Earlier today I attended a superb session of the Churchill Club in Palo Alto, at which the European Data Protection Supervisor was speaking on data protection and innovation. As he spoke about the progress of the EU General Data Protection Regulation and what its impacts would be upon business, I found myself given to thinking about what EU data protection reform would be like if it were up to me.
Of course, this is by definition somewhat of a navel gazing exercise because EU data protection reform is not up to me. Nevertheless, I thought I would at least share some of my thoughts to see to what extent they strike a chord with readers of our blog - and, perhaps, even reach the ears of those who do make the law.
So, if you'll allow me this indulgence, here's what my reforms would do:
1. They would strike a balance between supporting privacy rights, economic and social well-being and innovation. Fundamentally, I support the overarching goals of the GDPR described in its recitals - namely that "the principles and rules on the protection of individuals with regard to the processing of their personal data should .... respect their fundamental rights and freedoms, notably their right to protection of personal data" and that those rules should "contribute to an area of freedom, security and justice ..., to economic and social progress, ... and the well-being of individuals." Yet, sometimes within the draft texts of the GDPR, this balance has been lost, with provisions swinging so far towards conservatism and restrictiveness, that promotion of economic progress - including, critically for any economy, innovation - gets lost. If it were up to me, my reforms would endeavour to restore this balance through some of the measures described below.
2. They would recognize that over-prescription drives bad behaviours. A problem with overly-prescriptive legislation is that it becomes inherently inflexible. Yet data protection rules need to apply across all types of personal data, across all types of technologies and across all sectors. Inevitably, the more prescriptive the legislation, the less well it flexes to adapt to 'real world' situations and the more it discourages innovation - pushing otherwise would-be good actors into non-compliance. And, when those actors perceive compliance as unobtainable, their privacy programs become driven by concerns to avoid risk rather than to achieve compliance - a poor result for regulators, businesses and data subjects alike. For this reason, my data protection reforms would focus on the goals to be achieved (data stays protected) rather than on the means of their achievement (e.g. specifying internal documentation needs). This is precisely why the current Data Protection Directive has survived as long as it has.
3. They would provide incentives for pseudonymisation. Absent a few stray references to pseudonymisation here and there across the various drafts of the GDPR, there really is very little to incentivise adoption of pseudonymisation by controllers - psuedonymised data are protected to exactly the same standard as 'ordinary' personal data. Every privacy professional recognizes the dangers of re-identification inherent in pseudonymised data, but treating it identically to ordinary personal data drives the wrong behaviour by controllers - perceiving little to no regulatory benefit to pseudonymisation, controllers decline to adopt pseudonymisaion for cost or other implementation reasons. My reforms would explore whether pseudonymisation could be incentivised to encourage its adoption, for example by relaxing data minimization, purpose limitation or data export rules for pseudonymised data, in addition to existing proposals for relaxed data breach notification rules.
4. They would recognize the distinct role of platforms. European data protection professionals still operate in a binary world - businesses are either 'data controllers' or 'data processors'. Yet, increasingly, this binary division of responsibility and liability doesn't reflect how things operate in reality - and especially in an app-centric world operating over third party cloud or mobile platforms. The operators of these platforms don't always sit neatly within a 'controller' or a 'processor' mold yet, vitally, are the gatekeepers through which the controllers of apps have access to the highly sensitive information we store on their platforms - our contact lists, our address books, our health data and so on. We need an informed debate as to the role of platforms under revised data protection rules.
5. They would abandon outdated data export restrictions. It's time to have a grown up conversation about data exports, and recognize that current data export rules simply do not work. Who honestly believes that a model contract protects data? And how can European regulators promote Binding Corporate Rules as a best practice standard for data export compliance, but then insist on reviewing each and every BCR applicant when they are too poorly resourced to do so within any kind of commercially acceptable timescale? And how can we possibly complain about the US having poor Safe Harbor enforcement when we have little to no enforcement of data export breaches at home in the EU? Any business of scale collects data internationally, operates internationally, and transfers data internationally; we should not prohibit this, but should instead have a regulatory framework that acknowledges this reality and requires businesses to self-assess and maintain protection of data wherever it goes in the world. And, yes, we should hold businesses fully accountable when they fail to do so.
6. They would recognise that consent is not a panacea. There's been a strong narrative in Europe for some time now that more data processing needs to be conditioned on individuals' consent. The consensus (and it's not a wholly unfair one) is that individuals have lost control of their data and consent would somehow restore the balance. It's easy to have sympathy for this view, but consent is not all it's cracked up to be. Think about it, if consent were a requirement of processing, how would businesses be forced to respond? Particularly within the European legislative environment that considers almost all types of data to be 'personal' and therefore regulated? The answer would be a plurality of consent boxes, windows and buttons layered across every product and service you interact with. And, to make matters worse, the language accompanying these consents would invariably become excessively detailed and drafted using 'catch-all' language to avoid any suggestion that the business failed to collect a sufficiently broad consent. Clearly, there are places where consent is merited (collection and use of sensitive data being a prime example) but for other uses of data, a well-structured data protection regime would instead promote the use of legitimate interests and other non-consent based grounds for data processing - backed, of course, by effective regulatory audit and sanctions in order to provide the necessary checks and balances.
So there you have it. Those are just a few of my views - I have others, but I'll spare you them for now, and no doubt you'll have views of your own. If you agree with the views above, then share them; if you don't, then share them anyway and continue the debate. We'll only ever achieve an appropriate regulatory framework that balances the needs of everyone if we all make our voices heard, debate hard, and strive to reach consensus on the right data protection regime fit for the future!
Sign up to our email digest