We Need to Talk About Data Arbitrage: A position on Data Trusts
Based on personal experience, I can tell you that the best way to rile up a privacy/data protection lawyer is to talk about the best ways to use and trade personal data as a commodity. To be fair, our (top-notch, award-winning, industry-leading) Privacy, Security and Information team do exactly what they are meant to do, which is to ensure that clients follow privacy law and what that law takes as its higher, near-inviolable purpose: the right of individuals to keep their private information just that – private.
Just because our laws promote this ideal doesn't make the use of personal data evil, yet the focus on protecting personal data as a higher moral imperative leads most people to talk about using personal data (even when permitted by law) in hushed tones.
But data has been a commodity for as long as there has been data – certainly for much longer than there have been computers to automate its collection – and I think that we as transactional lawyers fail our clients if we don't ask how data fits into a commercial deal done in 2017. For example, I've said for a long time that in the data economy (see what I did there?), a good cross-promotional deal is one in which it's hard to tell which party paid cash to the other, because there are so many other currencies, including data, at play - and that aren't traded for as often as they should be. In the media, entertainment and sports industries that I work in, research and analysis has always been critical to ensuring that the right content – and the most relevant advertisements – are provided to each viewer or fan. It's safe to say that the use of data as a commodity has grown despite the concomitant growth of privacy regulation – and there is no suggestion that the legal use of data will slow any time soon or be any more harmful to consumers.
And, as importantly, consumers see data as a commodity to be used for advantage just as much as the companies that use data do. In a study published by @saleforce earlier this year, over 60% of UK consumers said that personalized offers have a direct influence on brand loyalty, and 55% of UK consumers felt that sharing personal information was a reasonable price for getting those personalized offers.
Given the accelerating rate at which data is being accumulated, providing those personalized experiences is going to take a significant amount of data, input into increasingly complex machine thinking engines, in order to analyse it.
This is why the Government's new report "Growing the Artificial Intelligence Industry in the UK" is so important. The report includes 18 recommendations, the most interesting one to my mind being that industry and government work together to create "data trusts," each of which would be a network of contractual relationships that allow for data to be shared in a "fair, safe and equitable way."
In other words, if a machine learning company needed access to a particular data set in order to complete its analysis for a customer, by participating in a data trust it would know the terms applicable to that use and would not be delayed by the need to negotiate those terms with the source or provider of the data set in question. Resources otherwise expended in administering those relationships could, instead, be deployed to develop and enhance the quality and relevance of available data.
And the better the data, the more accurate the results – which means individualised offers and communications to customers feel more organic to those customers, and the idea of adding value through targeting can finally start to shed its undeserved negative reputation.
If you're ready to shout about your use of data as a currency from the rooftops (or just discuss it around a conference table) please be in touch.