A recent sweep with participation from 26 data protection authorities ("DPA") across the world revealed a high proportion of mobile apps are accessing large amounts of personal data without meeting their data privacy obligations.
How did they do?
Not well. Out of the 1,200 apps surveyed, 85% failed to clearly explain how they were collecting and using personal information, 59% did not display basic privacy information and one in three requested excessive personal information. Another common finding was that many apps fail to tailor privacy information to the small screen.
The Information Commissioner's Office ("ICO"), as the UK's DPA, surveyed 50 apps including many household names which produced results in line with the global figures.
Rare examples of good practice included pop-up notifications asking permission prior to additional data being collected and basic privacy information with links to more detailed information for users who wish to know more.
What are they told to do?
As a result, Ofcom and ICO have produced some guidance for users on how to use apps safely. It is written in consumer-friendly language and contains straightforward advice on some common pitfalls such as checking content ratings or always logging out of banking apps.
Contrast this with the 25 page guidance from ICO aimed at app developers, which has drawn some criticism for being overly lengthy and complex. Rather than participate in research and point to long guidance documents, it would be more effective to promote simple rules (eg. requiring pop-up notifications) and holding app stores accountable for non-compliance. However, site tests demonstrate users are irritated enough by constant pop-ups to stop using the site, so developers are reluctant to implement them.
Why is it important?
The lack of compliance is all the more alarming if read in conjunction with Ofcom research which surveyed a range of UK app users. Users view the apps environment as a safer, more contained space than browser-based internet access. Many believed apps are discrete pieces of software with little interconnectivity, were unaware of the virus threat or that apps can continue to run in the background. There is implicit trust in established brands and recognised app stores, who users felt must monitor and vet all apps before selling them. Peer recommendation also played a significant role in deciding whether to download an app.
This means little, if any, attention is paid to privacy policies and permission requests. Users interviewed generally felt full permission had to be granted prior to using the app and were frustrated by the lack of ability to accept some and refuse other permissions.
So what's the risk for developers?
ICO has the power to fine those companies who breach the relevant laws up to £500,000. The threshold for issuing a fine is high, however, and this power has not yet been used in the context of mobile apps. Having said this, we know that 'internet and mobile' are one of ICO's priority areas for enforcement action.
Perhaps a more realistic and potentially more damaging risk is the reputational and brand damage associated with being named and shamed publically. ICO is more likely, when a lower level of harm has been caused, to seek undertakings that the offending company will change its practices. As we know, ICO publishes its enforcement actions on its website. For a company whose business model relies on processing data and on peer recommendations as the main way to grow its user base and its brand, the trust of its users is paramount and hard to rebuild once lost.
ICO has said it will be contacting app developers who need to improve their data collection and processing practices. The next stage for persistent offenders would be enforcement action.
Developers would be wise to pay attention. If enforcement action is not in itself a concern, ICO's research showed almost half app users have decided against downloading an app due to privacy concerns. If that's correct, privacy is important to mobile app users and could 'make' or 'break' a new app.
Update 12 December 2014
The 26 DPAs who took part in the global sweep have since written to seven major app stores including those of Apple, Google and Microsoft. In their letter of 9 December they urge the marketplaces to make links to privacy policies mandatory, rather than optional, for those apps that collect personal data.
Sign up to our email digest