Tip of the ice-berg for tech regulation: automated facial recognition technology | Fieldfisher
Skip to main content
Insight

Tip of the ice-berg for tech regulation: automated facial recognition technology

14/06/2019
Judgment of the High Court is awaited in relation to a judicial review claim brought by Ed Bridges against the Chief Constable of South Wales, alleging that the police force's use of automated facial recognition.

Judgment of the High Court is awaited in relation to a judicial review claim brought by Ed Bridges against the Chief Constable of South Wales, alleging that the police force's use of automated facial recognition technology ('AFR') breached his human rights and data protection legislation. The case represents a microcosm of the debate on how to regulate technology to harness societal benefits whilst protecting individual rights.

AFR is camera technology, currently used by a small number of police forces in the UK, which scans crowds and maps individuals' unique facial characteristics into an algorithm. This code is compared to measurements of faces held on a database, sometimes called a watch list, in an attempt to match an individual's face to existing data. Bridges' face was captured by South Wales Police in Cardiff. He claims that this is akin to fingerprints or DNA being taken without his consent which interferes with his rights under Articles 8, 10, 11 and 14 of ECHR, the rights of the data subject protected by chapter 3 of the Data Protection Act 2018 ('DPA') as well as the duty on public authorities to eliminate discrimination, harassment and victimisation contained in section 149(1) of the Equality Act 2010. Proponents claim AFR can free up police resources but critics fear the chilling effect it may have on freedom of expression and democratic participation. Others point to the inaccuracy of the technology – many facial recognition algorithms misidentify women and ethnic minorities – and the intrinsic bias which may result.

Like much recently developed technology, AFR exists in a legislative, regulatory and policy lacuna. Responsibility for regulating AFR falls across the remits of the Surveillance Camera Commissioner (whose role is 'simply advisory'), the Biometric Commissioner and the Information Commissioner, leading to a lack of clear accountability. Whether and how AFR is used is left largely to the discretion of police forces, subject to compliance with the Human Rights Act 1998 and the DPA. Whilst the use of DNA evidence, finger prints and footwear impressions are regulated by the Police and Criminal Evidence Act 1984, there are no equivalent provisions for AFR. This is because AFR is interpreted as being caught by the definition of 'surveillance camera systems' under Section 29(6) of the Protection of Freedoms Act 2012 ('PoFA') and is therefore subject to its light-touch regime, rather than falling within the strictures of the Regulation of Investigatory Powers Act 2000 ('RIPA') which provides a robust framework for the authorisation of the use of covert surveillance. It is however not clear that AFR should be categorised in this way. Guidance made by the Surveillance Camera Commissioner in March 2019 ('the March 2019 Guidance') states that 'If [members of the public] are not made aware [that such a system is in use] this may be considered as covert surveillance and therefore fall within the bounds of RIPA'. No guidance has been issued as to what constitutes 'awareness' and how authorities deploying AFR at events such as football matches and parades can reasonably claim to have given notice to the public.

PoFA requires only that an authority intending to use AFR 'have regard' to a code of practice, and failure to do so does not lead to any criminal or civil liability. The Surveillance Camera Code of Practice, issued in June 2013 ('SC Code') sets out 12 'guiding principles', including one which essentially restates individuals' Article 8 rights, noting that use of AFR 'must always be for a specified purpose which is in pursuit of a legitimate aim and necessary to meet an identified pressing need'. The March 2019 Guidance states that authorities must conduct a detailed risk assessment process including data protection considerations prior to deployment.

The March 2019 Guidance allows authorities to decide not to adopt the provisions of the SC Code so long as they provide a 'detailed and auditable rationale for doing so'. There is no guidance on how long a person's image may be stored and how it may be used. The lack of legal certainty is unhelpful for police who are at risk of legal challenge, as well as the public, concerned about their rights. Judgment in Bridges' case will hopefully be followed by fresh legislation and/or guidance to provide clarity and specifically to confirm whether AFR is to be treated as overt or covert surveillance; how notice must be given to the public and whether human intervention is a pre-requisite for use. Whilst the judgment and formal legislative proposals are awaited, public authorities intending to deploy AFR will need to meticulously weigh up the proportionality of its use on a case by case basis.

There are calls from industry and policy-makers for the regulation of technology in a number of spheres. It is clear from the issues that arise in relation to AFR that striking a regulatory balance between technical innovation and the protection of individual liberties will be a complex legislative challenge.