Insights

Our thoughts on the latest developments in our specialist sectors and services.

Facial recognition technology: the way of the future?

03 October 2019

Facial recognition technology is coming under fire from data protection regulators and privacy campaigners. As with so many new technologies, facial recognition presents many commercial opportunities, but also financial and reputational risks where it is used unlawfully.

Use and abuse

Facial recognition technology has been used in policing for years. Last year, a fugitive in China attracted headlines when he was apprehended after being correctly identified by facial recognition technology out of a crowd of 60,000 people at a concert. Several police forces in the UK are considering widespread use of facial recognition technology. For example, the London Metropolitan Police has recently trialled technology which analyses CCTV in public spaces against images of mugshots.

However, its use goes far beyond this. Facial recognition technology is becoming more advanced and is being used in increasingly inventive ways. In an interview with the Evening Standard on 27 August, Tom Porter, the government’s surveillance camera commissioner, described conventional facial technology as “almost so yesterday” in the face of enhanced facial recognition technologies such as lip sync and gait recognition.

Many consumers are voluntarily handing over their likeness to companies in exchange for convenience and simple novelty value, for example being able to unlock an iPhone by looking at it. For those of us who have a habit of mislaying our keys, ‘facentry’ tech is available for tech savvy homes, hotels and offices. Amazon is interested in the potential of this technology, and its ‘Amazon Rekognition’ software is already sophisticated enough to identify happiness, surprise and fear on your face.

Despite the opportunities which it offers, as with so many other forms of cutting edge tech, businesses should be cautious when using facial recognition technology, due to the legal and reputational threats of misuse.

It recently emerged that a property developer for King’s Cross estate was using facial recognition technology, and that the company which owns both offices and residential spaces in Canary Wharf is considering doing the same. King’s Cross has ceased using this technology pending an ICO investigation. This, together with other recent controversies, has prompted a sharp backlash from privacy campaigners, and the threat of new legislation.

The legal framework

The use of facial recognition technology is already subject to data protection law, and fresh regulation may be in the pipeline.

Under the GDPR, organisations which use facial recognition technology need a lawful basis to do so, or to be able to rely on an exemption.

In most cases, organisations will need to obtain the consent of those who they are monitoring in order to use facial recognition technology. The GDPR sets a high standard for consent. Consent must be a freely given, specific, informed and unambiguous, and must be demonstrated by an affirmative action rather than by failing to opt out. This means that consent to the use of facial recognition technology probably won’t be GDPR-compliant if it is buried in a long contract. Consent also has to be ‘unbundled’. So usually it will not be acceptable to force individuals to consent to be subject to facial recognition technology as part of a wider contract.

The issue is further complicated by the fact that facial recognition technology collects biometric data, which is categorised as ‘special category’ personal data and is subject to stricter rules.

This is difficult for many of the new commercial uses of facial recognition technology, not least because of the logistical difficulties of carrying out surveillance on one part of a crowd, excluding those who have not consented. In the context of surveillance of a public space, putting a sign up and interpreting a person’s entry into that public space is unlikely to meet the strict criteria for obtaining GDPR-compliant consent.

Even when organisations obtain valid consent for using facial recognition technology, this may not be enough to comply with data protection law. Organisations should be mindful that data protection law applies beyond the collection of personal data. In particular, companies using facial recognition technology should apply strict security measures to it, retain it for as short a period as possible, and be able to comply in a timely manner to data subject access requests and right to be forgotten requests.

Different rules apply to the police. The High Court in Cardiff recently ruled that, whilst the use of facial recognition technology interferes with the privacy rights of individuals, its use by the police is not unlawful provided it is carried out within strict parameters.

The ICO has indicated that it takes facial recognition technology, and its potential for abuse, very seriously. On 15 August Elizabeth Denham, Information Commissioner, announced that the ICO has launched an investigation into the use of facial recognition technology at King’s Cross Station. In the announcement, she stated: “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding.” Businesses would be foolish to disregard this, particularly in view of the ICO’s recent demonstrations that it is more than willing to exercise its powers to issue enormous fines. You can read our earlier articles on the ICO’s fines for British Airways here, and Marriott here.

New legislation is also in the pipeline. The European Commission is planning regulation that will target the indiscriminate use of facial recognition technology, and put in place stricter rules to ensure that individuals have the right to know when they are being monitored.

Conclusion

Facial recognition technology offers exciting opportunities. However, businesses should be extremely cautious in view of the ongoing ‘face-off’ between tech and privacy campaigners.

Back to blog

Share this page