HomeInsightsInformation Commissioner publishes blog piece on facial recognition technology and law enforcement

The Information Commissioner, Elizabeth Denham, says that the use of biometric data, including databases of facial images, in conjunction with Automatic Facial Recognition Technology (FRT), is a technological development that intrudes into people’s privacy. The technology has been available for some time but the ability of the technology to be linked to different online databases with mobile and fixed camera systems in real time greatly increases its reach and impact, she says.

Ms Denham says that FRT is increasingly deployed by police forces at public events. While there may be significant public safety benefits from using FRT (to enable the Police to apprehend offenders and prevent crimes from occurring), the way FRT is used in public spaces can be particularly intrusive. Ms Denham says there is “a lack of transparency about its use and is a real risk that the public safety benefits derived from the use of FRT will not be gained if public trust is not addressed”.

In Ms Denham’s view, a robust response to the many unanswered questions around FRT is “vital to gain this trust”. How does the use of FRT in this way comply with the law? How effective and accurate is the technology? How do forces guard against bias? What protections are there for people that are of no interest to the police? How do the systems guard against false positives and the negative impact of them?

Ms Denham explains that a key component of any FRT system is the underlying database of images the system matches to. The use of images collected when individuals are taken into custody is of concern; there are over 19 million images in the Police National Database (PND) database. Ms Denham is also considering the transparency and proportionality of retaining these photographs as a separate issue, particularly for those arrested, but not charged, for certain offences.

For the use of FRT to be legal, the police forces must have clear evidence to demonstrate that the use of FRT in public spaces is effective in resolving the problem that it aims to address, and that no less intrusive technology or methods are available to address that problem, Ms Denham says. The GDPR, coming into effect next week, requires organisations to assess the risks of using new and intrusive technologies, particularly involving biometric data, in a data protection impact assessment and provide it to the ICO when the risks are difficult to address.

Ms Denham says she has identified FRT by law enforcement as a priority area for the ICO and has written to the Home Office and the NPCC setting out her concerns. Should her concerns not be addressed, she says, she will consider what legal action is needed to ensure the right protections are in place for the public. To read the blog piece in full, click here.