May 23, 2022
The EDPB Guidelines explain that facial recognition technology (FRT) is built on the processing of biometric data. Therefore, it encompasses the processing of special categories of personal data. Often, FRT uses components of artificial intelligence (AI) or machine learning. While this enables large scale data processing, it also introduces the risk of discrimination and false results and is prone to interfere with fundamental rights, including the protection of personal data. FRT may be used in controlled one-to-one situations or in huge crowds and important transport hubs.
Personal data protection in the context of law enforcement is covered by the Law Enforcement Directive (2016/680/EU) (LED), which sets out rules regarding the use of FRT, e.g., Article 3(13) (term “biometric data”), Article 4 (principles relating to processing of personal data), Article 8 (lawfulness of processing), Article 10 (processing of special categories of personal data) and Article 11 (automated individual decision-making).
Given the effect of FRT on fundamental rights, the Guidelines explain that the EU Charter of Fundamental Rights is also essential to the interpretation of the LED, in particular the right to protection of personal data of Article 8 of the Charter, but also the right to privacy under Article 7 of the Charter.
Further, the Guidelines explain, legislative measures that provide a legal basis for the processing of personal data interfere directly with the rights guaranteed by Articles 7 and 8 of the Charter. The processing of biometric data under all circumstances constitutes a serious interference in itself. This does not depend on the outcome, e.g. a positive match. Any limitation on the exercise of fundamental rights and freedoms must be provided for by law and respect the essence of those rights and freedoms.
In addition, the legal basis must be sufficiently clear to give citizens an adequate indication of conditions and circumstances in which authorities are empowered to collect data and undertake secret surveillance. A mere transposition into domestic law of the general clause in Article 10 LED would lack precision and foreseeability.
Legislative measures must also be appropriate to the legitimate objectives pursued. An objective of general interest, however fundamental it may be, does not in itself justify a limitation to a fundamental right. Legislative measures should differentiate and target the people covered by it in the light of the objective, e.g. fighting specific serious crime. If the measure covers everyone in a general manner it intensifies the interference. The same is true if the data processing covers a significant part of the population.
The data must therefore be processed in a way that ensures the applicability and effectiveness of EU data protection rules and principles. The necessity and proportionality of the processing must identify and consider all possible implications for other fundamental rights.
The Guidelines address law makers at EU and national level, as well as law enforcement authorities, on implementing and using FRT-systems. It is also for individuals insofar as they are generally interested or as data subjects, in particular as regards data subjects’ rights.
The Guidelines are intended to inform about the properties of FRT and the applicable legal framework in the context of law enforcement (in particular the LED). In addition, they provide a tool to help when first classifying the sensitivity of any given case and contain practical guidance for law enforcement authorities that wish to procure and run a FRT system. They also contain case examples and list the relevant considerations in relation to the necessity and proportionality test.
The Guidelines are open for consultation until 27 June 2022. To access the Guidelines and for details on responding to the consultation, click here.