Blog

Facial Recognition in Law Enforcement: Controversies, risks and regulation

Written by

27 de February de 2019

Last monday (25/07), the TV newscast Bahia Meio Dia revealed that facial recognition technology will be used in the party circuits during the city of Salvador’s carnival, for public safety ends. The measure followed claims by the secretary of State of the Military Police of the state of Rio de Janeiro, according to whom the police will be using facial recognition software from the Oi company to identify outlaws and stolen car’s plates in the same period. In today’s post, we will examine some of the ethical, judicial and political implications of such technology.

What is automated facial recognition? What are its uses?

Automated facial recognition (AFR) consists of a biometric identification technology performed from the collection of facial data from photographs or video segments. These systems commonly, though not exclusively, operate in extracting mathematical representations of specific facial features, such as the distance between the eyes and the nose shape, from which a facial pattern is produced. When comparing the pattern of a particular face to others contained in a previous database, one can, for example, identify unknown individuals or authenticate known persons.
In recent years, there has been a great deal of interest in this type of system in several sectors. Much is said about the well-known Face ID, an unlock mechanism implemented by Apple in recent products. Other possible uses include identifying people in historical photographs for research, establishing identities of missing persons, and even diagnosing genetic diseases. Such human development possibilities brought about by the technology in question have led to a provision for rapid adherence to solutions of this kind in various fields, which sometimes omit their risks, negative uses and problems.
One of the areas in which this interest has emerged with greater prominence is law enforcement, as evidenced by the aforementioned use in the carnivals of Bahia and Rio de Janeiro. Since last year, Bill 9763/2018 has been in works in the Chamber of Deputies, the effect of which is to amend the Criminal Enforcement Law to include a provision for compulsory use of this type of system in the prison population. In January, Social Liberal Party MPs attracted remarkable attention as they traveled to China to learn about China’s AFR system, which would inspire a bill aimed at implementing technology in public places.
However, it is necessary to consider that there are a number of problems and controversies that involve the use of this type of resource, especially in public places and in the way of security. In this sense, legislators from different cities in the United States have already gone as far as to propose a ban on the use of this type of system by the State. To understand what has led such regulatory contexts to this point, it is necessary to examine some of the main controversies and risks involved in AFR.

Inaccuracy issues, discrimination and information security risks

One of the most basic problems involving such technologies is its effectiveness. Despite the great enthusiasm surrounding the resource, it is necessary to consider the limitations that still characterize its current stage of development. Facial recognition systems present significant results when the analyzed images are frontal photographs with good illumination and resolution. However, as a report from the Electronic Frontier Foundation (EFF) shows, system hit rates fall sharply due to several factors.

Error rates increase when low-resolution images from video segments are analyzed, as well as due to variations in illumination, image background, pose, facial expression, shadows, and camera distance. Finally, facial similarities in the same population make it that the larger the database used, the greater the probability of false positives – occurrences in which the system incorrectly attributes the analyzed face to another to which it does not actually correspond.

These errors particularly affect racial minorities and women, as indicated by current studies – one of which even shows 40% false positive rates for non-white people, compared with only 5% for white people. The bias is compounded in the field of law enforcement because of the historical inequality relations that conform the conditions of production of many of the databases used. Therefore, socially vulnerable populations could be subject to the automation of constraints and violence, such as improper police approaches and untrue attribution of criminal records.

In addition, from an information security standpoint, there are specific risks inherent to any biometric identification system (iris or fingerprint scanning, for example), mainly because of the difficulty in modifying this data if its security is compromised, as could occur in a data leak. However, more than our irises or fingers, the collection of images of our faces can easily be performed without our consent or even knowledge. This opens the possibility of non-consented and hidden collective biometric surveillance.

As the EFF states: “We expose our faces to public view every time we go outside, and many of us share images of our faces online with almost no restrictions on who may access them. Face recognition therefore allows for covert, remote, and mass capture and identification of images.”

facial recognition

Ethics is the starting point. Supervision and accountability should be the points of arrival.

Last year, Brazil approved its General Data Protection Act (GDPA), which will come into force in August 2020 and will be the main national legislative framework regarding the processing of personal data. This legislation provides (Article 4, item III) an exception regarding the processing of data for exclusive public security purposes, which shall be governed by a specific law. If, on the one hand, this creates uncertainty about the future of the subject matter, the same provision of the Act also expressly states that the general principles of data protection and the rights of the data subject to it should be observed in the specific law.

These principles include (Article 2) informational self-determination, freedom of expression, respect for privacy, free development of personality and inviolability of intimacy, honor and image. Continued, hidden and non-consensual monitoring implied by the implementation of facial recognition in public places constitutes a threat to these principles, as well as to other constitutionally guaranteed rights, such as freedom of assembly, freedom of association and presumption of innocence. Therefore, the technology must be approached with extreme caution.

In this sense, there is a consensus being gradually produced between different actors regarding certain ethical foundations to be observed in the use of AFR. In the private sector, for example, companies such as Microsoft and Amazon have recognized the need for broad regulation based on the ideals of public transparency, consent, due process, accuracy, and non-discrimination. But, as argued by the AI ​​Now Institute, ethical commitments are not enough. Efficient systems of supervision, auditing, monitoring and accountability are needed at each stage of the development and implementation of these products.

Such systems should include external and internal mechanisms for civil society participation, institutional protections for whistleblowers – which have been instrumental in bringing about the abuse of rights in the IT industry in recent years – and the removal of trade secrecy when such technologies are used in the public sector. Without concrete guarantees of this sort, society comes to be held hostage by public and business discourses about technologies that, in practice, remain as “black boxes.”

 

Conclusion

AFR’s potential for social benefits in many areas does not remove society’s ethical, regulatory and political concerns from its immense potential for authoritarianism. The combination of biometric identification, ubiquitous, distant data collection and continuous monitoring of multiple individuals finds few parallels regarding the potential for abuse of power that makes it viable. For this reason, it is necessary to address the issue with complete caution, especially in the field of law enforcement. It is also worth remembering that a recent US decision found the evidence obtained by compelling individuals to unlock devices protected by facial recognition by police authorities to be unlawful.

Moreover, as argued, the search for solutions to the problems inherent in the use of this type of system can not be limited to adherence to ethical commitments on the part of companies, nor to a compliment to the models of “design justice”. Code is a product of the social and political relations in which it appears, which means that these relations must be placed at the center of the solutions. While recognizing the problems and risks of technology is an important step, we need to move beyond ethical discourse. It is necessary to talk about continuous and effective overview and accountability.

Want to know more about the controversies surrounding facial recognition? Read our post on the use of artificial intelligence and facial recognition to determine people’s sexual orientation.

 

The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Institute for Research on Internet and Society.

Written by

Director at the Institute for Research on Internet and Society. Gustavo holds a bachelor’s degree in Anthropology from the Federal University of Minas Gerais (UFMG), and is currently undertaking a Master’s degree in Communication of Science and Culture at the University of Campinas (Unicamp). Member of the Brazilian Internet Governance Research Network steering group. Alumnus of the Brazilian School of Internet Governance. His research and policy interests are anthropology of the State, privacy and data protection, science and technology studies, platform governance and encryption policy.

Leave a Reply

Your email address will not be published. Required fields are marked *

Veja também

See all blog posts