This study aims to explore whether face recognition technology – as it is intensely used by state and local police departments and law enforcement agencies – is racism free or, on the contrary, is affected by racial biases and/or racist prejudices, thus reinforcing overall racial discrimination.
The study investigates the causal pathways through which face recognition technology may reinforce the racial disproportion in enforcement; it also inquires whether it further discriminates black people by making them experience more racial discrimination and self-identify more decisively as black – two conditions that are shown to be harmful in various respects.
This study shows that face recognition technology, as it is produced, implemented and used in Western societies, reinforces existing racial disparities in stop, investigation, arrest and incarceration rates because of racist prejudices and even contributes to strengthen the unhealthy effects of racism on historically disadvantaged racial groups, like black people.
The findings hope to make law enforcement agencies and software companies aware that they must take adequate action against the racially discriminative effects of the use of face recognition technology.
This study highlights that no implementation of an allegedly racism-free biometric technology is safe from the risk of racially discriminating, simply because each implementation leans against our society, which is affected by racism in many persisting ways.
While the ethical survey of biometric technologies is traditionally framed in the discourse of universal rights, this study explores an issue that has not been deeply scrutinized so far, that is, how face recognition technology differently affects distinct racial groups and how it contributes to racial discrimination.
The authors would like to thank Massimo Tistarelli for his generous support. Thanks also go to Gregor Pipan and all the people at Xlab, Ljubljana, Slovenia, who provided insight and expertise that greatly assisted our work. This work has been fully supported by the IDENTITY Project – Computer Vision Enabled Multimedia Forensics and People Identification, H2020-MSCA-RISE-2015, n. 690907.
Bacchini, F. and Lorusso, L. (2019), "Race, again: how face recognition technology reinforces racial discrimination", Journal of Information, Communication and Ethics in Society, Vol. 17 No. 3, pp. 321-335. https://doi.org/10.1108/JICES-05-2018-0050
Emerald Publishing Limited
Copyright © 2019, Emerald Publishing Limited