A/HRC/56/68
1.
Law enforcement, security and the criminal justice system
(a)
Automated identification
26.
Law enforcement agencies use automated identification tools to connect what they
observe in a particular environment to a potential “match” in a database. One of the most
common types of automated identification tools is facial recognition technology. Facial
recognition tools take video footage or photographs of a person and feed them into
algorithms. The algorithms then compare the images against a database of police
photographs, driver’s licence photographs or other images with the goal of identifying the
person.27 The designers of such tools train the models on which they are based by showing
them images of faces, through a process of machine learning. The goal is to train the models
to identify the distinguishing features of human faces. 28 However, the image data sets used
to train these models are not always demographically representative. 29 In one study of a
popular image database, researchers found an overrepresentation of men between the ages of
18 and 40 and an underrepresentation of people with dark skin. 30 According to another study
of commercially released facial recognition systems, gender classification algorithms are
trained on data sets with overwhelmingly white male faces.31 The lack of racial, gender and
cultural diversity in artificial intelligence tools’ training sets leads to one of the classic data
problems described above. Groups that are underrepresented in the training data, including
those that experience intersectional forms of discrimination, are more likely to be erroneously
matched by the algorithm.
27.
It has been reported that the misidentification of faces by these technologies has led
to an increased number of arrests of people of African descent. 32 The Special Rapporteur on
the promotion and protection of the right to freedom of opinion and expression and the United
Nations High Commissioner for Human Rights have noted that facial recognition tools often
contribute to unlawful discrimination and racial profiling. 33 Despite such human rights
concerns, facial recognition systems have been deployed by law enforcement agencies in a
number of countries. For example, the Government of India has reportedly invested
significantly in such systems. The facial recognition system used by the Delhi Police was
reported to be accurate in only 2 per cent of cases and to put minority communities at a
disproportionate risk of misidentification and false arrest. 34 Brazilian law enforcement
officials have reportedly falsely accused and arrested individuals on the basis of faulty facial
recognition tools. According to a 2019 study, 90 per cent of people arrested in Brazilian cities
on the basis of facial recognition technology are of African descent.35
28.
Gunshot detection systems are another common type of automated identification tool
used by law enforcement officials in a number of countries. One system, named ShotSpotter,
involves placing sensors containing a microphone, a GPS system, memory and processing,
and cell capability in neighbourhoods. 36 When the sensors detect a noise that could be a
gunshot, an algorithm triangulates the location of whatever caused the noise. The algorithm
27
28
29
30
31
32
33
34
35
36
GE.24-08849
Marissa Gerchick and Matt Cagle, “When it comes to facial recognition, there is no such thing as a
magic number”, American Civil Liberties Union, 7 February 2024.
Julia Dressel and Andrew Warren, “Breaking down data analytics and AI in criminal justice”,
Recidiviz, 8 March 2022.
AI for the People submission.
Khari Johnson, “ImageNet creators find blurring faces for privacy has a ‘minimal impact on
accuracy’”, VentureBeat, 16 March 2021.
Joy Buolamwini and Timnit Gebru, “Gender shades: intersectional accuracy disparities in commercial
gender classification”, Proceedings of Machine Learning Research, vol. 81 (2018). See also Gerchick
and Cagle, “When it comes to facial recognition, there is no such thing as a magic number”; AI for
the People submission; and Internet Lab submission.
Gerchick and Cagle, “When it comes to facial recognition, there is no such thing as a magic number”.
See A/HRC/41/35 and A/HRC/48/31.
Amnesty International, “Ban the scan: Hyderabad”, available at
https://banthescan.amnesty.org/hyderabad/.
Group of experts from Brazil submission.
Alisha Ebrahimji, “Critics of ShotSpotter gunfire detection system say it’s ineffective, biased and
costly”, CNN, 24 February 2024.
7