A/HRC/56/68
filters out other possible sounds before sending the audio to a person for review. 37 The
available information suggests that gunshot detection systems are deployed
disproportionately in communities inhabited by racially marginalized groups, 38 and they can
have a very high error rate. The placement of gunshot detection systems in communities in
which marginalized racial and ethnic groups live and the inaccuracies of gunshot detection
systems exacerbate systemic biases within law enforcement.
29.
There are many examples of how the use of automated identification technology has
had life-altering consequences. In 2019, in the United States, a Black man in New Jersey was
reportedly falsely arrested and held in jail for 10 days because of a facial recognition error.
Despite the existence of exonerating evidence, the authorities did not drop his case for almost
a year, and he faced up to 25 years of imprisonment for the charges brought against him. The
incident had a significant impact on the man’s life.39 In February 2024, law enforcement
officers in Chicago reportedly opened fire on a child who was lighting fireworks after
responding to a false alert from ShotSpotter.40 Another example of the use of this type of
artificial intelligence technology is the reported adoption by the Israel Defense Forces of
Wolf Pack, a vast database containing images and all available information on Palestinians
from the West Bank, further integrating various surveillance programmes such as Blue Wolf
and Red Wolf.41 Across the Old City of Hebron, the Israel Defense Forces reportedly installed
artificial intelligence-powered cameras capable of identifying human faces, which are
connected to the Blue Wolf programme, a mobile application that allows soldiers to detect
and categorize Palestinians across the West Bank by means of an extensive biometric
database in which most have not consented to enrol, resulting in ongoing surveillance of
Palestinians. The rigorous application of the Wolf Pack system by the Israel Defense Forces
exacerbates the apartheid perpetuated against Palestinians. 42 These examples show the
serious human rights implications resulting from the use of artificial intelligence systems to
make consequential decisions in high-risk settings.
(b)
Predictive policing algorithms
30.
Another form of artificial intelligence technology that is commonly used by law
enforcement is predictive policing. Predictive policing tools make assessments about who
will commit future crimes, and where any future crime may occur, on the basis of location
and personal data.
31.
Predictive policing can exacerbate the historical overpolicing of communities along
racial and ethnic lines.43 Because law enforcement officials have historically focused their
attention on such neighbourhoods, members of communities in those neighbourhoods are
overrepresented in police records. This, in turn, has an impact on where algorithms predict
that future crime will occur, leading to increased police deployment in the areas in question.44
37
38
39
40
41
42
43
44
8
Jay Stanley, “Four problems with the ShotSpotter gunshot detection system”, American Civil
Liberties Union, 24 August 2021.
Ibid.; and MacArthur Justice Center, “ShotSpotter is deployed overwhelmingly in Black and Latinx
neighborhoods in Chicago”, available at https://endpolicesurveillance.com/burden-on-communitiesof-color/.
Gerchick and Cagle, “When it comes to facial recognition, there is no such thing as a magic number”;
and Khari Johnson, “How wrongful arrests based on AI derailed 3 men’s lives”, Wired, 7 March
2022.
Adam Schwartz, “Responding to ShotSpotter, police shoot at child lighting fireworks”, Electronic
Frontier Foundation, 22 March 2024.
Amnesty International, Automated Apartheid: How Facial Recognition Fragments, Segregates and
Controls Palestinians in the OPT (London, 2023), pp. 41–45.
Sophia Goodfriend, “Algorithmic State violence: automated surveillance and Palestinian
dispossession in Hebron’s Old City”, International Journal of Middle East Studies, vol. 55, No. 3
(2023).
Tim Lau, “Predictive policing explained”, Brennan Center for Justice, 1 April 2020; and Jon Fasman,
“The black box of justice: how secret algorithms have changed policing”, Fast Company, 9 February
2021.
Kristian Lum and William Isaac, “To predict and serve?”, Significance, vol. 13, No. 5 (2016); and
Australian Human Rights Commission submission.
GE.24-08849