A/HRC/44/57
real concern is that this high approval rate may well result from a presumption of
technological objectivity and neutrality as discussed. The Constitutional Court of Colombia
uses Prometea to filter tutelas, or individual constitutional rights complaints, and decide
which to hear.78 The concern with Prometea and many other such artificial intelligence
systems is the “black box” effect – the basis of their decision-making is opaque, and it is
difficult or impossible for judges, other court officials, and litigants (and even public
authorities who commission these systems) to determine bias in design, input or output.
While it is impossible to know the impact that Prometea has or could have on racial and
ethnic minorities, the risk is high that such systems will reinforce or exacerbate existing
racial and ethnic disparities in the justice systems in which they are deployed.
35.
In the criminal justice context, police departments in different parts of the world use
emerging digital technologies for predictive policing, in which artificial intelligence
systems pull from multiple sources of data, such as criminal records, crime statistics and the
demographics of neighbourhoods.79 Many of these data sets reflect existing racial and
ethnic bias, thus operating in ways that reinforce racial discrimination despite the presumed
“objectivity” of these technologies or even their perceived potential to mitigate the bias of
the human actors they supplement or replace. Furthermore, police departments tend to
deploy predictive technologies disproportionately in impoverished communities of
predominantly ethnic or racial minorities.
36.
The United Kingdom of Great Britain and Northern Ireland, for example, uses a
database, known as the Gangs Violence Matrix, that has been demonstrated to be
discriminatory.80 Police officers reportedly make assumptions about individuals based on
their race, gender, age and socioeconomic status, which further reinforce those
stereotypes.81 The result is that 78 per cent of individuals on the Matrix are black, and an
additional 9 per cent are from other ethnic minority groups, while the police’s own figures
show that only 27 per cent of those responsible for serious youth violence are black. The
police also share the Matrix with other agencies, such as job centres, housing associations,
and educational institutions, leading to discrimination against individuals on the basis of
their supposed gang affiliation. Depending on the nature of the way this information is
shared, this poses an opportunity for possible violations of the right to privacy and may
affect housing and employment rights on a discriminatory basis. Those whose names are on
the Matrix experience “multiple stop and search encounters which seemingly lack any legal
basis”.82 Some report that police have stopped and searched them 200 times, others report
up to as many as 1,000 times, with some reporting multiple stops every day. This has an
impact on individuals’ rights to freedom from interference with their privacy and their
freedom from arbitrary arrest on an ethnically discriminatory basis.
37.
By way of another example, it was highlighted in one submission that predictive
policing was becoming the methodology used in local policing in so-called crime
prevention strategies in cities in the United States such as Los Angeles. 83 Until recently, the
Los Angeles Police Department had been using technology called PredPol to examine 10
years of crime data, including the types, dates, locations and frequency of crimes, to predict
when and where crimes would likely occur over the next 12 hours. These data, gathered and
categorized by police officers, are both the product and the cause of heightened surveillance
in black and Latinx communities. Predictive policing reiterates and exacerbates the existing
biases in the policing system, while providing the guise of objectivity because of the use of
supposedly neutral algorithmic decision-making. Although the Los Angeles Police
Department has suspended its use of PredPol, it has not disavowed use of other predictive
policing products that are likely to raise similar concerns.
78
79
80
81
82
83
See www.ambitojuridico.com/noticias/informe/constitucional-y-derechos-humanos/prometeainteligencia-artificial-para-la (in Spanish).
Submission by the Association for Progressive Communications.
A/HRC/41/54/Add.2, para. 40.
See www.amnesty.org.uk/files/reports/Trapped%20in%20the%20Matrix%20Amnesty%20report.pdf.
See www.stop-watch.org/uploads/documents/Being_Matrixed.pdf.
Submission by the Stop LAPD Spying Coalition.
11