A/HRC/44/57
41.
Many States are experimenting with incorporating emerging digital technologies
into their welfare systems, 90 in ways that reinforce racially discriminatory structures.
Australia has implemented the Online Compliance Intervention system, colloquially known
as robo-debt.91 This automated decision-making system uses machine learning algorithms
to identify suspected overpayment of government welfare benefits and demands
documentation from those recipients marked as having received more than they were
entitled to in welfare payments. The system sent out approximately 20,000 debt letters each
week for a six-month period in 2016 and 2017. An investigation estimated that between 20
and 40 per cent of debt letters were false positives based on flaws in the system processes
and the data. The State shifted the onus onto welfare recipients to prove that they did not
owe the State a debt. Recipients of welfare benefits at higher rates than white Australians, 92
indigenous Australians bear the greatest cost of this system’s flaws, while being the worst
equipped to challenge them given the barriers that they face. A recent human rights
intervention in judicial proceedings highlights similar concerns in the Netherlands, where
use of emerging digital technologies in the provision of social welfare has resulted in
human rights violations against the poorest and most vulnerable in that country. 93 There,
too, racial and ethnic minorities face disproportionate socioeconomic marginalization,
raising pressing concerns that class discrimination is also racial discrimination.
42.
As States increasingly use emerging digital technologies to calculate risk and
classify need, as exemplified by countries such as Denmark, New Zealand, the United
Kingdom and the United States, 94 greater scrutiny of their potential to have a disparate
impact on racial or ethnic minorities must be a State priority. Because digitalization of
welfare systems occurs in societies in which groups are marginalized, discriminated against
and excluded on a racial and ethnic basis, these systems are almost guaranteed to reinforce
these inequities, unless States actively take preventive steps. Without urgent intervention,
digital welfare states risk entrenching themselves as discriminatory digital welfare states.
43.
In some cases, although the racially discriminatory structures are sectoral, for
example criminal justice, they nonetheless holistically undercut the human rights of those
affected and reinforce their structural oppression in society. Such is the case in the United
States, where emerging digital technologies sustain and reproduce racially discriminatory
structures in the administration of criminal justice. There, emerging digital technologies are
common not only in policing but also in the justice system, where they have been
associated with discriminatory outcomes for racial and ethnic minorities. Several states in
the United States use artificial intelligence risk assessment tools in every step of the
criminal justice process. The developers intend these systems to provide objective, datadriven justice outcomes, 95 but the algorithms often rely on “data produced during
documented periods of flawed, racially biased, and sometimes unlawful practices and
policies”.96 As these algorithms affect sentencing, they can violate an individual’s rights to
equality before the law, to a fair trial, and to freedom from arbitrary arrest and detention.
Such risk assessments often weigh such factors as prior arrests and convictions, parental
criminal record, postal code and so-called “community disorganization”.97 As the authors of
one study find: “These factors reflect over-policing, the behaviours of law enforcement in
Black and brown communities, larger patterns of socioeconomic disadvantage resulting
from the racial caste system, rather than anything about the behaviours of people who are
90
91
92
93
94
95
96
97
See A/74/493.
See www.unswlawjournal.unsw.edu.au/forum_article/new-digital-future-welfare-debts-withoutproofs-authority and www.ombudsman.gov.au/__data/assets/pdf_file/0022/43528/ReportCentrelinks-automated-debt-raising-and-recovery-system-April-2017.pdf.
See www.aihw.gov.au/reports/australias-welfare/australias-welfare-2019-datainsights/contents/summary.
See www.ohchr.org/Documents/Issues/Poverty/Amicusfinalversionsigned.pdf.
A/74/493, para. 27.
See www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
See https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3333423.
Submission by the New York University Center on Race, Inequality, and the Law.
13