A/HRC/44/57
electronic health records.65 Integrated directly into existing electronic health records, Epic’s
artificial intelligence tool estimates the likelihood that a patient will no-show by using the
patient’s personal information, including ethnicity, class, religion and body mass index, as
well as the patient’s record of prior no-shows. In pointing out the obvious potential to
discriminate against vulnerable patient populations, the researchers note that “removing
sensitive personal characteristics from a model is an incomplete approach to removing
bias.”66 Prior no-shows, for example, likely correlate with socioeconomic status mediated
by the patient’s inability to cover transportation or childcare costs, or to take time off work
for the appointment. They also likely correlate with race and ethnicity because of
correlations between socioeconomic status, and race and ethnicity.67 It was revealed in
another recent study that black patients were more likely to be scheduled into overbooked
appointment slots and thus had to wait longer when they did show up.68
32.
In the housing context, studies in the United States have shown ethnic
discrimination in Facebook’s targeted advertising. Facebook used to allow advertisers to
“narrow audience” by excluding Facebook users with certain “ethnic affinities” under the
“demographics” category of its ad-targeting tool.69 This targeted advertising could be used
to prevent black people from viewing specific housing advertisements, which is prohibited
under United States anti-discrimination law. Facebook controls an estimated 22 per cent of
the market share for digital advertisements in the United States, 70 and its targeted
advertising, which is the core of the company’s business model,71 has been shown to be
racially exclusionary.72 These practices are best understood as a form of digital redlining,
which is defined as “the creation and maintenance of technology practices that further
entrench discriminatory practices against already marginalized groups”. 73 Facebook uses
targeted advertising in the employment context as well, raising similar concerns.
33.
In yet other cases, access to technology – and the information available through it –
are denied in ways that have disparate impacts, or that target specific racial, ethnic or
religious groups, sometimes on a discriminatory basis. In 2019, multiple States, including
Bangladesh, the Democratic Republic of the Congo, Egypt, India, Indonesia, Iran (Islamic
Republic of), Myanmar, the Sudan and Zimbabwe, completely restricted Internet access to
specific regions, with the effect of preventing nearly all communication in or out of those
regions.74 Researchers have linked more targeted Internet shutdowns to regions with higher
densities of minority groups. 75
34.
With respect to the right to a fair trial, multiple courts in Latin America have begun
using Prometea, a software that uses voice recognition and machine learning prediction, to
streamline judicial proceedings. The district attorney’s office and courts in Buenos Aires
use this artificial intelligence system to automate judicial decision-making in simple cases,
such as disputes about taxi licences and complaints from teachers about not being
compensated for school supplies.76 In such cases, Prometea interprets the facts given to it
and suggests a legal outcome based on prior jurisprudence in similar cases. A judge must
approve the decision before it is made official, which is the case 96 per cent of the time.77 A
65
66
67
68
69
70
71
72
73
74
75
76
77
10
See www.healthaffairs.org/do/10.1377/hblog20200128.626576/full.
Ibid.
See https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3467047.
Ibid.
See www.propublica.org/article/facebook-lets-advertisers-exclude-users-by-race.
See www.emarketer.com/content/us-digital-ad-spending-will-surpass-traditional-in-2019.
See www.motherjones.com/politics/2019/12/facebook-agreed-not-to-let-its-ads-discriminate-butthey-still-can.
See www.propublica.org/article/facebook-advertising-discrimination-housing-race-sex-nationalorigin.
See www.congress.gov/116/meeting/house/110251/witnesses/HHRG-116-BA00-Wstate-GillardC20191121.pdf.
See www.hrw.org/news/2019/12/19/shutting-down-internet-shut-critics.
See www.accessnow.org/cms/assets/uploads/2020/02/KeepItOn-2019-report-1.pdf.
See www.bloombergquint.com/businessweek/this-ai-startup-generates-legal-papers-without-lawyersand-suggests-a-ruling.
See www.giswatch.org/2019-artificial-intelligence-human-rights-social-justice-and-development.