A/HRC/44/57
27.
With respect to the right to work, in one submission it was reported that Paraguay
had implemented a digital employment system that allowed employers to sort and filter
prospective employees by various categories, some of which served as proxies for race. 56
Furthermore, the system is only available in Spanish, when less than half of rural
indigenous peoples in Paraguay speak Spanish. Such limited language accessibility
effectively restricts this system’s availability to jobseekers on an ethnic basis, even if this is
not the intention of policymakers.
28.
Algorithms used for selecting successful job candidates in North America and
Europe have also been criticized for making discriminatory recommendations. These
systems are trained to identify candidates based on data sets of existing “successful”
employees that include information on protected characteristics, such as gender, ethnicity or
religion. As a result, the respective algorithmic systems reproduce and reinforce existing
racial, ethnic, gender or other bias by making decisions that reflect existing inequities in
employment. Such systems effectuate direct and indirect forms of racial discrimination. 57
On the other hand, when these systems prohibit any consideration of protected statuses,
such as race and ethnicity, they can undercut special measures or affirmative action that
States may have adopted to promote equal employment opportunities.
29.
In other cases, the introduction of automated systems that do not rely directly on
discriminatory inputs or processes can nonetheless indirectly discriminate against
marginalized groups in their access to work by reducing or eliminating available positions.
A submission provided the example of a new artificial intelligence-based project for
sanitation management in India that would eliminate the need for many jobs typically
performed by those in the lowest, or Dalit, caste.58 Dalits, especially women, can often only
find employment in the sanitation sector, and some Indian states have prioritized Dalits for
sanitation jobs. Implementation of smart sanitation systems would likely affect the jobs and
livelihoods of Dalits disproportionately, especially Dalit women. In light of the broader
socioeconomic and political marginalization of Dalits in India, automation in the sanitation
sector might fundamentally undercut access to work for those who rely on employment in
the sanitation sector.
30.
Emerging digital technologies also have a discriminatory impact on the right to
health. The top 10 health-care algorithms on the United States market use patients’ past
medical costs to predict future costs, which are used as a proxy for health-care needs.59 A
recent study of such an algorithm used by a leading health services company found that it
had been unintentionally yet systematically discriminating against black patients in the
United States.60 Intended to help enrol high-risk patients in care management programmes,
the algorithm was found to encode racial bias by using patients’ health-care costs as a proxy
for their health needs in order to predict their level of risk. 61 Considered by its developers as
“race-blind” because race was not an input,62 the algorithm consistently assigned lower risk
scores to black patients who were equally sick as their white counterparts. 63 The algorithm
failed to identify less than half the number of black patients at risk of complicated medical
needs as white patients. As a result, black patients were less likely to be referred to
programmes for interventions to improve their health. Hospitals, insurers and government
agencies use this algorithm and similar ones to help manage care for over 200 million
people in the country each year.64
31.
In another example from the United States, a recent case study examined a
predictive model developed by Epic Systems Corporation, the leading global developer of
56
57
58
59
60
61
62
63
64
Submission by the Equal Rights Trust.
See https://fra.europa.eu/en/publication/2018/bigdata-discrimination-data-supported-decision-making.
Submission by the Association of Progressive Communications. See also
www.apc.org/sites/default/files/gisw2019_artificial_intelligence.pdf.
See www.sciencenews.org/article/bias-common-health-care-algorithm-hurts-black-patients.
See https://science.sciencemag.org/content/366/6464/447.
Ibid.
See www.thelancet.com/journals/landig/article/PIIS2589-7500(19)30201-8/fulltext.
See https://science.sciencemag.org/content/366/6464/447.
Ibid.
9