A/HRC/44/57
C.
Racially discriminatory structures
38.
Examples from different parts of the world show that the design and use of different
emerging digital technologies can be combined intentionally and unintentionally to produce
racially discriminatory structures that holistically or systematically undermine enjoyment of
human rights for certain groups, on account of their race, ethnicity or national origin, in
combination with other characteristics. In other words, rather than only viewing emerging
digital technologies as capable of undercutting access to and enjoyment of discrete human
rights, they should also be understood as capable of creating and sustaining racial and
ethnic exclusion in systemic or structural terms. Under this subheading, the Special
Rapporteur reviews examples of existing and potentially discriminatory structures,
emphasizing the prevalence of biometric data systems, racialized surveillance and
racialized predictive analytics in maintaining these structures.
39.
China uses biometric identification and surveillance to track and restrict the
movements and activities of the Uighur ethnic minority group, violating members of this
group’s rights to equality and non-discrimination, among others.84 Uighurs experience
frequent baseless police stops and are subjected to having their telephones scanned at police
checkpoints, which violates their right to privacy. There is a mandatory collection of
extensive biometric data, including DNA samples and iris scans, for Uighurs. According to
credible reports, the State, “using a combination of facial recognition technology and
surveillance cameras throughout the country, looks exclusively for Uighurs based on their
appearance and keeps records of their comings and goings for search and review”.85 It is
also noted in reports that this surveillance and data collection activity is occurring alongside
large numbers of ethnic minorities being held incommunicado in political “re-education
camps” under the pretext of countering religious extremism, without detainees being
charged or tried.86 The picture that emerges is one of systemic ethnic discrimination,
supported and indeed made possible by a number of emerging digital technologies, which
violates a broad spectrum of human rights for Uighurs.
40.
Kenya and India have implemented biometric identification for accessing public
services, known as Huduma Namba and Aadhaar, respectively.87 The programmes include
collection of various forms of biometric data, including fingerprints, retina and iris patterns,
voice patterns and other identifiers. When trying to access public services through these
systems, certain racial and ethnic minority groups in both countries find that they are
excluded from them, while others face logistical barriers and long vetting processes that in
effect can result in de facto exclusion from accessing public services to which they are
entitled. These public services include pensions and unemployment benefits in India, and
all essential government services in Kenya, including voting, registering birth certificates
and civil marriages, paying taxes and receiving deeds to property. The Supreme Court of
India has upheld the statute requiring the Aadhaar number for receiving government
welfare. Despite the same judgment prohibiting private entities from using Aadhaar for
non-governmental purposes, like banking, employment and mobile telecommunications,
such a requirement remains prevalent in practice. Furthermore, persons with disabilities –
including among ethnic and racial minorities – experience discrimination for not being able
to provide fingerprint or iris scans. Though the law provides special mechanisms for such
persons, they continue to face logistical hurdles because many centres have no training in
enrolling them without the biometric data.88 Without stringent protections, digital
identification systems for public services disproportionately exclude racial and ethnic
minorities, especially those whose citizenship status is insecure.89
84
85
86
87
88
89
12
See CERD/C/CHN/CO/14-17.
See www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racialprofiling.html; and A/HRC/41/35, para. 12.
See CERD/C/CHN/CO/14-17.
See A/74/493.
See https://timesofindia.indiatimes.com/city/kolkata/court-relief-in-disabled-womans-aadhaarbattle/articleshow/68961357.cms.
For a human rights analysis of racial discrimination in access to citizenship, see A/HRC/38/52.