A/HRC/44/57
13.
Within the fields and industries that produce emerging digital technologies,
misplaced faith in the neutrality or objectivity of numbers and their power to overcome
racism has been shown to contribute to discriminatory outcomes.18 Even the field that has
developed to promote fairness, accountability and transparency in the design and use of
emerging digital technologies needs to pay greater attention to the broader societal
structures of discrimination and injustice.19 Indeed, among the biggest challenges to
addressing racially discriminatory use and design of emerging digital technologies are
approaches that treat this issue as purely or largely a technological problem for computer
scientists and other industry professionals to solve by engineering bias-free data and
algorithms. Technology is a product of society, its values, its priorities and even its
inequities, including those related to racism and intolerance. Technological determinism –
the idea that technology influences society but is itself largely neutral and insulated from
social, political and economic forces – only serves to shield the forces that shape emerging
digital technologies and their effects from detection and reform. “Techno-chauvinism” – an
overreliance on the belief that technology can solve societal problems20 – has a similar
effect, and can complicate interrogating and changing the values and interests that shape
technology and technological outcomes.
14.
Although there remains a great need for scrutiny of and accountability for the quality
of engineering in ensuring equality and non-discrimination principles, securing these and
other human rights principles must begin with an acknowledgment that the heart of the
issue is a political, social and economic one, and not solely a technological or mathematical
problem. Inequality and discrimination, even in those circumstances in which they are the
product of the design and use of emerging digital technologies, will not be “cured” by more
perfect modelling of equality and non-discrimination. Concretely, this means that thinking
and action that seek to combat racial discrimination in the design and use of emerging
digital technologies, both in the private and public sectors, should not be the exclusive or
near-exclusive terrain of technology experts. Instead, such thinking and action must be
more holistic, as researchers and others with expertise in emerging digital technologies
have argued.21 Governments and the private sector must commit to approaches that include
experts on the political, economic and social dimensions of racial discrimination at all
stages of research, debate and decision-making to mitigate racially discriminatory design
and use of emerging digital technologies. Affected racial and ethnic minority communities
must play decision-making roles in the relevant processes.
15.
Private corporations wield monumental influence in the design and use of emerging
digital technologies. Among digital platforms, seven “super platforms” – Microsoft, Apple,
Amazon, Google, Facebook, Tencent and Alibaba – account for two thirds of the total
market value of the world’s 70 largest platforms.22 Notwithstanding the global reach of
their emerging digital technologies, the corporations that exert the greatest influence over
them are predominantly concentrated in Silicon Valley, in the United States of America,
while Europe’s share is 3.6 per cent, that of Africa 1.3 per cent and that of Latin America
0.2 per cent.23 For example, Google has 90 per cent of the global market for Internet
searches.24 Occupying two thirds of the global social media market, Facebook is the top
social media platform in more than 90 per cent of the world’s economies. Amazon has an
almost 40 per cent share of the world’s online retail activity. As a result, the specific
cultural, economic and political values of Silicon Valley fundamentally shape how many of
18
19
20
21
22
23
24
See, e.g., West, Whittaker and Crawford, “Discriminating systems”.
See, e.g., www.tandfonline.com/doi/full/10.1080/1369118X.2019.1593484 and
http://sorelle.friedler.net/papers/sts_fat2019.pdf.
See, e.g., Meredith Broussard, Artificial Unintelligence: How Computers Misunderstand the World
(Cambridge, Massachusetts, Massachusetts Institute of Technology, 2018).
See, e.g., www.odbproject.org/2019/07/15/critiquing-and-rethinking-fairness-accountability-andtransparency.
Digital Economy Report 2019: Value Creation and Capture: Implications for Developing Countries
(United Nations publication, Sales No. E.19.II.D.17), p. xvi.
Ibid., p. 2.
Ibid., p. xvii.
5