A/78/538
how materials are disseminated to users. Online platforms often collect large amounts
of data from users. These data are monetized by the providers of digital platforms
through their sale to advertisers. Advertisers then use the data to target audiences with
precision through advertisement placement. 14 The longer that individuals stay on
digital platforms and the more they engage while using them, the more effectively
providers can monetize the use of their platforms through the sale of exposure to
targeted advertising. 15 As such, content-shaping algorithms often prioritize the
dissemination of materials that generate high engagement, regardless of their
credibility, veracity and potential to cause harm. 16 Content-shaping algorithms can,
therefore, lead to the quick dissemination of hateful materials to a wide audience, thus
perpetuating harmful beliefs and narratives. 17
20. Content-shaping algorithms also contribute to the creation of social media “echo
chambers”, where people are only shown material that reinforces and amplifies
pre-existing views and beliefs, increasing engagement but also deepening harmful
racial stereotypes and spreading hate speech. As well as further disseminating harmful
ideas and ideologies, the creation of these online echo chambers also limits the
exposure of users to counterspeech that could challenge harmful beliefs and
narratives. 18 While content-shaping algorithms may not intentionally amplify and
disseminate racist content online, the Special Rapporteur’s predecessor highlighted
in her 2020 report that “colour-blind” or “race neutral” strategies towards digital
governance could cause algorithm bias and indirect harm to racial and ethnic groups
(A/HRC/44/57).
Multiple actors, motivations and contexts
21. Different forms of online racist hate speech can originate from and be
disseminated by a range of actors with varied intentions. The online activities of such
actors form a complex, opaque and mutually reinforcing interplay. Such actors may
be motivated by racist, ethnonationalist and xenophobic ideologies. However, it is
important to also acknowledge that significant commercial and political interests are
involved in the dissemination of racist hate speech online.
22. Individuals who espouse racist ideologies may disseminate racist hate speech
online in the context of their everyday use of digital platforms. Some sources suggest
that most hateful online materials originate from individuals who are not associated
with organized ideological groups, although more research is needed to truly
understand the drivers of the phenomenon. 19 Individuals may feel emboldened in an
online context due to the ability to sometimes be anonymous, as well as the
normalization of racist sentiments and ideologies within their highly curated online
spaces, as determined by content-shaping algorithms. 20
23. Digital platforms have also allowed individuals and the groups to which they
belong to form organizations, often at the international level, and to recruit and
__________________
14
15
16
17
18
19
20
8/22
Submissions from Amnesty International and the Federal Public Defenders’ Office of Brazil, and
A/HRC/46/57, para. 69.
Submission from Amnesty International and Access Now, 26 Recommendations on Content
Governance: A Guide for Lawmakers, Regulators, and Company Policy Makers (2020).
Ibid.
Ibid.
Submission from iCure and Zachary Laub, “Hate speech on social media: global comparisons”,
Council on Foreign Relations, 7 June 2019.
Daria Denti and Alessandra Faggian, “Where do angry birds tweet? Income inequality and online
hate in Italy”, Cambridge Journal of Regions, Economy and Society, vol. 14, No. 3 (November
2021); A/HRC/47/25; submission from Maat for Peace, Development and Human Rights
Association; and Laub, “Hate speech on social media”.
A/77/512 and submission from WYK Advocate.
23-20290