A/78/538
originate and disseminate online materials, which are monetized in various ways,
including by allowing those who view them to donate money, as well as through the
sale of merchandise related to materials shared and advertising revenue. There have
been cases where individuals and groups have monetized racist and xenophobic
content, notwithstanding efforts to prevent this. 27 At a more systemic level, the overall
business models of digital platforms and advertising revenue, which is linked to
content-shaping algorithms that can disseminate and amplify online racist hate
speech, mean that powerful economic incentives and disinc entives are at play. 28
Real life consequences of online racist hate speech
28. The consequences of the most serious forms of online racist hate speech can be
life threatening. The most extreme cases of online racist hate speech can amount to
incitement to discrimination, hostility or violence, as defined in article 4 of the
International Convention on the Elimination of All Forms of Racial Discrimination
and article 20 (2) of the International Covenant on Civil and Political Rights and in
the Convention on the Prevention and Punishment of the Crime of Genocide. In the
present report, the subsequent section on international human rights standards serves
to explore the most serious forms of online hate speech in more detail. More broadly,
in paragraph 16 of general recommendation No. 35 (2013), it is stated that:
“Incitement characteristically seeks to influence others to engage in certain forms of
conduct, including the commission of crime, through advocacy or threats. Incitement
may be express or implied, through actions such as displays of racist symbols or
distribution of materials as well as words.” One of the most emblematic cases of
online racist hate speech that amounted to incitement was the sustained demonization
of the Rohingya ethnic group in Myanmar on Facebook ahead of and during a
campaign of ethnic violence, which had horrific humanitarian consequences
(A/HRC/46/57, para. 46). 29 The escalating and serious online hate speech directed
towards the Rohingya ethnic group was met with inaction by the State and Facebook,
notwithstanding multiple warnings of impending harm. 30 This case exemplifies the
significant harm that online racist hate speech that meets the thresh old for incitement,
together with offline racist hate speech and policies, can generate, in particular when
there is inaction by Governments and companies. 31
29. It is important to note that it is not only the most serious cases of online racist
hate speech that have negative consequences. Even in cases where online hate speech
does not amount to incitement to discrimination, hostility or violence, it can be a
factor in offline hate crimes. Hate crimes have an element of bias that can be
influenced by hate speech, including online hate speech. 32 Digital platforms can
facilitate the global transmission of harmful stereotypes and related propaganda,
which potentially make violence against targeted groups more acceptable and
arguably more likely (A/77/512, para. 52).
__________________
27
28
29
30
31
32
10/22
Paul Hosford, “Revealed: how racist Irish YouTube accounts profit by livestreaming protests”,
Irish Examiner, 31 January 2023 and Sara Miller, “Big business: the monetization of
antisemitism”, The Media Line, 4 January 2023.
Submission from Amnesty International.
See also A/HRC/44/57.
Submission from Amnesty International.
Ibid.
Submissions from Romania, as well as from FakeReporter, the National Human Rights
Commission of Mexico and the Federal Public Defenders’ Office of Brazil; Meagan Cahill and
others, “Understanding online hate speech as a motivator and predictor of hate crime”, paper
prepared for the National Institute of Justice, Office of Justice Programs, United States
Department of Justice, April 2022; and Hogan Lovells, The Global Regulation of Online Hate: A
Survey of Applicable Laws, special report prepared for PeaceTech Lab, December 2020.
23-20290