A/78/538
Accordingly, other measures to prevent and address online racist hate speech, such as
counterspeech, education, community projects and steps to build societal support for
plurality, are critical to preventing and addressing online racist hate speech ( see Rabat
Plan of Action, para. 35). The Special Rapporteur is concerned by the lack of
information about comprehensive and effective non-legal measures by States to
prevent and address online racist hate speech. While noting some information
received from some States on non-legal measures, the Special Rapporteur is
concerned that there are gaps in the overall investment in non -legal measures to
effectively address the root causes of online racist hate speech.
Lack of adequate investment by the providers of digital platforms in preventing
and addressing online racist hate speech
50. The companies that provide and profit from the digital platforms on which
online racist hate speech is disseminated have responsibilities to respect human rights.
Their responsibilities under the Guiding Principles on Business and Human Rights
and other international human rights standards were referred to in the section above.
The Special Rapporteur notes some efforts by some digital platforms to prevent and
address online racist hate speech. However, overall, she considers that investments
made to that end are inadequate relative to the magnitude of the power and profits
that such companies have acquired as a result of digital platforms becoming integrated
into the everyday lives of a significant proportion of the world’s population.
Moreover, the fact that algorithms that disseminate online racist hate speech are
central to the business model and profitability of companies that provide digital
platforms compounds their responsibility to prevent and address the phenomenon. 53
51. Many large providers of digital platforms have developed definitions of online
hate speech, which include the grounds for discrimination in article 1 of the
International Convention on the Elimination of All Forms of Rac ial Discrimination,
as well as community guidelines and policies on hate speech and content moderation.
The Special Rapporteur is concerned, however, by the lack of clarity and transparency
in the policies and guidelines of companies providing digital plat forms. 54 The
vagueness and opaqueness of these policies and guidelines and how they are
implemented inhibit the scrutiny and participation of those from affected racial and
ethnic groups. There have also been cases where those from racial and ethnic groups
who have been targeted by online hate speech have had materials, which could be
considered counterspeech, removed, but they had little understanding of why and no
clear recourse. 55 Another area of concern is that the most serious action envisaged by
providers of digital platforms is often the removal of content and banning of the user.
If the content identified were serious enough to justify a restriction on online
expression, removal and the banning of the user may not reflect a proportionate or
effective response in all cases, in particular as users can often easily register again
using different credentials. 56
52. The weakness in companies’ efforts to prevent and address online racist hate
speech that is of serious concern to the Special Rapporteur is th e lack of investment
in the cultural and linguistic knowledge necessary to assess online materials,
including those that could be deemed as incitement. Digital platforms tend to use
content moderation algorithms to identify hateful content and may use auto mation or
employ staff to moderate content, among other measures. Such measures tend to be
grounded in race-neutral approaches, as described below, which can lead to the
replication or even exacerbation of societal racial and ethnic inequalities. Moreover,
__________________
53
54
55
56
23-20290
Submission from Amnesty International.
Access Now, “26 recommendations on content governance” and submission from Pakistan.
Ibid.; submissions from Fundación Karisma and El Veinte; and A/HRC/47/25.
Hogan Lovells, The Global Regulation of Online Hate.
17/22