A/HRC/46/57
77.
In 2016, the European Union introduced a voluntary code of conduct on hate speech. 23
At the time of writing, it was also drafting a digital services act, 24 which may impose legal
obligations and liabilities in recognition of the fact that, while social media and other online
platforms already moderate illegal and harmful content, there remains little transparency,
accountability or, in many cases, effectiveness. To date, Facebook, Microsoft, Twitter,
YouTube, Instagram, Snapchat, Dailymotion, Jeuxvideo.com and TikTok have joined the
code of conduct. The fifth – and latest – round of monitoring of the code’s implementation,
conducted in 2020, suggested that on average the companies were now assessing 90 per cent
of flagged content within 24 hours and that 71 per cent of the content deemed illegal hate
speech was removed.25 However, again, there remain concerns that the current process does
not faithfully capture many forms of hate speech against minorities.
78.
While it can be suggested that some two thirds of hate speech in social media targets
minorities, the community standards or content moderation guidelines of most social media
platforms pay little direct attention to minorities, or even fail to mention them specifically.
TikTok’s Community Guidelines, for example, refer to matters such as “content that depicts
or promotes the poaching or illegal trade of wildlife”, and defines hate speech or behaviour
as “content that attacks, threatens, incites violence against, or otherwise dehumanizes an
individual or a group” on the basis of attributes such as race, ethnicity and religion. 26
However, the Guidelines do not contain a single reference to the word “minority”. While
“wildlife” legitimately gets a mention, minorities, unfortunately and counter-intuitively, do
not, despite the fact that minorities are overwhelmingly the targets and victims of most hate
speech in social media.
79.
The above is not to suggest there have been no positive developments. Most social
networks developed their content moderation rules on an ad hoc basis and only in recent
years. Facebook first set down comprehensive internal moderation guidelines in 2009, but
issued public guidelines only in 2018. There are increasingly indications that most owners of
social media platforms are moving towards improving transparency and collaboration with
civil society organizations on content moderation, including perhaps human rights impact
assessments in some areas, and these efforts are to be commended. 27 However, the most
glaring gaps remain the relative silence on the issue of greater attention to minorities as the
main targets of hate speech, and few or no policy steps for their protection. One notable
exception is the above-mentioned signal from Facebook that it was in the process of altering
its algorithms to prioritize the flagging of hate speech targeting minorities. However, it is not
known whether this change will be applied worldwide or on a more limited basis.
F.
Minorities and civil society organizations
80.
Civil society initiatives, presence and involvement are essential in the modern world
of communication and information through social media, and particularly since hate speech
is mainly the scourge of minorities. Indeed, the United Nations also clarifies that the focus
must be on those groups in situations of vulnerability due to entrenched or historic
stigmatization, discrimination, long-standing conflicts, and exclusion and marginalization
from the political, economic and social life of the society. 28 Most of these groups are
23
24
25
26
27
28
Available at https://ec.europa.eu/newsroom/just/document.cfm?doc_id=42985. The code is very brief
and quite general, does not define hate speech and makes no mention of minorities.
European Commission, proposal for a regulation of the European Parliament and of the Council on a
single market for digital services (Digital Services Act) and amending Directive 2000/31/EC, 15
December 2020. Available at https://eur-lex.europa.eu/legalcontent/EN/TXT/PDF/?uri=CELEX:52020PC0825&from=en.
See https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combattingdiscrimination/racism-and-xenophobia/eu-code-conduct-countering-illegal-hate-speech-online_en.
See www.tiktok.com/community-guidelines?lang=en#38.
See, for example, OHCHR, “Public comment by UN Special Rapporteur on minority issues relating
to cases on hate speech and minorities”, 23 December 2020.
United Nations Strategy and Plan of Action on Hate Speech: Detailed Guidance on Implementation
for United Nations Field Presences (2020), p. 11.
15