UN Special Rapporteur on Minority Issues
[DRAFT FOR GLOBAL CONSULTATION]
Substantive Guidelines
1. Social media companies (SMC) should clearly and precisely define ‘hate speech’ in
their content policies and expand their protected characteristics to include any
identity factor. Hate speech targeting minorities should be a distinct category and
include national or ethnic, religious and linguistic minorities.
Commentary
SMC content policies13 on hate speech should be legally certain imbuing foreseeability and
accessibility. Broad, subjective or ambiguous terms such as ‘direct’ and ‘attack’ should be clearly
defined, detailed and their thresholds elaborated to allow for transparent and objective interpretation
and application. There would be increased legal certainty in referring to established and objective terms
such as ‘discriminatory’ instead14. Examples of appeals and their decisions should be publicly shared
to aid clarity and provide precedents. The included protected characteristics should not be arbitrarily
chosen and instead brought in line with permitted heads of non-discrimination under international
human rights law15 and as per the understanding of ‘hate speech’ in the UN Strategy and Plan of Action
on Hate Speech. As such, the included protected characteristics should not be exhaustive and establish
principled criteria to ascertain inclusion such as ‘any identity factor’.16 Notably while, ethnic and
religious minorities may be protected from hate speech to some extent on the basis of their ethnic, racial
or religious characteristics, linguistic minorities are often excluded.17 When there is intersectionality
between two or more identity factors, the potential of harm and thus severity of hate speech is greater
and should be recognised as such with appropriate and proportionate content moderation responses.18
While ‘hate speech’ defined as targeting those with protected characteristics may include minorities, it
is nonetheless a distinct type of hate speech that is not only severer, but also poses the worst risk of
widespread, systemic and group-based violence resulting in atrocity crimes such as ethnic cleansing
and genocide.19 Such hate speech targets an entire minority group on the basis of their national, ethnic,
religious or linguistic identity and culture, leads to incitements to violence, violent hate crimes and
ultimately violence en masse. The creation of such a distinct category is supported by several normative
standards. These include protection of ethnic, religious or linguistic minorities under ICCPR Art. 27;
prohibition of incitement to hatred, hostility and violence under ICCPR Art. 20(2), which singles out
minority groups based on national, racial and religious basis; ICERD Art. 4, which notes the specificity
of dissemination of ideas of racial superiority in relation to minorities20; the international crime of
inciting genocide and States’ obligation to prevent and punish genocide21; and the requirement in the
UN Guiding Principles on Business and Human Rights22 for private companies to take special note of
13
These are referred to under various headings by social media companies such as ‘Community Guidelines’,
‘Community Standards’, ‘Rules’.
14
‘Incitement to discrimination’ is the lowest threshold for ‘advocacy of national, racial and religious hatred’
under ICCPR, Art. 20(2).
15
ICCPR, Art. 2 & 26. ICERD Art. 1. General Comments.
16
UN Strategy and Plan of Action on Hate Speech 2019, and its Detailed Guidance, 2020.
17
ICCPR, Art. 27 and the UN Declaration on Minorities, which also includes ‘national’ minorities.
18
Detailed Guidance, 2020
19
Genocide Convention, Rome Statute and other relevant instruments.
20
CERD General Recommendation No. 35, Combating racist hate speech, 26 September 2013.
21
Convention on the Prevention and Punishment of the Crime of Genocide, art. 1.
22
UNGP, General Principles, p. 1: “These Guiding Principles should be implemented in a non-discriminatory
manner, with particular attention to the rights and needs of, as well as the challenges faced by,
7