UN Special Rapporteur on Minority Issues
[DRAFT FOR GLOBAL CONSULTATION]
Underlying Principles & Proposed Definition
PRINICIPLE 1: Social media companies (SMCs) should not offer protection to minorities less than
required under international human rights standards aimed at States in the area of incitement to
hatred and hate speech.
PRINICIPLE 2: Regardless of the extent national laws incorporate international human rights
standards, SMCs should adhere to international human rights obligations.
PRINICIPLE 3: SMCs should offer increased protections to community members, given the lesser
interference with freedom of expression as is required from States. There is seldom a complete
nullification of the right to freedom of expression or the granting of an absolute freedom of expression
despite the harm it may cause. Often there is an interference or limiting of expression, which has to
be justified with the purpose of that limitation. This can be seen as an exercise of ‘proportionality’9 or
‘balancing’ of competing rights10.
Proposed definition of ‘Online Hate Speech’
There has been no attempt to define online hate speech specifically. We do however have the
standard of prohibition of incitement to hatred under ICCPR, Art. 20(2) and the Rabat Plan of Action.11
Most recently, the UN Office of Genocide Prevention and Responsibility to Protect has elaborated a
definition of hate speech more generally and broader in scope than ICCPR, Art. 20(2):
“any kind of communication in speech, writing or behaviour, that attacks or uses pejorative
or discriminatory language with reference to a person or a group on the basis of who they are,
in other words, based on their religion, ethnicity, nationality, race, colour, descent, gender or
other identity factor.”12
It has also become apparent that social media companies are having to define what ‘hate speech’ is
on their platforms and that they are in need of assistance, support and guidance to ensure compliance
with international human rights law, any minimum criteria and consistency of standards across
industry. Drawing on and synthesising this pre-existing work, an authoritative working definition of
‘online hate speech’ and associated core responsibility to support and enhance existing approaches
by social media companies can be proposed:
“Social media companies are responsible for effectively prohibiting and removing content in the
shortest time possible that is discriminatory, hostile or violent towards those (community
members) with protected characteristics on the basis of any identity factor and especially those
belonging to national or ethnic, religious and linguistic minorities on the basis of their minority
identity”
9
ICCPR, Art. 19(3) requires permissible limitations on the right to freedom of expression to be necessary,
which is in turn establishment of proportionality.
10
UNGP.
11
Rabat Plan of Action, Appendix, Report of the United Nations High Commissioner for Human Rights on the
expert workshops on the prohibition of incitement to national, racial or religious hatred, 2013.
12
UN Strategy and Plan of Action on Hate Speech, 2019 and UN Strategy and Plan of Action on Hate Speech,
Detailed Guidance on Implementation for UN Field Presences, 2020.
6