A/HRC/46/57
rumours of a Muslim minority plot to sterilize the Sinhalese majority, circulating mainly on
Facebook, led to deaths, with mobs in several towns burning mosques, shops and homes
owned by Muslims minorities. Examples such as these against minorities are legion.
44.
The Special Rapporteur shares the concern expressed in one submission that
dehumanizing language, often reducing minority groups to animals or insects, normalizes
violence against such groups and makes their persecution and eventual elimination
acceptable, and that, when committed with a discriminatory or biased intent, these violations
become a pathway of demonization and dehumanization that can lead to genocide.
Individuals can find themselves drawn by social media into dehumanizing language and hate
environments and end up surrounded by people with similar viewpoints. They can thus
become enmeshed in confirmation bias in social media, an incubating environment that has
become particularly conducive to the expression of – and indeed strengthened and confirmed
– racist, intolerant and even violent viewpoints against certain scapegoated minorities.
2.
Legal and regulatory context
45.
An invaluable compendium on the regulation of online hate worldwide shows that
there is a wide range of approaches to the legal prohibition of hate speech, definitions, if any,
of the concept, the type of restrictions or limitations imposed, and the availability of
remedies, if any.7 While it is impossible to generalize, the compendium does suggest that
there is no single approach to hate speech, and quite often none specially adapted to the
particular nature and challenges of hate speech in social media.
46.
Some submissions to the Special Rapporteur highlight the fact that there is often a
lack of enforcement of restrictions on hate speech in social media, particularly those intended
to protect minorities. It has been suggested that in some countries, there are no data on hate
speech cases in social media, and situations whereby existing legislation against hate crimes
has never been used or is too onerous or vague to be invoked successfully for prosecution. In
some of the submissions, it is suggested that minorities hesitate to bring cases of hate speech
to the relevant authorities because their own experiences suggest that public authorities will
not intervene, that there will be no consequences for those who breach the legislation, or that
the use of moderation or complaint mechanisms for social media was unlikely to remedy the
situation. The consequences of the lack of effective legal and other responses by public
authorities and owners of social media platforms can be tragic to the point of being lethal and
lead to massive atrocities and violations of human rights, as shown by Facebook’s failure in
Myanmar to address incitement against the Rohingya minority. Social media were used to
demonize the Rohingya minority ahead of and during a campaign of ethnic violence, and the
independent international fact-finding mission on Myanmar confirmed that Facebook had
been a useful instrument for those seeking to spread hate (A/HRC/39/64, para. 74). The
consequence was a foreseeable and planned human rights catastrophe of gang rapes,
thousands of killings, and burning of schools, marketplaces, homes and mosques as part of
ethnic cleansing and possible genocide attempt, all resulting in a horrific humanitarian crisis
involving hundreds of thousands of men, women and children belonging to the Rohingya
minority.
47.
From the point of view of international law, companies such as social media platforms
do not have the same obligations as Governments. Governments, however, have direct
obligations, at a minimum, to prohibit incitement to genocide and advocacy that constitutes
incitement to discrimination, hostility or violence. Given the impact of social media today in
the propagation of hate speech, constituting grave violations of the rights of millions of
people and even threats to their life and safety, Governments have the obligation to take
measures to ensure that incitement to genocide or advocacy of hatred that constitutes
incitement to discrimination, hostility or violence are prohibited.
48.
A recurring issue is whether social media platforms should be subject to consequences
and penalties, as is the case for mainstream traditional media, when they are allowed to
7
8
Hogan Lovells and PeaceTech Lab, The Global Regulation of Online Hate: A Survey of Applicable
Laws (December 2020). Available at www.hoganlovells.com/~/media/hogan-lovells/pdf/2020pdfs/2020_12_15_the-global-regulation-of-online-hate_a-survey-of-applicable-laws_specialreport_december-2020.pdf?la=en.