A/HRC/40/58
institutions, describing them as impious and hypocritical, and he asserted that Sharia law
would replace democracy in Turkey. This resulted in an indictment for inciting people to
hatred and hostility on the basis of a distinction founded on religion. The Government of
Turkey maintained that the prosecution was justifiable on account of a pressing social need
because “through his comments, which ran counter to the moral principles of a very large
majority of the population, [Gündüz] had severely jeopardized social stability”.26 However,
the European Court of Human Rights held that in the instant case, the need for the restriction
in issue had not been convincingly established and that the interference with his freedom of
expression was not based on sufficient reasons.
VI. Impact of online platforms and related restrictions
Globally, policymakers are facing the challenge of responding to online expression
that incites persons to discriminate or perpetuate hostile or violent acts in the name of religion
or belief. Online platforms have revolutionized the public square, instantaneously conferring
celebrity on myriad views, including those which offend religious or belief communities, as
well as those that constitute incitement to discrimination, hostility or violence. In recent
years, States have adopted measures intended to combat incitement, and tech companies have
adopted voluntary measures, including reporting tools and policies for swiftly removing
content deemed illegal upon notification.
50.
The majority of the world’s Internet users thus experience various forms of censorship
or filtering. Such policies, critics note, have armed tech companies and the State with a
tremendous degree of power, granting them the capacity to effectively chill expression, as
people self-censor for fear of State sanction or widespread, and often, vitriolic, public rebuke.
Critics also argue that to be effective, such laws need to curb the spread of intolerant attitudes,
enfeeble extremist political forces and be shielded from abuse by authoritarian tendencies. But,
oftentimes, they note, regulations fail to meet these standards. Instead, State attempts to combat
incitement have contributed to the emergence of “digital authoritarianism” through increased
surveillance, encroachment on privacy and broad restrictions on expression related to religion
or belief, which has rendered cyberspace a perilous place for dissenters and religious
minorities. Digital applications, for example, are reportedly being used to report allegations
of blasphemy, and digital footprints can be used to assess compliance with faith-related
observances. In addition, in several cases, social media has been used to incite hatred against
religious communities or mobilize hostile or violent responses to offensive expression.
51.
Governments have responded to this phenomenon which negatively impacts freedom
of expression. Such responses have included the removal of online material to curtail access
to particular types of content, the blocking and filtering of websites, the disclosure of the
identities of bloggers critical of the politically dominant theology of the country, and holding
intermediaries liable for hosting “hate speech” content uploaded by third parties. While there
is a need to prevent and punish online incitement to violence, some of the current approaches,
characterized by vaguely worded laws on what is proscribed and draconian intermediary
penalties, are likely to be highly counterproductive, with chilling effects. The negative impact
of the rise of digital authoritarianism is evident from the high number of cases of murders,
attacks and prosecutions that have resulted from online activity. At the same time, criminal
and terrorist groups have recently demonstrated the potential for online platforms to be used
to propagate violent religious extremism or to incite violence against religious minorities.
52.
Pressure is mounting throughout Europe for effective responses to online incitement
and “hate speech”. For example, in Germany, the recently adopted Network Enforcement
Act (“NetzDG”) requires tech companies to delete “obviously illegal” content
within 24 hours of being notified. Other illegal content must be reviewed within seven
days of being reported and then deleted. If the complaint management requirements are not
met, fines of up to 50 million euros may be imposed. Said stipulations are problematic given
that some of the criteria for determining which content is prohibited are based on vague and
53.
26
14
European Court of Human Rights, Gündüz v. Turkey¸ Application No. 35071/97, Judgment of 4
December 2003, para. 31.