A/HRC/26/49
information they post will be removed or their liability will be invoked. In the view of the
Special Rapporteur, the main concern is that censorship impedes the fundamental right to
freedom of expression, as expressed in article 19 of the International Covenant on Civil and
Political Rights and subsequent concluding observations.
43.
Another model is the regulation framework, implemented either by the State or the
Internet or social media provider. States can create a legal framework that determines what
content on the web can be considered illegal. Under this model, a court or regulatory body
determines whether the content should pursuant to the applicable law, be blocked, before
the said content is filtered or removed. Alternatively, a non-governmental organization can
be given the responsibility of monitoring the content posted on the Internet. While courts
are usually seen as independent of political influence, a watchdog organization might,
however, be viewed as applying government policies.
44.
The Special Rapporteur was informed that, under the regulatory model, users may
also be able to submit information to the overseeing body on content they feel violates the
law and should be removed. This allows Internet users to implement self-censorship, and it
also helps to ensure that illegal content will be found and removed, given that more users
are looking out for it. A major benefit of the regulatory model is that it is more transparent
than censorship. Internet users are usually aware of what information is allowed, and can
have a voice in deciding what content will be blocked by voting for the politicians who
make these laws. However, one drawback of allowing users to submit removal requests is
that the decision-making process is often not transparent. Thus, after users submit their
requests, they have no way of knowing why their request was granted or denied. This issue
can be mitigated by ensuring that Internet users are informed of the decision-making
process on content removal.
45.
The above-mentioned regulatory model has been set up in France, where content and
activity on the Internet is regulated by law LCEN 2004-575. The law relieves Internet
service providers and social media platforms of responsibility for illegal content if they
have “no knowledge of illegal activity or material” or if they have “acted promptly to
remove or block access to it as soon as it was discovered”. The providers are also not held
responsible if they did not know how the illegal activity or material arose. Once a judicial
order is issued and proper notice is given, the hosting website is liable for any further
reposting of the illegal material. A website may also be held liable for redirecting Internet
users to other websites hosting illegal content. The benefit of the regulatory system used in
France is that it is more transparent, with a court assessing whether a website should be
blocked before any filtering is implemented. Moreover, it allows the Internet or social
media provider the opportunity to remove the objectionable information prior to the
issuance of a court order.
46.
The Special Rapporteur was informed that, in the United Kingdom of Great Britain
and Northern Ireland, the Internet is regulated by the Office of Communications (Ofcom),
the mandates of which includes protecting audiences against harmful material, unfairness
and infringements of privacy. Internet service providers and social media platforms are
considered mere conduits of information and are not responsible for any illegal information
transmitted. Although the State does not require providers and platforms to monitor
information being transmitted, some filter certain types of content, such as child
pornography materials, on their own initiative. In such a filtering system, called CleanFeed,
the Internet Watch Foundation, a non-profit organization, works with the Government to
compile a list of websites it deems illegal, then transmits the information to Internet service
providers. The body, which is funded by the Internet industry, works with the police, the
Home Office and the Crown Prosecution Service to receive public complaints and to
determine whether the content hosted on a website is illegal. In the CleanFeed system,
however, the list of filtered websites is not made public, which can lead to abuse when
12