international human rights law and standards, including the Rabat Plan of Action; ensure the
greatest possible transparency, accessibility and consistency in the application of their content
policies, decisions and actions, and clarity in the definition of their hate speech policies in
particular; ensure that any enforcement of their own hate speech policies involves an
evaluation of such factors as context and the harm of the content, including by ensuring that
any use of automation or artificial intelligence tools involves human beings; and ensure that
such contextual analysis involves groups most affected by content identified as hate speech by
ensuring that such groups are involved in the development and implementation of the most
effective approaches to address harms caused by hate speech on the platforms.
Dr. Parmar added that companies should also ensure that their content moderators and
fact-checkers are trained in international human rights standards, and have a sound
understanding of local cultures, contexts and languages including their nuances; translate
content policies into at least all supported languages of the particular platform and that
automated detections include most commonly used languages; support the capacity of civil
society groups to counter hate speech, including by providing data analytics tools to inform
their advocacy; and ensure the meaningful participation of communities from across the world,
especially the Global South.
Mr. Gerald Tapuka, Senior Correspondent and Deputy Director for Africa at The
Organisation for World Peace noted that States, intergovernmental organisations, internet
companies and social media platforms must not remain indifferent to hate speech against
minorities. He stated that intergovernmental organisations have a major role to play because
their influence goes beyond boundaries with the leverage of government and internet
companies, but very few make hate speech a priority. Intergovernmental organisations should
organise encounters with states at the highest levels, organise educational measures and
campaigns, train media workers, incorporate social media departments and track hate speech.
He also provided an overview of the situation in some African countries. In this regard, he
mentioned the fact that some of them have difficulties to trace, identify and counteract hate
messages.
Mr. Tapuka noted the relevant role of education on hate speech and indicated the need
to introduce social media related issues in schools. He highlighted the responsibility of internet
companies in addressing hate speech; their need to improve their conduct and to collaborate
with local actors, including local authorities. He concluded stating that in general, the
problematic of hate speech should not cause media censorship or internet blackout in our
society; and it should not be an opportunity for human rights abuses.
Discussion
Participants raised issues and presented initiatives, such as:
-
Preparation of a legislative initiative aiming to place more responsibility on social media
providers to monitor and quickly erase hateful content. The initiative includes extending
the application of the criminal offence “incitement to hatred” also to individuals and
17