How moderation worked in the “Less Hate, More Speech” project
The latest Median Research Centre – MRC report explains how comment moderation in the “Less Hate, More Speech” project worked. It describes the moderation experiments we implemented, what features were added to the comment system and what informed their design, as well as what the main principles and goals of moderation are, what key concepts inform our moderation procedures and how, exactly, moderators decide what can and cannot be published.
Core website GSP.ro entered moderation as part of the “Less Hate, More Speech” project in April 2015, followed by the second core website, Tolo.ro, in September. “Less Hate” moderation of Blogsport.ro and Paginademedia.ro began in February 2016. The supervised moderation phase, overseen by the research team, ended in June 2016.
What we moderate, in brief:
- Hate speech
- Incitement to violence
- Incitement to discrimination or exclusion
- “Inferiority” (comments painting certain groups as inferior, deviant, sick, etc)
- Slurs and insults that exhibit intolerance
- Dehumanization
- Highly vulgar language
- Threats and harassment
- Highly specific accusations that are not backed up by evidence available from the commenter or the public record
The procedures developed collaboratively with the journalists from Gazeta Sporturilor were informed by the literature on intolerance, previous work on the content analysis of uncivil and intolerant discourse (including online comments), relevant legislation and court decisions and socio-psychological and behavioural insights.
A few “Less Hate, More Speech” moderation principles:
- Don’t take the fun out of the game, as long as things stay reasonably civil. Criticism, playfulness, irony are not discouraged, as long as comments don’t cross the line.
- Moderate the comment, not the commenter. Comments are to be judged on a case-by-case basis and one’s opinion of the user should be suspended.
- Moderate speech, not thought. Examine what is written, not what could have been written or what the commenter may think. When facing “borderline” comments, where one suspects a commenter is racist, xenophobic or sexist because of what they say or imply, if their speech itself does not break the rules, one cannot moderate their statements.
- Dislike ≠ intolerance. Simply giving voice to dislike or even resentment of a person or group should not draw moderation, as long as the message is not demeaning or discriminatory.
- Context matters. Know the lingo of the comment section, know the news context and mobilize that knowledge to judge the appropriateness of a message.
While every website’s environment and needs are slightly different and moderation policies may vary, we believe moderation has greater chances of success if it is: respectful of the commenters and transparent; systematic and driven by detailed procedures; and in touch with the characteristics of the online audience it is addressed to.
Leave a Reply Cancel reply
![](https://lesshate.openpolitics.ro/wp-content/uploads/2017/01/54-lhms-tineri-1-1b.jpg)
Less Hate, More Speech – Youngsters Get Involved!
The project is run by Median Research Centre (MRC) in partnership with Educ Association and targets young people between 12 and 17 years old, in order to help them better identify and react to online and offline hate speech.
More, on the project’s website.
Comentarii