Engaging the moderators’ team is key since they will be doing the heavy lifting both at the set up of the procedure and continuously afterwards. And not only through surveys, but with regular talks and continuous support even after the rules and principles of moderation have been settled.
It is important to keep track of the issues raised by certain comments along the way and have a log of what happened and how it got solved, so that anyone can go back and check the resolutions when similar things pop up, especially since a good deal of comments might be on the fence and deciding if they should be moderated or not is not easy.
Over the course of the 14 months (during this time the moderation procedure was supervised by the researchers in the project), the moderators produced 34 field reports with questions and observations on commenter behaviour, and there were a total of 27 face-to-face meetings between the moderator team and the researchers.
For each meeting, the research team produced a handout designed to answer moderator questions and provide information on main concepts like intolerance and prejudice and their application to the comments. Even with more limited resources (not having someone to document and research such concepts) one could hold meetings to talk over difficult comments and see if they fall under the agreed principles and rules of moderation) as well as set up an online archive of those decisions. This systematic work will produce invaluable resources for the future and help keep people on the same page, making the moderation system a bit more even and instilling confidence in the team that their decisions were in line with the agreed principles and that the moderation works with positive effects for all involved.
Engaging the moderators team in talks and meetings on a regular basis might help defuse some of the burden they collect from sieving through, sometimes “nasty”, comments all day, a task considered by some “the worst job in the world”, but still with glimpses of “wit, wisdom and a community worth fighting for” as a former Guardian moderator put it.
At the New York Times, the team of moderators is made up of “about 13 mostly part-time moderators, all of whom are journalists” who review some 12,000 comments per day and at oftentimes feel the burden of their job, as one of them put it in a recent article depicting their work: “I won’t lie; there are days when I think, ‘if I see the word ‘Trump’ one more time …’ It can get to be a slog…But you want to give each comment its due. We want people to be heard.” (Stroud, Muddiman 2017) (Long 2017)
From our experience, those engaged in the moderation experienced fatigue after concentrating on this for a longer period and they related to us that having some other minor tasks around the newsroom helped them detach a bit from their main task. They also believed that in some cases, banning abusive commenters might work better by giving a chance to repeating offenders to cool down while easing the burden of having to moderate rage fits. Although the project did not allow banning of commenters (as explained in an previous piece), after it ended the newsroom decided that this was for the best and implemented a system of gradual banning, while also updating the “rules of the game” on the website.
In what concerns the surveys, one could start by sending some questions to the moderators’ team even before the procedure is implemented (in our project we also included the moderators in the first journalists survey) as well as some time after. The goal would be to get some feedback related to how they see their job or what their concerns are. It also might be easier to just have some talks with them on this subject (if it’s rather a small team). In case anonymity is preferred here are some sample questions to guide your survey and what answers we received in the one we implemented in the project:
- How would you describe what your role is in the (new) position?
- Have you seen an “ok” comment section anywhere on the web? What did you like about the interactions that were taking place there?
- What bothers you the most in comment sections? (One option would be to have a top five of things they mention).
- How does the role of moderator feel?
- What is your biggest fear related to the effects of comment moderation?
- What is your biggest hope related to the effects of comment moderation?
- What do you think, IDEALLY, SHOULD BE / WAS IN REALITY the role of a team that deal with comments? (for these questions they could be asked to mark each answer with numbers 1 to 4, where 1 is “most important role” and 4 is “least important role”)
- Gatekeeper – acts like a filter, marking decisions about comments and what kind of language is acceptable or not on the website
- Facilitator – encourages interactivity and facilitates dialogue, for example by posting comments and highlighting good comments on the website
- Expert/Assistant – answers questions or problems raised by users, related to the websites or the comment sections, helps commenters learn how to navigate the website and behave
- Messenger – gives feedback, information, opinions and ideas from the commenters to those in the newsroom or to the bosses.
- It’s possible to separate aggressive comments from the rest of the comments. (on a scale from 0 to 10, where 0=completely disagree and 10=completely agree)
- Moderation lead to diminished numbers of vulgar, aggressive and intolerant comments on the website.(same scale as above)
- Commenters negatively perceive the moderation. (same scale as above)
Interestingly enough the engagement we had with the moderators team in this project had translated not only in a two way learning process, which they admitted made them increasingly aware of framing effects when choosing certain subject, wordings or pictures, but also transformed them into the go to people when the newsroom had debates about certain ways they wanted to frame articles.
Because of their exposure to the comment section and their growing awareness to what tended to generate the most hate or incivility, they often joined the newsroom in sometimes heated debates or drew the alarm about problematic coverage. In several instances moderators managed to change the minds of their colleagues regarding the appropriateness of certain topics or angles.
As one of them reported: ”After Dinamo won a friendly with two goals from African players, a colleague asked me if it was ok to use a title like ‘Black Dogs’ (n.b. Dinamo players and fans called themselves “dogs” or “red dogs” because of the team’s insignia). We discussed this subject a little, without reaching a clear conclusion. There were arguments for and against, that the expression itself was not racist, but the fact that it was highlighting their skin color, in a context where this had no relevance, could be perceived differently. In the end he used a different title.“
Both from their testimonies and from their answers to various surveys, it appears that by the end of the project, the moderators also saw a need for the newsroom to think not just about moderation, but about how the comments are shaped by the news environment and how the users can be engaged and influenced by the journalists. In the last moderator survey, the entire team agreed that the tone and content of articles influence the tone and content of the comments.
Likewise, the majority of them also believed that it would be better for both readers and the newsroom if the authors actively participated in the comment section. All thought that readers are more civil if a journalist is involved in the discussion. They also thought it was a good idea to highlight good reader contributions (a feature that had been enabled as part of the project and that we covered in a previous piece) or to adopt a policy for how to communicate with the commenters (for instance to thank them when they report errors or to answer to good comments).
Getting the moderators’ team involved in talks and answering surveys will not only help smooth the implementation of the procedure, but will help get prepare them in case one needs to experiment with different updates to the appearance of the comment section, for example and what commenters can see or include new tools for them to use.
*The first 6 sample questions we mentioned above were implemented in the first moderators’ survey in the project, which took place in May 2015 (a month after the supervised moderation began), while the last 4 were implemented in the second moderators’ survey which took place in July 2016 (after the end of the supervised moderation period).
Take a look at our experience with different experiments in the comment section and see if any might be suitable for you.