- Comment and user profile management systems
Before setting out to generate ideas about how you want the community to look like and how the interaction should takes place, you need to decide on the practical things, namely how people will be able to comment, when will moderation intervene and how open “the door” to commenters will be. This means that you would need to make decisions on how the comment and user profile management system will look like, since this has ramifications for user privacy (decide which one gives more or less anonymity to users or even if this is something important to your website), accountability, ability for the publisher to track conversations and user history. Broadly put, you would need to consider whether users should have greater participation on the website or whether the newsroom has/needs greater control over the comments sections.
In our research of the trends and practices regarding comment sections in five countries, we found that almost 70% of the outlets either used a proprietary comment management systems or the one provided by WordPress (rather than other commenting platforms like Disqus or Livefyre). Arguably having a tailored comment management system would suit the needs of a publisher best, unless resources to invest in the development of one are scarce.
Equally important to consider is the log in process. Some outlets in our sample allowed for authentication without verification and in that sense Romania was an outlier, with 13 websites that allowed users to comment with just a nickname and an email address (without having to register on the website or confirm the email by clicking a validation link). This option might also give the most amount of “anonymity” to users.
Most websites though, require creating a profile through registration with a valid email and password. This option allows for more control on the part of the publisher and enhances its ability to track users and connect with them, as well as opening up opportunities to marketing (building the community, setting up badges for users, highlighting valuable contributions that are visible on users’ public profiles, user history) and advertising tactics. Recent insights revealed by Financial Times and the Times of London, support this argument. The FT found that users who comment on the website are 7 time more engaged than those who don’t, meaning more time spent on the website, more articles read and a habit of revisiting the site more often. At the same time, the Times of London found that users that comment read 4 times more articles than those who don’t (not all registered users are allowed to comment, just those that have subscriptions).
In addition to verified user accounts some websites also allow authentication through a social network such as Facebook, Google+ or Twitter.
Another option to consider would be to only have the Facebook comments plugin (our research showed that only 8 websites out of 69 used this method only), a decision that would probably give the least amount of privacy to users. And although community managers or moderators can still delete comments, they would not be able to moderate or engage in a more meaningful way (to just take out the nasty parts in the comment or indicate why a comment was moderated in the first place). For some publishers this can be a good tradeoff, since there is some indication in the literature that loss of anonymity can motive comments section users to be more civil. Contrary to this, our research revealed that, for one of the websites in the project that used both a proprietary log in system and Facebook login, Facebook users were less active, commented less and were moderated more.
In our case two of the websites had their own proprietary log in system where users needed to create a profile and use a validated email address and password, one used the WordPress system and one allowed for authentication without verification.
Another decision to make would be how much of your content should be open to commenting and when the moderation intervenes: before the comment is published on the website or after. Most websites we researched used pre-publication moderation only or a combination of pre- and post-moderation.Pre-moderation can take up multiple forms:
- pre-moderation by section or topic (a publisher might decide that some sections of the website or some articles, due to their sensitive content, can arouse important amounts of uncivil talk and allow comments only after a screening);
- pre-moderation by user (if the comments system allows to track the history of users, the publisher might decide to “filter” the content submitted by certain users, based on their past behaviour – like The Guardian is known to do -, or in the same way not filter those that have had a good track record – like The New York Times does with its “verified commenters”).
Post-moderation is also an option for publishers and this allows users to post their comments and only after posting them to prompt moderation if deemed necessary. Using a dictionary of “bad words” (like we did in the project) which automatically detects and deletes offensive content could also supplement any of the options.
Deciding how to actually apply the moderation is also something to consider, i.e. how a moderated comments looks on the website. From our research it was clear that almost no publishers chose to do selective moderation, wherein they only remove or replace the offending part of the comment, leaving the rest intact (which was the main type of moderation we used in the Less Hate, More Speech project, where the moderated part appeared with three asterisks). Most preferred to delete the entire comment and leave a message instead mentioning that the respective comment was suppressed because it did not abide by the community rules.
Why we went against the tide on this option?
The decision to use the asterisks (***) was not taken lightly and it involved negotiating the end goals of the moderation procedure, namely “less hate” through “more speech.” It was important to find ways in which to attract attention to the new procedure, as well as to turn it into a teaching experience for the users so they better understand how moderation works, what is ok to post and what goes against the rules.
We also considered that, at times, there might be well argued comments that contained rich information and which would be well intended in general, but a slip up at any point could see it deleted in its entirety.
And finally, although there are counter arguments for not selectively moderating comments (The Guardian believes that their moderators are not editors and so, they delete whole comments even if only a line or paragraph is problematic, in an attempt to encourage users to think and be responsible before posting) we felt that it was important for us to avoid being seen as censors of free speech, which in the Romanian context mentioned in another piece, is cause for great annoyance among users.
- Check what others are doing
We had multiple sources of inspiration for the rules, as well as for the goals and principles of moderation as mentioned in a separate piece: some of them originating from research endeavours, like literature review and developing a codebook for manually analysing the content of comments and some also from practice (from our moderation experiment and interaction with the moderators and the newsroom).
But probably the most influential was a study of the trends and practices of some 70 news outlets in five countries (Hungary, Romania, France, United Kingdom, United States).
We did the research so you don’t need to go to such lengths, but it’s still a good idea to at least check what other websites are doing in this respect in your local market and see their practices, as this might be a differentiation point, but will also give you a sense of what matters to other publishers and maybe see reactions or feedback from the public to their rules.
There were a few trends coming out of our study (take a more in depth look at the study and its findings), worth taking into account when thinking how to approach moderation:
- Most websites prefer pre-moderation: 41% do pre-moderation only (meaning they check some or all comments before publication); 35% only practice post-moderation (checking after publication); and 24% practice a combination of the two;
- Users with a good history on the website are privileged: 17% of publications go beyond punishing abusive content and highlight certain comments or designate certain super-users, whose contributions are either highlighted or are subject to a different moderation regime than those of the other users;
- Websites prefer user authentication: most of the online publications and platforms we analysed (77%) require commenters to either login through a social network, or create an account, a process that involves email verification.
- Community rules can be found on most websites: most publications that host comments also post rules about what is allowed and what is not. Approximately 40% of the websites we analysed stipulate that they moderate comments.
- Prohibitions in the rules, different in Romania vs. the other countries: most Romanian or foreign websites that have rules have prohibitions against discourse that incites to hatred or discrimination (85%) and vulgar language or images (79%). The rules of Romanian websites tend to be somewhat briefer and more preoccupied with problems like spam, trolling or vulgarity than issues like discrimination, insults or personal attacks.
- The quality of comments matters: 52% of websites allow recommending comments and ordering them by popularity.
The terms, conditions and practices developed in our project are neither among the most permissive nor among the most draconic and there are many ways to go about it, depending on the goals and specifics of the outlet and its audience. We believe the rules we implemented strike the appropriate balance for the project and the publisher and resulted in some of the most detailed and clear guidelines for commenters that can be found online, in particular in the Romanian landscape.
Devising the terms and conditions for a community and publishing them on your website is necessary not only to ensure transparency and help users understand why their comments might be moderated, but it is also a good tool to signal a change and raise awareness of the efforts and commitment to foster better debates and engagement between the newsroom and its audience.
For our project, announcing the new rules to the audience was important. On top of users having to agree to the new terms and conditions by manually ticking a box before they commented, the editor-in-chief penned an article explaining them, as well as the reasoning behind the procedure. Before the official announcement, the rules were discussed, refined and agreed upon with the leaders in the newsroom.
See next –> The rules of engagement: what we moderate, what we don’t and why
The rules of engagement: what we moderate, what we don’t and why