Blog Post

How Facebook comments affect trolling for news websites

Whether news sites should or shouldn’t use the Facebook comment plug-in or Facebook identity seems to have been a recurring theme in the last few days.

The Nieman Journalism Lab called it a “movement”, which seems quite a grand term for two sites announcing similar but different things on the same day, but both Politico and TechCrunch are opting to move their commenting systems away from Facebook. At the very same time, waves were being created in the UK as the newly-relaunched Manchester Evening News shifted to a commenting system that required users to have a Facebook account. At the heart of all this is the old canard — would forcing users to comment with something closer to their real identity reduce instances of trolling?

It seems to me that what Politico and TechCrunch have in common is a stubborn belief that the quality of debate underneath their articles would improve if only they could find the right commenting platform.

At Politico, Dylan Byers is putting his faith in technology:

“Disqus gives you the ability to up-vote and down-vote comments and thread responses. By default, high quality comments will filter to the top, and poor quality ones will not show up on the page.”

A view immediately debunked in the first comment left on the piece, where Adrian Lowe pointed out:

“That’s if people actually vote for them. And if people are trolling in voting, then low quality comments will be seen at the top. So, ‘by default’ high quality comments will not necessarily rise to the top.”

You only have to look at the green and red arrows on the MailOnline site to see how sometimes it is the scum that rises, not the cream.

TechCrunch’s attitude to their below-the-line contributors was made clear by the image they chose to accompany their announcement: “I miss you asshole

They seem to be ascribing the behavior of their users to the platform they employ, not to the way they are goaded into commenting by the articles they write. As my ex-colleague Meg Pickard says:

“If you write a provocative article, you can expect people to be provoked.”

The Manchester Evening News move is in the opposite direction, hoping that a shift to using Facebook identity will improve the commenting experience on the site. There’s no doubt that restricting people to only using Facebook identities will exclude some users, but David Higgerson wrote an eloquent personal blog post about the shift: “Much ado about Facebook”.

“Most of the people who have complained…seem to come from a starting point that news websites should allow free-for-all comments on all stories, and that the ‘community’ can say what it likes under any name it likes. I don’t see it like that.”

My own experience with using the Facebook comments plug-in under news content was within the Guardian Facebook app.

Guardian Facebook appI had rather hoped that by opening two commenting threads underneath each article — one on Facebook, and one on the Guardian site — we’d be able to prove once and for all whether one or other led to better interaction. In the end, it appeared that actually the tone set early on in a comment thread looked like it influenced comments much more than anything intrinsic about the format or identity system used.

There’s no doubt that software design and features do influence community behaviors, but not as much as decent community management and personal engagement from journalists does. In 2011 my friend Mary Hamilton wrote a very thorough blog post looking at the responsibility of news organizations to not just provide a commenting space, but to also participate and join in that space:

“If you don’t set examples of good behavior, or reward [commenters], or empower the regular visitors to police their community by telling them the rules, your community will make its own rules, and chances are you won’t like them.”

She described switching tech platforms in search of an answer to bad community problems as akin to “laying Astroturf over an unkempt, unmaintained garden because you don’t like the color of the wildflowers.”

She also said:

“The news industry can’t simply automate away its duty to respond to users. Small publishers and bloggers for the most part understand this, and — more crucially — so do our users. These are human beings at the other end of the internet, talking in our spaces, and we need to start treating them that way.”

Still, the golden rule of newspaper website comment systems is “Don’t be a dick” — and no technology choice can enforce that.

This was first posted at Martin’s personal blog, Currybet.

Martin is principal consultant at Emblem, which provides user experience design and training services. He was previously UX Lead at The Guardian, which included working directly with Facebook on the news organization’s Facebook app. Martin also currently provides some design and consultancy services to Trinity Mirror, publisher of the Manchester Evening News.

Guardian News and Media Ltd., the parent company of the Guardian newspaper, is an investor in the parent company of this blog, Giga Omni Media

10 Responses to “How Facebook comments affect trolling for news websites”

  1. GreyGeek

    What is scum and what is cream often, like beauty, is in the eye (and mind) of the beholder. Also, why show sexism by using a male prejoritive? Fifty percent of the population is female. Equality suggests that “Why be a dick or cunt” is equally offensive.

  2. Ro Gupta

    Hi Martin — Ro from Disqus here.

    This is one of the more thoughtful analyses on comment quality we’ve seen in a while. And we’d agree — we often say ourselves that software is 50% of the equation at most.

    Looking across a couple million sites, we see that the best communities tend to set norms and a distinct tone upfront, and they reinforce them on an ongoing basis through active staff engagement in the discussions. Self-policing by fellow commenters is also critical, which tends to happen more often when authors are consistently present. As Mary alludes to in her quote, as well as the commenter “agirlcalledTom” in this thread, the ‘broken windows theory’ and good old human psychology are the core factors in all of this.

    To clarify a point above, however, the Disqus sorting system does more than just take vote totals into account — although voting is a key part and has increased significantly in the new Disqus — and has built-in checks for improper use or gaming the system. Upon moving to the new system last year, we saw abuse reporting decrease by 80% and moderator workload decrease 25%, so we were encouraged that there are still a number of things software can really help with. Visibility management and use of quality and reputation signals are the big focus areas for us.

    • currybet

      “The Disqus sorting system does more than just take vote totals into account” – that’s good to hear Ro. I did quite a bit of design work at the Guardian on ways that the metrics of how people behave could be factored into giving users a “reputation” score behind the scenes that might help with borderline moderation decisions. I definitely think that software design does influence user behaviour, I’m just not convinced you can design away bad behaviour with a new widget as some sites seem to think.

  3. Social Media Insider

    The primary problem lies in the belief that an on-site, site-controlled commenting system that simply aggregates comments will on balance result in something of value. History has proven time and time again that only in rare cases is this true.

    The solution lies in a system of distributed pods of comments, not aggregated on-site comments.

  4. Laura Jean

    Reblogged this on Laura Jean and commented:
    Good perspective, things change so fast – what I learned in school a mere 6 years ago about reporting and journalism is pretty much useless now. That is, except the law part of it….:)

  5. There is no mystery to this. Sometimes it seems that News orgs are trying to reinvent the wheel.

    The rules for good community interaction are the same as before social went mainstream: Clearly state the rules, be seen to enforce them and enforce them fairly, reward good behaviour and set an example of how you want people to behave.

    At the end of the day it is about human behaviour not about platforms.