Blog Post

Fixing online comments — how do you automate trust?

Stay on Top of Enterprise Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

The social web has been around for more than a decade now, but even after all that time, no one has quite figured out how to fix online comments. Some bloggers have given up trying and don’t allow comments at all, while others have turned their communities over to Facebook, only to find that doing so makes things worse instead of better. Jeff Atwood, one of the founders of the online geek community Stack Overflow, has launched a new commenting system he hopes will help solve one of the crucial problems — namely, trust. But is it even possible to automate that process?

Atwood, who left Stack Exchange — the company that manages Stack Overflow and a number of other similar sites — about a year ago, launched his new venture on Tuesday with a blog post in which he lamented the fact that commenting and user forums have not changed much in the past decade. The vast majority of these platforms, he says, still fail to capture real conversation and are too difficult or expensive to implement.

Figuring out who to trust is the holy grail

The Stack Overflow founder says his new platform, which is known as Discourse, differs from other commenting systems in a number of ways — including the fact that it is fully open source. Atwood used the blog-publishing platform WordPress as a model (see disclosure below), and says the company will rely on selling hosting, support and other services for revenue.

Discourse has raised funding from a group of venture backers including Greylock and SV Angel, although Atwood wouldn’t say how much (another hosted commenting solution, Livefyre, also just closed a round of financing).

In addition to some other innovations, such as links that automatically expand within a comment (in the same way Twitter’s “expanded tweets” do), Atwood says he is trying to build a reputation system that will grant users new abilities based on the level of trust the platform has in them. Although he doesn’t provide a lot of detail, in a comment on a Hacker News discussion thread he suggests that it will be based on behavior such as flagging abusive posts.

Discourse screenshot

Measuring trust and rewarding good behavior is something online communities have been trying to do for years, with mixed success. Some believe that sites like Slashdot — which has a moderation platform that awards “karma points” for certain behavior and appoints moderators automatically — have a good solution to the usual problems of trolling and flame wars, while others argue that these systems are almost always fatally flawed. Metafilter (which charges users $5 to become members) has many fans, but it is also a relatively small community. Branch is another attempt to reinvent user forums and discussion as invitation-only hosted conversations.

Trust takes effort, not just algorithms

Atwood says he wants to use a badge system for rewards (something Huffington Post also uses), but Gawker founder Nick Denton said in an interview last year that a similar reward system his sites used was a “terrible mistake,” because it was easily gamed and encouraged the wrong kinds of behavior. Denton has since completely revamped Gawker’s commenting system in an attempt to make reader comments the centerpiece, as well as a potential business model.

As my colleague Jeff Roberts noted in a recent post, the Huffington Post has also launched what it hopes will be a new feature called Conversations, which allows popular comments to become full-fledged blog posts of their own. The Verge — a tech blog run by Vox Media — is doing something similar with its site, in order to try and encourage more discussion and community. But both take a lot of manual effort.

Veteran blogger Anil Dash pointed out in an insightful post in 2011 that one of the only ways to maintain and encourage a healthy conversation — regardless of what platform you use — is to be involved in those discussions yourself as much as possible (a point Bora Zivkovic of Scientific American also made recently). Unfortunately for publishers looking for a quick or inexpensive fix, that kind of engagement is almost impossible to automate.

Disclosure: Automattic, the maker of, is backed by True Ventures, a venture capital firm that is an investor in the parent company of this blog, Giga Omni Media. Om Malik, founder of Giga Omni Media, is also a venture partner at True.

Post and thumbnail images courtesy of Shutterstock / Sam72 and Yan Arief Purwanto

9 Responses to “Fixing online comments — how do you automate trust?”

    • Jake: StackExchange uses a flavor of Creative Commons licensing, maybe CC/by-sa/ as the default. I didn’t read anything about content on the Discourse site, nor Coding Horror’s blog post when it was announced..

      There’s information about the source code’s open access license. That is the focal point now. Author/publisher rights will need to be addressed once Discourse sees real use. Of course, that could be in the ToS or another part of the website that I overlooked ;)

    • Bill Seitz,
      Good point! I think that YOU are correct. I just read the first blog entry on the Discourse site, written by Jeff Atwood, here’s the URL

      It specifically refers to Discourse as a standalone Forum system, as you said. That’s the reason it has so many features. That makes more sense, as it would be overkill to have THAT much functionality in a comment system!

      Disqus is an adequately featured comment system but isn’t open source (not that I am complaining). Discourse is open source on Github, but IS a forum system. In the blog post comments, to which Jeff responds, they discuss the possibility of Discourse as a phpBB alternative. Any idea why Discourse was described here as a comment system?

  1. Thank you Mathew for returning to this topic that refused to be solved by the usual means of mass media and mass marketing. You are right to question the possibility that trust can be automated. Countless studies (especially studies of trust among stock market brokers) have shown that trust between people is based on the memory of past interactions. But it is a relationship between specific people, not a measurable quantity. It seems obvious, but you can trust somebody for his or her knowledge on wine, but this “trust” quantity is not transferable to another “trust” quantity on legal, medical or plumbing advice.

    Yes, the “social Web” is a decade old, but The Well will soon celebrate its 28th anniversary. “Community” is just a word to describe how to manage relationship between people inside a collective platform where they have a stable identity and where interactions are memorized: reputation and trust can only grow – or not – between people, not as a transferable quantity. The obstinate aversion of news media to recognize – and to study – how communities do it is appalling.

    Oh! Sure, there is a problem: Communities work only one person at a time. It scales, but very slowly, especially at the beginning. And our publishers and marketers – and VC – are only interested in anything that can work one million at a time. Sorry folks, no shortcut. On the other hand, the first to start a real community will have a head start. :-)