Blog Post

Craig Newmark on the Web’s Next Big Problem

Stay on Top of Enterprise Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

I had the chance to sit down with Craigslist Founder Craig Newmark recently at his favorite breakfast spot in San Francisco, just a block or two from the house where Craigslist was launched 15 years ago this month. We talked about a number of his favorite topics, including the bird feeders he keeps having to replace (because the squirrels he likes to post about on Twitter destroy some 10-15 of them every year), his love of dogs (he doesn’t own one himself, but keeps dog treats with him to feed the various neighborhood pets he runs into during the day, most of whom he knows by name) and — last but not least — what he thinks is the next big problem the web has to solve.

And what is that? The question of who to trust online, according to Newmark. To solve it, he believes that what the web needs is a “distributed trust network” that allows us to manage our online relationships and reputations. I just happened to have a Flip video camera with me, so I convinced him to let me capture a few minutes of him discussing this concept; I’ve embedded the clip below.

Newmark called some form of distributed trust system “the killingest of killer apps” for the web over the next decade (he said he wasn’t sure that was the best way to describe it, but was trying out to see how it sounded). He talked about “reputation and trust ruling the web, just the way it does in real life,” and how he was looking to big players such as Google (s goog), Facebook and Amazon (s amzn) as the kinds of entities that would have the scale to handle such a distributed trust or reputation management network. And he said that despite some occasional missteps by both Google and Facebook when it came to privacy (Google Buzz and Facebook Beacon, respectively), he believed that both were acting in good faith and had a policy of “not being evil.”

The Craigslist founder also said that he saw a place for government to be involved in this process — something he hoped he would be able to help with — but that there would need to be a private-public partnership to provide checks and balances. And he hoped that the major players such as Google and Facebook would co-operate to create some kind of universal standard or platform to support such a trust or reputation network, rather than fighting with each other. Newmark said that as a society we needed to “get our act together and make this happen,” adding with a wink that the idea for the distributed trust network was all part of his “hidden agenda to move ahead on the web to try and save the world.”

36 Responses to “Craig Newmark on the Web’s Next Big Problem”

  1. I’m wondering why my comment have been removed?
    It wasn’t spam and was about trust, digital IDs and my company Certified

    We’re extremely serious in what we’ve been doing for the past 3 years, we are certifying digital IDs in 37 countries and just signed up a major contract with the French government in order to provide certified digital IDs to all the French citizens.

    So I would really appreciate that my comment remains here as what Craig’s been calling for the web’s next big problem is what we’re trying to work on: a distributed trust label.
    Thank you!

    Charles Nouÿrit
    Founder & CEO of

  2. As I was commenting on Stowe Boyd blog,

    I tried to reason about this in some papers in the past, for example in “Trust metrics on controversial users: balancing between tyranny of the majority and echo chambers”
    I tried to back all my claims with empirical analysis on real trust networks (mainly derived from

    I’m totally for what I call local trust metrics: the code is open source, you run it locally on your computer, so this means that you can set your own parameters (trust horizon, propagation, algorithm, timeout, …). Of course most people will use the detault values of the most known open source trust metrics available but it is at least possible to not been told by someone else (google, the government, the matrix, whatever) “who you should trust”. What if the global trust metric tells you you should not trust your mother? ;)

  3. Kaliya - identity woman

    I would invite Craig and all your readers interested in this topic to join the community working on identity and trust on the web at the 10th Internet identity workshop

    We have been working on open standards and collaboration across all the major consumer potals and enterprise companies for 5 years. The most recent development includes the Open Identity Exchange that hosts trust frame works the first on coming from the US gov. Subsequent ones are forthcoming for PBS and OCLC.

  4. You can only trust someone if you know their bank details, phone number and address.

    How many people would be prepared to share that kind of information?

  5. I’m a founder of a restaurant review site, and have thought a lot about the trust problem over the years. One aspect to consider is that the same person may be trustworthy in one sphere and to one group of people, but not trustworthy in another. It would be hard to assign an overall trust metric, because the context matters a lot, even in just one area. For example, a friend of a restaurant owner might write a ridiculously high review for that restaurant, but then fairly review all the others. Or they might be totally unreliable on a restaurant site, but be fair and reasonable on ebay.

    People’s motivations are frustratingly complex, and the trust war becomes quite an arms race between those who are trying to build reliable services and those who desperately need to manipulate those services for their own real-world gains. In a sense, it’s ideals vs. reality. Everyone wants a fair platform, but once a platform is pronounced fair, everyone wants to sneak around to get on top. Partially because they need to pay their mortgage, and partially because they (rightly) assume others are doing the same.

    I hope the problem is solvable–this is what keeps me up at night.

    • Tina, this is pretty much what we at WOT have experienced as well. This is how we have solved the problem when rating the reputation and trustworthiness of web sites:

      Instead of a regular democracy, where everyone has one vote, what we have in WOT can be called a meritocracy. In our system, all votes are evaluated by their merit. Unlike in a typical meritocracy, we don’t know anything about you, your social status, or your skills, but we do know how you have voted in the past. Using a number of statistical algorithms, we compare your voting behavior with that of other users, and determine exactly how much we can trust you.

      If a user’s voting behavior is completely erratic, or we notice an actual attempt to manipulate reputations, we simply don’t trust that user’s votes as much anymore. In WOT, trust has to be earned.

  6. ultimately, i believe trust comes down to the right incentives being established and balanced between the various constituencies that need to interact. no easy task and like all past “trusts” there are often unanticipated consequences or repercussions that come about, but we just need to execute on some baby steps so we have something to begin evolving and iterating on.

  7. Trust is the new black :)

    I am the CEO of Web of Trust (WOT) – a free, community powered safe surfing add-on for IE, FF and Chrome browsers. We operate one of the largest “distributed trust networks” with app. 10 million users who have rated the trustworthiness and reputation of over 26 million web sites so far. We are continuously exploring new ways to improve our service so if anyone has great ideas, please let us know.

  8. I agree with Craig about “reputation and trust ruling the web, just the way it does in real life…”

    The need for trust is universal and arises from our human interdependence. We often rely on others (individuals, groups, brands or institutions) to help us obtain, or at least not to frustrate, the outcomes we value (and they depend back on us as well). Trust allows actions to occur that otherwise would not have been possible.

    Philosophically, we build and maintain trust in the digital world just as we do in the physical. Participation in digital world communities and platforms can accelerate the speed and reach of the trust metric, but the underlying human reasons for earning (and maintaining) trust are, I believe, the same: Ability + Integrity + Benevolence – demonstrated consistently over a time continuum.

    If interested in more :

  9. I already have a trust network, it’s people I know and that know me. And there is a secondary effect, people that know the people I know.

    What’s the problem that Craig believes needs to be solved? There’s lots of problems online, figuring out who to trust is not high on that list…

  10. I think Craig is right on the mark on the trust thing. I think the dust is now slowly settling and users are saying “hey wait a minute, how come our data is not being controlleed by us”. Turst will come but within a trusted network of family and friends where content is onyl shared with only thosse who you trust and have faith in. We at Virtual LockBox are actually developing such a model for sharing information and content with those who you trust and at the end of the day you the user is in full control of your data, you share it who you want and when you want.

  11. tolleson

    I agree that this is a challenging and fertile area for innovation.

    The problem itself is massive. To begin with, there is so much lying that goes on online that it is close to being an epidemic. Having been the CEO of a well-known web company, and having had a successful exit, I now see my former employees and co-workers fanning out across the web as to be expected 4 years after selling the company.

    What has astounded me is the amount of resume puffing that has gone on after the fact. All of a sudden people that had important single contributor jobs are telling others about how the “built xyz” or they “launched abc”. Others have used inflated titles post-transaction to make people think they had those elevated roles pre-transaction. Anyone that understands the M&A world knows they are very different times.

    This may not be the kind of reputation issue that Craig talks about but it is a problem. Who can you trust? Is their past to be believed? It’s a major epidemic imho.

  12. Trust is learned, it is not assigned. Basically one observers behavior over time and learns which one to trust. Its heavily biased based on generalizations.

    Google assigns a ranking value to text, machine translated and back which in the process becomes totally garbled besides a few keywords. No hope there, that they ever will figure out how to build a system which builds and learns its own generalizations.

    I trust my bird feeders to withstand the usual squirrel attacks, but not a black Bear one. Observations from the Rocky Mountains.

  13. eBay has a pretty good self-policing mechanism. It wouldn’t be that hard for Craigslist to incorporate something like that. It would be a big addition for CL.

  14. Fascinating, though not explicitly articulated. I can see where this is going. I have my own ideas about this regarding school districts and the “pooling” of blended capital from non-profit, private equity and federal / state monies to assist education innnovation. If either of you would like to have a conversation about that, let me know.