Stay on Top of Enterprise Technology Trends
Get updates impacting your industry from our GigaOm Research Community
When former New York Times public editor Arthur Brisbane asked earlier this year whether reporters for the paper should be “truth vigilantes,” the response was immediate and decisive: of course they should, readers said — after all, wasn’t that what journalists were supposed to be doing in the first place? It clearly is, but as Clay Shirky and others participating in a Poynter Institute forum pointed out on Tuesday, that job has gotten infinitely harder as the number of information sources has increased thanks to the web and social media. Checking specific facts may have gotten easier, now that anyone can do it, but is reaching any kind of consensus about the capital T truth even possible any more?
The forum, entitled “Journalistic Ethics in a Digital Age,” was a co-venture between the Poynter Institute and CraigConnects — a project created by Craigslist founder Craig Newmark to increase public trust and accountability in journalism — and included a series of panels involving Shirky, Digital First Media CEO John Paton, Microsoft social researcher Danah Boyd and others, as well as an invitation-only audience of media industry luminaries such as author Jeff Jarvis and digital-media veteran Dan Gillmor. The event was livestreamed, and Poynter also created a live-blog of the session in Storify.
With no gatekeepers, who decides what is true?
In the not-too-distant past, Shirky said, we could look to trusted media oracles like former TV anchor Walter Cronkite to determine what the truth was about any given media event, but that was only possible because there were so few sources of media or journalism at the time. This artificial scarcity of information has been exploded by what Om has called social media’s “democratization of distribution,” and the result is that there are literally thousands of different versions of the truth about any given news story, Shirky said. And without gatekeepers, who determines which is correct?
As Shirky noted in an essay that was published as a companion piece to the Poynter forum, the fact that anyone can make themselves heard about virtually any topic — something that was never possible before the web and social media came along — makes it a much more complicated task to arrive at any kind of actual consensus about the truth. As he put it:
Here’s what the “post-fact” literature has right: the Internet allows us to see what other people actually think. This has turned out to be a huge disappointment. When anyone can say anything they like, we can’t even pretend most of us agree on the truth of most assertions any more.
In effect, Shirky suggests that most political and social discourse has become — or is in the process of becoming — like a giant chat room, where debate never ends and anyone with a keyboard and an internet connection can take part, including trolls and those with hidden agendas, and everything in between. So how are journalists supposed to make sense of all this? The Poynter panel mostly agreed on one thing: namely, that is has become exponentially harder than it used to be, and traditional journalism may not be up to the task. As Shirky put it: “The strategies developed for reporting the truth in the 20th century simply don’t work any more.”
For one thing, said the Pew Research Center’s Tom Rosentiel, newsrooms are resource-constrained in a way they never have been before, as newspapers and other media outlets try to cut back on costs. So a journalistic entity that would once have done a number of things well — including plain reporting, investigative reporting, in-depth fact-checking and so on — now has to pick which of those to focus on. And economic constraints mean that most outlets are going to choose the one that is the most profitable or the most appealing to advertisers, or that is simply the fastest.
Defining truth requires journalism to be more open
And it’s not enough to rely on non-profit entities like
Politifact or FactCheck.org to do the heavy lifting, said panelist Adam Hochberg of the University of North Carolina’s journalism school, for the simple reason that these organizations are also beholden to donors and foundations that provide the resources for their work, and that can be just as significant a factor in what gets fact-checked (and how) as any commercial outlet’s relationship with advertisers. In most cases, he said, advertisers who paid for journalism through Sunday supplements didn’t really care much about the content, they just wanted a large audience.
Craig Silverman of Regret The Error noted that admitting mistakes and being open about the process of fact-checking or truth-telling — and the inherently complicated nature of it — could actually increase trust in the media, as opposed to decreasing it. Shirky, meanwhile, said that the whole notion of “objectivity” was something the media came up with in the 1950s and ’60s in order to appeal to a mass audience (and thereby appeal to advertisers), and that it serves no useful purpose any more.
One obvious outcome of what the Poynter panel was discussing is that defining the truth is no longer something that is done by professional journalists in isolation, but something that only emerges over time, through a process that involves both journalists and what Jay Rosen has called “the people formerly known as the audience.” Which is why I’ve argued that fact-checking of all kinds — both specific facts and larger questions of truth — is something that is best done in public. In a sense it has always been that way, it’s just easier to see now while it’s actually happening.
Arriving at the truth may be a lot more complicated than it used to be, because there are more moving parts and more sources than ever, but in the end it is probably closer to the real thing than what our traditional media gatekeepers have gotten used to providing in the past.