24 Comments

Summary:

Traditional media outlets like the Wall Street Journal and the New York Times have begun to use some of the tools of social media — blogs, Facebook pages, even Twitter accounts. But they seem a lot less eager to adopt some of social media’s core principles, […]

Traditional media outlets like the Wall Street Journal and the New York Times have begun to use some of the tools of social media — blogs, Facebook pages, even Twitter accounts. But they seem a lot less eager to adopt some of social media’s core principles, including a commitment to the two-way nature of the medium and all that it represents. This means a lot more than just talking about “the conversation” and how great it is to get links or comments. It’s about taking those comments seriously, responding to them regardless of whether they are positive or negative, and incorporating that approach into the way you do your job. It’s about looking at “journalism,” broadly-speaking, as a process rather than an artifact.

This is something that most of the blogosphere, or at least the part of it that cares about accuracy and integrity, does pretty well. Sites like GigaOM and others update their posts when information is added or corrected, and in many cases link to critical or differing opinions (and if they don’t, they should). In that sense, truth — to use a loaded word — is not absolute, nor is it something that a single entity has a monopoly on, particularly around a developing or complicated issue. The most we can hope for is that an outlet of any kind, whether it’s a blog or a traditional newspaper’s web site, does its best to represent an issue fairly and completely, and that requires additions, updates, links and discussion.

The WSJ arguably failed that test on Monday, with its story on Google and how its position on “net neutrality” had allegedly softened.

There has been, and will no doubt continue to be, debate about whether the Journal’s perception of Google’s behavior is correct. Some believe that Google is actually giving itself a benefit that others can’t match (except, of course, other large web companies such as Microsoft, Yahoo, Amazon, etc.). Others see it as a natural move by a large Internet company, and no threat to net neutrality at all. Whether you agree depends on what you think net neutrality is supposed to mean, and what Google’s role in it is. If you want to understand more about the issue and the way the Journal described it, read some of the links in David Weinberger’s post.

What isn’t in dispute, however, is that Google completely disagreed with the implications in the article, as company representatives made clear in a blog post written not long after the story went up on the Journal site. It’s understandable that Google might take issue with the story, of course, since it paints the company’s behavior in a negative light. But that’s not really the point.

What is important is how the Journal responded to these criticisms, both from Google and Lawrence Lessig (who was also quoted in the Journal story and noted, in his own blog post, that the description of his views was simply not accurate), and from other sources. Was the story itself updated? No. Were any links to the blog posts in question included, even as supplementary material? No. There was a blog post on the Journal site that mentioned how the story had “gotten a rise” out of the blogosphere, which included a couple of links, and then on Tuesday there was as second one, also with links to additional posts at Wired and elsewhere, as well as a description of what “edge caching” is.

No response to Lessig’s factual assertions about his views and the way they were described was provided. There is no acknowledgment of it apart from the Journal’s second blog post (which someone reading the original story might or might not even find). To any self-respecting blogger, this seems like a failure. Why not put all of that information, whether they be links to critical blog posts, updates on factual errors, or something else that is relevant, inside the original story? Why not allow those responses to help expand the way people look at the story? They’re going to do so anyway, once they come across them on their own. Is the Journal simply hoping that they won’t, and the story will remain pure and unsullied by criticism?

That’s an old-media approach. It’s a way of saying, either directly or by implication, “The truth is whatever we say it is.” Any critical responses, even from two of the major players in the story, are relegated to a blog post that gloats about the reaction the story got, but does little to treat it as valid or worthy of inclusion. As Scott Rosenberg of Salon points out, online media provides the tools for a real conversation, one that changes the way people look at an issue, and for a real “journalism as a process” approach to the news. It’s a pity the Journal couldn’t spot — or take advantage of — such an opportunity when it presented itself.

  1. Matt:

    Well said. Perhaps they feel justified because they feel enough people will empathize with their anti-Google bent. The article endeavors to create controversy where none exists, to paint Google as the next Microsoft. On that score, the WSJ fails:

    http://googlewatch.eweek.com/content/google_and_net_neutrality/the_wall_street_journal_fly_in_googles_net_neutrality_ointment.html

    Share
  2. not so fast. WSJ had the courage to call Google on it. why shoudl they retract it just cause Google wants to spin it?

    Share
  3. @les madras — I’m not saying they should retract the story, merely update it with new information.

    Share
  4. a few comments for Mathew,

    I am in by no means defending the Journal for what they did, but I think you may have failed to see the bigger picture…they are very new at this, and by this I mean blogging, web 2.0 etc…

    I believe that most paper companies are now trying to use tactices like blogging to appeal to younger audiences. However, while doing this, they are dropping the ball as you mentioned above in the “follow up department”, which is indeed the heart of true blogging. The sharing of infom in which they are doing is taking traditional journalism to a slightly new lever, but not to a fully web 2.0 level…

    I want to give them a few more months, and then we can bash them…show me a traditional paper company that does this well…you mihgt be looking for a long time.

    Nice article

    Share
  5. great post

    Share
  6. When I read the second WSJ blog post, I was struck by 1) its cursory approach and 2) how much better a job the commenters did at explaining/discussing edge caching.

    Share
  7. The only way we can get HD Youtube, cheap Google App Engine storage and bandwidth fees and fully unthrottled BitTorrent usage, is if ISP’s install cache servers on each of their networks.

    I don’t know how Google’s OpenEdge technology works. But surely OpenEdge has the word Open in it, which should probably be a good start.

    Google just launched Youtube HD, but how can they keep Youtube HD free unless they lower the cost of bandwidth to the minimum. As well as cache it everywhere to make it possible to stream such Youtube videos in HD quality for free to an unlimited amount of people.

    How can you be sure Flickr can’t use Google OpenEdge cache servers as well to lower bandwidth usage costs and make scaling easier for hosting and delivering of their most popular content?

    I’d guess Google OpenEdge technology would use something like Google App Engine technology consisting of APIs and stuff like that which would enable anyone with certain pricin algorithms that are based on storage, bandwidth used, location of users, ISPs used and more factors. Thus using OpenEdge to any company, such as Flickr or any startup, would only make sense to host and deliver the most popular content, thus which makes sense to cache as close as possible to the ISPs.

    It’s all about scaling things, if you want to deliver big 1GB HD video files for less than $0.01, the only way to do that is to deliver it from cache servers on the ISPs own networks. Otherwise you pay $0.05 or $0.10 per GB or even more.

    Basically installing those cache servers could be a way for ISP’s to lower the cost per GB to their users once ISP’s start charging people for the amount of data they use. Which is the only way that is going to ultimately make sense to charge for bandwidth usage. Flat rates don’t make anymore sense when some users only check emails and other users stream and upload HD videos that consume millions of times more bandwidth.

    Share
  8. Great post Matthew. I think it’s just the old-world thinking of social media as “hey! another set of eyeballs!” distribution channel building that they’ve always used. Until they start seeing these “channels” as communication channels and not just distribution channels they’ll fail to reap the benefits of an engaged online community.

    Share
  9. @les madras, what courage is there in being wrong, or in not recognizing that you might have made an error? And who gets to decide which part of the conversation is labeled “spin?”

    Share
  10. I think it will take awhile before traditional media really catches up and understand the purpose of the medium – to be interactive and responsive as they happen. Lessig’s point of view should have been recognized right away and even linked /responded to. Otherwise, the trust level of their publication will be seriously affected if this keeps going on and on.

    Share
  11. This is a great post, but I think it presupposes that the wsj is (or should be) trying to achieve an objective truth, which is arguable…The wsj is a business and I think they’re doing what they believe to be the most profitable approach. Whether or not their approach is the right one will be known in the future…

    Share
  12. Great Post Matthew. Serious stories have multiple players and perspectives….that is what journalists should expose.

    Share
  13. [...] about looking at “journalism,” broadly-speaking, as a process rather than an artifact. – from Gigaom Filed Under: TechTags: Rupert Murdoch · Wall Street Journal Digg, Facebook, Email: Share [...]

    Share
  14. [...] WSJ en GigaOM a cuenta de su reciente artículo sobre Google y la neutralidad de la red, en “How the WSJ Failed the Web 2.0 Test”: que no actualizó la noticia cuando a las pocas horas aparecieron nuevos elementos fundamentales [...]

    Share
  15. [...] How the WSJ Failed the Web 2.0 Test This is a perfect example of, well beyond the economic modeling, newspapers are still slow to embrace the journalistic improvements that the web allows. This is just bad journalism, and the WSJ should be better than that. (tags: new.media newspaper journalism wsj gigaom) [...]

    Share
  16. Thanks for the comments, everyone. And to @Chris, I know that most newspapers don’t take this approach — I work for one. I’m trying to think (and talk) about what we *should* be doing, rather than what we are currently doing.

    Share
  17. [...] el WSJ en GigaOM a cuenta de su reciente artículo sobre Google y la neutralidad de la red, en “How the WSJ Failed the Web 2.0 Test”: que no actualizó la noticia cuando a las pocas horas aparecieron nuevos elementos fundamentales [...]

    Share
  18. i am so frustrated — honestly, WSJ needs to be up on its game already! i just wish this transition wouldn’t take so long. its clear that newspapers will not be on paper very soon and yet we still go on acting as if the physical paper is still a necessary part of the business. just like pretending that updating a story as the day goes on will make the story less PRISTINE or less TRUE. i kinda wanna dissect this specific situation do the gristle and know WHO there made the decision not to blog or update the story as information came in and see what the problem is at the WSJ — i have worked two newspapers before things started falling apart and there is always one or two ppl making the final decision. so who was it who said don’t touch the story?

    Share
  19. [...] declaracion de Laurence Lessig, que era mencionado en la historia, corrigiendo esa interpretación. Matthew Ingram considera que este es un ejemplo perfecto de cómo la prensa tradicional sigue fallando a la hora entender el nuevo periodismo digital por no [...]

    Share
  20. [...] Gigaom reckons WSJ failed the Web 2.0 test [...]

    Share
  21. @les madras

    No, there is nothing to “spin.” Google is right and the WSJ is wrong. Edge caching is available to anyone who can pay for it. That’s the only kind of NN that Google has ever wanted–no *exclusive* deals.

    Share
  22. [...] the rest of this post at GigaOm [...]

    Share
  23. [...] Surowiecki has News You Can Lose. Finally, Gigaom’s Mathew Ingram writes perceptively about How the WSJ Failed the Web 2.0 Test, offering some insight in the process as to how and why the traditional news media have fallen [...]

    Share
  24. [...] en GigaOM a cuenta del artículo del pasado domingo sobre Google y la neutralidad de la red, en “How the WSJ Failed the Web 2.0 Test”: que no actualizó la noticia cuando a las pocas horas aparecieron nuevos elementos [...]

    Share

Comments have been disabled for this post