The Fallacy Of The Link Economy

Arnon Mishkin (new headshot)

Arnon Mishkin is a partner with Mitchell Madison Group, where he consults for media companies on improving legacy businesses as well as making the internet profitable. Prior to MMG, he was a partner at the Boston Consulting Group, where he did some of the firm’s earliest work on the web.

People who “get the web” will explain to you that the economics of the web have everything to do with linking and getting linked to. The more links one can get, the better off one is. Few disagree with that guidepost.

So when the AP and the newspaper owners demanded that they get revenue from the linkers, it was clear that they just didn’t understand the web and didn’t appreciate all the value they were receiving from link traffic. (Here are just a few examples of that critique.)

Well, the data suggests that the web – the “blogosphere”- is less an ecosystem than a one-way street.

The vast majority of the value gets captured by aggregators linking and scraping rather than by the news organizations that get linked and scraped. We did a study of traffic on several sites that aggregate purely a menu of news stories. In all cases, there was at least twice as much traffic on the home page as there were clicks going to the stories that were on it. In other words, a very large share of the people who were visiting the site were merely browsing to read headlines rather than using the aggregation page to decide what they wanted to read in detail. Obviously, this has major ramifications for content creators’ ability to grow ad revenue, as the main benefit of added traffic is the potential for higher CPMs. (Disclosure: I have consulted for the AP and other content creators, though not on this particular issue.)

Even in an absolute best-case scenario for producers of original content, the aggregators get at least as much traffic on linked stories as the creators of those stories because anyone who clicks on the link does so from the aggregator’s site (so each site gets a page view). If you don’t believe the data, consider how often in an average day you visit the home page of your favorite news site vs. how often you click through to the underlying story.

Actually, it shouldn’t be surprising to anyone who’s thought about how people have historically read a newspaper: They’ve scanned the headlines and then turned to the sports, movie listings or recipe pages, depending on their real interest. As the saying goes, “People don’t check the news to read about the fire, they check it to learn that there wasn’t a fire.”

Historically, the value of those casual browsers was captured by the newspaper because the readers would have to buy a copy. Now all the value gets captured by the aggregator that scrapes the copy and creates a front page that a set of readers choose to scan. And because creating content costs much more scraping it, there is little rational economic reason to create content.

This is not just a reflection of economic reality but also captures the changing nature of reporting. Take Iran. When the Shah fell in 1979, the journalistic hero was Mike Wallace, who flew to Tehran and pointedly asked the Ayatollah whether he was, as Anwar Sadat called him, a “lunatic.” This year, after the June election there, the journalistic hero was the blogger for The Huffington Post who stayed in D.C. and linked to every piece of information from Tehran he could find.

Moreover, because they can scrape essentially every content provider on the web, each aggregator gets to build a “front page” to target and win over their chosen segment, or enable each user to tailor a front page perfectly suited to his or her needs. And they can do that by leveraging all the resources of the global journalistic community without paying any part of its cost. As a result, aggregators – often from a standing start – are able to create substantial amounts of daily traffic and viable web businesses.

People will argue that the scrapers create value by pointing to many obscure stories that captured the imagination of linkers and got unexpectedly high traffic for a very obscure site. Fine, but was that site able to monetize the jump in traffic? And, how likely is that site to create a sustainable business by consistently winning a surfing game of serendipity?

Others will say that the site that gets linked to can keep the user using the site. But the opposite is happening – users are being trained to increase their usage of (and thus value to) the linker rather than the creator.

Can anything be done about this? I think publishers need to take three steps:

• Assess how much value the aggregators are getting by virtue of using their content and use that to seek an equitable economic relationship. And be willing to drop the links rather than submit to an unfair deal.
• Consider partnering with other content makers and developing appropriate aggregation sites of their own.
• Develop new ways of sharing their content with other sites – in ways that allow them to monetize some of the traffic on the aggregator site. For example, rather than offering a pure “widget” of the site’s material, the site should offer “wadgets” that contain a combination of content and advertising that could run instead of an aggregator’s scraping.

loading

Comments have been disabled for this post