The Fallacy Of The Link Economy

59 Comments

Arnon Mishkin is a partner with Mitchell Madison Group, where he consults for media companies on improving legacy businesses as well as making the internet profitable. Prior to MMG, he was a partner at the Boston Consulting Group, where he did some of the firm’s earliest work on the web.

People who “get the web” will explain to you that the economics of the web have everything to do with linking and getting linked to. The more links one can get, the better off one is. Few disagree with that guidepost.

So when the AP and the newspaper owners demanded that they get revenue from the linkers, it was clear that they just didn’t understand the web and didn’t appreciate all the value they were receiving from link traffic. (Here are just a few examples of that critique.)

Well, the data suggests that the web – the “blogosphere”- is less an ecosystem than a one-way street.

The vast majority of the value gets captured by aggregators linking and scraping rather than by the news organizations that get linked and scraped. We did a study of traffic on several sites that aggregate purely a menu of news stories. In all cases, there was at least twice as much traffic on the home page as there were clicks going to the stories that were on it. In other words, a very large share of the people who were visiting the site were merely browsing to read headlines rather than using the aggregation page to decide what they wanted to read in detail. Obviously, this has major ramifications for content creators’ ability to grow ad revenue, as the main benefit of added traffic is the potential for higher CPMs. (Disclosure: I have consulted for the AP and other content creators, though not on this particular issue.)

Even in an absolute best-case scenario for producers of original content, the aggregators get at least as much traffic on linked stories as the creators of those stories because anyone who clicks on the link does so from the aggregator’s site (so each site gets a page view). If you don’t believe the data, consider how often in an average day you visit the home page of your favorite news site vs. how often you click through to the underlying story.

Actually, it shouldn’t be surprising to anyone who’s thought about how people have historically read a newspaper: They’ve scanned the headlines and then turned to the sports, movie listings or recipe pages, depending on their real interest. As the saying goes, “People don’t check the news to read about the fire, they check it to learn that there wasn’t a fire.”

Historically, the value of those casual browsers was captured by the newspaper because the readers would have to buy a copy. Now all the value gets captured by the aggregator that scrapes the copy and creates a front page that a set of readers choose to scan. And because creating content costs much more scraping it, there is little rational economic reason to create content.

This is not just a reflection of economic reality but also captures the changing nature of reporting. Take Iran. When the Shah fell in 1979, the journalistic hero was Mike Wallace, who flew to Tehran and pointedly asked the Ayatollah whether he was, as Anwar Sadat called him, a “lunatic.” This year, after the June election there, the journalistic hero was the blogger for The Huffington Post who stayed in D.C. and linked to every piece of information from Tehran he could find.

Moreover, because they can scrape essentially every content provider on the web, each aggregator gets to build a “front page” to target and win over their chosen segment, or enable each user to tailor a front page perfectly suited to his or her needs. And they can do that by leveraging all the resources of the global journalistic community without paying any part of its cost. As a result, aggregators – often from a standing start – are able to create substantial amounts of daily traffic and viable web businesses.

People will argue that the scrapers create value by pointing to many obscure stories that captured the imagination of linkers and got unexpectedly high traffic for a very obscure site. Fine, but was that site able to monetize the jump in traffic? And, how likely is that site to create a sustainable business by consistently winning a surfing game of serendipity?

Others will say that the site that gets linked to can keep the user using the site. But the opposite is happening – users are being trained to increase their usage of (and thus value to) the linker rather than the creator.

Can anything be done about this? I think publishers need to take three steps:

• Assess how much value the aggregators are getting by virtue of using their content and use that to seek an equitable economic relationship. And be willing to drop the links rather than submit to an unfair deal.
• Consider partnering with other content makers and developing appropriate aggregation sites of their own.
• Develop new ways of sharing their content with other sites – in ways that allow them to monetize some of the traffic on the aggregator site. For example, rather than offering a pure “widget” of the site’s material, the site should offer “wadgets” that contain a combination of content and advertising that could run instead of an aggregator’s scraping.

59 Comments

steve

howard,

sherrod brown, his wife (of the cleveland plain-dealer) and you ought to get a room.

Chris Marentis

This post points out some very interesting ideas to change the model for news producers. Tweaking the model is not the answer for any traditional media company at this point.

New technology (cloud computing, web services, bband everywhere) combined with new techniques (crowd sourcing, widgets, social media and video distribution) has changed the game forever. New companies with very low cost structures and a firm understanding of the "distributed web" will win. Why, this new game is not just about eyeballs and ads…it is a relationship model that looks a lot more like direct marketing for a publisher. These aggregation models done well go much deeper with a viewer than ads…they want to create LISTS!

Traditional media companies need to think about a different type of relationship with their readers/viewers and their advertisers and capitalize on their remaining clout.

Howard Owens

Funny, Steve, you point to a logical fallacy "straw man" in using another "red herring."

The parties involved in the case, nor the outcome, do nothing to diminish the arguments put forth in Lichtman's paper.

Your entire comment is completely irrelevant to facts and opinions put forward by Lichtman.

The fact is, Mr. Lichtman's conclusions are not uncommon and should another such case arise, either Mr. Lichtman or a similar expert is quite likely to come forward with the same expert opinion, or quite similar, so those who are resting their use of RSS on a faulty and feeble understanding of fair use would do themselves well to read the paper, imbibe of some additional study and come to a reasonable conclusion, rather than relying on the myth that "just a quote and link" is ipso facto fair use.

steve

howard points to an absolute strawman case in which one money-losing publisher went after another.

one might ask, wouldn't it suit both parties real well to settle this as they did so they could point to it as a "defining moment" in the use of rss feeds?

please.

no f'in judge decided anything on this matter, howard.

get real.

Andrew Waterman

On this thread on fair use and stopping of "commercial" redisplaying of RSS feeds… Legally how would the use of RSS feeds in "personal" RSS readers like Google Reader (the most dominant reader nowadays) or LazyFeed be construed considering these are clearly commercial enterprises, in Google's case deriving ads revenue right next to the RSS?

In other words, do you see there being any realistic movement in the future to stop RSS from being openly displayed on 3rd-party websites, since noone is really stopping their open syndication now? I'm very curious to hear this crowd's opinions.

Howard Owens

David, I would think that you would have already read it, but fair use is not a slam-dunk defense, as Gary Lichman's expert opinion in GHM vs. NYT made clear.

http://www.niemanlab.org/pdfs/douglichtman.pdf

Grabbing and republishing a site's RSS feed for commercial use is not automatically protected by fair use. It could quite possibly be a copyright violation.

Petr Svacina

Some people seem to disregard how costly it is to produce news and seem to float in the idea that news come from an ethereal free area. They are wrong. It is true that the news industry is changing and that the old model isn't working any more. Arnon Mishkin's article points to important ways people are getting the news they want and where the news producers are losing.

I would point out a fourth step as a measure against the negative side of aggregators, where, once the reader clicks a link, instead of just showing the page with the specific article, they should allow for the front page to be seen as well.

We have developed our software in such a way that a link to an article, instead of opening a new page, opens an AJAX box on top of the main page. This way, we think, our readers have always the correct vision of what they have seen and what is still to be seen. (In most of on line papers it is easy to get lost and, at the end, the reader has the sensation that he or she missed something). If a reader receives a link from a referee, the desired article is opened also on top of our main page. This way there is always a chance that the remaining articles might attract their interest. Above all, our advertizers get the opportunity to bee seen.

Dave Mastio

Howard,

There is no possible Terms of Use/Service that can be written to take away fair use rights, which include the right to reuse content publicly and commercially. Content owners simply cannot restrict rights that they don't have control over.

For example, the creator of creative commons describes all of its variants — even the most restrictive — as "fair use plus."

J. Boone Pickens

This argument also assumes that consumers would seek out news content from the primary source if the aggregation sites weren't there to point them to it. In actual fact, I think many readers who get their news through aggregation wouldn't choose to visit the homepage of their local paper at all without the aggregation sites to point out the important stories.

Anecdotally, I have many 20-25 year old friends in Boston who have never purchased a copy of the paper in their lives, and they only wind up on Boston.com when they're directed to it by aggregation sites that they trust.

Aggregation is obviously turning some percentage of readers into sedate headline browsers who don't chose to click through, but perhaps the net loss would be even greater without aggregators to highlight good content.

Howard Owens

RSS feeds can be considered for private, non-commercial use. And that can be made explicit in the Terms of Use/Service.

Shafqat

Arnon – I'm glad you agree with me. However, I would disagree with Regan in this context: it's really not that hard to build a compelling, personalized news aggregator. Given the bloated resources of any news org, surely they could afford to spare a small tech team to build one, even if it was to try and see if there was a biz model there. The fact that they didn't is just an indication of the type of people working in these industries. Where is the creativity, flexibility or interest in trying new things?

Also, I would suggest not using the 'scraping' word too loosely. If sites didnt want to get scraped, they have robots.txt. This is widely respected (apart from splogs, but we shouldnt worry about them anyway).

Finally, most news aggregators (like HuffPo, NewsCred or countless others) use the widely available RSS feeds made available by the content producers. RSS feeds are for syndicating content. If you don't want to syndicate content, why do you have RSS feeds? Perplexing.

C.W. Anderson

Arnon,

If you had any credibility at all, you would link to your "study." (This article obviously shows that you know what linking is). Until you do so, why would anyone believe the rest of your arguments?

steve

instead of coining new terms like adtribution or wadget, why not pick a topic or locality and OWN it?

instead of every "effin" media company trying to be a portal, why not be a partal of the whole?

earn your visitor's every visit and they'll return.

and thank you to the previous commenter, but I, and only I, will decide what is quality content to me… much of which may not require an editor or journalist.

Enno Becker

Mishkin's correct about the intrinsic "unfairness" of the linking dynamic. It's certainly a natural result of the web's evolution, but that doesn't make it any easier to swallow when you have an investment in creating quality content and everybody from the small time blogger to Google is monetizing it.

The issue isn't whether you _like_ it or whether you create your own aggregator or vertical search. (not really as easy as you might think) The issue is the unsustainability of the model. Who is going to pay the journalists and editors?

I think we're about to see a variety of experiments in locking sites down and charging for access, most of which won't work very well. If this results in some sort of adtribution or wadget I'll be very pleased, but I don't see where the economic incentive comes from.

Arnon Mishkin

Ryan, the fallacy is the belief being sold by the "aggregators" that content creators can just go on the web, get links, focus on SEO and they will wind up with economically viable businesses. The reality for content creators is that the emperor is stark effin naked.

I happen to agree with "Shafqat," down to his prescription, but, as Ronald Reagan used to say, "there are simple answers…they're just not easy."

Tom Foremski

AP has things backwards. Why not take advantage of the distributive power of the Internet and it's "link" economy? Why not institute a system whereby if you link and quote then also publish one or two of my accompanying text ad links. Then my advertisers get to also benefit from the distribution. I call it "adtribution." It's a model that doesn't closedown but expands the link economy. Google the word "adtribution" for more info.

Ryan Tate

It is quite common for economic participants to be less than fully satisfied with the deals they strike. Just ask an iPhone buyer, an airline customer of for that matter your typical American employee.

Some old-line publishers are less than ecstatic about their relationship with various aggregators and "scrapers" and yet they consent to the relationship; they are not blocking Google via the method Shafqat points to.

It is also common in market economies for the actions of one participant to hurt the profits of another. This is not automatically a bad thing; it is certainly no disqualification to the label "economy."

There is, in fact, a link economy. You have tried now to say that the link economy is "a fallacy" because it is unequal; because some participants injure the profits of other participants; and because some participants do not "feel they have gotten the best deal possible." None of this at all disproves the existence of a link economy; it simply proves that you do not personally _like_ said economy.

Shafqat

Enough complaining already. If news organizations are so tired of news aggregators stealing their ad revenues, users, business value etc, why dont they:

1) Create an aggregator. It's not that hard.
2) Use robots.txt to block crawlers and also stop RSS feeds while you're at it.

I'm so tired of this argument. I've heard news organizations complaining about this for years, but havent seen a single compelling aggregator built by any of them. During that time, tons of independent companies have come and built great personalized news sites and enjoyed the success they deserve. If people prefer to go to the aggregators rather than your home page, doesn't that say something about the value you are creating?

Susan Fine

You continue to astound me that you figure this stuff out–
think it is easier to diagnose the problem than figure out the answer–one i like best is creating their own headline page–works well with local content since there is no local news anymore

Arnon Mishkin

Ya know, I've been called many things, but I can't remember someone saying I was a communist. Several of my clients would be seriously upset and several detractors would be seriously surprised.

To me an "equitable economic relationship" is one in which buyer and seller each feel they have gotten the best deal possible. I think Adam Smith himself opined on this.

In addition to not being a communist, I'm not a lawyer, but as I understand the basics of "fair use" the core test is whether the person using the content has damaged the business of the person who created it. The fact that Google News and Huffington Post have not chosen or been able to monetize the content they've scraped is orthogonal to whether they've taken traffic that otherwise would have gone to the creators of that which was scraped.

Now several of the comments suggest or imply that there are "thousands" of ad-based content businesses making piles of money from their content on the web. That is news to me. I suspect it is also news to the investors in Huffington Post, Daily Beast and others. Gawker claims they are now in the black, but I bet that has more to do with their buyer-guide sites than with the sweet commentary and gossip that they are known for.

Who are these thousands of companies and where I can I invest in them?

Howard Owens

While opposing the response to aggregators by some newspapers and AP, this article reveals a good deal of truth.

Here are three similar posts I wrote some time back (this article reads almost like they were used in prep of the story).

http://www.howardowens.com/7335/myth-deep-link

http://www.howardowens.com/7337/why-nobody-clicks-your-home-page-links

http://www.howardowens.com/7336/why-home-page-ads-may-be-more-valuable-story-page-ads

Many advocates of the link economy do not fully understand how links actually work.

Tim Barkow

This is so ridiculously tired. Can we just stop already?

There are thousands of businesses making money on the Web. Lots of money. The fact that newspapers cannot figure out how to be one of them is simply sad. These are lazy businesses, unable or unwilling to keep up with the competition.

What do we do with those types of businesses? Bailout! I mean, we let them fail.

jeffmignon

Or, you can consider that aggregators are like digital newsstands. You come in, browse the content and "buy" it or not. In newsstands, many of us browse and don't buy. Same idea with aggregators. No? The aggregator does not charge to display your "content teasers" and makes money with advertising. Do print publications charge newsstands because they are selling cafe while people are browsing/reading magazines for free? Maybe they should after all. ;)
Not sure the media industry is looking in the right direction. You said strategy?

steve

i'm guessing the author would consider a twitter account as a useless tool as well.

since one can just "scan the headlines" using the (unmonetized) twitter feed most news orgs. have fallen all over each other in setting up and feeding , why would one bother to follow the embedded links to their ad supported property?

Ryan Tate

That's not to imply I'm conceding your premise, by the way. All evidence indicates that actual scrapers are low-rent operations making little if any money.

Google News, the most prominent, is not monetized; HuffPo, which publishes reams of original content alongside some scraping, is famously unprofitable; the rest are mainly spam blogs that clog up Google results and disappear as quickly as they pop up.

The latter are grasping for AdSense pennies and usually operating in clear violation of copyright law. But their clickthrough rates are so low most original copyright holders don't bother to shut them down.

Ryan Tate

So it's not a "link economy" because the currency is distributed unequally and tends to accumulate to certain "winners." I guess the U.S. economy is a "fallacy" too, by your reasoning.

Were you calling for literal communism — an "equitable economic relationship" between subjects and publishers — when it was newspapers and magazines repurposing information provided by unpaid sources?

Andrew Waterman

Traffic to the successful aggregators (SeekingAlpha.com, HuffingtonPost.com, etc.) is quite disproportionately larger than the traffic to their content providers, leaving many content providers feeling this is a "raw deal" as the aggregators get the ads dollars.

However most aggregators are not the size of SeekingAlpha and HuffPost and are not the real culprits. Here the problem lies with the ads model we have all grown dependent on. Pageviews, not just clicks, have become cheap currency when monetized on a CPM basis alone. More value can be extracted from high-quality content than just $0.01/pageview that we are all fighting each other over now.

Most professional content producers have premium content to sell as reports or subscriptions that ought to be marketed alongside the headlines/free content on the aggregators' sites. Arnon's "wadgets" suggestion could accomplish this well, as well as other monetization systems that recognize this new "syndication economy."

Andrew Waterman
Emerginvest

patricia

People who "get" the internet know that linking is a LAZY way to marry audience to platform, like SEO. And like SEO driven traffic, wouldn't be surprised to find out that session times aren't that great. 1 mil unique visitors who are on page less than a minute and do not return aren't really traffic.

Smart people forget trying to cut corners and marry a real audience and customer base to their platform.

steve

folks that continue to refer to web properties as "sites" consisting of "pages" have no clue.

Comments are closed.