6 Comments

Summary:

As millions of consumers gained access to the Internet, new market opportunities emerged. But today, content is so heavy, and networks so overburdened, that more efficient use of the network is a critical behavior. This provides a new market opportunity for content optimization and CDNs.

racecar-thumb

The tech bubble of the late ’90’s was fueled largely by the promise of universal high-speed Internet access. As millions of consumers gained access to the Internet, new market opportunities emerged. But today, content is so heavy, and networks so overburdened, that more efficient use of the network is a critical behavior.

The state of web content today

As richer, more dynamic, more interactive sites have hit the Web; the existing infrastructure has become insufficient. While high-speed broadband has tried to meet the infrastructure demands of the exploding volume and size of content on the Web, it’s clear that throwing pure infrastructure at the problem isn’t enough.

Two new markets emerged from these challenges: the content delivery network market (CDN) and the application delivery controller market (ADC). Put simply: These are technologies that help make your experience on the web a lot faster, while still using the same infrastructure that has been in place for the past two decades.

Remarkably, those two markets are now struggling to keep up with the explosive growth of the web. Sites are too big, too dynamic, and too rich for our existing infrastructure, and prevailing techniques for optimizing performance.

Today, we’re embarking upon the third major evolution in modern web performance. Web content optimization and acceleration is one of the largest market opportunities in the tech sector today, and it’s going to pave the way for the next major era of the Internet. Without it, innovation gets throttled.

The technologies we’re currently using to speed up the web need to be supercharged. They need extra help. That’s where making sure web content is efficiently delivered comes in. It’s about the conservation of bandwidth and the compression of megabytes, especially on mobile networks.

Four problems you can’t ignore

Most of the web performance challenges we face today can be traced to four basic trends:

  1. Third-party content. Any given web site incorporates vast amounts of third-party content. This includes content such as advertisements, widgets and syndicated feeds.
  2. Dynamic. Sites are now required to be more dynamic than our infrastructure can handle. Twitter feeds are constantly changing, and the data can’t be cached, and furthermore, we expect a high degree of personalization and individually relevant experiences when we visit sites.
  3. More, more, more. We’re experiencing a content explosion: Sites have more pages, more pictures and more videos packed into the pages than ever before.
  4. New devices. Myriad new devices hit the market every month, all of which are Internet-enabled. This doesn’t just mean more laptops and iPads; we’re also talking about refrigerators, low-cost home security cameras, and even cars!

And why is this happening? There’s a new party in town, and its called social media. Our problems aren’t capacity problems; it’s just that our content delivery infrastructure wasn’t designed for what’s happening. The existing infrastructure is designed on three premises and assumptions:

  1. Single origin. Most content will originate from the same web servers, so if these are working properly, then everything is good.
  2. Static content. Most information will stay the same, and therefore can be cached across the data center and Internet.
  3. Fast delivery. Because content comes from a single origin and doesn’t change, caching and route optimization can deliver everything quickly.

Social media turns these assumptions on their heads. Content is mashed-up, syndicated, streamed from everywhere — with different qualities of service. So even if you’re paying $500,000 for traditional performance solutions, your pages will still slow down to the lowest common denominator, such as a slow ad service or the slower speeds of a streamed page from Facebook.

Yes, we can build new infrastructure, but it will take too long, and it may not be enough. We can throw more of the same performance technology at it, but this only helps so much, and the traditional technology doesn’t do anything for today’s dynamic content, which can’t be cached. At the end of the day, these four factors have driven intense demand for a new type of web acceleration.

With our powers combined, we are …

The good news is that we have the technology to solve the problem, and there has already been a good deal of investment to put the wheels in motion. We’re seeing the many web performance players converge to do this.

Recently, Limelight Networks acquired AcceloWeb for up to a rumored $20 million in a cash and stock deal. AcceloWeb’s technology does precisely what I’ve hinted at so far: It accelerates web content so that it can travel faster over our existing Internet infrastructure. Limelight, a traditional CDN company, is making a large investment in Web content optimization and acceleration. These are two fundamentally different markets converging under one company, yet we hardly heard any talk about the strategy behind the investment.

Similarly, Google recently announced that Google Analytics now offers a Site Speed Analytics Report. It was greeted with applause from the web performance community, but nobody really heard about how this “feature” had much broader implications for the web.

Google isn’t just helping you measure your site’s speed; they want the Web to be lightning fast. It’s critical to the future of their business that the web isn’t crippled by performance woes.

Their revenue is still largely ad-based, which contribute costly seconds to load times if we don’t find a solution. Not to mention: The faster a site loads, the more ads Google can serve. Google cares about web performance because it’s absolutely critical to their business and the future of the web itself.

Just how big is this?

This isn’t just the market opportunity for web content acceleration that’s exciting here. What’s more important is the future of the web, and what this evolution in web performance will spawn.

We’re talking about webscale personalization that isn’t held back by performance problems. Personalization is the web topic du jour, but we’re not going to reach the promise of true web personalization if we can’t load web pages faster than we’re doing on average today.

Similarly, the mobile web is going to face major obstacles if we can’t tune our apps to perform on even the most troubled networks. And we sure as heck aren’t going to usher in the future of virtual personal assistants if we can’t conduct complex processing and deliver that content at the speeds that consumers demand.

Oddly enough, the success or failure of these sexy technologies hinges on a critical evolution in web performance. All of a sudden, the emerging web acceleration and content optimization market is starting to look a lot sexier to investors, entrepreneurs and incumbent technology companies alike.

Ed Robinson is the CEO of Aptimize, a company that produces software to accelerate websites.

  1. Do not believe the hype that is being spread by ISP’s in that we do not have enough bandwidth.

    Well they are sorta correct but for the wrong reasons. Any intelligent business plans for the future and builds infrastructure accordingly. Like most American businesses for the past twenty years, much of the money that should have went to expanding infrastructure to meet almost certain demand went to enriching executives and shareholders who’s demand for pricey NY and Washington real estate exceeded even the bandwidth demand by internet users.

    these companies have shown that they have no real interest in building infrastructure to meet demand unless we pay for it out of our pockets and continue to allow the fat cats to live the high life with their monopolistic industries having us by the balls and wallet.

    I say since we the people are paying for it, we the people should own at the very least a stake in the internet.

    Nationalize the telecomunications industry and we will see the money we pay into it go for bandwidth real quick.

    Share
  2. I must’ve missed the ‘Advertisement’ tag in the headline. This is pretty blatant…

    Share
  3. U.S. TELCO INFRASTRUCTURE – NO NATIONALIZATION PLEASE

    Whaaat? Did someone on Gigaom just recommend that the telecommunications infrastructure be nationalized ;) ?? So, instead of leaving it to entrepreneurs to come up with solutions we should expect the government to own the telco infrastructure AND do a better job? Is there ONE good example of the government doing a better job at anything that the private sector can also do?

    Share
    1. I think nationalization would be very difficult and not very productive, but local governments can and have made excellent infrastructure managers.

      Case in point, the town of Wilson NC started a town ISP that has speeds from 10 to 100 mbps starting at $35/mo.
      http://www.greenlightnc.com/about/internet/
      Instead of getting the picture and bettering their service to stay competitive, the telcom companies lobbied the state legislature and got them to pass the ridiculously titled ‘level playing field’ bill that prevents municipalities from providing broadbnd services.
      http://www.govtech.com/technology/Municipal-Broadband-Networks-Outlawed-North-Carolina.html

      So id say Heilgar, while being a little over the top, is correct based on how the telcoms act.

      Share
      1. GOOD EXAMPLE OF LOCAL GOV

        Thank you – you provide a good example of a relatively straightforward need – fast broadband access – provided by a local government. I just don’t trust our federal government to do this at a national level – you also indicate that this would not be possible at a national level.

        In the article the author refers to more complex needs that go beyond speed – such as dynamic sites, third party content etc – that needs new software solutions. For those I want to rely on entrepreneurs to innovate. The comment by Heligar “since we the people are paying for it, we the people should own at the very least a stake ..” is ridiculous – this comment implies that the government should own stakes in everything ;)

        Share
  4. So we have two variables here:
    * Web Infrastructure improvements where acceleration and flexibility is the solution.
    * And software which, at least in the web infrastructure segment, is and has mainly been Open Source (think of Linux, Apache, MySQL, PHP/Python/Perl, Ruby-on-Rails, etc…)

    Check then Varnish Cache. The fastest and most flexible web accelerator out there. And it is open source with a company behind it.

    Hey, it even supports compression, modules and even HTTP streaming now with the release of last week!

    Want to know more? Check:
    * Varnish Cache Community Site: http://www.varnish-cache.org/
    * Company behind the software: http://www.varnish-software.com/

    Share

Comments have been disabled for this post