10 Comments

Summary:

It’s no secret that we’re watching more online videos. What’s not so well understood is just how dramatically this consumption will soon increase — and the pain that is going to inflict on Internet service providers. Alon Maor, the CEO of Qwilt, offers his solution.

IM

It’s no secret that most of us are starting to watch more and more video on the Internet today, as opposed to regular TV. What’s not so well understood is just how much more room online video has to grow, as a percentage of our total video consumption — and the pain that’s going to inflict on Internet service providers whose high-speed pipes are already close to bursting.

After spending the last decade developing products for these ISPs and hearing firsthand the challenges they’re experiencing, I founded a new company, Qwilt, at work on a solution I feel will benefit both operators and consumers.

Consider: Americans today watch an average of five hours a day of regular TV (depressing but true). Yet they consume only a few minutes of online video per day. That balance is changing dramatically with the rapid growth of video delivered through Web services like Netflix, HBO-Go, Amazon.com, Facebook, YouTube, Hulu, Xfinity-TV and others. It may not be too long before many of us will be hitting the volume caps set by carriers while consuming legitimate services of this kind.

In five years, we’re going to be watching about an hour a day of online video, according to TDG Research. That’s roughly 16 times more than consumption today. And it implies that Internet networks will need up to 10 times more capacity than they have now to handle the deluge.

What to do? Service providers are obviously not sitting still while this is happening. But their current options are limited. One solution is to simply throw more capacity at the problem — buy more gear from the usual telecom-gear suspects in the hopes that the boxes can handle all the new traffic. This is what many service providers have done so far.

But in many ways that’s a wasteful solution: carriers are still transmitting the same video time after time, again and again across the network every time a new user calls it up. This approach doesn’t take advantage of technology that would streamline this process and allow carriers to cache popular videos and serve them more efficiently. It also cannot scale with the exponential growth of video. Plus, it’s expensive — all that new gear isn’t free.

A second solution is for carriers to build their own content-delivery networks. Some have built internal CDNs to mainly distribute their own content to subscribers. Others have gone further and built “wholesale” CDNs that compete head-to-head with giants like Akamai, Limelight and Level3. Some of the challenges to these approaches, however, include:

  • Content providers would rather not deal with dozens of small CDNs to get widespread geographic coverage — today, they deal with just a couple, or even one, to achieve global reach. The content providers we’re speaking with these days tell us that dealing with multiple, small CDNs is an operational headache.
  • The economics of CDNs don’t work in favor of small providers. There are economies of scale at play here: the more traffic you bring as a content provider, the better prices you get per bit. So it’s cheaper to concentrate traffic with a smaller number of large CDNs.
  • Carriers’ sales teams aren’t skilled at engaging with online content providers. Big guys like Akamai have been doing this for years and have a big advantage.

A third solution for carriers, which attempts to address some of the problems inherent in solution #2, is to create a federation of new CDNs. This would allow content providers like Netflix and YouTube to deal with just one CDN on a commercial level, instead of dozens of smaller ones, even though multiple, regional CDNs would be propagating and distributing the video content for carriers.

This is a nice idea, but it’s far from being a reality. Think about it: It won’t be easy to get operators from different countries, often with competing business objectives and different regulatory frameworks, to work together on a project like this. This doesn’t even take into account the technical challenges of having different content-delivery products made by different vendors, and owned by different carriers, work seamlessly together.

I think there is a better way.

Some new solutions, including Blue Coat and PeerApp, are being developed today that attack the problem through what is known as “transparent caching.” That means inserting a layer of network-optimization technology into networks to help make them more efficient and cut costs — specifically, reducing 60 percent to 80 percent of video traffic on the network.

Given how much video traffic is expected to soar in the coming years, that’s a technology that can make quite an impact in freeing up network resources. In basic terms, this technology temporarily stores popular videos at the edge of the network, so that they can be delivered faster, using less bandwidth, to geographic areas where those videos are in high demand.

This type of technology could be installed by such carriers as Comcast and Verizon to make sure the most popular content they’re transmitting is delivered in the highest quality. For example, let’s assume that the latest Super Bowl video summary is available at www.nfl.com and becomes very popular in New York. Carriers offering this kind of technology would automatically detect the high consumer demand for the video. They would then make any subsequent requests available to viewers from network points close to the various customers’ locations, both improving viewing experience and reducing costs.

There are several upsides to this approach. They include significantly lower costs and also more flexibility for carriers. A transparent video-delivery solution doesn’t require any specific commercial engagements for it to work. It can be used with any type of CDN. Such a system also doesn’t require any changes to existing network IP architectures, modifications to any system or browser settings, any special HTML code or integration with different vendors’ equipment. It’s an intelligent system that could be deployed and managed very easily, almost like a consumer product.

Such a solution has the potential to be a win for the entire video value chain — carriers, content providers, consumers and CDNs. The carriers get a low-cost, easy-to-integrate and flexible solution to deal with the current flood of video traffic. Content providers would have reassurance that their videos would actually get through crowded networks without bumping up against usage caps. (Recently, GigaOM reported that Charter Cable would impose such caps to combat the surge in online video that is overwhelming its network.) Consumers would benefit because they’d be able to see the video they want, in the highest-possible quality, without delays or extra costs. CDNs would also benefit, but that’s a topic for a post of its own.

With today’s technologies and the right algorithms, it is possible to unify real-time network intelligence with high-volume storage and video-delivery capabilities in a very compact, form-factor appliance. These products may then get distributed across the broad edge of carriers’ networks for delivering any form of video at a higher quality and much lower cost for all parties.

What we’ve seen in the last few months is that many carriers worldwide have realized the significant benefits of a transparent video-delivery technology and have started to roll out projects accordingly.

We’ll see how the landscape shakes out this year. As content providers expand their video offerings, delivering more and more bits to networks, it’s becoming critical for carriers to deal with their video-overload challenges.

Alon Maor is the CEO of Qwilt, a startup backed by Redpoint and Accel Partners that is developing new transparent video-delivery technology.

Image courtesy of Flickr user covilha.

You’re subscribed! If you like, you can update your settings

  1. Why would you say “Some new solutions, including Blue Coat and PeerApp, are being developed today”? These are not new solutions, both PeerApp and BlueCoat have been deploying transparent caching solutions to tackle the problem of video for at least seven years. There are over 500 carriers worldwide that already use their solutions. What’s so “new” about Qwilt?

    1. We are just seeing the beginning of large scale adoption of transparent caching products these days. I’m certain that the other vendors are innovating as we speak, making sure that their products meet the market’s requirements. I would welcome you to visit our site at http://www.qwilt.com and learn more about our differentiation.

  2. I think we see more like 4 hours a day of TV viewing and that has remanned stable even with the growth of On Liner viewing. We will also see much more efficient encoding in the near term with rates needed for even HD video dropping in half versus current MPEG4 rates. Still I agree with your conclusion, that now is the time to get the infrastructure in place to accommodate more streaming video. Users clearly are moving in that direction. Connected devices like smart TV’s and game consoles are accelerating the ability to have a great experience and more choice of content.

  3. Reblogged this on Dots Of Color.

  4. Don’t forget Juniper’s Transparent Proxy, Media Flow. We have 3 trial customers out in the market right now and they are seeing up to 20% reduction in their overall HTTP traffic.

  5. And what is the solution for Long Tail? Nowadays long tail video accounts for over 40% of the traffic on Netflix, over 70% in YouTube and 95% on Facebook.
    Seems like at best transparent caching solves half of the problem.

    1. From the data we are seeing in multiple deployments in the US, EU and APAC, the long tail accounts for only ~30% of the video bandwidth.

  6. Seems like a reasonable solution to the dilemma…though I wonder if the cheapest / easiest way might be for carriers to work with media distribution outlets (many of which are tied to or owned by the carriers) to use a P2P protocol, such as BitTorrent to offload hosting capacity onto consumer boxes and keep video within geographically distributed areas to reduce usage…but perhaps folks wouldn’t want this occurring on their computers…another option would be to upgrade the routers/gateways they provide and including in them say 100-500 GB of storage, which could serve for P2P caching.

  7. How can you transparently handle DRM issues? You expect content owners to divulge their encryption keys/methodology?

  8. Alon, would the additional views for the videos stored at the edge be counted as video views for the content provider? ie if a video has been viewed once, the content provider gets 1 video view attributed to them but after that if that same video is hit 20 times, will the content provider be given credit for those additional views even though its not being pulled from the content provider but from within the cache?

Comments have been disabled for this post