Transparency Is Good But Intelligence Is Better

I took a break in the middle of the FCC hearing on open access and transparency today to figure out with Time Warner Cable, my Internet Service Provider, why the FCC feed and many other live online video feeds are problematic for me. It was more than a little ironic since it could be a lack of network management disclosure on the part of Time Warner causing my problem (it wasn’t), but it’s also a wonderful example of how even a transparent network doesn’t always deliver a great end user experience.

The Federal Communications Commission is trying to mandate six network neutrality rules governing how and when Internet Service Providers can mess with the bits traveling over their pipes (GigaOM Pro, subscription required), and one of those rules dictates that ISP be transparent with their users about how they mess with bits to improve the quality of the customer experience. The proposed rules also mandate that any such network management interference be “reasonable.” Comcast’s decision to throttle P2P files without letting users know would have violated the transparency principal, and it was later censured because the FCC found that its actions weren’t reasonable.

But sending bits across the web is far more complicated than most people may realize. Those bits pass through pipes owned by different companies, multiple routers, a variety of servers and even your home equipment, which means a transparent window on what an ISP is doing doesn’t always mean you can watch a live FCC broadcast without significant skips, buffering and other problems that may have a web user railing against their broadband provider. Not to say that the FCC rules aren’t a good thing (I think they are), but getting a quality video or even web site experience isn’t just an ISP issue, nor will network neutrality ensure high-quality video delivered via broadband.

For example, a traceroute command on the FCC broadcast revealed that my cable provider wasn’t the issue; the FCC server was. A few months back, a problem I had with Cisco’s live webcasts maybe have been traceable to an ISP issue or insufficient CPU power on my end (although since I didn’t involve Time Warner on that one I will never know). Most users don’t have the resources to get on the phone quickly with someone knowledgeable to see what might be causing a problem (and to be fair, when Hulu stutters during “The Daily Show,” I have to wait like everyone else).  But even if they able to access the intelligence of network engineers at their ISP or the host company, who wants to stop a YouTube video to figure out the issue, when something better is a click away?

So what does this mean for those of us who just want their online video to work? Knowing what your ISP is doing to your traffic is a key step, but for the best experience, I think there’s a market for a network intelligence-gathering service offered ideally by the content provider that a user can click on to get a sense of what issues might be halting content between the provider and the user’s home. Maybe the button starts a ping test and delivers the results in a user-friendly format noting which servers may be causing problems.

Router makers could pick it up inside the home. So maybe YouTube isn’t going to start offering this (although Google does have some nice tools for getting more insights into your ISP’s network management), but I think as far as getting business information and services via the web thanks to the movement to the cloud, some sort of user-friendly tool would make a complex process a bit easier to understand. I hope the idea will take off and the end consumers will benefit. If that happens companies like AlertSite, or Compuware, which bought Gomez, may find themselves with a consumer-facing quality assurance product.

loading

Comments have been disabled for this post