11 Comments

Summary:

Amid the debate on network neutrality, transparent network management is generally accepted, but in practice it may not improve the end user experience as much as everyone hopes, since there are so many players between the end user and the content provider. We need intelligence.

I took a break in the middle of the FCC hearing on open access and transparency today to figure out with Time Warner Cable, my Internet Service Provider, why the FCC feed and many other live online video feeds are problematic for me. It was more than a little ironic since it could be a lack of network management disclosure on the part of Time Warner causing my problem (it wasn’t), but it’s also a wonderful example of how even a transparent network doesn’t always deliver a great end user experience.

The Federal Communications Commission is trying to mandate six network neutrality rules governing how and when Internet Service Providers can mess with the bits traveling over their pipes (GigaOM Pro, subscription required), and one of those rules dictates that ISP be transparent with their users about how they mess with bits to improve the quality of the customer experience. The proposed rules also mandate that any such network management interference be “reasonable.” Comcast’s decision to throttle P2P files without letting users know would have violated the transparency principal, and it was later censured because the FCC found that its actions weren’t reasonable.

But sending bits across the web is far more complicated than most people may realize. Those bits pass through pipes owned by different companies, multiple routers, a variety of servers and even your home equipment, which means a transparent window on what an ISP is doing doesn’t always mean you can watch a live FCC broadcast without significant skips, buffering and other problems that may have a web user railing against their broadband provider. Not to say that the FCC rules aren’t a good thing (I think they are), but getting a quality video or even web site experience isn’t just an ISP issue, nor will network neutrality ensure high-quality video delivered via broadband.

For example, a traceroute command on the FCC broadcast revealed that my cable provider wasn’t the issue; the FCC server was. A few months back, a problem I had with Cisco’s live webcasts maybe have been traceable to an ISP issue or insufficient CPU power on my end (although since I didn’t involve Time Warner on that one I will never know). Most users don’t have the resources to get on the phone quickly with someone knowledgeable to see what might be causing a problem (and to be fair, when Hulu stutters during “The Daily Show,” I have to wait like everyone else).  But even if they able to access the intelligence of network engineers at their ISP or the host company, who wants to stop a YouTube video to figure out the issue, when something better is a click away?

So what does this mean for those of us who just want their online video to work? Knowing what your ISP is doing to your traffic is a key step, but for the best experience, I think there’s a market for a network intelligence-gathering service offered ideally by the content provider that a user can click on to get a sense of what issues might be halting content between the provider and the user’s home. Maybe the button starts a ping test and delivers the results in a user-friendly format noting which servers may be causing problems.

Router makers could pick it up inside the home. So maybe YouTube isn’t going to start offering this (although Google does have some nice tools for getting more insights into your ISP’s network management), but I think as far as getting business information and services via the web thanks to the movement to the cloud, some sort of user-friendly tool would make a complex process a bit easier to understand. I hope the idea will take off and the end consumers will benefit. If that happens companies like AlertSite, or Compuware, which bought Gomez, may find themselves with a consumer-facing quality assurance product.

You’re subscribed! If you like, you can update your settings

  1. “I took a break in the middle of the FCC hearing on open access and transparency today to figure out…”

    “Open access”? Why do you think they are talking about “open access”, a term that if used accurately refers to a set of policies such as unbundling (UNE-P, UNE-L), line sharing, bit stream access. The workshop page mentions the “Open Internet”, which is code for Net Neutrality, but that is not the same thing as open access.

    Journalists writing about the FCC need to get the lingo right, because it matters. Kevin Martin purposely took advantage of this lack of understanding in the context of the 700MHz C-Block spectrum auction, getting journalists to use the phrase “open access” to make his pro-consumer cred go up, without actually giving into consumer groups’ demands for true wholesale open access policy requirements.

    1. Maybe this confusion about terminology is why Stacey thinks that the proposed rules would be a “good thing?”

      The fact is that they’d kill innovation and competition, harm quality of service (causing even more problems with her streaming video), and raise prices to consumers.

      What’s more, the requirement for transparency — which Stacey pooh-poohs above — is in fact the only part of the rules that actually makes some sense.

      We need accurate reporting on this subject. Om, Sebastian: want another blogger?

      1. Yes, because Brett Glass, a WISP whose rabid and vitriolic online persona has gotten him kicked off multiple listservs, would be such a fabulous addition to GigaOM, and otherwise reasonable place to visit.

      2. Brett, it’s the transparency requirement that I was calling a good thing. It’s something I’ve said we’ve needed for a while.

      3. These are just opinion pieces, not reporting. It would be nice to see more fact based discourse, though the topic seems too religious and emotional for any of that to happen.

      4. It seems as if “Alfred,” above, is the one who is being vitriolic (and casting false aspersions as well). I write about facts, inconvenient though they may be.

  2. Let me agree that the FCC’s live stream is terrible. It used to not be this way. They’ve changed something since they moved to fcc.gov/live. It is a shame that this “open FCC 2.0″ can’t get this public facing aspect of their act together. For others that are frustrated, they always make a conference call number available, that of course lacks video, but is stutter-free. Score one for the POTS network.

    1. Alfred, you have to be joking — under Martin and Powell the FCC live stream was barely OK (RealMedia player? LOL), if you were one of the first 50 or so to log in. Anyone later had to pay $20 a pop. For what it’s worth I haven’t had any problems with the new feeds.

      I’d say this FCC has more of its act together in less than a year than the previous two chairmanships combined, especially when it comes to live video, interactive websites and blogs. Not sure what good old days you rememember, but it definitely doesn’t apply to the old FCC video feeds.

      1. Well, yes. It was crummy real player, and it did limit it to 200 users. But it worked.

        Interesting thing though. Today’s open meeting streamed for me without a problem, on a 3G and DSL connections, as well as a colleague on a cable modem. Perhaps they made some changes since yesterday, a result of this post?

        As for this FCC having its act together — yes, let’s give them a gold star for starting like 3 blogs. But on actually getting policymaking done, not so much. Go back and review Powell and Martin’s first half year, versus Genachowski’s. And I say this as someone who has no love at all for the destruction Powell and Martin caused to our telecom markets. Genachowski showing up at GigaOM headquarters is nice, but it doesn’t mean anything if all he does is speak in bromides, and fails to follow up with real policy change.

      2. I think the new broadband plan mandated by Congress will be a significant step forward — and I think the FCC was smart to spend most of the first half-year gathering data and opinions via the numerous open meetings, workshops, etc.

        Since Martin was basically a caretaker for AT&T I guess you could say that by doing nothing, he did exactly what he was supposed to do. Anything else his FCC did — like the 700 MHz auctions, which he now takes credit for — were already in motion before he became chairman. And even then they managed to screw up the D Block, perhaps by design but to the detriment of public safety first responders, who might be making use of that necessary bandwidth right now…

        Either way, I think this will be a much different discussion in March when the broadband plan emerges.

  3. The core of the network neutrality debate centers around ensuring that ISPs provide neutral access to content on the Web as it passes through their networks. This is particularly important for users. Without network neutrality, the ISP could essentially become the distribution “mafia” by allowing content companies who “pay up” to deliver content well, and a lesser distribution channel for those who do not.

    AlertSite’s DéjàClick already provides some really great desktop functionality for consumers, and we’re working on figuring out how best to aggregate and publish some of the really interesting Internet performance data we collect every day.

    Ken Godskind, Chief Strategy Officer for AlertSite
    http://blog.alertsite.com

Comments have been disabled for this post