Getting to gigabit networks isn’t a cheap proposition, and once they are deployed, they generally cost more than the average person can afford. This creates a chicken-and-egg problem around building applications that could help network adoption. Why build if there’s no audience?

running elephant

Getting to gigabit networks isn’t a cheap proposition, and once they are deployed, they generally cost more than the average person can afford. For example, a gigabit connection in Chattanooga, Tenn. one of several towns offering such a service costs more than $300 a month. Even if one can’t get a gig, even a 100 Mbps connection or so can cost about $120 or so. Which means that for most broadband supporters, even ardent ones such as myself, the elephant in the room is: Why spend that much, when for today’s applications, a cable modem offering 12-14 Mbps down will do just fine?

It’s a question that analysts posed of Verizon, when they pressed the company that deployed the nation’s largest fiber-to-home network, about take-up rates and boosting subscribers for FiOS. It’s a question Google seeks to answer with its own plans to build out a gigabit network in Kansas City, Kan. and Kansas City, Mo. And it’s also a question we need to focus more on even as the siren song of mobile connectivity and apps tempts developers to think smaller.

“It’s ironic that the app that is having the most effect and making a big difference is Twitter, which is the most narrow band application imaginable,” says Dane Jasper, CEO of Sonic.Net. “Something similar has to occur in broadband as it gets faster and faster and it gets more ubiquitous.”

Jasper’s ISP is overlaying fiber to the home in Sebastopol, Calif. where it already deployed an ADSL2 network. Subscribers can pay $40 a month for wireline voice and 100 Mbps FTTH broadband, or they can pay $70 for two lines and get a gigabit. Those seem more like the economics that Google is looking for when it sells its network, but until later this year when it should announce pricing, we’re still unsure what it plans to offer.

But tests from Jasper’s initial deployment speak to some problems the industry will need to overcome if we want gigabit networks to become the norm. For starters, there’s the equipment. Computers today aren’t geared up to support gigabit connections and current Wi-Fi networks couldn’t offer those speeds either. Jasper says the first trial of the gigabit network was a speed test on a generic laptop that showed off 420 Mbps down; the laptop couldn’t handle a full gig.

That’s fine, because there aren’t that many applications that need those speeds. Perhaps the most compelling use case I can think if right now is if you wanted to subscribe to a new online backup service and upload your images, music and movies all at once. A gigabit could help you complete the task in minutes as opposed to hours or days. But that’s a one-time kind of benefit — consumers will need everyday benefits if they are going to upgrade their broadband. Yet, network operators have a hard time justifying an investment in a network that will get few subscribers and application developers have little incentive to develop programs for the few on gigabit networks.

So we’re stuck at a point where a gigabit — or even 100 Mbps – sounds awesome, but it’s not exactly worth the prices most companies want (or need to charge). This is why Google’s and Sonic.Net’s plans to expand moderately priced 100 Mbps and gigabit networks will be so important.

“If every consumer has 100 Mbps, we’d have some better applications,” Jasper said. ” At 100 Mbps, high-def video conferencing becomes a reality and you don’t need local storage anymore. You don’t even need local computing.” He pointed me to this awesome video as an example of what might happen, ya’ know, just in case anybody wants to build those next-generation applications.

  1. Good article – I think there are a couple very interesting points here:
    1) Even with potentially unlimited bandwidth (and metered pricing), what is the equilibrium pricing/bandwidth point above which consumers will not go? I don’t think we know that yet, but we’ll soon find out.
    2) How fast is “fast enough”? For all the cheerleading that technophiles do for…LTE for example – what use case does it enable where bandwidth-inferior ones (HSPA+, WiMax) cannot?

    1. LTE is better at penetrating building structure so that it can be used in doors. I have WiMax and wish it did better inside buildings.

  2. Living and working in Silicon Valley, I can tell you that I wish I could get Gigabit bandwidth at that price. The cheapest option in Sunnyvale running Comcast’s 50Mbps/10Mpbs costs $189 + equipment rental per month.

    For the office, the prices are even higher. The question is, if they build it, will customers come? I think for certain parts of the country, the answer is yes. For others, well, we’ll have to see.

    1. NHL Counterpoint Sunday, September 11, 2011

      Sunnyvale’s internet sucks, but don’t shoot down the entire SV. If you can find a subdivision/complex (usually Pulte built), you can find Paxio and get up to gigabit speeds from them.

    2. That’s the real question. Getting regular consumers to think they need super fast connections is the only way any of the Telecos will be able to make money.

      1. I’m not sure they can afford to think that way Jeff. Most think the way to make money is to charge more for new features on top of the pipe because they have huge workforces and infrastructure around content now as TV providers to support. Wide scale consumer adoption of gigabit networks at a cost that makes sense to a telco is probably too much for most consumers to want to pay.

  3. Jeremy Zawodny Friday, September 9, 2011

    I used to think I wanted a high bandwidth connection but realized that above a few Mbit/sec, what really matters to me is latency.

  4. One problem here is Sonic.net’s operation, while it works extremely well on a quasi-local basis, it doesn’t scale on the levels needed to reach the masses. Dane runs a very lean shop and is not a greedy bastard like VZ. Anyone lucky enough to live in Sonic’s limited footprint will be one of the ‘haves’, while most of us will have to suffer with limited broadband options or pay through the nose. Still very nice work by Dane and Sonic.

    1. Oh Peter, wait until next week when I post on Dane’s operations based on the rest of this interview. He’s rethinking broadband, although scale will be slow.

  5. “At 100 Mbps … you don’t even need local computing.”

    This is EXACTLY why Google is investing in broadband.

  6. They have been demoing 4k plasma at CES for the past few years. Sony just announced consumer 4k projector.

    High quality video could be a driver for Gig networks. VC1 or AVC encoded ~2k bluray at 40 Mbps still shows artifacts. 4k would need roughly 150M just to be remotely comparable. If we want actual quality (deep color / 12bit) and reduced motion artifacts, then the rate mjst be higher. Try to ship high motion sports events at 4k progressive-60, and the rate goes up again. Most people have multiple sets in their house…

    The opponents of this are the studios (fear of piracy, loss of theater revenue) and the existing tv networks (fearing direct relationship between consumer and content originators will put them out of business; eg Youtube, Akamai and Level3 CDNs replace CBS, NBC and HBO).

    1. The opponents are also service providers who are offering slower bandwidth at higher rates. I.E.; 10Mbits/$150USD

    2. 4K is awesome technology and would be a huge bandwidth hog. But yes, we will get there eventually — just slower if the SPs have their way :)

  7. Well, when they advertise 100Mbps or 1Gbps, how much really it is for the customer ? Eventually I would like to get a remote TiVO HD (1080p), I guess that I would need a solid couple of 10s Mbps to support that (BluRay goes up to 35Mbps but hard compressed a 8Mbps could do)… and if we want few more streams for the all household, a solid 100Mbps would be required, and that ‘s only for video in 1080p. 4K TVs are already coming, and we are only talking about receiving video here. I might want/need to video call in hi definition too. So, even with our current hardware at home we could easily consume over 100Mpbs.

    1. I don’t understand this thread: physical hard drives will be obsolete by at the end of the year. Why don’t you think they ar SO cheap now and nothing over 2TB isn’t being massed produced?

      Let’s get real. Where there are bottlenecks in thw compendium of computing, the pursestring holders must let go of yhe reins and not in drips and drabs.

  8. Well, if you’re thinking about video streaming or IPTV, OTT applications. suddenly 100mb starts to make sense.

  9. What do you mean that computers today aren’t geared up for gigabit connections? All modern computers support gigabit. Even last year’s netbooks support 10/100/1000BaseT. The only systems that don’t support gigabit in my house are the cheapest motherboards from three generations back.

    In addition, almost everyone today shares their homes broadband connection. In my household of four, there are five hard-wired computers, two hard-wired blu-ray players and three wi-fi clients on any given day. If we’re backing up the computers to an online service while we’re watching streaming content on the two TVs, then we’re definitely starved for bandwidth.

    1. I think the point was that the laptop’s HD couldn’t keep up with a gigabit download. This becomes less true when you fit 7200RPM drives to your laptop.

      I think they somewhat missed the point – in fact, the whole article seems to be based on a fictitious premise. More and more people *do* have equipment capable of writing data to storage at gigabit speed. I have a consumer grade 2TB SATA drive which is more than capable of writing 100MB/s and that is hooked up to a 5/6 year-old PC.

      If I had full duplex gigabit internet at home, I would be doing quite a lot with it. I would bring my websites home. I would host my own email. I would host game servers.

      It’s the same argument for businesses. I have the offer to pay £12K/year for 100Mb or £24K/year for gigabit. That’s just ridiculously expensive, but the advantages it would bring would be nothing short of revolutionary and would unleash a huge amount of efficiencies and opportunities.

      Perhaps an analogy to our current ‘miserly’ internet performance is to compare with the amount of CPU power and installed RAM we all had to suffer, not a mere 10 years ago. Software had to be written to be efficient and to creatively cut corners to get the job done.

      We have to cut a lot of corners to enable video streaming. Blocky, error-prone imagery with the odd “buffering… please wait” thrown in for good measure. That is how a lot of people experience their streaming video.

      Throw a decent sized buffer at a gigabit connection and you too might be able to enjoy full HD, 120Hz (i.e. 3D) video being streamed to your TV.

      Maybe it’s nice to be controversial and argue against gigabit internet, but it’s certainly not helpful and largely ridiculous.

      1. Stefan, I’m not arguing against gigabit Internet, but I am concerned that today most consumers don’t see the value in connection speeds above 20 Mbps or so, which has obvious implications for how willing operators are to deploy it.

      2. Yes but a laptop display and process CAN keep up with a HD video stream, not everything has to be written to disk, in fact i suspect that Hollywood would be very happy if they could guarantee that streamed data was not written to disk.

  10. Planning for today’s bandwidth requirements is a sure way to stay behind the technology curve. Remember, it takes decades to deploy higher speed infrastructure. Are you sure you want to limit yourself to what a single laptop can consume now?


Comments have been disabled for this post