Blog Post

The downsides of a gig: what other towns have learned after getting a gig

Stay on Top of Enterprise Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

If you are even remotely interested in broadband, then you’re aware that Google Fiber is coming to Austin. I’ve confirmed it, local Austin news has confirmed it, a gigabit-touting organization has confirmed it, and Google may even have inadvertently confirmed it. It’s happening. Now the big questions are about the details. We’ll find that out tomorrow at the 11 a.m. CT press conference.

But after the city and Google answer the questions about where they plan to expand, if they will employ the same tactics as it did in Kansas City and other key details, here are a few ways concerned citizens and business leaders can pry a little deeper under the surface. Getting a gig is great, but as Kansas City and other gigabit towns can tell you, there’s a big learning curve.

As Google even pointed out during its launch in Kansas City, equipment and event services such as weren’t ready to support gigabit connections. Now Ookla, which runs, can support a gig, but devices like laptops that don’t support 802.11a/c standards might not. Mike Farmer, the CEO of Leap2, a Kansas City, Kan., startup that has a gig, says that his current MacBook is a bottleneck because, unless he hard-wires it, it can’t support a gig.

Is there anybody out there?

Mike Farmer of Leap2 praising the Google Fiber box.
Mike Farmer of Leap2 praising the Google Fiber box.
But he has a bigger problem as well. “I can watch seven simultaneous YouTube streams in 1080p high-def and Netflix, while still having 750 Mbps left over,” he told me. When I asked what he does with the remaining 750 Mbps, there is silence. And that’s one of the downsides.

The great thing about having a broadband connection is you are connected with billions of people around the world. But if you start building out gigabit-ready applications, or even applications that require 100 Mbps, you’re going to shrink your audience. The Fiber to the Home Council recently estimated that there are more than 640,000 North American households now receiving 100 Mbps service through a FTTH network. I’ve covered this before, but it bears repeating as Google plans to bring its gigabit service to Austin.

As Farmer says, “We have a car that goes 500 mph, but there’s only one road.” But Farmer and people in Chattanooga, Tenn. which is home to another gigabit network, have gotten together to discuss their plight and are planning to create a virtual co-working space using an always-on high-definition camera between their offices.

Farmer is part of a group of Kansas City startups renting a home in a residential area so they can play with Google Fiber. Venture capitalist Brad Feld bought a house in KC and set up an incubator program there too. However, the flip side of the entrepreneurial enthusiasm around Google Fiber is that others in town aren’t prepared for a gigabit connection.

How to handle the gigabit in civic institutions? Deacon, managing director at the KC Digital Drive, told me that schools, for example, are trying to understand and find money for the gear they would need to support a gigabit. He explained that Google provides a gigabit drop to the school, so then the question of how to deploy that technology throughout the build or buildings is left up to the administrators. Do they just provide a computer lab where the termination point is and hope for the best, or do they invest in gigabit capable Wi-Fi access points?

These issues, from a lack of know-how to an inability to brainstorm applications, is the reason that U.S. Ignite was founded almost a year ago. the program aims to teach people what to do with a gigabit connection. The first lesson? It’s not just about speed. Jake Brewer, a spokesman with U.S. Ignite, says speed is only one aspect. Another is about giving neighborhoods the ability to control their broadband destiny.

What does a gigabit app even look like?

For example, the three things Ignite wants people thinking about is speed (upload and download), the local cloud and software-defined networking. Much like the deeply nerdy SDN stuff happening inside data centers, Brewer wants to add programmability and intelligence to the wide-area network. Advantages of this are many, from being able to easily reroute traffic on congested routes to being able to allocate network resources to a specific application to guarantee high-quality service.

As for that local cloud, it may be as simple as storing data closer to the end users or as complicated as creating a town that can harness its compute to double as a data center. For a list of awesome gigabit applications that Brewer and Ignite have helped devise, check out their post from last week.

And there’s the “downside” of getting a gig. Once you have it, the real work begins.

8 Responses to “The downsides of a gig: what other towns have learned after getting a gig”

  1. It really isn’t about trying to figure out how to use a gig. It’s more about NOT having to figure out how to use your broadband.

    What I most like about my electrical connection is that I never have to think about its constraints. I can turn on all my household power guzzlers and then go outside and dwarf their pulls by firing up my pottery kiln. It doesn’t occur to me that I could brown-out my house if I don’t cut back on something. That’s because I can’t. My house is ridiculously, absurdly, oversupplied with capacity.

    Broadband should be like that. And there really isn’t any reason it can’t be. Google is in the process of proving it.

    It’s not about finding an app or even a suite of apps that can simultaneously pull a gig. It’s about installing a broadband “breaker box” that has enough capacity to remove all concern over and even any ability to notice differences in use.

    Not Killer Apps; instead Killer Experiences.

    [Caveat: sure, fine, maybe we don’t need a gig right now to achieve that end. Maybe a 100 megs would effectively do this for almost all of us. But putting that “extra” capacity in place now costs almost nothing at wall of the house and I’d guess makes zero difference in actual traffic consumed by a residence.]

  2. Seriously Stacey? Can’t you draw on any history from the 1980s and 1990s when we actually had some meaningful competition in voice which begat competition in data and we witnessed the software/hardware push-pull virtuous cycle of the Wintel model? Or maybe when we went from 8% penetration and 80 minutes a month in analog wireless in 1995 to 700 minutes and approaching 100% penetration in 6-8 years with digital wireless? Seriously?

    That school seriously can’t put in a switch and divide itself into 4 zones, each with 250 meg committed throughput shared across let’s say 4-6 classrooms and 80-150 kids simultaneously in each zone?


    I just feel so John McEnroe-ish about your article (call) this morning!

  3. I’m sorry but this is idiotic. I can walk into any staples and get gigabit cards, gigabit switches and everything else needed. Cards are $10 a pop. Switches are $240 for 24 ports. Is it going to be one the Junipers or Brocades? Nope but it does not have to be.

    The reason why it is creating the “problem” is because the administration consists of people trying to protect their behinds and dole out useless expensive contracts to their friends.

    • Thomas J. Romano

      For schools and companies you cant just use a switch from staples. the switches themselves just don’t have the backbone. try having 20 machines downloading something while 3 machines are being imaged. all on one switch. cheap switches will not be able to handle it. even if all ports are at 1 gig.

      you need the $5k switches because they have the backbone to handle the huge bandwidth.

  4. Steve Deuss

    Yes there is only one highway for that super fast car. A lot of the internet can’t handle gig speeds. I work for a university that has an uncapped 1gig connection in the northeast. The backbone they are connected on heads towards Washington DC. When I use to test there speed on the Washington DC test station, I measure over 800Mbps down and about the same up in that area depending on time of day. Most content providers are out west in CA or Washington state. When measuring there speed using on many CA test stations, depending on which one there speed can rang from 2Mbps to 18Mbps. Quite a drop from a gig.

    My provider caps my download at 35Mbps and they have there own 10GB national IP backbone connecting to over 100 internet backbones and content providers and portals. When I use to test my speed out west on the same test station, I get close to or most of the time get my full 35Mbps download speed since I am being routed on my providers network and there partners. Even though the University I work for is much faster then me, I beat them out west.

    I do look forward to a gig, but it will be better once we get are infrastructure up to speed.


  5. Ian Littman

    The “500 mph, one road” problem is slightly easier to solve than you’d think, if you’re trying to serve GFiber customers: find a data center who will lease you servers with 2xGbit or 10G ports and good connectivity to GFiber users, then buy services from there.

    Oh wait…those cost $400+ per month. And that’s on the cheap side.

    And then you realize that shoving even 100 Mbps of data down to a web browser stresses even SPDY, Google’s new protocol for making the web a bit more zippy. A fully refactored Blink engine can’t come fast enough (Blink as in the slimmed down WebKit fork that Google announced last week).

    Provided I’m in a GFiber-enabled area, I (as a web developer) plan to hack at answers to the above questions, potentially with Google’s help (for example, seeing if they’ll interconnect at 10Gbps or 20Gbps with some of the data centers here in town). Because, now that there are two major metro areas, plus a third minor one, with gigabit (well, most folks in Chattanooga are running at 50-100 Mbps because a gig costs $300 per month there), there’s at least somewhat of an audience that will suck down content as fast as you throw it at them.