13 Comments

Summary:

In this part of our special report on reinventing the internet, a look at how the wireless networks of the future should evolve to handle a world in which mobile computers are the standard computers.

reinventing internet-02

Let’s face it: today the internet is wireless. Comcast may provide the coaxial cable into your home. Level 3 and AT&T may build the fiber optic backbone connecting cities. And Google and Netflix may run the massive server banks where your content originates. But the last hop to your device is more often than not a wireless one, whether it’s the last mile, the last 20 feet, or the last inch.

Your phone isn’t just connecting to a distant carrier’s tower. It and your laptop hop on your home Wi-Fi router whenever you’re in range. Your wearable computers talk to your phone via Bluetooth. It’s a trend that’s only increasing. More devices — from TVs, to cars, to traffic lights — are coming with embedded radios, joining our phones, laptops and tablets in demanding more capacity over our increasingly crowded airwaves.

To build a better internet, we’re going to have build better wireless networks. If we don’t we’ll be forced to reverse one of the biggest technological gains of the 21st century: disconnecting the world from wires.

RTI mock 2-02

Connecting more gadgets isn’t the only goal here. We need a wireless internet that isn’t just available to more gadgets and doesn’t just feed our ever-growing desire for speed. We need wireless broadband networks that are available outside of big cities and can connect the large part of the world population’s that have little or no access to the internet. Those networks need to provide consistent, resilient connections as we move from place to place.

Most importantly, we need the mobile data those networks provide to be cheap — dirt cheap. The price of mobility can’t remain $15-$20 a month for a gigabit of data, otherwise we’re not making the internet mobile at all. We’re just creating an expensive toy that the majority of people can only access on a limited basis.

We can build that wireless network, but there are two problems we need to solve. The first is a problem of physics, which the wireless industry can work out in the labs. The second is a problem of political will.

Catching up with Claude Shannon

When the tech world talks about mobile, there’s often this rather simplistic assumption that the mobile industry needs to develop that big leap in radio technologies to solve its problems. Lately those hopes have been pinned on the amorphous concept called 5G and a single-minded focus on speed — all we need is a 1 Gbps connection to the device. But the problem is hardly that simple, and to understand why, you have to understand how wireless networks work.

The wireless industry has traditionally had three levers it can pull to build faster networks. The first is spectrum: the more spectral bandwidth it can add to a network, the more bandwidth is available for data. The second is spectral efficiency: shoving more bits into the same-sized pipe. The third is density: by shrinking the geographic area a transmission has to cover, we’re able to re-use the same spectrum over and over; in other words, we create more “cells.”

For the last two decades the mobile industry has been yanking on that second lever, moving from analog to digital, from 2G to 3G, from 3G to LTE. But we’re reaching the point where that lever won’t budge any further. Our current wireless networks are bumping up against the upper limit of how many bits per second can be crammed into hertz of spectrum over any given transmission path.

“We have a whole bunch of radio interfaces that will take you right up to the limits of what you can transmit over an antenna,” GSM Association Senior Director of Technology Dan Warren said in a recent interview. That barrier is known as Shannon’s Limit, and right now we’re 99 percent of the way there, Warren said. Radio technologies are constantly improving, but getting within 99.5 percent of Shannon’s limit isn’t going to spark another revolution in wireless.

Claude Shannon, the father of information theory, at Bell Labs in 1950. Shannon proposed that there was a limit to the amount of information that could be transmitted over a single communication channel. We're hitting that limit today. (Source: Alcatel-Lucent)

Claude Shannon, the father of information theory, at Bell Labs in 1950. Shannon proposed that there was a limit to the amount of information that could be transmitted over a single communication channel. We’re hitting that limit today. (Source: Alcatel-Lucent)

But what about these new technologies we’ve been hearing about lately like LTE-Advanced and gigabit Wi-Fi, which deliver enormous gains in speed? The thing is those big speed boosts are largely accomplished by piling more spectrum onto the network. We’re widening our lanes with more airwaves, not increasing the speed at which information can flow over them.

Consequently the wireless industry of late has been batting on that first lever, as anyone who has been paying attention the policy debates in Washington and the wave of consolidation among mobile operators will have noticed. The big carriers are buying up their smaller counterparts to harvest their airwaves. The tech sector is lobbying the FCC for more unlicensed airwaves so they can build more powerful Wi-Fi networks.

There’s no magic bullet to get us that next horizon in wireless networking, Warren said. Instead the industry is looking at a host of different technologies that will boost capacity and eke out more efficiency in the way data is delivered.

It’s looking squarely at that third lever — spectrum re-use — by deploying small cells that would basically create the same amount of capacity we’d get from a big tower in a much smaller area. Massive antenna arrays could move wireless networks into higher bands of frequencies like the millimeter waves that were formally unusable for connecting any kind of consumer device. New beamforming technologies would create stronger more resilient signals and thus more consistently fast connections to the phone or tablet. Carriers have even started deploying software-defined networks in their cores to help them adapt to these changing technologies more quickly.

Ericsson CEO Hans Vestberg holds up his company's newest small cell, the Radio Dot (Photo by Kevin Fitchard)

Ericsson CEO Hans Vestberg holds up his company’s newest small cell, the Radio Dot (Photo by Kevin Fitchard)

Breaking down the barrier between internets

In order for the mobile internet to get better, it needs to look more like the internet itself. The internet is flat, highly distributed with no central point of control. In contrast, mobile networks have basically been built in the model of our phone network, a highly centralized system controlled by an all-mighty core that manages connections and traffic all the way to the cell.

“We’ve had the same design paradigms since the days of Bell Labs of what a cell looks like,” said Vish Nandlall, North American CTO and head of strategy and marketing for mobile network vendor Ericsson. “We have to disrupt our basic notion of what a cellular network looks. … Let’s get away from the cell as basic unit of design.”

Instead of our devices linking to a single cell, they could link to multiple nodes on the network simultaneously. The tower in the distance could provide your download link but our upstream link could be sent to a nearby small cell more capable of handling your phone’s weaker transmitted signal. A video could be parsed out among multiple connections and reassembled on our handset screens.

Vish Nandlall Ericsson Mobilize 2013

Vish Nandlall, North America CTO, Ericsson at Mobilize 2013 (c) 2013 Pinar Ozger pinar@pinarozger.com

Nandlall is taking a relatively new concept in mobile networking, called the heterogenous network, and taking it to its logical extreme. The idea is that no device needs to be tied down a single network or a single transmission. It could access Wi-Fi just as easily as cellular or any number of other wireless technologies to find the most efficient path to the internet. Different types of content might take different routes, depending on their needs. An email could follow a lower latency, low-bandwidth circuit through the network, while a video conference call would be sent over a highly prioritized connection.

In such a model, anything can be a node that routes and relays traffic on the network. Instead of just communicating with other cells and access points, our phones could link to each other. A phone that has a stronger connection to the network than its neighbor could relay that connection through Wi-Fi or Bluetooth. Phones and cells together could form vast ad-hoc networks that could bypass the network core entirely, sending local communications and cached internet content through what amounts to a constantly morphing mesh network.

There is a limit to amount of nodes you can stick in a network before you reach a point of diminishing returns, Nandlall said. Every new transmitter introduces more interference, and eventually you get a soup of competing signals that carry no information at all. But the industry is getting better at managing and mitigating the interference created when these “cells” overlap.

The point is, though, if every device and every node optimized every possible transmission path over the network, you maximize its potential capacity. You can increase the number of devices simultaneously connecting to wireless networks exponentially and deliver faster, better connections while consuming less power.

Is spectrum really scarce?

Micha Benoliel Open Garden Sascha Meinrath New America Foundation's Open Technology Institute Steven van Wel Karma Mobilize 2013

New America Foundation’s Sascha Meinrath (center left) speaking at Gigaom Mobilize 2013 (c) 2013 Pinar Ozger pinar@pinarozger.com

If maximizing the capacity of the network is the goal, then the mobile industry is doing a dismal job, according to Sascha Meinrath. Meinrath is director of New America Foundation’s X-Lab and the founder of the Open Technology Institute, which is dedicated to building community broadband networks that make internet access available and affordable to everyone by bypassing the traditional telecom players.

Meinrath believes that the carriers’ business models are built on the false premise of spectrum scarcity. Spectrum is a limited asset — there are only so many parts of the electromagnetic spectrum over which we can transmit useful wireless signals today — and therefore bandwidth must be rationed out on a limited basis at high prices. But Meinrath’s point is that spectrum isn’t limited; it’s just tightly controlled by governments and carriers.

In any given location, Meinrath said, 90 percent of spectrum that could be used for wireless data services is unused: the channels between TV and radio broadcasts, hundreds of megahertz of lightly used government frequencies and all of the vast holes in cellular coverage across the country. We talk about the airwaves being crowded but looked at as a whole the usable electronic spectrum contains a lot of dead air.

A screenshot of Google's White Spaces database tool showing unused TV frequencies throughout the Midwest (Source: Google)

A screenshot of Google’s White Spaces database tool showing unused TV frequencies throughout the Midwest (Source: Google)

“Spectrum scarcity is a myth,” Meinrath said. “Anyone only has to turn on their FM radio to see it.”

If the government would just open those frequencies up to public use, then we could build sweeping networks that could deliver enormous quantities or cheap capacity, Meinrath said. Incumbents could get priority access for their airwaves, but when and where they’re not using anyone should be able to tap those frequencies using Wi-Fi, white spaces broadband or any other wide or local-area wireless technology that can access them.

That’s an argument that Silicon Valley likes to hear, and it strikes at the heart of a long ongoing debate over unlicensed free-to-use airwaves and restricted licensed airwaves owned by carriers. The mobile industry maintains that licensed airwaves are the crucial foundation of any wireless system, because they’re used to build managed networks that can span long distances. By opening up all of the airwaves to unfettered unlicensed use, you would get, in Warren’s words, “carnage,” with cross-interfering networks essentially canceling each other out.

Even under their existing restrictions, the unlicensed bands are already a mess of competing Wi-Fi and Bluetooth signals. That said, the unlicensed bands may be chaotic, but it’s a beautiful chaos. The Wi-Fi industry has done more than any other spectrum user to squeeze the maximum amount of benefit out of its limited quantity of spectrum. The inclusion of Wi-Fi in the original iPhone did far more to spur the smartphone revolution than the device’s measly 2G radio.

I’m not arguing for the demise of mobile carrier by any means, but given the innovation that’s occurred on Wi-Fi, more airwaves need to be opened up for unlicensed and shared use — and a lot of them.

Ultimately our future mobile internet depends on it. Carriers will be crucial to building the pervasive mobile networks that glue everything together, but accessing those networks is never going to be rock-bottom cheap. Mobile data will get less expensive, but you’re never going to pay ten bucks for 100 GBs. Otherwise carriers have no economic incentive to build those networks.

But if we integrate unlicensed technologies like Wi-Fi into our mobile networks – which is already starting to happen with Hotspot 2.0 –  and give those technologies more spectrum to expand, we can establish a foundation of cheap plentiful broadband to balance out the higher costs of truly mobile broadband.

Photo by Tomasz Wyszoamirski/Thinkstock

Photo by Tomasz Wyszoamirski/Thinkstock

What your future wireless internet could (and should) look like

So what’s the end game here?

The ideal situation is a mobile plan that switches between multiple networks operated by multiple carriers and ISPs, selecting the best and cheapest connections as they go. The workings of that ideal plan would remain opaque to users; they would just get a consistent broadband connection on all of their devices and the freedom to consume as much data as they liked for a nominal monthly fee paid to a single entity.

That’s not going to happen any time soon. I believe the mobile internet is going become cheap, it’s going to become widely available and its going to provide a far more consistent broadband experience. But it’s also going to take some work on our part. We’re no longer going to rely on a single company providing a single connection over a single network.

There are four nationwide mobile carriers in the U.S., and by next year that number could be reduced to a mere three. If you’re expecting one of those operators to satisfy all of your future wireless data needs then I assure you, you’re going to be disappointed. We’ll need to tap public Wi-Fi like the hotspot grids being built by the cable operators, and we’ll need to encourage the growth of community networks being built by our city councils, our neighborhoods or even internet companies like Google.

Startup Open Garden's concept of a crowdsourced mesh network (Source: Open Garden)

Startup Open Garden’s concept of a crowdsourced mesh network (Source: Open Garden)

We’re already starting to become familiar with the concept of “Wi-Fi First,” in which we use cellular data only when all of our Wi-Fi options are exhausted. In the future, we could have third, fourth and fifth options, tapping new networks using white spaces, short-range links like WiGig or Bluetooth or new networking standards we haven’t even envisioned yet.

I also think we’ll have to abandon the idea that we “own” our connections. At any given time we’re surrounded by wireless networks, but we only have permission to access one or two of them. To make the most efficient use out of wireless spectrum, we’ll have to use every possible network at our disposal. We’ll need to share our connections, whether that means letting a user hop onto your home Wi-Fi network or borrow your smartphone’s radio to get a faster link to the 4G network.

I’m not saying we share our data plans with strangers or open up our private information to anyone with a wireless device. We can put mechanisms in place to ensure our own devices are secure and that we’re billed for the data we consume on a borrowed connection. The whole idea might sound like some misguided socialist broadband utopia to some, but it’s really not that big a stretch. It’s really just an extension of a fundamental principle of the internet.

While companies may own the infrastructure the runs on, no one owns the internet itself. We may own our smartphones and laptops and Wi-Fi routers, but we have to stop thinking of them as terminal points in the network. They have to become part of the network so we can build a better wireless internet for everyone.

Check out the rest of our special report below:

Images from Tomasz Wyszoamirski/Thinkstock and Pinar Ozger. Banner image adapted from Hong Li/Thinkstock. Logos adapted from The Noun Project: Castor and Pollux, Antsey Design, Mister Pixel and Bjorn Andersson.

You’re subscribed! If you like, you can update your settings

  1. Kevin .. excellent thoughtful piece.

    1. Thanks Evan, I appreciate it.

  2. brettwatkins Wednesday, May 7, 2014

    Well said Kevin… this HAS to be the most sensible, practical, and equitable path for our networked future. Trouble though (as you pointed out) is that existing models, both commercially for telcos and regulatory from governments yielding billions from licencing or auctioning spectrum, MUST change.

    Here is Australia, just as in the US and others areas, there has been tremendous investment on duplicated data networks. Our NBN is trying to level the playing field and allow carriers to focus on building value – and investing less in overlapping infrastructure. Trouble is – its based on the same old model… that everyone should have “one” fast connection and plug into it (or at best use a personal wi-fi network at the end of fibre).

    I relish the day when wireless is less restricted and may provide the ubiquitous network (at a fair and reasonable fee for service) that you describe. That ole adage of “by growing the pie – everyone will have a larger slice” MUST become more a reality. Let’s hope…

    1. Hi Brett,

      So is Australia no longer doing spectrum auctions? One of the biggest problems in the U.S. is that Congress looks at spectrum as a cash cow. Consequently instead of trying to decide to most efficient use of spectrum, they’re looking at the best way to maximize revenue.

  3. What do you think of pCell ? Is that not a viable technology?

    1. Kevin Fitchard Prafull Friday, May 9, 2014

      Hi Prafull,

      I haven’t had a chance to take a close look at Artemis yet beyond my initial post on its launch (link below), but I actually mentioned a lot of the technologies that go into pCell in the post. Things like phantom cells and relays in the network, the re-use of spectrum through smart antenna links. The idea behind pCell is to create a lot more capacity by using the air interfaces and spectrum we have a lot more efficiently.

      A lot of people keep asking why I didn’t mention Artemis directly, but the truth is there are a lot of companies that I’m referencing and linking to in here that I didn’t mention by name: Open Garden, M87, Bluwan, OTI’s Commotion, etc. The idea was to give an overview of what the possible technologies would be, not to get bogged down specifically in who was working on what.

      Thanks for commenting.

      https://gigaom.com/2014/02/19/steve-perlmans-new-startup-says-it-has-the-answer-to-the-mobile-capacity-crunch/

  4. Aidan Dillon Thursday, May 8, 2014

    Great article Kevin. I agree it’s about Connecting Networks so that all infrastructure is available to all users and they always get the best connection for service depending on their location, device type and current service quality of each available connection.

    1. Thanks Aidan, Now the real problem (which other commenters have also pointed out): figuring out how to manage those different connections.

  5. Great article Kevin. I think that with the increasing number of connected devices out there using different technologies to connect to one another, it only makes sense that a “sharing” protocol will need to be created.
    It’s almost like leaving half your dinner on your plate and in everyone else’s plate at the table, and at the restaurant, and so one. While someone elsewhere goes hungry (and am not being political here). Question is who makes the first step?

    1. Hi Pamela,

      Good question. I bet a lot of companies are going to try, including carriers, but it might be a good opportunity for independent quasi-operators to arise. iPass kind of already does those, managing connections to multiple networks and operators for enterprises. You’d think Apple and Samsung might also get in on this. After all it’s there connection manager clients that decide what network you connect to at any given moment. Right now it’s pretty basic (If Wi-Fi is available then connect), but if they added a little more nuance to those clients…

  6. Sounds very similar to the pcell by artemis

  7. Is there a limit to the extent that technology can expand the utility of a given spectral band? Meaning, can’t 4.01 GHz be eventually utilized all the way down to say 4.00000000001 ghz?

  8. Griffin Ellis Sunday, May 18, 2014

    When you said “The price of mobility can’t remain $15-$20 a month for a gigabit of data…” I believe you meant Gigabyte. Gigabit is a unit of speed.

Comments have been disabled for this post