It’s time for a rational perspective on Wi-Fi

Wi-Fi has so dazzled us with its achievements that many people can’t see its fundamental limitations. Unless network planners and policymakers grasp those limitations, they are likely to reach misguided conclusions about the optimal role of Wi-Fi in our mobile-broadband fabric.

Wi-Fi’s achievements are many: Global adoption of standards such as IEEE 802.11n and 802.11ac, extremely high throughput speeds, low cost, and availability in many public areas. But two core aspects that empower Wi-Fi are also at the heart of its fundamental limitations: short range and use of unlicensed frequencies.

I am not opposed to Wi-Fi. My view of the network of the future—a network that will provide enormous capacity and make wireless a viable competitive broadband alternative for many—is that it balances use of licensed spectrum and unlicensed spectrum. Neither is sufficient alone.

Wi-Fi’s limitations

The case for a Wi-Fi-only world is based on false notions that existing wireless broadband providers are less innovative than others within the internet ecosystem and that networks can grow organically, as suggested by Comcast(s CMCSK) in its recent pleadings to acquire Time Warner(s TWX). The theory is that if government were to give innovators sufficient unlicensed spectrum, a global Wi-Fi network, available everywhere, built by hundreds or even thousands of entities, would materialize, similar to what happened with the internet.

This vision is tantalizing and almost appears to be coming to life, with millions of public hotspots around the world and new technologies like HotSpot 2.0 facilitating roaming arrangements. But seeing the vision isn’t the same as fulfilling it.

Because unlicensed bands are short range, any Wi-Fi network, no matter how many hotspots are deployed, will still result in massive coverage gaps. For example, compare the Cable Wi-Fi coverage map with a cellular one. Cellular has at least a 100-to-1 coverage advantage. Users want to stream content to their smartphones, but they want their phones to work no matter where they are. Big gaps in Wi-Fi coverage make such broad coverage impossible.

Photo from Thinkstock/Tomasz Wyszoamirski
Photo from Thinkstock/Tomasz Wyszoamirski

Using the rough approximation that a national footprint requires covering half of total geography, and assuming a generous 100-meter Wi-Fi operating radius, an operator would need to deploy over 150 million access points to cover the United States—an economic and logistical impossibility. History has not been kind to networks with partial coverage. Companies providing service using Cellular Digital Packet Data (CDPD) and Metricom Ricochet failed to sign up many users for their limited-coverage footprints, despite state-of-the art technology.

Only after wireless data matched voice coverage, and only after that coverage extended to almost all of the population, did American consumers embrace mobile data.

As serious as the concerns over coverage are the problems inherent to unlicensed frequencies: interference and congestion. Connecting to the internet via Wi-Fi at hotels and airports, for instance, has become a hit-or-miss proposition. It sometimes works, but more often it’s slow or unavailable due to the escalating number of people using these networks.

We need highways and local roads

Increasing the size of unlicensed cells is not the answer, as I’ve explained previously. Making cells larger by allowing unlicensed technologies, whether Wi-Fi or white space, to operate at higher power just makes interference issues worse because the expanded footprint covers so many more potential interferers.

As Wi-Fi continues to be deployed, the goal of dependable access from anywhere will remain elusive. Cable companies have deployed a considerable amount of public Wi-Fi, but their coverage remains incomplete. And the business purpose of those networks isn’t broadband everywhere, but stickiness to retain the lucrative cable subscriber.

Photo by Thinkstock/wx-bradwang
Photo by Thinkstock/wx-bradwang

A truly ubiquitous, fast mobile broadband network needs both licensed and unlicensed spectrum. Licensed spectrum gives operators manageability and predictability, which enables them to safely invest in a top-down fashion the tens of billions of dollars in the infrastructure necessary for coverage. Given the volume of traffic carried on these networks—traffic that can’t be off-loaded—these cellular networks will need continually greater capacity.

Meanwhile, unlicensed spectrum gives millions of entities the flexibility to invest in a bottom-up manner to provide localized high capacity. The two approaches are symbiotic and mutually interdependent—with no foreseeable changes. Both will benefit from technology advances and both will need more spectrum over time.

One can draw an analogy with highways. Our LTE networks are like well-planned freeways that use dedicated land and provide broad transportation coverage. Wi-Fi is like the mishmash of all other roads, providing great local access but not serving as a viable substitute for freeways.

Before long, users won’t even know what type of network they’re connecting to, but their super-high-speed-always-available experience will depend on networks that use both licensed and unlicensed frequencies.

Peter Rysavy is a wireless technology analyst and president of Rysavy Research. He’ll be chairing a workshop on wireless connectivity management and service enablement on June 17 in California.