2 Comments

Summary:

At the World Radiocommunication Conference in 2015, one of the key debates will be whether to allocate more spectrum to wireless services. Yet, the data that informs this debate is deeply flawed, according to Tim Farrar, and overestimates the demand by as much as 1200 times.

Over the last two years, I’ve written several articles for Gigaom asking whether the supposed “spectrum crisis” is a myth and how much we can rely on Cisco’s VNI mobile data forecasts. However, there is still considerable support among regulators for reallocating more spectrum to terrestrial wireless services at next year’s World Radiocommunication Conference (WRC-15). Until recently, I had assumed that the last eight years of debate within the International Telecommunications Union about spectrum allocation was based on a more thoughtful and balanced analysis of future spectrum needs.

Unfortunately it appears that the current ITU-endorsed forecasts massively overstate the demand. These forecasts indicate that, by 2020, licensed cellular services (i.e., excluding wifi) may need up to 1960 MHz of spectrum — more than double the current allocation and at least four times the amount of spectrum deployed in most countries today. Since proposals for reallocation would entail considerable disruption for broadcasters, satellite operators and government users who currently use the targeted spectrum, it is particularly worrying that there is no reliable forecast of whether additional spectrum is actually needed for terrestrial wireless services.

Off by a factor of 1200

The ITU forecasts are derived from a spreadsheet model (ironically, named the “Speculator”) that estimates the need for spectrum in 2020 based on projected traffic volumes in specific “service environments” and the amount of licensed spectrum that would be needed to accommodate that demand. The assumed need for spectrum is then based on the environment with the largest requirement, taking account of the fact that small cells and wifi hotspots will be mostly in urban areas, while in suburban areas, more users will be traveling in cars, buses or trains.

By adding up the total demand for different applications within each environment, and deducting the traffic offloaded to wifi, the model then calculates the assumed cellular data traffic per square kilometer. The Speculator model assumes that by 2020, total cellular data traffic will be 6.4 petabytes per sq km per month in suburban areas and up to 24 petabytes per sq km per month in urban areas.

To put that in context, the ITU report that presents this data also quotes a UMTS Forum forecast that total worldwide cellular traffic in 2020 will be approximately 128 exabytes, of which the US will account for approximately 20 exabytes per year. The total urban area of the US (not even counting exurbs) is approximately 235,000 sq km, so the average traffic density in US urban areas (where nearly 70 percent of the population live and three-quarters of traffic is generated) should be no more than about 5 terabytes per sq km per month in 2020. That is less than 1/1200 of the figure assumed in the ITU model for suburban areas. Even in more geographically compact countries like Japan and the UK, the ITU model overestimates potential traffic densities in 2020 by a factor of at least 200, compared to the UMTS Forum forecasts for the country as a whole.

Surprisingly enough this fact seems to have been completely overlooked, despite years of work and millions of dollars spent on studies to support WRC-15, the global regulatory event that will play a critical role in the future development of the wireless industry. For example, a June 2013 publication by Ofcom includes an analysis of the ITU demand model by Real Wireless that demonstrates the same problem. It shows the ITU-modeled traffic per sq km in suburban areas (including some WiFi traffic) to be in excess of 10 Petabytes per sq km per month, despite the fact that total traffic in the UK as a whole under the same set of assumptions is only 300 Petabytes per month.

Reuse, reduce — don’t reallocate

Can’t we do better? Why do we have an incessant drumbeat that “we need to bring more spectrum capacity to market… and fast,” but no good analysis of how much spectrum is actually needed over the next 10 years? Of course there are uncertainties, but that is no excuse for using such a poor analysis. Looking back to WRC-07, the ITU model predicted that between 760 and 840MHz would be needed by 2010 to accommodate expected demand. However, no country in the world has deployed networks using this much spectrum even today, and some countries make do with less than 300MHz of licensed spectrum.

Given that considerable additional spectrum has already been allocated but not yet deployed for mobile use, perhaps a more realistic analysis would determine that we don’t need to allocate more spectrum to accommodate future traffic or that unlicensed spectrum bands will be best suited to handle expected growth? After all, when Verizon notes that two-thirds of its traffic is now carried on its 700MHz LTE network, that means roughly 20 percent of all US mobile data traffic is being accommodated within only 22MHz of spectrum, despite the dramatic improvements in data speeds seen in recent years. And Verizon still has substantial holdings of AWS-1 spectrum it can deploy to accommodate further traffic growth. New network architectures, such as that planned by BT for its 2.5GHz spectrum deployment in the UK, with a small cell included in every home DSL router, also open up the possibility of massive increases in the reuse of existing spectrum allocations.

We must consider all these issues and we need realistic modeling of the various alternatives, so that regulators can make an informed decision about the benefits and costs of spectrum reallocation. After all, allocating more spectrum to mobile operators means taking away that spectrum from someone else. But I’m afraid that such rational analysis is taking a backseat to perceptions that a “spectrum crisis” is still looming in the wireless industry, despite considerable evidence to the contrary.

Tim Farrar is President of Telecom, Media and Finance Associates, a consulting and research firm in Menlo Park, Calif., which specializes in technical and financial analysis across the satellite and telecom sectors.

  1. one argument for more spectrum would be to allow smaller operators such as to-mobile to build networks that can compete with at&t or verizon, or else verizon, at&t and sprint should be forced to sell unused/underused spectrum.

    Reply Share
  2. And what about the development of software defined networking, dynamic spectrum sharing and cognitive radios that will use available spectrum far more efficiently through collaborative use?

    Reply Share