11 Comments

Summary:

Mobile operators insist we are fast approaching a mobile datapocalypse where their networks will no longer be able to meet mobile broadband demands. But are these claims of a spectrum crisis red herrings? A couple of telecom industry commentators think so, and they’re calling the carriers out.

1200003_apocalypse_thunder

Mobile operators like AT&T and Verizon Wireless have used many a financial call and speech to spell out the impending doom that awaits us once they use up their precious frequency resources. They insist we are fast approaching a mobile datapocalypse where their networks will no longer be able to meet the enormous demands for mobile broadband. But are these claims of a spectrum crisis all red herrings? A couple of telecom industry commentators think so, and they’re calling out the carriers, claiming they are using scare tactics to justify their recent consolidation sprees.

Data growth isn’t matching earlier predictions

On Friday, DSLPrime and Fast News Net’s Dave Burnstein pointed out that AT&T’s yearly mobile data growth is only 40 percent, far lower than the 92 percent to 120 percent figures predicted by Cisco Systems, research firms and the FCC. Burnstein said that operators are raising the specter of higher data growth rates to scare regulators and lawmakers into giving them more airwaves and placing fewer restrictions on how they use them:

With growth rates less than half of the predictions, a data-driven FCC and Congress [have] no reason to rush to bad policy. Wireless technology is rapidly moving to sharing spectrum, whether in-building small cells, WiFi, White Spaces, Shared RAN or tools [that] engineers are calling hetnets – heterogenous networks. The last thing policymakers should do is tie up more spectrum for exclusive use; shared spectrum often yields three to ten times as much capacity.

DSLReports’ Karl Bode has been questioning the so-called ‘spectrum crisis’ for years and, riffing off Burstein’s post, Bode pulled no punches:

Anybody who warns of an unavoidable capacity crisis on wireline or wireless networks is lying in order to sell you something. That may be a blunt assessment to some, but it’s the only conclusion you can draw as we see time and time again that claims about a looming network apocalypse (remember the Exaflood?) violently overestimate future traffic loads and underestimate the ingenuity of modern network engineers. Fear sells. Drink orange juice or you’ll die of cancer. Get more insurance or you’re a bad family man. Vote for me or lose your job and see your grandma deported. Pay $2.50 per gigabyte or face Internet brown outs. Be afraid.

… As usual though, actually bothering to listen to and look at the data tells a different story. Nobody argues that spectrum is infinite, but buried below industry histrionics is data noting that there really isn’t a spectrum crisis as much as a bunch of lazy and gigantic spectrum squatters, hoarding public-owned assets to limit competition, while skimping on network investment to appease short-sighted investors. Insiders at the FCC quietly lamented that the very idea of a spectrum crisis was manufactured for the convenience of government and industry.

Underlying Burnstein and Bodes’ criticism is the fact that operators have been sitting on unused airwaves for years while simultaneously asking the government for more. AT&T and Verizon Wireless bought their AWS in 2006 and have yet to deploy a single commercial cell site, though T-Mobile, MetroPCS and Leap Wireless have launched multiple networks over the same airwaves. Sprint and Clearwire are sitting on a goldmine of 2.5 GHz frequencies, which remain unused in two-thirds of the country. Even in the markets where they have launched WiMAX some 100 MHz of their spectrum still lies fallow.

So is the spectrum crisis a myth? Are the operators simply playing on fear to gobble up more of the public airwaves? I agree with Burnstein and Bode in principle, but I don’t think the capacity crunch is pure fiction either.

Data growth is slowing but – hopefully – not for long

AT&T’s growth rate has slowed down but that’s because a good deal of its networks are already full, thanks to it being the first carrier to land the iPhone. Between 2007 and 2010 AT&T saw a 5,000 percent increase in mobile data traffic, which led it to the highest smartphone penetration rate in the industry today at 56.8 percent. AT&T has such a large base of mobile data users already, its data growth rate was bound to slow down as long as smartphones are the primary mobile data driver.

But what happens when consumers finally find a reason to link their tablets to the 3G or 4G network? What happens when cars become connected? For each new wave of mobile data devices – each drawing more capacity than its predecessors – the cycle will repeat itself, launching huge surges in mobile traffic as each wave of early adopters gives way to the awesome force of the mass market.

Bode and Burnstein are right that a 40 percent yearly mobile data growth rate is much more easily manageable than a 100 percent growth rate. Even if new mobile networks like LTE and improvements to HSPA+ can’t take that new demand in stride, better use of unlicensed spectrum and Wi-Fi can easily make up the difference. Focusing solely on licensed airwaves is a wrong way to address the capacity crunch, as my colleague Stacey Higginbotham wrote last week when refuting a particularly fear-mongering column in The Washington Post.

But I don’t think we can count on that 40 percent growth AT&T is now experiencing to remain static, unless we maintain a very pessimistic outlook about the future of mobile broadband. Perhaps the operators should be providing us with better reasons – i.e. more sensible pricing plans — to adopt the connected tablet as readily as we’ve adopted the smartphone. If the operators can’t provide us with a compelling vision for a world of connected devices today, there’s little incentive for us to believe their predictions of a world facing a spectrum crisis in the future.

Image cortesy of stock.xchng user dimitri_c

You’re subscribed! If you like, you can update your settings

  1. Predictions of future traffic growth have been abysmal over the years, and today is no exception. There is the Cisco VNI report @ 100% CAGR resulting in backhaul needs of 50 Mbps. However, there is also the NPRG “Traffic Deluge” analysis which predicts 10X the Cisco report, and an Ericsson report which predicts far less than either. NPRG concluded that 2/3 of the US cell sites will need upwards of 500 Mbps of backhaul by 2015. 10X the Cisco report.

    The problem with modern mobile broadband vis-a-vis 10 years ago are the giagantic channel bandwidths (10MHz and heading towards 40 Mhz) required to support LTE Rel. 10 and/or 802.11 a/c (there will be Wifi in most ODU picocells). The issue is not so much mobile traffic growth but CMRS traffic growth vs. WiFi growth..and how the industry will backhaul the deluge of small cells that will begin raining down in 2013.

    Carriers have always overstated their spectrum needs to make a point. I testified in the US Senate on spectrum policy 10 years ago (http://markkelleyonline.com/whitepapers/testimony.pdf) and the strategy remains the same.

    1. Hi Mark,

      That’s one of the things that’s always puzzled me about the relentless quest for faster speeds the operators have been on. Considering what the cost of a MB is on a mobile data today pushing toward 50 Mb/s to 100 Mb/s connections seems a bit silly. I understand there are efficiencies to having fatter pipes, but if you offer developers and consumers faster speeds they’ll only fill it — and then complain about the cost of their data bills.

      1. with broadand “performance” is equated with speed and hence capacity – right? To put it another way, “quality = quantity” assuming you have decent coverage, which most people seem to. Blame it on WiMAX. It seems that carriers and 3GPP had to rush rel. 8 a year or more lest WiMAX make more of a foothold in the 10+ Mbps. The challenge carriers have it so deliver 10X the data speeds of 3G without much (or any) changes in the Network OPEX….10 or even 100X the backhaul by 2015 for the same cost as a few T1s were last year.

      2. Hi Mark, the funny thing is WiMAX, at least on 10 MHz carrier, doesn’t come close to delivering it. The promise of WiMAX was to deliver broadband capacity cheaply (at least that’s how looked at it). Now that’s all lost.

  2. Kevin,

    With all due respect to Bode and Burnstein, I say the facts support the hypothesis that the carriers aren’t lying:

    – Why would the carriers spend $19B at the LTE auctions if they were not constrained?

    – Why would the carriers be investing constantly to upgrade their networks (AT&T goes about $20B per year) if there were no capacity crunch?

    – Why would US carriers deploy LTE among the first in the world if there were plenty of capacity on cheaper, and amortized 3G networks?

    – AWS is oddball spectrum, and as you know is the cause of some of the problems T-Mobile faces. 2.5GHZ is bad range spectrum, and Sprint has tried to use it, but hasn’t had market success with Clear or WiMAX.

    – Sprint, which you mention has plenty of 2.5GHz, is also quite constrained on their Nextel 800 band, which they agreed to vacate years ago in the spectrum swap because of first responder interference. The scarcity of 800MHz spectrum is what drove the bad quality iDEN service that churned their customers away and gave their brand a massive black eye. So they know well the relationship between spectrum constraints and bad service. AND still, they actually are not a carrier that complains loudly for more spectrum.

    Are the carriers making billions in investments just for show? I understand that they could be bidding on spectrum just to hoard it, but that doesn’t appear to be the case with the most recent 700MHz auction. Beyond hoarding, I see billions of dollars spent on building capacity, and that isn’t consistent with the claim that they’re faking.

    Next argument, of course AT&T has had a drop in data growth rates. But what else happened in the time period discussed? Wassat you say? A usage cap and tiered services? Well, doesn’t that go a long way towards explaining the mitigated growth? Subscribers got put on the meter, of course growth was affected.

    The truth is, cellular bandwidth demand is still massive and growing fast. New users are buying smartphones at a dizzying rate. But there is a list of mitigation strategies, some of which have already been implemented, which work to handle the tide.

    – tiered services and caps
    – wifi offload
    – smaller cells
    – better modulation, ex: LTE, LTE-Advanced
    – more spectrum
    – infrastructure sharing
    – DAS, picocells, femtocells
    – better signaling chatter in devices
    – incentivize developers to produce more data-efficient apps

    I’m sure I missed some there. But the fact of the matter is that the data tsunami was spotted by 2008, as AT&T was getting killed by iPhone users, and all carriers started buttressing their defenses against the wave. The tsunami wasn’t overstated, or didn’t go away. It is just being partially managed.

    What would have happened if the Army Corps of Engineers had been funded and given time to adequately improve the levees around New Orleans before August, 2005. Would we say that the Katrina storm surge wasn’t so bad after all? Some might. But in fact…we just would have been better prepared.

    1. Hi Derek, good argument. We should have run it as a refutation post :)

      I would say, though, while you make a lot of good points for the capacity crunch on operators’ current networks, but not necessarily the case for a spectrum crisis.

      I agree operators will need more spectrum in the future. But until operators actually launch their own unused spectrum it’s hard to take them seriously when they claim their spectrum problems are immediate, which is exactly what AT&T appears to be claiming (though Sprint and Verizon not so much).

  3. The Issue is maybe not the lack of frequency, but the lack of good frequency (lower bands) and a proper spectrum allocation, which is very messy in the US, compared to everywhere else in the world. FCC has made frequency allocation in the US unique, and that will affect all Business cases for operators. Theoretically, we can still handle even a 100% growth rate, but financially, it will be impossible, if we don’t move quickly and develop commercial solutions. The reason why many operators in the world are sitting in “gold mines” is because 2.5, 2.3, 3.5GHz and pretty much everything above 2.1GHz does not have a positive business case. Propagation is very poor, and it means the cost of network implementation is huge. With ARPU’s going down all around the world, it will not be feasible to deploy certain frequencies even if the licenses are issued for free.

    Then, there is a very complex problem with spectrum allocation and ecosystem maturity, that needs to be taken into consideration. LTE has been deployed in different frequencies, but now we see some consolidation in 800EU, 1800 bands, and 700.

    Additionally, the spectrum efficiency that was increasing exponentially in the last 15 years, is not increasing anymore. The spectrum efficiency of LTE compared to HSPA+ is very similar, and we have reached a theoretical roof. It’s not impossible to increase efficiency, but it will just be harder and more expensive (256QAM, MIMO4x4).

    Then, even though I see how AT&T could be using this in order to support its acquisitions in a convinient way, it doesn’t affect the fact that there is an spectrum issue that we need to worry about.

    1. Hi Omar,

      I think we agree for the most part, but I also tend to be skeptical of the beachfront spectrum argument. We’re going to need to use every airwave we can find to fuel the mobile broadband revolution and they call can’t all be 700 MHz or PCS. You’re right 2.5 GHz is hardly ideal for a launch outside of dense urban areas, but Sprint and Clearwire aren’t even launching WiMAX or TD-LTE networks in urban areas anymore where smaller cell radii don’t work against them.

      If the higher frequencies are really that bad, then maybe we should scrap the auction process and broadband plan and just make everything above PCS unlicensed. Wi-Fi and other technologies could make far better use of it then. That, however, would confine operators to smaller swathes of airwaves making them coverage providers not the key capacity muscle of the network. I doubt they would go for idea.

      1. Hi Kevin,

        I understand your view. Somehow I try to merge the technical and the commercial views that I have about this subject. You’re right, in the example you mention, Clearwire and Sprint could try to implement in urban areas, given the fact that inter-site distance is shorter. However, with 2.5GHz you still need much more sites than with 700 or PCS. Then, the problem in rural is replicated, but in a different scale. Also, consider that the initial network planning done in most of the cities was based on lower frequencies (800-900), when AMPS was a leading technology. Then, the implementation of additional higher frequencies has been done on top and the coverage gaps appear, some are patched, some are still there. Going to higher frequencies, always means you need to invest in additional sites to cover the holes. Then again, it’s more expensive, and difficult to make a business case, and as an operator, you lose competitiveness.

        Higher frequencies are required, for traffic off-load. They will be used for handling the massive increases in traffic. However, my point is that in order to implement a MBB network, you can’t rely only on higher frequency bands, as it won’t be cost effective. You need to have a proper spectrum allocation that combines lower frequencies and higher frequencies for traffic handling. Operators trying to implement an MBB network based on higher frequencies only, won’t be able to compete.

        Heterogeneous Networks (HetNet) tackle this problem, by trying to make this mix of frequencies and technologies, transparent to the user (at least that’s the objective). However, this ecosystem with multi-technologies and multi-frequency bands isn’t mature yet IMO, and we will still struggle few years deciding which are the best bands for LTE, what are the best mix of technologies in which bands, and which chipsets with certain radio frequency combinations will be cheaper to produce. That’s the reason why I think we should be worried about it and focus on solving it.

  4. Oliver Maxwell Tuesday, January 31, 2012

    Personally I think the Telecoms are just trying to buy up all the whitespace so they’re the gatekeepers of data.

  5. Funny thing about all the comments so far, and Kevin’s replies…I agree with them all. Can’t say that very often!

    This is a complex discussion, with a lot of moving pieces, and I’m under the impression that we all seem to ‘get it’. I think Bode and Burnstein mostly get it, too, but are just bear a little (possibly deserved) anti-carrier vitriol.

    In particular, I like Omar comment. “…even though I see how AT&T could be using this in order to support its acquisitions in a convinient way, it doesn’t affect the fact that there is a spectrum issue that we need to worry about.”

    Karl Bode wrote more right than wrong, I just think that he lost me with this, “buried below industry histrionics is data noting that there really isn’t a spectrum crisis as much as a bunch of lazy and gigantic spectrum squatters, hoarding public-owned assets to limit competition, while skimping on network investment to appease short-sighted investors.”

    Partly true, for sure. There is strategic (monopolistic) advantage in hoarding spectrum. But I think there IS a crisis, but that it can be addressed and mitigated. And I certainly don’t think there has been “skimping on network investment” in the 2007-2012 time frame. If you to discuss skimping on network investment in the USA, lets talk about the 1999-2006 time frame.

Comments have been disabled for this post