A Spectrum Crisis Still Looms

By TMCnet Special Guest
  |  July 23, 2012

This article originally appeared in the July/August issue of INTERNET TELEPHONY 

In 2010, the FCC's (News - Alert) National Broadband Plan predicted a "looming spectrum crisis" would begin to affect mobile broadband networks by 2013 unless the government took action to reallocate spectrum from legacy applications to commercial networks. Analysts cognizant of the growth in demand for mobile bandwidth brought about the rise of the iPhone (News - Alert) and the slow pace of advances in wireless efficiency agreed that action was warranted. 

The historical pattern for bandwidth consumption tends to follow Moore's Law, doubling every 18 months, while increases in spectrum efficiency follow Cooper's Law, doubling every 30 months. Sending a video stream from a smartphone consumes 10 times the bandwidth of a voice call, and people were converting from dumb phones and feature phones to new devices that were more like small computers than telephones. Last year, smartphones outsold PCs for the first time, and it's likely the trend will continue.

Not surprisingly, NBP's recommendations were worrying to some. The U. S. government has assigned 2-300mHz more of the prime two-way frequencies to its own uses than have our counterparts in Europe. While the civilian agencies are generally willing to adapt applications to alternate technologies, the military and public safety establishments generally resist change. 

Television broadcasters are aware that less than 15 percent of Americans rely on over-the-air TV signals, but fear that surrendering their broadcast rights will ultimately lead to the demise of the "must-carry" rules that line their coffers with cable system revenues. Advocates of free wireless networks fear that their nirvana will be squeezed out of existence before it's born if it becomes de rigueur to view spectrum strictly as a licensed and tradable commodity.

We haven't made significant progress toward the NBP's goals – 300mHz for mobile broadband by 2015 and an additional 200mHz for both fixed and mobile uses by 2020 – in the ensuing two years. LightSquared's (News - Alert) efforts to repurpose mobile satellite spectrum for terrestrial networks were squelched by the DoD, the FAA, and the GPS lobby on dubious technical grounds, taking 40mHz off the table. Public safety demanded 20mHz for an LTE (News - Alert) network of their own that won't end up being substantially different from anyone else's, and efforts to re-allocate federal spectrum in the 1755-1850 band are likely to stretch out for 10 years and carry a huge price tag (News - Alert). There have been some small victories with the 1755-1780 and 2155-2180 bands, but the overall trend is mixed.

Progress has been so slow that some pin their hopes solely on the promise of alternatives to spectrum politics such as opportunistic access and super-decoding radios that can share time, space, and frequency with active neighbors without experiencing packet loss. These are promising technologies that will probably be important elements of the mobile broadband networks of tomorrow, but this very claim was made 10 years ago. Contrary to Yogi Berra's opinion, predictions about the future aren't hard as long as we're relaxed about the timeline. 

We're accustomed to coding systems such as CDMA that permit multiple transmitters to employ the same frequency at the same time, but these systems require coordination. Wi-Fi employs opportunistic access, but it's not collision-free or infinitely extensible in space. Space-division multiple access and multi-user MIMO systems relax these constraints but don't eliminate them, and coordinating faint GPS signals with adjacent terrestrial signals at much higher power levels requires knowledge of the neighbor's characteristics and constraints on both sides. 

Even if we had fully robust radio systems ready for production that didn't suffer from analog limitations, the transition problem would be immense. A quick household inventory tells me that I own no less than 30 devices that signal over electromagnetic spectrum without the benefit of smart radio magic, and while I'm not the typical user there are probably two to three billion such devices in the U. S. alone. The transition from analog to digital TV took 15 years, so moving from dumb to smart decoders is easily a 30- to 50-year project.

We need to continue making progress toward better radios, smarter coding, smaller cells, unlicensed data offload, and all the other means of making radio networks more robust and efficient. Reallocating spectrum from legacy applications such as over-the-air TV and government video surveillance to general purpose commercial networks is a step in this direction. 

Commercial networks are better at sharing spectrum among diverse groups of users and applications than any of the applications that were granted special spectrum dispensations by regulators before cellular networks emerged. There's no downside to redesigning government applications and improving DTV standards in any case. Even if the spectrum crunch were the biggest myth since Santa Claus, technical progress benefits from our acting on the assumption that it's real.

Richard Bennett is a senior research fellow with the Information Technology and Innovation Foundation (
www.itif.org) and one of the original designers of Wi-Fi.


Edited by Stefania Viscusi