It’s hard to squeeze a size-12 foot into a size-10 shoe. Yet, the two agencies that manage radio frequency (RF) spectrum in the United States have been ordered to do just that: fit more users into a space that currently cannot accommodate them. How they will stretch the finite radio spectrum presents challenges and opportunities for radio manufacturers.
The agencies are the Federal Communications Commission (FCC), which allocates and manages spectrum used by commercial parties, and the National Telecommunications and Information Administration (NTIA), which handles federal spectrum use. Changes are already under way. FCC is cranking out notices of proposed rulemaking (NPRMs), and NTIA, as part of a federal task force, will issue sweeping recommendations this summer.
These activities have direct bearing on aviation’s capability to communicate by radio or radar. New policies will profoundly change how spectrum is allocated and used and will determine which equipment will be used.
One of the most disconcerting proposals would raise the noise floor throughout the spectrum. This would open up spectrum for more users but would also increase the amount of interference allowed within the frequencies.
“Allowing the noise floor to increase could have an impact on all existing radio systems, cell phones or whatever, because these systems’ designs were based upon a particular noise figure,” says Kris Hutchison, ARINC’s senior director of frequency management. “The biggest concern we have is the possibility that an increase in the noise floor will force the aviation industry to replace existing equipment with equipment that has reduced sensitivity.”
“While this equipment would be less sensitive to interference, it would also be less able to pick up weak signals or signals that are far away,” Hutchison adds. “A less-sensitive radio would have a reduced coverage area and therefore less range than we require. You may be able to resolve this problem with digital signal processing, but again, this would represent an enormous investment in replacing existing equipment.”
Raising the noise floor is but one of many revolutionary proposals FCC and NTIA are studying as they seek to stretch spectrum capacity. What’s more, they seem to be in a hurry to do it.
Third-Generation Systems
The rush to revamp U.S. spectrum management was jump-started in 2000, as the world prepared for third-generation (3G) telecommunication systems. Defined by the International Telecommunications Union, 3G systems enable “a convergence of fixed, mobile, voice, data, Internet and multimedia services.” This allows seamless global roaming and vastly faster transmission. Spectrum between 400 MHz and 3 GHz was deemed best for 3G systems worldwide.
Many countries easily allocated spectrum for 3G systems, but the United States could not. U.S. spectrum was already allocated to other users.
What’s worse, even more spectrum-using products were emerging, particularly Part 15 devices that are allowed unlicensed use of spectrum. Part 15 gadgets include cordless phones, baby monitors, remote controllers, wireless security and video systems, keyless locks, medical sensors, and much more. Also under Part 15 are technologies that are enabling wireless local area networks (LANs), such as Bluetooth and Wi-Fi.
Another new spectrum user is ultra wideband (UWB). It is used for ground-penetrating radar and similar radar that sees through walls or other obstacles. The newest Part 15 candidate is broadband over power lines (BPL), which uses the electrical grid to deliver broadband services—potentially to every home that has electricity. Incumbent spectrum users are concerned about BPL interference. Nonetheless, FCC issued an NPRM on BPL in February.
Finding spectrum for new services is becoming impossible. It took three years for FCC and NTIA to eke out 90 MHz of spectrum for 3G, and many say this isn’t nearly enough. To accomplish this, NTIA agreed to relocate its federal, non-military frequencies from the 1710-1755- MHz band. The users will relocate to “comparable” frequencies that FCC will transfer to NTIA control or they will move into other federal bands and share the frequencies.
But as of April NTIA was still waiting for FCC to compensate federal users for the cost of relocating and to identify the frequencies to which the federal government systems can relocate. Even so, the new 3G space isn’t totally available, because some Department of Defense (DoD) systems cannot yet vacate the frequencies.
As to relocating spectrum incumbents to new bands, FCC and NTIA have the authority, but relocation is limited by whether the new spectrum’s propagation characteristics are compatible with the user’s technology. Reallocating spectrum also is restricted in cases involving international treaties or agreements.
NTIA has given up federal spectrum to FCC but, with the never-ending demand, there is a limit here, too. For example, federal agencies also want more spectrum, particularly the DoD and the Department of Homeland Security.
Spectrum Management Review
Yet FCC and NTIA must make room for more spectrum users, and for this assignment task forces have been formed.
In June 2002 FCC Chairman Michael Powell established the Spectrum Policy Task Force (SPTF), which issued a final report in November 2002 (see www.fcc.gov/sptf/reports.html). Its recommendations dynamically change spectrum allocation and management, and put the onus on technology to allow more spectrum users.
Although FCC doesn’t have jurisdiction over federal spectrum use, it’s important to read the report because NTIA is likely to follow the task force’s recommendations as part of its role in President Bush’s Spectrum Policy Initiative. Launched June 2003, the initiative ordered NTIA’s parent, the Department of Commerce, to chair and direct the work of a Federal Government Spectrum Task Force. It was ordered to “undertake a comprehensive review of spectrum management policies” and “identify recommendations for revising policies and procedures to promote more efficient and beneficial use of spectrum without harmful interference to critical incumbent users.”
The task force has conducted several meetings, including a public forum in December 2003 on spectrum efficiency and new technology. In June 2004 the task force was supposed to deliver a detailed report containing recommendations. If that occurred, hunt down a copy. If the report isn’t available, FCC’s task force recommendations shed much light.
Command and Control
One overarching recommendation in the SPTF report is for FCC to become far more flexible in allowing spectrum access. SPTF recommended expanding the current “command and control” model of spectrum management to include an “exclusive-use” model and a “commons” model.
For command and control, FCC maintains strict regulatory control on who uses spectrum and how they use it. Many feel this is inefficient spectrum management. Although the model creates dedicated, protected spectrum bands, it also prevents band use by services that could share the band without interfering with the licensee. Many believe it also inhibits technological progress because licensees must adhere to the terms under which they were granted licenses and must go through a protracted petition process to make changes that would improve their products’ performance or spectrum efficiency. (A licensee is a party that has been granted exclusive use of a frequency band: for example, a cellular phone company that has the right to use a specific band over a specific geographic region. FCC originally granted licenses after considering petitions from spectrum applicants, comparative analyses and lotteries, but it has auctioned licenses since 1994.)
SPTF recommends using command and control only when “regulation is necessary to accomplish important public interest objectives or to conform to treaty obligations,” or when required to avoid the failure of a particular market. Command and control would be retained for public safety, broadcasting, international satellite and radio astronomy spectrum. But for most other cases, SPTF recommends that FCC “eschew command and control regulation, and legacy command and control bands should be transitioned to more flexible rules and uses to the maximum extent possible.”
Exclusive Use
The exclusive-use model would grant licensees “exclusive and transferable flexible-use rights for specified spectrum within a defined geographic area, with flexible-use rights that are governed primarily by technical rules to protect spectrum users against interference.” The licensee could more easily improve its technology and more efficiently use its spectrum.
The licensee could also “sub-license” portions of its spectrum to services that don’t interfere with the licensee’s service. In theory, this “secondary market” approach would more efficiently use spectrum because it grants access to new spectrum users, and they would be using “idle spectrum.”
The concept of “idle spectrum” is prevalent in the SPTF report. Through preliminary studies, the SPTF determined that while little room exists to accommodate more licensed spectrum users, a great deal of spectrum capacity (i.e., white space) is available. Licensees are not using all of their spectrum all the time. For example, police may use their frequencies heavily during emergencies or shift changes but sparsely otherwise. Remote areas also have idle spectrum that can be otherwise used. These discoveries have generated an eagerness to find ways to tap into this rich vein of idle spectrum—such as frequency sharing.
Sharing Frequency
Frequency sharing is also the basis for the “commons” model of spectrum management. It would allow “unlimited numbers of unlicensed users to share frequencies with usage rights that are governed by technical standards or etiquettes but with no protection from interference.”
Also proposed as part of the commons model is creation of “underlay” rights for low-power, low-impact applications that would operate in command and control and flexible-use spectrum. This would be allowed if the applications could operate below “an established interference temperature threshold.”
And what’s that? It’s a concept contained within a recommendation to develop a “more quantitative approach to interference management.” This approach “establishes the maximum permissible levels of interference, thus characterizing the ‘worst case’ environment in which a receiver would be expected to operate.”
In other words, FCC would be raising the noise floor. Not arbitrarily, but in a way that is supposed to reflect real-time RF operations. Different threshold levels would be set for each band, geographic region or service. To determine these levels, the SPTF recommends a systematic study of the RF noise floor.
Technology as Savior
To successfully introduce interference temperature thresholds, exclusive-use and commons models, frequency sharing and other concepts to squeeze more users into the spectrum, SPTF looks to technology.
Technology produced efficiencies in signal processing—switching from analog to digital—a lesson not lost on FAA, which is transitioning to a digital VHF communication system that offers voice and data link capabilities. The system (Nexcom) also will use software-defined radios (SDRs), another new technology that is helping stretch spectrum capacity.
Employing reprogrammable signal processors, SDRs have enabled flexibility and interoperability among communication systems that operate in multiple frequency bands and under different standards. Software-defined radios have emerged during the past few years. Several products are on the market (visit www.vanu.com). In September 2001, FCC adopted new rules to facilitate SDR use.
In its NPRM for SDRs, FCC solicited comments about whether SDRs could incorporate “intelligence” to monitor the spectrum to detect usage by other parties and then transmit on open frequencies. This foreshadowed FCC’s current interest in “cognitive radios,” which are eyed as enabling technology for FCC’s “gee-whiz” proposals mentioned above. FCC issued an NPRM for cognitive radios in December.
The NPRM says cognitive radios “can make possible more intensive and efficient spectrum use by licensees within their own networks, and by spectrum users sharing spectrum access on a negotiated or an opportunistic basis. These technologies include, among other things, the ability of devices to determine their location, sense spectrum use by neighboring devices, change frequency, adjust output power, and even alter transmission parameters and characteristics. Cognitive radio technologies open spectrum for use in space, time and frequency dimensions that until now have been unavailable.”
Smarter Receivers
FCC also wants receivers to incorporate interference immunity performance standards and is studying responses to a Notice of Inquiry (NOI) issued in March 2003. As Chairman Powell stated at an October 2002 conference: “Interference is often more a product of receivers; that is, receivers are too dumb or too sensitive or too cheap to filter out unwanted signals. Yet, our decades-old rules have generally ignored receivers.”
FCC clearly aims to correct that neglect. Its NOI states: “If the receivers used in connection with a radio service are designed to provide a certain immunity or tolerance of undesired RF energy and signals, more efficient and predictable use of the spectrum resource can be achieved, as well as greater opportunities for access to the spectrum.
“In recent years, the preemptive effect of minimally performing receivers has been demonstrated, as licensees seek protection for service predicated on the performance of receivers with little tolerance for other signals. Had the expected performance characteristics of those receivers been defined in some way, these services could have been developed with receivers that could better tolerate the introduction of newer services on the same or proximate frequencies.”
But what about legacy receivers? There is little information in the SPTF report on this subject, except for a one-sentence recommendation that FCC should “issue a notice of inquiry to characterize current and future receiver environments and to explore issues…such as, performance parameters and protection for legacy receivers.” In a forum sponsored by NTIA, however, one presenter described legacy systems that use outdated technology as slow-moving trucks that need to be bumped off the highway.
Improved Antennas
Antenna technology also is being addressed. A September 2003 NPRM that covers rules for Parts 2 and 15 unlicensed devices and equipment approval also seeks to permit the use of advanced antenna technologies with spread spectrum devices in the 2.4-GHz band. The current omnidirectional and directional antennas require more power to transmit and are more likely to cause interference. Advanced antennas—sectorized antennas and phased array adaptive antennas—receive signals more efficiently and target their transmissions. They require less power to transmit and reduce the possibility of interference.
Addressing Interference
Interference remains the sticking point. A November article in the Los Angeles Times reported several examples of interference. One serious example was an incident in May 2003 when “pilots flying into London’s Luton airport heard cries broadcast by a baby monitor on the ground that drowned out communications from air traffic controllers.” The article also said FCC receives about half a dozen complaints a day about interference.
As one way to address interference, FCC in November 2004 issued an NOI/NPRM to establish an “interference temperature metric to quantify and manage interference.” It seeks information on how to develop the metric, while also asking whether it is feasible to introduce this approach on a limited basis in selected bands.
The concept, called “a fundamental paradigm shift in the commission’s approach to spectrum management,” shifts the method of assessing interference from FCC’s current focus on the operation of transmitters to a new method that measures the RF environment in real time. The method takes into account the interactions between transmitters and receivers, as well as the cumulative effects of all undesired RF energy that is present at a receiver at any time.
The NPRM reads: “…to the extent that the interference temperature limit in a band is not reached, there could be opportunities for other transmitters, whether licensed or unlicensed, to operate in the band at higher power levels than are currently authorized. In such cases, the interference temperature limit for the band would serve as an upper bound or ‘cap’ on the potential RF energy that could be introduced into the band.”
The SPTF recommends that “these thresholds should be set only after review of the condition of the RF environment in each band. To that end, the task force recommends that the commission [FCC] undertake a systematic study of the RF noise floor.” Determining an interference temperature cap of any band would require an enormous and perhaps impossible effort, given the increasing numbers of Part 15 devices, the increasing mobility of devices, and the FCC’s possible allowance of flexible use, common use, and underlay rights in various bands.
Noisy Up High
ARINC’s Hutchison brings up another point: “Measuring spectrum activity or occupancy is a good idea. However, it might look real quiet on the ground, but if you do the analysis at 1,000 feet higher, areas that looked quiet aren’t so quiet anymore. As you go up in altitude, you see more and more radios, and when you are at an altitude of 20,000 feet, it’s a very different and noisy environment.”
It is hoped that FCC’s analysis will take altitude into account. But once the agency determines appropriate noise caps for each band, it is looking to radio technology to be able to dynamically determine whether the interference is at a level that would allow the radios to transmit. This takes us back to cognitive radios, which would have the smarts to “take the temperature” of the interference in a frequency and determine whether the temperature is low enough to transmit without causing harmful interference.
Mother of All Protocols
The newest candidate for such a capability is the neXt Generation (XG) Communications program from the Defense Advanced Research Projects Agency (DARPA). Being developed for military and civil use, XG technology would allow “multiple users to share use of the spectrum through adaptive mechanisms that ‘deconflict’ users in terms of time, frequency, code and other signal characteristics.”
Some call XG technology “the mother of all sharing protocols.” It is anticipated to be frequency-agile, smart, opportunistic technology that can use whatever white space exists within spectrum, while having the requisite “etiquette” never to interfere with others.
In June 2003, the public got its first look at XG, at a forum sponsored by the New America Foundation, an independent, nonpartisan, nonprofit public policy institute that aims to bring new voices and new ideas into the “nation’s public discourse.” During the forum, DARPA issued its first formal request for comment. In August 2003, Raytheon received a one-year contract to participate in the program’s initial phase, developing software-controlled capabilities to use and share spectrum among different users. Raytheon subsequently received a second contract and was to demonstrate its work this summer. The ultimate goal is to increase spectrum use by a factor of 10 to 20.
This is where spectrum management is heading and these are the tools that are supposed to take us there. It sounds great, but hold on a minute.
Some of the technology is available, but the gear that fundamentally expands spectrum use is not. Some experts attending NTIA’s spectrum policy forum in December said that hopes for spectrum solutions are being pinned on technology that could be a decade away.
Time to Get it Right
Meanwhile, we can’t afford to throw out the baby with the bath water. We have many existing systems that perform flawlessly and provide significant public benefit. An excellent example is the GPS system. In many cases, these are safety-of-life or national defense services.
As we seek better ways to use spectrum, we should proceed prudently and take the time to get this extremely complex transition right. No one wants to see a missile launch or a plane crash because a new radio technology wasn’t able to correctly read the temperature interference threshold and frequency hopped its transmission into the wrong band.