The FCC looks at receiver standards
This article originally appeared in print with the headline, "The other side of the coin."
The federal government is determined to squeeze more users into the scarce resource that is radio spectrum. The FCC sought spectrum efficiency historically by specifying transmitter standards, but the agency now recognizes that receivers are equally important.
Poor receiver design limits how closely dissimilar services can be spaced because of limited receiver selectivity, vulnerability to strong adjacent band signals, and other shortfalls. Robust designs generally are more expensive, so any FCC standard must balance performance and cost.
In ET Docket No. 13-101, the FCC requested comments on its Technological Advisory Council’s (TAC) white paper, "Interference Limits Policy—The Use of Harm Claim Thresholds to Improve the Interference Tolerance of Wireless Systems." The TAC advocates an "interference limits" policy whereby the government establishes harm-claim thresholds that must be exceeded before a radio system can claim it is experiencing harmful interference.
The government does not direct how manufacturers should design receivers, but it establishes a standard for the environment where the receivers will operate. If receiver standards are adopted, devices should be sold with labels indicating that the receiver meets the appropriate threshold.
The TAC’s work is part of an effort to achieve greater spectrum efficiency. In February 2013, the Government Accountability Office (GAO) published a report, entitled "Further Consideration of Options to Improve Receiver Performance Needed," that recommends the FCC conduct small-scale pilot tests to assess the practical effects of various methods for improving receiver performance.
In July 2012, the President’s Council of Advisors on Science and Technology (PCAST) published a report, "Realizing the Full Potential of Government-Held Spectrum to Spur Economic Growth." PCAST notes the important role of receivers in spectrum policy, recommending interference limits similar in principle to the TAC white paper. It’s unclear what will happen next, but it is likely that a rulemaking procedure will ensue and receiver standards eventually will be adopted.
Let’s examine the application of receiver standards to a timely and widespread problem — 800 MHz interference to public-safety radios.
Rebanding challenges
When the 800 MHz rebanding rules were adopted by the FCC in 2004, certain receiver standards were specified for land mobile radios—before such radios were eligible for protection from 800 MHz cellular interference (e.g., from ESMR, cellular A or cellular B). These receiver standards specified performance for 12 dB SINAD sensitivity (-116 dBm), adjacent-channel rejection (70 and 75 dB for portables and mobiles, respectively), and intermodulation rejection (also 70 and 75 dB). The implied standard for measurement was TIA-603-B.
Because the major source of interference was Sprint Nextel cell sites employing the iDEN airlink standard, and all leading manufacturers published these values in their datasheets, these standards made sense at the time. Unfortunately, two problems were overlooked or were considered too difficult to address.
First, the purpose of rebanding was to separate dissimilar services, so the near-far problem is manageable. But separation alone does not solve the problem entirely, if the receiver front end still passes cellular signals. These strong cellular signals still can cause blocking and intermodulation interference. An essential part of the plan was to install narrower bandpass filters (851-861 MHz) to reject the cellular signals. Unfortunately, manufacturers have been slow to adopt new filters for a variety of practical reasons, and there may not be adequate motivation to make this change without the federal government getting involved.
The second problem is that the TIA-603-B standard for intermodulation interference is measured with interfering signals in the range of -50 to -45 dBm, but cellular interference on the street often is measured above -20 dBm. Consequently, two radios with identical TIA-603-B intermodulation rejection of 70 dB may show dramatically different intermodulation rejection above -50 dBm. This phenomenon is not referenced on manufacturer datasheets and rarely, if ever, appears in requests for proposal (RFP) for new radio systems.
To better understand the problems faced by the manufacturer, consider the typical digital receiver shown in Figure 1. The near-far problem is essentially a dynamic-range problem; the dynamic range of the digital section of the receiver is limited by the number of bits in the analog-to-digital converter (ADC).
To limit the amplitude of unwanted signals at the ADC, manufacturers typically perform a double-down conversion prior to the ADC.
Meanwhile, the front-end bandpass filter passes the ESMR and much of the cellular A band, so the low-noise amplifier (LNA) often is presented with strong signals. The brute-force way to fend off intermodulation (IM) interference is to employ an LNA with a high third-order intercept (IIP3). Unfortunately, amplifiers with a high IIP3 draw too much current to be practical in battery-operated portable devices.
Automatic gain control
Automatic gain control (AGC) can combat strong interferers, but most AGC power detectors are broadband RF diodes that cannot distinguish between desired and undesired signals. Placing an AGC attenuator in front of the LNA increases noise to the receiver and desensitizes it. More-sophisticated AGCs might use a second detector at the second intermediate frequency (IF) to detect the amplitude of the desired signal and apply the attenuation in front of the LNA needed to optimize receiver sensitivity.
This is possible, because 1 dB of attenuation in front of the LNA attenuates the desired signal by 1 dB but reduces the third-order IM product by 3 dB. Most receivers can tolerate higher levels from blocking than from IM (typically 95 dB versus 75 dB), so a receiver that distinguishes between the two types of interference can adapt the AGC accordingly.
Figure 2 plots the strong-signal IM rejection of four radio models from two different manufacturers. In each case, the desired frequency was 851.1750 MHz, and interfering signals were at 859.6750 and 868.1750 MHz. Radio performance differs dramatically, but all essentially used the same filter that did not roll off until 874 MHz. Because the radios’ third-order intercept can be derived from the TIA-603-B IM rejection—which was similar between radios (75-78 dB)—we must conclude that other techniques (perhaps advanced AGC) are used in the better-performing radios.
If a public-safety agency is using Model A from Vendor 1, and if Sprint Nextel and the cellular A operator each create -20 dBm IM interferers at street level, the desired signal must be at least -85 dBm (-20 dBm – 65 dB) to overcome this interference. If the same agency uses Model B from Vendor 2, the IM rejection is only 38 dB and the desired signal must be -58 dBm. According to the datasheet, the two radios perform identically, but the real difference does not show on the datasheet.
This difference with nearly identical bandpass filters illustrates another important principle. If the FCC took a narrow view of the problem, it might identify a specific minimum filter rejection as the receiver standard. But this approach would miss the point and possibly stifle innovation. What matters is the strong-signal IM rejection. We don’t care how it is achieved — it might be a filter, or it might be a novel AGC algorithm.
Jay Jacobsmeyer, KD0OFB, is president of Pericle Communications Company, a consulting firm located in Colorado Springs, Colo.
Thank you for sharing superb
Thank you for sharing superb informations. Your website is very cool. I am impressed by the details that you¡¦ve on this web site. It reveals how nicely you understand this subject. Bookmarked this website page, will come back for more articles. You, my pal, ROCK! I found just the info I already searched everywhere and simply couldn’t come across. What a great site.
I am in strong support of not
I am in strong support of not only the improvement of the design of current base receivers, but also publishing the placement and intended use of them in order to be able to help predict potential interference issues. When system designers of adjacent systems are attempting to build a system that will play nice with others, the lack of this information is always a black hole that cannot be filled until the new systems are built and/or in operation. In the past, I have suggested to the FCC that at a minimum a voluntary system receiver registry be established that would allow others to see the actual performance of an existing system in order to help minimize interference potential further. I would never go as far as suggesting licensing, but just having something that could allow this as additional information on a license for a system. For example, there are several VHF systems in Phoenix that have 40 – 60 receivers installed in the metro area on a single channel, in order to enhance portable radio performance. Yet, this information is blind to any system designer that may be looking at an adjacent channel system. Additionally, coordinators cannot see this potential problem and could create a difficult situation, all because receivers are not given the same consideration as transmitters, as they are rarely considered as relevant. Let’s also not forget about towertop amplification systems and their potential vulnerabilities. But, if having this information available and factored into potential inteference analysis, it should allow for the design of a better “neighnorhood”.
What ever happened with
What ever happened with Oakland throwing stones at AT&T for their radio problems?