First steps toward LMR-LTE convergence
This article originally appeared in print with the headline, "The first step in a long journey."
There are many factors influencing the decision whether to add a broadband-radio layer over existing land-mobile-radio (LMR) coverage, from regulatory issues to the operational and capital costs associated with building, managing and maintaining the new network. There also are engineering factors, including base-station site management, antenna systems, increased backhaul requirements, and radio-frequency (RF) planning.
This is the first article in a 3-part series focusing on the RF-engineering aspects of layering a broadband technology, such as LTE, over existing LMR systems. This article specifically addresses the RF link-budget differences and similarities between LTE and LMR. The second article will examine RF tools and how to determine whether there is a difference in radio coverage, while the third part will examine the backhaul needed to deliver broadband.
The radio link budget is the collection of information required by the design engineers to determine the physical-layer properties between the transmitter and receiver. For LMR systems, the standards reference is called TIA TSB-88. For LTE systems, the standard is 3GPP.
Downlink receiver noise floor. There are many similarities between LMR and LTE radio systems, but there are more differences. Similarities begin with the basic premise of link budgets: a transmitter and a receiver are matched to determine the requirements to link them, and the downlink portion of the connection is more complex.
The downlink (talkout) receiver in the subscriber device—in LTE, a modem attached to or connected to the client computer—is in a nomadic wireless environment for mobile data systems and must be able to distinguish between wanted and unwanted signals. To determine whether this is possible, the thermal noise floor and effective receiver sensitivity must be estimated, using Boltzmann’s constant.
The same formula is used to determine the thermal noise floors for LTE and LMR systems, but there is one significant difference: the bandwidth or channel size. In most LMR systems the bandwidth or channel size is very narrow, perhaps as low as 12.5 kHz. Typically, the formula requires the useable portion of that bandwidth, which is called the Effective Noise Bandwidth (ENBW), to be in the order of 6000 Hz. The challenge is to determine this value for the LTE system. LTE systems have channels that are in the order of 10 MHz, and this creates difficulties in making “apple-to-apples” comparisons with the narrowband systems.
To overcome this, the broad spectrum of LTE must be carved into smaller segments to make a reasonable comparison. Using special coding techniques, LTE carves the broad spectrum into resource blocks of about 34 kHz to best utilize spectrum. Also, with a frequency reuse plan of N=1, the channel-resource management techniques of LTE at the “cell edge” limit the bandwidth to the smallest resource block, MS-0, which is 34 kHz with an ENBW of 15 kHz. Near the “cell center,” LTE can deliver 2 Mb/s using 64 QAM signaling. At the LTE “cell edge,” to deliver 256 kb/s, QPSK modulation is used. For LMR systems, QPSK is used to deliver 12 kb/s throughout the area of the cell.
Assuming a noise figure of 12 dB for the LTE receiver and a noise figure of 12 dB for an LMR receiver, Boltzmann’s constant yields about -124 dBm for the narrowband LMR channel and -120 dBm for the LTE signal, assuming resource MS-0 is 34 kHz and the ENBW is 44% of each channel or resource designation. This establishes the noise floor for both technologies.
Downlink effective receiver sensitivity.To calculate the effective receiver sensitivity—the RSRP design threshold—of the downlink, we must determine the minimum useable signal above the noise floor. With a minimal delivered data rate of 256 kb/s, and using an MS-0 of 34 kHz (16.32 bits/second/hz), LTE’s required signal-to-interference and noise ratio (SINR) value is 7.5 dB plus 11.5 dB for Raleigh, or fast fading—about 19 dB total. For LMR systems, a typical value of faded C/(I+N)— for a digital audio quality value of 3.4—is 17.7 dB. For LMR data transmissions of 12 kb/s (about 1 bits/second/hz), a typical faded C/(I+N) is 18 dB. The effective receiver design value, or RSRP design threshold, for LTE broadband is -101 dBm; for LMR it is -106 dBm—a 5 dB difference in favor of LMR.
Downlink radio transmission considerations. Transmission of an LTE signal with a 10 MHz bandwidth does not compare with a narrowband LMR transmission of 12.5 kHz. At the radio base station cell center, the LTE-delivered data rates can exceed 2 Mb/s per user and require multiple resource blocks with channel bandwidths up to 128 kHz per user. For LMR systems, at the radio base station transmitted cell center, the bandwidth is 1/10 of the LTE signal.
Transmission power for LTE systems at the ENodeB is 10 watts, while LMR base stations typically use 100 watts. LTE transmission at the cell center occurs at a lower power than the 10-watt maximum, because of bandwidth spread and power density requirements. At the cell edge, an LTE signal must be dropped by at least 25 dB to prevent interference, so a minimal resource block can be used. The resulting LTE-transmitted signal at the cell edge is about 30 mW, or 15 dBm. The LMR system with narrowband channels, proper transmitter-to-receiver frequency separation and operation within FCC rules and co-channel limits, can transmit the maximum signal of 100 watts, with transmission the same at both the cell center and cell edge. For the best comparison, the cell edge is used where the LTE resource requirements are similar to the LMR transmitted signal. The difference between the LTE transmission of 30 mW and the LMR transmission of 100 watts at the cell edge is 35 dB stronger for LMR.
Downlink antenna systems. LTE and LMR antenna systems are different. LTE antenna systems are in 3 sectors and utilize specialized antennas employing a technology called multiple input/multiple output (MIMO). MIMO antennas are sectored flat panels and have a gain of 12 dBd. LMR antennas are omnidirectional and have a gain of 9 dBd. LTE antenna systems are 6 dB stronger than LMR systems, assuming that both are operating in the 700 MHz band and the transmission lines and filtering for LTE are 5 dB less than LMR. The filtering losses and transmission lines are minimal for LTE, co-located with the transmitter, while the losses are higher for LMR. The filter losses for LMR also are higher to allow for close narrowbanded channel separations. LTE antenna systems offer 12 dB gain to the link budget, while LMR antenna systems offer 4 dB. As a result, LTE signals are 8 dB stronger from the antenna systems.
Downlink maximum available path loss. To determine the maximum available path loss (MAPL) for LTE, assume that the transmitter is limited to 30 mW (15 dBm) in order to mitigate N=1 frequency reuse at the cell edge; 0 dB loss for filtering; and a transmitter antenna of 12 dBd, resulting in an ERP of 27 dBm. If the receiver is a radio “dongle-type” modem with a -4 dB gain antenna, no transmission or filtering losses and an effective receiver design level of -101 dBm, the MAPL is 124 dB.
For LMR, starting with 100 watts (50 dBm) for the transmitter, a 5 dB loss for filtering and coaxial-line losses, and a transmitter antenna gain of 9 dBd, resulting in an ERP of 54 dBm. Assuming the downlink LMR receiver is a radio modem with a -4 dB gain antenna, no transmission or filtering losses and an effective receiver design level of -106 dBm, the MAPL is 156 dB. The difference is a 32 dB signal strength advantage for LMR. This difference in link budgets will have a significant impact on the coverage between the two types of systems in the downlink.