Many policymakers, public-safety officials and end users expect that the proposed 700 MHz wireless broadband network will deliver nationwide interoperability. But while public safety’s selection of the Long Term Evolution (LTE) standard will enable critical broadband communication capabilities, it will not, by itself, deliver nationwide interoperability.

It is true that the LTE network will deliver mobile communications capabilities that have eluded public safety over legacy land-mobile-radio (LMR) networks. For example, for radios to operate on other agencies’ trunking networks, they must be pre-programmed. This operational requirement frequently results in reliance on conventional mutual-aid channels for major incidents — channels that easily and quickly overwhelmed with traffic. LTE, on the other hand, natively allows nationwide access to multiple regional systems.

But the ability to exchange IP packets over the LTE network does not constitute interoperability. No first responder can decipher a string of ones and zeros from another first responder in order to determine the purpose and meaning of the communication. So, public safety must have the means to share meaningful information over the LTE network. In other words, public safety needs both network and application level interoperability. Furthermore, public safety might use diverse networks other than a public safety nationwide LTE network to facilitate mobile data communications. Consequently, critical applications must work on a variety of public-safety networks — 4.9 GHz and 802.11 in addition to 700 MHz — as well as commercial 2G, 3G and 4G networks. Importantly, the applications that ride on public safety’s 700 MHz LTE network may require peer-to-peer capabilities — as opposed to client-server software centric architectures — whereby end users communicate directly and not through a centralized system.

Should public safety deploy an incident scene broadband network that is not connected to the core network, the applications must support discovery, authentication, authorization and security over the local incident network. There likely will be some concern over who is authorized to receive peer-to-peer video, biometrics, audio and other data.

Therefore, the applications must support this level of control at the user level. But such architecture would diverge from most public-safety applications today. Agency e-mail, websites, computer-aided dispatch (CAD) systems, records-management systems, video system, and other applications that currently reside in the data center would be unavailable if detached from the core network. As a result, special considerations regarding the application architecture are necessary if public safety requires these applications while “off network.”

Application continuity while away from home is another major architectural consideration. Consider, for example, a user from New York City who is providing mutual aid in Baltimore. For this example, let’s also assume that a national standard for CAD exists to facilitate incident information-sharing. Because the CAD data may not amount to substantial traffic, it may be acceptable for the two independent servers to share interoperable data. In other words, the user from New York City does not need to communicate directly with Baltimore’s CAD system. Instead, users can communicate via their home servers, which are communicating with one another. If the shared CAD information doesn’t amount to significant traffic load and there is high reliability over the inter-regional links (backhaul network), this may be the simplest solution to facilitate this communications exchange.

However, for multimedia applications such as voice and video, this delivery model may be insufficient. If, for example, the inter-regional traffic is video from a local Baltimore camera, and if the New York user connects to his video server in New York, that video will need to travel to New York and be transported back again to Baltimore. It is likely that there will be multiple cameras and multiple New York users providing mutual aid, so the amount of traffic carried across the inter-regional link would be substantial.

For push-to-talk audio, the individual streams are significantly smaller than video (tens of kilobits per second versus hundreds of kilobits per second), but the number of audio streams due to the increased population of users is drastically higher. Therefore, another important component of broadband application interoperability is the configuration of those applications while away from home.

Fortunately, LTE systems easily can handle these two diverse application transport models. Inherent within the LTE system architecture is the ability to service a locally provided application using “local breakout,” while an application provided by the user’s home server would be “home routed.” Public safety needs to determine and standardize the nationwide application framework to accommodate the various routing needs for each application.

Another 3GPP standardized technology, IP Multimedia Subsystem (IMS), delivers voice and other applications via the home network or visited network — providing a viable framework for public-safety applications. The LTE-IMS marriage manages the traffic to be serviced locally, including all-important 911 phone calls, which receive special treatment within the IMS platform. The development of standards that define roaming, message format, and coding formats, while fundamental, is not enough. The interoperability standards that govern applications also must address priority access of users and applications. The LTE standard supports application priority through the Quality of Service (QoS) function, and it will be prudent for operators to prioritize user and applications for day-to-day and real-time emergency incident response.

Public safety also needs an “emergency button” capability for broadband applications, much like that which is available in LMR systems. Only in this case, the emergency trigger must accommodate many different applications and scenarios. For example, a video stream may be deemed critical after it has been established. If that stream becomes degraded, the recipient may need to upgrade the priority of that stream to increase reliability. The stream also may be forwarded from a fixed security camera, where no manual intervention can occur from the transmitting end. Since video is expected to account for the bulk of public safety’s incident traffic once the nationwide broadband network is operating, this application will be the most susceptible to network congestion. It is critical then that public safety adopts standards that facilitate operations in capacity-limited scenarios.

Video interoperability becomes even more complex due to its bandwidth requirements and the need to balance quality and capacity. For example, public safety may decide that automatically will allow the frame rate to vary. For example, public safety may decide to automatically allow the frame rate to vary. Television and motion pictures use 30 and 24 frames per second to deliver video that appears smooth and not jerky. However, the higher the frame rate and associated application throughput, the greater the network capacity that is needed. Therefore, reducing the frame rate can be a more economical method to manage network capacity while still conveying the needed information.

In some cases, the user simply is trying to establish basic situational awareness from the video — discerning what is occurring but not who is in the picture or what they are holding. In such cases, it is possible that the resolution of the video can be reduced. With higher-resolution video, more data is needed to represent the video, which in turn requires more bandwidth on the network. Lower resolution video would enable more video streams to be transmitted simultaneously over the network.

Figure 1 illustrates the effects of various video frame sized, conveyed in Common International Format (CIF), on the ability to discern visual information. Figure 2 presents a zoom perspective of a person in the window at three different resolutions. The man in the window clearly is distinguishable at four times CIF (4CIF), but at one-quarter CIF (QCIF) one cannot be sure if that is a person at all. However, the 4CIF image requires 16 times the information, and therefore, substantially more bandwidth than the QCIF video. The eventual standards will need to accommodate the varying visual needs, along with device and bandwidth limitations of public safety systems.

The video framework also must integrate tightly with the network. A helicopter video stream can provide useful situational awareness information to many individuals at an incident scene. But if each recipient were to separately utilize network bandwidth for the stream, a LTE base station quickly could become congested. Instead, that content should be multicast or broadcast. In other words, one stream could be received by multiple parties, thereby reducing the video stream’s impact on network resources.

The LTE standard will accommodate multicast and broadcast of such information; however, the application framework needs to properly integrate with these functions to fully economize bandwidth. 
To complicate matters further, multiple techniques exist for coding the video content. For example, DVD uses the Motion Picture Entertainment Group (MPEG) 2 standard. Newer standards, such as H.264, which is used for Blu-Ray disks, can convey the same quality in half the data rate. The Video Coding Experts Group (VCEG) eventually will publish H.265, and likely other standards that further improve video transmission.

Ideally, both ends of a video stream (the source and destination) will “speak the same language.” If they don’t support common video codec standards, as is common with LMR today, it will be necessary to translate the video content using video gateways. This would further complicate the architecture of a nationwide interoperable system. 
In the future, it is likely that incident command personnel will manage multiple video streams. Some of those streams will be critically important and will be the central focus for those managing the incident. Other video streams will provide situational awareness of other aspects of the incident. Based on the ever-changing condition of emergency incidents, the other streams may need to become the central focus — e.g., after a suspect moves from one part of a building to another — and the video incident manager will need to change the priority and quality of each stream as the incident requirements change.

Public-safety application standards need to accommodate 
dynamic user control of video streams. Video streams will come from fixed locations such as schools and banks, mobile locations such as police cruiser-mounted cameras, and other sources such as NG-911 centers. A crystal ball isn’t needed to foresee that there likely will be much more video than can be reasonably transmitted over any LTE network; therefore, managing that video will be a critical component of future public safety operations.

It commonly is understood that LTE will not deliver peer-to-peer capabilities, i.e., communication directly between subscriber devices without requiring access to a network. In addition, there are multiple other challenges regarding broadband data-based push-to-talk (PTT) and there currently are no standards directing the application development.

But in the coming years, as public safety tackles these challenges, the nationwide interoperability aspects of push-to-talk should be addressed. Voice and data communications must be simple and transparent for a first responder to the greatest extent possible. For example, if vehicular repeaters operating on a trunked radio network are deployed at the incident scene, a firefighter must change channels to maintain communications. Failure to do so could cause fireground interoperability issues. 

Furthermore, visiting mutual-aid providers may not have network access programmed into their radios, or have access to incident talkgroups. These issues make communications with today’s Project 25–based systems challenging. As the broader issues concerning broadband-based PTT are solved, and standards are developed to support such communications and achieve interoperability, the larger goals also should be addressed. Specifically, communications need to be more transparent for first responders and must be delivered 100% of the time, not just 95% of the time.

The selection of LTE as the framework for nationwide network interoperability now is formalized and will provide a robust mobile broadband throughput pipe that enables robust information and application communications. However, the hard work must now begin to achieve critical application interoperability. The broadband pipe does not guarantee public safety access, exchange and sharing of broad-based information. LTE provides the medium on which public safety applications transmit and applications must be standardized in order to be interoperable. The necessary steps to advance such standards must be taken now, before these applications are deployed en masse. Hopefully, the interoperability challenges encountered with LMR systems will provide the important lessons needed to avoid the mistakes of the past.

Rick Burke has more than 30 years of engineering and system operations experience with complex communication networks and applications, particularly public-safety LMR and wireless broadband system engineering and information technology, and in implementing large-scale, multijurisdictional interoperable voice and data conventional, digital and IP communications networks. He can be contacted at 

Joe Ross is a senior partner at Televate, LLC, a Falls Church, Va.–based consultancy specializing in comprehensive system engineering and program management for public-safety communications. He has 20 years of experience in designing and LMR, wireless broadband and commercial cellular systems. He also chairs the Public Safety Spectrum Requirements Working Group for the National Public Safety Telecommunications Council Technology Committee. He can be contacted at

Related stories: