Last week, thereleased its much-anticipated white paper detailing why the agency believes 10 MHz of 700 MHz spectrum currently licensed to the Public Safety Spectrum Trust (PSST) typically will be enough to serve the needs of first responders using a proposed nationwide, public-safety wireless broadband network.
As promised, the white paper states that 10 MHz is sufficient to serve public safety’s bandwidth needs for routine, day-to-day operations. When a truly large-scale disaster occurs — think 9/11 or Hurricane Katrina — public safety will need considerably more spectrum and should have priority-roaming access to commercial wireless networks in the 700 MHz band to meet capacity demands.
On these two ends of the usage-case model, the FCC and public-safety officials are in agreement. In terms of the usage cases between these two extremes, there is considerable debate.
According to the FCC white paper, 10 MHz of spectrum will be enough to serve public safety’s needs even during significant local responses, such as the Minneapolis bridge collapse and Hurricane Ike in Houston. Meanwhile, many public-safety officials question whether 10 MHz of spectrum will even be adequate for local incidents that tend to happen a few times per year, such as a large multi-vehicle accident or other significant local event.
From an outsiders’ point of view, the most remarkable statement on the subject in recent weeks came from FCC Chief Technologist Jon Peha, who said during a recent panel that the FCC would use the numbers provided by New York City — data that public-safety officials cite as proof that more than 10 MHz of spectrum is needed — to show that the PSST spectrum would be enough in virtually all instances.
Sure enough, the capacity white paper includes sections detailing the New York City “dirty bomb” scenario and notes that there will be plenty of capacity for responders to run their applications for video, data, etc. But there is one significant caveat: the mobile video throughputs used by the FCC are a fraction of those used by New York City.
While there a many other applications to consider, video is most significant, because it requires the most bandwidth and requires high quality-of-service levels to ensure that interruptions don’t occur. New York City built its scenario on having 1.15 Mbps per device for video applications, which is enough to provide standard-broadcast quality video.
In contrast, the FCC plan calls for video throughputs between 256 Kbps — as included in the statement of requirements submitted by the(NPSTC) prior to the failed D Block auction two years ago — and 384 Kbps.
In the dirty-bomb scenario, the FCC white paper shows that 50-60% of the network capacity would be used with video streaming at 256 Kbps, that 60-70% of the network capacity would be used with video streaming at 256 Kbps, and that 80-90% of the network capacity would be used with 512 Kbps video.
Public-safety officials note that the NPSTC video-throughput numbers were designed to be a minimal requirement — not an optimal target throughput rate — to help attract a commercial D Block bidder. More important, the projected use of video and data by public safety is much greater today than it was two years ago. This trend mirrors the commercial world, which has seen data and video usage figures soar through the roof; after all, isn’t that why T-Mobile and Sprint are so adamant about having the right to bid on the D Block today, even though both declined to even participate in the last 700 MHz auction?
Instead of throwing white papers at each other, representatives of the FCC, Congress and public safety need to talk about what they want this proposed public-safety network to do.
If only tactical video is needed for the deployment of large resources such as trucks and manpower is wanted, then the lower video throughputs cited by the FCC probably will suffice. If the video needs to be good enough to stand up in court for prosecutorial purposes or to make an identification of a potential suspect, then broadcast-quality video is probably more appropriate.
Meanwhile, one of the most compelling applications proposed is the ability for a doctor to perform remote diagnosis from a hospital to determine what EMS should do with a patient at an incident — what treatments should be given on site and where the patient should be taken to get appropriate treatment. To make those kinds of decisions, remote doctors will need at least 1 Mbps and would prefer 3 Mbps to approach high-definition video.
If I’m the guy injured at an accident site or in an ambulance, I definitely want the doctor to have the clearest picture possible, so no one performs an unnecessary procedure or sends me to a hospital that is not equipped to take care of me properly. I suspect most lawmakers and policymakers in Washington would feel the same, if asked.
Still, there is no doubt that money and spectrum limitations have to be considered. With this in mind, all sides of this debate need to do a better job of stating their cases about what they want this network to be able to do, so laws and policies can be made based on informed decisions, not on speculation based on twisted assumptions. That’s going to require all parties to calm down, state their cases and — yes — truly listen to the problems faced by the other players involved in the matter.
What do you think? Tell us in the comment box below.