‘Bomb robot’ usage in Dallas underscores importance of mission-critical data, not just voice
What is in this article?
‘Bomb robot’ usage in Dallas underscores importance of mission-critical data, not just voice
Public safety entered a new tactical era last week, as the Dallas police department deployed a “bomb robot” and killed Micah Xavier Johnson, the suspected shooter of 12 law-enforcement officers, five of whom died.
Dallas Police Chief David Brown said a bomb-disposal robot to deliver one pound of C4 plastic explosive to the second floor of the parking garage where Johnson was located. The subsequent detonation killed the suspect, in what is being deemed as the first time that a law-enforcement robot has been used purposely to deliver a lethal blow on domestic soil.
Of course, the concept is not. The military has been using unmanned drones for years to execute lethal strikes against suspected terrorists and others who are deemed to be threats to the U.S.—a tactic that has generated considerable scrutiny worldwide.
Now, many of those same ethical and policy questions are being asked in the wake of the Dallas episode, along with a debate whether this use of force was appropriate or legal.
Brown said Dallas police used the “bomb robot” after hours of negotiations with Johnson failed. At the time, Johnson indicated that he had a desire to shoot other police officers, which led to the decision to deploy the “bomb robot,” which actually was a Northrop Grumman-manufactured robot—primarily used to dispose of bombs—carrying the C4 explosive.
“Without our actions, he [Johnson] would have hurt more officers,” Brown said. “We had no choice, in my mind, but to use all tools necessary.
“I approved it [the use of the ‘bomb robot’]. And I’ll do it again if presented with the same circumstances.”
Some may claim that Johnson had not fired a shot in hours, meaning that the police and the public were not in imminent danger at the time of the “bomb robot” detonation—the normal standard used to determine whether lethal force is legally justified. But I agree with Brown and most observers that lethal force was appropriate in this instance, given the shootings hours before and Johnson’s threats to negotiators to shoot more.
Whether the use of the “bomb robot” was legal and appropriate likely will be debated for some time, as was President Harry Truman’s decision to drop atomic bombs on Japan in 1945 in an effort to accelerate the end of World War II.
No, the Dallas police’s use of a “bomb robot” is not as significant as the atomic-bomb decision, but there are some parallels.
First, the leaders in question used unprecedented technological measures to deliver a lethal force, both with the idea of reducing the number of lives lost if fighting had continued. Second, such unprecedented uses of lethal force will spark a lively and much-needed public debate about the legal and ethical use of such technologies.
While the outcome of these debates may not be obvious—whether the atomic bombs should have been used remains an ongoing discussion, more than 70 years after Truman’s decision—there is little doubt that the use of such technologies immeidiately changes the thought process associated with response tactics and strategy in the future. Putting the genie back in the bottle almost never happens, no matter how hard some people try to do so.
I’ll leave the complex legal and ethical implications of robot usage to others. But the Dallas saga seems to point to a very clear technical need: Now that we live in a world where a remote-controlled robot has been used to deliver lethal force, it is imperative that personnel deploying such devices have the utmost confidence in the communications connection between the human operator and the robot.