When your hands are full, new technology may let your eyes carry the load
ORLANDO—One of the great things about this job is that I get to see lots of cool, innovative technology developed to address the needs of public-safety and critical-infrastructure. As is the case in the commercial world, some of the most intriguing ideas being created come in the form of software applications.
Some of these applications are very ingenious, mining databases to provide public safety with all kinds of potentially useful information—from stats and background checks to pictures and video. As helpful as this information could be, some of these applications have not been adopted by public safety for various operational reasons associated with public safety.
For instance, many applications access data housed in the cloud, meaning a very reliable, very robust broadband connection is needed. Such connectivity is becoming more prevalent, but the reality is that first responders in the field occasionally have to operate in environments where there is no connectivity, particularly after a disaster. Similar problems arise during responses in which there is no commercial power available. Initiatives such as FirstNet and smart-grid deployments are designed to alleviate many of these concerns.
But having connectivity does not guarantee that public-safety personnel will be able to utilize applications. A public-safety user in the field often does not have the luxury of staring at a screen and devoting both hands to handling a device and navigating through application to find information, no matter how helpful it may be.
Given this, the virtual-reality demonstration by Motorola Solutions (click here for video) at APCO 2016 caught my attention. Yes, the virtual-reality piece was predictably cool, but what was especially intriguing was the method used to navigate through the system—not with hands on a mouse, button or a touch screen, but simply with your eyes.
Leveraging technology from Eyefluence—Motorola Solutions venture-capital group has invested in the California-based company—the user can navigate through a menu of database or application choices just by looking at them. The Eyefluence technology detects where your eyes are focused, and a selection is made. It is the same as using a mouse to click on a computer screen, but no hands are needed.
Upon trying the system, I was pleasantly surprised by how intuitive it is. I was able to navigate immediately, and the speed and efficiency of selections improved throughout the few minutes of use, as I became more comfortable with the system. There was nothing hard about it—and that’s saying something, coming from someone labeled by teenaged son as “maybe the worst video-game player ever” (sad, but true).
What really resonated with me was the fact that all of this was possible by just moving my eyes, because it opens up a vast world of new possibilities.
The Motorola Solutions demonstration used virtual reality, which could be used in a command-center environment but is not practical for an on-scene first responder, who needs to be aware of the immediate surroundings. However, it is not hard to envision a first responder wearing glasses—or an oxygen mask that includes some sort of a viewable screen—with the Eyefluence navigation capability embedded to achieve the same functions.
When discussing possible applications that could be used by first responders, one of the most frustrating parts is remembering the difficult environment in which they work. Firefighters offer the most challenging problems, because they wear thick gloves that don’t work well when trying to press buttons or use a touch screen. And that’s only the beginning, as designers also have to deal with facts that firefighters may not be able to see anything in thick smoke and they can get caught in positions where movement of hands, arms and legs may be very limited.
Developing a user interface that will function effectively under such conditions has seemed to be impossible, but this Eyefluence technology offers the core of a potential solution.
Even when a firefighter is pinned and completely immobilized physically, the eyes can still move, even if the head cannot turn. And that’s all that is needed with the Eyefluence technology to navigate to the best communications platform. For instance, imagine a pinned firefighter being able to select an option to send a distress signal just by moving his eyes, even if he cannot reach the “man down” button on his communications device.
In addition, having public-safety personnel wearing such headgear with eye-sensor technology could enable other possibilities, as well.
For instance, a retinal scanner could be included to identify the user, which could be used to access a profile that activates the first responder’s appropriate security level, information access and applications, And biometric monitoring of the user’s eyes can be used to provide early warnings that a first responder could be facing health risks—especially important for firefighters—and may need a break, according to Craig Siddoway, director of advanced concepts at Motorola Solutions.
It is important to stress that none of this is available now. The virtual-reality prototype is at the proof-of-concept phase and is more than a year away from being something that is available commercially, and the ideas in the previous paragraphs are not even to that point yet.
But the good news is that there is a technology that appears to be a viable solution to a problem that promised to hamper public safety’s use of broadband connectivity. There’s still a lot of work to be done, not to mention the normal efforts of hardening equipment and including the kind of cybersecurity that public safety needs with all of its applications.
This technology could clear a development path to the kind of solution that is needed throughout the public-safety and critical-infrastructure sectors. It won’t be easy, but that should not be discouraging when something so important is within reach.