Tools & Technologies: Black Box Tools on the Emergency Scene

Sept. 22, 2015
Joseph Sullivan offers a policy guide for using electronics and backup plans.

I recently went to the Ukraine, and while I was there, I visited the infamous Chernobyl nuclear reactor that exploded in April 1986. There are many striking things about this desolate place, among them the statue of Prometheus stealing fire from the gods. (If you recall the legend, Prometheus stole the fire to give it to mankind; he was later sentenced to eternal punishment for this crime.) The Soviets erected a statue representing Prometheus stealing the fire and placed it in front of the movie theater near the nuclear plant. Obviously they felt pretty happy with themselves.

Ironically, the Soviets and Prometheus had more in common than they thought. After the reactor blew up, the statue was eventually moved to its present position near the Chernobyl Firefighters Memorial.

As MLB pitcher Vern Law said, “Experience is a hard teacher because she gives the test first, the lesson afterwards.”

Into the unknown

In my job, I talk with many different types of emergency workers, from the hometown responder in Kentucky to the bigwig chief in California to the PSAP worker in New Zealand. One common thread is that many of us are using tools that we don’t fully understand. Often the benefits of using these tools are too obvious to ignore. But we shouldn’t use that as a reason to accept something without trying to quantify the risk.

It is up to responsible early adopters, and chief officers in particular, to make sure that new digital tools are used in a responsible framework. Of course, most of these tools are “advisory,” and if they fail, we can resort to the old methods (i.e., the DOT hazmat guide versus a computer program, an axe versus a mechanical saw, or water in the tank versus a wetting agent). But someone needs to ensure that responders know how to detect a digital failure and that they are trained in a backup procedure.

The black box problem

Black box is a term used to describe a system that is internally unknown. For example, most people can learn how to use a computer, but if it suddenly stops working, they have no way of further troubleshooting the problem. They have to call a technician. To them, a computer is a black box.

The problem with cloud-based services and most consumer-level electronics is that they are “black boxes” as well. If they don’t work, there’s simply nothing you can do.

Smartphone apps present both a blessing and a challenge to public service agencies. Most of us are used to fast-paced technology at home and slow-moving advances on the scene. If you can’t touch it and you can’t fix it, how are you supposed to use it? If you put a nozzle on a shelf and come back in the morning, it will work the same way as it did the night before. But cloud-based software updates itself while you sleep, and if it breaks, you can’t pull a spare from the shelf.

A lesson from aviation

The aviation world faced a similar problem several years ago when GPS became widespread. On the one hand, GPS is so useful that there was no way it could simply be ignored as a navigation tool. On the other hand, certain events (interference, solar flares, etc.) could degrade the quality of a GPS signal such that it could not be used safely. No one wanted an airliner to crash because the onboard GPS wasn’t working properly.

To solve this problem, a system called RAIM was devised. RAIM stands for Receiver Autonomous Integrity Monitoring. Essentially it was a feature that constantly checked the validity of the GPS signal. If the GPS data becomes untrustworthy, the pilot is notified via a “RAIM alarm” to stop using the GPS. This demonstrates two principles:

1) “If” the GPS fails, the pilot is notified.

2) The pilot then uses an alternate (“Or”) navigation method that does not rely on GPS.

This same policy that protects lives in aircraft is an excellent model for us.

The “If/Or” approach

Similar to what pilots use, the “If/Or” approach was developed to give emergency response agencies a framework for defining standard operating procedures and guidelines (SOPs/SOGs) in the context of digital “black boxes.” This approach requires each black box to conform to two requirements:

1. If: “If” means, “If it fails.” If the black box doesn’t do what you want it to do, will you know about it? Does it have some way of internally detecting the failure? Will crews be notified of the failure? In an aviation GPS, this would be a “RAIM” check.

2. Or: “Or” means “Or what?” In other words, what is the backup plan? Is there a fallback technology, and are crews trained in using the fallback technology?

Putting it together

For an example of the “If/Or” approach to black boxes, let’s say we have an imaginary smartphone called “Water” that tells you how far away the nearest hydrant or water source is. Let’s further assume that the Water app is “If”-compliant; that is, there is a foolproof method for knowing whether or not it is working. In the case of such a simple app, it might mean visually verifying the database version or the presence of hydrants on the screen.

Assume the following SOG outline:

i) All personnel are encouraged to use the Water app.

(1) When the Water app is activated, personnel will visually verify that the database version is no more than 24 hours old.

(2) If the database is expired, or if no water sources appear on the screen, use of the app will be discontinued. In this case, map books and preplans will be used exclusively to determine water source locations.

ii) Mission critical staff—such as each shift officer, duty officer and department heads—are required to carry a backup planning tool [map books] with water source locations in their vehicle.

iii) In conjunction with training over [insert timeframe], this policy will be reviewed to ensure continued compliance.

Now apply the “If/Or” test to this SOG. “If” the Water app were to fail, there is a quick way of verifying that is has failed. The alternate “Or” method is to use the map books. Because this SOG passes the “If/Or” test, it would be safe to use as a guideline.

In sum

There are many ways in which the rapid advance of software-based technology can make the emergency services faster and safer. Black box technology can be dangerous if it is not used in a considered way. However, in connection with a properly developed SOG that passes the “If/Or” test, these “black box” tools can become a safe and helpful part of our modern tool belt.

Voice Your Opinion!

To join the conversation, and become an exclusive member of Firehouse, create an account today!