To access the remainder of this piece of premium content, you must be registered with Firehouse. Already have an account? Login
Register in seconds by connecting with your preferred Social Network.
Complete the registration form.
For years, the fire service has collected data on fire fatalities. Through the National Fire Incident Reporting System (NFIRS), we now have at our fingertips all kinds of important information on the national fire experience. Today, we know where people are most likely to die from fire (homes), what ignition sources are likely to cause fatalities (smoking) and even the material involved in fatal fire ignitions (mattresses, bedding and upholstered furniture). Our ability to understand and learn from our fire experience has improved dramatically over the years as we have increased our ability to capture, analyze and report performance data. This is a good trend indeed.
Photo courtesy of Austin Fire Department
The Austin, TX, Fire Department partnered with an outdoor theater to sponsor the rollout of the “Freddy Finger” fire safety education campaign. Firefighters handed out “Freddy” foam fingers, door hangers with the “Freddy Finger“ message and general information about smoke alarms.
But the question begs: What are we doing with this important information? As we continue to learn and grow as an industry, are we using both local and national performance data to inform our decisions – to refine policy, define mission and scope, and rethink current practices? Or, is the data simply used to report what is happening – an ongoing look at the fire problem in our communities and country? This is an important distinction and not one to be taken lightly. On the one hand, we are actively engaged in the performance process, looking at and exploring the meaning of our data and performance information. On the other, we are simply spectators to the ever-growing mountain of data, churning out questionable statistics.
In a society that is becoming increasingly obsessed with “what the numbers say,” it is important that we not only begin to ask the very same question – what are the numbers telling us – but also, and more importantly, what can we do about it? Essentially, this is the difference between being an organization that simply collects a lot of data, and one that actually uses the data to explore the effectiveness of its services and programs.
In 2002, the City of Austin, TX, found itself experiencing a very disturbing trend in its performance data. By July 2002, the city had already experienced nine fatalities from fire. Austin’s average fire fatality rate is five per year, which is good for a city of 700,000 residents. To be well over the annual average by mid-year was a frightening prospect. In fact, the worst year for fire fatalities in the city’s history had been in 1981, when 16 people died from fire. At the mid-year rate in 2002, it appeared as if the city was on a record-breaking path.
Historically, we would have assumed that Austin was simply experiencing an inevitable statistical spike – the anomaly year. In fact, as an industry, we tend to assume that some years will just be “bad.” More people will die, more structures will burn, and more firefighters will be injured – it’s just the cyclical nature of the work. But, in today’s changing political environment, simply accepting dramatic performance trends as the inevitable whims of destiny is no longer acceptable. Today, we have to be asking the fundamental question, “What’s going on here and why, and what can we do about it?”
Understanding the Problem
After the ninth fire fatality in 2002, the Austin Fire Department pulled together a cross-functional team of employees and tasked them to go back and evaluate every single fatal fire incident and attempt to identify what was going on. This “Fire Fatality Task Force” was given the daunting (and to some, threatening) task of looking at each incident critically – asking difficult questions about our firefighting practices, response times, on-scene procedures and the effectiveness of our inspections. The charge was to identify whether these fatal fires could have been prevented, not to assign blame.