What You Think You Know
For a brief period in the middle of my career, I had the opportunity to study major fires as a lead investigator for the U.S. Fire Administration’s major fires project. Among the incidents that attracted our team’s attention back then were those that killed two or more firefighters. Sadly, I had more than my share of opportunities to investigate these kinds of incidents.
The most disappointing thing I discovered while working these incidents was just how often certain bad things repeated themselves. Despite the differences in these fires and the fire departments attending them, the causes of firefighter fatalities seemed altogether too common after awhile.
It almost goes without saying that what you don’t know can hurt you when it comes to firefighting. But as true as that may be, I didn’t find it was a problem nearly as often as something more insidious: Thinking you know something that you don’t. For all their experience, I saw incident commanders, company officers, and journeyman firefighters repeat simple mistakes with alarming frequency and surprisingly little insight into their own misapprehensions about situations they encounter more or less routinely.
Two incidents, in particular, stand out for me. In February 1992, two Indianapolis firefighters died while investigating a fire alarm activation in an athletic club that turned out to be a real fire. At a fire in Memphis, Tennessee in April 1994, two firefighters lost their lives when the fire they attended on the ninth floor of a highrise apartment building turned out not to be the false alarm they assumed it probably was.
In both instances, the firefighters knew the buildings. They had attended alarms in each many times. But somehow that knowledge produced little understanding of the conditions they confronted the night these firefighters died.
As they came to realize the situations facing them presented bigger challenges than they originally expected and tried in vain to correct their mistakes, they failed to communicate the conditions confronting them to others who could help. In both cases, this put additional lives in danger.
Both fire departments prided themselves on aggressive firefighting. The firefighters involved were generally among the most experienced in each of the fire departments involved. And the fire departments themselves were widely recognized as having progressive operating procedures, especially as it related to safely managing complex incidents in multistory buildings.
How then could they get it so wrong? Well, for starters, no one likes to admit when they have got it wrong. When those involved have little understanding themselves what’s actually going wrong, they become even less inclined than they are able to communicate what’s happening to others who can help them.
I sincerely doubt that the firefighters killed in either incident really knew fully what was happening to them in their final moments. Sadly, neither did their peers. In the aftermath of each incident, those who should have known what was happening offered detailed explanations that differed sharply from the available evidence, including the first-person accounts of others involved in the incidents.
Even if the information available to those in command would not have filled the gaps in their knowledge, it would have materially contradicted what they thought they knew. This would have left them with two options: find a way to incorporate the new information into a coherent picture of what was happening, or, more likely, acknowledge the irreconcilable contradictions and order a strategic retreat to regroup.
In any organization, the same situations arise all the time. If we do not encourage people to communicate openly and challenge assumptions continuously we’ll never know whether we truly know what we think we know.