|Home | Login | Schedule | Pilot Store | 7-Day IFR | IFR Adventure | Trip Reports | Blog | Fun | Reviews | Weather | Articles | Links | Helicopter | Download | Bio|
Learn to Fly
7 day IFR Rating
A recent helicopter accident once again reminded me how tenuous life is. While the folks in this particular incident survived, many times they don't. Just look at the monthly lists of accidents on the NTSB website and you'll see what I mean. The problem in aviation is that it automatically cleanses itself of folks that don't belong. How I wish that we can predict which folks would kill themselves in aviation. Obviously we'd try to give them a hint. The best we can do is offer the occasional web article we might stumble across.
We've got to do better in preventing accidents & incidents. It's taken 100 years but the first 99% of the effort has been completed. That last 1% will probably take another 100 years. For now we'll have to be satisfied with accident reports that state pilot error as the cause. I'm not quite satisfied with this conclusion that appears on some 90% of the reports in the NTSB database.
In doing some research on flight safety, I came across the work of H.W. Heinrich. Mr. Heinrich was an Industrial Safety Engineer, who in 1931 developed a model describes how major injuries/industrial accidents occur.
For a given population of workers, his model proposes that for every 300 unsafe acts there are 29 minor injuries and one major injury. Heinrich's theory accounts for the first three levels of the model. For more than 75 years, industrial safety managers have tried to reduce such accidents and injuries by attacking the problem from different ends of a three level pyramid. They run around fixing frayed power cords, making sure equipment interlocks work, and enforcing safety rules.
How the Model Applies to AviationHow we expand the model starts here. In the expanded model, we focus on leading indicators. These are the antecedents to the accident and is the only place we can focus to prevent them. The proposed model includes two additional layers called "Unsafe acts" and "Pilot Qualifications".
We all understand close calls. We've all had them. Some more than others. And that's the point. Some folks simply have more close calls, take more shortcuts & unnecessary risks. Sometimes they are lucky. Sometimes luck runs out. If you keep playing darts, eventually you'll get a bull's-eye. Out of 3,000 unsafe acts, one is going to be the winner that awards the pilot a check in the fatal column.
Even more insidious is the "Pilot Qualifications" layer. Imagine an employer reviewing all your flights, all the decisions you made on those flights, and all the little things that surprised you on all those flights. Would the employer select you to be a pilot for them? This layer includes (but not limited to):
Where Do We Go From Here?
Attacking the triangle from the top is expensive, just look at what we
spend in the NTSB budget for accident investigations. The other
problem with using the model starting at the top is that an accident is
a lagging indicator which doesn't necessarily serve to prevent
injuries. To illustrate what I mean, take a look at the typical
report and this nebulous term is used 90% of the time: "Pilot
error." Well how do you prevent that? I guess it means an
indepth analysis of all error and the root causes. Got some spare
time? We've got a whole bunch of people looking at this data
trying to come up with conclusions that define what action leads to
what results. It might be more complicated than we think.