Failures in highly technological environments, such as military aircraft, can be investigated using known tools like HFACS, the U.S. Department of Defense’s Human Factors Analysis and Classification System. However, because of some limitations, HFACS does not always highlight the deeper causal factors that contribute to such failures. In what might be the first application of the Bayes’ theorem probability formula to an HFACS dataset, Andrew Miranda examined data from 95 severe incidents to pinpoint external influences behind so-called human error.
“Understanding Human Error in Naval Aviation Mishaps” discusses the three potential influences on performance-based errors that Miranda found: sensory misperception (for instance, spatial disorientation), mental awareness (cognition, attention), and the technological environment (e.g., design of cockpit displays and controls).
In addition, factors that likely contributed to judgment/decision-making errors included supervisory or organizational influences that may have placed aviators in situations of increased risk that taxed, if not their skills, then their decision-making abilities.
Digging deeper into external influences in the 95 mishaps, Miranda, an aerospace experimental psychologist at the Naval Safety Center, used content analysis. Themes drawn from the mishap reports helped to explain how and why the failures occurred. These themes could be classified as involving teamwork and organizational/supervisory influences. For example, there was evidence that crewmembers were unexpectedly put in a position of shared expectations that someone else was responsible for a particular task. When this occurred during circumstances with slowly increasing risk, individual crewmembers did not speak up or intervene because the social and technical conditions unintentionally encouraged it. Slowly but surely, an unsafe situation would emerge.
Miranda notes, “This project was essentially the extension of human factors work spanning 70 years: examine beyond the label ‘human error’ in favor of more careful considerations about the general conditions of aviation accidents. There were 95 severe mishaps in our dataset. To those of us on the outside, it’s easy to look back with hindsight at each one of those accidents and wonder why the people involved did (or didn’t) do what they did (or didn’t). But we won’t learn much with that approach. Instead, we made the effort to take an insider perspective. Each of these mishaps is an intricate story of people and technology under changing, dynamic circumstances that ultimately lead to an aircraft being destroyed or even lives being lost. The people involved made decisions and actions that made sense to them at the time. Human factors principles and methods are uniquely capable at both uncovering how conditions foster pilot error, as well as suggesting how to improve those conditions for future aviators.”
Miranda’s work has the potential to reveal ways in which HFACS or similar incident analysis tools can be used in other complex systems, such as health care, oil and gas, transportation, and maritime operations.
Source: Read Full Article