A psychologist's view of validating aviation systems
All systems, no matter what they are designed to do, have shortcomings that may make them less productive than was hoped during the initial development. Such shortcomings can arise at any stage of development: from conception to the end of the implementation life cycle. While systems failure and err...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | All systems, no matter what they are designed to do, have shortcomings that may make them less productive than was hoped during the initial development. Such shortcomings can arise at any stage of development: from conception to the end of the implementation life cycle. While systems failure and errors of a lesser magnitude can occur as a function of mechanical or software breakdown, the majority of such problems, in aviation are usually laid on the shoulders of the human operator and, to a lesser extent, on human factors. The operator bears the responsibility and blame even though, from a human factors perspective, error may have been designed into the system. Human factors is not a new concept in aviation. The name may be new, but the issues related to operators in the loop date back to the industrial revolution of the nineteenth century and certainly to the aviation build-up for World War I. During this first global confrontation, military services from all sides discovered rather quickly that poor selection and training led to drastically increased personnel losses. While hardware design became an issue later, the early efforts were primarily focused on increased care in pilot selection and on their training. This actually involved early labor-intensive simulation, using such devices as sticks and chairs mounted on rope networks which could be manually moved in response to control input. The use of selection criteria and improved training led to more viable person-machine systems. More pilots survived training and their first ten missions in the air, a rule of thumb arrived at by experience which predicted ultimate survival better than any other. This rule was to hold through World War II. At that time, personnel selection and training became very sophisticated based on previous standards. Also, many psychologists were drafted into Army Air Corps programs which were geared towards refining the human factor. However, despite the talent involved in these programs and the tremendous build-up of aviation during the war, there were still aircraft designs that were man killers (no sexism implied since all combat pilots were men). One classic design error that was identified fifty years ago was the multipointer altimeter, which could easily be misread especially by a pilot under considerable task load. It has led to flying fully operational aircraft into the terrain. The authors of the research which formally identified this problem put 'Human Errors' in quot |
---|