HMI605. Barrier Analysis and Accident Prevention, 5p

Instructor Erik Hollnagel
Semester

Prerequisites  
Goals
Content
  1. Introduction
  2. This session will introduce the topic of human erroneous action and the concepts needed to guide the explorations of the complex issues associated with the human contribution to system breakdowns. The problems in defining "human errors" will be discussed, as well as the view that erroneous actions are an important form of creativity and learning about a system. Specific topics will be the complexity and cognition behind human erroneous action, and the need for anticipating and modelling human erroneous action.

  3. The analysis of accidents and mishaps
  4. An important part of giving meaning to past accidents and mishaps consists of methodological approaches to incident analysis. The retrospective use of the Cognitive Reliability and Error Analysis Method (CREAM) will be introduced and illustrated by examples. After system disaster, a human operator can always be found somewhere in the rubble. One of the largest challenges in the investigation of the causes of a breakdown is a consideration of the organisational and cognitive factors that shaped the human actions. The use of a concrete example of an aircraft accident will illustrate the possibility of analysing past mishaps, despite its inherent difficulty and complexity.

  5. The context of erroneous actions
  6. The organisation of which operators are part plays an important role in shaping the actions of local operators, through the resources it provides and double binds it creates. New technology may also fundamentally exacerbate cognitive demands and change the type and consequences of human erroneous action. This has been captured by terms such as clumsy automation, multi-mode systems, keyhole effects, etc., that can contribute to breakdowns in human-machine interaction. In the Scandinavian tradition, this has been expressed as the Man-Technology-Organisation (MTO) concept, which often is a prime determiner of how erroneous actions manifest themselves. Similarly, physical and functional barriers may be used to prevent the propagation of the consequences of erroneous actions.

  7. Predicting system failures
  8. Human erroneous actions have always been seen as a major (and in some industries growing) threat to system safety. There is a long history of predicting human erroneous actions and their consequences for system integrity, especially in nuclear power. Here, we will discuss traditional methods of risk assessment, their shortcomings, and introduce models that provide new ways forward. The predictive use of the Cognitive Reliability and Error Analysis Method (CREAM) will be described

  9. Conclusions and summary

    In the context of cognitive systems engineering, erroneous actions can be seen both as a consequence of coping with complexity and as an indication of the quality of the tools provided. The course will end by providing a larger perspective on erroneous actions as a potential failure not only of the people at the sharp end (operators, pilots, users), but also of people at the blunt end (designers, managers, etc.).

Literature

Examination Students will do a group project which includes implementing a user interface using a UIMS such as Visual Basic and conducting an evaluation of that interface.
Other