DOI QR코드

DOI QR Code

A Study on Countermeasure Strategy on Risk of Human Errors driven by Advanced and Automated Systems Through Consideration of Related Theories

현대의 고도화, 자동화된 시스템이 파생한 휴먼에러에 관한 이론적 고찰을 통한 리스크 대응전략 설정

  • Shin, In Jae (Department of Occupational Accident Prevention, Ministry of Employment and Labor)
  • 신인재 (고용노동부 산재예방지도과)
  • Received : 2013.06.25
  • Accepted : 2014.02.10
  • Published : 2014.02.28

Abstract

This paper provides an integrated view on human and system interaction in advanced and automated systems, which adopting computerized multi-functional artifacts and complicated organizations, such as nuclear power plants, chemical plants, steel and semi-conduct manufacturing system. As current systems have advanced with various automated equipments but human operators from various organizations are involved in the systems, system safety still remains uncertain. Especially, a human operator plays an important role at the time of critical conditions that can lead to catastrophic accidents. The knowledge on human error helps a risk manager as well as a designer to create and control a more credible system. Several human error theories were reviewed and adopted for forming the integrated perspective: gulf of execution and evaluation; risk homeostasis; the ironies of automation; trust in automation; design affordance; distributed cognition; situation awareness; and plan delegation theory. The integrated perspective embraces human error theories within three levels of human-system interactions such as affordance level, psychological logic level and trust level. This paper argued that risk management process should dealt with human errors by providing (1) reasoning improvement; (2) support to situation awareness of operators; and (3) continuous monitoring on harmonization of human system interaction. This approach may help people to understand risk of human-system interaction failure characteristics and their countermeasures.

Keywords

References

  1. C. Perrow, "Normal Accidents: Living with High-risk Technologies". Princeton, NJ: Princeton University Press,1999.
  2. T. Moriyam, H. Othtani, "Risk Assessment Tools Incorporating Human Error Probabilities in the Japanese Small-sized Establishment", Safety science, 2009.
  3. D.A. Wiegmann, S.A. Shappell, "A Human Error Approach to Aviation Accident Analysis: The Human Factors Analysis and Classification System",.Ashgate, 2003.
  4. J.S. Busby, R.E. Hibberd, "Mutual Misconceptions between Designers and Operators of Hazardous Systems", Research in Engineering Design, 13, 132-138, 2002. https://doi.org/10.1007/s00163-002-0012-2
  5. J. Rasmussen, "Skills, Rules, Knowledge: Signals, Signs and Symbols and other Distinctions in Human Performance Models", IEEE Transactions on Systems, Man, and Cybernetics, SMC-13, 257-267, 1983. https://doi.org/10.1109/TSMC.1983.6313160
  6. D.A. Norman, "Categorization of Action Slips". Psychological Review, 88, 1-15, 1981. https://doi.org/10.1037/0033-295X.88.1.1
  7. J. Reason, Human Error. Cambridge University Press, 1990.
  8. A. Mital, A. Pennathur, "Advanced Technologies and Humans in Manufacturing Workplaces : An Interdependent Relationship", International journal of industrial ergonomics, vol. 33, no4, pp. 295-313, 2004. https://doi.org/10.1016/j.ergon.2003.10.002
  9. Aviation Safety Investigation Report 200402747, "Final ATSB Report into the 24 July 2004 Boeing 737 Ground Proximity Caution near Canberra", Australian Transport Safety Bureau, 2005.
  10. Accident Investigation Report, "Accident of the a/c 5B-DBY of Helios Airways, Flight HCY522 on August 14, 2005", The Accident Investigation and Aviation Safety Board, 2006.
  11. I.J. Shin, "Development of a Theory-based Ontology of Design-Induced Error", PhD Thesis, University of Bath, 2009.
  12. B.M. Muir, N. Moray, "Trust in Automation: Part1-Theoretical Issues on the Study of Trust and Human Intervention in Automated Systems", Ergonomics, 37, 1905-1923, 1994. https://doi.org/10.1080/00140139408964957
  13. L. Bainbridge, "The Ironies of Automation". Automatica, 19, 775-780, 1983. https://doi.org/10.1016/0005-1098(83)90046-8
  14. N.B. Sarter, D.D. Woods and C.E. Billings, "Automation Surprises", in G. Salvendy (Ed.) Handbook of Human Factors & Ergonomics, second edition, Wiley. pp. 1926-1943, 1997.
  15. B.L. Wong, J. Hayes and T. Moore, "What Makes Emergency Ambulance Command and Control Complex?", Workshop on Complexity in Design and Engineering, 10-12th March 2005, 2005.
  16. G.J.S. Wilde, "The Theory of Risk Homeostasis: Implications for Safety and Health", Risk Analysis, 2, 209-225, 1982. https://doi.org/10.1111/j.1539-6924.1982.tb01384.x
  17. E. Hutchins, J.D. Hollan and D.A. Norman, "Direct Manipulation Interfaces", Human-Computer Interaction, 1, 311-338, 1985. https://doi.org/10.1207/s15327051hci0104_2
  18. J.J. Gibson, "The Theory of Affordances", In Shaw,R., Bransford, J.,(Eds,). "Perceiving, acting, andknowing: Toward an ecological psychology", (pp. 67-82). Hillsdale, NJ: Lawrence Erlbaum, 1977.
  19. D.A. Norman, "The Design of Everyday Things", The MIT Press, 1998.
  20. J.S. Busby, E.J. Hughes, "How Plan Delegation Contributes to Systemic Failure", Human Systems Management22, 13-22, 2003.
  21. V. De Keyser, "Temporal Decision Making in Complex Environments", Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, Vol. 327, No. 1241, Human Factors in Hazardous Situations (Apr. 12, 1990), pp. 569-576, 1990. https://doi.org/10.1098/rstb.1990.0099
  22. M.R. Endsley, "Toward a Theory of Situation Awareness in Dynamic Systems". Human Factors, 37(1), 32-64, 1995. https://doi.org/10.1518/001872095779049543
  23. J. Hollan, E. Hutchins and D. Kirsh, "Distributed Cognition: Toward a New Foundation for Human-computer Interaction Research". ACM Transactions on Computer-Human Interaction, 7, 174-196, 2000. https://doi.org/10.1145/353485.353487
  24. HSE, "The Train Collision at Ladbroke Grove 5 October 1999. A Report of the HSE Investigation". Health and Safety Executive. London:HMSO, 2000.
  25. KOSHA, Accident Report, 2003.
  26. US -Canada Power System Outage Task Force, "Final Report on the August 14, 2003 Blackout in the United States and Canada: Causes Recommendations", Ministry of Energy, 2004.
  27. BP Fatal accident investigation report, "Isomerization Unit Explosion Final Report", Texas, USA., 2005.
  28. K. Ohmae, "World is Ignoring Most Important Lesson from Fukushima Nuclear Disaster", April 4 2012, The Christian Science Monitor, 2012.
  29. J.R. Wilson, & A. Rutherford, "Mental Models: Theory and Application in Human Factors", Human Factors, 31, 617-634, 1989. https://doi.org/10.1177/001872088903100601
  30. J.S. Busby, & R.E. Hibberd, "The Coordinating Role of Organizational Artefacts in Distributed Cognitions and How it Fails in Maritime Operations", Le travail humain, 69, pp.25-47, 2006. https://doi.org/10.3917/th.691.0025
  31. F. Durso, K. Rawson, & S. Girotto, "Comprehension and Situation awareness", In F. Durso, R. Nickerson, S. Dumais, S. Lewandowsky, & T. Perfect (Eds.), Handbook of applied cognition.(2nd ed., pp. 163-194). Hoboken, NJ: Wiley. 2007.
  32. D.D. Woods, R.I. Cook, "Perspectives on Human Error: Hindsight Bias and Local Rationality". In F. Durso (Ed.) Handbook of Applied Cognitive Psychology. New York: Wiley, 141-171, 1999.
  33. J.C. Park, "Techniques and Management Strategies for Preventing Human-error Related Chemical Accidents", Korea Occupational Safety and Health Research Institute, 2012.