Key Insights Across the Series
- Expertise as Double-Edged Sword: Deep domain knowledge builds confidence that can lead to overconfidence, filtering out risks and distorting perceptions of vulnerability.
- Invisible Organizational Failures: Major disasters like Titanic and Challenger stem from universal psychological and organizational mechanisms, not incompetence, where pressure suppresses contrary evidence.
- Psychology of Risk Translation: Failures arise from disconnects between physical reality, engineering knowledge, psychological beliefs, and organizational incentives.
- Need for External Safeguards: Individual awareness of dangers is insufficient; organizations must implement checklists, review boards, and dissent structures to ensure contrary evidence is addressed.
Related Content
- The Structural Post-Mortem Series - Case studies of catastrophic failures
- Human Systems and Behavior - Cognitive biases and decision-making patterns
- The Drivers Mind Series - Psychology of human-machine interaction in transportation
References
Adensamer, A., Gsenger, R., & Klausner, L. (2021). “Computer says no”: Algorithmic decision support and organisational responsibility. Journal of Responsible Technology, 7, 100014. https://doi.org/10.1016/j.jrt.2021.100014
Bauer, W., Hämmerle, M., Schlund, S., & Vocke, C. (2019). Designing AI-supported human-machine interaction. International Journal of Human-Computer Studies, 131, 25-39.
Bernabei, M., & Costantino, F. (2024). The role of automation in operator performance degradation: A systematic review. Safety Science, 170, 106329. https://doi.org/10.1016/j.ssci.2023.106329
Bloch, K. (2016). Incident investigation. In Rethinking Bhopal (pp. 45-67). Elsevier. https://doi.org/10.1016/B978-0-12-803778-2.00003-5
Boreham, N., Shea, C., & Mackway-Jones, K. (2000). Clinical risk and collective competence in the hospital emergency department in the UK. Social Science & Medicine, 51(3), 441-456. https://doi.org/10.1016/S0277-9536(99)00441-4
Burgén, J., & Bram, S. (2024). Safety on automated passenger ships: Exploration of evacuation scenarios for coastal vessels. Maritime Transport Research, 6, 100110. https://doi.org/10.1016/j.martra.2024.100110
Cummings, M. L., Marquez, J. J., & Roy, S. (2012). Measuring cognitive load for teleoperated search and rescue. IEEE Transactions on Systems, Man, and Cybernetics, 42(4), 793-804.
D’Addona, D., Bracco, F., Bettoni, A., Nishino, N., Carpanzano, E., & Bruzzone, A. (2018). Adaptive automation and human factors in manufacturing: An experimental assessment for a cognitive approach. CIRP Annals, 67(1), 447-450. https://doi.org/10.1016/j.cirp.2018.04.123
Dekker, S. (2014). The field guide to understanding ‘human error’ (3rd ed.). CRC Press.
Dwivedi, Y. K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., … & Medaglia, R. (2021). Artificial intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management, 57, 102126. https://doi.org/10.1016/j.ijinfomgt.2019.08.002
Fernandes, B., & Zhao, Z. (2023). Improving drug development in precision psychiatry by ameliorating cognitive biases. European Neuropsychopharmacology, 70, 1-8. https://doi.org/10.1016/j.euroneuro.2023.02.001
Gawande, A. (2009). The checklist manifesto: How to get things right. Metropolitan Books.
Georgosouli, D. (2023). The irony of automation: Skill decay and professional expertise. Safety Science, 158, 106000.
Jarmolowicz, D., Bickel, W., Sofis, M., Hatz, L., & Mueller, E. (2016). Sunk costs, psychological symptomology, and help seeking. SpringerPlus, 5, 1297. https://doi.org/10.1186/s40064-016-3402-z
John Rae, A., & Alexander, R. (2017). Probative blindness and false assurance about safety. Safety Science, 92, 115-126. https://doi.org/10.1016/j.ssci.2016.10.005
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
Karevold, K., & Teigen, K. H. (2010). Progress framing and sunk costs: How managers’ statements about project progress reveal their investment intentions. Journal of Economic Psychology, 31(3), 419-432. https://doi.org/10.1016/j.joep.2010.05.005
Kim, S., & Song, H. C. (2021). Automation bias and the role of explainable AI. Computers in Human Behavior, 123, 106877.
Kirwan, B. (2001). Safety management for the control of human error. Nuclear Electric plc.
Krausmann, E., & Necci, A. (2021). Thinking the unthinkable: A perspective on Natech risks and black swans. Safety Science, 139, 105255. https://doi.org/10.1016/j.ssci.2021.105255
Lee, S., Kim, M., Kim, J., & Seong, P. (2015). Optimization of automation: II. Estimation method of ostracism rate based on the loss of situation awareness of human operators in nuclear power plants. Annals of Nuclear Energy, 79, 160-166. https://doi.org/10.1016/j.anucene.2015.01.021
Lee, S., Kang, Y., & Seong, P. (2016). Reliability analysis of operator’s manual actions in human-machine interface systems. Nuclear Engineering and Technology, 48(3), 694-702.
Lord, W. (1955). A night to remember. Henry Holt and Company.
Macnamara, B. N., Hambrick, D. Z., & Oswald, F. L. (2024). Deliberate practice and performance in music, games, sports, education, and professions: A meta-analysis. Psychological Bulletin, 150(6), 1144-1167.
McLeod, R. (2015). Reflections on Buncefield. In Designing for human reliability (pp. 412-429). Butterworth-Heinemann. https://doi.org/10.1016/B978-0-12-802421-8.00020-5
Min, J., Yasuda, N., & Kim, T. (2024). Learning in the gray zone: Harmful organizational learning from safety deviations in nuclear power plants. Journal of Business Research, 185, 114883. https://doi.org/10.1016/j.jbusres.2024.114883
Morita, T., Kirakowski, J., & Kerschbaum, H. (2020). Trust and cooperation in the sharing economy. Frontiers in Psychology, 11, 1664. https://doi.org/10.3389/fpsyg.2020.01664
Osselton, S., & Heuts, E. (2016). Operational risk: Building a resilient organization. In Enterprise risk management (pp. 105-127). Butterworth-Heinemann. https://doi.org/10.1016/B978-0-12-800633-7.00005-5
Parent, M. (2020). Unbiasing information technology decisions. Organizational Dynamics, 49(1), 100740. https://doi.org/10.1016/j.orgdyn.2019.02.001
Pazouki, K., Forbes, N., Norman, R., & Woodward, M. (2018). Investigation on the impact of human-automation interaction in maritime operations. Ocean Engineering, 153, 234-244. https://doi.org/10.1016/j.oceaneng.2018.01.103
Perrow, C. (1984). Normal accidents: Living with high-risk technologies. Basic Books.
Petroski, H. (1985). To engineer is human: The role of failure in successful design. St. Martin’s Press.
Plokhy, S. (2018). Chernobyl: History of a tragedy. Basic Books.
Reiman, T., Rollenhagen, C., Pietikäinen, E., & Heikkilä, J. (2015). Principles of adaptive management in complex safety–critical organizations. Safety Science, 71, 53-66. https://doi.org/10.1016/j.ssci.2014.07.021
Robert, L., Alahmad, R., Issa, B., & Banerjee, S. (2024). A review of collaborative intelligence research and practice. Journal of Organizational Computing and Electronic Commerce, 34(1), 52-74.
Robison, P. (2021). Flying blind: The 737 MAX tragedy and the fall of Boeing. Doubleday.
Rogge, N. (2021). When the cost has sunk: Measuring and comparing the sunk-cost bias in autistic and neurotypical persons. Journal of Economic Psychology, 87, 102432. https://doi.org/10.1016/j.joep.2021.102432
Roodhooft, F., & Warlop, L. (1999). On the role of sunk costs and asset specificity in outsourcing decisions: A research note. Accounting, Organizations and Society, 24(4), 363-369. https://doi.org/10.1016/S0361-3682(98)00069-5
Rose, K. (2022). Low-tech and high-tech challenges. In Accidents and disasters (pp. 234-261). Elsevier. https://doi.org/10.1016/B978-0-323-99149-0.00009-2
Ruschemeier, H., & Hondrich, K. (2024). Trust in automated systems: A psychological perspective. Computers in Human Behavior, 156, 108220.
Sutanto, J., Liu, Y., Grigore, M., & Lemmik, R. (2018). Does knowledge retrieval improve work efficiency? An investigation under multiple systems use. International Journal of Information Management, 40, 141-158. https://doi.org/10.1016/j.ijinfomgt.2018.01.009
Sutton, I. (2010). Culture and employee involvement. In Process risk and reliability management (pp. 89-112). Butterworth-Heinemann. https://doi.org/10.1016/B978-1-4377-7805-2.10002-X
Sutton, I. (2012). Offshore safety developments. In Offshore safety management (pp. 123-154). Butterworth-Heinemann. https://doi.org/10.1016/B978-1-4377-3524-6.00008-3
Sutton, I. (2014). Major offshore events. In Offshore safety management (pp. 45-89). Butterworth-Heinemann. https://doi.org/10.1016/B978-0-323-26206-4.00002-2
Sutton, I. (2015). Culture and participation. In Process risk and reliability management (pp. 167-201). Butterworth-Heinemann. https://doi.org/10.1016/B978-0-12-801653-4.00003-5
Taleb, N. N. (2007). The black swan: The impact of the highly improbable. Random House.
van de Merwe, K., Mallam, S., Nazir, S., & Engelhardtsen, Ø. (2024). Supporting human supervision in autonomous collision avoidance through agent transparency. Safety Science, 169, 106329. https://doi.org/10.1016/j.ssci.2023.106329
Vaughan, D. (1996). The Challenger launch decision: Risky technology, culture, and deviance at NASA. University of Chicago Press.
Velegol, D. (2023). Risk: Identify risk and frame it as questions. In Design of innovation processes (pp. 234-267). Elsevier. https://doi.org/10.1016/B978-0-323-90465-0.00011-9
Vu, D., Hussain, A., Vu, D., Zhang, X., & Bui, V. (2026). Designing effective demand response: A review of behavioral insights, consumer engagement, and operational strategies in energy systems. Energy Research & Social Science, 131, 104474. https://doi.org/10.1016/j.erss.2025.104474
Wen, H. (2024). Human-AI collaboration for enhanced safety. In Methods in chemical process safety (Vol. 8, pp. 87-112). Elsevier. https://doi.org/10.1016/bs.mcps.2024.07.001
Williams, E., & Polito, V. (2022). Meditation in the workplace: Does mindfulness reduce bias and increase organisational citizenship behaviours? Frontiers in Psychology, 13, 747983. https://doi.org/10.3389/fpsyg.2022.747983
Zentall, T. R. (2015). When animals misbehave: Analogs of human biases and suboptimal choice. Behavioural Processes, 112, 164-173. https://doi.org/10.1016/j.beproc.2014.08.001
Zeng, J., Zhang, Q., Chen, C., Yu, R., & Gong, Q. (2013). An fMRI study on sunk cost effect. Brain Research, 1519, 73-83. https://doi.org/10.1016/j.brainres.2013.05.001




