Altmetrics
Downloads
453
Views
125
Comments
0
A peer-reviewed article of this preprint also exists.
This version is not peer-reviewed
Submitted:
30 May 2023
Posted:
01 June 2023
You are already at the latest version
Turner’s MMD 1978 | Perrow’s NA 1984 |
---|---|
Multiple high-risk industry qualitative case documents | Multiple high-risk industry qualitative case documents |
Patterns found in cases from inquiries | Patterns found in cases from inquiries |
Organisational Sociology and Weberian background | Organisational Sociology and Weberian background |
Technology and high-risk location important | Technology and high-risk location important |
Man-made disaster focus (13-14, 190) | Man-made catastrophe focus (3, 11, 351) |
Organisational failure (66, 75-8, 199-200) | Organisational failure (233, 330-1) |
Socio-technical (2-3, 5, 8, 47-8, 89, 170, 185, 187-8) | Socio-technical (3, 7, 9, 10-11, 352) |
Systemic (19, 135-6, 141-2, 145, 158-9, 161-2, 185, 188) | Systemic (3, 10, 62-71, 351) |
Open systems/external environment (136, 151, 170, 201) | Open systems and external environment (75) |
Emergence and propagation (89, 135, 158, 180) | Emergence and propagation (9-10) |
Failures of control (7, 70, 191) | Failures of control (81, 83) |
System forgiveness (19-20) | Cybernetic self-correcting & error-avoiding systems such as aviation (11, 79-81, 126-7, 146-7, 167-8) |
Error magnification/feedback amplification (179-81, 187, 236) | Negative synergy, error inducing systems, magnification, unfamiliar or unintended feedback loops (82, 88, 98) |
Precursor contributory factors combine in complex, unexpected and discrepant ways to defeat safety systems (86, 88, 105, 126) | Interactive complexity: small failures and other unanticipated interactions can cause system accidents (4-5, 7, 10, 101) |
Complex large-scale accidents and disasters with multiple chains of causes (14, 23-4, 75-6, 89, 105, 187) | Complex system accidents and catastrophes with multiple causes (7, 70-1, 75, 78, 85-6, 88) |
Precipitating or triggering incident or event, last event is not focus (81, 88-90, 102, 107, 122, 150, 155-6, 193, 198) | Trigger event and particular events are not the focus (6-7, 71, 342, 344) |
Surprise and unanticipated events (86, 126, 138, 145-6, 151, 159, 169, 184-6) | Unanticipated and unexpected outcomes from interactions (6, 70, 78) |
Large-scale accidents, rare catastrophes (149-51, 178) | System accidents, rare catastrophes (343-5) |
Latent structure of incubation events (86-7, 89, 94, 193) | Independent factors lying fallow for the fatal spark (111) |
Less complex accidents separate from disasters (88-9, 99) | Component failure accidents with ‘DEPOSE’ factors (8, 77, 111, 343) separate from system accidents (70) |
Bounded rationality and satisficing (133-8, 161) | Bounded rationality (315-21, 323-4) |
Inability to see or comprehend hazard (93-5, 195, 198) | Inability to see or comprehend hazard (9, 75, 351) |
Gap between perceived and actual reality (84, 94, 128-9, 138, 161, 194) | Gap between perceived and actual reality (9, 75) |
Warnings not heeded or discerned (19, 61, 194-5) | Warnings ignored or didn’t fit mental model (10, 31, 351) |
Miscommunication and misinformation (45-7, 61, 64-7, 121-4, 139) | Misinterpretation and indirect information sources (35, 73, 84) |
Variable disjunction of information (50-2, 61, 101, 217, 225) and social construction of reality (165-6, 191) | Cognitive models of ambiguous situations and the social construction of reality (9, 75, 176) |
Don’t blame individual operator error (160, 162-3, 198) | Don’t blame individual operator error (4, 9, 331, 351) |
Importance of power/elites (4, 72, 124-5, 132, 152, 191) | Importance of power/elites (12, 155, 306, 311, 339, 352) |
Growing concentration and power of large organisations and energy sources (1-2, 4-6, 160, 199, 201) | Growing concentration of energy sources and power of large organisations (102, 306, 311) |
Intentional misinformation by managers (118, 125, 147,) | Deception and lying, false logs by ship captains (10, 187) |
Regulatory issues/inadequacies (70-1, 79, 87, 99, 103-4) | Regulatory issues/inadequacies (3, 176, 343) |
Gap in defences and failure of precautions (84, 87, 91) | Defence in depth limits and failures (3-4, 43, 60) |
Intuition, tacit knowledge, craft (11, 25, 51) | Intuition and use of heuristics (316-7, 319) |
Poor and unrealistic management (63, 66-7, 77, 79) | Poor management (111-2, 177, 343) |
Environmental disasters (2, 5-6, 14, 128, 131, 149, 190) | Eco-system disasters (233, 252-3, 255, 295-6) |
Societal culture and context (84, 192) | Societal values and culture (12, 315-6, 321-8) |
Importance of learning from near misses (96, 182) | Aviation occurrence reporting model important (167-9 ) |
Turner’s MMD 1978 | Perrow’s NA 1984 |
---|---|
Organisational and social unit focus (160, 186, 199) | Macro industry and technology focus (3, 12-14, 339) |
Multidisciplinary approach and theories are necessary to study large-scale accidents and disasters (31-2, 38, 127) | Own theory and radical critical paradigm mostly applied to high-risk accident reports and industry data |
Somewhat optimistic about learning and prevention (32, 75-80, 194-200) | Somewhat pessimistic about learning and prevention (32, 60, 257, 343, 351) |
Incubation network (86-9, 99-107, 125, 131, 193, 200) | Inevitable normal or system accidents - irretrievable for at least some time (3-5, 256, 328, 330) |
Disaster timing usually after a long incubation often of years (87, 105, 180, 193) | Disaster timing rapid: unanticipated system interaction combined with external factors (4-5, 75, 233, 253-5) |
Disasters require focused unintended organising attention on multiple fronts to occur (180) | Banality and triviality lies behind most catastrophes (9) |
Sequence model with 6 stages (84-92) | Close or tight coupling with little slack (4-6, 10-11, 89- 96, 330-2) |
Failures of intention (4, 128-31, 160, 171, 181) and of foresight (50, 77, 92, 99, 107, 161, 170, 179) | Garbage can theory helps explain randomness of system accidents (324) |
Schematic accident representation diagram (97-8) | 2x2 matrix or grid of complexity and coupling (97, 327) |
Hierarchy of levels of information (145) | Catastrophic potential of risky technologies especially where complex and tightly coupled systems (342-6) |
Sub-cultures and shared social context determine perception (4, 58, 78, 101, 120-1, 166-171) | Capitalist production imperatives and distorted market prices are important (310-13) |
Bounded decision zones and perceptual horizons in an organisational worldview (58-9, 120-1, 165, 168-71, 200) | Common mode failures (72-3, 75, 85) |
Ill-structured problems; confusion across organisations and divisions (19-22, 50, 52-3, 60, 72, 75, 77, 96, 107) | Unnecessary proximity and tight spacing can lead to unexpected interactions (82, 85, 88) |
Well-structured problem post-disaster (52, 74-6, 103, 106, 179-88) | Centralisation and decentralisation (10, 331-5) |
Intended actor rationality (129, 160, 171-8, 200) | Social rationality by non-experts in society (315-6, 321-4) |
Negentropy, Anti-tasks & non-random structured nature of unintended consequences (127, 179, 181, 187, 190) | Understanding of transformational designs and processes is limited (11, 84-6, 330) |
Discrepant information and events (86-90, 122, 146) | Externalities imposed on society (339-41) |
Importance of organisational culture (77, 103) | Incomprehensibility of system accidents (23, 277) |
Catastrophe and chaos theory (153-6, 185-7, 194) | Complex systems seek productive efficiency (88) |
Misdirected energy and misinformation (4, 182-4, 187, 189-91, 193) | Risk assessment has a narrow focus; typically assumes over-regulation (306-14) |
Decoy problem takes the focus off more serious threats (59-61, 64, 78, 80, 86-7, 100, 102-4, 196) | Risk assessor ‘shamans’ support elites’ use of ‘evil’ technologies (12, 14, 307); some scientists, engineers and cognitive psychologists complicit (14, 307, 316-20) |
Complaints from outsiders discounted; reluctance to fear the worst (73-4, 76, 102-4) | Social class distribution of risk, inequality linked to disproportionate risk (310) |
Social and differentiated distribution of knowledge (3, 85, 106, 152) | Error-inducing systems such as marine shipping (11, 173-6, 181-90, 230) |
Channels of observation not just communication (141, 159); what organisations pay attention to (58, 163-171) | Nuclear accidents like TMI, unreliability and inevitabilty (15-61, 344, 348) |
Nuclear industry’s enormous hazards - but risk analysis, information and response (1-2, 18, 29-30, 35, 183) | Normative advocacy; technologies like nuclear power and weapons should not be used (x, 14, 347-52) |
Knowledge of Turner (MMD 1978 or after 1997 2nd edn) | Acknowledgment of Turner’s ideas | Knowledge of Perrow’s NA (1984 or 1999) | Acknowledgment of Perrow’s ideas | ||
---|---|---|---|---|---|
Hale | MMD 1978 | mixed | NA | good | |
Weick | 2nd edn 1997 | good | NA | good | |
Rasmussen | unclear | poor | unclear | poor | |
Reason | MMD 1978 | poor/mixed | NA | good | |
Vaughan | MMD 1978 | good/mixed | NA | good | |
Leveson | MMD 1978 | mixed/poor | NA | good | |
Hopkins | 2nd edn 1997 | good | NA | good | |
Hollnagel | 2nd edn 1997 | mixed | NA | good | |
Dekker | 2nd edn 1997 | mixed/good | NA | good | |
Shrivastava | MMD 1978 | mixed | NA | poor | |
Sagan | unclear | poor | NA | good | |
Snook | pre MMD 1978 | poor | NA | good |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 MDPI (Basel, Switzerland) unless otherwise stated