Akademik

normal accident
n.
An accident that is the nearly inevitable result of technological interactions so complex that they cannot fully predicted or controlled.
Example Citation:
Natural disasters — earthquakes, floods — could affect anybody, and most companies think about that. Second are normal accidents that come about when complexity builds in the potential for an accident. In IT, complexity can lead to a glitch like Y2k. That's a normal crisis — an accident that's almost inevitable, but not intentional. Then you have abnormal accidents: Someone deliberately causing the accident. 9/11 was abnormal. Enron was an abnormal economic crisis because it was caused by shenanigans.
— Ian I. Mitroff (interview), "Facing the Unthinkable," Computerworld, April 21, 2003
Earliest Citation:
An alternative vision for a peaceful and productive world requires the emergence of the political will to insist that a future of unlimited technological growth, self-anointed managers and "normal" accidents is unworthy of the best in human potential and may well be unendurable.
— Robert Engler, "Technology out of control," The Nation, April 27, 1985
Notes:
We live in a world that is increasingly run by complex systems — from nuclear power stations and chemical production plants to the computer industry and the aviation industry. These are systems where multiple technologies must not only interact, but that can only work properly if all the other technologies work properly. (Such a system is said to be tightly coupled.) In other words, if one fails, the system itself fails. Thankfully, most complex systems have built-in redundancies and fail-safe mechanisms that prevent such a system failure. However, the interactions between technologies in a complex system are so, well, complex, that it isn't possible to predict all the ways that any one failure will affect the system. Therefore, accidents in these systems are more or less inevitable. This isn't strictly Murphy's Law: "If something can go wrong it will." Instead, it's a variation on the theme: "If something can go wrong, it usually won't, but eventually it will."
These are the normal accidents of a complex system, a phrase that was coined by Yale sociologist Charles Perrow and first appeared in his 1984 book, Normal Accidents: Living with High-Risk Technologies. The earliest citation, below, is the first use I could find that wasn't merely the title of the book.
Related Words:
blue goo
global ecophagy
mode confusion
Murphy willing
phantom accident
resistentialism
revenge effect
SMIDSY
technology-related anxiety
walkaway safe
Category:
Technology (General)

New words. 2013.