Normal Accidents: Living with High Risk Technologies and over 2 million other books are available for Amazon Kindle . Learn more

Sign in to turn on 1-Click ordering.
Trade in Yours
For a 3.48 Gift Card
Trade in
More Buying Choices
Have one to sell? Sell yours here
Sorry, this item is not available in
Image not available for
Image not available

Start reading Normal Accidents on your Kindle in under a minute.

Don't have a Kindle? Get your Kindle here, or download a FREE Kindle Reading App.

Normal Accidents: Living with High Risk Technologies (Princeton Paperbacks) [Paperback]

Charles Perrow
4.8 out of 5 stars  See all reviews (4 customer reviews)
RRP: 30.95
Price: 24.76 & FREE Delivery in the UK. Details
You Save: 6.19 (20%)
o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o
Only 5 left in stock (more on the way).
Dispatched from and sold by Amazon. Gift-wrap available.
Want it Wednesday, 3 Sept.? Choose Express delivery at checkout. Details


Amazon Price New from Used from
Kindle Edition 17.27  
Hardcover --  
Paperback 24.76  
Trade In this Item for up to 3.48
Trade in Normal Accidents: Living with High Risk Technologies (Princeton Paperbacks) for an Amazon Gift Card of up to 3.48, which you can then spend on millions of items across the site. Trade-in values may vary (terms apply). Learn more

Book Description

17 Oct 1999 Princeton Paperbacks

Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them.

The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.

Frequently Bought Together

Normal Accidents: Living with High Risk Technologies (Princeton Paperbacks) + The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters + Human Error
Price For All Three: 63.32

Buy the selected items together

Product details

  • Paperback: 386 pages
  • Publisher: Princeton University Press; Updated edition (17 Oct 1999)
  • Language: English
  • ISBN-10: 0691004129
  • ISBN-13: 978-0691004129
  • Product Dimensions: 23.4 x 15.5 x 3 cm
  • Average Customer Review: 4.8 out of 5 stars  See all reviews (4 customer reviews)
  • Amazon Bestsellers Rank: 230,335 in Books (See Top 100 in Books)
  • See Complete Table of Contents

More About the Author

Discover books, learn about writers, and more.

Product Description


"[Normal Accidents is] a penetrating study of catastrophes and near catastrophes in several high-risk industries. Mr. Perrow ... writes lucidly and makes it clear that `normal' accidents are the inevitable consequences of the way we launch industrial ventures.... An outstanding analysis of organizational complexity."--John Pfeiffer, The New York Times

"[Perrow's] research undermines promises that `better management' and `more operator training' can eliminate catastrophic accidents. In doing so, he challenges us to ponder what could happen to justice, community, liberty, and hope in a society where such events are normal."--Deborah A. Stone, Technology Review

"Normal Accidents is a testament to the value of rigorous thinking when applied to a critical problem."--Nick Pidgeon, Nature

Inside This Book (Learn More)
First Sentence
Our first example of the accident potential of complex systems is the accident at the Three Mile Island Unit 2 nuclear plant near Harrisburg, Pennsylvania, on March 28, 1979. Read the first page
Explore More
Browse Sample Pages
Front Cover | Copyright | Table of Contents | Excerpt | Index | Back Cover
Search inside this book:

Customer Reviews

3 star
2 star
1 star
4.8 out of 5 stars
4.8 out of 5 stars
Most Helpful Customer Reviews
21 of 22 people found the following review helpful
By T. D. Welsh TOP 500 REVIEWER
Format:Paperback|Verified Purchase
This book's impact, for me, consists of the insight that people are part of the systems they build and operate. Because "to err is human", everyone from designers to operators makes mistakes from time to time. In complex systems, such mistakes can be expected to result in a steady stream of component failures, malfunctions, and accidents - hence the book's provocative and memorable title.
After a very readable introduction, the author examines six important areas of technology (nuclear power, petrochemical plants, aircraft and airways, marine accidents, dams and mines, and "exotics" - space exploration, weapons of mass destruction, and recombinant DNA research). He plots these on two dimensions - complexity and coupling - and comes to the unsurprising conclusion that complex, tightly-coupled systems are bad news. Complexity means that unexpected accidents will happen, and tight coupling means that when they do happen, they will touch off further problems too quickly for human intervention.
First published in 1984, the book shows its age in some ways, and the author has updated it somewhat with an Afterword and a Postscript on the Y2K problem.
It would be hard to read even the first chapter without feeling dismay at the apparent gaping weakneses of the systems described. It looks as if the greatest source of trouble in nuclear power systems, for example, is the routine failure of valves controlling the flow of water through pipes! True, the water may be at hundreds of degrees Centigrade, loaded with chemical contaminants, and even radioactive - but surely this is 19th century (or, at worst, early 20th century) technology?
Then there is the ubiquitous evidence of human inadequacy.
Read more ›
Comment | 
Was this review helpful to you?
11 of 12 people found the following review helpful
5.0 out of 5 stars Reprint needed 27 Oct 1997
By A Customer
I specified this book as one of (the better of) two choices for supplementary reading in a university-level engineering course, and I'm dismayed that it's currently in this precarious print status. The book is an excellent--compelling and comprehensible-- explanation of the inherent risk of failure of tightly-coupled complex systems, in other words, the world we have created around ourselves. Engineers particularly need this insight before being unleashed on the world, because engineering as a profession (if not vocation) has taken the obligation to protect humankind from science and technology. If not a reprint or new edition, perhaps a new publisher is in order.
Comment | 
Was this review helpful to you?
3 of 3 people found the following review helpful
5.0 out of 5 stars Excellent 18 Mar 2011
By Shane C
I first read this about four years ago, recently I read it again prior to reading The Next Catastrophe, it certainly takes a very different view of Human Error, food for thought and a nice contrast to Reason and Dekker.
Comment | 
Was this review helpful to you?
8 of 10 people found the following review helpful
By A Customer
I enjoyed very much reading this book. It is simple to read, yet profound and can be used for many purposes. I teach courses in the field of engineering systems, and one unit is dedicated to Perrow's approach. It is a must if you are interested in complex engineering systems!
Comment | 
Was this review helpful to you?
Most Helpful Customer Reviews on (beta) 4.0 out of 5 stars  54 reviews
127 of 147 people found the following review helpful
3.0 out of 5 stars Living With High-Risk Conclusions 29 Jan 2004
By Robert I. Hedges - Published on
Format:Paperback|Verified Purchase
I have been mulling over this review for a while now, and am still undecided on the correct rating to award this book. On the one hand Perrow offers some genuine insight into systems safety, but frequently does not understand the technicalities of the systems (or occasionally their operators) well enough to make informed decisions and recommendations. In more egregious cases he comes to conclusions that are guaranteed to reduce safety (as when he argues that supertankers should be run by committee, and the usefulness of the Captain is no more) or are merely the cherished liberal opinions of an Ivy League sociologist (he teaches at Yale) as when he argues for unilateral nuclear disarmament, government guaranteed income plans, and heroin maintenance (distribution) plans for addicts "to reduce crime." In the case of disarmament, remember this was written during the early 1980s while the Soviet Union was still a huge threat...complete nuclear disarmament would have resulted in fewer US nuclear accidents, but would NOT have made us safer as we would have been totally vulnerable to intentional nuclear attack. He has great personal animosity toward Ronald Reagan, and makes inflammatory statements in the mining section that mining safety regulations would surely be weakened by Reagan, causing many more accidents and deaths. Later in the same section, though, he concludes that mining is inherently dangerous, and no amount of regulation can make it safe. So which is it? Any of this is, at very best, folly, but regardless of political bent (he is a self avowed "leftist liberal") has absolutely no place in a book ostensibly on safety systems. As such I think portions of this book show what is so wrong in American academia today: even genuinely excellent research can be easily spoiled when the conclusions are known before the research is started. This is one of the many reasons that physical scientists scorn the social sciences, and it doesn't have to be this way.
Having said all that there IS a wealth of good information and insight in this book when Perrow sticks to systems and their interactions. The book contains the finest analysis commercially available of the Three Mile Island near-disaster, and his insight about how to improve safety in nuclear plants was timely when the book was written in 1984, though many improvements have been made since then.
Speaking as a commercial airline pilot, I feel his conclusions and observations about aircraft safety were generally true at the time of printing in 1984, but now are miserably out of date. (The same is true of the Air Traffic Control section.) I believe that he generally has a good layman's grasp of aviation, so I am willing to take it as a given that he has a knowledgeable layman's comprehension of the other systems discussed. As an aside, he never gets some of the technicalities quite right. For instance, he constantly uses the term 'coupling' incorrectly in the engineering sense; this is particularly objectionable in the aviation system where it has a very specific meaning to aeronautical engineers and pilots.
The section on maritime accidents and safety is superbly written. Here I am not an expert, but there seems to be a high degree of correlation with the aviation section. His section on "Non Collision Course Collisions" by itself makes this book a worthwhile read. He presents very compelling information and reasoning until the very end of the section, at which point he suggests that since ships are now so big, large ships (especially supertankers) essentially should have no Captain, but should be run by committee. This is an invalid conclusion, and he offers no evidence or substantial argument to support that idea. Clearly, it is an idea hatched in his office and not on a ship (or plane.) There always needs to be a person in a place of ultimate authority in fast moving, dynamic systems, or the potential exists to have crew members begin to work at direct odds with each other, making a marginal situation dangerous. Ironically, in the very same part of the discussion where he concludes that there should be no Captain, he has hit upon the key to the problem. He mentions that he was pleased to see that some European shippers were now training their crews together as a team, and that he expected this to lower accident rates. He is, in fact, exactly right about that. Airlines now have to train crews in Crew Resource Management (CRM) in which each member of the crew has the right and obligation to speak up if they notice anything awry in the operation of their aircraft, and the Captain makes it a priority to listen to the input of others, as everyone has a different set of concerns and knowledge. In this way, the Captain becomes much less dictatorial, and becomes more of a final decision maker after everyone has had their say. It IS critical, though, to maintain someone in command, as there is no time to assemble a staff meeting when a ship is about to run aground, or a mid-air collision is about to occur. Many other well documented studies and books have come to this conclusion, and in the airline industry since CRM was introduced the accident rate has decreased dramatically.
Overall, if you have a desire to understand high risk systems, this book has a lot of good information in it; however it is woefully out of date and for that reason among others, I can only recommend it with reservations. A better and much more contemporary introductory book on the subject is 'Inviting Disaster' by James R. Chiles. Remember, this book was written over twenty years ago, and much has changed since then. There is knowledge to be gleaned here, but you have to be prepared to sort the wheat from the chaff.
25 of 28 people found the following review helpful
5.0 out of 5 stars Of Lasting Value, Relevant to Today's Technical Maze 27 Jan 2003
By Robert David STEELE Vivas - Published on
Edit of 2 April 2007 to add link and better summary.

I read this book when it was assigned in the 1980's as a mainstream text for graduate courses in public policy and public administration, and I still use it. It is relevant, for example, to the matter of whether we should try to use nuclear bombs on Iraq--most Americans do not realize that there has never (ever) been an operational test of a US nuclear missile from a working missle silo. Everything has been tested by the vendors or by operational test authorities that have a proven track record of falsifying test results or making the tests so unrealistic as to be meaningless.

Edit: my long-standing summary of the author's key point: Simple systems have single points of failure that are easy to diagnose and fix. Complex systems have multiple points of failure that interact in unpredictable and often undetectable ways, and are very difficult to diagnose and fix. We live in a constellation of complex systems (and do not practice the precationary principle!).

This book is also relevant to the world of software. As the Y2K panic suggested, the "maze" of software upon which vital national life support systems depend--including financial, power, communications, and transportation software--has become very obscure as well as vulnerable. Had those creating these softwares been more conscious of the warnings and suggestions that the author provides in this book, America as well as other nations would be much less vulnerable to terrorism and other "acts of man" for which our insurance industry has not planned.

I agree with another review who notes that this book is long overdue for a reprint--it should be updated. I recommended it "as is," but believe an updated version would be 20% more valuable.

Edit: this book is still valuable, but the author has given us the following in 2007:
The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters
19 of 22 people found the following review helpful
5.0 out of 5 stars Altogether a fascinating and informative book 21 Mar 2003
By Atheen M. Wilson - Published on
Wow. This is an incredible book. I have to admit, though, that I had some difficulty getting into Normal Accidents. There seemed an overabundance of detail, particularly on the nuclear industry's case history of calamity. This lost me, since I'm not familiar with the particulars of equipment function and malfunction. The book was mentioned, however, by two others of a similar nature and mentioned with such reverence, that after I had finished both, I returned to Perrow's book, this time with more success.
Professor Perrow is a PhD in sociology (1960) who has taught at Yale University Department of Sociology since 1981 and whose research focus has been human/technology interactions and the effects of complexity in organizations. (His most recent publication is the The AIDS disaster : the Failure of Organizations in New York and the Nation, 1990.)
In Normal Accidents, he describes the failures that can arise "normally" in systems, ie. those problems that are expected to arise and can be planned for by engineers, but which by virtue of those planned fail-safe devices, immeasurably complicate and endanger the system they are designed to protect. He describes a variety of these interactions, clarifying his definitions by means of a table (p. 88), and a matrix illustration (p. 97). Examples include systems that are linear vs complex, and loosely vs tightly controlled. These generally arise through the interactive nature of the various components the system itself. According to the matrix, an illustration of a highly linear, tightly controlled system would be a dam. A complex, tightly controlled system would be a nuclear plant, etc.
The degree to which failures may occur varies with each type of organization, as does the degree to which a recovery from such a failure is possible. As illustrations, the author describes failures which have, or could have, arisen in a variety of settings: the nuclear industry, maritime activities, the petrochemical industry, space exploration, DNA research and so on.
The exciting character of the stories themselves are worth the reading; my favorite, and one I had heard before, is the loss of an entire lake into a salt mine. More important still is the knowledge that each imparts. Perrow makes abundantly apparent by his illustrations the ease with which complex systems involving humans can fail catastrophically. (And if Per Bak and others are correct, almost inevitably).
Probably the most significant part of the work is the last chapter. After discussing the fallibility of systems that have grown increasingly complex, he discusses living with high risk systems, particularly why we are and why it should change. In a significant statement he writes, "Above all, I will argue, sensible living with risky systems means keeping the controversies alive, listening to the public, and recognizing the essentially political nature of risk assessment. Unfortunately, the issue is not risk, but power; the power to impose risks on the many for the benefit of the few (p. 306)," and further on, "Risks from risky technologies are not borne equally by the different social classes [and I would add, countries]; risk assessments ignore the social class distribution of risk (p. 310)." How true. "Quo Bono?" as the murder mystery writers might say; "Who benefits?" More to the point, and again with that issue in mind, he writes "The risks that made our country great were not industrial risks such as unsafe coal mines or chemical pollution, but social and political risks associated with democratic institutions, decentralized political structures, religious freedom and plurality, and universal suffrage (p. 311)." Again, very true.
Professor Perrow examines the degrees of potential danger from different types of system and suggests ways of deciding which are worth it to society to support and which might not be. These include categorizing the degree and the extent of danger of a given system to society, defining the way these technologies conflict with the values of that society, determining the likelihood that changes can be made to effectively alter the dangerous factors through technology or training of operators, and the possibility of placing the burden of spill-over costs on the shoulders of the institutions responsible. The latter might conceivably lead to corrective changes, either by the institutions themselves in order to remain profitable or by consumers through purchasing decisions.
The bibliography for the book is quite extensive and includes a variety of sources. These include not only popular books and publications on the topics of individual disasters, but government documents, research journals, and industry reports as well. I did not find any reference to the Johnstown flood, my particular favorite dam burst story, but there are a wide variety of references to chose from should someone wish to do their own research on the topic.
Altogether a fascinating and informative book.
14 of 16 people found the following review helpful
5.0 out of 5 stars Insightful perspective on serious industrial accidents. 17 July 1998
By A Customer - Published on
Normal Accidents is the best summary of major industrial accidents in the USA that I have encountered. It is written in a factual and technically complete style that is particularly attractive to anyone with a technical background or interest. I was able to read a borrowed copy from a colleague a few years ago when I was appointed as chairman of the safety committee at a manufacturing facility where workers had potential for exposure to toxic gasses, high voltage, x-radiation, and other more everyday industrial hazards. The author's insight is right on target for achieving a workable understanding of the cause and prevention of disaster events. I wanted to buy copies for all our engineering managers and safety committee members, but the book is out of print. It is my fond hope that the author will write an updated version with analysis of more recent events as well as the well-chosen accidents in the previous edition. For any safety related product or process de! ! signer, this book is a must read! For any technically cognizant reader, this book is a delight to read, even if it is a little scary in its implications. For everyone else, it has some really interesting historical stories.
15 of 18 people found the following review helpful
4.0 out of 5 stars Cool water for hot-headed analysts of complex systems 7 July 1998
By A Customer - Published on
I'm dismayed to discover that 'Normal Accidents' is so difficult to find.
Like all voters, I'm sometimes asked to make choices about the use of potentially devastating technology, despite having no training in engineering and only a sketchy idea of statistical risk analysis. 'Normal Accidents' doesn't reduce my reliance on experts, but it does provide a common language for us to discuss the issues.
Perrow's accident descriptions are masterly, and should disturb anyone who lightly dismisses accidents in complex systems as "simple human error", or assumes that all systems can be made safe by a technological fix. I've used Perrow's complexity / coupling matrix as a tool for thinking about and discussing the risks involved in decisions about many systems in addition to those Perrow actually discusses, not least software systems.
I think this book still has a lot to offer anyone interested in public debate about complex technological issues, and I hope it will be reprinted. A new edition would be even better.
Were these reviews helpful?   Let us know
Search Customer Reviews
Only search this product's reviews

Customer Discussions

This product's forum
Discussion Replies Latest Post
No discussions yet

Ask questions, Share opinions, Gain insight
Start a new discussion
First post:
Prompts for sign-in

Search Customer Discussions
Search all Amazon discussions

Look for similar items by category