Customer Reviews


5 Reviews


3 star
0

2 star
0

1 star
0

 
 
Average Customer Review
Share your thoughts with other customers
Create your own review
 
 
Most Helpful First | Newest First

21 of 22 people found the following review helpful
4.0 out of 5 stars Not the last word, but provocative and well argued, 12 Jan. 2003
By 
T. D. Welsh (Basingstoke, Hampshire UK) - See all my reviews
(TOP 1000 REVIEWER)   
Verified Purchase(What is this?)
This book's impact, for me, consists of the insight that people are part of the systems they build and operate. Because "to err is human", everyone from designers to operators makes mistakes from time to time. In complex systems, such mistakes can be expected to result in a steady stream of component failures, malfunctions, and accidents - hence the book's provocative and memorable title.
After a very readable introduction, the author examines six important areas of technology (nuclear power, petrochemical plants, aircraft and airways, marine accidents, dams and mines, and "exotics" - space exploration, weapons of mass destruction, and recombinant DNA research). He plots these on two dimensions - complexity and coupling - and comes to the unsurprising conclusion that complex, tightly-coupled systems are bad news. Complexity means that unexpected accidents will happen, and tight coupling means that when they do happen, they will touch off further problems too quickly for human intervention.
First published in 1984, the book shows its age in some ways, and the author has updated it somewhat with an Afterword and a Postscript on the Y2K problem.
It would be hard to read even the first chapter without feeling dismay at the apparent gaping weakneses of the systems described. It looks as if the greatest source of trouble in nuclear power systems, for example, is the routine failure of valves controlling the flow of water through pipes! True, the water may be at hundreds of degrees Centigrade, loaded with chemical contaminants, and even radioactive - but surely this is 19th century (or, at worst, early 20th century) technology?
Then there is the ubiquitous evidence of human inadequacy. Imagine trying to monitor the working of a nuclear power plant whose control panel has 1600 separate switches and lights! (Especially when some of these have failed, and those awaiting repair are marked out by ordinary labels that hang down in fron of other, essential warning lights, preventing them from being seen). According to Perrow, one type of highly critical valve used to prevent a reactor melt-down was rated by its own manufacturer as likely to fail at least once every ten times it was activated! And it is hard to get excited when a fault light comes on, if it is known to have a habit of "playing up".
It is the people failures, though, that are most disturbing. One of the most vital parts of a nuclear power station is the concrete containment structure, which makes sure that whatever accidents may occur, the external environment is not affected. So it is scary to hear about construction crews who did not seem very good at pouring concrete, to the extent that holes up to 180 cubic feet in size were found inside the walls they had built!
Perrow relates many other interesting stories, perhaps the best being the one about the oil company that was drilling through a lake bed in the vicinity of a huge salt mine... resulting in the disappearance of lake, salt mine, oil rig, several boats, and part of a hotel. All because it was no one's particular job to make quite sure that the drill missed the mine...
It seems clear that, as Perrow suggests, people should be assessed and evaluated as a system component. Just as steel or concrete comes in various grades at corresponding prices, people tend to have a characteristic "defect level" which is not altogether uninfluenced by their cost. When companies, perhaps because of hard times, skimp or ignore maintenance, cut out training, and fire the most experienced (and highest paid) workers, accidents are likely to become more frequent.
In fact, we would have far more catastrophes but for a fact that Perrow also emphasizes - it is quite hard to cause a disaster. Everything has to be just right. In some industries, such as marine transport, this means that owners can continue to get by with minimum levels of investment. Sometimes, they are entirely insulated from the consequences of accidents.
I feel that the book has opened my eyes in a number of ways, but I still suspend judgement. For one thing, Charles Perrow is a professor of sociology, not an expert in engineering or systems theory. Nothing wrong with that - it gives him a usefully "skewed" point of view - but he is probably not entirely qualified to understand engineering systems. Moreover, I feel he is at his best when relating case studies and drawing specific conclusions with his sociologist's hat on. His attempts to start a new discipline called "Normal Accident Theory" (NAT) do not look very convincing to me.
Nevertheless, this is a fascinating book filled with interesting (and sometimes frightening) facts and interpretations. I feel sure that its core message - that human beings are part of the system, and should not be challenged beyond their reasonable limits - will be learned and taken into account by enlightened system designers.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


11 of 12 people found the following review helpful
5.0 out of 5 stars Reprint needed, 27 Oct. 1997
By A Customer
I specified this book as one of (the better of) two choices for supplementary reading in a university-level engineering course, and I'm dismayed that it's currently in this precarious print status. The book is an excellent--compelling and comprehensible-- explanation of the inherent risk of failure of tightly-coupled complex systems, in other words, the world we have created around ourselves. Engineers particularly need this insight before being unleashed on the world, because engineering as a profession (if not vocation) has taken the obligation to protect humankind from science and technology. If not a reprint or new edition, perhaps a new publisher is in order.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


3 of 3 people found the following review helpful
5.0 out of 5 stars Excellent, 18 Mar. 2011
I first read this about four years ago, recently I read it again prior to reading The Next Catastrophe, it certainly takes a very different view of Human Error, food for thought and a nice contrast to Reason and Dekker.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


8 of 10 people found the following review helpful
5.0 out of 5 stars The only theoretical account on accidents and their causes., 4 July 1999
By A Customer
I enjoyed very much reading this book. It is simple to read, yet profound and can be used for many purposes. I teach courses in the field of engineering systems, and one unit is dedicated to Perrow's approach. It is a must if you are interested in complex engineering systems!
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


5.0 out of 5 stars Five Stars, 13 Sept. 2014
No additional comment.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


Most Helpful First | Newest First

This product

Only search this product's reviews