There are now many good books available on why we make errors in judgment and decision making. This book represents Michael Mauboussin's contribution to this genre, and I think he has done a good job in pulling together a lot of information from a diverse range of credible sources. The information he presents has broad application, though he has a slight emphasis on business and investing applications (his own area of specialization). The book is also a fairly easy and quick read.
Perhaps the best way to describe the content of the book is to summarize the key points, roughly in the order they appear in the book:
(1) "Think twice" to avoid errors in judgment and decision making, especially in situations where stakes are high.
(2) Learn from the experiences of others in similar situations (making use of statistics when possible), rather than relying only on your own perspective, and don't be excessively optimistic about expecting to beat the odds.
(3) Beware of anecdotal information, since it can paint a biased picture. Related to this point, don't infer patterns which don't exist, especially when the available data is limited, and avoid the bias of favoring evidence which supports your beliefs while ignoring contradictory evidence (deliberately seek dissenting opinions if necessary).
(4) Avoid making decisions while at an emotional extreme (stress, anger, fear, anxiety, greed, euphoria, grief, etc.).
(5) Beware of how incentives, situational pressures, and the way choices are presented may consciously or subconsciously affect behavior and shape decisions.
(6) In areas where the track record of "experts" is poor (eg, in dealing with complex systems), rely on "the wisdom of crowds" instead. Such crowds will generally perform better when their members are capable and genuinely diverse, and if dissent is tolerated (otherwise the crowd will be prone to groupthink).
(7) Use intuition where appropriate (eg, stable linear systems with clear feedback), but recognize its limitations otherwise (eg, when dealing with complex systems).
(8) Avoid overspecialization, aiming to have enough generalist background to draw on diverse sources of information.
(9) Make appropriate use of the power of information technology.
(10) Overcome inertia by asking "If we did not do this already, would we, knowing what we now know, go into it?"
(11) Because complex systems have emergent properties (the whole is more than the sum of the parts), avoid oversimplifying them with reductionistic models (simulation models are often helpful), remember that the behavior of components is affected by the context of the system, and beware of unintended consequences when manipulating such systems.
(12) Remember that correlation doesn't necessarily indicate causality.
(13) Remember that the behavior of some systems involves nonlinearities and thresholds (bifurcations, instabilities, phase transitions, etc.) which can result in a large quantitative change or a qualitative change in system behavior.
(14) When dealing with systems involving a high level of uncertainty, rather than betting on a particular outcome, consider the full range of possible outcomes, and employ strategies which mitigate downside risks while capturing upside potential.
(15) Because of uncertainties and heterogeneities, luck often plays a role in success or failure, so consider process as much as outcomes and don't overestimate the role of skill (or lack thereof). A useful test of how much difference skill makes in a particular situation is to ask how easy it is to lose on purpose.
(16) Remember that luck tends to even out over time, so expect outcomes to often "revert to the mean" (eventually move close to the average). But this isn't always the case, since outliers can also occur, especially when positive feedback processes are involved (eg, in systems in which components come to coordinate their behavior); in a business context, remember to make a good first impression.
(17) Make use of checklists to help ensure that important things aren't forgotten.
(18) To scrutinize decisions, perform a "premortem" examination. This involves assuming that your decision hasn't worked out, coming up with plausible explanations for the failure, and then revising the decision accordingly to improve the likelihood of a better outcome.
While this book doesn't really present any new material, I still found it to be a good resource, so I recommend it. After all, this subject matter is important and practical, yet also counterintuitive, so it makes sense to read many books to help these insights sink in and actually change one's habits.