Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet or computer – no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Follow the authors
OK
Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts (2020 edition - updated and revised) Paperback – 4 Aug. 2020
Purchase options and add-ons
When we make mistakes, cling to outdated attitudes, or mistreat other people, we must calm the cognitive dissonance that jars our feelings of self-worth. And so, unconsciously, we create fictions that absolve us of responsibility, restoring our belief that we are smart, moral, and right a belief that often keeps us on a course that is dumb, immoral, and wrong.
Backed by years of research, Mistakes Were Made (But Not by Me) offers a fascinating explanation of self-justification how it works, the damage it can cause, and how we can overcome it.
This updated edition features new examples and concludes with an extended discussion of how we can live with dissonance, learn from it, and perhaps, eventually, forgive ourselves.
- Print length464 pages
- LanguageEnglish
- PublisherPinter & Martin Ltd.
- Publication date4 Aug. 2020
- Dimensions13.5 x 3.4 x 20.3 cm
- ISBN-101780666950
- ISBN-13978-1780666952
Frequently bought together

Products related to this item
Product description
Review
About the Author
Product details
- Publisher : Pinter & Martin Ltd.; 3rd New edition (4 Aug. 2020)
- Language : English
- Paperback : 464 pages
- ISBN-10 : 1780666950
- ISBN-13 : 978-1780666952
- Dimensions : 13.5 x 3.4 x 20.3 cm
- Best Sellers Rank: 98,333 in Books (See Top 100 in Books)
- 1,105 in Psychological Schools of Thought
- 2,531 in Higher Education of Biological Sciences
- 20,711 in Society, Politics & Philosophy
- Customer reviews:
About the authors

Carol Tavris is a social psychologist, writer, and lecturer whose goal is to promote psychological science and critical thinking in improving our lives. She is coauthor, with Elliot Aronson, of "Mistakes Were Made (But Not by ME): Why we justify foolish beliefs, bad decisions, and hurtful acts" and, with Avrum Bluming, "Estrogen Matters: Why taking hormones in menopause can improve women's well-being and lengthen their lives--without raising the risk of breast cancer." Her other major books include the landmark "Anger: The misunderstood emotion” and the award-winning "The Mismeasure of Woman: Why women are not the better sex, the inferior sex, or the opposite sex." She has written hundreds of essays, op-eds, and book reviews on topics in psychological science, writes a column for Skeptic magazine, and is a highly regarded lecturer who has spoken to groups around the world. She is a Fellow of the Association for Psychological Science.

Elliot Aronson is currently Professor Emeritus at the University of California in Santa Cruz. He has long-standing research interests in social influence and attitude change, cognitive dissonance, research methodology, and interpersonal attraction. Professor Aronson's experiments are aimed both at testing theory and at improving the human condition by influencing people to change dysfunctional attitudes and behaviors.
Professor Aronson received his B.A. from Brandeis University in 1954, his M.A. from Wesleyan University in 1956, and his Ph.D. in psychology from Stanford University in 1959. He has taught at Harvard University, the University of Minnesota, the University of Texas, and the University of California. In 1999, he won the American Psychological Association's Distinguished Scientific Contribution Award, making him the only psychologist to have won APA's highest awards in all three major academic categories: distinguished writing (1973), distinguished teaching (1980), and distinguished research (1999).
Products related to this item
Customer reviews
Customer Reviews, including Product Star Ratings, help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyses reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonCustomers say
Customers find the book easy to read and interesting. They appreciate the enlightening content and thought-provoking ideas about human nature. The book helps readers reflect on their own behavior and better understand others.
AI-generated from the text of customer reviews
Customers find the book easy to read and engaging. They describe it as a straightforward read in clear language for people interested in thinking about thinking. The book is described as interesting, clear, and well-researched.
"...In this fascinating and important book, Carol Tavris and Elliot Aronson illustrate the many ways in which all kinds of people, including even..." Read more
"...Mistakes Were Made is an interesting read, with the authors examining the subject of intellectual reasoning and hypocrisy- when we delude ourselves..." Read more
"...This was another great book about (mainly) self-justification and cognitive dissonance...." Read more
"...Before reading it I noted this book was a 'straight read' with no illustrations, yet it was a gripping book from start to finish...." Read more
Customers find the book's content thought-provoking and excellent. It helps them reflect on their own behavior and better understand people. They find it a good reference point that makes them look at things in a different way. The book provides examples and research to discuss various aspects of human nature.
"...was another great book about (mainly) self-justification and cognitive dissonance...." Read more
"...not written as another self help book, it certainly made me re-examine my own actions, as well as seeing faults in others...." Read more
"...book is a good popular introduction to the subject, and also covers Confirmation Bias, which is what happens when we choose to believe or disbelieve..." Read more
"A real lot of 'fun' - loads of examples and research to discuss the many variants of cognitive dissonance that exist in the world - on an individual..." Read more
-
Top reviews
Top reviews from United Kingdom
There was a problem filtering reviews right now. Please try again later.
Far more common than politicians who lie are those for whom irrefutable evidence that they got things wrong is merely an invitation to engage in self-justification. Although George W. Bush was wrong about many things - that Saddam Hussein had weapons of mass destruction and was linked with Al Qaeda, for example - throughout his presidency he was more inclined to justify his decisions rather than straightforwardly admit he was wrong. When pushed, we've all been tempted by the passive construction - "mistakes were made" - to put distance between us and our errors. Tavris and Aronson believe that "self-justification is more powerful and more dangerous than the explicit lie" because it allows us to convince ourselves that what we did was the best thing we could have done.
Exposing politicians who are good at deflecting criticism is not science and hardly news, however, and Tavris and Aronson soon move on from the high-profile and well-documented example of George W. Bush. Before they widen their scope to include just about everybody else, and to avoid the book becoming a procession of jaw-dropping but unconnected anecdotes, they first outline the theory of cognitive dissonance, which they use throughout to explain some otherwise very strange behaviour. Whenever we hold two ideas, attitudes, beliefs or opinions that are psychologically inconsistent, we experience an uncomfortable state of tension. Dissonance "is disquieting because to hold two ideas that contradict each other is to flirt with absurdity" and so it's not surprising that we go to some lengths to reduce dissonance. And, to make a bad cognitive situation worse, we indulge in the self-flattering idea that we "process information logically" rather than in more self-serving ways.
Imagine being innocent of a crime and yet accused. There's no way we'd confess! We'd be in good company in thinking such a thing impossible. Although "obviously one of the most dangerous mistakes that can occur in police interrogation", most detectives, prosecutors and judges share this view that innocent people simply don't confess. Those at the sharp end even have a manual of interrogation methods that virtually guarantees that such mistakes won't happen, so long as you follow the techniques laid down. "The manual is written in an authoritative tone as if it were the voice of God revealing indisputable truths, but in fact it fails to teach its readers a core principle of scientific thinking: the importance of examining and ruling out other possible explanations for a person's behavior before deciding which one is the most likely."
The trouble is, innocent people do get jailed, many more than will ever come to light. For one thing, reviewing old cases runs the risk of unearthing injustice, which means a heavy dose of dissonance for those responsible. Sending an innocent man to prison for fifteen years "is so antithetical to your view of your competence that you will go through mental hoops to convince yourself that you couldn't possibly have made such a blunder".
Changing trees once you realize you're barking up the wrong one is hard to do, and made harder by the fact that training often increases people's confidence in the accuracy of their judgements, not the accuracy itself. When tested, detectives who had had special training in the Reid Technique "did no better than chance, yet they were convinced that their accuracy rate was close to 100 percent". The problem here is that "the professional training of most police officers, detectives, judges, and attorneys" barely mentions their own cognitive biases.
Psychotherapists are also vulnerable to a "misplaced reliance on their own powers of observation and the closed loop it creates". If there's no way for your theory to be wrong, if every outcome confirms your hypotheses, "your beliefs are a matter of faith, not science". This is why Freud, "for all his illuminating observations about civilization and its discontents, was not doing science".
This is the heart of scientific reasoning, and cannot be stressed too often: for any theory to be scientific, "it must be stated in such a way that it can be shown to be false as well as true". Indeed, while we all naturally like to be proved right, the scientific method is often all about proving us wrong, and an essential if sometimes uncomfortable aspect of the scientific attitude is that we should change our beliefs "once they are discredited". I'll leave the last word to the authors of this brilliant book.
"Scientific reasoning is useful to anyone in any job because it makes us face the possibility, even the dire reality, that we were mistaken. It forces us to confront our self-justifications and put them on public display for others to puncture. At its core, therefore, science is a form of arrogance control."
Exemplars of cognitive bias (do these ideas morph into propaganda?) are experienced as saying something but believing something else. You are given a gift by a friend which you really don't like (-he recognizes this hidden dislike under the out ward expression of, "oh, that's great-I've always wanted one of these...!" and both now are equally confused...) cognitive dissonance distorts our decisions, our beliefs, memories and judgments.
The authors focus on self-directed bias-the distortions of memory and explanation, making sure that each of us is fortunately always right in our opinions. Reading this material suggests that we should instinctively distrust those who try to convince us they're always right. There's something inherently puzzling and potentially dangerous about someone claiming absolute certainty, particularly when it comes to human nature. There's no place for the concept of absolute when describing human emotional life or behaviour, though it can be found happily residing in the physical sciences.
Experience tells us that if we listen carefully to those who profess certainty about their views of the world, there is certain baggage that travels with them. They intrinsically exhibit an excessively controlling personality distrusting their left hemisphere's powerful links to intuitive and social bonding skills. Conversely they may be impressed by the certainties promised by their right hemisphere dominance. They are drawn to logic, deductive and mathematical reasoning principals. They are probably deeply insecure under the surface-may I reconsider this please-they are most definitely deeply insecure under the surface.
The same logic applies to prejudice and bias. One man's certitude is another's propaganda. We ought to admit to both traits when they appear in our own reasoning. It's probably best just to laugh at yourself when exhibiting these characteristics, as they're irredeemably hardwired into our nervous system's matrix? It's human to show bias or preference to a degree. Certainly, they're frequently unhelpful and can be a block to deep insight, but they underpin necessary parts of our emotional and psychological development.
By definition you cannot see anything unless you exclude something else. That's the human part of the process of seeing and understanding that you see. You have to reject more than you take in, but perhaps by accepting this psychological fact, you let the sunshine in.
`The fact that an opinion has been widely held, is no evidence whatever that it is not utterly absurd; indeed in view of the silliness of the majority of mankind, a widespread belief is more likely to be foolish than sensible.
Bertrand Russell, Mathematician & Philosopher (1872-1970)
Mistakes Were Made is an interesting read, with the authors examining the subject of intellectual reasoning and hypocrisy- when we delude ourselves by explaining why everyone else can be wrong in thought¬- but not ourselves. Every sort of person can be afflicted by this state-politicians, journalists, medical researchers-there's a comment that Danish investigators examined 159 clinical trials published in the British Medical Journal. This shows(P49) that if you compare studies that oblige the researchers to declare a conflict of interest, or otherwise, the published results indicate `significantly more positive results toward the experimental intervention' (ie the drug on trial compared to its competitor) This title should appeal to those who are interested in human psychology and people in general.
Thoroughly recommended reading, and for those that do enjoy this subject, then I can direct you to Kluge by Gary Marcus, which looks at the same subject but from a more anthropological viewpoint. The haphazard construction of our minds may show the inherent social value in deceit and the biochemistry of aggression. Consider why our brains don't epitomise a `perfectly evolved organ', which has been proposed by some. More of a work in progress.
this book discusses the research which indicates that she was right!
I ordered this after reading irationality, bad science and mumbo jumbo. I really enjoy books like this, so I'm probably biased. This was another great book about (mainly) self-justification and cognitive dissonance. The overwhelming feeling I had after reading this book was one of dispair. You realize a couple of things about our flawed human brains.
1. Your brain works not rationally but in order to make you feel comfortable. we are not rational creatures but selfish self justifying creatures.
2. It matters little what decisions you make (in terms of happiness) as your mind will work hard to make you feel you made the right decision.
3. People who believe things which you personally find ridiculous and clearly non-sensical (easily disprovable) hold those beliefs with complete conviction.
Top reviews from other countries
5.0 out of 5 stars Very useful
5.0 out of 5 stars Excellent Reading
3.0 out of 5 stars Nice.
5.0 out of 5 stars Obligada lectura
Sólo por el capítulo de las relaciones de pareja, ya merece la pena. Ha sido una epifanía.
5.0 out of 5 stars Excellent Read - Full of Great Research But Not Too Technical
Self justification is a scary thing we do to preserve our ego and even ourselves. It's more powerful than a lie and it is absolutely more dangerous than a lie because we're not conscious that we're doing it.
This is such an excellent book for revealing why we do that as humans, helping you see where you might be hiding the truth from yourself and understanding how it plays into your attempts to influence others. The research covered in this book is great ... not too scientific but detailed enough that you understand what the point is.
For a business person or anyone interested in human psychology, but not wanting a hard read, this book will be highly satisfying for you!
From business to home (there is an entire chapter dedicated to how this plays into marriages) - this book will equip you with useful insights into the human mind and behaviors around mistakes and justifications for them. And you'll be in a better position to learn from your mistakes and help influence others when they are dead wrong too. :)

