£12.21 with 6 percent savings
RRP: £12.99
£10.58 delivery Thursday, 26 December. Details
Or fastest delivery Thursday, 19 December. Order within 17 hrs 23 mins. Details
In stock
££12.21 () Includes selected options. Includes initial monthly payment and selected options. Details
Price
Subtotal
££12.21
Subtotal
Initial payment breakdown
Delivery cost, delivery date and order total (including tax) shown at checkout.
Dispatches from
Amazon
Dispatches from
Amazon
Sold by
Amazon
Sold by
Amazon
Returns
Returnable until Jan 31, 2025
Returnable until Jan 31, 2025
For the 2024 holiday season, this item if purchased between November 1 and December 25, 2024 can be returned until January 31, 2025 or within 30 days from receipt (whichever is later).
Returns
Returnable until Jan 31, 2025
For the 2024 holiday season, this item if purchased between November 1 and December 25, 2024 can be returned until January 31, 2025 or within 30 days from receipt (whichever is later).
Payment
Secure transaction
Your transaction is secure
We work hard to protect your security and privacy. Our payment security system encrypts your information during transmission. We don’t share your credit card details with third-party sellers, and we don’t sell your information to others. Learn more
Payment
Secure transaction
We work hard to protect your security and privacy. Our payment security system encrypts your information during transmission. We don’t share your credit card details with third-party sellers, and we don’t sell your information to others. Learn more
Added to

Sorry, there was a problem.

There was an error retrieving your Wish Lists. Please try again.

Sorry, there was a problem.

List unavailable.
Other sellers on Amazon
Kindle app logo image

Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet or computer – no Kindle device required.

Read instantly on your browser with Kindle for Web.

Using your mobile phone camera - scan the code below and download the Kindle app.

QR code to download the Kindle App

Follow the authors

Something went wrong. Please try your request again later.

Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts (2020 edition - updated and revised) Paperback – 4 Aug. 2020

4.4 4.4 out of 5 stars 1,061 ratings

on any 4 Qualifying items | Terms
{"desktop_buybox_group_1":[{"displayPrice":"£12.21","priceAmount":12.21,"currencySymbol":"£","integerValue":"12","decimalSeparator":".","fractionalValue":"21","symbolPosition":"left","hasSpace":false,"showFractionalPartIfEmpty":true,"offerListingId":"zurfdrnN82g2ZNPvIDJu%2FQGVfW6RBz%2BCKqpOcCsrTexEgqdAJCCA%2FFPOCbUPOvXkUzEa9js7SQaGgZZEBtOP3Q%2BUnYmFRQnBNGByjJwAMJ3pJSi4R5I1egWi64GPEWl6","locale":"en-GB","buyingOptionType":"NEW","aapiBuyingOptionIndex":0}]}

Purchase options and add-ons

Why is it so hard to say 'I made a mistake' and really believe it?

When we make mistakes, cling to outdated attitudes, or mistreat other people, we must calm the cognitive dissonance that jars our feelings of self-worth. And so, unconsciously, we create fictions that absolve us of responsibility, restoring our belief that we are smart, moral, and right a belief that often keeps us on a course that is dumb, immoral, and wrong.

Backed by years of research, Mistakes Were Made (But Not by Me) offers a fascinating explanation of self-justification how it works, the damage it can cause, and how we can overcome it.

This updated edition features new examples and concludes with an extended discussion of how we can live with dissonance, learn from it, and perhaps, eventually, forgive ourselves.

Frequently bought together

This item: Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts (2020 edition - updated and revised)
£12.21
In stock
Sent from and sold by Amazon.
+
£6.40
Get it as soon as Thursday, Dec 26
In stock
Sold by Books from Norfolk and sent from Amazon Fulfillment.
Total price: $00
To see our price, add these items to your basket.
Details
Added to Basket
spCSRF_Treatment
These items are dispatched from and sold by different sellers.
Choose items to buy together.

Product description

Review

"Entertaining, illuminating and-when you recognize yourself in the stories it tells-mortifying." * Wall Street Journal * "A revelatory study of how lovers, lawyers, doctors, politicians-and all of us-pull the wool over our own eyes . . . Reading it, we recognize the behavior of our leaders, our loved ones, and-if we're honest-ourselves, and some of the more perplexing mysteries of human nature begin to seem a little clearer." -- Francine Prose * O, The Oprah Magazine * "Every page sparkles with sharp insight and keen observation. Mistakes were made-but not in this book!" --Daniel Gilbert, author of Stumbling on Happiness

About the Author

Dr. Carol Tavris's work as a writer, teacher, and lecturer has been devoted to educating the public about psychological science. She has spoken to students, psychologists, mediators, lawyers, judges, physicians, business executives, and general audiences on, among other topics, self-justification; science and pseudoscience in psychology; gender and sexuality; critical thinking; and anger. In the legal arena, she has given many addresses and workshops to attorneys and judges on the difference between testimony based on good psychological science and that based on pseudoscience and subjective clinical opinion.Elliot Aronson's primary research interests are in the general area of social influence. His experiments have been aimed both at testing theory and at improving the human condition by influencing people to change their dysfunctional attitudes and behavior (e.g., prejudice, bullying, wasting of water, energy and other environmental resources). Professor Aronson is the only psychologist ever to have won APA's highest awards in all three major academic categories: For distinguished writing (1973), for distinguished teaching (1980), and for distinguished research (1999). In 2002, he was listed among the 100 most eminent psychologists of the 20th Century (APA Monitor, July/August, 2002). In 2007 he received the William James Award for Distinguished Research from APS. Elliot Aronson is currently Professor Emeritus at the University of California in Santa Cruz. He has long-standing research interests in social influence and attitude change, cognitive dissonance, research methodology, and interpersonal attraction. Professor Aronson's experiments are aimed both at testing theory and at improving the human condition by influencing people to change dysfunctional attitudes and behaviours. Professor Aronson received his B.A. from Brandeis University in 1954, his M.A. from Wesleyan University in 1956, and his Ph.D. in psychology from Stanford University in 1959. He has taught at Harvard University, the University of Minnesota, the University of Texas, and the University of California. In 1999, he won the American Psychological Association's Distinguished Scientific Contribution Award, making him the only psychologist to have won APA's highest awards in all three major academic categories: distinguished writing (1973), distinguished teaching (1980), and distinguished research (1999).

Product details

  • Publisher ‏ : ‎ Pinter & Martin Ltd.; 3rd New edition (4 Aug. 2020)
  • Language ‏ : ‎ English
  • Paperback ‏ : ‎ 464 pages
  • ISBN-10 ‏ : ‎ 1780666950
  • ISBN-13 ‏ : ‎ 978-1780666952
  • Dimensions ‏ : ‎ 13.5 x 3.4 x 20.3 cm
  • Customer reviews:
    4.4 4.4 out of 5 stars 1,061 ratings

About the authors

Follow authors to get new release updates, plus improved recommendations.

Customer reviews

4.4 out of 5 stars
1,061 global ratings

Customers say

Customers find the book easy to read and interesting. They appreciate the enlightening content and thought-provoking ideas about human nature. The book helps readers reflect on their own behavior and better understand others.

AI-generated from the text of customer reviews

Select to learn more
34 customers mention ‘Readability’34 positive0 negative

Customers find the book easy to read and engaging. They describe it as a straightforward read in clear language for people interested in thinking about thinking. The book is described as interesting, clear, and well-researched.

"...In this fascinating and important book, Carol Tavris and Elliot Aronson illustrate the many ways in which all kinds of people, including even..." Read more

"...Mistakes Were Made is an interesting read, with the authors examining the subject of intellectual reasoning and hypocrisy- when we delude ourselves..." Read more

"...This was another great book about (mainly) self-justification and cognitive dissonance...." Read more

"...Before reading it I noted this book was a 'straight read' with no illustrations, yet it was a gripping book from start to finish...." Read more

33 customers mention ‘Enlightened content’30 positive3 negative

Customers find the book's content thought-provoking and excellent. It helps them reflect on their own behavior and better understand people. They find it a good reference point that makes them look at things in a different way. The book provides examples and research to discuss various aspects of human nature.

"...was another great book about (mainly) self-justification and cognitive dissonance...." Read more

"...not written as another self help book, it certainly made me re-examine my own actions, as well as seeing faults in others...." Read more

"...book is a good popular introduction to the subject, and also covers Confirmation Bias, which is what happens when we choose to believe or disbelieve..." Read more

"A real lot of 'fun' - loads of examples and research to discuss the many variants of cognitive dissonance that exist in the world - on an individual..." Read more

Top reviews from United Kingdom

Reviewed in the United Kingdom on 16 February 2011
To do anything in an uncertain world takes confidence in ourselves and in our beliefs. Without such convictions, we'd rarely get out of bed in the morning. Of course, no one (except the insane and the pope) claims to be in any way infallible. We all make mistakes - but few of us enjoy having them pointed out, or readily change as a result. In this fascinating and important book, Carol Tavris and Elliot Aronson illustrate the many ways in which all kinds of people, including even psychologists, "when directly confronted with proof that they are wrong, do not change their point of view or course of action but justify it even more tenaciously". The varied and ingenious ways in which mistakes are made and then unconsciously covered up is often amusing, sometimes tragic, and always compelling. The accounts are not just for our entertainment: edification comes thanks to the explanatory framework of dissonance theory. The moral of the story, as usual, is easier said than done: next time you screw up, try saying: "I made a mistake. I need to understand what went wrong. I don't want to make the same mistake again".

Far more common than politicians who lie are those for whom irrefutable evidence that they got things wrong is merely an invitation to engage in self-justification. Although George W. Bush was wrong about many things - that Saddam Hussein had weapons of mass destruction and was linked with Al Qaeda, for example - throughout his presidency he was more inclined to justify his decisions rather than straightforwardly admit he was wrong. When pushed, we've all been tempted by the passive construction - "mistakes were made" - to put distance between us and our errors. Tavris and Aronson believe that "self-justification is more powerful and more dangerous than the explicit lie" because it allows us to convince ourselves that what we did was the best thing we could have done.

Exposing politicians who are good at deflecting criticism is not science and hardly news, however, and Tavris and Aronson soon move on from the high-profile and well-documented example of George W. Bush. Before they widen their scope to include just about everybody else, and to avoid the book becoming a procession of jaw-dropping but unconnected anecdotes, they first outline the theory of cognitive dissonance, which they use throughout to explain some otherwise very strange behaviour. Whenever we hold two ideas, attitudes, beliefs or opinions that are psychologically inconsistent, we experience an uncomfortable state of tension. Dissonance "is disquieting because to hold two ideas that contradict each other is to flirt with absurdity" and so it's not surprising that we go to some lengths to reduce dissonance. And, to make a bad cognitive situation worse, we indulge in the self-flattering idea that we "process information logically" rather than in more self-serving ways.

Imagine being innocent of a crime and yet accused. There's no way we'd confess! We'd be in good company in thinking such a thing impossible. Although "obviously one of the most dangerous mistakes that can occur in police interrogation", most detectives, prosecutors and judges share this view that innocent people simply don't confess. Those at the sharp end even have a manual of interrogation methods that virtually guarantees that such mistakes won't happen, so long as you follow the techniques laid down. "The manual is written in an authoritative tone as if it were the voice of God revealing indisputable truths, but in fact it fails to teach its readers a core principle of scientific thinking: the importance of examining and ruling out other possible explanations for a person's behavior before deciding which one is the most likely."

The trouble is, innocent people do get jailed, many more than will ever come to light. For one thing, reviewing old cases runs the risk of unearthing injustice, which means a heavy dose of dissonance for those responsible. Sending an innocent man to prison for fifteen years "is so antithetical to your view of your competence that you will go through mental hoops to convince yourself that you couldn't possibly have made such a blunder".

Changing trees once you realize you're barking up the wrong one is hard to do, and made harder by the fact that training often increases people's confidence in the accuracy of their judgements, not the accuracy itself. When tested, detectives who had had special training in the Reid Technique "did no better than chance, yet they were convinced that their accuracy rate was close to 100 percent". The problem here is that "the professional training of most police officers, detectives, judges, and attorneys" barely mentions their own cognitive biases.

Psychotherapists are also vulnerable to a "misplaced reliance on their own powers of observation and the closed loop it creates". If there's no way for your theory to be wrong, if every outcome confirms your hypotheses, "your beliefs are a matter of faith, not science". This is why Freud, "for all his illuminating observations about civilization and its discontents, was not doing science".

This is the heart of scientific reasoning, and cannot be stressed too often: for any theory to be scientific, "it must be stated in such a way that it can be shown to be false as well as true". Indeed, while we all naturally like to be proved right, the scientific method is often all about proving us wrong, and an essential if sometimes uncomfortable aspect of the scientific attitude is that we should change our beliefs "once they are discredited". I'll leave the last word to the authors of this brilliant book.

"Scientific reasoning is useful to anyone in any job because it makes us face the possibility, even the dire reality, that we were mistaken. It forces us to confront our self-justifications and put them on public display for others to puncture. At its core, therefore, science is a form of arrogance control."
10 people found this helpful
Report
Reviewed in the United Kingdom on 11 July 2010
Mistakes Were Made (and not by me), has been written by two authors with backgrounds in social psychology and fuelled by a particular interest in cognitive dissonance, where the brain tries to reconcile two contrasting viewpoints-in an attempt to maintain a sense of overall personal integrity.

Exemplars of cognitive bias (do these ideas morph into propaganda?) are experienced as saying something but believing something else. You are given a gift by a friend which you really don't like (-he recognizes this hidden dislike under the out ward expression of, "oh, that's great-I've always wanted one of these...!" and both now are equally confused...) cognitive dissonance distorts our decisions, our beliefs, memories and judgments.

The authors focus on self-directed bias-the distortions of memory and explanation, making sure that each of us is fortunately always right in our opinions. Reading this material suggests that we should instinctively distrust those who try to convince us they're always right. There's something inherently puzzling and potentially dangerous about someone claiming absolute certainty, particularly when it comes to human nature. There's no place for the concept of absolute when describing human emotional life or behaviour, though it can be found happily residing in the physical sciences.

Experience tells us that if we listen carefully to those who profess certainty about their views of the world, there is certain baggage that travels with them. They intrinsically exhibit an excessively controlling personality distrusting their left hemisphere's powerful links to intuitive and social bonding skills. Conversely they may be impressed by the certainties promised by their right hemisphere dominance. They are drawn to logic, deductive and mathematical reasoning principals. They are probably deeply insecure under the surface-may I reconsider this please-they are most definitely deeply insecure under the surface.

The same logic applies to prejudice and bias. One man's certitude is another's propaganda. We ought to admit to both traits when they appear in our own reasoning. It's probably best just to laugh at yourself when exhibiting these characteristics, as they're irredeemably hardwired into our nervous system's matrix? It's human to show bias or preference to a degree. Certainly, they're frequently unhelpful and can be a block to deep insight, but they underpin necessary parts of our emotional and psychological development.

By definition you cannot see anything unless you exclude something else. That's the human part of the process of seeing and understanding that you see. You have to reject more than you take in, but perhaps by accepting this psychological fact, you let the sunshine in.

`The fact that an opinion has been widely held, is no evidence whatever that it is not utterly absurd; indeed in view of the silliness of the majority of mankind, a widespread belief is more likely to be foolish than sensible.
Bertrand Russell, Mathematician & Philosopher (1872-1970)

Mistakes Were Made is an interesting read, with the authors examining the subject of intellectual reasoning and hypocrisy- when we delude ourselves by explaining why everyone else can be wrong in thought¬- but not ourselves. Every sort of person can be afflicted by this state-politicians, journalists, medical researchers-there's a comment that Danish investigators examined 159 clinical trials published in the British Medical Journal. This shows(P49) that if you compare studies that oblige the researchers to declare a conflict of interest, or otherwise, the published results indicate `significantly more positive results toward the experimental intervention' (ie the drug on trial compared to its competitor) This title should appeal to those who are interested in human psychology and people in general.

Thoroughly recommended reading, and for those that do enjoy this subject, then I can direct you to Kluge by Gary Marcus, which looks at the same subject but from a more anthropological viewpoint. The haphazard construction of our minds may show the inherent social value in deceit and the biochemistry of aggression. Consider why our brains don't epitomise a `perfectly evolved organ', which has been proposed by some. More of a work in progress.
15 people found this helpful
Report
Reviewed in the United Kingdom on 23 December 2010
My mother used to say "A man convinced against his will is of the same opinion still".

this book discusses the research which indicates that she was right!

I ordered this after reading irationality, bad science and mumbo jumbo. I really enjoy books like this, so I'm probably biased. This was another great book about (mainly) self-justification and cognitive dissonance. The overwhelming feeling I had after reading this book was one of dispair. You realize a couple of things about our flawed human brains.

1. Your brain works not rationally but in order to make you feel comfortable. we are not rational creatures but selfish self justifying creatures.

2. It matters little what decisions you make (in terms of happiness) as your mind will work hard to make you feel you made the right decision.

3. People who believe things which you personally find ridiculous and clearly non-sensical (easily disprovable) hold those beliefs with complete conviction.
7 people found this helpful
Report

Top reviews from other countries

Translate all reviews to English
Anil kumar
5.0 out of 5 stars Very useful
Reviewed in India on 25 March 2021
Nice
Rodrigo
5.0 out of 5 stars Excellent Reading
Reviewed in Mexico on 22 May 2019
A great reading to understand how and why we all self-justify ourselves. When you start to be aware that we all fall into the cognitive dissonance trap you can start owning your mistakes, apologizing for those and take action from there.
Amazon Kunde
3.0 out of 5 stars Nice.
Reviewed in Italy on 6 December 2019
Lot of very good examples.
Carlos
5.0 out of 5 stars Obligada lectura
Reviewed in Spain on 23 October 2016
Me ha cambiado la forma de pensar. El descubrimiento del concepto de la disonancia cognitiva le permite a uno observar los acontecimientos con una nueva herramienta. Es como disponer de una lupa con la que ver los detalles que explican una realidad mayor no del todo comprensible.
Sólo por el capítulo de las relaciones de pareja, ya merece la pena. Ha sido una epifanía.
JulzB
5.0 out of 5 stars Excellent Read - Full of Great Research But Not Too Technical
Reviewed in Canada on 1 July 2015
I've often wondered how seemingly good honest people turn into dishonest and self serving politicians. This book covers a lot of experiments and examples that show how little by little, small acts of dishonesty, eventually lead to the justification of big acts of dishonesty. You get a man to lose his ethical compass one step at a time.

Self justification is a scary thing we do to preserve our ego and even ourselves. It's more powerful than a lie and it is absolutely more dangerous than a lie because we're not conscious that we're doing it.

This is such an excellent book for revealing why we do that as humans, helping you see where you might be hiding the truth from yourself and understanding how it plays into your attempts to influence others. The research covered in this book is great ... not too scientific but detailed enough that you understand what the point is.

For a business person or anyone interested in human psychology, but not wanting a hard read, this book will be highly satisfying for you!

From business to home (there is an entire chapter dedicated to how this plays into marriages) - this book will equip you with useful insights into the human mind and behaviors around mistakes and justifications for them. And you'll be in a better position to learn from your mistakes and help influence others when they are dead wrong too. :)