Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet or computer - no Kindle device required. Learn more.
Read instantly on your browser with Kindle Cloud Reader.
Using your mobile phone camera - scan the code below and download the Kindle app.
How We Know What Isn't So Paperback – 5 Mar. 1993
| Thomas Gilovich (Author) See search results for this author |
| Amazon Price | New from | Used from |
|
Kindle Edition
"Please retry" | — | — |
|
Audible Audiobooks, Unabridged
"Please retry" |
£0.00
| Free with your Audible trial | |
- Choose from over 20,000 locations across the UK
- FREE unlimited deliveries at no additional cost for all customers
- Find your preferred location and add it to your address book
- Dispatch to this address when you check out
Enhance your purchase
When can we trust what we believe—that "teams and players have winning streaks," that "flattery works," or that "the more people who agree, the more likely they are to be right"—and when are such beliefs suspect? Thomas Gilovich offers a guide to the fallacy of the obvious in everyday life. Illustrating his points with examples, and supporting them with the latest research findings, he documents the cognitive, social, and motivational processes that distort our thoughts, beliefs, judgments and decisions. In a rapidly changing world, the biases and stereotypes that help us process an overload of complex information inevitably distort what we would like to believe is reality. Awareness of our propensity to make these systematic errors, Gilovich argues, is the first step to more effective analysis and action.
- Print length224 pages
- LanguageEnglish
- Publication date5 Mar. 1993
- Dimensions15.56 x 1.42 x 23.5 cm
- ISBN-100029117062
- ISBN-13978-0029117064
Frequently bought together

- +
- +
Customers who viewed this item also viewed
Irrationality: The enemy withinStuart SutherlandPaperback£4.14 deliveryOnly 5 left in stock (more on the way).
Product description
About the Author
Product details
- Publisher : Free Press; Reprint edition (5 Mar. 1993)
- Language : English
- Paperback : 224 pages
- ISBN-10 : 0029117062
- ISBN-13 : 978-0029117064
- Dimensions : 15.56 x 1.42 x 23.5 cm
- Best Sellers Rank: 239,184 in Books (See Top 100 in Books)
- 200 in Philosophical Logic
- 681 in The Self, Ego & Personality
- 794 in Social Psychology (Books)
- Customer reviews:
About the author

Discover more of the author’s books, see similar authors, read author blogs and more
Customer reviews
Customer Reviews, including Product Star Ratings, help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyses reviews to verify trustworthiness.
Learn more how customers reviews work on Amazon-
Top reviews
Top reviews from United Kingdom
There was a problem filtering reviews right now. Please try again later.
The brain is hard-wired to detect order in the nature of things. We can learn from experience by accumulated observations and this has obvious survival advantages in evolutionary terms.
But where do things start to go wrong? First of all, we see ordered patterns of outcomes that are in fact the blind product of chance. Chance produces less alternation than our intuition leads us to expect. If we toss a coin 20 times, we're unlikely to see 10 heads and 10 tails. A series of 20 tosses has a 50-50 chance of producing 4 heads in row. When we see patterns such as the lucky streak in baseball we think we are spotting an order that isn't in fact there.
The regression effect also fools us into misattributing a cause to an effect. You perform exceptional badly or excel when taking an exam, much worse or better than your average. Your next result is likely to be better or worse as you move back to your average. That's the regression effect. But we make assumptions that the exceptional and atypical is representative when the regression effect would tell us otherwise: investors may assume that a company's bumper profits in one year will predict will be repeated in future years when in all likelihood they will actually fall.
We underdetermine beliefs with insufficient data, treating weakly tested hypotheses as facts. We look for confirmatory examples while overlooking or discounting facts that contradict a belief. We fail in other words to understand the distinction between necessary and sufficient evidence. We seize on isolated, salient examples of pieces of data that prematurely confirm a hypothesis. Take the homoeopathist's claims that a cancer patient was miraculously cured after taking an alternative remedy. The recovery is treated as conclusive evidence of the remedy's efficacy. But such evidence is in itself insufficient to proof anything - isolated facts do not in themselves provide sufficient confirmation. They are too vulnerable to the discovery of counter-examples that contradict the hypothesis.
We leap to such conclusions because when we test for a hypothesis, we fail to define what success or failure is. Too often beliefs are formed with vague definitions of what counts as a successful confirmation. Studies of identical twins separated at birth may well track an identity of life outcomes that point strongly to genetic influences. But there are many outcomes or results in any given life. Some of these may overlap and give the impression of congruence. So the twins may both choose the same occupation and this is indeed a striking identity of outcome but this is only one such outcome, and others may vary. The danger once again is taking an overlap of outcomes from two sets of data similar while overlooking variances. Likewise many predictions are couched so vaguely to guarantee against disconfirmation, akin to Woody Allen's spoof Nostradamus character who portentously avers that `two nations will go to war but only one will win'.
Does our social nature compensate for this? Not necessarily. We tend to associate with like-minded people and to fight shy of conflict and controversy. Therefore members of presidential advisory groups hold their own counsel. We keep our mouths shut during a meeting at work. We do not want to be seen to rock the boat. The result is that others believe that their beliefs are more broadly shared than they actually are (one reason why the bore and the name dropper carries on with a self-defeating strategy is precisely the reluctance of others to point it out).
Good heavens, having said all this how on earth can we tell if our beliefs our well founded? There is no easy way out of these cognitive illusions. It's not all bad. We do have good reasons for example to accept the theory of gravity, which has weight (so to speak) and well attested by centuries of sense and statistical data. So we can rightly disregard claims of levitation on this basis.
We can also tighten up our definitions of what counts as confirmation, as we noted earlier. If we were testing whether a training course that claims it can raise sales staff performance really works, then we would define successful confirmation as increased sales figures. The scientific process of peer review also helps: we can make sure that a researcher does not which members of the trial group are receiving the new drug being tested, so preconceptions of success or failure do not contaminate the researcher's observations. We can test if a claim for an extraordinary effect like Extra Sensory Perception can be replicated (it can't).
These are palliatives however. We can only strive imperfectly to try and recognise when our reasoning faculties are leading us up blind-allies. This book will help you at least be a little more vigilant when it comes to forming conclusions about why you think you are right to believe the way you do
Part I of the book opens with an important claim. "Human nature abhors a lack of predictability and the absence of meaning." We tend to see order in the "the often messy data of the real world" where there is none. Part II examines why we might want to hold questionable beliefs, and the role society plays in supporting or promoting those beliefs. Part III looks at several areas in which erroneous beliefs flourish, including the fertile ground of alternative medicine. Finally, in case you were beginning to lose all hope in humanity, Part IV shows "how we might improve the way we evaluate the evidence of everyday life".
Evidence of whatever provenance is important for most people, but one of the major themes of the book - indeed of this whole field of research - is to show how evidence cannot always be neatly bagged and labelled like an exhibit in a court case. "For nearly all complex issues, the evidence is fraught with ambiguity and open to alternative interpretation." We "often fail to recognize that a particular belief rests on inadequate evidence" and are "prone to self-serving assessments". People have even been found "to attribute their successes to themselves, and their failures to external circumstances." Who'd have thought it?
We humans are rightly proud of our ability to see patterns in nature, but we overreach ourselves when we extract "too much meaning from chance events". Professional basketball players and their fans fail to recognize randomness when they talk about winning or losing streaks, and so commit the "hot hand" fallacy. Gilovich and his colleagues discovered that the outcome of any given shot has no predictable influence on the outcome of the following shot. The powerful impression that there is some kind of connection between a sequence of similar events is "the clustering illusion" and the temptation is then to "explain" the phenomenon with "superfluous and often complicated causal theories." Ignoring regression to the mean has rather more serious consequences in education, for example, where spurious regimes of reward and punishment result in a lot of wasted time and energy.
Our best defence "against erroneous beliefs" is, basically, science. There is no magic formula, only easily understandable principles, such as insisting upon "replicability and the public presentation of results". Ideas and findings "that rest on a shaky foundation tend not to survive in the intellectual marketplace." Contrast that with our everyday lives, in which we tend to seek confirmation of our beliefs, not contradiction. It is as if we ask ourselves "Can I believe this?" for what we want to believe and "Must I believe this?" for what we don't want to believe.
"Many of our most bizarre and erroneous beliefs do not survive our interactions and discussions with others." Religious leaders understand this, which is why they fear the books by so-called militant atheists they would once have burned. It is not only the religious who are thankful that people "are generally reluctant to openly question another person's beliefs." Fringe medicine is also "plagued by questionable, erroneous, and often harmful beliefs". Few of us realize the extent to which the "body is a truly amazing machine with remarkable powers to set itself right" or are aware of the "common fluctuations in the course of most diseases". By falling for the post hoc ergo propter hoc fallacy, quacks claim any temporary improvement for themselves (while of course accepting no responsibility for any deterioration).
Confidence is one of those go-to words - especially at times of financial crisis or political change - and is thought to be a universal good. Yet we should remember that those holding false beliefs can be confident too, often more so than those who would question those beliefs. Reading Gilovich will reduce certainty and confidence where they are unwarranted, but will reward you with a surer footing in a complex and often unpredictable world. Be prepared for "conflict and disharmony" as you question beliefs that are unquestionable, and set the bar low. When we tie ourselves in irrational knots, don't expect the intellect to make any great leap.
For those entirely new to the subject, this book may well give a better understanding of why so many people believe things which are demonstrably not true (or are not demonstrably true). The author singles out UFO's and ESP as his two major case studies. There are other good candidates for this analysis like astrology and homeopathy, but they get only the most fleeting of mentions. In any event, the explanation of the scientific method could be more systematically explored as did B Goldacre in "Bad Science". The book ends with a rather unconvincing plea that "social scientists" have greater insight into truth than the "hard scientists". Hmm, not really convinced!

