Learn more Download now Shop now Learn more Shop now Shop now Shop now Shop now Shop now Shop now Shop now Shop now Learn More Shop now Shop now Shop now Learn more Shop Fire Shop Kindle Ed Sheeran Shop now Shop now

on 23 May 2017
One of the best books I've read, should be required reading for everyone. Explores the hidden side of statistics in a genuinely interesting way, using examples from medicine to finance to help educate people in risk literacy. This was the first book by Gerd Gigerenzer I've read but I will definitely be checking out his other books.
0Comment|Was this review helpful to you? Report abuse
on 14 August 2017
Another essential read for the ''risk'' community out there.
0Comment|Was this review helpful to you? Report abuse
on 9 June 2014
We don't understand risk very well is one of Gigerenzer's central theses. Another is that in situations of uncertainty simple rules are better than complex ones we can't understand. - or that would be fine if only we had far more information. (So, for example invest your money by splitting it equally between a number of investments I'd different kind. Or keep leverage to 10 per cent.) a third thesis is that in decision taking and leadership, we should trust our gut instincts. A fourth is that our fear instincts are geared to our evolutionary past with their focus on things like spiders and 'dread events' where a lot of oeople die at the same time such as 9/11.

This is all persuasive and I learned a good deal from this book. That relates particularly to understanding risk and the role that presentation through natural frequencies and 'icon boxes' and the like can play. The book is also excellent as a guide to understanding medical risks.

If I didn't find it griping throughout, I think there are two main reasons. First it's a bit repetitive, especially when it comes to the North American healthcare system and its shortcomings. Secondly it's a bit optimistic about the impacts of education in risk understanding and in catching students young - pre adolescence. I just can't believe thus will turn out to be, for example, a way to stop young people texting and driving, as Gigerenzer hopes...
0Comment| 15 people found this helpful. Was this review helpful to you? Report abuse
on 22 June 2014
But still good, This is an excellent introduction to risk but I'd advise giving the older and better "Reckoning with Risk" by the same author a go first.
If the thought of statistics makes your brain go numb but you really have to get a handle on what the doctor told you, this is the book for you.
0Comment| 2 people found this helpful. Was this review helpful to you? Report abuse
on 27 July 2014
Very interesting and readable book. It provides plenty of food for thought.
0Comment|Was this review helpful to you? Report abuse
on 6 October 2014
There's a view of human nature that we are irrational slaves of our appetites, creatures who need to be nudged into better behaviour and who would benefit from a benign paternalism. In this important and accessible book, Gerd Gigerenzer argues against this view by shifting the focus from "individual stupidity" to "the phenomenon of a risk-illiterate society." Gigerenzer distils a great deal of scholarship to show how we can all become more risk literate and what a difference that might make in the modern world.

Whenever presented with a bald probability, say, a 30 percent chance of rain tomorrow, we should always ask, 30 percent of what? Time? Geographical area? The number of weather forecasters? The absence of a reference class creates a confusion that's not just limited to rain, "but occurs whenever a probability is attached to a single event." One way out of this muddle is to use frequency statements that make the reference class clear ("it will rain on 30 percent of the days for which this announcement is made"). Gigerenzer calls these "mind tools" and he shows how relatively easy they are to learn and apply. (For more thinking tools, see Daniel Dennett's Intuition Pumps and Other Tools for Thinking.)

Being confused by a weather forecaster is one thing, but what if your doctor doesn't understand Bayesian inference? In one study, Gigerenzer switched from conditional probabilities to natural frequencies and discovered that how the information was presented was critical to accuracy. If you're a woman whose mammography screening is positive, there's a difference between being told you've got a 9 or a 90 percent chance of having breast cancer. Lack of risk literacy can take a "toxic toll" on healthy women who are encouraged to participate in mammography screening but who haven't been educated to expect a false alarm sooner or later. Gigerenzer's research has led to practical solutions such as icon and fact boxes, which make these risks transparent and which should be available in every doctor's waiting room.

Making informed decisions by both getting and understanding the facts is one of the messages of Margaret McCartney's The Patient Paradox: Why Sexed Up Medicine is Bad for Your Health. She laments how "the cult of awareness" is too often a substitute for the kind of education Gigerenzer advocates. He's more optimistic than many psychologists in thinking that everyone can improve their decision making, and he disagrees with Daniel Kahneman's idea that System 1 is not readily educable (see Thinking, Fast and Slow). Unlike perceptual illusions, cognitive illusions "are not hard-wired." That we find it easier to think in terms of natural frequencies rather than conditional probabilities shows that the defect lies more in the way risks are communicated than in our cognitive abilities.

There is a more fundamental problem with the two-system view. Kahneman and his followers take logic or probability theory as a general, "content-blind" norm of rationality. "In their thinking, heuristics can never be more accurate, only faster. That, however, is true only in a world of known risk. In an uncertain world, simple heuristics often can do better."

The difference between risk and uncertainty is one of the key ideas in this book. In lotteries and games of chance, all the alternatives are known and their probabilities can be calculated precisely. We don't know whether the flip of a coin will result in a head or a tail but we know the probability of each outcome. For all other situations, where the risks remain unknown and, crucially, unknowable, we're dealing with uncertainty. What will the weather be one week from today, or the level of the stock market, or the mood of our new romantic partner?

Statistical methods are required when dealing with known risks, heuristics when dealing with uncertainty. Instead of knocking heuristics, Gigerenzer suggests that "we need to study their ecological rationality" to find out when they work and when they don't.

The incalculability of uncertainty is bad enough; that we are also often in denial can be catastrophic. With all their sophisticated mathematical models, financial institutions treated "the highly unpredictable financial market as if its risks were predictable" and were surprised when it all blew up in their face (not a single bank foresaw the crash). We have a strong psychological need for certainty, and are susceptible to the high priests of finance or big pharma who may be selling us anything from pensions to pills (to say nothing of actual priests who promise all sorts of heavenly rewards with a remarkable degree of certainty given the paucity of evidence to back up their claims).

The world of uncertainty is vast compared to the world of risk, and it's the world we live in. According to Gigerenzer, while probability theory is all we need in a world of known risks, heuristics are indispensable in a world of uncertainty. Luckily for us, and odd as it may sound, simple rules can outperform more complex strategies. More information and more time and more sophisticated models do not necessarily lead to better decisions.

Gigerenzer challenges the ingrained idea that people are irredeemable and irrational slaves to their appetites. Amos Tversky, one of the pioneers of research into cognitive biases, liked to say, "My colleagues, they study artificial intelligence. Me? I study natural stupidity." It's a good line, but Gigerenzer has a different story. "People aren't stupid. The problem is that our educational system has an amazing blind spot concerning risk literacy." Being risk savvy is more than being well informed. It takes courage "to face an uncertain future as well as to stand up to authority and ask critical questions."
0Comment| 7 people found this helpful. Was this review helpful to you? Report abuse
on 7 July 2014
It's obvious, isn't it, that the more we know the better our inferences will be? No, says Gerd Gigerenzer, it doesn't always follow. What we take to be knowledge is usually limited to the conscious variety. We are unaware of the vast reservoir of unconscious knowledge which we can access, and which is the source of gut feelings and intuitions.
Successful managers base all their decisions on reason, or so we have been led to believe. Wrong again, says Gigerenzer. Although they are reluctant to admit it, the higher up the hierarchy managers are the more likely they are to rely on gut feelings.
So why do many of us make bad decisions? Because we have not been educated to understand risk. We are unable to distinguish between between known calculable risks and uncertainty. We are very uncomfortable with uncertainty, preferring to accept the illusion of certainty offered to us by people in authority. The hunger for certainty is what prevents us from being risk savvy.
Heuristics are smart rules of thumb which can simplify decision making. They can be safer and more accurate than a calculation, yet are frowned upon by many. This book gives examples of heuristics ranging from the gaze heuristic of pilots to the aspiration rule which can prevent us wasting time and feeling restless and dissatisfied when shopping.
This is a book which encourages us to take more control of our lives. It allows us to see when we are being offered second or third best solutions because someone feels it necessary to engage in defensive decision making. Although it does contain repetition, it is a book well worth the time taken to read it.
0Comment| 5 people found this helpful. Was this review helpful to you? Report abuse
on 1 August 2014
Book of the year in economics for 2014
0Comment|Was this review helpful to you? Report abuse
on 19 July 2014
Very reliable. Everything as promised.
0Comment|Was this review helpful to you? Report abuse
on 11 September 2014
Review courtesy of www.subtleillumination.com

Understanding and dealing with risk is essential in almost every aspect of the modern world; medicine, transportation, education, public policy, even game shows. Most of us do pretty badly at it; despite the fact that you’re more likely to die driving 12 miles than flying from New York to Washington, we feel more worried in the airplane than on the drive to the airport. The response of policymakers has been to argue the need for experts to save us from our biases. Risk Savvy disagrees: what we need, Gigerenzer argues, is risk education. Understanding probabilities is something that can be learned, and must be if we are to function in the world.

Gerd Gigerenzer is best known for his work arguing that though it’s easy to criticize instinct and human decision making as being biased and flawed, in reality those biases actually work better than being unbiased would in the majority of situations. We aren’t broken, leaky beta versions; rather, we operate with a well-designed and effective ‘adaptive toolbox’ one that allows us to successfully navigate a wide variety of situations with considerable success and a minimum of effort.

Gigerenzer is a top academic doing very interesting work in psychology, and I think his academic work makes some great reading. Unfortunately, this book is not that. He’s oversimplified his work, and as a result it often feels like a linear combination of other pop behavioural economics books, rather than a new addition to the field. He has some great examples of his points and some great stories, but nothing new to add to them. Still, some of the facts are really good. Consider the disparate policy approaches between mad cow disease and child proofing scented lamp oil bottles, despite the fact they kill similar numbers of people, or that reading to a 8-16 month year old child boosts their performance on language tests by 7 points, while watching TV reduces it by 17 points. Not world shaking, and not illustrating anything you didn’t already know, but interesting. Still, if I were you I’d stick with some of his earlier books.
0Comment| 10 people found this helpful. Was this review helpful to you? Report abuse

Sponsored Links

  (What is this?)