Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet or computer – no Kindle device required. Learn more
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Superforecasting: The Art and Science of Prediction MP3 CD – MP3 Audio, 29 Sept. 2015
| Amazon Price | New from | Used from |
|
Kindle Edition
"Please retry" | — | — |
|
Audible Audiobooks, Unabridged
"Please retry" |
£0.00
| Free with your Audible trial | |
|
MP3 CD, Audiobook, MP3 Audio, Unabridged
"Please retry" | £12.33 | £12.33 | — |
Purchase options and add-ons
From one of the world's most highly regarded social scientists, a transformative book on the habits of mind that lead to the best predictions.
Everyone would benefit from seeing further into the future, whether buying stocks, crafting policy, launching a new product, or simply planning the week's meals. Unfortunately people tend to be terrible forecasters. As Wharton professor Philip Tetlock showed in a landmark 2005 study, even experts' predictions are only slightly better than chance. However, an important and underreported conclusion of that study was that some experts do have real foresight, and Tetlock has spent the past decade trying to figure out why. What makes some people so good? And can this talent be taught?
In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary peopleincluding a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancerwho set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They've beaten other benchmarks, competitors, and prediction markets. They've even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters."
In this groundbreaking and accessible book, Tetlock and Gardner show us how we can learn from this elite group. Superforecasting offers the first demonstrably effective way to improve our ability to predict the futurewhether in business, finance, politics, international affairs, or daily lifeand is destined to become a modern classic.
- LanguageEnglish
- PublisherAudible Studios on Brilliance audio
- Publication date29 Sept. 2015
- Dimensions17.15 x 13.97 x 1.27 cm
- ISBN-101511358491
- ISBN-13978-1511358491
Customers who viewed this item also viewed
Fooled by Randomness: The Hidden Role of Chance in Life and in the MarketsNassim Nicholas TalebPaperback
Product details
- Publisher : Audible Studios on Brilliance audio; Unabridged edition (29 Sept. 2015)
- Language : English
- ISBN-10 : 1511358491
- ISBN-13 : 978-1511358491
- Dimensions : 17.15 x 13.97 x 1.27 cm
- Best Sellers Rank: 1,261,806 in Books (See Top 100 in Books)
- 883 in Professional Financial Forecasting
- 11,327 in Cognition & Cognitive Psychology
- 19,158 in Business Economics (Books)
- Customer reviews:
About the authors

Philip E. Tetlock (born 1954) is a Canadian-American political science writer, and is currently the Annenberg University Professor at the University of Pennsylvania, where he is cross-appointed at the Wharton School and the School of Arts and Sciences.
He has written several non-fiction books at the intersection of psychology, political science and organizational behavior, including Superforecasting: The Art and Science of Prediction; Expert Political Judgment: How Good Is It? How Can We Know?; Unmaking the West: What-if Scenarios that Rewrite World History; and Counterfactual Thought Experiments in World Politics. Tetlock is also co-principal investigator of The Good Judgment Project, a multi-year study of the feasibility of improving the accuracy of probability judgments of high-stakes, real-world events.
For more see here: https://en.wikipedia.org/wiki/Philip_E._Tetlock
For CV: https://www.dropbox.com/s/uorzufg1v0nhcii/Tetlock%20CV%20%20march%2018%2C%202016.docx?dl=0
Twitter: https://twitter.com/PTetlock
LinkedIn: https://www.linkedin.com/in/philip-tetlock-64aa108a?trk=hp-identity-name
For an interview: https://www.edge.org/conversation/philip_tetlock-how-to-win-at-forecasting

Dan Gardner is the New York Times best-selling author of books about psychology and decision-making. His work has been called "an invaluable resource for anyone who aspires to the think clearly" by The Guardian and "required reading for journalists, politicians, academics, and anyone who listens to them" by Harvard psychologist Steven Pinker.
Gardner’s books have been published in 25 countries and 19 languages.
In addition to writing, Gardner lectures on forecasting, risk, and decision-making.
Prior to becoming an author, Gardner was a newspaper columnist and feature writer whose work won or was nominated for every major award in Canadian newspaper journalism.

Discover more of the author’s books, see similar authors, read author blogs and more
Customer reviews
Customer Reviews, including Product Star Ratings, help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyses reviews to verify trustworthiness.
Learn more how customers reviews work on Amazon-
Top reviews
Top reviews from United Kingdom
There was a problem filtering reviews right now. Please try again later.
But what do you find if you actually put forecasts to the test by recording them, checking which were right or wrong, and analysing who is best at making forecasts?
That is what Philip Tetlock and Dan Gardner set out to do in Superforecasting, based on a mammoth twenty-year study by Tetlock of predictions about current affairs. Funded by the US intelligence community - who have an obvious interest in accurate forecasting - Tetlock has run regular forecasting competitions to see who is best. Teams or individuals? Experts or novices? And so on.
Most 'experts' are not very expert when it comes to predictions it turns out. In Tetlock's research the average expert is only marginally better at predicting the future than a layperson applying random guesswork. But some - the superforecasters - are consistently better and they use approaches which we can all learn. It is not down to some mystical inbuilt talent.
The picture painted in Superforecasting is that the best forecaster has to behave almost like the opposite of what makes for a media-friendly pundit.
They should always be questioning their own assumptions and approaches. They should always remain modest about their certainty. They should always listen to the views of others. They should be more interested in learning how to be better at forecasting than in declaiming their own claimed brilliance compared to others. They should admit when they are wrong and learn from it. They should value revision and improvement over consistency. They should think in terms of doubt and probabilities rather than certainties and absolutes. They should break issues down into component parts rather than apply one grand theory to everything. The should understand that being right with one forecast may well be followed by being wrong with their next.
Or in other words, the sort of person who says, "Those idiots never listened to me, but I knew I was right all along - and everything would have been better if they'd only followed my one approach for all those years" is awful as a forecaster. But that's just the mindset which makes for high profile punditry.
A good reason to pay careful attention to which pundits you pay attention to.
The most spectacular example of mistaken predictions with catastrophic effects was that which predicted Saddam Hussain had weapons of mass destruction which he was prepared to use. In the US, the CIA, the National Security Agency, the Defense Intelligence Agency plus 13 other agencies all agreed. After invading Iraq in 2003 the "facts" were shown to be false.
In a brave acknowledgement of their mistakes, the US authorities were brave enough to question their own fundamental ability to forecast. In 2006 the Intelligence Advanced Research Project Activity (IARPA) was created to fund research into making the intelligence smarter and coopted Philip Tetlock. IARPA would sponsor a massive tournament to see who could invent the best methods of making all sorts of forecasts. IARTA was breaking the mould by demanding the measurement of forecasts. How accurate are forecasts in predicting outcomes? It is only by measurement that improvements can be made.
Tetlock’s contribution to the IARTA research was his Good Judgement Project on the internet whereby he got 2,800 people to make specific predictions on a range of questions. For example, will an outbreak of H5N1 kill more than 10 people in China in the next 6 months? Will the euro fall below $1.20 in the next 12 months? Will the President of Tunisia flee to exile within 3 months? Then after the 6 or 12 month period had elapsed, the accuracy of the forecasts and the forecasters were assessed.
His surprising results were that there are people who prove consistently good at forecasting across a broad range of subjects, and they are not necessarily the superbright specialists immersed in their subject. The superforecasters came from a cross section of the general public . But they did share common characteristics and an approach to making forecasts. Thanks to IARTA we now know a few hundred ordinary people and some simple math can not only compete with professionals supported by multibillion dollar apparatus but also beat them.
The book lays out in fascinating detail what these characteristics are and the fundamentals of the approach to forecasting that increases chances of success.
If you want to test yourself against the superforecasters, there is a new Good Judgement Project being launched on the net. Go to www.goodjudgement.com if you are mesmerised by a fascinating book.
So they are people with time on their hands (often retired or unemployed) ready to research the exam questions at some length and to put in the time necessary to update their original predictions. Psychologically they have a high 'need for cognition'. They are bright, but not super-bright. They are numerate but not as a rule super numerate. They follow current affairs but are not complete news junkies.They go in for 'slow' rather than 'fast' thinking and think in terms of probabilities, adjusting those probabilities on Bayesian principles. They break problems down into their component parts. And they work better in teams - indeed so well in teams that they outperform markets (which they don't do individually). And yes it's more skill than luck. And if we don't know how good they'd be at predicting black swans, well maybe black swans are rare and that's the nature of predicting them. There are limitations: predictions in particular are quantifiable estimates of the likelihood of precisely defined measurable outputs. So not always quite the ideal product.
I had perhaps hoped for a little more applicability to daily life - but I feel I did learn quite a lot from this book that was completely new to me. I would strongly recommend it to others.




