Expert Political Judgment: How Good Is It? How Can We Know? and over 2 million other books are available for Amazon Kindle . Learn more
£17.92
  • RRP: £21.95
  • You Save: £4.03 (18%)
FREE Delivery in the UK.
Only 2 left in stock (more on the way).
Dispatched from and sold by Amazon.
Gift-wrap available.
Quantity:1
Trade in your item
Get a £7.91
Gift Card.
Have one to sell?
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See this image

Expert Political Judgment: How Good Is It? How Can We Know? Paperback – 20 Aug 2006


See all 4 formats and editions Hide other formats and editions
Amazon Price New from Used from
Kindle Edition
"Please retry"
Paperback
"Please retry"
£17.92
£13.73 £13.88

Frequently Bought Together

Expert Political Judgment: How Good Is It? How Can We Know? + Future Babble: Why Expert Predictions Fail and Why We Believe them Anyway + Risk: The Science and Politics of Fear
Price For All Three: £34.25

Buy the selected items together


Trade In this Item for up to £7.91
Trade in Expert Political Judgment: How Good Is It? How Can We Know? for an Amazon Gift Card of up to £7.91, which you can then spend on millions of items across the site. Trade-in values may vary (terms apply). Learn more

Product details

  • Paperback: 352 pages
  • Publisher: Princeton University Press; New Ed edition (20 Aug 2006)
  • Language: English
  • ISBN-10: 0691128715
  • ISBN-13: 978-0691128719
  • Product Dimensions: 23.5 x 15.6 x 2.2 cm
  • Average Customer Review: 3.3 out of 5 stars  See all reviews (3 customer reviews)
  • Amazon Bestsellers Rank: 114,733 in Books (See Top 100 in Books)
  • See Complete Table of Contents

More About the Author

Discover books, learn about writers, and more.

Product Description

Review


Winner of the 2006 Grawemeyer Award for Ideas Improving World Order



Winner of the 2006 Woodrow Wilson Foundation Award, American Political Science Association



Winner of the 2006 Woodrow Wilson Foundation Award, American Political Science Association



Winner of the 2006 Robert E. Lane Award, Political Psychology Section of the American Political Science Association


"It is the somewhat gratifying lesson of Philip Tetlock's new book . . . that people who make prediction their business--people who appear as experts on television, get quoted in newspaper articles, advise governments and businesses, and participate in punditry roundtables--are no better than the rest of us. When they're wrong, they're rarely held accountable, and they rarely admit it, either. . . . It would be nice if there were fewer partisans on television disguised as "analysts" and "experts". . . . But the best lesson of Tetlock's book may be the one that he seems most reluctant to draw: Think for yourself."--Louis Menand, The New Yorker



"The definitive work on this question. . . . Tetlock systematically collected a vast number of individual forecasts about political and economic events, made by recognised experts over a period of more than 20 years. He showed that these forecasts were not very much better than making predictions by chance, and also that experts performed only slightly better than the average person who was casually informed about the subject in hand."--Gavyn Davies, Financial Times



"Before anyone turns an ear to the panels of pundits, they might do well to obtain a copy of Phillip Tetlock's new book Expert Political Judgment: How Good Is It? How Can We Know? The Berkeley psychiatrist has apparently made a 20-year study of predictions by the sorts who appear as experts on TV and get quoted in newspapers and found that they are no better than the rest of us at prognostication."--Jim Coyle, Toronto Star



"Tetlock uses science and policy to brilliantly explore what constitutes good judgment in predicting future events and to examine why experts are often wrong in their forecasts."--Choice



"[This] book . . . Marshals powerful evidence to make [its] case. Expert Political Judgment . . . Summarizes the results of a truly amazing research project. . . . The question that screams out from the data is why the world keeps believing that "experts" exist at all."--Geoffrey Colvin, Fortune



"Philip Tetlock has just produced a study which suggests we should view expertise in political forecasting--by academics or intelligence analysts, independent pundits, journalists or institutional specialists--with the same skepticism that the well-informed now apply to stockmarket forecasting. . . . It is the scientific spirit with which he tackled his project that is the most notable thing about his book, but the findings of his inquiry are important and, for both reasons, everyone seriously concerned with forecasting, political risk, strategic analysis and public policy debate would do well to read the book."--Paul Monk, Australian Financial Review



"Phillip E. Tetlock does a remarkable job . . . applying the high-end statistical and methodological tools of social science to the alchemistic world of the political prognosticator. The result is a fascinating blend of science and storytelling, in the the best sense of both words."--William D. Crano, PsysCRITIQUES



"Mr. Tetlock's analysis is about political judgment but equally relevant to economic and commercial assessments."--John Kay, Financial Times



"Why do most political experts prove to be wrong most of time? For an answer, you might want to browse through a very fascinating study by Philip Tetlock . . . who in Expert Political Judgment contends that there is no direct correlation between the intelligence and knowledge of the political expert and the quality of his or her forecasts. If you want to know whether this or that pundit is making a correct prediction, don't ask yourself what he or she is thinking--but how he or she is thinking."--Leon Hadar, Business Times

From the Inside Flap

"This book is a landmark in both content and style of argument. It is a major advance in our understanding of expert judgment in the vitally important and almost impossible task of political and strategic forecasting. Tetlock also offers a unique example of even-handed social science. This may be the first book I have seen in which the arguments and objections of opponents are presented with as much care as the author's own position."--Daniel Kahneman, Princeton University, recipient of the 2002 Nobel Prize in economic sciences

"This book is a major contribution to our thinking about political judgment. Philip Tetlock formulates coding rules by which to categorize the observations of individuals, and arrives at several interesting hypotheses. He lays out the many strategies that experts use to avoid learning from surprising real-world events."--Deborah W. Larson, University of California, Los Angeles

"This is a marvelous book--fascinating and important. It provides a stimulating and often profound discussion, not only of what sort of people tend to be better predictors than others, but of what we mean by good judgment and the nature of objectivity. It examines the tensions between holding to beliefs that have served us well and responding rapidly to new information. Unusual in its breadth and reach, the subtlety and sophistication of its analysis, and the fair-mindedness of the alternative perspectives it provides, it is a must-read for all those interested in how political judgments are formed."--Robert Jervis, Columbia University

"This book is just what one would expect from America's most influential political psychologist: Intelligent, important, and closely argued. Both science and policy are brilliantly illuminated by Tetlock's fascinating arguments."--Daniel Gilbert, Harvard University

--This text refers to an out of print or unavailable edition of this title.

Inside This Book (Learn More)
Browse Sample Pages
Front Cover | Copyright | Table of Contents | Excerpt | Index
Search inside this book:

Customer Reviews

3.3 out of 5 stars
Share your thoughts with other customers

Most Helpful Customer Reviews

11 of 11 people found the following review helpful By Rolf Dobelli TOP 500 REVIEWER on 8 Jun 2006
Format: Hardcover
If you want to find out what makes a forecaster a real expert or a lucky guesser, this book explains the complicated set of necessary talents. Author Philip E. Tetlock is a researcher and political psychologist. He tracks a wide academic path into psychological investigations about predicting the future - in business, politics or other arenas - and the implications of its results. He finds some surprises, especially in the study of objectivity and how people think. He explains psychological experiments on forecasting, and uses them as a trail through tangles of complex research. As you climb, enjoy the occasional clearings where some great ideas (such as Amos Twersky's "Support Theory") come to light. We find Tetlock's insights worth the journey. Despite its sometimes dense thickets, this book is necessary for people who want to understand the role of self-described "expert" prognosticators. If you wonder why the predictions of political, media and sports forecasters often are not worth heeding, Tetlock shows you how to distill the best from the rest. We recommend this book to journalists, political scientists and managers or executives who rely on "expert" opinions or futuristic scenarios.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
6 of 8 people found the following review helpful By K.Castle on 7 July 2009
Format: Paperback
Bought this because of a radio broadcast where the writer explained his research. However the book is difficult and includes a lot of math type stuff (for me anyway)I'm still glad I bought it but its a slog in small installments to get through it.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
0 of 7 people found the following review helpful By N. Tabak-tadema on 30 Jan 2013
Format: Paperback Verified Purchase
To specific for an ordinary reader to get the hang of it. Utterly hard to read, not interesting for the layman.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again

Most Helpful Customer Reviews on Amazon.com (beta)

Amazon.com: 32 reviews
45 of 48 people found the following review helpful
A classic of Political Science & Cognitive Psychology 6 Jan 2007
By Dr. Frank Stech - Published on Amazon.com
Format: Paperback Verified Purchase
Tetlock shows conclusively two key points: First, the best experts in making political estimates and forecasts are no more accurate than fairly simple mathematical models of their estimative processes. This is yet another confirmation of what Robyn Dawes termed "the robust beauty of simple linear models." The inability of human experts to out-perform models based on their expertise has been demonstrated in over one hundred fields of expertise over fifty years of research; one of the most robust findings in social science. Political experts are no exception.

Secondly, Tetlock demonstrates that experts who know something about a number of related topics (foxes) predict better than experts who know a great deal about one thing (hedgehogs). Generalist knowledge adds to accuracy.

Tetlock's survey of this research is clear, crisp, and compelling. His work has direct application to world affairs. For example he is presenting his findings to a conference of Intelligence Community leaders next week (Jan 2007) at the invitation of the Director of National Intelligence.

"Expert Political Judgment" is recommended to anyone who depends on political experts, which is pretty much all of us. Tetlock helps the non-experts to know more about what the experts know, how they know it, and how much good it does them in making predictions.
29 of 32 people found the following review helpful
Careful, Plodding, Objective 23 Sep 2006
By Peter McCluskey - Published on Amazon.com
Format: Paperback
This book is a rather dry description of good research into the forecasting abilities of people who are regarded as political experts. It is unusually fair and unbiased.

His most important finding about what distinguishes the worst from the not-so-bad is that those on the hedgehog end of Isaiah Berlin's spectrum (who derive predictions from a single grand vision) are wrong more often than those near the fox end (who use many different ideas). He convinced me that that finding is approximately right, but leaves me with questions.

Does the correlation persist at the fox end of the spectrum, or do the most fox-like subjects show some diminished accuracy?

How do we reconcile his evidence that humans with more complex thinking do better than simplistic humans, but simple autoregressive models beat all humans? That seems to suggest there's something imperfect in using the hedgehog-fox spectrum. Maybe a better spectrum would use evidence on how much data influences their worldviews?

Another interesting finding is that optimists tend to be more accurate than pessimists. I'd like to know how broad a set of domains this applies to. It certainly doesn't apply to predicting software shipment dates. Does it apply mainly to domains where experts depend on media attention?

To what extent can different ways of selecting experts change the results? Tetlock probably chose subjects that resemble those who most people regard as experts, but there must be ways of selecting experts which produce better forecasts. It seems unlikely they can match <a href="[...]">prediction markets</a>, but there are situations where we probably can't avoid relying on experts.

He doesn't document his results as thoroughly as I would like (even though he's thorough enough to be tedious in places):

I can't find his definition of extremists. Is it those who predict the most change from the status quo? Or the farthest from the average forecast?

His description of how he measured the hedgehog-fox spectrum has a good deal of quantitative evidence, but not quite enough for me check where I would be on that spectrum.

How does he produce a numerical timeseries for his autoregressive models? It's not hard to guess for inflation, but for the end of apartheid I'm rather uncertain.

Here's one quote that says a lot about his results:

Beyond a stark minimum, subject matter expertise in world politics translates less into forecasting accuracy than it does into overconfidence
15 of 15 people found the following review helpful
Great Decision Making Evidence 10 Mar 2006
By T. Coyne - Published on Amazon.com
Format: Hardcover
As both a consultant and an investment manager I have spent a lot of years studying decision theory. One limitation in a lot of the work I encountered was its heavy reliance on lab studies using students. You were never quite sure if the conclusions applied in the "real world." This outstanding book puts that concern to rest. It is by far the richest body of evidence I have encountered on decision making in real world situations. Anybody interested in decision making and decision theory will profit from reading it.
4 of 4 people found the following review helpful
Human expert - an oxymoron ? 18 Aug 2009
By Chetan Chawla - Published on Amazon.com
Format: Paperback
"The fox knows many things but the hedgehog knows one big thing." - Isaiah Berlin, The Hedgehog and the Fox.

Tetlock uses the Fox-Hedgehog thinking style framework from Berlin's classic essay to distinguish between top-down ideological Hedgehogs and the far more circumspect and integrative Foxes. After a long term study of predictive ability in the political sphere - Tetlock clearly shows the superiority of Foxes over Hedgehogs (more vulnerable to cognitive biases).

However, worryingly for experts/pundits/consultants everywhere, the study also reveals the inability of humans to outperform statistical and base rate extrapolation algorithms in the predictive arena in any complex process with stochastic elements (politics/finance/economics...)
6 of 7 people found the following review helpful
If you don't think you know it all, there is a better chance you will get it right 7 Jun 2006
By Shalom Freedman - Published on Amazon.com
Format: Hardcover
This is an important book for it gives us an insight in how to evaluate the thousands of experts who are continually bombarding us with their predictions. Tetlock chooses the difficult and murky area of political judgment, and on it centers his analysis, though his basic conclusions relate to forecasting in other areas, such as business and finance. Roughly he takes Isaiah Berlin's distinction between the hedgegog who would know one big thing, and the fox, who knows many little things as basis of his analysis. As he sees it the Hedgehogs who base themselves on big theories are invariably wrong, while the foxes who tend be more open to the actual play of reality, have a far better record.

As he understands it the Hedgehogs go overboard in making Boom or Bust predictions. He provides empircal studies data to show how they are most often wrong, even more wrong by the way when they are predicting Disaster. Those who qualify their predictions the uses of 'perhaps' and 'however' and 'nonetheless' and 'possibly' have a far better chance of getting it right.

The irony of this however is that it is precisely the Hedgehogs who are rewarded, and receive greatest Media attention. They are never punished for being wrong, for few seem to follow and check on the accuracy of the prediction. The more accurate qualified assessments are given, on the other hand, more scanty space and attention. After all when we are uncertain about the future who wants to hear a prediction which itself says it is uncertain.

This is a very instructive book, although I wish at times its systems of classification were a bit less awkward.

On the whole though this is a highly recommended and important work, which can be of real help to most of us in understanding how to separate the ' wheat ' from the 'bull'.
Were these reviews helpful? Let us know


Feedback