51 of 51 people found the following review helpful
on 18 July 2011
Whether or not you will enjoy this book depends on who you are. If you enjoy reading books about popular science, and trying to solve the occasional simple mathematical or logical puzzle, then you are ready for this one. If you want to understand the theory in any depth, or use it to solve problems, then you will need at least first-year undergraduate statistics to get started, much more to make progress - and a book with the formal mathematics, but begin with this one first to get a perspective on the field before going into detail.
It is not obvious how you should use data to decide what to believe or how to act, and, as theories of statistics were developed, statisticians tried several different ways of thinking about data and the conclusions that could reasonably be drawn from them. Unfortunately the divisions of opinion (perhaps largely due to the personalities of the leading thinkers) resulted in acrimonious and inconclusive arguments.
Thomas Bayes was a clergyman who died in 1761, leaving behind some mathematical papers. One of these was revised and corrected by Richard Price, so we don't know quite what Bayes wrote or what he meant. This paper was the origin of two things: (1) the widely-used and uncontroversial `Bayes Theorem', and (2) the controversial idea that probability could be expressed in terms of a measure of belief. In Bayesian statistics the researcher puts a belief into numerical terms and refines this belief in the light of subsequently observed data. The 'subjective' aspect of the theory brought it into disrepute, where it lingered for nearly 200 years. Many people faced with practical problems found that Bayesian methods worked, but either they didn't know about Bayes or they preferred not to invite criticism by mentioning his name.
In the last 60 years or so there has been a big revival in interest in Bayes theory, and it has been used to solve many problems that weren't amenable to traditional methods. The big barrier was that some of the methods needed huge calculations, but with the availability of cheap, fast computers and new methods of calculation that barrier has almost disappeared.
Sharon Bertsch Mcgrayne's book gives a very clear and thorough history of "the theory that would not die." As a practising statistician for more than 40 years I knew much of the published work that she has written about, and can vouch for her accuracy (there are a few corrections on her website), but until I read this book I did not have a clear idea of all of the historical developments and controversies. My only criticism is that the bibliography is organised by chapters, rather than as one alphabetically ordered sequence.
12 of 12 people found the following review helpful
This is an excellent history of the development and application of Bayes' Theorem. Intended for the general reader with an interest in probability and the history of science, it is clearly written with a minimum of mathematics, and covers the ground efficiently.
It is particularly interesting for what it reveals of the way in which new ideas become part of intellectual discourse; in this case, by enduring a long period of suspicion and neglect before being rescued by the enthusiasm of practitioners rather than theorists. McGrayne offers many sidelights on the clandestine uses made of Bayes by the military and the intelligence community, which go some way to explaining why the power of these techniques was so long in receiving acknowledgement. The powerful personalities of the people involved receive extensive attention: no reader will come away from this book in ignorance of the degree to which accidents of institutional history and personal character condition the intellectual environment. Recommended.
9 of 9 people found the following review helpful
on 11 December 2012
I have to agree with the other reviewers who were disappointed by the lack of mathematics in this book. To borrow an old cliche, Bayes without the mathematics is Hamlet without the prince. It is certainly interesting to read about the academic squabbles, the logical breakthroughs, the military applications, and so on; but I want to know HOW (for instance) Turing used Bayes to decode Enigma, not merely THAT he used Bayes. I wonder just how many readers would pick up the book if they didn't already have some understanding of what Bayes was about; but if McGrayne were worried about the ability of her readers to follow a mathematical explanation then all she needed to do was relegate the detailed explanations to appendices. She deserves credit for the appendix on mammograms and breast cancer, which is admirably simple, but as far as I can see that is the only point at which even the algebraic statement of the familiar theorem appears.
I first came across the Bayesian approach to statistics as a graduate student in 1970 (thanks to Tribus' "Rational Descriptions, Decisions and Designs" - pity he didn't get a name check from McGrayne) and, like Saul on the road to Damascus, I underwent something like a religious conversion. Unlike St Paul, I never suffered any persecution in consequence, but it is good to see that what seemed to me at the time a fringe religion has now achieved something approaching statistical orthodoxy. For that reassurance, I thank Ms McGrayne.
20 of 21 people found the following review helpful
on 19 January 2012
This is half a book and the half is very good - it would be worth 5 stars. You learn about the fascinating people who deployed Bayesian inference, particularly the Enigma codebreakers; about the statisticians who thought it was a complete waste of time; about the quirks of history which made people so slow to recognize its value.
All very good. But this is a book about some mathematics, and there is very little maths! Bayes' rule gets an equation, but that's not actually Bayesian inference. The author keeps saying that sometimes frequentists and Bayesians get the same results, but no example. And sometimes very different results, but no example. Bayes himself seems to have proved it, but no details on the proof. Some other people seem to have proved it, but ditto. Bayesian calculations are said to be very difficult pre-computer and pre-MCMC, but no example so you can see why it's such a problem.
So: a little disappointing - but maybe it does provide the questions you can type into Google after this book has not provided the answers.
3 of 3 people found the following review helpful
on 23 June 2013
Well first off, I'm delighted to see that co-founder Richard Price of Llangeinor is given proper credit. (Llangeinor in South Wales, is near where I live, But Rev Price did much more than re-write Rev Bayes's notes)
And I'm fascinated by the names of all the statisticians who I'd heard about, and a few I've even met (I taught stats at a midlands University).
But having re-read it more closely, I now understand my quibbles: All Bayesians are treated as unsung heroes, the un-converted are knaves.
For instance: p116 "Cornfield's identification [in the Framingham study] in 1962 of the most critical risks factors [high cholesterol, high blood pressure] for cardiovascular disease produced....a dramatic drop in death rates from c.v. diease.", because it seems that Cornfield used Bayes and the others didn't.
Now this is a complete travesty! Read Gary Taubes 'The Diet Delusion' and you'll discover that poor analysis, and especially pre-conceptions meant that Framingham produced the 'wrong' results. Apart from smoking, none of the other factors matter. The low-fat obsession is making matters worse. A clear example of bad priors causing wrong posteriors?
So did Cornfield and his bayesianism lead to these false conclusions? Ms. McGrayne, the author could be forgiven for not knowing this, but it shows how the book works -- run with any 'success' for bayesianism (and ignore the failures?)
Her attitude to my favourite statistician, Tukey is bizarre to say the least. She claims he did all sorts of secret work both for the military and for commercial clients that used Bayes, yet ignored his plain-sight comments that EDA -- exploratory data analysis was what matters to most problem solvers; that CBA confirmatory data analysis was just an ornamental final flourish, and that was true for both bayesians and frequentists.[disclaimer: I wrote a book on EDA misleadingly titled 'Mastering statistics with your micro-computer' 1986]
p 236 is to say the least, disingenuous! Greenspan, chairman of the Fed said in 2004 he used bayesian ideas to assess risk in financial policy. Ooops! He was proven spectacularly wrong by 2008! But Greenspan, claims Ms McGrayne didn't do Bayes properly. ho! ho! pull the other one!
This is a good book, well researched, and shines a light on otherwise neglected characters (statisticians, like me!). But she's caught the bayesian bug in spades!
4 of 4 people found the following review helpful
on 11 July 2013
As another reviewer wrote, this is about the people who worked on/with the theory, with no explication or examples of the theory itself in use; lots of examples of where it was used, but no explication at all, which is a pity. Another reviewer said pretty much the same thing, but I thought that perhaps it would still be interesting or illuminating, but alas no. Unless you want to know a little bit about all of the various people who have been "Bayesian", I wouldn't bother with this.
1 of 1 people found the following review helpful
"When the facts change, I change my opinion. What do you do, sir?"
This apposite quote, provided by the author, essentially describes the Bayesian protocol. Each time new data appears, the odds are recalculated based on said new data. Why all the fuss? It's logical, Captain.
The author has produced a kind of social history of Bayes' Theorem. So it is a book that describes the history of the idea rather than much explanation of how Bayesian inference actually works. Her degree appears to be in journalism so she sticks to the terra firma of the historical approach - rather than wade into the mathematical nitty gritty of the theory. In the history of the idea, she describes the massive bun fights that developed. ("Bun fights - Hardly logical behaviour, Captain".)
But the Bayes rule has an impressive pedigree. It was not just the intellectual property of an obscure amateur mathematician, the Reverend Thomas Bayes. Laplace independently discovered the rule for himself in 1774 and made great use of it. (It should also be pointed out that another minister, Richard Price, also was involved in tidying up some of Thomas Bayes' results).
As the author observes, it's about a theory that was just waiting for a good software package for computing all those probabilities. With the advent of computers the Bayesian methodology really comes into its own. For example, the spam filters in our emails have a Bayesian inheritance.
I don't have a Bayesian, or indeed Laplacian, axe to grind. If the methodology works then I'll darn well use it - instread of getting precious about it. Good results are hard to argue with.
But getting precious also works both ways. McGrayne also describes the near religious zeal that "converts" had to the Bayesian idea. Hence the bunfights. Guys, it's only a useful theory. It's not Holy Writ. Get over it.
1 of 1 people found the following review helpful
on 12 August 2012
The first half of the book is strong, but the narrative of the present day is very cursory.
The book is an excellently researched history of Bayes' Theorem and does a particularly good job of tracing its development at the hands of Price and Laplace. You also get a great flavour of the cryptographic efforts Second World War in general and the story of those personalities is told well.
The case study of estimating the probability of a nuclear accident is fascinating and was one of the strongest parts of the book. It provides a great insight into the application of Bayes' Theorem to practical problems, and the process of breaking down the larger problem into its constituent parts.
However as the story gets towards modern developments such as Kalman filters and Markov chain methods, developments are introduced at a pretty quick pace without much explanation. There is a marked contrast between the hasty way the Gibbs sampler is mentioned with the thoroughness with which cryptographic uses were explained earlier.
Overall, a pretty good read as a history book but given its likely audience could perhaps have used more numerical illustrations.
1 of 1 people found the following review helpful
on 20 April 2013
The book describes the history of application of Baye's rule and how it helped solve lots of problems that couldn't be solved otherwise. But it doesn't really explain how it helped solve all these problems which is frustrating. I understand why the author doesn't want to go into mathematical detail but just an indication of what was assumed and how they went about the claculation would have been nice. So in the end, its like a list of problems solved using Bay'es rule. On the other hand, the description of the people involved and the relationships between them is very nice.
on 31 October 2012
As the second world war started the German military used a hugely sophisticated encryption machine - Enigma- which allowed their military units to co-operate intensely with each other. The number of possible combinations in the Enigma meant that the code was very possibly unbreakable.
Except for the human factor - one genius, who had carried out a regular task radioed each night `Beacons lit as ordered', so once you this level of information it provides a key for all the rest.
The code-breakers in Bletchley Park worked with probabilities that certain phrases appeared in the encrypted code, and over time were able to break the code. The use of probabilities, using small amounts of initial data, and working through iterations is the basis of Bayes theorem and has many applications - try Google Translate, it uses this sort of algorithm.
There is much to annoy in this book - there is a whole description of an academic disputes between Bayesian's and `frequentists' seems to me to be a bit over hyped to enhance the conflict element. I believe Frequentist statistician's promoted tools which were suitable for controlled experiments which were data rich and capable of being run again and again. I believe Bayesian analysis suits when there isn't much data, but previous estimates (successful or failures) help to change the probabilities for future guesses. In addition Bayesian recalculation of probabilities is very computer intensive and hence grew in relevance as computer power increased since the 1950's.
I was disappointed to learn that the successful search for the Scorpian, a nuclear submarine lost at sea in 1968, was not a fully Bayesian-assisted search - apparently a Bayesian analysis was performed, but some physical debris sparked the final find. However the search for the Air France Airbus which went down in mid Atlantic a couple of years ago, was. Indeed the referenced papers on that search are worth reading. The discussion about the 1960s safety analysis for nuclear weapons was I thought an example of near miss methodology rather than Bayesian - asked to predict the probability of a nuclear accident, one (sterile) avenue would have been to say that as none had occurred there was insufficient data for prediction; however on further examination there were a number of accidents which had stopped short of the nuclear threshold which could be examined.
I thought the medical reverse inference data was very good , and the discussions on modern Bayesian techniques for search and translate was very interesting, indeed could have been given more space. The worked examples in the appendix were also very enlightening