- Paperback: 340 pages
- Publisher: Oxford University Press, USA; 1 edition (3 May 2007)
- Language: English
- ISBN-10: 0198524498
- ISBN-13: 978-0198524496
- Product Dimensions: 23.4 x 2.3 x 15.5 cm
- Average Customer Review: 1.0 out of 5 stars See all reviews (1 customer review)
- Amazon Bestsellers Rank: 1,349,024 in Books (See Top 100 in Books)
- See Complete Table of Contents
Bayesian Rationality: The Probabilistic Approach to Human Reasoning (Oxford Cognitive Science Series) Paperback – 3 May 2007
- Choose from over 13,000 locations across the UK
- Prime members get unlimited deliveries at no additional cost
- Find your preferred location and add it to your address book
- Dispatch to this address when you check out
Customers who viewed this item also viewed
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
If you are a seller for this product, would you like to suggest updates through seller support?
Oaksford and Chater make a very strong case in favour of a probabilistic view of human reasoning. This publication in therefore highly recommended to any cognitive psychologists, and particularly to master's or doctoral students doing research in this field. (The Psychologist,)
Oaksford and Chater have been at the center of a major reconceptualization of how humans reason. This book explains the deep reasons for this new approach and provides an excellent summary of their work. (Professor John R. Anderson, Carnegie Mellon University, USA)
Oaksford and Chater convincingly argue that rationality in the real world cannot be reduced to logical thinking and demonstrate how apparently logical problems can instead be reconstructed in a probabilistic way. This is an important step towards the ultimate goal of understanding the heuristic mechanisms underlying behavior. An excellent book on a Bayesian approach to cognition. (Professor Gerd Gigerenzer, Center for Adaptive Behavior and Cognition, Max Planck Institute for Human Development, Germany)
For years, Oaksford and Chater have taken a maverick approach to the analysis of human reasoning, applying probabilistic ideas to construct radically new interpretations of what people are doing when they reason and whether or not those actions are rational. The field has started to follow Oaksford and Chater's lead; probabilistic concepts are becoming central to all areas of cognitive science. In this book, Oaksford and Chater offer an exceptionally lucid and compelling introduction to their own work and in the process provide an accessible introduction to a number of technical issues in reasoning. This book is a must for those interested in the latest theoretical ideas in the study of human reasoning. (Professor Steve Sloman, Brown University, USA)
This fascinating book is the capstone of one of the most important and original programs of research on reasoning in the last twenty years. Oaksford and Chater argue persuasively that human thinking is best understood not in terms of how poorly it approximates the philosopher's norms of deductive logic, but rather in terms of how well it captures the more powerful and subtle principles of Bayesian probability. (Professor Josh Tenenbaum, Massachusetts Institute of Technology, USA)
About the Author
Mike Oaksford is Professor of Psychology and Head of School at Birkbeck College London. He was a research fellow at the Centre for Cognitive Science, University of Edinburgh, he was then lecturer at the University of Wales, Bangor, and senior lecturer at the University of Warwick, before moving to Cardiff University in 1996 as Professor of Experimental Psychology, a post he held until 2005. His research interests are in the area of human reasoning and decision making. In particular, with his colleague Nick Chater, he has been developing a Bayesian probabilistic approach to deductive reasoning tasks. According to this approach reasoning "biases" are the result of applying the wrong normative model and failing to take account of people's normal environment. He also studies the way the emotions affect and interact with reasoning and decision making processes. Nick Chater is Professor of Cognitive and Decision Sciences at University College London. He has an M.A. in Psychology from Cambridge University, and a PhD in Cognitive Science from Edinburgh. He has held academic appointments at Edinburgh, Oxford, and Warwick Universities. His research focussed on attempting to find general principles that may be applicable across many cognitive domains, ranging from reasoning and decision making, to language acquisition and processing, to perception and categorization. Since the late 1980s, in collaboration with Mike Oaksford, he has been interested in the application of probabilistic and information-theoretic methods for understanding human reasoning.
Top Customer Reviews
surrounding a probabilistic account of human thought - what I found was
vague, self-aggrandising and frankly just wrong.
It is a rare feat for any science book where almost every equation on a
single page is simply wrong - but this book manages it spectacularly.
Page 109 is the offending example - I counted three spectacularly
glaring typo's, listed below.
Halfway down, we have:
P(p -> q) = P(q | p), where P(p) < 0
This can't be right since probability is always non-negative. Fair
enough, a simple typo. But then a couple of lines below that contains
the very basic "Ratio" formula - the authors wrote:
P(q | p) = P(q /\ p) / P(q)
It should naturally be: P(q | p) = P(q /\ p) / P(p)
Finally, three lines from the bottom of the page we have:
P(q) = P(p /\ q) + P(p /\ -q)
It should of course be P(p) = ....
My point is that *none* of these formulae were complicated - indeed,
they are all very straightforward. This simply suggests a wider
carelessness that doesn't give the reader much confidence that the
authors care that much about what they are trying to say. So called
"trivial" errors like these certainly undermined the value of the book
for this reader.
Another discouraging thing - if more were needed - is the large number of
citations to the works of the main author (Oaksford) - 29, by actual
count. This is surely vanity publication - at best.
Most Helpful Customer Reviews on Amazon.com (beta) (May include reviews from Early Reviewer Rewards Program)
David Deutsch says of Bayesianism that the doctrine assumes that minds work by assigning probabilities to their ideas and modifying those probabilities in the light of experience as a way of choosing how to act. He continues, this is especially perverse when it comes to an Artificial General Intelligence's values -- the moral and aesthetic ideas that inform its choices and intentions -- for it allows only a behaviouristic model of them, in which values that are `rewarded' by `experience' are `reinforced' and come to dominate behaviour while those that are `punished' by `experience' are extinguished. Deutsch argues that behaviourist, input-output model may be appropriate for computer programming other than AGI, but hopeless for AGI. I would also be cautious about the common viewpoint that information evolves from data, an inductive bias.
I suspect that were Chater and Oaksford not so thoroughly imbued with a logical empiricist view of scientific logic they may have tempered the preliminary criticism of deductive logic before building further the rationale for Bayesian cognition. It seems that Bayes' theorem has value as a tool but there are a lot of questions that are still open with respect to Bayesian epistemology. One clear failing by the authors is the superficiality of the referencing to Karl Popper's critical rationalism. They seem to overlook that his model is one of conjecture and refutation (modus tollens rather than modus ponens). There is no royal road to knowledge other than preparedness to question and test one's conjectures. I get the feeling that the authors have put the cart before the horse, the horse is called "conjecture" and the cart is "inference". We explore the world by guessing and we explore our guesses with logical inference. The guesses can occur by a whole range of neurological and cognitive processes.
Evolutionary epistemology, developed by Popper in parallel with Donald Campbell and Konrad Lorenz has much to offer. Knowledge must be primary, not tacit knowledge nor justified knowledge nor certain knowledge. The embryo is not a blank slate: it guesses its way through the world and its guesses are tested through experience. Chater and Oaksford do roll out the standard Kuhn and Lakatos' misreadings of Popper with respect to falsifiability and falsification. This has reached the status of mythology.
Probabilistic tools are powerful but with power comes the need for great responsibility. Probability gives us information about the chances that an event will occur but it does not inform at all about the severity of the tests that a hypothesis has passed (or failed). Corroboration and degree of corroboration are not equivalent to confirmation and degree of confirmation, or probability, as per Carnap's logical positivism. The "probability of a hypothesis", in the sense of the degree of its corroboration, does not satisfy the laws of the calculus of probability. A highly testable hypothesis is logically improbable.
The point being that Bayesianism may model "degrees of certainty of beliefs" but this may well be misplaced in the truth-testing umwelts of nature raw in tooth and claw.