Bayesian Reasoning and Machine Learning and over 2 million other books are available for Amazon Kindle . Learn more
  • RRP: £54.99
  • You Save: £6.22 (11%)
FREE Delivery in the UK.
Only 11 left in stock (more on the way).
Dispatched from and sold by Amazon.
Gift-wrap available.
Bayesian Reasoning and Ma... has been added to your Basket
Trade in your item
Get a £18.75
Gift Card.
Have one to sell?
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See this image

Bayesian Reasoning and Machine Learning Hardcover – 2 Feb 2012

3 customer reviews

See all 2 formats and editions Hide other formats and editions
Amazon Price New from Used from
Kindle Edition
"Please retry"
"Please retry"
£43.38 £38.20
£48.77 FREE Delivery in the UK. Only 11 left in stock (more on the way). Dispatched from and sold by Amazon. Gift-wrap available.

Special Offers and Product Promotions

  • Win a £5,000 Gift Card for your child's school by voting for their favourite book. Learn more.
  • Prepare for the summer with our pick of the best selection for children (ages 0 - 12) across

Frequently Bought Together

Bayesian Reasoning and Machine Learning + Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning Series) + Pattern Recognition and Machine Learning (Information Science and Statistics)
Price For All Three: £160.21

Buy the selected items together

Trade In this Item for up to £18.75
Trade in Bayesian Reasoning and Machine Learning for an Amazon Gift Card of up to £18.75, which you can then spend on millions of items across the site. Trade-in values may vary (terms apply). Learn more

Product details

  • Hardcover: 720 pages
  • Publisher: Cambridge University Press (2 Feb. 2012)
  • Language: English
  • ISBN-10: 0521518148
  • ISBN-13: 978-0521518147
  • Product Dimensions: 18.9 x 3.7 x 24.6 cm
  • Average Customer Review: 4.7 out of 5 stars  See all reviews (3 customer reviews)
  • Amazon Bestsellers Rank: 156,638 in Books (See Top 100 in Books)
  • See Complete Table of Contents

More About the Author

Discover books, learn about writers, and more.

Product Description


'This book is an exciting addition to the literature on machine learning and graphical models. What makes it unique and interesting is that it provides a unified treatment of machine learning and related fields through graphical models, a framework of growing importance and popularity. Another feature of this book lies in its smooth transition from traditional artificial intelligence to modern machine learning. The book is well-written and truly pleasant to read. I believe that it will appeal to students and researchers with or without a solid mathematical background.' Zheng-Hua Tan, Aalborg University, Denmark

'With approachable text, examples, exercises, guidelines for teachers, a MATLAB toolbox and an accompanying website, Bayesian Reasoning and Machine Learning by David Barber provides everything needed for your machine learning course. Only students not included.' Jaakko Hollmén, Aalto University

'The chapters on graphical models form one of the clearest and most concise presentations I have seen … The exposition throughout uses numerous diagrams and examples, and the book comes with an extensive software toolbox - these will be immensely helpful for students and educators. It's also a great resource for self-study.' Arindam Banerjee, University of Minnesota

'I repeatedly get unsolicited comments from my students that the contents of this book have been very valuable in developing their understanding of machine learning … My students praise this book because it is both coherent and practical, and because it makes fewer assumptions regarding the reader's statistical knowledge and confidence than many books in the field.' Amos Storkey, University of Edinburgh

Book Description

This practical introduction for final-year undergraduate and graduate students is ideally suited to computer scientists without a background in calculus and linear algebra. Numerous examples and exercises are provided. Additional resources available online and in the comprehensive software package include computer code, demos and teaching materials for instructors.

Inside This Book

(Learn More)
Browse Sample Pages
Front Cover | Copyright | Table of Contents | Excerpt | Index
Search inside this book:

Customer Reviews

4.7 out of 5 stars
5 star
4 star
3 star
2 star
1 star
See all 3 customer reviews
Share your thoughts with other customers

Most Helpful Customer Reviews

9 of 11 people found the following review helpful By Amazon Customer on 16 July 2012
Format: Hardcover Verified Purchase
A very clear and concise book. I highly recommend this it.The Concepts are explained well with sufficient examples. One of the best books out there for machine learning.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
8 of 10 people found the following review helpful By jedihoho on 21 Nov. 2012
Format: Hardcover Verified Purchase
I am doing course related to machine learning and this book helps me with the study. I also have bishop but that book has too much math and I cannot follow. This one is explained in the simpler way. Less proof. Give you the intermediate detail on topics. Not for beginner or expert.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
Format: Hardcover Verified Purchase
A somewhat advanced book. Useful if you want to know more than what your statistical program libraries deliver more or less automatically. Possible more for the evolving specialist than anyone for whom the topic is more of a hobby.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again

Most Helpful Customer Reviews on (beta) 14 reviews
99 of 102 people found the following review helpful
Brilliant and accessible 17 May 2012
By T. Triche - Published on
Format: Hardcover
Don't take my word for it, though; read the book online. For some reason Amazon decided to delete the URL, so just do a search for David Barber and go to his home page at UCL (University College London), where links to a PDF of the book and to recent publications of his can be found.

Barber has done an excellent job of making extremely complex and contemporary ideas accessible to anyone with a reasonable mathematical background, and he puts them in context ("these techniques can be applied to finance, biology, and speech recognition"... para). Read through it and see for yourself. I find this book more accessible than Daphne Koller & Nir Friedman's (also excellent) text, Probabilistic Graphical Models, despite my immense respect for the authors of the latter.
53 of 54 people found the following review helpful
Extremely Suitable for Self Study 6 Dec. 2012
By Let's Compare Options Preptorial - Published on
Format: Hardcover Verified Purchase
Unlike many (most?) books and courses on machine learning, Barber's outstanding text is very suitable for self study. There are many reasons for this, and high among them is the fact that he carefully explains, with commonsense examples and applications, many of the tougher logical, mathematical and processing foundations of pattern recognition.

For relative beginners, Bayesian techniques began in the 1700s to model how a degree of belief should be modified to account for new evidence. The techniques and formulas were largely discounted and ignored until the modern era of computing, pattern recognition and AI, now machine learning. The formula answers how the probabilities of two events are related when represented inversely, and more broadly, gives a precise mathematical model for the inference process itself (under uncertainty), where deductive reasoning and logic becomes a subset (under certainty, or when values can resolve to 0/1 or true/false, yes/no etc. In "odds" terms (useful in many fields including optimal expected utility functions in decision theory), posterior odds = prior odds * the Bayes Factor.

For context, I'm the lead scientist at IABOK dot org-- we design algorithms for huge data mining problems and applications. This text is our "go to" reference for programmers not up to speed in many of the new pattern recognition algorithms, including those writing new versions. All the most recent relevant models, from a probability standpoint, are represented here, with a clarity that is stunning. My only criticism (a mild one) is that, when applying Barber's examples to Bodies of Knowledge and data mining, he skips Prolog, backward chaining, predicate calculus and other techniques that are the foundation of automated inference systems (systems that extend knowledge bases automatically by checking whether new propositions can be inferred from the KB as consistent, relevant, etc.).

In the next 20 years, algorithms will rule this planet. If you either want to see the future of your grandkids, or participate in it if you're young, this is a MUST HAVE exploration of where what we used to call AI is now headed. There IS plenty of calculus in this volume, so don't mistakenly think it is "simple" -- but if you put the time in, you can "get it" even if you're a bright undergrad level thinker. The author's goal of training new algorithm programmers is laudable and right on point for where pattern recognition is headed.

With this amount of math, how can we star it high for self study? Easy: unlike most "recipe" books that just give bushels of codes or techniques, the authors here give the what, where when and why of both code and math, not just the how, as their goal is independent, creative contributors who can write their OWN algorithms. There are a few minor UK vs US differences in terminology also (event space instead of sample space, for example), but they expand the reader's horizon rather than distract or annoy as some others do. There are others like Bishop and many more that have more recipes, and more compact and difficult math, but you have to either be really good (just show me the recipe) or really bad (I don't know what I'm doing, but can follow this recipe) to benefit from them. This is a happy middle ground that does not disappoint.
29 of 31 people found the following review helpful
Great effort put in explanations and examples 13 Jun. 2012
By Vladislavs Dovgalecs - Published on
Format: Hardcover
First I would like to thank authors (and the publishing house) for giving free PDF of the complete book. That's a really kind of them. Knowing its contents, that motivated me to buy a hard copy for my library.

Now to the content. The author has done a great job in introducing probabilistic concepts and pushing forward to more advanced and practically interesting techniques. There are many examples in the text that often help to grasp the workings of a method or an approach.

For me who has very little background in probabilistic methods this is a real textbook. I am still reading it chapter by chapter and can recommend it as a reading for advanced undergraduate, graduate and pHD students. The material in each chapter is well introduced and motivated, equations are just in time with variables and transformations explained, and with numerous exercises at the end of each chapter. All this makes a book almost self-consistent onto at least a semester can be taught.

The book is also accompanied with MATLAB code to which author refers to at the end of each chapter. The code is organized as a toolbox of functions with demos for each chapter. This allows to apply the acquired knowledge on your own problems.

As of the moment of writing, I've found just few typos that were not that disturbing. I don't see any other more serious reason not to give solid 5 stars to this book.
22 of 24 people found the following review helpful
One of the best text to learn machine learning 27 Sept. 2012
By quant@inside - Published on
Format: Hardcover Verified Purchase
I have read a similar book on Machine Learning, namely Pattern Recognition and Machine Learning (by Bishop). Before I read Barber's book, I considered Bishop's book to be the best in the Machine Learning (with bayesian focus). However, after reading this book, I can definitely say that it is better that Bishop's book in many sense. There are lots of examples in each chapter with matlab codes for many of them. Also, it covers more material that Bishop. Chapters on dynamic models and graphical models are particularly awesome. This book doesn't assume much background in probability (one master's level course on probability and statistics is probably more than enough). However, some chapters are advanced, and are mentioned so in the book.
16 of 18 people found the following review helpful
A Truly Modern Introduction to Bayesian Reasoning and Machine Learning 18 Nov. 2012
By Adnan Masood - Published on
Format: Hardcover
If you are scouring for an exploratory text in probabilistic reasoning, basic graph concepts, belief networks, graphical models, statistics for machine learning, learning inference, naïve Bayes, Markov models and machine learning concepts, look no further. Barber has done a praiseworthy job in describing key concepts in probabilistic modeling and probabilistic aspects of machine learning. Don't let the size of this 700 page, 28 chapter long book intimidate you; it is surprisingly easy to follow and well formatted for the modern day reader.

With excellent follow ups in summary, code and exercises, Dr. David Barber a reader at University college London provides a thorough and contemporary primer in machine learning with Bayesian reasoning. Starting with probabilistic reasoning, author provides a refresher that the standard rules of probability are a consistent, logical way to reason with uncertainty. He proceeds to discuss the basic graph concepts and belief networks explaining how we can reason with certain or uncertain evidence using repeated application of Bayes' rule. Since belief network, a factorization of a distribution into conditional probabilities of variables dependent on parental variables, is a specific case of graphical models, the book leads us into the discipline of representing probability models graphically. Followed by efficient inference in trees and the junction tree, the text elucidates on key stages of moralization, triangularization, potential assignment, and message-passing.

I particularly enjoyed the follow up chapter called statistics for machine learning which uniquely discuss the classical univariate distributions including the exponential, Gamma, Beta, Gaussian and Poisson. It summarizes the measure of the difference between distributions, Kullback-Leibler divergence and states that Bayes' rule enables us to achieve parameter learning by translating a prior parameter belief into a posterior parameter belief based on observed data. Learning as inference, naïve bayes, Learning with Hidden Variables and Bayesian model selection is followed by machine learning concepts. I found the sequence of chapters to be a bit off (shouldn't graphical models be discussed before providing a specific case?) but since the book is more rooted in practice than an exercise in theorem-proving's, the order ultimately makes sense.

Since Nearest neighbor methods are general classification methods, the book continues with conditional mixture of Gaussian into Unsupervised Linear Dimension Reduction, supervised linear dimension reduction, kernel extensions, Gaussian processes, mixture models, latent linear models, latent ability models, discrete and continuous state Markov models eventuating to distributed computation of models, sampling and the holy grail of Deterministic Approximate Inference.

One of the many great things about this book is the practical and code oriented approach; tips with applied insight like "Consistency methods such as loopy belief propagation can work extremely well when the structure of the distribution is close to a tree. These methods have been spectacularly successful in information theory and error correction." makes this text distinguished and indispensable.
Were these reviews helpful? Let us know