Learn more Download now Shop now Learn more Shop now Shop now Shop now Shop now Shop now Learn More Shop now Shop now Learn more Shop Fire Shop Kindle The Grand Tour Prize Draw Learn more Shop Women's Shop Men's



on 4 June 2017
Clearly written. Better than PRML of Bishop in my opinion. I should have bought it earlier.
0Comment|Was this review helpful to you?YesNoReport abuse
on 16 January 2017
I'm currently a physics PhD student and have a fairly strong mathematics background. I decided I was going to sit down with a textbook and make extensive notes in order to gain an intuition of machine learning, before moving on to doing some more 'hands on' practise with real data sets. I chose Murphy due to the probabilistic framework - it sounded like a nice and unifying way of the concepts of machine learning and the other reviews were very positive.

Three chapters in, I was really enjoying it. It seemed well written and I felt like I was gaining a lot of intuition. After that though, it went downhill... The author will introduce symbols without explaining what they are, as well as equations. In addition to being difficult to understand, there's a significant amount of typos.

As someone who wanted to systematically work through the entire book and gain a deep understanding of the topic, this textbook simply does not cut it. If you already have a large amount of experience with machine learning, then I believe this book would make a good reference. If you're planning on using this book to learn, then I would avoid.
22 Comments| 13 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 24 June 2017
I picked up this book from my university library with the hope of being able to get a solid and thorough introduction to the field of Machine Learning. To provide some context, I'm a Computer Science student at the upper undergraduate level.

First things first, the book runs just over 1000 pages in length, and so for the ambitious souls that plan on reading it cover to cover, you've got to be in for the long haul. This was indeed my approach towards reading the book, since the preface of the book contains no paragraphs about "how to read this book". The book is structured entirely as a linear sequence of 28 chapters and in the absence of CLRS-style "Parts" under which the chapters can be grouped, it is practically impossible for a novice to estimate which chapters are likely to be more fundamental than others, before reading them. Furthermore, with chapter titles as technical and foreign as "Generalized linear models and the exponential family"; "Mixture models and the EM algorithm"; "Adaptive basis function models", a novice has no high-level overview about what concepts may be explored in a chapter and so again, cannot form a rough idea about which chapters are likely to be more fundamental than others. Because of this uninformative structure of the book, the only solution is to study it cover-to-cover. This is easier said than done.

As early as the second chapter, I had already uncovered several typos. Often times the author will introduce new notation and symbols without clarifying them in the subsequent sentence. Other times he chooses to reuse the same symbol within the same mathematical argument, applying to it different meanings at different stages of the argument. The author says that this should be "clear from context", but someone quickly reviewing an argument as part of their revision, will likely become confused about whether 'theta' is referring to a scalar parameter or a parameter vector.

Put simply, the book fails to convey any high-level overview of the subject, and is littered with often poorly motivated mathematical arguments. Mathematical formulas are often introduced out of the blue, and the reader
will likely spend more time than she would like, incessantly scanning two pages of text in order to figure out how and why exactly a particular mathematical formula has emerged. The abrupt notational changes throughout the book will confuse most readers, although those more confident in their mathematical ability may view them simply as a mild annoyance. I would have given this book three stars, but I believe this book will likely leave a novice confused about what Machine Learning is really about, and how all the different concepts presented relate to one another.

It's a shame; we are still waiting for the CLRS of Machine Learning. I believe the book may have started out as an honest effort to capture the subject of Machine Learning in its entirety, but somewhere along the way, publishers convinced the author to add in "hot topics" that are not really introductory, but could perhaps boost the book's popularity as a reference amongst well established Machine Learning researchers. More topics means a larger audience after all.
0Comment| 6 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 23 May 2013
The field now collected beneath the standard of 'Machine Learning' is so vast, and draws from so many different schools of thought, and hence mindsets, notations and assumptions, that it is extremely hard to take your bearings. Even knowing what exists, and how it relates to the rest of what exists, is extremely difficult. The old school statistics guys speak one language, the machine learners another, and the Bayesian chaps yet a third, and so although there are many unifying ideas, these are hard to identify. The primary strength of this book is that it allows the reader to see the connections by providing a unifying framework and notation all the way from basic distributions through standard statistical models to machine learning black-boxes and out to applied algorithms. Many sections end in current academic references, as well as current practical uses thereof. I have wanted such a text for a very long time, and am thrilled to have found it.

Beyond that, the approach that the book takes to maths hits the sweet-spot between the thicket of lemma-lemma-theorem-proof found in 'academic' books, and the hand-wavy elisions found in 'practitioners' book. That is, important proofs are stated and fully worked, within the context of softer discussion of the concepts presented. Finally, having the source code for all images in the books allows you to dive in and really understand by doing. Having this code a gold standard off which to base your own software is fantastic.

I have read the other main books in this area (PGM, ESL, PRML etc) and think this is the most broad, thorough and unified presentation available. It can be used as the foundation for understanding this field.
0Comment| 38 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 14 February 2017
Is very dense but if you have the need to learn everything about Ml then this is the best book out there with the one from Christopher Bishop.
Bishop is less deep but easier to understand. Murphy is very thick but the best, however, your need some serious math skills to reading it, though.
0Comment| One person found this helpful. Was this review helpful to you?YesNoReport abuse
on 21 October 2013
Kevin Murphy's book covers all aspects of statistical learning theory in depth and breadth, taking the reader from basic concepts all the way to cutting edge problems. It is a very rare thing, indeed, to find a textbook that is nigh on impossible to fault (Matlab vs R is the only minor niggle for me), in terms of content, style and delivery. The theoretical underpinnings are outlined with care and the motivating examples are well chosen. It serves as a great introduction to statistical inference, machine learning, information theory and graphical models. This book has quickly become my standard reference on the topic and the main recommendation for students.
11 Comment| 10 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 16 November 2012
This is an excellent textbook on machine learning, covering a number of very important topics. The depth and breadth of coverage of probabilistic approaches to machine learning is impressive. Having Matlab code for all the figures is excellent. I highly recommend this book!
0Comment| 16 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 2 June 2015
The most thorough theoretical piece on machine and statistical learning and statistical model development I have ever met. Fine print, elegant paper, and lots of colour illustrations. Requiers some previous math/statistical background.
0Comment| 3 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 13 May 2015
This is a great book. Having been exposed to the other two popular textbooks in machine learning, "The Elements of Statistical Learning" and "Pattern recognition and Machine Learning", in university courses, I have to say that Murphy's "Machine Learning" is definitely the best one. It the most comprehensive one, it is better at explaining (because there is more detail) and it is also the most up-to-date one
0Comment| 2 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 7 December 2016
Great book. As of 7/12/2016 the edition shipped is the fourth reprint (the one you want). I gather that previous editions had lots of errors. Amazon has not updated the edition number in the product information.

Easier to understand than PRML (Bishop) and yet more technical than David Barber's book. Motivations are better explained and algorithms given wider context than in "the elements of statistical learning" (Hastie).
0Comment| 2 people found this helpful. Was this review helpful to you?YesNoReport abuse

Sponsored Links

  (What is this?)