or
Sign in to turn on 1-Click ordering.
Trade in Yours
For a 11.50 Gift Card
Trade in
More Buying Choices
Have one to sell? Sell yours here
Sorry, this item is not available in
Image not available for
Colour:
Image not available

 
Tell the Publisher!
Id like to read this book on Kindle

Don't have a Kindle? Get your Kindle here, or download a FREE Kindle Reading App.

Information Theory, Inference and Learning Algorithms [Hardcover]

David J. C. MacKay
4.8 out of 5 stars  See all reviews (6 customer reviews)
RRP: 42.00
Price: 35.70 & FREE Delivery in the UK. Details
You Save: 6.30 (15%)
o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o
Only 10 left in stock (more on the way).
Dispatched from and sold by Amazon. Gift-wrap available.
Want it Tuesday, 22 April? Choose Express delivery at checkout. Details

Formats

Amazon Price New from Used from
Hardcover 35.70  
Paperback --  
Amazon.co.uk Trade-In Store
Did you know you can use your mobile to trade in your unwanted books for an Amazon.co.uk Gift Card to spend on the things you want? Visit the Books Trade-In Store for more details or check out the Trade-In Amazon Mobile App Guidelines on how to trade in using a smartphone. Learn more.

Book Description

25 Sep 2003
Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.

Frequently Bought Together

Information Theory, Inference and Learning Algorithms + Pattern Recognition and Machine Learning (Information Science and Statistics) + Bayesian Reasoning and Machine Learning
Price For All Three: 137.94

Buy the selected items together


Product details

  • Hardcover: 640 pages
  • Publisher: Cambridge University Press; Sixth Printing 2007 edition (25 Sep 2003)
  • Language: English
  • ISBN-10: 0521642981
  • ISBN-13: 978-0521642989
  • Product Dimensions: 24.9 x 19.6 x 4.1 cm
  • Average Customer Review: 4.8 out of 5 stars  See all reviews (6 customer reviews)
  • Amazon Bestsellers Rank: 164,292 in Books (See Top 100 in Books)
  • See Complete Table of Contents

More About the Author

Discover books, learn about writers, and more.

Product Description

Review

'This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn.' Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London

'This is primarily an excellent textbook in the areas of information theory, Bayesian inference and learning algorithms. Undergraduates and postgraduates students will find it extremely useful for gaining insight into these topics; however, the book also serves as a valuable reference for researchers in these areas. Both sets of readers should find the book enjoyable and highly useful.' David Saad, Aston University

'An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics.' Dave Forney, Massachusetts Institute of Technology

'An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home.' Bob McEliece, California Institute of Technology

'… a quite remarkable work … the treatment is specially valuable because the author has made it completely up-to-date … this magnificent piece of work is valuable in introducing a new integrated viewpoint, and it is clearly an admirable basis for taught courses, as well as for self-study and reference. I am very glad to have it on my shelves.' Robotica

'With its breadth, accessibility and handsome design, this book should prove to be quite popular. Highly recommended as a primer for students with no background in coding theory, the set of chapters on error correcting codes are an excellent brief introduction to the elements of modern sparse graph codes: LDPC, turbo, repeat-accumulate and fountain codes are described clearly and succinctly.' IEEE Transactions on Information Theory

Book Description

This exciting and entertaining textbook is ideal for courses in information, communication and coding. It is an unparalleled entry point to these subjects for professionals working in areas as diverse as computational biology, data mining, financial engineering and machine learning.

Inside This Book (Learn More)
First Sentence
In this chapter we discuss how to measure the information content of the outcome of a random experiment. Read the first page
Explore More
Concordance
Browse Sample Pages
Front Cover | Copyright | Table of Contents | Excerpt | Index | Back Cover
Search inside this book:

Sell a Digital Version of This Book in the Kindle Store

If you are a publisher or author and hold the digital rights to a book, you can sell a digital version of it in our Kindle Store. Learn more


Customer Reviews

3 star
0
2 star
0
1 star
0
4.8 out of 5 stars
4.8 out of 5 stars
Most Helpful Customer Reviews
17 of 17 people found the following review helpful
5.0 out of 5 stars Excellent book on inference and learning ... 21 Nov 2003
By Jurgen Van Gael VINE VOICE
Format:Hardcover
I have been able to use this book as extra background material for several courses of my final undergraduate year.
First I have been able to find a lot of usefull information on coding theory. Although this book isn't meanth to be a treatise on several coding, decoding techniques it gives the reader a lot of insight in the connection between coding and information theory. You won't find how matrix decoding algorithms, cyclic codes etc work but you will find out how the limits of information theory restrict coding theory.
I cannot compare the information theoretic approach to any other book as this was my first introduction but I can say the information theoretic treatise was a good read and I make myself strong I now have a solid information theory background.
Another course for which I have been able to use this book was a course on uncertainty reasoning. Mckay's book covers inference in great depth and introduces the reader to several different area's such as belief networks, decision theory, bayesian networks and several other inference methods. As before I cannot compare the ising, monte carlo like methods but it did give me a good introduction. Concerning the bayesian probability/inference, decision theory I can only say this is THE best introduction I have read!
I have read several introductions on Neural Networks (Kevin Geurny). This book keeps up with the standard set by several other good introductions.
Inference/Learning is a vast research area and this books gives a good introduction in all areas. Even as the part on neural networks may be as good as some other books on the topic I would definitely advise this book as for the same price you get so much more introductions to other learning techniques.
Read more ›
Comment | 
Was this review helpful to you?
16 of 16 people found the following review helpful
Format:Hardcover
Uniting information theory and inference in an interactive and entertaining way, this book has been a constant source of inspiration, intuition and insight for me. It is packed full of stuff - its contents appear to grow the more I look - but the layering of the material means the abundance of topics does not confuse.

This is _not_ just a book for the experts. However, you will need to think and interact when reading it. That is, after all, how you learn, and the book helps and guides you in this with many puzzles and problems.
Comment | 
Was this review helpful to you?
4 of 4 people found the following review helpful
5.0 out of 5 stars One of a kind 27 Jan 2011
Format:Hardcover|Verified Purchase
This is unique among the books I have encountered on information theory at this level, indeed one of the most reader-friendly accounts of any mathematically complex topic that I have ever read. The style makes the (difficult) subject matter very accessible. There are plenty of illustrations, which really do help with understanding, as well as examples with (mostly) answers provided, which are also valuable. The provision of answers to examples is frowned upon by purists, who say readers should just work them out for themselves, but we can't always succeed with every one, and I personally hate to be hung up on an example that I can't do.

To appreciate the benefits of Mackay's approach, compare this book with the classic 'Elements of Information Theory' by Cover and Thomas. That book was first published in 1990, and the approach is far more 'classical' than Mackay. It is certainly less suitable for self-study than Mackay's book. That said, I find Cover and Thomas very useful for providing the formal mathematical proofs of the theorems. After reading Mackay and understanding a topic, I then read Cover and Thomas on the same area and find the formal exposition of it, which complements Mackay nicely. I would not be without either book.

PS: I have subsequently discovered an excellent series of lectures by the author available online, essentially covering the main topics of the book. The lectures clarify the rather dense presentation in the book, and I have found them invaluable. They can be found by Googling "Mackay information theory lectures".
Comment | 
Was this review helpful to you?
Would you like to see more reviews about this item?
Were these reviews helpful?   Let us know
ARRAY(0xa20e8a5c)

Customer Discussions

This product's forum
Discussion Replies Latest Post
No discussions yet

Ask questions, Share opinions, Gain insight
Start a new discussion
Topic:
First post:
Prompts for sign-in
 

Search Customer Discussions
Search all Amazon discussions
   


Look for similar items by category


Feedback