Information Theory: A Tutorial Introduction Paperback – 1 Feb 2015
- Choose from over 13,000 locations across the UK
- Prime members get unlimited deliveries at no additional cost
- Find your preferred location and add it to your address book
- Dispatch to this address when you check out
Frequently Bought Together
Customers Who Bought This Item Also Bought
Enter your mobile number below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
Getting the download link through email is temporarily not available. Please check back later.
To get the free app, enter your mobile phone number.
"This is a really great book – it describes a simple and beautiful idea in a way that is accessible for novices and experts alike. This "simple idea" is that information is a formal quantity that underlies nearly everything we do. In this book, Stone leads us through Shannon’s fundamental insights; starting with the basics of probability and ending with a range of applications including thermodynamics, telecommunications, computational neuroscience and evolution. There are some lovely anecdotes: I particularly liked the account of how Samuel Morse (inventor of the Morse code) pre-empted modern notions of efficient coding by counting how many copies of each letter were held in stock in a printer's workshop. The treatment of natural selection as "a means by which information about the environment is incorporated into DNA" is both compelling and entertaining. The substance of this book is a clear exposition of information theory, written in an intuitive fashion (true to Stone's observation that "rigour follows insight"). Indeed, I wish that this text had been available when I was learning about information theory. Stone has managed to distil all of the key ideas in information theory into a coherent story. Every idea and equation that underpins recent advances in technology and the life sciences can be found in this informative little book."
Professor Karl Friston, Fellow of the Royal Society.
Scientific Director of the Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London.
"Information lies at the heart of biology, societies depend on it, and our ability to process information ever more efficiently is transforming our lives. By introducing the theory that enabled our information revolution, this book describes what information is, how it can be communicated efficiently, and why it underpins our understanding of biology, brains, and physical reality. Its tutorial approach develops a deep intuitive understanding using the minimum number of elementary equations. Thus, this superb introduction not only enables scientists of all persuasions to appreciate the relevance of information theory, it also equips them to start using it. The same goes for students. I have used a handout to teach elementary information theory to biologists and neuroscientists for many years. I will throw away my handout and use this book."
Simon Laughlin, Professor of Neurobiology, Fellow of the Royal Society, Department of Zoology, University of Cambridge, England.
About the Author
James V. Stone is a Reader in the Psychology Department of the University of Sheffield. He is coauthor (with John P. Frisby) of the widely used text "Seeing: The Computational Approach to Biological Vision" (second edition, MIT Press, 2010), and author of "Independent Component Analysis: A Tutorial Introduction "(MIT Press, 2004).
What Other Items Do Customers Buy After Viewing This Item?
Top Customer Reviews
After defining the key concepts in terms of discrete information sources, applying them to noiseless and noisy channels and going on to address continuous sources, the final two chapters discuss the relationship of information theory to thermodynamic entropy and the application of information theory to a range of topics, including satellite communication, compression systems such as MP3 and, lastly, biology. These might be considered bonus material, since the purpose of the book is to teach the basics, but if I have one disappointment with the coverage, it is the mere one page devoted to quantum computation, which does not seem to explain the relevance of information theory to this subject at all - though in the book's defence, it must be said that this is a big topic that would have required a substantial section on quantum mechanics for anything deeper. The sections on thermodynamics and the applications to biology though, especially the subject of vision (the author's own research field), are much fuller and make it well worth working through all the preceding material to reach.Read more ›
Information theory is central to the technology that we use every day - apart from anything else, in the technology that brings you this review (though ironically not in the book being reviewed as it doesn't appear to have an ebook version). As in his Bayes' Rule, James Stone sets out to walk a fine line between a title for the general reader and a textbook. And like that companion title the outcome is mixed, though here the textbook side largely wins.
The opening chapter 'What is information?' walks the line very well. It gradually builds up the basics that will be required to understand information theory and though it would work better if it had a little more context (for example, more about Claude Shannon as a person) to anchor it, the general reader will, with perhaps a few pages that needs re-reading, find it approachable and providing more depth than a popular science title usually would. I like the way that Stone uses variants of a photograph, for instance, to demonstrate what is happening with different mechanisms for compressing data. Unfortunately, though, this is pretty much where that general reader gets off, until we get to chapter 9.
The main bulk of the book, pages 21 to 184, cross that line and plonk solidly into textbook territory - they may cover the topic rather more lightly than a traditional textbook, but they simply don't work to inform without requiring the kind of investment of mind and mathematics that a textbook does - and, with a few brief exceptions, the writing style feels no different from the better textbooks I have from university.Read more ›