Start reading An Introduction to Information Theory: Symbols, Signals a... on your Kindle in under a minute. Don't have a Kindle? Get your Kindle here or start reading now with a free Kindle Reading App.

Deliver to your Kindle or other device


Try it free

Sample the beginning of this book for free

Deliver to your Kindle or other device

Anybody can read Kindle books—even without a Kindle device—with the FREE Kindle app for smartphones, tablets and computers.
An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics)

An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics) [Kindle Edition]

John R. Pierce
4.0 out of 5 stars  See all reviews (1 customer review)

Print List Price: £14.49
Kindle Price: £5.49 includes VAT* & free wireless delivery via Amazon Whispernet
You Save: £9.00 (62%)
* Unlike print books, digital books are subject to VAT.


Amazon Price New from Used from
Kindle Edition £5.49  
Paperback £14.49  
Kindle Daily Deal
Kindle Daily Deal: At least 60% off
Each day we unveil a new book deal at a specially discounted price--for that day only. Learn more about the Kindle Daily Deal or sign up for the Kindle Daily Deal Newsletter to receive free e-mail notifications about each day's deal.

Special Offers and Product Promotions

  • Purchase any Kindle Book sold by and receive £1 credit to try out our Digital Music Store. Here's how (terms and conditions apply)

Product Description

Product Description

Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permeated the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future.
To give a solid introduction to this burgeoning field, J. R. Pierce has revised his well-received 1961 study of information theory for an up-to-date second edition. Beginning with the origins of the field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy. language and meaning, efficient encoding , and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. A glossary of terms and an appendix on mathematical notation are provided to help the less mathematically sophisticated.
J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. He is currently affiliated with the engineering department of the California Institute of Technology. While his background is impeccable, Dr. Pierce also possesses an engaging writing style that makes his book all the more welcome. An Introduction to Information Theory continues to be the most impressive non-technical account available and a fascinating introduction to the subject for laymen.
"An uncommonly good study. . . . Pierce's volume presents the most satisfying discussion to be found."― Scientific American.

Product details

  • Format: Kindle Edition
  • File Size: 6495 KB
  • Print Length: 336 pages
  • Publisher: Dover Publications; Subsequent edition (29 Mar 2012)
  • Sold by: Amazon Media EU S.à r.l.
  • Language: English
  • ASIN: B008TVLR0O
  • Text-to-Speech: Enabled
  • X-Ray:
  • Average Customer Review: 4.0 out of 5 stars  See all reviews (1 customer review)
  • Amazon Bestsellers Rank: #331,092 Paid in Kindle Store (See Top 100 Paid in Kindle Store)
  •  Would you like to give feedback on images?

More About the Authors

Discover books, learn about writers, and more.

What Other Items Do Customers Buy After Viewing This Item?

Customer Reviews

5 star
3 star
2 star
1 star
4.0 out of 5 stars
4.0 out of 5 stars
Most Helpful Customer Reviews
18 of 19 people found the following review helpful
4.0 out of 5 stars A very good introduction for the layperson 24 May 2008
This is an excellent introduction to Information Theory for the layperson. I often started with the intention of reading just a single chapter and found myself drawn into continuing onto the next chapter. It is unusual to find a technical book to be a page-turner!

The first four chapters set the scene and build towards a consideration of entropy. This chapter is quite heavy going but we are then rewarded with two rather easier chapters which allow us to catch our breath. The author then turns his attention to the problem of noise. We are also given a glimpse of how n-dimensional geometry was used by Claude Shannon to prove an important theory on the effect of noise on signal transmission.

There are chapters on the application of Information Theory to physics, cybernetics, psychology and art. Since the book was published in 1961 and revised in 1980, some of this is rather dated now - anyone still using cassette tapes to store their computer data?

I came to this book simply wanting an introduction to the subject. The early parts of the book were excellent and far exceeded my expectations. Others may subsequently review this book from the perspective of being well-acquainted with the subject and being more familiar with recent developments. My perspective relates to its accessibility.

If someone has a reasonable mathematical background they should find this book very approachable. To get the most from this book, the reader should probably be familiar with logarithms, indices, 3-D geometry and the sigma notation for summation. For those who aren't, the author does provide an appendix which gives a brief introduction to most of the relevant background mathematics.
Comment | 
Was this review helpful to you?
Most Helpful Customer Reviews on (beta) 4.5 out of 5 stars  62 reviews
112 of 113 people found the following review helpful
5.0 out of 5 stars Still the place to start 14 Nov 2000
By Ken Braithwaite - Published on
Although old this is still the best book to learn the core ideas of this subject, especially what information "entropy" really means. I read Ash's book, and followed the proofs, but I didn't really grasp the ideas until I read this.
The book is geared towards non-mathematicians, but it is not just a tour. Pierce tackles the main ideas just not all the techniques and special cases.
Perfect for: anyone in science, linguistics, or engineering. Very good for: everyone else.
88 of 93 people found the following review helpful
5.0 out of 5 stars An Absolute Gem 11 Oct 2002
By Clark M. Neily - Published on
Format:Paperback|Verified Purchase
Claude Shannon died last year, and it's really disgraceful that his name is not a household word in the manner of Einstein and Newton. He really WAS the Isaac Newton of communications theory, and his master's thesis on Boolean logic applied to circuits is probably the most cited ever.
This is the ONLY book of which I am aware which attempts to present Shannon's results to the educated lay reader, and Pierce does a crackerjack job of it. Notwithstanding, this is not a book for the casual reader. The ideas underlying the theory are inherently subtle and mathematical, although there are numerous practical manifestations of them in nature, and in human "information transmission" behavior. On the other hand, this is a work which repays all effort invested in its mastery many times over.
47 of 49 people found the following review helpful
5.0 out of 5 stars Best Introduction 13 April 2000
By Chris McKinstry - Published on
Though first printed in 1961 and revised in 1980 this is the best introduction to information theory there is. Very easy to read and light on math, just as an introduction should be. I expect it will be in print for a very, very long time.
24 of 24 people found the following review helpful
5.0 out of 5 stars Good book for the basics of information theory 8 Dec 2005
By calvinnme - Published on
I give this book five stars because it succeeds brilliantly at what it sets out to do - to introduce the field of information theory in an accessible non-mathematical way to the completely uninitiated. Information theory is that branch of mathematics that deals with the information content of messages. The theory addresses two aspects of communication: "How can we define and measure information?" and "What is the maximum information that can be sent through a communications channel?". No other book I know of can explain these concepts of information, bits, entropy, and data encoding without getting bogged down in proofs and mathematics. The book even manages to equate the concept of language with the information it inherently transmits in a conversational and accessible style. The book rounds out its discussion with chapters on information theory from the perspectives of physics, psychology, and art. The only math necessary to understand what's going on in this book is high school algebra and the concept of logarithms. If you are an engineer or engineering student who knows anything about information theory, you probably will not find this book helpful. Instead you would do better to start off with a more advanced book like "An Introduction To Information Theory" by Reza, which introduces concepts from a more mathematical perspective.
18 of 18 people found the following review helpful
4.0 out of 5 stars An excellent introduction to a complex subject 6 Aug 1998
By A Customer - Published on
Pierce's book is an excellent introduction to the subject of information theory. It is not a text on the subject, although it does have some limited mathematical content which is more than the casual reader can handle. The beauty of this book is that unlike most engineers and scientists turned authors, Pierce not only relates much of the history of the subject (from first hand knowledge), but does so with incredible conciseness and clarity. The non-technical approach allows that, and Pierce takes full advantage of his chosen format. It is better to say a non-textual approach really since this isn't a text. Yet, like Feynman, Pierce is able to explain a great amount of the fundamental details of information theory without the rigor of difficult equations and derivations. Any student truly interested in the subject should keep this volume as a companion to their textbook.
Were these reviews helpful?   Let us know
Search Customer Reviews
Only search this product's reviews

Customer Discussions

This product's forum
Discussion Replies Latest Post
No discussions yet

Ask questions, Share opinions, Gain insight
Start a new discussion
First post:
Prompts for sign-in

Search Customer Discussions
Search all Amazon discussions

Look for similar items by category