FREE Delivery in the UK on orders with at least £10 of books.
Only 9 left in stock (more on the way).
Dispatched from and sold by Amazon. Gift-wrap available.
Quantity:1
An Introduction to Inform... has been added to your Basket
+ £2.80 UK delivery
Used: Good | Details
Condition: Used: Good
Comment: Ships from the USA. Please allow 2 to 3 weeks for delivery. Book shows a small amount of wear to cover and binding. Some pages show signs of use. Sail the seas of value.
Have one to sell?
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See all 3 images

An Introduction to Information Theory, Symbols, Signals and Noise (Dover Books on Mathematics) Paperback – 1 Jan 1980

4.3 out of 5 stars 3 customer reviews

See all formats and editions Hide other formats and editions
Amazon Price
New from Used from
Kindle Edition
"Please retry"
Paperback
"Please retry"
£9.99
£4.32 £4.32
Note: This item is eligible for click and collect. Details
Pick up your parcel at a time and place that suits you.
  • Choose from over 13,000 locations across the UK
  • Prime members get unlimited deliveries at no additional cost
How to order to an Amazon Pickup Location?
  1. Find your preferred location and add it to your address book
  2. Dispatch to this address when you check out
Learn more
£9.99 FREE Delivery in the UK on orders with at least £10 of books. Only 9 left in stock (more on the way). Dispatched from and sold by Amazon. Gift-wrap available.
click to open popover

Special Offers and Product Promotions


Frequently Bought Together

  • An Introduction to Information Theory, Symbols, Signals and Noise (Dover Books on Mathematics)
  • +
  • Introduction to Graph Theory (Dover Books on Mathematics)
  • +
  • Introduction to Topology: Third Edition (Dover Books on Mathematics)
Total price: £27.97
Buy the selected items together

Enter your mobile number below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
Getting the download link through email is temporarily not available. Please check back later.

  • Apple
  • Android
  • Windows Phone

To get the free app, enter your mobile phone number.




Product details

  • Paperback: 336 pages
  • Publisher: Dover Publications Inc.; 2nd Revised edition edition (1 Jan. 1980)
  • Language: English
  • ISBN-10: 0486417093
  • ISBN-13: 978-0486240619
  • ASIN: 0486240614
  • Product Dimensions: 13.7 x 1.7 x 21.6 cm
  • Average Customer Review: 4.3 out of 5 stars  See all reviews (3 customer reviews)
  • Amazon Bestsellers Rank: 194,781 in Books (See Top 100 in Books)
  • See Complete Table of Contents

Product Description

About the Author

JOHN R. PIERCE, M.D. (Col. MC, U.S. Army, Ret.), wrote the series of articles on which this book is based for Stripe, a publication for Walter Reed Army Medical Center (WRAMC) personnel and included on dcmilitary.com, a Web site for military personnel in the Washington, D.C., area. Pierce recently retired after thirty years of active duty, a significant portion of that spent at WRAMC. With Pierce, JIM WRITER coedited a supplement to the journal "Military Medicine" on the 1900 Yellow Fever Board.


Customer Reviews

4.3 out of 5 stars
5 star
1
4 star
2
3 star
0
2 star
0
1 star
0
See all 3 customer reviews
Share your thoughts with other customers

Top Customer Reviews

Format: Paperback
This is an excellent introduction to Information Theory for the layperson. I often started with the intention of reading just a single chapter and found myself drawn into continuing onto the next chapter. It is unusual to find a technical book to be a page-turner!

The first four chapters set the scene and build towards a consideration of entropy. This chapter is quite heavy going but we are then rewarded with two rather easier chapters which allow us to catch our breath. The author then turns his attention to the problem of noise. We are also given a glimpse of how n-dimensional geometry was used by Claude Shannon to prove an important theory on the effect of noise on signal transmission.

There are chapters on the application of Information Theory to physics, cybernetics, psychology and art. Since the book was published in 1961 and revised in 1980, some of this is rather dated now - anyone still using cassette tapes to store their computer data?

I came to this book simply wanting an introduction to the subject. The early parts of the book were excellent and far exceeded my expectations. Others may subsequently review this book from the perspective of being well-acquainted with the subject and being more familiar with recent developments. My perspective relates to its accessibility.

If someone has a reasonable mathematical background they should find this book very approachable. To get the most from this book, the reader should probably be familiar with logarithms, indices, 3-D geometry and the sigma notation for summation. For those who aren't, the author does provide an appendix which gives a brief introduction to most of the relevant background mathematics.
Comment 20 people found this helpful. Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
Format: Paperback
This is one of the best books on popular science I have read. The subject may seem rather dry but if, as I did, you want to understand how the entropy used as measure of information in Information Theory relates to the entropy of the second law of thermodynamics - a source of much confusion - I can assure you this book does an excellent job of explaining it. The book does not shirk from presenting the key mathematical equations - it's hard to see how it could, since Information Theory is the 'Mathematical Theory of Communications', as invented by Claude Shannon - so if you are repelled by the sight of equations with logarithms and summation signs, this is probably not the book for you. The maths is all explained, though, and it's not too heavy.

After explaining the key concepts and their applications to communications systems, including communications in the presence of noise, channel capacity and error correction, later chapters cover the relationship with physics (physical origins of noise, thermodynamic entropy), cybernetics, psychology and art. Its relationship to natural language (using English as the example) is a recurring theme through the book. Thus the author tackles the question that will be in the minds of many readers: the relationship of the precise mathematical concept of information in Information Theory with the imprecise notions of the every day use of the word.

The book is beautifully written and the explanations are very clear. The author waxes philosophically on the nature of science on occasion in order to set Information Theory in context. Some of this might seem a little over the top for the subject matter, but on the whole I rather enjoyed that.
Read more ›
Comment One person found this helpful. Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
Format: Paperback Verified Purchase
I was expecting there to be way more difficult mathematics in the book but it was extremely tame. Hopefully it will prove as a light introduction and I'll start reading some papers after.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse

Most Helpful Customer Reviews on Amazon.com (beta)

Amazon.com: HASH(0x8f017660) out of 5 stars 83 reviews
141 of 144 people found the following review helpful
HASH(0x8ed690c0) out of 5 stars Still the place to start 14 Nov. 2000
By Ken Braithwaite - Published on Amazon.com
Format: Paperback
Although old this is still the best book to learn the core ideas of this subject, especially what information "entropy" really means. I read Ash's book, and followed the proofs, but I didn't really grasp the ideas until I read this.
The book is geared towards non-mathematicians, but it is not just a tour. Pierce tackles the main ideas just not all the techniques and special cases.
Perfect for: anyone in science, linguistics, or engineering. Very good for: everyone else.
103 of 109 people found the following review helpful
HASH(0x8ed69114) out of 5 stars An Absolute Gem 11 Oct. 2002
By Clark M. Neily - Published on Amazon.com
Format: Paperback Verified Purchase
Claude Shannon died last year, and it's really disgraceful that his name is not a household word in the manner of Einstein and Newton. He really WAS the Isaac Newton of communications theory, and his master's thesis on Boolean logic applied to circuits is probably the most cited ever.
This is the ONLY book of which I am aware which attempts to present Shannon's results to the educated lay reader, and Pierce does a crackerjack job of it. Notwithstanding, this is not a book for the casual reader. The ideas underlying the theory are inherently subtle and mathematical, although there are numerous practical manifestations of them in nature, and in human "information transmission" behavior. On the other hand, this is a work which repays all effort invested in its mastery many times over.
34 of 35 people found the following review helpful
HASH(0x8f40a1e0) out of 5 stars Good book for the basics of information theory 8 Dec. 2005
By calvinnme - Published on Amazon.com
Format: Paperback
I give this book five stars because it succeeds brilliantly at what it sets out to do - to introduce the field of information theory in an accessible non-mathematical way to the completely uninitiated. Information theory is that branch of mathematics that deals with the information content of messages. The theory addresses two aspects of communication: "How can we define and measure information?" and "What is the maximum information that can be sent through a communications channel?". No other book I know of can explain these concepts of information, bits, entropy, and data encoding without getting bogged down in proofs and mathematics. The book even manages to equate the concept of language with the information it inherently transmits in a conversational and accessible style. The book rounds out its discussion with chapters on information theory from the perspectives of physics, psychology, and art. The only math necessary to understand what's going on in this book is high school algebra and the concept of logarithms. If you are an engineer or engineering student who knows anything about information theory, you probably will not find this book helpful. Instead you would do better to start off with a more advanced book like "An Introduction To Information Theory" by Reza, which introduces concepts from a more mathematical perspective.
50 of 54 people found the following review helpful
HASH(0x8ed6939c) out of 5 stars Best Introduction 13 April 2000
By Chris McKinstry - Published on Amazon.com
Format: Paperback
Though first printed in 1961 and revised in 1980 this is the best introduction to information theory there is. Very easy to read and light on math, just as an introduction should be. I expect it will be in print for a very, very long time.
19 of 19 people found the following review helpful
HASH(0x8ed6954c) out of 5 stars An excellent introduction to a complex subject 6 Aug. 1998
By A Customer - Published on Amazon.com
Format: Paperback
Pierce's book is an excellent introduction to the subject of information theory. It is not a text on the subject, although it does have some limited mathematical content which is more than the casual reader can handle. The beauty of this book is that unlike most engineers and scientists turned authors, Pierce not only relates much of the history of the subject (from first hand knowledge), but does so with incredible conciseness and clarity. The non-technical approach allows that, and Pierce takes full advantage of his chosen format. It is better to say a non-textual approach really since this isn't a text. Yet, like Feynman, Pierce is able to explain a great amount of the fundamental details of information theory without the rigor of difficult equations and derivations. Any student truly interested in the subject should keep this volume as a companion to their textbook.
Were these reviews helpful? Let us know


Feedback