This is an excellent introduction to Information Theory for the layperson. I often started with the intention of reading just a single chapter and found myself drawn into continuing onto the next chapter. It is unusual to find a technical book to be a page-turner!
The first four chapters set the scene and build towards a consideration of entropy. This chapter is quite heavy going but we are then rewarded with two rather easier chapters which allow us to catch our breath. The author then turns his attention to the problem of noise. We are also given a glimpse of how n-dimensional geometry was used by Claude Shannon to prove an important theory on the effect of noise on signal transmission.
There are chapters on the application of Information Theory to physics, cybernetics, psychology and art. Since the book was published in 1961 and revised in 1980, some of this is rather dated now - anyone still using cassette tapes to store their computer data?
I came to this book simply wanting an introduction to the subject. The early parts of the book were excellent and far exceeded my expectations. Others may subsequently review this book from the perspective of being well-acquainted with the subject and being more familiar with recent developments. My perspective relates to its accessibility.
If someone has a reasonable mathematical background they should find this book very approachable. To get the most from this book, the reader should probably be familiar with logarithms, indices, 3-D geometry and the sigma notation for summation. For those who aren't, the author does provide an appendix which gives a brief introduction to most of the relevant background mathematics.