Your Amazon Prime 30-day FREE trial includes:
| Delivery Options | ![]() |
Without Prime |
|---|---|---|
| Standard Delivery | FREE | From £2.99* |
| Premium Delivery | FREE | £3.95 |
| Same-Day Delivery (on eligible orders over £20 to selected postcodes) Details | FREE | £5.99 |
Unlimited Premium Delivery is available to Amazon Prime members. To join, select "Yes, I want a free trial with FREE Premium Delivery on this order." above the Add to Basket button and confirm your Amazon Prime free trial sign-up.
Important: Your credit card will NOT be charged when you start your free trial or if you cancel during the trial period. If you're happy with Amazon Prime, do nothing. At the end of the free trial, you will be charged £95/year for Prime (annual) membership or £8.99/month for Prime (monthly) membership.
Buy new:
£11.99£11.99
FREE delivery:
Wednesday, Jan 17
in the UK
Dispatches from: Amazon Sold by: Amazon
Buy used £3.54
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet or computer – no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Information: A History, a Theory, a Flood Paperback – 1 Mar. 2012
Purchase options and add-ons
Winner of the Royal Society Winton Prize for Science Books 2012, the world's leading prize for popular science writing.
We live in the information age. But every era of history has had its own information revolution: the invention of writing, the composition of dictionaries, the creation of the charts that made navigation possible, the discovery of the electronic signal, the cracking of the genetic code.
In ‘The Information’ James Gleick tells the story of how human beings use, transmit and keep what they know. From African talking drums to Wikipedia, from Morse code to the ‘bit’, it is a fascinating account of the modern age’s defining idea and a brilliant exploration of how information has revolutionised our lives.
Review
‘An audacious book which offers remarkable insight. Gleick takes us, with verve and fizz, on a journey from African drums to computers, liberally sprinkling delightful factoids along the way. This is a book we need to give us a fresh perspective on how we communicate and how that shapes our world.’ The Royal Society Winton Prize Judges
‘Mind-stretching but enlightening … the power and breadth of the ideas involved cannot but make you marvel.’ Daily Mail
‘Magisterial…It is not merely a history of information, but also a theory and a prospectus. To describe it as ambitious is to engage in almost comical understatement.’ Matthew Syed, The Times
‘A deeply impressive and rather beautiful book.’ Philip Ball, Observer
‘The fascinating story of how humans have transmitted knowledge…broad and occasionally brilliant.’ Sunday Times
‘This is a work of rare penetration, a true history of ideas whose witty and determined treatment of its material brings clarity to a complex subject.’ Tim Martin, Daily Telegraph
About the Author
James Gleick was born in New York in 1954. He worked for ten years as an editor and reporter for The New York Times. He is the bestselling author of Chaos, Genius, Faster, What Just Happened and a biography of Isaac Newton.
- Print length544 pages
- LanguageEnglish
- PublisherFourth Estate
- Publication date1 Mar. 2012
- Dimensions12.8 x 3.45 x 19.61 cm
- ISBN-109780007225743
- ISBN-13978-0007225743
Frequently bought together

Customers who viewed this item also viewed
From the Publisher
|
|
|
|
|---|---|---|
|
|
|
|
Product details
- ASIN : 0007225741
- Publisher : Fourth Estate (1 Mar. 2012)
- Language : English
- Paperback : 544 pages
- ISBN-10 : 9780007225743
- ISBN-13 : 978-0007225743
- Dimensions : 12.8 x 3.45 x 19.61 cm
- Best Sellers Rank: 238,061 in Books (See Top 100 in Books)
- 170 in Information Management
- 212 in Genetics in Popular Science
- 221 in Genetics (Books)
- Customer reviews:
About the author

James Gleick was born in New York and began his career in journalism, working as an editor and reporter for the New York Times. He covered science and technology there, chronicling the rise of the Internet as the Fast Forward columnist, and in 1993 founded an Internet startup company called The Pipeline. His books have been translated into more than twenty-five languages.
His home page is at http://around.com, and on Twitter he is @JamesGleick.
Customer reviews
Customer Reviews, including Product Star Ratings, help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyses reviews to verify trustworthiness.
Learn more how customers reviews work on Amazon-
Top reviews
Top reviews from United Kingdom
There was a problem filtering reviews right now. Please try again later.
Because of its centrality in the book, I commence the review and somewhat elaborate on Shannon's 'Theory of Information'. The year 1948 was for Bell Telephone Laboratories an annus mirabilis. During that year Claude Shannon then aged 32 published his theory of information titled 'A Mathematical Theory of Communication' while working at Bell Labs and coined the word bit;the transistor was invented in the same year and lab but the word 'transistor' was the product of committee deliberation.
Shannon in his theory defines the information content of an event as being proportional to the logarithm of its inverse probability of occurence. Shannon's theory of information is related to Entropy in that an increase in Entropy in a system increases its disorder while concurrently increasing its information content.
In Shannon's theory of information, the fundamental concept of distinguishability between two different states is a bit of information. A bit, by definition, exists in one of two different states at any given time - a zero or a one;where you have more than two outcomes, you simply use more bits to distinguish them all. Shannon's information theory relates to events based on Boolean logic i.e for an event with several outcomes, each outcome happens or does not happen.
Shannon's information theory has a very wide applicability in that it has the same logical foundation in different physical, biological, social and economic phenomena.
All the preceding relate to classical physics and the classical worldview. However, a more accurate description of our physical world is given by quantum theory which has superseded classical physics. With quantum theory the notion of a deterministic world fails, events always occur with a probability regardless of how much information we possess.
Classical bits as we have already explained exist in one of two different states at any given time - a zero or a one. With quantum mechanics, however, we are permitted to have a zero and a one at the same time in one physical system. In fact we are permitted to have an infinite range of states between zero and one which we call a qubit.
It has been established that Shannon's theory of information can be successfully extended to account for quantum theory and potentially to a quantum computer with enormous capability.
The evolution from oral speech to writing was effected through a progression from pictographic, writing the picture;to ideographic, writing the idea;and then logographic, writing the word. This journey led from things to words, from words to categories, from categories to metaphor and logic. We know that formal logic is the invention of Greek culture after it had interiorized the technology of alphabetic writing. The culture of literacy developed its many gifts:history and the law;the sciences and philosophy;the reflective application of art and literature itself.
For a long period the written word was the task of scribes. The revolution came from Johannes Gutenberg (c.1400-68) who was the first in the West to print using movable type and was the first to use a press. Printing books naturally assisted in their wide dissemination. But it was left to Elizabeth Eisenstein in her landmark scholarship, two volumes titled 'The Printing Press as an Agent of Change'published in 1979 to conclusively demonstrate printing as the communications revolution essential to the transition from medieval times to modernity.
And from the extensive treatment of Shannon's theory of information to the telegraphic presentation of the three waves of electrical communication erected in sequence:telegraphy, telephony, and radio which collectively annihilated space and time. Naturally ingenuity was required to effect the crossing point between electricity and language and also the interface between device and human.
It is worth noting that both Claude Shannon for whom we have spoken at length and Alan Turing, the British genius, who apart from breaking the German Enigma code also conceived the elegant ethereal abstaction of Universal machine were engaged in cryptography during World War II and even met in Bell labs cafeteria but naturally did not talk to each other on the specific nature of their secret work.
I shall conclude the review with information in biological systems:DNA serves as one-dimensional storage of information;DNA also (information transfer)sends that information outward for use in the making of the organism. The data stored in a one-dimensional strand has to flower forth in three dimensions. This information transfer occurs via messages passing from nucleic acids to proteins. So DNA not only replicates itself;separately, it dictates the manufacture of something entirely different.
When the genetic code was solved, in the early 1960s, it turned to be full of redundancy. The redundancy serves exactly the purpose that an information theorist would expect. It provides tolerance for errors.
Imagine a random string of numbers, infinite in length with no discernible pattern. Is it possible to write a small, compact computer program to predict the next number in the string at any given point? No, the minimum length of a computer program that could accurately print out the string in the correct order is equal in length to the string itself.
However, think about the number Pi. Infinite yet utterly non-random. A very small, finite computer program can compute it to infinite length and accuracy just by continuously dividing the circumference of a circle by its diameter.
Then consider how much 'information' and 'meaning' the two numbers contain. If you think about it, Pi - computable by looping a single mathematical operation - actually contains a small, finite quantity of information - the information contained in the program. It is only a ratio between two numbers and we only need a small amount of code to predict it to infinite accuracy. But in terms of the amount of 'meaning' this information represents, well Pi basically underpins our understanding of the mechanics of the universe = it has an awful lot of 'meaning'. Conversely, the random string is a massive amount of information with little or no value to us = it has no 'meaning'. To me, this is a new and fascinating way of looking at the universe, leading to some very exciting conclusions.
In this absorbing book, James Gleick takes us on a guided history tour of how modern science finally came to understand the importance of 'information', to the point where it is now regarded as a fundamental quantity just like energy. There are fascinating chapters on the work of Alan Turing and Claude Shannon as well as a brilliant exposition on the concept of entropy, clearly linking it to information theory. As entropy increases, so does the amount of information in the universe. Ultimately we are, just like Maxwell's philosophical Demon, fighting a losing battle in trying to structure and sort it.
This is the book I should have read before I tackled Seth Lloyd's "Programming The Universe". Unfortunately, as I now know, this was not to be. Lloyd's book was published first and - because entropy is always on the up and up - nobody, least of all me, can turn back the hands of time!








