Shop now Shop now Shop now  Up to 50% Off Fashion  Shop all Amazon Fashion Cloud Drive Photos Shop now Learn More Shop now Shop now Shop Fire Shop Kindle Listen with Prime Shop now Shop now

Customer Reviews

4.1 out of 5 stars46
4.1 out of 5 stars
Your rating(Clear)Rate this item


There was a problem filtering reviews right now. Please try again later.

on 26 April 2011
I'm not sure the reviews so far are terribly helpful if you want a quick feel for whether to read this book or not. So here goes.

It's basically a bravura sweep through the history of information, told with great panache and lots of anecdote, mixing straight narrative with reflection on wider significance, and attempting to explain quite difficult concepts for the non-specialised reader. Whatever else it may or may not be, I found it a lively and enjoyable read.

The book falls broadly into three sections. The first runs through key early stages in the creation, storage and use of information - the alphabet, printing, the telegraph, telephone, etc. I didn't find much new here but the author did a great job marshalling facts, figures, characters and anecdotes into a lively tale.

The heart of the book grapples with information as a scientific concept, and you will find yourself in the realm of computers, information theory, DNA and quantum mechanics (to name but a few). This isn't natural territory for me, but I was swept along by Gleick's style and even felt I understood some of the underlying mathematical concepts he sought to explain.

The final section is essentially a thought piece on the modern information age, considering the ubiquity of information from the internet and the perils of information overload. Rather like the first section, I didn't feel there was a great deal new here but Gleick's ability to call up literary references, make parallels across the centuries and ask the pertinent questions made it an engaging read. I'm certainly pleased to have made the acquaintance of Vincent of Beauvais, a thirteenth century monk who seems to have arrived 750 years early for the Information Age.

So, a dazzling read certainly, but one also with a great deal of substance. Recommended.
0Comment|86 people found this helpful. Was this review helpful to you?YesNoReport abuse
VINE VOICEon 24 February 2012
Gleick's Chaos was one of the books from my teen years and I read Genius his biography of Richard Feynman. So I had high expectations for The Information. In chaos Gleick was at his best when he was telling the history of chaos or the biographical sketches of the participants (Chaos focused on Mitchel Feigenbaum). Again here these are the strongest parts of the book when he is talking about Babbage or Shannon. The stories of Shannon and his seminal paper on information theory is brilliantly explained along with the impact of the advances in communication from telegraphy to telephones and the internet. His explanations of probability and complexity are much clearer than similar arguments made by Murray Gell-Mann in the Quark and the Jaguar. He also does a great job of number theory and the problems of rational, irrational and computable numbers and their information content. It was very nice to see Henry Quastler mentioned as he has unfortunately been ignored because of his untimely death.

The problem comes when towards the end in the chapters when he looks at information in biology, entropy and complexity. Biology as Sydney Brenner and Craig Venter have both said is an information science, but Gleick looks too much at the Ricahrd Dawkin's view of neo-Darwinism and information, which is a gloss on the work of John Maynard-Smith. Another founder of the idea of information in biological sequences, especially from a phylogenetic perspective is Linus Pauling. The real truth in biology is deeper than this. The gene code is not a code for an organism the same way as a blueprint does not build a building without the builders to build it and the technical know-how of the construction. We are seeing some progress in understanding the complexity of biology, and it is much more complex than we have ever imagined. This is the field of bioinformatics and bioinformaticians as far as I know (I have been one for nearly 20 years) do not investigate chain letters, which is the only example he gives in the book. The information in the genome is at many different levels, in sequence, in structure, between the genes in the junk DNA and in the chemistry. It is not 0 and 1 it is an intimate layering of information.

I do not think he was trying to be comprehensive and so the work of Steven Wolfram, Gell-Mann and those at Santa Fe like Andreas Wagner as well as older work from Conrad Waddington to provide a broader view of the arguments about complexity in computational and biological systems.
22 comments|21 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 9 November 2011
Much of the book makes fascinating reading- pseudo codes in African drumming, lucid accounts of concepts such as computable vs non-computable numbers, Byron's daughter and more. There is a large section on the development of alphabets and dictionaries and Glieck observes that the OED in Murray's edition allows 89 distinct meanings for the word 'make'. It seems a pity that he did not look up the word 'information' at the same time, as besides the normal meaning of knowledge, there are a number of specialised meanings, and the word was even usurped by Shannon in his signalling theory to indicate disorder and randomness: if Shannon had chosen a different term for this concept, Glieck's book would have been very different. 'Information' is also used in molecular biology to indicate encoding, compatible with the normal meaning, and there may be further usages in number theory and quantum physics.

Glieck gives a broad discussion of these different usages although it is not always clear that he realises that the meaning depends on context and that he is writing about disparate topics. This could be the reason that the book is interesting in its parts but to me lacks overall coherence.

An editor might have deleted the end section on the future of Google, Wiki and the internet which risks becoming outdated even before the paperback edition of the book is released- indeed, Wikileaks, also due to pass off into history, shows that relevant knowledge can still be held as closely guarded secrets.
0Comment|3 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 8 August 2012
I've been a fan of James Gleick's work ever since his book on Chaos came out. Thus I was looking forward to reading his book on Information. Unfortunately, for the first time I was disappointed. It's difficult to put a finger on a specific reason why I should have only got part way through the book before abandoning it. I'm used to reading larger books than this, so it wasn't the size. The writing seemed more turgid that past work, and the portraits of the key figures more fuzzy. And, of course, it wasn't helped by the tiny print size used by the publishers.

Unlike some of the other reviewers, I don't have any specific disagreements with Gleick's ideas on information, at least as far as I got through the book. It was just that the writing was too pedestrian to hold my attention for long enough to complete it.
0Comment|11 people found this helpful. Was this review helpful to you?YesNoReport abuse
on 18 February 2013
If one is forgiven the sin of communicating in memes (see chapter 11) this fine work is best described as a triumph of joined up thinking. The aim to write a history of information and ideas in 500 pages seems initially outrageously optimistic yet by the epilogue one is left dazzled and in awe of Gleick's ability to draw upon such a diverse array of human achievements and pursuits to produce such a cogent and coherent discussion.

The author comprehensively charts the progress of ideas and information transmission from the oral through the first alphabets to the written, then via printing which led to The Renaissance and birth of modern science, to mechanical computing envisaged and part realized by Babbage, then through telegraphy, telephony, electronic computing, ultimately to quantum computing, the internet, Wikipedia, Google and Twitter.

At the core of Gleick's thesis is the notion of information theory developed by Shannon in the late 1940's and early 1950's and the revolutionary influence it had on academic disciplines as wide as: psychology, computing, genetics and quantum physics. Shannon's viewing of information as being a signal or code transmitted to a sentient listener who subsequently creates information from it is a fundamental tenet of cognitive and neuropsychology which emerged in the 1950's refuting the 'black box' of behaviourist psychologists.

Further Shannon's quantifying information in terms of 'bits', paved the way for the use of transistors and resistors to manipulate data in electronic computers. His envisaging of information and it's transmission as a code had a profound influence on Watson and Crick's unravelling of the complexities of DNA and how it codes for amino acids which subsequently create proteins, from which all living things are made. Shannon also introduced the concept of information being associated with probability through notions such as redundancy in language and codes. This made a clear link with quantum mechanics via Heisenberg's uncertainty principle and provided impetus to the nascent discipline of quantum computing.

Gleick eloquently tells the human story behind these great advancements portraying: the key players, the controversies and the very real impact upon the everyday lives of people - for instance the shrinking of space and time initially created by telegraphy and today by the internet. The concept of information overload is also amusingly discussed in the 21st Century, as is the squabble over telegraph addresses by large companies and rich individuals in the late 19th century, mirrored in the late 20th by the litigation over the ownership of internet domain names.
0Comment|One person found this helpful. Was this review helpful to you?YesNoReport abuse
`We can see now that information is what our world runs on: the blood and the fuel, the vital principle.'

Information takes many forms, and relies on different technologies for its retention and dissemination. I struggled at first with the definite article in the title, but the more I read the more sense it made, most of the time.

`Writing comes into being to retain information across time and across space.'

In this book, James Gleick covers the development and different forms of literacy. From Chinese script (between 4,500 and 8,000 years ago) to the development of the alphabet around 1500 BCE, literacy enables information capture and transmission. But many of the developments that surround us today can be pinpointed to work undertaken by Claude Shannon and published in his paper called `A Mathematical Theory of Communication' in 1948. For Claude Shannon, communication was an engineering challenge unrelated to the content of the message. What was important was that the message could be transmitted so that someone else could recover it. The field of information theory was created.
Information theory may originally have been perceived as having applications in engineering and computer science (how fortuitous that the first `point contact' transistor was discovered around the same time), but its impacts were far greater. Information theory has reshaped fields such as economics and philosophy, and has resulted in changes to thinking in biology (the structure of DNA) and physics (the paradoxes of quantum mechanics).

`Every new medium transforms the nature of human thought. In the long run, history is the story of information becoming aware of itself.'

While I found many of the scientific concepts challenging, the book is written in such a way that the story remains clear. I enjoyed the anecdotes, especially the one in which Charles Babbage wrote to Alfred, Lord Tennyson to take point out the inaccuracy of the arithmetic in the couplet: `Every moment dies a man/Every moment one is born' (in `The Vision of Sin') suggesting this change: `Every moment dies a man/ And one and a sixteenth is born.' `I may add that the exact figures are 1.167, but something must, of course, be conceded to the laws of metre.'
Fascinating as the science is, it is the human history of information that most interested me: the various writing systems invented; the compilation of the Oxford English Dictionary (thirty different ways to spell `mackerel' is so triumphantly human); the stories of coding and communication (including the semaphore system invented by the Chappe brothers in France during the late 18th century).

`We may wish to understand the rise of literacy both historically and logically, but history and logic are themselves the products of literate thought.'

There's plenty of information and food for thought in this book, and it isn't necessary to agree with everything James Gleick writes in order to appreciate the broader points made. Perhaps the biggest question for me is how we decide what constitutes meaningful information, and how we manage information flow effectively at a personal level.

Is information knowledge? Where is the balance between process and outcome?

Jennifer Cameron-Smith
0Comment|One person found this helpful. Was this review helpful to you?YesNoReport abuse
on 18 March 2013
This is, at it's core, an excellent book. Chapters 1 to 9 are a party full of old friends from Babbage through to ... eh? What's this?!? DNA, and Oh No! Someone drunk in the corner waffling Just-So stories and old defunct theories? It's The Dawk!! Skip the rest of chapter 10.... 'Memes'? OK, skip that chapter...and here's who we expected to meet at this stage; Chaitin and Kolmogorov having a couple of whiskies. Fine. Gotta be self organising networks next ... surely. No? skip skip skip - oh, here's Duncan, "hi" skip skip skip "any one seen Erdős? couldn't make it? To short a time and I missed him? oh well."

I'm really sorry. Gleick is a really good writer, does a service to the sciences, was fully on form for most the book - but just spent to much space and, no doubt, time, on the wrong stuff in over a third of the book.
0Comment|One person found this helpful. Was this review helpful to you?YesNoReport abuse
on 7 November 2012
I came to this volume with only a vague idea about information and information theory but having been very impressed by Gleick's book on Chaos (Chaos: Making a New Science). This is an engaging work that explores our understanding of information and how it has changed over time. This is primarily done through looking at a number of key thinkers/contributors from the likes of Charles Babbage (1791 - 1871)and Augusta Ada King, Countess of Lovelace (1815-1852) to Alan Turing (1912-1954) and Claude Shannon (1916-2001).

This is an interesting overview that brings together both the history and theory of information and shows how we came to be living in the "Information Age". Well worth a read.
0Comment|One person found this helpful. Was this review helpful to you?YesNoReport abuse
VINE VOICEon 11 August 2013
Many great insights as to "The meaning of life, the universe and everything" begins with a vision or a universal concept that was just under our nose but required someone to tell us what we already knew and bring this to our forethought. Think back to economics classes before the classes economics was just to term for money handling. Now today we see that every Great War every great invention and even the small ones were encouraged and even made available due to economics. Before reading such books as "Homo Evolutis" by Juan Enriquez and Steve Gullans, we knew of evolution and its controversies but never thought that we would see it all around us and realize much of it is our doing. Now there is "The Information: A History, a Theory, a Flood" by James Gleick also the author of "Faster: The Acceleration of Just About Everything." The title of this book is definitely an understatement of what you're about to be presented. Just keep in mind that as much fun as this book is to read it is how you use this" information" that gives the book its worth.

We will see that every little "bit" of the universe and everything in it is "information." Do not over look the prolog for an encompassing hint as what the book is about. No information related subject is glossed over we het extensive history and in-depth views of what information is, how it was all-around ups and where t is going. I will not go into every detail of you would not need to read the book

Be prepared for over 400 footnotes and an extensive bibliography which will take some time to "look the references up."
0Comment|Was this review helpful to you?YesNoReport abuse
Here's a book which examines several aspects of the history of information and communication, beginning with African drums and ending up with Wikipedia. Along the way, the author discusses the work of such pioneers as Charles Babbage (who invented the mechanical computer), Ada Lovelace (who worked with Babbage and is considered the world's first computer programmer), Samuel Morse (inventor of the single-wire telegraph), and Claude Shannon, who - as the original information theorist - is the real hero of the book. Focussing the story on the personalities is a shrewd touch, as it keeps the tale interesting, even for the non-specialist who might otherwise get bogged down in the technical details of things such as entropy measurement, quantum computing, and the propagation of memes.

The other thing that keeps the reader's attention is Gleick's entertaining, assured writing style (already familiar to those of us who've read his excellent biography of Richard Feynman). For example, here is his stimulating comment on a letter from Lovelace to Babbage (p119):

"She was programming the machine. She programmed it in her mind, because the machine did not yet exist. The complexities she encountered for the first time became familiar to programmers of the next century."

His description (p231) of the first attempt by Shannon (or indeed anyone) to construct a scale of information content - ranging from the digit wheel in an adding machine (3 bit), through the human genome (estimated conservatively at 100 Mbit), up to the Library of Congress (100 Tbit) - is similarly arresting; the fact that Shannon did this in 1949, just before his book on information theory appeared, and was the first person to suggest that a genome was an information store, is extraordinary.

I greatly enjoyed this book. The concepts and technologies it discusses are complicated, but Gleick explains them cleverly, and brings out the excitement in the pursuit of an understanding of the way we use, transmit and keep what we know, and the effect it has on our lives.
0Comment|Was this review helpful to you?YesNoReport abuse

Sponsored Links

  (What is this?)