Learn more Download now Shop now Shop now Shop now Shop now Learn More Shop now Shop now Learn more Shop Fire Shop Kindle Learn More Shop now Shop now Learn more



on 5 April 2016
Excellent!
0Comment|Was this review helpful to you? Report abuse
on 21 February 2013
For someone working in any field of technology or anyone even vaguely interested in how we got to this point of information proliferation, this is just class.
0Comment|Was this review helpful to you? Report abuse
on 8 August 2012
I've been a fan of James Gleick's work ever since his book on Chaos came out. Thus I was looking forward to reading his book on Information. Unfortunately, for the first time I was disappointed. It's difficult to put a finger on a specific reason why I should have only got part way through the book before abandoning it. I'm used to reading larger books than this, so it wasn't the size. The writing seemed more turgid that past work, and the portraits of the key figures more fuzzy. And, of course, it wasn't helped by the tiny print size used by the publishers.

Unlike some of the other reviewers, I don't have any specific disagreements with Gleick's ideas on information, at least as far as I got through the book. It was just that the writing was too pedestrian to hold my attention for long enough to complete it.
0Comment| 11 people found this helpful. Was this review helpful to you? Report abuse
on 20 December 2012
The Information is billed as the `story of how human beings use, transmit and keep what they know', discussing a series of information revolutions: `the invention of writing, the composition of dictionaries, the creation of charts that made navigation possible, the discovery of the electronic signal, and the cracking of the genetic code.' At best it only does a fraction of this and from a very particular perspective. The book is principally a treatise on information theory within mathematics and physics and how information is encoded and communicated in a technical and theoretical sense. It is especially concerned with the reduction of information to constituents parts, how this is encoded and transmitted, and the notion that information is the constituent component of life and the universe. This is a view of information shorn of meaning and context. Consequently the reader does not get the full story of information revolutions with respect to the written word, and the visual (art, maps, photography, television, film) and aural (voice, music) is all but absent. Oddly, there is no discussion of broadcast media such as radio and television, though there is a fair amount of discussion dedicated to the telegraph and internet. There is no discussion of discourse or how information is used. The book then is filled with absences. What is included, however, is often fascinating and intriguing, although my feeling is that the level is often not for the average lay reader - it is quite advanced and requires a fair degree of pre-requisite knowledge. In this sense, it's sold as a popular science book, but given its technical nature and length I suspect it has far more sales than readers who manage to get from start to end. Overall, an interesting book but doesn't quite live up to its billing.
0Comment| 4 people found this helpful. Was this review helpful to you? Report abuse
on 25 April 2011
I think the core idea of this book is about the separateness of 'information' and 'meaning'.

Imagine a random string of numbers, infinite in length with no discernible pattern. Is it possible to write a small, compact computer program to predict the next number in the string at any given point? No, the minimum length of a computer program that could accurately print out the string in the correct order is equal in length to the string itself.

However, think about the number Pi. Infinite yet utterly non-random. A very small, finite computer program can compute it to infinite length and accuracy just by continuously dividing the circumference of a circle by its diameter.

Then consider how much 'information' and 'meaning' the two numbers contain. If you think about it, Pi - computable by looping a single mathematical operation - actually contains a small, finite quantity of information - the information contained in the program. It is only a ratio between two numbers and we only need a small amount of code to predict it to infinite accuracy. But in terms of the amount of 'meaning' this information represents, well Pi basically underpins our understanding of the mechanics of the universe = it has an awful lot of 'meaning'. Conversely, the random string is a massive amount of information with little or no value to us = it has no 'meaning'. To me, this is a new and fascinating way of looking at the universe, leading to some very exciting conclusions.

In this absorbing book, James Gleick takes us on a guided history tour of how modern science finally came to understand the importance of 'information', to the point where it is now regarded as a fundamental quantity just like energy. There are fascinating chapters on the work of Alan Turing and Claude Shannon as well as a brilliant exposition on the concept of entropy, clearly linking it to information theory. As entropy increases, so does the amount of information in the universe. Ultimately we are, just like Maxwell's philosophical Demon, fighting a losing battle in trying to structure and sort it.

This is the book I should have read before I tackled Seth Lloyd's "Programming The Universe". Unfortunately, as I now know, this was not to be. Lloyd's book was published first and - because entropy is always on the up and up - nobody, least of all me, can turn back the hands of time!
0Comment| 2 people found this helpful. Was this review helpful to you? Report abuse
on 6 December 2011
In this book Gleick (2001) offers a historical account of information and communicational systems. Showing the technology, the social and cultural impact, the inventors and there environment and a theoretical account on information. Starting with anthropological accounts of communication and signalling, to light and electric telegraph and the ancestors of early calculators and computer he comes to write about the information age. Each of these information and communication technologies, according to Gleick (2011), as the printing press, telegraph, telephone and internet left their mark on our society, most noticeably are the early communicational systems who made the word smaller and globally connected and contributed to standardisation as the time zones.

Several key figures pass by e.g. Charles Babbage who contributed to the early calculator or computer, Ada Byron who started working on the first algorithms, Samuel Morse who contributed to the telegraph and Morse code, Alan Turing who was one of the key persons in the development of computation, algorithm and inventor of the Turing machine and Claude Shannon regards as the father of information theory. Gleick (2011) provides bibliographical information, how some of these people were interlinked, together with accounts how they contributed by innovation, adaptation and research to the different information and communication systems.

Gleick (2011) does not contribute to the history of communication technologies with new research, facts or insights, and refers and quotes several well known authors on the same topic. The interesting bit in Gleick (2011) book is probably not the historical description how different systems came into being, but the adaptation people made to make most of it, which is now largely forgotten. The telegraph resulted in lists of abbreviations and special dictionaries of words of phrases to transmit, as much information as possible, with the least amount of characters. And how the telephone book, with list of persons and numbers were needed to organise and connect people. But that both types of retrieval systems became redundant by new technologies. Despite their disappearance, the quest to develop more efficient signs and media contributed to the development of digital information. And as digital information becomes increasingly more sophisticated and efficient, it leads to a world according to Gleick (2011) of abundance or a flood of information.

Towards the end, the book becomes fragmented. Gleick (2011) tries to show how information is understood in different disciplines as genetics (memes and genes), physics (Maxwell's Demon and randomness), and in literature (Jorge Luis Borges, Library of Babel). Gleick (2011) shows briefly how older communication systems, which transmit, select and filter information, but also provided tools to make sense of information, such as printed books with indexes, table of content and references, special books like catalogues, encyclopaedia, glossaries, book of quotations and anthropologies and entries as book reviews and digests. However, Gleick (2011) does not describe, accept from mentioning that it becomes available in daunting quantities, how people deal with digital information. It would have been more interesting, instead of the theoretical descriptions of information theory, which is not Gleick's strongpoint, to know how people distribute, organise, structure, select and filter information, and make the best of the tools available and contrast this with previous and almost forgotten attempts.

As mentioned, this book is an eclectic collection of interesting pieces and ideas, by times easy to read and amusing, but on the whole incoherent.

Gleick, J. (2011) The information. A History, a Theory, a Flood. London, Fourth Estate
0Comment| 6 people found this helpful. Was this review helpful to you? Report abuse
TOP 1000 REVIEWERon 22 April 2011
James Gleick's books - Chaos, Genius, Isaac Newton, and the present - share the attractive traits of being intelligently conceived, meticulously researched and beautifully realized;the author is erudite, conveys the distinctive atmosphere of an era, the character of its actors, the anectode while he is a delightful storyteller;his prose is simple but conceptually rich and expansive. The book traces the evolution of human communication and its impact on culture from oral communication to the twenty-first century and the internet, Wikipedia and cloud computing. A recurring theme in this evolution is the rapid increase in information which in recent years has been phenomenal with information increasing exponentially.

Because of its centrality in the book, I commence the review and somewhat elaborate on Shannon's 'Theory of Information'. The year 1948 was for Bell Telephone Laboratories an annus mirabilis. During that year Claude Shannon then aged 32 published his theory of information titled 'A Mathematical Theory of Communication' while working at Bell Labs and coined the word bit;the transistor was invented in the same year and lab but the word 'transistor' was the product of committee deliberation.

Shannon in his theory defines the information content of an event as being proportional to the logarithm of its inverse probability of occurence. Shannon's theory of information is related to Entropy in that an increase in Entropy in a system increases its disorder while concurrently increasing its information content.

In Shannon's theory of information, the fundamental concept of distinguishability between two different states is a bit of information. A bit, by definition, exists in one of two different states at any given time - a zero or a one;where you have more than two outcomes, you simply use more bits to distinguish them all. Shannon's information theory relates to events based on Boolean logic i.e for an event with several outcomes, each outcome happens or does not happen.

Shannon's information theory has a very wide applicability in that it has the same logical foundation in different physical, biological, social and economic phenomena.

All the preceding relate to classical physics and the classical worldview. However, a more accurate description of our physical world is given by quantum theory which has superseded classical physics. With quantum theory the notion of a deterministic world fails, events always occur with a probability regardless of how much information we possess.

Classical bits as we have already explained exist in one of two different states at any given time - a zero or a one. With quantum mechanics, however, we are permitted to have a zero and a one at the same time in one physical system. In fact we are permitted to have an infinite range of states between zero and one which we call a qubit.

It has been established that Shannon's theory of information can be successfully extended to account for quantum theory and potentially to a quantum computer with enormous capability.

The evolution from oral speech to writing was effected through a progression from pictographic, writing the picture;to ideographic, writing the idea;and then logographic, writing the word. This journey led from things to words, from words to categories, from categories to metaphor and logic. We know that formal logic is the invention of Greek culture after it had interiorized the technology of alphabetic writing. The culture of literacy developed its many gifts:history and the law;the sciences and philosophy;the reflective application of art and literature itself.

For a long period the written word was the task of scribes. The revolution came from Johannes Gutenberg (c.1400-68) who was the first in the West to print using movable type and was the first to use a press. Printing books naturally assisted in their wide dissemination. But it was left to Elizabeth Eisenstein in her landmark scholarship, two volumes titled 'The Printing Press as an Agent of Change'published in 1979 to conclusively demonstrate printing as the communications revolution essential to the transition from medieval times to modernity.

And from the extensive treatment of Shannon's theory of information to the telegraphic presentation of the three waves of electrical communication erected in sequence:telegraphy, telephony, and radio which collectively annihilated space and time. Naturally ingenuity was required to effect the crossing point between electricity and language and also the interface between device and human.

It is worth noting that both Claude Shannon for whom we have spoken at length and Alan Turing, the British genius, who apart from breaking the German Enigma code also conceived the elegant ethereal abstaction of Universal machine were engaged in cryptography during World War II and even met in Bell labs cafeteria but naturally did not talk to each other on the specific nature of their secret work.

I shall conclude the review with information in biological systems:DNA serves as one-dimensional storage of information;DNA also (information transfer)sends that information outward for use in the making of the organism. The data stored in a one-dimensional strand has to flower forth in three dimensions. This information transfer occurs via messages passing from nucleic acids to proteins. So DNA not only replicates itself;separately, it dictates the manufacture of something entirely different.

When the genetic code was solved, in the early 1960s, it turned to be full of redundancy. The redundancy serves exactly the purpose that an information theorist would expect. It provides tolerance for errors.
0Comment| 9 people found this helpful. Was this review helpful to you? Report abuse
on 18 May 2011
This curious book breaks down towards the end, when Gleick's arguments about information expansion become slight, devoid of evidence and as incoherent as a google search for 'truth'. This is a shame, as up until then the narrative describing exactly why we were developing technology for the sake of information was genuinely fascinating, and a tale well told. It seems we rarely found what we were looking for, but often something more useful, or perhaps this only seemed so because we found ourselves using it.

Away from the arm-waving threats about the banality of volume, Gleick does the technical background very well, and for once someone does some justice to scientific method presented in a book aimed at a more general audience. The central thesis draws attention to the magnifying power created when two separate spheres of knowledge, language and numbers, can be reconfigured as representations of each other. The resulting turbulence is still being felt, and whilst I note the raving towards the end, it is surely far too soon in our experience of the Internet to know what its implications will be. This is where Gleick really does become difficult to follow: despite citing all those historical accounts of other moments when his critical predecessors also saw the demon in the sheer volume of stuff available, he doesn't seem to position himself in the same place, but as if facing something unprecedented in the guise of the Internet (well, they thought that too, only it was the expansion of print, or the availability of real time information via the telephone). The best bit is exactly that: the rise of the bit as the quantum unit of consciousness, and its apparent capacity to efficiently resolve so much of what we currently know that we have come to assume it can accommodate just about anything (with this assumption currently feeding the mania for 3D printers and rapid prototyping).

The Internet will continue to fascinate us for some time to come, but as it degenerates in prominence and quality, we will treat it with the 'respect' we show some of the other technologies that once beguiled us. Of course, we will eventually neglect it, except as reference point if we go beyond our immediate understanding (probably gained by some more knowledge localising tech), but as Gleick shows, guessing the future of technology turns out to be rather tricky.
0Comment| 7 people found this helpful. Was this review helpful to you? Report abuse
on 22 November 2011
I was enjoying James Gleick's easy way with sophisticated concepts when I came crashing back to earth and began to suspect a facile over-simplification: his chapter on the "selfish gene" is just wrong. Richard Dawkins was not the originator of the idea, but rather it's populariser: it's like attributing the basics of information theory to Marshall McLuhan rather than to Shannon Conway. The gene-oriented theory of natural selection has a distinguished pedigree, none of it derived from original work by Richard Dawkins. George C. Williams, for instance, was best known for his vigorous critique of group selection, a blind alley in evolutionary theory. The work of Williams in this area, along with W. D. Hamilton, John Maynard Smith and others led to the development of a gene-oriented view of evolution in the 1960s. It is a great pity that Gleick fails to acknowledge the intellectual origins of the "selfish gene". What other gross misrepresentations have I missed, outside the area of my own training and expertise?
22 Comments| 5 people found this helpful. Was this review helpful to you? Report abuse
on 26 April 2011
I'm not sure the reviews so far are terribly helpful if you want a quick feel for whether to read this book or not. So here goes.

It's basically a bravura sweep through the history of information, told with great panache and lots of anecdote, mixing straight narrative with reflection on wider significance, and attempting to explain quite difficult concepts for the non-specialised reader. Whatever else it may or may not be, I found it a lively and enjoyable read.

The book falls broadly into three sections. The first runs through key early stages in the creation, storage and use of information - the alphabet, printing, the telegraph, telephone, etc. I didn't find much new here but the author did a great job marshalling facts, figures, characters and anecdotes into a lively tale.

The heart of the book grapples with information as a scientific concept, and you will find yourself in the realm of computers, information theory, DNA and quantum mechanics (to name but a few). This isn't natural territory for me, but I was swept along by Gleick's style and even felt I understood some of the underlying mathematical concepts he sought to explain.

The final section is essentially a thought piece on the modern information age, considering the ubiquity of information from the internet and the perils of information overload. Rather like the first section, I didn't feel there was a great deal new here but Gleick's ability to call up literary references, make parallels across the centuries and ask the pertinent questions made it an engaging read. I'm certainly pleased to have made the acquaintance of Vincent of Beauvais, a thirteenth century monk who seems to have arrived 750 years early for the Information Age.

So, a dazzling read certainly, but one also with a great deal of substance. Recommended.
0Comment| 97 people found this helpful. Was this review helpful to you? Report abuse

Sponsored Links

  (What is this?)