Customer Reviews


22 Reviews
5 star:
 (5)
4 star:
 (5)
3 star:
 (7)
2 star:
 (4)
1 star:
 (1)
 
 
 
 
 
Average Customer Review
Share your thoughts with other customers
Create your own review
 
 

The most helpful favourable review
The most helpful critical review


1 of 1 people found the following review helpful
4.0 out of 5 stars Hungarians, Engineers and Mathematicians
This proved a very interesting book; it is not mainly about Alan Turing but is virtually a biography of John von Neumann. It tells the story of his work to answer Turing's challenge to create a machine that models human intelligence. We see the creation of the Institute for Advanced Study at Princeton, and its development and growing pains - notably the strain caused...
Published 15 months ago by Sassenach

versus
8 of 8 people found the following review helpful
3.0 out of 5 stars Much more than just Turing's contribution to computing
Unlike other reviewers, I was not worried by the fact that this book is not another rehash of the same source material about Alan Turing. Setting his famous paper in some, maybe not the entire, context of the time was illuminating.

I skimmed sections that seemed dense in technical details of valves and command lines, but the stories of wives and women working...
Published on 15 May 2012 by CatR


‹ Previous | 1 2 3 | Next ›
Most Helpful First | Newest First

8 of 8 people found the following review helpful
3.0 out of 5 stars Much more than just Turing's contribution to computing, 15 May 2012
Verified Purchase(What is this?)
Unlike other reviewers, I was not worried by the fact that this book is not another rehash of the same source material about Alan Turing. Setting his famous paper in some, maybe not the entire, context of the time was illuminating.

I skimmed sections that seemed dense in technical details of valves and command lines, but the stories of wives and women working on computer hardware and programmes, plus the vibrant "work hard, play hard" atmosphere in the various campus-type living arrangements were fascinating. Klari von Neumann's narrative was one of the most engaging for me. I also quite like stories of how institutions are shaped, so I wasn't put off by this strand.

A stand out comment related to the power of computer processing keeping men honest, because we've all seen how powerful computer models can be created and used dishonestly.

The Manchester University Small Scale Experimental Machine or Baby was repeatedly referred to in the same breath as Colossus and thus was a bit confusing. For instance "the core of the computing group from Bletchley Park were continuing from where their work on Colossus had left off". I (unlike the author who counts Max Newman as the core) imagine that the core of the computing group were the ones who actually designed and built the machine; Williams, Kilburn and Tootill who had all been based at the Telecommunications Research Establishment in Malvern. It isn't the most straightforward of family trees, but these vague references don't help to give people their proper credits or to understand why things came about in the way they did.

Kindle-wise, quite a few of the photos at the end seemed to have become separated from their captions on the following page which is a bit annoying, but I don't remember any particularly awful lay out issues.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


13 of 14 people found the following review helpful
3.0 out of 5 stars Oh yes, I well remember the command line..., 30 April 2012
By 
T. D. Dawson "tdawson735" (UK) - See all my reviews
(TOP 1000 REVIEWER)    (VINE VOICE)    (REAL NAME)   
Verified Purchase(What is this?)
This review is from: Turing's Cathedral: The Origins of the Digital Universe (Hardcover)
My first encounter with digital computers was in the late 1960s when I headed up a small design team working in the development of a computer-based remote control/telemetry system to replace the earlier electromagnetic/discrete component systems used by the public utilities. In the ensuing years - although I've occasionally tried - I've never managed to escape completely from the digital universe...

Because of - or perhaps in spite of - this background I found it extremely difficult to review George Dyson's book. The claim on the back cover that the book 'can be read as literature whether or not you have any interest in computers and machine intelligence' is, in my view, grossly misleading and dangerously inaccurate.

For example, we learn on page 301 that (verbatim) "the codes spawned in 1951 have proliferated, but their nature has not changed. They are symbiotic associations of self-reproducing numbers (starting with a primitive alphabet of order codes) that were granted limited, elemental powers, the way a limited alphabet of nucleotide sequences code for an elemental set of aminio acids - with polynucleotides, proteins, and everything else that follows developing from there."

This, I submit, is hardly something that can be read as literature. Although I have a reasonable scientific background I had similar difficulties with sections dealing with Monte Carlo statistical techniques, chaos theory in meteorology and with the theory of self-reproducing automata.

The research that Mr Dyson carried out in developing the various chapters is, of course, impressive but I would have found the book far more interesting and informative had he concentrated on developing the subject matter within a chronological timeline - and, even better, had he focused on explaining it rather than simply relying on extremely erudite statements. He also, and very obviously, found it difficult to decide whether to concentrate on:

1. tracing the development of the digital computer itself. If so, the material on the theory of Turing's Universal Machine should appear before page 243 whilst a summary of the prophetic work of Gottfried Leibniz at the end of the 17th century would be better located before pages 103 to 105. There is, admittedly, a large amount of information on the development of various digital components and storage techniques but, unfortunately, this is scattered throughout the book.

or on

2. examining the work of a number of eminent scientists and focusing on how, by applying the evolving digital technology to their research work, they influenced and contributed to the development of that technology. There is a large amount of interesting background information on the scientists themselves (and on the occasional clash of mercurial personalities) including such anecdotal gems as the hospital at Los Alamos charging one dollar a day for diapers. But...

The depth of material in 'Turing's Cathedral' is immense which - had it been the sole criteria - would have justified a five-star rating. However, the lack of a coherent timeline and his difficulty in dealing with highly complex scientific issues reduces my rating to a more than generous three stars.

In my opinion the 1953 book Faster Than Thought: A Symposium on Digital Computing Machines gives a far better overview of developments prior to that date. That edition is, unfortunately, now out of print but 1955, 1957 and 1963 reprints are listed on Amazon. Out of interest the copy on my bookshelf contains, as a bookmark, a receipt dated 3rd December 1953 showing that it cost me 1:16s:3d...!
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


51 of 58 people found the following review helpful
2.0 out of 5 stars Misleading account, 8 Mar 2012
By 
Jeremy E. May "Software pro" (Fort Lauderdale, Florida) - See all my reviews
(REAL NAME)   
This review is from: Turing's Cathedral: The Origins of the Digital Universe (Hardcover)
The focus of George Dyson's well-written, fascinating but essentially misleading book,'Turing's Cathedral', is curiously not on celebrated mathematician, code-breaker and computer theorist Alan Turing but on his equally gifted and innovative contemporary John von Neumann. Von Neumann, whose extraordinarily varied scientific activities included inter alia significant contributions to game theory, thermodynamics and nuclear physics, is especially associated with the early development of the electronic digital computer (i.e. the 'EDC'), an interest apparently sparked by reading Turing's seminal 1936 paper 'On Computational Numbers' which attempted to systematize and express in mathematical terminology the principles underlying a purely mechanical process of computation. Implicit in this article, but at a very theoretical level, was a recognition of the relevance of stored program processing (whereby a machine's instructions and data reside in the same memory), a concept emanating from the work of mid-Victorian computer pioneer Charles Babbage but which demanded a much later electronic environment for effective realization.

What Mr Dyson insufficiently emphasizes is that, despite a widespread and ever-growing influence on the mathematical community, Turing's paper was largely ignored by contemporary electronic engineers and had negligible overall impact on the early development of the EDC. Additionally, he omits to adequately point out that von Neumann's foray into the new science of electronic computers involved a virtual total dependence on the prior work, input and ongoing support of his engineering colleagues. Invited in August 1944 to join the Moore School, University of Pennsylvania, team responsible for ENIAC, the world's first general purpose computer being built for the US Army, von Neumann was quickly brought up to speed courtesy of the machine's lead engineers, J. Presper Eckert and John Mauchly. As early as the fall of 1943, Eckert and Mauchly had become seriously frustrated by the severe processing limitations imposed by ENIAC's design and were giving serious consideration to implementing major modifications, in particular the adoption of Eckert's own mercury delay line technology to boost the machine's miniscule memory capacity and enable a primitive stored-program capability. These proposals were subsequently vetoed by the School's authorities on the quite understandable grounds that they would seriously delay ENIAC's delivery date; instead it was decided to simultaneously begin research on a more advanced machine (i.e. EDVAC) to incorporate the latest developments. As a new member of the group, von Neumann speedily grasped the essentials of the new science and contributed valuable theoretical feedback, but an almost total lack of hands-on electronic expertise on his part prevented any serious contribution to the nuts and bolts of the project. Relations with Eckert and Mauchly rapidly deteriorated when an elegantly written, but very high-level, document of his entitled 'First Draft of a Report on the EDVAC' was circulated among the scientific community. Not only had this document not been previewed, let alone pre-approved, by Eckert and Mauchly, but it bore no acknowledgment whatsoever of their overwhelming responsibility for much of the content. By default, and in view too of his already very considerable international reputation, the content was therefore attributed exclusively to von Neumann, an impression he made no attempt thereafter to correct, the term 'Von Neumann Architecture' being subsequently bestowed on the stored program setup described in the document.

The public distribution of von Neumann's 'Draft' denied Eckert and Mauchly the opportunity to patent their technology. Worse still, despite academic precedents to the contrary, they were refused permission by the Moore School to proceed with EDVAC's development on a commercial basis. In spite of his own links to big business (he represented IBM as a consultant), von Neumann likewise opposed their efforts to do so. All this resulted in a major rift, von Neumann thereafter being shunned by Eckert and Mauchly and forced to rely on lesser mortals to help implement various stored-program projects, notably the IAS computer at Princeton. The following year (1946) Eckert and Mauchly left the School to focus on developing machines for the business market. Before doing so, they jointly delivered a series of state of the art lectures on ENIAC and EDVAC to an invited audience at the School. Among the attendees was British electronics engineer Maurice Wilkes, a fellow academic of Turing's from Cambridge University, but with relatively little interest in the latter's ongoing activity (by this time Turing, a great visionary, had also turned his attention to designing stored-program computers). Blown away by Eckert and Mauchly's presentation, Wilkes returned to England to forge ahead with a new machine called EDSAC, which was completed in May 1949 and represented the first truly viable example of a stored program computer (an experimental prototype christened 'Baby' had already been developed at Manchester University the year before). Back in the US, Eckert and Mauchly continued their efforts, but persistent problems with funding and also Eckert's own staunch refusal to compromise on quality delayed progress, their partnership finally culminating in the development of the UNIVAC 1, the world's first overtly business-oriented computer, delivered initially to the Census Bureau in March 1951.

Mr Dyson is quite right of course (and he does this well) to trace the beginnings of the modern computer to the stored program concept, but his obsessive focus on von Neumann's role obscures the impact of Eckert and Mauchly's vastly more significant contribution to its development. The triumph of the EDC depended almost wholly on the efforts and expertise of utterly dedicated and outstanding electronics specialists like them, not on mathematicians, logicians and generalists like von Neumann or even Turing. Never one to deny credit where it was due, Wilkes (who later spearheaded advances in software, became the doyen of Britain's electronic community and ended his long and distinguished career as professor emeritus of computer science at Cambridge) unceasingly acknowledged his major debt to Eckert and Mauchly. Hopefully, Mr Dyson, a writer of considerable talent, might one day decide to tell in full their story and set the record straight.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


3.0 out of 5 stars Ex cathedra, 28 July 2014
By 
Jeremy Walton (Sidmouth, UK) - See all my reviews
(VINE VOICE)    (TOP 1000 REVIEWER)    (REAL NAME)   
Dyson tells the story of the computer developed at the Institute for Advanced Study at Princeton, NJ between 1945 and 1951 under the direction of the Hungarian-born mathematician John von Neumann. Actually, he does much more than that in this book - in fact, practically every aspect of my first sentence is unpicked in extensive detail: a history of the Institute, the life of von Neumann, the origins of his design for the machine, its relation to contemporary efforts in computing (including its immediate predecessors the ENIAC and EDVAC at the University of Pennsylvania), the lives and careers of the other people who worked on the computer, and some of the early problems which the computer was used to solve. The latter - unsurprisingly, given what else was happening in the world at that time - were almost exclusively military in nature: ENIAC was initially designed to calculate artillery firing tables, whilst von Neumann's work on explosions and shock waves in the Manhattan Project, and the extensive calculations it required, were his main reasons for becoming interested in building an electronic computer (at the time that this book deals with, the word 'computer' was only just beginning to change its meaning from a 'person who does calculations' to the machine which is in ubiquitous use today).

This is an interesting tale, and the author tries to tell it in an engaging fashion by concentrating more on the lives of the scientists and engineers who built the machine than going into extensive technical detail about its workings and how they came into being. For myself, I could have done with a little more of the latter: although Dyson shows how von Neumann's account of the earlier EDVAC machine played an important role in promulgating ideas for computer design, it's hard to tell from this account how the IAS machine differed from EDVAC. Similarly, although it seems to be supposed to be central to the tale, the way in which Turing's idea of a hypothetical computing machine influenced von Neumann's work is not clearly brought out in this book. One of the problems is that the narrative jumps back and forth in time as it looks at different topics or personalities, which can make it difficult to piece together a coherent account of what actions preceded or influenced which events. This gives rise to a nagging feeling that the author isn't completely in control of his material, particularly when he presents quotations like this one, which is referring to the Teller-Ulam design for thermonuclear weapons [p213]:

"Ulam kept pressing for squeezing the secondary," says Theodore B. Taylor, the gifted Los Alamos bomb designer who was friends with both Ulam and Teller at the same time. "Now whether he did that with the key perception that the inverse Compton effect wouldn't drain the energy, that things would be much closer to equilibrium and that at these high densities you get a fast enough reaction rate and a high enough temperature rise so that it would be very efficient, I don't know who came up with that."

I suppose you can vaguely understand what Taylor is saying here, but you'd still be left wondering about the significance of the 'inverse Compton effect' - or what it is - because Dyson doesn't refer to it all. Such omissions aren't due to a lack of space - this book is 400 pages long, and elsewhere the author is able to make use of several of them to variously describe the life of William Penn, who was one of the original owners of the land the IAS was built on [p11], what was on the menu at a 1937 lunch meeting of the designers of the main IAS building [p89], and the precise blood chloride levels in either side of the heart of von Neumann's wife after she drowned herself in the Pacific Ocean [p328]. However, when it comes to the technology, Dyson seems to be keener on epigrams which are pithy but not completely uncontentious - e.g. "Evolution in the digital universe now drives evolution in our universe, rather than the other way around." [p289], or "Facebook defines who we are, Amazon defines what we want and Google defines what we think." [p308].

Uncertainties like these distracted from my enjoyment of this book, which still yielded moments of insight, however. For example, having recently started work for a large national organization devoted to weather forecasting, I was interested to see that this was one of the few non-military problems that these very early machines were turned to. Intriguingly, von Neumann was audaciously thinking even further than that: "most [...] meteorological phenomena [...] could be controlled, or at least directed, by the release of perfectly practical amounts of energy" [p161]. He wrote this in October 1945, so it's reasonably obvious what sources of energy he was thinking about; a newspaper article is quoted later on the same page which suggests that "atomic energy might provide a means for diverting, by its explosive power, a hurricane before it could strike a populated place". Now there's an idea...
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


1 of 1 people found the following review helpful
3.0 out of 5 stars The digital revolution, 5 Nov 2013
By 
Brian R. Martin (London, UK) - See all my reviews
(TOP 500 REVIEWER)    (VINE VOICE)    (REAL NAME)   
This is a rather frustrating book. The author, like his famous father Freeman Dyson, obviously is interested in, and has detailed knowledge of, a wide variety of scientific and technological subjects, but not alas the latter's ability to explain things clearly to non-experts. The statement on the back cover that the book 'can be read as literature whether or not you have any interest in computers and machine intelligence' is simply not true. (Do the authors of `blurbs' actually read the whole book?) Dyson also seems to find it difficult to decide whether to concentrate on tracing the history of the development of digital computing, a very interesting subject in its own right, or the people who played a major role in that development. Partly because of this, and the need to follow a particular episode to its conclusion, the book suffers from the lack of a consistent chronology.

The technical story is fascinating and, according to Dyson, has as its core the work done at the Institute for Advanced Study (IAS) at Princeton in developing the ENIAC and MANIAC computers, initially driven largely by the needs of the emerging nuclear weapons programme. The driving force behind this was the extraordinary polymath John von Neumann, who dominates the narrative throughout the book. Turing's name may well be in the title, but he is relegated to a bit player here. The author takes the view that von Neumann and the team at the IAS were the principal inventors of the modern stored program computer. Others would dispute this. Rivals, such as Eckert and Mauchly, as well as the British, get little mention. The problem is that the descriptions of the mathematics, physics, and electronics that underpin this work are usually given in great detail and at a technical level that will often be unintelligible to anyone not already familiar with the material. (We are even told the pin connections of the 6J6 miniature twin triode vacuum tube.)

Far better are descriptions of the people involved in this heroic period, their personalities and how they interacted. In this narrative, Von Neumann is the leading player by far, but many others made contributions, many of whom are probably unknown to the general public. They included Teller, the arch proponent of the `super bomb'; the superbly talented, but not formally qualified, engineer John Bigelow, who more than anyone translated ideas into reality; the mathematical wizard Metropolis, who pioneered the Monte Carlo method of calculation; and on the fringe, other characters such as the rather eccentric Barricelli, who ran endless computer experiments on numerical evolution. Even here though there is far too much extraneous detail, such as the menu at a lunchtime meeting of a particularly important committee, information about the acquisition of land to build the IAS, and even the price of nappies at the Los Alamos nursery.

The book ends with a discussion based on a very speculative view of the author concerning an interpretation of modern computer developments, where networks of machines reproduce themselves and become the controllers of man rather than its servant. I happen not to share this view, but others may find it interesting. The final chapter rounds off the human stories by recording what happened to many of the players after the death of Von Neumann and the closure of the IAS computer project. Not surprisingly, many went into the fledgling computer industry; others founded computer science departments in universities; and Von Neumann's wife Klari, who had herself made significant theoretical contributions in the early days, contracted her fourth marriage before drowning in the sea off the Californian coast, in what may well have been a suicide at the early age of 52, just a year younger than the age when Von Neumann himself died of cancer.

The period described was a very important one, during which developments took place that impact on every one of us today. Moreover, the characters involved were intrinsically interesting, and their personal stories could have been woven into an exciting narrative understandable to the layman. Dyson can write well, but by overburdening the reader with details that do not advance the story, he has missed the opportunity to do this.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


1 of 1 people found the following review helpful
3.0 out of 5 stars Not very satisfying, 24 May 2013
Verified Purchase(What is this?)
An interesting account but rather thin, padded by lots of biographical details of minor characters that add little to the narative
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


1 of 1 people found the following review helpful
4.0 out of 5 stars Hungarians, Engineers and Mathematicians, 8 May 2013
Verified Purchase(What is this?)
This proved a very interesting book; it is not mainly about Alan Turing but is virtually a biography of John von Neumann. It tells the story of his work to answer Turing's challenge to create a machine that models human intelligence. We see the creation of the Institute for Advanced Study at Princeton, and its development and growing pains - notably the strain caused by the introduction of engineers to an institution that thought it was all about theory. The way that both theoretical and practical brains were crucial to the development of the early computers - before there were graduates in computer science - is well described. We get good descriptions of many of the other key figures in the story. It is revealing how much the whole project depended on people fleeing Nazi-occupied Europe. And although it could have been written way above the head of a reader without specialised knowledge, the book is in fact fairly accessible to anyone interested enough in the topic to buy it. OK I may have skipped a few paras here and there, but this did not detract fro my appreciation of the book.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


1 of 1 people found the following review helpful
4.0 out of 5 stars Turing's Cathedral: One view of the early development of computers, 8 April 2013
Verified Purchase(What is this?)
The core of this book is a useful, informative, and at times exciting, account of early computer projects, with the Institute for Advanced Study at Princeton University as the setting, the development of nuclear weapons as the plot, and John von Neumann as the principal character, by an author who was there (as a child) at the time. The principle action concerns the development of the ENIAC and MANIAC computers, and the books sheds light on the relationships between many of the characters and organisations involved in these projects during the Second World War and the immediate post-war period. However, it has two flaws which lead me to have reservations about it.

The first of these is a certain lack of balance. Despite the title, Alan Turing is given only a minor role, and - despite some acknowledgement of British contributions to both the MANIAC project and other early computers - the author clearly takes the view that von Neumann and IAS were the principle inventors of the modern stored program computer. This is debatable. British computer developments were ahead of US developments at many stages during this period, including the completion of Colossus ahead of ENIAC, the completion of the Manchester Baby ahead of MANIAC and other early computers, and the introduction of the Ferranti Mark1 as the first commercially available computer. von Neumann's "First Draft of a Report on the EDVAC" (1945) was the first published account of the idea of a stored program computer, and gave rise to the term "von Neumann architecture" which is still used today, but the idea had by then been current for a year or two and others, including Turing, were already experimenting with it. It can be argued that storage, or "memory" was the key innovation that allowed computing to develop and, once used for intermediate results during a computation, its use to store programs was an invention waiting to happen. Therefore, the book should be read in conjunction with Andrew Hodge's "Alan Turing: The Enigma" and other books on early computers to arrive at a balanced view.

The second flaw is, unfortunately, more serious. Dyson's view of the "digital universe" is based on his perception of current offerings from companies such as Amazon, Facebook and Google and on a dystopian interpretation of modern developments in which computers and networks reproduce themselves and become the controllers of mankind rather than its servant - a view more reminiscient of works of science fiction such as The Matrix rather than serious history. Several of the later chapters contain uncritical discussions of this theme. Dyson argues that computers have influenced human behaviour - and so, of course, has every other new technology - but he also says "Facebook defines who we are; Amazon defines what we want; Google defines what we think." Really? We are just waking up to the fact these companies pay little or no tax in the UK but, given the fact that their current services are easily fooled, perhaps we don't to worry about them taking over our minds just yet.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


5 of 6 people found the following review helpful
3.0 out of 5 stars Could have been so much better, 13 April 2012
By 
Verified Purchase(What is this?)
I had previously read "Alan Turing The Enigma" so didn't really pay too much attention to the implied notion that Turing was a rather odd chap who didn't actually contribute an awful lot because he was British. As knowledgeable people know, according to Winston Churchill, Turing made the single biggest contribution to Allied victory in the war against Nazi Germany.
I enjoyed sections of this book but found myself skipping page after page of background information where I really couldn't detect any relevance. Where this book is good is in getting into the hardware and software and what the machines were actually used for. The book would benefit from stripping out 100 pages of stuff like the origins of the land the IAS was built on and the personal attributes of people like Klari Von Neumann plus a logical ordering of the story, probably worth 5 stars then as the material is there.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


3.0 out of 5 stars Some prominent figures in science., 1 Mar 2014
By 
G. HAYWOOD (Lincolnshire - England) - See all my reviews
(REAL NAME)   
Verified Purchase(What is this?)
An informative but not overly interesting book. This book would have maximum appeal to someone with a specific interest in prominent figures in the history of science and mathematics. It did however 'have its moments'.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


‹ Previous | 1 2 3 | Next ›
Most Helpful First | Newest First

This product

Turing's Cathedral: The Origins of the Digital Universe
Turing's Cathedral: The Origins of the Digital Universe by George Dyson (Hardcover - 1 Mar 2012)
Used & New from: 5.14
Add to wishlist See buying options
Only search this product's reviews