Top positive review
One person found this helpful
Turing's Cathedral: One view of the early development of computers
on 8 April 2013
The core of this book is a useful, informative, and at times exciting, account of early computer projects, with the Institute for Advanced Study at Princeton University as the setting, the development of nuclear weapons as the plot, and John von Neumann as the principal character, by an author who was there (as a child) at the time. The principle action concerns the development of the ENIAC and MANIAC computers, and the books sheds light on the relationships between many of the characters and organisations involved in these projects during the Second World War and the immediate post-war period. However, it has two flaws which lead me to have reservations about it.
The first of these is a certain lack of balance. Despite the title, Alan Turing is given only a minor role, and - despite some acknowledgement of British contributions to both the MANIAC project and other early computers - the author clearly takes the view that von Neumann and IAS were the principle inventors of the modern stored program computer. This is debatable. British computer developments were ahead of US developments at many stages during this period, including the completion of Colossus ahead of ENIAC, the completion of the Manchester Baby ahead of MANIAC and other early computers, and the introduction of the Ferranti Mark1 as the first commercially available computer. von Neumann's "First Draft of a Report on the EDVAC" (1945) was the first published account of the idea of a stored program computer, and gave rise to the term "von Neumann architecture" which is still used today, but the idea had by then been current for a year or two and others, including Turing, were already experimenting with it. It can be argued that storage, or "memory" was the key innovation that allowed computing to develop and, once used for intermediate results during a computation, its use to store programs was an invention waiting to happen. Therefore, the book should be read in conjunction with Andrew Hodge's "Alan Turing: The Enigma" and other books on early computers to arrive at a balanced view.
The second flaw is, unfortunately, more serious. Dyson's view of the "digital universe" is based on his perception of current offerings from companies such as Amazon, Facebook and Google and on a dystopian interpretation of modern developments in which computers and networks reproduce themselves and become the controllers of mankind rather than its servant - a view more reminiscient of works of science fiction such as The Matrix rather than serious history. Several of the later chapters contain uncritical discussions of this theme. Dyson argues that computers have influenced human behaviour - and so, of course, has every other new technology - but he also says "Facebook defines who we are; Amazon defines what we want; Google defines what we think." Really? We are just waking up to the fact these companies pay little or no tax in the UK but, given the fact that their current services are easily fooled, perhaps we don't to worry about them taking over our minds just yet.