10 of 13 people found the following review helpful
2.0 out of 5 stars
A disheartening experence, especially for the layperson., 29 Aug. 2013
A popular exposition of Turing's daring paper on Computable Numbers would be a welcome addition to any interested party's library, since this paper is held as, "The foundation of modern computer science." This book, despite the authors stated desire and indeed the volume's format, fails to provide a layperson an understandable or pleasant acquaintance with any topic pertaining to computer logic or programming methodology.
In addition to the general style, the thought behind how the concepts are presented seems willfully obtuse. Persistent failure to analogize any of the concepts in Turing's work can be infuriating, but this book commits the cardinal sin of all technical writing by using repetitive sentences to explain deceptively simple ideas and frequently using imprecise language which only leads to confusion. More recent programming concepts are laced into explanations of Turing Machines without any specific reference to them or anything like them in the paper itself. Indeed explanation of these modern concepts themselves seem to have gone missing. Most bewildering of all however, the author tries to explain subroutines and later a branching algorithm (as well as other processes) without any diagram, tree or otherwise, whatsoever. I'm baffled as to how a layperson could expect to follow this.
I eventually reread my O'Reilly book on understanding computation, read a free e book from Mcquarie University on computer and programming logic, and then read Turing's paper on its own. I am consequently of the opinion that someone with all Petzold's enthusiasm and programming knowledge combined with the fascinating nature of Turing's thought experiments, would find it difficult to monotonously construct a book on the subject. I just want to reiterate, Petzold describes programming methodology WITHOUT diagrammatically describing the process in any way, only verbose sentences are used, it must be the result of some form of madness.
Quoting from the back cover, "From his use of binary numbers to his exploration of concepts that today's programmers will recognize as RISC processing, subroutines, algorithms and others, Turing foresaw the future.." If you understand any of these concepts and can use them in software creation, then this book may provide a nice exploration of Turing's work, you could however just read the original paper unabridged for free online. If you are unaware of these concepts in the quote, this book is practically worthless.