I was looking for a book which provided a good introduction to computing suitable for readers not familiar with the field. This book was written in 1998 just after the IBM Deep Blue rematch. Despite its age, it's remarkably very relevant and forward looking with its coverage of parallel computing, quantum computers, genetic algorithms and back-in-fashion neural networks - there is very little missing. I would recommend it to anyone looking for a short (~150 pages), accessible read to learn about fundamentals, architecture and some research areas.
Some aspects seem loosely prescient. The statement "The pseudorandom number generator can pass all normal statistical tests of randomness" is a useful reminder particularly with the shameful backdooring of Dual_EC_DRBG. The game Go gets a brief mention since at the time of Deep Blue, humans were at least still masters of this game. It's unstated but perhaps for there's an implied question of how long this would remain the case. The one thing it misses or omits is the escape of the graphics co-processors from the specialist world of Silicon Graphics Inc to the commodity world of the home desktop market. In the last decade or so, these cost-effective GPUs have been 'borrowed' for other important tasks like neural network training.
If I were nitpicking, the paragraph on adaptive automatic pilot systems is very futuristic and doesn't contrast this idea with the current challenges of software reliability usually achieved with modular designs, extensive testing and some degree of formal verification. Continuous learning is an interesting area with active research on 'catastrophic forgetting'. It's going to be a long, long time before an ANN can apply skills/experience from one problem to another. (If Amazon is still around in 20 years I'll be back to comment on this.)
It's very good value now in the second-hand market! I was surprised to see an old library stamp in a copy I purchased. I'd say that library made a mistake in disposing of this.
Danny Hillis is probably best-known as the inventor of the Connection Machine, a massively parallel computer which was manufactured by his company Thinking Machines in the early 1980's. In this book, he tackles the problem of explaining how computers work, using simple, direct language and examples which are accessible to the layman. This could be viewed as challenging, since most people (even those of us who work with them all the time) view these machines as being too complicated to understand, and take them for granted in the same way as we do the cars we drive every day, for example.
The author rises to this challenge very well, building his explanation from the ground up, starting with an account of Boolean logic - i.e. the construction and manipulation of AND, OR and INVERT functions - that's firmly rooted in concrete examples (he points out that, although these functions are invariably implemented using electrical signals in a circuit, they could equally well be built using sticks and strings, or water-operated valves). Having laid down this foundation, he is able to move to more high-level topics such as programming, algorithms, heuristics, parallel computing, data encryption and compression, and adaptive systems, ending up with a lucid discussion about whether it will be possible one day to build a computer which could be described as (in a nod to the name of his old company) a thinking machine.
In spite of the abstruseness of these later subjects, he never leaves the reader behind, being careful to explain new ideas in simple terms that are easily understood. For example, he quotes the philosopher Gregory Bateson's definition of information as 'the difference that makes a difference', and points out (p10) how this could be applied to a binary signal, or bit. Elsewhere, he remembers (p110) a talk he gave in a New York hotel in the 1970's where he predicted that there would soon be more microprocessors than people in the USA. This caused one of his listeners to ask sarcastically, "Just what do you think people are going to do with all these computers? It's not as if you needed a computer in every doorknob!" - a humorous remark at the time, but one which has become true today, since each doorknob in that hotel (and thousands of others) contains a microprocessor which controls the lock. It's touches like this that make this a compelling read, giving the reader penetrating insights into the workings and development of these ubiquitous machines.
Bought to check whether there were any corrections or additions but it's the same edition without corrections as the original hardback. Still a great intro to Boole's logic as the core of computation running up to his bicentenay albeit with some less than ideal illustrations at its core (the traffic lights!).
I really enjoyed it - perhaps more than I expected. Clarity and precision are too often hidden in computing by an inability to explain in layman's terms. Definitely recommended. Just to explain why not 5 stars - it was published a while ago and things have moved on but it is still useful.
This book takes an extrordinary look at computers from angles you would never have looked at a computer from before. This book can be read and appreciated by anyone with any level of knowledge of computers. I personally have had several years of programming experience before reading this and have to admit I learned something new with each page turned. Even information I already knew was put forward to me in ways I had never looked at before, for example, in this book Hillis will explain to in a non-jargon, easy to understand manner as to how computers could be built from sticks and string. Hillis covers almost all aspects of computing in this text but without being too specific to the technicalities to each area. For example, if you know nothing about programming, he`ll explain the theory behind it without reffering to the syntax used. This book will apeal to anyone who has an interest in computers. As I say can be read by anyone (I could give this book to my granny and she wouldn`t get too lost, yet I could give it too a computer scientist and he would begin to look at his work in a different way). It would be an ideal text for anyone about to study computing/computer science courses at uni or college, as it lays down the foundations nicely. A MUST READ!!
Worth the money just for his three page discussion of simulated evolution (the software he 'evolved' for sorting lists of numbers is faster than any algorithm he can write but he has no idea why it works).