Most of us take the history of computing completely for granted. Similar to other everyday objects that have always just "been there," such as toilets, toothbrushes and cars, we just continue to use them with very little regard of how much had to happen for us to have our now seemingly indispensable cell phones, wireless laptops and interactive web pages. In regards to computing, so much has happened in so little time that some may have difficulty recognizing our current technological state in its predecessors. How did the now quaint looking Altair 8800 become our modern day laptop? How did the pipe organ sized ENIAC evolve into the microprocessor? The answers remain murky and nebulous without researching the relatively recent past. Still, the history of computing swells so much with information, inventions, directions, dead ends and successes that making anything but a strictly chronological history seem nothing less than Quixotic.
That's where the remarkably compact - in the spirit of the microprocessor - "Computing A Concise History" enters. In under 200 pages (including appendices and index) this very readable book will arm even the most technologically disinterested with a decent overall picture of how computing evolved from Babbage to Twitter. Not to mention that it remains cognizant that the history of computing has evolved as much as computing itself has evolved. It speaks volumes that the printed book cannot keep up with the electronic digital world. Technology books become obsolete almost while they're being written and even this one has aged since it appeared in 2012. Though it's definitely more current than other histories available, expect no discussion about "The Cloud" or other of the most bleeding edge technologies of the present moment. No doubt this book will itself seem quaint in five years time.
But that's just the nature of the game. No history of computing will ever even seem complete, so one must just start reading. And this book probably serves as the best starting point for this subject at this time. To help make the history of computing a little more coherent, the book breaks it down into four distinct "threads" that overlap and co-exist: "The Digital Paradigm," "Convergence," "Solid State Electronics" and "The Human Machine Interface." These threads weave through the narrative, though it remains questionable if they actually help organize the story of computing, regardless of how well they might give a high-level abstract idea of the whole story. In the end, this book is too short to get into gritty details on such a massive subject. Nonetheless, it excels as a brief history.
The story begins during WWII where the need to calculate gun trajectories arose under the threat of Blitzkrieg. Here, in 1942, George Stibitz coined the word "digital" based on finger counting (or "digit" counting). Then the story reaches way back to historical computing techniques such as pebbles, the abacus and the ingenious machines of Pascal and Babbage all the way to the punch cards used for the 1890 US census and the ticker tape tabulator of 1914. This solidifies the well known fact that computer history intertwines with the history of counting and calculating. More big innovations fly by in the years 1935 - 1945: the use of base 2 over base 10 mathematics by Zuse in 1937, and Alan Turing's famous and amazing work on a theoretical "computing machine" in 1936; the first algorithm use by Atanasoff in 1938 and Bletchley Park's mysterious, and supposedly destroyed, "Colossus" that processed text around 1941. It all culminated with ENIAC in 1946, the gigantic, and somewhat inflexible, vacuum tube strewn super computer of 1946 from where comes the phrase "to program."
One major innovation that reacted to ENIAC was the "Stored Program Principle" that allowed greater system flexibility. This principle stated that data and programs were basically the same thing and could be stored on the same devices. This and other breakthroughs, bred the UNIVAC, time sharing mainframes, IBM, Fortran, COBOL, operating systems, transistors to replace clumsy vacuum tubes, networking, ARPANET and eventually TCP/IP, on which the internet still runs today. In parallel, chips became smaller and smaller thanks to the silicon-based integrated circuit and more modularized processor design. This and the microprocessor helped launch the personal computer revolution that continues on today in cellphone form. The 1975 Altair made computing a personal matter and Microsoft rose to the challenge of writing software for it. Also some still familiar names appeared, including Apple, Commodore, the IBM PC and pioneering business/consumer programs such as VisiCalc, Lotus 1-2-3 and DOS. Windows, LANs, laser printers, GUIS, icons, Ethernet and more originated with the much glorified, but nonetheless commercially unsuccessful, Xerox PARC. Utilizing Doug Engelbart's mouse and other elements, PARC more or less created the modern PC look and feel. But others ultimately profited from it - and still profit from it.
All of which leads to the more familiar present day where Windows and Apple remain the dominant players in an internet saturated computing environment that produces smaller and more powerful devices each year. As ARPANET morphed into the internet as we know it, thanks to 1995 internet governance legislation, many entrepreners made access easier, such as AOL and Compuserve. In 1991 the World Wide Web was created at CERN and everything changed. Current mainstays Amazon and eBay, made possible by Mosiac's Secure Socket Layer (SSL) encryption in 1992, appeared in 1995 and "blogs" and social networking followed in the late 1990s. Google, Facebook and Twitter seem to reign at the moment, but that could change any time, impossible as it may seem. And everything started to shrink after the first handheld phone call in 1973 (though it took a while in non-cosmic time). Blackberries and PDAs gave way to iPhones and "SmartPhones" equipped with almost everything a PC has except the large screen. Somewhere in there e-mail also gave partial way to texting (though the book doesn't mention now the prevalent act of texting). Of course many things were left out due to space.
As of this writing, "Computing A Concise History" stands as the obvious choice for those looking to learn about the origins of this now ubiquitous digital culture. Some may bewail the absence or lack of emphasis on such huge topics as Java, Open Source, eBooks, Flash, alternate media, AI, forums, YouTube, video games, online "revolutions" and countless other emerged and emerging brands, technologies, events and trends. A thousand such histories could be written. As the book approaches the present, the pace becomes understandably frenetic as though no organizing principle exists for what is unfolding right in front of us. This seems accurate. Only hindsight may allow a sober analysis on what really mattered in the here and now. As such, this book looks back more than a century and tries to extrapolate what mattered in the history of computing in all of those previous present moments. And though it doubtless doesn't tell the whole story, it tells enough to give readers a coherent picture of the almost incomprehensible rise of the computing machines.