Start reading Computing: A Concise History (MIT Press Essential Knowledge) on your Kindle in under a minute. Don't have a Kindle? Get your Kindle here or start reading now with a free Kindle Reading App.

Deliver to your Kindle or other device

 
 
 

Try it free

Sample the beginning of this book for free

Deliver to your Kindle or other device

Computing: A Concise History (MIT Press Essential Knowledge)
 
 

Computing: A Concise History (MIT Press Essential Knowledge) [Kindle Edition]

Paul E. Ceruzzi
5.0 out of 5 stars  See all reviews (1 customer review)

Print List Price: £9.95
Kindle Price: £4.32 includes VAT* & free wireless delivery via Amazon Whispernet
You Save: £5.63 (57%)
* Unlike print books, digital books are subject to VAT.

Free Kindle Reading App Anybody can read Kindle books—even without a Kindle device—with the FREE Kindle app for smartphones, tablets and computers.

To get the free app, enter your e-mail address or mobile phone number.

Formats

Amazon Price New from Used from
Kindle Edition £4.32  
Paperback £8.49  
Kindle Daily Deal
Kindle Daily Deal: Up to 70% off
Each day we unveil a new book deal at a specially discounted price--for that day only. Learn more about the Kindle Daily Deal or sign up for the Kindle Daily Deal Newsletter to receive free e-mail notifications about each day's deal.


Product Description

Review

"It's a delightful small book, very nicely produced and with illustrations, perfect for a journey or to slip in a pocket for commuting. It's also, in 150 pages, a super overview of the history of this utterly transformational technology..."--Diane Coyle, The Enlightened Economist "For those interested in the fundamentals of computer history, Computing: A Concise History navigates a complex world with in-depth, authoritative coverage in terms accessible to the non-expert."--John F. Barber, Leonardo Reviews

Product Description

The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of "smart" hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective. He identifies four major threads that run throughout all of computing's technological development: digitization--the coding of information, computation, and control in binary form, ones and zeros; the convergence of multiple streams of techniques, devices, and machines, yielding more than the sum of their parts; the steady advance of electronic technology, as characterized famously by "Moore's Law"; and the human-machine interface. Ceruzzi guides us through computing history, telling how a Bell Labs mathematician coined the word "digital" in 1942 (to describe a high-speed method of calculating used in anti-aircraft devices), and recounting the development of the punch card (for use in the 1890 U.S. Census). He describes the ENIAC, built for scientific and military applications; the UNIVAC, the first general purpose computer; and ARPANET, the Internet's precursor. Ceruzzi's account traces the world-changing evolution of the computer from a room-size ensemble of machinery to a "minicomputer" to a desktop computer to a pocket-sized smart phone. He describes the development of the silicon chip, which could store ever-increasing amounts of data and enabled ever-decreasing device size. He visits that hotbed of innovation, Silicon Valley, and brings the story up to the present with the Internet, the World Wide Web, and social networking.

Product details

  • Format: Kindle Edition
  • File Size: 1481 KB
  • Print Length: 216 pages
  • Publisher: The MIT Press (15 Jun 2012)
  • Sold by: Amazon Media EU S.à r.l.
  • Language: English
  • ASIN: B008CJ4EE4
  • Text-to-Speech: Enabled
  • X-Ray:
  • Word Wise: Not Enabled
  • Average Customer Review: 5.0 out of 5 stars  See all reviews (1 customer review)
  • Amazon Bestsellers Rank: #70,093 Paid in Kindle Store (See Top 100 Paid in Kindle Store)
  •  Would you like to give feedback on images?


More About the Author

Discover books, learn about writers, and more.

Customer Reviews

4 star
0
3 star
0
2 star
0
1 star
0
5.0 out of 5 stars
5.0 out of 5 stars
Most Helpful Customer Reviews
2 of 2 people found the following review helpful
5.0 out of 5 stars Small, Cheap and full of knowledge! 16 July 2014
Format:Paperback
This little pocket book is brilliant!

Would recommend for the computing enthusiast and the hardcore CS students.
Comment | 
Was this review helpful to you?
Most Helpful Customer Reviews on Amazon.com (beta)
Amazon.com: 4.0 out of 5 stars  11 reviews
5 of 5 people found the following review helpful
4.0 out of 5 stars The chaos from which the digital world arose and still exists in... 5 Jun 2013
By ewomack - Published on Amazon.com
Format:Paperback
Most of us take the history of computing completely for granted. Similar to other everyday objects that have always just "been there," such as toilets, toothbrushes and cars, we just continue to use them with very little regard of how much had to happen for us to have our now seemingly indispensable cell phones, wireless laptops and interactive web pages. In regards to computing, so much has happened in so little time that some may have difficulty recognizing our current technological state in its predecessors. How did the now quaint looking Altair 8800 become our modern day laptop? How did the pipe organ sized ENIAC evolve into the microprocessor? The answers remain murky and nebulous without researching the relatively recent past. Still, the history of computing swells so much with information, inventions, directions, dead ends and successes that making anything but a strictly chronological history seem nothing less than Quixotic.

That's where the remarkably compact - in the spirit of the microprocessor - "Computing A Concise History" enters. In under 200 pages (including appendices and index) this very readable book will arm even the most technologically disinterested with a decent overall picture of how computing evolved from Babbage to Twitter. Not to mention that it remains cognizant that the history of computing has evolved as much as computing itself has evolved. It speaks volumes that the printed book cannot keep up with the electronic digital world. Technology books become obsolete almost while they're being written and even this one has aged since it appeared in 2012. Though it's definitely more current than other histories available, expect no discussion about "The Cloud" or other of the most bleeding edge technologies of the present moment. No doubt this book will itself seem quaint in five years time.

But that's just the nature of the game. No history of computing will ever even seem complete, so one must just start reading. And this book probably serves as the best starting point for this subject at this time. To help make the history of computing a little more coherent, the book breaks it down into four distinct "threads" that overlap and co-exist: "The Digital Paradigm," "Convergence," "Solid State Electronics" and "The Human Machine Interface." These threads weave through the narrative, though it remains questionable if they actually help organize the story of computing, regardless of how well they might give a high-level abstract idea of the whole story. In the end, this book is too short to get into gritty details on such a massive subject. Nonetheless, it excels as a brief history.

The story begins during WWII where the need to calculate gun trajectories arose under the threat of Blitzkrieg. Here, in 1942, George Stibitz coined the word "digital" based on finger counting (or "digit" counting). Then the story reaches way back to historical computing techniques such as pebbles, the abacus and the ingenious machines of Pascal and Babbage all the way to the punch cards used for the 1890 US census and the ticker tape tabulator of 1914. This solidifies the well known fact that computer history intertwines with the history of counting and calculating. More big innovations fly by in the years 1935 - 1945: the use of base 2 over base 10 mathematics by Zuse in 1937, and Alan Turing's famous and amazing work on a theoretical "computing machine" in 1936; the first algorithm use by Atanasoff in 1938 and Bletchley Park's mysterious, and supposedly destroyed, "Colossus" that processed text around 1941. It all culminated with ENIAC in 1946, the gigantic, and somewhat inflexible, vacuum tube strewn super computer of 1946 from where comes the phrase "to program."

One major innovation that reacted to ENIAC was the "Stored Program Principle" that allowed greater system flexibility. This principle stated that data and programs were basically the same thing and could be stored on the same devices. This and other breakthroughs, bred the UNIVAC, time sharing mainframes, IBM, Fortran, COBOL, operating systems, transistors to replace clumsy vacuum tubes, networking, ARPANET and eventually TCP/IP, on which the internet still runs today. In parallel, chips became smaller and smaller thanks to the silicon-based integrated circuit and more modularized processor design. This and the microprocessor helped launch the personal computer revolution that continues on today in cellphone form. The 1975 Altair made computing a personal matter and Microsoft rose to the challenge of writing software for it. Also some still familiar names appeared, including Apple, Commodore, the IBM PC and pioneering business/consumer programs such as VisiCalc, Lotus 1-2-3 and DOS. Windows, LANs, laser printers, GUIS, icons, Ethernet and more originated with the much glorified, but nonetheless commercially unsuccessful, Xerox PARC. Utilizing Doug Engelbart's mouse and other elements, PARC more or less created the modern PC look and feel. But others ultimately profited from it - and still profit from it.

All of which leads to the more familiar present day where Windows and Apple remain the dominant players in an internet saturated computing environment that produces smaller and more powerful devices each year. As ARPANET morphed into the internet as we know it, thanks to 1995 internet governance legislation, many entrepreners made access easier, such as AOL and Compuserve. In 1991 the World Wide Web was created at CERN and everything changed. Current mainstays Amazon and eBay, made possible by Mosiac's Secure Socket Layer (SSL) encryption in 1992, appeared in 1995 and "blogs" and social networking followed in the late 1990s. Google, Facebook and Twitter seem to reign at the moment, but that could change any time, impossible as it may seem. And everything started to shrink after the first handheld phone call in 1973 (though it took a while in non-cosmic time). Blackberries and PDAs gave way to iPhones and "SmartPhones" equipped with almost everything a PC has except the large screen. Somewhere in there e-mail also gave partial way to texting (though the book doesn't mention now the prevalent act of texting). Of course many things were left out due to space.

As of this writing, "Computing A Concise History" stands as the obvious choice for those looking to learn about the origins of this now ubiquitous digital culture. Some may bewail the absence or lack of emphasis on such huge topics as Java, Open Source, eBooks, Flash, alternate media, AI, forums, YouTube, video games, online "revolutions" and countless other emerged and emerging brands, technologies, events and trends. A thousand such histories could be written. As the book approaches the present, the pace becomes understandably frenetic as though no organizing principle exists for what is unfolding right in front of us. This seems accurate. Only hindsight may allow a sober analysis on what really mattered in the here and now. As such, this book looks back more than a century and tries to extrapolate what mattered in the history of computing in all of those previous present moments. And though it doubtless doesn't tell the whole story, it tells enough to give readers a coherent picture of the almost incomprehensible rise of the computing machines.
6 of 7 people found the following review helpful
3.0 out of 5 stars Brief introduction to history of digital computers 21 Aug 2012
By E. Jaksetic - Published on Amazon.com
Format:Paperback
This short book provides one perspective on the history of digital computer.

In the Introduction, the author states his premise that there are four major threads or themes that "run through" the history of the digital computer: (1) the digital paradigm, by which binary code is used for "coding information, computations, and control"; (2) digital computers reflect the convergence of a variety of different technologies, devices, and machines; (3) the history of digital computers has been "driven by a steady advance of underlying electronics technology"; and (4) the issue of the human-machine interface has raised philosophical issues about the nature and role of digital computing in society.

In the first chapter, the author briefly discusses the origins of digital computers in mechanical computing devices in the 1600s-1800s, punch cards that were first used in the 1800s to control weaving looms and later adapted to code information, and more modern technology such as the telegraph, telephone, and early electrical devices. The rest of the book covers the development and evolution of digital computers during the period 1935-present. The author uses his four major threads or themes to organize his discussion about the development and evolution of digital comptuers.

The book provides an adequate introduction to the history of digital computers. The book is written for the general public, so a reader does not need any technical training or expertise in digital computers to read and understand this book. Given the basic, introductory level of this book, it would be appropriate for high school or college students, or anyone with a casual interest in the history of digital computers.

Anyone interested in more detailed histories of computers should look at the following books: Paul E. Ceruzzi, A History of Modern Computing (History of Computing); John Orton, Semiconductors and the Information Revolution: Magic Crystals that made IT Happen; Frederick Seitz & Norman G. Einspruch, Electronic Genie: THE TANGLED HISTORY OF SILICON; and Michael R. Williams, A History of Computing Technology, 2nd Edition.
3 of 3 people found the following review helpful
4.0 out of 5 stars Reader's Digest History of Computing 29 Nov 2013
By Edward Meade - Published on Amazon.com
Format:Kindle Edition|Verified Purchase
There may be some quibbling with terms and exactly who did what when, but if you want to get the broad outline of how we got to where we are (and you don't want to spend an entire semester doing it) this is the book for you. Its a one-day, couple-of-hours read that covers most of the big points.
2 of 2 people found the following review helpful
5.0 out of 5 stars Precisely what it claims to be: a concise history of computing. 15 Oct 2012
By Jerry Saperstein - Published on Amazon.com
Format:Paperback
Paul E. Ceruzzi provides the history of computing as we know it, including precursor machines and theories. This is a very pared down book, covering only the highlights of the technology. Basic technological terms, inventions, methodologies are explained just to the point where a lay reader should be able to grasp them.

Major developments together with the major players are identified and described.

For people unfamiliar with the history of computing and who need a convenient resource containing this history (i.e., journalists, writers, etc.) will find this little volume helpful. For those more familiar with the technology, it's a good resource for looking up names and dates, a kind of reminder.

Overall, a valuable addition to a tech library, particularly for anyone who writes in any way about the subject.

Jerry
1 of 1 people found the following review helpful
5.0 out of 5 stars Concise = Bird Flight perspective 29 Jan 2014
By Dmytro - Published on Amazon.com
Format:Paperback|Verified Purchase
Easy to read book. New to me was four threads that run through history of computing.
(1) the digital paradigm
(2) convergence
(3) advancement in solid-state electronics
(4) human-machine interface

DSRemotelab
Were these reviews helpful?   Let us know
Search Customer Reviews
Only search this product's reviews

Customer Discussions

This product's forum
Discussion Replies Latest Post
No discussions yet

Ask questions, Share opinions, Gain insight
Start a new discussion
Topic:
First post:
Prompts for sign-in
 

Search Customer Discussions
Search all Amazon discussions
   


Look for similar items by category