Learn more Shop now Shop now Shop now Shop now Shop now Shop now Learn More Shop now Learn more Click Here Shop Kindle Amazon Music Unlimited for Family Shop now Shop now
Profile for Jeremy E. May > Reviews

Personal Profile

Content by Jeremy E. May
Top Reviewer Ranking: 348,392
Helpful Votes: 75

Learn more about Your Profile.

Reviews Written by
Jeremy E. May "Software pro" (Fort Lauderdale, Florida)

Show:  
Page: 1 | 2
pixel
Treason: Liberal Treachery from the Cold War to the War on Terrorism
Treason: Liberal Treachery from the Cold War to the War on Terrorism
by Ann Coulter
Edition: Paperback
Price: £12.04

3.0 out of 5 stars Devastating sarcasm and rejection of liberal assumptions, 4 Aug. 2017
Verified Purchase(What is this?)
Fiery American Fox News contributor Ann Coulter delivers a crushing series of putdowns in her book Treason, a master class in effect in how to mock, ridicule and belittle one’s political opponents. The targets of her impassioned wrath are exclusively liberals, not just Democrats but also soppy Republicans who side with them, plus leftist journalists, actors and film makers, judges, lawyers and academics. All these are guilty - in Coulter’s view - of ‘treason’, a loose and subjective (to put it mildly) interpretation of the meaning of the term. Most notably, she offers a highly spirited defence of Senator Joseph McCarthy and his bitterly controversial campaign to root out communist sympathisers in the US State Department and military during the early 1950s. Certainly McCarthy was not alone and by no means the first to take up the anti-communist cause, let alone be held responsible for the ‘Red Scare’ which so typified that era. By the time he made his maiden speech on the subject in early 1950, both the House Committee on Un-American Activities (the HUAC) and J. Edgar Hoover’s FBI had been intensively monitoring pro-communist activity in the federal government for several years. In 1947 President Truman issued an executive order making permanent the loyalty security program he had earlier set up to uncover security risks among government employees. Shortly thereafter, the HUAC investigated former Roosevelt aide and State Department official Alger Hiss following accusations he had spied for the Russians; he was subsequently convicted of perjury, while at the same time the Internal Security Act - which placed severe restrictions on the political activities of communists in the US – sailed thru Congress. Meanwhile in Britain, ex-Manhattan Project scientist Klaus Fuchs got a 14 year jail sentence for imparting A-bomb secrets to the Soviets with the help of Americans Julius Rosenberg, his wife Ethel (both executed in the US three years later) and brother-in-law David Greenglass.

Was McCarthy unjustly maligned when labelled a reckless, unprincipled demagogue? Did he deserve the censure motion imposed on him by his senatorial colleagues in late 1954 which effectively ended his political career and undoubtedly contributed to his premature demise (from alcoholism) three years later? Does he merit rehabilitation today? Coulter makes a good case that he was a fundamentally decent man, a highly articulate and sincere patriot, a committed upholder of American values. The decryption of Soviet secret intelligence messages conducted by the Venona Projecr from 1943 to 1980 (only published in 1995) has revealed the true extent of communist espionage during the Cold War and provided at the very least a partial vindication of McCarthy’s campaign. Coulter doesn't make this point but it is clear in retrospect that he was guilty only of miscalculation – of being a one trick political pony who increasingly misread the public mood and should, after 3 years selfless dedication to highlighting communist subversion, have passed the baton to others and shifted his focus to alternative, equally pressing, national issues.

The rest of Treason is devoted to much more recent events, in particular the post 9-11 war on terror. Here, Coulter’s excoriation of liberals reaches a peak as she most eloquently ridicules the mindset of all who question the wisdom of combating radical Jihadism with every means at our disposal. But her denigration of those (especially Democrats like Al Gore) who justifiably opposed President George W. Bush’s stated policy of invading Iraq to depose Saddam Hussein – especially when the US was still bogged down fighting the Taliban in Afghanistan and no convincing evidence yet existed that Saddam possessed either weapons of mass destruction or was in cahoots with Al Qaeda terrorists – represents a rash rush to judgement on her part which can only be excused on the grounds that Treason was written too early (i.e. in 2002) to take into account the negative consequences of the subsequent invasion. Above all, her repeated assertion that Republican presidents (unlike Democrat ones) don’t usually start wars is directly contradicted by Bush’s foolhardy decision to go to war on this occasion.


Great Britain's Great War
Great Britain's Great War
by Jeremy Paxman
Edition: Paperback
Price: £7.78

1 of 1 people found the following review helpful
3.0 out of 5 stars Informative and well-written, 15 Jun. 2017
Verified Purchase(What is this?)
Celebrated British TV presenter and author, Jeremy Paxman, has some fascinating things to tell us in this well-written and highly informative survey of how World War One affected various sectors of the British population. He pays tribute, for instance, to the remarkable pioneering role of Harold Gillies, the inventor of plastic surgery, in reconstructing the often gruesomely shattered faces of badly wounded soldiers, enabling them to hold their heads high in public without undue embarrassment to themselves and others. He vividly narrates the infamous antics of some British women in harassing young males in civilian clothes (they wrongly assumed the latter hadn’t signed up to fight) - an experience my own maternal grandfather, an artillery officer, personally underwent while on leave from the Western Front). The transformative effect on the social complexion of the officer class resulting from the hopelessly ill-conceived Somme offensive of July to November 1916 (160,000 British fatalities, a disproportionately high percentage being upper middle class public (i.e. private) school educated young officers) - which necessitated the rapid infusion of hastily trained officer replacements from lower social backgrounds - is eloquently described. In documenting the squalor, filth and disease-ridden conditions of trench warfare, the author pulls no punches and brings home the horrific immediacy of such warfare.

In its coverage of leading participants in the War, Paxman’s book is, however, open to criticism. Weak and vacillating British Prime Minister Herbert Asquith, (who, against his better judgement, let himself be bullied by gung-ho cabinet colleagues into rushing troops to France to oppose the invading Germans) and his flamboyant successor, David Lloyd George, (who shabbily sought to shield a family member from the fray with the offer of a cushy desk job - while at a the same time consigning tens of thousands of British military personnel to needless deaths), both inexplicably get a pass. The impulsive decision of Winston Churchill, then Secretary of the Navy and at 37 the rising star of British politics, to up the ante some three days before the official start of hostilities (and thereby make the outbreak of war almost inevitable) by ordering the main fleet to relocate from the English Channel to the North Atlantic gets barely a mention, despite the fact that the overall purpose of this naval activity – the imminent imposition of a North Sea blockade on German shipping - was instantly recognised by the Germans and correctly interpreted as a preliminary act of war. For the conduct of British generals in the field, their endless preventable blunders and appalling lack of imagination, Paxman offers the lame excuse ‘What else could they do?’ He seems oddly unaware of the principles of initiative and delegation so deeply instilled as a modus operandi in the German Army whereby every rank from colonel down to corporal was trained to immediately assume command of all those under him whenever superior officers were no longer available. Why did the British top brass never think of imitating this policy? Instead, in the event of critical officer staff becoming casualties, infantrymen were left (as happened particularly during the 1916 Somme offensive) to wander aimlessly about, lacking the authority and operational knowledge to exploit any momentary advantage gained on the battlefield. Regarding the German Emperor, Kaiser Wilhelm II, Paxman curiously omits to impart the extraordinary circumstance that Wilhelm was not only fifth in line to the British throne but that he also, since his mother was the eldest child of Queen Victoria, would have automatically become King of Great Britain had the rite of male royal primogeniture been rescinded (this had to wait till 2014) prior to the demise of playboy monarch Edward VII in 1910. Despite his bombastic outbursts, Wilhelm was an Anglophile at heart, his fundamental wish being to avoid war with Britain, an attitude most dramatically expressed during the immediate run-up to the start of the War in August 1914 when he desperately sought to dissuade his resolutely obstinate general staff from implementing the Schlieffen Plan (an utterly misguided strategy because its exclusive focus on winning the war in the west neglected all political realities, in particular the adverse impact on British public opinion of a German violation of Belgian neutrality). Instead, he unsuccessfully pushed for the much more realistic alternative of an offensive war against Russia in the east coupled with a defensive posture in the west. This strategy would almost certainly have paid huge dividends, resulting at the very least in a military stalemate and, even more likely, a complete German victory in the War.


The Rediscovery of the Mind  (Representation and Mind series)
The Rediscovery of the Mind (Representation and Mind series)
by John R. Searle
Edition: Paperback
Price: £26.95

1 of 2 people found the following review helpful
1.0 out of 5 stars Counterintuitive and cocksure, 5 Jun. 2017
Verified Purchase(What is this?)
The great Roman author, jurist and statesman, Marcus Tullius Cicero (106-43 BCE, once sarcastically observed ‘I know of nothing in any way too absurd to say that isn’t said by some philosopher’ - nescio quo modo nihil tam absurde dici potest quod non dicatur ab aliquo philosophorum (De Divinatione 1I. 119). This was over 1,600 years before Rene Descartes, in propounding the notion ‘I think therefore I am’ (a plagiarised sentiment going back to the ancient Greek thinker Parmenides) and that of a material brain working in concert with an immaterial mind, unleashed that notably errant brand of speculation called the ‘philosophy of mind’. It’s a safe bet that Cicero, a man with an encyclopaedic knowledge of Greek philosophy and no mean philosopher himself, would have reacted with even greater stupefaction and derision were he alive today and exposed to this type of thinking. For the philosophy of mind exemplifies to the full the worst characteristics of philosophy in general – an abiding addiction to the counterintuitive, to arcane distinctions of hair-splitting triviality, to the burying of the blindingly obvious in preposterous pseudo profundity, to pervasive hyperbole and dogmatic, unsubstantiated assertions, to pompous, pretentious pedantry, and finally, but not least, to a bewildering terminology typified by an endlessly proliferating array of mostly superfluous -isms. In a pathetic effort to mask the intellectual poverty of their discipline and aspire to quasi-scientific respectability, philosophers of mind have taken in recent years to brazenly call themselves ‘cognitive scientists’, a term normally reserved for those engaged in the purely scientific study of the brain and modelled simulations of mental activity, i.e. neuroscientists, AI specialists and psychologists. There is nothing remotely scientific, however, about any aspect of philosophy, least of all the philosophy of mind. For the latter, whose distinguishing features are strict logical analysis and clear-cut conceptual definition, eschews the results of practical observation and experiment - thereby precluding any serious investigation of a nebulous entity like the mind and, in particular, of that most mysterious and elusive of its properties, namely consciousness. Indeed, by its very terms of reference, practitioners of the philosophy of mind have little choice but to label mental concepts as ‘incoherent’ and ‘irrelevant’, and them dismissing them from further serious consideration.

Among philosophers of mind, UC Berkeley-based John Searle is often viewed as a moderate, a sensible and level-headed compromise between the flamboyant metaphysical abandon of Daniel Dennett (see Consciousness Explained) on the one hand and the icily tendentious elitism of Peter Hacker (see The Philosophical Foundations of Neuroscience) on the other. Yet it is perhaps illustrative of the fundamental shortcomings of this discipline that even Searle’s work exemplifies many of the negative features mentioned above. For starters, the very title, The Rediscovery of the Mind, is a hyperbolic misnomer since the author fails therein to rediscover, let alone discover, anything of any consequence about the ‘mind’, the book turning out instead to be a summation of various theories and his own highly debatable speculations on this contentious topic. He offers at the outset what appears to be a spirited defence of the primacy of first person subjectivity and qualia (i.e. our individual responses to sensory input) in learning about the world, an approach which he rightfully claims is unduly downplayed by philosophers of mind in favour of the objective 3rd person perspective typically employed in science. He therefore roundly condemns materialist views of the mind in all their various guises (i.e. physicalism, monism, reductionism, the identity theory and above all eliminative materialism) - in particular on the ground that the submersion of mental faculties within the physical brain which they imply denies any element of first person subjectivity. Logical behaviourism (i.e. divining the inner thought processes of others from their outward behavior) and functionalism (i.e. plotting the causal relations between various mental functions, their inputs and outputs) are likewise rejected for the same reason; in the first case somewhat unfairly (because what else do we usually have to go on except the outward behavior of others in trying to judge what’s happening inside their heads?), and in the second, much more reasonably, because the omission of personal subjectivity in functionalism’s remit virtually eliminates any constructive value it might have as a cognitive tool.

But Searle’s endorsement of first person subjectivity turns out to be a limited one. For it soon becomes clear that he conceives it exclusively in terms of external personal subjectivity, i.e. referring only to what’s outside us, and not to what’s internal within us. He therefore dismisses the concept of a privileged access we all might have to our own mental states (e.g. hopes, beliefs, desires, fears etc.), doing so with the curious argument that where they are would have to be ‘different from the space’ we would go looking for them. This assertion is echoed in his equally counterintuitive rejection of introspection on the ground that ‘any introspective observation that I might care to perform is itself that which was supposed to be observed’. Why does he think this? Because to him, the idea of us having feelings about other feelings we have represents a logical contradiction, a violation of clear-cut, unambiguous conceptualisation of the self. First person subjectivity he clearly views as bound up with the issue of intentionality, an arguably superfluous notion (but one of Searle’s pet obsessions - see his magnum opus Intentionality) which denotes a category of mental states which are always about something else (those which are either about ourselves or about nothing in particular, e.g. an aimless, inexplicable sense of euphoria or depression, are excluded from this category). Unsurprisingly, intentionality’s scope is circumscribed in a very similar fashion to that of first person subjectivity – a feeling of pain, he says, cannot be considered to exemplify intentionality because our mentally reacting to another mental thought or experience like a pain purely references something internal to ourselves and not something outside us.

It transpires that although the book purports to treat consciousness as one of its principal themes, the deeper aspects of consciousness are something the author is clearly uncomfortable with: almost needless to say, they represent ‘incoherent’, illogical and hence indefinable concepts. Don’t therefore expect to find here any serious attempt to investigate such standard mental processes as reflection, recognition or recollection, let alone any reference to the subconscious or related phenomena like telepathy, premonitions, hypnosis etc. The nearest Searle gets to this sort of stuff is his bland motion of ‘unconscious’ mental states - typically memory and epistemic content like the fact I still ‘know’ when I’m comatose or asleep that the Eiffel Tower is in Paris - which have the potential to surface as conscious states. This feature he grandly denominates the ‘principle of connection’ - just one instance of idiosyncratic terminology sporadically injected into his narrative. Other examples of such Searlian nomenclature include a) the ‘Background’, meaning the myriad capabilities we derive from daily life experiences which enable us to correctly interpret and control our environment, b) the ‘Network’, a term he bestows on the web of hopes, fears, desires, beliefs etc. that serve as operational backup to the ‘Background’, and finally c) ‘strong AI’, an especially misleading term (it’s used in a totally different context in computer science circles) and by which Searle means the notion – apparently widespread among philosophers of mind - that the mind can be viewed as little more than a piece of computational software working in harness with a physical brain. This concept he had first belittled in 1980 in his ‘Chinese Room’ analogy, and here he offers a revised version of the argument to convey how computers, unlike minds, have no accompanying conscious awareness, in particular of what they’re doing when they execute programs; like trains forced to stick to and follow rail tracks, they mindlessly obey given syntactical rules and are oblivious to the semantic significance of the data processing and number crunching operations they have to carry out. This example of the blindingly obvious (every 2nd grader who encounters computers gets to understand this most basic of principles from the get-go) was treated - incredibly - as a ground-breaking revelation by many in the philosopher of mind community when Searle first proposed it, a striking testament to pervasive computer illiteracy among its membership.

A prominent theme in the book is the irreducibility of the mind and the associated issue of dualism, a highly controversial topic which, according to Searle, philosophers of mind tend to recoil from in horror since it evokes the generally discredited Cartesian thesis of an immaterial/physical mind-body set-up. Here, as part of his firm stance against materialism, he rejects the notion of the mind’s reducibility to a physical brain, proposing in place of a wholly material brain the concept of one invested with both physical and mental properties, the former representing the brain’s neuronal basis and the latter the supervenient role of the mind on the former. Despite the fact that this interpretation is manifestly a type of property dualism, it doesn’t stop Searle from trying to make out (from eagerness to shore up his possibly shaky anti-dualist credentials?) that it isn’t. To back up this line of thinking, he makes much of the supposedly analogous case of how H2O’s properties, for instance, can vary from liquidity (i.e. plain water) to solidity (i.e. ice). But the comparison fails miserably because the distinction between the mental and the physical – a huge conceptual gulf to be sure - is of a vastly greater order of magnitude and significance than that between purely natural, observable phenomena like liquidity and solidity; we are on firm scientifically testable and definable ground when we talk of these, but can the same be said of a priori concepts like the mental and the physical? In his treatment of dualism, we see an attempt to have it both ways, of wanting (as others have pointed out) to have his cake and eat it. Yet another example of this can be detected in his ambivalent attitude to the concept of ‘homunculi’ (i.e. little men) who many philosophers of mind believe are supposedly needed to account for many integral neurological processes, e.g. the ‘Cartesian Theatre’ of mental imagery derived from memory or the external world which we all witness when we close our eyes, or, at a more basic level, the transformation of the two dimensional visual array our retina receives into the three dimensional representation of outside reality we consciously experience. Oh really? Without invoking the principle of ‘recursive decomposition’ (apparently one of Dennett’s brainwaves) to explain homunculi away we might well take on board the no brainer consideration that countless vital physiological processes like breathing, cardio-vascular activity, digestion etc. occur automatically, not only without any conscious intervention on our own part but also without any suggestion by eccentric philosophers of mind that busy bands of homunculi are working overtime behind the scenes to take care of all these functions. Why then assume our visual system is any different?

There is an elegance, precision and stimulating quality to Searle’s writing which give it a superficially persuasive air – presumably that’s why The Rediscovery of the Mind has attracted many favourable reviews. His fecund imagination enables him to repeatedly come up with an endless supply of vivid analogies which do much to illustrate the points he is making. But the text is marred not only by the various defects already noted but also by persistent unsubstantiated assertion, a particularly imprudent and irritating habit when dealing with a topic as nebulous as the mind. We learn for instance, that witches don’t exist (how does Searle know this?) and that quantum physics is something animals don’t understand but humans do (pity he never tried convincing the famous physicist Richard Feynman of this obvious truism!). Ad nauseam we are lectured that the physical brain houses all our mental processes, thoughts and feelings, that the most critical mental function of all - consciousness - is just a property, an emergent feature which emanates seamlessly from our cerebral circuitry. But what justifies such certitude? On the evidence of this book, a little less dogmatism, a bit more intellectual humility on Searle’s part would have been much appreciated.
Comment Comment (1) | Permalink | Most recent comment: Aug 7, 2017 11:15 AM BST


The Age of Genius: The Seventeenth Century and the Birth of the Modern Mind
The Age of Genius: The Seventeenth Century and the Birth of the Modern Mind
by Professor A. C. Grayling
Edition: Paperback
Price: £10.98

1 of 1 people found the following review helpful
2.0 out of 5 stars Patchy and unbalanced, 15 April 2017
Verified Purchase(What is this?)
The seventeenth century was arguablv the most notable in European (and, by extension, world) historical progress. It began with a continuation of the deeply-entrenched attitudes and traditions of the preceding medieval epoch, i.e. mysticism, religious orthodoxy and major impositions on free expression, especially that of a scientific nature which conflicted with the doctrines of the Catholic Church. By the end of the century, a much more open society had emerged in northern Europe in particular, one in which a radically new mindset with a focus on secularism, self-responsibility and the untrammelled pursuit of scientific enquiry had replaced the earlier dominant obscurantism.

British academic A.C. Grayling’s book is a well-written, informative and stimulating account of this extraordinary era. Unfortunately, it’s also patchy, unbalanced and uneven. At times, the author seems to assume an unwarranted familiarity on the part of his readers with the most celebrated events and features of the period, an assumption which makes him neglect detailed discussion of these while conversely dwelling much more on those he senses his readership has little knowledge of. For this reason, the English Civil war gets barely a single mention, whereas over 80 pages (a quarter of the entire book) chart the tortuous course of the roughly contemporaneous Thirty Years war in central Europe – an event Grayling clearly thinks his readers know little (and need therefore to know more) about. This focus on the lesser known involves a detailed account of the strong undercurrent of alchemy, mysticism and superstition accompanying the momentous progress of 17th century science (an entire chapter records the career of relatively obscure mathematician and magician John Dee, the inspiration for Marlowe’s protagonist in Dr. Faustus and Shakespeare’s Prospero in The Tempest), and a corresponding failure to document Newton’s seminal contributions to optics, mechanics, physics and mathematics in favour of explicit comment on his abiding interest in alchemy, occultism and biblical interpretation. Galileo’s epic advances in mechanics and astronomy are overshadowed by an over lengthy narrative of the harsh persecution meted out to him by the Catholic Church. Francis Bacon’s endorsement of a cooperative approach to science and rigorous adherence to scientific method gets admittedly much more generous treatment, but (perhaps because of the disproportionate amount of space Grayling has already used up telling us all about the Thirty Years war!) this is the exception – most of his accounts of individual scientific achievement are either perfunctory or non-existent. Major advances like William Harvey’s revolutionary work on blood circulation and Blaise Pascal’s on computation, for instance, are dismissed in a couple of sentences, those of Johannes Kepler, Robert Boyle, Christopher Wren and Robert Hooke in astronomy, physics, mathematics and natural science receive even less attention, while almost entirely absent from the narrative is any mention of the vital contributions of microscopy pioneer Antonie van Leeuwenhoek, astronomer and mathematician Christiaan Huygens or towering genius Gottfried Leibniz (co-inventor of mathematical calculus). Turning to philosophy, Rene Descartes inevitably takes centre stage, despite the fact that his methodology, supplemented by highly dubious and heavily derivative suppositions like ‘I think therefore I am’ and belief in a physical body combined with an immaterial mind, aroused immediate and understandable controversy. In the case of Thomas Hobbes and John Locke, a fulsome analysis is presented of their contrary positions on constitutional liberty, but this is sadly unaccompanied by any reference to their equally important ideas on empiricism (Hobbes was the first true British empiricist) and the philosophy of mind (Locke’s. seminal Essay Concerning Human Understanding gets no mention). Coverage of the humanities, bar isolated literary quotes from Alexander Pope and contemporary English pornography (i.e. poems by the Duke of Rochester), is minimal. As regards the arts, apart from token acknowledgements of the ‘Golden Age’ of Dutch painting, no major artists are identified, nor is the overriding influence of Italian Palladianism (notably exemplified in England by Inigo Jones and then Wren) in architecture anywhere cited.

Mention is made of the foundation in 1639 of a trading post by the British East India Company at Madras. Curiously, nothing is said however about the contemporary Dutch East India Company, the greatest corporate enterprise of the day, whose establishment of extensive maritime links with the Far East (necessitating the settlement of present-day Cape Town in 1652 as a half way point providing fresh fruit and vegetables to ward off scurvy among its sailors during the long journey from N. Europe) deposited untold wealth into the pockets of Amsterdam merchants. The momentous spike in foreign trade directly underpinned the emergence of Dutch supremacy in commerce and banking, financial skills which later – following the ‘Glorious Revolution’ of 1688 in England when Dutch leader William of Orange became King William III – were avidly absorbed and further refined by the English, fuelling Britain’s meteoric rise in the following century to world power status and subsequent global pre-eminence.

A very important point which Grayling does thankfully make is the great improvement in communications, largely attributable to very much better mail delivery services, during the 17th century. Information could thus be transmitted much more rapidly than before, the scientific community being a major beneficiary of this development. Bacon’s ideal of science as a cooperative endeavour could now be fully realised, a process greatly facilitated by the tireless efforts of polymathic commentators like Marin Mersenne and Pierre Gassendi in publicising and circulating the latest theories and results among a wide circle of contacts, an association semi-institutionalised by Mersenne in 1635 as the Academie Parisienne, Europe’s first scientific society and forerunner of the Royal Society of London established in 1660.


Anger and Forgiveness: Resentment, Generosity, and Justice
Anger and Forgiveness: Resentment, Generosity, and Justice
by Martha C. Nussbaum
Edition: Hardcover
Price: £16.99

12 of 18 people found the following review helpful
1.0 out of 5 stars Contradictory and inaccurate, 20 Oct. 2016
Verified Purchase(What is this?)
Like all the emotions, the subject of anger is one of interest to both psychology and philosophy. But whereas the focus of the former is more devoted to investigating the condition itself, that of the latter tends to concentrate more on the impact and, if applicable, redress of this alpha emotion. This ‘consequentialist’ approach to anger is well represented in 'Anger and Forgiveness', a book whose principal theme is how victims should respond to the wrong they have suffered. Should they yield to anger? If so, should they let anger dictate a resolve to seek retribution from the perpetrator for the wrong? Nussbaum’s answer, for most of the book, is an unequivocal ‘no’ on both counts. Anger, she insists, is invariably a negative emotion and the impulse to obtain payback a normatively indefensible attitude. With almost all wrongs, it’s much wiser to ignore the ill one has suffered, work through one’s emotions and adopt a ‘forward-looking’ attitude to life involving a focus on positives like ‘work, friendship, exercise, shopping, confidence, self-esteem, love and trust’; not to do so is ‘narcissistic’ and rarely if ever likely to restore whatever thing, tangible or intangible, one has lost. But isn’t this overall prescription - which she labels the ‘Transition’ - an arguably naive and unrealistic assumption? May one not already be temperamentally fully disposed to act in this very way and yet find it psychologically impossible to wholly shrug off the injury one has been subject to, especially if it is not a trivial one? Might one be unable to get the anger one instinctively feels off one’s chest and move on successfully with one’s life without knowing that the agent of a substantial wrong to one has been made to suffer in some way too? And might not society by extension reasonably view possible repeats of such misbehaviour as harmful to itself and seek to deter them in future by instituting punishment for the offenders?

These issues of course have their roots in Greco-Roman philosophy, notably Aristotle (whose obsessive notions of proportion and balance led him to opine that anger is acceptable provided it is expressed in the right way, at the right time, in the right amount and towards the right individual - whatever all that truly means!) and the Stoics (particularly Seneca and Epictetus), and later getting explicit attention from 18th and 19th century ethical theorists like Joseph Butler, Kant, Beccaria and Bentham. Nussbaum presents a deeply committed analysis to make her case, dissecting and differentiating the constituent strands of her arguments with painstaking care and conviction. One cannot disagree with her that anger is generally a most unwelcome emotion (oddly, she fails to point out in this context that its extreme form, i.e. uncontrollable, incandescent rage, is one of the least dignified and most counterproductive types of behaviour there is, especially when fuelled to a fever pitch by alcohol, drugs, high anxiety, a pathological, however justified, sense of grievance or a naturally quick temper). But Nussbaum’s condemnation of all forms of anger is unwarranted. One can make a good case that a measured display of anger, for instance, one expressed merely by an icy smile, pursed lips, a furrowed brow, weary sighs or withering sarcasm, is a very valuable and effective form of communication, an essential social skill whose mandatory inclusion in the educational curriculum is an imperative necessity. For does not the perpetrator deserve to know how the object of his/her misbehaviour feels? How can the former possibly be expected to work on modifying his/her behaviour when denied this essential emotional feedback? Strikingly, the word ‘education’ appears only a couple of times in the entire book, and that in cursory one line references to public discussion and anger management classes for adults. But it is children who desperately need to be taught about anger, how to detect both in themselves and others the warning signs and kindred emotions (festering discontent, frustration, impatience, jealousy etc.) which serve as breeding grounds for its emergence, how to harness, channel and control it, and how above all to achieve ‘measured’ expressions of this emotion to more effectively communicate and socialize with one other. Prevention is always better than cure, and, instead of resorting to will o’ the wisp and late in the day notions of ‘Transition’ to hopefully cure already angry adults, the general problem of anger in society can surely be far better addressed by a focus on the very young, by inculcating into them both its dangers and occasional benefits before they develop bad habits from others (parents and fictitious media characters especially). Nussbaum’s failure to take this obvious point on board constitutes a major flaw in her overall thesis.

Later in the book, Nussbaum appears to modify her earlier stance on the issue of retribution by recommending in certain unspecified situations a recourse to the law to take the place of personally directed payback. But isn’t legal recourse just another type of payback? lf one either actively initiates a law suit against a wrongdoer or passively, without objection, lets the law intervene of its own accord and prosecute the offender, is one not in effect participating, or at the very least conniving, in retribution against that individual? And what justifies her blind faith in legal justice? Not every society adheres to strict, standardized sentencing guidelines – England, with its poorly regulated, amateurish and outrageously politically correct criminal justice system, is a case in point. Nussbaum shifts her former ground even more when opining that assaults on one’s ‘dignity’, i.e. slurs and/or discriminatory conduct of an ethnic or gender-based character, constitute a special class of exceptions that does justify retribution on the part of the victim. Clearly, such behaviour is highly offensive, but is it truly worse than that of the drunken thug, male or female, who launches a wholly unprovoked attack in a crowded bar, smashing a beer glass into one’s face and inflicting semi-blindness in the process? Problematic too is that this concern about the negative impact on a victim’s dignity of racial or sexual ‘down-ranking’ appears to clash with the equally forcefully expressed concern she makes later in the book about the effect of punishment generally on offenders. More than once she describes this as threatening both the ‘dignity’ and ‘non-humiliation’ which the offender is entitled to. How on earth can these opposing concerns be reconciled?

The last few chapters weigh in on the issue of forgiveness. Whether Catholic ‘transactional’ (wholly penitential), ‘unconditional’ (wholly blame free) or Jewish (a compromise between the other two), Ms. Nussbaum is deeply suspicious of this response, viewing it as little more than moral grandstanding on the part of the victim and as imposing an unacceptably unhealthy level of guilt on the perpetrator. While one might have serious reservations about ‘transactional’ forgiveness in view of what one might consider an over-emphasis on confession and self-abasement demanded from the offender, it is difficult to see what her rationale is for rejecting ‘unconditional’ forgiveness too. She then turns to famous 20th century alternatives to forgiveness, those exemplified in the varied efforts to achieve ‘revolutionary justice’ on the part of Mohandas Gandhi, Martin Luther King and Nelson Mandela. Here she chastises Gandhi for his advocacy of unqualified pacifism (especially towards Hitler), offers a more balanced portrait of King and goes out of her way to eulogize Mandela. Mandela was incontrovertibly a man of extraordinary courage, resolution and practical wisdom, but Nussbaum misses the point when she takes his legendary great-heartedness at face value, for behind all the acts of magnanimity he was famous for lay great calculation, an exclusive focus on goals, and subordination of all else to pragmatic political objectives (especially to gain white acceptance of eventual black majority rule), the mindset, in short, of a master of realpolitik. A sober analysis of the deeply entrenched, uncompromising and repressive nature of the Afrikaner (i.e. whites of predominantly Dutch ancestry) government which came to power in South Africa in the late 1940s and instituted ‘Apartheid’ (forced racial segregation) led him to the inescapable conclusion that armed resistance was the only answer; increasing involvement and overall responsibility for the campaign of violence which ensued led to his eventual arrest and prosecution (i.e. the Rivonia trial) in 1963. There he pled guilty to authorizing acts of sabotage and was sentenced to life imprisonment; in response to his public refusal to disown violence (which he steadfastly maintained till his final release 27 years later), Amnesty International withdrew all support for his campaign. Amazingly, Nussbaum omits all mention of this background in Anger and Forgiveness (she incidentally brushed aside my concerns on this score when I raised them with her in person at a lecture she gave to promote this book (then forthcoming) at the University of Virginia in September 2015); the reader is thus denied any explanation as to why Mandela got jailed in the first place. Instead, she focuses exclusively on his saintly forbearance in prison, his undeniable acts of kindness and generosity to warders and fellow inmates, and the pivotal role he played after his release in unifying the new nation. In this context, she describes how he helped inspire the South African national team to win the 1995 rugby World Cup but leaves out any reference to the truly outstanding coach who was far and away primarily responsible for this major sporting achievement – i.e. Kitch Christie, already diagnosed with terminal leukaemia (he died 3 years later) who came out of early retirement to guide the team to victory. One can only deplore such an omission, for this was the real story of 'Invictus' (blame Hollywood too, incidentally, for shamefully airbrushing Christie from the movie version, in order purely it seems to highlight and exaggerate Mandela’s role in the affair – as if a man as big as him actually needed a boost to his already colossal reputation!).

'Anger and Forgiveness' ends with an account of the so-called ‘Truth and Reconciliation Commission’, set up to clear the political air during the early years of Mandela’s new government. Essentially an amnesty for those implicated in criminal acts of political violence during the Apartheid era, it required, in return for grants of immunity from prosecution, full disclosure of these crimes from those responsible. In many ways, the Commission was the crowning achievement of Mandela’s Presidency, for only he had the moral stature to successfully induce not only former members of the police, security services and military to own up to the Commission, but also those of his own followers who had committed violent acts of terrorism to help bring down the previous government. Nussbaum rightfully praises the Commission’s work, but curiously fails to concede that, admirable as its role undoubtedly was, the insistence on confession and contrition it embodied closely mirrored the ‘transactional’ type of forgiveness she has earlier in the book emphatically rejected.

Hyperbole and factual inaccuracy abound in 'Anger and Forgiveness'. ‘Manliness’ often takes a beating, men generally are written off as ‘privileged’ (a very debatable proposition), Aesop’s tale of the contest between the sun and the wind is credited to Mandela. The white Afrikaner government, Calvinist to its very core, is labelled ‘corrupt’ – a charge which even its most hostile critics have hardly ever made (a single case of financial impropriety by a senior South African government official in the late 1970s generated an outcry only because such instances were so rare). Its members also, according to the author, ‘seem as evil as can be’. Oh really? As or more evil than the adherents of Stalin, Mao, Pol Pot or Hitler? Finally, the British government is berated for supposedly extending protection to majority but denying it to minority faiths; in fact, it has consistently, thru assiduous enforcement of the infamous, European Union promoted, Human Rights Act, done precisely the opposite. Sloppy editing and proof-reading on the part of Ms. Nussbaum’s publishers, Oxford University Press, are much to blame for the presence of mistakes and poor judgements like these and also for the patchy index, a notably hit and miss, incomplete and inadequate accompaniment to the text.
Comment Comments (4) | Permalink | Most recent comment: Feb 12, 2017 5:46 PM GMT


What is Good?: The Search for the Best Way to Live
What is Good?: The Search for the Best Way to Live
by Prof A.C. Grayling
Edition: Paperback
Price: £9.99

2.0 out of 5 stars Unbalanced polemic, 14 Aug. 2016
Anyone seeking a reasoned, informative and, above all, dispassionate account of theories of the 'Good’ will be disappointed by this volume. For Grayling’s book is little more than a vehement assault on religion – in particular Christianity – and a concomitant celebration of the virtues of modern Humanism. I say ‘little more’ because ‘What is Good?’ starts out in a fairly promising fashion with a short, highly readable survey of Greco-Roman thought on the subject, but even here the author falters by giving insufficient attention to Epicureanism (the forerunner of all subsequent humanist thinking) and too much to its great rival, Stoicism – an imbalance attributable, it seems, to Grayling’s infatuation with the latter’s emphasis on the ‘brotherhood of man’ and other sentimental notions of a similar nature. This uneven treatment continues with barely a single mention of Scepticism, one of the most influential philosophical schools in classical antiquity, and of Neo-Platonism, which likewise had a profound bearing on contemporary concepts of the Good.

Grayling, of course, gets finally into his stride with religion and here he proceeds to indict Christianity with all the fervour and messianic conviction of a medieval zealot. The Bible is so packed with contradictions, inconsistencies, and hopelessly absurd improbabilities, he insists, that it is quite impossible to take seriously, let alone respect and believe in, such a ridiculous document. Evil is all too often ordained, excused or otherwise awarded divine sanction within it – how can this conceivably be reconciled with a striving for the Good? Petty doctrinal disputes have generated unprecedented intolerance, unleashing ruinous conflicts among warring parties; incalculable suffering has additionally been inflicted on dissenters by Church officialdom, e.g. the Inquisition. Much of Grayling’s argument is patently true – much of the history of Christianity is certainly not a pretty picture. But this is to ignore the obvious benefits to many of its adherents – the psychological reassurance afforded by the promise of salvation and an afterlife, the strict injunctions discouraging bad behaviour, the comprehensive array of explanations and answers to almost all life’s questions and dilemmas, the inspiring fellowship and support of other Church members etc. Above all, he fails to acknowledge the supreme importance of faith itself, the overpowering sensation of a true believer that a divine presence resides mysteriously within him, guiding and motivating all his actions - an overwhelming and inexplicable conviction impervious to all rational analysis.

The later chapters chart the development of modern Humanism, from its roots in the Italian Renaissance to its emergence as a distinct philosophy in the early 20th century. Here, the roles of scientific progress, increasing secularism and social liberalism in inducing us to accept fuller responsibility for managing our own lives - with ever diminishing subservience to a supreme deity - are discussed in detail, the values and virtues of this new outlook contrasted most favourably (and arguably unfairly) with those of traditional Christian teaching and morality. Religion and Humanism, it is implied, are mutually exclusive, it not being possible for the same individual to subscribe to both. But is this true? May one not choose to lead one’s daily life entirely in accordance with humanist principles, yet retain a belief in God when it comes to our spiritual thinking?

A final issue in ‘What is Good?’ is what constitutes the true role of philosophy itself. Here, Grayling delivers a withering attack on the still dominant position of analytic philosophy in English-speaking countries, especially its overriding obsession with highly abstract minutiae, arcane definitions and hair-splitting, trivial distinctions, especially in the areas of language, meaning and mathematics. All this is largely irrelevant to daily life, he maintains; rather, we should devote ourselves to the study of ethics, on how to lead good lives and come up with viable approaches to ameliorating human suffering and addressing our most urgent problems. But this superficially welcome proposal neglects the huge divergence in opinion on these matters: given this scenario, how are we ever to achieve a consensus on how to proceed? At least analytic philosophy, one might say, by virtue of its restricted focus and quasi-scientific methodology, offers the prospect of measurable progress, however meagre, in clarifying and helping to resolve the issues that it narrowly concerns itself with.


The Road to Little Dribbling: More Notes from a Small Island (Bryson)
The Road to Little Dribbling: More Notes from a Small Island (Bryson)
by Bill Bryson
Edition: Hardcover
Price: £16.89

3 of 6 people found the following review helpful
1.0 out of 5 stars Careless factual errors, 27 Feb. 2016
Verified Purchase(What is this?)
This long delayed sequel to Notes from a Small Island involves the author covering much of the same ground (literally) as its predecessor and won’t disappoint readers seeking the same brand of wry humour, sardonic wit, merciless put-downs and obsessive focus on human frailties and life’s oddities as before. Conservationists in particular will applaud Bryson’s caustic condemnation of witless politicians, exploitative developers, ruthless polluters and others dedicated to wrecking the increasingly fragile British rural and marine environment. On the minus side, however, he needs to do a better job of getting his facts right: not only is Thomas Edison wrongly credited in this book with the invention of the incandescent electric light bulb (English electrical pioneer Joseph Swan was actually the first to patent this revolutionary new device in November 1880 – several years before Edison did) but also, perhaps mesmerized by crass Hollywood hyperbole and misrepresentation, he incorrectly ascribes primary responsibility for the development of the programmable electronic digital computer to British mathematician and cryptographer Alan Turing (who was in fact easily beaten to the punch by those responsible for the EDVAC project at the University of Pennsylvania, notably American engineers J. Presper Eckert and John Mauchly). On a perhaps lesser note, Bryson would do well to refrain from cheap shots at dead individuals who aren’t around to defend themselves – case in point his mean-spirited and utterly fatuous (‘f****** stupid’ in Brysonic parlance) attack on popular English travel writer H.V. Morton who relocated to South Africa in the 1940s, a decision Bryson takes to be motivated by Morton’s ‘fascist’ inclinations and aim to have ‘servants he could shout at’. Grow up and get a life, Bill!


A War to Be Won: Fighting the Second World War
A War to Be Won: Fighting the Second World War
by Williamson Murray
Edition: Paperback
Price: £17.95

0 of 2 people found the following review helpful
3.0 out of 5 stars Historically accurate account but questionable judgements, 17 Aug. 2015
This is a very uneven book. On the one hand, it is a most valuable source of information regarding the purely military aspect of WWII. No other layperson work I am familiar with packs as much helpful detail about the nuts and bolts of this momentous conflict, the logistics, tactics, munitions and weapon specifications, plus related economic statistics like industrial production, labor availability, food stuffs and raw materials, sizes of opposing armies, casualty lists etc. as does this study. Additionally, the narrative of specific military operations is unfailingly accurate and well-written. On the other hand, where the authors stray from this focus and proffer value judgments about the bigger picture – i.e. issues of causation and impact, of intent, motivation and responsibility, their narrative is frequently marred by imbalances, omissions and inconsistencies. While 26 pages, for instance, are devoted to the origins of the Allied war against Japan in the Pacific, the account of the vastly more complex background to WWII in Europe occupies little more than half this space, a brevity which results in insufficient attention, in particular, being devoted to the treaty of Versailles and its critical impact on subsequent events. By downplaying the treaty’s provisions, in particular the huge reparations bill imposed on Germany and the loss of 10% of its land area and population, and by saying little about deep German resentment at such draconian measures, the authors fail to explain the meteoric rise of Hitler whose appeal lay above all in a ruthless exploitation of popular anger at such injustice. The treaty’s placement of the former German city of Danzig under League of Nations control and the pivotal role this played in the breakdown of relations between Germany and Poland (the former wanting the city back and the latter seeking to acquire it) get not even a passing reference. Increasingly uneasy at the prospect of British military involvement in the worsening situation, the authors quote British Prime Minister Neville Chamberlain as earlier opining, in a rare moment of lucidity, ‘how horrible, fantastic and incredible it is that we should be digging trenches and trying on gas masks here because of a quarrel in a far away country between people of whom we know nothing’. They fail to question, however, the wisdom of his decision six months later, in April 1939, to completely reverse himself by authorizing a cast-iron British guarantee to protect Poland in the event of future German aggression, a totally reckless and unenforceable measure which served only to embolden the Poles to be even less compromising in their dealings over Danzig and which therefore did much to make the subsequent outbreak of war inevitable.

A similar imbalance lies in their assignment of relative national responsibility for acts of savagery and brutality. Rejecting outright the notion of ‘moral equivalence’, they rightly emphasize the barbaric nature of German military conduct in eastern and southern Europe especially but gloss over comparable Soviet atrocities. No mention is made, for example, of the cold-blooded massacre by Stalin’s secret police in 1940 of over 20,000 captive Polish officers in the Katyn Forest near Smolensk nor of the 10,000 lives lost (all German women and children) when the refugee transport ship Wilhelm Gustav was torpedoed by a Soviet submarine in the Baltic in 1945. The simultaneous destruction of Dresden (with over 40,000 civilian deaths), which represented the culmination of the Allied bombing offensive over Germany, is likewise ignored, little or no condemnation being expressed of the focus (cynically sanctioned by Roosevelt and Churchill) on indiscriminately slaughtering as many non-combatants in populated areas as possible in this way.

The authors’ emphasis on military operations results in little or no discussion of the infamous ‘Final Solution’, even though this directly impacted the German war effort. No mention, for instance, is made of the January 1942 Wannsee Conference officially authorizing the systematic elimination of Europe’s Jewish population, nor of the colossal overhead, diversion of scarce military resources and loss of essential manpower incurred by implementation of this diabolical program. The huge toll of casualties suffered by the Red Army in taking Berlin is cited as a good reason for the Allies not having bothered to capture the city instead, a judgment which ignores the obvious fact that the Germans, terrified of Soviet reprisals, would have put up vastly less resistance to the British and Americans. That this fear of Russian vengeance above all motivated the German generals to fight to the bitter end does not occur to the authors, who absurdly take them to task for continuing a supposedly pointless struggle against impossible odds, especially the looming threat of the atom bomb (how could the generals possibly have known about the Manhattan project?). Elsewhere, inconsistencies abound in the evaluation of individual commanders. Montgomery, for instance, is warmly praised at first but later savaged for dilatory leadership, in particular his reluctance to back up General Simpson’s bold proposal to cross the Rhine into N. Germany. Marshall, never in combat and surely the most over-rated and over-promoted officer in American military history, is lauded to the skies as a great general purely on account, it seems, of his supposed skill in choosing high quality subordinates; later, the authors let slip that not all of his appointments were that good, especially that of Clark (whose direction of the Italian campaign they mercilessly criticize) and that he needlessly got in the way of Eisenhower’s highly efficient management of the Allied effort.

Inconsistencies like the above could easily have been remedied by better editorship - which is as much the publisher’s as the authors’ fault. To the publishers can also be laid the principal blame for the hopelessly inadequate index, a deficiency which severely detracts from the overall value of this generally helpful work and, incidentally, hampers the reviewer’s task.


The Sleepwalkers: How Europe Went to War in 1914
The Sleepwalkers: How Europe Went to War in 1914
by Christopher Clark
Edition: Paperback
Price: £10.98

3 of 10 people found the following review helpful
2.0 out of 5 stars Lacking in judgement, 14 Jun. 2014
Verified Purchase(What is this?)
I must be in a minority of one in not wishing to join in the rapturous applause which has greeted this book by Christopher Clark, so let me explain. Clark writes as a born story-teller as opposed to a professional historian, his book a highly readable and informative anecdotal account of the build-up to WWI. But noticeably lacking is a penetrating, didactic analysis of the successive chain of events which ultimately led to this momentous conflict, what one might call 'the mother of all wars' - or rather, in view of what ensued 25 years later, 'the grandmother of all wars'. One looks in vain in his narrative for any controversial thesis, any bold judgments or conclusions, any in-depth focus on the possible contributory factors (e.g. wounded pride, revanchism, wishful thinking, insensitivity, intransigence, hot-headed bellicosity and so on) which brought it about. Clark’s low-key lack of emphasis is particularly apparent in his very brief comments on the outbreak of the war: the Schlieffen Plan is passed off – incredibly - as a reasonable and sensible military option, the invasion of Belgium almost brushed aside as a decisive factor in galvanizing British national opinion. His final conclusion on the issue of war guilt is typically bland and non-committal - the concept in his view is ‘meaningless’ since it a) implies that policy-makers were ‘driven by a coherent intention’ to will war, b) presupposes that ‘one protagonist must ultimately be right and the other wrong’ and c) fails to address the pivotal role of ‘multilateral processes of interaction’ in leading to war. Too bad he makes no serious attempt to defend these extremely debatable hypotheses.
Comment Comments (3) | Permalink | Most recent comment: Dec 4, 2014 9:19 AM GMT


Turing's Cathedral: The Origins of the Digital Universe
Turing's Cathedral: The Origins of the Digital Universe
by George Dyson
Edition: Hardcover

52 of 60 people found the following review helpful
2.0 out of 5 stars Misleading account, 8 Mar. 2012
The focus of George Dyson's well-written, fascinating but essentially misleading book,'Turing's Cathedral', is curiously not on celebrated mathematician, code-breaker and computer theorist Alan Turing but on his equally gifted and innovative contemporary John von Neumann. Von Neumann, whose extraordinarily varied scientific activities included inter alia significant contributions to game theory, thermodynamics and nuclear physics, is especially associated with the early development of the electronic digital computer (i.e. the 'EDC'), an interest apparently sparked by reading Turing's seminal 1936 paper 'On Computational Numbers' which attempted to systematize and express in mathematical terminology the principles underlying a purely mechanical process of computation. Implicit in this article, but at a very theoretical level, was a recognition of the relevance of stored program processing (whereby a machine's instructions and data reside in the same memory), a concept emanating from the work of mid-Victorian computer pioneer Charles Babbage but which demanded a much later electronic environment for effective realization.

What Mr Dyson insufficiently emphasizes is that, despite a widespread and ever-growing influence on the mathematical community, Turing's paper was largely ignored by contemporary electronic engineers and had negligible overall impact on the early development of the EDC. Additionally, he omits to adequately point out that von Neumann's foray into the new science of electronic computers involved a virtual total dependence on the prior work, input and ongoing support of his engineering colleagues. Invited in August 1944 to join the Moore School, University of Pennsylvania, team responsible for ENIAC, the world's first general purpose computer being built for the US Army, von Neumann was quickly brought up to speed courtesy of the machine's lead engineers, J. Presper Eckert and John Mauchly. As early as the fall of 1943, Eckert and Mauchly had become seriously frustrated by the severe processing limitations imposed by ENIAC's design and were giving serious consideration to implementing major modifications, in particular the adoption of Eckert's own mercury delay line technology to boost the machine's miniscule memory capacity and enable a primitive stored-program capability. These proposals were subsequently vetoed by the School's authorities on the quite understandable grounds that they would seriously delay ENIAC's delivery date; instead it was decided to simultaneously begin research on a more advanced machine (i.e. EDVAC) to incorporate the latest developments. As a new member of the group, von Neumann speedily grasped the essentials of the new science and contributed valuable theoretical feedback, but an almost total lack of hands-on electronic expertise on his part prevented any serious contribution to the nuts and bolts of the project. Relations with Eckert and Mauchly rapidly deteriorated when an elegantly written, but very high-level, document of his entitled 'First Draft of a Report on the EDVAC' was circulated among the scientific community. Not only had this document not been previewed, let alone pre-approved, by Eckert and Mauchly, but it bore no acknowledgment whatsoever of their overwhelming responsibility for much of the content. By default, and in view too of his already very considerable international reputation, the content was therefore attributed exclusively to von Neumann, an impression he made no attempt thereafter to correct, the term 'Von Neumann Architecture' being subsequently bestowed on the stored program setup described in the document.

The public distribution of von Neumann's 'Draft' denied Eckert and Mauchly the opportunity to patent their technology. Worse still, despite academic precedents to the contrary, they were refused permission by the Moore School to proceed with EDVAC's development on a commercial basis. In spite of his own links to big business (he represented IBM as a consultant), von Neumann likewise opposed their efforts to do so. All this resulted in a major rift, von Neumann thereafter being shunned by Eckert and Mauchly and forced to rely on lesser mortals to help implement various stored-program projects, notably the IAS computer at Princeton. The following year (1946) Eckert and Mauchly left the School to focus on developing machines for the business market. Before doing so, they jointly delivered a series of state of the art lectures on ENIAC and EDVAC to an invited audience at the School. Among the attendees was British electronics engineer Maurice Wilkes, a fellow academic of Turing's from Cambridge University, but with relatively little interest in the latter's ongoing activity (by this time Turing, a great visionary, had also turned his attention to designing stored-program computers). Blown away by Eckert and Mauchly's presentation, Wilkes returned to England to forge ahead with a new machine called EDSAC, which was completed in May 1949 and represented the first truly viable example of a stored program computer (an experimental prototype christened 'Baby' had already been developed at Manchester University the year before). Back in the US, Eckert and Mauchly continued their efforts, but persistent problems with funding and also Eckert's own staunch refusal to compromise on quality delayed progress, their partnership finally culminating in the development of the UNIVAC 1, the world's first overtly business-oriented computer, delivered initially to the Census Bureau in March 1951.

Mr Dyson is quite right of course (and he does this well) to trace the beginnings of the modern computer to the stored program concept, but his obsessive focus on von Neumann's role obscures the impact of Eckert and Mauchly's vastly more significant contribution to its development. The triumph of the EDC depended almost wholly on the efforts and expertise of utterly dedicated and outstanding electronics specialists like them, not on mathematicians, logicians and generalists like von Neumann or even Turing. Never one to deny credit where it was due, Wilkes (who later spearheaded advances in software, became the doyen of Britain's electronic community and ended his long and distinguished career as professor emeritus of computer science at Cambridge) unceasingly acknowledged his major debt to Eckert and Mauchly. Hopefully, Mr Dyson, a writer of considerable talent, might one day decide to tell in full their story and set the record straight.
Comment Comments (4) | Permalink | Most recent comment: Jan 28, 2013 2:36 PM GMT


Page: 1 | 2