Shop now Shop now Shop now  Up to 70% Off Fashion  Shop all Amazon Fashion Cloud Drive Photos Shop now Learn More Shop now Shop now Shop Fire Shop Kindle Shop now Shop now Shop now
Profile for Sphex > Reviews

Personal Profile

Content by Sphex
Top Reviewer Ranking: 4,714
Helpful Votes: 2279

Learn more about Your Profile.

Reviews Written by
Sphex (London)

Show:  
Page: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11-19
pixel
Inside Jokes: Using Humor to Reverse-Engineer the Mind
Inside Jokes: Using Humor to Reverse-Engineer the Mind
by Matthew M. Hurley
Edition: Hardcover
Price: £22.95

6 of 8 people found the following review helpful
5.0 out of 5 stars The endogenous mind-candy of mirth, 30 April 2013
E. B. White said that analysing a joke is like dissecting a frog: few people are interested and the frog dies of it. So, my recommendation of this tremendous book, in which the authors seek a fully scientific explanation of humour, also comes with a reassurance: your sense of humour should not only survive this encounter with science intact it'll also be tickled along the way. (For those without a sense of humour or an appetite for evolutionary explanations, this book will be tough going.) There are nearly a hundred jokes scattered throughout the text and plenty of funny epigraphs introducing each section, and unusually for a book of this calibre there are plenty of laugh-out-loud moments. To earn the reward of each little nugget of "mind-candy", however, requires some high-level incursions into philosophy, evolution, epistemology, computational problems, cognitive science, neuroscience, and so on. To labour the food metaphor, you have eat some huge piles of broccoli to enjoy the choccies.

The preposition in the title is vital. Hurley, Dennett and Adams concentrate on jokes because these "are compact, self-contained mirth-delivery systems" that allow us to take a peek inside, and watch the fundamental machinery in action. The one thing everyone agrees upon is that humour is both complex and diverse. The authors appreciate the scale of their undertaking, and in the shortest of the main chapters they list twenty questions for a cognitive and evolutionary theory of humour. These are answered in the final main chapter (aptly called The Punch Line), also short, which sums up the argument of the book.

Throughout, Hurley et al. seamlessly combine their talents and model humour by taking both an evolutionary and an epistemic perspective. After an introduction the second chapter asks: "What Is Humor For?" A handy way of filling the television schedules is obviously too parochial an answer (although entertainment has its own evolutionary roots in sexual selection; see The Mating Mind: How Sexual Choice Shaped the Evolution of Human Nature). The question invites us to think in terms of adaptations that evolved to meet challenges faced in our evolutionary past. In particular, the authors try to show that humour "evolved out of a computational problem that arose when our ancestors were furnished with open-ended thinking." Having big brains is all very well, but that means there's more to go wrong. We "have to learn to live with the failings of our minds, and to detect their consequences after they occur" - and humour is one way of fixing the faults in our thinking.

Here is their theory in a nutshell: "Our brains are engaged full time in real-time (risky) heuristic search, generating presumptions about what will be experienced next in every domain. This time-pressured, unsupervised generation process has necessarily lenient standards and introduces content - not all of which can be properly checked for truth - into our mental spaces. If left unexamined, the inevitable errors in these vestibules of consciousness would ultimately continue on to contaminate our world knowledge store. So there has to be a policy of double-checking these candidate beliefs and surmisings, and the discovery and resolution of these at breakneck speed is maintained by a powerful reward system - the feeling of humor; mirth - that must support this activity in competition with all the other things you could be thinking about."

The authors draw the radical conclusion that emotions govern all our cognitive activities (see also Robert Frank's Passions within Reason: The Strategic Role of the Emotions), and that our epistemic emotions are crucial to navigating a complex world of information. We constantly have to cope with "epistemic undecidability" and problems caused by faulty assumptions and mistakes in inference. How we do this reveals a fascinating and intimate connection between humour and belief: "with no invalidated belief there can be no humor." (This leads me to speculate that the absence of gags in holy texts may be a design feature of those texts. The last thing a religious tradition wants to encourage is a cognitive system that is not only highly sensitive to false beliefs but finds their discovery funny. We know that curiosity is a scientific value, so when Hurley et al. suggest that it might also "be the analogue of lust" we can see why most church services are no laughing matter.)

The science of logic rarely features as the subject of comedy routines (even Eddie Izzard would be stretching his surreal imagination to riff on the enthymematic nature of jokes), but comedians do know what to leave out, and how to manipulate our background knowledge to get a laugh. A priest, a rabbi and a nun walk into a bar, and the bartender asks, "What is this, a joke?" We not only know about priests and rabbis and nuns, we also know about jokes about odd groups of individuals walking into bars, so "we suspend disbelief and accept that it's expectable for a priest, a rabbi, and a nun to walk into a bar together." We go along with the fiction before being brought down to earth by the punch line. Realism is not what we were expecting.

Suspending disbelief is the logical mechanism we use when we create a fictional mental space. When we are told of two muffins in an oven, and then told that the first one says, "Boy is it hot in here!", we suspend our natural scepticism that this is not how the universe works and wait for the punch line. When this comes (the second one says, "Wow, a talking muffin!"), we feel a complex set of emotions that are a consequence of being trumped by a talking muffin who paradoxically and simultaneously confirms the validity of our initial suspension of disbelief.

We don't need to read a book to know that laughter's good for us, but just how good will be a revelation to most. Hurley, Dennett and Adams argue that the original purpose of humour was to protect us from "epistemic catastrophe" and to correct mistaken preconceptions. Humour has also "been exapted as a tool in mate selection and sexual competition, allegiance probing, belief extraction, and the building of social capital". Although "endogenous mind-candy" is a marvellous phrase, not many nutritionally valueless foodstuffs could match that impressive list of benefits: while candy may not be good for us, a sense of humour most certainly is. Whether or not this account stands up on every detail will be for others more qualified than me to judge. Indisputable, I think, is that these authors have written an important book that shows how humour is "a rich source of insights into the delicate machinery of our minds."


London Wall (Oberon Modern Plays)
London Wall (Oberon Modern Plays)
by John van Druten
Edition: Paperback
Price: £9.99

1 of 1 people found the following review helpful
5.0 out of 5 stars The three ages of women, 23 April 2013
Set in 1931 in the general office of a firm of solicitors on London Wall, this remarkable play crams into the course of a couple of working days a whole series of life-changing events for several of the characters. Tightly plotted, and low key, the unfolding stories are actually rather ordinary. However, the all-too-familiar drab working environment - a plausible routine of phone calls and messages and the comings and goings of staff and clients - belies the emotional intensity that is generated. John Van Druten dramatizes the interior lives of his characters with intelligence and sympathy, taking us far beyond the four walls of the office in which they spend so much time, and into their hopes and dreams - and nightmares. As always, the test of any play is in performance, and this certainly works: I was captivated by the Finborough production earlier this year, and will soon be seeing it again.

This is a play full of women, set at a time when more women were entering the workplace but when most women could only aspire to secretarial work. The fewer male parts include the boss of the firm (referred to as "our lord"), who is nevertheless secondary to the real order of business, which is playing the mating game. This play turns out to be - as well as brilliant drama - a fascinating study of male and female sexual strategies, and the psychological mechanisms that each sex has evolved to help them achieve their goals. It's worth remembering that these strategies are grounded in our evolutionary history and largely transcend the social and cultural norms that so often seem to dominate. Just because we've moved beyond a view of marriage that involves the woman staying home and raising the kids doesn't mean that women no longer look for signs of commitment and resources in a man.

Part of the average male's strategy is a preference for younger women, for the simple reason that younger women are more likely to be fertile than older women. Brewer fancies himself as the office Casanova, and it's no surprise when he latches on to the newest - and youngest - female. Pat is 19, and a little naive when it comes to negotiating the attentions of the older man. She is the complete opposite of Miss Bufton. ("Flirtation's her game, and she knows all the rules... she makes them... and she sees that they're kept, too.")

Because of this particular male preference, a woman's age is a crucial factor in determining her value as a potential mate, and it's no accident that Van Druten gives the age of each character in the stage directions. Miss Janus, for example, is about 35. She faces two futures: a compromise marriage or a life of loneliness into old age. For most of the time, more or less assured she will eventually marry her Dutch diplomat, she carries herself with a self-composed confidence. Only the thought that the seven years she's invested in the relationship might be wasted breaks her composure. She cannot bear the thought of having to start over, and with good reason: she risks getting squeezed out of the mating market due to the lack of available men.

In The Evolution of Desire: Strategies of Human Mating, David Buss identifies "the sharp decline in female reproductive value with age" as being at the heart of this marriage squeeze. Over our evolutionary history, ancestral men who preferred younger women as mates and ancestral women who preferred older men with resources as mates both tended to have more children and so spread the genes responsible for these traits. We are where we are, and we shouldn't expect a few years of social reform to easily reverse a few million years of evolution.

Pat's encounter with one of the firm's clients, Miss Willesden (harmless "but definitely cracked" according to Miss Janus), is key to understanding the play's inspired handling of the sensitive question of a woman's age. (Even Shakespeare's Cleopatra, a supremely confident and powerful queen, has a hissy fit on discovering that Octavia is not yet 30.) Miss Willesden is 65, and her advice to Pat is to find a nice man and marry him, while she's young. "It doesn't last, you know." "It" is, of course, the ability to attract a high-quality male as a mate.

Buss observes that "the sadness of aging turns the youthful frustration of unrequited love into the despair of unobtainable love." Van Druten dramatizes this observation, and presents us with three ages of woman, each defined by their reproductive value: Pat (a full tank of fertility), Miss Janus (warning light flashing), Miss Willesden (crying on the hard shoulder).

Given the difficulty of assessing the quality of a male, in particular, whether or not he is capable of commitment, one female strategy is to poach such a man from within an established relationship. As Paul Seabright observes in The War of the Sexes: How Conflict and Cooperation Have Shaped Men and Women from Prehistory to the Present, "sexual relations in almost all species are clouded by the possibility that either partner might be better off with someone else, now or in the future." Miss Hooper is that someone else, the younger woman hoping a husband will get a divorce from his wife and marry her.

John Van Druten perfectly captures the universal truth that our working lives are often both unglamorous and something of a treadmill, even while our personal lives are in turmoil. But he does far more than re-create office life (or create a character, Birkinshaw, who is the original phone hacker, listening in to some of the juicier conversations). He presents us with characters and constraints that are recognizable nearly a century later: everyone's looking for love, but not everybody wants what's on offer, and some of us end up on the shelf.


Proving History: Bayes's Theorem and the Quest for the Historical Jesus
Proving History: Bayes's Theorem and the Quest for the Historical Jesus
by Richard C. Carrier
Edition: Hardcover
Price: £27.00

12 of 14 people found the following review helpful
5.0 out of 5 stars Historians should be Bayesians, 4 April 2013
Bayesian reasoning is widely used in a range of scientific disciplines. According to the GP Margaret McCartney, for example, Bayesian reasoning is "a good description of how medicine is practised" (The Patient Paradox: Why Sexed Up Medicine is Bad for Your Health). Historians, however, have yet to discover the delights of Bayes's Theorem, and Richard Carrier wants that to change. In this important book Carrier pursues two related objectives: "first, to demonstrate when and why existing methods of historical reasoning are valid; and second, to provide a model of reasoning that can be directly employed in historical analysis and argument. The latter is methodological, the former is epistemological."

For those historians who baulk at what appears to be a purely scientific methodology, and one that involves a daunting-looking equation, Carrier points out that many sciences such as geology and cosmology are in fact historical, in the sense that they "explore not merely scientific generalizations but historical particulars, such as when the Big Bang occurred". Indeed, much of science involves field observations and doesn't rely on experiments. "History is thus continuous with science. The difference between them is only quantitative: history must work with much less data, of much less reliability."

Every time we say that some event is "implausible" or "unlikely" we are "covertly making a mathematical statement of probability" - whether or not that is what we think we are doing. (In Believing in Magic: Psychology of Superstition, the psychologist Stuart Vyse makes a similar point: "Much of our day-to-day thinking is quantitative, whether we are aware of it or not.") Carrier tabulates a canon of probabilities, with five percentages on either side of "even odds" that range from "virtually impossible" to "virtually certain" (both one in a million) and include "very improbable" and "very probable" (both one in twenty). Thus he links familiar verbal descriptions with their underlying numbers, emphasizing the probabilistic nature of historical knowledge.

Judgements about degrees of belief reflect the uncontroversial fact that ignorance and uncertainty are hallmarks of good scholarship in any discipline. In history, as in science, very little is known for sure, and confidence must often be measured in relative degrees of certainty. Historians learn to avoid black-and-white terms like "true" and "false" and to be comfortable with ambiguity, uncertainty and ignorance (just like scientists, according to Firestein in Ignorance: How It Drives Science). This state of affairs is tailor-made for Bayes's Theorem, which can be formulated as a theory of warrant rather than of truth: it tells us what we are warranted in believing, not what is true in any absolute sense.

One sign of poor scholarship - and a common failure of critical thinking - is eagerness to adopt a particular explanation just because it "fits the evidence". Many explanations will fit (in fact, an infinite number are logically possible), but not all are equally believable. Working out the probability that our hypothesis is true (in the language of Bayes) entails not only examining "the specific evidence that requires explanation" (which our hypothesis purports to explain), but also taking into account any relevant background knowledge ("everything all historians know or should be able to know") and all the significant alternative hypotheses. Bayes's Theorem shows that the probability that our hypothesis is true follows necessarily from four other probabilities (technically, the prior and consequent probabilities).

One huge advantage of Bayes over the historicity criteria that have "so dismally failed" is that the theorem has been proved and is therefore logically valid. The maths represents a logic that "models the structure of all sound historical reasoning". (Logic matters, since invalid arguments can never reliably produce true conclusions; see, for example, Rulebook for Arguments. Logic is also no respecter of departmental boundaries, even if some corners of the academy are less respectful of logic than they should be.)

In a short first chapter Carrier outlines the central problem that bedevils attempts to get to the "real" historical Jesus (the method of criteria), the consequences of the failure of this methodology (a more confused picture of Jesus), and the solution (Bayes's Theorem). A second chapter covers the basics, the set of methodological assumptions to which all historians should subscribe, and a third introduces Bayes's Theorem. The core of the book comprises two chapters on the Bayesian analysis of historical methods and criteria, in which Carrier aims to show both the limitations of some aspects of the existing methodology and the power of the Bayesian approach. For example, the criterion of coherence ("the most insidious of them all") "assumes that anything that coheres with what has been established with other criteria is also historical." The problem here is that good fiction is often just as "coherent" as historical fact. Indeed, it can be even more so, for coherence is easy to create by design, and "is just as common and expected on hypotheses of fabrication."

The most egregious example of historians using the same method on the same facts to get a whole range of different results is in Jesus studies. For Carrier, this shows that the method of criteria "is invalid and should be abandoned" and he insists that "agreement on the fundamentals of method is the first essential requirement for any community of experts to deem itself an objective profession." To this end he proposes that professional historical inquiry should be based on a set of core epistemological assumptions: the twelve axioms of historical method that "represent the epistemological foundation of rational-empirical history." For good measure, there are also twelve rules all historians should follow. Again, axioms and rules may seem alien in a subject like history, but many of these will already be familiar to historians, and to anyone with a grounding in critical thinking (for example, the eleventh axiom invalidates cherry-picking and special pleading and other abuses of logic and evidence).

In Did Jesus Exist?: The Historical Argument for Jesus of Nazareth, Bart Ehrman relies heavily on the method of criteria (for example, "the criterion of dissimilarity") to establish his conclusion that the historicist position is right and the mythicist position wrong. While Carrier does not fully address the historicity of Jesus here (he will do that in a forthcoming volume), he does undermine the consensus position: "the many contradictory versions of Jesus now confidently touted by different Jesus scholars" cannot all can be true, but they can all be false. And if they are all false, does that mean there never was a historical Jesus?

Carrier offers three basic rules to laypeople who ask him what history to trust: "(1) don't believe everything you read; (2) always ask for the primary sources of a claim you find incredible; and (3) beware of scholars who make amazing claims about history but who are not experts in the period". As a layreader, I confess that I found the book tough going in places (nearly a third is taken up with the final chapter, ominously called "The Hard Stuff"). For the professional historian, however, "it's too hard" is not an excuse we should ever hear, "because mastering difficult methods is what separates professionals from amateurs." In one important respect, however, Bayes's Theorem should make the historian's life easier, since it brings into the open unstated assumptions and so facilitates the resolution of disagreements.

Only cranks and crackpots challenge the ability of science to understand the world around us. So why is history, which is the study of the world no longer around us, so firmly rooted in the humanities? Few historians will dispute Carrier's claim that they "need solid and reliable methods" and that their arguments must be logically valid and factually sound. (Those that do dispute this claim ought to pack up and become historical novelists instead.) Although Carrier makes a good case for Bayes's Theorem to be taken seriously by historians, it remains to be seen whether they will embrace Bayes with any enthusiasm. What does seem certain is that Bayesian reasoning will help us to see more clearly into our past, and that this book will help historians see more clearly the value of Bayes's Theorem.
Comment Comments (7) | Permalink | Most recent comment: May 13, 2014 3:50 PM BST


Eyewitness Testimony
Eyewitness Testimony
by Elizabeth F Loftus
Edition: Paperback
Price: £21.21

2 of 2 people found the following review helpful
5.0 out of 5 stars Memories are fragile things, 15 Mar. 2013
Verified Purchase(What is this?)
This review is from: Eyewitness Testimony (Paperback)
Over a long career, the psychologist Elizabeth Loftus has investigated memory, a subject that should be of interest to virtually everyone, or at least to anyone who has ever remembered or reflected upon any aspect of their past experiences. Her research into how memories are formed - and how they can be corrupted - has resulted in many surprising and important discoveries that should be more widely known. She is also one of those scientists who can write about her own and others' work in an engaging and accessible way for a lay audience. In this fascinating book, she begins by revealing "a long-standing concern with cases in which an innocent person has been falsely identified, convicted, and even jailed." How can such miscarriages occur? In part, they happen because eyewitness testimony is "among the most damning of all evidence that can be used in a court of law" - and yet, while often convincing, our memories are not always accurate. In short, the "unreliability of eyewitness identification evidence poses one of the most serious problems in the administration of criminal justice and civil litigation."

This concern with the legal system points up the twin themes of the book: Loftus gives an account of "the field of experimental psychology with the purpose of providing a theoretical framework in which a diverse collection of empirical findings are integrated" and she also attempts "to say how this body of research should be fitted into society as a whole, and into the legal system in particular."

One crucial empirical finding is that memory does not work like a videotape recorder. We don't passively take in information. Rather, we "take in information in bits and pieces, from different sources, at different times, and integrate this information together." There's a sense in which we actually construct memories. This raises questions about just what these sources can be if they are not to do with the event being remembered, and the vulnerability of memory to being changed in the interval between the event and the act of remembering. Indeed, Loftus describes how new "information" can invade us, like a Trojan horse, precisely because we do not detect its influence. Such "postevent information often becomes incorporated into memory, supplementing and altering a person's recollection." We can be tricked by revised data about a past experience, and this is central to understanding "the reconstructive nature of memory."

For example, in one experiment witnesses of an automobile accident who were queried with the verb "smashed" were substantially more likely to erroneously report (that is, to remember) the presence of broken glass than were subjects originally queried with the verb "hit". The wording of a question about an event can influence the answer given. The line "between valid retrieval and unconscious fabrication is easily crossed" and so investigators must be careful in their use of language and sensitive to the innumerable different ways in which a witness's answers can be influenced. The legal system has recognized this in part, for example, with its concept of a leading question.

In place of the simple idea of a memory as a once-only recording of an event is a three-stage analysis comprising acquisition, retention and retrieval ("virtually universally accepted among psychologists"). It's therefore not surprising that our memories are fragile things, and it's "important to realize how easily information can be introduced into memory, to understand why this happens, and avoid it when it is undesirable." The "commonly held belief that information, once acquired by the memory system, is unchangeable, and that errors in memory result either from an inability to find stored information... or from errors made during the original perception of the event" must be revised in light of the research findings of Elizabeth Loftus and others. This new understanding of memory is not only valuable as part of our scientific understanding of how the mind works; it is also vital in making sure that justice is done.


Christian Delusion: Why Faith Fails
Christian Delusion: Why Faith Fails
by John Loftus
Edition: Paperback
Price: £16.99

7 of 10 people found the following review helpful
5.0 out of 5 stars We all know poppycock when we hear it, 27 Feb. 2013
John Loftus is an ex-preacher who has not only left his evangelical beliefs behind but is now an eloquent voice for humanism and atheism and reason. In this excellent volume, he joins eight other writers in producing fifteen chapters divided into five parts. This is a systematic and informed analysis of the Christian delusion, starting with why faith fails as a route to knowledge and ending with a rejection of the idea that society depends on Christianity. The foreword is written by another well-known ex-preacher, Dan Barker, who charted his own journey from faith to reason in Godless: How an Evangelical Preacher Became One of America's Leading Atheists. What both share with the contributors is an attitude that is characteristic of the new atheism, that, as Loftus puts it, someone "has to tell the emperor he has no clothes on."

Barker suggests that "what unities the authors of this volume is not revenge for having been victimized by the deceptions of religion, but a burning desire for actual facts." While the title of the book is provocative, the contributed essays are far from wholly negative. A typical strategy is that of David Eller, a professor of anthropology, who argues that our moral sense is grounded in the natural world and does not originate with Christianity. Eller identifies agency as the "one quality that religions seem to share" (it's certainly something that preoccupies the religious evolutionist Robert Asher in Evolution and Belief: Confessions of a Religious Paleontologist). We humans are inveterate agent-detectors, looking for will or intention or purpose or goal-oriented behaviour in each other and in the world around us, even when none exists. Eller mischievously concludes that it's "never easy to be honest with yourself about the Bible when a mind-reading god is always present."

Baruch Spinoza dismissed much of the Bible as the "uninteresting opinions of some people who lived long ago" (see Richard Popkin's The History of Scepticism from Erasmus to Spinoza) and Jason Long, in his chapter on the malleability of the human mind, suspects that "those who have been conditioned to believe in a book with a talking donkey will never actively seek out someone to challenge this position." This is what those of us who see no reason to look beyond naturalism are up against, and why Barker's emphasis on facts is important. In Ignorance: How It Drives Science, Firestein argues that accumulating facts is not what science is about. That may well be true, but facts are of course still important, especially in an area like religious belief where facts are often denied or obscured because of commitments to ancient doctrines. When Barker concludes that the "case for faith is a case for ignorance" he's not talking about the "insightful ignorance" of scientists but of all the many ways in which Christians are wrong. In short, scripture is not a reliable source of knowledge.

Richard Carrier cuts to the chase in his chapter on why the resurrection of Jesus is unbelievable, from which it follows that so is Christianity. He points out that Christians no longer believe Peter's Gospel, for many of the same reasons we no longer believe the marvels reported by Herodotus. But why then believe any of the other Gospels? How are they any less fantastic than the Gospel of Peter? Any reasonable person not already committed to the New Testament as it stands must pause for sceptical thought. The challenge to believers - formulated by Loftus in his "outsider test for faith" - is for them to examine their own religious faith with the same presupposition of scepticism they use to examine other religious faiths.

The outsider test is surprisingly devastating given its simplicity. Not being able to believe a claim just because you've read it in some ancient text pulls the rug right from under a remarkable number of Christian beliefs. Oh, but the Bible is God's word, isn't it? The three chapters of the second part refute this claim, with Paul Tobin listing several impressive deficiencies in a supposedly divine book: the canonical Bible "is inconsistent with itself" and "not supported by archaeology", and it contains "fairy tales", "failed prophecies" and "many forgeries." If Barker's right, and "Is it true?" is the most important question we can ask about any religion, then the title of this book is less polemical and more descriptive.

With contributions from other eminent atheists such as Robert M. Price (on the mythical Jesus) and Hector Avalos (on why atheism was not the cause of the Holocaust), this is a wide-ranging, well-organized and well-argued critique of Christianity. The overarching theme, as expressed by John Loftus himself, is that scepticism "is the hallmark of an adult who thinks for herself."
Comment Comments (3) | Permalink | Most recent comment: Apr 10, 2013 2:01 AM BST


Evolution and Belief: Confessions of a Religious Paleontologist
Evolution and Belief: Confessions of a Religious Paleontologist
by Robert J. Asher
Edition: Hardcover
Price: £19.99

11 of 19 people found the following review helpful
1.0 out of 5 stars Say no to NOMA, 11 Feb. 2013
Throughout this book, Robert Asher displays a great knowledge and understanding of palaeontology and Darwin's theory of evolution, and reveals a tremendous commitment to scientific principles of evidence and logical inference in that domain. In parts it reads more like a textbook, and a tedious one at that for a lay reader, but this is not my main objection. The book's worst fault is that it celebrates an intellectual double standard. Asher is eager "to point out that religion and science can be compatible" but he succeeds only in showing that some scientists happen to hold religious beliefs, not that religion and science are compatible at a deeper level. Indeed, he unwittingly achieves the opposite of his stated goal, by demonstrating the deeper incompatibility: on the final page, he admits to accepting "the existence of a deity behind life" on the basis on his own intuition, which he claims to be "entirely rational." Really? Entirely rational? Yes, he argues, because science is a subset of rationality. Unfortunately, logic is not on his side: the premises "science is a subset of rationality" and "intuitions are non-scientific" do not logically entail the conclusion that those intuitions must therefore be rational, let alone entirely rational. They could be irrational. Lest we forget, being rational (the root of this word is ratio) means being committed to holding beliefs in proportion to the evidence. Having faith means believing in the absence of evidence or in the face of evidence to the contrary. The compatibility Asher asserts is ultimately unconvincing because his arguments rely on one set of standards for religion and another for science, regardless of the kinds of questions each is supposed to address.

Can intuitions be "entirely rational"? In a word, no. We don't need to read Daniel Kahneman to suspect that the confidence people have in their intuitions is not a reliable guide to their validity (see Thinking, Fast and Slow). Intuitions are often correct, of course, but how do we know which ones are right? Obviously, because of the evidence, as Radcliffe Richards writes in The Sceptical Feminist. To the extent that intuition can be defended, that defence stems from reason, not from faith.

So, what is the evidence adduced by Asher in defence of his Christian beliefs? He regards the New Testament as "impressive documentation" and the historical content relating to Jesus as "honestly impressive" because these accounts "appear to have been written close to Christ's lifetime, well within range of an oral tradition based on eyewitness accounts." Untrustworthy, not impressive, is the word I would choose. As Richard Carrier writes in Proving History: Bayes's Theorem and the Quest for the Historical Jesus, "The existence of improbabilities, contradictions, propaganda, evident fictions, forgeries and interpolations, and legendary embellishments in them has been exhaustively discussed in the modern literature, and most scholars agree the Gospels contain a goodly amount of these things." We don't even know who wrote, say, the gospel according to Mark, let alone anything about this author's methods or sources or the warrants for his beliefs. Carrier points out that "rarely can we ascertain even who an author's source is, much less to which eyewitness it can ultimately be traced, and we can rarely assert someone is reliable when we don't even know who they are." These facts alone should give a sensible person (let alone a scientist trained to be sceptical of unfounded claims) pause for thought.

On the evidence of this book, Asher is an accomplished palaeontologist, and someone with a successful scientific career. He clearly uses the best scientific methods in order to construct and explain, say, the "evolutionary tree of living and fossil proboscideans calibrated to the geological timescale". As a good scientist, he also avoids appealing to a god of the gaps. What comes to his rescue as a believer is NOMA, the assertion that science and religion occupy non-overlapping magisteria "and are basically compatible with one another in the sense that they deal with fundamentally different questions." And different questions require different methods, right? We have the scientific method for scientific questions and the religious method (whatever that is) for religious questions. (To get a feel for the religious disputes over fundamental epistemic criteria, see my review of Popkin's The History of Scepticism from Erasmus to Spinoza.)

The problem for Asher is that the magisteria do not seem to be quite as non-overlapping as we are led to believe. The boundary is permeable to reason (remember, his intuition is not mystical but "entirely rational"), which is permitted entry into the religious sphere on condition that it bows the knee before faith. This is the true demarcation of science and religion, at the core of their incompatibility and of the double standard. Theology is full of reasoned arguments that ultimately depend upon unreasonable premises - such as the existence of God - that can only be held on faith. (And even if we accept the assertion that God is beyond the reach of science, whether or not, say, the resurrection took place at a certain time and at a certain place in history is an entirely scientific question, in the sense that it is decidable by appeal only to evidence and reason.)

To support his NOMA position, Asher appeals to the limitations of "methodological naturalism", which he asserts "is a rule of science that says one should not use supernatural phenomena to explain causation in the natural world." Who says this "is a rule of science"? As Dacey argues in The Secular Conscience, while "it makes sense for scientists to prefer naturalistic explanations, there are no good grounds for ruling out supernatural explanations necessarily and in principle." As Stenger points out in God and the Folly of Faith, if "the supernatural exists and has effects on the material world, then those effects are subject to scientific study." And as the author of Acts writes, the "sun shall be turned into darkness, and the moon into blood, before that great and notable day of the Lord come". If that isn't the supernatural causing events that are visible in the natural world, I don't know what is.

Asher's mistake, I think, lies in his confusing two kinds of naturalism: methodological and ontological. It ought to be obvious that "there is no way to legislate in advance what may or may not be used in our scientific exploration of the world" (Dacey). The natural forces we know make a mobile phone work would have seemed supernatural sorcery to a medieval person, and so we would not have the science behind mobile phones if science were not allowed to investigate what was thought to be supernatural. (Historically, religious opposition of this kind did frequently inhibit scientific research. See, for example, White's History of the Warfare of Science with Theology in Christendom.)

According to Asher, science is all about the "how" behind nature, not the "who" or "why", and, moreover, Christianity seems to him "a legitimate account of the agency behind life". First of all, asking simple why questions (why did the apple fall?) does not necessarily imply agency (it could be because it was windy, not that someone was shaking the branch). Secondly, as Asher himself admits, palaeontologists "consider the products of natural agents all the time" - so why couldn't science investigate the agency responsible for evolution, if there was one? The reason he doesn't ask this question is because he's already answered it: there is an agency behind evolution, and that agent is God, and not just any god but the particular god of Christianity. He then makes a fatuous analogy comparing the relationship between evolutionary biology and God to that between the lightbulb and Thomas Edison: understanding the former says nothing about the motivations of the latter. True, apart from the disanalogy: we know Thomas Edison existed and what his motivations were and that he was responsible for the lightbulb; we don't even know that God exists.

One glaring omission from the bibliography is the monumental work by Daniel Dennett, Darwin's Dangerous Idea: Evolution and the Meanings of Life, which makes a far more powerful case for cranes than Asher does for skyhooks. What this book does illustrate magnificently is how religious commitments are capable of clouding even the finest minds. Asher's equivocal attitude to reason is summed up by a couple of comments within a few lines of each other. He first makes the unexceptional claim that "scientific inquiry is limited by human rationality and our capacity to observe." Well, of course it is. Then, he refers to "the acid of rational scrutiny" and so undermines the very process by which he does his science. Not only that, he demonstrates once again the fundamental incompatibility between science and religion: while science values reason and evidence as good, faith doesn't.
Comment Comments (2) | Permalink | Most recent comment: Jul 6, 2013 9:14 AM BST


The History of Scepticism from Erasmus to Spinoza
The History of Scepticism from Erasmus to Spinoza
by Richard H Popkin
Edition: Paperback
Price: £22.45

1 of 1 people found the following review helpful
5.0 out of 5 stars The deflowering of religion, 18 Jan. 2013
Contemporary scepticism is a champion of reason and almost exclusively associated with disbelief. It has come a long way from the period from 1500 to 1675 covered by Richard H. Popkin in this excellent volume. Beginning with Erasmus and ending with Spinoza, this age saw a revival of Pyrrhonism that coincided with the religious turmoil of the Reformation. The resulting intellectual fireworks were to produce, according to Popkin, "one of the crucial forces in the formation of modern thought."

Popkin begins with the origins of philosophical scepticism in ancient Greece. The "Pyrrhonians proposed to suspend judgment on all questions on which there seemed to be conflicting evidence, including the question whether or not something could be known." Proceeding with caution is sensible advice in any age; doubting the possibility of knowledge itself is a far more radical position. Today, mounting a defence of our fundamental epistemic principles is the work of philosophy (see, for example, Michael Lynch's excellent In Praise of Reason). In the early days of the Reformation, theologians were at the forefront. When Martin Luther "set forth his new criterion of religious knowledge" (that what conscience is compelled to believe on reading scripture is true), Catholics were, for once, incredulous. They had their own criterion, which had served them (if not those heretics and infidels they persecuted) well for centuries. For them, religious truth was authorized by tradition, the pope and his councils.

Given the modern meaning of scepticism, Popkin's clarification of what terms like "scepticism" and "fideism" meant in Reformation Europe is important. Fideists "are sceptics with regard to the possibility of our attaining knowledge by rational means, without our possessing some basic truths known by faith (i.e. truths based on no rational evidence whatsoever)." Far from opposing traditional religion, "the sceptics of the sixteenth and seventeenth centuries asserted, almost unanimously, that they were sincere believers in the Christian religion." This was their high point, however, since the "new machine of war" appeared to have a peculiar recoil mechanism which had the odd effect of engulfing both target and gunner in a common catastrophe. By the time we reach Baruch Spinoza (1632-77), the smoke had settled and scepticism was well on the way to becoming an Enlightenment virtue.

Before that point, the "crise pyrrhonienne" surrounding the search for certainty had to play itself out. Michel de Montaigne (1533-92), for example, saw scepticism as a cure for dogmatism: "The plague of man is the opinion of knowledge. That is why ignorance is so recommended by our religion as a quality suitable to belief and obedience." (This is not the kind of ignorance or scepticism endorsed by Stuart Firestein in Ignorance: How It Drives Science!) While René Descartes (1596-1650) failed to solve the sceptical crisis, Pierre Gassendi (1592-1655) had more success in mitigating his initial Pyrrhonism into a type of "constructive scepticism" and developing "what may be called the scientific outlook." In any case, Gassendi became "one of the major figures in the scientific revolution" by seeking to extend human knowledge through careful examination of nature. Indeed, Popkin regards Gassendi's development as illustrating "the making of the modern mind."

Once science took off and the Enlightenment got underway, it was the beginning of the end of religious scepticism. Spinoza in particular dealt a heavy blow, by rejecting scripture as a source of knowledge and reducing the Bible "to uninteresting opinions of some people who lived long ago." He challenged the status of prophecy ("one of the central religious knowledge claims on which the theological significance of the Bible rests") as a fundamental epistemic principle. By the end of the seventeenth century, the Western world had "lost its religious innocence" if not its religion. The story of how this happened is fascinating, and Richard Popkin, a scholar steeped in scepticism, has captured this dramatic intellectual history and made it accessible to the non-specialist.


The Patient Paradox: Why Sexed Up Medicine is Bad for Your Health
The Patient Paradox: Why Sexed Up Medicine is Bad for Your Health
by Margaret McCartney
Edition: Paperback

8 of 10 people found the following review helpful
5.0 out of 5 stars False promise and unfair lure, 9 Jan. 2013
To some policymakers and to a large section of the public, screening people for diseases seems like a win-win proposition: if the test comes back positive, screening has "caught the disease early" (and we all know that's a good thing), while if it's negative, the patient at least has "peace of mind" (also marvellous). Margaret McCartney is not so sure. In fact, she's downright sceptical, and in this important book she makes an excellent case for caution, and for a better understanding of the role of screening in a modern health service. The paradox she keeps finding within the NHS is that the ill often have to be persistent and determined to get help, while those who are well are pestered into patienthood, and treated for something they'll never get.

Screening "throws Hippocrates out of the window." Harm is inevitable. However, the costs and side-effects of screening are rarely acknowledged. For most medical check-ups, "a positive test often means a problem with the test, not with the patient." Tests are not perfect, and unless the tests themselves are "tested just like any other medical intervention, you can have no idea if you are helping or harming."

To illustrate screening in the real world, McCartney uses the high-profile and emotive example of breast cancer. A test that's 80% accurate sounds pretty good, until you realize that testing positive does not mean an 80% chance you have the disease. You might test positive and not have the disease (a false positive). In fact, for a rare disease like breast cancer these false positives might mean your chance of having the disease remains in single figures.

Calculating conditional probabilities doesn't come naturally to most of us (even doctors, worryingly, aren't much better than the rest of us), and so our intuition might lead us to view mammography screening as far more useful than it actually is. Without mammography screening, a woman who walked into Dr McCartney's surgery could be told she had a 1% chance of having breast cancer (the population risk). Even after screening has returned a positive result, the chance only increases to around 8%.

This modest rise might still sound worth having, all else equal. The problem is, of course, all else is not equal. What happens to all those women with false positives? This is where we need to be careful with language. A woman who returns to the surgery for the results of the test will only learn she has tested positive, not that she is one of the lucky ones who has a "false positive". At this stage, no one knows. Without further tests, out of all the women who test positive we simply do not know who has cancer and who hasn't. We only know the statistical proportions. To increase our knowledge in any individual case, further tests are needed, and this is where further harm creeps in (on top of the distress of being told you've tested positive for cancer).

And these further tests themselves are not guaranteed to produce an absolutely definite diagnosis. The specialist field of pathology is often described as the "gold standard" of diagnosis and yet pathologists are still not capable of producing clear answers to every question a patient or doctor might ask. There is an inherent uncertainty to diagnosis. Indeed, "we are all, doctors and patients alike, guddling around in uncertainty" (incidentally, a nice example of McCartney's style, drawing upon dialect that I guess is more familiar in Glasgow than in the Home Counties).

Handling uncertainty and ignorance is an essential part of the scientific method (see Stuart Firestein's Ignorance: How It Drives Science), but, in wider society, these are more likely to be feared than understood. McCartney includes a useful section on Bayesian reasoning ("a good description of how medicine is practised"), which captures the way diagnosis "is instantly responsive, malleable and intelligent to new information." (For an excellent book on the importance of Bayes's theorem in the study of history, see Richard Carrier's Proving History: Bayes's Theorem and the Quest for the Historical Jesus.) For example, a man may present with a whole list of symptoms, none seeming terribly significant, until he reports waking up with chest pain. That little bit of information "trumps everything else" and makes her dial 999 right away.

Although we may not always be wise to its "false promise and unfair lure", we've come to expect that advertising is not entirely geared to our best interests. We suspect there are commercial reasons why women, for example, are encouraged to feel there is a special cream that will hold back the signs of ageing, whatever their age. If women feel, however vaguely, that ageing is also a medical condition in need of treatment, so much the better for sales. What is surprising and disturbing to learn from reading this book is that we may not be entering an entirely different world when we step into a GP's surgery, where clinical and not commercial principles are supposed to hold sway. As with advertising, hard-sell, sexed-up medicine does not give us a free choice, and because "the information we are given is chosen to lead us, not inform us, we don't get fair information."

Margaret McCartney has experience as both a GP and as an ordinary citizen on the receiving end of letters from the health authority hectoring her into having various check-ups. She's concerned that we're being "sold screening MOTs as though there were no downside to consider" and that, as a doctor, her job is no longer to make people better but "to find out what risk factors for disease they might have or could have, despite their feeling well and having no complaints at all." Few of us will be lucky enough to have Dr McCartney as our GP, someone who's willing to take "informed consent" seriously. All of us, however, can read her book.


The Mating Mind: How sexual choice shaped the evolution of human nature
The Mating Mind: How sexual choice shaped the evolution of human nature
by Geoffrey Miller
Edition: Hardcover

5.0 out of 5 stars The wit to woo, 10 Dec. 2012
Verified Purchase(What is this?)
Charles Darwin gave biology two equally potent theories - natural selection and sexual selection - which fared very differently in the century after his death. The former eventually became established at the heart of the modern evolutionary synthesis. The latter was, if not exactly forgotten, then at least sidelined. In this tremendous book, Geoffrey Miller argues for the importance of "sexual selection through mate choice" as an evolutionary force, and he presents one way (out of the many possible) "to apply sexual selection theory in evolutionary psychology".

Miller begins with a simple and yet far-reaching logical point: each one of our ancestors managed not just to survive "but to convince at least one sexual partner to have enough sex to produce offspring." Anyone who didn't "attract sexual interest did not become our ancestors, no matter how good they were at surviving." Darwin realized this, and argued that evolution is driven by both natural selection (arising through competition for survival) and sexual selection (arising through competition for reproduction). The peacock's tail, for example, "makes no sense as an adaptation for survival, but it makes perfect sense as an adaptation for courtship." Without courtship, there is less chance of sexual reproduction, and without sexual reproduction there is no chance of inheritance. The concept of sexual selection shows how differences in reproductive success can lead to evolutionary change.

Sexually attractive biological "luxuries" such as the peacock's tail are not isolated quirks of the living world that can safely be ignored. They're everywhere and they're not biological accidents. A peacock in poor condition or with poor genes will probably not have as glorious a tail as a healthy peacock with good genes. Even for a healthy bird, growing such a tail is costly, so why do it? Key to the adaptive rationale is the handicap principle: a costly ornament with no survival value can still function as an indicator of fitness, precisely because it is costly and hard to fake. The owner of that ornament will therefore be more likely to attract a mate and to pass on those genes responsible for the ornament into future generations.

Is the same true of, say, art? Many evolutionary biologists have thought of art as an accidental by-product of adaptations that were useful for survival (Stephen Jay Gould used the image of the spandrels of San Marco to make this point). Like growing a splendid tail, however, "artistic production entails effort" and biological systems don't like wasting energy. Since art has been ubiquitous, pleasurable and costly throughout human history, is there an evolutionary explanation?

The stumbling block, of course, is our understanding of the mind, which both produces and appreciates a whole range of things "that human minds are uniquely good at, such as humor, story-telling, gossip, art, music, self-consciousness, ornate language, imaginative ideologies, religion, and morality" and for which there were originally no plausible survival payoffs. If natural selection can't explain these features, what can? As "a committed Darwinian" Miller isn't going to give up on evolution so easily, but first he must tackle the bias of looking at the mind as a calculating machine. He drily notes that the mind-as-computer is a terrific metaphor so long as "you ignore most of human life" and are not interested in questions of emotion, creativity, social interaction, culture, status, and so on. He suggests instead "the metaphor of the mind as a sexually selected entertainment system" that evolved to stimulate other minds.

"Our minds are entertaining, intelligent, creative, and articulate far beyond the demands of surviving on the plains of Pleistocene Africa." How did they get that way? Once we swing the spotlight from natural to sexual selection, we can see how "sexual selection through mate choice can be much more intelligent than natural selection." While habitat is inanimate, animals aren't (the clue's in the word). In particular, animals can have very strong interests when it comes to choosing a mate, and evolution by means of sexual selection will help them choose partners who carry good genes.

A major theme of the book is that "thought itself became subject to sexual selection" once language evolved. "Through language, and other new forms of expression such as art and music, our ancestors could act more like psychologists - in addition to acting like beauty contest judges - when choosing mates. During human evolution, sexual selection seems to have shifted its primary target from body to mind." On this account, human courtship is not something quaint and old-fashioned, involving decorously arranged meetings in the presence of a chaperone. It is the main driving force of human evolution, encouraging individuals to excel in any way they can in order to entertain a potential sexual partner.

Miller acknowledges that there "is much more to modern human social life than courtship, and much more to people than their fitness indicators." Mate choice is intrinsically discriminatory and judgemental, "built to rank potential mates by reducing their rich subjectivity to a crass list of physical, mental, and social features." However, the better we understand these instincts, "the easier they may be to override when they are socially inappropriate." Literature is littered with tales of romantic entanglements, and, for me, understanding courtship from this biological point of view and teasing out the evolutionary narrative combine to add another fascinating dimension to great works of art.

This book is a celebration of the "element of frivolity" that sexual selection has introduced into the cosmos. Humour - "the wit to woo" - is one of its most delightful products, and one of the many ways in which humans have displayed their creative intelligence throughout evolutionary history. Darwin's great sexual selection idea explains three enigmas: "the ubiquity across many species of ornaments that do not help survival, sex differences within species, and rapid evolutionary divergence between species." Focusing on survival value alone was "arguably the most typical mistake of 20th-century theorizing about human evolution." It's a mistake Miller does not repeat.


Ignorance: How It Drives Science
Ignorance: How It Drives Science
by Stuart Firestein
Edition: Hardcover
Price: £13.83

6 of 6 people found the following review helpful
5.0 out of 5 stars Time to get out the matches, 16 Nov. 2012
W. B. Yeats admonished that "education is not the filling of a pail, but the lighting of a fire." Stuart Firestein agrees, and in this marvellous book he argues that science is less about accumulating facts and rules and more like looking for "black cats in dark rooms." The scientific process is not a tidy logical procession from one grand truth to the next. It's "mostly stumbling about in the dark", "bumping into unidentifiable things, looking for barely perceptible phantoms". In short, it's about dealing with ignorance.

This isn't the view held by most non-scientists, who for the most part subscribe to the popular image of the scientist as brainy or a boffin, not as a fount of ignorance. It's true that a professional scientist, like any professional, knows an awful lot. Knowing everything is of course impossible, and, anyway, knowing lots of facts "does not automatically make you a scientist, just a geek." Firestein argues that science is different in that the facts "serve mainly to access the ignorance" and to frame new questions. Scientists concentrate on what they don't know, and "science traffics in ignorance, cultivates it, and is driven by it."

Firestein is not talking about ignorance in the pejorative sense. He's interested in "knowledgeable ignorance, perceptive ignorance, insightful ignorance" - the kind that "leads us to frame better questions, the first step to getting better answers." His big claim is that it's "the most important resource" scientists have, and using it correctly is "the most important thing a scientist does."

Scientists love questions. Naturally, we should guard against a simple-minded idea that asking a few questions (especially the so-called "big" ones), any more than knowing a few facts, is all there is to being a scientist. As Michael Lynch warns (In Praise of Reason, page 84), carried to its extreme a sceptic is someone who only questions and never commits, which is no way either to live a life or to do science. It's not just questions, but questions rightly asked that are important. (Lynch is discussing W.K. Clifford's great essay on the Ethics of Belief, which is collected in The Ethics of Belief and Other Essays (Great Books in Philosophy).) I think Firestein would agree with Clifford that testing and open enquiry are what really matter.

One of the virtues of this book is its brevity. Almost half the book is taken up by a single chapter on four case histories, including current research on consciousness and the question of whether or not animals think. He finishes this chapter with a fascinating autobiographical section, outlining his own adventures in neuroscience. A section on suggested further reading includes useful single-paragraph summaries of the books he's recommending.

Given the vastness of this subject, it would be unfair to criticize Firestein for something he's left out. However, I happen to be reading Popkin's The History of Scepticism from Erasmus to Spinoza, which explores a strand of anti-intellectualism that began in Reformation Europe and has paralleled and plagued science ever since. Popkin quotes Michel de Montaigne: "The plague of man is the opinion of knowledge. That is why ignorance is so recommended by our religion as a quality suitable to belief and obedience." This is not the kind of ignorance recommended by Firestein, since it is associated with "the imbecility of human reason". The religious view is to trust in God to supply the revealed truth and that "man is safe in his total natural ignorance."

We need to be careful when celebrating ignorance not to endorse such views. Firestein is a robust defender of reason and he knows the difference between "dumb and ignorant". However, I think he goes too far in claiming that the single thing all scientists know about facts is that they're unreliable and that nothing "is safe from the next generation of scientists with the next generation of tools." For the word "fact" to have any serious meaning it cannot be subject to this kind of continual revision. (In Uncommon Sense: Heretical Nature of Science, Alan Cromer makes a good case for certainty in science.)

Firestein "came to science late, after a career in, of all things, the theater" and his probably unique career trajectory into neuroscience is in itself remarkable. Most scientists will welcome the idea that science has as much "excitement and creativity" as can be found in the arts, and that "[m]ucking about in the unknown is an adventure". Fewer may appreciate his argument that grant applications are a good thing (even the President of the Royal Society thinks there should be longer intervals between having to fill in all those forms). Firestein recognizes that every scientist spends a significant amount of time writing - and complaining about writing - grants, but he argues that this can also be seen as an exercise in defining ignorance, a core part of the job of being a scientist. Firestein's advice to a scientist about to sit down and write a grant application? "Imagine being awarded a prize for what you don't know"!


Page: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11-19