Reviews
How the Secular World Began
Lucretius' poem, recovered in 1417, described atomic theory, chance's role in nature and the gods' indifference.

We think of the Renaissance as a time of unsurpassed artistic accomplishment, but it was also a time of unrelenting strife. In the 14th century, the city-states of Italy were in constant turmoil; Milan warred against Venice, Florence against Rome. This was the period of the "Babylonian Captivity" of the church, when the papacy was established in Avignon, where it became a satellite of the French king. Central religious authority so disintegrated that eventually no fewer than three popes would simultaneously claim the chair of St. Peter. Reformers to the north began issuing denunciations of the church, denouncing a clergy and hierarchy that had grown ever more corrupt and licentious.
In this age of chaos and confusion, a small band of learned men, the Renaissance humanists, clung to a vision of the past that seemed to hold out some hope of tranquility. Antiquity offered models of noble conduct, but they survived only in broken statues or in the mildewed pages of manuscripts hidden for centuries in remote monasteries. To locate and copy manuscripts that contained the voices of the vanished past in all their unsurpassed eloquence became, for some scholars, an overmastering obsession.
The remarkable Poggio Bracciolini (1380-1459) is hardly remembered today, even in his Tuscan hometown of Terranuova. Yet his discoveries led, by slow, almost imperceptible steps, to a revolution in Western thought. In "The Swerve," Shakespearean scholar Stephen Greenblatt traces Poggio's extraordinary career and legacy. He served as apostolic secretary to one of the three rival claimants to the papacy, John XXIII. He was a calligrapher of genius, creating elegant scripts (beautifully illustrated in one of Mr. Greenblatt's well-chosen plates). And he was a book-hunter of the most dogged sort.
Aided by his official position, Poggio used every skill at his command—diplomatic suavity, flattery, displays of erudition—to talk his way into otherwise inaccessible monastic libraries. Underlying his quest was a conviction that the good order of the world and of society depended upon a disciplined reverence for language, specifically the Latin language; true eloquence had a moral foundation. Though Poggio was responsible for many priceless finds—the works of Vitruvius and Quintilian, the letters of Cicero—it was thanks to his discovery in 1417 of the sole surviving manuscript of the Roman poet Lucretius' "On the Nature of Things" that, in Mr. Greenblatt's phrase, "the world swerved in a new direction."
Holy War: Why Vasco de Gama Went to India
The Portuguese navigator Vasco da Gama set sail from Belém, a village at the mouth of the Tagus River now part of greater Lisbon, on July 8, 1497. An obscure but well- connected courtier, he had been chosen, much to everyone’s surprise, by King Manuel I to head the ambitious expedition to chart a new route to India. The king was not moved chiefly by a desire for plunder. He possessed a visionary cast of mind bordering on derangement; he saw himself spearheading a holy war to topple Islam, recover Jerusalem from "the infidels" and establish himself as the "King of Jerusalem."
Da Gama shared these dreams, but like his hard-bitten crew, rogues or criminals to a man, he coveted the fabled riches of the East — not only gold and gems but spices, then the most precious of commodities. On this voyage, as on his two later ones, he proved a brilliant navigator and commander. But where courage could not bring him through violent storms, contrary seas and the machinations of hostile rulers, luck came to his rescue. He sailed blindly, virtually by instinct, without maps, charts or reliable pilots, into unknown oceans.
As Nigel Cliff, a historian and journalist, demonstrates in his lively and ambitious "Holy War," da Gama was abetted as much by ignorance as by skill and daring. To discover the sea route to India, he deliberately set his course in a different direction from Columbus, his great seafaring rival. Instead of heading west, da Gama went south. His ships inched their way down the African coast, voyaging thousands of miles farther than any previous explorer. After months of sailing, he rounded the Cape of Good Hope, the first European to do so. From there, creeping up the east coast of Africa, he embarked on the uncharted vastness of the Indian Ocean. Uncharted, that is, by European navigators. For at the time, the Indian Ocean was crisscrossed by Muslim vessels, and it was Muslim merchants, backed up by powerful local rulers, who controlled the trade routes and had done so for centuries. Da Gama sought to break this maritime dominance; even stronger was his ambition to discover the Christians of India and their "long-lost Christian king," the legendary Prester John, and by forging an alliance with them, to unite Christianity and destroy Islam.
What Physics Owes the Counterculture
"What the Bleep Do We Know!?," a spaced-out concoction of quasi physics and neuroscience that appeared several years ago, promised moviegoers that they could hop between parallel universes and leap back and forth in time — if only they cast off their mental filters and experienced reality full blast. Interviews of scientists were crosscut with those of self-proclaimed mystics, and swooping in to explain the physics was Dr. Quantum, a cartoon superhero who joyfully demonstrated concepts like wave-particle duality, extra dimensions and quantum entanglement. Wiggling his eyebrows, the good doctor ominously asked, "Are we far enough down the rabbit hole yet?" All that was missing was Grace Slick wailing in the background with Jorma Kaukonen on guitar.
Dr. Quantum was a cartoon rendition of Fred Alan Wolf, who resigned from the physics faculty at San Diego State College in the mid-1970s to become a New Age vaudevillian, combining motivational speaking, quantum weirdness and magic tricks in an act that opened several times for Timothy Leary. By then Wolf was running with the Fundamental Fysiks Group, a Bay Area collective driven by the notion that quantum mechanics, maybe with the help of a little LSD, could be harnessed to convey psychic powers. Concentrate hard enough and perhaps you really could levitate the Pentagon.
In "How the Hippies Saved Physics: Science, Counterculture, and the Quantum Revival," David Kaiser, an associate professor at the Massachusetts Institute of Technology, turns to those wild days in the waning years of the Vietnam War when anything seemed possible: communal marriage, living off the land, bringing down the military with flower power. Why not faster-than-light communication, in which a message arrives before it is sent, overthrowing the tyranny of that pig, Father Time?
That was the obsession of Jack Sarfatti, another member of the group. Sarfatti was Wolf’s colleague and roommate in San Diego, and in a pivotal moment in Kaiser’s tale they find themselves in the lobby of the Ritz Hotel in Paris talking to Werner Erhard, the creepy human potential movement guru, who decided to invest in their quantum ventures. Sarfatti was at least as good a salesman as he was a physicist, wooing wealthy eccentrics from his den at Caffe Trieste in the North Beach section of San Francisco.
Other, overlapping efforts like the Consciousness Theory Group and the Physics/Consciousness Research Group were part of the scene, and before long Sarfatti, Wolf and their cohort were conducting annual physics and consciousness workshops at the Esalen Institute in Big Sur.
The Final Testament of the Holy Bible is Shocking. Shockingly Bad, that is.
The problem with James Frey's book isn't blasphemy per se. Good blasphemy, unlike this adolescent theology, is valuable.

Blasphemy is in the news again, and this time it has nothing to do with the Qu'ran or the prophet Muhammad. The novelist James Frey has written a new life of Jesus, The Final Testament of the Holy Bible. It is set in contemporary New York in which a Jesus-figure, Ben, comes back among New York lowlife, as lowlife. His message is the old hippy one – love, love, love – which he pursues in very practical ways. He makes love to almost everyone he meets – women, men, drug addicts, priests. Hence the blasphemy.
Or at least, that is what the publishers are hoping. Written on the cover, in bold, we are told that this is Frey's most revolutionary and controversial work. "Be moved, be enraged, be enthralled by this extraordinary masterpiece," it screams in uppercase letters.
I hope people don't rise to the bait. The book is more ludicrous than scandalous. The rabbit-like lovemaking is accompanied by dialogue of the "we-screwed-until-dawn-and-it-was-like-being-joined-with-the-cosmos" type. And then there's the adolescent protest theology. Religion is responsible for all ills everywhere, Ben solemnly informs us. The Bible is a stone age sci-fi text. God is no more believable than fairies. Faith is just an excuse to oppress.
That said, the book did set me thinking about blasphemy. For it seems to me that there is good blasphemy and bad blasphemy. Good blasphemy is worth studying, whereas bad blasphemy is not. Good blasphemy conveys ethical and theological insights, whereas bad blasphemy is simply about complaint and shock. Both kinds of blasphemy might be published, but only the good type is worth spending time on. (It's a shame when bad blasphemy upsets believers and gains press coverage that encourages others to react to it.)
I was myself involved in a blasphemy case, one of the last to be investigated by the police before changes in British law. We'd published a banned poem, The Love that Dares to Speak its Name by James Kirkup. It strikes me now that while there were important principles of free speech to defend in the case, the poem itself is an example of bad blasphemy. It features a Roman centurion having sex with Jesus after his crucifixion, and is naive and clumsy, replete with ban puns about Jesus being "well hung". Aesthetically it's inept, ethically it's simplistic, theologically it's crass.
Hallelujah! At Age 400, King James Bible Still Reigns

This year, the most influential book you may never have read is celebrating a major birthday. The King James Version of the Bible was published 400 years ago. It's no longer the top-selling Bible, but in those four centuries, it has woven itself deeply into our speech and culture.
Let's travel back to 1603: King James I, who had ruled Scotland, ascended to the throne of England. What he found was a country suspicious of the new king.
"He was regarded as a foreigner," says Gordon Campbell, a historian at the University of Leicester in England. "He spoke with a heavy Scottish accent, and one of the things he needed to legitimize himself as head of the Church of England was a Bible dedicated to him."
At that time, England was in a Bible war between two English translations. The Bishops' Bible was read in churches: It was clunky, inelegant. The Geneva Bible was the choice of the Puritans and the people: It was bolder, more accessible.
"The problem with the Geneva Bible was it had marginal notes," says David Lyle Jeffrey, a historian of biblical interpretation at Baylor University. "And from the point of view of the royalists, and especially King James I, these marginal comments often did not pay sufficient respect to the idea of the divine right of kings."
Those notes referred to kings as tyrants, they challenged regal authority, and King James wanted them gone. So he hatched an idea: Bring the bishops and the Puritans together, ostensibly to work out their differences about church liturgy. His true goal was to maneuver them into proposing a new Bible. His plans fell into place after he refused every demand of the Puritans to simplify the liturgy, and they finally suggested a new translation. With that, James commissioned a new Bible without those seditious notes. Forty-seven scholars and theologians worked through the Bible line by line for seven years.
listen now or download Listen to the Story: All Things Considered
Richard Feynman, the Thinker
In the heyday of the physicist Richard P. Feynman, which ensued after his death in 1988, a publishing entrepreneur might have been tempted to start a book club of works by and about him. Offered as main selections would be Feynman’s autobiographical rambles (as told to Ralph Leighton), " ‘Surely You’re Joking, Mr. Feynman!’" and " ‘What Do You Care What Other People Think?’" For alternate selections, readers could choose from his more serious works, like "QED: The Strange Theory of Light and Matter" — a spirited account of the counterintuitive behavior of the quantum world — and the legendary "Feynman Lectures on Physics." Whatever the man said had swagger. For those who would rather listen, there are recordings of the lectures and of Feynman playing his bongos.
He was an irresistible subject for biographers and, as he called himself in two of his subtitles, a curious character indeed. The best biography, James Gleick’s "Genius," captured the ebullience — sometimes winning, sometimes exasperating — and gave lucid explanations of some hard physics. Those seeking a more mathematical treatment could turn to Jagdish Mehra’s thick book "The Beat of a Different Drum," while for a lighter touch there was Christopher Sykes’s "No Ordinary Gen ius: The Illustrated Richard Feynman."
It is hard to imagine that the world needs another Feynman biography, but here it is. In "Quantum Man: Richard Feynman’s Life in Science," Lawrence M. Krauss, director of the Origins Project at Arizona State University, makes his own way through the subject and emerges with an enlightening addition to the field. Krauss — like Feynman a physicist as well as an author — has written seven books, including "The Physics of Star Trek." Though he couldn’t resist recycling some well-worn Feynman anecdotes (and providing a couple of his own), he concentrates on Feynman the thinker, and on the contributions that merited his fame.
It is not something easily summarized. Einstein discovered relativity. Murray Gell-Mann discovered the quark. And Feynman? Well, there was that thing he did on TV with the O-ring and the ice water, showing why the Challenger had disintegrated. But to physicists he is famous for something more obscure: cleaning up the mathematical mess known as quantum electrodynamics — an ambitious attempt to explain light and matter using two great theories, quantum mechanics and special relativity.
Review: First Contact
Scientific Breakthroughs in the Hunt for Life Beyond Earth by Marc Kaufman
It wasn’t that long ago that the field of astrobiology—the search for life beyond Earth—operated towards the fringes of scientific endeavor, research many explicitly avoided being identified with, especially those seeking government grants or academic tenure. That’s changed, though, as scientists have both discovered life in increasingly extreme environments on the Earth as well as identifying locales beyond Earth, including beyond our solar system, which may be hospitable to life. There are now astrobiology conferences, astrobiology journals, and even a NASA Astrobiology Institute. It’s in that environment of increased acceptance that Marc Kaufman surveys the state of astrobiology’s quest to discover life elsewhere in the universe in First Contact: Scientific Breakthroughs in the Hunt for Life Beyond Earth.
Kaufman, a reporter for the Washington Post, covers a lot of ground in First Contact, both intellectually and geographically. For the book he traveled from mines in South Africa where scientists look for extremophiles living deep below the surface, to observatories in Chile and Australia where astronomers search for methane on Mars and planets around other stars, to conferences from San Diego to Rome where researchers discuss their studies. He uses this travel to provide a first-person account of the state of astrobiology today, neatly condensed into about 200 pages. It’s a good overview for those not familiar with astrobiology, although those who have been following the field, or at least some aspects of it, will likely desire more details than what’s included in this slender book.
One interesting portion of the book looks at those people who still work in—or have been exiled to—the fringes of astrobiology even as the field gains wider acceptance. "[P]erhaps because of its urge for legitimacy, or because the discipline itself so often enters terra incognita, astrobiology has shown a consistent need to enforce a consensus," casting aside those who differ, Kaufman writes. These people include Gil Levin, who argues the Labeled Release experiment on the Viking landers did, in fact, detect life on Mars; David McKay, who led the team that discovered what they still believe is evidence of life in Martian meteorite ALH 84001; and Richard Hoover, who claims to see similar evidence for life in other meteorites. All three get fair, if somewhat sympathetic, profiles in one chapter of the book; Kaufman goes so far to lament that the three, presenting in the same session of a conference, draw only a handful of people—nevermind that they’re speaking at a conference run by SPIE, an organization better known for optics and related technologies. (Since the book went to press, Hoover has gained notoriety for publishing a paper in the quixotic, controversial Journal of Cosmology about his asteroid life claims, an event that generated some media attention but was widely rebuffed as containing nothing new to support his claims.)
Rob Bell's Intervention in the Often Ugly World of American Evangelicalism
In its treatment of hell, the pastor's book holds two Christian truths in tension: human freedom and God's infinite love.

The question: Who is in hell?
I met Rob Bell at Greenbelt, a couple of years back, because we happened to be staying in the same hotel. Though at first, I didn't know who he was. Rather, I saw him coming. He was dressed head-to-foot in black and was accompanied by three other chaps, similarly clad, carrying those impressive silver cases that speak of expensive, hi-tech gear. Then, later, I saw the long queues for his event; they were heavily oversubscribed. I made the link with the inclusive megachurch American pastor who was topping the bill.
He draws congregations numbered in the tens of thousands. And now his new book, Love Wins, has achieved the ultimate accolade. A clever marketing campaign led to a top 10 Twitter trend at the end of February. Evangelicals, even liberal ones, believe the Word changes everything, and so they take words very seriously. They are entirely at home in the wordy, online age.
The row on Twitter is to do with the content of the book, or at least what a number of conservative megachurch detractors assumed to be the content. It's to do with universalism – the long debate in Christianity about whether everyone is eventually saved by Jesus, or whether only an elect make it through the pearly gates. Bell's opponents assume that he is peddling the message that when the great separation comes, between the sheep and the goats, there won't be any going into the pen marked "damnation".
From this side of the pond, it all feels very American, one of those things that makes you realise that the US is a foreign country after all. I'm sure that some British evangelicals debate the extent of the saviour's favour too, only they are also inheritors of the Elizabethan attitude about being wary of making windows into other people's souls. "Turn or burn!" works in South Carolina, not the home counties. (Then again, I was recently in a debate with someone who claimed to know Jesus better than his wife. I wondered whether his wife knew.)
Counting Down to Nuclear War
Every thousand years a doomsday threatens. Early Christians believed the day of judgment was nigh. Medieval millenarians expected the world to end in 1000 A.D. These fatal termini would have required supernatural intervention; in the old days humans lacked sufficient means to destroy themselves. The discovery, in 1938, of how to release nuclear energy changed all that. Humankind acquired the means of its own destruction. Even were we to succeed in eliminating our weapons of doomsday — one subject of "How the End Begins" — we would still know how to build them. From our contemporary double millennium forward, the essential challenge confronting our species will remain how to avoid destroying the human world.
Ron Rosenbaum is an author who likes to ask inconvenient questions. He has untombed the secrets of the Yale secret society Skull and Bones, tumbled among contending Shakespeare scholars and rappelled into the bottomless darkness of Adolph Hitler’s evil. But nothing has engaged his attention more fervently than doomsdays real or threatening, especially the Holocaust and nuclear war. Both catastrophes ominously interlink here.
The book wanders before it settles down. Rosenbaum speculates on the risk Israel took in 2007 when it bombed a secret Syrian nuclear reactor, a pre-emptive strategy Israel has followed since it destroyed an Iraqi reactor in 1981. He cites a quotation in The Spectator of London from a "very senior British ministerial source" who claimed that we came close to "World War III" the day of the attack on Syria. With no more information than the minister’s claim, the best Rosenbaum can say is that "it was not inconceivable."
He goes on to review the many close calls of the cold war, the continuing interception of Russian bombers by United States and NATO fighter aircraft, the negligent loading of six nuclear-armed cruise missiles onto a B-52 in August 2007 and their unrecognized transport across the United States. He works his way through these and similar incidents as if they prove much beyond the vulnerability of all man-made systems to accident, inadvertence and misuse.
Thinking the Unthinkable Again in a Nuclear Age
In his previous books the journalist Ron Rosenbaum has tackled big topics — Hitler’s evil, Shakespeare’s genius — with acuity and irreverence, believing, correctly, that some things are too important to leave to the experts. He’s proud of his gonzo amateur status, so much so that you half suspect he has a scarlet "A" tattooed across his chest, where Superman wore his "S."
Mr. Rosenbaum’s books are both profound and excitable. They resemble grad school seminars that have been hijacked by the sardonic kid in the back, the one with the black sweater and nicotine-stained fingers. Mr. Rosenbaum sometimes writes as if he were pacing the seminar room floor, scanning for sharp new ideas. At other times, it’s as if he were passing around red wine and a hookah, seeking to conjure deep, mellow, cosmic thoughts. He’s pretty good in both modes.
His new book, "How the End Begins: The Road to a Nuclear World War III," gives us both Mr. Rosenbaums, for better and occasionally for worse. This book is a wide-angle and quite dire meditation on our nuclear present; Mr. Rosenbaum is convincingly fearful about where humanity stands.
"I hate to be the bearer of bad news," he declares, "but we will all have to think about the unthinkable again." Our holiday from history is over.
Mr. Rosenbaum charts the likely origins of a nuclear war in the short term, probably in the Middle East (where Israelis fear a second Holocaust, this time a nuclear one) or Pakistan (where stray nukes may yet land in the hands of Islamists) or in the almost marital tensions and miscommunications between the United States and Russia. He pursues thorny moral questions, including this one about nuclear retaliation: "Would it be justice, vengeance or pure genocide to strike back" once our country had been comprehensively bombed?
"How the End Begins" is grim enough that, by its conclusion, you may feel like shopping for — depending on your temperament — either shotgun shells or the kind of suicide pills that spies were said to have sewn into their clothes when dropped behind enemy lines.
The Gods Return
A solution to the 'lostness' of the modern world

For the ancient Greeks, the universe was filled with shining presences that reflected the passing moods of the gods. Each god had a privileged sphere: Mars presided over war, Aphrodite over sexual love, Hera over the hearth; Zeus was the lord of all the gods, the hurler of thunderbolts from on high; and so on for the entire rowdy pantheon. It has long been customary to dismiss these gods as mere stage presences, as convenient explanations for random catastrophes or as fall guys for human motives. Helen caused the Trojan War but, hey, "golden Aphrodite" made her do it.
The authors of "All Things Shining" will have none of this. Though both are professional philosophers—Hubert Dreyfus at Berkeley and Sean Dorrance Kelly at Harvard—they view the ancient Homeric gods as hidden presences still susceptible of invocation. Indeed, they hold out "Homeric polytheism" as a solution to the "lostness" of the contemporary world. They begin by stating that "the world doesn't matter to us the way it used to." They are concerned to elucidate why this may be so. But this is no bland academic exercise. "All Things Shining" is an inspirational book but a highly intelligent and impassioned one. The authors set out to analyze our contemporary nihilism the better to remedy it.
And they provide, really, several books in one. "All Things Shining" provides a concise history of Western thought, beginning with Homer and concluding with Descartes and Kant. But there are extended discussions as well of such contemporary authors as the late David Foster Wallace and, even more startling, of "chick lit" novelist Elizabeth Gilbert. Among much else, the authors contrast Wallace's fraught, "radical" idea of individual freedom and autonomy to Ms. Gilbert's pre-Renaissance view of creativity, according to which "writing well" happens, as the authors put it, "when the gods of writing shine upon her."
A discussion of Odysseus's encounter with his wife Penelope's suitors—who try to kill him at the end of "The Odyssey"—is illumined by a comparison with the scene in the movie "Pulp Fiction" in which Jules and Vincent argue over whether their close call with a gunman was a miracle or merely a statistical probability. In the book's impressive chapter on Melville and "Moby Dick," the authors argue that the wildly changeable tones and shifts of subject matter in the novel are a key to its meaning. To take our moods seriously—"both our highest, soaring joys and our deepest, darkest descents"—is, for Melville, "to be open to the manifold truths our moods reveal."
The authors' general theme, and lament, is that we are no longer "open to the world." We fall prey either to "manufactured confidence" that sweeps aside all obstacles or to a kind of addictive passivity, typified by "blogs and social networking sites." Both are equally unperceptive. By contrast, the Homeric hero is keenly aware of the outside world; indeed, he has no interior life at all. His emotions are public, and they are shared; he lives in a community of attentiveness. He aspires to what in Greek is termed "areté," not "virtue," as it is usually translated, but that peculiar "excellence" that comes from acting in accord with the divine presence, however it may manifest itself.
The Pontiff Speaks
Benedict sits down for several hours of conversation with a journalist.

'The monarchy's mystery is its life," the English writer Walter Bagehot wrote in 1867. "We must not let in daylight upon the magic." A turning point in the history of the British crown, according to some observers, was the 1969 BBC documentary "Royal Family," which showed Queen Elizabeth and her relations engaged in TV-watching and other activities of ordinary folk. The broadcast endeared the royals to millions but may have helped to dispel the larger-than-life aura on which their prestige depended.
Will future historians of the papacy say the same about "Light of the World"? Based on six hours of interviews with Pope Benedict XVI conducted in July of this year by the German journalist Peter Seewald, the book offers a rare portrait of a reigning pontiff, presenting him as insightful and eloquent—and pious of course—but also all too human.
Benedict confesses to TV-watching of his own: the evening news and the occasional DVD, especially a series of movie comedies from the 1950s and 1960s about a parish priest sparring with the Communist mayor of his Italian town. Despite such pleasures, the pope finds that his schedule "overtaxes an 83-year-old man" and reports that his "forces are diminishing," though he makes it clear that he still feels up to the demands of his office.
When it comes to recent controversies, Benedict voices gratitude to journalists for recently exposing the clerical sex abuse in several European and Latin American countries. He goes on to claim that "what guided this press campaign was not only a sincere desire for truth, but . . . also pleasure in exposing the Church and if possible discrediting her." While there is doubtless much truth to such a statement, blaming the messenger is the last thing an image consultant would advise a leader to say in a crisis—which suggests that the image of Benedict that appears here is as uncensored as Mr. Seewald claims.
Likewise, concerning the uproar that greeted Benedict's 2009 decision to lift the excommunication of Richard Williamson—the ultra-traditionalist bishop who turned out to be a Holocaust denier—the pope sees evidence, in the press, of "a hostility, a readiness to pounce . . . in order to strike a well-aimed blow." In this case, Benedict concedes that he made a mistake—that he would not have readmitted Bishop Williamson to the Catholic Church had he known about his statements on the Nazi genocide. "Unfortunately," he tells Mr. Seewald, "none of us went on the Internet to find out what sort of person we were dealing with."
The Age of Empathy
Nature's Lessons for a Kinder Society
This is a confused book because it is trying to do several things at once. It is partly a study of animal empathy, the area of work for which Frans de Waal is well known. But de Waal is also fighting other battles, notably over whether there are sharp dividing lines between humans and other animals. And here he is much less sure of his ground.
The difficulty is that, in stressing the co-operative side of animal behaviour, de Waal sidelines an important trait on which human difference turns: cognition. It was not a mistake Charles Darwin made when, in The Descent of Man (1871), he noted that although animals have "well-marked social instincts", it is "intellectual powers", such as humans have, that lead to the acquisition of the moral sense of right and wrong.
The book fails when it comes to the third goal de Waal sets himself: to champion empathy as a solution to social, even political, problems. He resists the notion that empathy is innately morally ambivalent, overlooking how sadistic behaviour, for example, arises from empathising with a victim, too. Occasionally he admits that empathy is psychologically complex, but rather than exploring this complexity, he quickly returns to reciting evidence more congenial to his thesis.
He briefly offers a more sophisticated account of emotional connectivity, but again ignores its ramifications. This is a hierarchical model, and begins with the inchoate feelings that arise from witnessing another's exhilaration or distress. Next, there is "self-protective altruism" - doing something that benefits another person, though only in order to protect yourself from unpleasant emotional contagion. Then there is "perspective-taking", which is stepping into the shoes of another. But it is not unless the individual has a further capacity, sympathy, that he or she knows how to improve the lot of others. What distinguishes sympathy is that it allows both emotion and understanding (Darwin's "intellectual powers") to be brought to bear on the situation at hand.
Such faculties, however, are not the basis for moral behaviour that de Waal seeks. Indeed, he "shudder[s] at the thought that the humaneness of our societies would depend on the whims of politics, culture, or religion". One may shudder with him, but one might shudder more if our ability to act humanely rested solely on the morally flawed capacity of empathy.
Morality Without Transcendence
Can science determine human values?
When anthropologists visited the island of Dobu in Papua New Guinea in the 1930s they found a society radically different from those in the West. The Dobu appeared to center their lives around black magic, casting spells on their neighbors in order to weaken and possibly kill them, and then steal their crops. This fixation with magic bred extreme poverty, cruelty and suspicion, with mistrust exacerbated by the belief that spells were most effective when used against the people known most intimately.
For Sam Harris, philosopher, neuroscientist and author of the best-selling The End of Faith and Letter to a Christian Nation, the Dobu tribe is an extreme example of a society whose moral values are wrong. In his new book, The Moral Landscape: How Science Can Determine Human Values, Harris sets out why he believes values are not, as is widely held, subjective and culture-dependent. Instead, he says, values are a certain kind of fact — facts about the well-being of conscious creatures — and that they can therefore, at least in principle, be objectively evaluated. The "moral landscape" of the title is the concept that certain moral systems will produce "peaks" of human well-being while others, such as that of the Dobu, will lead to societies characterized by a slough of suffering. Harris maintains that it is possible to determine objectively that the former are better than the latter.
Harris is not the first person to advocate an objective basis for morality. The biologist E. O. Wilson, for example, has previously explained how he believes moral principles can be demonstrated as arising objectively from human biological and cultural evolution. But in arguing that there is an objective basis to morality, Harris puts himself at odds with a principle put forward by the 18th century philosopher David Hume and regarded as inviolable by many philosophers and scientists today: the idea that statements about how things ought to be cannot be derived from statements about what is true. In other words, it is impossible to derive values from facts.
Harris dismisses both this reasoning and the objection that there are no grounds for favoring his moral framework over any other. He takes it to be essentially self-evident that morality is about well-being, arguing that some practices, such as forcing women to dress head to toe in a burqa, are bound to reduce well-being. In Harris’s view, it is not right to treat all cultural practices as being equally valid and maintains that multiculturalism and moral relativism are wrong.
God and Philosophy in Hawking's Universe
Given that the celebrated physicist's thought is bathed in philosophical theories, it's folly to assert that science has dispatched metaphysics.
"Philosophy is dead." Stephen Hawking (2010)
"Philosophy always buries its undertakers." Etienne Gilson (1949)
Upon reading "The Grand Design," one gets the impression that Stephen Hawking has come a little late to the party. Sure, he manages to pronounce philosophy dead on page one of his new book, but philosophers have been heralding the death of philosophy for centuries.
Indeed, virtually every generation has produced at least a few philosophers who describe their subject as either finished or futile. It's doubtful Hawking is aware of this, though, since "The Grand Design" provides abundant evidence that the celebrated physicist's philosophical education has been sorely neglected.
Most people won't be particularly troubled by that, of course. What has troubled people--and consequently rocketed "The Grand Design" up the bestseller lists--is Hawking's claim that, "It is not necessary to invoke God to light the blue touch paper and set the universe going."
Yet God and philosophy are intimately related in Hawking's universe, for it is the same philosophy--yes, philosophy -- that tried to kill them both. In "The Universe in a Nutshell," published in 2001, Hawking called this philosophy "positivist," and described positivism as an "approach put forward by Karl Popper and others." Now let's stop right there, since this is example No. 1 of Hawking's philosophical ignorance. For Popper did not "put forward" positivism; on the contrary, he argued vehemently against it. In fact, he devoted an entire section of his autobiography to explaining how he was responsible for destroying positivism.
Self-Made Golem
Simon Wiesenthal, painted in a new biography as a fame-seeking myth-maker, is also the man who insisted that the world face up to the Holocaust.
Stop the presses! Are you sitting down? Can you handle the truth? According to Tom Segev’s new biography of Simon Wiesenthal—and I’m not making this up—the famed Nazi hunter was not a perfect human being! He was a media manipulator, a myth-maker, a publicity seeker. He could be a self-aggrandizing credit grabber, a teller of tall tales and much-varied narratives, and sometimes weaver of outright fabrications. He was quarrelsome, vain, egotistical, didn’t play well with others.
But what would we have done without him? To many Jews, especially in the Diaspora, he gave at least the illusion that some of the perpetrators would be brought to justice. "Justice not vengeance," as Wiesenthal liked to say.
Segev, an indefatigable historian and highly respected reporter for the leftist Israeli daily Haaretz, tells us he had access to 300,000 Wiesenthal-related documents, although he doesn’t say how many he read. (Among his many human sources are agents of the Mossad who believe they deserve credit for some of his successes.) But his attempt at de-mythologizing Wiesenthal can sometimes make one feel he misses the forest for the trees. Yes, the Wiesenthal behind the legend may have been all too human, and it’s always valuable to set the record straight for history, but could this be a case where the legend is more important to the course of history than the life? Is publicity-seeking intrinsically bad if one is seeking to publicize the untroubled afterlives of mass murderers in order to shame the world into action?
The fact that this question has to be asked is due to something we have chosen to forget: the world community’s stunning failure after World War II to treat the Final Solution as a crime unto itself. The 19 Nazis convicted at Nuremberg were found guilty of "crimes against humanity" mainly for planning and starting a devastating war of aggression. Wiesenthal, Segev reminds us, was always adamant that the Final Solution was a crime against humanity as well as against Jews. But it was a different crime from that for which the Nazi leaders were tried at Nuremberg.
There was a lamentable loss of distinction between the two crimes, or rather a shameful failure to prosecute the second crime, for some 15 years after the war. Hitler lost the war against the Allies, yes. But in effect he won his personal "war against the Jews" (as Lucy Dawidowicz described his greatest priority) by a factor of some 6 million to one.
Den of Antiquities
In a remarkable interlude in Willa Cather’s novel "The Professor’s House," a New Mexico cowboy named Tom Outland describes climbing a landmark he calls Blue Mesa: "Far up above me, a thousand feet or so, set in a great cavern in the face of the cliff, I saw a little city of stone, asleep. It was as still as sculpture, . . . pale little houses of stone nestling close to one another, perched on top of each other, with flat roofs, narrow windows, straight walls, and in the middle of the group, a round tower. . . . I knew at once that I had come upon the city of some extinct civilization, hidden away in this inaccessible mesa for centuries."
Blue Mesa was Cather’s stand-in for Green Mesa — Mesa Verde in southern Colorado — and she was evoking what a real cowboy, Richard Wetherill, might have felt when, a week before Christmas 1888, he found Cliff Palace, the centerpiece of what is now Mesa Verde National Park. Craig Childs understands these kinds of epiphanies, and he beautifully captures them in "Finders Keepers: A Tale of Archaeological Plunder and Obsession" — along with the moral ambiguities that come from exposing a long-hidden world.
Wetherill’s city of stone, Childs reminds us, was quickly commandeered by a Swedish archaeologist who, over the objections of outraged locals, shipped crates of Anasazi artifacts to Europe. Worse things could have happened. Childs tells of a self-proclaimed amateur archaeologist who in the 1980s removed hundreds of artifacts from a cave in Nevada, including a basket with the mummified remains of two children. He kept the heads and buried the bodies in his backyard.
William Blake's Picture of God
The muscular old man with compasses often taken to be Blake's God is actually meant to be everything God is not.

Go to see the newly acquired etchings by William Blake at Tate Britain, or take a look online. They display all the unsettling power and apocalypticism we expect from this exceptional, romantic artist. One shows a young man tethered to a globe of blood by his hair. In another, someone burns in a furnace. Underneath, Blake has written lines such as, "I sought pleasure and found pain unutterable," or, "The floods overwhelmed me."
What you won't find in the gallery, though, is any explanation of these visions. Instead, Blake is treated as impenetrable, his imagery obscure, his calling idiosyncratic. He's rendered slightly mad, and so safe. We can look and admire, but like a modern gothic cartoon strip – that his art no doubt influences – he can be enjoyed, but not taken too seriously.
That's a shame. For not only can Blake be read. What he says carries at least as much force today as it did two hundred years ago.
Consider one of the figures who's in the new works: Urizen. He's well known as he's the same figure who appears as Blake's famous "Ancient of Days" – an old man, with Michelangelo muscles, a full head of long white hair, and a wizard-like beard. Urizen is a key figure in Blake's mythology.
He is not God. (Blake thought it laughable to imagine the divine as a father-figure, as God is found within and throughout life, he believed, hence referring to Jesus as "the Imagination.") Instead, Urizen is the demiurge, a "self-deluded and anxious" forger of pre-existent matter, as Kathleen Raine explains. His predominant concern with material things is signified by his heavy musculature. He is variously depicted as wielding great compasses, absorbed by diagrams, lurking in caves, and drowning in water – as in the new Tate image. It shows that his materialism has trapped him.
Blake loathed the deistic, natural religion associated with Newton and Bacon. He called it "soul-shuddering." Materialism he dismissed as "the philosophy in vogue." He thought the Enlightenment had created a false deity for itself, one imagined by Rousseau and Voltaire as projected human reason. The "dark Satanic mills" of Jerusalem are the mills that "grind out material reality", as Peter Ackroyd writes in his biography of Blake, continuing: "These are the mills that entrance the scientist and the empirical philosopher who, on looking through the microscope or telescope, see fixed mechanism everywhere."
Islam's Problem
In the annals of well-meaning ineptitude, Western efforts to locate and support moderate Muslim voices deserve a place of distinction. The story begins in the smoky rubble of Manhattan’s Twin Towers and the dawning awareness that Islamist zealots who styled themselves holy warriors were the masterminds of this startling act of mass murder. Such acts had to be understood either as something frightfully sick about Islam or as a radical distortion of Islam. Most reasonable people chose to see them as the latter. But if Islam was being hijacked, who within the Islamic world would resist?
Voices of moderation were hastily sought. Understandably, mistakes were made. Even among the Muslims mustered to stand in solidarity with President George W. Bush at the 9/11 memorial service in Washington National Cathedral were a couple whose credentials as champions of moderate, mainstream Islam were questionable. But if that was forgivable because of haste, later missteps were less so.
Wall Street Journal reporter Ian Johnson deftly recounts one such fiasco in a recent issue of Foreign Policy. In 2005, the U.S. State Department cosponsored a conference with the Islamic Society of North America (ISNA) that brought American Muslims to Brussels to meet with 65 European Muslims. The State Department followed up by bringing European Muslims, many of whom had connections to the Muslim Brotherhood—the world’s oldest and arguably most influential Islamist organization, dedicated to making Islam a political program—to the United States for an ISNA-led summer program and imam training. The rationale was that European Muslims, thought to be less integrated into their adopted countries than American Muslims, would learn something valuable about assimilation. All well meaning, of course, but comically misguided. As Johnson notes, "ISNA was founded by people with extremely close ties to the European leadership of the Muslim Brotherhood."
This initiative was only the beginning of protracted efforts by U.S. officialdom to court a number of Brotherhood or Brotherhood-related Islamist organizations and leaders. Instant experts on political Islam from both liberal and conservative Washington think tanks advocated the idea of engaging Islamists who eschewed violence (except, in some cases, violence against Israelis) and endorsed the democratic process, if not liberal values. European officials were wary of this approach, but even the CIA gave a go-ahead.
The folly of this kind of thinking is a major concern of the books under review. In an essay in The Other Muslims, Yunis Qandil, a Jordan-born Palestinian and a lecturer at the Institute of Contemporary Intellectual Studies in Beirut, goes to the heart of the problem: "In the long term, the strengthening of ideological Islam and the granting of official recognition to its ‘moderate’ organizations against jihadism create more problems for us than solutions." Moderate as these Muslim groups in Europe and America may seem, Qandil explains, they represent what moderate, traditional Muslims fight against in their countries of origin: "the instrumentalization of our religion through a totalitarian ideology." While paying lip service to the values of Western societies—notably, the tolerance that allows them to operate—these Islamists fundamentally view such societies as the "archenemy of Islam." So why, Qandil reasonably asks, are European governments "still selecting the adherents of this particular type of Islam as their privileged partners and the recognized representatives of all Muslims"? The same question applies in the case of America.
The Language God Talks
A search for the secrets of eternity

This hodge-podgey little book has at least two very moving high points.
The second is a closing coda: the fictional Aaron Jastrow's sermon, "Heroes of the Iliad," delivered at the Theresienstadt concentration camp. It appears in Wouk's massive World War II novels Winds of War and War and Remembrance.
Jastrow invokes the book of Job. He ponders cruelty, suffering, injustice in creation. He concludes that Job, "the stinking Jew" who upholds Almighty God in the face of an unavailing universe, who calls God to acknowledge that "injustice is on his side," performs an essential service, for he, in upholding God, upholds humanity.
But the first high point is just as gripping. It's an imagined walk-and-talk between Wouk and physicist/Nobelist Richard Feynman. An aggressive, jokey unbeliever, Feynman questions Wouk about Talmud. He's curious why Wouk is Orthodox. Wouk tries to explain, "Not to convince you of my view, but because you asked me."
Throughout his long writerly career, Wouk, who turned 95 on May 15, has often pondered belief. In The Language God Talks (a title derived from Feynman's nickname for calculus), Wouk embraces science, assents to the connection between the human mind and the big bang. In the laws science has discovered, he hears a language God talks, and he feels we should listen.
This modest and not terribly well-focused book has inherent drama, for it faces nothing less than a huge shift, among some Jews, away from God. The break was the Holocaust. The argument goes: In light of such unanswered injustice, six million unanswered injustices, to cling to the notion of a just God is an obscenity, a way to smooth over, to seek comfort when comfort is unjust. To forget. For some, it is wrong even to question this break.
The 20th century saw an exhaustive effort to backfill the tradition, an effort that remains the subject of vigorous disagreement throughout all branches of Judaism. The brilliant, indispensable Jewish insistence on questioning, on debate, on weighing and sifting all sides, was mined for an unbroken skeptical tradition without the God of the Psalms. Atheist Jews could now claim that atheism was always Jewish.
The Shock Philosopher
On the provocative thinking of Simone Weil
Simone Weil was born in Paris just over a century ago, on February 3, 1909, and though she always remained fiercely loyal to France—and was French to her fingertips—the case could be made that her true homeland lay elsewhere, deep in the hazier and far more fractious republic of Contradiction. There she was, however perilously, chez elle. Weil displayed alarming aplomb on the horns of dilemmas; often she teetered on several simultaneously. She went after the paradoxical, the contradictory, the oppositional, with the rapt single-mindedness of a collector and the grim fervor of a truffle-hound. She exulted in polarities. Though many of Weil’s statements have an aphoristic cast or masquerade as bons mots, they are anything but witty in the usual bantering sense. Weil is an irritating thinker; her words impart an acrid aftertaste; they leave scratchy nettles behind. She meant to provoke, to jostle, to unsettle—she was an activist as well as a contemplative—and the play of opposition served this purpose.
But Weil also saw the summoning of opposites as a way of knowing. It was an ancient way, much favored by her beloved Greeks: the Stoic Chrysippus taught that we discern good and evil only by their opposition; in one of Heraclitus’s fragments, he is reported as saying, "God: day, night; winter, summer; war, peace; satiety, hunger." The dark saying could have been Weil’s own. For her, truth was to be found, if at all, in the tension of antinomies. Her thought, at its most perceptive as at its most repellent, draws its remarkable energy from that tension.
Weil’s life, too, was rife with contradiction, much of it in keeping with a familiar modern pattern. Born into a well-off, middle-class Jewish family, she was, over the course of her short life, now a Bolshevik, now an anarchist, now a labor union activist, now a (reluctant) combatant on the Republican side in the Spanish Civil War and, after 1937, as a result of some sort of mystical experience in Assisi, a fervent Roman Catholic believer (though she refused to be baptized). In each of these metamorphoses, she found herself embroiled in opposition; she could not join a group or a movement or an institution without almost immediately dissociating herself from it.
Reasons to Think
At the Guardian Hay debate on reason, atheists and believers found more common ground than might be expected.

People are worried about reason, if the large numbers who attended the Guardian's debate at Hay is anything to go by. It proposed the motion "Reason is always right".
But what do we mean by reason? Why the worry? The philosopher, AC Grayling, kicked off the discussion in favour of the motion with a definition. Reason is the quality we want our doctors to practice when diagnosing our complaint. It's the discipline we want engineers to have when designing a passenger plane. It's that approach to life which we call enlightened, scientific. It gathers information, tests evidence, asks questions.
The word "rational" is close to the word ratio, or being proportionate. So, the good life is one in which passions and emotions don't run riot too. They are kept within reasonable limits. Thinking is what makes us human, Grayling averred. When our appetites take over, we come to harm. Hence, at the end of the day, reason is indeed always right.
Not so!, interjected the second speaker, Richard Harries, the former bishop of Oxford. He told of two women arguing, as they stood on the doorsteps of their respective houses. The couldn't agree because they were arguing from different premises. Ho-ho. But behind the joke lies a serious human issue. Rational discussions are very hard to have because we come to any encounter with jealousies, rivalries, prejudices and assumptions. "In my experience, very few people are capable of arguing objectively," said Harries, who is also a member of the House of Lords.
In other words, reason itself is not enough. We need judgment and wisdom, and that requires the moral and spiritual disciplines of conscience and intuition too. The truly wise individual, who can engage in debate well, is the person who can draw on these other capacities. Reason is not always right because reason alone is not enough.
Martin Rees, the distinguished scientist whose Reith lectures start this week, spoke next, in favour of the motion – but only just. He confessed to being a "cross-bencher" when it came to reason. It's vital, of course. It should hone arguments and test consistency. And scientific knowledge must be backed by reason. But for human beings, there always comes a point when we hit something that is unconditional for us. Respect for life would be one example. Reason helps to clarify why that's the case, but the principle itself is somehow prior to reason. Reason should take us as far as it can, Rees pressed, but it won't take us all the way.
Religion Confuses Even the Best Minds

Religion confuses even the best minds -- and maybe the best minds most of all. Chalk that up partly to the contempt most modern intellectuals have felt for the "opiate of the people." That attitude is hardly conducive to deep understanding, and in fact has given rise to a number of popular misconceptions. One is that modernization -- and its handmaidens, mass education and science -- would lead inevitably to the long-heralded twilight of the gods. We would all be good secular humanists one day soon.
Confidence in that particular shibboleth took a bad hit in the last decade or so. In addition to 9/11 and other acts of faith-based zealotry, Americans witnessed the boisterous return of religion to their public square. Other evidence from around the world -- whether it's the assertive role of Hinduism in contemporary Indian politics or the renewed interest in Confucian principles in still nominally communist China -- has made it much harder to think that religion is a spent force.
Intellectuals friendly to religion have fostered an equally misleading notion, one that is thoughtfully dispelled in Stephen Prothero's book, "God is Not One." Seeing the world's major belief systems through Enlightenment-tinted glasses, a succession of influential philosophers, artists, scholars and even many religious leaders have tended to minimize the differences of ritual and dogma among the various religions to emphasize a supposedly universal and benign truth shared by them all. Such well-meaning believers (and they do constitute a kind of religion of their own) have subscribed to variations of the Dalai Lama's affirmation that "the essential message of all religions is very much the same."
It's an uplifting bromide, to be sure, and Prothero, a professor of religion at Boston University, gives its supporters their due. The Golden Rule and other ethical principles are indeed shared by a majority of the world's religions. The mystical traditions of many religions employ similar disciplines and aspire to similar ends, whether transcendence of baser desires or a sense of unity with a supreme being.
But if the devil is in the details, the point of Prothero's useful book is that God is, too. Which is to say that the particular and often problematic features of a religion -- from its core narratives and rituals to its arcane points of theology -- are at least as important to its followers as those qualities that it may share with other religions. The universalist impulse may be a "lovely sentiment," Prothero writes, "but it is dangerous, disrespectful, and untrue."
Bad Science, Bad Theology, and Blasphemy
ID is indeed bad theology. It implies that God is one more thing along with all the other things in the universe.

The question: Is intelligent design bad theology?
You may have caught some of the row that followed Thomas Nagel's recommendation, in the literary pages of the TLS, for 2009 books of the year. He ventured Stephen Meyer's Signature In The Cell: DNA And The Evidence for Intelligent Design. Nagel is one of the most distinguished philosophers living today. And yet, that apparently now stood for nothing. Meyer's book is pro-ID. Everything from Nagel's reputation to his sanity was called into question.
I read the book. It felt a little like creeping behind the bike sheds at school to have a cigarette, as if an ID cancer might seize control of my synapses. The temptation was irresistible. What I discovered was an arresting book about science, which is what drew Nagel. But it is close to vacuous when it comes to the theology. That, it seems to me, is the problem with ID.
To a non-specialist like myself, Meyer seemed to capture very well the depth of the mystery that the origin of life is to modern science – essentially how DNA, as an astonishingly precise and complex information processing system, could possibly have come about. It's analogous to the monkey-bashing-at-a-typewriter-and-producing-Shakespeare problem, except that with DNA it's even more intractable: you've also got to account for how typewriters and language arose too, they being the prerequisites for the possibility of the prose, let alone the prose of the Bard.
That said, it's because of the inscrutable nature of life's origins that I found the book theologically unsatisfying. It proposes, in essence, an argument from ignorance.
The ID hypothesis Meyer conveys is, roughly, that life is, at base, an information processing system, information that is put to a highly specific purpose, and that the best explanation for the source of such a system is one that is intelligent. Only an intelligence could get the system going, as it were. It can't be put down to chance, since by massive margins there hasn't been nearly enough time since the Big Bang for the random encounters of organic compounds to form such highly specified self-replicating systems. Neither can it be put down to self-organisation, since what DNA requires to work is not general patterns, but the fantastically fine-grained and specific activity of proteins and amino acids. Intelligent design is, then, the best hypothesis to date. But that qualification, "to date", is the problem.
Eagleton and Hitchens against Nihilism
Two recent books converge on a common enemy: the bland atheist managerialism that assumes the point of life is fun.

Peter Hitchens is a Mail on Sunday columnist who writes from the right. Terry Eagleton is a professor of English literature who writes from the left. What's striking, reading their new books alongside each other – Hitchens' The Rage Against God and Eagleton's On Evil – is that they both have the same target in their sights: nihilism.
Hitchens wrote his book to unpick the arguments of his brother, Christopher, one of the big gun anti-theists, though much of it is taken up with his thoughts on why Christianity has become so marginal in Britain today. More than anything else, he puts it down to the two world wars, and the Church of England's alignment with these national causes: after all the horror and bloodshed, the pews emptied. Add to that the decline of empire, and the anxiety about what Britain now is, and the established religion inevitably declines and worries about itself too.
Hitchens also blames the rampant liberalism if his generation; he was a teenager in the 1960s. They feared the constraints of their parents" lifestyle – post-war rationing coupled to the limitations of life in the suburbs. So, they pursued life goals of unbridled ambition and pleasure, viscerally rejecting anything that smacked of authority and moral judgment. That fed the undermining of Christianity too.When it comes to his brother's blast against God, he makes a number of points. On the "good without God" question, he argues that morality must make an absolute demand on you, so that even though you constantly fail to reach its high standards, you are not able to ignore it, as he believes people and politicians now do every day: witness everything from common rudeness to the suspension of Habeas Corpus. If there are no laws that even kings must obey, no-one is safe.
His toughest rhetoric comes when he notes that the Russian communists moved remarkably swiftly to stop religious education, after they had seized power in 1917. He sees clear parallels between this move and his brother's nostalgia for Trotsky, and the argument that religious education is child abuse. "It is a dogmatic tyranny in the making," he concludes.
Incredible Views
Review of four books on atheism
Despite the recent intensification of debate between atheists and religious believers, the result still seems to be stalemate. Protagonists can readily identify their opponent's weak spots, and so delight their supporters. At the same time, both sides can fall back on their best arguments, thereby reviving their fortunes when necessary. The new atheism has created, or recovered, a perfect sport. No one can win in the game called "God"; everyone can land blows.
But is anything important or new emerging from the spat? Consider one of the tussles that is rehearsed in the books under review. It centres on the so-called anthropic argument. This is the observation of physicists that certain features in the universe appear to be uncannily "tuned" in favour of life. An anthropic principle was first proposed in 1973, and generally mocked by scientists. Stephen Jay Gould wrote that the principle was like declaring that the design of ships is finely tuned to support the life of barnacles. Since then, however, physicists have become relatively comfortable with discussing the idea. This is for at least two reasons. One is that the anthropic principle, in a weak form, makes predictions about the age of the universe that can be tested. Roughly, the universe has to be a minimal age in order to allow enough time to pass for the constituents of life - such as the element carbon - to be forged in the heart of stars. It passes that test.
Second is that the alleged tuning of the universe for life has come to be seen as quite extraordinarily fine. One demonstration of this exactitude concerns "dark energy". It forms about 70 per cent of the stuff of the known universe, and is called dark because astronomers have little idea what it is. The best candidate is the energy associated with a quantum vacuum, a phenomenon that arises from quantum physics. The important point for the anthropic argument is that it turns out that this energy would have to be "tuned" to about one part in 10120. That is a very substantial number, to say the least - way higher than the number of atoms in the visible universe. In a recent issue of the magazine Discover, the robust atheist and Nobel Prizewinning physicist Steven Weinberg described this as "the one fine-tuning that seems to be extreme, far beyond what you could imagine just having to accept as a mere accident."
Butchers and Saints
The villains of history seem relatively easy to understand; however awful their deeds, their motives remain recognizable. But the good guys, those their contemporaries saw as heroes or saints, often puzzle and appall. They did the cruelest things for the loftiest of motives; they sang hymns as they waded through blood. Nowhere, perhaps, is this contradiction more apparent than in the history of the Crusades. When the victorious knights of the First Crusade finally stood in Jerusalem, on July 15, 1099, they were, in the words of the chronicler William of Tyre, "dripping with blood from head to foot." They had massacred the populace. But in the same breath, William praised the "pious devotion . . . with which the pilgrims drew near to the holy places, the exultation of heart and happiness of spirit with which they kissed the memorials of the Lord’s sojourn on earth."
At the same time, "Holy Warriors" is what Phillips calls a "character driven" account. The book is alive with extravagantly varied figures, from popes both dithering and decisive to vociferous abbots and conniving kings; saints rub shoulders with "flea pickers." If Richard the Lion-Hearted and Saladin dominate the account, perhaps unavoidably, there are also vivid cameos of such lesser-known personalities as the formidable Queen Melisende of Jerusalem and her rebellious sister Alice of Antioch. Heraclius, the patriarch of Jerusalem, is glimpsed in an embarrassing moment when a brazen messenger announces to the assembled high court where he sits in session that his mistress, Pasque, has just given birth to a daughter.
It’s tempting to dismiss the crusaders’ piety as sheer hypocrisy. In fact, their faith was as pure as their savagery. As Jonathan Phillips observes in his excellent new history — in case we needed reminding at this late date — "faith lies at the heart of holy war." For some, of course, this will be proof that something irremediably lethal lies at the heart of all religious belief. But the same fervor that led to horrific butchery, on both the Christian and the Muslim sides, also inspired extraordinary efforts of self-sacrifice, of genuine heroism and even, at rare moments, of simple human kindness. Phillips, professor of crusading history at the University of London, doesn’t try to reconcile these extremes; he presents them in all their baffling disparity. This approach gives a cool, almost documentary power to his narrative.
The New Buddhist Atheism
A book setting out the principles of a pared-down Buddhism has won praise from arch-atheist Christopher Hitchens.

In God is Not Great, Christopher Hitchens writes of Buddhism as the sleep of reason, and of Buddhists as discarding their minds as well as their sandals. His passionate diatribe appeared in 2007. So what's he doing now, just three years later, endorsing a book on Buddhism written by a Buddhist?
The new publication is Confession of a Buddhist Atheist. Its author, Stephen Batchelor, is at the vanguard of attempts to forge an authentically western Buddhism. He is probably best known for Buddhism Without Beliefs, in which he describes himself as an agnostic. Now he has decided on atheism, the significance of which is not just that he doesn't believe in transcendent deities, but is also found in his stripping down of Buddhism to the basics.
Reincarnation and karma are rejected as Indian accretions: his study of the historical Siddhartha Gautama – one element in the new book – suggests the Buddha himself was probably indifferent to these doctrines. What Batchelor believes the Buddha did preach were four essentials. First, the conditioned nature of existence, which is to say everything continually comes and goes. Second, the practice of mindfulness, as the way to be awake to what is and what is not. Third, the tasks of knowing suffering, letting go of craving, experiencing cessation and the "noble path". Fourth, the self-reliance of the individual, so that nothing is taken on authority, and everything is found through experience.
It's a moving and thoughtful book that does not fear to challenge. It will cause consternation, not least for its quietly harsh critique of Tibetan Buddhism as authoritarian. It is full of phrases that stick in the mind, such as "religion is life living itself."
Hitchens calls it "honest" and "serious", a model of self-criticism, and an example of the kind of ethical and scientific humanism "in which lies our only real hope". The endorsement makes sense because Batchelor's is an account of Buddhism for "this world alone". His deployment of reason and evidence, coupled to the imperative to remake Buddhism and hold no allegiance to inherited doctrines, would appeal to Hitchens. And not just Hitchens.
Review: 'The Hidden Brain' by Shankar Vedantam

Sen. Harry Reid has faced sharp criticism for speculating during the 2008 campaign that Barack Obama may have had an advantage as a black presidential candidate because of his light skin. But a practical test carried out by the Obama campaign -- and now revealed in Shankar Vedantam’s book, "The Hidden Brain" -- suggests that Reid may have been on target.
Near the end of the race, Democratic strategists considered combating racial bias by running feel-good television advertisements. The campaign made two versions of the same ad. One featured images of a white dad and two black dads in family settings reading "The Little Engine That Could" to their daughters. The other depicted similar scenes with a white dad and a light-skinned black dad. The first version got high ratings from focus groups, but it did not budge viewers' attitudes toward Obama. The second version elicited less overt enthusiasm, but it increased viewers' willingness to support the candidate. The spots never aired in part, Vedantam writes, because of budget problems.
Vedantam, a science reporter for The Washington Post, saw these ads in the course of his reporting on the science of the unconscious mind. The contrast between viewers' expressed sentiments and their inclinations as voters is, he argues, evidence of the unconscious in action -- in this case, through bias against dark skin. "We are going back to the future to Freud" with this application of science to politics, explains Drew Westen, a psychologist who advised the Obama campaign.
We may be heading back farther yet. Before Freud, the unconscious was understood as a social monitor. Illustrating this limited concept, Freud's collaborator, Joseph Breuer, once wrote that he would suffer "feelings of lively unrest" if he had neglected to visit a patient on his rounds. Freud proposed a more complex, creative unconscious, one that accessed forgotten facts and feelings and, using poetic logic, concocted meaningful dreams, medical symptoms and slips of the tongue. A common example of a misstep provoked by the psychoanalytic unconscious has a young man calling a woman a "breast of fresh air" and then correcting himself: "I mean, a breath of flesh air."
This clever unconscious has fallen on hard times. While contemporary research finds that mental processes occurring outside awareness shape our decisions, the unconscious revealed in those studies is stodgy. It uses simple mechanisms to warn us of risks and opportunities -- and often it is simply wrong.
The Fruitless Search for Fact
AN Wilson's 'Jesus' shows how anyone combing the gospels for history is likely to be disappointed.

The Jesus seminar is a group of scholars who have adopted a systematic approach to the search for the historical Jesus. Listing all the sayings and acts attributed to him, they colour code the likely veracity of each according to the standards of biblical criticism. For example, if the saying or act fits uneasily with subsequent Christian teaching, it's likely to be true, for only that could have stopped its suppression. One of these sayings is Jesus' injunction to turn the other cheek. An "inauthentic" saying is the beatitude he supposedly pronounced on those persecuted for following the Son of Man. The work has led the scholars to conclude that Jesus was an extraordinary ethical teacher, perhaps akin to Gandhi. It's an answer to the question of who this man was that AN Wilson, in his book "Jesus", utterly refutes.
It's not that what's recorded about him in the four gospels is not fascinating to search and weigh. Rather, it's that the ethical teaching is too muddled. Jesus has been read as a pacifist, as the saying about turning the other cheek might imply. And yet his disciples apparently carried swords in the Garden of Gethsemane. He taught that the poor would be blessed, though archaeological evidence suggests he lived for most of his life in a comfortable home. It just doesn't add up. "A patient and conscientious reading of the gospels will always destroy any explanation which we devise," Wilson writes. "If it makes sense, it's wrong."
His book is written in an open-minded, if questioning tone. He tests the evidence, whilst respecting the faith of ordinary Christians. His barbs are mostly saved for institutions like churches, who have consistently shown "contempt" towards what their supposed founder reportedly said. Some allow divorce, when Jesus is almost certain to have forbidden it. Others claim Jesus as their founder, when the fact that he didn't present his teachings in anything like a systematic form, but rather engaged with existing Jewish teaching, implies otherwise. He seems to have regarded himself as an authoritative, reformist rabbi, with apocalyptic leanings. He almost certainly believed that a new kingdom was coming, one so imminent that his disciples could live by it already.
Holier than Thou
In a letter of 1521, Martin Luther exhorted his fellow reformer Philipp Melanchthon to "be a sinner and sin strongly, but have faith and rejoice in Christ even more strongly!" The antithesis is carefully couched, suggesting a subtle dynamic between the extremes of bold sinfulness and joyful faith as though in some indefinable way they fed upon one another (and perhaps they do). Luther's words convey that tremulous equipoise of irreconcilables which has characterised Christian belief from its beginnings. In his new, massive history of Christianity, the distinguished Reformation scholar Diarmaid MacCulloch balks at such robust paradoxes. Unreason — that "faith in things unseen" — leaves him queasy. It leads to beliefs he finds preposterous. Christianity intrigues him because he cannot understand "how something so apparently crazy can be so captivating to millions of other members of my species". It inspires intolerance, bigotry, fanaticism and their murderous consequences. "For most of its existence," he writes, "Christianity has been the most intolerant of world faiths." As if this weren't bad enough, it indulges in "gender-skewed language."
Although MacCulloch purports to be writing a history for the general reader — his book was the basis for "a major BBC TV series" this autumn — his take on Christianity is highly tendentious. When he sticks to events, such as the Council of Chalcedon in 451, which provoked the first momentous schism in Christian history, or when he untangles obscure doctrinal disputes, ranging from the controversies incited by the Iconoclasts to the baffled modern clashes between genteel traditional Protestants and rowdy Pentecostals, he can be superb. His scope is enormous. His discussion of Christianity in Ethiopia is as thorough as his explorations of 19th-century American revivalist movements. And his attention to often disregarded detail is impressive. His affectionate references to devotional music, from the hymns of Charles Wesley to Negro spirituals to the old Roman Catholic service of Benediction, enliven his account. Unsurprisingly, as the author of the magisterial Reformation: Europe's House Divided 1490-1700, he is excellent on the rise of Protestantism and on the Catholic Counter-Reformation. But he's just as informed, and as informative, about recent developments, whether the Second Vatican Council or the Orthodox Church in post-Soviet Russia.
A Science Writer for the Ages
At age 95, Martin Gardner continues to churn out thought-provoking essays and reviews that are aimed squarely at pieties of all kinds.
Martin Gardner is fond of quoting the following from Isaac Newton:
"I don't know what I may seem to the world, but as to myself I seem to have been only like a boy playing on the sea shore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me."
It's a statement not only of scientific (and thus human) wonder, but of scientific (and thus human) modesty. All of us, even the geniuses, as Newton assuredly was, can at best catch only partial glimmers of "the great ocean of truth."
Wading into that ocean is a pursuit that Gardner, though science writer rather than scientist (he usually refers to himself as a journalist), has been engaged in for much of his very long life. And at 95, he shows little sign of abandoning it. Gardner made himself famous as both long-time author of a Scientific American column on mathematical games and diversions and as a thoroughgoing skeptic, author of such works as Fads and Fallacies in the Name of Science; published in 1952, it was among the very earliest works to debunk pseudo-science and to scold the popular press for its frequent credulity.
My own introduction to Gardner was through another branch of his interests: literature. Gardner virtually invented the now-popular genre of annotated editions of classic works and, relatively early in life, I came upon and hugely enjoyed his editions of "Alice in Wonderland" (The Annotated Alice still sits on my shelves) and Coleridge's "Rime of the Ancient Mariner".
And even though he now resides in an assisted-living home in his native Oklahoma, Gardner continues to churn out essays and reviews, and collect them in books, of which "When You Were a Tadpole and I Was a Fish" is the latest.
Here you'll find book reviews and essays in several categories, including Science, Logic, Literature and Religion, all displaying Gardner's trademark erudition, expansiveness and curiosity.
Even though he's famously a skeptic, and one of the founders of Skeptical Inquirer magazine, Gardner is no atheist. Rather, as he makes plain in his essay "Why I am Not an Atheist" (a play on Bertrand Russell's book "Why I am Not a Christian"), he is a philosophical theist; that is, one who believes in God but rejects the truth claims of any particular religion. In fact, there are reviews here that debunk the scientific pretensions of Christians, for instance U.S. physicist Frank Tippler ("The Physics of Christianity") and that scourge of the liberals, Anne Coulter. There's a scathing review of Coulter's book "Godless," in which she somehow promotes herself as an (anti-) evolutionary biologist.
Fateful Schism
Tracing the history of a religious divide that still haunts the world

When the Prophet Muhammad died unexpectedly after a brief illness in Medina, in present-day Saudi Arabia, on June 8, 632, his followers were stunned. A contemporary called it "the greatest of calamities." Their grief was not only for the loss of an irreplaceable leader. Muhammad was "the seal of the prophets," the last in a line that stretched back to Adam. He had received revelations as "God's emissary" for some 20 years—revelations that he had communicated to the embattled community of his followers, first in Mecca and then, after the hijra, or emigration, in 622, in Medina—but now they came to an end. It was as though God, who revealed Himself through the Prophet, had suddenly fallen silent.
In fact, the calamity was greater than Muhammad's mourners could have foreseen. Muhammad had not unambiguously named his successor. The question of succession would haunt Islam for centuries to come. The wrangling began within hours of Muhammad's death; it would quickly lead to a momentous rift between two implacable factions, Shia and Sunni. It is a divide that continues to this day, often with horrific consequences. In "After the Prophet," veteran Middle East journalist Lesley Hazleton tells with great flair this "epic story of the Shia-Sunni split in Islam," as she rightly calls it.
Those who supported Muhammad's cousin and son-in-law Ali found themselves pitted against those who favored Abu Bakr, the Prophet's closest friend. Muhammad was also his son-in-law: Abu Bakr's daughter Aisha was Muhammad's third, and favorite, wife, and a force to reckon with in her own right. Ali's supporters formed the "shi'at Ali," the "party of Ali," from which the term Shia derives. The partisans of Abu Bakr would come to be known as "Sunni" Muslims—those who follow the "sunna," the code of pious practice based on the Prophet's example.
That Abu Bakr was almost immediately named caliph—the title then meant no more than "successor"—embittered Ali's supporters; when their man was passed over for the caliphate two more times they felt that a monstrous injustice had been perpetrated. Ali did finally accede to the caliphate in 656, but his claim was contested. When he was assassinated in the mosque of Kufa, in 661, by an extremist wielding a sword laced with poison, his murder struck a tragic note that would reverberate ever after. The Sunni-Shia schism pitted Muslim against Muslim and led to civil wars, massacres and assassinations, and even the collapse of dynasties.
Faith Rites Boost Brains, Even for Atheists
Brain scanners show that intense meditation alters our gray matter, strengthening regions that focus the mind and foster compassion while calming those linked to fear and anger.

PHILADELPHIA, Aug 17 (Reuters) - Buddhist monks and Catholic nuns boost their brain power through meditation and prayer, but even atheists can enjoy the mental benefits that believers derive from faith, according to a popular neuroscience author.
The key, Andrew Newberg argues in his new book "How God Changes Your Brain," lies in the concentrating and calming effects that meditation or intense prayer have inside our heads.
Brain scanners show that intense meditation alters our gray matter, strengthening regions that focus the mind and foster compassion while calming those linked to fear and anger.
Whether the meditator believes in the supernatural or is an atheist repeating a mantra, he says, the outcome can be the same - a growth in the compassion that virtually every religion teaches and a decline in negative feelings and emotions.
"In essence, when you think about the really big questions in life -- be they religious, scientific or psychological -- your brain is going to grow," says Newberg, head of the Center for Spirituality and the Mind at the University of Pennsylvania.
"It doesn't matter if you're a Christian or a Jew, a Muslim or a Hindu, or an agnostic or an atheist," he writes in the book written with Mark Robert Waldman, a therapist at the Center.
NEUROTHEOLOGY
In his office at the University of Pennsylvania's hospital, Newberg told Reuters that "neurotheology" - the study of the brain's role in religious belief - is starting to shed light on what happens in believers' heads when they contemplate God.
Science and religion are often seen as opposites, to the point where some in each camp openly reject the other, but this medical doctor and professor of radiology, psychology and religious studies sees no reason not to study them together.
Sam Harris and Francis Collins
Atheism can express intolerance and hatred quite as well as religion. Sam Harris proves it.

Anyone tempted to believe that the abolition of religion would make the world a wiser and better place should study the works of Sam Harris. Shallow, narrow, and self-righteous, he defends and embodies all of the traits that have made organised religion repulsive; and he does so in the name of atheism and rationality. He has, for example, defended torture, ("restraint in the use of torture cannot be reconciled with our willingness to wage war in the first place") attacked religious toleration in ways that would make Pio Nono blush: "We can no more tolerate a diversity of religious beliefs than a diversity of beliefs about epidemiology and basic hygiene" ; he has claimed that there are some ideas so terrible that we may be justified in killing people just for believing them. Naturally, he also believes that the Nazis were really mere catspaws of the Christians. ("Knowingly or not, the Nazis were agents of religion").
"A bold and exhilarating thesis" is what Johann Hari said of Harris's first book (from which the quotes above are taken), though on reflection he might think it more bold than exhilarating. Richard Dawkins was more wordily enthusiastic in a preface for Harris's next: "Every word zings like an elegantly fletched arrow from a taut bowstring and flies in a gratifyingly swift arc to the target, where it thuds satisfyingly into the bullseye." (where else does he expect to find the bullseye?)
Hundreds of thousands of people bought the books, and perhaps the ideas in them. And now Harris has had an op-ed in the New York Times, in which, in his bold and exhilarating way, he makes the case against appointing a Christian scientist, Francis Collins, to the important American government post of Director of the National Institutes of Health. This is not because Collins is a bad scientist. He is, actually, quite extraordinarily distinguished, both as a scientist and as an administrator: his previous job was running the Human Genome Project as the successor to James Watson.
But he is, unashamedly, a Christian. He's not a creationist, and he does science without expecting God to interfere. But he believes in God; he prays, and this is for Harris sufficient reason to exclude him from a job directing medical research.
Of course this is a fantastically illiberal and embryonically totalitarian position that goes against every possible notion of human rights and even the American constitution. If we follow Harris, government jobs are to be handed out on the basis of religious beliefs or lack thereof. But what is really astonishing and depressing is how little faith it shows in science itself.
On The Evolution of God
Robert Wright's latest book sees moral progress in terms of evolution. But is his approach really suited to religion?

God is being reinvented by atheists. It's an unexpected phenomenon. Martin Seligman, the psychologist responsible for the surge of interest in happiness, talks of never being able to get on with the God of the Christians. However, he speculates about a deity that will emerge in time: "I am optimistic that God may come at the end," he has written.
Or there's the theoretical biologist, Stuart Kaufmann. His reflections on evolution lead him to suggest that "the unfolding of the universe … appears to be partially beyond natural law" and hence he is "happy to accept this natural creativity in the universe as a reinvention of 'God'."
The work of the author and journalist Robert Wright is caught up in this movement. Much as evolution seems directed towards growing physiological complexity, he detects moral progress in the evolution of humanity. It's an insight supported by game theory. Very roughly, some activities we see in nature are zero-sum: one player wins, and the other must lose. There is no progress in that. However, some activities are non-zero-sum: it is possible to devise outcomes that are win-win. In such situations, to put it crudely, people can afford to be nice to each other. Seen as a force of history, that leads to an increase in compassion and the creation of the moral ideal of universal love. In his new book, The Evolution of God, Wright links that to Jewish, Christian and Muslim explorations of the divine. He believes his method points to a synthesis of faith and science, one that transcends contemporary antagonisms.
He even leaves open the possibility that the God of Abraham, Isaac and Jacob exists, though he's more inclined to the view that "God" is a creation of the human imagination, if one to be valued not dismissed; without "God", he suggests, our moral sensibilities would be indistinguishable from those of beasts. He comes closest to affirming a transcendent force for the good when exploring the work of Philo of Alexandria. This Jewish thinker envisaged a principle running through the cosmos, which he called the Logos. That abstract conception of God suits Wright's game theory, and its cost-benefit analysis: he calls Philo's Logos "the divine algorithm."
Crazy for You
Review of Norah Vincent's VOLUNTARY MADNESS: My year lost and found in the loony bin
Norah Vincent is making a career out of adopting different disguises. For her previous book, Self-Made Man (2006), she dressed up as a man for a year, fooling everyone she met. The tensions that resulted from living and working in male guise, however, "that psycho-emotional contradiction", caused her to have a breakdown. And this new book picks up where that leaves off. This time she's off to the asylum, ostensibly feigning madness, though as the book unfurls, it becomes clear that her interest in mental institutions has much to do with her own search for mental well-being too; she has a history of depression of which her breakdown was a part. Her technique is called "immersion journalism", and it is to writing what method acting is to performance. The idea is to literally become the part. It's also a good way of gaining readers: My year lost and found in the loony bin is a far more gripping subtitle than something equally accurate, such as "An examination of the contradictions inherent in the care of the mentally ill".
Over the course of a year or so, Vincent has three short sojourns in three different American institutions - one public, one private, one based on complementary therapies. She self-refers in order to get herself committed, adding to the symptoms of her previous mental illness the sort of thing that doctors need to hear, and it turns out to be surprisingly easy to do. That said, she is never actually in one hospital for more than a couple of weeks, though she writes powerfully enough about the experience to create the illusion that she has been committed in each case for a long time. Her descriptions convey many uncomfortable things about the way people suffering from mental illness can be treated in these places, not least in terms of how dehumanizing such care can be. She is also good at describing the effects of different drugs from the point of view of the individual who takes them, "the undisclosed or unknown dangers and unpleasant side effects". During the course of the research that also forms part of the project, Vincent is "appalled to read, first in books, then in newspapers, of how thoroughly corrupt the drug development and approval processes are in [the United States]".
Her analysis of what it is to be institutionalized is also illuminating: "It doesn't just happen to you. You allow it to happen to you. You partake". In fact, that theme of responsibility - whether that be the responsibility of doctors, carers or patients - is the main theme of the book. She concludes that although institutions may offer a sufferer of mental illness some respite from the pressures of everyday life, sufferers must aim to take responsibility for themselves in order to recover, or at least to learn to live with their illness: "nothing and no one can do for a person what he will not do for himself, even if he is crazy". Medicine may help. But medication is a far from complete answer.
God, Dawkins, and Tragic Humanism
In a new book, Terry Eagleton argues that liberal humanism woefully underestimates the horrors of which humans are capable.

But this month, two books are a cut above the rest. For one thing, they pack hilarious rhetorical punch. You'd expect that in Reason, Faith and Revolution: Reflections on the God Debate, by Terry Eagleton. His review of The God Delusion in the London Review of Books became a minor publishing event in its own right: "Imagine someone holding forth on biology whose only knowledge of the subject is the Book of British Birds, and you have a rough idea of what it feels like to read Richard Dawkins on theology," it began.
However, there is something deeper going on in Eagleton's book than highbrow trench warfare. As there is in the second work, Atheist Delusions: The Christian Revolution and Its Fashionable Enemies by David Bentley Hart. Eagleton does not let up now. Of Daniel Dennett's scientific treatment of belief, he writes: "[Dennett] is rather like someone who thinks that a novel is a botched piece of sociology." To Christopher Hitchens, whom he respects, Eagleton says: "Christianity was never meant to be an explanation of anything in the first place. It is rather like saying that thanks to the electric toaster we can forget about Chekhov." However, there is something deeper going on in Eagleton's book than highbrow trench warfare. As there is in the second work, Atheist Delusions: The Christian Revolution and Its Fashionable Enemies by David Bentley Hart. They're worth considering even if you naturallyA New Entry in the God Debate
The author says atheists reject the Christian gospel because it is too radical for them.

What if the Great God Debate isn't about the existence of God at all?
What if the great atheist writers of our age have missed the point? What if, as God debater Terry Eagleton says, "they reject the Christian gospel not because it's garbage, but because it's too radical for them"?
The oldest questions of all - Does God exist? Can science prove or disprove it? Is religion good or bad? - have become the highest-profile intellectual debate of the decade. It's a war of books, stoked to white heat by the war on terror, when some have thought the West was in a toe-to-toe cultural Armageddon with Islam.
Eagleton is a recent entrant in the God Debate. With a glittering resume ranging from literary criticism to history, he is a writer with serious Marxist and socialist credentials. In his new book, Reason, Faith, and Revolution: Reflections on the God Debate, he comes out squarely - against the atheists.
He's diving into a brainiac mosh pit. Richard Dawkins, Christopher Hitchens, Sam Harris, and Daniel Dennett have weighed in on the nay side, and Francis Collins, Chris Hedges, Rick Warren, and Tim Keller on the yea.
Eagleton, one of the best-known public intellectuals in the world, holds professorships at Lancaster University and the National University of Ireland, Galway. He has written more than 40 books and all but haunts talk shows, book reviews, and op-ed pages.
But against the atheists? Wittily, merrily, trenchantly so. Eagleton mischievously lumps Dawkins and Hitchens together as "Ditchkins" throughout his book. It's unfair. He's glad. Partly, it's to mock what he sees them doing to religion - tarring all belief as fundamentalism.
The book grew out of a furious 2006 Eagleton review of Dawkins' The God Delusion, in which the former slammed what he called the latter's ignorance about religion. That led to an invitation from Yale University to give the Dwight H. Terry lectures (which address how science and philosophy inform religion) in April 2008. Eagleton's title: "Faith and Fundamentalism: Is Belief in Richard Dawkins Necessary for Salvation?" His four lectures formed the basis for the book.
This attack has come, to put it lightly, as a surprise to many. Speaking by phone from his home in Dublin, Eagleton chuckles and says: "It wasn't such a hard thing to go from Marxism to this debate. It just so happens I have a bit of theological background, enough to know when people are talking out the back of their necks."
A Window into the Faith of Religion Reporters
Faith celebrated. Faith lost. Faith inspected and detected, from neurological research to relics of saints venerated in exotic shrines. USA TODAY religion reporter Cathy Lynn Grossman talks to four journalists who, drawn to write about religion, make their explorations personal in their new books.

Barbara Bradley Hagerty (Fingerprints of God: The Search for the Science of Spirituality)
The religion reporter for National Public Radio is nearly naked in her new book. Spiritually naked, that is.
Barbara Bradley Hagerty's scientific exploration of spirituality research weaves in her faith history: a devout Christian Scientist who shifts to evangelical Christianity, then develops a gnawing desire to answer the question "Will science get the last word on God?"
She visits neuroscientists who scan the brains of people in prayer and meditation; interviews the "God spot" researcher who says there's no such spot; hears the tales of people transported by visions; and investigates the power of prayer, where her own faith began.
Fingerprints is also funny. Bradley Hagerty describes being trapped in a tent for an all-night peyote ceremony where she's observing, not indulging. A two-week meditation experiment just makes her cranky, "a poster child for meditative failure."
Ultimately, she circles back to an idea from her Christian Science youth: that prayer really does shape your brain, as some scientific research suggests. "You can find peace and calm and a sense of unity with the universe that other people cannot usually get to," she says.
Although no scientific experiment proves the existence of God, "that doesn't mean science won't come up with one."
Until then, "we have no facts about God. We only have beliefs about God," says Bradley Hagerty, who still believes.
Peter Manseau (Rag and Bone: A Journey Among the World's Holy Dead)
Peter Manseau has made — from teeth and whiskers, fingers and ribs — a globe-traveling tale of the one thing all humanity shares: the body.
In Rag and Bone, he takes a world tour of relics.
St. John the Baptist's finger. Mohammed's whisker. St. Francis Xavier's toe. Even Buddha's teeth are "politically active" as "tools of piety and power" in Myanmar.
"Every religion is a banquet of holy lives: These are the leftovers," Manseau writes after traveling from Jerusalem to Goa, Kashmir to Paris.
A Pretence of Progress
To negotiate the vague and tangled pathways of the odd parallel universe where Tariq Ramadan - the Professor of Islamic Studies at Oxford's Faculty of Theology - holds sway is an unsettling experience. It is to find oneself in a realm where common words, such as "reform" or "ethics" or even "universe", mean both more and less than they say, often simultaneously. His is a coded discourse. Here even solecism serves. He can write "toward He whom" but this - despite Ramadan's often muddled English - probably isn't the clumsy error it seems: behind "He" lurks the Arabic personal pronoun Huwa, a common designation for God in pious texts. Since there is no self-standing accusative pronoun in Arabic, English grammar must be twisted into submission.
Even the title of his new book is artfully misleading. The phrase "radical reform" raises high expectations, suggesting a bold attempt to strike at the "root" of a stubborn intransigence. But, as it turns out, Ramadan means something quite different. For the reform he proposes addresses the theoretical jurisprudence of Islam, known in Arabic as "the roots of the law" (usul al-fiqh), as opposed to "the branches" (furu' al-fiqh) - the specific practical rulings enunciated by judges and legal experts.
Such a project would be admirable, as well as brave, if carried through. Islam - in this, like Judaism - is intensely legalistic. Its religious scholars have almost all been jurists by training, and those theoretical "roots" sustain disciplines as lofty as Koranic exegesis and as mundane as the issuance of fatwas. A fresh examination, let alone a reform, of the "roots" of Islamic law could have momentous consequences, and not only for Muslims. But in fact, Ramadan stands in a long line of Islamic reformers, beginning with the brilliant Egyptian theologian Muhammad ‘Abduh and his shadier sidekick Jamal al-Din al-Afghani in the late 19th century, who advocate "reform" less as a way of ushering Islam into the modern world than as a means of insulating it from deleterious "Western" influences. Ramadan shares this agenda but with a crucial difference. He espouses a reform that replaces "adaptation" with "transformation". Muslims should no longer merely accommodate modernity as best they can in anxious conformity with their beliefs but strive to transform the modern world, infusing it with Islamic values.
The Structure of Everything
Did we invent math? Or did we discover it?

Did you know that 365 -- the number of days in a year -- is equal to 10 times 10, plus 11 times 11, plus 12 times 12?
Or that the sum of any successive odd numbers always equals a square number -- as in 1 + 3 = 4 (2 squared), while 1 + 3 + 5 = 9 (3 squared), and 1 + 3 + 5 + 7 = 16 (4 squared)?
Those are just the start of a remarkable number of magical patterns, coincidences and constants in mathematics. No wonder philosophers and mathematicians have been arguing for centuries over whether math is a system that humans invented or a cosmic -- possibly divine -- order that we simply discovered. That's the fundamental question Mario Livio probes in his engrossing book Is God a Mathematician?
Livio, an astrophysicist at the Hubble Space Telescope Science Institute in Baltimore, explains the invention-vs.-discovery debate largely through the work and personalities of great figures in math history, from Pythagoras and Plato to Isaac Newton, Bertrand Russell and Albert Einstein. At times, Livio's theorems, proofs and conundrums may be challenging for readers who struggled through algebra, but he makes most of this material not only comprehensible but downright intriguing. Often, he gives a relatively complex explanation of a mathematical problem or insight, then follows it with a "simply put" distillation.
An extended section on knot theory is, well, pretty knotty. But it ultimately sheds light on the workings of the DNA double helix, and Livio illustrates the theory with a concrete example: Two teams taking different approaches to the notoriously difficult problem of how many knots could be formed with a specific number of crossings -- in this case, 16 or fewer -- came up with the same answer: 1,701,936.
The author's enthusiasm is infectious. But it also leads him to refer again and again to his subjects as "famous" and "great" and to their work as "monumental" and "miraculous." He has a weakness as well for extended quotes from these men (and they are all men) that slow the narrative without adding much. There are exceptions, including the tale of how he tale of how Albert Einstein and mathematician Oskar Morgenstern tried to guide Kurt Gödel, a fellow mathematician and exile from Nazi Germany, through the U.S. immigration process.
A deep-thinking and intense man, Gödel threw himself into preparing for his citizenship test, including an extremely close reading of the U.S. Constitution. In his rigorously logical analysis, he found constitutional weaknesses that he thought could allow for the rise of a fascist dictatorship in America. His colleagues told him to keep that reading to himself, but he blurted it out during his naturalization exam. He was allowed to stay anyway.
Science & Islam: A History by Ehsan Masood

Last November, scientists using the Hubble space telescope reported the first sighting with visible light of a planet circling a star other than our own sun. It orbits 25 light years away around one of the brightest stars in the sky, called Fomalhaut.
Isn't that a curious name for a star? Not obviously mythological, it sounds as if it derives from some forgotten French astronomer. Not so; it is, in fact, from the Arabic fum u'l haut, meaning "mouth of the fish". And Fomalhaut is not alone in having an Arabic derivation - there are well over 100 others, including Betelgeuse, Aldebaran and Deneb. How did the Arabs get to name stars?
The answer, as these two revealing books make clear, is that they once led the world in astronomy. Muslim scientists were mapping the heavens, and pondering our place in them, while Europeans were still gazing at the night sky with baffled awe. To judge from some scientific narratives, the baton of knowledge about astronomy passed directly from the Greek Ptolemy in the 2nd century AD to Copernicus in the Renaissance. In fact, just about everything that the western world knew of the celestial sphere in the 16th century had come to it via the Arabs, who translated and refined Ptolemy's works between the 9th and the 13th centuries. And they didn't just read Ptolemy; they added to and challenged him, with data gathered at observatories such as the one established in the 820s in Baghdad by the greatest of the "scientific" rulers, al-Mamun of the Abbasid caliphate.
Astronomy is just one example of the enormous debt that the West owes to the achievements of Islamic science during the periods we still insist on calling the Dark and Middle Ages. While Europeans struggled until at least the 12th century with the mere rudiments of mathematics and natural philosophy, the Abbasid caliphs of the 8th to 13th centuries were promoting a rationalistic vision of Islam within which it was a sacred duty to inquire into the workings of the world. This programme was founded on the remnants of Roman and Hellenic culture, to which the Muslims had direct access in centres such as Alexandria. They prepared Arabic versions of the works of Aristotle, Euclid, Ptolemy and Archimedes, and set up schools and libraries such as the House of Wisdom in Baghdad.
As well as preserving classical scholarship, Muslim thinkers also innovated in many fields: astronomy, optics, cartography and medicine. The camera obscura, for instance, a kind of pinhole camera in which an outside scene is projected onto a wall in a darkened chamber as light enters through a small hole, was first studied experimentally by Hassan ibn al-Haitham (Alhazen) in the 11th century. Roger Bacon later used the device to study solar eclipses, and old masters from Van Eyck to Vermeer may have employed the projection method to achieve their micro-realist detail.
How Muhammad Yunus Created an Impossible Business

Grameen Bank is an improbable business worth study. In the second section of Creating a World Without Poverty, Muhammad Yunus details the ongoing evolution of what he calls "The Grameen Experiment." Yunus was an economist, not a banker, and he needed to invent his bank for the poor, ignoring naysayers and regulatory obstacles at almost every step. It's a classic example of how someone who does not realize that what he intends to do is impossible is thus able to achieve it.
Economists may find themselves frowning in this section of the book. Yunus tweaks his former colleagues for their blind spots and their refusal to look at people except in abstract terms like "labor." His is another voice in favor of 'experimental' economics, the part of the field that tries to look at human behavior as it is, rather than as economists say it should be.
That kind of anthropological economics resulted in many useful business practices at Grameen, and Yunus is generous in discussing what has worked and what has needed revising. For instance, it found that the wisdom of crowds works among the very poor: it has learned to lend to people in groups of five, and they all have to vouch for the person receiving the loan. It's also learned that lending to women has a bigger potential for getting families out of poverty than lending to men.
Science & Islam
Science and Islam tells the history of one of the most misunderstood, yet rich and fertile periods in science: the Islamic scientific revolution between 700 and 1500 AD.

Between the 8th and 16th centuries, scholars and researchers working from Samarkand in modern-day Uzbekistan to Cordoba in Spain advanced our knowledge of astronomy, chemistry, engineering, mathematics, medicine and philosophy to new heights. It was Musa al-Khwarizmi, for instance, who developed algebra in 9th century Baghdad, drawing on work by mathematicians in India; al-Jazari, a Turkish engineer of the 13th century whose achievements include the crank, the camshaft, and the reciprocating piston; Abu Ali ibn Sina, whose textbook Canon of Medicine was a standard work in Europe's universities until the 1600s. These scientists were part of a sophisticated culture and civilization that was based on belief in a God – a picture which helps to scotch the myth of the 'Dark Ages' in which scientific advance faltered because of religion.
Science and Islam weaves the story of these scientists and their work into a compelling narrative. He takes the reader on a journey through the Islamic empires of the middle ages, and explores, both the cultural and religious circumstances that made this revolution possible, and Islam's contribution to science in Western Europe. Masood unpacks the debates between scientists, philosophers and theologians on the nature of physical reality and the limits of human reason, and he describes the many reasons for the eventual decline of advanced science and learning in the Arabic-speaking world.
Science and Islam is essential reading for anyone keen to explore science's hidden history and its contribution to the making of the modern world.
Atheist Scientists Have Taken Over the Pulpit
This scientist's criticism of all non-scientific knowledge exposes the dogma of the New Atheists' creed.
This is one of those books whose subtitle gives it away entirely. Robert Park is a physicist and sceptic, who believes in an age of science - so naturally Superstition: Belief in an Age of Science (Princeton University Press, £14.95) is one long howl of complaint that he actually lives in an age of unscience. This makes his book much more illuminating than it might have been; much more illuminating, in fact, than he intended it to be.
It is hardly news to the intelligent reader that homeopathy is nonsense, creationism is a lie, intercessory prayer has no measurable effect, Uri Geller is a fraud and so on. What's easy to overlook is the existence of another sect of determined believers, whose creed is the last sentence of Park's book: "Science is the only way of knowing - everything else is just superstition."
So much for philosophy, history, literature, art, and common sense.
Park is not original here. In fact the value of his book consists in his unoriginality and his willingness to say straight out the kinds of thing which lurk unsaid within more self-conscious writers like Richard Dawkins. The more that the New Atheism emerges as a social movement in the USA, the more it acquires the habits of mind that make monotheistic religion obnoxious.
Just like other monotheisms, scientism proclaims the brotherhood of humanity in theory, while in practice excluding unbelievers from
The Day of Restlessness
Every week, a challenge arises for churchgoers and nonbelievers alike.

Who, raised in or around the Christian tradition, has not experienced the ambivalent dolors of a Sunday? That is only one question -- but a central and recurrent one -- raised by "The Peculiar Life of Sundays," Stephen Miller's lively history of a day that has exercised a peculiar hold on countless human beings for the past 2,000 years.
One might think that, for the devout, this hold would be especially firm. For them, after all, the day is unquestionably holy, unquestionably the Lord's: an Easter in miniature marking their savior's resurrection. But even the faithful can feel uneasy, as Mr. Miller shows by depicting the spiritual struggles of many of his exemplary figures.
Consider Samuel Johnson, the 18th-century essayist, conversationalist and one-man dictionary compiler. A committed Anglican and forthright defender of the faith, he nevertheless found it difficult -- indeed, almost impossible -- to haul himself into church on Sundays. Uncomfortable with "publick Worship," bored by most sermons and inclined toward late-rising, Johnson was forever recording his resolution to attend church more conscientiously. But that vow "was little better kept than the others," as the editor of his diaries noted. Without saying so explicitly, Mr. Miller uses Johnson to show how even a deeply religious person can find the outward institutional form of his religion at odds with what he finds most sacred.
Johnson's internal struggle, Mr. Miller implies, is part of a much larger culture war within the world that was once, until its 16th-century fragmentation, called Christendom. At the center of that struggle have been conflicting efforts to define the doctrines and practices of a religion based on the life, death and reputed resurrection of a first-century Palestinian Jew, proclaimed by many of his followers as the unique son of the Hebrew God. Inevitably the struggle has involved -- and, yes, to this day still involves -- politics, powerful personalities, sectarian rivalries and other human, all too human, factors.
Gotcha, Physics Genius: Einstein's Mistakes
Retelling Albert Einstein's story by homing in on his blunders makes for good intellectual entertainment.
When Donald Crowhurst's abandoned sailboat was found adrift in the Atlantic in 1969, his captain's log recorded the ravings of a man whose mind had snapped. On page after page, he spouted fulminations and pseudoscience, finally ripping his chronometer from its mountings and throwing it and then himself into the drink.
During the voyage, an around-the-globe sailboat race, Crowhurst had been reading Einstein's book "Relativity: The Special and the General Theory." A chapter called "On the Idea of Time in Physics" seems to have pushed him over the edge.
Einstein was pondering what it means to say that two lightning bolts strike the ground simultaneously. For this to be true, he suggested, someone positioned halfway between the events would have to observe the flashes occurring at the same instant. That assumes that the two signals are traveling at the same speed -- a condition, Einstein wrote, rather oddly, that "is in reality neither a supposition nor a hypothesis about the physical nature of light, but a stipulation which I can make of my own free will in order to arrive at a definition of simultaneity."
"You can't do THAT!" Crowhurst, an electrical engineer, protested to his journal. "I thought, 'the swindler.' " From there he descended into madness.
Hans C. Ohanian, who tells this strange tale at the beginning of "Einstein's Mistakes: The Human Failings of Genius," sympathizes with poor Crowhurst.
"The speed of light is either constant or not, and only measurement can decide what it is," Ohanian writes. For Einstein to make a postulation rather than propose it as a hypothesis to be tested may seem like a fine distinction. (Earlier in his book, Einstein does cite an empirical basis for his assumption: the Dutch astronomer Willem de Sitter's paper, "An Astronomical Proof for the Constancy of the Speed of Light," which was based on observations of binary stars.) But to Ohanian, the act was as outrageous as when Indiana lawmakers tried to legislate the value for pi. And so he adds it to his roster of Einstein's mistakes.
Exposing a Network of Powerful Christians
A new book claims that the "Fellowship" influences key decision makers.

It is an elite and secretive network of fundamentalist Christians that has been quietly pulling strings in America's highest corridors of power for more than 70 years. Or so claims Jeff Sharlet, author of a new exposé, The Family: The Secret Fundamentalism at the Heart of American Power. And in his telling, the group that calls itself the Fellowship operates at the very center of the vast, right-wing conspiracy that has promoted unfettered capitalism and dismantled liberal social policies at home, even while encouraging ruthless but America-friendly dictators abroad.
Sharlet, an associate research scholar at New York University's Center for Religion and Media, tells an intriguing story of an organization founded in 1935 by Norwegian immigrant pastor Abraham Vereide. Growing out of Vereide's early struggles against the radical labor movement on the West Coast, the group came to consist of religiously minded businessmen and sympathetic politicians who shared Vereide's mildly pro-fascist sentiments. Vereide is most widely known for launching in 1953 what is now a Washington institution, the National Prayer Breakfast, where movers and shakers come together to pray in an uplifting but blandly interfaith way.
But behind the scenes, Sharlet contends, Vereide and his key men worked with politicians and officials to advance unfettered, tooth-and-claw capitalism and engage in secret diplomacy with some of the world's least savory leaders, including, in the past, Indonesia's General Suharto and Haiti's François "Papa Doc" Duvalier. If all that weren't ominous enough, the group's leader since 1969, Doug Coe, has gained something of a reputation for invoking not only Jesus but also Hitler, Lenin, and Mao as models of effective leadership.
Sound sinister? To be sure. And Sharlet has done extensive reading in the Fellowship's archival materials to document what he calls "the secret history of Christian fundamentalism's most enduring and most powerful organization." Furthermore, he got started on the project after living in one of the Fellowship's Arlington, Va., homes, a kind of commune for well-off but somewhat undirected young men seeking Christian and worldly guidance from Fellowship elders. Sharlet, in other words, should know whereof he speaks.
But there are problems. Sharlet's ease of access to documents and people would seem to belie his characterization of the Fellowship as an obsessively secretive group. Other problems—including many overly broad and unsubstantiated charges—point to some of the larger difficulties that journalists, scholars, and commentators have had in understanding the nexus of religion and power in the post-9/11 world. Writing in the current issue of World Affairs, Adrian Wooldridge, Washington bureau chief for the Economist, describes the core problem succinctly: "In the aftermath of 9/11, however, we arguably have overcorrected—not underestimating the role of religion, as in years past, but exaggerating it. Exaggerating it in the sense of giving it undue emphasis, of turning it into a cartoon. The commentators who not that long ago were heedless of the role of religion and were theologically illiterate now see it everywhere (and remain theologically illiterate)."
Looking Through the Other End of the Microscope

Scientific discoveries are not only cumulative but a bit haphazard. They emerge in reaction to wrongheaded hypotheses as well as from moments of sudden insight. Often they occur inadvertently or by sheer dumb luck. Almost always, despite the genius of their discoverers, they are the product of the long, stubborn investigation of other scientists who remain obscure. Neither James Watson nor Francis Crick would have arrived at an understanding of the double helix without the prior work of Rosalind Franklin at Cambridge, for instance, but their names, and not hers, will forever be associated with that breakthrough.
And there are other even more obscure collaborators. The fruit fly, the evening primrose, and the guinea pig would seem to have little in common. And yet, along with a few other unassuming creatures, they have a just claim to be considered the unsung heroes of biology. As Jim Endersby notes in his fascinating "A Guinea Pig's History of Biology" (Harvard, 499 pages, $27.95), progress in biology owes as much to the hawkweed and the humble corncob as it does to the brainstorms of scientists. Mr. Endersby has had the happy idea of tracing the successes of modern biological research through the subjects which have made it possible. Each of his chapters focuses on a particular plant or animal, from the now extinct quagga, a cantankerous relative of the zebra, to microscopic bacteriophage, or "bacteria eaters," and culminating in the genetically engineered OncoMouse®, one of the first rodents with a full-fledged patent all its own. As he points out, "the history of biology has, in part, been the story of finding the right animals or plants to aid the search."
That search was twofold. It drew on practical considerations: to improve breeding lines, beginning with racehorses but extending to livestock; to develop cash crops with higher yields, culminating in our current, and controversial, genetically modified crops (of which Mr. Endersby provides a cautious and very balanced assessment), and, most crucially, to understand and find cures for devastating diseases. But it was also a search for something fundamental and far more elusive. In Mr. Endersby's account, the history of modern biology is a story of challenged assumptions, of refusing to accept easy explanations, of a willingness to ask apparently silly questions and to pursue the answers to them with astonishing doggedness.
Why God Won't Die
Jay Tolson reviews a new book by Charles Taylor, "a profoundly learned thinker (who is also a believing Christian)" that tries to explain the persistence of religion in a secular world.
Something curious happened on the way to the 21st century.Religion—which modernization theorists had said was destined for the dustbin ofhistory—didn’t go away. It even seemed to gain new strength, popping up in the culture wars, claiming space in the public square, and (in its worst manifestations) inspiring angry young men to acts of unspeakable violence. Why, it’s all enough to drive a good secular humanistcrazy—or at least to the bookstores to purchase the reassuring screeds of Sam Harris, Richard Dawkins, Christopher Hitchens, or any of the otherso-called New Atheists.
Making sense of secularism, its achievements and its failings, is one of the great intellectual challenges of our time. The word itself has several interlinked meanings, from the political (the separation of church and state) to the sociological (describing the abandonment of religious belief by individuals or society in general) to the ideological (the programmatic conviction that secularity is the logical outcome of enlightenment, science, and progress). A fourth sense is more anthropological, and arguably lies at the root of the other three. This secularism names a profound shift in worldview, one that the eminent McGill University philosopher Charles Taylor defines as a "move from a society where belief in God is unchallenged and, indeed, unproblematic, to one in which it is understood to be one option among others, and frequently not the easiest to embrace."
It is this fourth sense that mainly occupies Taylor in hislong-awaited magnum opus, A Secular Age. For nearly 800 pages, Taylor, winner of the 2007 Templeton Prize (religion’s own "genius award"), wrestles mightily with a fascinating subject: how Christianity became the religion that produced the first exit from religion, and how that exit, secularism, never entirely disentangled itself from the religion that made its existence possible. The book is, loosely, a history of ideas, but Taylor’s project is to get at something deeper and broader than the activity of intellectuals and other elites, something he calls the "social imaginary," an ungainly term describing the various ways people "imagine their social existence."
Taylor begins, not surprisingly, with theReformation—into which he lumps late medieval developments and theCounter Reformation—because of its crucial role in paving the way for a host of "modern" afflictions, from the confusion of morality with materialism to the disconnection of the individual from tradition and the larger corporate body of fellow believers.
If this seems like a fairlywell-trod road, well, it is. But the richness and pleasure of the journey is in seeing how a profoundly learned thinker (who is also a believing Christian) examines the landmarks along the way. For example, Taylor’s treatment of the emergence of a new kind of public sphere in the 18th century allows us to see how radical a development it is. Previously, what brought people together was always "somethingaction-transcendent," Taylor writes, "be it a foundation by God, or a Chain of Being which society bodied forth, or some traditional law which defined our people." This new sphere was "grounded purely in its own common actions." Whatever happenedwithin it—crowds clamoring for lower taxes or members of the Third Estate calling for the end of the OldRegime—was no longer important in relation to eternal time but only in relation to the actually unfolding present and its ideal goal: the future.
Infinitely Demanding
A review of Simon Critchley's "Infinitely Demanding: Ethics of commitment, politics of resistance"
Modern philosophy begins not in wonder, as it did for the ancients, but in disappointment. There is religious disappointment that the world has no meaning in the absence of a transcendent deity. This provokes nihilism, expressed in fundamentalist violence and non-theistic, passive spirituality. Then there is political disappointment inasmuch as justice has become meaningless. This is a crisis of ethics, in particular due to the lack of an ethical experience. That experience would be the feeling of a moral claim that was internally compelling, not merely externally compulsory.
This is the current impasse identified by Simon Critchley. "Infinitely Demanding" is a little book with a big idea; through high theory and wry observations, Critchley seeks to describe a way forward. In short, this is to find an ethics of commitment based on a demand that is felt to be infinite. Critchley uses Alain Badiou's idea of fidelity and Levinas's of the other, inter alia. But in order to ensure that the individual is not crushed by the impossibility of such responsibilities - a failure that would result in another kind of nihilism - his ethical subjectivity turns on itself and finds humour at its own tragic inauthenticity. As Woody Allen put it, "comedy is tragedy plus time". Conscience, then, is "dividuated" subjectivity.
Politics becomes grass-roots interventions by those who are against the consensus sanctioned by the state, an activism that also serves as a definition of democracy. It cultivates a kind of self-assertion driven by anger. But what about the problem of violence? Humour is an inadequate safeguard: after all, soldiers joke in the face of atrocity as a way of coping because it can keep the ethical at bay. But there is another ethical attitude that Critchley never quite addresses, namely that of compassion. This is an identification with another (related to but different from respect for the other), which is an ekstasis, a stepping out of yourself. It is also the beginnings of a meaningful transcendence not guaranteed by God but born of a sense of infinite mystery, and wonder.
A Mind Emparadised
A review of Paradiso by Robert Hollander
The ineffability of mystical experience is an ancient commonplace. Not surprisingly, Dante alludes to it often with baffled exasperation throughout his Paradiso. In Canto 3, he states that "the sweetness of eternal life" can never be understood by the intellect; rather, it must be "tasted." This is a reference to a verse from Psalm 33, often cited by medieval mystics, "Taste and see that the Lord is good." A century before Dante, Richard of St. Victor, among others, had adduced it to illustrate "the tasting of inner sweetness," that dulcedo Dei which is the sweetness of God Himself. For such mystics, the experience of God was not verbal but gustatory, at once intimate and incommunicable. Hearing and seeing, touching and smelling, might play a part but they too were "spiritual senses." God can be seen but only with "the eyes of the heart." For Dante, when he set about composing the Paradiso, sometime around the year 1317, the inexpressibility of blessedness presented a seemingly insoluble difficulty. For the final cantica of the Divine Comedy is above all, and avowedly, a narrative of intellect; its drama arises from the struggle of "a mind emparadised" first to conceptualize, and then to articulate, the inherently indescribable. By now, Inferno and Purgatorio stood complete (though he continued to revise them), but there Dante had traversed readily imaginable terrains. We all have a sense of hell; even purgatory is as gnawingly real as our next best intention. Heaven remains the unimaginable kingdom.
In his efforts not so much to overcome this impossibility as to dramatize its effects, Dante has recourse both to the formulations of Scholastic theology and to the supple inventiveness of language itself. The originality of his language can hardly be overstated. By means of multiple allusion, wordplay, puns, and acrostics, as well as the entire repertoire of medieval rhetoric, he pushes his beloved vernacular to the limit. He draws on Latin and on Provençal, and even rhymes at one point in Hebrew. He revives old words and invents a score of new ones. At the outset, in Canto 1, he coins a new verb and pairs it with a Latin phrase to make the audacity of his unprecedented venture plain. In the new translation by Jean and Robert Hollander, the lines read, "To soar beyond the human cannot be described/ in words."[1] The Italian is terser: "Trasumanar significar per verba/ non si poria." The translation—to which I’ll return—doesn’t quite catch the effect of those two rhyming four-syllable infinitives capped by a Latin tag. (Longfellow was more literal as well as more accurate, in his 1865 version, when he rendered it as "To represent transhumanize in words/ Impossible were.") The odd verb trasumanar is Dante’s invention and at first sight it mystifies. It compounds the deliberate obduracy of the verse; we are brought up short by the word just as the poet is brought up short by the boldness of his own ambition.
Bright Scientists, Dim Notions
AT a conference in Cambridge, Mass., in 1988 called "How the Brain Works," Francis Crick suggested that neuroscientific understanding would move further along if only he and his colleagues were allowed to experiment on prisoners. You couldn’t tell if he was kidding, and Crick being Crick, he probably didn’t care. Emboldened by a Nobel Prize in 1962 for helping uncoil the secret of life, Dr. Crick, who died in 2004, wasn’t shy about offering bold opinions — including speculations that life might have been seeded on Earth as part of an experiment by aliens.
The notion, called directed panspermia, had something of an intellectual pedigree. But when James Watson, the other strand of the double helix, went off the deep end two Sundays ago in The Times of London, implying that black Africans are less intelligent than whites, he hadn’t a scientific leg to stand on.
Since the publication in 1968 of his opinionated memoir, "The Double Helix," Dr. Watson, 79, has been known for his provocative statements (please see "Stupidity Should be Cured, Says DNA Discoverer," New Scientist, Feb. 28, 2003), but this time he apologized. Last week, uncharacteristically subdued, he announced his retirement as chancellor and member of the board of Cold Spring Harbor Laboratory on Long Island, where he had presided during much of the genetic revolution.
Though the pronouncements are rarely so jarring, there is a long tradition of great scientists letting down their guard. Actors, politicians and rock stars routinely make ill-considered comments. But when someone like Dr. Watson goes over the top, colleagues fear that the public may misconstrue the pronouncements as carrying science’s stamp of approval.
The Double Thinker
There are two ways to look at anything. That's what I learned from reading Steven Pinker. Actually, I learned it from two Steven Pinkers. One is a theorist of human nature, the author of ''How the Mind Works'' and ''The Blank Slate.'' The other is a word fetishist, the author of ''The Language Instinct'' and ''Words and Rules.'' One minute, he's explaining the ascent of man; the next, he's fondling irregular verbs the way other people savor stamps or Civil War memorabilia.
In ''The Stuff of Thought,'' Pinker says his new book is part of both his gigs. Hence its subtitle: ''Language as a Window Into Human Nature.'' It sounds as though he's finally going to pull together his life's work under one big idea, but he doesn't. That's what makes him so edifying and infuriating to read: he sees duality everywhere.
It's not that Pinker thinks the world can be neatly divided. That would be dualism. In ''The Blank Slate,'' he trashed the most famous such distinction, the one between mind and matter. Pinker's duality is of the opposite kind. Categories intersect like dimensions. The mind is what the brain does. Evolution shaped psychology, but in the process psychology evolved its own laws.
''The Stuff of Thought'' explores the duality of human cognition: the modesty of its construction and the majesty of its constructive power. Pinker weaves this paradox from a series of opposing theories. Philosophical realists, for instance, think perception comes from reality. Idealists think it's all in our heads. Pinker says it comes from reality but is organized and reorganized by the mind. That's why you can look at the same thing in different ways.
Then there's the clash between ancient and modern science. Aristotle thought projectiles continued through space because a force propelled them. He thought they eventually fell because Earth was their natural home. Modern science rejects both ideas. Pinker says Aristotle was right, not about projectiles but about how we understand them. We think in terms of force and purpose because our minds evolved in a biological world of force and purpose, not in an abstract world of vacuums and multiple gravities. Aristotle's bad physics was actually good psychology.
Of Apples, String and, Well.... Everything
Endless Universe Beyond the Big Bang Paul J. Steinhardt and Neil Turok
OUTSIDE the Great Gate of Trinity College, Cambridge, is a tree reputed to be descended from the one that dropped the apple on Isaac Newton's head. The result of the concussion, the legend goes, was Newton's law of universal gravitation: The same force that pulls things to Earth also binds the planets around the sun. So began an era of cosmic mergers and acquisitions that led, three centuries later, to Stephen Hawking's famous prediction that the end of physics -- the Theory of Everything -- was in sight
Hawking, as his press agents never tire of reminding us, holds Newton's old chair as Lucasian Professor of Mathematics at Cambridge University. By the time he assumed the post, an all-encompassing physics indeed seemed at hand. Electricity had been united with magnetism, and electromagnetism with the weak force that causes nuclear decays. For their next act, physicists planned to combine this electroweak force with the strong nuclear force that glues quarks together. That would still leave gravity, but Hawking bet, in his inaugural Lucasian lecture in April 1980, that by the year 2000 it too would be absorbed into the mix. All these fundamental forces would be reduced to facets of a single phenomenon -- supergravity -- present at the moment of creation.
Sitting in the audience that day was a graduate student, Neil Turok, who went on to get a PhD in theoretical physics. He didn't suspect at the time that he would be swept up by another convergence -- between his newly chosen field and cosmology.
It was a natural connection: Testing an idea like supergravity required a particle accelerator literally the size of the universe -- and the "experiment" had already been done: In the searing temperatures of the Big Bang, all the forces would still have been fused together. Figure out the Bang and you could reverse-engineer the Theory of Everything.
That's not how things turned out. In "Endless Universe: Beyond the Big Bang," Turok and Paul J. Steinhardt, a Princeton physicist, describe how they devoted years to the quest, only to find themselves veering from the pack with a new theory -- a radical model of the universe in which there is no beginning explosion but, rather, an endless cycle of cosmic thunderclaps.
Tinkering With Humans

Three years ago in The Atlantic, the Harvard philosopher Michael Sandel wrote a critique of genetic engineering titled "The Case Against Perfection." Now he has turned it into a book. The title is the same, but the text has changed, and sections have been added. That’s what human beings do. We try to improve things.
Sandel worries that this urge to improve can get us into trouble. Steroids, growth hormones, genetic engineering and other enhancements "pose a threat to human dignity" and "diminish our humanity," he argues. That’s the way ethicists talk: things are good or bad, human or inhuman. The book’s subtitle encapsulates this project: "Ethics in the Age of Genetic Engineering." But genetic engineering is too big for ethics. It changes human nature, and with it, our notions of good and bad. It even changes our notions of perfection. The problem with perfection in the age of self-transformation isn’t that it’s bad. The problem is that it’s incoherent.
Sandel’s critique is refreshingly sophisticated. Opponents of eugenic technologies usually complain that they’re unsafe, coercive, exploitative, nontherapeutic or unavailable to the poor. Sandel rebuts these objections, pointing out that they’re selectively applied and can be technically resolved. His deeper worry is that some kinds of enhancement violate the norms embedded in human practices. Baseball, for example, is supposed to develop and celebrate an array of talents. Steroids warp the game. Parents are supposed to cultivate children through unconditional as well as conditional love. Selecting a baby’s sex betrays that relationship.
How do we know these norms exist? Because when they’re violated, something in us rebels. When parents choose their children’s sex, we recoil. When baseball players use steroids, we frown. We, the audience, embody and express the norm.
But audiences, too, can change. Look what happened to Broadway musicals. Thanks to sound amplification, "audiences inevitably grew less alert, more passive," Sandel observes. Musicals became less verbally clever, but we no longer cared. We were "rehabituated." Sandel laments this, but it’s not clear why. If everything has changed — the practice, the audience, the norms — what basis for complaint remains?
When norms change, you can always find old fogeys who grouse that things aren’t the way they used to be. In the case of football, Sandel finds a retired N.F.L. player to support his contention that today’s bulked-up linemen are "degrading to the game" and to players’ "dignity." But eventually, the old fogeys die out, and the new norms solidify. Sandel recalls a scene from the movie "Chariots of Fire," set in the years before the 1924 Olympics, in which a runner was rebuked for using a coach. Supposedly, this violated the spirit of amateur competition. Today, nobody blinks at running coaches. The standpoint from which people used to find them unseemly is gone.
To defend the old ways against the new, Sandel needs something deeper: a common foundation for the various norms in sports, arts and parenting. He thinks he has found it in the idea of giftedness. To some degree, being a good parent, athlete or performer is about accepting and cherishing the raw material you’ve been given to work with. Strengthen your body, but respect it. Challenge your child, but love her. Celebrate nature. Don’t try to control everything.
Dancing With the Stars
by
In June 1948, as Jack Kerouac was recovering from another of the amphetamine-fueled joy rides immortalized in On the Road, Freeman Dyson, a young British physicist studying at Cornell, set off on a road trip of a different kind. Bound for Albuquerque with the loquacious Richard Feynman, the Neal Cassady of physics, at the wheel, the two scientists talked nonstop about the morality of nuclear weapons and, when they had exhausted that subject, how photons dance with electrons to produce the physical world. The hills and prairies that Dyson, still new to America, was admiring from the car window, the thunderstorm that stranded him and Feynman overnight in Oklahoma–all of nature's manifestations would be understood on a deeper level once the bugs were worked out of an unproven idea called quantum electrodynamics, or QED.
Dyson recounted the journey years later in Disturbing the Universe, contrasting Feynman's Beat-like soliloquies on particles and waves with the mannered presentations ("more technique than music") he heard later that summer from the Harvard physicist Julian Schwinger. On a Greyhound bus crossing Nebraska–Dyson had fallen in love with the American highway–he had an epiphany: his two colleagues were talking, in different languages, about the same thing.
It was a pivotal moment in the history of physics. With their contrasting visions joined into a single theory, Feynman, Schwinger and the Japanese scientist Sin-Itiro Tomonaga were honored in 1965 with a Nobel Prize, one that some think Dyson deserved a piece of.
In The Scientist as Rebel, a new collection of essays (many of them reviews first published in The New York Review of Books), he sounds content with his role as a bridge builder. "Tomonaga and Schwinger had built solid foundations on one side of a river of ignorance," he writes. "Feynman had built solid foundations on the other side, and my job was to design and build the cantilevers reaching out over the water until they met n the middle."
Drawing on this instinct for unlikely connections, Dyson has become one of science's most eloquent interpreters. From his perch at the Institute for Advanced Study in Princeton, he followed Disturbing the Universe, a remembrance of physics in the making, with Infinite in All Directions, his exuberant celebration of the universe, and other books like Weapons and Hope, Imagined Worlds and The Sun, the Genome and the Internet–speculations on how science and technology might one day ennoble humanity, eliminating war and poverty, if only we can avoid blowing ourselves up.
A New Journey into Hofstadter's Mind
The eternal golden braid emerges as a strange loop.

To get into a properly loopy mind-set for Douglas R. Hofstadter's new book on consciousness, I plugged a Webcam into my desktop computer and pointed it at the screen. In the first instant, an image of the screen appeared on the screen and then the screen inside the screen. Cycling round and round, the video signal rapidly gave rise to a long corridor leading toward a patch of shimmering blue, beckoning like the light at the end of death's tunnel.
Giving the camera a twist, I watched as the regress of rectangles took on a spiraling shape spinning fibonaccily deeper into nowhere. Somewhere along the way a spot of red--a glint of sunlight, I later realized--became caught in the swirl, which slowly congealed into a planet of red continents and blue seas. Zooming in closer, I explored a surface that was erupting with yellow, orange and green volcanoes. Like Homer Simpson putting a fork inside the microwave, I feared for a moment that I had ruptured the very fabric of space and time.
In I Am a Strange Loop, Hofstadter, a cognitive and computer scientist at Indiana University, describes a more elaborate experiment with video feedback that he did many years ago at Stanford University. By that time he had become obsessed with the paradoxical nature of Gödel's theorem, with its formulas that speak of themselves. Over the years this and other loopiness--Escher's drawings of hands drawing hands, Bach's involuted fugues--were added to the stew, along with the conviction that all of this had something to do with consciousness. What finally emerged, in 1979, was Gödel, Escher, Bach: An Eternal Golden Braid, one of the most captivating books I have ever read.
I still remember standing in the aisle of a bookstore in Washington, D.C., where I had just finished graduate school, devouring the pages. GEB, as the author calls it, is not so much a "read" as an experience, a total immersion into Hofstadter's mind. It is a great place to be, and for those without time for the scenic route, I Am a Strange Loop pulls out the big themes and develops them into a more focused picture of consciousness.
Creation vs. Darwin Takes Muslim Twist in Turkey

A lavishly illustrated Atlas of Creation is mysteriously turning up at schools and libraries in Turkey, proclaiming that Charles Darwin’s theory of evolution is the real root of terrorism.
Arriving unsolicited by post, the large-format tome offers 768 glossy pages of photographs and easy-to-read text to prove that God created the world with all its species.
At first sight, it looks like it could be the work of United States creationists, the Christian fundamentalists who believe the world was created in six days as told in the Bible.
But the author’s name, Harun Yahya, reveals the surprise inside. This is Islamic creationism, a richly funded movement based in predominantly Muslim Turkey which has an influence U.S. creationists could only dream of.
Creationism is so widely accepted here that Turkey placed last in a recent survey of public acceptance of evolution in 34 countries—just behind the United States.
"Darwinism is dead," said Kerim Balci of the Fethullah Gulen network, a moderate Islamic movement with many publications and schools but no link to the creationists who produced the atlas.
Scientists say pious Muslims in the government, which has its roots in political Islam, are trying to push Turkish education away from its traditionally secular approach.
Aykut Kence, biology professor at the Middle East Technical University in Ankara, said time for discussing evolution had been cut out of class schedules for the eighth grade this year.
Searching for the Truth About Nature

Scientists were once happy to be known as natural philosophers. The title implied not merely that they studied nature but that they thought about it in such a way as to discern its hidden laws. They weren’t concerned only with the how of things but with the why. The beautiful line of Virgil, "Happy is he who can recognize the causes of things," epitomized the endeavor. Causation in all its forms, from the collisions of solid bodies on earth to the remote arrangements of the First Cause beyond the empyrean, underlay natural laws. Goethe’s Faust, the mythic prototype of the philosopher-scientist, was driven to despair, as well as near-damnation, by his passion to know "what holds the world together in its deepest core." But Faust represents the end of an ancient tradition; for all his knowledge, he’s tormented by the world’s ultimate unknowability. And that bafflement "scorches his heart."
Is nature finally unintelligible? Even more disturbing, is nature intelligible in itself but beyond the power of humans to comprehend? These and other questions form the theme of Peter Dear’s excellent new book, The Intelligibility of Nature: How Science Makes Sense of the World (University of Chicago Press, 233 pages, $27.50). Mr. Dear, a historian of science at Cornell, provides a succinct history of modern science from the 17th century to the present by drawing on two complementary but conflicting aspects of the scientific method. The first involves the search for "intelligibility," or the truth about nature; the second concerns "instrumentality," or the practical uses scientists make of their discoveries. As he demonstrates, in lucid prose and well-chosen illustrations, these two aspects aren’t quite compatible, and yet, both have proved essential to the advancement of science.
The criterion of intelligibility sounds obvious but isn’t. As Mr. Dear shows, even Newton was harshly criticized, by the Dutch mathematician Christiaan Huyghens, among others, for his inability to explain the phenomenon of gravitation. He could describe the force and derive laws and inferences from it but he couldn’t account for it in a satisfactorily philosophical manner. The criticism stung Newton because he agreed with Huyghens. The seemingly insuperable problem was "action at a distance." How could one object—whether a heavenly body or the earth beneath an apple tree—influence another without some sustaining medium through which gravity could act? Mr. Dear cites a letter in which Newton protests, "Pray do not ascribe that notion to me, for the cause of gravity is what I do not pretend to know and therefore would take more time to consider of it."
Favored by the Gods
Happiness, according to current scientific thinking, depends less on our circumstances than on our genetic endowment

The sky was smeared with the lights from the midway, spinning, blinking, beckoning to risk takers, but I decided to go for a different kind of thrill: Ronnie and Donnie Galyon, the Siamese Twins, were at the Minnesota State Fair. Feeling some guilt, I bought my ticket and cautiously approached the window of the trailer they called home. Thirty years old, joined at the stomach, they were sitting on a sofa, craning their necks to watch television.
Twenty-five years later I am still struck by this dizzying conjunction of the grotesque and the mundane. Trying to project myself into their situation--a man with two heads, two men with one body--I felt only sickness, horror and a certainty that I would rather be dead. Yet there they were, traveling from town to town, leading some kind of life.
When we try to envision another's happiness, we suffer from arrogance and a poverty of imagination. In 1997, when science writer Natalie Angier interviewed Lori and Reba Schappell, connected at the back of the head and sporting different hairdos, each insisted that she was basically content.
"There are good days and bad days--so what?" Reba said. "This is what we know. We don't hate it. We live it every day." Lori was as emphatic: "People come up to me and say, 'You're such an inspiration. Now I realize how minor my own problems are compared to yours.' But they have no idea what problems I have or don't have, or what my life is like.''
Three recent books--two by scientists and one by a historian--take on the quest for the good life, in which common sense and the received wisdom of the ages is increasingly confronted by findings from psychology, neuroscience and genetics:
Book Review: Breaking the Spell, by Daniel Dennett

Daniel Dennett, one of America's most brilliant philosophers, wants to provoke. That's obvious not just from the title of his new book, Breaking the Spell but the image he uses to begin his 400 page analysis of why people have religious faith. An ant climbs laboriously up a blade of grass. Up it climbs, falls and climbs again… and again. Why? A parasite, a tiny brain worm, has commandeered the ant's brain because it needs to get itself into the stomach of a sheep or cow to complete its reproductive cycle. The ant's wellbeing is redundant. Could religion have commandeered human brains to ensure the survival of its own precepts without any regard to its hosts' wellbeing?
Dennett has spent much of his distinguished career as a philosopher explaining how some of our most cherished ideals such as freedom and justice can be explained by evolution because they gave human genes an advantage in their struggle to replicate. In this, his new book, he's turned his formidable intellectual firepower on religion itself: his book is the boldest and most ambitious attempt of a Darwinian to account for the phenomenon of religion. Can religion—its practices, beliefs and experiences—be explained simply in terms of genetic advantage? And has that genetic advantage passed its sell-by date—in other words, if religion was useful in the past, is it now?
These are big questions so it seems fitting that a tall man with a big beard—with an uncanny resemblance to Charles Darwin or a child's idea of God—has launched himself into the centre of one of America's most fraught public debates, the nature of religion. In the UK on a tour of public debates across the country to promote the book, it's immediately clear that here is a man who has thought deeply about what he is contributing to that debate and why. He is a captivating thinker because he's a master of accessible, vivid analogies—such as that brain worm. Or to take a couple more: referring to the tendency amongst humans for a sweet tooth (he explains why that evolved in the book) he goes on to ask me, "Is religion sugar or saccharine. If it's the latter, we eliminate it at our peril because sugar is worse. But if its sugar, can we develop saccharine?"
The Universe in a Single Atom
Reason and Faith
It was his subtitle that bothered me. Spirituality is about the ineffable and unprovable, science about the physical world of demonstrable fact. Faced with two such contradictory enterprises, divergence would be a better goal. The last thing anyone needs is another attempt to contort biology to fit a particular religion or to use cosmology to prove the existence of God.
But this book offers something wiser: a compassionate and clearheaded account by a religious leader who not only respects science but, for the most part, embraces it. "If scientific analysis were conclusively to demonstrate certain claims in Buddhism to be false, then we must accept the findings of science and abandon those claims," he writes. No one who wants to understand the world "can ignore the basic insights of theories as key as evolution, relativity and quantum mechanics."
That is an extraordinary concession compared with the Christian apologias that dominate conferences devoted to reconciling science and religion. The "dialogues" implicitly begin with nonnegotiables - "Given that Jesus died on the cross and was bodily resurrected into heaven…" - then seek scientific justification for what is already assumed to be true.
The story of how someone so open-minded became the Tibetan Buddhist equivalent of the pope reads like a fairy tale. When the 13th Dalai Lama died in 1933 he was facing northeast, so a spiritual search team was sent in that direction to find his reincarnation. The quest narrowed further when a lama had a vision pointing to a certain house with unusual gutters. Inside a boy called out to the visitors, who showed him some toys and relics that would have belonged to him in his previous life. "It is mine!" he exclaimed, like any acquisitive 2-year-old, and so his reign began.