Eric Ormsby

A freelance journalist and specialist in Islamic intellectual traditions, Eric Ormsby writes on science, history, natural history, and religion. His work appears regularly in The Wall Street Journal, the New Republic, The New Yorker, The New Criterion, Yale Review, The Paris Review, and the Times Literary Supplement. Parallel to his journalism career he has served as a director of libraries and professor of Islamic Studies at Princeton University, where he received his Ph.D. in Near Eastern Studies, McGill, and the Institute of Ismaili Studies. He has also published several books, including Theodicy in Islamic Thought (1984), and articles on Islam as well as a volume of essays.

Review
Wall Street Journal
published September 26, 2011

How the Secular World Began

Lucretius' poem, recovered in 1417, described atomic theory, chance's role in nature and the gods' indifference.

Book cover

We think of the Renaissance as a time of unsurpassed artistic accomplishment, but it was also a time of unrelenting strife. In the 14th century, the city-states of Italy were in constant turmoil; Milan warred against Venice, Florence against Rome. This was the period of the "Babylonian Captivity" of the church, when the papacy was established in Avignon, where it became a satellite of the French king. Central religious authority so disintegrated that eventually no fewer than three popes would simultaneously claim the chair of St. Peter. Reformers to the north began issuing denunciations of the church, denouncing a clergy and hierarchy that had grown ever more corrupt and licentious.

In this age of chaos and confusion, a small band of learned men, the Renaissance humanists, clung to a vision of the past that seemed to hold out some hope of tranquility. Antiquity offered models of noble conduct, but they survived only in broken statues or in the mildewed pages of manuscripts hidden for centuries in remote monasteries. To locate and copy manuscripts that contained the voices of the vanished past in all their unsurpassed eloquence became, for some scholars, an overmastering obsession.

The remarkable Poggio Bracciolini (1380-1459) is hardly remembered today, even in his Tuscan hometown of Terranuova. Yet his discoveries led, by slow, almost imperceptible steps, to a revolution in Western thought. In "The Swerve," Shakespearean scholar Stephen Greenblatt traces Poggio's extraordinary career and legacy. He served as apostolic secretary to one of the three rival claimants to the papacy, John XXIII. He was a calligrapher of genius, creating elegant scripts (beautifully illustrated in one of Mr. Greenblatt's well-chosen plates). And he was a book-hunter of the most dogged sort.

Aided by his official position, Poggio used every skill at his command—diplomatic suavity, flattery, displays of erudition—to talk his way into otherwise inaccessible monastic libraries. Underlying his quest was a conviction that the good order of the world and of society depended upon a disciplined reverence for language, specifically the Latin language; true eloquence had a moral foundation. Though Poggio was responsible for many priceless finds—the works of Vitruvius and Quintilian, the letters of Cicero—it was thanks to his discovery in 1417 of the sole surviving manuscript of the Roman poet Lucretius' "On the Nature of Things" that, in Mr. Greenblatt's phrase, "the world swerved in a new direction."

read more…

Review
The New York Times
published September 9, 2011

Holy War: Why Vasco de Gama Went to India

Graphic credit: Mansell/Time & Life Pictures — Getty Images; Description: An etching of Vasco da Gama.

The Portuguese navigator Vasco da Gama set sail from Belém, a village at the mouth of the Tagus River now part of greater Lisbon, on July 8, 1497. An obscure but well- connected courtier, he had been chosen, much to everyone’s surprise, by King Manuel I to head the ambitious expedition to chart a new route to India. The king was not moved chiefly by a desire for plunder. He possessed a visionary cast of mind bordering on derangement; he saw himself spearheading a holy war to topple Islam, recover Jerusalem from "the infidels" and establish himself as the "King of Jerusalem."

Da Gama shared these dreams, but like his hard-bitten crew, rogues or criminals to a man, he coveted the fabled riches of the East — not only gold and gems but spices, then the most precious of commodities. On this voyage, as on his two later ones, he proved a brilliant navigator and commander. But where courage could not bring him through violent storms, contrary seas and the machinations of hostile rulers, luck came to his rescue. He sailed blindly, virtually by instinct, without maps, charts or reliable pilots, into unknown oceans.

As Nigel Cliff, a historian and journalist, demonstrates in his lively and ambitious "Holy War," da Gama was abetted as much by ignorance as by skill and daring. To discover the sea route to India, he deliberately set his course in a different direction from Columbus, his great seafaring rival. Instead of heading west, da Gama went south. His ships inched their way down the African coast, voyaging thousands of miles farther than any previous explorer. After months of sailing, he rounded the Cape of Good Hope, the first European to do so. From there, creeping up the east coast of Africa, he embarked on the uncharted vastness of the Indian Ocean. Uncharted, that is, by European navigators. For at the time, the Indian Ocean was crisscrossed by Muslim vessels, and it was Muslim merchants, backed up by powerful local rulers, who controlled the trade routes and had done so for centuries. Da Gama sought to break this maritime dominance; even stronger was his ambition to discover the Christians of India and their "long-lost Christian king," the legendary Prester John, and by forging an alliance with them, to unite Christianity and destroy Islam.

read more…

Review
Wall Street Journal
published December 31, 2010

The Gods Return

A solution to the 'lostness' of the modern world

Book cover for All Things Shining

For the ancient Greeks, the universe was filled with shining presences that reflected the passing moods of the gods. Each god had a privileged sphere: Mars presided over war, Aphrodite over sexual love, Hera over the hearth; Zeus was the lord of all the gods, the hurler of thunderbolts from on high; and so on for the entire rowdy pantheon. It has long been customary to dismiss these gods as mere stage presences, as convenient explanations for random catastrophes or as fall guys for human motives. Helen caused the Trojan War but, hey, "golden Aphrodite" made her do it.

The authors of "All Things Shining" will have none of this. Though both are professional philosophers—Hubert Dreyfus at Berkeley and Sean Dorrance Kelly at Harvard—they view the ancient Homeric gods as hidden presences still susceptible of invocation. Indeed, they hold out "Homeric polytheism" as a solution to the "lostness" of the contemporary world. They begin by stating that "the world doesn't matter to us the way it used to." They are concerned to elucidate why this may be so. But this is no bland academic exercise. "All Things Shining" is an inspirational book but a highly intelligent and impassioned one. The authors set out to analyze our contemporary nihilism the better to remedy it.

And they provide, really, several books in one. "All Things Shining" provides a concise history of Western thought, beginning with Homer and concluding with Descartes and Kant. But there are extended discussions as well of such contemporary authors as the late David Foster Wallace and, even more startling, of "chick lit" novelist Elizabeth Gilbert. Among much else, the authors contrast Wallace's fraught, "radical" idea of individual freedom and autonomy to Ms. Gilbert's pre-Renaissance view of creativity, according to which "writing well" happens, as the authors put it, "when the gods of writing shine upon her."

A discussion of Odysseus's encounter with his wife Penelope's suitors—who try to kill him at the end of "The Odyssey"—is illumined by a comparison with the scene in the movie "Pulp Fiction" in which Jules and Vincent argue over whether their close call with a gunman was a miracle or merely a statistical probability. In the book's impressive chapter on Melville and "Moby Dick," the authors argue that the wildly changeable tones and shifts of subject matter in the novel are a key to its meaning. To take our moods seriously—"both our highest, soaring joys and our deepest, darkest descents"—is, for Melville, "to be open to the manifold truths our moods reveal."

The authors' general theme, and lament, is that we are no longer "open to the world." We fall prey either to "manufactured confidence" that sweeps aside all obstacles or to a kind of addictive passivity, typified by "blogs and social networking sites." Both are equally unperceptive. By contrast, the Homeric hero is keenly aware of the outside world; indeed, he has no interior life at all. His emotions are public, and they are shared; he lives in a community of attentiveness. He aspires to what in Greek is termed "areté," not "virtue," as it is usually translated, but that peculiar "excellence" that comes from acting in accord with the divine presence, however it may manifest itself.

read more…

Review
The New Criterion
published June 2010

The Shock Philosopher

On the provocative thinking of Simone Weil

Jacket cover of the Relevance of the Radical

Simone Weil was born in Paris just over a century ago, on February 3, 1909, and though she always remained fiercely loyal to France—and was French to her fingertips—the case could be made that her true homeland lay elsewhere, deep in the hazier and far more fractious republic of Contradiction. There she was, however perilously, chez elle. Weil displayed alarming aplomb on the horns of dilemmas; often she teetered on several simultaneously. She went after the paradoxical, the contradictory, the oppositional, with the rapt single-mindedness of a collector and the grim fervor of a truffle-hound. She exulted in polarities. Though many of Weil’s statements have an aphoristic cast or masquerade as bons mots, they are anything but witty in the usual bantering sense. Weil is an irritating thinker; her words impart an acrid aftertaste; they leave scratchy nettles behind. She meant to provoke, to jostle, to unsettle—she was an activist as well as a contemplative—and the play of opposition served this purpose.

But Weil also saw the summoning of opposites as a way of knowing. It was an ancient way, much favored by her beloved Greeks: the Stoic Chrysippus taught that we discern good and evil only by their opposition; in one of Heraclitus’s fragments, he is reported as saying, "God: day, night; winter, summer; war, peace; satiety, hunger." The dark saying could have been Weil’s own. For her, truth was to be found, if at all, in the tension of antinomies. Her thought, at its most perceptive as at its most repellent, draws its remarkable energy from that tension.

Weil’s life, too, was rife with contradiction, much of it in keeping with a familiar modern pattern. Born into a well-off, middle-class Jewish family, she was, over the course of her short life, now a Bolshevik, now an anarchist, now a labor union activist, now a (reluctant) combatant on the Republican side in the Spanish Civil War and, after 1937, as a result of some sort of mystical experience in Assisi, a fervent Roman Catholic believer (though she refused to be baptized). In each of these metamorphoses, she found herself embroiled in opposition; she could not join a group or a movement or an institution without almost immediately dissociating herself from it.

read more…

Review
The New York Times
published March 12, 2010

Butchers and Saints

Illustration of crusader by Stephen Savage

The villains of history seem relatively easy to understand; however awful their deeds, their motives remain recognizable. But the good guys, those their contemporaries saw as heroes or saints, often puzzle and appall. They did the cruelest things for the loftiest of motives; they sang hymns as they waded through blood. Nowhere, perhaps, is this contradiction more apparent than in the history of the Crusades. When the victorious knights of the First Crusade finally stood in Jerusalem, on July 15, 1099, they were, in the words of the chronicler William of Tyre, "dripping with blood from head to foot." They had massacred the populace. But in the same breath, William praised the "pious devotion . . . with which the pilgrims drew near to the holy places, the exultation of heart and happiness of spirit with which they kissed the memorials of the Lord’s sojourn on earth."

At the same time, "Holy Warriors" is what Phillips calls a "character driven" account. The book is alive with extravagantly varied figures, from popes both dithering and decisive to vociferous abbots and conniving kings; saints rub shoulders with "flea pickers." If Richard the Lion-Hearted and Saladin dominate the account, perhaps unavoidably, there are also vivid cameos of such lesser-known personalities as the formidable Queen Melisende of Jerusalem and her rebellious sister Alice of Antioch. Heraclius, the patriarch of Jerusalem, is glimpsed in an embarrassing moment when a brazen messenger announces to the assembled high court where he sits in session that his mistress, Pasque, has just given birth to a daughter.

It’s tempting to dismiss the crusaders’ piety as sheer hypocrisy. In fact, their faith was as pure as their savagery. As Jonathan Phillips observes in his excellent new history — in case we needed reminding at this late date — "faith lies at the heart of holy war." For some, of course, this will be proof that something irremediably lethal lies at the heart of all religious belief. But the same fervor that led to horrific butchery, on both the Christian and the Muslim sides, also inspired extraordinary efforts of self-sacrifice, of genuine heroism and even, at rare moments, of simple human kindness. Phillips, professor of crusading history at the University of London, doesn’t try to reconcile these extremes; he presents them in all their baffling disparity. This approach gives a cool, almost documentary power to his narrative.

read more…

Review
Standpoint
published December 2009

Holier than Thou

Albrecht Durer's depiction of the Turin Shroud

In a letter of 1521, Martin Luther exhorted his fellow reformer Philipp Melanchthon to "be a sinner and sin strongly, but have faith and rejoice in Christ even more strongly!" The antithesis is carefully couched, suggesting a subtle dynamic between the extremes of bold sinfulness and joyful faith as though in some indefinable way they fed upon one another (and perhaps they do). Luther's words convey that tremulous equipoise of irreconcilables which has characterised Christian belief from its beginnings. In his new, massive history of Christianity, the distinguished Reformation scholar Diarmaid MacCulloch balks at such robust paradoxes. Unreason — that "faith in things unseen" — leaves him queasy. It leads to beliefs he finds preposterous. Christianity intrigues him because he cannot understand "how something so apparently crazy can be so captivating to millions of other members of my species". It inspires intolerance, bigotry, fanaticism and their murderous consequences. "For most of its existence," he writes, "Christianity has been the most intolerant of world faiths." As if this weren't bad enough, it indulges in "gender-skewed language."

Although MacCulloch purports to be writing a history for the general reader — his book was the basis for "a major BBC TV series" this autumn — his take on Christianity is highly tendentious. When he sticks to events, such as the Council of Chalcedon in 451, which provoked the first momentous schism in Christian history, or when he untangles obscure doctrinal disputes, ranging from the controversies incited by the Iconoclasts to the baffled modern clashes between genteel traditional Protestants and rowdy Pentecostals, he can be superb. His scope is enormous. His discussion of Christianity in Ethiopia is as thorough as his explorations of 19th-century American revivalist movements. And his attention to often disregarded detail is impressive. His affectionate references to devotional music, from the hymns of Charles Wesley to Negro spirituals to the old Roman Catholic service of Benediction, enliven his account. Unsurprisingly, as the author of the magisterial Reformation: Europe's House Divided 1490-1700, he is excellent on the rise of Protestantism and on the Catholic Counter-Reformation. But he's just as informed, and as informative, about recent developments, whether the Second Vatican Council or the Orthodox Church in post-Soviet Russia.

read more…

Review
Wall Street Journal
published September 11, 2009

Fateful Schism

Tracing the history of a religious divide that still haunts the world

Image: Brooklyn Museum/Corbis ‘The Battle of Karbala,’ Abbas Al-Musavi’s depiction, circa 1900, of a seventh-century clash of Islamic factions.

When the Prophet Muhammad died unexpectedly after a brief illness in Medina, in present-day Saudi Arabia, on June 8, 632, his followers were stunned. A contemporary called it "the greatest of calamities." Their grief was not only for the loss of an irreplaceable leader. Muhammad was "the seal of the prophets," the last in a line that stretched back to Adam. He had received revelations as "God's emissary" for some 20 years—revelations that he had communicated to the embattled community of his followers, first in Mecca and then, after the hijra, or emigration, in 622, in Medina—but now they came to an end. It was as though God, who revealed Himself through the Prophet, had suddenly fallen silent.

In fact, the calamity was greater than Muhammad's mourners could have foreseen. Muhammad had not unambiguously named his successor. The question of succession would haunt Islam for centuries to come. The wrangling began within hours of Muhammad's death; it would quickly lead to a momentous rift between two implacable factions, Shia and Sunni. It is a divide that continues to this day, often with horrific consequences. In "After the Prophet," veteran Middle East journalist Lesley Hazleton tells with great flair this "epic story of the Shia-Sunni split in Islam," as she rightly calls it.

Those who supported Muhammad's cousin and son-in-law Ali found themselves pitted against those who favored Abu Bakr, the Prophet's closest friend. Muhammad was also his son-in-law: Abu Bakr's daughter Aisha was Muhammad's third, and favorite, wife, and a force to reckon with in her own right. Ali's supporters formed the "shi'at Ali," the "party of Ali," from which the term Shia derives. The partisans of Abu Bakr would come to be known as "Sunni" Muslims—those who follow the "sunna," the code of pious practice based on the Prophet's example.

That Abu Bakr was almost immediately named caliph—the title then meant no more than "successor"—embittered Ali's supporters; when their man was passed over for the caliphate two more times they felt that a monstrous injustice had been perpetrated. Ali did finally accede to the caliphate in 656, but his claim was contested. When he was assassinated in the mosque of Kufa, in 661, by an extremist wielding a sword laced with poison, his murder struck a tragic note that would reverberate ever after. The Sunni-Shia schism pitted Muslim against Muslim and led to civil wars, massacres and assassinations, and even the collapse of dynasties.

read more…

Review
Standpoint
published April 2009

A Pretence of Progress

Photo credit: Julia Vitullo-Martin; Description: Tariq Ramadan, at Templeton-Cambridge Journalism Fellowships

To negotiate the vague and tangled pathways of the odd parallel universe where Tariq Ramadan - the Professor of Islamic Studies at Oxford's Faculty of Theology - holds sway is an unsettling experience. It is to find oneself in a realm where common words, such as "reform" or "ethics" or even "universe", mean both more and less than they say, often simultaneously. His is a coded discourse. Here even solecism serves. He can write "toward He whom" but this - despite Ramadan's often muddled English - probably isn't the clumsy error it seems: behind "He" lurks the Arabic personal pronoun Huwa, a common designation for God in pious texts. Since there is no self-standing accusative pronoun in Arabic, English grammar must be twisted into submission.

Even the title of his new book is artfully misleading. The phrase "radical reform" raises high expectations, suggesting a bold attempt to strike at the "root" of a stubborn intransigence. But, as it turns out, Ramadan means something quite different. For the reform he proposes addresses the theoretical jurisprudence of Islam, known in Arabic as "the roots of the law" (usul al-fiqh), as opposed to "the branches" (furu' al-fiqh) - the specific practical rulings enunciated by judges and legal experts.

Such a project would be admirable, as well as brave, if carried through. Islam - in this, like Judaism - is intensely legalistic. Its religious scholars have almost all been jurists by training, and those theoretical "roots" sustain disciplines as lofty as Koranic exegesis and as mundane as the issuance of fatwas. A fresh examination, let alone a reform, of the "roots" of Islamic law could have momentous consequences, and not only for Muslims. But in fact, Ramadan stands in a long line of Islamic reformers, beginning with the brilliant Egyptian theologian Muhammad ‘Abduh and his shadier sidekick Jamal al-Din al-Afghani in the late 19th century, who advocate "reform" less as a way of ushering Islam into the modern world than as a means of insulating it from deleterious "Western" influences. Ramadan shares this agenda but with a crucial difference. He espouses a reform that replaces "adaptation" with "transformation". Muslims should no longer merely accommodate modernity as best they can in anxious conformity with their beliefs but strive to transform the modern world, infusing it with Islamic values.

read more…

Review
New York Sun
published May 7, 2008

Newton's Single Vision

Newton by Peter Ackroyd

Isaac Newton's sketches for a reflecting telescope and its component parts; credit: Library of Congress

For Isaac Newton (1642–1727), the universe was governed by precise laws which could not only be formulated but mathematically proved to a certainty. These physical laws were not sporadic or local; they were universal and extended "everywhere to immense distances," as he wrote in The Principia: Mathematical Principles of Natural Philosophy, first published in 1687. Newton's three laws of motion may not apply at the atomic level or under conditions approaching the speed of light, as we now know, but they apply everywhere else. The fall of that famous apple was no less an effect of universal gravitation than the rhythms of the tides or the orbits of the planets.

But to prove the law of gravity, though an unparalleled accomplishment, was not to understand its final cause. Newton wrote, again in The Principia, that "I have not as yet been able to deduce from phenomena the reason for these properties of gravity." (That "as yet" demonstrates both Newton's supreme self-confidence and his rigorous honesty. To this day no one else has deduced those "properties" either.) In a statement that stands as his scientific signature, he added, "et non fingo hypotheses"—"and I do not feign hypotheses." Even so, this same scorner of the hypothetical would spend much of his career after the amazing two-year period of his greatest discoveries in 1664–66 dabbling obsessively in alchemy, as well as pursuing increasingly fantastic numerological investigations of Scripture.

read more…

Review
New York Sun
published January 30, 2008

Looking Through the Other End of the Microscope

Cover of "A Guinea Pig's History of Biology"

Scientific discoveries are not only cumulative but a bit haphazard. They emerge in reaction to wrongheaded hypotheses as well as from moments of sudden insight. Often they occur inadvertently or by sheer dumb luck. Almost always, despite the genius of their discoverers, they are the product of the long, stubborn investigation of other scientists who remain obscure. Neither James Watson nor Francis Crick would have arrived at an understanding of the double helix without the prior work of Rosalind Franklin at Cambridge, for instance, but their names, and not hers, will forever be associated with that breakthrough.

And there are other even more obscure collaborators. The fruit fly, the evening primrose, and the guinea pig would seem to have little in common. And yet, along with a few other unassuming creatures, they have a just claim to be considered the unsung heroes of biology. As Jim Endersby notes in his fascinating "A Guinea Pig's History of Biology" (Harvard, 499 pages, $27.95), progress in biology owes as much to the hawkweed and the humble corncob as it does to the brainstorms of scientists. Mr. Endersby has had the happy idea of tracing the successes of modern biological research through the subjects which have made it possible. Each of his chapters focuses on a particular plant or animal, from the now extinct quagga, a cantankerous relative of the zebra, to microscopic bacteriophage, or "bacteria eaters," and culminating in the genetically engineered OncoMouse®, one of the first rodents with a full-fledged patent all its own. As he points out, "the history of biology has, in part, been the story of finding the right animals or plants to aid the search."

That search was twofold. It drew on practical considerations: to improve breeding lines, beginning with racehorses but extending to livestock; to develop cash crops with higher yields, culminating in our current, and controversial, genetically modified crops (of which Mr. Endersby provides a cautious and very balanced assessment), and, most crucially, to understand and find cures for devastating diseases. But it was also a search for something fundamental and far more elusive. In Mr. Endersby's account, the history of modern biology is a story of challenged assumptions, of refusing to accept easy explanations, of a willingness to ask apparently silly questions and to pursue the answers to them with astonishing doggedness.

read more…

Review
New York Sun
published December 26, 2007

A Map of the Heavens

photo:  A 17th-century Dutch engraving of the Copernican solar system, with the sun at the center.  credit:  New York Sun

Much of the genius of Nicolaus Copernicus (1473–1543) lay in a mix of audacity and exactitude. His boldest leaps of insight sprang from laborious plodding. Years of careful computation, based on sporadic stargazing with the crudest of instruments, lay behind his astonishing discoveries: Our earth was not the fixed center of the universe, nor did the sun and the stars move around us in perfect epicycles, as Ptolemy had argued more than a millennium earlier; in fact, our earth not only revolved around the sun but rotated on its axis. Nor were the heavens themselves static: They moved as well.

When his "On the Revolutions of the Heavenly Spheres" finally appeared in 1543, after decades of delay — he saw the first printed copy on the very day of his death — he not only turned human beings out of the cozy nest of their fondest assumptions, but rejoiced in the eviction. In the first book of his great work, he states, "Indeed, the Sun as if seated on a royal throne governs his household of stars as they circle around him." A heliocentric cosmos demonstrated to him "the marvelous symmetry of the universe."

As Jack Repcheck demonstrates in his excellent "Copernicus' Secret: How the Scientific Revolution Began" (Simon & Schuster, 255 pages, $25), the Polish astronomer and mathematician is not simply the pure empiricist we might recall. Born as Mikolaj Kopernik in the town of Torun on the Baltic coast, Copernicus combined religious fervor with scientific rigor in almost equal measure.

Of course, this wasn't unusual in the 16th century: Virtually all scientists then were believers, but most of them looked to nature only

read more…

Review
The New Criterion
published November 2007

A Mind Emparadised

A review of Paradiso by Robert Hollander

Jacket cover of On Paradiso by Dante Aligheri, translated by Robert and Jean Hollander.

The ineffability of mystical experience is an ancient commonplace. Not surprisingly, Dante alludes to it often with baffled exasperation throughout his Paradiso. In Canto 3, he states that "the sweetness of eternal life" can never be understood by the intellect; rather, it must be "tasted." This is a reference to a verse from Psalm 33, often cited by medieval mystics, "Taste and see that the Lord is good." A century before Dante, Richard of St. Victor, among others, had adduced it to illustrate "the tasting of inner sweetness," that dulcedo Dei which is the sweetness of God Himself. For such mystics, the experience of God was not verbal but gustatory, at once intimate and incommunicable. Hearing and seeing, touching and smelling, might play a part but they too were "spiritual senses." God can be seen but only with "the eyes of the heart." For Dante, when he set about composing the Paradiso, sometime around the year 1317, the inexpressibility of blessedness presented a seemingly insoluble difficulty. For the final cantica of the Divine Comedy is above all, and avowedly, a narrative of intellect; its drama arises from the struggle of "a mind emparadised" first to conceptualize, and then to articulate, the inherently indescribable. By now, Inferno and Purgatorio stood complete (though he continued to revise them), but there Dante had traversed readily imaginable terrains. We all have a sense of hell; even purgatory is as gnawingly real as our next best intention. Heaven remains the unimaginable kingdom.

In his efforts not so much to overcome this impossibility as to dramatize its effects, Dante has recourse both to the formulations of Scholastic theology and to the supple inventiveness of language itself. The originality of his language can hardly be overstated. By means of multiple allusion, wordplay, puns, and acrostics, as well as the entire repertoire of medieval rhetoric, he pushes his beloved vernacular to the limit. He draws on Latin and on Provençal, and even rhymes at one point in Hebrew. He revives old words and invents a score of new ones. At the outset, in Canto 1, he coins a new verb and pairs it with a Latin phrase to make the audacity of his unprecedented venture plain. In the new translation by Jean and Robert Hollander, the lines read, "To soar beyond the human cannot be described/ in words."[1] The Italian is terser: "Trasumanar significar per verba/ non si poria." The translation—to which I’ll return—doesn’t quite catch the effect of those two rhyming four-syllable infinitives capped by a Latin tag. (Longfellow was more literal as well as more accurate, in his 1865 version, when he rendered it as "To represent transhumanize in words/ Impossible were.") The odd verb trasumanar is Dante’s invention and at first sight it mystifies. It compounds the deliberate obduracy of the verse; we are brought up short by the word just as the poet is brought up short by the boldness of his own ambition.

read more…

Review
Wall Street Journal
published December 6, 2006

When Empires Collide

In one of Aesop's Fables a stag takes refuge on a cliff to escape his hunters. He feels safe as long as he can survey the landscape below him. But a boatload of hunters coming upriver spot his silhouette against the sky and bring him down from his blind side. The Holy Roman Emperor Leopold I (1640-1705) resembled that unfortunate stag. He was so fixated on the threat from France and the aggressive designs of Louis XIV that he underestimated a far worse menace from the East. That, combined with his legendary procrastination, almost cost him Vienna and his empire.

In 1683, the Ottoman Turks under Mehmed IV, still smarting from the failure of Suleiman the Magnificent to take Vienna in 1529, began preparing for a new assault on the ultimate prize. Victory, which lay almost within their grasp, would have spelled the end of the Holy Roman Empire. The heartland of Europe would have become yet another unruly Ottoman province.

In his splendid study The Siege of Vienna, the Oxford historian John Stoye provides a detailed account of the intricate machinations, involving a bewildering cast of characters, that led up to this near-debacle. For this was not simply a contest between the Habsburgs and the Ottomans but a quarrel among a host of nations and factions -- Hungarians, Serbs, Poles, Tartars and others -- each of which had its own vital interests and strategic agendas at stake.

read more…

Article
New York Sun
published September 13, 2006

The Magnificence of How

Photo credit: Kitty Barnes; Description: Owen Gingerich at Templeton-Cambridge Journalism Fellowships

In the 1970s, when the big-bang model for the origins of the universe at last seemed firmly established, Christian, Jewish, and even some Muslim preachers and exegetes took heart. Hadn't modern cosmology at long last proved what scripture always claimed? The universe emerged in a single indefinable instant. Creation out of nothing stood confirmed. Genesis had been vindicated.

The troublesome fact that big bang cosmology offers a model of how the cosmos came into being from a dimensionless point of infinite density but says nothing about what—or who—precipitated that primordial explosion (whose effects still determine our world, some 15 billion years later), hardly fazed these eager explicators. But the question nags. How far are we entitled to draw metaphysical inferences from scientific models?

Believers aren't alone in shoring up doctrine with data. Skeptics, including many scientists, do it routinely. The evolutionary biologist Richard Dawkins draws on Darwin to promote an atheistic agenda of well-nigh evangelical intensity, and he's hardly an isolated instance. Yet even the most stubborn doubter can occasionally be touched by puzzlement. The great English astronomer Fred Hoyle, a convinced atheist, was shaken when his researches into the way elements are formed in the hot hearts of stars showed that the nucleus of carbon possessed unique qualities that guaranteed its abundance, as though this fundamental component of life had been provided for—virtually designed into—the cosmic crucible. Hoyle grumbled about someone "monkeying with" the cosmos, which he now suspected was "a put-up job," and this rattled his atheism.

The anecdote comes from a remarkable new book by a Harvard astronomer and historian of science, Owen Gingerich, titled God's Universe (Harvard University Press, 144 pages, $16.95). Based on his William Belden Noble Lectures, delivered in 2005, Mr. Gingerich's work is a survey of the conflicts—and confluences—between hard science and deep faith; along the way he provides a brief but magisterial history of science that is as astute as it is original. He's a superb writer too, handling scientific and theological complexities with equal aplomb but enlivening his account throughout with poetry, dramatic anecdote, and snippets of autobiography.

read more…

Article
New York Sun
published August 23, 2006

Searching for the Truth About Nature

Cover of The Intelligibility of Nature

Scientists were once happy to be known as natural philosophers. The title implied not merely that they studied nature but that they thought about it in such a way as to discern its hidden laws. They weren’t concerned only with the how of things but with the why. The beautiful line of Virgil, "Happy is he who can recognize the causes of things," epitomized the endeavor. Causation in all its forms, from the collisions of solid bodies on earth to the remote arrangements of the First Cause beyond the empyrean, underlay natural laws. Goethe’s Faust, the mythic prototype of the philosopher-scientist, was driven to despair, as well as near-damnation, by his passion to know "what holds the world together in its deepest core." But Faust represents the end of an ancient tradition; for all his knowledge, he’s tormented by the world’s ultimate unknowability. And that bafflement "scorches his heart."

Is nature finally unintelligible? Even more disturbing, is nature intelligible in itself but beyond the power of humans to comprehend? These and other questions form the theme of Peter Dear’s excellent new book, The Intelligibility of Nature: How Science Makes Sense of the World (University of Chicago Press, 233 pages, $27.50). Mr. Dear, a historian of science at Cornell, provides a succinct history of modern science from the 17th century to the present by drawing on two complementary but conflicting aspects of the scientific method. The first involves the search for "intelligibility," or the truth about nature; the second concerns "instrumentality," or the practical uses scientists make of their discoveries. As he demonstrates, in lucid prose and well-chosen illustrations, these two aspects aren’t quite compatible, and yet, both have proved essential to the advancement of science.

The criterion of intelligibility sounds obvious but isn’t. As Mr. Dear shows, even Newton was harshly criticized, by the Dutch mathematician Christiaan Huyghens, among others, for his inability to explain the phenomenon of gravitation. He could describe the force and derive laws and inferences from it but he couldn’t account for it in a satisfactorily philosophical manner. The criticism stung Newton because he agreed with Huyghens. The seemingly insuperable problem was "action at a distance." How could one object—whether a heavenly body or the earth beneath an apple tree—influence another without some sustaining medium through which gravity could act? Mr. Dear cites a letter in which Newton protests, "Pray do not ascribe that notion to me, for the cause of gravity is what I do not pretend to know and therefore would take more time to consider of it."

read more…

Article
The New York Times
published June 7, 2006

The Bump of Reverence

Cover of The Autobiography of Charles Darwin

It's almost impossible for us to recapture the pre-Darwinian notion of a species or an individual creature as having issued in its final configuration directly from the hand of its maker. We can't escape an awareness of the countless mutations and adaptations that every being, including ourselves, has undergone in the long process of evolution. Poets attempt to recover this lost sense of essence. When Rilke writes about a flamingo, he sees it "under the aegis of eternity." It would have been interesting and startlingly original had he somehow glimpsed, and been able to convey, the shadowy precursors—all those vanished proto-flamingos—that went to form his transcendental waterfowl, but this would have destroyed the Platonic fiction on which his vision depended.

As a young man, Charles Darwin himself was an avid reader of poetry; he "took intense delight," he tells us, in Shakespeare but loved Milton and Shelley, Byron and Coleridge, as well. In his later years he lost this taste and found poetry intolerable, preferring the popular novels his wife read aloud to him at stated intervals over each working day. His growing indifference to poetry, as well as to music, puzzled and disturbed Darwin; he saw it as a possible symptom of mental decline.

Darwin recounts this change in the account of his life and career that he wrote at the request of his family over a five-year period from 1876 to 1881, a year before his death. Like much else in The Autobiography of Charles Darwin (Totem Books, 154 pages, $15), it leaves a faint sense of puzzlement. All his life Darwin suffered from a mysterious illness, involving severe headaches, prolonged bouts of vomiting, and enervating weakness, that disabled him for months at a time. He has been retrospectively diagnosed as suffering from everything from Chagas disease, picked up during his five years aboard the Beagle, to neurotic hypochondria, caused by anxiety over the reception of his theories. But sickness, though it slowed him down, never deflected the stubborn progress of his research or writing, nor did it impair the singular vigor of his prose. The cause must lie elsewhere.

read more…