In The Nation:
The publication of any book by J.G. Ballard at this moment–let alone so colossal and career-spanning a volume as The Complete Stories, running to nearly 1,200 pages–is an occurrence that can only be about more than itself. All writers are writers of their time, of course, but Ballard, who after a fight with cancer died in April 2009, feels somehow uniquely, precisely so. This book marks the fact that we are all post-Ballard now: it's not that we've gotten beyond him but rather that we remain ineluctably defined by him. Completists have pointed out that, its title notwithstanding, this volume is not a truly comprehensive collection of all Ballard's published short fiction. Those few omissions are a disappointment. Nevertheless, they are few, and despite them the book is indispensable.
The volume's ninety-eight stories (including two written for this edition) are printed in chronological order of publication, which illuminates Ballard's trajectory. There is something fascinating and poignant about watching various obsessions appear, reappear or come gradually or suddenly into focus: birds, flying machines, ruins, beaches, obscure geometric designs, the often-noted empty swimming pools. That the earlier stories are on the whole less compelling than the later, and more numerous, suggests a career-long process of distillation, a rendering-down. Both in facility and insight, early works such as the wincingly punning “Prima Belladonna”–the first of many journeys to Vermilion Sands, an artists' colony-cum-fading seaside resort supposedly somewhere in the real world though full of impossibilities and dream technologies–or “Now: Zero” and “Track 12,” rather overwrought Dahl-esque tales of the unexpected, are slight compared with the later dense and strange forensics. Many of the stories function as testing grounds for Ballard's novels. For the admirer of his longer work there is the slightly disconcerting pleasure of déjà vu, of stumbling into précis and dry runs. Here are various aspects of Empire of the Sun, Crash, The Crystal World. This book is a valedictory, an event, the ground-laying for investigations.
Ed Yong over at his blog Not Exactly Rocket Science:
Today, the city of Angkor in Cambodia lies in ruins. But a thousand years ago, life there was very different. Then, Angkor was the heart of the Khmer empire and the largest preindustrial city of its day. It had a population of a million and an area that rivalled modern Los Angeles. And the key to this vast urban sprawl was water.
Radar images of the city by the Greater Angkor Project (GAP) revealed that Angkor was carefully designed to collect, store and distribute water. The “Hydraulic City” included miles of canals and dikes, irrigation channels for supplying crops, overflow channels to cope with a monsoon, massive storage areas (the largest of which was 16km2 in area), and even a river diverted into a reservoir. Water was the city’s most precious resource, allowing it to thrive in the most unlikely of locations – the middle of a tropical forest.
But water, or rather a lack of it, may have been part of Angkor’s downfall. Brendan Buckley from Columbia University has reconstructed the climate of Angkor over the last 750 years, encompassing the final centuries of the Khmer Empire. The records show that Angkor was hit by two ferocious droughts in the mid-14th and early-15th century, each lasting for a few decades. Without a reliable source of water, the Hydraulic City’s aquatic network dried up. It may have been the coup de grace for a civilisation that was already in severe decline.
Many theories have been put forward for the downfall of Angkor, from war with the Siamese to erosion of the state religion. All of these ideas have proved difficult to back up, despite a century of research. Partly, that’s because the area hasn’t yielded much in the way of historical texts after the 13th century. But texts aren’t the only way of studying Angkor’s history. Buckley’s reconstruction relies on a very different but more telling source of information – Fujian cypress trees.
Annalee Newitz over at io9:
Most creatures on Earth have one sex that fertilizes and one that gets fertilized. Not so with olive trees. Last week scientists described how these trees evolved a system of males and a third sex which can go both ways.
The sexual system these trees have is called androdioecy: It includes males and a third “hermaphrodite” sex. A group of French researchers last week explained how such a setup could evolve from a pure hermaphrodite system. Initially, the trees were probably all able to pollinate or be pollinated. But over time, some of the trees mutated and lost their female functionality. Now, a very sizable male population exists among the olive trees.
But how? You'd think that males, who can only reproduce by pollinating, would have a strong disadvantage in a system where their competitors can reproduce either by pollinating or being pollinated.
However, among the olive trees these scientists studied, the androdioecy had reached a stable state.
Perhaps the greatest illusion that we, people of the democratic opposition, had laboured under was our conviction that we lived in societies comprising honest and noble people who had simply been silenced. We believed we were the voice of those who had been silenced and that is why our rebellion was fundamentally a moral one. Alexandr Solzhenitsyn told us “not to live the lie”. Leszek Kołakowski asked us to “live with dignity”. John Paul II exhorted us: “Don’t be afraid!” and he promised that “truth would set us free”. Václav Havel believed in the “power of the powerless”. For us, dissidents, this ethical motivation strengthened our morale but it also turned us into elitists. Being a dissident required being in open conflict with the dictatorship and everything it entailed: oppression, loss of opportunities, exclusion and often imprisonment. Yet our conviction that our voice was the voice of the enslaved nation was only part of the truth. In defending the historical truth and religious and civil liberties we articulated the collective consciousness. Yet our call for active resistence and for breaking the barriers of fear and apathy remained unheard. The ethical perfectionism of a Sakharov, a Havel or a Kuroń simply could not be shared by everyone, certainly not by the majority. The majority stayed silent and we assumed this was out of fear. Yet fear was not the only reason for the silence of the majority.
more from Adam Michnik at Salon) here.
Samuel R. Delany’s Dhalgren is—like Moby-Dick, Naked Lunch, or “Chocolate Rain”—an essential monument both to, and of, American craziness. It doesn’t just document our craziness, it documents our craziness crazily: 800 epic pages of gorgeous, profound, clumsy, rambling, violent, randy, visionary, goofy, postapocalyptic sci-fi prose poetry. The book is set in Bellona, a middle-American city struggling in the aftermath of an unspecified cataclysm. Phones and TVs are out; electricity is spotty; money is obsolete. Riots and fires have cut the population down to a thousand. Gangsters roam the streets hidden inside menacing holograms of dragons and griffins and giant praying mantises. The paper arrives every morning bearing arbitrary dates: 1837, 1984, 2022. Buildings burn, then repair themselves, then burn again. The smoke clears, occasionally, to reveal celestial impossibilities: two moons, a giant swollen sun. To top it off, this craziness trickles down to us through the consciousness of a character who is, himself, very likely crazy: a disoriented outsider who arrives in Bellona with no memory of his name, wearing only one sandal, and who proceeds to spend most of his time either having graphic sex with fellow refugees or writing inscrutable poems in a notebook—a notebook that also happens to contain actual passages of Dhalgren itself. The book forms a Finnegans Wake–style loop—its opening and closing sentences feed into one another—so the whole thing just keeps going and going forever. It’s like Gertrude Stein: Beyond Thunderdome. It seems to have been written by an insane person in a tantric blurt of automatic writing.
more from Sam Anderson at New York Magazine here.
There is an aspect of the American aesthetic that approaches design like a child. There’s a giddy lack of propriety, a joyful dismissal of taste, a love of big colors and sparkle. It’s connected to our attitude toward wealth, which often equates beauty with prosperity. In other words, if it looks rich, it must be beautiful. The shinier the better. This aesthetic of bling, though, is not simply about playacting at wealth; it’s about becoming lost in a fantasy of layers upon layers of artificiality and imitation. The Versailles that Larry Hart imitated in the Hartland Mansion (Versailles itself the classic contribution to Artifice) was not even the actual Versailles, but an idea of Versailles based on pictures of Versailles in a book and created with the mass-produced materials available to him at craft and hardware stores. All craft is imitation. There are cultures that imitate things they find in nature, or gods, or traditions that go back thousands of years. In America, imitation isn’t just about copying other essential things; imitation is the essential thing, the basis for whatever it is that “American craft” is. Sure, you’ve got exceptions like the Shakers, who designed elegant originals such as the flat-bottomed broom (which is an amazing thing, truly) and the clothespin. But the clothespin never screamed AMERICA! until Claes Oldenburg made a supersized imitation of it in downtown Philly.
more from Stefany Anne Golberg at The Smart Set here.
….“We saw reindeer
browsing,” a friend who'd been in Lapland, said:
“finding their own food; they are adapted
….to scant reino
or pasture, yet they can run eleven
miles in fifty minutes; the feet spread when
….the snow is soft,
and act as snow-shoes. They are rigorists,
however handsomely cutwork artists
….of Lapland and
Siberia elaborate the trace
or saddle-girth with saw-tooth leather lace.
….One looked at us
with its firm face part brown, part white,—a queen
of alpine flowers. Santa Claus' reindeer, seen
….at last, had grey-
brown fur, with a neck like edelweiss or
lion's foot,— leontopodium more
this candelabrum-headed ornament
for a place where ornaments are scarce, sent
was a gift preventing the extinction
of the Esquimo. The battle was won
….by a quiet man,
Sheldon Jackson, evangel to that race
whose reprieve he read in the reindeer's face
by Marianne Moore
from News of the Universe;
Sierra Club Books, 1995
From Scientific American:
Like many people, rats are happy to gorge themselves on tasty, high-fat treats. Bacon, sausage, chocolate and even cheesecake quickly became favorites of laboratory rats that recently were given access to these human indulgences—so much so that the animals came to depend on high quantities to feel good, like drug users who need to up their intake to get high. A new study, published online March 28 in Nature Neuroscience, describes these rats' indulgent tribulations, adding to research literature on the how excess food intake can trigger changes in the brain, alterations that seem to create a neurochemical dependency in the eater—or user. (Scientific American is part of Nature Publishing Group.) Preliminary findings from the work were presented at the Society for Neuroscience meeting in October 2009.
Like many pleasurable behaviors—including sex and drug use—eating can trigger the release of dopamine, a feel-good neurotransmitter in the brain. This internal chemical reward, in turn, increases the likelihood that the associated action will eventually become habitual through positive reinforcement conditioning. If activated by overeating, these neurochemical patterns can make the behavior tough to shake—a result seen in many human cases, notes Paul Kenny, an associate professor in the Department of Molecular Therapeutics at The Scripps Research Institute in Jupiter, Fla., and co-author of the new study. “Most people who are overweight would say, 'I would like to control my weight and my eating,' but they find it very hard to control their feeding behavior,” he says. Despite a growing body of research, it has been unclear whether extreme overeating was initiated by a chemical irregularity in the brain or if the behavior itself was changing the brain's biochemical makeup. The new research by Kenny and his colleague Paul Johnson, a graduate student, shows that both conditions are possible.
Does the Shroud of Turin show the “real face of Jesus”? That claim is impossible to judge, even though it serves the title of a documentary about the 3-D analysis of the Shroud of Turin premiering tonight on the History Channel. What can be said is that the centuries-old image wasn’t just painted freehand. Computer analysis of the imprint on the shroud suggests that it had to be left behind by someone draped in cloth. “Is this the artifact of a real person or not? Definitely it is,” Ray Downing, the digital illustrator at the center of the show, told me today. Downing worked with specialists on the shroud to come up with a photorealistic representation of the man whose body's imprint appears faintly on a famous 14-foot-long length of linen. For some Christians, the stain serves as the miraculous snapshot of their risen Lord. For most scientists, it is a cleverly done fake from the 13th or 14th century, but nothing more. Back in 1988, carbon-14 dating tests were conducted on a sample from the shroud in an effort to determine whether the cloth was created in Jesus' time. The verdict from three laboratories was that the cloth was produced in medieval times. But the shroud's fans have insisted that the sample was actually taken from a patch, rather than from the original linen. Just this month, a chemist proposed a new series of non-destructive dating tests that would give an estimate for the entire cloth.
From a marketing perspective, the timing of the History Channel show couldn't be better: Good Friday and Easter Sunday, the Christian holy days that mark Jesus' death and resurrection, are just a few days away. What's more, the shroud is due to go on display for six weeks at Turin Cathedral, starting April 10. The last time the relic was exhibited, a decade ago, more than 3 million people came to Turin to see it. More than a million reservations have been received already for next month's viewing. Have scientists been wrong about the shroud? Downing noted that historical records referring to the shroud predate the current carbon-14 estimate. “We know the carbon-14 [test] is wrong,” he said. “The question is, how wrong are they? The further back you go, the less likely it is that anybody could have faked it.”
Michael Bérubé in Dissent:
Earlier this year I had a lively email exchange with an exceptionally bright young Chomsky admirer who was deeply annoyed by my book, The Left At War. Part of the exchange was frustrating, insofar as he seemed to believe that if you give up ye olde “false consciousness” explanation for people’s behavior you have no effective way of saying that they are just flat-out wrong. But after a week or so of back-and-forth, we hit upon something that (for me, anyway) shed a nice bright light on what was at stake in the discussion.
He adduced this 2009 essay, “The Torture Memos and Historical Amnesia,” as an example of why he regards Chomsky as so valuable (his word) to a critical understanding of U.S. policy…
My interlocutor explained that whenever he lapses into a merely-liberal Krugman-like faith in American ideals, he finds Chomsky to be a bracing reminder that those ideals have routinely been traduced, and that the justification of torture by American officials is nothing new. And that’s why he’s vexed by left criticism of Chomsky, which he thinks is really “liberal” rather than properly “left.”
It cannot be denied that we have often traduced our ideals. And Chomsky’s essay is in many respects quite good, especially with regard to the history of how “in ordinary American practice, torture was largely farmed out to subsidiaries.” (Though I can do without the ritual repetition of “The 9/11 attack was doubtless unique in many respects. One is where the guns were pointing: typically it is in the opposite direction.” I still find it impossible to read those words without hearing, “and it was about time.” And his attempt to construe the extermination of Native Americans as a “humanitarian intervention” is yet another form of doubling down on his hands-off-the-Balkans position.) But I had two other responses to this young man.
Susan Douglas in In These Times:
This was the Spice Girls moment, and debate: Were these frosted cupcakes really a vehicle for feminism? And how much reversion back to the glory days of prefeminism should girls and women accept—even celebrate—given that we now allegedly had it all? Despite their Wonderbras and bare thighs, the Spice Girls advocated “girl power.” They demanded, in their colossal, intercontinental hit “Wannabe,” that boys treat them with respect or take a hike. Their boldfaced liner notes claimed that “The Future Is Female” and suggested that they and their fans were “Freedom Fighters.” They made Margaret Thatcher an honorary Spice Girl. “We’re freshening up feminism for the nineties,” they told the Guardian. “Feminism has become a dirty word. Girl Power is just a ’90s way of saying it.”
Fast-forward to 2008. Talk about girl power! One woman ran for president and another for vice president. Millions of women and men voted for each of them. The one who ran for vice president had five children, one of them an infant, yet it was verboten to even ask whether she could handle the job while tending to a baby. At the same time we had a female secretary of state, and the woman who had run for president became her high-profile successor. And we have Lady Gaga, power girl of the new millennium. Feminism? Who needs feminism anymore? Aren’t we, like, so done here? Okay, so some women moaned about the sexist coverage of Hillary Clinton, but picky, picky, picky.
Indeed, eight years earlier, career antifeminist Christina Hoff Sommers huffed in her book, The War Against Boys: How Misguided Feminism Is Harming Our Young Men, that girls were getting way too much attention and, as a result, were going to college in greater numbers and much more likely to succeed while boys were getting sent to detention, dropping out of high school, destined for careers behind fast-food counters, and so beaten down they were about to become the nation’s new “second sex.” Other books like The Myth of Male Power and The Decline of Males followed suit, with annual panics about the new “crisis” for boys. Girl power? Gone way too far.
First Sam Harris in Project Reason:
[M]any people strongly objected to my claim that values (and hence morality) relate to facts about the wellbeing of conscious creatures. My critics seem to think that consciousness and its states hold no special place where values are concerned, or that any state of consciousness stands the same chance of being valued as any other. While maximizing the wellbeing of conscious creatures may be what I value, other people are perfectly free to define their values differently, and there will be no rational or scientific basis to argue with them. Thus, by starting my talk with the assertion that values depend upon actual or potential changes in consciousness, and that some changes are better than others, I merely assumed what I set out to prove. This is what philosophers call “begging the question.” I am, therefore, an idiot. And given that my notion of objective values must be a mere product of my own personal and cultural biases, and these led me to disparage traditional religious values from the stage at TED, I am also a bigot. While these charges are often leveled separately, they are actually connected.
I’ve now had these basic objections hurled at me a thousand different ways—from YouTube comments that end by calling me “a Mossad agent” to scarcely more serious efforts by scientists like Sean Carroll which attempt to debunk my reasoning as circular or otherwise based on unwarranted assumptions. Many of my critics piously cite Hume’s is/ought distinction as though it were well known to be the last word on the subject of morality until the end of time. Indeed, Carroll appears to think that Hume’s lazy analysis of facts and values is so compelling that he elevates it to the status of mathematical truth:
Attempts to derive ought from is [values from facts] are like attempts to reach an odd number by adding together even numbers. If someone claims that they’ve done it, you don’t have to check their math; you know that they’ve made a mistake.
This is an amazingly wrongheaded response coming from a very smart scientist. I wonder how Carroll would react if I breezily dismissed his physics with a reference to something Robert Oppenheimer once wrote, on the assumption that it was now an unmovable object around which all future human thought must flow. Happily, that’s not how physics works. But neither is it how philosophy works. Frankly, it’s not how anything that works, works.
Carroll appears to be confused about the foundations of human knowledge. For instance, he clearly misunderstands the relationship between scientific truth and scientific consensus. He imagines that scientific consensus signifies the existence of scientific truth (while scientific controversy just means that there is more work to be done). And yet, he takes moral controversy to mean that there is no such thing as moral truth (while moral consensus just means that people are deeply conditioned for certain preferences). This is a double standard that I pointed out in my talk, and it clearly rigs the game against moral truth.
Sean Carroll's rejoinder:
I wanted to try to clarify my own view on two particular points, so I put them below the fold. I went on longer than I intended to (funny how that happens). The whole thing was written in a matter of minutes — have to get back to real work — so grains of salt are prescribed.
First, the role of consensus. In formal reasoning, we all recognize the difference between axioms and deductions. We start by assuming some axioms, and the laws of logic allow us to draw certain conclusions from them. It’s not helpful to argue that the axioms are “wrong” — all we are saying is that if these assumptions hold, then we can safely draw certain conclusions.
A similar (although not precisely analogous) situation holds in other areas of human reason, including both science and morality. Within a certain community of like-minded reasoners, a set of assumptions is taken for granted, from which we can draw conclusions.
Toward the end of John Banville’s new novel, “The Infinities” (Knopf; $25.95), a more or less contemporary tale over which the Greek gods Zeus and Hermes rather startlingly preside, a snooty character to whom someone is describing an “updated” production of a play about the parents of Hercules declares that he “does not approve of the classics being tampered with”: the Greeks, he says, “knew what they were doing, after all.” The joke is that the pretentious young man doesn’t know what he’s talking about. The play in question, “Amphitryon”—whose themes, of adultery, confused identities, and improbable Olympian interventions, are actually the model for Banville’s novel—isn’t Greek at all. Rather, it’s an early-nineteenth-century German reworking of late-seventeenth-century French and English rewritings of a second-century-B.C. tragicomedy written in Latin. And that was just then. In the twentieth century alone, the Amphitryon myth has been adapted by a French novelist, two German playwrights, an opera composer, an anti-Nazi filmmaker, and Cole Porter. Have we ever done anything but tamper with the classics?
more from Daniel Mendelsohn at The New Yorker here.
The 20th century dawned not on the first day of 1900 (or, for purists, 1901) but on a September evening in 1894, when a cleaner at the German embassy in Paris found a torn-up letter in the military attaché’s wastebasket. The cleaner was working for French intelligence, and the letter, once reassembled, was found to contain military secrets being offered by an unnamed French Army officer. After a cursory investigation, authorities arrested Alfred Dreyfus, a Jewish artillery captain working at General Staff headquarters. Thus began the Dreyfus Affair, in which an innocent man was unjustly convicted, amid rising xenophobia and anti-Semitism, and sent off to rot on a deserted island in South America. A vigorous public campaign against the howling injustice of the affair raged for more than a decade before the captain’s final vindication, which divided France into warring camps of Dreyfusards and anti-Dreyfusards, republicans and traditionalists. Dreyfus’s ordeal was the first big test of a modern justice system, and it defined one of the central issues of democracy: should the rule of law be applied consistently, or are there cases in which it should be bent to fit a current crisis or pressing national concern? Even today, hardly a month passes without an alleged misstep of justice somewhere in the world being labelled a “new Dreyfus Affair”.
more from Donald Morrison at the FT here.