March 31, 2006
Ian McEwan Looks at Science Writing
On the 30th anniversary of The Selfish Gene, Ian McEwan considers science writing.
[A] literary tradition implies an active historical sense of the past, living in and shaping the present. And reciprocally, a work of literature produced now infinitesimally shifts our understanding of what has gone before. You cannot value a poet alone, TS Eliot argued in his famous essay, "Tradition and the Individual Talent", "you must set him, for contrast and comparison, among the dead." Eliot did not find it preposterous "that the past should be altered by the present as much as the present is directed by the past." We might discern the ghost of Auden in the lines of a poem by James Fenton, or hear echoes of Wordsworth in Seamus Heaney, or Donne in Craig Raine. Ideally, having read our contemporaries, we return to re-read the dead poets with a fresh understanding. In a living artistic tradition, the dead never quite lie down.
Can science and science writing, a vast and half forgotten accumulation over the centuries, offer us a parallel living tradition? If it can, how do we begin to describe it? The problems of choice are equalled only by those of criteria. Literature does not improve; it simply changes. Science, on the other hand, as an intricate, self-correcting thought system, advances and refines its understanding of the thousands of objects of its study. This is how it derives it power and status. Science prefers to forget much of its past - it is constitutionally bound to a form of selective amnesia.
Is accuracy, being on the right track, or some approximation of it, the most important criterion for selection? Or is style the final arbiter? The writings of Thomas Browne or Francis Bacon or Robert Burton contain many fine passages that we now know to be factually wrong - but we would surely not wish to exclude them.
Exploring the Flann O'Brien Archives
In Context, Theodore McDermott on the Flann O'Brien Archives.
Specifically, I went to see a microfilmed copy of an early manuscript of At Swim-Two-Birds. References online and in O’Brien scholarship suggest that a draft much longer than the published one exists—it seemed likely that it would be the manuscript in Carbondale. There, in the special collections room, I sat at the microfilm machine looking at the doodles on the book’s first page. Don’t tell Terry Eagleton, but the name “Engels” was scrawled around the title—we wouldn’t want a Marxist reading to jeopardize O’Brien’s genius, to see the theme of three in At Swim as an example of dialectical materialism (thesis, antithesis, synthesis) at play. Maybe this Engels is other than Marx’s sidekick? A Gaelic figure? A friend? Who knows?
And there were, indeed, as I got past the first page, some differences between this early manuscript and the one published. Some different ordering (mostly at the beginning), some extra material—“Memoir of Dermot Trellis, his youth, being an extract from A Conspectus of the Arts and Natural Sciences on the subject of Dr. Beatty, now in heaven, by the reverend Alexander Dyce, but found on examination to be singularly referable to the life of Trellis. Serial volume in the Conspectus, the Thirty-seventh,” for example—and other slight variations (Finn having a conversation with Trellis, which might well be of note to the careful At Swim scholar) comprise the most notable changes from the un- to the published versions. On the whole, the manuscript seemed not to warrant what I hoped it might: publication. The differences simply aren’t substantive enough. In theory, there exists somewhere a manuscript that’s one-third longer than the published one—but this, unfortunately, wasn’t it. Best I could tell, it was a revision of something already sent once to the publisher. The substantively longer version was apparently given to a friend, then revised, and only then sent out.
With the rest of my time, I went through as much of the eleven boxes of O’Brien material as I could. I didn’t make it that far...
Responses to Mearsheimer and Walt
First, it is not true that the American relationship with Israel has been ‘the centrepiece of US Middle Eastern policy’. That centrepiece has been and remains access to oil for the United States and for the global economy. As it became apparent during the 1960s that Israel was not merely the only democracy in the region but also a supporter of the West in the Cold War, the American relationship intensified. At that point, support for Israel, which had been strongest among liberals who supported a Jewish state in the wake of the Holocaust, expanded to include the previously less than enthusiastic military and diplomatic foreign policy establishment, some of which was deeply hostile to Israel and suspicious of Jews, to put it mildly. This was not due to the efforts of the Jewish Lobby or the power of the five million Jews (in a country of almost 300 million). It was due to an assessment of American national interest made by an overwhelmingly non-Jewish political and military establishment long before Christian fundamentalism became a factor in the Republican Party. It coincided with increasingly close ties with the Saudi regime.
Making Chemistry More Interesting Through Video Games
An attempt to get more students interested in chemistry through video games, in nature.com.
You are deep underground in a lab that once housed some of the finest minds in chemistry. But robots directed by a crackbrained artificial intelligence have taken it over and plan to use its equipment to destroy the world! After freezing an evil robot with your handy wrist-mounted hot-and-cold gun, you reach the Haber-Bosch room. And now you must correctly synthesize ammonia or die.
"Your students are playing video games," Gabriela Weaver told a group of chemistry teachers at the American Chemical Society meeting in Atlanta, Georgia, on 29 March. "They are playing them more and more hours a day. They are probably playing them in your class."
If you can't beat 'em, join 'em. Weaver, an associate professor of chemistry at Purdue University in West Lafayette, Indiana, is building a computer game about the subject - she hopes her prototype will be as appealing to students as the blockbuster games coming out of companies like Electronic Arts (EA).
Two Epidemiologists Look at Class
In Harvard Magazine:
Two years ago the New England Journal of Medicine published a commentary titled “Class—The Ignored Determinant of the Nation’s Health.” Its authors, a policy analyst and an academic physician, wrote: “[P]eople in lower classes die younger and are less healthy than people in higher classes. They behave in ways that ultimately damage their health and that take their lives prematurely (by smoking more, having poorer eating habits, and exercising less). They also have less health insurance coverage, live in worse neighborhoods, and are exposed to more environmental hazards. Beyond that, however, there is something about lower socioeconomic status itself that increases the risk of premature death.”
For 20 years, that “something” about being poor and getting sick has preoccupied Nancy Krieger ’80, Ph.D., professor of society, human development, and health at the Harvard School of Public Health. It has also preoccupied her older brother, James Krieger ’78, M.D., chief of the epidemiology, planning, and evaluation unit at Public Health-Seattle and King County, his local public-health authority. Independent of each other, the Krieger siblings have transformed that fixation into the leading edge of public-health theory and practice. Nancy’s hypotheses and methods are called, by many colleagues, the most brilliant contributions to social epidemiology in a generation. Jim’s on-the-ground innovations are the envy of local health departments across the country. Sister and brother have set a standard for what public health can and should be in the United States; both are trying to steer the profession back to its roots in social justice. That few beyond their respective fields have heard of the Kriegers says as much about their modesty as about the battered profile of public health in America.
alex mcquilkin: still fucked
More and more, 26-year-old Brooklyn-based Alex McQuilkin has come to embody Sylvia Plath’s valediction in Lady Lazarus, one of her final poems: "Dying is an art like anything else/I do it well/I do it so it feels like hell/I do it so it feels real/I guess you could say I have a call."
Unlike Plath, literally dying is not in fact what McQuilkin is about. But in Plath’s tradition, she does make moving art out of the idea of death. In her DVDs and C-print stills, McQuilkin exposes the raw, tender ties between death, sex, desire and youth. Her work evokes an uncomfortable, undeniable blend of contempt and empathy, as her teenage protagonists (played by her) desperately flaunt their sexual desire, their desirability and their romantic wish for death. With roots in feminist theory, 1990s cultural criticism and popular culture, McQuilkin manages to produce work which avoids jargon and evades any purely intellectual reaction. Like Valie Export, Carolee Schneemann, Paul McCarthy and Sue de Beer, McQuilkin makes art that is like the strongest, sharpest parts of punk rock nailed through layers and layers of solid intellectual foundation.
more from artnet here.
zizek: liberal communists the true enemy
Since 2001, Davos and Porto Alegre have been the twin cities of globalisation: Davos, the exclusive Swiss resort where the global elite of managers, statesmen and media personalities meets for the World Economic Forum under heavy police protection, trying to convince us (and themselves) that globalisation is its own best remedy; Porto Alegre, the subtropical Brazilian city where the counter-elite of the anti-globalisation movement meets, trying to convince us (and themselves) that capitalist globalisation is not our inevitable fate – that, as the official slogan puts it, ‘another world is possible.’ It seems, however, that the Porto Alegre reunions have somehow lost their impetus – we have heard less and less about them over the past couple of years. Where did the bright stars of Porto Alegre go?
Some of them, at least, moved to Davos. The tone of the Davos meetings is now predominantly set by the group of entrepreneurs who ironically refer to themselves as ‘liberal communists’ and who no longer accept the opposition between Davos and Porto Alegre: their claim is that we can have the global capitalist cake (thrive as entrepreneurs) and eat it (endorse the anti-capitalist causes of social responsibility, ecological concern etc). There is no need for Porto Alegre: instead, Davos can become Porto Davos.
more from ther LRB here.
In 1967, Karlheinz Stockhausen’s face appeared on the cover of Sgt Pepper’s Lonely Hearts Club Band – between Lenny Bruce and W. C. Fields. In September 2001 he achieved a different kind of immortality when Die Zeit quoted (or, he claims, misquoted) him as saying that the destruction of the World Trade Center was the “greatest work of art there has been”. The remark convinced many that the once-famous composer had long since jumped off the deep end; it also seemed to signal the end of what might be termed da Vincian vangardism – the grandiose claim by a composer to be prophet, inventor, scientist, philosopher and spiritual guide. Other Planets, Robin Maconie’s latest book about Stockhausen, reads, appropriately enough, like a cross between conventional musical history and The Da Vinci Code. In addition to laying out the facts about every work in Stockhausen’s large oeuvre, Maconie promises to reveal how a “latent philosophical agenda” in the music addresses “the historic aspirations of German nationalism, and more specifically a defense of the role of post-Enlightenment European culture in the wider world” and, beyond that, to show how serialism is part of a “grander aesthetic and intellectual enterprise, beginning in the late eighteenth century, concerning the nature and evolution of language and its implications for post-revolutionary democracy”. In place of Dan Brown’s Last Supper, Maconie hinges his mad dash through cultural history on Jean-François Champollion’s decoding of the Rosetta Stone; Olivier Messiaen had once compared the young Stockhausen to the French decrypter. Where Brown pits the Catholic Church against the Knights of the Temple, Maconie fashions his catalogue raisonné around an esoteric battle between Saussurean “lettrists” and Goethean holists.
more from the TLS here.
Kazuo Ishiguro on how a radio discussion helped fill in the missing pieces of Never Let Me Go in The Guardian:
The setting for the first section of Never Let Me Go is a boarding school, but let me say I never went to boarding school myself. Of course, I drew on my own memories of what it felt like to be a child and an adolescent. And I suppose it's inevitable the experience of being a parent would inform the way I think about children. But I can't think of any one scene in that school section based, even partly, on an actual event that ever happened to me or anyone I know. When I write about children, I do much the same as when I write about elderly people, or any other character who's different from me in culture or experience. I try my best to think and feel as they would, then see where that takes me. I've never found that children present any special demands for me as a novelist. In fact, I find it alarmingly easy to think like an adolescent.
As Luck Would Have It
Michael Shremer in Scientific American:
Clearly, luck is a state of mind. Is it more than that? To explore this question scientifically, experimental psychologist Richard Wiseman created a "luck lab" at the University of Hertfordshire in England. Wiseman began by testing whether those who believe they are lucky are actually more likely to win the lottery. He recruited 700 subjects who had intended to purchase lottery tickets to complete his luck questionnaire, which is a self-report scale that measures whether people consider themselves to be lucky or unlucky. Although lucky people were twice as confident as the unlucky ones that they would win the lottery, there was no difference in winnings.
March 30, 2006
The British people find it hard to cherish their philosophers. In France, the recent centenary of Jean-Paul Sartre was virtually a state event, with newspaper pull-outs bearing his toad-face and endless adulations. In the United States – a country we like to jeer at as ignorant – most people at least learn some lines of Thomas Jefferson and Ralph Waldo Emerson’s at school. But here, the bicentenary of the birth of perhaps our greatest philosopher – John Stuart Mill – is passing in the night.
This is tragic, because Mill is our contemporary and our guide in a way that is true of very few philosophers. If you read his Collected Works after reading the day’s newspapers, it is as if he is an unimaginably brilliant columnist, commenting on yesterday, today and tomorrow.
more from The Philosopher's Magazine here.
wegman and the damned dogs
On weekends, William Wegman’s “Funney/Strange” at the Brooklyn Museum becomes a playground for parents and children, most of whom have a grand time laughing at the posed pooches. No other artist today can pull that kind of crowd. (Calder came close, but usually just for his child-friendly “Calder’s Circus.”) As the art world knows, Wegman has created a significant body of work apart from his portraits of Weimaraners, notably paintings and human-only videos. But his dogs inevitably steal the show. The question that beguiles me is why they engender more than a passing smile. What makes them more than doggy kitsch? Something more than a jokey greeting card?
more from New York Magazine here.
art, beauty, kara walker
With all due respect to the legions who argue otherwise, it is totally bogus to claim that art is about beauty or that the two are even connected. Goya's stupendous Saturn Devouring His Children isn't beautiful, nor is Duchamp's bottle rack, or Thomas Hirschhorn's recent blood-and-modernism installation. Over time these works may alter definitions of beauty, but it's an oversimplification to say that art and beauty are innately bound. It's as empty, dogmatic, patronizing, misleading, and limiting as saying art is about sincerity, wonderment, or any other absolute value.
Keats was wrong: Beauty isn't truth, or truth beauty. Saying art is about beauty is like saying "I'm for children." Everyone loves beauty; Nazis loved beauty; Osama loves beauty. As painter Allison Taylor recently wrote, "Reducing art to beauty hides vacant thoughts; it can be used as propaganda, to decorate mini-mansions, and crowd out art that is harder to deal with."
more from Salz at The Village Voice here.
Cultural Determinism and Democracy
Via Normblog, Amartya Sen on democracy and cultural determinism, in the WSJ.com's Opinion Journal.
The belief in the allegedly "Western" nature of democracy is often linked to the early practice of voting and elections in Greece, especially in Athens. Democracy involves more than balloting, but even in the history of voting there would be a classificatory arbitrariness in defining civilizations in largely racial terms. ...[T]there is reluctance in taking note of the Greek intellectual links with other civilizations to the east or south of Greece, despite the greater interest that the Greeks themselves showed in talking to Iranians, or Indians, or Egyptians (rather than in chatting up the Ostrogoths).
Since traditions of public reasoning can be found in nearly all countries, modern democracy can build on the dialogic part of the common human inheritance... [T]he history of Muslims includes a variety of traditions, not all of which are just religious or "Islamic" in any obvious sense. The work of Arab and Iranian mathematicians, from the eighth century onward reflects a largely nonreligious tradition. Depending on politics, which varied between one Muslim ruler and another, there is also quite a history of tolerance and of public discussion, on which the pursuit of a modern democracy can draw. For example, the emperor Saladin, who fought valiantly for Islam in the Crusades in the 12th century, could offer, without any contradiction, an honored place in his Egyptian royal court to Maimonides, as that distinguished Jewish philosopher fled an intolerant Europe. When, at the turn of the 16th century, the heretic Giordano Bruno was burned at the stake in Campo dei Fiori in Rome, the Great Mughal emperor Akbar (who was born a Muslim and died a Muslim) had just finished, in Agra, his large project of legally codifying minority rights, including religious freedom for all, along with championing regular discussions between followers of Islam, Hinduism, Jainism, Judaism, Zoroastrianism and other beliefs (including atheism).
Cultural dynamics does not have to build something from absolutely nothing, nor need the future be rigidly tied to majoritarian beliefs today or the power of the contemporary orthodoxy. To see Iranian dissidents who want a fully democratic Iran not as Iranian advocates but as "ambassadors of Western values" would be to add insult to injury, aside from neglecting parts of Iranian history (including the practice of democracy in Susa or Shushan in southwest Iran 2,000 years ago). The diversity of the human past and the freedoms of the contemporary world give us much more choice than cultural determinists acknowledge.
Scramble for WWII 'Enigma' encoding machine
Bidders in an Internet auction are offering more than 13,000 euros ($15,600) for a wartime German encoding machine, similar to ones whose messages were cracked by British code breakers in World War II.
The portable Enigma encryption machine made in 1941 has a keyboard and a series of rotors designed to scramble messages. It is up for sale on Internet auction site, eBay.
Tony O'Brien in Metapsychology:
Why are so many creative people apparently crazy? Is mental illness, for some people, a doorway to creativity, something that unlocks latent genius that would otherwise lie dormant? Jeffrey A Kottler attempts to answer this question in Divine Madness, Ten Stories of Creative Struggle. The book presents ten case studies of well-known artists, using the term in a broad sense. They are all people who have pushed their creative talents to the limits. In most cases they finally lost the struggle, and died at their own hand or as a consequence of drug abuse. Their lives pose questions about creativity, about suffering, and about art. Finding answers is very much harder.
The individuals Kottler chooses to study are a mixed group of writers, visual artists, and performing artists. Their names are familiar: Plath, Woolf, Munroe, Garland, Nikinsky, Hemingway and others, a roll call of the famously mad.
At MoMA: Islamic Show Sans Politics
Tyler Green in the New York Observer:
As an Iranian-American artist who was effectively exiled from her homeland, Shirin Neshat was happy to be included in an exhibition of artists from the Islamic world. But when the opportunity came—Without Boundary: Seventeen Ways of Looking opened at the Museum of Modern Art on Feb. 26—Ms. Neshat was upset.
Without Boundary is the most important exhibit MoMA has launched in at least a decade, and it’s the first exhibition of contemporary art from the Islamic world in a major American museum since 9/11. The show features 14 artists from Islamic countries, an Indian born to Muslim parents, and two Americans (Mike Kelley and Bill Viola were added late in the show’s development). Iran, Iraq, Palestine, Turkey and Pakistan are represented in the exhibition, though nearly all of the artists from those countries now primarily work in the West. The exhibition is a reminder of the difficulties that museums face when it comes to merging—or not—art and politics.
“My immediate reaction was, how could anyone today discuss art made by contemporary Muslim artists and not speak about the role the subjects of religion and contemporary politics play in the artists’ minds?” Ms. Neshat said. “For some of us, our art is interconnected to the development of our personal lives, which have been controlled and defined by politics and governments. Some artists, including Marjane Satrapi and myself, are ‘exiled’ from our country because of the problematic and controversial nature of our work.”
How poor is poor?
John Cassidy in The New Yorker:
In the summer of 1963, Mollie Orshansky, a forty-eight-year-old statistician at the Social Security Administration, in Washington, D.C., published an article in the Social Security Bulletin entitled “Children of the Poor.” “The wonders of science and technology applied to a generous endowment of natural resources have wrought a way of life our grandfathers never knew,” she wrote. “Creature comforts once the hallmark of luxury have descended to the realm of the commonplace, and the marvels of modern industry find their way into the home of the American worker as well as that of his boss. Yet there is an underlying disquietude reflected in our current social literature, an uncomfortable realization that an expanding economy has not brought gains to all in equal measure. It is reflected in the preoccupation with counting the poor—do they number 30 million, 40 million, or 50 million?”
Orshansky’s timing was propitious. In December of 1962, President John F. Kennedy had asked Walter Heller, the chairman of the Council of Economic Advisers, to gather statistics on poverty. In early 1963, Heller gave the President a copy of a review by Dwight Macdonald, in The New Yorker, of Michael Harrington’s “The Other America: Poverty in the United States,” in which Harrington claimed that as many as fifty million Americans were living in penury.
The federal government had never attempted to count the poor, and Orshansky’s paper proposed an ingenious and straightforward way of doing so.
I am deeply suspicious of the concept of closure. The general public and policy-makers take it as an article of faith that there is something called closure that the criminal justice system can help provide. Even Dalia Lithwick takes it more or less for granted that closure is real. She just questions whether executions are the best way to help survivors achieve it.
Intuitively, we all know more or less what closure is supposed to be. At first grief is overwhelming and all-consuming, but eventually it fades enough for the bereaved person to get on with life. Closure has something to do with that transition.
Upon closer examination, the concept of closure turns out to be much more elusive that we might have supposed.
Closure might refer the emotional shift from acute grief to emotional healing. Alternatively, might to describe some psychological or practical prerequisites that must be in place in order for a person to transcend acute grief (e.g., time, insight, restitution...).
Does Globalization Help or Hurt the World's Poor?
Globalization and the attendant concerns about poverty and inequality have become a focus of discussion in a way that few other topics, except for international terrorism or global warming, have. Yet the strength of people's conviction is often in inverse proportion to the amount of robust factual evidence they have.
As is common in contentious public debates, different people mean different things by the same word. Some interpret "globalization" to mean the global reach of communications technology and capital movements, some think of the outsourcing by domestic companies in rich countries, and others see globalization as a byword for corporate capitalism or American cultural and economic hegemony. So it is best to be clear at the outset of this article that I shall primarily refer to economic globalization--the expansion of foreign trade and investment. How does this process affect the wages, incomes and access to resources for the poorest people in the world?
Scans suggest IQ scores reflect brain structure
Claims that IQ is a valid measure of intelligence tend to attract angry responses, in part because of studies that have attempted to link group differences in IQ with race. In their 1994 book The Bell Curve, political scientist Charles Murray and psychologist Richard Herrnstein argued that the lower-income status of some US ethnic minorities was linked to below-average IQ scores among those groups. These were in turn attributed to mainly genetic factors. Researchers say that a remarkable data set on the developing brain adds to the idea that IQ is a meaningful concept in neuroscience. The study, which is published in this issue, suggests that performance in IQ tests is associated with changes in the brain during adolescence.
Constance Casey in Slate:
Like a lot of beautiful things, tulips inspire malfeasance, and they take a lot of work to maintain. Careless people pick them. Mice, rats, voles, skunks, squirrels, and deer eat them. Even in Holland, they need a lot of human intervention to thrive, because they'd rather be on a rocky mountainside in Turkey, where they come from.
My favorite tulip story comes from The Year of Reading Proust, a memoir by Wesleyan University professor Phyllis Rose. A few years ago, Rose looked out the window of her on-campus house and saw an undergraduate picking a bouquet of tulips from her yard and carrying the flowers uphill toward the dorms. By the time she tracked the tulip thief down, she'd attracted a small crowd.
Scientists Debate Dinosaur Demise
3QD's own Ker Than, in LiveScience.com:
The ancient asteroid that slammed into the Gulf of Mexico and purportedly ended the reign of the dinosaurs occurred 300,000 years too early, according to a controversial new analysis of melted rock ejected from the impact site.
The standard theory states that a giant asteroid about 6 miles wide smashed into the Yucatan Peninsula close to the current Mexican town of Chicxulub about 65 million years ago. The impact raised enough dust and debris to blot out the sun for decades or even centuries.
But Markus Harting of the University of Utrecht in the Netherlands and a small group of scientists thinks the Chicxulub impact happened too early to have been the infamous dinosaur-killer.
March 29, 2006
Controversy over the New Britney Spears Sculpture
Daniel Edwards' sculpture of Britney Spears giving birth has sparked controversy in all corners, in the BBC.
A nude sculpture depicting singer Britney Spears giving birth to her son has prompted a flood of emails from both pro-choice and anti-abortionists.
Monument to Pro-Life: The Birth of Sean Preston will be unveiled at New York's Capla Kesting Art gallery in April.
Gallery co-owner David Kesting said they had received 3,000 emails, some from "pro-life" supporters who thought it was degrading to their movement.
He added that other people were "upset" the sculpture was a pro-life monument.
The life-size work, by artist Daniel Edwards, features Spears crouched on all fours on a bear-skin rug as she gives birth.
It will be displayed at the gallery alongside a display case filled with anti-abortion materials.
Do bloggers lean left or right? Does the blogosphere have an ideological tilt? Such questions once engaged mainstream reporters and pundits struggling to understand an upstart online movement. During the post-9/11, pre-Iraq explosion of "warbloggers," we were told that blogs gave voice to red-state anger and conservative values. Then, during the heyday of Howard Dean's outsider campaign, we heard that they instead embodied a new progressive populism.
Now, the pointlessness of these questions seems plain. You might as well ask, "Do writers lean left or right?" -- or, "Does the world have an ideological tilt?"
more from Salon here.
80s of the imagination
If you’re like me, the peculiar selectivity of the ’80s revival has been a source of considerable perplexity and annoyance. Overlooking complex cultural touchstones like Crime Story, Kate Bush, Q: The Winged Serpent and Philip K. Dick’s VALIS trilogy in favor of Rubik’s Cube, Reaganomics and The Breakfast Club, the ordained collective memory shrouds the awkward vital perversity of the era in Day-Glo bangles and Cosby sweaters.
Similarly, the official picture of the ’80s art world is flat and cartoonish, an embarrassing bubble of self-indulgence irrevocably linked to junk bonds, cocaine and the gutting of the National Endowment for the Arts. But just as Aerial and the Futureheads have re-ignited Kate Bush’s hipness quotient and PKD is suddenly everybody’s go-to guy for the looming information apocalypse, visual artists of the ’80s — unfairly lumped in (and dismissed) with the ham-fisted neo-expressionists, anal-retentive postmodernists and not-anal-retentive-enough performance artists that populate the awesomely bad ’80s of the imagination — are being rediscovered in all their subtlety and depth.
more from the LA Weekly here.
Housefly Gets Glasses Made With Lasers
Pampering pets with designer goods isn't so unusual—and now even your houseflies can get outfitted in style.
An entry in a German science-photo competition, this image shows a fly sporting a set of "designer" lenses crafted and set in place with a cutting-edge laser technique. The glasses fit snuggly on the fly's 0.08-inch-wide (2-millimeter-wide) head.
Why Are Some Animals So Smart?
Even though we humans write the textbooks and may justifiably be suspected of bias, few doubt that we are the smartest creatures on the planet. Many animals have special cognitive abilities that allow them to excel in their particular habitats, but they do not often solve novel problems. Some of course do, and we call them intelligent, but none are as quick-witted as we are.
What favored the evolution of such distinctive brainpower in humans or, more precisely, in our hominid ancestors? One approach to answering this question is to examine the factors that might have shaped other creatures that show high intelligence and to see whether the same forces might have operated in our forebears. Several birds and nonhuman mammals, for instance, are much better problem solvers than others: elephants, dolphins, parrots, crows. But research into our close relatives, the great apes, is surely likely to be illuminating.
Who Killed Christopher Marlowe?
Stephen Greenblatt in the New York Review of Books:
On the morning of May 30, 1593, twenty-nine-year-old Christopher Marlowe made his way to an appointment he had in Deptford, a small town on the Thames, a few miles downriver from London Bridge. The appointment was for 10 AM at a house that belonged to a widow named Eleanor Bull. There Marlowe met three men with whom he was already well acquainted, Ingram Frizer, Nicholas Skeres, and Robert Poley. The four sat all morning in quiet conversation, had lunch together, and afterward walked for some time in widow Bull's garden. At about 6 PM they returned inside for supper. Along with the table at which they ate, the room contained a bed, on which Marlowe lay down after dining; the other three continued to sit next to each other on a single bench, their backs to their reclining companion.
According to the official account, an argument began between Ingram Frizer and Marlowe about the bill— the "reckoning," as it was termed—for the meals they had eaten that day. Their words grew ever more heated. Suddenly Marlowe's anger must have boiled over, for he jumped up and grabbed Frizer's dagger from its sheath. Hemmed in by Skeres and Poley and at first unable to move, Frizer was slashed twice on the head before he finally wrested his weapon out of Marlowe's hands. "And so it befell, in that affray, that the said Ingram, in defense of his life, with the dagger aforesaid of the value of twelve pence, gave the said Christopher then and there a mortal wound over his right eye, of the depth of two inches and of the width of one inch."
H. Allen Orr on Daniel Dennett
From The New Yorker:
[Dennett's] real contribution is an accessible account of what might be called the natural history of religion. (Religion, as he provisionally defines it, involves believing in, and seeking the approval of, a supernatural being.) “There was a time,” he writes, “when there was no religion on this planet, and now there is lots of it. Why?” Why did religion appear in the first place? And why did certain religions spread while others sank into obscurity?
To answer these questions, Dennett says, we must confront two spells. The first is the taboo against asking uncomfortable questions about religion. In his view, religion is simply too important to be spared hard questions. Indeed, he argues, religion is among the most powerful forces on earth and, as religiously inspired warfare and acts of terrorism remind us, it is not always benign. The second spell, in Dennett’s account, is one cast by religion itself. Do we risk dimming religion’s numinous glow by the very act of scientific analysis? Will we, out of what Dennett calls a “pathological excess of curiosity,” rob believers of the deepest and most important part of their lives? Dennett is sensitive to this concern and concedes the danger, but he concludes that the chances of undermining religious sensibility are slight...
Britain: Germans are brainiest (but at least we're smarter than the French)
Helen Nugent in the London Times:
A new European league of IQ scores has ranked the British in eighth place, well above the French, who were 19th. According to Richard Lynn of the University of Ulster, Britons have an average IQ of 100. The French scored 94. But it is not all good news. Top of the table were the Germans, with an IQ of 107. The British were also beaten by the Netherlands, Poland, Sweden, Italy, Austria and Switzerland.
Professor Lynn, who caused controversy last year by claiming that men were more intelligent than women by about five IQ points on average, said that populations in the colder, more challenging environments of Northern Europe had developed larger brains than those in warmer climates further south. The average brain size in Northern and Central Europe is 1,320cc and in southeast Europe it is 1,312cc.
A Nobel Prize for Donkey Kong?
Chris Baker in Slate:
Thousands of industry professionals have descended on Silicon Valley to ogle the latest physics engines and graphics cards, hear panel discussions like "C++ on Next-Gen Consoles: Effective Code for New Architectures," and thrill at being in the same room with the guy who made Marble Madness. But the highlight of the annual Game Developers Conference is an epic battle known as the Game Design Challenge.
The challenge is the brainchild of Eric Zimmerman, the CEO of gameLab and the author of several scholarly books on video games. Each year, Zimmerman asks three pre-eminent designers to build a game around some ridiculously ambitious theme. This year, he tasked them with dreaming up something that could win the Nobel Peace Prize.
Myth and Mystery Surround Wednesday's Solar Eclipse
Tourists and scientists are gathering at spots around the world for a total solar eclipse Wednesday that will sweep northeast from Brazil to Mongolia, blotting out the Sun across swathes of of the world's poorest lands.
Day will turn briefly to dark twilight in the eclipse's path as the Moon comes between the Earth and the Sun. [Viewer's Guide]
As is often the case, the eclipse is shrouded in mystery and misinformation.
The event will occur in highly populated areas, including west Africa, where governments scrambled to educate people about the dangers of looking at the eclipse without proper eye protection.
A total solar eclipse is safe to watch during the darkness of totality. But when Sun is not fully blocked by the Moon, its light can easily damage the eyes, so special protection is required.
More here. [NASA TV will carry the eclipse live from 5 a.m. to 6:12 a.m. ET on March 29.]
Britannica defends itself against Wikipedia
Sarah Ellison in the Wall Street Journal:
On Dec. 15, the scientific journal Nature ran a two-page "special report" titled "Internet encyclopedias go head to head." It compared the accuracy of science entries for the online encyclopedia Wikipedia and the online version of Encyclopaedia Britannica.
Founded in 1768 in Edinburgh, Scotland, Britannica is painstakingly compiled by a collection of scholars and other experts around the world. Wikipedia came to life in California five years ago under a "user-generated" model: That is, anyone who wants to can contribute, or change, an entry.
The Nature report, published in the journal's news section, said there was not much difference between the two. For every four errors in Wikipedia, Britannica had three. "Wikipedia comes close to Britannica in terms of the accuracy of its science entries," the study concluded.
The Problem With Brainstorming
Momus in Wired News:
From time to time I find myself invited to brainstorm for people. This usually involves coming up with new ways my hosts might "add value to their revenue chain" or "leverage their brand." To be perfectly honest, I'm not very good at it. I'll explain why in a moment. First, though, here's a little history of brainstorming.
Brainstorming is a creative problem-solving strategy launched in 1953 in a book called Applied Imagination by Alex F. Osborn, an advertising executive. The basic idea is that when judgment is suspended, a bold and copious flow of original ideas can be produced. It's very much a team effort -- rather than getting bogged down in the judgments, personal criticisms and ego clashes that accompany the ownership of, and investment in, certain ideas, the team acts collectively.
When you're brainstorming, ideas belong to no one and come from anywhere. Anything goes.
More here. [Thanks to Akbi Khan.]
March 28, 2006
Laura Claridge wins prestigious Lukas Prize
I am extremely pleased and proud to announce that my longtime mentor and dearest friend Laura Claridge has won the 2006 Lukas Prize jointly awarded by Harvard and Columbia Universities. Laura is the author of several scholarly books, as well as highly critically acclaimed biographies of Tamara De Lempicka and Norman Rockwell. Here is the announcement by Lawrence Van Gelder in the New York Times:
Laura Claridge’s ‘Emily Post and the Rise of Practical Feminism,’ to be published by Random House, has won the $30,000 J. Anthony Lukas Work-in-Progress Award in the 2006 J. Anthony Lukas Prize Project Awards for exceptional nonfiction. Announced yesterday by the Columbia Graduate School of Journalism and the Nieman Foundation, the accolades included the $10,000 Mark Lynton History Prize to Megan Marshall for ‘The Peabody Sisters: Three Women Who Ignited American Romanticism’ (Houghton Mifflin) and the $10,000 J. Anthony Lukas Book Prize to Nate Blakeslee for ‘Tulia: Race, Cocaine and Corruption in a Small Texas Town’ (Public Affairs Press). Mr. Lukas, a journalist and author who won two Pulitzer Prizes, died in 1997. The awards ceremony will be held on May 9 at Harvard University.
More about the Lukas Prize here. Congratulations, Laura, from all your fans here at 3QD! And I'll be at Harvard on May 9th, for sure.
The decline of the Hapsburg Empire was long, and slow, and confusing, and it produced in the empire's subjects that combination of desperation and indolence that results from staring down into a disaster one is powerless to avert. The years of secure prosperity were over, though many were prosperous still. Political and economic institutions—corrupted, and, it turned out, irreplaceable—careened out of control. In this late period of decline it began to seem possible, even if the idea was deplored, that collectivity had been a dream, that nothing existed but the individual, and so the people living in what was then the Austro-Hungarian Empire did what people do in such circumstances: They sought meaning and solace in life stories, in the successes of the illustrious and the tragedies of those understood to be ordinary. Perhaps this accounts in part for the fact that Stefan Zweig, born in 1881, became, in the period from 1910 until his suicide in 1942, one of Austria's most popular writers by penning more than twenty biographical studies (on Erasmus, Balzac, Marie Antoinette, Magellan, Freud, Casanova, Tolstoy, Nietzsche, and Mary Stuart, among others) and a number of fine, strange novellas, in which the characters very often tell the stories of their lives. Neither was Zweig's popularity limited to the territories of the imploding empire. Translated during his lifetime into twenty-nine languages, his books were also best sellers in all the neighboring and chaotically restructuring European states.
more from Bookforum here.
The Globalization of Science and Linguistic Diversity
In openDemocracy, Ehsan Masood looks at the spread of English language scientific terms and what it may mean for linguistic and cultural diversity.
The issue of language depletion or (at the extreme) language loss is far from abstract. Unesco's Atlas of the World's Languages in Danger of Disappearing, for example, tells us that half of the world's approximately 7,000 spoken languages are endangered to varying degrees. 5,000 of the total number of languages are spoken by groups comprising fewer than 100,000 people; 1,500 have fewer than 1,000 (mostly elderly) speakers.
Should that be a problem for science? There are, after all, many who argue that science is a universal way of understanding the world – and that the answers to questions such as "what is a gene?", "why is our climate changing?", and "is the universe expanding?" will not be any different if the person trying to answer the question speaks Swahili rather than English or French as a first language.
It may be true that the search for answers to asking some of life's big questions can in principle be conducted through the medium of any language. But there are many ways in which the existence of multiple languages (each one intrinsically rich and world-encompassing on its own terms) makes this search – and an exploration of its practical, social and scientific subsets – more enlightening.
More on the Unrest in France
[R]egardless of whether the CPE [Contrat première embauche, the propose reform] is good idea, economically speaking, it is fair to say that Villepin’s governing method has done a great deal to heighten the crisis.
The first thing to keep in mind is that the French parliament is inherently weak: when the government really wants a law to be passed, it always gets its way. This is due in no small part to the fact that the French constitution gives the government various procedural tools to discipline rebel MPs. The most famous and effective of them is the so-called 49.3 (named after the third paragraph of the 49th article of the constitution), which confronts MPs with a stark choice: either let the bill be adopted without a vote or vote to overthrow the government.
Theoretically, that could mean that painful reforms would be easier to implement in France than in other countries. And such procedural tools are of course quite handy when you’re trying to pass a budget without a parliamentary majority. Practically, however, it often creates a perverse set of incentives: why bother trying to build support for your bill if you are 99% sure that the law will be adopted no matter what? The problem, of course, is that snubbing the trade unions and the political parties is a sure-fire way to trigger a direct confrontation between the government and the famed French street.
'Concrete poet' Ian Hamilton Finlay dies age 80
Tim Cornwell in The Scotsman:
The "concrete poet", whose work often featured inscriptions sculpted on walls or floors, died peacefully at the age of 80 in an Edinburgh nursing home after a long illness.
The director of the Scottish National Gallery of Modern Art, Richard Calvocoressi, yesterday called him "the most original artist to have worked in Scotland in the last 50 years".
Little Sparta, the garden that Finlay carved out of six acres on the edge of the Pentlands, he said, "is known all over the world and will remain his lasting monument".
The director of the Royal Botanic Garden of Edinburgh, Professor Stephen Blackmore, said: "Scotland has lost a unique and inspirational gardener and a truly brilliant man."
More here. [Thanks to Alta L. Price.]
A Crooked Timber Seminar on Chris Mooney's The Republican War on Science
It has been a few months since the last Crooked Timber seminar. The new one on Chris Mooney's The Republican War on Science, which includes a response by Mooney, is well worth a read.
Political conflict over scientific issues has probably never been as sharp as at present. Issues like global warming and stem-cell research, that came to prominence in the 1990s are being fiercely debated. At the same time, questions that had, apparently, been resolved long ago, like evolution or the US ban on agricultural use of DDT, are being refought. A striking feature of these debates is that, in nearly all cases (the one big exception being GM foods) the fight lines up the political Right, and particularly the US Republican Party on one side, and the majority of scientists and scientific organisations on the other. Chris Mooney’s book, The Republican War on Science is, therefore, a timely contribution to the debate, and we are happy to host a seminar to discuss it, and thank Chris for agreeing to take part.
In addition to contributions from five members of CT, we’re very pleased to have two guests participating in the debate. Tim Lambert has been an active participant in the blogospheric version of some of the debates discussed by Chris. Tim, like the CT participants, broadly endorses Chris’s argument, though with some disagreement on analytical points and questions of emphasis and presentation. To broaden the debate, Steve Fuller was invited to take part in the seminar, and kindly agreed, knowing that he would be very much in the minority.
Radical Losers, Enzensberger's Take
Also in Sign and Sight, Hans Magnus Enzensberger on radical losers (translated from the origin in Der Spiegel).
In a chaotic, unfathomable process, the cohorts of the inferior, the defeated, the victims separate out. The loser may accept his fate and resign himself; the victim may demand satisfaction; the defeated may begin preparing for the next round. But the radical loser isolates himself, becomes invisible, guards his delusion, saves his energy, and waits for his hour to come.
Those who content themselves with the objective, material criteria, the indices of the economists and the devastating findings of the empiricists, will understand nothing of the true drama of the radical loser. What others think of him – be they rivals or brothers, experts or neighbours, schoolmates, bosses, friends or foes – is not sufficient motivation. The radical loser himself must take an active part, he must tell himself: I am a loser and nothing but a loser. As long as he is not convinced of this, life may treat him badly, he may be poor and powerless, he may know misery and defeat, but he will not become a radical loser until he adopts the judgement of those who consider themselves winners as his own.
Since before the attack on the World Trade Center, political scientists, sociologists and psychologists have been searching in vain for a reliable pattern. Neither poverty nor the experience of political repression alone seem to provide a satisfactory explanation for why young people actively seek out death in a grand bloody finale and aim to take as many people with them as possible. Is there a phenotype that displays the same characteristics down the ages and across all classes and cultures?
No one pays any mind to the radical loser if they do not have to. And the feeling is mutual. As long as he is alone – and he is very much alone – he does not strike out. He appears unobtrusive, silent: a sleeper.
Glucksmann on Holocaust Denial and the Caricature of Mohammed
Andre Glucksmann argues that the caricatures of the prophet Mohammed and Holocaust denial and not equivalent, translated in Sign and Sight (originally published in French in Le Monde and in German in Perlentaucher).
Why are jokes about Muhammad permitted, but not those about the genocide of the Jews? This was the rallying call of fundamentalists before they initiated a competition for Auschwitz cartoons. Fair's fair: either everything should be allowed in the name of the freedom of expression, or we should censor that which shocks both parties. Many people who defend the right to caricature feel trapped. Will they publish drawings about the gas chambers in the name of freedom of expression?
Offence for offence? Infringement for infringement? Can the negation of Auschwitz be put on a par with the desecration of Muhammad? This is where two philosophies clash. The one says yes, these are equivalent "beliefs" which have been equally scorned. There is no difference between factual truth and professed faith; the conviction that the genocide took place and the certitude that Muhammad was illuminated by Archangel Gabriel are on a par. The others say no, the reality of the death camps is a matter of historical fact, whereas the sacredness of the prophets is a matter of personal belief.
This distinction between fact and belief is at the heart of Western thought. Aristotle distinguished between indicative discourse on the one hand, which could be used to reach an affirmation or a negation, and prayer on the other.
Iraq, WMDs, Al-Qaeda: The Distributed Problem Solving Approach
American intelligence agencies and presidential commissions long ago concluded that Saddam Hussein had no unconventional weapons and no substantive ties to Al Qaeda before the 2003 invasion.
But now, an unusual experiment in public access is giving anyone with a computer a chance to play intelligence analyst and second-guess the government.
Under pressure from Congressional Republicans, the director of national intelligence has begun a yearlong process of posting on the Web 48,000 boxes of Arabic-language Iraqi documents captured by American troops.
Less than two weeks into the project, and with only 600 out of possibly a million documents and video and audio files posted, some conservative bloggers are already asserting that the material undermines the official view.
A pill to beat fear?
Does the prospect of public speaking make you panic? Do you run for the hills at the mere mention of spiders? Help could be at hand: researchers have come up with a way to ease the crippling symptoms of phobia. The treatment, developed by a Swiss-led research team, could one day help sufferers to face their fear simply by popping a pill before facing a stressful situation. The researchers hope that it may even have permanent effects, by helping phobics deal with the daunting prospect of undergoing therapy in which they come face to face with their fears.
The remedy contains a human hormone called cortisol, which the body produces naturally in times of stress or fear to help subdue the panic response. Previous studies have shown that increased levels of cortisol help us to blank out painful memories and emotions, allowing us to deal more effectively with stressful situations.
Brain cells fused with computer chips
The line between living organisms and machines has just become a whole lot blurrier. European researchers have developed "neuro-chips" in which living brain cells and silicon circuits are coupled together. The achievement could one day enable the creation of sophisticated neural prostheses to treat neurological disorders, or the development of organic computers that crunch numbers using living neurons. To create the neuro-chip, researchers squeezed more than 16,000 electronic transistors and hundreds of capacitors onto a silicon chip just 1 millimeter square in size.
They used special proteins found in the brain to glue brain cells, called neurons, onto the chip. However, the proteins acted as more than just a simple adhesive. "They also provided the link between ionic channels of the neurons and semiconductor material in a way that neural electrical signals could be passed to the silicon chip," said study team member Stefano Vassanelli from the University of Padua in Italy.
Do death sentences really give victims relief?
Dahlia Lithwick in Slate:
The past few weeks have been rife with accusations of closure denied. The families of Slobodan Milosevic's tens of thousands of victims were ostensibly denied closure when he died before the conclusion of his war-crimes tribunal. Decisions over where to try exiled Liberian ruler Charles Taylor turn largely on how to afford closure to his victims. And the families of those killed in the 9/11 attacks despaired that government misconduct had ended not only the prosecution of Zacharias Moussaoui but also their one chance at closure. "I felt like my heart had been ripped out," said Rosemary Dillard, whose husband died in the attack on the Pentagon. "I felt like my husband had been killed again."
The Moussaoui death-penalty trial has been touted by the government as a way to bring resolution to bereft families. Hundreds watch the proceedings on remote, closed-circuit televisions. Tens will testify about their losses. This will be their "day in court." Since John Ashcroft announced in 2002 that he'd seek the death penalty for Moussaoui to "carry out justice," the assumption has been that justice demands an execution. Ashcroft said something similar in 2001 when he decided that family members of the Oklahoma City bombing victims could witness the execution of Timothy McVeigh on closed-circuit television, insisting it would "meet their need for closure."
Why? What's the empirical basis for the government assumption that all, or even most, victims of terrible tragedy will find "closure" through protracted trials and executions?
Local News Broadcasts Offer Inaccurate Health Stories
"New research finds egregious errors in the reporting of medical studies."
Britt Peterson in Seed Magazine:
Watched the nightly local newscast much in the past few years? Perhaps you've heard that lemon juice can be used as a substitute for HIV medications, or that exercise can actually cause cancer. If your child has something caught in his throat, doctors recommend that you shove your fingers down their gullet to get it out. Oh, and be very sure not to perform self-examinations for breast cancer—unless you want to, in which case, doctors say: Go right ahead.
According to a study in the March issue of the American Journal of Managed Care, these often flat-out wrong and occasionally harmful stories were all broadcast under the guise of scientific fact on local television news programs.
Author Stanislaw Lem dies
From the BBC:
He sold more than 27 million copies of his works, translated into about 40 languages, and a number were filmed.
His 1961 novel Solaris was made into a movie by Russian director Andrei Tarkovsky in 1971 and again by American Steven Soderbergh in 2002.
Soderbergh's version starred George Clooney and Natascha McElhone.
Lem was born in 1921 in Lviv in Ukraine and studied medicine there before World War II. He moved to Krakow in 1946.
When Law and Ethics Collide — Why Physicians Participate in Executions
Atul Gawande in the New England Journal of Medicine:
On February 14, 2006, a U.S. District Court issued an unprecedented ruling concerning the California execution by lethal injection of murderer Michael Morales. The ruling ordered that the state have a physician, specifically an anesthesiologist, personally supervise the execution, or else drastically change the standard protocol for lethal injections.1 Under the protocol, the anesthetic sodium thiopental is given at massive doses that are expected to stop breathing and extinguish consciousness within one minute after administration; then the paralytic agent pancuronium is given, followed by a fatal dose of potassium chloride.
The judge found, however, that evidence from execution logs showed that six of the last eight prisoners executed in California had not stopped breathing before technicians gave the paralytic agent, raising a serious possibility that prisoners experienced suffocation from the paralytic, a feeling much like being buried alive, and felt intense pain from the potassium bolus. This experience would be unacceptable under the Constitution's Eighth Amendment protections against cruel and unusual punishment. So the judge ordered the state to have an anesthesiologist present in the death chamber to determine when the prisoner was unconscious enough for the second and third injections to be given — or to perform the execution with sodium thiopental alone.
The California Medical Association, the American Medical Association (AMA), and the American Society of Anesthesiologists (ASA) immediately and loudly opposed such physician participation as a clear violation of medical ethics codes. "Physicians are healers, not executioners," the ASA's president told reporters.
More here. [Thanks to Michael Blim.]