January 31, 2007
Hiding From the Truth
Philip Kitcher in the Oxford University Press blog:
[Kitcher is the John Dewey Professor of Philosophy at Columbia University. Living With Darwin: Evolution, Design, and the Future of Faith, Kitcher's most recent book, is both a defense of Darwin and an exploration of the meaning behind the clash of religion and modern science. Kitcher is also the author of Abusing Science: The Case Against Creationism, The Lives to Come: The Genetic Revolution and Human Possibilities, Vaulting Ambition: Sociobiology and the Quest for Human Knowledge, Science, Truth, and Democracy, and In Mendel's Mirror. In the article below Kitcher explores how easy it is to hide from the truth.]
Finally, in his State of the Union message, President Bush acknowledged that climate change is a problem. Whether he understands the magnitude of the problem or is prepared for the kinds of measures that are needed to address it remains unclear. But, from many Americans, and especially from people in other countries who have been concerned about global warming for many years, there have been huge sighs of relief. At the same time, there’s an obvious question – why has it taken so long?
The broad outlines of the answer are fairly clear. During recent years, some writers whose conclusions appeal to the values of the President and his advisers, have muddied the waters about climate change. They have employed familiar tactics, casting doubt on any consensus among experts by ignoring the large agreements and concentrating on those places where scientists debate the details. Structurally, the case is much like the long-running battle about evolution: you make it seem as though there is no consensus by judiciously quoting from researchers who are actively involved in discussing unsettled questions, but who agree in a fundamental core framework that you don’t bother to mention.
Behind these two examples lies a deeper problem about the ways the achievements of the sciences are received in American society.
Essay Linking Liberal Jews and Anti-Semitism Sparks a Furor
Patricia Cohen in the New York Times:
The American Jewish Committee, an ardent defender of Israel, is known for speaking out against anti-Semitism, but this conservative advocacy group has recently stirred up a bitter and emotional debate with a new target: liberal Jews.
An essay the committee features on its Web site, ajc.org, titled “ ‘Progressive’ Jewish Thought and the New Anti-Semitism,” says a number of Jews, through their speaking and writing, are feeding a rise in virulent anti-Semitism by questioning whether Israel should even exist.
In an introduction to the essay, David A. Harris, the executive director of the committee, writes, “Perhaps the most surprising — and distressing — feature of this new trend is the very public participation of some Jews in the verbal onslaught against Zionism and the Jewish State.” Those who oppose Israel’s basic right to exist, he continues, “whether Jew or gentile, must be confronted.”
Muhammad Ali: The Brand and the Man
Dave Zirin in The Nation:
Muhammad Ali's brilliance was not that he was some kind of antiwar prophet. He wasn't Malcolm X or Martin Luther King Jr. in boxing gloves, debating foreign policy between rounds. But unlike the Ivy League advisers who made up the "best and brightest," Ali understood then that there was justice and injustice, right and wrong. He knew that not taking a stand could be as political a statement as taking one. This was Ali's code, and he never wavered.
In early 1966 the US Army came calling for Ali, and he was classified 1-A for the draft. He got the news surrounded by reporters and blurted one of the most famous phrases of the decade, "Man, I ain't got no quarrel with them Vietcong."
This was an astounding statement. As Mike Marqusee outlines in his Redemption Song: Muhammad Ali and the Spirirt of the 60s, there was little opposition to the war at the time. The antiwar movement was in its infancy, and most of the country still stood behind the President. Life magazine's cover read, "Vietnam: The War Is Worth Winning." The song "Ballad of the Green Berets" was climbing the charts. And then there was Ali. As longtime peace activist Daniel Berrigan said, "It was a major boost to an antiwar movement that was very white. He was not an academic or a bohemian or a clergyman. He couldn't be dismissed as cowardly."
Worldmapper Maps Health
Infant mortality is babies who die during the first year of their life. In 2002 there were 7.2 million infant deaths worldwide; 5.4% of all babies born died within their first year, including 2.3% in their first week.
The territory with the most infant deaths was India, at 1.7 million, or 24% of the world total. In India, for every 100 babies born alive, almost 7 die in the following 12 months.
In 22 territories the rate is over 1 infant death for every 10 live births. All of these 22 territories are in Africa. The highest infant mortality rate is in Sierra Leone where 16.5 babies die, of every 100 born alive.
Territory size shows the proportion of infant deaths worldwide that occurred there in 2002. Infant deaths are deaths of babies during their first year of life.
What Non-Human Primates Tell Us About Religion
In Salon, an interview with Barbara J. King, author of Evolving God: A Provocative View on the Origins of Religion (via Political Theory Daily Review):
Every human culture has believed in spirits, gods or some other divine being. That's why human beings have often been called Homo religioso. Some people take this long history of belief in the otherworldly as evidence for God; doesn't it explain why religion continues to be so pervasive? But many scientists are coming up with their own, decidedly secular, theories about the origins of faith. In fact, over the last few years, a small cottage industry made up of scientists and philosophers has devoted itself to demystifying the divine.
Take Daniel Dennett, the philosopher who has proposed that religion is a meme -- an idea that evolved like a virus -- that infected our ancestors and continued to spread throughout cultures. By contrast, anthropologist Pascal Boyer argues that religious belief is a quirky byproduct of a brain that evolved to detect predators and other survival needs. In this view, the brain developed a hair-trigger detection system to believe the world is full of "agents" that affect our lives. And British biologist Lewis Wolpert, with yet another theory, posits that religion developed once hominids understood cause and effect, which allowed them to make complex tools. Once they started to make causal connections, they felt compelled to explain life's mysteries. Their brains, in essence, turned into "belief engines."
Of course, these thinkers are either religious skeptics or outright atheists who mean to imply that we've been duped by evolution to believe in supernatural beings when none, in fact, exist. That's what makes Barbara J. King, an anthropologist at the College of William and Mary, so unique. She has no desire to undermine religion. In fact, she's been deeply influenced by the religious writers Karen Armstrong and Martin Buber. But her main insights about the origins of religion come not from researching humans' deep history, but from observing very much alive non-human primates.
Why Men and Women Don't Want Sex
Dr. Helen scans through the comments on a WebMD post on the different reasons why men and women don't want sex and concludes:
Update: A Men's News Daily commenter to this post writes the following:
"Never forget: the single most revolting image, the nightmare that haunts women, is that of the happy, grinning, sexually satisfied male. They really hate that and the sooner we adjust our social expectation to that fact, the better." Truer words were never spoken--I think that some women really do feel this way.
Jill at Feministe responds:
Yes, women do secretly hate the idea of our partners being happy. You’ve got us all figured out.
The double-standard here is amazing. From the letters Dr. Helen quotes, it’s pretty clear that many women are refusing sex because they aren’t enjoying it, or because there are other issues within the relationship that are leaking over into their sex lives. But clearly, they’re just being selfish by not allowing their husbands unrestrained sexual access, even if the sex sucks, or is painful, or is unwanted. As usual, the mens are not doing anything that needs re-evaluating.
holding fast to the prism of her very soul
If Avedon provided the tools, it was Susan Sontag who gave Leibovitz a fresh sense of how she could use them as an autonomous artist. In retrospect, that a high-profile photographer of Leibovitz’s calibre should form an alliance with an intellectual as illustrious as Sontag is perfectly logical. After all, Walker Evans and James Agee formed an influential collaboration in the heyday of “documentary style” photography (Evans’s own term.) The turn of the 21st century twist is that Leibovitz and Sontag are women – and that all aspects of their personal, creative, and intellectual lives were intertwined during the fifteen-year period of their relationship.
Leibovitz’s knowing, “commercial” style stands out in a museum context. The best example is her witty color portrait of the Bush Administration, Cabinet Room (2001). A straight photograph and a public image, it’s also stupendously ironic. Bush, Rice, and the rest of them look like a band posing for a 1970s album cover. But the exhibition reveals that Leibovitz has mastered other modes. Her work shifts from creative service in the political and entertainment industries to photojournalism, as in Traces of the Massacre of Tutsi Schoolchildren and Villagers on a Bathroom Wall (1994), to tender family portraiture. Her soft-focus landscape photography of the American west and vast terrain in other locations includes a picture of Mt. Vesuvius - echoing Sontag’s novel The Volcano Lover.
more from Artcritical here.
Cue uproarious laughter
Interviewing two people at the same time is never easy, but Gilbert and George, a retrospective of whose work opens at Tate Modern next month, take the thing (and of course they're perfectly aware of this) to a whole new level. Ask a question and, to your right, George will offer some piece of gnomic wisdom topped off with a dash of mild smut while, to your left, Gilbert will titter or splutter or make his own naughty joke in an effort to back up his friend. Then, as you struggle to grasp what it is that they actually mean, the two of them will fall eerily silent. Their marmoset eyes are always on you, which would be scary if they weren't so invincibly charming. George, in particular, has the kind of manners - if you ignore the smut - that one might have found behind the discreet rosewood counter of a gentleman's outfitter, circa 1935.
more from The Guardian here.
Being and Laziness
From The New Republic:
Anyone with a claim to literacy is familiar with the names of Tolstoy, Turgenev, and Dostoevsky, and can cite some of the titles of their most famous works. But Goncharov and his novel Oblomov, of which a new translation, a snappily colloquial and readable one, has just been published -- who ever heard of them?
Open any Russian dictionary and you will find the word oblomovshchina, defined, in the first one that comes to hand, as "carelessness, want of energy, laziness, negligence," and specifying its origin in Goncharov's novel, where the word itself is used. Scarcely any other novelist, Russian or otherwise (except perhaps Cervantes), could boast of having created a character whose attributes have left such an indelible impression on the vocabulary, and on the national psyche, of his country.
So who was Ivan Goncharov, and why has the character he created taken on such ineradicably symbolic proportions? He came from a very prosperous merchant family, and was one of the few Russian writers of this period descended from such a background. He was known for his shy and retiring personality, and such reticence may well be attributed to a lingering uneasiness about his status in the carefully delineated Russian caste society.
The late Carl Sagan on questions of science and faith
From The Washington Post:
In 1877, the Italian astronomer Giovanni Schiaparelli was looking at Mars through his new telescope, and he noticed intricate etchings in the equatorial region of the planet's surface. Schiaparelli called these lines canali, by which he probably meant something like "gullies" or "grooves," but his coinage got wrongly translated into English as "canals." It was a regrettable linguistic slip.The idea of Martian canals grabbed the imagination of American astronomer Percival Lowell, scion of the famous Boston Lowell clan, who spun out an elaborate story of a Martian civilization with a central planetary government and the technological wizardry to engineer a massive system of aqueducts. Lowell even used his own Arizona observatory to identify the Martian capital, called Solis Lacus.
There are no canals on Mars. No cities either, and no government. Indeed, no signs of past life whatsoever, as we know today. All of this was an elaborate phantasm of Lowell's fertile mind, yet as late as the 1950s, popular culture was saturated with imagery of Martians as a technologically advanced extraterrestrial race. The late Carl Sagan used the misbegotten tale of Martian engineers, in his 1985 Gifford Lectures in Natural Theology at the University of Glasgow, as a cautionary tale about the power of belief and yearning to trump science and reason.
January 30, 2007
Samir El-youssef: At home with the heretic
Samir El-youssef, raised in a refugee camp, grew up into a writer who challenges the myths of Palestinian politics. Matthew J Reisz meets a trouncer of taboos.
From The Independent:
El-youssef has a Sunni father, but his mother comes from the only Shi'ite Palestinian family. This, he believes, "has contributed to the diversity of my understanding of things - from the beginning you are aware of yourself as someone different". Although he has contributed many articles to the London-based Arabic newspaper Al-Hayat, his criticisms of the second intifada and the Arab policy of "non-normalisation" in relation to Israel have sometimes proved too controversial to be published.
"We have to meet up with the Israelis and have a dialogue with them," he explains. "The idea of not meeting is simply childish and stupid. But it is not easy to express your views. You can be branded a 'Zionist' or a 'traitor' simply for not parroting the same old slogans."
His own social circle consists largely of liberal British Jews and Israelis. Asked about his outspoken opposition to the academic boycott of Israel, he responds cheerfully: "What hope do we have if we as writers don't speak to each other? Do we really think our idiotic leaders are going to sort things out?"
Michael Chabon on Cormac McCarthy's new novel
A review of The Road by McCarthy, from the New York Review of Books:
Charlton Heston and a savagely coiffed vixen, wrapped in animal skins, riding horseback along a desolate seashore, confronted by the spike-crowned ruin of the Statue of Liberty half buried in the sand: everyone knows how the world ends. First radiation, plague, an asteroid, or some other cataclysm kills most of humankind. The remnants mutate, lapse into feudalism, or revert to prehistoric brutality. Old cults are revived with their knives and brutal gods, while tiny noble bands cling to the tatters of the lost civilization, preserving knowledge of machinery, agriculture, and the missionary position against some future renascence, and confronting their ancestors' legacy of greatness and destruction.
Ambivalence toward technology is the underlying theme, and thus we are accustomed to thinking of stories that depict the end of the world and its aftermath as essentially science fiction. These stories feel like science fiction, too, because typically they deal with the changed nature of society in the wake of cataclysm, the strange new priesthoods, the caste systems of the genetically stable, the worshipers of techno-death, the rigid pastoral theocracies in which mutants and machinery are taboo, etc.; for inevitably these new societies mirror and comment upon our own. Science fiction has always been a powerful instrument of satire, and thus it is often the satirist's finger that pushes the button, or releases the killer bug.
This may help to explain why the post-apocalyptic mode has long attracted writers not generally considered part of the science fiction tradition. It's one of the few subgenres of science fiction, along with stories of the near future (also friendly to satirists), that may be safely attempted by a mainstream writer without incurring too much damage to his or her credentials for seriousness.
Sending a man to the moon was an immensely expensive distraction of little scientific or cultural worth
Greg Ross interviews Gerard J. DeGroot, author of Dark Side of the Moon: The Magnificent Madness of the American Lunar Quest, in American Scientist:
To Americans in the 1960s, putting a man on the Moon was a noble, even romantic challenge. "No single space project in this period will be more impressive to mankind," President Kennedy told Congress, "or more important in the long-range exploration of space, and none will be so difficult or expensive to accomplish."
But in re-examining the Apollo project, historian Gerard J. DeGroot finds it largely an empty dream. In Dark Side of the Moon: The Magnificent Madness of the American Lunar Quest (New York University Press), he argues that the Moon race was essentially just a new front in the Cold War, "an immensely expensive distraction of little scientific or cultural worth."
In announcing the Apollo project, Kennedy referred to moving with what he called "the full speed of freedom." Do you think he saw it chiefly as a scientific endeavor, or really as a symbolic contest of ideologies?
I think very definitely the latter. It's very difficult for some people even still, given Kennedy's mystique, to accept that he wasn't quite the person we thought he was. I think the really telling bit comes in a conversation that he has with the NASA administrator James Webb, in which he says, "I don't really care about the moon. I know it's important; I know there are people who really want to go there, but I just want to beat the Russians." So it really comes down to that. It is purely a symbol of American supremacy in the Cold War. Because the Cold War didn't provide real wars, this is in a sense a sort of surrogate war, and almost seemingly chosen with the same sort of cavalier attitude that, say, a Civil War general might choose a battlefield: "Well, we're here, let's fight right here."
Spy Princess: The Life of Noor Inayat Khan
Ruchira Paul in Accidental Blogger:
In 1912, a flamboyant "oriental style" dancer with the exotic name of Mata Hari (mother of god in Hindi) was the toast of Paris night clubs. A traveling musical group from India, The Royal Musicians of Hindustan was in Paris that year. Mata Hari performed with this group. The group's lead singer was a handsome and serious young man named Inayat Khan. He belonged to an accomplished Indian musical family from Baroda and was trained in Indian classical music and the sufi philosophical tradition. The glamorous and famous Mata Hari later went on to become a French spy (some say, a German double agent) during World War I - not the most sensible career choice for someone who sought publicity relentlessly. Little did the gentle Inayat Khan know that one day his own daughter would follow in the footsteps of the notorious Mata Hari and meet an equally tragic (but more honorable) fate.
Inayat Khan traveled the world with his musical group and introduced the pacifist sufi philosophy to western audiences. During a tour of the United States, he met, fell in love with and married Ora Ray Baker. In 1914 their oldest daughter, Noorunnisa Inayat Khan (Noor) was born in Kremlin, Moscow. The family lived in England and France. From all accounts, Noor and her siblings were brought up in a household bearing both eastern and western traditions. Despite European influences on the children's upbringing, the cultured and conservative lifestyle of the Khan family was in keeping with Indian Muslim tradition.(Her American born mother had converted to Islam and adopted the name, Amina Begum.) Noor was trained in classical Indian and western music, playing the sitar, piano, cello and violin. She studied child psychology in Sorbonne and music at the Paris Conservatory.
More here. [Photo shows Noor Inayat Khan.]
Le Grand Content: Examining the omnipresent Powerpoint-culture in search for its philosophical potential...
A short film, via Tony Cobitz's blog, Mtanga:
Paul Auster should not exist
Paul Auster should not exist. I say this not to mimic a sentence that might easily have been plucked from one of his own hall-of-mirrors fictions, but simply to note his singular position in contemporary American letters. He has enjoyed unlikely success by writing reflexive novels that take up notions of chance and fate, memory and oblivion, luck and the uncanny; given his self-referential leanings and taste for highbrow allusion, it might seem that he would at best have found a coterie of admirers and a university appointment to subsidize his writing. Instead, he has settled comfortably into a career as one of the most glamorous novelists in America. Abroad, he has even higher visibility, a genuine rock-star aura. Magazine profiles cite his movie-idol looks and general air of suave elegance, and although Park Slope, the Brooklyn neighborhood where he lives, may now be home to more writers than any other urban enclave on the planet, he stands out in his affiliation with the place as one of its presiding celebrities. He has branched out into subsidiary projects as a radio personality (having headed up a few years back NPR's National Story Project, which solicited anecdotal tales from listeners nationwide, later collected in the anthology I Thought My Father Was God ) and a screenwriter and film director: Best known in this regard for his screenplay for 1995's Smoke (directed by Wayne Wang), Auster has written and directed the rather stilted Lulu on the Bridge (1998) and the just-completed The Inner Life of Martin Frost, based on material from his novel The Book of Illusions (2002). His work has also proliferated into media of unimpeachable hipness: Paul Karasik and David Mazzuchelli adapted City of Glass (1985), the first book in Auster's New York Trilogy, into a graphic novel in 1994, and the beguiling, mischievous French artist Sophie Calle has realized conceptual pieces based on his writings. These extraliterary manifestations contribute to a highly resilient cultural persona, gracing him, if you will, with a street credibility among chic young bookish types that has sustained Auster through an uneven career.
more from Bookforum here.
the permanent night-time of his elected trade
When John le Carré published A Perfect Spy in 1986, Philip Roth, then spending a lot of time in London, called it ‘the best English novel since the war’. Not being such a fan of A Perfect Spy, I’ve occasionally wondered what Roth’s generous blurb says about the postwar English novel. As a le Carré bore, however, I’ve also wondered how Roth managed to overlook Tinker Tailor Soldier Spy (1974), the central novel in le Carré’s career, in which George Smiley – an outwardly diffident ex-spook with a strenuously unfaithful wife and an interest in 17th-century German literature – comes out of retirement to identify the turncoat in a secret service that’s explicitly presented as a metaphorical ‘vision of the British establishment at play’. If you sit up late enough watching DVDs of the BBC adaptation starring Alec Guinness, or Martin Ritt’s version of The Spy who Came in from the Cold with Richard Burton, it’s possible to persuade yourself that le Carré might even be the greatest English novelist alive. Unfortunately, looking at his other books the next morning makes this seem less likely, in part because the classic phase of his career ended earlier than we bores like to remember, and in part because some of his early strengths have become, in a changed context, weaknesses.
more from the LRB here.
to organize the world’s information and make it universally accessible and useful
Every weekday, a truck pulls up to the Cecil H. Green Library, on the campus of Stanford University, and collects at least a thousand books, which are taken to an undisclosed location and scanned, page by page, into an enormous database being created by Google. The company is also retrieving books from libraries at several other leading universities, including Harvard and Oxford, as well as the New York Public Library. At the University of Michigan, Google’s original partner in Google Book Search, tens of thousands of books are processed each week on the company’s custom-made scanning equipment.
Google intends to scan every book ever published, and to make the full texts searchable, in the same way that Web sites can be searched on the company’s engine at google.com. At the books site, which is up and running in a beta (or testing) version, at books.google.com, you can enter a word or phrase—say, Ahab and whale—and the search returns a list of works in which the terms appear, in this case nearly eight hundred titles, including numerous editions of Herman Melville’s novel.
more from The New Yorker here.
Snake Bites the Toxic Toad That Feeds It--and Spreads Its Poison
From Scientific American:
It sounds like something straight out of a video game: A snake collects toxin by biting a poisonous toad and uses that venom as a defense against hawks and other predators. But that is exactly what researchers say the Asian snake Rhabdophis tigrinus does, based on studies of glandular fluid from hatchlings and adult snakes on two Japanese islands.
Some R. tigrinus snakes carry toxins called bufadienolides in their nuchal glands, sacks located under a ridge of skin along their upper necks. When threatened, they arch their necks, exposing the poisonous ridge to an antagonist. The clawing and biting of hawks and other predators most likely rips the skin and lets the poison ooze out, potentially blinding the snake's attackers, says herpetologist Deborah Hutchinson of Old Dominion University in Norfolk, Va. "It might not kill the predator but it would be noxious enough to deter predation," she says.
'Hobbit' human 'is a new species'
The finds caused a sensation when they were announced to the world in 2004. But some researchers argued the bones belonged to a modern human with a combination of small stature and a brain disorder called microcephaly. That claim is rejected by the latest study, which compares the tiny people with modern microcephalics. Microcephaly is a rare pathological condition in humans characterised by a small brain and cognitive impairment.
In the new study, Dean Falk, of Florida State University, and her colleagues say the remains are those of a completely separate human species: Homo floresiensis. They have published their findings in Proceedings of the National Academy of Sciences. The remains at the centre of the Hobbit controversy were discovered at Liang Bua, a limestone cave on the Indonesian island of Flores, in 2003.
January 29, 2007
A Case of the Mondays: The Blank Slate and Other Phantom Theories
Reading Steven Pinker's The Blank Slate reminded me of most other polemical books I'd read that attempt to integrate some science into their works. In theory it's a science book, a longwinded defense of both evolutionary psychology and its obvious social implications. But in practice, it's mostly a political book; the science is provided only as a backdrop against which Pinker sets up his attacks on a host of social, political, and cultural notions that stand in opposition to crude evolutionary psychology (which I'll abbreviate as EP in the rest of this post).
Pinker frames his view as this of modern science, represented by such tools as genetics, neurobiology, and post-Williams Revolution evolutionary biology, versus this of three closely interlinked demons. The first demon, which he focuses on the most, is the view that at birth the human mind is a blank slate to be shaped by environmental forces. The second is romantic affection for the noble savage, uncorrupted by pernicious civilization. And the third is the dualist notion that people are ghosts inhabiting the machines that are their own bodies.
The problems with the book's thesis start right at the beginning, when Pinker claims that a) all three views are interlinked, and b) all three views were very respectable until the science of EP started to overthrow them. The best way of seeing why Pinker is wrong there is by looking at the three philosophical positions he associates with the three demons—empiricism for the blank slate, romanticism for the noble savage, and dualism for the ghost in the machine.
By and large, the philosophers who developed empiricism, romanticism, and dualism in modern times disagreed with one another. Descartes' dualism isn't a component of Locke's empiricism; on the contrary, they disagree on the fundamental issue of whether all knowledge comes from experience. Romanticism developed mostly after the Enlightenment, and was only associated with empiricism or dualism when it mythologized European progress rather than the noble savage.
Zooming in on empiricism, it's easy to see another error of Pinker's: Lockean empiricism does not strictly speaking say the mind is a blank slate, at least not in the way that is relevant to EP. The main point of EP is that the human brain is hardwired to be prone to certain forms of learning and modes of behavior. The EP-derived view that men are on average better than women at math is not that men are born knowing more math than women but that men are born with a greater aptitude for math than women. In contrast, Locke's main contention is that knowledge comes directly from experience. He never concerned himself with social learning, which only became a serious subject of study a century or two after his death.
More importantly, the people Pinker criticizes for distorting science by claiming that IQ is not meaningful or not hereditary, or even that the mind is indeed a blank slate, have nothing to do with the other two demons. Marxist theory, which the people Pinker labels radical scientists adhere to, is extremely anti-romantic and anti-dualist. Among all the radical ideologies in existence—libertarianism, fascism, religious fundamentalism, anarchism—it is certainly the most pro-modern. Lewontin's politics is largely doctrinaire Marxist: in Biology as Ideology, he trumpets the triumph of progress, even as he indicates this progress should come from accepting socialism more than from ordinary capitalist improvements.
The relationship between Pinker and Lewontin is an interesting one. Pinker notes that although Lewontin claims that he thinks the dominant force in evolution is the interaction between gene, organism, and environment, in terms of social implications he ignores everything but environment. On that Pinker is certainly right: Biology as Ideology is an anti-science polemic that distorts facts to fit Lewontin's agenda (my take on Lewontin was subsequently debated in length here). However, Pinker commits the same transgression: he says he believes in the sensible moderate view that human behavior is determined by both inborn and environmental factors, and goes on to not only ignore the implications of the environmental part but also defend racists and sexists who have used pseudoscience as cover.
For instance, he starts by ridiculing people who called Richard Herrnstein a racist for a 1970 paper about intelligence and heredity. Although the paper as Pinker describes it is not racist per se, Herrnstein was indeed a racist. The screed he published with Charles Murray in 1994, The Bell Curve, is not only wrong, but also obviously wrong. Even in 1994, there were metastudies about race and intelligence that showed that the IQ gap disappears once one properly controls for environmental factors, for example by considering the IQ scores of children born to single mothers in Germany by American fathers in World War Two.
The truth, or what a reasonable person would believe to be the truth, is never oppressive. If there is indeed an innate component to the racial IQ gap, or to the gender math score gap, then it's not racist or sexist to write about it. It remains so even if the innate component does not exist, but the researcher has solid grounds to believe it does.
However, Murray and Herrnstein had no such solid grounds. They could quote a few studies proving their point, but when researchers publish many studies about the same phenomenon, some studies are bound to detect statistically significant effects that do not exist. By selectively choosing one's references, one can show that liberals are morally superior or morally inferior to conservatives, or that socialism is more successful or less successful than capitalism. At times there are seminal studies, which do not require any further metastudy. There weren't any in 1994, while existing metastudies suggested that the racial IQ gap was entirely environmental. As I will describe below, the one seminal study done in 2003 moots not only Murray and Herrnstein's entire argument but also much of Pinker's.
To rebut claims of racism and sexism, Pinker invokes the moral argument—in other words, that to be against racism and sexism one need only vigorously oppose discrimination, without believing that without any discrimination there would be no gaps in achievement. In theory, that is correct. But in practice, that narrow view makes it impossible to enforce any law against discrimination.
Worse, Pinker invokes anti-feminist stereotypes that are born not of serious scholarship, but of ideologically motivated conservative thinking. He supports Christina Hoff-Sommers' spurious distinction between equity feminism and gender feminism. Although there are many distinctions among different kinds of feminists, some of which track the degree of radicalism, none of the serious ones has anything to do with Hoff-Sommers'. In theory, equity feminism means supporting equality between women and men, while gender feminism means supporting a view of the world in which the patriarchy is omnipresent. In practice, the people who make that distinction, including Pinker, assign everyone who supports only the forms of equality that are uncontroversial in the United States, like equal pay laws and suffrage, to equity feminism, and everyone who supports further changes or even existing controversial ones to gender feminism.
As a case study, take family law activist Trish Wilson. Wilson's activism focuses on divorce law; she has written articles and testified in front of American state legislatures against laws mandating presumptive joint custody, mainly on the grounds that it hurts children. In addition, she has written exposés of ways abusive men exploit legal loopholes, including presumptive joint custody, to gain custody of children. In pushing for equality in the courtroom, she is a liberal feminist's liberal feminist. And yet, her attacks on the men's rights movement for protecting abusive men have caused every conservative who makes distinctions between equity and gender feminism to deride her as a gender feminist.
Any reasonable distinction between a more radical feminist stream and a more conventional one would put Betty Friedan and her organization NOW on the less radical side. Friedan was anti-radical enough to devote much of her tenth anniversary afterword to The Feminine Mystique to attacking radical feminists, by which she means not Catharine MacKinnon or Andrea Dworkin, but Kate Millett. NOW has focused on legal equality, primarily abortion rights and secondarily laws cracking down on employment discrimination and sexual harassment. But Pinker assigns Friedan as well as Bella Abzug to the gender feminism slot, using entirely trivial statements of theirs to paint them as radicals. Friedan he attacks for suggesting extending compulsory education to the age of 2; Abzug he attacks for saying equality means fifty-fifty representation everywhere.
To his credit, Pinker never quite claims that there is no gender discrimination. However, he makes an earnest effort to undermine every attempt to counteract it, however well founded. For instance, he claims that it's absurd to say that women's underrepresentation in science in the United States is due to discrimination, on the grounds that they're even more underrepresented in math, and it's not likely mathematicians are more sexist than scientists. Instead, he suggests, women are just not interested in math and science.
However, it is legitimate to ask why this interest gap exists. There is no EP-based argument why it should be innate. On the contrary, independent evidence from, for example, the proportion of female mathematicians who come from families of mathematicians versus the proportion of male mathematicians, suggests it is environmental. Indeed, the educational system of the United States has long encouraged women to ignore the hard sciences. Other educational systems produce near-parity: while 13% of American scientists and engineers are women, many other countries, such as Sweden and Thailand, have percentages higher than 30 or even 40.
Furthermore, one of the most important pieces of information about biases in education, the stereotype threat, receives no mention from Pinker. It's an established fact that telling girls who are about to take a math test that boys do better will make them do worse. In fact, telling them that the test measures aptitude, or even asking them to fill out an oval for gender before the test, will hurt their performance. And yet somehow Pinker glosses over that fact in a book that purports to be about a combination of genetics and environment.
There is hardly a single thing Pinker gets right about rape in his book, except that Susan Brownmiller is wrong. His explanation of rape is that it is a male biological urge, as evidenced in the fact that in many species males rape females. However, that theory says nothing about why straight men rape other men in prison, or in general about the dynamics of male-on-male rape. He provides scant circumstantial evidence for his theory of rape; instead, he prefers to rant about how Brownmiller's feminist theories are dominant, even though in fact the dominant view among criminologists is that rape is simply a violent crime, rather than a case of passionate sex gone awry or a mechanism of reinforcing the patriarchy.
Pinker commits not only a sin of omission in writing about rape or violence in general, but also a sin of commission, in writing that nobody really knows what causes violence. In fact, criminologists have fairly good ideas about how social ills like poverty and inequality cause crime, although they know it about murder more than about other violent crimes. Still, the rates of all violent crimes are closely correlated; the major exception is the United States' murder rate, which is higher than its general violent crime rate predicts presumably because of its lax gun control laws.
Finally, Pinker quotes a 2001 study by Eric Turkheimer as showing that the Darwin wars ended and the gene-centric side, led by Richard Dawkins, prevailed over the more environment-based side, led by Stephen Jay Gould. Thence Pinker concludes that attempts to raise children in ways more conducive to growth are futile, since much of their future behavior is genetic, and almost all of what is not genetic is due to developmental noise rather than environmental influence.
However, in 2003 Turkheimer published another study, which sealed the questions of race and IQ and of environmental influences on children in general. Turkheimer's starting point was that earlier studies about the heritability of IQ often focused on adopted children in middle- and upper-class families, where environmental influences might be different from in lower-class families. By examining a large array of data spanning multiple races and social classes, he saw that on the one hand, within the middle class IQ is highly genetic, with a heritability level of 0.72 and no significant environmental effects. But on the other, within the lower class, which includes most blacks and Hispanics in the US, the heritability of IQ drops to 0.1, and environmental factors such as the depth of poverty or the level of schooling predominate.
Obviously, it would be futile to blame Pinker for not mentioning Turkheimer's 2003 study. The Blank Slate was published in 2002. However, all other facts I have cited against Pinker's thesis and its purported social implications predate 2002. The Turkheimer study does not show by itself that Pinker's book is shoddy; it merely shows that much of it is wrong. What establishes Pinker's shoddiness is the treatment of social problems like sexism, racism, and crime, which is based not on examination of the available evidence or even the views that are mainstream among social scientists who study them, but on what think tanks whose views align with Pinker's say.
Even a cursory examination of the current mainstream social scene will show that the myths of the noble savage and the ghost in the machine are nonexistent. That fringe scholars sometimes believe in them is no indication of their level of acceptability; there are fringe scholars who believe in 9/11 conspiracy theories, too. Even the theory of the blank slate, at least in its most extreme form, is a phantom ideology. Lewontin adheres to it, but Lewontin is a contrarian; non-contrarian scientists do not publish books comparing modern biology departments to Medieval Christianity. Pinker likes to poke fun at theories that suggest everyone or almost everyone can succeed in life, but he never gets around to actually refuting them. All he does is attack extreme caricatures such as the blank slate and other phantom theories.
PERCEPTIONS: centuries of grief
Firoze Shakir. Muharram in Lucknow. Ya Sakina Ya Abbas.
"This is a chant that rips through the air on Aathvi after burying the Chup tazia, and along with Ya Hussain Ya Hussain the Azreens do an unforgettable Matam."
More on this particular series here.
Muharram in Karbala here.
FS has a unique, informative and broad ranging gallery of photographs here.
Shia and Sunni, A Ludicrously Short Primer
Even now, many people who hear these terms daily on the news are confused about what the real differences are between Sunni and Shia Muslims, so I, having been brought up in a very devout Shia household in Pakistan, thought I would explain these things, at least in rough terms. Here goes:
It all started hours after Mohammad's death: while his son-in-law (and first cousin) Ali was attending to Mohammad's burial, others were holding a little election to see who should succeed Mohammad as the chief of what was by now an Islamic state. (Remember that by the end of his life, Mohammad was not only a religious leader, but the head-of-state of a significant polity.) The person soon elected to the position of caliph, or head-of-state, was an old companion of the prophet's named Abu Bakr. This was a controversial choice, as many felt that Mohammad had clearly indicated Ali as his successor, and after Abu Bakr took power, these people had no choice but to say that while he may have become the temporal leader of the young Islamic state, they did not recognize him as their divinely guided religious leader. Instead, Ali remained their spiritual leader, and these were the ones who would eventually come to be known as the Shia. The ones who elected Abu Bakr would come to be known as Sunni.
This is the Shia/Sunni split which endures to this day, based on this early disagreement. Below I will say a little more about the Shia.
So early on in Islam, there was a split between political power and religious leadership, and to make a long story admittedly far too short, this soon came to a head within a generation when the grandson of one of the greatest of Mohammad's enemies (Abu Sufian) from his early days in Mecca, Yazid, took power in the still nascent Islamic government. Yazid was really something like a cross between Nero and Hitler and Stalin; just bad, bad in every way: a decadent, repressive dictator (and one who flouted all Islamic injunctions), for whom it became very important to obtain the public allegiance of Husain, the pious and respected son of Ali (and so, grandson of Mohammad). And this Husain refused, on principle.
Yazid said he would kill Husain. Husain said that was okay. Yazid said he would kill all of Husain's family. Husain said he could not compromise his principles, no matter what the price. Yazid's army of tens of thousands then surrounded Husain and a small band of his family, friends and followers at a place called Kerbala (in present day Iraq), and cut off their water on the 7th of the Islamic month of Moharram. For three days, Husain and his family had no water. At dawn on the third day, the 10th of Moharram, Husain told all in his party that they were sure to be killed and whoever wanted to leave was free to do so. No one left. In fact, several heroic souls left Yazid's camp to come and join the group that was certain to be slaughtered.
On the 10th of Moharram, a day now known throughout the Islamic world as Ashura, the members of Husain's parched party came out one by one to do battle, as was the custom at the time. They were valiant, but hopelessly outnumbered, and therefore each was killed in turn. All of Husain's family was massacred in front of his eyes, even his six-month old son, Ali Asghar, who was pierced through the throat by an arrow from the renowned archer of Yazid's army, Hurmula. After Husain's teenage son Ali Akbar was killed, he is said to have proclaimed, "Now my back is broken." But the last to die before him, was his beloved brother, Abbas, while trying desperately to break through Yazid's ranks and bring water back from the Euphrates for Husain's young daughter, Sakeena. And then Husain himself was killed.
The followers of Ali (the Shia) said to themselves that they would never allow this horrific event to be forgotten, and that they would mourn Husain and his family's murder forever, and for the last thirteen hundred years, they have lived up to this promise every year. This mourning has given rise to ritualistic displays of grief, which include flagellating oneself with one's hands, with chains, with knives, etc. It can all seem quite strange, out of context, but remembrance of that terrible day at Kerbala has also given rise to some of the most sublime poetry ever written (a whole genre in Urdu, called Marsia, is devoted to evoking the events of Ashura), and some of us, religious or not, still draw inspiration from the principled bravery and sacrifice of Husain on that black day.
Earlier today, I took the following unlikely pictures on the ritziest road in New York City, Park Avenue:
This is the procession commemorating Ashura, or the 10th of Moharram. In front, you can see a painstakingly recreated model of the tomb of Husain. The mourners are dressed mostly in black. It is a testament to the tolerance of American society that despite the best attempts of some of its cleverest citizens to proclaim a "clash of civilizations," it allows (and observes with curiosity) such displays of foreign sentiment.
The procession is made up of Shias of various nationalities, with the largest contingents being from Pakistan and Iran.
A young Shia holds up a banner, perhaps forgetting for a second that he is supposed to be mourning.
You can see one of the coffins with roses on it, which are ritualistically carried in the procession.
The self-flagellation is in full swing at this point. (The arms are raised before coming down to beat the chest.)
This is "Zuljana" or Husain's horse, caparisoned with silks and flowers.
The self-flagellation, or matam, reaches a climactic frenzy before ending for Asr prayers. Later in the evening, there are gatherings (or majaalis) to remember the women and children of Husain's family who survived to be held as prisoners of Yazid.
Sojourns: Two Views of the Apocalypse
Slavoj Zizek once said "it is much easier for us to imagine the end of the world than a small change in the political system. Life on earth maybe will end but somehow capitalism will go on." One is tempted to respond, well yes of course. It is also easier to imagine blowing up a car than designing one. Destruction is a rather simple proposition. Feats of engineering are somewhat more complicated.
And yet there is something to the apocalyptic imagination. Thinking about the end of the world can perhaps tell us something about the world that is ostensibly ending. Or so it would seem from two of the more visually arresting films to appear in the last decade, both ruminating over our final days, both set, as it happens, in England. I refer here to everyone's favorite intellectual zombie flick 28 Days Later and the more recent dystopian thriller Children of Men.
The first thing I would point to is that it is not the "world" that is ending in these movies so much as the human race that has lorded over it for the past eon or so. It is part of our species arrogance to identify the world with humanity and then to wonder if our destruction would be anything other than a good thing for the rest of "life on earth." So then let us be clear. What we are talking about here is not exactly the globe or the planet but simply the noisome breed of animals bent on mucking it up for everyone else.
Humans. We are tiresome, aren't we? Few could deny the beauty of the depopulated London with which 28 Days Later begins: the seraphic Cillian Murphy ambling about Oxford Circle, picking detritus off the ground, alone save for the pigeons and the gulls. Humanity has perished because the "rage virus" has been loosed from a lab and made us tear each other limb from limb. We don't die from the virus itself. It's the rage that kills us. And so we ought to wonder how much the virus adds to our native cruelty and rancor. Perhaps Cornelius had it right after all: "Beware the beast Man, for he is the devil's pawn. Alone among God's primates he kills for lust or sport or greed … Let him not breed in great in numbers, for he will make a desert of his home and yours."
Actually, the conclusion (or at least the original one) of 28 Days Later is nowhere near as radical. It turns out the virus never got out of the country. Humanity is spared. The hero, his girlfriend, and an orphaned kid make an ersatz domestic hearth in the English countryside, all warm in their sweaters and waiting to be rescued. Rage may be conquered after all. Perhaps we can all just get along.
Humanity (nearly) perishes by anger in 28 Days Later. Sadness dooms us in Children of Men. Seventeen years after a global infertility crisis has brought a stop to human reproduction across the planet, "life" has pretty much ground to a halt. There's no future generation in sight, so nations plunge into despair. War, chaos, and social entropy ensue. The sound of children's voices is dearly missed.
Children of Men is a movie at odds with itself. At its core, the story is a saccharine humanist fable of a culture of life fighting to persist among one of death. A baby springs miraculously into the fallen world and suddenly there is a future to save, as if one could only live for the sake of progeny, as if a world without humans would not be left well enough alone. Amid the rubble and squalor of the end of world, life or death struggle turns to getting the baby offshore to a group of save-the-planet scientists aptly dubbed (giving the game away) … the human project.
Yet, for us much as the movie is committed at the level of story to a bland humanism, it is equally committed at the level of form to something quite different, to making us wonder, within the terms of the narrative, whether the human species ought not to become extinct after all. A great deal of attention has been paid to the six-minute long shot in a battle strewn internment camp. As with 28 Days Later, humanity's end makes quite a spectacle. I would point also to an earlier scene at an abandoned and dilapidated schoolyard. Here we are supposed to be thinking about the despair left in the absence of children. But the camera does something else. We freeze on a deer that strides into the frame and occupies the place of the missing kids. It's an arresting moment precisely in the species difference. A non-human animal walks on the ruins of a civilization made for human children. And perhaps that is just fine.
As with 28 Days Later, humanity ends and begins again in England and is best imagined wrapped up in a cable knit sweater while drinking Earl Gray tea (a role brilliantly played here by Michael Cain). Yet, Children of Men makes the saving of humanity look and feel like it is beside the point and a waste of time. And that is why it is most interesting in spite of its own worst ideas.
So, perhaps the lesson is that thinking about the end of the world is in fact thinking about making it a better place.
January 28, 2007
No Reservations, Asad Raza-Style
Recently, my wife and I have been avid watchers of chef Anthony Bourdain's program No Reservations on the Travel Channel (get cable, will you? And then get TIVO, too--trust me), and as I see Tony visit exotic locales and sample their various culinary offerings, I always wonder why he never replied to the late-nite letter that I once wrote him inviting him to dinner at my house, even promising to get my nephew Asad Raza to cook the incomparably zesty-yet-subtle, and completely sui generis, Pakistani dish, Nihari, for him. Now, let me tell you, Asad cooks a mean Nihari, but even the NM (Nihari Master) must go to the source for inspiration and instruction once in a while, and Asad not only went to Burns Road in Karachi (read about some of his other activities while he was there, here), he recorded his visit on video for the rest of us. So, Tony, either go to Pakistan, or come over to my place for some of Asad's Nihari, and meanwhile, watch this video which made my mouth water (and my heart ache):
The Decline and Fall of Public Festivals
In The Nation, Terry Eagleton reviews Barbara Ehrenreich's Dancing in the Streets.
Western liberals who are besotted with the Other should read E.M. Forster's mischievous little novel Where Angels Fear to Tread. The well-bred young English heroine of this tale runs off with a rather roughneck young Italian, to the horror of her priggish, xenophobic, stiff-necked family. Yet just as the reader is relishing the family's discomfort, an equally discomforting realization begins to dawn. The young Italian turns out to be an appalling brute. The parochially minded prigs were right after all.
Barbara Ehrenreich's Dancing in the Streets refuses to fall for the romance of the Other, though its subject--popular festivity versus puritanical order--might well have tempted her to. What we have instead is an admirably lucid, level-headed history of outbreaks of collective joy from Dionysus to the Grateful Dead. It is a book that investigates orgy but declines quite properly to join in. For one thing, it recognizes in its impressively unromantic way that most carnivalesque activity over the centuries has been planned rather than spontaneous, rather as rock concerts are today. For another thing, unlike the more dewy-eyed apostles of dancing in the streets, it recognizes that popular carnival has a darker, violent dimension. In wisely agnostic manner, Ehrenreich refuses to take sides in the debate about whether carnival is a licensed displacement of popular energies ("There is no slander in an allowed fool," remarks Olivia in Shakespeare's Twelfth Night), or whether it is a case of the plebeians rehearsing the uprising.
An Online Conference on Danto's The Transfiguration of the Commonplace
I first encountered Arthur Danto’s philosophy as an undergraduate in Jerusalem in the early 1970s, in a course on analytic aesthetics, where we also studied the texts of Monroe Beardsley, Nelson Goodman, Richard Wollheim, George Dickie, and Joseph Margolis. Each of these philosophers has a distinctive voice, and it was not Danto’s but Nelson Goodman’s that initially won my heart and inspired my philosophical ambitions. So inseparable was his red-covered Languages of Art from my person that friends jokingly described it, with reference to Chairman Mao’s current eminence, as my little red book of cultural revolution. Goodman’s austerely uncompromising nominalism, his lean, hard-fisted logical style, his confident, even arrogant tone of conviction all appealed to me as a young Israeli shaped by that culture’s military virtues. The infatuation did not survive my doctoral studies in Oxford, and my unqualified zeal for analytic philosophy did not survive my encounter with pragmatism in the early 1990s. Now, after more than thirty years of engagement with analytic aesthetics (both from the inside and from the critical perspective of the pragmatist aesthetics I advocate), I regard Danto as having its most alluringly potent oeuvre. This paper is, in part, an effort to explain why.
Several factors contribute to Danto’s greatness and collectively conspire to take him beyond those other prominent analytic aestheticians of his generation whose conceptual and argumentative skills seem every bit as impressive and who are likewise capable of systematic philosophy. First is his lovingly intimate engagement with the visual arts, though this is something that Wollheim and Goodman certainly shared. Another factor is Danto’s superior literary style – artfully belle-lettrist but never artificial, colorful and free-flowing without sacrificing logical form, bold but not bullying in its argumentation, imaginative but not eccentric, sophisticated and complex yet easy to follow, professional but not pedantic, precise enough to satisfy the philosophical expert but sufficiently flexible and broadly comprehensible to convey its message to any intellectual interested in the arts. There is also the vibrant passion that pervades Danto’s aesthetic imagination, a passion as richly inflected with the erotic as the philosophical, fusing his sensuous and intellectual perceptions to make his arguments intriguingly compelling even when their logical architecture seems slim and shadowy in pure conceptual terms.
An Interview With Simon Blackburn
(Via Political Theory Daily Review) the Virtual Philosopher interviews Simon Blackburn:
Nigel: Since returning to England from the States, taking up a professorial chair in Cambridge, you have been prolific as a writer of popular philosophy books: Think, Being Good, Lust, and most recently Truth. Is there a particular reason for this apparent change of direction?
Simon: Actually Think was published a couple of years before I left the States, and Being Good was finished before I did so. So if there was a change in direction, I suppose it was while I was in the States. It is a change of emphasis, I think, more than a change of direction. I have always had a democratic streak: one of my earliest books, published in 1984, was supposed to be an introduction to the philosophy of language (Spreading The Word). But I also like to blend some of my own attempts at philosophy into supposedly introductory books, so for instance that book was quite influential in its moral theory, and to some extent in what it said about other things such as rule-following and truth. I try to keep on publishing "professional" papers while I also produce books like these.
Nigel: David Hume in his essay writing saw himself as an ambassador from the 'dominions of learning to those of conversation'. Is that a position that you now identify with? Who are you writing for? Do you know who reads your books?
Simon: As so often, Hume puts it better than I could myself. Yes, that's an admirable description of a position I identify with. I think professional philosophy can be very odd, very self-contained and narcissistic and quite out of touch with more general cultural currents. Its writings, as Bernard Williams memorably put it, can often resemble "scientific reports badly translated from the Martian". I think good philosophy always has had to take some nourishment from surrounding politics, moral concerns, and science. It may be harder to identify what it returns, but in my books I try at least to exhibit something it can give back.
How Did The Answer To "What to Eat?" Get So Complicated?
Michael Pollan in the NYT Magazine:
Eat food. Not too much. Mostly plants.
That, more or less, is the short answer to the supposedly incredibly complicated and confusing question of what we humans should eat in order to be maximally healthy. I hate to give away the game right here at the beginning of a long essay, and I confess that I’m tempted to complicate matters in the interest of keeping things going for a few thousand more words. I’ll try to resist but will go ahead and add a couple more details to flesh out the advice. Like: A little meat won’t kill you, though it’s better approached as a side dish than as a main. And you’re much better off eating whole fresh foods than processed food products. That’s what I mean by the recommendation to eat “food.” Once, food was all you could eat, but today there are lots of other edible foodlike substances in the supermarket. These novel products of food science often come in packages festooned with health claims, which brings me to a related rule of thumb: if you’re concerned about your health, you should probably avoid food products that make health claims. Why? Because a health claim on a food product is a good indication that it’s not really food, and food is what you want to eat.
Uh-oh. Things are suddenly sounding a little more complicated, aren’t they? Sorry. But that’s how it goes as soon as you try to get to the bottom of the whole vexing question of food and health. Before long, a dense cloud bank of confusion moves in. Sooner or later, everything solid you thought you knew about the links between diet and health gets blown away in the gust of the latest study.
Last winter came the news that a low-fat diet, long believed to protect against breast cancer, may do no such thing — this from the monumental, federally financed Women’s Health Initiative, which has also found no link between a low-fat diet and rates of coronary disease. The year before we learned that dietary fiber might not, as we had been confidently told, help prevent colon cancer. Just last fall two prestigious studies on omega-3 fats published at the same time presented us with strikingly different conclusions. While the Institute of Medicine stated that “it is uncertain how much these omega-3s contribute to improving health” (and they might do the opposite if you get them from mercury-contaminated fish), a Harvard study declared that simply by eating a couple of servings of fish each week (or by downing enough fish oil), you could cut your risk of dying from a heart attack by more than a third — a stunningly hopeful piece of news. It’s no wonder that omega-3 fatty acids are poised to become the oat bran of 2007, as food scientists micro-encapsulate fish oil and algae oil and blast them into such formerly all-terrestrial foods as bread and tortillas, milk and yogurt and cheese, all of which will soon, you can be sure, sprout fishy new health claims. (Remember the rule?)
3D morphable model face animation
This is a very interesting video presentation I found at Lindsay Beyerstein's Majikthise:
Scientists bridging the spirituality gap
Religion and science can combine to create some thorny questions: Does God exist outside the human mind, or is God a creation of our brains? Why do we have faith in things that we cannot prove, whether it’s the afterlife or UFOs? The new Center for Spirituality and the Mind at the University of Pennsylvania is using brain imaging technology to examine such questions, and to investigate how spiritual and secular beliefs affect our health and behavior. Newberg's center is not a bricks-and-mortar structure but a multidisciplinary team of Penn researchers exploring the relationship between the brain and spirituality from biological, psychological, social and ideological viewpoints.
How does the center test the relationship between the mind and spirituality? In one study, Newberg and colleagues used imaging technology to look at the brains of Pentecostal Christians speaking in tongues — known scientifically as glossolalia — then looked at their brains when they were singing gospel music. They found that those practicing glossolalia showed decreased activity in the brain’s language center, compared with the singing group.
Rare "Prehistoric" Shark Photographed Alive
From The National Geographic:
Flaring the gills that give the species its name, a frilled shark swims at Japan's Awashima Marine Park on Sunday, January 21, 2007. Sightings of living frilled sharks are rare, because the fish generally remain thousands of feet beneath the water's surface. Spotted by a fisher on January 21, this 5.3-foot (160-centimeter) shark was transferred to the marine park, where it was placed in a seawater pool. "We think it may have come to the surface because it was sick, or else it was weakened because it was in shallow waters," a park official told the Reuters news service. But the truth may never be known, since the "living fossil" died hours after it was caught.
January 27, 2007
Surface texture in new building design
Witold Rybczynski in Slate:
The new de Young museum in San Francisco, which opened in 2005, replaces a Spanish-style building that had been insensitively "modernized" in 1949, fatally weakened by the 1989 Loma Prieta earthquake, and finally demolished in 2003. Despite billing itself as a "museum of the 21st century," the new building has many of the hallmarks of a 19th-century art gallery: skylights, separate wings for different collections, landscaped courtyards, and a grand staircase. What is decidedly unconventional, in this age of extrovert cultural institutions, is the exterior appearance, which, from a distance, is slightly forbidding, uncommunicative, almost grim.
Unusual exterior cladding is something of a trademark of the primary design architects, the Swiss firm Herzog & de Meuron, who have previously used skins of twisted copper strips, printed acrylic, and silk-screened glass and concrete. The walls of the de Young are paper-thin copper panels, both dimpled and perforated. The dimples—concave and convex—create changing textures, while the perforations, which vary in size and density, emphasize the thinness and create the effect of a scrim. Like the exteriors of many recent buildings (Zaha Hadid's Rosenthal Center for Contemporary Art in Cincinnati, Daniel Libeskind's Denver Art Museum, Herzog & de Meuron's own addition to the Walker Art Gallery), the walls neither reveal structure nor express inner organization. Instead, the patterns are purely ornamental, which makes them curiously similar in function to the moldings and decorations on the exteriors of classical buildings. The trend suggests just how far from its Modernist roots the current generation of architects has strayed.
Irene Khan, secretary general of Amnesty International, blogging from the World Economic Forum in Davos:
In true WEF style, a galaxy of stars – from Blair to Bono and Bill Gates, from Thabo Mbeki, the President of South Africa, to Ellen Sirleaf Johnson, the President of Liberia and the first woman head of state in Africa – sat around the table, with lesser luminaries like Paul Wolfowitz, the President of the World Bank, and Joseph Stiglitz, the award-winning economist in the row, directly behind them.
The subject was “Delivering on the Promise of Africa”. While everyone agreed that the promise was being marred by crippling debt, rampant corruption, weak commitment by both western and African leaders and the lack of capacity in Africa, the round table exuded a sense of optimism and hope for the future.
It was an inspiring debate. The two African Presidents and the African NGO representative emphasised the leadership and participation of Africans themselves.
Bono called Africa “an adventure and an opportunity”; for Blair, it was a “strategic interest”. Ogata believed that it was possible to have an African miracle.
It was left to Ogata, the former UN High Commissioner for Refugees, to remind all of us that Africa must be made refugee-free and that the carnage in Darfur must be stopped.
It was the first time that anyone had mentioned Darfur in Davos. A human rights and humanitarian catastrophe – that has left millions displaced, killed, raped with impunity - is nowhere to be found on the agenda of WEF. Business leaders see no commercial interest in that part of Sudan. Political leaders would prefer not to be reminded of the limits of their own impotence.
Friends in unlikely places
The superstores are suddenly competing to be green. Can we trust them?
George Monbiot in The Guardian:
You batter your head against the door until you begin to wonder whether it is a door at all. Suddenly it opens, and you find yourself flying through space. The superstores’ green conversion is astonishing, wonderful, disorienting. If Tesco and Walmart have become friends of the earth, are any enemies left?
These were the most arrogant of the behemoths. They have trampled their suppliers, their competitors and even their regulators. They have smashed local economies, broken the backs of the farmers, forced their contractors to drive down wages, shrugged off complaints with a superciliousness born of the knowledge that they were unchallengeable. For them, it seemed, there was no law beyond the market, no place too precious to be destroyed, no cost they could not pass to someone else.
We environmentalists developed a picture of the world which seemed to be repeatedly confirmed by experience. Big corporations destroy the environment. They are the enemies of society. The bigger they become, the less they can be constrained by either democracy or consumer power. The politics of scale permit them to bully governments, tear up standards, reshape the world to suit themselves.
Anyone with a claim to literacy is familiar with the names of Tolstoy, Turgenev, and Dostoevsky, and can cite some of the titles of their most famous works. But Goncharov and his novel Oblomov, of which a new translation, a snappily colloquial and readable one, has just been published--who ever heard of them? Well, Beckett for one, who was told to read Oblomov by his mistress Peggy Guggenheim, and soon signed some of his letters to her with this cognomen. I recall my teacher at the University of Chicago long ago, the renowned classicist David Grene, who had been a fellow student of Beckett's at Trinity College, Dublin, telling me that the future famous writer was well-known as a very late riser and missed classes for this reason. Since the main character of Oblomov also finds it very difficult to leave his couch--whether he succeeds in doing so or not (literally as well as symbolically) constitutes the main thread of the extremely tenuous action of the novel--Beckett's instant attraction to this character is easily comprehensible. There is also good reason to believe that the figure in Waiting for Godot bearing the Russian name of Vladimir is a tribute to this unexpectedly Slavic aspect of Beckett.
more from TNR here.
The Jihadism of Fools
Over the last few years, and especially since the American invasion of Iraq in March 2003, there have been indications across the world of a growing convergence between the forces of Islamist militancy, on the one hand, and the 'anti-imperialist' left on the other. Leaving aside widespread, if usually unarticulated, sympathy for the attacks of September 11, 2001, justified on the grounds that "the Americans deserved it," we have seen since 2003 an overt coincidence of policies, with considerable support for the Iraqi "resistance," which includes strong Islamist elements, and, more recently and even more explicitly, support for Hezbollah in Lebanon. In the Middle East itself, and on parts of the European far left, an overt alliance with Islamists has been established, going back at least to the mass demonstrations in early 2003 that preceded the Iraq War, but also including a convergence of slogans on Palestine--supporting suicide bombings and denying the legitimacy of the Israeli state. Last year, for example, radical Basque demonstrators were preceded by a militant waving a Hezbollah flag. Moreover, since most of those who oppose the U.S. action in Iraq of 2003 also opposed the war in Afghanistan in 2001, this leads, whether clearly recognized or not, to support for the anti-Western Taliban, armed groups now active across that country.
more from Dissent here.
may this day be as much as possible unlike a dog!
Kenneth Koch once said of his early poems that he wanted to keep the subject in the air as much as possible, because any subject, according to Wittgenstein, is a limitation of the world. But Koch, who died in 2002, knew that a subject emerges sooner or later, so it makes sense that he would choose happiness, an expansive one, as his own. In the long poem “Seasons on Earth,” he writes that he thought about happiness “As being at one’s side, so that one [had] but / To bend or turn to get to it . . .” This obsession, amply displayed in the near-simultaneous publication of his Collected Poems and Collected Fiction, began as a reaction to the suffocating aesthetic of what he saw as New Critical drips—poets Koch referred to as “the castrati”—who thought suffering the only form of intense feeling. For Koch, it was unethical to deny any part of experience, which is why his poems ask us time and again what it means to lead a good life—with “good” meaning both pleasurable and ethical. In the broad view of his career that these new books afford, we see Koch preserving the comic as a serious way of arriving at “ecstasy, unity, freedom, completeness, dionysiac things,” poetic ambitions he talks about in a 1995 interview with the poet Jordan Davis, his student and the editor of his Collected Fiction.
more from Boston Review here.
Thomas Hardy’s English Lessons
When Thomas Hardy drew his first chancy breaths inside a Dorset cottage in 1840, Wordsworth had yet to become England’s poet laureate. By his ninth decade, still writing, Hardy enjoyed listening to the wireless with his dog, Wessex, and had seen the silent-film adaptation of his own “Tess of the D’Urbervilles.” The author’s life span seems somehow even vaster than it was, a match for the cosmically long view Hardy took of his fictional characters, fate’s playthings set in motion on a “blighted star.”
This new biography makes its subject a fascinating case study in mid-Victorian literary sociology. Hardy struggles — with an industriousness befitting the age — against editorial rejection, rapacious contract terms and enforced prudery. Leslie Stephen, known chiefly to the 21st century as Virginia Woolf’s father, edited his magazine, The Cornhill, under the watchful, prissy eyes of so many others that he sometimes made “few suggestions beyond bowdlerizations” when working on Hardy’s copy. Serialization often forced the author “to pack in far too much plot” and thereby throw novels like “The Mayor of Casterbridge” significantly off-kilter. Finally, there were reviewers to contend with; Hardy remained overly sensitive to all they had to say.
The debt to pleasure
From The Guardian:
It was on a long train journey that I first read Junichiro Tanizaki's novella Diary of a Mad Old Man (1961). It had been sent to me by an American friend who knew I'd just read Tanizaki's novel The Key (1956). The Key concerns a middle-aged, ordinary couple with an adult daughter who still lives with them. From the ruins of what appears to be a long-dead marriage, something starts to stir. We like to believe - it is a common misconception - that erotic relationships only deteriorate, that there is nothing new that can happen between a long-established couple. This is something we are so certain of that it must be incorrect. A deep involvement may become so distressingly pleasurable that we might feel dangerously addicted. As such a relationship develops, distance might be required, as the relationship begins to feel dangerous, even incestuous.
The novel opens with a middle-aged man drugging his sexually cold wife in order to spend more time with her feet. The sexuality of both of them is in the process of being re-aroused by the constant presence in their house of their daughter's fiancé. Here jealousy makes passion possible. As Lacan puts it, "The other holds the key to the object desired."
John Allen Paulos on Health, Wealth and Happiness
From ABC News:
First wealth. A recently released study says that the inequality in wealth throughout the world is extreme and growing more so. The report by the World Institute for Development Economics Research of the United Nations University paints an informative picture of the world's wealth distribution as of 2000, the last year for which figures are available. It states that the top 1% of the world's population - about 37 million adults - had net assets (note: not income) worth at least $500,000. This constituted approximately 40% of the world's assets.
In contrast, the top 50% of the world's people owned a bit less than 99% of the wealth, and this translated into a median wealth (half the world's population having more, half less) of just over $2,000. To be in the top 10% required net assets worth about $60,000.
The Simon Wiesenthal Center may be going too far by trying to build a museum on a Muslim cemetery in Jerusalem
Daniela Yanai in the Los Angeles Times:
Last week, Israel's High Court of Justice ordered Los Angeles' Simon Wiesenthal Center and the municipality of Jerusalem to explain why they should be allowed to construct a new Museum of Tolerance on the site of an ancient Muslim cemetery.
On the surface, it's a straightforward enough question. But it's really about more than the fate of one cemetery and whether it should be preserved. What is at stake is the nature of both people's claims, Palestinian and Israeli, to Jerusalem.
The site of the museum is in the heart of downtown Jerusalem, on a parking lot next to the city's Independence Park. Designed by architect Frank Gehry and kicked off in 2004 with a visit by California Gov. Arnold Schwarzenegger, the museum (a sister, of sorts, to the one of the same name in Los Angeles) seems, at first glance, like a welcome initiative. In a region wracked by intolerance, what better way to improve the chances for peace than to teach people about different cultures?
But the museum itself became a test case for tolerance when bulldozers digging its foundation unearthed human remains last year, and the project has been mired in legal disputes ever since. Even though archeologists and historians knew that the site was on top of an ancient cemetery — parts of which are visible just adjacent to the site — spokespeople for the Jerusalem municipality claimed that the discovery of remains came as a surprise.
Mexico in Ruins
My friend and 3QD contributor, J. M. Tyree, has sent the following message:
Joseph Pearson, a great friend and very talented Canadian writer, has a new essay up on the AGNI site about a visit to Mexico City:
Around the doors of the Hotel Gran Ciudad de México, with its wondrous Art-Nouveau dome, are younger men in the casual uniform of Lacoste gold shirts tucked into khaki pants. They sport expensive watches in a city where wearing a plastic one is the only sure method to prevent being mugged. They pile into a taxi, and a tickle of fear rises up my neck. Disparities of wealth exist everywhere, but they are rarely so visible as in a city without a real middle class. I am afraid not because Mexico is unique in its share of misery. It's not. I am afraid because Mexico is the future.
I highly recommend the rest of the essay, "Mexico in Ruins" (particularly its take on gay bars in Mexico City), as well as AGNI's very fine online-only journal in general, which offers free subscriptions by email and showcases emerging, often younger writers every month.
Thanks, J. M.
January 26, 2007
Paul Krugman on Milton Friedman
From the New York Review of Books:
Keynesianism was a great reformation of economic thought. It was followed, inevitably, by a counter-reformation. A number of economists played important roles in the great revival of classical economics between 1950 and 2000, but none was as influential as Milton Friedman. If Keynes was Luther, Friedman was Ignatius of Loyola, founder of the Jesuits. And like the Jesuits, Friedman's followers have acted as a sort of disciplined army of the faithful, spearheading a broad, but incomplete, rollback of Keynesian heresy. By the century's end, classical economics had regained much though by no means all of its former dominion, and Friedman deserves much of the credit.
I don't want to push the religious analogy too far. Economic theory at least aspires to be science, not theology; it is concerned with earth, not heaven. Keynesian theory initially prevailed because it did a far better job than classical orthodoxy of making sense of the world around us, and Friedman's critique of Keynes became so influential largely because he correctly identified Keynesianism's weak points. And just to be clear: although this essay argues that Friedman was wrong on some issues, and sometimes seemed less than honest with his readers, I regard him as a great economist and a great man.
The Lies of Ryszard Kapuściński
Jack Shafer in Slate:
John Updike worshipped him. Gabriel García Márquez tagged him "the true master of journalism." But there's one fact about the celebrated war correspondent and idol of New York's literary class that didn't get any serious attention this week. It's widely conceded that Kapuściński routinely made up things in his books. The New York Times obituary, which calls Kapuściński a "globe-trotting journalist," negotiates its way around the master's unique relationship with the truth diplomatically, stating that his work was "often tinged with magical realism" and used "allegory and metaphors to convey what was happening."
Scratch a Kapuściński enthusiast and he'll insist that everybody who reads the master's books understands from context that not everything in them is to be taken literally. This is a bold claim, as Kapuściński's work draws its power from the fantastic and presumably true stories he collects from places few of us will ever visit and few news organization have the resources to re-report and confirm. If Kapuściński regularly mashes up the observed (journalism) with the imagined (fiction), how certain can we be of our abilities to separate the two while reading?
Should we regard Kapuściński's end product as journalism? Should we give Kapuściński a bye but castigate Stephen Glass, who defrauded the New Republic and other publications by doing a similar thing on a grosser scale? Do we cut Kapuściński slack because he was better at observing, imagining, and writing than Glass, and had the good sense to write from exotic places? Exactly how is Kapuściński different from James Frey in practice if not in execution?
US military unveils heat-ray gun
From the BBC:
A beam was fired from a large rectangular dish mounted on a Humvee vehicle.
The beam has a reach of up to 500m (550 yds), much further than existing non-lethal weapons like rubber bullets.
It can penetrate clothes, suddenly heating up the skin of anyone in its path to 50C.
But it penetrates the skin only to a tiny depth - enough to cause discomfort but no lasting harm, according to the military.
A Reuters journalist who volunteered to be shot with the beam described the sensation as similar to a blast from a very hot oven - too painful to bear without diving for cover.
Enlightenment fundamentalism or racism of the anti-racists?
From Sign and Sight:
Pascal Bruckner defends Ayaan Hirsi Ali against Ian Buruma and Timothy Garton Ash, condemning their idea of multiculturalism for chaining people to their roots.
"What to say to a man who tells you he prefers to obey God than to obey men, and who is consequently sure of entering the gates of Heaven by slitting your throat?" - Voltaire
"Colonisation and slavery have created a sentiment of culpability in the West that leads people to adulate foreign traditions. This is a lazy, even racist attitude." – Ayaan Hirsi Ali
There's no denying that the enemies of freedom come from free societies, from a slice of the enlightened elite who deny the benefits of democratic rights to the rest of humanity, and more specifically to their compatriots, if they're unfortunate enough to belong to another religion or ethnic group. To be convinced of this one need only glance through two recent texts: "Murder in Amsterdam" by the British-Dutch author Ian Buruma on the murder of Theo Van Gogh (1) and the review of this book by English journalist and academic Timothy Garton Ash in the New York Review of Books (2). Buruma's reportage, executed in the Anglo-Saxon style, is fascinating in that it gives voice to all of the protagonists of the drama, the murderer as well as his victim, with apparent impartiality. The author, nevertheless, cannot hide his annoyance at the former Dutch member of parliament of Somali origin, Ayaan Hirsi Ali, a friend of Van Gogh's and also the subject of death threats. Buruma is embarrassed by her critique of the Koran.
Garton Ash is even harder on her. For him, the apostle of multiculturalism, Hirsi Ali's attitude is both irresponsible and counter-productive. His verdict is implacable: "Ayaan Hirsi Ali is now a brave, outspoken, slightly simplistic Enlightenment fundamentalist." (3). He backs up his argument with the fact that this outspoken young woman belonged in her youth to the Muslim Brotherhood in Egypt. For Garton Ash, she has merely exchanged one credo for another, fanaticism for the prophet for that of reason.
Steven Pinker on thought and metaphor
Peter Calamai in the Toronto Star:
"We have to do two things with language. We've got to convey a message and we've got to negotiate what kind of social relationship we have with someone," Pinker says in a telephone interview from his home in Cambridge, Mass.
Even something as seemingly straightforward as asking for the salt involves thinking and communicating at two levels, which is why we utter such convoluted requests as, "If you think you could pass the salt, that would be great."
Says Pinker: "It's become so common that we don't even notice that it is a philosophical rumination rather than a direct imperative. It's a bit of a social dilemma. On the one hand, you do want the salt. On the other hand, you don't want to boss people around lightly.
"So you split the difference by saying something that literally makes no sense while also conveying the message that you're not treating them like some kind of flunky."
The Harvard psychologist classes the salt request as an example of indirect speech, a category that also includes euphemisms and innuendo. Two other key themes for Wednesday's talk are the ubiquity of metaphor in everyday language and swearing and what it says about human emotion.
how the left went berserk
The anti-war movement disgraced itself not because it was against the war in Iraq, but because it could not oppose the counter-revolution once the war was over. A principled left that still had life in it and a liberalism that meant what it said might have remained ferociously critical of the American and British governments while offering support to Iraqis who wanted the freedoms they enjoyed.
It is a generalisation to say that everyone refused to commit themselves. The best of the old left in the trade unions and parliamentary Labour party supported an anti-fascist struggle, regardless of whether they were for or against the war, and American Democrats went to fi ght in Iraq and returned to fi ght the Republicans. But again, no one who looked at the liberal left from the outside could pretend that such principled stands were commonplace. The British Liberal Democrats, the continental social democratic parties, the African National Congress and virtually every leftish newspaper and journal on the planet were unable to accept that the struggle of Arabs and Kurds had anything to do with them. Mainstream Muslim organisations were as indifferent to the murder of Muslims by other Muslims in Iraq as in Darfur. For the majority of world opinion, Blair's hopes of 'giving people oppressed, almost enslaved, the prospect of democracy and liberty' counted for nothing.
more from The Observer here.
coetzee on mailer
The lesson that Adolf Eichmann teaches, wrote Hannah Arendt at the conclusion of Eichmann in Jerusalem, is of "the fearsome, word-and-thought-defying banality of evil" (Arendt's italics). Since 1963, when she penned it, the formula "the banality of evil" has acquired a life of its own; today it has the kind of clichéd currency that "great criminal" had in Dostoevsky's day.
Mailer has repeatedly in the past voiced his suspicion of this formula. As a secular liberal, says Mailer, Arendt is blind to the power of evil in the universe. "To assume...that evil itself is banal strikes me as exhibiting a prodigious poverty of imagination." "If Hannah Arendt is correct and evil is banal, then that is vastly worse than the opposed possibility that evil is satanic"—worse in the sense that there is no struggle between good and evil and therefore no meaning to existence.
It is not too much to say that Mailer's quarrel with Arendt is a running subtext to The Castle in the Forest. But does he do justice to her? In 1946 Arendt had an exchange of letters with Karl Jaspers sparked by his use of the word "criminal" to characterize Nazi policies. Arendt disagreed. In comparison with mere criminal guilt, she wrote to him, the guilt of Hitler and his associates "oversteps and shatters any and all legal systems."
more from the NY Review of books here.