Wednesday, January 31, 2007
Hiding From the Truth
Philip Kitcher in the Oxford University Press blog:
[Kitcher is the John Dewey Professor of Philosophy at Columbia University. Living With Darwin: Evolution, Design, and the Future of Faith, Kitcher's most recent book, is both a defense of Darwin and an exploration of the meaning behind the clash of religion and modern science. Kitcher is also the author of Abusing Science: The Case Against Creationism, The Lives to Come: The Genetic Revolution and Human Possibilities, Vaulting Ambition: Sociobiology and the Quest for Human Knowledge, Science, Truth, and Democracy, and In Mendel's Mirror. In the article below Kitcher explores how easy it is to hide from the truth.]
Finally, in his State of the Union message, President Bush acknowledged that climate change is a problem. Whether he understands the magnitude of the problem or is prepared for the kinds of measures that are needed to address it remains unclear. But, from many Americans, and especially from people in other countries who have been concerned about global warming for many years, there have been huge sighs of relief. At the same time, there’s an obvious question – why has it taken so long?
The broad outlines of the answer are fairly clear. During recent years, some writers whose conclusions appeal to the values of the President and his advisers, have muddied the waters about climate change. They have employed familiar tactics, casting doubt on any consensus among experts by ignoring the large agreements and concentrating on those places where scientists debate the details. Structurally, the case is much like the long-running battle about evolution: you make it seem as though there is no consensus by judiciously quoting from researchers who are actively involved in discussing unsettled questions, but who agree in a fundamental core framework that you don’t bother to mention.
Behind these two examples lies a deeper problem about the ways the achievements of the sciences are received in American society.
Essay Linking Liberal Jews and Anti-Semitism Sparks a Furor
Patricia Cohen in the New York Times:
The American Jewish Committee, an ardent defender of Israel, is known for speaking out against anti-Semitism, but this conservative advocacy group has recently stirred up a bitter and emotional debate with a new target: liberal Jews.
An essay the committee features on its Web site, ajc.org, titled “ ‘Progressive’ Jewish Thought and the New Anti-Semitism,” says a number of Jews, through their speaking and writing, are feeding a rise in virulent anti-Semitism by questioning whether Israel should even exist.
In an introduction to the essay, David A. Harris, the executive director of the committee, writes, “Perhaps the most surprising — and distressing — feature of this new trend is the very public participation of some Jews in the verbal onslaught against Zionism and the Jewish State.” Those who oppose Israel’s basic right to exist, he continues, “whether Jew or gentile, must be confronted.”
Muhammad Ali: The Brand and the Man
Dave Zirin in The Nation:
Muhammad Ali's brilliance was not that he was some kind of antiwar prophet. He wasn't Malcolm X or Martin Luther King Jr. in boxing gloves, debating foreign policy between rounds. But unlike the Ivy League advisers who made up the "best and brightest," Ali understood then that there was justice and injustice, right and wrong. He knew that not taking a stand could be as political a statement as taking one. This was Ali's code, and he never wavered.
In early 1966 the US Army came calling for Ali, and he was classified 1-A for the draft. He got the news surrounded by reporters and blurted one of the most famous phrases of the decade, "Man, I ain't got no quarrel with them Vietcong."
This was an astounding statement. As Mike Marqusee outlines in his Redemption Song: Muhammad Ali and the Spirirt of the 60s, there was little opposition to the war at the time. The antiwar movement was in its infancy, and most of the country still stood behind the President. Life magazine's cover read, "Vietnam: The War Is Worth Winning." The song "Ballad of the Green Berets" was climbing the charts. And then there was Ali. As longtime peace activist Daniel Berrigan said, "It was a major boost to an antiwar movement that was very white. He was not an academic or a bohemian or a clergyman. He couldn't be dismissed as cowardly."
Worldmapper Maps Health
Infant mortality is babies who die during the first year of their life. In 2002 there were 7.2 million infant deaths worldwide; 5.4% of all babies born died within their first year, including 2.3% in their first week.
The territory with the most infant deaths was India, at 1.7 million, or 24% of the world total. In India, for every 100 babies born alive, almost 7 die in the following 12 months.
In 22 territories the rate is over 1 infant death for every 10 live births. All of these 22 territories are in Africa. The highest infant mortality rate is in Sierra Leone where 16.5 babies die, of every 100 born alive.
Territory size shows the proportion of infant deaths worldwide that occurred there in 2002. Infant deaths are deaths of babies during their first year of life.
What Non-Human Primates Tell Us About Religion
In Salon, an interview with Barbara J. King, author of Evolving God: A Provocative View on the Origins of Religion (via Political Theory Daily Review):
Every human culture has believed in spirits, gods or some other divine being. That's why human beings have often been called Homo religioso. Some people take this long history of belief in the otherworldly as evidence for God; doesn't it explain why religion continues to be so pervasive? But many scientists are coming up with their own, decidedly secular, theories about the origins of faith. In fact, over the last few years, a small cottage industry made up of scientists and philosophers has devoted itself to demystifying the divine.
Take Daniel Dennett, the philosopher who has proposed that religion is a meme -- an idea that evolved like a virus -- that infected our ancestors and continued to spread throughout cultures. By contrast, anthropologist Pascal Boyer argues that religious belief is a quirky byproduct of a brain that evolved to detect predators and other survival needs. In this view, the brain developed a hair-trigger detection system to believe the world is full of "agents" that affect our lives. And British biologist Lewis Wolpert, with yet another theory, posits that religion developed once hominids understood cause and effect, which allowed them to make complex tools. Once they started to make causal connections, they felt compelled to explain life's mysteries. Their brains, in essence, turned into "belief engines."
Of course, these thinkers are either religious skeptics or outright atheists who mean to imply that we've been duped by evolution to believe in supernatural beings when none, in fact, exist. That's what makes Barbara J. King, an anthropologist at the College of William and Mary, so unique. She has no desire to undermine religion. In fact, she's been deeply influenced by the religious writers Karen Armstrong and Martin Buber. But her main insights about the origins of religion come not from researching humans' deep history, but from observing very much alive non-human primates.
Why Men and Women Don't Want Sex
Dr. Helen scans through the comments on a WebMD post on the different reasons why men and women don't want sex and concludes:
Update: A Men's News Daily commenter to this post writes the following:
"Never forget: the single most revolting image, the nightmare that haunts women, is that of the happy, grinning, sexually satisfied male. They really hate that and the sooner we adjust our social expectation to that fact, the better." Truer words were never spoken--I think that some women really do feel this way.
Jill at Feministe responds:
Yes, women do secretly hate the idea of our partners being happy. You’ve got us all figured out.
The double-standard here is amazing. From the letters Dr. Helen quotes, it’s pretty clear that many women are refusing sex because they aren’t enjoying it, or because there are other issues within the relationship that are leaking over into their sex lives. But clearly, they’re just being selfish by not allowing their husbands unrestrained sexual access, even if the sex sucks, or is painful, or is unwanted. As usual, the mens are not doing anything that needs re-evaluating.
holding fast to the prism of her very soul
If Avedon provided the tools, it was Susan Sontag who gave Leibovitz a fresh sense of how she could use them as an autonomous artist. In retrospect, that a high-profile photographer of Leibovitz’s calibre should form an alliance with an intellectual as illustrious as Sontag is perfectly logical. After all, Walker Evans and James Agee formed an influential collaboration in the heyday of “documentary style” photography (Evans’s own term.) The turn of the 21st century twist is that Leibovitz and Sontag are women – and that all aspects of their personal, creative, and intellectual lives were intertwined during the fifteen-year period of their relationship.
Leibovitz’s knowing, “commercial” style stands out in a museum context. The best example is her witty color portrait of the Bush Administration, Cabinet Room (2001). A straight photograph and a public image, it’s also stupendously ironic. Bush, Rice, and the rest of them look like a band posing for a 1970s album cover. But the exhibition reveals that Leibovitz has mastered other modes. Her work shifts from creative service in the political and entertainment industries to photojournalism, as in Traces of the Massacre of Tutsi Schoolchildren and Villagers on a Bathroom Wall (1994), to tender family portraiture. Her soft-focus landscape photography of the American west and vast terrain in other locations includes a picture of Mt. Vesuvius - echoing Sontag’s novel The Volcano Lover.
more from Artcritical here.
Cue uproarious laughter
Interviewing two people at the same time is never easy, but Gilbert and George, a retrospective of whose work opens at Tate Modern next month, take the thing (and of course they're perfectly aware of this) to a whole new level. Ask a question and, to your right, George will offer some piece of gnomic wisdom topped off with a dash of mild smut while, to your left, Gilbert will titter or splutter or make his own naughty joke in an effort to back up his friend. Then, as you struggle to grasp what it is that they actually mean, the two of them will fall eerily silent. Their marmoset eyes are always on you, which would be scary if they weren't so invincibly charming. George, in particular, has the kind of manners - if you ignore the smut - that one might have found behind the discreet rosewood counter of a gentleman's outfitter, circa 1935.
more from The Guardian here.
Being and Laziness
From The New Republic:
Anyone with a claim to literacy is familiar with the names of Tolstoy, Turgenev, and Dostoevsky, and can cite some of the titles of their most famous works. But Goncharov and his novel Oblomov, of which a new translation, a snappily colloquial and readable one, has just been published -- who ever heard of them?
Open any Russian dictionary and you will find the word oblomovshchina, defined, in the first one that comes to hand, as "carelessness, want of energy, laziness, negligence," and specifying its origin in Goncharov's novel, where the word itself is used. Scarcely any other novelist, Russian or otherwise (except perhaps Cervantes), could boast of having created a character whose attributes have left such an indelible impression on the vocabulary, and on the national psyche, of his country.
So who was Ivan Goncharov, and why has the character he created taken on such ineradicably symbolic proportions? He came from a very prosperous merchant family, and was one of the few Russian writers of this period descended from such a background. He was known for his shy and retiring personality, and such reticence may well be attributed to a lingering uneasiness about his status in the carefully delineated Russian caste society.
The late Carl Sagan on questions of science and faith
From The Washington Post:
In 1877, the Italian astronomer Giovanni Schiaparelli was looking at Mars through his new telescope, and he noticed intricate etchings in the equatorial region of the planet's surface. Schiaparelli called these lines canali, by which he probably meant something like "gullies" or "grooves," but his coinage got wrongly translated into English as "canals." It was a regrettable linguistic slip.The idea of Martian canals grabbed the imagination of American astronomer Percival Lowell, scion of the famous Boston Lowell clan, who spun out an elaborate story of a Martian civilization with a central planetary government and the technological wizardry to engineer a massive system of aqueducts. Lowell even used his own Arizona observatory to identify the Martian capital, called Solis Lacus.
There are no canals on Mars. No cities either, and no government. Indeed, no signs of past life whatsoever, as we know today. All of this was an elaborate phantasm of Lowell's fertile mind, yet as late as the 1950s, popular culture was saturated with imagery of Martians as a technologically advanced extraterrestrial race. The late Carl Sagan used the misbegotten tale of Martian engineers, in his 1985 Gifford Lectures in Natural Theology at the University of Glasgow, as a cautionary tale about the power of belief and yearning to trump science and reason.
Tuesday, January 30, 2007
Samir El-youssef: At home with the heretic
Samir El-youssef, raised in a refugee camp, grew up into a writer who challenges the myths of Palestinian politics. Matthew J Reisz meets a trouncer of taboos.
From The Independent:
El-youssef has a Sunni father, but his mother comes from the only Shi'ite Palestinian family. This, he believes, "has contributed to the diversity of my understanding of things - from the beginning you are aware of yourself as someone different". Although he has contributed many articles to the London-based Arabic newspaper Al-Hayat, his criticisms of the second intifada and the Arab policy of "non-normalisation" in relation to Israel have sometimes proved too controversial to be published.
"We have to meet up with the Israelis and have a dialogue with them," he explains. "The idea of not meeting is simply childish and stupid. But it is not easy to express your views. You can be branded a 'Zionist' or a 'traitor' simply for not parroting the same old slogans."
His own social circle consists largely of liberal British Jews and Israelis. Asked about his outspoken opposition to the academic boycott of Israel, he responds cheerfully: "What hope do we have if we as writers don't speak to each other? Do we really think our idiotic leaders are going to sort things out?"
Michael Chabon on Cormac McCarthy's new novel
A review of The Road by McCarthy, from the New York Review of Books:
Charlton Heston and a savagely coiffed vixen, wrapped in animal skins, riding horseback along a desolate seashore, confronted by the spike-crowned ruin of the Statue of Liberty half buried in the sand: everyone knows how the world ends. First radiation, plague, an asteroid, or some other cataclysm kills most of humankind. The remnants mutate, lapse into feudalism, or revert to prehistoric brutality. Old cults are revived with their knives and brutal gods, while tiny noble bands cling to the tatters of the lost civilization, preserving knowledge of machinery, agriculture, and the missionary position against some future renascence, and confronting their ancestors' legacy of greatness and destruction.
Ambivalence toward technology is the underlying theme, and thus we are accustomed to thinking of stories that depict the end of the world and its aftermath as essentially science fiction. These stories feel like science fiction, too, because typically they deal with the changed nature of society in the wake of cataclysm, the strange new priesthoods, the caste systems of the genetically stable, the worshipers of techno-death, the rigid pastoral theocracies in which mutants and machinery are taboo, etc.; for inevitably these new societies mirror and comment upon our own. Science fiction has always been a powerful instrument of satire, and thus it is often the satirist's finger that pushes the button, or releases the killer bug.
This may help to explain why the post-apocalyptic mode has long attracted writers not generally considered part of the science fiction tradition. It's one of the few subgenres of science fiction, along with stories of the near future (also friendly to satirists), that may be safely attempted by a mainstream writer without incurring too much damage to his or her credentials for seriousness.
Sending a man to the moon was an immensely expensive distraction of little scientific or cultural worth
Greg Ross interviews Gerard J. DeGroot, author of Dark Side of the Moon: The Magnificent Madness of the American Lunar Quest, in American Scientist:
To Americans in the 1960s, putting a man on the Moon was a noble, even romantic challenge. "No single space project in this period will be more impressive to mankind," President Kennedy told Congress, "or more important in the long-range exploration of space, and none will be so difficult or expensive to accomplish."
But in re-examining the Apollo project, historian Gerard J. DeGroot finds it largely an empty dream. In Dark Side of the Moon: The Magnificent Madness of the American Lunar Quest (New York University Press), he argues that the Moon race was essentially just a new front in the Cold War, "an immensely expensive distraction of little scientific or cultural worth."
In announcing the Apollo project, Kennedy referred to moving with what he called "the full speed of freedom." Do you think he saw it chiefly as a scientific endeavor, or really as a symbolic contest of ideologies?
I think very definitely the latter. It's very difficult for some people even still, given Kennedy's mystique, to accept that he wasn't quite the person we thought he was. I think the really telling bit comes in a conversation that he has with the NASA administrator James Webb, in which he says, "I don't really care about the moon. I know it's important; I know there are people who really want to go there, but I just want to beat the Russians." So it really comes down to that. It is purely a symbol of American supremacy in the Cold War. Because the Cold War didn't provide real wars, this is in a sense a sort of surrogate war, and almost seemingly chosen with the same sort of cavalier attitude that, say, a Civil War general might choose a battlefield: "Well, we're here, let's fight right here."
Spy Princess: The Life of Noor Inayat Khan
Ruchira Paul in Accidental Blogger:
In 1912, a flamboyant "oriental style" dancer with the exotic name of Mata Hari (mother of god in Hindi) was the toast of Paris night clubs. A traveling musical group from India, The Royal Musicians of Hindustan was in Paris that year. Mata Hari performed with this group. The group's lead singer was a handsome and serious young man named Inayat Khan. He belonged to an accomplished Indian musical family from Baroda and was trained in Indian classical music and the sufi philosophical tradition. The glamorous and famous Mata Hari later went on to become a French spy (some say, a German double agent) during World War I - not the most sensible career choice for someone who sought publicity relentlessly. Little did the gentle Inayat Khan know that one day his own daughter would follow in the footsteps of the notorious Mata Hari and meet an equally tragic (but more honorable) fate.
Inayat Khan traveled the world with his musical group and introduced the pacifist sufi philosophy to western audiences. During a tour of the United States, he met, fell in love with and married Ora Ray Baker. In 1914 their oldest daughter, Noorunnisa Inayat Khan (Noor) was born in Kremlin, Moscow. The family lived in England and France. From all accounts, Noor and her siblings were brought up in a household bearing both eastern and western traditions. Despite European influences on the children's upbringing, the cultured and conservative lifestyle of the Khan family was in keeping with Indian Muslim tradition.(Her American born mother had converted to Islam and adopted the name, Amina Begum.) Noor was trained in classical Indian and western music, playing the sitar, piano, cello and violin. She studied child psychology in Sorbonne and music at the Paris Conservatory.
More here. [Photo shows Noor Inayat Khan.]
Le Grand Content: Examining the omnipresent Powerpoint-culture in search for its philosophical potential...
A short film, via Tony Cobitz's blog, Mtanga:
Paul Auster should not exist
Paul Auster should not exist. I say this not to mimic a sentence that might easily have been plucked from one of his own hall-of-mirrors fictions, but simply to note his singular position in contemporary American letters. He has enjoyed unlikely success by writing reflexive novels that take up notions of chance and fate, memory and oblivion, luck and the uncanny; given his self-referential leanings and taste for highbrow allusion, it might seem that he would at best have found a coterie of admirers and a university appointment to subsidize his writing. Instead, he has settled comfortably into a career as one of the most glamorous novelists in America. Abroad, he has even higher visibility, a genuine rock-star aura. Magazine profiles cite his movie-idol looks and general air of suave elegance, and although Park Slope, the Brooklyn neighborhood where he lives, may now be home to more writers than any other urban enclave on the planet, he stands out in his affiliation with the place as one of its presiding celebrities. He has branched out into subsidiary projects as a radio personality (having headed up a few years back NPR's National Story Project, which solicited anecdotal tales from listeners nationwide, later collected in the anthology I Thought My Father Was God ) and a screenwriter and film director: Best known in this regard for his screenplay for 1995's Smoke (directed by Wayne Wang), Auster has written and directed the rather stilted Lulu on the Bridge (1998) and the just-completed The Inner Life of Martin Frost, based on material from his novel The Book of Illusions (2002). His work has also proliferated into media of unimpeachable hipness: Paul Karasik and David Mazzuchelli adapted City of Glass (1985), the first book in Auster's New York Trilogy, into a graphic novel in 1994, and the beguiling, mischievous French artist Sophie Calle has realized conceptual pieces based on his writings. These extraliterary manifestations contribute to a highly resilient cultural persona, gracing him, if you will, with a street credibility among chic young bookish types that has sustained Auster through an uneven career.
more from Bookforum here.
the permanent night-time of his elected trade
When John le Carré published A Perfect Spy in 1986, Philip Roth, then spending a lot of time in London, called it ‘the best English novel since the war’. Not being such a fan of A Perfect Spy, I’ve occasionally wondered what Roth’s generous blurb says about the postwar English novel. As a le Carré bore, however, I’ve also wondered how Roth managed to overlook Tinker Tailor Soldier Spy (1974), the central novel in le Carré’s career, in which George Smiley – an outwardly diffident ex-spook with a strenuously unfaithful wife and an interest in 17th-century German literature – comes out of retirement to identify the turncoat in a secret service that’s explicitly presented as a metaphorical ‘vision of the British establishment at play’. If you sit up late enough watching DVDs of the BBC adaptation starring Alec Guinness, or Martin Ritt’s version of The Spy who Came in from the Cold with Richard Burton, it’s possible to persuade yourself that le Carré might even be the greatest English novelist alive. Unfortunately, looking at his other books the next morning makes this seem less likely, in part because the classic phase of his career ended earlier than we bores like to remember, and in part because some of his early strengths have become, in a changed context, weaknesses.
more from the LRB here.
to organize the world’s information and make it universally accessible and useful
Every weekday, a truck pulls up to the Cecil H. Green Library, on the campus of Stanford University, and collects at least a thousand books, which are taken to an undisclosed location and scanned, page by page, into an enormous database being created by Google. The company is also retrieving books from libraries at several other leading universities, including Harvard and Oxford, as well as the New York Public Library. At the University of Michigan, Google’s original partner in Google Book Search, tens of thousands of books are processed each week on the company’s custom-made scanning equipment.
Google intends to scan every book ever published, and to make the full texts searchable, in the same way that Web sites can be searched on the company’s engine at google.com. At the books site, which is up and running in a beta (or testing) version, at books.google.com, you can enter a word or phrase—say, Ahab and whale—and the search returns a list of works in which the terms appear, in this case nearly eight hundred titles, including numerous editions of Herman Melville’s novel.
more from The New Yorker here.
Snake Bites the Toxic Toad That Feeds It--and Spreads Its Poison
From Scientific American:
It sounds like something straight out of a video game: A snake collects toxin by biting a poisonous toad and uses that venom as a defense against hawks and other predators. But that is exactly what researchers say the Asian snake Rhabdophis tigrinus does, based on studies of glandular fluid from hatchlings and adult snakes on two Japanese islands.
Some R. tigrinus snakes carry toxins called bufadienolides in their nuchal glands, sacks located under a ridge of skin along their upper necks. When threatened, they arch their necks, exposing the poisonous ridge to an antagonist. The clawing and biting of hawks and other predators most likely rips the skin and lets the poison ooze out, potentially blinding the snake's attackers, says herpetologist Deborah Hutchinson of Old Dominion University in Norfolk, Va. "It might not kill the predator but it would be noxious enough to deter predation," she says.
'Hobbit' human 'is a new species'
The finds caused a sensation when they were announced to the world in 2004. But some researchers argued the bones belonged to a modern human with a combination of small stature and a brain disorder called microcephaly. That claim is rejected by the latest study, which compares the tiny people with modern microcephalics. Microcephaly is a rare pathological condition in humans characterised by a small brain and cognitive impairment.
In the new study, Dean Falk, of Florida State University, and her colleagues say the remains are those of a completely separate human species: Homo floresiensis. They have published their findings in Proceedings of the National Academy of Sciences. The remains at the centre of the Hobbit controversy were discovered at Liang Bua, a limestone cave on the Indonesian island of Flores, in 2003.
Monday, January 29, 2007
A Case of the Mondays: The Blank Slate and Other Phantom Theories
Reading Steven Pinker's The Blank Slate reminded me of most other polemical books I'd read that attempt to integrate some science into their works. In theory it's a science book, a longwinded defense of both evolutionary psychology and its obvious social implications. But in practice, it's mostly a political book; the science is provided only as a backdrop against which Pinker sets up his attacks on a host of social, political, and cultural notions that stand in opposition to crude evolutionary psychology (which I'll abbreviate as EP in the rest of this post).
Pinker frames his view as this of modern science, represented by such tools as genetics, neurobiology, and post-Williams Revolution evolutionary biology, versus this of three closely interlinked demons. The first demon, which he focuses on the most, is the view that at birth the human mind is a blank slate to be shaped by environmental forces. The second is romantic affection for the noble savage, uncorrupted by pernicious civilization. And the third is the dualist notion that people are ghosts inhabiting the machines that are their own bodies.
The problems with the book's thesis start right at the beginning, when Pinker claims that a) all three views are interlinked, and b) all three views were very respectable until the science of EP started to overthrow them. The best way of seeing why Pinker is wrong there is by looking at the three philosophical positions he associates with the three demons—empiricism for the blank slate, romanticism for the noble savage, and dualism for the ghost in the machine.
By and large, the philosophers who developed empiricism, romanticism, and dualism in modern times disagreed with one another. Descartes' dualism isn't a component of Locke's empiricism; on the contrary, they disagree on the fundamental issue of whether all knowledge comes from experience. Romanticism developed mostly after the Enlightenment, and was only associated with empiricism or dualism when it mythologized European progress rather than the noble savage.
Zooming in on empiricism, it's easy to see another error of Pinker's: Lockean empiricism does not strictly speaking say the mind is a blank slate, at least not in the way that is relevant to EP. The main point of EP is that the human brain is hardwired to be prone to certain forms of learning and modes of behavior. The EP-derived view that men are on average better than women at math is not that men are born knowing more math than women but that men are born with a greater aptitude for math than women. In contrast, Locke's main contention is that knowledge comes directly from experience. He never concerned himself with social learning, which only became a serious subject of study a century or two after his death.
More importantly, the people Pinker criticizes for distorting science by claiming that IQ is not meaningful or not hereditary, or even that the mind is indeed a blank slate, have nothing to do with the other two demons. Marxist theory, which the people Pinker labels radical scientists adhere to, is extremely anti-romantic and anti-dualist. Among all the radical ideologies in existence—libertarianism, fascism, religious fundamentalism, anarchism—it is certainly the most pro-modern. Lewontin's politics is largely doctrinaire Marxist: in Biology as Ideology, he trumpets the triumph of progress, even as he indicates this progress should come from accepting socialism more than from ordinary capitalist improvements.
The relationship between Pinker and Lewontin is an interesting one. Pinker notes that although Lewontin claims that he thinks the dominant force in evolution is the interaction between gene, organism, and environment, in terms of social implications he ignores everything but environment. On that Pinker is certainly right: Biology as Ideology is an anti-science polemic that distorts facts to fit Lewontin's agenda (my take on Lewontin was subsequently debated in length here). However, Pinker commits the same transgression: he says he believes in the sensible moderate view that human behavior is determined by both inborn and environmental factors, and goes on to not only ignore the implications of the environmental part but also defend racists and sexists who have used pseudoscience as cover.
For instance, he starts by ridiculing people who called Richard Herrnstein a racist for a 1970 paper about intelligence and heredity. Although the paper as Pinker describes it is not racist per se, Herrnstein was indeed a racist. The screed he published with Charles Murray in 1994, The Bell Curve, is not only wrong, but also obviously wrong. Even in 1994, there were metastudies about race and intelligence that showed that the IQ gap disappears once one properly controls for environmental factors, for example by considering the IQ scores of children born to single mothers in Germany by American fathers in World War Two.
The truth, or what a reasonable person would believe to be the truth, is never oppressive. If there is indeed an innate component to the racial IQ gap, or to the gender math score gap, then it's not racist or sexist to write about it. It remains so even if the innate component does not exist, but the researcher has solid grounds to believe it does.
However, Murray and Herrnstein had no such solid grounds. They could quote a few studies proving their point, but when researchers publish many studies about the same phenomenon, some studies are bound to detect statistically significant effects that do not exist. By selectively choosing one's references, one can show that liberals are morally superior or morally inferior to conservatives, or that socialism is more successful or less successful than capitalism. At times there are seminal studies, which do not require any further metastudy. There weren't any in 1994, while existing metastudies suggested that the racial IQ gap was entirely environmental. As I will describe below, the one seminal study done in 2003 moots not only Murray and Herrnstein's entire argument but also much of Pinker's.
To rebut claims of racism and sexism, Pinker invokes the moral argument—in other words, that to be against racism and sexism one need only vigorously oppose discrimination, without believing that without any discrimination there would be no gaps in achievement. In theory, that is correct. But in practice, that narrow view makes it impossible to enforce any law against discrimination.
Worse, Pinker invokes anti-feminist stereotypes that are born not of serious scholarship, but of ideologically motivated conservative thinking. He supports Christina Hoff-Sommers' spurious distinction between equity feminism and gender feminism. Although there are many distinctions among different kinds of feminists, some of which track the degree of radicalism, none of the serious ones has anything to do with Hoff-Sommers'. In theory, equity feminism means supporting equality between women and men, while gender feminism means supporting a view of the world in which the patriarchy is omnipresent. In practice, the people who make that distinction, including Pinker, assign everyone who supports only the forms of equality that are uncontroversial in the United States, like equal pay laws and suffrage, to equity feminism, and everyone who supports further changes or even existing controversial ones to gender feminism.
As a case study, take family law activist Trish Wilson. Wilson's activism focuses on divorce law; she has written articles and testified in front of American state legislatures against laws mandating presumptive joint custody, mainly on the grounds that it hurts children. In addition, she has written exposés of ways abusive men exploit legal loopholes, including presumptive joint custody, to gain custody of children. In pushing for equality in the courtroom, she is a liberal feminist's liberal feminist. And yet, her attacks on the men's rights movement for protecting abusive men have caused every conservative who makes distinctions between equity and gender feminism to deride her as a gender feminist.
Any reasonable distinction between a more radical feminist stream and a more conventional one would put Betty Friedan and her organization NOW on the less radical side. Friedan was anti-radical enough to devote much of her tenth anniversary afterword to The Feminine Mystique to attacking radical feminists, by which she means not Catharine MacKinnon or Andrea Dworkin, but Kate Millett. NOW has focused on legal equality, primarily abortion rights and secondarily laws cracking down on employment discrimination and sexual harassment. But Pinker assigns Friedan as well as Bella Abzug to the gender feminism slot, using entirely trivial statements of theirs to paint them as radicals. Friedan he attacks for suggesting extending compulsory education to the age of 2; Abzug he attacks for saying equality means fifty-fifty representation everywhere.
To his credit, Pinker never quite claims that there is no gender discrimination. However, he makes an earnest effort to undermine every attempt to counteract it, however well founded. For instance, he claims that it's absurd to say that women's underrepresentation in science in the United States is due to discrimination, on the grounds that they're even more underrepresented in math, and it's not likely mathematicians are more sexist than scientists. Instead, he suggests, women are just not interested in math and science.
However, it is legitimate to ask why this interest gap exists. There is no EP-based argument why it should be innate. On the contrary, independent evidence from, for example, the proportion of female mathematicians who come from families of mathematicians versus the proportion of male mathematicians, suggests it is environmental. Indeed, the educational system of the United States has long encouraged women to ignore the hard sciences. Other educational systems produce near-parity: while 13% of American scientists and engineers are women, many other countries, such as Sweden and Thailand, have percentages higher than 30 or even 40.
Furthermore, one of the most important pieces of information about biases in education, the stereotype threat, receives no mention from Pinker. It's an established fact that telling girls who are about to take a math test that boys do better will make them do worse. In fact, telling them that the test measures aptitude, or even asking them to fill out an oval for gender before the test, will hurt their performance. And yet somehow Pinker glosses over that fact in a book that purports to be about a combination of genetics and environment.
There is hardly a single thing Pinker gets right about rape in his book, except that Susan Brownmiller is wrong. His explanation of rape is that it is a male biological urge, as evidenced in the fact that in many species males rape females. However, that theory says nothing about why straight men rape other men in prison, or in general about the dynamics of male-on-male rape. He provides scant circumstantial evidence for his theory of rape; instead, he prefers to rant about how Brownmiller's feminist theories are dominant, even though in fact the dominant view among criminologists is that rape is simply a violent crime, rather than a case of passionate sex gone awry or a mechanism of reinforcing the patriarchy.
Pinker commits not only a sin of omission in writing about rape or violence in general, but also a sin of commission, in writing that nobody really knows what causes violence. In fact, criminologists have fairly good ideas about how social ills like poverty and inequality cause crime, although they know it about murder more than about other violent crimes. Still, the rates of all violent crimes are closely correlated; the major exception is the United States' murder rate, which is higher than its general violent crime rate predicts presumably because of its lax gun control laws.
Finally, Pinker quotes a 2001 study by Eric Turkheimer as showing that the Darwin wars ended and the gene-centric side, led by Richard Dawkins, prevailed over the more environment-based side, led by Stephen Jay Gould. Thence Pinker concludes that attempts to raise children in ways more conducive to growth are futile, since much of their future behavior is genetic, and almost all of what is not genetic is due to developmental noise rather than environmental influence.
However, in 2003 Turkheimer published another study, which sealed the questions of race and IQ and of environmental influences on children in general. Turkheimer's starting point was that earlier studies about the heritability of IQ often focused on adopted children in middle- and upper-class families, where environmental influences might be different from in lower-class families. By examining a large array of data spanning multiple races and social classes, he saw that on the one hand, within the middle class IQ is highly genetic, with a heritability level of 0.72 and no significant environmental effects. But on the other, within the lower class, which includes most blacks and Hispanics in the US, the heritability of IQ drops to 0.1, and environmental factors such as the depth of poverty or the level of schooling predominate.
Obviously, it would be futile to blame Pinker for not mentioning Turkheimer's 2003 study. The Blank Slate was published in 2002. However, all other facts I have cited against Pinker's thesis and its purported social implications predate 2002. The Turkheimer study does not show by itself that Pinker's book is shoddy; it merely shows that much of it is wrong. What establishes Pinker's shoddiness is the treatment of social problems like sexism, racism, and crime, which is based not on examination of the available evidence or even the views that are mainstream among social scientists who study them, but on what think tanks whose views align with Pinker's say.
Even a cursory examination of the current mainstream social scene will show that the myths of the noble savage and the ghost in the machine are nonexistent. That fringe scholars sometimes believe in them is no indication of their level of acceptability; there are fringe scholars who believe in 9/11 conspiracy theories, too. Even the theory of the blank slate, at least in its most extreme form, is a phantom ideology. Lewontin adheres to it, but Lewontin is a contrarian; non-contrarian scientists do not publish books comparing modern biology departments to Medieval Christianity. Pinker likes to poke fun at theories that suggest everyone or almost everyone can succeed in life, but he never gets around to actually refuting them. All he does is attack extreme caricatures such as the blank slate and other phantom theories.
PERCEPTIONS: centuries of grief
Firoze Shakir. Muharram in Lucknow. Ya Sakina Ya Abbas.
"This is a chant that rips through the air on Aathvi after burying the Chup tazia, and along with Ya Hussain Ya Hussain the Azreens do an unforgettable Matam."
More on this particular series here.
Muharram in Karbala here.
FS has a unique, informative and broad ranging gallery of photographs here.
Shia and Sunni, A Ludicrously Short Primer
Even now, many people who hear these terms daily on the news are confused about what the real differences are between Sunni and Shia Muslims, so I, having been brought up in a very devout Shia household in Pakistan, thought I would explain these things, at least in rough terms. Here goes:
It all started hours after Mohammad's death: while his son-in-law (and first cousin) Ali was attending to Mohammad's burial, others were holding a little election to see who should succeed Mohammad as the chief of what was by now an Islamic state. (Remember that by the end of his life, Mohammad was not only a religious leader, but the head-of-state of a significant polity.) The person soon elected to the position of caliph, or head-of-state, was an old companion of the prophet's named Abu Bakr. This was a controversial choice, as many felt that Mohammad had clearly indicated Ali as his successor, and after Abu Bakr took power, these people had no choice but to say that while he may have become the temporal leader of the young Islamic state, they did not recognize him as their divinely guided religious leader. Instead, Ali remained their spiritual leader, and these were the ones who would eventually come to be known as the Shia. The ones who elected Abu Bakr would come to be known as Sunni.
This is the Shia/Sunni split which endures to this day, based on this early disagreement. Below I will say a little more about the Shia.
So early on in Islam, there was a split between political power and religious leadership, and to make a long story admittedly far too short, this soon came to a head within a generation when the grandson of one of the greatest of Mohammad's enemies (Abu Sufian) from his early days in Mecca, Yazid, took power in the still nascent Islamic government. Yazid was really something like a cross between Nero and Hitler and Stalin; just bad, bad in every way: a decadent, repressive dictator (and one who flouted all Islamic injunctions), for whom it became very important to obtain the public allegiance of Husain, the pious and respected son of Ali (and so, grandson of Mohammad). And this Husain refused, on principle.
Yazid said he would kill Husain. Husain said that was okay. Yazid said he would kill all of Husain's family. Husain said he could not compromise his principles, no matter what the price. Yazid's army of tens of thousands then surrounded Husain and a small band of his family, friends and followers at a place called Kerbala (in present day Iraq), and cut off their water on the 7th of the Islamic month of Moharram. For three days, Husain and his family had no water. At dawn on the third day, the 10th of Moharram, Husain told all in his party that they were sure to be killed and whoever wanted to leave was free to do so. No one left. In fact, several heroic souls left Yazid's camp to come and join the group that was certain to be slaughtered.
On the 10th of Moharram, a day now known throughout the Islamic world as Ashura, the members of Husain's parched party came out one by one to do battle, as was the custom at the time. They were valiant, but hopelessly outnumbered, and therefore each was killed in turn. All of Husain's family was massacred in front of his eyes, even his six-month old son, Ali Asghar, who was pierced through the throat by an arrow from the renowned archer of Yazid's army, Hurmula. After Husain's teenage son Ali Akbar was killed, he is said to have proclaimed, "Now my back is broken." But the last to die before him, was his beloved brother, Abbas, while trying desperately to break through Yazid's ranks and bring water back from the Euphrates for Husain's young daughter, Sakeena. And then Husain himself was killed.
The followers of Ali (the Shia) said to themselves that they would never allow this horrific event to be forgotten, and that they would mourn Husain and his family's murder forever, and for the last thirteen hundred years, they have lived up to this promise every year. This mourning has given rise to ritualistic displays of grief, which include flagellating oneself with one's hands, with chains, with knives, etc. It can all seem quite strange, out of context, but remembrance of that terrible day at Kerbala has also given rise to some of the most sublime poetry ever written (a whole genre in Urdu, called Marsia, is devoted to evoking the events of Ashura), and some of us, religious or not, still draw inspiration from the principled bravery and sacrifice of Husain on that black day.
Earlier today, I took the following unlikely pictures on the ritziest road in New York City, Park Avenue:
This is the procession commemorating Ashura, or the 10th of Moharram. In front, you can see a painstakingly recreated model of the tomb of Husain. The mourners are dressed mostly in black. It is a testament to the tolerance of American society that despite the best attempts of some of its cleverest citizens to proclaim a "clash of civilizations," it allows (and observes with curiosity) such displays of foreign sentiment.
The procession is made up of Shias of various nationalities, with the largest contingents being from Pakistan and Iran.
A young Shia holds up a banner, perhaps forgetting for a second that he is supposed to be mourning.
You can see one of the coffins with roses on it, which are ritualistically carried in the procession.
The self-flagellation is in full swing at this point. (The arms are raised before coming down to beat the chest.)
This is "Zuljana" or Husain's horse, caparisoned with silks and flowers.
The self-flagellation, or matam, reaches a climactic frenzy before ending for Asr prayers. Later in the evening, there are gatherings (or majaalis) to remember the women and children of Husain's family who survived to be held as prisoners of Yazid.
Sojourns: Two Views of the Apocalypse
Slavoj Zizek once said "it is much easier for us to imagine the end of the world than a small change in the political system. Life on earth maybe will end but somehow capitalism will go on." One is tempted to respond, well yes of course. It is also easier to imagine blowing up a car than designing one. Destruction is a rather simple proposition. Feats of engineering are somewhat more complicated.
And yet there is something to the apocalyptic imagination. Thinking about the end of the world can perhaps tell us something about the world that is ostensibly ending. Or so it would seem from two of the more visually arresting films to appear in the last decade, both ruminating over our final days, both set, as it happens, in England. I refer here to everyone's favorite intellectual zombie flick 28 Days Later and the more recent dystopian thriller Children of Men.
The first thing I would point to is that it is not the "world" that is ending in these movies so much as the human race that has lorded over it for the past eon or so. It is part of our species arrogance to identify the world with humanity and then to wonder if our destruction would be anything other than a good thing for the rest of "life on earth." So then let us be clear. What we are talking about here is not exactly the globe or the planet but simply the noisome breed of animals bent on mucking it up for everyone else.
Humans. We are tiresome, aren't we? Few could deny the beauty of the depopulated London with which 28 Days Later begins: the seraphic Cillian Murphy ambling about Oxford Circle, picking detritus off the ground, alone save for the pigeons and the gulls. Humanity has perished because the "rage virus" has been loosed from a lab and made us tear each other limb from limb. We don't die from the virus itself. It's the rage that kills us. And so we ought to wonder how much the virus adds to our native cruelty and rancor. Perhaps Cornelius had it right after all: "Beware the beast Man, for he is the devil's pawn. Alone among God's primates he kills for lust or sport or greed … Let him not breed in great in numbers, for he will make a desert of his home and yours."
Actually, the conclusion (or at least the original one) of 28 Days Later is nowhere near as radical. It turns out the virus never got out of the country. Humanity is spared. The hero, his girlfriend, and an orphaned kid make an ersatz domestic hearth in the English countryside, all warm in their sweaters and waiting to be rescued. Rage may be conquered after all. Perhaps we can all just get along.
Humanity (nearly) perishes by anger in 28 Days Later. Sadness dooms us in Children of Men. Seventeen years after a global infertility crisis has brought a stop to human reproduction across the planet, "life" has pretty much ground to a halt. There's no future generation in sight, so nations plunge into despair. War, chaos, and social entropy ensue. The sound of children's voices is dearly missed.
Children of Men is a movie at odds with itself. At its core, the story is a saccharine humanist fable of a culture of life fighting to persist among one of death. A baby springs miraculously into the fallen world and suddenly there is a future to save, as if one could only live for the sake of progeny, as if a world without humans would not be left well enough alone. Amid the rubble and squalor of the end of world, life or death struggle turns to getting the baby offshore to a group of save-the-planet scientists aptly dubbed (giving the game away) … the human project.
Yet, for us much as the movie is committed at the level of story to a bland humanism, it is equally committed at the level of form to something quite different, to making us wonder, within the terms of the narrative, whether the human species ought not to become extinct after all. A great deal of attention has been paid to the six-minute long shot in a battle strewn internment camp. As with 28 Days Later, humanity's end makes quite a spectacle. I would point also to an earlier scene at an abandoned and dilapidated schoolyard. Here we are supposed to be thinking about the despair left in the absence of children. But the camera does something else. We freeze on a deer that strides into the frame and occupies the place of the missing kids. It's an arresting moment precisely in the species difference. A non-human animal walks on the ruins of a civilization made for human children. And perhaps that is just fine.
As with 28 Days Later, humanity ends and begins again in England and is best imagined wrapped up in a cable knit sweater while drinking Earl Gray tea (a role brilliantly played here by Michael Cain). Yet, Children of Men makes the saving of humanity look and feel like it is beside the point and a waste of time. And that is why it is most interesting in spite of its own worst ideas.
So, perhaps the lesson is that thinking about the end of the world is in fact thinking about making it a better place.
Sunday, January 28, 2007
No Reservations, Asad Raza-Style
Recently, my wife and I have been avid watchers of chef Anthony Bourdain's program No Reservations on the Travel Channel (get cable, will you? And then get TIVO, too--trust me), and as I see Tony visit exotic locales and sample their various culinary offerings, I always wonder why he never replied to the late-nite letter that I once wrote him inviting him to dinner at my house, even promising to get my nephew Asad Raza to cook the incomparably zesty-yet-subtle, and completely sui generis, Pakistani dish, Nihari, for him. Now, let me tell you, Asad cooks a mean Nihari, but even the NM (Nihari Master) must go to the source for inspiration and instruction once in a while, and Asad not only went to Burns Road in Karachi (read about some of his other activities while he was there, here), he recorded his visit on video for the rest of us. So, Tony, either go to Pakistan, or come over to my place for some of Asad's Nihari, and meanwhile, watch this video which made my mouth water (and my heart ache):