Philip Kitcher in the Oxford University Press blog:
[Kitcher is the John Dewey Professor of Philosophy at Columbia University. Living With Darwin: Evolution, Design, and the Future of Faith, Kitcher’s most recent book, is both a defense of Darwin and an exploration of the meaning behind the clash of religion and modern science. Kitcher is also the author of Abusing Science: The Case Against Creationism, The Lives to Come: The Genetic Revolution and Human Possibilities, Vaulting Ambition: Sociobiology and the Quest for Human Knowledge, Science, Truth, and Democracy, and In Mendel’s Mirror. In the article below Kitcher explores how easy it is to hide from the truth.]
Finally, in his State of the Union message, President Bush acknowledged that climate change is a problem. Whether he understands the magnitude of the problem or is prepared for the kinds of measures that are needed to address it remains unclear. But, from many Americans, and especially from people in other countries who have been concerned about global warming for many years, there have been huge sighs of relief. At the same time, there’s an obvious question – why has it taken so long?
The broad outlines of the answer are fairly clear. During recent years, some writers whose conclusions appeal to the values of the President and his advisers, have muddied the waters about climate change. They have employed familiar tactics, casting doubt on any consensus among experts by ignoring the large agreements and concentrating on those places where scientists debate the details. Structurally, the case is much like the long-running battle about evolution: you make it seem as though there is no consensus by judiciously quoting from researchers who are actively involved in discussing unsettled questions, but who agree in a fundamental core framework that you don’t bother to mention.
Behind these two examples lies a deeper problem about the ways the achievements of the sciences are received in American society.
Patricia Cohen in the New York Times:
The American Jewish Committee, an ardent defender of Israel, is known for speaking out against anti-Semitism, but this conservative advocacy group has recently stirred up a bitter and emotional debate with a new target: liberal Jews.
An essay the committee features on its Web site, ajc.org, titled “ ‘Progressive’ Jewish Thought and the New Anti-Semitism,” says a number of Jews, through their speaking and writing, are feeding a rise in virulent anti-Semitism by questioning whether Israel should even exist.
In an introduction to the essay, David A. Harris, the executive director of the committee, writes, “Perhaps the most surprising — and distressing — feature of this new trend is the very public participation of some Jews in the verbal onslaught against Zionism and the Jewish State.” Those who oppose Israel’s basic right to exist, he continues, “whether Jew or gentile, must be confronted.”
Dave Zirin in The Nation:
Muhammad Ali’s brilliance was not that he was some kind of antiwar prophet. He wasn’t Malcolm X or Martin Luther King Jr. in boxing gloves, debating foreign policy between rounds. But unlike the Ivy League advisers who made up the “best and brightest,” Ali understood then that there was justice and injustice, right and wrong. He knew that not taking a stand could be as political a statement as taking one. This was Ali’s code, and he never wavered.
In early 1966 the US Army came calling for Ali, and he was classified 1-A for the draft. He got the news surrounded by reporters and blurted one of the most famous phrases of the decade, “Man, I ain’t got no quarrel with them Vietcong.”
This was an astounding statement. As Mike Marqusee outlines in his Redemption Song: Muhammad Ali and the Spirirt of the 60s, there was little opposition to the war at the time. The antiwar movement was in its infancy, and most of the country still stood behind the President. Life magazine’s cover read, “Vietnam: The War Is Worth Winning.” The song “Ballad of the Green Berets” was climbing the charts. And then there was Ali. As longtime peace activist Daniel Berrigan said, “It was a major boost to an antiwar movement that was very white. He was not an academic or a bohemian or a clergyman. He couldn’t be dismissed as cowardly.”
Some of the newest maps on Worldmapper highlights international differences in health, e.g., this one on infant mortality:
Infant mortality is babies who die during the first year of their life. In 2002 there were 7.2 million infant deaths worldwide; 5.4% of all babies born died within their first year, including 2.3% in their first week.
The territory with the most infant deaths was India, at 1.7 million, or 24% of the world total. In India, for every 100 babies born alive, almost 7 die in the following 12 months.
In 22 territories the rate is over 1 infant death for every 10 live births. All of these 22 territories are in Africa. The highest infant mortality rate is in Sierra Leone where 16.5 babies die, of every 100 born alive.
Territory size shows the proportion of infant deaths worldwide that occurred there in 2002. Infant deaths are deaths of babies during their first year of life.
In Salon, an interview with Barbara J. King, author of Evolving God: A Provocative View on the Origins of Religion (via Political Theory Daily Review):
Every human culture has believed in spirits, gods or some other divine being. That’s why human beings have often been called Homo religioso. Some people take this long history of belief in the otherworldly as evidence for God; doesn’t it explain why religion continues to be so pervasive? But many scientists are coming up with their own, decidedly secular, theories about the origins of faith. In fact, over the last few years, a small cottage industry made up of scientists and philosophers has devoted itself to demystifying the divine.
Take Daniel Dennett, the philosopher who has proposed that religion is a meme — an idea that evolved like a virus — that infected our ancestors and continued to spread throughout cultures. By contrast, anthropologist Pascal Boyer argues that religious belief is a quirky byproduct of a brain that evolved to detect predators and other survival needs. In this view, the brain developed a hair-trigger detection system to believe the world is full of “agents” that affect our lives. And British biologist Lewis Wolpert, with yet another theory, posits that religion developed once hominids understood cause and effect, which allowed them to make complex tools. Once they started to make causal connections, they felt compelled to explain life’s mysteries. Their brains, in essence, turned into “belief engines.”
Of course, these thinkers are either religious skeptics or outright atheists who mean to imply that we’ve been duped by evolution to believe in supernatural beings when none, in fact, exist. That’s what makes Barbara J. King, an anthropologist at the College of William and Mary, so unique. She has no desire to undermine religion. In fact, she’s been deeply influenced by the religious writers Karen Armstrong and Martin Buber. But her main insights about the origins of religion come not from researching humans’ deep history, but from observing very much alive non-human primates.
Dr. Helen scans through the comments on a WebMD post on the different reasons why men and women don’t want sex and concludes:
Update: A Men’s News Daily commenter to this post writes the following:
“Never forget: the single most revolting image, the nightmare that haunts women, is that of the happy, grinning, sexually satisfied male. They really hate that and the sooner we adjust our social expectation to that fact, the better.” Truer words were never spoken–I think that some women really do feel this way.
Jill at Feministe responds:
Yes, women do secretly hate the idea of our partners being happy. You’ve got us all figured out.
The double-standard here is amazing. From the letters Dr. Helen quotes, it’s pretty clear that many women are refusing sex because they aren’t enjoying it, or because there are other issues within the relationship that are leaking over into their sex lives. But clearly, they’re just being selfish by not allowing their husbands unrestrained sexual access, even if the sex sucks, or is painful, or is unwanted. As usual, the mens are not doing anything that needs re-evaluating.
If Avedon provided the tools, it was Susan Sontag who gave Leibovitz a fresh sense of how she could use them as an autonomous artist. In retrospect, that a high-profile photographer of Leibovitz’s calibre should form an alliance with an intellectual as illustrious as Sontag is perfectly logical. After all, Walker Evans and James Agee formed an influential collaboration in the heyday of “documentary style” photography (Evans’s own term.) The turn of the 21st century twist is that Leibovitz and Sontag are women – and that all aspects of their personal, creative, and intellectual lives were intertwined during the fifteen-year period of their relationship.
Leibovitz’s knowing, “commercial” style stands out in a museum context. The best example is her witty color portrait of the Bush Administration, Cabinet Room (2001). A straight photograph and a public image, it’s also stupendously ironic. Bush, Rice, and the rest of them look like a band posing for a 1970s album cover. But the exhibition reveals that Leibovitz has mastered other modes. Her work shifts from creative service in the political and entertainment industries to photojournalism, as in Traces of the Massacre of Tutsi Schoolchildren and Villagers on a Bathroom Wall (1994), to tender family portraiture. Her soft-focus landscape photography of the American west and vast terrain in other locations includes a picture of Mt. Vesuvius – echoing Sontag’s novel The Volcano Lover.
more from Artcritical here.
Interviewing two people at the same time is never easy, but Gilbert and George, a retrospective of whose work opens at Tate Modern next month, take the thing (and of course they’re perfectly aware of this) to a whole new level. Ask a question and, to your right, George will offer some piece of gnomic wisdom topped off with a dash of mild smut while, to your left, Gilbert will titter or splutter or make his own naughty joke in an effort to back up his friend. Then, as you struggle to grasp what it is that they actually mean, the two of them will fall eerily silent. Their marmoset eyes are always on you, which would be scary if they weren’t so invincibly charming. George, in particular, has the kind of manners – if you ignore the smut – that one might have found behind the discreet rosewood counter of a gentleman’s outfitter, circa 1935.
more from The Guardian here.
From The Washington Post:
In 1877, the Italian astronomer Giovanni Schiaparelli was looking at Mars through his new telescope, and he noticed intricate etchings in the equatorial region of the planet’s surface. Schiaparelli called these lines canali, by which he probably meant something like “gullies” or “grooves,” but his coinage got wrongly translated into English as “canals.” It was a regrettable linguistic slip.The idea of Martian canals grabbed the imagination of American astronomer Percival Lowell, scion of the famous Boston Lowell clan, who spun out an elaborate story of a Martian civilization with a central planetary government and the technological wizardry to engineer a massive system of aqueducts. Lowell even used his own Arizona observatory to identify the Martian capital, called Solis Lacus.
There are no canals on Mars. No cities either, and no government. Indeed, no signs of past life whatsoever, as we know today. All of this was an elaborate phantasm of Lowell’s fertile mind, yet as late as the 1950s, popular culture was saturated with imagery of Martians as a technologically advanced extraterrestrial race. The late Carl Sagan used the misbegotten tale of Martian engineers, in his 1985 Gifford Lectures in Natural Theology at the University of Glasgow, as a cautionary tale about the power of belief and yearning to trump science and reason.
Samir El-youssef, raised in a refugee camp, grew up into a writer who challenges the myths of Palestinian politics. Matthew J Reisz meets a trouncer of taboos.
From The Independent:
El-youssef has a Sunni father, but his mother comes from the only Shi’ite Palestinian family. This, he believes, “has contributed to the diversity of my understanding of things – from the beginning you are aware of yourself as someone different”. Although he has contributed many articles to the London-based Arabic newspaper Al-Hayat, his criticisms of the second intifada and the Arab policy of “non-normalisation” in relation to Israel have sometimes proved too controversial to be published.
“We have to meet up with the Israelis and have a dialogue with them,” he explains. “The idea of not meeting is simply childish and stupid. But it is not easy to express your views. You can be branded a ‘Zionist’ or a ‘traitor’ simply for not parroting the same old slogans.”
His own social circle consists largely of liberal British Jews and Israelis. Asked about his outspoken opposition to the academic boycott of Israel, he responds cheerfully: “What hope do we have if we as writers don’t speak to each other? Do we really think our idiotic leaders are going to sort things out?”
A review of The Road by McCarthy, from the New York Review of Books:
Charlton Heston and a savagely coiffed vixen, wrapped in animal skins, riding horseback along a desolate seashore, confronted by the spike-crowned ruin of the Statue of Liberty half buried in the sand: everyone knows how the world ends. First radiation, plague, an asteroid, or some other cataclysm kills most of humankind. The remnants mutate, lapse into feudalism, or revert to prehistoric brutality. Old cults are revived with their knives and brutal gods, while tiny noble bands cling to the tatters of the lost civilization, preserving knowledge of machinery, agriculture, and the missionary position against some future renascence, and confronting their ancestors’ legacy of greatness and destruction.
Ambivalence toward technology is the underlying theme, and thus we are accustomed to thinking of stories that depict the end of the world and its aftermath as essentially science fiction. These stories feel like science fiction, too, because typically they deal with the changed nature of society in the wake of cataclysm, the strange new priesthoods, the caste systems of the genetically stable, the worshipers of techno-death, the rigid pastoral theocracies in which mutants and machinery are taboo, etc.; for inevitably these new societies mirror and comment upon our own. Science fiction has always been a powerful instrument of satire, and thus it is often the satirist’s finger that pushes the button, or releases the killer bug.
This may help to explain why the post-apocalyptic mode has long attracted writers not generally considered part of the science fiction tradition. It’s one of the few subgenres of science fiction, along with stories of the near future (also friendly to satirists), that may be safely attempted by a mainstream writer without incurring too much damage to his or her credentials for seriousness.
Greg Ross interviews Gerard J. DeGroot, author of Dark Side of the Moon: The Magnificent Madness of the American Lunar Quest, in American Scientist:
To Americans in the 1960s, putting a man on the Moon was a noble, even romantic challenge. “No single space project in this period will be more impressive to mankind,” President Kennedy told Congress, “or more important in the long-range exploration of space, and none will be so difficult or expensive to accomplish.”
But in re-examining the Apollo project, historian Gerard J. DeGroot finds it largely an empty dream. In Dark Side of the Moon: The Magnificent Madness of the American Lunar Quest (New York University Press), he argues that the Moon race was essentially just a new front in the Cold War, “an immensely expensive distraction of little scientific or cultural worth.”
In announcing the Apollo project, Kennedy referred to moving with what he called “the full speed of freedom.” Do you think he saw it chiefly as a scientific endeavor, or really as a symbolic contest of ideologies?
I think very definitely the latter. It’s very difficult for some people even still, given Kennedy’s mystique, to accept that he wasn’t quite the person we thought he was. I think the really telling bit comes in a conversation that he has with the NASA administrator James Webb, in which he says, “I don’t really care about the moon. I know it’s important; I know there are people who really want to go there, but I just want to beat the Russians.” So it really comes down to that. It is purely a symbol of American supremacy in the Cold War. Because the Cold War didn’t provide real wars, this is in a sense a sort of surrogate war, and almost seemingly chosen with the same sort of cavalier attitude that, say, a Civil War general might choose a battlefield: “Well, we’re here, let’s fight right here.”
Ruchira Paul in Accidental Blogger:
In 1912, a flamboyant “oriental style” dancer with the exotic name of Mata Hari (mother of god in Hindi) was the toast of Paris night clubs. A traveling musical group from India, The Royal Musicians of Hindustan was in Paris that year. Mata Hari performed with this group. The group’s lead singer was a handsome and serious young man named Inayat Khan. He belonged to an accomplished Indian musical family from Baroda and was trained in Indian classical music and the sufi philosophical tradition. The glamorous and famous Mata Hari later went on to become a French spy (some say, a German double agent) during World War I – not the most sensible career choice for someone who sought publicity relentlessly. Little did the gentle Inayat Khan know that one day his own daughter would follow in the footsteps of the notorious Mata Hari and meet an equally tragic (but more honorable) fate.
Inayat Khan traveled the world with his musical group and introduced the pacifist sufi philosophy to western audiences. During a tour of the United States, he met, fell in love with and married Ora Ray Baker. In 1914 their oldest daughter, Noorunnisa Inayat Khan (Noor) was born in Kremlin, Moscow. The family lived in England and France. From all accounts, Noor and her siblings were brought up in a household bearing both eastern and western traditions. Despite European influences on the children’s upbringing, the cultured and conservative lifestyle of the Khan family was in keeping with Indian Muslim tradition.(Her American born mother had converted to Islam and adopted the name, Amina Begum.) Noor was trained in classical Indian and western music, playing the sitar, piano, cello and violin. She studied child psychology in Sorbonne and music at the Paris Conservatory.
More here. [Photo shows Noor Inayat Khan.]
Paul Auster should not exist. I say this not to mimic a sentence that might easily have been plucked from one of his own hall-of-mirrors fictions, but simply to note his singular position in contemporary American letters. He has enjoyed unlikely success by writing reflexive novels that take up notions of chance and fate, memory and oblivion, luck and the uncanny; given his self-referential leanings and taste for highbrow allusion, it might seem that he would at best have found a coterie of admirers and a university appointment to subsidize his writing. Instead, he has settled comfortably into a career as one of the most glamorous novelists in America. Abroad, he has even higher visibility, a genuine rock-star aura. Magazine profiles cite his movie-idol looks and general air of suave elegance, and although Park Slope, the Brooklyn neighborhood where he lives, may now be home to more writers than any other urban enclave on the planet, he stands out in his affiliation with the place as one of its presiding celebrities. He has branched out into subsidiary projects as a radio personality (having headed up a few years back NPR’s National Story Project, which solicited anecdotal tales from listeners nationwide, later collected in the anthology I Thought My Father Was God ) and a screenwriter and film director: Best known in this regard for his screenplay for 1995’s Smoke (directed by Wayne Wang), Auster has written and directed the rather stilted Lulu on the Bridge (1998) and the just-completed The Inner Life of Martin Frost, based on material from his novel The Book of Illusions (2002). His work has also proliferated into media of unimpeachable hipness: Paul Karasik and David Mazzuchelli adapted City of Glass (1985), the first book in Auster’s New York Trilogy, into a graphic novel in 1994, and the beguiling, mischievous French artist Sophie Calle has realized conceptual pieces based on his writings. These extraliterary manifestations contribute to a highly resilient cultural persona, gracing him, if you will, with a street credibility among chic young bookish types that has sustained Auster through an uneven career.
more from Bookforum here.