Saul Kripke resigned yesterday from his position as Distinguished Professor of Philosophy at the CUNY Graduate Center. While similar allegations have been circulating in unpublished form for years, a team of philosophers from Oxford University has just released a damning report claiming that they were systematically unable to reproduce the results of thought experiments reported by Kripke in his groundbreaking Naming and Necessity. The team, led by Timothy Williamson, first became suspicious of Naming and Necessity after preliminary results raised questions about related work by Hilary Putnam. While the group was initially unable to confirm that water is H2O on Twin Earth, the results turned out to be due to contaminated research materials—one of the researchers’ minds had been contaminated by Chomskyan internalist semantics.
The inability to replicate Kripke’s results could not be similarly explained away, however, as the researcher in question was excluded from the analysis of Naming and Necessity.
Theo Anderson in In These Times:
Before his suicide in September 2008, David Foster Wallace published three short story collections, two novels, two essay collections, a book about rap music and another about infinity. His final, unfinished novel, The Pale King, was published early last year. His essay subjects ranged from Dostoevsky to the porn industry to tennis. But for all his output and range, Wallace rarely wrote about politics. The most notable exception was a long article about the 2000 primary campaign of John McCain. A prominent thread in that narrative is Wallace’s exaggerated innocence about all things political, set against the polished professionals of the mainstream press corps.
Wallace had even less to say about religion. His masterpiece, the 1,000-page novel Infinite Jest, is shot through with the quasi-religious elements of Alcoholics Anonymous. It examines recovering addicts’ commitment to a higher power, but traditional religious organizations and formal theology are almost entirely absent. The same is true of his famous 2005 Kenyon College commencement speech, published as This Is Water, which posthumously brought him to the attention of a wider audience.
If rarely his explicit subjects, though, religion and politics were nearly always Wallace’s subtexts. He mostly ignored the hideous spectacle of electoral politics in the United States, and he had no time for the nonsense that pervades much of American religious life. But his work is obsessed with the roots of our religious and political poverty. It’s a sustained jeremiad aimed at America’s spiritual childishness, and it’s a plea for preserving what is most valuable in religious thought and practice. Wallace was a Puritan, not in theology, but in his sensitivity to a set of insoluble questions and tensions that are deeply rooted in the Calvinist tradition – most notably the tension between freedom and determinism.
For Gautam Pemmaraju apropos of a conversation last night, Amitava Kumar in Caravan:
A rash of Indian bureaucrats are now authors. It doesn’t bode well, in my opinion. Our host wasn’t a closeted writer, thank God, and was merely satisfied to regard the whole lot of us as delinquents. The writer Ruchir Joshi—maybe this was a critical postmodern antic on his part—was happy to oblige. But the most petulant schoolchild at the festival was VS Naipaul. He accused the American ambassador’s wife of a severe lack of intelligence and requested that she leave the table where dinner was being served. On the last day, Naipaul erupted once again. When Nayantara Sahgal bemoaned the sins of colonialism, he interrupted her, shouting, “My life is short. I can’t listen to banalities. Banalities irritate me.”
Banalities irritate me, too, but if you are so averse to them, you ought to stay away from literary festivals. And besides, not all banalities are created equal. The first year that I went to the Jaipur Literature Festival, I was given the honour of engaging in a public conversation with my early hero, Hanif Kureishi. Hanif is a writer of clean sentences; he has a dry wit, and isn’t afraid to be perverse or provocative. He also speaks just the way he writes, his utterances coming out clothed in elegant perfection, their hair gelled. He was in fine form that morning but quite unprepared for what, best as I can recall, was the very first question from the audience: “Mr Kureishi, are you circumcised?”
That was good, very good, in fact, and amused everyone. Much better than questions like, “Sir, how many books have you read?” that had been posed to me the previous day after my own panel. I’m calling such statements banalities, but I quite appreciate their directness and honesty. It’s important to know where these questions are coming from. The man with the pressing inquiry about Hanif’s foreskin really wanted to ask about Muslim identity; his own grandson, the questioner explained, had recently been circumcised. Why should young children undergo this trauma? Of course, we might want to ask why anyone would consider writers a source of great wisdom on such worldly matters: what exactly makes someone who does nothing but spend a lot of time alone in front of their computer uniquely qualified to answer questions about violent conflicts, or stubborn social customs, or world historical changes?
Diane Ravitsch in the New York Review of Books:
In Finnish Lessons: What Can the World Learn from Educational Change in Finland?, Pasi Sahlberg explains how his nation’s schools became successful. A government official, researcher, and former mathematics and science teacher, Sahlberg attributes the improvement of Finnish schools to bold decisions made in the 1960s and 1970s. Finland’s story is important, he writes, because “it gives hope to those who are losing their faith in public education.”
Detractors say that Finland performs well academically because it is ethnically homogeneous, but Sahlberg responds that “the same holds true for Japan, Shanghai or Korea,” which are admired by corporate reformers for their emphasis on testing. To detractors who say that Finland, with its population of 5.5 million people, is too small to serve as a model, Sahlberg responds that “about 30 states of the United States have a population close to or less than Finland.”
Sahlberg speaks directly to the sense of crisis about educational achievement in the United States and many other nations. US policymakers have turned to market-based solutions such as “tougher competition, more data, abolishing teacher unions, opening more charter schools, or employing corporate-world management models.” By contrast, Finland has spent the past forty years developing a different education system, one that is focused on
improving the teaching force, limiting student testing to a necessary minimum, placing responsibility and trust before accountability, and handing over school- and district-level leadership to education professionals.
To an American observer, the most remarkable fact about Finnish education is that students do not take any standardized tests until the end of high school.
Paula Findlen in The Nation:
A right thumb, a finger, a tooth. These were the contents of a reliquary acquired several years ago by a collector at an auction in Florence. Little did he know that for centuries the remains had been objects of profane devotion. Last seen in 1905, they had been sliced from the corpse of Galileo, along with another finger and a vertebra, during his highly publicized reburial in the Basilica of Santa Croce in 1737 almost 100 years after his death, and preserved in a slender case fashioned of glass and wood and crowned with a carved bust of the scientist. The reliquary’s new owner consulted Galileo experts about his find, and after the authenticity of its contents had been verified he donated it to the Museo Galileo, which is tucked behind the Uffizi in a quiet piazza overlooking the River Arno. (A dentist asked by the museum to examine the tooth concluded that Galileo suffered from gastric acid reflux and ground his teeth in his sleep.) The rediscovered reliquary is displayed adjacent to a smaller one containing Galileo’s other finger, a prized museum possession since 1927. Nearby are several artifacts of Galileo’s scientific genius: a telescope presented to the Medici and the broken objective lens of the original device with which Galileo sighted Jupiter’s four satellites in 1610.
“I can’t give up either humanity or freedom,” Joseph Roth announced in a 1935 letter to fellow Austrian novelist Stefan Zweig. Freedom was the right to fit all his possessions into two suitcases and to live in hotels; to move in a single year from Austria to Germany to France to Russia; to have no address and no bank account. He was married, to a woman committed to a mental asylum, and he had a long-term mistress. But he avoided “cooking smells and ‘family life'”. “I shit on furniture. I hate houses.” He nevertheless felt a duty to support these women, along with their parents and children. Roth was often penniless but he still shared what money he had with eight others. On a wider scale, freedom was the license to spurn friends or nations lacking in humanity. Roth was living in Germany in 1933, but the day that Hitler became chancellor he left and never returned. “What divides me from everyone, without a single exception, who is active in Germany,” he told the more accommodating Zweig, “is precisely what divides a human from an animal”.
more from Lara Feigel at The Guardian here.
What makes a book a gay book, or a writer a gay writer? Walt Whitman, for all his sizzling erotic verses about men, insisted to the end that he was interested only in women. Gore Vidal, who has made no secret of his attraction to men, writes sparingly about gay characters and has asserted that there is no such thing as a homosexual, only homosexual acts. James Baldwin’s novels typically repose on bookstores’ African-American shelves, rather than their gay and lesbian sections — even “Giovanni’s Room,” which centers on a relationship between two white men. Christopher Bram, who calls himself a gay novelist (his “Father of Frankenstein” was the basis of the movie “Gods and Monsters”), assumes the task of herding the gay American male writers who emerged after World War II into a coherent history, beginning with the coded innuendo of Tennessee Williams’s “Glass Menagerie” in 1944 and peaking with Tony Kushner’s luminescent “Angels in America” in 1991. In between, Bram writes, a growing stream of gay-themed novels, plays and poems, some bolder than others, prefigured or hastened sweeping changes in the culture at large. “The gay revolution,” he writes, “began as a literary revolution.”
more from John Leland at the NY Times here.
When Justice Marshall decided to retire, a decidedly more conservative political atmosphere dominated national politics. Republican President George Bush was in the White House following the eight-year administration of President Ronald Reagan. President Bush saw Justice Marshall's retirement as an opportunity to appoint a more conservative judge to the Supreme Court. His choice was Clarence Thomas, a forty-three year old, conservative, African-American from Pinpoint, Georgia. Thomas would maintain the racial makeup of the Court, yet would add another conservative voice on decisions involving Affirmative Action and abortion. President Bush's nomination of Clarence Thomas was instantly controversial. Many African-American and Civil Rights organizations including: the NAACP, the National Bar Association, and the Urban League, opposed the Thomas nomination. These organizations feared that Thomas's conservative stance on issues such as Affirmative Action would reverse the Civil Rights gains that Justice Marshall had fought so hard to achieve. Women's groups including the National Organization for Women were equally concerned that Clarence Thomas, if appointed to the high court, would rule against legal abortion. The legal community also voiced apprehension about Thomas's clear lack of experience since he had only served two years as a federal judge. Despite these voices of dissent, the Thomas nomination proceeded to the Senate Judiciary Committee's confirmation hearings. The first few days of the hearings were relatively uneventful. When asked about his stance on legal abortion, Thomas claimed that he had not formulated an opinion and the issue was dropped. After a few more days of outside testimony, it appeared as if the Senate committee would easily confirm the Thomas nomination. The committee split its vote, however–seven to seven, and the nomination went to the Senate without a clear recommendation.
When the nomination moved to the floor of the Senate, it took a sudden and dramatic turn when Anita Hill, a law professor at the University of Oklahoma, came forward with accusations that Clarence Thomas had sexually harassed her. Hill had worked for Thomas years earlier when he was head of the Equal Employment Opportunities Commission. Hill charged that Thomas harassed her with inappropriate discussion of sexual acts and pornographic films after she rebuffed his invitations to date him. A media frenzy quickly arose around Hill's allegations and Thomas's denials. When Thomas testified about Hill's claims before the Senate Judiciary Committee, he called the hearings, “a high-tech lynching for uppity Blacks.” The incident became one person's word against another's. In the end, the Senate voted 52-48 to confirm Clarence Thomas as associate justice of the Supreme Court. To the many people who believed Anita Hill's claims or opposed the Thomas nomination on other grounds, Thomas's appointment was a defeat. Yet, the Anita Hill-Clarence Thomas controversy had other long-term consequences beyond Justice Thomas's life-term on the Supreme Court. Foremost, national awareness about sexual harassment in the workplace heightened considerably. According to Equal Employment Opportunity Commission filings, sexual harassment cases have more than doubled, from 6,127 in 1991 to 15,342 in 1996. Over the same period, awards to victims under federal laws nearly quadrupled, from $7.7 million to $27.8 million. Another repercussion of the Hill-Thomas controversy was the increased involvement of women in politics. The media heralded the 1992 election year as the “Year of the Woman” when a record number of women ran for public office and won. In the U.S. Senate, eleven women ran and five won seats–including one incumbent candidate. In the House of Representatives, twenty-four women won new seats. Many commentators saw this increase as a direct reaction to the Thomas nomination. His appointment dismayed many women, who felt that Anita Hill's allegations were not taken seriously by a Senate that was 98% male.
In the end, the Anita Hill-Clarence Thomas controversy acted as a flash point that illuminated many of the central tensions of life in late twentieth-century America.
More here. (Note: In honor of African American History Month, we will be linking to at least one related post throughout February. The 2012 theme is Black Women in American Culture and History).
Jennifer McDonald in The New York Times:
This book review would be so much easier to write were we to play by John D’Agata’s rules. So let’s try it. (1) This is not a book review; it’s an essay. (2) I’m not a critic; I’m an artist. (3) Nothing I say can be used against me by the subjects of this essay, nor may anyone hold me to account re facts, truth or any contract I have supposedly entered into with you, the reader. There are to be no objections. There are to be no letters of complaint. For you are about to have — are you ready? — a “genuine experience with art.”
This is so liberating!
Under consideration in this essay is “The Lifespan of a Fact,” which is less a book than a knock-down, drag-out fight between two tenacious combatants, over questions of truth, belief, history, myth, memory and forgetting. In one corner is Jim Fingal, who as an intern for the literary magazine The Believer in 2005 (or it might have been 2003 — sources disagree) signed on for what he must have thought would be a straightforward task: fact-checking a 15-page article. In the other corner is D’Agata, who thought he had made a deal with The Believer to publish not just an article but a work of Art — an essay already rejected by Harper’s Magazine because of “factual inaccuracies” — that would find its way to print unmolested by any challenge to its veracity. “Lifespan” is the scorecard from their bout, a reproduction of their correspondence over the course of five (or was it seven?) years of fact-checking.
Peter Richerson reviews Samuel Bowles and Herbert Gintis's A Cooperative Species: Human Reciprocity and its Evolution, in Nature:
Humans are capable of remarkable feats of cooperation. Warfare is an extreme example: when under attack, hundreds or even millions of people might join forces to provide a mutual defence. In A Cooperative Species, economists Samuel Bowles and Herbert Gintis update their ideas on the evolutionary origins of altruism. Containing new data and analysis, their book is a sustained and detailed argument for how genes and culture have together shaped our ability to cooperate.
Modern hunting and gathering societies offer clues as to how human cooperation evolved. They are typically organized into tribes of a few hundred to a few thousand people. Each tribe is composed of smaller bands of around 75 individuals united by bonds of kinship and friendship. Formalized leadership is often weak, but cooperation is buttressed by social norms and institutions, such as marriage, kinship and property rights. The tribal scale of social organization probably evolved by the late Pleistocene (126,000–11,700 years ago), or perhaps much earlier.
Human societies are diverse and competitive, often violently so. Charles Darwin conjectured in The Descent of Man (John Murray, 1871) that the main evolutionary motor behind human cooperation was intertribal competition, and suggested that cooperation evolved in two stages. In ‘primeval’ times, well before the dawn of recorded history, our ancestors came under selection for cooperative instincts, such as sympathy and group loyalty. In more recent ‘civilized’ times, laws and customs have fostered cooperation on ever larger scales. Darwin contended that the primeval social emotions, more than natural selection, drove the evolution of civilization.
Michael Price takes more critical look in Evolutionary Psychology.
Bijal P. Trivedi in Nature News:
On a frigid winter's morning in 1992, Susan Lindquist, then a biologist at the University of Chicago in Illinois, trudged through the snow to the campus's intellectual-property office to share an unconventional idea for a cancer drug. A protein that she had been working on, Hsp90, guides misfolded proteins into their proper conformation. But it also applies its talents to misfolded mutant proteins in tumour cells, activating them and helping cancer to advance. Lindquist suspected that blocking Hsp90 would thwart the disease. The intellectual-property project manager she met with disagreed, calling Lindquist's idea “ridiculous” because it stemmed from experiments in yeast. His “sneering tone”, she says, left an indelible mark. “It was actually one of the most insulting conversations I've had in my professional life.” It led her to abandon her cancer research on Hsp90 for a decade. Today, more than a dozen drug companies are developing inhibitors of the protein as cancer treatments.
Lindquist seems able to shrug off such injustices, now. Her work over the past 20 years has consistently challenged standard thinking on evolution, inheritance and the humble yeast. She has helped to show how misfolded infectious proteins called prions can override the rules of inheritance in yeast, and how this can be used to model human disease. She has also proposed a mechanism by which organisms can unleash hidden variation and evolve by leaps and bounds. She was the first female director of the prestigious Whitehead Institute for Biomedical Research in Cambridge, Massachusetts, and has received more than a dozen awards and honours in the past five years. In a paper being published this week in Nature, she and her colleagues show that in wild yeast, prions provide tangible advantages, such as survival in harsh conditions and drug resistance.
[H/t: Tom Jacobs]
A nationwide campaign to stem investments in private corrections companies is gathering steam.
Hannah Rappleye in Salon:
Early this year, the United Methodist Church Board of Pension and Health Benefits voted to withdraw nearly $1 million in stocks from two private prison companies, the GEO Group and Corrections Corporation of America (CCA).
The decision by the largest faith-based pension fund in the United States came in response to concerns expressed last May by the church’s immigration task force and a group of national activists.
“Our board simply felt that it did not want to profit from the business of incarcerating others,” said Colette Nies, managing director of communications for the board.
“Our concern was not with how the companies manage or operate their business, but with the service that the companies offer,” Nies added. “We believe that profiting from incarceration is contrary to church values.”
It was an important success for a slew of activists across the country who are pushing investors and institutions to divest from the private prison industry.
Sean Carroll in Cosmic Variance:
They do things differently over in Britain. For one thing, their idea of a fun and entertaining night out includes going to listen to a lecture/demonstration on quantum mechanics and the laws of physics. Of course, it helps when the lecture is given by someone as charismatic as Brian Cox, and the front row seats are filled with celebrities. (And yes I know, there are people here in the US who would find that entertaining as well — I’m one of them.) In particular, this snippet about harmonics and QM has gotten a lot of well-deserved play on the intertubes.
More recently, though, another excerpt from this lecture has been passed around, this one about ramifications of the Pauli Exclusion Principle. (Headline at io9: “Brian Cox explains the interconnectedness of the universe, explodes your brain.”)
The problem is that, in this video, the proffered mind-bending consequences of quantum mechanics aren’t actually correct. Some people pointed this out, including Tom Swanson in a somewhat intemperately-worded blog post, to which I pointed in a tweet. Which led to some tiresome sniping on Twitter, which you can dig up if you’re really fascinated. Much more interesting to me is getting the physics right.
One thing should be clear: getting the physics right isn’t easy. For one thing, going from simple quantum problems of a single particle in a textbook to the messy real world is often a complicated and confusing process. For another, the measurement process in quantum mechanics is famously confusing and not completely settled, even among professional physicists.
And finally, when one translates from the relative clarity of the equations to a natural-language description in order to reach a broad audience, it’s always possible to quibble about the best way to translate. It’s completely unfair in these situations to declare a certain popular exposition “wrong” just because it isn’t the way you would have done it, or even because it assumes certain technical details that the presenter did not fully footnote. It’s a popular lecture, not a scholarly tome. In this kind of format, there are two relevant questions: (1) is there an interpretation of what’s being said that matches the informal description onto a correct formal statement within the mathematical formulation of the theory?; and (2) has the formalism been translated in such a way that a non-expert listener will come away with an understanding that is reasonably close to reality? We should be charitable interpreters, in other words.
From Centennial of Flight:
Bessie Coleman, the daughter of a poor, southern, African American family, became one of the most famous women and African Americans in aviation history. “Brave Bessie” or “Queen Bess,” as she became known, faced the double difficulties of racial and gender discrimination in early 20th-century America but overcame such challenges to become the first African American woman to earn a pilot's license. Coleman not only thrilled audiences with her skills as a barnstormer, but she also became a role model for women and African Americans. Her very presence in the air threatened prevailing contemporary stereotypes. She also fought segregation when she could by using her influence as a celebrity to effect change, no matter how small.
Coleman was born on January 26, 1892, in Atlanta, Texas, to a large African American family (although some histories incorrectly report 1893 or 1896). She was one of 13 children. Her father was a Native American and her mother an African American. Very early in her childhood, Bessie and her family moved to Waxahachie, Texas, where she grew up picking cotton and doing laundry for customers with her mother. The Coleman family, like most African Americans who lived in the Deep South during the early 20th century, faced many disadvantages and difficulties. Bessie's family dealt with segregation, disenfranchisement, and racial violence. Because of such obstacles, Bessie's father decided to move the family to “Indian Territory” in Oklahoma. He believed they could carve out a much better living for themselves there. Bessie's mother, however, did not want to live on an Indian reservation and decided to remain in Waxahachie. Bessie, and several of her sisters, also stayed in Texas. Bessie was a highly motivated individual. Despite working long hours, she still found time to educate herself by borrowing books from a traveling library. Although she could not attend school very often, Bessie learned enough on her own to graduate from high school. She then went on to study at the Colored Agricultural and Normal University (now Langston University) in Langston, Oklahoma. Nevertheless, because of limited finances, Bessie only attended one semester of college.
More here. (Note: In honor of African American History Month, we will be linking to at least one related post throughout February. The 2012 theme is Black Women in American Culture and History).