obins Can Literally See Magnetic Fields, But Only if Their Vision is Sharp

RobinEd Yong over at Not Exactly Rocket Science:

Some birds can sense the Earth’s magnetic field and orientate themselves with the ease of a compass needle. This ability is a massive boon for migrating birds, keeping frequent flyers on the straight and narrow. But this incredible sense is closely tied to a more mundane one – vision. Thanks to special molecules in their retinas, birds like the European robins can literally see magnetic fields. The fields appear as patterns of light and shade, or even colour, superimposed onto what they normally see.

Katrin Stapput from Goethe University has shown that this ‘magnetoreception’ ability depends on a clear image from the right eye. If the eye is covered by a translucent frosted goggle, the birds become disorientated; if the left eye is covered, they can navigate just fine. So the robin’s vision acts as a gate for its magnetic sense. Darkness (or even murkiness) keeps the gate shut, but light opens it, allowing the internal compass to work.

The magnetic sense of birds was first discovered in robins in 1968, and its details have been teased out ever since. Years of careful research have told us that the ability depends on light and particularly on the right eye and the left half of the brain. The details still aren’t quite clear but, for now, the most likely explanation involves a molecule called cryptochrome. Cryptochrome is found in the light-sensitive cells of a bird’s retina and scientists think that it affects just how sensitive those cells are.

When cryptochrome is struck by blue light, it shifts into an active state where it has an unpaired electron – these particles normally waltz in pairs but here, they dance solo. The same thing happens in a companion molecule called FAD. Together, cryptochrome and FAD, both with unpaired electrons, are known as a “radical pair”. Magnetic fields act upon the unpaired electrons and govern how long it takes for the radical pair to revert back to their normal, inactive state. And because cryptochrome affects the sensitivity of a bird’s retina, so do magnetic fields.

Ideas of the Century: Film as Philosophy

Tv200Havi Carel and Greg Tuck in The Philosophers Magazine:

Film studies scholars have always drawn on philosophical ideas. Philosophers, and in particular those working on aesthetics and philosophy of art, have been interested in cinema for as long as it has existed. However, film as philosophy as an autonomous sub-discipline is relatively new, emerging in the 80s and coming into its own over the past five years. The 00s have seen the emergence of extraordinary interest and a large number of publications focusing on the conjunction of film and philosophy. This is not to say that it is a well-defined field of enquiry or one that has broad agreement amongst its practitioners on what exactly it is and what it should be doing. This lack of agreement is what, in part, contributes to the richness of this sub-discipline.

Cognitivist film theorists appeal to philosophy of mind and perception and even neuroscience to analyse the experience of film viewing. Wittgensteinians, such as Cavell, have linked film and representation to the general problem of scepticism. Other philosophers, such as Deleuze, Adorno and Baudrillard, have each inspired a different range of film-philosophical understandings. What this diverse work shares are the questions: what film can bring to philosophy, how it can broaden our understanding of philosophical activity as going beyond the written and spoken word, and whether this practice will transform our views of what philosophy is.

Much of the debate has focused on exactly what philosophers can do with film. On the simplest level, film can be used to illustrate existing philosophical ideas. The Matrix has often been used to demonstrate sceptical arguments about the nature of our perceptions and the reality of the external world. But increasingly, philosophers see film as not just illustrating but advancing philosophical views. Thus we can see Woody Allen’s Crimes and Misdemeanours as playing out the classical problem of inequity: how can an evil man flourish, while a righteous man suffers?

A stronger view still is advanced by Stephen Mulhall in his book On Film, in which he claims that films can actually do philosophy in a way that is as cogent and detailed as philosophical texts.

Religion, Science, and the Humanities: an Interview with Barbara Herrnstein Smith

SmithBH1Over at The Immanent Frame:

NS [Nathan Schneider]: Natural Reflections has been the subject of a lively debate (here and here) on Stanley Fish’s blog at The New York Times. Have you found the exchange productive?

BHS: One-shot retorts, or seesaw exchanges on blogs, are rarely models of intellectually productive discussion, but Stanley Fish’s columns attract thoughtful readers, and I found the responses to his column on Natural Reflections instructive. Two related anxieties were repeatedly voiced on the basis of Fish’s description of my evenhanded—or, in fact, determinedly symmetrical—treatment of religious beliefs and what we take as scientific knowledge. One is that I am flattening out important differences between them. The other is that I’m refusing to take a stand on a major issue of our time, and thus—wittingly or unwittingly—giving aid and comfort to the wrong side.

The first of these worries is unwarranted. While I locate the differences between “science” and “religion” on multiple levels, I don’t diminish either the significance of such differences or the stakes that may be involved in identifying them accurately.

The second worry is, I think, misplaced in principle, and reflects increasingly oversimplified public views of science, religion, and the relations between them. Most of the commentators anxious about what side the book comes out on are concerned, I think, about such issues as the promotion of creationist ideas in science classes, or the clerical condemnation of contraceptive devices or homosexuality—that is, public issues in which noisy literalist convictions clash with established scientific accounts, or where informed secular attitudes are confronted by uncompromising ecclesiastic doctrine. Such concerns are understandable and I share them. But taking a clear stand on such issues does not require choosing sides between Science and Religion, conceived as monolithic adversaries in an epic battle.

Reading Nussbaum in the Balkans

NussbaumJustin Smith over at his blog:

I still have fantasies about being an anthropologist, but I have to admit I would be terrible at it. I don't mind being an observer, but to be a participant-observer, that's a bit too much to ask of me. Forget about living for years among rainforest-dwelling, insectivorous hunter-gatherers: I have trouble passing a single week in a provincial Romanian town, surviving on nothing but traditional home cooking (even though it's cooked with love). I prefer my meals meatless, largely uncooked, heavily based on imported and exotic fruits, grains, and pulses. Now that I am back in Bucharest, whenever I see a restaurant that advertises food that is 'just like home', I think to myself: Well in that case nevermind.

But still, the questions that anthropologists ask, if not the field investigations they undertake, attract me more than ever. This much was driven home to me after a leisurely morning of reading recently, during which I alternated between the eminent moral philosopher Martha Nussbaum's recent work in defense of same-sex marriage, and the eminent anthropologist Jack Goody's The Theft of History, a learned tirade against the remnants of Eurocentrism in the writing of world history. The contrast was stark: in the latter case, there was a thinker at work, surveying the range of possible ways human beings do in fact organize societies and attempting to draw general conclusions from these data about the nature of human social existence as such. In the former case, there was a thinker at work, surveying the prevailing opinions of her small community (educated, liberal Westerners), and then attempting to come up with a priori arguments in defense of them.

Friday Poem

“This is the first line. . .”

This is the first line. This line is meaningless.
And this is the second line, in which you’re no longer yourself,
which means you aren’t the person from the first line,
and now you aren’t even who you were
in lines two and three, and four, and additionally

in five. This poem is life, I do everything
to be myself in every line, so that every line
by some miracle bends back to me, meanwhile you,
whether you want to or not, must live this life and in

the last line, as close to the end as possible,
make the grade, the subject of which will be
you. You’ll only survive if you admit

that the poem spoke of God. The last line will come
however faster
than

by Tadeusz Dabrowski
from
Agni, 2009
translation from Polish: Jennifer Carter-Zielińska

Antibody Building: Does Tapping the Body’s Other Immune System Hold the Key to Fending Off HIV Infection?

From Scientific American:

Discovery-of-new-antibodies-hiv_1 Scientists at the National Institutes of Health have identified long-sought and elusive broadly neutralizing antibodies to HIV in a pair of papers published in the July 9 issue of Science. These proteins produced by the innate immune system are crucial for creating a preventive vaccine, and could also have therapeutic uses developed in the coming years or decades. Variations in individuals' innate and adaptive immune systems can dramatically affect responses to infection—HIV is no exception. The result generally can be shown as a bell curve, with a group of people whose disease progresses rapidly, a broad middle segment who progress typically, and a small group of “elite controllers” whose immune systems are quite effective at containing HIV viral replication. The quest to figure out why has focused primarily on the adaptive immune system, because CD4+ and CD8+ T cells have a clearly demonstrated capacity to kill cells infected with HIV. But that response only arises some days, weeks and even months after a person has been exposed to HIV and the virus has integrated itself into cellular DNA, establishing lifelong infection. The adaptive immune response can only contain an established infection, it cannot prevent that infection from occurring at its onset.

The innate immune system is the first line of defense against infection. It attacks at the initial exposure to a pathogen, and can prevent the establishment of infection—and HIV is no exception. But there are a number of reasons why it has proved difficult to identify components of the innate immune response that can neutralize the deadly virus. HIV transmission is not very efficient. Exposed persons may avoid infection for a variety of mechanical (barrier) and biological reasons, such as the virus's failure to penetrate to the surface of mucosal tissue or dendritic cell difficulties in latching onto the virus to carry it to a lymph node. So it is challenging to conclusively identify the contribution of a specific innate immune response that can prevent an initial infection. Over the years, it has become clear that there are factors other than CD4+ and CD8+ T cells that help to control the virus in at least a portion of those infected with HIV.

Researchers have identified several antibodies that can neutralize the virus.

More here.

Interesting environment wards off cancer

From Nature:

Mice Stress has acquired a bad image as a contributor to disease, but a little stress may be no bad thing. Mice raised in a complex environment providing social interactions, opportunities to learn and increased physical activity are less likely to get cancer, and better at fighting it when they do, a new study suggests. A mild boost in stress hormones seems to be what keeps the cancer at bay by switching on a molecular pathway that restrains tumour growth.

Researchers from the United States and New Zealand injected mice with melanoma cells — the deadliest form of skin cancer. After six weeks, mice raised in an enriched environment — extra-large cages housing 20 individuals with running wheels and other toys — had tumours that were almost 80% smaller than those in mice raised in standard housing — five animals to a cage with no additional stimulation. Whereas all the normally housed mice developed tumours, 17% of the mice from the enriched environment developed no tumours at all. Tests in mice with colon cancer showed the same effect.

More here.

Totaalvoetbal is dood

Cruyff

Like all soccer writers, I have a debilitating nostalgic streak, and like all soccer writers, I love Holland. The Dutch, who face Spain in Sunday’s World Cup final, are soccer’s most gorgeous losers, a team defined by a single generation of players who brilliantly failed to reach their potential. The Dutch teams of the 1970s—led by the mercurial Johan Cruyff, who’s widely considered the greatest European player of all time—launched a tactical revolution, played one of the most thrilling styles of their era, and lost two consecutive World Cup finals in memorable and devastating ways. In the process, they became the icons of soccer romantics who would rather see teams play beautifully and lose than win and be boring. That’s a harsh legacy for any team that just wants to take home trophies, and this year’s Dutch squad is trying hard to transcend it. The dreams of millions of fans are riding on their success. Personally, I hope they fail. The legend of Dutch soccer begins, and inevitably ends, with Totaalvoetbal: “total football.” The Dutch haven’t really played total football in years; their current World Cup team is constructed more in opposition to the system than in line with it. But embraced or resisted, it’s the idée fixe that looms over everything they do.

more from Brian Phillips at Slate here.

Bierced

Lavalle_img

On June 23, 1864, Ambrose Bierce was in command of a skirmish line of Union soldiers at Kennesaw Mountain in northern Georgia. He’d been a soldier for three years, and in that time had been commended by his superiors for his efficiency and bravery during battle. He’d been pretty fortunate so far. Three years of hard fighting—on the ground and wielding a rifle—without serious injury. But that day in June a Confederate marksman shot Ambrose Bierce right in the head. The bullet fractured Bierce’s temporal lobe and got stuck in his skull, behind his left ear. He was sent by railroad to Chattanooga for medical care, riding along with other wounded soldiers on an open flat car for two days, their bodies covered by nothing more than a tarp. They rode this way through the June heat of Georgia, as well as drizzling rains. At night the “bright cold moonlight” gave Bierce jarring headaches. Somehow, none of this killed Ambrose Bierce—one of American literature’s great stubborn bastards. In fact, nobody knows how Bierce died. In 1913, at the age of 71, he traveled from Northern California to Mexico. He wanted to check out all the ruckus Pancho Villa had been causing. Bierce departed and was never seen or heard from again. By then he’d become famous as the caustic columnist for, among others, William Randolph Hearst’s San Francisco Examiner.

more from Victor LaValle at The Nation here.

the masterful visual component of an elaborate and profound theater piece

Gorky-s-debt.5036759.40

Arshile Gorky is a pivotal but enigmatic figure in the history of Modern Art — specifically in the alleged shifting of the narrative center of Capital-A Art from Paris to New York somewhere around World War II. Gorky was a quintessential example of American self-reinvention: a figurehead to the ab-ex pioneers in his spongelike eclecticism and existential heroicism, but at the same time a haunted European cast from the Old Master mold — with a psyche rooted in peasantry, Catholicism and genocide, and almost pathologically addicted to biographical fabrication. When Gorky was working the Manhattan art world of the 1930s and ’40s, nobody knew that he was a survivor of the Armenian genocide. Nobody knew he was born Vosdanig Adoian and was not, as he claimed, related to Russian writer Maxim Gorky (whose real name in any case was Aleksey Peshkov). Nobody knew that he had never received the professional training he claimed, and was, in fact, largely self-taught through study of reproductions in library books and visits to public museums. The man they knew as Gorky was, arguably, Vosdanig Adoian’s greatest artistic creation — an evolving pastiche of behaviors, narratives and props coalescing into something approximating the persona of The Great Artist — as envisioned by an untutored immigrant’s imagination and molded by the inchoate expectations of the emerging East Coast cultural elite.

more from Doug Harvey at the LA Weekly here.

Thursday, July 8, 2010

Axiomatic Equality: Rancière and the Politics of Contemporary Education

Students-84x84 Nina Power in Eurozine:

Is it necessary to presuppose the intellectual equality of those you teach? To be an educator at all it seems likely that one would have at least an implicit theory of mind, such that one knows what one is doing (or, at least, what one aspires to be doing) when standing at the front of the classroom. Is education merely the transplanting of gobbets of information onto the blank slate of a student's mind (we could call this the Lockean approach), or are we drawing out forms of rational and creative capacity possessed (equally?) by students qua rational beings? Jacques Rancière contributes much to this debate, particularly in his work on the unusual educator Joseph Jacotot in The Ignorant Schoolmaster. This paper attempts to analyse the possibility of what could be called the “utopian rationalism” of Jacotot (and of Rancière himself), within the context of the modern university. Rancière's work will be read alongside that of Pierre Bourdieu and Ivan Illich as other crucial figures in the understanding of the way in which educational achievement relates to certain assumptions about what teaching involves. Ultimately, it may be that the modern university is antithetical to any possibility of establishing true equality among its players – Rancière's position at times invokes the possibility of a radically de-institutionalized autodidacticism that predicates all learning merely on the basis of the will of those desiring to learn. This stance is the very opposite of the Lockean approach, which emphasizes the passivity of the student-receiver. Can the contemporary university bear the weight of Rancière's challenge?

Zogby vs. Silver

John Zogby's opening salvo at Nate Silver:

To date you have many fans. But the real scrutiny is just beginning and some fans are ephemeral. Here is some advice from someone who has been where you are today.

Don't Create Standards You Will Find Hard to Maintain Yourself. You are hot right now – using an aggregate of other people's work, you got 49 of 50 states right in 2008. I know how it is to feel exhilarated. I get the states right a lot too. But remember that you are one election away from being a mere mortal like the rest of us. We very good pollsters have missed some. They tell me you blew the Academy Awards and your projections in the 2010 U.K. elections were a tad squidgy. So be humble and continue to hone your craft. Be aware that some of your legions who adore you today and hang on your every word will turn their guns on you in a minute. Hey, I have been right within a few tenths of a percent – but you are a probabilities guy and even a 95% confidence level and a margin of sampling error are not enough for some.

Nate Silver's response:

Mr. Zogby, I think you may be mistaking me for my Wikipedia page. I don't really spend a lot of time touting my accomplishments or resting on my laurels — there are no marketing materials of any kind on this site. I'm a process-oriented guy, not a results-oriented guy, because as you mention, there's a tremendous amount of luck involved in making any sort of predictions. In the long run, if an unskilled forecaster gets something right 50 percent of the time, a skilled forecaster might get something right 55 or 60 percent of the time. There are very, very few exceptions to that, in politics or in any other discipline. So when we get something right, we usually just move on with our lives rather than brag about it. And when we get something wrong, we'll usually do a post-mortem and try to figure out if we were unlucky or stupid, but not wallow in self-pity.

Now, I'm certainly not going to pretend that we take an attitude of austere academic humility toward everything that we do. We're happy to engage both our friends and our critics in lively arguments, and we can be sarcastic and combative at times. I have a background in competitive, adrenaline-intensive disciplines like poker, policy debate, and sportswriting, and that attitude has become hardwired by now.

Andrew Gelman also weighs in here.

Thursday Poem

Yellow Dress

Port-au-Prince
……………………….
Girl on a heap of street sweepings high
as a pyre, laid on snarled wire & dented rim.
Girl set down among the wrung-out hides.
A girl who was coming from church. It is late
Sunday afternoon. Was it a seizure? Is it
destiny or bad luck we should fear? Weak heart
or swerving taxi? In Tet Bef by the dirty ocean
thousands crush past her without pausing
at the shrine of her spayed limbs; brilliance
like the flesh of lilies sprouting from the pummeled cane.
Is it possible to be lighthearted, hours later?
Days? To forget the yellow dress?
I am waiting for her mother to find her, still
wearing one white spotless glove (where is the other?),
my idle taxi level with her unbruised arm,
her fingers just curling like petals of a fallen flower
and how did it end? Let someone have gathered her up
before the stars assembled coldly overhead:
her dress brighter than gold, crocus, the yolk of an egg
her face covered like the bride of a god; let them
have found her & borne her though the traffic's clamor
veiled with a stranger's handkerchief.

by Amy Beeder
from Poetry, Vol. 185, No. 3, December
publisher: Poetry, Chicago, 2004

Parenting Makes People Miserable. What Else Is New?

From The Atlantic:

Doyle_july07_parents_post It's easy to mock New York magazine's recent article on “why parents hate parenting.” So many of its points seem obvious: Children decrease romance between spouses, diminish one's social life, and can be unholy terrors. (Jennifer Senior, its author, relates an awful-sounding interlude in which her son dismantled a wooden garage and then proceeded to pelt her with the pieces of it as she made repairs.) Still, there's one conclusion Senior makes that merits a bit of skepticism. She suggests that the hatred of parenting is recent, and raises “the possibility that parents don't much enjoy parenting because the experience of raising children has fundamentally changed.” In some important ways, it has. But the complaints raised by the piece aren't new at all; in fact, people—women, most notably—have been voicing them for the better part of the last 60 years.

In The Second Sex, Simone de Beauvoir wrote of the mother who finds that “her child by no means provides that happy self-fulfillment that has been promised her.” Instead, when this woman is busy, and “particularly when she is occupied with her husband,” she finds that “the child is merely harassing and bothersome. She has no leisure for 'training' him; the main thing is to prevent him from getting into trouble; he is always breaking or tearing or dirtying and is a constant danger to objects and to himself.” Adrienne Rich opened her 1976 book on motherhood, Of Woman Born, with one of her own journal entries, in which she noted that her children “cause [her] the most exquisite suffering… the murderous alternation between bitter resentment and raw-edged nerves, and blissful gratification and tenderness.” Later in the book, she would go on to argue that a mother of eight who dismembered her two youngest children and laid them on the lawn as “a sacrifice” was not precisely crazy, just fed up.

More here.

On Caregiving

From Harvard Magazine:

Med In 1966, as a visiting medical student at a London teaching hospital, I interviewed a husband and wife, in their early twenties, who had recently experienced a truly calamitous health catastrophe. On their wedding night, in their first experience of sexual intercourse, a malformed blood vessel in the husband’s brain burst, leaving him with a disabling paralysis of the right side of his body. Stunned and guilt-ridden, the couple clutched hands and cried silently as they shared their suffering with me. My job was to get the neurological examination right and diagnose where the rupture had taken place. I remember the famous professor, who went over my findings, repeating the neurological examination and putting me through my paces as a budding diagnostician. He never once alluded to the personal tragedy for the sad lovers and the shock to their parents. Finally, I found the courage to tell him that I thought the failure to address what really mattered to them—how to live their lives together from here on—was unacceptable. Surely, it was our medical responsibility to offer them some kind of caregiving and hope for the future. He smiled at me in a surprised and patronizing way; then he said I was right to insist that there was more to this case than the neurological findings. The professor had the patient and his wife brought to the lecture hall where he presided over the teaching rounds, and he gave them as sensitive an interview as I might have hoped for, including empathic suggestions for rehabilitation, family counseling, and social-work assistance.

More here.

India arrives in the American imagination

Kroll-Zaidi_1-B

TWO WINTERS AGO, making their way through Bombay’s Victoria Terminus, two young Muslim men became famous. One was Jamal Malik, a fictional orphan in a movie. The other was Ajmal Amir Kasab, said to be from a clan of butchers in small-town Pakistani Punjab. The former, in the closing scenes of the film, is weedy, gawky; lets his mouth hang open in a pantomime of nervous exhilaration; is newly rich from winning a game show; kisses the girl and then dances with the rest of the cast. The latter, in the most widely seen photo, is stout, steroidful; wears too-short cargo pants, a knockoff VERSACE T-shirt, two backpacks filled with ammunition and snacks; carries a double-banana-clipped AK-47 that blurs as it swings through the depopulated space before him. On May 6, 2010, he was sentenced to be hanged.

more from Rafil Kroll-Zaidi at Triple Canopy here.

pariah status in perpetuity

TLS_Baker_726417a

From cries of “Long live dynamite!” to arguments for vegetarianism, the anarchist cause has been a very broad church. Often naive and under-theorized – although it has always had highly intelligent proponents and sympathizers, a current example being Noam Chomsky – anarchism has also been dogged by a reputation for ill-directed violence, leading to what Alex Butterworth describes as “the movement’s pariah status in perpetuity”. Although The World That Never Was is an unashamedly popular book and concentrates on the more lurid end of the anarchist tendency, Butterworth at least tries to treat his pariah subjects with a counterbalancing sympathy. United – if at all – by a resistance to imposed authority, the characters here range from the almost Tolstoyan figure of Peter Kropotkin to the far wilder François Koenigstein, better known as Ravachol. Disgusted by Thomas Huxley’s 1888 Darwinian essay “The Struggle for Existence”, Kropotkin was the great theorist of Mutual Aid who had a soft spot for the rabbit as a species, admiring it as “the symbol of perdurability [that] stood out against selection”. Ravachol, on the other hand, began his career by disinterring an old woman’s corpse, murdered a ninety-five-year-old man, and then embarked on a terror bombing campaign which some commentators romanticized for the perpetrator’s “courage, his goodness, his greatness of soul”.

more from Phil Baker at the TLS here.