Bierced

Lavalle_img

On June 23, 1864, Ambrose Bierce was in command of a skirmish line of Union soldiers at Kennesaw Mountain in northern Georgia. He’d been a soldier for three years, and in that time had been commended by his superiors for his efficiency and bravery during battle. He’d been pretty fortunate so far. Three years of hard fighting—on the ground and wielding a rifle—without serious injury. But that day in June a Confederate marksman shot Ambrose Bierce right in the head. The bullet fractured Bierce’s temporal lobe and got stuck in his skull, behind his left ear. He was sent by railroad to Chattanooga for medical care, riding along with other wounded soldiers on an open flat car for two days, their bodies covered by nothing more than a tarp. They rode this way through the June heat of Georgia, as well as drizzling rains. At night the “bright cold moonlight” gave Bierce jarring headaches. Somehow, none of this killed Ambrose Bierce—one of American literature’s great stubborn bastards. In fact, nobody knows how Bierce died. In 1913, at the age of 71, he traveled from Northern California to Mexico. He wanted to check out all the ruckus Pancho Villa had been causing. Bierce departed and was never seen or heard from again. By then he’d become famous as the caustic columnist for, among others, William Randolph Hearst’s San Francisco Examiner.

more from Victor LaValle at The Nation here.

the masterful visual component of an elaborate and profound theater piece

Gorky-s-debt.5036759.40

Arshile Gorky is a pivotal but enigmatic figure in the history of Modern Art — specifically in the alleged shifting of the narrative center of Capital-A Art from Paris to New York somewhere around World War II. Gorky was a quintessential example of American self-reinvention: a figurehead to the ab-ex pioneers in his spongelike eclecticism and existential heroicism, but at the same time a haunted European cast from the Old Master mold — with a psyche rooted in peasantry, Catholicism and genocide, and almost pathologically addicted to biographical fabrication. When Gorky was working the Manhattan art world of the 1930s and ’40s, nobody knew that he was a survivor of the Armenian genocide. Nobody knew he was born Vosdanig Adoian and was not, as he claimed, related to Russian writer Maxim Gorky (whose real name in any case was Aleksey Peshkov). Nobody knew that he had never received the professional training he claimed, and was, in fact, largely self-taught through study of reproductions in library books and visits to public museums. The man they knew as Gorky was, arguably, Vosdanig Adoian’s greatest artistic creation — an evolving pastiche of behaviors, narratives and props coalescing into something approximating the persona of The Great Artist — as envisioned by an untutored immigrant’s imagination and molded by the inchoate expectations of the emerging East Coast cultural elite.

more from Doug Harvey at the LA Weekly here.

Axiomatic Equality: Rancière and the Politics of Contemporary Education

Students-84x84 Nina Power in Eurozine:

Is it necessary to presuppose the intellectual equality of those you teach? To be an educator at all it seems likely that one would have at least an implicit theory of mind, such that one knows what one is doing (or, at least, what one aspires to be doing) when standing at the front of the classroom. Is education merely the transplanting of gobbets of information onto the blank slate of a student's mind (we could call this the Lockean approach), or are we drawing out forms of rational and creative capacity possessed (equally?) by students qua rational beings? Jacques Rancière contributes much to this debate, particularly in his work on the unusual educator Joseph Jacotot in The Ignorant Schoolmaster. This paper attempts to analyse the possibility of what could be called the “utopian rationalism” of Jacotot (and of Rancière himself), within the context of the modern university. Rancière's work will be read alongside that of Pierre Bourdieu and Ivan Illich as other crucial figures in the understanding of the way in which educational achievement relates to certain assumptions about what teaching involves. Ultimately, it may be that the modern university is antithetical to any possibility of establishing true equality among its players – Rancière's position at times invokes the possibility of a radically de-institutionalized autodidacticism that predicates all learning merely on the basis of the will of those desiring to learn. This stance is the very opposite of the Lockean approach, which emphasizes the passivity of the student-receiver. Can the contemporary university bear the weight of Rancière's challenge?

Zogby vs. Silver

John Zogby's opening salvo at Nate Silver:

To date you have many fans. But the real scrutiny is just beginning and some fans are ephemeral. Here is some advice from someone who has been where you are today.

Don't Create Standards You Will Find Hard to Maintain Yourself. You are hot right now – using an aggregate of other people's work, you got 49 of 50 states right in 2008. I know how it is to feel exhilarated. I get the states right a lot too. But remember that you are one election away from being a mere mortal like the rest of us. We very good pollsters have missed some. They tell me you blew the Academy Awards and your projections in the 2010 U.K. elections were a tad squidgy. So be humble and continue to hone your craft. Be aware that some of your legions who adore you today and hang on your every word will turn their guns on you in a minute. Hey, I have been right within a few tenths of a percent – but you are a probabilities guy and even a 95% confidence level and a margin of sampling error are not enough for some.

Nate Silver's response:

Mr. Zogby, I think you may be mistaking me for my Wikipedia page. I don't really spend a lot of time touting my accomplishments or resting on my laurels — there are no marketing materials of any kind on this site. I'm a process-oriented guy, not a results-oriented guy, because as you mention, there's a tremendous amount of luck involved in making any sort of predictions. In the long run, if an unskilled forecaster gets something right 50 percent of the time, a skilled forecaster might get something right 55 or 60 percent of the time. There are very, very few exceptions to that, in politics or in any other discipline. So when we get something right, we usually just move on with our lives rather than brag about it. And when we get something wrong, we'll usually do a post-mortem and try to figure out if we were unlucky or stupid, but not wallow in self-pity.

Now, I'm certainly not going to pretend that we take an attitude of austere academic humility toward everything that we do. We're happy to engage both our friends and our critics in lively arguments, and we can be sarcastic and combative at times. I have a background in competitive, adrenaline-intensive disciplines like poker, policy debate, and sportswriting, and that attitude has become hardwired by now.

Andrew Gelman also weighs in here.

Thursday Poem

Yellow Dress

Port-au-Prince
……………………….
Girl on a heap of street sweepings high
as a pyre, laid on snarled wire & dented rim.
Girl set down among the wrung-out hides.
A girl who was coming from church. It is late
Sunday afternoon. Was it a seizure? Is it
destiny or bad luck we should fear? Weak heart
or swerving taxi? In Tet Bef by the dirty ocean
thousands crush past her without pausing
at the shrine of her spayed limbs; brilliance
like the flesh of lilies sprouting from the pummeled cane.
Is it possible to be lighthearted, hours later?
Days? To forget the yellow dress?
I am waiting for her mother to find her, still
wearing one white spotless glove (where is the other?),
my idle taxi level with her unbruised arm,
her fingers just curling like petals of a fallen flower
and how did it end? Let someone have gathered her up
before the stars assembled coldly overhead:
her dress brighter than gold, crocus, the yolk of an egg
her face covered like the bride of a god; let them
have found her & borne her though the traffic's clamor
veiled with a stranger's handkerchief.

by Amy Beeder
from Poetry, Vol. 185, No. 3, December
publisher: Poetry, Chicago, 2004

Parenting Makes People Miserable. What Else Is New?

From The Atlantic:

Doyle_july07_parents_post It's easy to mock New York magazine's recent article on “why parents hate parenting.” So many of its points seem obvious: Children decrease romance between spouses, diminish one's social life, and can be unholy terrors. (Jennifer Senior, its author, relates an awful-sounding interlude in which her son dismantled a wooden garage and then proceeded to pelt her with the pieces of it as she made repairs.) Still, there's one conclusion Senior makes that merits a bit of skepticism. She suggests that the hatred of parenting is recent, and raises “the possibility that parents don't much enjoy parenting because the experience of raising children has fundamentally changed.” In some important ways, it has. But the complaints raised by the piece aren't new at all; in fact, people—women, most notably—have been voicing them for the better part of the last 60 years.

In The Second Sex, Simone de Beauvoir wrote of the mother who finds that “her child by no means provides that happy self-fulfillment that has been promised her.” Instead, when this woman is busy, and “particularly when she is occupied with her husband,” she finds that “the child is merely harassing and bothersome. She has no leisure for 'training' him; the main thing is to prevent him from getting into trouble; he is always breaking or tearing or dirtying and is a constant danger to objects and to himself.” Adrienne Rich opened her 1976 book on motherhood, Of Woman Born, with one of her own journal entries, in which she noted that her children “cause [her] the most exquisite suffering… the murderous alternation between bitter resentment and raw-edged nerves, and blissful gratification and tenderness.” Later in the book, she would go on to argue that a mother of eight who dismembered her two youngest children and laid them on the lawn as “a sacrifice” was not precisely crazy, just fed up.

More here.

On Caregiving

From Harvard Magazine:

Med In 1966, as a visiting medical student at a London teaching hospital, I interviewed a husband and wife, in their early twenties, who had recently experienced a truly calamitous health catastrophe. On their wedding night, in their first experience of sexual intercourse, a malformed blood vessel in the husband’s brain burst, leaving him with a disabling paralysis of the right side of his body. Stunned and guilt-ridden, the couple clutched hands and cried silently as they shared their suffering with me. My job was to get the neurological examination right and diagnose where the rupture had taken place. I remember the famous professor, who went over my findings, repeating the neurological examination and putting me through my paces as a budding diagnostician. He never once alluded to the personal tragedy for the sad lovers and the shock to their parents. Finally, I found the courage to tell him that I thought the failure to address what really mattered to them—how to live their lives together from here on—was unacceptable. Surely, it was our medical responsibility to offer them some kind of caregiving and hope for the future. He smiled at me in a surprised and patronizing way; then he said I was right to insist that there was more to this case than the neurological findings. The professor had the patient and his wife brought to the lecture hall where he presided over the teaching rounds, and he gave them as sensitive an interview as I might have hoped for, including empathic suggestions for rehabilitation, family counseling, and social-work assistance.

More here.

India arrives in the American imagination

Kroll-Zaidi_1-B

TWO WINTERS AGO, making their way through Bombay’s Victoria Terminus, two young Muslim men became famous. One was Jamal Malik, a fictional orphan in a movie. The other was Ajmal Amir Kasab, said to be from a clan of butchers in small-town Pakistani Punjab. The former, in the closing scenes of the film, is weedy, gawky; lets his mouth hang open in a pantomime of nervous exhilaration; is newly rich from winning a game show; kisses the girl and then dances with the rest of the cast. The latter, in the most widely seen photo, is stout, steroidful; wears too-short cargo pants, a knockoff VERSACE T-shirt, two backpacks filled with ammunition and snacks; carries a double-banana-clipped AK-47 that blurs as it swings through the depopulated space before him. On May 6, 2010, he was sentenced to be hanged.

more from Rafil Kroll-Zaidi at Triple Canopy here.

pariah status in perpetuity

TLS_Baker_726417a

From cries of “Long live dynamite!” to arguments for vegetarianism, the anarchist cause has been a very broad church. Often naive and under-theorized – although it has always had highly intelligent proponents and sympathizers, a current example being Noam Chomsky – anarchism has also been dogged by a reputation for ill-directed violence, leading to what Alex Butterworth describes as “the movement’s pariah status in perpetuity”. Although The World That Never Was is an unashamedly popular book and concentrates on the more lurid end of the anarchist tendency, Butterworth at least tries to treat his pariah subjects with a counterbalancing sympathy. United – if at all – by a resistance to imposed authority, the characters here range from the almost Tolstoyan figure of Peter Kropotkin to the far wilder François Koenigstein, better known as Ravachol. Disgusted by Thomas Huxley’s 1888 Darwinian essay “The Struggle for Existence”, Kropotkin was the great theorist of Mutual Aid who had a soft spot for the rabbit as a species, admiring it as “the symbol of perdurability [that] stood out against selection”. Ravachol, on the other hand, began his career by disinterring an old woman’s corpse, murdered a ninety-five-year-old man, and then embarked on a terror bombing campaign which some commentators romanticized for the perpetrator’s “courage, his goodness, his greatness of soul”.

more from Phil Baker at the TLS here.

Krasznahorkai

Timetravelling

All that is transitory is but a parable. Goethe, Faust II This line, meant by Goethe to indicate that our worldly lives are but symbols for a greater, permanent afterlife, carries with it ambiguities that Mahler never considered when he used it rather clumsily at the climax of his Eighth Symphony. If we are all Christians, how easy to dispose of the travails of this life by casting them as imperfections of a greater, lesser-known world. But if we do not know that world, how do we construct that parable, and how do we sustain it in the face of reality’s constant resistance to conform to it? This is the question that the Hungarian author László Krasznahorkai pursues in his fiction. In the post-war years, many European authors, especially those from Communist states, engaged in surrealism, parable, and allegory as a way of containing the mid-century chaos that spilled over from the war, where the psychology and rationality of modernism no longer seemed capable of fighting the irrationality of Nazism and Communism. While there have been some stunning works by Ludvik Vaculik (The Guinea Pigs), Bohumil Hrabal (I Served the King of England, Too Loud a Solitude), Imre Kertész (Detective Story, Liquidation), and others, this general approach has more frequently produced limp sentimentality and disposable weirdness (Milan Kundera and Victor Pelevin, spring to mind). Within their own works, Günter Grass and Ismail Kadare have met with both success and disaster plowing this field.

more from David Auerbach at The Quarterly Conversation here.

The Future of Ethnic Studies

Photo_6065_landscape_largeGary Y. Okihiro in The Chronicle of Higher Education:

On May 11, 2010, less than a month after signing SB 1070, which many people hold legalizes racial profiling, Arizona's Gov. Jan Brewer signed HB 2281 into law. That law bans schools from teaching classes that are designed for students of a particular ethnic group or that promote resentment, ethnic solidarity, or overthrow of the U.S. government. “Public school pupils should be taught to treat and value each other as individuals and not be taught to resent or hate other races or classes of people,” it reads.

According to Tom Horne, the state's superintendent of public instruction and one of the bill's principal sponsors, the law was aimed at Chicano studies as taught in the Tucson school system. He called the program “harmful and dysfunctional.” Judy Burns, president of the Tucson Unified School District's governing board, disagreed, declaring that Chicano studies benefits students by promoting critical thinking.

The caricatures and falsehoods implied in the language of HB 2281 and in the arguments in its favor are as old as the field of ethnic studies, of which Chicano studies is a part. And while the Arizona law deals with primary and secondary schools, the issue is very much alive in higher education as well. There, too, ethnic studies, now almost half a century old, is facing threats: from budget cuts that often hit the smallest and newest programs first, from scholars who have transformed ethnic studies into multiculturalism and the study of difference, from critics who say ethnic studies is divisive—and from ethnic studies itself.

In light of the “culture wars” of the 1980s and 90s, the arguments of Arizona's political leaders appear positively old-fashioned. They say that ethnic studies has been created only by and for particular racial groups, and that it promotes hatred of whites and minority-group solidarity. Thus the “harmful” and “dysfunctional” nature of ethnic studies is allegedly that it creates social cleavages where, presumably, none existed before. Those battles were waged and resolved years ago—in favor of multiculturalists. Even former advocates of a single national culture now agree that the United States is and has always been a diverse nation, and that its study, accordingly, must reflect that fact.

Reconsidering Birthright Citizenship

20089_article_main Will Wilkinson makes a case against birthright citizenship in the United States, in The Week:

Even as Arizona continues to distinguish itself as America's undisputed leader in hare-brained xenophobia, the state has stumbled upon a very good idea. Hot on the heels of SB 1070, the controversial Arizona law that hands cops expansive powers to detain anybody who gives off an insufficiently American vibe, Republican lawmakers in the state have set their sights on a new state law to deny citizenship to babies born on American soil whose parents lack proper papers.

Currently, anyone born within U.S. boundaries counts as a U.S. citizen, and it doesn’t matter a bit how mom got in. The proposal to end “birthright citizenship” for the children of unauthorized immigrants springs from less than generous motives, and almost surely runs afoul of the U.S. Constitution. But ending it altogether is a better idea than you might think. (And if you already think it's a good idea, it's good for reasons you might find surprising.) For one, it would likely achieve the opposite of its intended result by making America more, rather than less, welcoming to newcomers.

Mothers Who Care Too Much

Ndf_35.4_mothers Over at the Boston Review, a debate: Nancy J. Hirschmann makes the case for the proposition that “Stay-at-home mothering is bad for mothers, their kids, and women’s equality.” Shannon Hayes, Ann Friedman, and Lane Kenworthy respond. (Other responses to come.) Hirschmann:

Since 1986 I have been teaching “Introduction to Feminist Political Thought.” In 2003 something unusual happened in the course. In each of the first five classes, my students initiated a discussion of mothering.

Surprised by this development, I asked the students if they expected to have children. Every woman’s hand went up, but the men thought I was crazy. When I asked how many of the women expected to be stay-at-home mothers, three-quarters raised their hands. Mothering, they said, is the most important job anyone could do. They wanted other options available, but they planned to choose mothering. This pattern has held, more or less, in subsequent years.

Some feminists will cheer this development. Significant trends within feminism, grouped under the label of care feminism, have long emphasized the socially important work that women do rearing children. I have pursued such arguments in my own work but lately I have grown worried that feminists such as me have exaggerated the importance of care, ignored the inadequate ways in which it is often performed. We have failed to acknowledge that the louder we applaud it, the more we enable its perversion.

We hear a lot about the evils of working mothers, how they are too busy or selfish to pay attention to their children. And everyone loves to pile on rich men’s wives who are obsessed with getting their children into the right preschool yet consign them to the care of nannies. But we don’t often talk—either within the academy or outside of it—about the comparable failings of full-time mothering, about the women like Susan’s and Anthony’s mothers who devote their lives to caring for their families, while producing outcomes that arguably undermine such basic political values as freedom, equality, and engaged citizenship.

The students in my feminist theory course are a useful barometer. When they read about the financial and economic vulnerability of married stay-at-home moms, they are skeptical: good mothers, they say, devote themselves to caring for their children; and their husbands should support them; they’re a team. Yet when they read about women on public assistance— often single mothers—they argue that the women should work, and they excoriate them for being bad mothers who set a poor example for their children by not working for a wage, and, implicitly, for not hanging on to their husbands.

Their lack of empathy and identification is both breathtaking and remarkably consistent.

Why We Talk to Terrorists

Axelrod200 Scott Atran and Robert Axelrod over at Edge:

In our own work on groups categorized as terrorist organizations, we have detected significant differences in their attitudes and actions. For example, in our recent interactions with the leader of the Palestinian militant group Islamic Jihad Ramadan Shallah (which we immediately reported to the State Department, as he is on the F.B.I.’s “most wanted” list), we were faced with an adamant refusal to ever recognize Israel or move toward a two-state solution.

Yet when we talked to Khaled Meshal, the leader of Hamas (considered a terrorist group by the State Department), he said that his movement could imagine a two-state “peace” (he used the term “salaam,” not just the usual “hudna,” which signifies only an armistice).

Atran200

In our time with Mr. Meshal’s group, we were also able to confirm something that Saudi and Israeli intelligence officers had told us: Hamas has fought to keep Al Qaeda out of its field of influence, and has no demonstrated interest in global jihad. Whether or not the differences among Al Qaeda, Islamic Jihad, Hamas and other violent groups are fundamental, rather than temporary or tactical, is something only further exploration will reveal. But to assume that it is invariably wrong to engage any of these groups is a grave mistake.

In our fieldwork with jihadist leaders, foot soldiers and their associates across Eurasia and North Africa, we have found huge variation in the political aspirations, desired ends and commitment to violence.

Wednesday Poem

Self Portrait in a Men's Room Mirror

Moustasche: roan — red flecked with grey.
Aquiline nose: from some Roman Gaius
who slipped off crested helmet, greaves and boots,
to pleasure some Semitic Ruth.

Eyes: subdued, muddy blue, dark bags below
all packed and ready to go.
Lines: one for every woe —
six divorces, ten runaway horses.

But nothing, nothing left to comb:
I'd die for a parted red sea of hair,
to toss about, fiddle with, and braid.

I figure, girls go wild for men with manes.
Or so I'm told. That's what I hear.

What else? A mole. A zit. That's it.

by Norbert Hirschhorn
from Anon Seven, 2010

Noam Chomsky interview

From The Telegraph:

Noam-1_1672796c In an almost empty hotel bar, around the corner from the British Museum, an 81-year-old American professor is sipping tea and talking in a monotone so muted I wonder whether he is having me on. I soon conclude that he isn’t; that he doesn’t do jokes; that he, Noam Chomsky, does not, in fact, possess a sense of humour. Sacha Baron Cohen came to the same conclusion when, as Ali G, he asked Chomsky: ‘How many words does you know, and what is some of them?’ Chomsky didn’t even smile, he simply informed his interviewer how many words the average Westerner knows, and then, as requested, revealed what is some of them. Baron Cohen’s question may have been amusing but it wasn’t entirely random. Chomsky found global fame in the Sixties, in the unlikely field of linguistics. He more or less founded the discipline, becoming to it what Freud became to psychoanalysis and Einstein to cosmology.

In contradiction of the prevailing ‘behaviourist’ view that language was learned, Chomsky argued that the human mind is actually hard-wired for grammatical thought. The way children successfully acquire their native language in so little time suggested, for him, that the structures of language were innate, rather than acquired, and that all languages shared common underlying rules. This he called Universal Grammar but don’t worry, I won’t be testing you later, and linguistics is not what this interview is about. Although I should perhaps add that the debate about language has moved on since Chomsky’s theories in the Sixties. And Chomsky has moved on, too. In fact he is better known these days as a political activist. The man the American Right love to hate. The American Left aren’t exactly wild about him either. As a self-styled anarchist and Enlightenment liberal, he collects political enemies the way sticky paper collects flies. You somehow imagine that a man with his rhetorical clout and reputation will have a booming voice, or at least some basic oratory skills. Yet here he is, barely 4ft away from me, and I am straining to hear him. It’s nothing to do with his age or health – he is a slender, fit looking, slightly stooped man with greying wavy hair, a diffident manner and a tendency to glance sideways at you through wire-rimmed glasses.

More here.

Skip the Small Talk: Meaningful Conversations Linked to Happier People

From Scientific American:

Skip-the-small-talk_1 Feeling down? Having a stimulating conversation might help, according to a new study published in Psychological Science. Researchers at the University of Arizona and Washington University in St. Louis used unobtrusive recording devices to track the conversations of 79 undergraduate students over the course of four days. They then counted the conversations and determined how many were superficial versus substantive, based on whether the information exchanged was banal (“What do you have there? Pop­corn?”) or meaningful (“She fell in love with your dad? So, did they get divorced soon after?”). They also assessed subjects’ overall well-being by having them fill out question­naires and by asking their friends to report on how happy and content with life they seemed.

The happiest subjects spent 70 percent more time talking than the unhappiest sub­jects, which suggests that “the mere time a person spends in the presence of others is a good predictor of the person’s level of happi­ness,” says co-author Matthias Mehl, a psy­chologist at Arizona. The happiest subjects also participated in a third as much small talk and had twice as many in-depth conversations as the most unhappy participants.

More here.

see no evil

Eve_apple

Terry Eagleton has written a book about evil in order to demonstrate that there is no such thing. Evil, he writes, is boring, supremely pointless, lifeless, philistine, kitsch-ridden, and superficial. Lacking any substance, it “is not something we should lose too much sleep over.” People can be wicked, cruel, and indifferent. But the concept of evil, with which theologians and philosophers have wrestled for centuries, can be safely tucked away. When it comes to evil, we must be social and economic realists. “Most violence and injustice are the result of material forces, not of the vicious dispositions of individuals.” On a subject that does not exist, Eagleton nonetheless has found a great deal to say. This should come as no surprise. Widely known for books refuting what he confidently proclaimed in his earlier ones, Eagleton is not one to let a seeming contradiction stand in the way of strongly declared convictions. Perhaps this explains why his Marxist musings seem so obligatory, tacked on to the end of a book that primarily deals with writers pondering the many ways we are disobedient to God and his commands. Eagleton believes as fervently in everything as he does in nothing. On Evil is theology without a supreme being. As much as he wants to hold onto class struggle, Eagleton cannot let go of the catechism.

more from Alan Wolfe at TNR here.