The Advantages of Closing a Few Doors

From The New York Times:

Doors_2

Xiang Yu was a Chinese general in the third century B.C. who took his troops across the Yangtze River into enemy territory and performed an experiment in decision making. He crushed his troops’ cooking pots and burned their ships. He explained this was to focus them on moving forward — a motivational speech that was not appreciated by many of the soldiers watching their retreat option go up in flames. But General Xiang Yu would be vindicated, both on the battlefield and in the annals of social science research.

He is one of the role models in Dan Ariely’s new book, “Predictably Irrational,” an entertaining look at human foibles like the penchant for keeping too many options open. General Xiang Yu was a rare exception to the norm, a warrior who conquered by being unpredictably rational. Most people can’t make such a painful choice, not even the students at a bastion of rationality like the MIT, where Dr. Ariely is a professor of behavioral economics. In a series of experiments, hundreds of students could not bear to let their options vanish, even though it was obviously a dumb strategy (and they weren’t even asked to burn anything). The experiments involved a game that eliminated the excuses we usually have for refusing to let go.. In the real world, we can always tell ourselves that it’s good to keep options open.

More here.

Temporary Columns: OBAMA, UNGER AND I

by
Ram Manikkalingam

I sat in on a class that Obama also attended at Harvard Law School.  I believe it was the Spring or Fall of 1991.  The class was called “Re-inventing Democracy”.  It was taught by Roberto Unger, who dresses like an undertaker, lectures like a prophet, and thinks like a philosopher in a hurry.  At the time, I was doing my doctorate in political science at MIT. Students at MIT and Harvard were permitted to take classes at each other’s institution.

Unger is now the “minister of strategic affairs” in Lula’s government in Brazil.  His colleagues call him “the minister of ideas”. Unger belonged to what is known as the critical legal studies movement in law. They are leftish, foucauldian, postmodernish, multiculturalist critics of how law has traditionally been approached in the academic (primarily), professional and political worlds.  Critical legal scholars have had more success with changing academia than the “real world”.  Still, their views are important to understand the role of power (racial, class, gender, heterosexual, among others) in law.  In fact Unger’s first (and I believe his best) book is called Politics and Knowledge.

300pxmangabeiraunger19062007_2

Roberto Unger’s own work goes beyond critical legal studies.  He has been describing a new world full of political possibilities and economic opportunities for quite some time.  He described this world then as an alternative to liberalism and Marxism.  While the world he describes remains the same, the alternatives he critiques have changed after the collapse of the Soviet Union and communist Eastern Europe. He goes after neo-liberalism on the right, and on the left he attacks what he calls – “the populist authoritative nationalist version of Latin America” associated with Chavez, and the “well behaved version of Western Europe” associated with social democratic parties of the north Atlantic. He criticises both lefts for stifling individual and institutional creativity.

He argues instead for a world of economic and political experimentation, where the state’s function is to first provide the social and political tools (including insurance for individual and collective failures) to encourage innovation, and then to get out of the way. Innovate and experiment, till things get stuck, either because the strategy has failed, or you have come to a fork in the political road. Then let the people decide how to get unstuck through a plebiscite.  The heroic class of his theory are the petty bourgeoisie, dismissed by marxists, and disregarded by liberals.  He believes they are the wellspring of innovation as the classic boundary crossing group – finding new ways of surviving in an institutional and ideological environment that is inhospitable to them.  But their innovations are disregarded, dismissed or suppressed by a combination of ignorance (among those who seek emancipation through Marxism) and enmity (among those who seek wealth through capitalism). The result is a failure to harness and increase innovations that can help society progress.  Instead, Unger argues those dissatisfied with the world moving towards a divide between rich, fat and comfortable white people, and poor, hungry and uncomfortable black ones – are left with authoritarian Third Worldism and phlegmatic North Atlantic social democracy as the only available alternatives.

While it is easy to be sceptical about Unger’s capacity to translate his ideas into practical policies, there is no doubt that his work captures the disaffection many of us feel with the failures of the dominant neo-liberal model, and the uninspiring alternatives that have been presented to us. It says something appealing about Obama, that he attended this class, instead of one on say corporate tax law that many other Harvard law students planning to pursue another career route probably did.  Dissatisfied with the world we inhabited, he too was struggling with ideas for a better future one.

One day in the midst of all this high minded theorising, students in this class staged a (mini) “revolt” against Unger. I do not recall exactly what sparked it off, but a student (planned or unplanned) took on Unger’s own commitment to democracy.  Since this was a class about re-inventing democracy in radically new ways – Unger did not discourage challenges to his ideas and queries about his approach.  This attack, however, went beyond the realm of Unger’s ideas, but to his personal commitment to implementing them in the very sphere he had control over – the class room.  The attack was that while Unger talked about re-inventing democracy in the world, the class was taught in a hierarchical manner, like any other. In short, his class was run like a Latin American fiefdom, while he posed as a radical democrat.

He behaved like he knew more than we did, so the critique went.  He taught by lecturing, and we – the students – learned by trying to digest what he said. The point – at least to the extent I can recall one – was that Unger was not engaging the class in a manner that enabled them to participate more fully. He set the agenda, the content and the tenor of the discussion. And the students had to fall in line. More over, those revolting charged that some students seemed to speak more than others, implying that Unger was permitting a select few to domineer class discussion. And so one student after another piled on repeating variations of the same critical theme and accusing Unger of hypocrisy.  The class ended in the middle of the uproar.

I was bemused by the whole incident walking back. And by the time I got to my flat the supercilious attitude I had assumed towards my fellow students – rich and privileged members of the corporate elite-in-waiting who were posing at radicalism – had turned into disdain. What do they expect – they teach and the professor listens? How could these students be so naïve about what a classroom is? Or who a professor is?  How else is he to teach other than lecture in a class with seventy odd students? And they are the ultimate hypocrites – taking a class on re-inventing democracy, while interviewing for jobs with corporate law firms.

I had never felt that Unger or for that matter any other professor – however authoritarian and hierarchical in the class room – was necessarily smarter than me – just by being my professor. Certainly, I acknowledged that some were. But the reason they were the professor and I the student, was more pragmatic. They had already struggled with questions I was struggling with. And they (probably) had read far more books than I had, in doing so. So their experience and possibly wisdom might help me navigate a little quicker my own struggle with ideas.  Did this mean that they were smarter? I was loathe to admit it of those who were, and happy to deny it of those who weren’t. 

The following week, I returned to class expectantly for the second act in the drama.  I was not the only one. There were many new faces in class, along side the regulars. Word had spread there was going to be a showdown in Unger’s classroom. So the cheap stalls were full.  And Unger began – as he always uncannily did – from the very word where he left off the previous week.  He acknowledged the mini revolt and then proceeded to express his disagreement with its rationale. 

He said that for him the “form” of the class was dictated by practical aspects.  He disagreed that just because a professor lectured and students listened, they ought to feel less smart or agree with his views. In fact, he claimed that he always did think he was smarter than his lecturers even though he had to listen to them.  And as a student who never spoke in class, he certainly felt that those who did usually made fools of themselves, rather than actually dominate discussion. He also argued there was nothing about the nature of the classroom that precluded students from disagreeing with his ideas, forming their own, or simply dismissing his altogether.  And finally he came up with the most brilliant summary of teaching approaches (in a large lecture classroom) I had heard.  Here is what he said:

“There are three forms of pedagogical discourse. The first is the no-holds-barred philosophical discourse. The chief requirements of which are infinite amounts of time and a willingness to waste it.  The second is the pseudo socratic method, with the illusion of freedom and the reality of structure.  Here the professor asks a question. Joe responds – wrongly in the view of the professor. The professor says that was a very interesting answer Joe – now can we please get on with the discussion. The third is what I do.  I present my own ideas. You then develop and sharpen your own, by arguing against and critiquing mine.  I do not expect that the outcome of this process will be that you come over to mine.”

Obama_2

Unger then opened up the class for more comments and suggestions about what to do.  He had a little trick up his sleeve, but he wanted to first give everyone hankering for a showdown an opportunity to have a say.  One supporter piped in from the gallery saying that Unger should not be discouraged (as if he were when he was quite enjoying himself), but that “he should know, in the words of Nixon, that a silent majority was with him”.  After the tumult had settled down a bit, Obama took the stage.  He captured the mood of the outspoken minority in the class – idealistic (even if naïve) outrage about hierarchy in the university. Without losing the realistic view of why we go there in the first place – there are people who know more than we do about books at a University and spend more time thinking abut them than most others.  He was good, really good. Though not succinct, he was eloquent. Students quietened down and listened.  So did I. For a moment I even suspended my cynicism about Harvard law students in the class, as corporate elite-in-waiting posing as radical democrats. He finally ended his speech. There were few other comments afterwards.  But they seemed superfluous after Obama’s.

Finally Unger came in with his denouement.  He invited the students to take over the class. He asked any interested group of students to develop a syllabus, an agenda and a reading list, and visit his office and discuss it with him. He assured them that he would not just consider this input but actually work with it.  It may have been this that put students off. But in any case, anyone who has had to teach knows that developing an agenda and content for a class in a coherent, interesting and pedagogically useful way takes time and/or experience. The students had neither. None showed up in his office and we returned the next week to business as usual, much to my relief.

This minor episode (or at least my reaction to it) prefigures my response to Obama as a Presidential candidate sixteen years later.  I recall verbatim Unger’s brilliantly succinct summary of pedagogical approaches. While I remember the tenor of, I struggle to recall, a single word in Obama’s eloquent intervention. He is inspiring as a speaker on change. But, however much I would like to, I cannot quite shake off my doubts about him as a maker of it.

monday musing: black history month, nwa

Nwa

It sounds like it might be a baritone sax. One note repeated over and over underneath the song. Low and nasty. The beat is driving and has a funky edge, set off by the little guitar riff looped over the top. The whole sound is there from the first note. No build. No games. Within the first second you’re hearing the lyrics, which come hard and relentless…

Straight outta Compton, crazy motherfucker named Ice Cube
From the gang called Niggaz With Attitudes
When I’m called off, I got a sawed off
Squeeze the trigger, and bodies are hauled off

It is hard to explain the way that song makes you feel when you first hear it: Los Angeles, 1988, coming out of the giant speakers of a low-slung Oldsmobile rolling down Pico Boulevard just after sunset. Bad Ass. Rock and Roll died that day. Whatever its other virtues, Rock and Roll was driven and sustained by one thing… badassness. But that summer in LA in the late 80s was the final straw. Bad Ass moved to Compton.

The first two songs from the album Straight Outta Compton hit the NWA formula perfectly. The sound and the mix were put together by Dr. Dre. It was mean and gritty but it always managed to stay light. Hip-hop wasn’t plodding anymore, it was leaping around like Bizet, plus a growl. Then you get the lyrical triumvirate: Ice Cube, Ren, Eazy-E. Ice Cube always had the strongest voice and the solid rhymes. You start with Ice Cube. Then Ren comes in and picks up where Ice Cube left off with a slight twist, different emphasis, stranger thoughts. And then, just when it seems that you know what to expect, comes Eazy-E. Eazy-E has a crazy high-pitched voice. It comes out of nowhere. It’s evil and funny at the same time. Plus nobody in NWA was a Bad Ass quite like Eazy-E. His first lines from Straight Outta Compton are legendary…

straight outta Compton
is a brotha that’ll smother yo’ mother
and make ya sister think I love her
Dangerous motherfucker raises hell

He’s like a maniac from some ghetto nightmare. Unbelievable. Brilliant. He is going to kill your mother and he’s going to treat your sister badly. Bad Ass. Same thing on Fuck Tha Police, the second song off the album. You get excited by Ice Cube and Ren but you’re secretly waiting for Eazy-E. And then, after a slight pause, the Eazy-E madness kicks in.

I’m tired of the muthafuckin jackin
Sweatin my gang while I’m chillin in the shackin
Shining tha light in my face, and for what
Maybe it’s because I kick so much butt

I kick ass, or maybe cuz I blast
On a stupid assed nigga when I’m playin with the trigga
Of an Uzi or an AK
Cuz the police always got somethin stupid to say

He is in extra Eazy-E whine mode for these lines and really works himself into a stunning sing-songy rhythm for the lines “cuz I blast / on a stupid assed nigga when I’m playin with the trigga.” Nobody ever had more fun than Eazy-E being an inexcusably awful person. That’s the nature of a Bad Ass. Done right, there are no excuses. There can’t be. It isn’t a moral position. It isn’t something that can be argued about, for, and against. That was what was so silly about all the debates around gangster rap. The defenses missed the point every bit as much as the denouncements did.

NWA was not great because the music “directed our attention to the real conditions in the inner city” or any such twaddle. And every attempt to attack NWA for glorifying crime and violence simply added another six figures in the “albums sold” category. You can’t beat Bad Ass with logic or politics or ethics. Bad Ass is an aesthetic category. It’s inimical to discourse. Bad Asses don’t explain themselves because there is nothing to explain.

That begs the question, I guess, as to why we ever cared about Bad Asses in the first place. Why are we thrilled and excited by them, if even despite ourselves. The answer is not a definitive one, I suspect, and the matter can’t be looked at dispassionately. Maybe you’re sitting on a stoop somewhere, any half-assed bungalow in the southland on a dry night with the Santa Ana winds blowing just so. You’re young and the world seems new enough still that something different might just happen. But probably it won’t. There’s the dull ache of empty desire and the vague scent of a wild fire burning itself out in one of the canyons. And then you hear the sound again, from a boom box or a car radio. The bounce of that sound, the drive in it, the thump and the relentless lyrics. Bad Ass.

NWA is satisfying in the same way as a James M. Cain novel or maybe Byron’s Don Juan. It isn’t pretty and isn’t meant to be. It’s something else. But anybody who isn’t drawn to the Bad Ass in some way is missing an essential human bone. You can’t listen to those NWA songs without feeling a moment of thrill, when the beat comes, when the lyrics blast out, whenever. It is Bad Ass pure and simple, stamped and sealed and impossible to ignore. We want the Bad Ass to blast the world apart, if only for a moment, or to deny it just for the sake of denying it. We don’t want to take up the task of being the Bad Ass ourselves, but we want somebody to be it, we want some Bad Ass out there to say fuck it all, every single bit of it.

Monday Poem

Looking for Evidence
Jim Culleny

Poor Darwin.
Forever dissed by people-of-the-book,
he rummaged through bins of bones
flinging one after another
over his shoulder
looking for a missing link.

Femurs and fibulas went flying.
Knuckles and kneecaps rained.
Disks —the pride of vertebrates—
hit walls and ricocheted like pucks
slap-shot by blood-thirsty Bruins.
The thud of ulnas and clavicles
drummed rhythms on wallboard as they hit.
They landed here and there in the dusty landscape
only to be buried again in the sands of time,
found by future anthropologists,
and dismissed once more (no matter what)
by latter-day people-of-the-book.

It’s gotta be here somewhere, murmured
Charles. Everything else so elegantly fits.

Meanwhile, at a bin to Darwin’s right
marked “Creation, Myths, and Miracles”
Reverend Pat dug in too.

He tossed a leather-bound edition
of the Epic of Gilgamesh
onto a heap in the corner which
nudged a volume of the Enuma Elish
that slid to the floor and settled
beside a story of how a flower
grew from Vishnu’s navel.

Junk, grumbled Pat . Absurd junk
that can’t hold a candle to a talking snake.

He’d been hoping for a scrap
of Genesis notarized by God
but found only a sheepskin note
inscribed “Adam and Eve
are the apples of the old man’s eye.”
Good enough for me, said Pat
and ducked as the skull of a chimp
sailed by.

The Continuity Wars

by Frans B. M. de Waal

DewaalSomething curious is underfoot in the science of human vs. ape comparisons.

For a long time, we’ve been used to scientists who believe we’re totally unique. They simply don’t see humans as part of the animal kingdom, are uninterested in evolution, and indeed uninterested in any meaningful cross-species comparison. They just react with horror to any hairy creature that looks like them, the way Queen Victoria declared the apes displayed, in 1835, at the London Zoo “frightful, and painfully and disagreeably human.”

It is different now. We’re dealing with scientists who believe in evolution, claim an interest in it, and sometimes even have great expertise, yet balk at accepting mental continuity between humans and their closest relatives. Admittedly, most of them have a background in the social sciences, such as anthropology or psychology, not biology, which may explain why they argue that Charles Darwin was actually mistaken on this issue and that the cognitive gap between a human and an ape is in fact so wide that it may exceed that between an ape and a beetle.

A beetle? Have they ever seen a beetle brain next to a chimp’s?

Darwin could not have been clearer, saying in The Descent of Man: “… the difference in mind between man and the higher animals, great as it is, certainly is one of degree and not of kind.” The evolutionary framework simply has no room for saltationist arguments. Like Darwin, I am not claiming that humans possess absolutely no unique mental capacities – I am sure they do – but these capacities are merely the tip of the iceberg, and I prefer to look at the whole “berg.”

Except for a few differences at the microscopic level, the human brain is barely distinguishable from the ape brain. Its structure, neurotransmitters, and functional connections are all the same. Even our much-heralded frontal cortex turns out to be about the same size as an ape’s relative to the rest of the brain. Since we don’t assume that the human heart or liver work any differently than those of other animals, why shouldn’t this apply to the stuff between our ears? Yes, the human brain is three times larger, but this only means that it can do more, or do certain things better.

We now seem to have two schools of primate researchers. The “gradualists,” who follow Darwin on both counts (evolution and continuity), and the “exceptionalists,” who follow only half the theory. They propose major mental and behavioral differences, often focusing on just one that they feel explains everything that makes our species unique. Even the major scientific journals are taking sides, with Nature publishing more gradualist papers and Science more exceptionalist ones. Entire research institutes are split, such as two directors at the Leipzig Max-Planck for Evolutionary Anthropology, with one director publicly criticizing another on this issue.

Claims and counter-claims arrive at a pace that must be hard to follow for the outside world. For example, a recent exceptionalist paper on how altruism is sadly absent in the apes, hence must be uniquely human, was soon followed by a gradualist correction about how altruism is alive and well in chimpanzees (see my commentary on both). Or a recent prominent paper about highly developed social learning in chimpanzees was forgotten, and in fact unmentioned, when Science published a report about the limits of chimpanzee social cognition. This prompted our recent commentary in Science – which the journal published four months later – about the best way to compare human and ape cognition.

Screenhunter_03_feb_24_1941Our main critique was that if both children and apes are tested by human experimenters, this is unfair to the apes. On the surface, the procedures look identical, but the apes are the only ones facing a species barrier. They obviously don’t relate as well to adult humans as children do. Another difference is that children often sit on or next to their parent during testing, meaning that the parent can give all sort of unintentional clues that assist performance, whereas the apes lack this advantage. In fact, apes have been tested for decades in ways that almost guarantee underperformance.

We do have solutions to this problem. A recent study on dog cognition was conducted in the pet owner’s presence, but with the owner blindfolded. This way, they excluded unwanted influences known as “Clever Hans” effects. Shouldn’t children, too, be tested in a way that cancels parental influence?

I do think there is room for careful human/ape comparisons, and that most of the time (but not always) these will come out in favor of the primate with the larger brain. Humans are different, but not as drastically as claimed. The bigger task that we face is not to assign the gold, silver, and bronze medals of smartness in the animal kingdom, but to see what kind of processes underlie all cognition, both human and animal. Evolutionarily speaking, the more parsimonious assumption is that related species will handle similar problems in similar ways, using the same brain areas, (mirror) neurons, and connectivity.

This is something to keep in mind when the next paper comes along postulating a huge human vs. ape difference. My bet is always on the similarities, and indeed over my lifetime I have seen tons of claimed differences fall by the wayside, but rarely a claimed similarity.

A nice illustration is the work on imitation by Vicky Horner and others. Even though everyone uses the word “aping” for imitation, it was until recently held that apes are actually not good at it. Apes were said to lack “true” imitation based on the fact that, most of the time, they refuse to follow the human example. When we removed the human experimenter from the picture, however, and looked at imitation from ape to ape, all of a sudden they turned out to be excellent, faithful copiers of behavior. So, apes actually do ape!

This won’t deter the exceptionalists, however. They have already begun to turn their attention to the next major difference.

The most amusing one I have ever seen occurred in a Dutch newspaper by a serious philosopher writing about man’s place in nature. He proposed that humans differ from all other animals in that only we go on vacation. A sea lion may lie on the beach, he wrote, but not with the purpose of relaxation. Only we set aside time for this.

Perhaps we should just grant him his little distinction, and not fight it, so that we can finally close this line of argument and move on to more important matters.

Trained as an ethologist and biologist, the Dutch-born Frans B. M. de Waal is C. H. Candler Professor in Psychology and Director of the Living Links Center at Emory University, in Atlanta, Georgia, USA.

A writer’s house

PD Smith at Kafka’s Mouse:

Virginiawoolfmonkshouserodmellsma_5Do the houses once lived in by famous writers tell us anything about their work? After the Great War, Virginia Woolf and her husband paid £700 for Monk’s House in the Sussex village of Rodmell. It’s a simple, weather-boarded cottage beside a country lane.

Behind it was a garden and an orchard of overgrown pear and apple trees, with views over the flats of the Ouse valley. When they bought it, Monk’s House had no bath, no toilet, no hot water and just brick floors. Its previous owner had gone mad and starved himself to death. Virginia wrote: “We went to Rodmell, and the gale blew at us all day; off arctic fields; so we spent our time attending to the fire.” One morning they had to get up at 4 am to chase mice out of their bed. Today, few would put up with such conditions. But not Virginia; she loved the cottage and her “soft grey walks” in the surrounding countryside.

More here.

A Chat With George W. Bush’s Conscience

Francis Wilkinson in Discover:

Screenhunter_02_feb_24_1826As a former chairman of George W. Bush’s President’s Council on Bioethics, Leon Kass is well acquainted with controversy, and with the treacherous terrain at the nexus of science and politics. The council, tasked with advising the president on such hot-button issues as stem cell research and cloning, has sometimes been dismissed as a vehicle for the right wing of the Republican Party. But although some of his views comport with those of hard-liners, Kass, a physician with a Ph.D. in biochemistry, is hard to pigeonhole. “I do not come from a school of thought, nor do I have an ideology,” he says.

An old-fashioned moralist, he holds some views that are remarkably unfashionable—even premodern. He still employs the term bastard to describe the children of unwed parents, and he has written despairingly about the loss of “female modesty” in our culture. At the same time, he has misgivings about the effects of global capitalism and believes in integration, tolerance, and inclusiveness. In the end, what really rankles many scientists is Kass’s belief that society has a duty to regulate research, and his frequent warnings about the dehumanizing effects of some technologies.

The recommendations of the Council on Bioethics, though substantive and scholarly, have by and large not been put into practice by policymakers, and the group’s prominence has faded as the debate about stem cell research has ground to a standoff. Kass left the council in September and currently is a fellow at the conservative American Enterprise Institute for Public Policy Research, where his office is a few paces from Lynne Cheney’s. He sat down with DISCOVER to reflect on his tenure and discuss his beliefs, his influences, and his concerns for the future.

More here.

In Thriving India, Wedding Sleuths Find Their Niche

Emily Wax in the Washington Post:

Screenhunter_01_feb_24_1742Like a lot of young Indian couples, they met on a matrimonial Web site and within a matter of weeks were picking out the wedding invitations, reserving the horse-drawn carriages and having the bride fitted for a pearl- and gold-encrusted sari.

Judging by his online profile, the groom was suitable and eager to be a good spouse: a quiet, stay-at-home kind of guy who never drank and worked as a successful software engineer. Perfect, thought the bride, a shy 27-year-old computer engineer.

Too perfect, according to Bhavna Paliwal, one of India’s wedding detectives, who are being hired here in growing numbers to ferret out the truth about prospective mates.

More here.  [Thanks to Ruchira Paul.]

A Moment of Hope

Mohsin Hamid in Time:

A_essay_pak_0303It has been some time since I was as happy as I was on the night after Pakistan’s Feb. 18 general election. Mine was perhaps a reckless joy, temporarily distracting me from the very real troubles that Pakistan faces. But as I spoke to friends and acquaintances, both here in London and in my hometown of Lahore, I realized that the sense of euphoria I was feeling was widespread.

Pakistan is sometimes described by the international media as the most dangerous place on the planet. That has always seemed to me to be an irresponsible exaggeration: there are other countries whose citizens are far more likely to die of violent causes. But certainly Pakistan is a troubled land, suffering from illiteracy, poverty, terrorism and the bite of rapidly increasing prices, especially of food. The Feb. 18 election has not solved those problems. Yet Pakistanis are justified in allowing themselves a sigh of relief. Indeed, the entire world should be breathing a little easier now, for Pakistan suddenly looks a lot less frightening than it did.

More here.

importantitis

Bernstein5

Leonard Bernstein set Broadway on fire in 1957 with “West Side Story,” a jazzed-up version of “Romeo and Juliet” in which the Capulets and Montagues were turned into Puerto Rican Sharks and American Jets. It was the most significant musical of the postwar era — and the last successful work that Bernstein wrote for the stage. His next show, 1976’s “1600 Pennsylvania Avenue,” closed after seven performances. For the rest of his life he floundered, unable to compose anything worth hearing.

What happened? Stephen Sondheim, Bernstein’s collaborator on “West Side Story,” told Meryle Secrest, who wrote biographies of both men, that he developed “a bad case of importantitis.” That sums up Bernstein’s later years with devastating finality. Time and again he dove head first into grandiose-sounding projects, then emerged from the depths clutching such pretentious pieces of musical costume jewelry as the “Kaddish” Symphony and “A Quiet Place.” In the end he dried up almost completely, longing to make Great Big Musical Statements — he actually wanted to write a Holocaust opera — but incapable of producing so much as a single memorable song.

more from the Wall Street Journal here.

cimrman and other Czechisms

14

Actual Czechs are eminently practical, nothing magical or mystical about them, as befits the people who drink the most beer in the world. Their most curious feature, which they keep to themselves and of  which the tourists know nothing, is a collective sense of  humor. Consider Jára Cimrman, by popular opinion the greatest Czech who ever lived. A few years ago a Czech TV channel asked its audience to name the most beloved native son. Jára Cimrman came first, ahead of Václav Havel, founding president Masaryk, and the Emperor Charles IV. Even the fact that Cimrman was explicitly disqualified in advance did not hurt his chances. This year, when a popular Internet site angled for an alternative to the current President Václav Klaus, Cimrman, disqualified again, came second. An obvious handicap was the fact that he was allegedly last seen alive in 1914.

Jára Cimrman is, of course, a fictitious character, the brainchild of a small group of writers and actors. In the Czech version of Wikipedia he is introduced as “one of the greatest Czech playwrights, poets, musicians, teachers, adventurers, philosophers, inventors” and many other things. Some of his achievements include inventing the Paraguayan puppet show, almost becoming the first man to reach the North Pole (he apparently missed it by seven meters), and conducting a voluminous correspondence with George Bernard Shaw, who never deigned to respond.

Well, that’s funny enough, but the most striking thing about Cimrman is the favor he has found with his people.

more from Poetry Magazine here.

oscar prognosticating

Nocountryforhomepageima

The Academy Award nominees are a worthy but scattered bunch this year, and anyone who confidently tells you they know what’s going to happen is not to be trusted. I, by contrast, make a bid for your confidence by openly acknowledging that my guesses are entirely uneducated, and you could probably fare well by betting against them in your office Oscar pool.

The good news is that, with a few exceptions, the Academy seems to have screwed up less than usual. 2007 was a very strong year for film, and the Oscar nominees do a solid job of reflecting this. If there’s a major complaint to be made this year, it’s with the abstruse rules that govern eligibility in certain categories–in particular, best score and best foreign-language film. In the former category, Jonny Greenwood’s stunning, vital, utterly original score for There Will Be Blood was deemed ineligible for containing too many bits of music not written for the film, ensuring the ludicrous outcome that by far the best score of the year is not even nominated. The foreign film category is an even sorrier sight, with the year’s most celebrated offerings (Four Months, Three Weeks, and Two Days; Persepolis; The Diving Bell and the Butterfly; Lust, Caution; La Vie En Rose; The Orphanage; etc.) not making it, for one reason or another, to the “short list” of nine films from which the five finalists were chosen. (The foreign film rules, which are particularly convoluted, are explained here.) I’d especially like to put in a plug for Four Months, Three Weeks and Two Days, which I saw too late to include in my end of the year list, but would have belonged near the top. It is a marvel of cinematic intimacy, grim and unsparing yet not without hope. If The Lives of Others, the 2006 spellbinder about life behind the Iron Curtain, captured the institutional oppressions of totalitarian rule, Four Months, Three Weeks and Two Days captures the ways in which it turns people into their own oppressors.

more from TNR here.

coetzee on nooteboom, angels, etc.

Lost_paradise

In the summer of 2003, as part of that year’s Lincoln Center Festival, members of the public were offered a guided walk around selected New York sites, beginning on Roosevelt Island and ending in the Chrysler Building. As they proceeded from site to site, they were invited to keep an eye out for angels. And at certain sites they did indeed get to see angel-actors, some with wings, some without, some gazing into the distance, some sleeping. At other sites there were merely traces of past angelic visits: feathers, for example.

The event was the brainchild of the British theater director Deborah Warner. In its first version, dating back to 1995 and as yet sans angels, it was set in a huge abandoned nineteenth-century London hotel; its goal was to evoke ghostly presences from the building’s past. In 1999 Warner presented a revised version with angels added. For the angels, said Warner, she was indebted to Rilke. “There’s a wonderful quote from Rilke which says that angels are uncertain if they are walking amongst the living or the dead.” In 2000 the revised version was exported to Perth, capital of Western Australia.

Responses of participants in the Angel Project varied widely. According to some, the presence of otherworldly beings changed the nature of their gaze, aestheticizing their view of the city. Others dismissed the project as mere Disneyfication, exploitation of a millenary craze for angels. Yet others were deeply moved. “They cried a lot,” said Warner, looking back on the 1999 London performance. “We put angels up at the top of the empty floors of the Euston Tower watching over London. And again, people’s response, terribly, terribly emotional. I think it’s about loss of innocence.”

Among visitors to the 2000 Angel Project was the Dutch writer Cees Nooteboom, in Perth to take part in the city’s arts festival. Nooteboom’s novel Lost Paradise, published in the Netherlands in 2004, draws heavily on recollections of that visit, as we shall see.

more from the NYRB here.

Ella Fitzgerald: The First Lady of Song (1917-1996)

From ellafitzgerald.com:

Ella1 Dubbed “The First Lady of Song,” Ella Fitzgerald was the most popular female jazz singer in the United States for more than half a century. In her lifetime, she won 13 Grammy awards and sold over 40 million albums. Her voice was flexible, wide-ranging, accurate and ageless. She could sing sultry ballads, sweet jazz and imitate every instrument in an orchestra. She worked with all the jazz greats, from Duke Ellington, Count Basie and Nat King Cole, to Frank Sinatra, Dizzy Gillespie and Benny Goodman.

Ella Jane Fitzgerald was born in Newport News, Va. on April 25, 1917. Her father, William, and mother, Temperance (Tempie), parted ways shortly after her birth. Together, Tempie and Ella went to Yonkers, N.Y, where they eventually moved in with Tempie’s longtime boyfriend Joseph Da Silva. In 1932, Tempie died from serious that injuries she received in a car accident. Ella took the loss very hard. After staying with Joe for a short time, Tempie’s sister Virginia took Ella home. Shortly afterward Joe suffered a heart attack and died, and her little sister Frances joined them.

Unable to adjust to the new circumstances, Ella became increasingly unhappy and entered into a difficult period of her life. Her grades dropped dramatically, and she frequently skipped school. After getting into trouble with the police, she was taken into custody and sent to a reform school. Living there was even more unbearable, as she suffered beatings at the hands of her caretakers. Eventually Ella escaped from the reformatory. The 15-year-old found herself broke and alone during the Great Depression, and strove to endure. Never one to complain, Ella later reflected on her most difficult years with an appreciation for how they helped her to mature. She used the memories from these times to help gather emotions for performances, and felt she was more grateful for her success because she knew what it was like to struggle in life.

In 1934 Ella’s name was pulled in a weekly drawing at the Apollo and she won the opportunity to compete in Amateur Night. Ella went to the theater that night planning to dance, but when the frenzied Edwards Sisters closed the main show, Ella changed her mind. “They were the dancingest sisters around,” Ella said, and she felt her act would not compare.

Ella2_2 Once on stage, faced with boos and murmurs of “What’s she going to do?” from the rowdy crowd, a scared and disheveled Ella made the last minute decision to sing. She asked the band to play Hoagy Carmichael’s “Judy,” a song she knew well because Connee Boswell’s rendition of it was among Tempie’s favorites. Ella quickly quieted the audience, and by the song’s end they were demanding an encore. She obliged and sang the flip side of the Boswell Sister’s record, “The Object of My Affections.”

In the band that night was saxophonist and arranger Benny Carter. Impressed with her natural talent, he began introducing Ella to people who could help launch her career. Fueled by enthusiastic supporters, Ella began entering – and winning – every talent show she could find. In January 1935 she won the chance to perform for a week with the Tiny Bradshaw band at the Harlem Opera House. It was there that Ella first met drummer and bandleader Chick Webb. Although her voice impressed him, Chick had already hired male singer Charlie Linton for the band. He offered Ella the opportunity to test with his band when they played a dance at Yale University. “If the kids like her,” Chick said, “she stays.”  Despite the tough crowd, Ella was a major success, and Chick hired her to travel with the band for $12.50 a week.

During this time, the era of big swing bands was shifting, and the focus was turning more toward bebop. Ella played with the new style, often using her voice to take on the role of another horn in the band. “You Have to Swing It” was one of the first times she began experimenting with scat singing, and her improvisation and vocalization thrilled fans. Throughout her career, Ella would master scat singing, turning it into a form of art. In 1938, at the age of 21, Ella recorded a playful version of the nursery rhyme, “A-Tisket, A-Tasket.” The album sold 1 million copies, hit number one, and stayed on the pop charts for 17 weeks. Suddenly, Ella Fitzgerald was famous.

Perhaps in search of stability and protection, Ella married Benny Kornegay, a local dockworker who had been pursuing her. Upon learning that Kornegay had a criminal history, Ella realized that the relationship was a mistake and had the marriage annulled. While on tour with Dizzy Gillespie’s band in 1946, Ella fell in love with bassist Ray Brown. The two were married and eventually adopted a son, whom they named Ray, Jr.

Ella3 At the time, Ray was working for producer and manager Norman Granz on the “Jazz at the Philharmonic” tour. Norman saw that Ella had what it took to be an international star, and he convinced Ella to sign with him. It was the beginning of a lifelong business relationship and friendship. Under Norman’s management, Ella joined the Philharmonic tour, worked with Louis Armstrong on several albums and began producing her infamous songbook series. From 1956-1964, she recorded covers of other musicians’ albums, including those by Cole Porter, Duke Ellington, the Gershwins, Johnny Mercer, Irving Berlin, and Rodgers and Hart. The series was wildly popular, both with Ella’s fans and the artists she covered.

“I never knew how good our songs were until I heard Ella Fitzgerald sing them,” Ira Gershwin once remarked.

Ella also began appearing on television variety shows. She quickly became a favorite and frequent guest on numerous programs, including “The Bing Crosby Show,” “The Dinah Shore Show,” “The Frank Sinatra Show,” “The Ed Sullivan Show,” “The Tonight Show,” “The Nat King Cole Show,” “The Andy Willams Show” and “The Dean Martin Show.” Unfortunately, busy work schedules also hurt Ray and Ella’s marriage. The two divorced in 1952, but remained good friends for the rest of their lives.

On the touring circuit it was well-known that Ella’s manager felt very strongly about civil rights and required equal treatment for his musicians, regardless of their color. Norman refused to accept any type of discrimination at hotels, restaurants or concert halls, even when they traveled to the Deep South. Once, while in Dallas touring for the Philharmonic, a police squad irritated by Norman’s principles barged backstage to hassle the performers. They came into Ella’s dressing room, where band members Dizzy Gillespie and Illinois Jacquet were shooting dice, and arrested everyone.

“They took us down,” Ella later recalled, “and then when we got there, they had the nerve to ask for an autograph.”

Norman wasn’t the only one willing to stand up for Ella. She received support from numerous celebrity fans, including a zealous Marilyn Monroe. “I owe Marilyn Monroe a real debt,” Ella later said. “It was because of her that I played the Mocambo, a very popular nightclub in the ’50s. She personally called the owner of the Mocambo, and told him she wanted me booked immediately, and if he would do it, she would take a front table every night. She told him – and it was true, due to Marilyn’s superstar status – that the press would go wild. The owner said yes, and Marilyn was there, front table, every night. The press went overboard. After that, I never had to play a small jazz club again. She was an unusual woman – a little ahead of her times. And she didn’t know it.”

Ellared Outside of the arts, Ella had a deep concern for child welfare. Though this aspect of her life was rarely publicized, she frequently made generous donations to organizations for disadvantaged youths, and the continuation of these contributions was part of the driving force that prevented her from slowing down. In 1987, United States President Ronald Reagan awarded Ella the National Medal of Arts. It was one of her most prized moments. France followed suit several years later, presenting her with their Commander of Arts and Letters award, while Yale, Dartmouth and several other universities bestowed Ella with honorary doctorates.

By the 1990s, Ella had recorded over 200 albums. In 1991, she gave her final concert at New York’s renowned Carnegie Hall. It was the 26th time she performed there. As the effects from her diabetes worsened, 76-year-old Ella experienced severe circulatory problems and was forced to have both of her legs amputated below the knees. She never fully recovered from the surgery, and afterward, was rarely able to perform.

On June 15, 1996, Ella Fitzgerald died in her Beverly Hills home. Hours later, signs of remembrance began to appear all over the world. A wreath of white flowers stood next to her star on the Hollywood Walk of Fame, and a marquee outside the Hollywood Bowl theater read, “Ella, we will miss you.” After a private memorial service, traffic on the freeway was stopped to let her funeral procession pass through. She was laid to rest in the “Sanctuary of the Bells” section of the Sunset Mission Mausoleum at Inglewood Park Cemetery in Inglewood, Calif.

 

True tales from the couch

From The Guardian:

Kureishi_lg_2 If Hanif Kureishi’s new novel has a fault, it is that its secondary characters are often so full of life that they upstage the principals and this is a fault for which most writers would cheerfully kill. The hero, Jamal, is not only in a reflective profession – he’s a Freudian analyst – but also at a stage of midlife limbo. He’s still involved with his 12-year-old son Rafi (‘We touch fists and exchange the conventional middle-class greeting, “Yo bro – dog!”‘), but on terms of armed truce at best with his estranged wife, Josephine. No wonder the eye of the reader, that magpie, is drawn to Jamal’s rough diamond of a sister, Miriam, overweight and much pierced (‘parts of her face resembled a curtain rail’), reigning over the semi-criminal disorder of her council house, as she starts a relationship with her polar opposite, Jamal’s prissy yet wild intellectual friend Henry, a famous lapsed theatre director. From one point of view, she is the supreme distillation of various brands of bad news into a single prospective partner. On the other hand, as ‘a Muslim single mother with a history of abuse’ who has few taboos and ‘sees straight to the centre of things’, she’s practically perfect.

In addition to his practice, Jamal has a reputation as a writer of case studies, presumably of an Adam Phillips variety, aphoristic and philosophical (‘Why do you want to fail? Why is pleasure hard to bear?’). Paradox comes with the territory, since the territory is the human mind, secreting paradox incessantly. At one stage of his past, for instance, Jamal wanted to be with a woman he didn’t want, a seemingly heartless television producer, out of mourning for lost love.

More here.

Sen and Ferguson Debate the British Raj and Counterfactual History

In the TNR, Amartya Sen and Niall Ferguson debate Ferguson’s Empire. Sen’s original piece:

When the East India Company undertook the battle of Plassey and defeated the Nawab of Bengal, there were businessmen, traders, and other professionals from a number of different European nations already in that very locality. Their primary involvement was in exporting textiles and other industrial products from India, and the river Ganges (or Hughly, as it is more often called in that part of India), on which the East India Company had its settlement, also had (further upstream) trading centers and settled communities from Portugal, the Netherlands, France, Denmark, Prussia, and other European nations.

Being subjected to imperial rule is not the only way of learning things from abroad, no matter how necessary such learning may be. When the Meiji restoration established a new reformist government in Japan in 1868 (which was not unrelated to the internal political impact of Commodore Perry’s show of force in the previous decade), the Japanese went full steam into learning from the West, sending people for training in America and Europe, and making institutional changes that were clearly inspired by western experience. They globalized themselves voluntarily. They were not coercively globalized by others. The shaking of India, too, could have come in non-colonialist ways.

Ferguson’s response:

I quite agree, and have said myself, that any assessment of the costs and benefits of British rule in India needs to make the counterfactual(s) explicit. No one claims India would have stood still if there had been no 1757. With all due respect, however, Professor Sen’s counterfactual of “Meiji India” lacks plausibility. Though I have often heard it argued, the notion seems to me utterly far-fetched that India could have adopted the Japanese route to economic and political modernization.

Sen again:

am glad that Ferguson agrees that India would not have stood still even in the absence of British conquest. But then he says: “Sen’s counterfactual of ‘Meiji India’ lacks plausibility.” “Meiji India”? But that surely is an idea of Ferguson’s, not mine. What I had, in fact, said was: “It is not easy to guess with any confidence how the history of the subcontinent would have gone had the British conquest not occurred. Would India have moved, like Japan, toward modernization in an increasingly globalizing world, or would it have stayed resistant to change, like Afghanistan, or hastened slowly, like Thailand?”

Even after overlooking that misattribution, it can, however, be asked whether Ferguson should be so sure that India could have done little of the kind that Japan did.

Survival of the Funniest

Gil Greengross reviews Rod Martin’s Psychology of Humor: An Integrative Approach in Evolutionary Psychology (you just know that sexual selection is going to be in there):

Due to the complexity of the topic, it is not surprising that hundreds of theories, varying in specificity, have been offered to explain humor. Evolutionary explanations are no exception. Although it is widely accepted that humor has an evolutionary origin, how it evolved and what evolutionary purpose it served is far from clear and is heatedly debated (Gervais and Wilson, 2005; Polimeni and Reiss, 2006). No one has yet proposed a comprehensive theory of humor, and a unitary theory may not even exist, as different aspects of humor may have different origins and purposes.

Take, for example, one of the most common explanations for the adaptive function of humor, known as the “false alarm” theory. The idea gained recent popularity after it was put forward by the famous neuroscientist Ramachandran, although it was known for at least two decades (Chafe, 1987; Ramachandran, 1998). This theory holds that when facing an ambiguous event, laughter serves as a signal to other members of the group that the perceived threat or anomaly is in fact unimportant. Using a stereotyped vocalization such as laughter helps others to determine the non importance of the event. Thus, they should not allocate energy towards it. Whereas this theory has intuitive value and can explain certain aspects of humor (for example, why laughter is contagious), it is not hard to find examples that do not quite fit. One of the open secrets among humor researchers is that most laugher comes in response to trivial comments. Despite the tendency to focus on analyzing jokes, they comprise only a small portion of what humor is. Thus, the importance of humor in a social context goes far beyond the narrow definition that the “false alarm” theory seeks to explain.

But the social aspect of humor is only one lens through which it can be viewed. As Martin notes, a complete understanding of humor also involves developmental, cognitive, personality and other aspects as well.