Visible Young Man

Colson In the NYT, a review of Colson Whitehead's Sag Harbor:

Now that we’ve got a post-black president, all the rest of the post-blacks can be unapologetic as we reshape the iconography of blackness. For so long, the definition of blackness was dominated by the ’60s street-fighting militancy of the Jesses and the irreverent one-foot-out-the-ghetto angry brilliance of the Pryors and the nihilistic, unrepentantly ghetto, new-age thuggishness of the 50 Cents. A decade ago they called post-blacks Oreos because we didn’t think blackness equaled ghetto, didn’t mind having white influencers, didn’t seem full of anger about the past. We were comfortable employing blackness as a grace note rather than as our primary sound. Post-blackness sees blackness not as a dogmatic code worshiping at the altar of the hood and the struggle but as an open-source document, a trope with infinite uses.

The term began in the art world with a class of black artists who were adamant about not being labeled black artists even as their work redefined notions of blackness. Now the meme is slowly expanding into the wider consciousness. For so long we were stamped inauthentic and bullied into an inferiority complex by the harder brothers and sisters, but now it’s our turn to take center stage. Now Kanye, Questlove, Santigold, Zadie Smith and Colson Whitehead can do blackness their way without fear of being branded pseudo or incognegro.

So it’s a perfect moment for Whitehead’s memoiristic fourth novel, “Sag Harbor,” a coming-of-age story about the Colsonesque 15-year-old Benji, who wishes people would just call him Ben. He’s a Smiths-loving, Brooks Brothers-wearing son of moneyed blacks who summer in Long Island and recognize the characters on “The Cosby Show” as kindred spirits.



Sunday Poem

Found
Ron Koertge

My wife waits for a caterpillar
to crawl onto her palm so she
can carry it out of the street
and into the green subdivision
of a tree.

Yesterday she coaxed a spider
into a juicier corner. The day
before she hazed a snail
in a half-circle so he wouldn’t
have to crawl all the way
around the world and be 2,000
years late for dinner.

I want her to hurry up and pay
attention to me or go where I
want to go until I remember
the night she found me wet
and limping, felt for a collar
and tags, then put me in
the truck where it was warm.

Without her I wouldn’t
be standing here in these
snazzy alligator shoes.

A Queen for the Ages

From The Washington Post:

Cleo More than two millennia after it took place, the story of Cleopatra has lost none of its grip on the world's imagination. It has inspired great plays (Shakespeare, Shaw and Sardou), novels, poems, movies (Elizabeth Taylor!), works of art, musical compositions both serious (Handel and Samuel Barber) and silly (“Comin' Atcha,” by Cleopatra), and of course histories and biographies. Yet for all this rich documentation and interpretation, it remains at least as much legend and mystery as historical record, which has allowed everyone who tells it to play his or her own variations on the many themes it embraces.

The latest to take it on is Diana Preston, a British writer of popular history. On the evidence of “Cleopatra and Antony,” I'd say she's a thoroughgoing pro. Her research is careful and deep; her prose is lively and graceful; her sympathy for her central character is strong but wholly without sentimentality; her depiction of the worlds in which Cleopatra lived is detailed, textured and evocative. If there is a better book about Cleopatra for today's reader, I don't know what it is.

She calls her book “Cleopatra and Antony,” thus reversing the order as immortalized by Shakespeare. History and legend have usually given priority to the two great men in the Egyptian queen's life, Julius Caesar and Mark Antony, but Preston argues that “Cleopatra perhaps deserves first place” because “her tenacity, vision and ambition would have been remarkable in any age but in a female ruler in the ancient world they were unique.” She was “a charismatic, cultured, intelligent ruler,” yet thanks to the propaganda put about by Octavian — later the Emperor Augustus but in the fourth decade B.C. Mark Antony's rival for control of the Roman Empire — she “was transformed into a pleasure-loving houri, the very epitome of fatal beauty and monstrous depravity, bent on bringing animal gods, barbarian decadence and despotism to the sacred halls of Rome's Capitol.”

More here.

Why can’t we concentrate?

Laura Miller in Salon:

Story Here's a fail-safe topic when making conversation with everyone from cab drivers to grad students to cousins in the construction trade: Mention the fact that you're finding it harder and harder to concentrate lately. The complaint appears to be universal, yet everyone blames it on some personal factor: having a baby, starting a new job, turning 50, having to use a Blackberry for work, getting on Facebook, and so on. Even more pervasive than Betty Friedan's famous “problem that has no name,” this creeping distractibility and the technology that presumably causes it has inspired such cris de coeur as Nicholas Carr's much-discussed “Is Google Making Us Stupid?” essay for the Atlantic Monthly and diatribes like “The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future,” a book published last year by Mark Bauerlein.

You don't have to agree that “we” are getting stupider, or that today's youth are going to hell in a handbasket (by gum!) to mourn the withering away of the ability to think about one thing for a prolonged period of time. Carr (whose argument was grievously mislabeled by the Atlantic's headline writers as a salvo against the ubiquitous search engine) reported feeling the change “most strongly” while he was reading. “Immersing myself in a book or a lengthy article used to be easy,” he wrote. “Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text.” For my own part, I now find it challenging to sit still on my sofa through the length of a feature film. The urge to, for example, jump up and check the IMDB filmography of a supporting actor is well-nigh irresistible, and once I'm at the computer, why not check e-mail? Most of the time, I'll wind up pausing the DVD player before the end of the movie and telling myself I'll watch the rest tomorrow.

More here.

The Big Similarities & Quirky Differences Between Our Left and Right Brains

Carl Zimmer in Discover Magazine:

Brain There is nothing more humbling or more perception-changing than holding a human brain in your hands. I discovered this recently at a brain-cutting lesson given by Jean-Paul Vonsattel, a neuropathologist at Columbia University. These lessons take place every month in a cold, windowless room deep within the university’s College of Physicians and Surgeons. On the day I visited, there were half a dozen brains sitting on a table. Vonsattel began by passing them around so the medical students could take a closer look. When a brain came my way, I cradled it and found myself puzzling over its mirror symmetry. It was as if someone had glued two smaller brains together to make a bigger one.

Vonsattel then showed us just how weak that glue is. He took back one of the brains and used a knife to divide the hemispheres. He sliced quickly through the corpus callosum, the flat bundle of nerve fibers that connects the halves. The hemispheres flopped away from each other, two identical slabs of fleshy neurons.

Sometimes surgeons must make an even more extreme kind of slice in the brain of a patient. A child may suffer from epilepsy so severe that the only relief doctors can offer is to open up the skull and cut out the entire hemisphere in which the seizures start. After the surgery, the space soon fills with cerebrospinal fluid. It may take a child a year of physical therapy to recover from losing a hemisphere—but the fact that patients recover at all is stunning when you consider that they have only half a brain. It makes you wonder what good two hemispheres are in the first place.

More here.

After the Great Recession

President Obama discusses how his policies on schools, energy and health care might change daily life in America.

David Leonhardt in the New York Times Magazine:

03obama-500 Are there tangible ways that Wall Street has made the average person’s life better in the way that Silicon Valley has?

THE PRESIDENT: Well, I think that some of the democratization of finance is actually beneficial if properly regulated. So the fact that large numbers of people could participate in the equity markets in ways that they could not previously — and for much lower costs than they used to be able to participate — I think is important.

Now, the fact that we had such poor regulation means — in some of these markets, particularly around the securitized mortgages — means that the pain has been democratized as well. And that’s a problem. But I think that overall there are ways in which people have been able to participate in our stock markets and our financial markets that are potentially healthy. Again, what you have to have, though, is an updating of the regulatory regimes comparable to what we did in the 1930s, when there were rules that were put in place that gave investors a little more assurance that they knew what they were buying.

More here.

Genius: The Modern View

David Brooks in the New York Times:

Ts-brooks-190 Some people live in romantic ages. They tend to believe that genius is the product of a divine spark. They believe that there have been, throughout the ages, certain paragons of greatness — Dante, Mozart, Einstein — whose talents far exceeded normal comprehension, who had an other-worldly access to transcendent truth, and who are best approached with reverential awe.

We, of course, live in a scientific age, and modern research pierces hocus-pocus. In the view that is now dominant, even Mozart’s early abilities were not the product of some innate spiritual gift. His early compositions were nothing special. They were pastiches of other people’s work. Mozart was a good musician at an early age, but he would not stand out among today’s top child-performers.

What Mozart had, we now believe, was the same thing Tiger Woods had — the ability to focus for long periods of time and a father intent on improving his skills. Mozart played a lot of piano at a very young age, so he got his 10,000 hours of practice in early and then he built from there.

More here.

Saturday, May 2, 2009

J. G. Ballard, 1930-2009

Jg-ballard_165935tIn The Independent:

J G Ballard, the award-winning writer best known for his autobiographical novel Empire of the Sun, has died at his home in Shepperton, aged 78, after a long illness. He had been unwell “for several years”, said his agent, Margaret Hanbury. He had prostate cancer.

“J G Ballard has been a giant on the world literary scene for more than 50 years,” said Ms Hanbury, who was his agent for 25 of them. “His acute and visionary observation of contemporary life was distilled into a number of brilliant, powerful novels which have been published all over the world and saw Ballard gain cult status.”

James Graham Ballard was regularly labelled a writer of science fiction, but maintained he was “picturing the psychology of the future”. He earned the rare distinction of appearing as an adjective – “Ballardian” – in the Collins English Dictionary, referring to “dystopian modernity, bleak man-made landscapes and the psychological effects of technological, social or environmental developments”.

A Biocentric Theory of The Universe

Biocentric Robert Lanza and Bob Berman make their case in Discover:

According to biocentrism, time does not exist independently of the life that notices it. The reality of time has long been questioned by an odd alliance of philosophers and physicists. The former argue that the past exists only as ideas in the mind, which themselves are neuroelectrical events occurring strictly in the present moment. Physicists, for their part, note that all of their working models, from Isaac Newton’s laws through quantum mechanics, do not actually describe the nature of time. The real point is that no actual entity of time is needed, nor does it play a role in any of their equations. When they speak of time, they inevitably describe it in terms of change. But change is not the same thing as time.

To measure anything’s position precisely, at any given instant, is to lock in on one static frame of its motion, as in the frame of a film. Conversely, as soon as you observe a movement, you cannot isolate a frame, because motion is the summation of many frames. Sharpness in one parameter induces blurriness in the other. Imagine that you are watching a film of an archery tournament. An archer shoots and the arrow flies. The camera follows the arrow’s trajectory from the archer’s bow toward the target. Suddenly the projector stops on a single frame of a stilled arrow. You stare at the image of an arrow in midflight. The pause in the film enables you to know the position of the arrow with great accuracy, but you have lost all information about its momentum. In that frame it is going nowhere; its path and velocity are no longer known. Such fuzziness brings us back to Heisenberg’s uncertainty principle, which describes how measuring the location of a subatomic particle inherently blurs its momentum and vice versa.

Liberalism, Past and Future

George Scialabba reviews Alan Wolfe's The Future of Liberalism in The Nation:

Wolfe's account of liberalism's substantive commitments is straightforward and persuasive–much the best part of the book. The conservative and libertarian enemies of liberalism have squandered so much wealth and welfare, blighted so many lives, that it is always satisfying to see them intellectually routed yet again. Unfortunately, Wolfe does not stop there. He sees liberalism's enemies, or unreliable friends, everywhere and feels bound to scold them all. Wolfe's spiritual home is The New Republic, and he manifests the same complacent centrism as most of its regular writers (though not–for better and worse–the snarky wit and verbal edge that make the magazine at once irresistible and insufferable). Half The Future of Liberalism is valuable affirmation; the other half is an ideological Syllabus of Errors.

The first and most dangerous heresy that Wolfe rebukes from the pulpit–“the single most influential illiberal current of our time”–is evolutionary psychology. The attempt to view human behavior in Darwinian perspective amounts to “nothing short of a determined campaign to reduce human beings and their accomplishments to insignificance.” According to these anti-humanists, humans “rarely accomplish very much independent of what nature has bequeathed to them”; culture is a “side effect,” a “by-product,” just “one more way in which nature imposes its designs upon us.” All this, Wolfe protests, radically undermines liberal morale. Liberalism is about choice and purpose, but the aim of evolutionary psychology “is to show that leading any kind of life we think we are choosing is impossible.”

If science really and truly discredited liberalism, then the only honest response would be: so much the worse for liberalism. But, of course, it does not. The distinction between nature and culture that Wolfe brandishes so menacingly is far more subtle and tenuous than he recognizes. His version, like the obsolete distinction between body and soul, implies that we cannot be both purely physical and meaningfully moral. And yet we are. Whatever “free will” means, it does not mean that choices are uncaused.

Saturday Poem

Rhetorical Figures
Tom Christopher

When a sentence is composed of two independent
clauses, the second being weaker than the first,
it is called One-Legged Man Standing. If it
purposefully obscures meaning, it’s called
Ring
Dropped in Muddy Creek,
or if elegantly composed,
Wasp Fucking Orchid. There are words behind words,
and half the time our thought spraying out like water
from a hose, half the time banging inside our heads
like a wren in a house. When a sentence ends
unexpectedly because someone has punched
the speaker in the face, its Avalanche Sudden.
When instead the speaker is stopped with sloppy
kisses, it’s Dripping Cloud. Not to be confused
with Dripping Cone, when someone overturns
the table, or Bird Pecking the Mountain, when
the sentence goes on for an hour and a half and ends
in a shaking death. If the speaker lies in the driveway
so drunk on cheap wine that one listening cannot
get close to the meaning and thus runs away again,
claiming, “For the last time,” it’s
Pregnant Dog
Cooked in Sun
. If the speaker sells everything for
an old convertible and drives out into the desert
with unintelligible shouting to the pissed-off stars:
Aching Stones Laughing. Forced incongruent words
are Fishes on Fire, and are beautiful but bring us
no closer to the Truth or the Cosmos or the All,
so either we tour Europe looking for the bodies
of saints or drink all night playing Johnny Cash LPs.
Everything we have said, we have said all our lives.
Same for what we haven’t said. Learning the terms
doesn’t help, we’re still filled over the rim with longing.
Already in this room there is
Clamshell Moon, Barn
House Burning, Cow Lowing the Field, One Hundred
Village Bells, Moth Flurry.
Somewhere above, a
Torn
Shirt
, a Peasant Girl Crying, a
Baby Dropped Through
Smoke to Voices Shouting.
Not much further a
Cat
in Heat,
a Wailing Street, and in the end
Tree Frogs
Blazing Reeds with Sound.

from: Best American Poetry 2006; Scribner Poetry, NY

Fordham Law Class Collects Personal Info About Scalia; Supreme Ct. Justice Is Steamed

Martha Neil in the ABA Journal:

ScreenHunter_01 May. 02 13.31Last year, when law professor Joel Reidenberg wanted to show his Fordham University class how readily private information is available on the Internet, he assigned a group project. It was collecting personal information from the Web about himself.

This year, after U.S. Supreme Court Justice Antonin Scalia made public comments that seemingly may have questioned the need for more protection of private information, Reidenberg assigned the same project. Except this time Scalia was the subject, the prof explains to the ABA Journal in a telephone interview.

His class turned in a 15-page dossier that included not only Scalia's home address, home phone number and home value, but his food and movie preferences, his wife's personal e-mail address and photos of his grandchildren, reports Above the Law.

And, as Scalia himself made clear in a statement to Above the Law, he isn't happy about the invasion of his privacy:

“Professor Reidenberg's exercise is an example of perfectly legal, abominably poor judgment. Since he was not teaching a course in judgment, I presume he felt no responsibility to display any,” the justice says, among other comments.

More here.

Untangling the Brain

From Harvard Magazine:

Brain Modern neuroscience rests on the assumption that our thoughts, feelings, perceptions, and behaviors emerge from electrical and chemical communication between brain cells: that whenever we recognize a face, read the newspaper, throw a ball, engage in a conversation, or recall a moment in childhood, a pattern of activity in our neurons makes such feats possible. It’s a tenet of modern biology that sparks fascination—and disbelief. How can a tangle of cells produce the complexity and subtlety of a mind?

Answering that question has always been propelled—and limited—by the available technologies. Accessing the living brain is difficult, and yet studying neurons outside their normal context can’t tell us how they work together normally. Today, using anatomical studies and technologies like functional magnetic resonance imaging, scientists can finally observe large-scale patterns of how the brain is organized. In animals, they have begun to map out networks of neurons responsible for processes like vision and smell. And detailed studies of individual neurons have revealed much about how they “fire” electrically in response to inputs from other neurons, or release neurotransmitters to communicate with one another. But one of the most difficult questions in neuroscience is how to connect these two scales: how do individual neurons link to one another in networks that somehow result in complex brain functions?

More here.

A long, loving literary line

From The Guardian:

Kamila Shamsie always wanted to be a writer – just like the three generations of women before her. Now shortlisted for the Orange prize, she pays homage to their courage and their craft.

Kamila-Shamsie-right-with-001 In the dying days of the British Raj, over a family meal in Lucknow, a young Indian man, greatly influenced by communist and Marxist thought while at Oxford in the 1930s, launched forth with his political ideas. The subject at hand: Lenin and the Soviets. His mother, at the other end of the table, leaned forward. “If there's something wrong with the linen and serviettes,” she said, “let me know. I'll attend to it.” The young man was my maternal grandfather, and the story is the stuff of family legend. As a child I loved the humour of that tale – now I am startled by the picture it forms of my great-grandmother as a traditional figure of 1940s Indian womanhood, unable to step outside the domestic, even when faced with a communist dictator. You certainly wouldn't guess from that story, or necessarily from photographs of my great-grandmother (which show a tiny woman, her head covered with a dupatta), that she was a politician and the first of four generations of women writers in my family.

I grew up among my mother's remarkable collection of books – made more remarkable by how difficult it could be in the 1980s to get hold of Anglophone literature in Karachi. She was generally quite happy to let me work my way through her bookshelf, but every so often in my teenage years she would direct me towards particular writers – Salman Rushdie, Peter Carey, Anita Desai, Kazuo Ishiguro.

More here.

Pakistan is Already an Islamic State

Ali Eteraz in Dissent:

Ali A recent sharia-for-peace deal between militant groups and the civilian government in Pakistan’s quasi-autonomous Swat region has ignited interest in the status of Islamic law in Pakistan. The U.S. State Department, concerned about terrorist safe-havens, called the deal a “negative development.” Meanwhile, Fareed Zakaria of Newsweek, trying to look at the bright side of things, argued that the deal might drive a wedge between “violent” radicals and those that are “merely extreme.”

Both of these views, rooted in the “war on terror” frame of thinking, diagnose Pakistan’s relationship with Islam incorrectly. The real issue in Pakistan is not that from time to time a group of militants, while demanding the implementation of sharia, begins attacking civilians. This, while deplorable and painful, is a consequence of Pakistan’s constitution. The essential problem in Pakistan is its flawed constitutional framework, which forces every citizen to refer to their idiosyncratic and personal views on life through the lens of “Islam.” Such a state of affairs has the effect of concealing every political, material and economic demand behind theological verbiage, and that situation ultimately favors religious hard-liners and militants who are willing to use violence.

Pakistan will not be rid of such religion-based conflict until it addresses the problem of its 1973 Constitution. That document’s constitutional Islamization engenders a cultural competition over who controls Islam—a conflict which, thanks to the Soviet war in Afghanistan and then 9/11, has become politicized, militarized, and weaponized.

More here.

In Karachi, a company that makes 2,000 fetish and bondage products operates next to a mosque

Adam B. Ellick in the New York Times:

ScreenHunter_11 May. 02 10.20 In Pakistan, a flogger is known only as the Taliban’s choice whip for beating those who defy their strict codes of Islam.

But deep in the nation’s commercial capital, just next door to a mosque and the offices of a radical Islamic organization, in an unmarked house two Pakistani brothers have discovered a more liberal and lucrative use for the scourge: the $3 billion fetish and bondage industry in the West.

Their mom-and-pop-style garment business, AQTH, earns more than $1 million a year manufacturing 2,000 fetish and bondage products, including the Mistress Flogger, and exporting them to the United States and Europe.

The Qadeer brothers, Adnan, 34, and Rizwan, 32, have made the business into an improbable success story in a country where bars are illegal and the poor are often bound to a lifetime in poverty.

If the bondage business seems an unlikely pursuit for two button-down, slightly awkward, decidedly deadpan lower-class Pakistanis, it is. But then, discretion has been their byword. The brothers have taken extreme measures to conceal a business that in this deeply conservative Muslim country is as risky as it is risqué.

More here. [Thanks to Sheheryar Salim.]

Friday, May 1, 2009

Trouble with Strangers

Over at Notre Dame Philosophical Reviews, Gopal Balakrishnan reviews Terry Eagleton's Trouble with Strangers: A Study of Ethics:

Trouble with Strangers is the latest reflection on ethics from the distinguished literary critic Terry Eagleton. Best known for his witty and sympathetic popularizations of critical theory, Eagleton once claimed that its main contemporary currents — Marxism, deconstruction, and psychoanalysis — were devoid of practical ethical corollaries, thereby contributing to a wider condition of relativistic malaise. Eagleton was probably the only proponent of post-structuralism at the time to be troubled by this. But even if it were true that the main variants of post-structuralism once exhibited such nihilist tendencies, one could say that they have all subsequently sought to make amends, for after their 80's heyday, all of them made a turn towards ethical inquiry, along lines laid out by Levinas, Derrida and the late Foucault. Indeed, even Lacanians like Žižek who dismiss the post-structuralist ethics of difference and undecidability, invariably do so in ethical terms. It might also be noted that this development seems to conform to a broader trend that has brought us bio-ethics, professional ethics, and ethical consumerism. Eagleton has addressed this broader ideological context in other works, but in Trouble with Strangers, he limits himself to providing a comparative overview of some of the different conceptions of ethics coming out of that mixture of continental intellectual traditions that is still called 'Theory'.

From the Left, A Debate on Capitalism, Social Democracy and Democratic Socialism

Sdoa_1932_poster First, Sheri Berman in Dissent (via bookforum):

Helping people adjust to capitalism, rather than engaging in a hopeless and ultimately counterproductive effort to hold it back, has been the historic accomplishment of the social democratic left, and it remains its primary goal today in those countries where the social democratic mindset is most deeply ensconced. Many analysts have remarked, for example, on the impressive success of countries like Denmark and Sweden in managing globalization—promoting economic growth and increased competitiveness even as they ensure high employment and social security. The Scandinavian cases demonstrate that social welfare and economic dynamism are not enemies but natural allies. Not surprisingly, it is precisely in these countries that optimism about globalization is highest. In the United States and other parts of Europe, on the other hand, fear of the future is pervasive and opinions of globalization astoundingly negative. American leftists must try to do what the Scandinavians have done: develop a program that promotes growth and social solidarity together, rather than forcing a choice between them. Concretely this means agitating for policies—like reliable, affordable, and portable health care; tax credits or other government support for labor-market retraining; investment in education; and unemployment programs that are both more generous and better incentivized—that will help workers adjust to change rather than make them fear it.

Read more »