Juan Cole’s summary of the situation in Pakistan

Juan Cole in his blog, Informed Comment:

As Pakistani president Asaf Ali Zardari arrived in Washington for talks with President Obama and Afghan President Hamid Karzai, fighting intensified in Pakistan's northwest.

On Tuesday morning, Pakistani Taliban deployed a suicide bomber to attack Pakistani security forces near Peshawar killing 5 and wounding 9 persons, among them school children bystanders.

WaPo says that fighting had intensified Monday in the Swat Valley between the Pakistani Taliban and government troops, as well as in Buner, the district into which they recently made an incursion and from which the government has been attempting to dislodge them. So far some 80 militants have been killed in the Buner campaign, and 20,000 civilians have been displaced.

Tony Karon at Time explains that the Pakistani military establishment disagrees with Washington that the Taliban are an existential threat to the Pakistani state, and why.

Convinced that Pakistan's problems are in part rooted in economic issues, Sens. John Kerry and Dick Lugar introduced legislation Monday aimed at tripling US foreign aid to Islamabad.

Meanwhile, on the diplomatic front, Secretary of Defense Robert Gates is calling on Saudi Arabia to help Pakistan crush the Pakistani Taliban. The Saudis have developed a fear of the vigilante radicals that they once supported back in the 1980s, and spent 2003-2006 suppressing them at home, and perhaps by now Gates's idea makes some sense.

More here. Obama says Pakistan is toughest U.S. challenge:

A Natural History of the Flu

Carl Zimmer in the New York Times:

ScreenHunter_01 May. 05 08.25 The current outbreak shows how complex and mysterious the evolution of viruses is. That complexity and mystery are all the more remarkable because a virus is life reduced to its essentials. A human influenza virus, for example, is a protein shell measuring about five-millionths of an inch across, with 10 genes inside. (We have about 20,000.)

Some viruses use DNA, like we do, to encode their genes. Others, like the influenza virus, use single-strand RNA. But viruses all have one thing in common, said Roland Wolkowicz, a molecular virologist at San Diego State University: they all reproduce by disintegrating and then reforming.

A human flu virus, for example, latches onto a cell in the lining of the nose or throat. It manipulates a receptor on the cell so that the cell engulfs it, whereupon the virus’s genes are released from its protein shell. The host cell begins making genes and proteins that spontaneously assemble into new viruses. “No other entity out there is able to do that,” Dr. Wolkowicz said. “To me, this is what defines a virus.”

The sheer number of viruses on Earth is beyond our ability to imagine. “In a small drop of water there are a billion viruses,” Dr. Wolkowicz said. Virologists have estimated that there are a million trillion trillion viruses in the world’s oceans.

More here.

Monday, May 4, 2009

Emotional Cartography: Christian Nold and William Blake

Sensory deprivation mapFrom the indispensable psychology and neuroscience blog Mind Hacks, some information on Christian Nold and his “emotional cartography.” Writes Nold:

Bio Mapping is a community mapping project … In the context of regular, local workshops and consulltations (s9c), participants are wired up with an innovative device which records the wearer's Galvanic Skin Response (GSR), which is a simple indicator of the emotional arousal in conjunction with their geographical location. People re-explore their local area by walking the neighbourhood with the device and on their return a map is created which visualises points of high and low arousal. By interpreting and annotating this data, communal emotion maps are constructed that are packed full of personal observations which show the areas that people feel strongly about and truly visualise the social space of a community.

Nold goes on to ask, “How will our perceptions of our community and environment change when we become aware of our own and each others intimate body states?” That's more of a forecast than a description of his current work. He's predicting a technology that allows people to read the emotions of others in real-time. His current maps essentially measure only stress, and the results are published retrospectively and not in real time.

But it raises a number of interesting questions and possibilities. Before we go there, however, it's worth mention that, while Nold may be science's first “emotional cartographer,” literature's been there already. Take William Blake's London:

I wander through each chartered street,
Near where the chartered Thames does flow,
And mark in every face I meet,
Marks of weakness, marks of woe.

In every cry of every man,
In every infant's cry of fear,
In every voice, in every ban,
The mind-forged manacles I hear:

How the chimney-sweeper's cry
Every blackening church appals,
And the hapless soldier's sigh
Runs in blood down palace-walls.

But most, through midnight streets I hear
How the youthful harlot's curse
Blasts the new-born infant's tear,
And blights with plagues the marriage-hearse.

Blake's cartography isn't only emotional, although it's intensely emotional. It's also economic, political, and psychological (the “mind-forged manacles” evoking everything from learned helplessness to crushing social convention.) The cagey old printer even manages to inject a little epidemiology into his mapmaking. The youthful harlot's curse sounds a lot like a venereal disease, one that condemns the unfaithful husband and his family to death. And the “marks” of “weakness and woe” that Blake inventories form the contours of his map. The wounded soldier's bloody sigh on the palace wall reduces to zero the geographical distance between the suffering of the battlefront and the comforts of the wealthy.

And all in sixteen lines.

So maybe Christian Nold hasn't caught up with the poet yet. But he's done something interesting, and there's more to come. The “emotional maps” aren't his only work, either. He's also created the Newham “Sensory Deprivation” Map, which is where the illustration above comes from. By switching up the senses people use to perceive the environment, he's helping map our geography in a new way. A very nice idea.

So what would happen if we could read the emotions of those around us in real-time? What if we could tell that the crowd around us at rush hour was overstressed, that the people at our bar band gig really liked the crazy rockabilly number we threw in, that our academic audience was becoming skeptical of our Blake-As-Cartographer thesis? Would people on the street feel more personal responsibility for the well-being of the throng around them? Would presenters and performers lose the willingness to challenge their audience? Would anybody even care?

Would politicians be even more eager to say anything the public wanted than ever before?

Nold's work can veer in any number of future directions. It could lead to new forms of psychological epidemiology, or to conceptual art works. Or to new ways of seeing the world around us, a breaker of mind-forged manacles. But he needs to be vigilant, to prevent his work from descending into an entertainment, a crowdsourced “mood ring” for the 21st Century, played with and then forgotten.

He can do it, if he gets the right support. And draws the right inspiration from cartographers like William Blake.

Prick Up Your Ears, Times Readers: Do You Know What Your New York Times Is Doing?

Michael Blim

Globe_121302 The minutes of the night tick down as I write this column. Soon I will have my morning reward. My column will come out on 3QD, and I will hear the thud of the Boston Globe against the front door.

My column will come out, but with the Boston Globe?

Ask The New York Times Company, its owner. For the past month, they have been threatening to close the Globe unless its workers give back $20 million in wages and benefits by May 1. For the past two days, the Times company has extended the deadline by one day. As I write, the Times company has put several hours back on the clock at the same time it is waving its official plant closing notice as required by the state in the faces of its employees.

The Globe, once the Sulzberger flagship for its New England media armada, and a cash cow to boot, is now losing a million dollars a week. It is the last paper of record in Boston, and has garnered dozens of national awards, including seven Pulitzer prizes since 1995. The 2007 prize was won by reporter Charlie Savage’s exposure of President Bush’s abuse of so-called signing statements, pithy bits of prose attached to his approval of laws that skewed or set aside whole provisions of legislation he could not summon the courage to veto. The 2003 prize was won the Globe’s spotlight team for their uncovering of the sex abuse scandal in the local and national Catholic churches.

These were hardly prizes awarded for art criticism, however valuable those forms of recognition may be. They were what newspapers do that no other institution or platform in America can yet do, which is to generate facts about and attention to serious, yet undiscovered problems in everyday life.

Read more »

The Humanists: Andrei Tarkovsky’s Solaris (1972)

Solaris2

by Colin Marshall

Though certain cultural circles customarily and wrongfully dismiss science fiction as an altogether inferior breed of narrative, the genre's bad reputation isn't wholly unearned. Just last week, I heard veteran sci-fi novelist Robert Silverberg publicly assert that, in his field, “character is necessarily subordinated to speculation”; rarely has the fatal flaw of one subset of fiction been so succinctly stated. While the disease that withers human inhabitants to ciphers is indeed widespread and devastating, it hasn't quite contaminated every crevice of the sci-fi landscape. Witness, to name one of these exotic and wonderful instances, Andrei Tarkovsky's Solaris, a futuristic, fantastical journey into an impossible planet's orbit that nevertheless remains the most gripping cinematic narrative of the 1970s.

The film is, I would submit, Tarkovsky's finest, though the great director would have argued with me. He reportedly came to consider Solaris his least successful project, owing to what he saw as its inability to break the shackles of its genre. Though no viewer then or now would call it anything other than a science fiction film, perhaps only Tarkovsky himself, his mind's eye fixed on the less conventional visions he would later realize, could lump it with the day's rockets-and-aliens potboilers. What to him may have been a not-entirely-successful attempt to imbue relatively insubstantial material with stronger human resonances is to others a set of Tarkovskian themes brought closer and made more comprehensible by interaction with a familiar cinematic context. This is not to minimize the impact of the films that followed — the ultra-personal Mirror and The Sacrifice, the supremely textural Nostalghia, the much more distant science fiction of Stalker — but to appreciate the unexpectedly positive hybridization effects of two entirely distinct entities, a phenomenon of which almost any science fiction writer would approve.

Read more »

Monday poem

The reality of time has long been questioned by an odd alliance
of philosophers and physicists.–Robert Lanza and Bob Berman


The Problem of Time
Jim Culleny

Then was now once while now always is
the train leaving the station

and Is (itself) is pretty much
a matter of interpretation
as murky as the dilemma:
to be or not to…which was
well explored long before today
(today being exactly when
Hamlet was written anyway).

Tomorrow maybe I’ll figure it all out,
though by then it’ll be almost yesterday again

which before tock has ticked will
seem like a month or two ago
or year or even an eon or so
, which it undoubtedly is

Sunday, May 3, 2009

The 2012 Apocalypse — And How to Stop It

2012 Perhaps alarmist, Brandon Keim in Wired:

For scary speculation about the end of civilization in 2012, people usually turn to followers of cryptic Mayan prophecy, not scientists. But that’s exactly what a group of NASA-assembled researchers described in a chilling report issued earlier this year on the destructive potential of solar storms.

Entitled “Severe Space Weather Events — Understanding Societal and Economic Impacts,” it describes the consequences of solar flares unleashing waves of energy that could disrupt Earth’s magnetic field, overwhelming high-voltage transformers with vast electrical currents and short-circuiting energy grids. Such a catastrophe would cost the United States “$1 trillion to $2 trillion in the first year,” concluded the panel, and “full recovery could take 4 to 10 years.” That would, of course, be just a fraction of global damages.

Good-bye, civilization.

Worse yet, the next period of intense solar activity is expected in 2012, and coincides with the presence of an unusually large hole in Earth’s geomagnetic shield. But the report received relatively little attention, perhaps because of 2012’s supernatural connotations. Mayan astronomers supposedly predicted that 2012 would mark the calamitous “birth of a new era.”

Whether the Mayans were on to something, or this is all just a chilling coincidence, won’t be known for several years. But according to Lawrence Joseph, author of “Apocalypse 2012: A Scientific Investigation into Civilization’s End,” “I’ve been following this topic for almost five years, and it wasn’t until the report came out that this really began to freak me out.”

Wired.com talked to Joseph and John Kappenman, CEO of electromagnetic damage consulting company MetaTech, about the possibility of geomagnetic apocalypse — and how to stop it.

Visible Young Man

Colson In the NYT, a review of Colson Whitehead's Sag Harbor:

Now that we’ve got a post-black president, all the rest of the post-blacks can be unapologetic as we reshape the iconography of blackness. For so long, the definition of blackness was dominated by the ’60s street-fighting militancy of the Jesses and the irreverent one-foot-out-the-ghetto angry brilliance of the Pryors and the nihilistic, unrepentantly ghetto, new-age thuggishness of the 50 Cents. A decade ago they called post-blacks Oreos because we didn’t think blackness equaled ghetto, didn’t mind having white influencers, didn’t seem full of anger about the past. We were comfortable employing blackness as a grace note rather than as our primary sound. Post-blackness sees blackness not as a dogmatic code worshiping at the altar of the hood and the struggle but as an open-source document, a trope with infinite uses.

The term began in the art world with a class of black artists who were adamant about not being labeled black artists even as their work redefined notions of blackness. Now the meme is slowly expanding into the wider consciousness. For so long we were stamped inauthentic and bullied into an inferiority complex by the harder brothers and sisters, but now it’s our turn to take center stage. Now Kanye, Questlove, Santigold, Zadie Smith and Colson Whitehead can do blackness their way without fear of being branded pseudo or incognegro.

So it’s a perfect moment for Whitehead’s memoiristic fourth novel, “Sag Harbor,” a coming-of-age story about the Colsonesque 15-year-old Benji, who wishes people would just call him Ben. He’s a Smiths-loving, Brooks Brothers-wearing son of moneyed blacks who summer in Long Island and recognize the characters on “The Cosby Show” as kindred spirits.

Sunday Poem

Found
Ron Koertge

My wife waits for a caterpillar
to crawl onto her palm so she
can carry it out of the street
and into the green subdivision
of a tree.

Yesterday she coaxed a spider
into a juicier corner. The day
before she hazed a snail
in a half-circle so he wouldn’t
have to crawl all the way
around the world and be 2,000
years late for dinner.

I want her to hurry up and pay
attention to me or go where I
want to go until I remember
the night she found me wet
and limping, felt for a collar
and tags, then put me in
the truck where it was warm.

Without her I wouldn’t
be standing here in these
snazzy alligator shoes.

A Queen for the Ages

From The Washington Post:

Cleo More than two millennia after it took place, the story of Cleopatra has lost none of its grip on the world's imagination. It has inspired great plays (Shakespeare, Shaw and Sardou), novels, poems, movies (Elizabeth Taylor!), works of art, musical compositions both serious (Handel and Samuel Barber) and silly (“Comin' Atcha,” by Cleopatra), and of course histories and biographies. Yet for all this rich documentation and interpretation, it remains at least as much legend and mystery as historical record, which has allowed everyone who tells it to play his or her own variations on the many themes it embraces.

The latest to take it on is Diana Preston, a British writer of popular history. On the evidence of “Cleopatra and Antony,” I'd say she's a thoroughgoing pro. Her research is careful and deep; her prose is lively and graceful; her sympathy for her central character is strong but wholly without sentimentality; her depiction of the worlds in which Cleopatra lived is detailed, textured and evocative. If there is a better book about Cleopatra for today's reader, I don't know what it is.

She calls her book “Cleopatra and Antony,” thus reversing the order as immortalized by Shakespeare. History and legend have usually given priority to the two great men in the Egyptian queen's life, Julius Caesar and Mark Antony, but Preston argues that “Cleopatra perhaps deserves first place” because “her tenacity, vision and ambition would have been remarkable in any age but in a female ruler in the ancient world they were unique.” She was “a charismatic, cultured, intelligent ruler,” yet thanks to the propaganda put about by Octavian — later the Emperor Augustus but in the fourth decade B.C. Mark Antony's rival for control of the Roman Empire — she “was transformed into a pleasure-loving houri, the very epitome of fatal beauty and monstrous depravity, bent on bringing animal gods, barbarian decadence and despotism to the sacred halls of Rome's Capitol.”

More here.

Why can’t we concentrate?

Laura Miller in Salon:

Story Here's a fail-safe topic when making conversation with everyone from cab drivers to grad students to cousins in the construction trade: Mention the fact that you're finding it harder and harder to concentrate lately. The complaint appears to be universal, yet everyone blames it on some personal factor: having a baby, starting a new job, turning 50, having to use a Blackberry for work, getting on Facebook, and so on. Even more pervasive than Betty Friedan's famous “problem that has no name,” this creeping distractibility and the technology that presumably causes it has inspired such cris de coeur as Nicholas Carr's much-discussed “Is Google Making Us Stupid?” essay for the Atlantic Monthly and diatribes like “The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future,” a book published last year by Mark Bauerlein.

You don't have to agree that “we” are getting stupider, or that today's youth are going to hell in a handbasket (by gum!) to mourn the withering away of the ability to think about one thing for a prolonged period of time. Carr (whose argument was grievously mislabeled by the Atlantic's headline writers as a salvo against the ubiquitous search engine) reported feeling the change “most strongly” while he was reading. “Immersing myself in a book or a lengthy article used to be easy,” he wrote. “Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text.” For my own part, I now find it challenging to sit still on my sofa through the length of a feature film. The urge to, for example, jump up and check the IMDB filmography of a supporting actor is well-nigh irresistible, and once I'm at the computer, why not check e-mail? Most of the time, I'll wind up pausing the DVD player before the end of the movie and telling myself I'll watch the rest tomorrow.

More here.

The Big Similarities & Quirky Differences Between Our Left and Right Brains

Carl Zimmer in Discover Magazine:

Brain There is nothing more humbling or more perception-changing than holding a human brain in your hands. I discovered this recently at a brain-cutting lesson given by Jean-Paul Vonsattel, a neuropathologist at Columbia University. These lessons take place every month in a cold, windowless room deep within the university’s College of Physicians and Surgeons. On the day I visited, there were half a dozen brains sitting on a table. Vonsattel began by passing them around so the medical students could take a closer look. When a brain came my way, I cradled it and found myself puzzling over its mirror symmetry. It was as if someone had glued two smaller brains together to make a bigger one.

Vonsattel then showed us just how weak that glue is. He took back one of the brains and used a knife to divide the hemispheres. He sliced quickly through the corpus callosum, the flat bundle of nerve fibers that connects the halves. The hemispheres flopped away from each other, two identical slabs of fleshy neurons.

Sometimes surgeons must make an even more extreme kind of slice in the brain of a patient. A child may suffer from epilepsy so severe that the only relief doctors can offer is to open up the skull and cut out the entire hemisphere in which the seizures start. After the surgery, the space soon fills with cerebrospinal fluid. It may take a child a year of physical therapy to recover from losing a hemisphere—but the fact that patients recover at all is stunning when you consider that they have only half a brain. It makes you wonder what good two hemispheres are in the first place.

More here.

After the Great Recession

President Obama discusses how his policies on schools, energy and health care might change daily life in America.

David Leonhardt in the New York Times Magazine:

03obama-500 Are there tangible ways that Wall Street has made the average person’s life better in the way that Silicon Valley has?

THE PRESIDENT: Well, I think that some of the democratization of finance is actually beneficial if properly regulated. So the fact that large numbers of people could participate in the equity markets in ways that they could not previously — and for much lower costs than they used to be able to participate — I think is important.

Now, the fact that we had such poor regulation means — in some of these markets, particularly around the securitized mortgages — means that the pain has been democratized as well. And that’s a problem. But I think that overall there are ways in which people have been able to participate in our stock markets and our financial markets that are potentially healthy. Again, what you have to have, though, is an updating of the regulatory regimes comparable to what we did in the 1930s, when there were rules that were put in place that gave investors a little more assurance that they knew what they were buying.

More here.

Genius: The Modern View

David Brooks in the New York Times:

Ts-brooks-190 Some people live in romantic ages. They tend to believe that genius is the product of a divine spark. They believe that there have been, throughout the ages, certain paragons of greatness — Dante, Mozart, Einstein — whose talents far exceeded normal comprehension, who had an other-worldly access to transcendent truth, and who are best approached with reverential awe.

We, of course, live in a scientific age, and modern research pierces hocus-pocus. In the view that is now dominant, even Mozart’s early abilities were not the product of some innate spiritual gift. His early compositions were nothing special. They were pastiches of other people’s work. Mozart was a good musician at an early age, but he would not stand out among today’s top child-performers.

What Mozart had, we now believe, was the same thing Tiger Woods had — the ability to focus for long periods of time and a father intent on improving his skills. Mozart played a lot of piano at a very young age, so he got his 10,000 hours of practice in early and then he built from there.

More here.

Saturday, May 2, 2009

J. G. Ballard, 1930-2009

Jg-ballard_165935tIn The Independent:

J G Ballard, the award-winning writer best known for his autobiographical novel Empire of the Sun, has died at his home in Shepperton, aged 78, after a long illness. He had been unwell “for several years”, said his agent, Margaret Hanbury. He had prostate cancer.

“J G Ballard has been a giant on the world literary scene for more than 50 years,” said Ms Hanbury, who was his agent for 25 of them. “His acute and visionary observation of contemporary life was distilled into a number of brilliant, powerful novels which have been published all over the world and saw Ballard gain cult status.”

James Graham Ballard was regularly labelled a writer of science fiction, but maintained he was “picturing the psychology of the future”. He earned the rare distinction of appearing as an adjective – “Ballardian” – in the Collins English Dictionary, referring to “dystopian modernity, bleak man-made landscapes and the psychological effects of technological, social or environmental developments”.

A Biocentric Theory of The Universe

Biocentric Robert Lanza and Bob Berman make their case in Discover:

According to biocentrism, time does not exist independently of the life that notices it. The reality of time has long been questioned by an odd alliance of philosophers and physicists. The former argue that the past exists only as ideas in the mind, which themselves are neuroelectrical events occurring strictly in the present moment. Physicists, for their part, note that all of their working models, from Isaac Newton’s laws through quantum mechanics, do not actually describe the nature of time. The real point is that no actual entity of time is needed, nor does it play a role in any of their equations. When they speak of time, they inevitably describe it in terms of change. But change is not the same thing as time.

To measure anything’s position precisely, at any given instant, is to lock in on one static frame of its motion, as in the frame of a film. Conversely, as soon as you observe a movement, you cannot isolate a frame, because motion is the summation of many frames. Sharpness in one parameter induces blurriness in the other. Imagine that you are watching a film of an archery tournament. An archer shoots and the arrow flies. The camera follows the arrow’s trajectory from the archer’s bow toward the target. Suddenly the projector stops on a single frame of a stilled arrow. You stare at the image of an arrow in midflight. The pause in the film enables you to know the position of the arrow with great accuracy, but you have lost all information about its momentum. In that frame it is going nowhere; its path and velocity are no longer known. Such fuzziness brings us back to Heisenberg’s uncertainty principle, which describes how measuring the location of a subatomic particle inherently blurs its momentum and vice versa.

Liberalism, Past and Future

George Scialabba reviews Alan Wolfe's The Future of Liberalism in The Nation:

Wolfe's account of liberalism's substantive commitments is straightforward and persuasive–much the best part of the book. The conservative and libertarian enemies of liberalism have squandered so much wealth and welfare, blighted so many lives, that it is always satisfying to see them intellectually routed yet again. Unfortunately, Wolfe does not stop there. He sees liberalism's enemies, or unreliable friends, everywhere and feels bound to scold them all. Wolfe's spiritual home is The New Republic, and he manifests the same complacent centrism as most of its regular writers (though not–for better and worse–the snarky wit and verbal edge that make the magazine at once irresistible and insufferable). Half The Future of Liberalism is valuable affirmation; the other half is an ideological Syllabus of Errors.

The first and most dangerous heresy that Wolfe rebukes from the pulpit–“the single most influential illiberal current of our time”–is evolutionary psychology. The attempt to view human behavior in Darwinian perspective amounts to “nothing short of a determined campaign to reduce human beings and their accomplishments to insignificance.” According to these anti-humanists, humans “rarely accomplish very much independent of what nature has bequeathed to them”; culture is a “side effect,” a “by-product,” just “one more way in which nature imposes its designs upon us.” All this, Wolfe protests, radically undermines liberal morale. Liberalism is about choice and purpose, but the aim of evolutionary psychology “is to show that leading any kind of life we think we are choosing is impossible.”

If science really and truly discredited liberalism, then the only honest response would be: so much the worse for liberalism. But, of course, it does not. The distinction between nature and culture that Wolfe brandishes so menacingly is far more subtle and tenuous than he recognizes. His version, like the obsolete distinction between body and soul, implies that we cannot be both purely physical and meaningfully moral. And yet we are. Whatever “free will” means, it does not mean that choices are uncaused.