A Match Made in Heaven

From The Washington Post:

Match Exhibit A: the match that took place July 20, 1937, on Wimbledon's Centre Court. The occasion was the Davis Cup Interzone Final between the United States and Germany. On one side of the net was Don Budge, a lanky redhead from Oakland, Calif., with a bludgeoning serve and a fabled backhand. On the other side, Baron Gottfried von Cramm, “the very embodiment of style, grace, and sportsmanship,” with a counterpunching game that was likened to chamber music. Cramm took the first two sets; Budge swept the next two; and as the combatants played on into the London twilight, the crowd of 14,000 realized that something extraordinary was happening. “The two white figures began to set the rhythms of something that looked more like ballet than a game where you hit a ball,” wrote radio journalist Alistair Cooke. “People stopped asking other people to sit down. The umpire gave up stopping the game to beg for silence during rallies.”

Each player hit twice as many winners as errors — an ungodly percentage — and the match was concluded by a spectacular running passing shot that the winning player, stumbling as he hit it, never saw land. Whereupon “a British crowd forgot its nature,” Cooke reported. “It stood on benches” and made the “deep kind of roar” that “does not belong on any tennis court.” The U.S. team captain later said, “No man, living or dead, could have beaten either man that day.” Indeed, the question of who ultimately prevailed — I won't spoil it by telling you here — is almost irrelevant.

More here.

Ear Plugs to Lasers: The Science of Concentration

John Tierney in The New York Times:

Ears Imagine that you have ditched your laptop and turned off your smartphone. You are beyond the reach of YouTube, Facebook, e-mail, text messages. You are in a Twitter-free zone, sitting in a taxicab with a copy of “Rapt,” a guide by Winifred Gallagher to the science of paying attention. The book’s theme, which Ms. Gallagher chose after she learned she had an especially nasty form of cancer, is borrowed from the psychologist William James: “My experience is what I agree to attend to.” You can lead a miserable life by obsessing on problems. You can drive yourself crazy trying to multitask and answer every e-mail message instantly.

Or you can recognize your brain’s finite capacity for processing information, accentuate the positive and achieve the satisfactions of what Ms. Gallagher calls the focused life. It can sound wonderfully appealing, except that as you sit in the cab reading about the science of paying attention, you realize that … you’re not paying attention to a word on the page.

More here.

No Exhibit for Old Men

Celebrating the new worlds of 50 artists under 33.

Our own Morgan Meis in The Smart Set:

ScreenHunter_03 May. 05 10.33 Every once in a while you get an epiphany. Something you've been meaning to say for a long time jumps, crystal clear, to the front of your brain. You've always known it, but you've never been able to say it.

This happened to me while reading an essay by Sasha Frere-Jones about Lady Gaga. Frere-Jones opens the piece with the following thought:

Dedicated fans of popular music have a certain conversation at least once a year. Call it The Question of Endurance. You and your friends are talking about music, and the conversation turns to a popular band. You express support. A friend voices her opinion, maybe as favorable as yours, but appends a qualifier: “I like them, but will they be around in 10 years?” You may feel compelled to defend whomever it is you’re talking about, covering the present moment and the future with your positive take. After trying this approach, though, you realize that pop music has no Constitution and doesn’t operate like a de-facto Supreme Court: Precedent is not always established, and isn’t even necessary. Pop rarely accretes in a tidy, serial manner — it zigs, zags, eats itself, and falls over its shoelaces.

It's a smart point, and it applies, as far as I'm concerned, to pretty much everything in the realm of what we like to call “culture.” I would take it even a step further in regard to contemporary art. I don't care whether or not any specific work of art will be around in 10 years, or a hundred, or a thousand. I'm utterly uninterested in trying to judge whether this or that work will “stand the test of time.” I don't think there is a “test of time.” Time doesn't “test” things. Longevity and quality have no intrinsic connection. Time does not slowly sift out the truth from the lies — it just moves along, usually in directions we could never have fathomed. Civilization isn't stable and progressive and never has been. For the critic, 10 years from now ought not exist, 100 years from now ought doubly not.

More here.

Scientists make molecules that evolve, compete, mimick behavior of Darwin’s finches

From PhysOrg.com:

ScreenHunter_02 May. 05 09.46 Two years ago, Voytek managed to develop a second, unrelated enzymatic RNA molecule that also can continuously evolve. This allowed her to set the two RNAs in evolutionary motion within the same pot, forcing them to compete for common resources, just like two species of finches on an island in the Galapagos.

In the new study, the key resource or “food” was a supply of molecules necessary for each RNA's replication. The RNAs will only replicate if they have catalyzed attachment of themselves to these food molecules. So long as the RNAs have ample food, they will replicate, and as they replicate, they will mutate. Over time, as these mutations accumulate, new forms emerge — some fitter than others.

When Voytek and Joyce pitted the two RNA molecules in a head-to-head competition for a single food source, they found that the molecules that were better adapted to use a particular food won out. The less fit RNA disappeared over time. Then they placed the two RNA molecules together in a pot with five different food sources, none of which they had encountered previously. At the beginning of the experiment each RNA could utilize all five types of food — but none of these were utilized particularly well. After hundreds of generations of evolution, however, the two molecules each became independently adapted to use a different one of the five food sources. Their preferences were mutually exclusive — each highly preferred its own food source and shunned the other molecule's food source.

In the process, the evolved different evolutionary approaches to achieving their ends.

More here.

Juan Cole’s summary of the situation in Pakistan

Juan Cole in his blog, Informed Comment:

As Pakistani president Asaf Ali Zardari arrived in Washington for talks with President Obama and Afghan President Hamid Karzai, fighting intensified in Pakistan's northwest.

On Tuesday morning, Pakistani Taliban deployed a suicide bomber to attack Pakistani security forces near Peshawar killing 5 and wounding 9 persons, among them school children bystanders.

WaPo says that fighting had intensified Monday in the Swat Valley between the Pakistani Taliban and government troops, as well as in Buner, the district into which they recently made an incursion and from which the government has been attempting to dislodge them. So far some 80 militants have been killed in the Buner campaign, and 20,000 civilians have been displaced.

Tony Karon at Time explains that the Pakistani military establishment disagrees with Washington that the Taliban are an existential threat to the Pakistani state, and why.

Convinced that Pakistan's problems are in part rooted in economic issues, Sens. John Kerry and Dick Lugar introduced legislation Monday aimed at tripling US foreign aid to Islamabad.

Meanwhile, on the diplomatic front, Secretary of Defense Robert Gates is calling on Saudi Arabia to help Pakistan crush the Pakistani Taliban. The Saudis have developed a fear of the vigilante radicals that they once supported back in the 1980s, and spent 2003-2006 suppressing them at home, and perhaps by now Gates's idea makes some sense.

More here. Obama says Pakistan is toughest U.S. challenge:

A Natural History of the Flu

Carl Zimmer in the New York Times:

ScreenHunter_01 May. 05 08.25 The current outbreak shows how complex and mysterious the evolution of viruses is. That complexity and mystery are all the more remarkable because a virus is life reduced to its essentials. A human influenza virus, for example, is a protein shell measuring about five-millionths of an inch across, with 10 genes inside. (We have about 20,000.)

Some viruses use DNA, like we do, to encode their genes. Others, like the influenza virus, use single-strand RNA. But viruses all have one thing in common, said Roland Wolkowicz, a molecular virologist at San Diego State University: they all reproduce by disintegrating and then reforming.

A human flu virus, for example, latches onto a cell in the lining of the nose or throat. It manipulates a receptor on the cell so that the cell engulfs it, whereupon the virus’s genes are released from its protein shell. The host cell begins making genes and proteins that spontaneously assemble into new viruses. “No other entity out there is able to do that,” Dr. Wolkowicz said. “To me, this is what defines a virus.”

The sheer number of viruses on Earth is beyond our ability to imagine. “In a small drop of water there are a billion viruses,” Dr. Wolkowicz said. Virologists have estimated that there are a million trillion trillion viruses in the world’s oceans.

More here.

Emotional Cartography: Christian Nold and William Blake

Sensory deprivation mapFrom the indispensable psychology and neuroscience blog Mind Hacks, some information on Christian Nold and his “emotional cartography.” Writes Nold:

Bio Mapping is a community mapping project … In the context of regular, local workshops and consulltations (s9c), participants are wired up with an innovative device which records the wearer's Galvanic Skin Response (GSR), which is a simple indicator of the emotional arousal in conjunction with their geographical location. People re-explore their local area by walking the neighbourhood with the device and on their return a map is created which visualises points of high and low arousal. By interpreting and annotating this data, communal emotion maps are constructed that are packed full of personal observations which show the areas that people feel strongly about and truly visualise the social space of a community.

Nold goes on to ask, “How will our perceptions of our community and environment change when we become aware of our own and each others intimate body states?” That's more of a forecast than a description of his current work. He's predicting a technology that allows people to read the emotions of others in real-time. His current maps essentially measure only stress, and the results are published retrospectively and not in real time.

But it raises a number of interesting questions and possibilities. Before we go there, however, it's worth mention that, while Nold may be science's first “emotional cartographer,” literature's been there already. Take William Blake's London:

I wander through each chartered street,
Near where the chartered Thames does flow,
And mark in every face I meet,
Marks of weakness, marks of woe.

In every cry of every man,
In every infant's cry of fear,
In every voice, in every ban,
The mind-forged manacles I hear:

How the chimney-sweeper's cry
Every blackening church appals,
And the hapless soldier's sigh
Runs in blood down palace-walls.

But most, through midnight streets I hear
How the youthful harlot's curse
Blasts the new-born infant's tear,
And blights with plagues the marriage-hearse.

Blake's cartography isn't only emotional, although it's intensely emotional. It's also economic, political, and psychological (the “mind-forged manacles” evoking everything from learned helplessness to crushing social convention.) The cagey old printer even manages to inject a little epidemiology into his mapmaking. The youthful harlot's curse sounds a lot like a venereal disease, one that condemns the unfaithful husband and his family to death. And the “marks” of “weakness and woe” that Blake inventories form the contours of his map. The wounded soldier's bloody sigh on the palace wall reduces to zero the geographical distance between the suffering of the battlefront and the comforts of the wealthy.

And all in sixteen lines.

So maybe Christian Nold hasn't caught up with the poet yet. But he's done something interesting, and there's more to come. The “emotional maps” aren't his only work, either. He's also created the Newham “Sensory Deprivation” Map, which is where the illustration above comes from. By switching up the senses people use to perceive the environment, he's helping map our geography in a new way. A very nice idea.

So what would happen if we could read the emotions of those around us in real-time? What if we could tell that the crowd around us at rush hour was overstressed, that the people at our bar band gig really liked the crazy rockabilly number we threw in, that our academic audience was becoming skeptical of our Blake-As-Cartographer thesis? Would people on the street feel more personal responsibility for the well-being of the throng around them? Would presenters and performers lose the willingness to challenge their audience? Would anybody even care?

Would politicians be even more eager to say anything the public wanted than ever before?

Nold's work can veer in any number of future directions. It could lead to new forms of psychological epidemiology, or to conceptual art works. Or to new ways of seeing the world around us, a breaker of mind-forged manacles. But he needs to be vigilant, to prevent his work from descending into an entertainment, a crowdsourced “mood ring” for the 21st Century, played with and then forgotten.

He can do it, if he gets the right support. And draws the right inspiration from cartographers like William Blake.

Prick Up Your Ears, Times Readers: Do You Know What Your New York Times Is Doing?

Michael Blim

Globe_121302 The minutes of the night tick down as I write this column. Soon I will have my morning reward. My column will come out on 3QD, and I will hear the thud of the Boston Globe against the front door.

My column will come out, but with the Boston Globe?

Ask The New York Times Company, its owner. For the past month, they have been threatening to close the Globe unless its workers give back $20 million in wages and benefits by May 1. For the past two days, the Times company has extended the deadline by one day. As I write, the Times company has put several hours back on the clock at the same time it is waving its official plant closing notice as required by the state in the faces of its employees.

The Globe, once the Sulzberger flagship for its New England media armada, and a cash cow to boot, is now losing a million dollars a week. It is the last paper of record in Boston, and has garnered dozens of national awards, including seven Pulitzer prizes since 1995. The 2007 prize was won by reporter Charlie Savage’s exposure of President Bush’s abuse of so-called signing statements, pithy bits of prose attached to his approval of laws that skewed or set aside whole provisions of legislation he could not summon the courage to veto. The 2003 prize was won the Globe’s spotlight team for their uncovering of the sex abuse scandal in the local and national Catholic churches.

These were hardly prizes awarded for art criticism, however valuable those forms of recognition may be. They were what newspapers do that no other institution or platform in America can yet do, which is to generate facts about and attention to serious, yet undiscovered problems in everyday life.

Read more »

The Humanists: Andrei Tarkovsky’s Solaris (1972)

Solaris2

by Colin Marshall

Though certain cultural circles customarily and wrongfully dismiss science fiction as an altogether inferior breed of narrative, the genre's bad reputation isn't wholly unearned. Just last week, I heard veteran sci-fi novelist Robert Silverberg publicly assert that, in his field, “character is necessarily subordinated to speculation”; rarely has the fatal flaw of one subset of fiction been so succinctly stated. While the disease that withers human inhabitants to ciphers is indeed widespread and devastating, it hasn't quite contaminated every crevice of the sci-fi landscape. Witness, to name one of these exotic and wonderful instances, Andrei Tarkovsky's Solaris, a futuristic, fantastical journey into an impossible planet's orbit that nevertheless remains the most gripping cinematic narrative of the 1970s.

The film is, I would submit, Tarkovsky's finest, though the great director would have argued with me. He reportedly came to consider Solaris his least successful project, owing to what he saw as its inability to break the shackles of its genre. Though no viewer then or now would call it anything other than a science fiction film, perhaps only Tarkovsky himself, his mind's eye fixed on the less conventional visions he would later realize, could lump it with the day's rockets-and-aliens potboilers. What to him may have been a not-entirely-successful attempt to imbue relatively insubstantial material with stronger human resonances is to others a set of Tarkovskian themes brought closer and made more comprehensible by interaction with a familiar cinematic context. This is not to minimize the impact of the films that followed — the ultra-personal Mirror and The Sacrifice, the supremely textural Nostalghia, the much more distant science fiction of Stalker — but to appreciate the unexpectedly positive hybridization effects of two entirely distinct entities, a phenomenon of which almost any science fiction writer would approve.

Read more »

Monday poem

The reality of time has long been questioned by an odd alliance
of philosophers and physicists.–Robert Lanza and Bob Berman


The Problem of Time
Jim Culleny

Then was now once while now always is
the train leaving the station

and Is (itself) is pretty much
a matter of interpretation
as murky as the dilemma:
to be or not to…which was
well explored long before today
(today being exactly when
Hamlet was written anyway).

Tomorrow maybe I’ll figure it all out,
though by then it’ll be almost yesterday again

which before tock has ticked will
seem like a month or two ago
or year or even an eon or so
, which it undoubtedly is

The 2012 Apocalypse — And How to Stop It

2012 Perhaps alarmist, Brandon Keim in Wired:

For scary speculation about the end of civilization in 2012, people usually turn to followers of cryptic Mayan prophecy, not scientists. But that’s exactly what a group of NASA-assembled researchers described in a chilling report issued earlier this year on the destructive potential of solar storms.

Entitled “Severe Space Weather Events — Understanding Societal and Economic Impacts,” it describes the consequences of solar flares unleashing waves of energy that could disrupt Earth’s magnetic field, overwhelming high-voltage transformers with vast electrical currents and short-circuiting energy grids. Such a catastrophe would cost the United States “$1 trillion to $2 trillion in the first year,” concluded the panel, and “full recovery could take 4 to 10 years.” That would, of course, be just a fraction of global damages.

Good-bye, civilization.

Worse yet, the next period of intense solar activity is expected in 2012, and coincides with the presence of an unusually large hole in Earth’s geomagnetic shield. But the report received relatively little attention, perhaps because of 2012’s supernatural connotations. Mayan astronomers supposedly predicted that 2012 would mark the calamitous “birth of a new era.”

Whether the Mayans were on to something, or this is all just a chilling coincidence, won’t be known for several years. But according to Lawrence Joseph, author of “Apocalypse 2012: A Scientific Investigation into Civilization’s End,” “I’ve been following this topic for almost five years, and it wasn’t until the report came out that this really began to freak me out.”

Wired.com talked to Joseph and John Kappenman, CEO of electromagnetic damage consulting company MetaTech, about the possibility of geomagnetic apocalypse — and how to stop it.

Visible Young Man

Colson In the NYT, a review of Colson Whitehead's Sag Harbor:

Now that we’ve got a post-black president, all the rest of the post-blacks can be unapologetic as we reshape the iconography of blackness. For so long, the definition of blackness was dominated by the ’60s street-fighting militancy of the Jesses and the irreverent one-foot-out-the-ghetto angry brilliance of the Pryors and the nihilistic, unrepentantly ghetto, new-age thuggishness of the 50 Cents. A decade ago they called post-blacks Oreos because we didn’t think blackness equaled ghetto, didn’t mind having white influencers, didn’t seem full of anger about the past. We were comfortable employing blackness as a grace note rather than as our primary sound. Post-blackness sees blackness not as a dogmatic code worshiping at the altar of the hood and the struggle but as an open-source document, a trope with infinite uses.

The term began in the art world with a class of black artists who were adamant about not being labeled black artists even as their work redefined notions of blackness. Now the meme is slowly expanding into the wider consciousness. For so long we were stamped inauthentic and bullied into an inferiority complex by the harder brothers and sisters, but now it’s our turn to take center stage. Now Kanye, Questlove, Santigold, Zadie Smith and Colson Whitehead can do blackness their way without fear of being branded pseudo or incognegro.

So it’s a perfect moment for Whitehead’s memoiristic fourth novel, “Sag Harbor,” a coming-of-age story about the Colsonesque 15-year-old Benji, who wishes people would just call him Ben. He’s a Smiths-loving, Brooks Brothers-wearing son of moneyed blacks who summer in Long Island and recognize the characters on “The Cosby Show” as kindred spirits.

Sunday Poem

Found
Ron Koertge

My wife waits for a caterpillar
to crawl onto her palm so she
can carry it out of the street
and into the green subdivision
of a tree.

Yesterday she coaxed a spider
into a juicier corner. The day
before she hazed a snail
in a half-circle so he wouldn’t
have to crawl all the way
around the world and be 2,000
years late for dinner.

I want her to hurry up and pay
attention to me or go where I
want to go until I remember
the night she found me wet
and limping, felt for a collar
and tags, then put me in
the truck where it was warm.

Without her I wouldn’t
be standing here in these
snazzy alligator shoes.

A Queen for the Ages

From The Washington Post:

Cleo More than two millennia after it took place, the story of Cleopatra has lost none of its grip on the world's imagination. It has inspired great plays (Shakespeare, Shaw and Sardou), novels, poems, movies (Elizabeth Taylor!), works of art, musical compositions both serious (Handel and Samuel Barber) and silly (“Comin' Atcha,” by Cleopatra), and of course histories and biographies. Yet for all this rich documentation and interpretation, it remains at least as much legend and mystery as historical record, which has allowed everyone who tells it to play his or her own variations on the many themes it embraces.

The latest to take it on is Diana Preston, a British writer of popular history. On the evidence of “Cleopatra and Antony,” I'd say she's a thoroughgoing pro. Her research is careful and deep; her prose is lively and graceful; her sympathy for her central character is strong but wholly without sentimentality; her depiction of the worlds in which Cleopatra lived is detailed, textured and evocative. If there is a better book about Cleopatra for today's reader, I don't know what it is.

She calls her book “Cleopatra and Antony,” thus reversing the order as immortalized by Shakespeare. History and legend have usually given priority to the two great men in the Egyptian queen's life, Julius Caesar and Mark Antony, but Preston argues that “Cleopatra perhaps deserves first place” because “her tenacity, vision and ambition would have been remarkable in any age but in a female ruler in the ancient world they were unique.” She was “a charismatic, cultured, intelligent ruler,” yet thanks to the propaganda put about by Octavian — later the Emperor Augustus but in the fourth decade B.C. Mark Antony's rival for control of the Roman Empire — she “was transformed into a pleasure-loving houri, the very epitome of fatal beauty and monstrous depravity, bent on bringing animal gods, barbarian decadence and despotism to the sacred halls of Rome's Capitol.”

More here.

Why can’t we concentrate?

Laura Miller in Salon:

Story Here's a fail-safe topic when making conversation with everyone from cab drivers to grad students to cousins in the construction trade: Mention the fact that you're finding it harder and harder to concentrate lately. The complaint appears to be universal, yet everyone blames it on some personal factor: having a baby, starting a new job, turning 50, having to use a Blackberry for work, getting on Facebook, and so on. Even more pervasive than Betty Friedan's famous “problem that has no name,” this creeping distractibility and the technology that presumably causes it has inspired such cris de coeur as Nicholas Carr's much-discussed “Is Google Making Us Stupid?” essay for the Atlantic Monthly and diatribes like “The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future,” a book published last year by Mark Bauerlein.

You don't have to agree that “we” are getting stupider, or that today's youth are going to hell in a handbasket (by gum!) to mourn the withering away of the ability to think about one thing for a prolonged period of time. Carr (whose argument was grievously mislabeled by the Atlantic's headline writers as a salvo against the ubiquitous search engine) reported feeling the change “most strongly” while he was reading. “Immersing myself in a book or a lengthy article used to be easy,” he wrote. “Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text.” For my own part, I now find it challenging to sit still on my sofa through the length of a feature film. The urge to, for example, jump up and check the IMDB filmography of a supporting actor is well-nigh irresistible, and once I'm at the computer, why not check e-mail? Most of the time, I'll wind up pausing the DVD player before the end of the movie and telling myself I'll watch the rest tomorrow.

More here.

The Big Similarities & Quirky Differences Between Our Left and Right Brains

Carl Zimmer in Discover Magazine:

Brain There is nothing more humbling or more perception-changing than holding a human brain in your hands. I discovered this recently at a brain-cutting lesson given by Jean-Paul Vonsattel, a neuropathologist at Columbia University. These lessons take place every month in a cold, windowless room deep within the university’s College of Physicians and Surgeons. On the day I visited, there were half a dozen brains sitting on a table. Vonsattel began by passing them around so the medical students could take a closer look. When a brain came my way, I cradled it and found myself puzzling over its mirror symmetry. It was as if someone had glued two smaller brains together to make a bigger one.

Vonsattel then showed us just how weak that glue is. He took back one of the brains and used a knife to divide the hemispheres. He sliced quickly through the corpus callosum, the flat bundle of nerve fibers that connects the halves. The hemispheres flopped away from each other, two identical slabs of fleshy neurons.

Sometimes surgeons must make an even more extreme kind of slice in the brain of a patient. A child may suffer from epilepsy so severe that the only relief doctors can offer is to open up the skull and cut out the entire hemisphere in which the seizures start. After the surgery, the space soon fills with cerebrospinal fluid. It may take a child a year of physical therapy to recover from losing a hemisphere—but the fact that patients recover at all is stunning when you consider that they have only half a brain. It makes you wonder what good two hemispheres are in the first place.

More here.

After the Great Recession

President Obama discusses how his policies on schools, energy and health care might change daily life in America.

David Leonhardt in the New York Times Magazine:

03obama-500 Are there tangible ways that Wall Street has made the average person’s life better in the way that Silicon Valley has?

THE PRESIDENT: Well, I think that some of the democratization of finance is actually beneficial if properly regulated. So the fact that large numbers of people could participate in the equity markets in ways that they could not previously — and for much lower costs than they used to be able to participate — I think is important.

Now, the fact that we had such poor regulation means — in some of these markets, particularly around the securitized mortgages — means that the pain has been democratized as well. And that’s a problem. But I think that overall there are ways in which people have been able to participate in our stock markets and our financial markets that are potentially healthy. Again, what you have to have, though, is an updating of the regulatory regimes comparable to what we did in the 1930s, when there were rules that were put in place that gave investors a little more assurance that they knew what they were buying.

More here.