September 30, 2012
Judith Goldman Reads "Austerity Measures"
The Golden Age: Keynesian UtopianismJohn Quiggin in Aeon:
I first became an economist in the early 1970s, at a time when revolutionary change still seemed like an imminent possibility and when utopian ideas were everywhere, exemplified by the Situationist slogan of 1968: ‘Be realistic. Demand the impossible.’ Preferring to think in terms of the possible I was much influenced by an essay called ‘Economic Possibilities for our Grandchildren,’ written in 1930 by John Maynard Keynes, the great economist whose ideas still dominated economic policymaking at the time.
Like the rest of Keynes’s work, the essay ceased to be discussed very much during the decades of free-market liberalism that led up to the global financial crisis of 2007 and the ensuing depression, through which most of the developed world is still struggling. And, also like the rest of Keynes's work, this essay has enjoyed a revival of interest in recent years, promoted most notably by the Keynes biographer Robert Skidelsky and his son Edward.
The Skidelskys have revived Keynes’s case for leisure, in the sense of time free to use as we please, as opposed to idleness. As they point out, their argument draws on a tradition that goes back to the ancients. But Keynes offered something quite new: the idea that leisure could be an option for all, not merely for an aristocratic minority.
Writing at a time of deep economic depression, Keynes argued that technological progress offered the path to a bright future. In the long run, he said, humanity could solve the economic problem of scarcity and do away with the need to work in order to live. That in turn implied that we would be free to discard ‘all kinds of social customs and economic practices, affecting the distribution of wealth and of economic rewards and penalties, which we now maintain at all costs, however distasteful and unjust they may be in themselves, because they are tremendously useful in promoting the accumulation of capital’.
Poll Averages Have No History of Consistent Partisan BiasNate Silver in the NYT's Five Thirty Eight:
The analysis that follows is quite simple. I’ll be taking a simple average of polls conducted each year in the final 21 days of the campaign and comparing it against the actual results. There are just two restrictions.
First, I will be looking only at polls of likely voters. Polls of registered voters, or of all adults, typically will overstate the standing of Democratic candidates, since demographic groups like Hispanics that lean Democratic also tend to be less likely to turn out in most elections. (The FiveThirtyEight forecast model shifts polls of registered voters by 2.5 percentage points toward Mr. Romney for this reason.)
Second, the averages are based on a maximum of one poll per polling firm in each election. Specifically, I use the last poll that each conducted before the election. (Essentially, this replicates the methodology of the Real Clear Politics polling average.)
Let’s begin by looking at the results of national polls for the presidential race.
In the 10 presidential elections since 1972, there have been five years (1976, 1980, 1992, 1996 and 2004) in which the national presidential polls overestimated the standing of the Democratic candidate. However, there were also four years (1972, 1984, 1988 and 2000) in which they overestimated the standing of the Republican. Finally, there was 2008, when the average of likely voter polls showed Mr. Obama winning by 7.3 percentage points, his exact margin of victory over John McCain, to the decimal place.
Arthur Ochs Sulzberger, 1926-2012Clyde Haberman in the NYT:
Arthur Ochs Sulzberger, who guided The New York Times and its parent company through a long, sometimes turbulent period of expansion and change on a scale not seen since the newspaper’s founding in 1851, died early Saturday at his home in Southampton, N.Y. He was 86.
His death, after a long illness, was announced by his family.
Mr. Sulzberger’s tenure, as publisher of the newspaper and as chairman and chief executive of The New York Times Company, reached across 34 years, from the heyday of postwar America to the twilight of the 20th century, from the era of hot lead and Linotype machines to the birth of the digital world.
The paper he took over as publisher in 1963 was the paper it had been for decades: respected and influential, often setting the national agenda. But it was also in precarious financial condition and somewhat insular, having been a tightly held family operation since 1896, when it was bought by his grandfather Adolph S. Ochs.
By the 1990s, when Mr. Sulzberger passed the reins to his son, Arthur Sulzberger Jr., first as publisher in 1992 and then as chairman in 1997, the enterprise had been transformed. The Times was now national in scope, distributed from coast to coast, and it had become the heart of a diversified, multibillion-dollar media operation that came to encompass newspapers, magazines, television and radio stations and online ventures.
The expansion reflected Mr. Sulzberger’s belief that a news organization, above all, had to be profitable if it hoped to maintain a vibrant, independent voice.
I'm Sorry, Steve Jobs: We Could Have Saved You
Siddhartha Mukherjee in Newsweek:
We are failing to treat and prevent cancer—even as the promise of life-saving remedies await us. On the anniversary of Steve Jobs’s death, leading oncologist and the author of The Emperor of All Maladies Siddhartha Mukherjee explains how we failed to save an icon and why we will lose so many more lives if we do not give cancer research the funding it deserves. In Oct. 5, the night that Steve Jobs died, I ascended 30,000 feet into the thin air above New York on a flight to California. On my lap was a stash of scientific papers. I was reading and taking notes—where else?—on an iPad.
Jobs’s death—like a generational Rorschach test—had provoked complex reactions within each of us. There was grief in abundance, of course, admixed with a sense of loss, with desolation and nostalgia. Outside the Apple store in SoHo, New York, that evening, there were bouquets of white gerberas and red roses. Someone had left a bushel of apples by the doorstep and a sign that read “I-miss ...” I missed Jobs, too—but I also felt a personal embarrassment in his death. I am an oncologist and a cancer researcher. I felt as if my profession, my discipline, and my generation had let him down. Steve Jobs had promised—and then delivered—life-altering technologies. Had we, in all honesty, given him any such life-altering technologies back? I ask the question in all earnestness. Jobs’s life ended because of a form of pancreatic cancer called pancreatic neuroendocrine tumor, or PNET. These tumors are fleetingly rare: about five in every million men and women are diagnosed with PNETs each year. Deciphering the biology of rare cancers is often challenging. But the past five years have revealed extraordinary insights into the biology of some rare cancers—and PNETs, coincidentally enough, have led part of that charge. By comparing several such tumors, scientists are beginning to understand the biology of these peculiar tumors.
The Art of Stem Cells
From Harvard Magazine:
Jennifer Quick, a Ph.D. candidate in the department of the history of art and architecture, stood before her audience in the main gallery of the Carpenter Center for the Visual Arts. Behind her, on a free-standing wall running down the middle of the gallery, hung more than a dozen works by Michael Wang ’03, a visual artist who creates micrograph images of artificially produced stem cells.
But as Quick began to lecture on Wang’s works, another voice could be heard from behind the wall. It belonged to cellular biologist Gabriella Boulting, Ph.D. ’12, who was speaking to her own audience about pieces Wang had created that hung on her side of the wall. Dueling gallery tours? Wang, an artist who aims to bring together art and science in interesting and thought-provoking ways, prefers to think of two lecturers working as one. “I wanted to literally stage an encounter between two disciplines, the artistic community on one hand and the scientific on the other,” Wang says. “I don’t think of artworks as ending with the individual artistic object. There is an expanded field that includes context and discourse around [that object] so I really wanted to make sure that could be an expanded part of the work. I wanted people from within the University to provide two very different insights at the exact same time.” His latest work, Differentiation Series, is a sequence of micrograph images of artificially produced stem cells that have been hand-tinted using a system that matches a unique color to every specific cell type that can potentially be produced from these initially undifferentiated cells.
The Britishisation of American English
Cordelia Hebblethwaite at the BBC:
There is little that irks British defenders of the English language more than Americanisms, which they see creeping insidiously into newspaper columns and everyday conversation. But bit by bit British English is invading America too.
"Spot on - it's just ludicrous!" snaps Geoffrey Nunberg, a linguist at the University of California at Berkeley.
"You are just impersonating an Englishman when you say spot on."
"Will do - I hear that from Americans. That should be put into quarantine," he adds.
And don't get him started on the chattering classes - its overtones of a distinctly British class system make him quiver.
But not everyone shares his revulsion at the drip, drip, drip of Britishisms - to use an American term - crossing the Atlantic.
"I enjoy seeing them," says Ben Yagoda, professor of English at the University of Delaware, and author of the forthcoming book, How to Not Write Bad.
"It's like a birdwatcher. If I find an American saying one, it makes my day!"
Last year Yagoda set up a blog dedicated to spotting the use of British terms in American English.
So far he has found more than 150 - fromcheeky to chat-up via sell-by date, and the long game - an expression which appears to date back to 1856, and comes not from golf or chess, but the card game whist. President Barack Obama has used it in at least one speech.
Behind the drones debate
Cyril Almeida in Dawn:
Drones kill civilians. Fewer civilians would probably die if there were less secrecy surrounding drone strikes in Fata. And the US kills people in Fata that it probably wouldn’t get away with killing in less remote parts of the world.
There. Now that that’s out of the way, there’s another obvious truth about drone strikes: they won’t end.
Because drones kill militants. Because there isn’t a good alternative to drones for killing militants in parts of Fata. And because the US security establishment likes them and the Pakistani security establishment doesn’t loathe them.
And, given what 140,000 troops in Fata can and have done, drones are — in terms of casualties and damage caused to civilian populations — on the periphery of the ‘what are we doing to our people’ debate.
If drones are here to stay, why this endless back and forth, sometimes acerbic, at other times restrained, between Pakistan and the US?
A Trans-Atlantic Trip Turns Kafkaesque
Gary Shteyngart in the New York Times:
You, American Airlines, should no longer be flying across the Atlantic. You do not have the know-how. You do not have the equipment. And your employees have clearly lost interest in the endeavor. Like the country whose name graces the hulls of your flying ships, you are exhausted and shorn of purpose. You need to stop.
Flight 121 from Paris to New York began on a clear autumn afternoon. It ended over 30 hours later. For those of us without miles, it is probably still going.
The initial delay was a mere hour or two. Some were told that our aircraft possessed faulty tires and brakes. Others were told that the crew could not find their way in from Paris. Neither scenario was particularly encouraging.
The aircraft was indeed an interesting one. One of the overhead baggage compartments was held together with masking tape. Halfway across the Atlantic you decided to turn Flight 121 back because your altimeter wasn’t working. Some of us were worried for our safety, but your employees mostly shrugged as if to say, Ah, there goes that altimeter again.
To get the most out of life, you've got to live like a bird
Our own Morgan Meis in The Smart Set:
Franzen became a bird watcher many years ago. He is almost apologetic about that fact, realizing that — in the opinion of most normal human beings — the birdwatcher is a slightly pathetic if otherwise harmless individual. In his commencement address at Kenyon College, "Pain Won't Kill You," Franzen writes:
It's a long story, but basically I fell in love with birds. I did this not without significant resistance, because it is very uncool to be a birdwatcher, because anything that betrays real passion is by definition uncool. But little by little, in spite of myself, I developed this passion, and although one half of a passion is obsession, the other half is love.
From his usage of words like "passion," "obsession," and "love," it's obvious Jonathan Franzen thinks birdwatching is neither pathetic, nor, more importantly, is it harmless. For Franzen, birdwatching is a big deal. Paying attention to birds can change you. It can transform your sense of self and the world. Franzen knows this because it happened to him.
Many of the essays in Franzen's book therefore touch on the subject of watching birds. A couple of essays are explicitly about birdwatching, which Franzen has done in Cyprus, on an island in the South Pacific known as Masafuera, and in China, among other places. Franzen has become a defender of the birds. He is appalled by the killing of birds and by the destruction of their natural habitat. He laments with great pathos the lusty shooting of migrant birds that is a favorite pastime of the people of Malta. But what does it mean, this birdwatching, and why does Franzen keep coming back to the theme of birds over and over in his essays?
September 29, 2012
Geller’s ‘savage’ ad displays the racism inherent in Israeli colonization
Shireen Tawil in Mondoweiss:
In her latest attempt to fan the flames of Islamaphobia, anti-Arab sentiments, and blind allegiance to Israel across America, Pamela Geller launched an ad campaignimploring Americans, “In any war between the civilized man and the savage, support the civilized man. Support Israel, defeat Jihad”. Published in August on buses and subway cars in San Francisco, the ad made its debut in New York City subway stations this week and is due to speckle the nation’s capital in the near future.
Geller’s vulgar and hateful ad campaign has rightfully received much resistance and heat from local populations, as well as the public transportation authorities whose vehicles it smuts. Local anti-hate activists’ creative and artistic responses have branded these ads as racist and hate speech. Her violent and distasteful language has been slammed for reeking of colonial racism and white supremacy. The San Francisco MTA refused to run the ad as it contradicts their stance against defamatory language, until Geller went to court and, winning the case, protected the ad under the First Amendment (much to their credit, in a refreshing reaction to being forced to post the ad, the SFMTA donated its proceeds from the ad to the San Francisco Human Rights Commission).
The language Geller employs in her ad is shocking, hurtful, divisive, violent, hateful, racist, and vulgar. But it is out there, and potentially spreading. The question now remains: what to do with it?
Siddhartha Mukherjee: The Cancer Puzzle
On Early Warning SignsGeorge Sugihara in Seed:
At a closed meeting held in Boston in October 2009, the room was packed with high-flyers in foreign policy and finance: Henry Kissinger, Paul Volcker, Andy Haldane, and Joseph Stiglitz, among others, as well as representatives of sovereign wealth funds, pensions, and endowments worth more than a trillion dollars—a significant slice of the world’s wealth. The session opened with the following telling question: “Have the last couple of years shown that our traditional finance/risk models are irretrievably broken and that models and approaches from other fields (for example, ecology) may offer a better understanding of the interconnectedness and fragility of complex financial systems?”
Science is a creative human enterprise. Discoveries are made in the context of our creations: our models and hypotheses about how the world works. Big failures, however, can be a wake-up call about entrenched views, and nothing
produces humility or gains attention faster than an event that blindsides so many so immediately.
Examples of catastrophic and systemic changes have been gathering in a variety of fields, typically in specialized contexts with little cross-connection. Only recently have we begun to look for generic patterns in the web of linked causes and effects that puts disparate events into a common framework—a framework that operates on a sufficiently high level to include geologic climate shifts, epileptic seizures, market and fishery crashes, and rapid shifts from healthy ecosystems to biological deserts.
The main themes of this framework are twofold: First, they are all complex systems of interconnected and interdependent parts. Second, they are nonlinear, non-equilibrium systems that can undergo rapid and drastic state changes.
Quick Intuitive Decisions Foster More Charity and Cooperation than Slow Calculated OnesEd Yong over at Not Exactly Rocket Science:
Our lives are governed by both fast and slow – by quick, intuitive decisions based on our gut feelings; and by deliberate, ponderous ones based on careful reflection. How do these varying speeds affect our choices? Consider the many situations when we must put our own self-interest against the public good, from giving to charity to paying out taxes. Are we naturally prone to selfishness, behaving altruistically only through slow acts of self-control? Or do we intuitively reveal our better angels, giving way to self-interest as we take time to think?
According to David Rand from Harvard University, it’s the latter. Through a series of experiments, he has found that, on average, people behave more selflessly if they make decisions quickly and intuitively. If they take time to weigh things up, cooperation gives way to selfishness. The title of his paper – “Spontaneous giving and calculated greed” – says it all.
Working with Joshua Greene and Martin Nowak, Rand asked volunteers to play the sort of games that economists have used for years. They have to decide how to divvy, steal, invest or monopolise a pot of money, sometimes with the option to reward or punish other players. These games are useful research tools, but there’s an unspoken simplicity to them. Sure, the size of the payoffs or the number of rounds may vary, but experiments assume that people play consistently depending on their personal preferences. We know from personal experience that this is unlikely to be true, and Rand’s experiments confirm as much. They show that speed matters.
Rand started with a simple public goods game, where players decide how much money to put into a pot. The pot is then doubled and split evenly among them. The group gets the best returns if everyone goes all-in, but each individual does best if they withhold their money and reap the rewards nonetheless.
When Senator Strom Thurmond of South Carolina died in 2003 at the age of 100, he seemed to embody the term “political survivor.” Think of someone who began his career as a Roosevelt Democrat and finished it as a Reagan Republican, who campaigned for president as a white supremacist and ended up supporting a national holiday for Martin Luther King. Decades passed, one generation replaced another, but Thurmond soldiered on, swapping causes, even political parties, with a juggler’s eye. Where many politicians become objects of contempt or indifference over time, with Thurmond the reverse was true: the longer he lasted, the more revered he became. He hailed from Edgefield County in the hardscrabble Carolina Piedmont, home to several governors and a host of Lost Cause Southern heroes like “Pitchfork” Ben Tillman, the race-baiting demagogue who notoriously advocated lynching to protect white women from black “lust.” Edgefield had a tradition of enforcing its Jim Crow laws with a heavy hand — a necessity, whites believed, in a county where two-thirds of the residents were black.more from David Oshinsky at the NY Times here.
the zoo problem
Of course, zoos have both passionate supporters and outright opponents. But most people, like me, occupy a middle ground: delighting at the squirrel monkeys chasing each other’s tails, but shamed by the bored and contemptuous glance of the gorilla. Zoos embody the dilemmas of our relationship to a nature that we strive to control, for good and frequently for ill. These dilemmas provide the common thread to four fascinating books on the lives of animals in captivity. The central dilemma is of course whether to keep animals removed from their natural habitats at all: to do so allows us to come closer to them, but only in an environment that seems unnatural and impoverished. In Death at SeaWorld: Shamu and the Dark Side of Killer Whales in Captivity, the investigative journalist David Kirby poses this question in the context of a very particular kind of zoo: the oceanariums and marine mammal parks in which some of the most sophisticated and spectacular of animals can be seen.more from Stephen Cave at the FT here.
A Review of The Breathless Zoo: Taxidermy and the Cultures of Longing
Anjuli Raza Kolb in The LA Review of Books:
IN HER MEMOIR Deep Blue Home, Julia Whitty describes a near collision with a young sperm whale as she swims, helpless among human companions, in the depths off the Galapagos Islands. She wonders if she will die. Calm and curious about her fate, she watches as he approaches full speed ahead, “with all the energy and incaution of adolescence.” Instead of ramming her, “he jackknifes his huge head downward,” she writes, “and I can see the sheets of cellophane-thin gray skin peeling off his body — the constant striptease one of the means by which cetaceans reduce their drag in the water.” This image captivates me: an empty whale skin, perfectly formed, floating gently up as its former contents shoot down in the water, all strength and purpose. The ghost whale unfurling itself and pouring outward on the surface of the water. I have since learned that whale skin releases in prosaic strips.
There is a quickening in this scene, a flush of longing, envy, and terror not just from the Melvillian symbolism of the whale, or the arresting account of a near-death experience, but, too, from the proximity to a living creature so other and so enormous. Whitty’s description is unmistakably erotic: the young whale’s shedding is a “striptease,” the “sculpted angle of his cheek and jaw,” his “tensile strength,” and his “arching upright” tail towering above her before he slides down into the dark waters. She writes of touching him, and it’s thrilling.
It is the trace, however, that struck me most. The oxytocin to the dopamine rush of first encounter, the strip of skin is like a lover’s token: prophetic, ontic, memorial, foretelling life lived together and the ways in which it ends. For those of us who love it and observe it closely, we are locked on a fast collision course with our natural world, tempting its wrath, waiting for it to fall apart or duck us, reaching for an acknowledgement of our presence in it, and, above all, enamored of its tokens and talismans.
Cradles of jewels, spun in an hour
From New Statesman:
The spider and the cleaner work in the same building, not far from each other but solitary in their worlds. The spider spins quietly, tapping and rubbing herself. When the Hoover starts up, she can hear it through her hairs, feeling the sound waves like a wind on her body. She has eight eyes. She may not see in colour but she’s sensitive to tiny movements. There’s something stately about her panic: no noise, no head movement, simply her legs carry her suddenly into hiding. The cleaner on the other hand shouts and steps backwards: “Weird isn’t it, all those legs, very fragile and quite soft. A spider might measure an inch and a half but the mind sees something huge. I must admit, I’m a little bit unnerved by big spiders. Someone told me they bite. I might just pick them up very quickly – you know, grab them with my hand very softly and fling them out the window. I’d never kill one.”
The cleaner has only two eyes. When he’s not cleaning, he paints detailed pictures of invisible worlds. When he cleans, he keeps his eyes tuned to the task. He sees dust, mud, smears, nail clippings, spillages, hairs in plugholes, unpaid bills, kicked-off shoes, all the secret debris of a human. At a certain point he glances at the ceiling and sees cobwebs: “Ah yes, what should you do about webs? Old webs I’ll get rid of, they’re just big balls or twists of dust; and kitchen webs, the more you leave them (and I’m talking years here) the more the whole place becomes a congealment of grease. But fresh webs – I tend to take one and leave another. I make a balancing decision. I try not to get anxious about the ethics of it. There’s the issue of flies, of course. Not hygienic but I have saved flies on occasion. But those webs, when you see them outdoors they’re like cradles of jewels between the gorse – it seems so sad to damage them.”
Saturday PoemTo Marina (excerpt)
Let's take a walk
Into the world
Where if our shoes get white
With snow, is it snow, Marina,
Is it snow or light?
Let's take a walk
Every detail is everything in its place (Aristotle). Literature is a cup
And we are the malted. The time is a glass. A June bug comes
And a carpenter spits on a plane, the flowers ruffle ear rings.
I am so dumb-looking. And you are so beautiful.
by Kenneth Koch
from The Collected Poems of Kenneth Koch
Alfred A. Knopf, 2007
The Great Disconnect
Mark Lilla in The New York Times:
Once upon a time there was a radical president who tried to remake American society through government action. In his first term he created a vast network of federal grants to state and local governments for social programs that cost billions. He set up an imposing agency to regulate air and water emissions, and another to regulate workers’ health and safety. Had Congress not stood in his way he would have gone much further. He tried to establish a guaranteed minimum income for all working families and, to top it off, proposed a national health plan that would have provided government insurance for low-income families, required employers to cover all their workers and set standards for private insurance. Thankfully for the country, his second term was cut short and his collectivist dreams were never realized.
His name was Richard Nixon.
Whenever conservatives talk to me about Barack Obama, I always feel quite certain that they mean something else. But what exactly? The anger, the suspicion, the freestyle fantasizing have no perceptible object in the space-time continuum that centrist Democrats like me inhabit. What are we missing? Seen from our perspective, the country elected a moderate and cautious straight shooter committed to getting things right and giving the United States its self-respect back after the Bush-Cheney years. Unlike the crybabies at MSNBC and Harper’s Magazine, we never bought into the campaign’s hollow “hope and change” rhetoric, so aren’t crushed that, well, life got in the way. At most we hoped for a sensible health care program to end the scandal of America’s uninsured, and were relieved that Obama proposed no other grand schemes of Nixonian scale. We liked him for his political liberalism and instinctual conservatism. And we still like him.
But more than a few of our fellow citizens are loathing themselves blind over Barack Obama. Why?
September 28, 2012
Thinking in Network Terms
Albert-László Barabási in Edge:
We always lived in a connected world, except we were not so much aware of it. We were aware of it down the line, that we're not independent from our environment, that we're not independent of the people around us. We are not independent of the many economic and other forces. But for decades we never perceived connectedness as being quantifiable, as being something that we can describe, that we can measure, that we have ways of quantifying the process. That has changed drastically in the last decade, at many, many different levels.
It has changed partly because we started to be aware of it partly because there were a lot of technological advances that forced us to think about connectedness. We had Worldwide Web, which was all about the links connecting information. We had the Internet, which was all about connecting devices. We had wireless technologies coming our way. Eventually, we had Google, we had Facebook. Slowly, the term 'network connectedness' really became part of our life so much so that now the word 'networks' is used much more often than evolution or quantum mechanics. It's really run over it, and now that's the buzzword.
The question is, what does it mean to be part of the network, or what does it mean to think in terms of the network? What does it mean to take advantage of this connectedness and to understand that? In the last decade, what I kept thinking about is how do you describe mathematically the connectedness? How do you get data to describe that? What does this really mean for us?
This had several stages, obviously. The first stage for us was to think networks, only networks down the line. That was about a decade ago, we witnessed the birth of network science. I could say a couple of geniuses came along and did it, but really it was the data that made it possible. Suddenly we started to discover that lots of data that's out there, that we're collecting thanks to the Internet and other technological advances, allowed us to look at connectedness and to measure it and to map it out.
Once you had data, you could build theories. Once you had theories, you have predictive power, you could test that and then the whole thing fitted itself. It suddenly very actively emerged as a field that we now call network science. Going beyond networks, going beyond connectedness, we realized we started to know not only whom you connect to and whom you see and where are your links (the economical, personal, social or whatever they are) but we started to see also the timing of your activities. What do you do with those links? When do you interact?
That was the second way; we called it 'human dynamics.'
Mahatma Gandhi as Philosopher
Over at Philosophy Bites:
Richard Sorabji discusses Mahatma Gandhi's philosophy of non-violence with Nigel Warburton for this the 200th episode of the Philosophy Bites podcast. Philosophy Biteshas now been downloaded more than 15 million times.
Trapped in the Total Cinema
J. Hoberman in The New York Review of Books:
Can we speak of a twenty-first-century cinema? And if so, on what basis?
In the immediate aftermath of World War II, the French film critic André Bazin characterized cinema as an idealistic phenomenon and cinema-making as an intrinsically irrational enterprise. “There was not a single inventor who did not try to combine sound and relief with animation of the image,” Bazin maintained in “The Myth of Total Cinema.” Each and every new technological development—synchronous sound, full-color, stereoscopic or 3-D movies, Smell-O-Vision—served to take the cinema nearer to its imagined essence, which is to say that “cinema has not yet been invented!” Moreover, once true cinema was achieved, the medium itself would disappear—just like the state under true communism. Writing in 1946, Bazin believed that this could happen by 2000.
In fact, something else occurred: the development of digital computer-generated imagery (CGI). Bazin had imagined cinema as the objective “recreation of the world.” Yet digital image-making precludes the necessity of having the world, or even a really existing subject, before the camera—let alone the need for a camera. Photography had been superseded, if not the desire to produce images that moved. Chaplin was perhaps but a footnote to Mickey Mouse; what were The Birth of a Nation and Battleship Potemkin compared to Toy Story 3?
The history of motion pictures was now, in effect, the history of animation. The process began in the early 1980s with two expensive and much-publicized Hollywood features. One From the Heart (1982), Francis Ford Coppola’s experiment in electronic image-making, returned but $1 million on a $26-million investment and effectively destroyed his studio, while Disney’s Tron (1982) the first sustained exercise in computer-generated imagery, was a movie whose costly special effects and mediocre box-office returns would be credited with (or blamed for) delaying CGI-based cinema for a decade.
“Vagina: A New Biography” by Naomi Wolf
Lindsay Beyerstein in In These Times:
Naomi Wolf tried vainly to deflect feminist criticism of her new book, Vagina: A New Biography in an interview with Amanda Hess of Slate.
Why has Wolf's silly book inspired so much feminist pushback? Because we’re sick of religious conservatives trying to reduce us to our sexual organs. It’s bad enough when it’s a Republican senatorial candidate pontificating about “legitimate rape.” But it’s even more galling when the conservative in question is hailed as a major feminist thinker and her religion is Pop Tantra.
Like Todd Akin, Wolf preaches that women can only be fulfilled through rapturous surrender to our biological-cum-mystical destiny.
Akin and his cronies want to reduce women to their wombs. Wolf wants to reduce us to our vaginas. My colleague Sady Doyle sees Wolf’s daft brief for vagina worship as essentially harmless. If Concerned Women for America published this book, I'd agree.
If reactionaries are going to reduce us to our reproductive organs, they might as well reduce us to the fun ones. But pelvic essentialism is dangerous, whether it’s about babies or pleasure, and doubly so when it’s being peddled as feminism.
The Pornography of Equality
Markha Valenta in Berfrois:
When Betty Friedan wrote The Feminine Mystique in 1963, “the problem that has no name” was the problem of college-educated housewives sitting at home being bored to death. Today, the “problem that has no name” is more widespread, more alluring and more aggressive. Its most insidious aspect is how close it comes to the licit ways in which women are used to lure, seduce, persuade and sweetly tease those who see them. To buy more. And more. Promising to make us sexy and our eyes glaze in pleasure. In the commercials saturating our public spaces. Thebestselling novel now rising high on sadomasochistic frisson. The film crossing and uncrossing its legs.
We like to think that these are metaphors. That the impossibly beautiful things calling out to us, seductively and low-voiced – to be them, to desire them, to touch and possess that thing they have, their hot sexiness on the edge or pure life itself – don’t literally mean it. Or do mean it, but then only in order to sell us sandwiches and Victoria’s secrets. Or as a bit of diversion from boredom. And yet, the constant presence of their siren-calls wherever we look, day in and day out doing their best to arouse in us some amalgam of desire to be, to possess, to have what they have, is striking.
Of course there is a steady stream of documentaries, manifestos and little squeaks of protestagainst this state of affairs. They include everyone from Christian grandparents to radical feminists to immigrant imams affronted in their moral sensibilities. But we studiously ignore them. They come and go without changing a thing. Rather like the tide.
But now something has happened that for a moment has made our societies’ traffic in women’s sexual assets a possible problem.
Give Dan Carlin CNN!!!
the other Ulysses
Ulysses S. Grant was a hero to his generation: the greatest general of the Civil War, a popular president who was elected twice—and could have been elected three or four times had he wished. But later generations found him entirely dispensable, and he became the butt of historians’ jokes. Surveys of presidential scholars long placed Grant among the worst presidents. In a 1948 poll he rated ahead of just Warren Harding; by 1982 he had only clambered past James Buchanan and Andrew Johnson. And while today he has managed to put a little more distance between himself and last place, it is still no surprise to find him in the bottom half, if not among the bottom ten. The standard rap on Grant is that he was a drunk who surrounded himself with spoilsmen who stole the country blind.more from H. W. Brands at Lapham's Quarterly here.
Monsieur le Comte
Monte Cristo, it turns out, was more than just the little Mediterranean islet of the book title. Looking much further westwards in the atlas, we find it marked as a port on the island of Hispaniola, which nowadays is partitioned into Haiti and the Dominican Republic. The future general was born in 1762 in the French sugar colony of Saint-Domingue, in the western half of the island. He was the son of a black slave, Marie-Cessette, and a renegade Norman aristocrat, Alexandre-Antoine Davy de la Pailleterie, who, having paid a high price for Marie-Cessette's beauty, fathered three more children before selling her off to a merchant from Nantes. French Enlightenment values meant that young Thomas-Alexandre (known as Alex), brought to France in servitude by his father, was free once he stepped ashore. The pair moved into the smart suburb of Saint-Germain-en-Laye and the fifteen-year-old boy found himself addressed as 'Monsieur le Comte'.more from Jonathan Keates at Literary Review here.
Strange Death of the English Gentleman
One of the distinguishing marks of a gentleman was that he did things because he knew they were the right thing to do, not because they would bring him personal advantage. Captain Oates was a very gallant gentleman. The idea of a gentleman was a more inclusive one than it sounds to modern ears. One of its greatest advantages was that you could define it so as to include yourself. You could behave like a gentleman, without possessing any of the social attributes which a gentleman might have: there was no need to possess a coat of arms, or a country estate, or engage in field sports, or wear evening dress. At least since Chaucer's time, there had been a distinction between the social meaning of the word, and the moral. It was evident that well-born people, who ought to know how to behave like gentlemen, did not always do so, while others sometimes did.
Philip Mason, whose perceptive study, The English Gentleman, was published in 1982, argues that "the desire to be a gentleman" runs through and illuminates English history from the time of Chaucer until the early 20th century. He suggests that "for most of the 19th century and until the Second World War" the idea of the gentleman "provided the English with a second religion, one less demanding than Christianity. It influenced their politics. It influenced their system of education; it made them endow new public schools and raise the status of old grammar schools. It inspired the lesser landed gentry as well as the professional and middle classes to give their children an upbringing of which the object was to make them ladies and gentlemen, even if only a few of them also became scholars." This was a subject that interested so great a man as Cardinal Newman. In The Idea of a University he said that a liberal education makes "not the Christian, not the Catholic, but the gentleman", and went on: It is well to be a gentleman, it is well to have a cultivated intellect, a delicate taste, a candid, equitable dispassionate mind, a noble and courteous bearing in the conduct of life; these are the connatural qualities of a large knowledge; they are the objects of a University . . . but they are no guarantees for sanctity or even for conscientiousness; they may attach to the man of the world, the profligate, the heartless.
Raising Frogs for Freedom
From The New York Times:
The birdman of Alcatraz became famous. But the frogmen of Cedar Creek are still anonymous beyond the tiny cult world of amphibian science. For now, they say. Mat Henson, 25, serving a four-and-a-half-year sentence for robbery and assault, and his research partner, Taylor Davis, 29, who landed in the Cedar Creek Corrections Center here in central Washington for stealing cars, raised about 250 Oregon spotted frogs in the prison yard this summer.
Working with biologists, Mr. Henson is now helping write a scientific curriculum for other frog-raisers, in prison or out. A previous inmate in the program, released some years ago, is finishing his Ph.D. in molecular biology. When asked about his plans after he is released from prison in 2014, Mr. Henson paused only a moment. “Bioengineering,” he said. The state program that connected the dots — or rather the felons and the frogs — is called Sustainability in Prisons. Nationally, it is unique in enlisting inmates to help rescue imperiled species like the Oregon spotted frog, which is threatened across much of its range.
September 27, 2012
Louis C.K. and the Rise of the 'Laptop Loners'
Adam Wilson in the LA Review of Books:
LOUIS C.K. EMERGES from the subway station: sullen, sweating. His balding crown of carrot colored hair is slightly brighter than his ruddy, freckled skin. The man is overweight but solid, like a fullback long past glory, in love with French fries, who still hits the gym. He’s got broad shoulders, thick arms, A-cup man breasts, and a sizable gut that hangs over his beltline. His black t-shirt is half a size too small, constricting his movements, and adding to the general impression of physical discomfort.
C.K. makes it up the subway steps and arrives at street level, exhaling as if he’s crested some unprecedented summit. He marches into a pizza joint, scarfs most of a giant slice in three bites, then disgusted, throws what remains in the garbage. To watch him eat is akin to watching a junkie shoot heroin; one can trace the convergence of shame and sublimity. All the while there’s music playing, the syncopated up beat of seventies funk. The singer repeats: “Louie, Louie, you’re gonna die.” The camera cuts to another set of stairs, this time a declension, C.K. hustling down to a door marked “Comedy Cellar.” The juxtaposition is stark: here lies humor, at the intersection of pathos and indigestion. We must armor ourselves with laughter.
So begins each episode of Louie, C.K.’s brainchild, currently in its third season on the cable channel FX. Cicero said that to be a philosopher is to learn how to die. Flaubert thought an artist must have a religion of despair. Accordingly, C.K. may be television’s true first in both categories.
Mitt Romney: Those People
Founding Fathers, Founding VillainsWilliam Hogeland in Boston Review:
Liberals have become originalists too. Recent books by progressive thinkers as varied as the legal scholar Lawrence Lessig, journalist Roger Hodge, and political commentator Rachel Maddow decry a national failure to live up to the founders’ purposes in creating the Constitution. Maddow means by her title, Drift, an unfortunate movement away from founding-era anti-militarism into the modern military-industrial complex. In Lessig’s Republic, Lost, the loss has come about thanks to a money influence in politics that Lessig says the founders condemned as corrupt. Hodge, in The Mendacity of Hope, frames a criticism of President Obama in terms of the founding political battle over finance between Hamilton and Madison.
All of the liberal originalists’ books run into political and historical trouble over some unedifying realities of our founding period. Similar difficulties plague a new right-wing constitutional history, Tea Party leader Michael P. Leahy’s Covenant of Liberty, which takes the betrayal of founding values as its theme, too. Leahy’s book represents classic originalism, the right-wing kind. It therefore serves as a mirror of the new liberal originalism: American-history fantasies of the left stand sharply in relation to those of the right.
One of Leahy’s strengths is that unlike so many others in the Tea Party movement—and unlike some of the liberal originalists—he doesn’t rope all the founders into one group and set them rolling in their graves over today’s America. Leahy admires particular founders and knows they had enemies in other founders. To him, a disastrous betrayal of the Constitution occurred in its first moments of operation. The betrayal was carried out by Hamilton.
Pussy Riot’s Punk Prayer
Colin Jager in The Immanent Frame:
Aided by social networking sites, blogs, and popular YouTube videos (found here and here), Pussy Riot’s plight became something of an international media sensation. Amnesty International and Madonna took up the cause, and British Prime Minister David Cameron questioned Putin about it in a face-to-face meeting. Indeed, as some commentators noted there did seem something almost pre-packaged about the whole event, as though it were designed for western consumption.
Fascinatingly, however, religion played a central role within this media event. Many orthodox clergy were quick to label the performance blasphemous, noting its “sacrilegious humiliation of the age-old principles aimed at inflicting even deeper wounds to Orthodox Christians”; claiming that the women’s “chaotically waving arms and legs, dancing and hopping…cause[ed] a negative, even more insulting resonance in the feelings and souls of the believers”; and describing the performance as “desecrating the cathedral, and offending the feelings of believers.”
The Orthodox Church occupies an odd space in relationship to the secular power of the state. Historically aligned with the czars, it was driven largely underground during the Soviet era, thus becoming one site of opposition to politics as usual. In recent years it has emerged as a potent political force in Russia, one largely aligned with Putin’s hold on power. In her closing statement, Yekaterina Samutsevich, one of the members of Pussy Riot, positioned their performance in precisely this way. The cozy relationship between church and state in contemporary Russia, she claimed, “has required considerable quantities of professional lighting and video equipment, air time on national television for hours-long live broadcasts, and numerous background shoots for morally and ethically edifying news stories, where the Patriarch’s well-constructed speeches would…help the faithful make the correct political choice during a difficult time for Putin preceding the election….Our sudden musical appearance in the Cathedral of Christ the Savior with the song “Mother of God, Drive Putin Out” violated the integrity of the media image that the authorities had spent such a long time generating and maintaining, and revealed its falsity.”
History of the Natural
Samantha Weinberg in More Intelligent Life:
When David Attenborough joined the BBC, 60 years ago this September, Britain had only one television channel. Cameras had to be wound up like a clock and could only film live or in 20-second bursts. There was no way to capture sound and vision at the same time, or to broadcast from anywhere but the studio. Attenborough, like most people, did not own a television set; he thinks he had seen only one programme in his life. He had applied for a job in radio, as a talks producer, and been turned down, and it was only by chance that his CV was seen by a television executive, the head of factual broadcasting, Mary Adams. She gave him a chance—but when he first went in front of the camera, she said his teeth were too big.
By 1956, Attenborough had persuaded the BBC to let him try a new way of filming—from and of the natural world. With only a cameraman and animal expert for company, he would go off for months to remote lands in search of rare beasts. In Borneo, some days’ walk from civilisation, he was on the trail of orangutan when he spied a man paddling up the river, wearing only a sarong and bearing a message tucked in a cleft stick. It was from the BBC, giving instructions on how to use their new toy: colour film. What started in a makeshift fashion with "Zoo Quest" matured over the decades into "Life on Earth", "The Private Life of Plants", "Life in Cold Blood", "Frozen Planet" and many more. With Attenborough, the phenomenon of natural-history film-making was born.
50 years of ultra-violence
A Clockwork Orange sits awkwardly in this schema. It is a cusp novel. Chronologically it belongs to Early Burgess, but stylistically it resembles the middle novels. In this sense it was an important departure. (Inside Mr Enderby (1963), Burgess’s best-loved comic novel, occupies a liminal position similar to A Clockwork Orange. Both share an animating fascination with excess of all kinds, and both find the perfect stylistic expression for this in obsessive wordplay. And both are very funny.) The novels of the middle period are Burgess’s most vital because it was in these that he forged what we might now recognize as the Burgessian – the antic puns and wordplay, the etymological digressions, the opacity, the glamorous pedantry, the tympanic repetitions, and an alliterative, assonantal musicality that makes every sentence seem vivid and extrovert: “Seafood salt with savour of seabrine thwacking throat with thriving wine-thirst”; “the lucent flawlessness of the skin, of the long fleshly languor that flowered into visibility”; “he was in a manner tricked, coney-caught, a court-dor to a cozening cotquean”. This is Burgess’s description of an Elizabethan brothel: “He entered darkness that smelled of musk and dust, the tang of sweating oxters, and, somehow, the ancient stale reek of egg after egg cracked in waste, the musty hold-smell of seamen’s garments, seamen’s semen spattered, a ghost procession of dead sailors lusting till the crack of doom”.more from Ben Masters at the TLS here.
theology saves politics?
The crisis of secularism goes much deeper than a deficit of personal meaning. The separation of church and state is so entrenched in the Western mind that it can be difficult to see the capitalist nation-state as a theological and political whole. Secularism is not strictly speaking a religion, but it represents an orientation toward religion that serves the theological purpose of establishing a hierarchy of legitimate social values. Religion must be “privatized” in liberal societies to keep it out of the way of economic functioning. In this view, legitimate politics is about making the trains run on time and reducing the federal deficit; everything else is radicalism. A surprising number of American intellectuals are able to persuade themselves that this vision of politics is sufficient, even though the train tracks are crumbling, the deficit continues to gain on the GDP, and millions of citizens are sinking into the dark mire of debt and permanent unemployment.more from David Sessions at Religion and Politics here.
JK Rowling: The Casual Vacancy reviews – what the critics said
From The Telegraph:
As it hits the bookshops, The Casual Vacancy, JK Rowlings' first post-Harry Potter novel, is at the top of the bestseller lists with 2.6m copies sold on pre-order. But Rowling's first adult novel, which charts the aftermath of the unexpected death of a member of the parish council in the seemingly idyllic town of Pagford, has divided opinion. Here are excerpts from some of the reviews:
Christopher Brookmyre (Telegraph): "It quickly becomes clear that this is not the book we might have been expecting. Recently arrived social worker Kay’s first visit to a drug-addict mother of two at her home in the Fields brings us into the very heart of the world that the hawks on the parish council would like to simply wish away. It is a heart-in-the-mouth passage, taut with dread, invoking in the reader a vivid mirror of Kay’s own fear, revulsion, anger, compassion and sorrow. ... There is villainy, from domestic violence to sexual abuse, including a rape scene that is most shocking in its banality for both parties. Neither the victims nor their assailants expect justice from any external agency, and nor should the reader: There are few resolutions, and no promises of wish-fulfilment. This is undoubtedly where the book takes its greatest risks. One marvels at the skill with which Rowling weaves such vivid characters in and out of each other’s lives, rendering them so complex and viscerally believable that one finds oneself caring for the worst of them. However, upon hearing the cries of so many souls in pain, the more sensitive reader might begin to crave a leavening of hope, or to fear that Rowling’s own cry is one of despair."
Deepti Hajela (Associated Press): "So look, here's the thing: This. Is. Not. A. Children's. Book. If you're looking for what made Harry Potter magical – Wizards! Spells! Flying Broomsticks! -- you're not going to find it. If you're looking for what makes JK Rowling magical – emotion, heart – you will. ... [The] ability to bring her characters to their emotional life was a hallmark of the Harry Potter series – it didn't become a global phenomenon just because it was an exciting adventure, but because there was a real heart to it, characters who had both strengths and weaknesses, who struggled with their choices. That's what makes this book worth it, despite a slow start and sometimes too much of the descriptions and adjectives that added life to Harry Potter but at times tend to bog Rowling down here. That's what makes the book's ending scenes so heartbreaking – turning the page seems unbearable, but not as much as putting down the book would be."
Bearing Sons Can Alter Your Mind
Giving a whole new meaning to "pregnancy brain," a new study shows that male DNA—likely left over from pregnancy with a male fetus—can persist in a woman's brain throughout her life. Although the biological impact of this foreign DNA is unclear, the study also found that women with more male DNA in their brains were less likely to have suffered from Alzheimer's disease—hinting that the male DNA could help protect the mothers from the disease, the researchers say. During mammalian pregnancy, the mother and fetus exchange DNA and cells. Previous work has shown that fetal cells can linger in the mother's blood and bone for decades, a condition researchers call fetal microchimerism. The lingering of the fetal DNA, research suggests, may be a mixed blessing for a mom: The cells may benefit the mother's health—by promoting tissue repair and improving the immune system—but may also cause adverse effects, such as autoimmune reactions.
One question is how leftover fetal cells affect the brain. Researchers have shown that fetal microchimerism occurs in mouse brains, but they had not shown this in humans. So a team led by autoimmunity researcher and rheumatologist J. Lee Nelson of the Fred Hutchinson Cancer Research Center in Seattle, Washington, took samples from autopsied brains of 59 women who died between the ages of 32 and 101. By testing for a gene specific to the Y chromosome, they found evidence of male DNA in the brains of 63% of the women. Because some studies have suggested that the risk of Alzheimer's disease (AD) increases with an increasing number of pregnancies, the team also examined the brains for signs of the disease, allowing them to determine whether AD correlated with the observed microchimerism. Of the 59 women, 33 had AD—but contrary to the team's expectation, the women with AD had significantly less male DNA in their brains than did the 26 women who did not have AD.
Thursday PoemThe Clock
With only one story to tell, the clock strikes
a monotonous note, irrespective of how
musical the bell, how gilded the chimes
its timely conclusions report through.
Time literally on hands, it informs you
to your face exactly where you stand
in relation to your aspirations, stacks up
the odds against your long-term prospects,
leaves your hopes and expectations checked.
Keeping track of time to the last second, it gives
the lie to all small talk about your reputedly
youthful looks, sees through the subterfuge
of dyed hair, exposes the stark truth beneath
the massaged evidence of smooth skin.
by Dennis O'Driscoll
from Reality Check
Copper Canyon Press, 2008
September 26, 2012
100 million will die by 2030 if world fails to act on climate - report
Nina Chestney at Reuters:
More than 100 million people will die and global economic growth will be cut by 3.2 percent of gross domestic product (GDP) by 2030 if the world fails to tackle climate change, a report commissioned by 20 governments said on Wednesday.
As global average temperatures rise due to greenhouse gas emissions, the effects on the planet, such as melting ice caps, extreme weather, drought and rising sea levels, will threaten populations and livelihoods, said the report conducted by humanitarian organisation DARA.
It calculated that five million deaths occur each year from air pollution, hunger and disease as a result of climate change and carbon-intensive economies, and that toll would likely rise to six million a year by 2030 if current patterns of fossil fuel use continue.
More than 90 percent of those deaths will occur in developing countries, said the report that calculated the human and economic impact of climate change on 184 countries in 2010 and 2030. It was commissioned by the Climate Vulnerable Forum, a partnership of 20 developing countries threatened by climate change.
"A combined climate-carbon crisis is estimated to claim 100 million lives between now and the end of the next decade," the report said.
It said the effects of climate change had lowered global output by 1.6 percent of world GDP, or by about $1.2 trillion a year, and losses could double to 3.2 percent of global GDP by 2030 if global temperatures are allowed to rise, surpassing 10 percent before 2100.
Bad science gets busted
High-profile cases show the importance of questioning academic research -- especially when it has a corporate tie.
David Sirota in Salon:
If you want to influence a mass audience, for instance, you can try to do what the Pentagon does and subtly bake slanted information into entertainment products such as movies and television shows. If, on the other hand, you are looking to influence a slightly higher-brow audience, you can embed disinformation in newspapers’ news andopinion pages. And if you are looking to brainwash politicians, think tanks, columnists and the rest of the political elite in order to rig an esoteric debate over public policy, you can attempt to shroud your agitprop in the veneer of science.
While these are all diabolically effective methods of manipulating political discourse, the latter, which involves corporate funding of academic research, is the most insidious of all. But the good news is that the last few weeks provided important reminders about the problem — and why scrutiny of sources is so important.
At the national level, media organizations frothed with news about Stanford University researchers supposedly determining that organic food food is no more healthy than conventionally produced food. In the rush to generate audience-grabbing headlines, most of these news outlets simply regurgitated the Stanford press release, which deliberately stressed that researchers ”did not find strong evidence that organic foods are more nutritious or carry fewer health risks than conventional alternatives.”
New Stanford/NYU study documents the civilian terror from Obama's drones
Glenn Greenwald in The Guardian:
A vitally important and thoroughly documented new report on the impact of Obama's drone campaign has just been released by researchers at NYU School of Law and Stanford University Law School. Entitled "Living Under Drones: Death, Injury and Trauma to Civilians From US Drone Practices in Pakistan", the report details the terrorizing effects of Obama's drone assaults as well as the numerous, highly misleading public statements from administration officials about that campaign. The study's purpose was to conduct an "independent investigations into whether, and to what extent, drone strikes in Pakistan conformed to international law and caused harm and/or injury to civilians".
The report is "based on over 130 detailed interviews with victims and witnesses of drone activity, their family members, current and former Pakistani government officials, representatives from five major Pakistani political parties, subject matter experts, lawyers, medical professionals, development and humanitarian workers, members of civil society, academics, and journalists." Witnesses "provided first-hand
accounts of drone strikes, and provided testimony about a range of issues, including the missile strikes themselves, the strike sites, the victims' bodies, or a family member or members killed or injured in the strike".
Here is the powerful first three paragraphs of the report, summarizing its main findings:
Whilte noting that it is difficult to obtain precise information on the number of civilian deaths "because of US efforts to shield the drone program from democratic accountability", the report nonetheless concludes: "while civilian casualties are rarely acknowledged by the US government, there is significant evidence that US drone strikes have injured and killed civilians."
But beyond body counts, there's the fact that "US drone strike policies cause considerable and under-accounted for harm to the daily lives of ordinary civilians, beyond death and physical injury":
In other words, the people in the areas targeted by Obama's drone campaign are being systematically terrorized. There's just no other word for it. It is a campaign of terror - highly effective terror - regardless of what noble progressive sentiments one wishes to believe reside in the heart of the leader ordering it. And that's precisely why the report, to its great credit, uses that term to describe the Obama policy: the drone campaign "terrorizes men, women, and children".
Amazing mind reader reveals his 'gift'
Sri Lanka to New York...
We have coffee with almond milk and agave syrup rather than buffalo curd and strong tea and decide to take a long morning walk to integrate back into New York society. The Harlem apartment is right on Central Park. Living in Brooklyn, we don't often get a chance to stroll through this famous place. It is raining just a little, which makes the city feel that much more surreal. Right away I notice how many ducks there are — big fat American ducks, they are so wonderful, trailing along in a row along the algae-topped lake. I notice all the bird sounds and look at the trees in a new way. Even the pigeons look beautiful. Two people behind us speak Spanish and I think it's nice to hear people speaking Spanish. I like this language. There are people wearing expensive shoes picking up their dog's poop with little blue baggies on their hands. I try to picture Sri Lankans doing this and cannot, no matter what kind of shoes they wear. A young woman in a button down shirt and tall rubber boots paces past us. She listens to music on her iPod and has her nose down, reading a daily paper. She does all this without falling, without slowing. It is like a circus act.more from Stefany Anne Golberg at The Smart Set here.
Lightness is a strategy
As elsewhere, one of my guides here is W. G. Sebald, who performed this task with a kind of relentlessness that is as stunning as it is deeply sad. The unnamed subject of each of Sebald’s books is, by his own admission, the concentration camps, and yet, with a few exceptions, he touches on them so lightly that you could be lulled, by his long, languorous sentences, into thinking the books were about something else: herring, say, or the rise of the Dowager Empress. That they are not is a function of a very Sebaldian principle: atrocity needs no exaggeration. If you look closely enough you see how it saturates all that surrounds it, drawing the energy of the world into its deep and abhorrent abyss. But lightness, in Sebald and elsewhere, provides more than a cover. Lightness is a strategy, much as I distrust that word. It is a method for dealing with and channeling other energies.more from Erik Anderson at the LA Review of Books here.
the return of the story
It has been more than 70 years since Walter Benjamin, in his classic essay “The Storyteller,” declared that telling stories was obsolete. “Less and less frequently do we encounter people with the ability to tell a tale properly,” Benjamin complained. “It is as if something that seemed inalienable to us, the securest among our possessions, were taken from us: the ability to exchange experiences.” For most of us in the western world, our first experience of our culture’s classic stories—Snow White, Cinderella, Little Red Riding Hood—does not come through a wise man or woman sitting before an audience, spellbinding us with words. It is in print or through images that we learn our culture’s foundational stories. This development has led to a certain nostalgia about the mere act of telling a story. In his novel The Storyteller, Mario Vargas Llosa writes lovingly about the raconteurs of the Machiguenga people, a remote Amazonian tribe that has had almost no contact with modern Peruvian civilisation. By reciting their people’s cosmogonies and myths, by bringing news from one far-flung group to another, the storyteller “remind[ed] each member of the tribe that the others were alive, that despite the great distances that separated them, they still formed a community, shared a tradition and beliefs.”more from Adam Kirsch at Prospect Magazine here.
The best books on the Beatles
From The Gaurdian:
5 October is the 50th birthday of the Beatles' first single, released back when Harold Macmillan was the PM, and the Cuban missile crisis was only weeks away.
"Love Me Do" sounds like the world in which it was made: tentative, still feeling the pinch of post-war austerity. Ian MacDonald's wonderful song-by-song history of the group, Revolution in the Head, reckoned that the song's "modal gauntness" is subtly cunning, serving notice of the Beatles' "unvarnished honesty", and – via John Lennon's wailing harmonica part – the "blunt vitality" of their native Liverpool. In the surviving Beatles' own account, the huge Anthology, Paul McCartney recalls that the song was meant to sound hard and authentic: "blues" rather than "la de da de la". Many Beatles books barely mention "Love Me Do" at all. But there it is: a number 17 hit, long rumoured to have been propelled into the charts thanks to bulk-buying by manager Brian Epstein. If, like me, one of your first experiences of Beatles music was the collection 1962-66 (known as "The Red Album", as against 1967-70 "The Blue Album"), you will probably have experienced it as a strangely muted opening to a listening experience that quickly flared into spectacular life: a prologue, rather than a first chapter proper.
The Beatles' second single, "Please Please Me", was released in January 1963, in the midst of a legendarily biting British winter, to which its giddy sound was an antidote. "Congratulations, Gentlemen, you've just made your first number one," said their producer, George Martin. And he was right. By early the following year, their songs were crowding the US charts, and they were about to play to 73 million Americans on The Ed Sullivan Show. Once again, they were adopted as a panacea for cold and grim times – this time less a matter of the weather than the pall cast by the murder of President Kennedy. Only two years later, they would reach the apex of their fame, chased around the Deep South by fundamentalist Christians outraged by John Lennon's claim that they were "bigger than Jesus", while their music took on the textures and expanded horizons traceable – at least in part – to Lennon and George Harrison's use of LSD. Such is the remarkable pace of a story that has been told by scores of writers, a story about four young musicians but no end of other things: the cities of Liverpool, Hamburg and London; class, and the shaking of English hierarchies; pop's transmutation into a global culture; and the western world's passage from a world still defined by the second world war and its aftermath, to the accelerated modernity we know today. Everything in the tale pulses with significance and drama. It seems barely believable, and in the best Beatles books, it still burns.
Researchers prevent heart failure in mice
Cardiac stress, for example a heart attack or high blood pressure, frequently leads to pathological heart growth and subsequently to heart failure. Two tiny RNA molecules play a key role in this detrimental development in mice, as researchers at the Hannover Medical School and the Göttingen Max Planck Institute for Biophysical Chemistry have now discovered. When they inhibited one of those two specific molecules, they were able to protect the rodent against pathological heart growth and failure. With these findings, the scientists hope to be able to develop therapeutic approaches that can protect humans against heart failure.
A research team at the Göttingen Max Planck Institute for Biophysical Chemistry and the Hannover Medical School discovered that two small RNA molecules play a key role in the growth of heart muscle cells: the microRNAs miR-212 and miR-132. The scientists had observed that these microRNAs are more prevalent in the cardiac muscle cells of mice suffering from cardiac hypertrophy. To determine the role that the two microRNAs play, the scientists bred genetically modified mice that had an abnormally large number of these molecules in their heart muscle cells. "These rodents developed cardiac hypertrophy and lived for only three to six months, whereas their healthy conspecifics had a normal healthy life-span of several years," explained Dr. Kamal Chowdhury, researcher in the Department of Molecular Cell Biology at the Max Planck Institute for Biophysical Chemistry. "For comparison, we also selectively switched off these microRNAs in other mice. These animals had a slightly smaller heart than their healthy conspecifics, but did not differ from them in behavior or life-span," continued the biologist. The crucial point is when the scientists subjected the hearts of these mice to stress by narrowing the aorta, the mice did not develop cardiac hypertrophy – in contrast to normal mice.
Wednesday PoemGratitude to Old Teachers
When we stride or stroll across the frozen lake,
We place our feet where they have never been.
We walk upon the unwalked. But we are uneasy.
Who is down there but our old teachers?
Water that once could take no human weight
We were students then-holds up our feet,
And goes on ahead of us for a mile.
Beneath us the teachers, and around us the stillness.
by Robert Bly
from Eating the Honey of Words, 1999
Harper Collins Publishers, New York, NY