Something like this happened to the most famous case of amnesia in 20th-century science, a man known only as ‘H.M.’ until his death in 2008. When he was 27, a disastrous brain operation destroyed his ability to form new memories, and he lived for the next 55 years in a rolling thirty-second loop of awareness, a ‘permanent present tense’. During this time he was subjected to thousands of hours of tests, of which naturally he had no recall; he provided data for hundreds of scientific papers, and became the subject of a book (Memory’s Ghost by Philip Hilts) and a staple of popular science journalism; by the 1990s digital images of his uniquely disfigured hippocampus featured in almost every standard work on the neuroscience of memory. Since his death his brain has been shaved into 2401 slices, each 70 microns thick, compared in one account to the slivers of ginger served with sushi. Suzanne Corkin, an MIT neuroscientist, first met him in 1962 and after 1980 became his lead investigator and ‘sole keeper’. Permanent Present Tense is her account of Henry Gustave Molaison – his full identity can finally be revealed – and the historic contribution he made to science.
Corkin had a reputation for strict policing of access to Henry, a charge she happily concedes: ‘I did not want him to become a sideshow attraction – the man without a memory.’ After the death of his mother, his last thirty years were spent at a Connecticut nursing home in strict anonymity, with staff sworn to secrecy and filming prohibited. More than a hundred carefully screened researchers were admitted over the years to perform brain scans and cognitive tests, but were never told his name. Corkin’s lucid, well-organised telling of Henry’s story merges intimate case history with an account of the current scientific understanding and how it was reached.
May 19, 2013
Habermas, Adorno, Politics
Richard Marshall interviews Gordon Finlayson in 3:AM Magazine:
3:AM: A key discussion in contemporary liberal theory of ethics and politics is the relationship and differences between Habermas and Rawls. Can you say something about what you take the main points of dispute are and where you stand on this?
GF: Sure. In my view, despite the amount of ink that has been spilt on Habermas and Rawls in their respective fields, relatively little attention has been paid to the dispute between them. This is largely because influential commentators and critics were quick to judge their exchange in the Journal of Philosophy a damp squib.
This was in part because expectations ran high, at the time, because two of the greatest social and political theorists of the 20th century, although working in different traditions, roughly analytic political philosophy and German Social theory had engaged each other in debate. It was also because in truth neither thinker was sufficiently well apprised of the detail of the others theory – unsurprisingly really, since they worked in very different traditions and each had just spent the last few years writing their own major work of political theory. Finally, everyone at the time, including the disputants themselves, were seduced by the assumption that the salient point of comparison between their respective theories was Habermas’s principle (U) and his conception of the moral standpoint, and Rawls’s argument that the principles of justice are those that would be chosen by a rational and reasonable persons in the Original Position. Almost everyone who has written on Habermas and Rawls makes that particular mistake.
My take on that is straightforward. The debate between them concerns their respective political theories. It is basically a dispute between Rawls’s theory of Political Liberalism, and Habermas’s Discourse Theory of Law. It is not primarily a dispute between Rawls’s A Theory of Justice, and Habermas Discourse Ethics. Principle (U) is the central idea in Habermas’s Discourse Ethics, which is a moral theory, not a theory of law or of democratic legitimacy, while the argument from the Original Position takes a back seat in Rawls’s Political Liberalism. People who interpret the Habermas Rawls dispute in the light of the contrast between Habermas’s principle (U) and Rawls’s Original Position, are looking at the wrong thing and so miss the real points of dispute.
What people should have been asking is this. What are the central organizing ideas of their respective political theories, and on what significant points do these ideas conflict?
Argument with Myself
Mike Jay reviews Suzanne Corkin's Permanent Present Tense: The Man with No Memory, and What He Taught the World in the LRB:
Memory creates our identity, but it also exposes the illusion of a coherent self: a memory is not a thing but an act that alters and rearranges even as it retrieves. Although some of its operations can be trained to an astonishing pitch, most take place autonomously, beyond the reach of the conscious mind. As we age, it distorts and foreshortens: present experience becomes harder to impress on the mind, and the long-forgotten past seems to draw closer; University Challenge gets easier, remembering what you came downstairs for gets harder. Yet if we were somehow to freeze our memory at the youthful peak of its powers, around our late twenties, we would not create a polished version of ourselves analogous to a youthful body, but an early, scrappy draft composed of childhood memories and school-learning, barely recognisable to our older selves.
Rite of Spring
Christopher Benfey in the NYRB's blog:
Spring should be a time of portents and premonitions, winged harbingers (“I dreaded that first Robin, so,” as Emily Dickinson put it with characteristic ambivalence) and new beginnings.
This thought struck me as I read Megan Marshall’s sympathetic new biography of Margaret Fuller, which opens with a familiar phrase from Virgil’s Aeneid, one that inspired an essay Fuller wrote during her precocious childhood. Possunt, quia posse videntur means, roughly, “They can because they think they can,” and describes a team of rowers who, according to Marshall, “will themselves to win a race.” The phrase, which Fuller thought demonstrated “confidence in the future,” gives Marshall an overarching theme for Fuller’s fiercely driven life.
But Fuller also made use of the Aeneid when she was less confident of the future. She was known to perform the ancient form of divination in which a passage of Virgil selected at random is assumed to reveal what lies ahead. Sir Philip Sidney described the practice, with a dash of skepticism, in his Defence of Poesy:And so far were they carried into the admiration thereof, that they thought in the chanceable hitting upon any such verses great fore-tokens of their following fortunes were placed; whereupon grew the word of Sortes Virgilianae, when by sudden opening Virgil’s book they lighted upon some verse of his making.
Fauré plays Fauré
shatters the evenness of skies—
she peers, stunned, from cell 22
that such dumb minuteness
can shake the earth
from Summer was a fast train without terminals
publisher: Spinifex, North Melbourne, 1998
May 18, 2013
The Beautiful German Language
Enda O'Doherty in Eurozine:
Germans have featured prominently among those who have sometimes had difficulty in believing that their native tongue is quite up to the mark, or, as we say in our barbarous contemporary jargon, fit for purpose. The German invention of printing in the mid-fifteenth century was certainly to give a boost to the prestige of vernacular languages (at the expense of the universal language, Latin). It was also to be important in spreading the new religion, Protestantism. Martin Luther enthused:
Printing is God's most recent gift and his greatest. Through it, in effect, God wishes to make known the true religion to the whole world, right to the extremities of the Earth.And so it came to pass. But the Whiggish Protestantism (still alive in the popular cultural histories of Lord Bragg, formerly Melvyn) which celebrates the unstoppable spread of the Word, to be read and chewed over by the individual in private – an improving substitute for the "nonsense" mumbled by the priest in an incomprehensible language – tends to forget that in the short term virtually no one could read, whereas all could see and grasp the meaning of the wall paintings, statues and altarpieces in the church, which the Protestants for the most part were so keen to efface or destroy. The short term in this context, we should remember, was rather long. Mass literacy came to England only in the nineteenth century.Luther however, after his initial enthusiasm, seems to have had second thoughts about the wisdom of translating the Bible into German and making it available to everyone, or everyone who could read (he was to find, disturbingly, that they disagreed with him about what it meant).
The World’s Bloodiest Civil War
John B. Thompson reviews Stephen R. Platt's Autumn in the Heavenly Kingdom : China, the West, and the Epic Story of the Taiping Civil War and Tobie Meyer-Fong's What Remains : Coming to Terms with Civil War in 19th Century China in the LA Review of Books:
The Taiping Civil War (1850–1864) started with a dream. Hong Xiuquan, a young scholar from Guangdong, a province in southern China, aspired to the government position and the unassailable status guaranteed by success in imperial civil service examinations. However, in 1837, Hong flunked the provincial-level examination in Canton, the province’s major city, for the third time and returned home broken. He collapsed into episodic trances in which he traveled to a heavenly realm and met an old man in a black dragon robe. The man, whom Hong understood to be his “father,” stood grieving at the edge of heaven, dismayed by the people of his creation who had been led astray by demons. He dispatched Hong to earth, along with a middle-aged man identified as Hong’s “elder brother,” to slay these devils.
Until 1843, Hong had no vocabulary to explain his visions. That year, he rediscovered a collection of Bible passages he had obtained in Canton years before, and the meaning of his visions became clear: his heavenly father was God. His elder brother was Jesus. The demons were China’s false idols and Hong was China’s savior. Hong immediately began to preach his vision along with the New Testament in the mountains of southern China and quickly amassed a growing following among the farmers and villagers.
Over time, Hong resolved to establish on earth the kingdom he had seen in heaven. He redefined the demons from the idols of China’s cultural inheritance to the alien Manchu rulers of the Qing Dynasty. “God had divided the kingdoms of the world […] just as a father divides his estates among his sons,” Hong said. “Why should these Manchus forcibly enter China and rob their brothers of their estate?” In 1850, Hong and his Society of God Worshippers openly rebelled against Qing authorities. In 1851, Hong formally declared the existence of the Taiping Heavenly Kingdom with himself as Heavenly King. By 1853, his resourceful, ever-growing army had captured the old Ming Dynasty capital of Nanjing. From that point until the end of the civil war, there were effectively two states within China.
Claude S. Fischer in Boston Review:
Some observers respond to questions raised by the Flynn Effect by dismissing intelligence testing as an exercise in cultural domination. This ostrich-like response ignores the fact that IQ scores, whatever they measure, consistently correlate with important outcomes such as how well people perform their jobs and how long they live. Such dismissal also ignores the growing evidence that there is a physical, neurological basis to cognition and cognitive skills.
A more serious critique of the research attacks the definition of intelligence. Researchers in the intelligence field define it as a general capability to reason, understand complex ideas, think abstractly, and solve problems. You can measure it, they argue, using IQ tests. Critics consider these tests to be superficial and argue that they ignore other kinds of intelligence such as emotional intelligence or deeper traits such as wisdom. While researchers cannot track historical trends in wisdom, they are trying to wise up about the apparent historical increase in IQ.
One might suspect that the tests have gotten easier. They haven’t. In fact, the tests have gotten harder in order to keep the average IQ at one hundred. By reversing that process, Flynn showed the long-term rise in real performance.
Other challengers argue that we are not really smarter than our great-grandparents; it’s just that people today learn the answers to test questions in school or have become familiar with testing. However, scores on the parts of tests that are most easily taught and are the most culture-laden—say, recognizing vocabulary or knowing geography—have not changed much. Scores on those parts of tests that measure the most abstract, presumably culture-free thinking—say, drawing logical inferences from patterns in designs—have risen the most. The sorts of thinking that are supposedly most detached from classroom and cultural learning are the ones that have really improved.
So if a real increase in some kind of cognitive ability is under way, the question is why.
Think About Nature
A conversation with Lee Smolin in Edge:
The main question I'm asking myself, the question that puts everything together, is how to do cosmology; how to make a theory of the universe as a whole system. This is said to be the golden age of cosmology and it is from an observational point of view, but from a theoretical point of view it's almost a disaster. It's crazy the kind of ideas that we find ourselves thinking about. And I find myself wanting to go back to basics—to basic ideas and basic principles—and understand how we describe the world in a physical theory.
What's the role of mathematics? Why does mathematics come into physics? What's the nature of time? These two things are very related since mathematical description is supposed to be outside of time. And I've come to a long evolution since the late 80's to a position, which is quite different from the ones that I had originally, and quite surprising even to me. But let me get to it bit by bit. Let me build up the questions and the problems that arise.
One way to start is what I call "physics in a box" or, theories of small isolated systems. The way we've learned to do this is to make an accounting or an itinerary—a listing of the possible states of a system. How can a possible system be? What are the possible configurations? What were the possible states? If it's a glass of Coca Cola, what are the possible positions and states of all the atoms in the glass? Once we know that, we ask, how do the states change? And the metaphor here—which comes from atomism that comes from Democritus and Lucretius—is that physics is nothing but atoms moving in a void and the atoms never change.
Lift up your voices: The century-long battle for women's freedomFrom New Statesman:
Given its successes, feminism today looks very different from the feminism of 1913. There is, wonderfully and rightly, something much less embattled, much more inclusive and much more relaxed about feminists now. Rather than chasing the chimera of the “perfect mother” and the “perfect citizen”, we can accept one another in our flawed variety. The humourlessness that sometimes characterised women’s politics in the early 20th century has disappeared and many leading voices in contemporary feminism, from Caitlin Moran to the Vagenda magazine, use humour as their main weapon. The new tone of feminism suggests that instead of being furious and earnest all the time, we can begin to enjoy how far we have come. This humour is hugely attractive to younger women and effective in divesting the enemy of much of his power simply by giggling at him. It rests very much on the progress that has already been made and the ability of the funny feminists to build their audiences through social media and the internet, rather than having to rely exclusively on editors who may not be in on the jokes. Yet I hope that, however much we love the funny feminists, we do not forget to love some of the other aspects of feminism – aspects that may be harder to find on one’s iPhone and harder to laugh about.
The unifying force of the movement for suffrage is not going to be seen again in our generation. But I can still see the power of activism and it is heartening to see women still coming together to demonstrate this power through action in everyday life, not just over the internet or through the published word. Over the past 12 months, I have taken part in a lobby of parliament organised by UK Feminista; in One Billion Rising, an international day of activism against violence against women organised by V-Day; and in a number of conferences and public gatherings at which women are learning from one another face to face. Such activity can sometimes feel time-consuming and frustratingly slow but it also leaves me with a renewed understanding of the process of creating change. And that is vital, because even though feminism has achieved so much, there is still so much to be done. While this government is making decisions on benefits, education and housing that are forcing more women and children into poverty, we have to protest. While women are still experiencing rape and sexual assault in their everyday lives and finding that the perpetrators walk free, we need to stand up for change. While women are still too often absent from public life, we need to make sure our voices are heard loudly, even angrily. The other crucial aspect of feminism that should not be forgotten is the importance of listening to stories about what goes on beyond the comfort of our lives. There is always time to make jokes about thongs and pubic waxing or about women’s magazines and bad sex but funny feminism is not always great at bringing in other issues. After all, there is not much to laugh about in women having to queue at food banks, or being trafficked into forced prostitution or being killed in the name of honour.
What the Woodpecker Told Me
Rennie Sparks in The New York Times:
I have a lot of notebooks full of scribbles. They often don’t lead to anything, but sometimes, on lucky days, the scribbles begin to connect into a mystery that I can not look away from until it is laid bare. What was once a jumble of words and ideas begins to feel magnetized and full of import. Oh, those are lucky days! Mostly I just sit on the couch and follow the sparks here and there until they disperse. That morning that began with a tap-tap-tapping led to an afternoon in which I learned a lot about woodpeckers. I found out that woodpeckers have very long tongues with barbs on the end. I found out that woodpeckers have specially designed skulls that protect them from impact, like a built-in crash helmet. I also found out that woodpecker hearing is amazingly acute. These birds can actually hear larvae slithering inside a tree trunk as they are flying past overhead. Yes! That fact resonated with me. I felt my head tingling with excitement.
I sat awhile and tried to imagine what it might be like to have hearing so acute that I could hear bugs wriggling through trees. At first it seemed a wonderful thing — to hear great orchestras within rocks and mountainsides, the secret songs of air and earth. And then I realized how distracting it would be. With such sensitive hearing wouldn’t we all end up lying for days with our ears pressed to dirt piles and knot holes, forgetting to eat, forgetting to sleep, utterly transfixed by the tiniest sounds? Why then, I wondered, aren’t woodpeckers driven to insane distraction by their acute hearing? How can these birds stand to hammer away at a tree trunk when their ears are sensitive enough to hear bugs crawling inside wood? Is the woodpecker brain, then, fine-tuned to hear some sounds acutely, but to ignore other sounds completely? What parts of reality do our own brains actively filter out as we try and perceive the world?
Suddenly, and seemingly without context, I thought of Mary Sweeney. Mary Sweeney was a woman briefly mentioned in Michael Lesy’s book, “Wisconsin Death Trip.”
Fetish and brutish
The big, desert city of El Paso, on the US border with Mexico, for years felt like a lesson from the work of Giorgio Agamben. In his book Homo Sacer: Sovereign Power and Bare Life, Agamben analyzes a law from the Roman Empire specifying that if a man committed certain crimes, all of his citizenship rights would be revoked. This punishment, oddly enough, rendered the criminal a homo sacer, a sacred man, whom it was forbidden to ritually sacrifice to the gods. Yet in the everyday world the sacred man could be killed by anyone, with no penalty at all invoked on the killer. He inspired the highest veneration and the basest contempt. He constituted yet another category from Agamben’s work: bare life, or human existence stripped of its social nature and reduced to the purely biological. Bare life defines brutes. Homo sacer, brutes fetishized.more from Debbie Nathan at n+1 here.
the calvino letters
When Italo Calvino was becoming a big name in the English-speaking world in the 1970s and 1980s, he was seen as a somewhat rarefied figure: an Italian master of French-style abstraction who seemed to observe life from a serene ironic distance. And because of the timing of his death – at 61, after a cerebral haemorrhage, in 1985 – the prevailing image of him outside Italy has more or less stayed that way. His witty meta-novel If on a Winter’s Night a Traveller (1979) has for years been used to teach the rudiments of postmodernism, while Invisible Cities (1972) – “a totally decadent book”, he wrote casually in a letter – has acquired the status of a fetish among architects, urban theorists and purveyors of art-speak. Yet the role of chic metropolitan guru wasn’t one that Calvino sought or felt comfortable in. An agronomist’s son from the Ligurian Riviera, he started out as a writer under the auspices of the Italian Communist party, having joined while fighting as a partisan during the second world war. Hemingway and Chekhov were his first literary models and, early on, he was stymied by his unsuccessful efforts to write a novel documenting social conditions in industrial Turin.more from Christopher Tayler at the FT here.
The Letters of William Gaddis
“America has odd ways of making one feel one’s self a failure. And looking over the fragments of our correspondence assembled, I am just terribly struck at the consistency, from my end, of howls about money, and from yours of reassurances, hopes, encouragement: of course this isn’t really news (and probably hardly unique in your file of writers), but seeing it so all at once did overwhelm me with a clearer sense of what I’ve put you through year after year, and I wish to Christ it had finally come up on the note of triumph you have hoped and worked so hard for.” Like most sensible serious writers, Gaddis never actually planned for his “triumph” to be posthumous; nor was he trying to write books that would be considered unreadable (usually by people who hadn’t read them). “What pained me most about the reviewers,” he writes in 1960, referring to the notoriously inadequate reception for “The Recognitions,” “was their refusal — their fear — to relax somewhat with the book and be entertained.” To be fair, one can understand why your average reviewer might not have been able to “relax” when faced by a thousand-page novel packed with theological allusions, inventive (but consistent) punctuation, dense, tiny typography and huge, tree-trunk-wide paragraphs. It’s a daunting task just lifting one of Gaddis’s best novels — let alone reading it.more from Scott Bradfield at the NY Times here.
Saturday PoemThrough the Speckled Land
She won’t speak to me anymore, this place
my tongue is received with poor grace.
My roots penetrated only so far
and they wither for lack of water.
Salt was spread on the upper scraw
and ploughed through to the lower layer.
She can no longer nourish her brood,
In my own land as a stranger viewed.
On the road between two cities
each of which has two names,
I read the words on the signs.
I am travelling through the speckled land
and every town here has two names.
Claonadh – Clane
Cill Dara – Kildare
Baile Dháith – Littleton
Cúil an tSúdaire – Portarlington
the native name
in italic script
a biased telling of the lore of place
the native name
in the lesser script
a muted telling, in slow fade . . .
As I travel through the speckled land
I move from white to black
my journey is taken aslant
the way I follow is zig-zagged.
I am the knight going the long way round
to attack from behind, to try to confound
but there are castles I can’t assault
and clerics before me, proud and preening,
I can’t protect my own queen even
my road is blocked by lowly pawns.
Between two hues
between two names
between two views
between two words
between two tongues
between two worlds
I live my life
between two lives.
by Colm Breathnach
from An Fearann Breac
publisher: Coiscéim, Dublin, 1982
May 17, 2013
Mortify Our Wolves
Though I have in my life experienced gout, bladder stones, a botched bone marrow biopsy, and various other screamable insults, until recently I had no idea what pain was. It islands you. You sit there in your little skeletal constriction of self—of disappearing self—watching everyone you love, however steadfastly they may remain by your side, drift farther and farther away. There is too much cancer packed into my bone marrow, which is inflamed and expanding, creating pressure outward on the bones. “Bones don’t like to stretch,” a doctor tells me. Indeed. It is in my legs mostly, but also up in one shoulder and in my face. It is a dull devouring pain, as if the earth were already—but slowly—eating me. And then, with a wrong move or simply a shift in breath, it is a lightning strike of absolute feeling and absolute oblivion fused in one flash. Mornings I make my way out of bed very early and, after taking all of the pain medicine I can take without dying, sit on the couch and try to make myself small by bending over and holding my ankles. And I pray. Not to God, who also seems to have abandoned this island, but to the pain. That it ease up ever so little, that it let me breathe. That it not—though I know it will—get worse.more from Christian Wiman at The American Scholar here.
the hudson review
The Hudson Review may lack the name recognition of the Paris Review (founded in 1953) or the New York Review of Books (1963). Its circulation is just 2,500, and its annual budget is almost enough to buy a studio apartment on the Upper West Side. What it has, though, is an extreme clarity of mission: publishing worthy authors who keep alive the love of literature. It's all considerably less bewildering once a reader is introduced to the magazine's editor, Paula Deitz, who combines a quick eye for talent with a nearly career-long devotion to the project. Ms. Deitz, who is 74, became editor in 1998, just as the Internet began to dissolve the established media order. By that time, the Hudson Review had earned its reputation for independence, publishing authors from Wallace Stevens and Sylvia Plath to Octavio Paz and Joyce Carol Oates. The magazine was founded by Frederick Morgan and Joseph Bennett, two young men from Princeton University's class of 1943 who were encouraged by their teacher, poet Allen Tate, to establish a magazine. After returning from World War II, the pair set up shop inside the Sapolio soap factory, owned by Morgan's father. Located at the corner of West and Bank streets, it overlooked the Hudson River, hence the name of the journal.more from Pia Catton at the WSJ here.
wagner in new york?
In his last years, Richard Wagner often spoke of immigrating to America. The composer had enthusiastically greeted the founding of the German Empire in 1871, but in the following decade, as Bismarck and the Kaiser failed to provide funds for his nascent festival at Bayreuth, his chauvinism waned, and he entertained the idea of escaping to the New World. Cosima Wagner, his second wife, wrote in her diary in 1880: “Again and again he keeps coming back to America, says it is the only place on the whole map which he can gaze upon with any pleasure: ‘What the Greeks were among the peoples of this earth, this continent is among its countries.’” In consultation with Newell Jenkins, an American dentist who had become a family friend, Wagner drew up a plan whereby American supporters would raise a million dollars to resettle the composer and his family in a “favorable climate”; in return, America would receive proceeds from “Parsifal,” his opera-in-progress, and all other future work. “Thus would America have bought me from Europe for all time,” Wagner wrote. The pleasant climate he had in mind was, surprisingly, Minnesota. What might have happened if, against all odds, Wagner had realized his American scheme?more from Alex Ross at The New Yorker here.
Here’s how to change the world
“How to Change the World” takes as its modest premise the idea that everyone is capable of creating massive, global change — if only we start small and set manageable goals. It’s just like quitting smoking! The book’s author, British journalist and life coach John-Paul Flintoff, has some experience in this area: for his last book, “Sew Your Own,” he learned to make all his own clothes. This allowed him to opt out of the unethical labor practices of the big clothing companies, and also gave him something to do with an old sewing machine. He reports that shirts are his favorite things to make. “How to Change the World” is different from “Sew Your Own” in that it doesn’t offer a roadmap for a particular kind of change — instead, Flintoff invites us to imagine what kinds of change we’d like to make, and suggests some ways to go about it.
For example, Flintoff tells a story about how he got very worried about global warming and decided the only solution was for everyone to grow their own produce. It wasn’t enough to just change his own habits — everyone would need to pitch in to make a dent in carbon consumption. He wanted to start with the people living in his section of London, but rather than harangue his neighbors, Flintoff devised a plan. First, around harvest time, he brought an armload of ripe tomatoes around to his neighbors’ doors, explaining he couldn’t possibly eat all the fruit he had grown. They accepted gladly. Phase I, Buttering Up, was a success. The next spring, he brought around tomato seedlings to the same neighbors. He had some cover story about having planted too many, and asked if they would be interested in growing their own tomatoes this year. Remembering the good tomatoes from last year, most accepted the plant. He had essentially tricked his neighborhood into growing its own tomatoes.
Devious? Maybe. Effective? In Flintoff’s case, yes. Although Flintoff doesn’t advise trickery in every case, he has some ideas about how to convince others to join in your righteous mission — including, paradoxically, offering them a gracious way out. “People do like to give advice or help,” he said. “The only time they don’t feel that happy is if they feel cornered.”
Obama must Make Fighting Climate Change National Project, or Die the death of a thousand Scandals
Juan Cole in Informed Comment:
President Obama, like George H. W. Bush, has a problem with the ‘vision thing.’ And that is the reason for which he is being dogged by critics and ‘scandals.’ He presides over a huge bureaucracy and things will go wrong in it, for which he will be blamed if he allows others to control the narrative. Moreover, it is always possible to depict perfectly ordinary decisions by bureaucrats as somehow outrageous.
Thus, there was no cover-up in Benghazi, but all governments would want to be careful about how talking points were shaped in the aftermath of a crisis (if anything the one most responsible for the insistence that crowd reaction against an Islamophobic film was part of the Benghazi story was Republican David Petraeus, then head of the CIA).
The IRS scrutiny of Tea Party groups applying for tax exempt charitable status derived from a legitimate concern at the more than doubling of such requests after the Citizens United ruling, and a suspicion that the groups were backed by Republican billionaires intending to use them for politics, not charity. It may be that the scrutiny was sometimes invidious, but it is not obvious on the surface as to whether the bureaucrats actually did anything out of the ordinary (left wing requests for tax exempt status were flat; if they had suddenly doubled presumably they would have attracted attention, too.)
But these minor bureaucratic issues only crowd in to dominate the headlines because politics, like nature, abhors a vacuum. Obama should be making the headlines, should be setting a coherent national agenda. He offered to drive the USA Bus for another four years. But where is he taking us? Not clear.
Stephen Wolfram: Dropping In on Gottfried Leibniz
Stephen Wolfram in his blog:
I have always found Leibniz a somewhat confusing figure. He did many seemingly disparate and unrelated things—in philosophy, mathematics, theology, law, physics, history, and more. And he described what he was doing in what seem to us now as strange 17th century terms.
But as I’ve learned more, and gotten a better feeling for Leibniz as a person, I’ve realized that underneath much of what he did was a core intellectual direction that is curiously close to the modern computational one that I, for example, have followed.
Gottfried Leibniz was born in Leipzig in what’s now Germany in 1646 (four years after Galileo died, and four years after Newton was born). His father was a professor of philosophy; his mother’s family was in the book trade. Leibniz’s father died when Leibniz was 6—and after a 2-year deliberation on its suitability for one so young, Leibniz was allowed into his father’s library, and began to read his way through its diverse collection of books. He went to the local university at age 15, studying philosophy and law—and graduated in both of them at age 20.
More here. [Thanks to Justin E. H. Smith.]
Moving Atoms: Making The World's Smallest Movie
The Ayatollah's Game Plan
Mohsen Milani in Foreign Affairs:
In normal presidential elections, it is only the candidates and their platforms that matter. Not so in Iran. There, the key player in the upcoming presidential elections is the septuagenarian supreme leader, Ayatollah Ali Khamenei, who is constitutionally barred from running for the office. He recognizes that the election result will have a profound impact on his own rule and on the stability of the Islamic Republic. So behind the scenes, he has been doing everything in his power to make sure that the election serves his interests. But the eleventh-hour declarations of candidacy by Hashemi Rafsanjani, Iran's president between 1989 and 1997, and by Esfandiar Rahim Mashaei, President Mahmoud Ahmadinejad’s chief of staff and close confidant, have made his task more difficult.
The first part of Khamenei’s four-pronged strategy is to conduct an orderly election. The nightmare scenario for Khamenei is a repeat of the June 2009 presidential election, in which allegations that Ahmadinejad had stolen victory from his challenger, Mir Hossein Mousavi, led to massive demonstrations and the birth of the popular reformist Green Movement.
Khamenei could have stayed above the fray, as elites expected him to do. Instead, he lost credibility as a neutral arbiter when he sided with Ahmadinejad, rejected all allegations of fraud, and blamed Ahmadinejad’s opponents for inciting violence. His offer of public support for the president opened a fissure among the elites that has never quite healed. It also preceded a massive crackdown on activists who were castigated as American stooges and arrested. Even more, the disputed election alienated millions who felt truly robbed of their voices.
Friday PoemMeeting at Night
The gray sea and the long black land;
And the yellow half-moon large and low:
And the startled little waves that leap
In fiery ringlets from their sleep,
As I gain the cove with pushing prow,
And quench its speed i’ the slushy sand.
Then a mile of warm sea-scented beach;
Three fields to cross till a farm appears;
A tap at the pane, the quick sharp scratch
And blue spurt of a lighted match,
And a voice less loud, through joys and fears,
Than the two hearts beating each to each!
by Robert Browning
Shocks to the brain improve mathematical abilities
The 'three Rs' of reading, writing and arithmetic could become four. Random electrical stimulation, a technique that applies a gentle current through the skull, leads to a long-lasting boost in the speed of mental calculations, a small laboratory study of university students has found1. If unobtrusive brain stimulation proves safe and effective in larger classroom trials, the technology could augment traditional forms of study, says Roi Cohen Kadosh, a cognitive neuroscientist at the University of Oxford, UK, who led the study. “Some people will say that those who are bad at mathematics will stay bad. That might not be the case.” Cohen Kadosh’s team made headlines in 2010, when it showed that a different form of electrical jolt — transcranial direct-current stimulation (TDCS) — helped volunteers to learn and remember a number system made up of unfamiliar symbols.
In TDCS, electrical current flows continuously between electrodes placed on different parts of the scalp, activating neurons in one area and quieting them in another. It feels like a baby tugging gently on your hair. By contrast, with transcranial random-noise stimulation (TRNS), “people ask ‘are you sure it’s on?’” says Cohen Kadosh. As the name implies, the technique involves electrical currents flowing through electrodes in random pulses, activating neurons in multiple brain areas. There is no evidence to suggest that either method is unsafe, he says. In the latest study1, his team tasked 25 Oxford students with rote memorization of mathematical facts (such as 2 x 17 = 34) and more complicated calculations (for example, 32 – 17 + 5). Thirteen volunteers received TRNS to their prefrontal cortices, a part of the brain involved in higher cognition, while doing these problems for five days in a row. They became faster at both tasks than volunteers in the control group, who were electrically stimulated only briefly.
May 16, 2013
The First New Atheist? Kierkegaard
Our own Morgan Meis in The Smart Set:
Søren Kierkegaard was born in Denmark on May 5, 1813. He was a difficult and troublesome boy. He quarreled with his father and lived a flippant and self-indulgent life as a young man. Then he had a conversion experience. He broke with his fiancé and became an urban hermit of sorts. He studied philosophy and started to write. He believed that he had a truth to tell the people of his time. The people didn't want to be told — do they ever? This caused him to fight with his fellow Danes and anyone else who got in his way. He became an object of ridicule around Copenhagen. The local papers made fun of him for his hunched back and clubbed foot. He wrote many books under various false names, most of which were ignored. He died in relative obscurity at the age of 42.
Thus, the short and painful life of Søren Kierkegaard. Over the last 200 years, however, Kierkegaard's writings have resurfaced in influential places. A mad German named Friedrich Nietzsche was impressed with Kierkegaard's writings. He helped to keep Kierkegaard from falling into complete oblivion. Another rascally German rediscovered Kierkegaard in the early 20th century. This was Martin Heidegger who, unintentionally, turned Kierkegaard into an intellectual predecessor of Existentialist philosophy. More recently the Post-Modernists rediscovered Kierkegaard, fascinated by his use of fragmentary writing and multiple narrative voices. Kierkegaard is the philosopher who will not go away.
Today, at the 200th anniversary of his birth, Kierkegaard seems as relevant as ever. That’s because there is a public discussion about faith in America today. Kierkegaard’s central concern was faith and the problems of faith. Today, the evolutionary biologist and sometimes children's author Richard Dawkins is at the forefront of the faith debate. The philosopher and cognitive scientist Daniel Dennett is a frequent contributor, as well as the neuroscientist Sam Harris. The late, great Christopher Hitchens was the angriest and funniest participant. We'll call these figures The New Atheists.
KFC smugglers bring buckets of chicken through Gaza tunnels
Ahmed Aldabba in the Christian Science Monitor:
For six years, Rafat Shororo longed for the taste of a KFC sandwich he had eaten in Egypt. This week, he got his finger lickin' fix at home in the Gaza Strip after a local delivery company managed to smuggle it from Egypt through underground tunnels.
"It has been a dream, and this company has made my dream come true," says Mr. Shororo, an accountant, as he receives his order from the delivery guy.
The al-Yamama company advertises its unorthodox new fast-food smuggling service on Facebook. It gets tens of orders a week for KFC meals despite having to triple the price to 100 shekels ($30) to cover transportation and smuggling fees. The deliveries go from the fryers at the Al-Arish KFC joint 35 miles away to customers' doorsteps in about three hours.
The fact that the tunnels operate quickly and cheaply enough for the Colonel’s secret recipe to be enjoyed in the tightly controlled Gaza Strip shows just how much of a sieve the Egypt-Gaza border has become.
How do Finnish kids excel without rote learning and standardized testing?
Erin Millar in The Globe and Mail:
One September morning in 2003, a group of engineers gathered for a marathon brainstorming session at NASA’s Jet Propulsion Lab in Pasadena, Calif. Their intent was ambitious; they wanted to dream up a new way to land spacecraft on Mars.
The meeting stretched to three days of scribbling options on whiteboards, and the solution they came to for landing the SUV-sized rover Curiosity was radical. When Curiosity was 10 kilometres above ground, a contraption they called a sky crane would detach. Then rocket engines would slow the crane to 3 kmh, so it almost hovered above ground as it gently lowered the rover on cables to the ground before flying off and crash landing a safe distance away.
The idea marked a dramatic reversal in NASA’s design philosophy by favouring a complex, risky technology over the simpler, safer, albeit imprecise, previously used options of airbags and legs. Observers thought it was crazy. But on Aug. 5, 2012, after a nail-biting entry into the atmosphere of Mars, Curiosity landed safely.
The sky crane typifies a modern sort of innovation; the big, transformative ideas of today are often complicated and collaborative. Innovation is no longer necessarily about inventions produced by a single person, but about collective knowledge and team-based problem solving.
So if innovation requires people who thrive on collaboration, why are our education systems so focused on individual achievement?
Lennon's "Imagine" and McCartney/Wings' "Band on the Run" overlaid: One way of reuniting (some of) the Beatles
For my sister Sughra, a fan of both:
The Superhero Factory
Paul Morton skips through Sean Howe's history of Marvel comics at The Millions:
At some point, at 4, at 8 or 25, every child learns he will not become a superhero. It won’t be his first disillusionment. He will meet men and women who won’t return his affections. He will discover he has only a limited talent for the vocation he honors. He’ll still indulge his initial fantasies from time to time, usually through stories that imbue the superhero mythos with a hint of realism, some concept of what a superhero would look and act like if he inhabited our world. In the ’60s Marvel Comics comforted its readers by creating superheroes as neurotic as themselves. Ben Grimm was a powerful but impotent rock-man who could only be sated by the love of a blind woman. Reed Richards had no curiosity for the sexual possibilities of his body, which could stretch in any and all directions. By the ’80s, the concept of superhero-comic realism led to the ultra-violence of DC’s Watchmen and The Dark Knight Returns. But in the ’60s Marvel Comics avoided anything like Alan Moore’s misanthropy and Frank Miller’s fascism. The Marvel Universe was at once familiar and psychedelic, mature and juvenile, populated by likable good-looking freaks. It was a happy place.
Read the rest of the essay here.
tracking baba yaga
Why do Russian literary creations, from Gogol’s promenading nose to Bulgakov’s talking cat, hold such a captivating and enigmatic place among the classics of world literature? Perhaps the answer lies with the old woman who haunts Russian fairy tales. “If people are too inquisitive,” says Baba Yaga to her visitor, “I eat them.” This abrupt admonition, like many of the jarring oneliners in Robert Chandler’s new collection of Russian magic tales, at once surprises and perplexes, inviting us into a world where logic and understanding must yield to imagination. Russian Magic Tales from Pushkin to Platonov is full of bears who force children to play blind man’s buff, livestock who give birth to human heroes, and talking gates. Like all folk tales, these stories contain moral elements (humility is rewarded, vanity is punished), but they are worth retelling for their delightful absurdities.more from Amelia Glaser at the TLS here.
a fault line of European civilization
By the mid-thirties there were already fifty seven large cinemas in Moscow and hundreds of other places where films could be shown. The party was very well aware of the propaganda potential of the medium, and generous provision was made for cinemas in the general plan for the city. Naturally, the medium was not untouched by the omnipotent party hand. Sergei Eisenstein was forced to withdraw his film Bezhin Meadow, a dramatisation of the tale of Pavlik Morozov, an apparently apocryphal fable of an odious child who shopped his own father to the authorities and was then murdered by his family. Eisenstein went on to redeem himself in Stalin’s eyes by producing Aleksandr Nevskii, a panegyric of Russian greatness, the following year. The Soviet film industry was very productive, and not all this production was propagandistic. In music, the USSR could show some outstanding talents, and these were the years when David Oistrakh and Emil Gilels, subsequently to achieve world fame, came to public notice. After a lively debate, Pravda declared authoritatively that there was a place for proletarian Soviet dzhaz. Its main exponent was Leonid Utesov, who rose through the cabaret scene to become one of the most popular Soviet musicians. A typically “Soviet” form of light music was provided by Isaak Dunaevskii, prominent as the writer of the score for Soviet musicals such as The Jolly Fellows. The most famous Soviet musician at the time was of course Shostakovich. His opera, Lady Macbeth of the Mtsensk District, had been denounced by Pravda in January 1936 as “chaos instead of music”. He spent 1937 working on his Fifth Symphony, which was premiered in Leningrad to great acclaim in November of that year.more from Pádraig Murphy at the Dublin Review of Books here.
Calvino does not have any sort of eye on posterity, as so many other modern letter-writers do. He is living in the present, not constructing a future monument. This may offer something of a surprise to the reader who comes to the letters from the fiction and who may at first miss the expected intricacy and play. It’s not that there is no fun in the letters, but the sense of direct communication, of a man being as clear as he can about a host of matters, complex and simple, is quite different from that created by the artistic density of Calvino’s prose fiction. In his art, the wit and the irony are ways of reflecting the difficulties of the world while hanging on to his sanity – instruments of reason in a world of madness. “I am in favour,” Calvino says in one letter, “of a clown-like mimesis of contemporary reality.” Clowns are often sad and all too sane; but their relation to reality is oblique. Calvino’s writing is part of a great literary project of hinting and suggesting, making memorable shapes and images, rather than giving information or offering explanations. In his letters, Calvino tells rather than shows his correspondents what he means – with great and often moving success.more from Michael Wood at The New Statesman here.
Thursday PoemIf You Could See Her After Drinking Wine . . .
If you could see her after drinking wine,
Wine from Chile of the berry-red kind
Prancing ahead of me in the middle of the night
Through the business district with her face alight
Having left the pub late and a little tight.
Ah, if you could see her after drinking wine.
If you could see her after drinking wine.
Wine called Hoch from Germany’s Rhine
Her hands like birds fluttering in flight
In a sugawn café when the day is high
Her voice louder than the crowd’s by just a mite.
Oh, if you could see her after drinking wine.
If you could see her after drinking wine,
Beaujolais Nouveau, strawberries and cream
At a garden party under autumn’s gleam
Her bike by the gate lost in a dream
Of the road home as the sun goes to sleep.
Ah, if you could see her after drinking wine.
If you could see her after drinking wine.
Wine from California’s grape-fields fresh and new
Hopping through the Stack-of-Barley a bit askew
In her oh so new blue suede shoes.
If you could see her, as I see her, after drinking wine . . .
by Colm Breathnach
publisher: Coiscéim, Dublin, 2006
Evolution shapes new rules for ant behavior
In ancient Greece, the city-states that waited until their own harvest was in before attacking and destroying a rival community's crops often experienced better long-term success. It turns out that ant colonies that show similar selectivity when gathering food yield a similar result. The latest findings from Stanford biology Professor Deborah M. Gordon's long-term study of harvester ants reveal that the colonies that restrain their foraging except in prime conditions also experience improved rates of reproductive success. Importantly, the study provides the first evidence of natural selection shaping collective behavior, said Gordon, who is also a senior fellow at the Stanford Woods Institute for the Environment.
A long-held belief in biology has posited that the amount of food an animal acquires can serve as a proxy for its reproductive success. The hummingbirds that drink the most nectar, for example, stand the best chance of surviving to reproduce. But the math isn't always so straightforward. The harvester ants that Gordon studies in the desert in southeast Arizona, for instance, have to spend water to obtain water: an ant loses water while foraging, and obtains water from the fats in the seeds it eats. The ants use simple positive feedback interactions to regulate foraging activity. Foragers wait near the opening of the nest, and bump antennae with ants returning with food. The faster outgoing foragers meet ants returning with seeds, the more ants go out to forage. (Last year, Gordon, Katie Dektar, an undergraduate, and Balaji Prabhakar, a professor of computer science and of electrical engineering at Stanford, showed that the ants' "Anternet" algorithm follows the same rules as the protocols that regulate data traffic congestion in the Internet).
"It's like the British during the Blitz:" How It Feels to Lose Your Breasts
Liz Kulze in The Atlantic:
When I first saw Angelina Jolie's announcement about her double mastectomy, my mind immediately conjured up a picture of her once-magnificent chest, the prominent supporting-actors in Tomb Raider eliminated from her commanding figure. But of course, her famous breasts were skillfully, and I assume rather beautifully, restored. In an age where stardom now includes the fetishization of particular body parts, she had no other choice. Yet, as equalizing and humanizing as Jolie's words were, the reality of mastectomy is quite different for much of the world, and cuts a bit deeper than even Jolie herself has bravely let on. Much of yesterday's discussion was right in praising her brave choice and assuring us that Jolie, and all women like her, are indeed "still women." However, it is both flippant and naive not to acknowledge that this procedure changes women , however intact their femininity remains. As any survivor will tell you, breast cancer shows no clemency. As a girl my cousins and I used to sneak into my grandmother's room to play with her boobs. She kept them in her sock drawer, palm-sized silicon inserts that gave one the sensation of a balloon filled with jelly. Her real breasts had been removed at the age of 57, before the tumors had a chance to prey on the remainder of her still-youthful figure. Looking back, I realize I never took a moment to think about the experience she had withstood. I had known her in no other way. The subdued contour of her silk blouses were entirely normal to me. But as I spent last night contemplating my own two breasts (and asking my boyfriend obnoxious questions like, "What do these mean to you?"), only then did I begin to understand both the literal and figurative parts of her that were lost.
I called her, and to my surprise, she had once been a rather voluptuous woman. "My breasts were huge!" she told me as if recalling some exciting memory of the past, "Huge! But you know after you have seven children they get pendulous. I had to sort of stuff them into a bra." After losing them, the most harrowing part, she tells me, was the loss of sensitivity—something faced even by those like Jolie who have reconstruction. It's a kind of sexual evisceration, a source of tremendous pleasure tossed out like spoiled milk. The public, and even doctors, often forget about this. When she heard her surgeons telling my grandfather, "Oh she's so lucky, we'll just remove both her breasts, and she'll be fine," my grandmother remembered thinking to herself, "Well god, why don't you go get your penis cut off and see how you feel?!" (to which I said, Grandmama!). Following a hysterectomy ten years prior, the additional loss of her breasts precipitated a swift end to her sex life. "It was probably a lot harder on your Granddaddy," she said, "but I just couldn't care anymore. It would have been worse if I was younger." Luckily my grandmother was approaching her 60s, a time where breasts and sex and one's public image begin to figure relatively less into one's day-to-day existence. But unfortunately many young women are also victims of this diabolical disruption, and at increasing rates.
May 15, 2013
Where Thomas Nagel Went Wrong
Michael Chorost in the Chronicle of Higher Education:
Thomas Nagel is a leading figure in philosophy, now enjoying the title of university professor at New York University, a testament to the scope and influence of his work. His 1974 essay "What Is It Like to Be a Bat?" has been read by legions of undergraduates, with its argument that the inner experience of a brain is truly knowable only to that brain. Since then he has published 11 books, on philosophy of mind, ethics, and epistemology.
But Nagel's academic golden years are less peaceful than he might have wished. His latest book, Mind and Cosmos (Oxford University Press, 2012), has been greeted by a storm of rebuttals, ripostes, and pure snark. "The shoddy reasoning of a once-great thinker," Steven Pinker tweeted. The Weekly Standard quoted the philosopher Daniel Dennett calling Nagel a member of a "retrograde gang" whose work "isn't worth anything—it's cute and it's clever and it's not worth a damn."
The critics have focused much of their ire on what Nagel calls "natural teleology," the hypothesis that the universe has an internal logic that inevitably drives matter from nonliving to living, from simple to complex, from chemistry to consciousness, from instinctual to intellectual.
This internal logic isn't God, Nagel is careful to say. It is not to be found in religion. Still, the critics haven't been mollified. According to orthodox Darwinism, nature has no goals, no direction, no inevitable outcomes. Jerry Coyne, an evolutionary biologist at the University of Chicago, is among those who took umbrage. When I asked him to comment for this article, he wrote, "Nagel is a teleologist, and although not an explicit creationist, his views are pretty much anti-science and not worth highlighting. However, that's The Chronicle's decision: If they want an article on astrology (which is the equivalent of what Nagel is saying), well, fine and good."
Albert Hirschman: An Original Thinker of Our Time
Cass R. Sunstein in the New York Review of Books:
Albert Hirschman, who died late last year, was one of the most interesting and unusual thinkers of the last century. An anti-utopian reformer with a keen eye for detail, Hirschman insisted on the complexity of social life and human nature. He opposed intransigence in all its forms. He believed that political and economic possibilities could be found in the most surprising places.
Hirschman is principally known for four remarkable books. The most influential,Exit, Voice, and Loyalty (1970), explores two ways to respond to unjust, exasperating, or inefficient organizations and relationships. You can leave (“exit”) or you can complain (“voice”). If you are loyal, you will not exit, and you may or may not speak out. The Passions and the Interests (1977) uncovers a long-lost argument for capitalism in general and commercial interactions in particular. The argument is that trade softens social passions and enmities, ensuring that people see one another not as members of competing tribes, but as potential trading partners. Shifting Involvements(1982) investigates the dramatically different attractions of political engagement and private life, and shows how the disappointments of one can lead to heightened interest in the other. For example, the protest movements of the 1960s were inspired, at least in part, by widespread disappointment with the experience of wealth-seeking and consumption, emphasized in the 1950s.
Finally, The Rhetoric of Reaction (1991) is a study of the reactionary’s tool kit, identifying the standard objections to any and all proposals for reform.
3-D Scans Reveal Caterpillars Turning Into Butterflies
Ed Yong in Not Exactly Rocket Science:
The transformation from caterpillar to butterfly is one of the most exquisite in the natural world. Within the chrysalis, an inching, cylindrical eating machine remakes itself into a beautiful flying creature that drinks through a straw.
This strategy—known as holometaboly, or complete metamorphosis—partitions youngsters and adults into completely different worlds, so that neither competes with the other. It’s such a successful way of life that it’s used by the majority of insects (and therefore, the majority of all animals). Butterflies, ants, beetles and flies all radically remodel their bodies within a pupa as they develop from larvae to adults.
But what goes on inside a pupa? We know that a larva releases enzymes that break down many of its tissues into their constituent proteins. Textbooks will commonly talk about the insect dissolving into a kind of “soup”, but that’s not entirely accurate. Some organs stay intact. Others, like muscles, break down into clumps of cells that can be re-used, like a Lego sculpture decomposing into bricks. And some cells create imaginal discs—structures that produce adult body parts. There’s a pair for the antennae, a pair for the eyes, one for each leg and wing, and so on. So if the pupa contains a soup, it’s an organised broth full of chunky bits.
The organic myth of the British constitution
Michael Gardiner writes at openDemocracy on 'public' services in Britain:
The British left is packed with voices demanding an unreflective defence of ‘public services’. This public is frozen beyond any evaluation of commonality, is held to be equalising even as its bases fall away to reveal the private ownership concealed within them. The barrage is triggered in part by the Great Recession, but also in part by the sovereignty challenge being felt in the UK, concretely in Scotland in 2014. Now is a good time to reflect that the British sovereignty behind these public services has always in fact defined itself as a defence against popular sovereignty, a defence projected as timeless inheritance which is intuitive and ‘just there’.
If the nationalism standing behind the ‘British public’ throughout the press and left commentary seems oddly transparent, this transparency derives from Britain’s unusual licence to exist ‘beyond’ the national. For this is less a nation than it is a rationalisation of credit. The British union arises from the import of the Anglo-Dutch financial system after 1688, its guarantee in perpetuity by the Hanoverian crown, and central banks which supported it from the 1690s. As Daniel Defoe was describing at exactly the time of the Acts of Union in 1706-07, Britain’s raison d’état is as an investment entity, a guarantor of global money. As has been described in many accounts of the close of the seventeenth century, in this new state citizenship is understood in terms of naturalised property and the avoidance rather than the promotion of shared action. Reform it as much as you like, but collectivity is not within the scope of the British constitution.
Read the rest here.
M. J. Rosenberg: Pro-Palestinian Is Not Anti-Israel But the Opposite
M. J. Rosenberg in the Washington Spectator:
Sometimes it is instructive to listen to what Harvard law professor Alan Dershowitz says because his way of seeing the Israel-Palestinian conflict is typical of the thinking of both the Netanyahu government and its lobby here. I say "sometimes," because most of Dershowitz’s opinions can be found in a dozen other places -- from AIPAC, the "major Jewish organizations," neocon websites like Commentary and in statements and tweetsfrom the Israeli government itself.
But sometimes Dershowitz inadvertently provides solid insight into the mentality that enables a 45-year occupation that, even Dershowitz admits, has proven so destructive to Israel.
In a debate last week with Peter Beinart, theDaily Beast columnist and author of the bestseller, The Crisis of Zionism, Dershowitz said that, for Jews, Israel is now "an embarrassment."
In 1967, Jews were able to beat their chest and say "wow, we’re proud to be Israel [sic], look how tough Israelis are. It was a source of pride. Today, it’s a source of embarrassment."
And he knows why, as evidenced by his reference to 1967, the year the occupation began.
But when Beinart pointed that out, Dershowitz responded that Israel’s evolution into "an embarrassment" has nothing to do with the occupation.
Robert Pinsky reads his poetry to improvised jazz
infinite fossil fuel
For years, environmentalists have hoped that the imminent exhaustion of oil will, in effect, force us to undergo this virtuous transition; given a choice between no power and solar power, even the most shortsighted person would choose the latter. That hope seems likely to be denied. Cheap, abundant petroleum threw sand in the gears of solar power in the 1980s and stands ready to do it again. Plentiful natural gas, a geopolitical and economic boon, is a climatological shackle. To Vaclav Smil, the University of Manitoba environmental scientist, the notion that we can move so fast is naive, even preposterous. “Energy transitions are always slow,” he told me by e-mail. Modern energy infrastructures, assembled over decades, cannot be revamped overnight. Worse still, in his view, there is little public appetite for beginning the process, or even appreciating the magnitude of what lies ahead. “The world has been running into fossil fuels, not away from them.”more from Charles C. Mann at The Atlantic Monthly here.
Permanent Present Tense
Henry had his first epileptic episode in 1936, at the age of ten; by 1953 his seizures had become increasingly frequent and debilitating. His family doctor referred him to William Beecher Scoville, a leading neurosurgeon at Yale Medical School. When massive doses of medication failed to quell his attacks and EEGs revealed no obvious locus of brain damage, Scoville suggested a novel surgical procedure. Using a trepanning drill he had constructed himself from auto parts, he cut two coin-sized holes in the skull, ‘doorways to Henry’s brain’, and suctioned out most of his medial temporal lobes, the front half of the hippocampus and most of the amygdala. After recovery, Henry’s seizures were significantly reduced, but it soon become apparent that the operation had vacuumed away any recollection of his hospital stay, and indeed most of the significant events of the previous few years. Catastrophically, it had also created a global anterograde amnesia: the loss of the ability to form new memories of any kind.more from Mike Jay at the LRB here.
From childhood, Duncan saw herself as a liberator, opposed but never vanquished by philistines. In My Life she recalls that in elementary school she gave an impromptu lecture in front of the class on how there was no Santa Claus, whereupon she was sent home by an angry teacher. This was not the last of what, with pride, she called her “famous speeches.” When she became a professional, she routinely ended her concerts by coming out in front of the curtain and describing to the audience, at length, how profound her way of dancing was, as opposed to the triviality of other ways—she called ballet “an expression of degeneration, of living death”—and on how, therefore, they should contribute to the expenses of her school. (This declamatory bent was probably the least attractive aspect of Duncan’s personality, as it is of My Life, and some reviewers had a lot of fun with it.) What appeared to her most vile about ballet was its unnaturalness: the rigid back, the studied positions, the relentless daintiness. Duncan was an exemplary bohemian—a quality that was partly rooted, no doubt, in the fact that she was from California. (She was born in San Francisco and raised, mostly, in Oakland.) That region has a history of breeding idealists, animists, nonconformists.more from Joan Acocella at the NYRB here.
From Harvard Magazine:
“Fairy tales have always tapped into the subconscious, bringing to light children’s deepest fears,” says Soman Chainani ’01. In his new fantasy-adventure novel, The School for Good and Evil, he has brought that tenet into the twenty-first century. The first of a trilogy for middle-grade readers (ages nine and up), The School for Good and Evil tracks two archetypal heroines: the lovely Sophie, with her waist-long blond hair and her dreams of becoming a princess, and her friend Agatha, an unattractive, unpopular contrarian who chooses to wear black. A giant bird snatches the pair and carries them off to the School for Good and Evil, a two-pronged magical academy that trains children to become fairy-tale heroes and villains. When, to her horror, Sophie arrives at the Evil branch to learn “uglification,” death curses, and other dark arts, while Agatha finds herself at the School for Good amid handsome princes and fair maidens, the line between good and evil blurs, the meaning of beauty twists, and the girls reveal their true natures.
At the core of their journey is the “princess culture,” which Chainani defines as today’s “tyranny of pink in young-girl marketing. It tells them their responsibility is to be pink, sparkly, ultra-feminine, and—most of all—pretty.” With such an emphasis on looks, “girly girls are terrified of being ugly, and normal girls are afraid of being outcasts.” Even boys are unnerved. “They have no idea how to live up to the expectations,” he says. “That’s what I am interested in capturing: what kids fear most today.” Sophie and Agatha inhabit a world like that of classic fairy tales: a place where magic and reality coexist, and dangers lurk. Yet those dangers reflect modern issues. Several episodes tackle the fear of aging; one chapter riffs on the current obsession with physical self-improvement. In a scene where Sophie is asked to contribute to the school, she becomes a campus celebrity by offering “Malevolent Makeovers” and a presentation titled “Just Say No to Drab.” When Agatha challenges her, Sophie replies, “Isn’t this compassion? Isn’t this kindness and wisdom? I’m helping those who can’t help themselves!”
“So much is based on image,” Chainani explains. “It’s such a pervasive, destructive thing.”
The emergence of individuality in genetically identical mice
From Kurzweil AI:
How do people and other organisms evolve into individuals that are distinguished from others by their own personal brain structure and behavior? Why do identical twins not resemble each other perfectly even when they grew up together? To shed light on these questions, the scientists observed 40 genetically identical mice that were kept in an enclosure that offered a rich shared environment with a large variety of activity and exploration options. They showed that individual experiences influence the development of new neurons in mice, leading to measurable changes in the brain. “The animals were not only genetically identical, they were also living in the same environment,” explained principal investigator Gerd Kempermann, Professor for Genomics of Regeneration, CRTD, and Site Speaker of the DZNE in Dresden. “However, this environment was so rich that each mouse gathered its own individual experiences in it. Over time, the animals therefore increasingly differed in their realm of experience and behavior.” Each of the mice was equipped with a special microchip emitting electromagnetic signals. This allowed the scientists to construct the mice movement profiles and quantify their exploratory behavior.
The result: despite a common environment and identical genes, the mice showed highly individualized behavioral patterns. In the course of the three-month experiment, these differences increased in size.
“These differences were associated with differences in the generation of new neurons in the hippocampus, a region of the brain that supports learning and memory,” said Kempermann “Animals that explored the environment to a greater degree also grew more new neurons than animals that were more passive.” Adult neurogenesis [generation of new neurons] in the hippocampus allows the brain to react to new information flexibly. With this study, the authors show for the first time that personal experiences and ensuing behavior contribute to the “individualization of the brain.” The individualization they observed cannot be reduced to differences in environment or genetic makeup. “Adult neurogenesis also occurs in the hippocampus of humans,” said Kempermann. “Hence we assume that we have tracked down a neurobiological foundation for individuality that also applies to humans.”
May 14, 2013
How the Case for Austerity Has Crumbled
In the New York Review of Books, Paul Krugman reviews Neil Irwin's The Alchemists: Three Central Bankers and a World on Fire, David A. Stockman's The Great Deformation: The Corruption of Capitalism in America, and Mark Blyth's Austerity: The History of a Dangerous Idea:
It’s an ill wind that blows nobody good, and the Greek crisis was a godsend for anti-Keynesians. They had been warning about the dangers of deficit spending; the Greek debacle seemed to show just how dangerous fiscal profligacy can be. To this day, anyone arguing against fiscal austerity, let alone suggesting that we need another round of stimulus, can expect to be attacked as someone who will turn America (or Britain, as the case may be) into another Greece.
If Greece provided the obvious real-world cautionary tale, Reinhart and Rogoff seemed to provide the math. Their paper seemed to show not just that debt hurts growth, but that there is a “threshold,” a sort of trigger point, when debt crosses 90 percent of GDP. Go beyond that point, their numbers suggested, and economic growth stalls. Greece, of course, already had debt greater than the magic number. More to the point, major advanced countries, the United States included, were running large budget deficits and closing in on the threshold. Put Greece and Reinhart-Rogoff together, and there seemed to be a compelling case for a sharp, immediate turn toward austerity.
But wouldn’t such a turn toward austerity in an economy still depressed by private deleveraging have an immediate negative impact? Not to worry, said another remarkably influential academic paper, “Large Changes in Fiscal Policy: Taxes Versus Spending,” by Alberto Alesina and Silvia Ardagna.
One of the especially good things in Mark Blyth’s Austerity: The History of a Dangerous Idea is the way he traces the rise and fall of the idea of “expansionary austerity,” the proposition that cutting spending would actually lead to higher output. As he shows, this is very much a proposition associated with a group of Italian economists (whom he dubs “the Bocconi boys”) who made their case with a series of papers that grew more strident and less qualified over time, culminating in the 2009 analysis by Alesina and Ardagna.
In essence, Alesina and Ardagna made a full frontal assault on the Keynesian proposition that cutting spending in a weak economy produces further weakness. Like Reinhart and Rogoff, they marshaled historical evidence to make their case. According to Alesina and Ardagna, large spending cuts in advanced countries were, on average, followed by expansion rather than contraction. The reason, they suggested, was that decisive fiscal austerity created confidence in the private sector, and this increased confidence more than offset any direct drag from smaller government outlays.