Saturday, July 26, 2014
From More Intelligent Life:
The Big Question: we asked six writers, how many children should we have? David Benatar believes that the only way to prevent harm is not to have any
Millions of years of evolutionary history have programmed you to reject the notion that procreation is wrong. Bear this in mind if you rush to reject my argument, and to defend a deeply harmful practice. Morally responsible parents wish to spare their children pain. There are ways they can minimise the chances of their children suffering certain types of harm, but the only way to prevent harm altogether is to desist from bringing children into existence. Any child will, inevitably, suffer considerable harm. Privileged procreators in developed countries are inclined to respond that their children are likely to be spared the chronic deprivation, insecurity and violence that blight the lives of so many. This response ignores the discomfort, distress, frustration and unhappiness that characterise even the most charmed lives. It also ignores the appalling fates that can befall anybody. These include assault, devastating injury, degenerative disease and depression.
Nor can these fates be dismissed as improbable. For example, 40% of men and 37% of women in Britain develop cancer at some point. Add to these odds the cumulative risks of other terrible conditions and we find that the chance of escaping calamity approaches zero. It reaches zero if we include death. In creating a child you are ultimately responsible for its death, and for the ensuing ripples of bereavement.
“So much precious time goes by and it seems to me I get so little out of it,” Elia Kazan wrote to John Steinbeck in 1955. “I ask myself is this it? Is this why? Is this what I wanted to do? Is this why I accumulated what dough I have. I feel like a highly publicized meal ticket, some of the time, doing what the hell my wife and family and society expects of me and not at all — since I dont think originally enough — what I’d like to do. I can imagine great excitement to life again. But something prevents my going after it.”
Kazan’s sense of feeling aimless and artistically spent was rare for this relentless achiever, but certainly understandable. The letter was written when he was 45, midway through a life that would span almost another half-century. (He died in 2003.) But by this juncture Kazan had already amassed a momentous body of work in theater and film that testifies to his unstoppable drive and restless energy, qualities that spring from almost every page of “The Selected Letters of Elia Kazan,” a meaty volume edited by Albert J. Devlin with Marlene J. Devlin. (That “dont,” by the way, was Kazan’s: He couldn’t be bothered with properly punctuating contractions most of the time, as if life were too short for apostrophes.)
Atheists: The Origin of the Species seems to have been born out of frustration with these and other confusions perpetuated by the so-called “New Atheists” and their allies, who can’t be bothered to familiarize themselves with the traditions they traduce. Several thoughtful writers have already laid bare the slapdash know-nothingism of today’s mod-ish atheism, but Spencer’s not beating a dead horse—he’s beating a live one, in the hope that Nietzsche might rush to embrace it. Several critics have noted that if evangelical atheists (as the philosopher John Gray calls them) are ignorant of religion, as they usually are, then they aren’t truly atheists. “The knowledge of contraries is one and the same,” as Aristotle said. If your idea of God is not one that most theistic traditions would recognize, you’re not talking about God (at most, the New Atheists’ arguments are relevant to the low-hanging god of fundamentalism and deism). But even more damning is that such atheists appear ignorant of atheism as well.
For atheists weren’t always as intellectually lazy as Dawkins and his ilk. (Nor, to be sure, are many atheists today—Coyne accused me of “atheist-bashing” the last time I wrote about religion for Slate, but I really only bashed evangelical atheists like him. My father and sister, most of my friends, and many of the writers I most admire are nonbelievers. They’re also unlikely to mistake the creation myth recounted above for anything more than the dreariest parascientific thinking.)
The canon of first world war reminiscence was established early. Erich Maria Remarque’s novel All Quiet on the Western Front blazed the trail, selling nearly 2m copies in 1929. An avalanche of testimonials followed, several – including those by Robert Graves, Siegfried Sassoon and Vera Brittain, as well as Ernest Hemingway’s A Farewell to Arms (1929) – having stayed in print ever since.
During the first postwar decade, the memoirs of politicians and commanders had filled publishers’ schedules, and there had been little market for the worm’s eye view. What the “war books boom” offered, in contrast, was witness testimony written from the perspective of the junior officer (or, in Brittain’s case, from that of a bereaved civilian and army nurse) that highlighted not grand strategy but ground-level chaos and suffering. The most enduring memoirists were skilled and often practised authors, who at the distance of a decade used literature as a tool of therapy, for themselves as well as others. In addition, they established a standard trajectory – from innocence to disenchantment, via black humour, horror and the grotesque – that set the mould for later testimony to conflicts ranging from the second world war to Vietnam.
Michael Inzlicht and Sukhvinder Obhi in The New York Times:
The human brain can be exquisitely attuned to other people, thanks in part to its so-called mirror system. The mirror system is composed of a network of brain regions that become active both when you perform an action (say, squeezing a rubber ball in your hand) and when you observe someone else who performs the same action (squeezing a rubber ball in his hand). Our brains appear to be able to intimately resonate with others’ actions, and this process may allow us not only to understand what they are doing, but also, in some sense, to experience it ourselves — i.e., to empathize. In our study, we induced a set of participants to temporarily feel varying levels of power by asking them to write a brief essay about a moment in their lives. Some wrote about a time when they felt powerful and in charge, while others wrote about a time when they felt powerless and subordinate to others. The selection process was random, so that each participant had an equal chance of being powerful or powerless. Next, the participants watched a video of a human hand repeatedly squeezing a rubber ball. While they watched, we assessed the degree of motor excitation occurring in the brain — a measure that is widely used to infer activation of the mirror system. This motor excitation was determined by the application of transcranial magnetic stimulation and the measurement of electrical muscle activation in the subject’s hand. We sought to determine the degree to which the participants’ brains became active during the observation of rubber ball squeezing, relative to a period in which they observed no action.
We found that for those participants who were induced to experience feelings of power, their brains showed virtually no resonance with the actions of others; conversely, for those participants who were induced to experience feelings of powerlessness, their brains resonated quite a bit. In short, the brains of powerful people did not mirror the actions of other people. And when we analyzed the text of the participants’ essays, using established techniques for coding and measuring themes, we found that the more power that people expressed, the less their brains resonated. Power, it appears, changes how the brain itself responds to others.
The War Works Hard
How magnificent the war is!
Early in the morning,
it wakes up the sirens
and dispatches ambulances
to various places,
swings corpses through the air,
rolls stretchers to the wounded,
from the eyes of mothers,
digs into the earth
dislodging many things
from under the ruins . . .
Some are lifeless and glistening,
others are pale and still throbbing . . .
It produces the most questions
in the minds of children,
entertains the gods
by shooting fireworks and missiles
into the sky,
sows mines in the fields
and reaps punctures and blisters,
urges families to emigrate,
stands beside the clergymen
as they curse the devil
(poor devil, he remains
with one hand in the searing fire) . . .
The war continues working, day and night.
It inspires tyrants
to deliver long speeches,
awards medals to generals
and themes to poets.
It contributes to the industry
of artificial limbs,
provides food for flies,
adds pages to the history books,
between killer and killed,
teaches lovers to write letters,
accustoms young women to waiting,
fills the newspapers
with articles and pictures,
builds new houses
for the orphans,
invigorates the coffin makers,
gives grave diggers
a pat on the back
and paints a smile on the leader’s face.
The war works with unparalleled diligence!
Yet no one gives it
a word of praise.
by Dunya Mikhail
from The War Works Hard
publisher: Al-Mada, 2000
translation: 2005, Elizabeth Winslow
fromThe War Works Hard
publisher: New Directions, New York, 2005
Friday, July 25, 2014
Alexander J. Motyl in Foreign Affairs (image: Maxim Zmeyev/Courtesy Reuters):
This week also saw a major escalation of Russian military involvement in Ukraine; in the early morning hours of Sunday, July 13, about 100 Russian armored personnel carriers and other vehicles crossed from Russia into Luhansk province in Ukraine. Unlike earlier Russian deployments into Crimea and eastern Ukraine, these carriers were openly adorned with Russian insignia and flags. The flow of Russian tanks and soldiers into the area has since continued, and Ukrainian authorities estimate that up to 400 additional “little green men” (a term coined during the Crimea invasion for Russian troops without insignia) have infiltrated into eastern Ukraine’s Donbas.
Until yesterday, that escalation had gone relatively unremarked in Western media. But now, no matter who fired the missile, things are set to change. The downing of a civilian plane may conceivably qualify as a war crime, inasmuch as it entailed the unwarranted militarily destruction of a civilian target. At any rate, it was certainly an atrocity and an act of terrorism. And if Girkin -- an ethnic Russian who hails from Russia and who, by some accounts, is still an officer in the Russian military intelligence service, which would make him officially subordinate to Russia’s president -- really was involved, Putin might arguably be politically responsible for the crime.
Politically and economically, that couldn’t be worse news for Putin, who launched a charm offensive just last week at the World Cup in Rio de Janeiro. Putin, worried about the Ukrainian army’s rapid advances on insurgent positions, met with German Chancellor Angela Merkel and convinced her to agree to negotiations with the insurgents. His efforts -- presumably deemed insincere by Washington -- collapsed on Wednesday when the Obama administration imposed new financial sanctions on several important Russian banking and energy institutions, including Gazprombank, Novatek (an independent natural gas producer), the Rosneft Oil Company, and the VEB Bank for Development and Foreign Economic Affairs. Hours later, the Russian stock market took a nosedive and the ruble fell.
Putin might have managed to muddle along. Although most of the West has been deeply critical of Russia and its support for separatist groups in eastern Ukraine, European and American policymakers have been hesitant to impose the most severe sanctions and have seemed ready to move on to other foreign policy issues, such as Iraq and the war between Israel and Hamas. Even the Obama administration’s recent round of sanctions was not as far-reaching as many critics of the president would have liked.
But the Malaysia Airlines crash will force both the United States and Europe to come to terms with unpleasant realities.
Over at Philosophy Bites:
Is there a place for the sacred in contemporary life? Roger Scruton believes there is. In this discussion he explains his understanding of the experiences he calls sacred.
Richard Marshall interviews Pascal Engel in 3:AM Magazine [Photo: Claire Poinsignon]:
3:AM: As an epistemologist do you think knowledge is elusive because the term is empty, and would that be an approach developed out of your work on Ramsey’s principle, (which I’ll ask about in a minute)?
PE: I do not take knowledge to be elusive or empty. On the contrary, it seems to me to designate a bona fide natural kind, although not one which is easy to pin down. Unlike contextualists, I take expressions such as “knowledge” or “knows” to be invariant across contexts (although I do not deny a certain amount of context sensitivity in our epistemic terms). In the current jargon I am an insensitive invariantist. Unlike eliminativists about knowledge, among whom I count a number of experimental philosophers, I do not think it is an empty term. I do not think, however, that knowledge can be defined through a set of necessary and sufficient conditions. This I take to be one of the lessons of Gettierology.
This does not mean that there is nothing to say about knowledge and that epistemologists have to pack up and leave. Although knowledge cannot be defined in the strict sense of this term, we can still characterize it functionally, through its relationships with other notions, such as those of justification, evidence, reliability, or safety, and we can try to give explanations and theories about these notions. Thus it makes sense to ask whether internalism or externalism about knowledge are correct, whether reliabilism or evidentialism are correct, and to work out the best versions of these.
I also take it that , although the analysis of knowledge is basically a conceptual and a priori matter, we can learn a lot (although not everything) from cognitive science, ethology and especially from developmental psychology about what knowledge is. These issues cannot be dealt with only at the level of an account of knowledge in general, but have to be dealt with about particular kinds of knowledge depending on its sources (perceptual, inferential, testimonial, a priori) and of its domain (natural, scientific, moral, aesthetic). Perhaps there is no single account which works fully for all domains, but I take it that they have a number of traits in common (in this respect the shape of the issues is pretty much like for truth, which can vary across domains, but keeps a functionalist core)
Ramsey’s ideas about knowledge, which were in many ways pioneering, are perfectly consistent with this functionalist account.
Anne Fausto-Sterling in The Boston Review of Books:
Consider the case of Grady Nelson, a Florida man who was convicted of murdering his wife, Angelina Martinez. During the 2010 hearing to decide if Nelson should be sentenced to death, his lawyer showed the jury an image suggesting that Nelson had an abnormality in his left frontal lobe. At least two jurors were impressed by the evidence, shifting the voting balance toward life imprisonment. As much as I oppose the death penalty, the outcome raises some basic questions. Was the brain scan taken at the time of the murder, or, more likely, after years in jail? Could the brain deformations be linked to the murder? Scientifically, the introduction of this neural image was pretty lame, but the emotional impact was huge, and it carried the day for the defendant, who escaped execution.
Or what about the flurry of news stories this past December with headlines such as “Brain wiring in men, women could explain gender differences,” all reporting on a publication in the Proceedings of the National Academy of Sciences, which used neural imaging to produce average connectomes for brains of several hundred males and females.
Again, the images are compelling, but the science is not. First, neither the Proceedings article nor any other reputable research has tied specific wiring diagrams to variation in behaviors or cognitive skills. Just as with the fly larvae, the activity of differently wired networks can lead to the same behavior. Indeed, in an earlier study, the Proceedings researchers showed only small differences in the “big skills”—map reading, social cognition, spatial processing—that supposedly separate men from women. Second, the researchers do not assess the possibility that different experiences of gender might themselves produce differently wired brains. Did the young people in their samples play the same sports, have the same hobbies, wear the same type of clothing, or study the same subjects in high school?
H. Bruce Franklin in The LA Review of Books:
Teaching students in the 21stcentury — including the combat veterans, National Guard soldiers, and reservists in my classes at Rutgers University, Newark — I have to keep reminding myself that they have lived their entire conscious lives during America’s endless warfare. For them, that must seem not just normal, but how it has always been and always will be. Is that also true for the rest of us?
I had to rethink this question a few months ago, when I woke one day to discover that I was 80 years old. For more than half a century I’ve been involved in struggles to stop wars being waged by our nation or to keep it from starting new ones. Before that, in the late 1950s, I had spent three years in the US Air Force flying in Strategic Air Command operations of espionage and provocation against the Soviet Union and participating in launches for full-scale thermonuclear war. Some of these launches were just practice, but a few were real war strikes that were recalled while we were in flight just minutes before it would have been too late. (I recall with embarrassment that I never had a flicker of doubt about whether I should be participating in the start of a thermonuclear Armageddon.) And before that were four years of ROTC, which I joined during the Korean War, a war that had started when I was 16. From age 11 to 16, I had bounced right from the Victory Culture at the end of World War II into the repression and militarization of the early Cold War years.
So it dawned on me that living one’s life during America’s Forever War is hardly unique to those millennials I’m teaching. How many people alive today have ever lived part of their conscious lives in a United States of America at peace with the rest of the world? Would someone even older than I am have any meaningful memory of what such a state of peace was like? How many Americans are even capable of imagining such a state? I can remember only two periods, bracketing World War II, when I believed I lived in a nation at peace. And even these were arguably just childish illusions.
Ismail Khalidi interviews Max Blumenthal in Guernica:
Guernica: The last month has seen the killing of three teenage Israeli settlers near Hebron and a massive Israeli sweep into the West Bank in which hundreds of Palestinians were arrested, injured, and killed. Earlier this month a Palestinian teen was abducted and killed by Israelis in Jerusalem (who are said to have burned the boy alive). Now the Israeli military is engaged in an offensive against Gaza while Hamas fires rockets toward Israel. What do the last month’s events tell us about the state of the conflict?
Max Blumenthal: The entire crisis occurred against the backdrop of a peace process that Netanyahu was blamed for destroying and in the wake of the Hamas-Fatah unity deal, which the US recognized and which Netanyahu was determined to destroy as well. The kidnapping of the three Israeli teens by what appears to be a rogue Hamas cell apparently seeking to generate some kind of prisoner exchange was too good of an opportunity for him to waste.
And so, as I’ve documented with on-the-record sources, Israeli investigators, Netanyahu and the honchos of the military-intelligence apparatus knew by the sound of gunshots on a recorded call by the teens to the police that the teens were killed right away. And they chose to lie, not only to the teens’ parents, whom they sought to deploy as props in their global PR campaign, but to the Israeli public. Through a military gag order, the Israeli media was not allowed to report on the investigation or the details of the recorded phone call. With the Israeli public and the world convinced that the teens were alive, Israeli troops ransacked the West Bank under the guise of a rescue mission, and embarked on a global propaganda campaign centering around the hashtag #BringBackOurBoys. The Israeli public was not emotionally prepared for the discovery of the teens’ bodies because they thought they would be returned home as Gilad Shalit was. So Netanyahu and his inner circle set the public up for a truly dangerous reaction.
In Goliath, I detailed the rise of anti-Arab mobs comprised of soccer thugs and of the burgeoning anti-miscegenation movement in Israel. Netanyahu’s manipulation of the kidnapping and his response to the discovery of the dead teens—he said, “Vengeance for the blood of a small child, Satan has not yet created”—validated these elements and emboldened them as they set out for revenge. Those young men who abducted the Palestinian teen Mohamed Abu Khdeir met at one of the revenge rallies in Jerusalem; they were fans of the soccer club Beitar Jerusalem, which I wrote about in Goliath and whose racist history is absolutely legion.
Andrew Bacevich in Notre Dame Magazine:
Tick off the countries in that region that U.S. forces in recent decades have invaded, occupied, garrisoned, bombed or raided and where American soldiers have killed or been killed. Since 1980, they include Iraq and Afghanistan, of course. But also Iran, Lebanon, Libya, Turkey, Kuwait, Saudi Arabia, Qatar, Bahrain, the United Arab Emirates, Jordan, Bosnia, Kosovo, Yemen, Sudan, Somalia and Pakistan. The list goes on.
To judge by various official explanations coming out of Washington, the mission of the troops dispatched to these various quarters has been to defend or deter or liberate, punishing the wicked and protecting the innocent while spreading liberal values and generally keeping Americans safe.
What are we to make of the larger enterprise in which the U.S. forces have been engaged since well before today’s Notre Dame undergraduates were even born? What is the nature of the military struggle we are waging? What should we call it?
For several years after 9/11, Americans referred to it as the Global War on Terrorism, a misleading term that has since fallen out of favor.
More here. [Thanks to Christopher Lydon.]
Many of us get confused by claims of how much the risk of a heart attack, for example, might be reduced by taking medicine for it. And doctors can get confused, too. Just ask Karen Sepucha. She runs the Health Decisions Sciences Center at Boston's Massachusetts General Hospital. A few years ago she surveyed primary care physicians, and asked how confident they were in their ability to talk about numbers and probabilities with patients. "What we found surprised us a little bit," Sepucha says. "Only about 20 percent of the physicians said they were very comfortable using numbers and explaining probabilities to patients."
Doctors, including Leigh Simmons, typically prefer words. Simmons is an internist and part of a group practice that provides primary care at Mass General. "As doctors we tend to often use words like, 'very small risk,' 'very unlikely,' 'very rare,' 'very likely,' 'high risk,' " she says. But those words can be unclear to a patient. "People may hear 'small risk,' and what they hear is very different from what I've got in my mind," she says. "Or what's a very small risk to me, it's a very big deal to you if it's happened to a family member." Simmons and her colleagues are working on ways to involve their patients in shared decision-making. The initiative at Mass General gives patients online, written and visual information to help them. One of the goals is to make risk understandable — bridging the gap between percent probabilities and words.
Douglas Coupland in FT Magazine:
A way of rethinking the global web of overlapping allegiances would be to wonder what might happen if Earth instituted a planet-wide citizenship flush. Whoever you are, you now have to choose just one passport – so, which is it going to be? The answer would probably boil down to multiple factors, the largest including personal identity, ease of crossing borders, consular access while abroad and, of course, taxes. Sure, a low tax rate is great, but if I break my arm do I really want to spend $75,000 fixing it? Yes, popping in and out of Europe is terrific, but would I want to forfeit getting a lump in my throat if I hear my ex-national anthem playing? What exactly is citizenship? What does it mean to say I’m this and you’re that? The fact that almost every country on Earth makes it very difficult to become a citizen means that citizenship has to mean something. I think this week when I was asking my guests what citizenship they would choose if they could only have one, I was unwittingly taking them to task for trying to have their cake and eat it, too. Can you really have the best of all worlds, bing bang boom, whenever it suits your needs? I suspect polycitizenry is a creation of the 20th century, and a creation whose days are numbered. As the world gets ever more pay-per-use, the luxury of low-commitment semi-disposable allegiance seems, if nothing else, too expensive. If nothing else, Canada put a number on it.
He built his palette
around the ragged colors
of her tortoise-shell calico,
piled like clotted earth
in a sunny corner.
His worn soul
embraced the tales
of those crumpled old shoes—
shredded laces, wilted leather—
scuffed with the stuff of a heart
that beat wild paths through
fields of irises, sunflowers, wheat,
the streets of Arles, and
dreams lost to the night sky.
by Elaine Frankonis
from What the Seasons Leave —soon to be published
Finishing Line Press, 2014
Thursday, July 24, 2014
Michael Schulson in Aeon (Illustration by Tim McDonagh):
Over the millennia, cultures have expended a great deal of time, energy and ingenuity in order to introduce some element of chance into decision-making. Naskapi hunters in the Canadian province of Labrador would roast the scapula of a caribou in order to determine the direction of their next hunt, reading the cracks that formed on the surface of the bone like a map. In China, people have long sought guidance in the passages of the I Ching, using the intricate manipulation of 49 yarrow stalks to determine which section of the book they ought to consult. The Azande of central Africa, when faced with a difficult choice, would force a powdery poison down a chicken’s throat, finding the answer to their question in whether or not the chicken survived – a hard-to-predict, if not quite random, outcome. (‘I found this as satisfactory a way of running my home and affairs as any other I know of,’ wrote the British anthropologist E E Evans-Pritchard, who adopted some local customs during his time with the Azande in the 1920s).
The list goes on. It could – it does – fill books. As any blackjack dealer or tarot reader might tell you, we have a love for the flip of the card. Why shouldn’t we? Chance has some special properties. It is a swift, consistent, and (unless your chickens all die) relatively cheap decider. Devoid of any guiding mind, it is subject to neither blame nor regret. Inhuman, it can act as a blank surface on which to descry the churning of fate or the work of divine hands. Chance distributes resources and judges disputes with perfect equanimity.
Above all, chance makes its selection without any recourse to reasons. This quality is perhaps its greatest advantage, though of course it comes at a price. Peter Stone, a political theorist at Trinity College, Dublin, and the author of The Luck of the Draw: The Role of Lotteries in Decision Making (2011), has made a career of studying the conditions under which such reasonless-ness can be, well, reasonable.
‘What lotteries are very good for is for keeping bad reasons out of decisions,’ Stone told me. ‘Lotteries guarantee that when you are choosing at random, there will be no reasons at all for one option rather than another being selected.’ He calls this the sanitising effectof lotteries – they eliminate all reasons from a decision, scrubbing away any kind of unwanted influence. As Stone acknowledges, randomness eliminates good reasons from the running as well as bad ones. He doesn’t advocate using chance indiscriminately. ‘But, sometimes,’ he argues, ‘the danger of bad reasons is bigger than the loss of the possibility of good reasons.’
Marc Weitzmann in Tablet:
Last May 24, a Saturday, at 3:27 p.m., according to the accusatory file, a man appeared at the doorstep of the Jewish Museum of Belgium. Out of his bag he pulled a Magnum .357 and fired. The bullets hit Emanuel and Miriam Riva, a couple of Israeli tourists in their mid-fifties who had just entered the place. Each was struck in the back of the skull, and they died on the spot. (Later on, a witness showing up a few minutes after the killing would post on his Facebook page a picture of Miriam’ s body lying in her blood, her hand still carrying the museum’s pamphlet program; what her children, ages 15 and 16, living in Israel, thought of the photograph is not known.) Letting go of the Magnum, the shooter then took from his bag a Kalashnikov, aimed it at a 65-year-old woman by the name of Dominique Sabrier, and shot her, also in the head.
A retired art publisher of Polish descent, Sabrier had left France for Brussels only two months before. Her reason for moving, ironically enough, was, according to her friends, the anti-Semitic atmosphere that now permeates France. The Toulouse killing had scared her, as had the hate demonstration in Paris the previous winter—when, for the first time since World War II, anti-Jewish slogans were chanted in public in the French capital. In Brussels, a city Sabrier knew, she hoped to live a quiet retirement. She had registered for law classes at the Free University of the town and was volunteering as a tourist guide at the museum.
Alexander Strens, 25, found the time to seek refuge under his reception desk—before the killer found him and shot him, once again in the head. Strens, hired at the museum’s communication department the previous year, was the only victim still alive after the shooting. Sent to the Saint-Pierre hospital of Brussels, he was declared brain dead there the next day. He died on June 6, raising the murder total to four. Although Strens’ mother is Jewish, his father is a Muslim Berber from Morocco and, in accordance with the wishes of both families, he was buried in the Muslim cemetery of Taza.
Then, with Strens—and with no more reason than it had when it started, the massacre ends. The surveillance video shows the shooter running away, bag in hand. He disappears.
Brussels is the capital of Europe. The day after the shooting, an election was held for a new European parliament. Xenophobic nationalist parties across the continent were predicted to win a lot of seats even before the killing, and as soon as the news broke the already perceptible tension among the continental political class was imbued with a new sense of frailty and paranoia: Was the scheduling of the massacre just a coincidence? Or was a message being sent—and by whom? Europe was under siege, no doubt, and humiliated, too.
Andersen was profoundly committed to the Romantic ideal of the extraordinary genius, singled out for distinction from birth. “It doesn’t matter if you’re born in a duck yard if you’ve lain in a swan’s egg”, as he explains in “The Ugly Duckling”, one of his most self-reflective tales. His desire to please his many acquaintances meant he could be irritatingly anxious and deferential, and his inclination to a fawning submissiveness in his relations with aristocratic patrons was vexing to those wishing to promote the professional dignity and independence of writers. But he never doubted his credentials as an artist. Despite his lifelong social uncertainties, he was convinced that his unique gifts meant that he was perfectly entitled to special treatment from the hands of fate and his friends.
Binding is especially persuasive in tracing Andersen’s creative relations with Walter Scott, whose work had been translated into Danish in the 1820s. Andersen’s first published tale, “The Apparition at Palnatoke’s Grave” (1822), was influenced by the character of Madge Wildfire in Scott’s The Heart of Midlothian (1818), a novel that acquired cult status among its European readers, and made a lasting impression on Andersen. The story was published under the pseudonym of Villiam Christian Walter, in a volume that included one of the plays that won the approval of the Royal Theatre. Choosing the name “Walter” was an act of homage, but it was also a bold statement of intent. Binding suggests that Gerda’s journey to the icy palace of Kay’s glamorous captor in “The Snow Queen” reflects Jeanie Deans’s indomitable walk to London, undertaken so that she can plead with Queen Caroline for the pardon and release of her condemned sister.
Rock stars are the gods of the last century, avatars for the emotional and religious yearnings 1960s youth would have had nowhere else to place. Bob Dylan’s cryptic magnetism marked him as the Person With the Answers; Mick Jagger’s shaking hips stood for personal and sexual liberation. “Mick Jaggerpersonified a penis,” wrote Pamela Des Barres, the famed groupie and author of several books, in her 1987 memoir, I’m With the Band: Confessions of a Groupie. As a teenager, she “rushed home from school every day to throb along with Mick while he sang: ‘I’m a king bee, baby, let me come inside.’”
Getting in close proximity to a rock star, then—for a night, or a string of tour dates, or the time it would take them to compose an album especially for you—would seem a service to a higher power. “Perhaps by embracing their cherished rock gods, groupies tap into their own divinity,” Des Barres wrote in 2007’sLet’s Spend the Night Together: Backstage Secrets of Rock Muses and Supergroupies. Unfortunately, rock stars are not gods but rather human beings whose emotions happen to resonate with millions—emotions that are inspired by other human beings, some of whom have written memoirs.
For people accustomed to the cooler precincts of modernist and postmodernist art, it is often a joy to reëncounter older, messier forms of theatre, with coincidences and murders and the like. Therefore, when I arrived at the Rose Theatre for “The Ghost Tale of the Wet Nurse Tree,” the Kabuki company Heisei Nakamura-za’s contribution to the Lincoln Center Festival, I was not surprised to find the lobby packed with people spending too much money at the snack bar and looking as though they were going to a soccer game.
Here, with considerable abridgment, is what happens in “The Ghost Tale of the Wet Nurse Tree.” The distinguished painter Shigenobu and his wife, Oseki, have a new baby boy. Hanging around the neighborhood is a self-styled samurai, Namie, wearing a hat the size of a washtub, with a nasty smirk on his face. Shigenobu announces that he’s leaving town to create a dragon painting for a famous temple. Incredibly, he entrusts the care of his wife and son to Namie.
Christopher de Bellaigue in The Guardian:
Along with the "Trojan Horse" controversy about the imposition of a strict Islamic ethos on a number of Birmingham schools, the disclosure that several hundred Britons have been to Syria and Iraq to fight with the jihadis has fired up those who believe that Islam represents an urgent threat to this country. Amid the hype – some of it justifiable, much not – nuance has inevitably been lost. It is significant that the British jihadis have chosen to realise their fantasies not here but in Mesopotamia. From a theological point of view, a caliphate can only be set up in Muslim lands. Britain would be a poor choice even for a pilot scheme – it has a substantial opposing majority and a competent intelligence service. As for the Birmingham "conspiracy", that, too, is more complicated than it seems: while there have undoubtedly been moves to Islamise the schools' curricula and atmosphere, much of the pressure in this direction has come from parents. It would be understandable if law-abiding families with children at schools such as the formerly "outstanding" Park View Academy – now deemed "inadequate" and placed in special measures – felt targeted by former education secretary Michael Gove's campaign to "drain" the fundamentalist "swamp". While in opposition, Gove authored a famously error-strewn and intemperate screed on Islamic fundamentalism, "Celsius 7/7". And before his unexpected sacking, he wanted to inculcate "British values" in people whose social attitudes suggest they have had a bellyful.
The deeper concern is that a significant number of British Muslims are getting more conservative while much of the rest of society – including, of course, very many other Muslims – liberalises apace. Exporting high-profile hate preachers such as Abu Qatada is no solution, for whether one likes it or not the values of conservative Muslims are "British", too. As the shortcomings of "Celsius 7/7" demonstrated, and as Innes Bowen confirms in her sober, meticulous and revelatory new book, the state's attitude towards British Muslims has been defined in part by ignorance.
In My Spare Time
During my long, boring hours of spare time
I sit to play with the earth’s sphere.
I establish countries without police or parties
and I scrap others that no longer attract consumers.
I run roaring rivers through barren deserts
and I create continents and oceans
that I save for the future just in case.
I draw a new colored map of the nations:
I roll Germany to the Pacific Ocean teeming with whales
and I let the poor refugees
sail pirates’ ships to her coasts
in the fog
dreaming of the promised garden in Bavaria.
I switch England with Afghanistan
so that its youth can smoke hashish for free
provided courtesy of Her Majesty’s government.
I smuggle Kuwait from its fenced and mined borders
to Comoro, the islands
of the moon in its eclipse,
keeping the oil fields in tact, of course.
At the same time I transport Baghdad
in the midst of loud drumming
to the islands of Tahiti.
I let Saudi Arabic crouch in its eternal desert
to preserve the purity of her thoroughbred camels.
This is before I surrender America
back to the Indians
just to give history
the justice it has long lacked.
I know that changing the world is not easy
but it remains necessary nonetheless.
by Fadhil al-Azzawi
from Poetry International
translation: 2000, Khaled Mattawa
By 2050, the number of people over the age of 80 will triple globally. These demographics could come at great cost to individuals and economies. Two groups describe how research in animals and humans should be refocused to find ways to delay the onset of frailty.
The problems of old age come as a package. More than 70% of people over 65 have two or more chronic conditions such as arthritis, diabetes, cancer, heart disease and stroke1. Studies of diet, genes and drugs indicate that delaying one age-related disease probably staves off others. At least a dozen molecular pathways seem to set the pace of physiological ageing. Researchers have tweaked these pathways to give rodents long and healthy lives. Restricting calorie intake in mice or introducing mutations in nutrient-sensing pathways can extend lifespans2 by as much as 50%. And these 'Methuselah mice' are more likely than controls to die without any apparent diseases3. Post-mortems reveal that tumours, heart problems, neurodegeneration and metabolic disease are generally reduced or delayed in long-lived mice. In other words, extending lifespan also seems to increase 'healthspan', the time lived without chronic age-related conditions.
These insights have made hardly a dent in human medicine. Biomedicine takes on conditions one at a time — Alzheimer's disease, say, or heart failure. Rather, it should learn to stall incremental cellular damage and changes that eventually yield several infirmities. The current tools for extending healthy life — better diets and regular exercise — are effective. But there is room for improvement, especially in personalizing treatments. Molecular insights from animals should be tested in humans to identify interventions to delay ageing and associated conditions. Together, preclinical and clinical researchers must develop meaningful endpoints for human trials.
Picture: Fauja Singh, here aged 100, prepares for Britain's Edinburgh marathon in 2011