Sunday, July 13, 2014
Angus Kennedy in Spiked:
At first sight, it might appear that society has taken a markedly positive moral turn over the past 20-or-so years. The example of the Conservative Party’s embrace of gay marriage alone shows how far we now are from the failure of Conservative prime minister John Major’s ‘back to basics’ campaign to restore traditional moral values in the early 1990s. Cultural and political trendsetters champion any number of apparently progressive moral campaigns: against female genital mutilation, child abuse, poverty, inequality, or any imaginable form of discrimination on the grounds of race, sex or disability. On the face of it, we seem to have the good fortune to be living in a new age of tolerance, born of a society confident and firm in its moral values. One of contemporary society’s most prominent features is the wide level of support for non-judgementalism; namely, the idea that we do not have the right as individuals to lay down the law as to how others should live their lives. This moral-sounding sentiment reaches right to the top of society. Earlier this year no less an eminence than UK Supreme Court judge Lord Wilson of Culworth declared that marriage was ‘an elastic concept’ (ie, as empty as a rubber band), that the nuclear family had been replaced by a ‘blended’ variety, and that the Christian teaching on the family has been ‘malign’. The one (ironic) judgement that today’s non-judgemental morality is happy to make is to judge the judgemental and castigate strict moral codes as malign and abusive.
This residuum of people who still cling to traditional ideas of morality and concepts like duty are routinely denigrated by the right-thinking as intolerant ‘bigots’ or dismissed as reactionary religious rednecks. Society’s apparent moral confidence is betrayed to a degree by its own level of intolerance towards the supposedly morally intolerant and overly judgemental. Would there be a need for today’s moral crusades against child abuse or FGM to be quite so shrill and knee-jerk were they reflective of a society genuinely confident and secure in what is right and what is wrong? A morally confident society might not need to be on such a high-state of moral alert against the dangers supposedly posed to the social fabric by cases such as the black Christian couple in Derby whom Derby Council denied the right to be foster carers because of their belief that homosexuality is a sin.
William T. Vollmann in Bookforum:
“WE HAD REACHED THE CROSS ROADS before noon and had shot a French civilian by mistake. . . . Red shot him. It was the first man he had killed that day and he was very pleased.” So far, this incident, and the style in which it is told, would be appropriate for either Redeployment or The Corpse Exhibition, two new works of fiction about the Iraq war, the first by Phil Klay, a former marine who served in Iraq during the surge, and the second by Hassan Blasim, an Iraqi filmmaker and writer who moved to Finland as a refugee in 2004. In fact it comes from a late Hemingway story called “Black Ass at the Cross Roads.” The setting is France, sometime after D-day, when the Nazis are fleeing. The narrator’s business is to kill them as they go by. By the standards of the Iraq war, he turns out to be a bleeding heart. Even after his contingent ambushes a half-track “full of combat S.S.,” the worst of the worst, he feels uncomfortable about keeping souvenirs from their corpses: “It’s bad luck in the end. I had stuff for a while that I wished I could have sent back afterwards or to their families.”
Of course, Hemingway was no career soldier; he was a writer and therefore, never mind his tough-guy stuff, a professional sensitive. But he was around war enough to be grieved, hardened, enlightened, and damaged by it, and to write about it movingly. This essay is not about him except insofar as he can be a foil to the other two writers under discussion. I need not discuss either his greatness or his glaring faults except to say in regard to the former that he surely remains a natural standard of comparison for modern war literature, which is why I mention him now, and in regard to the latter that (excluding much mawkishness about gender relations) sentimentality rarely figures high on his list of official sins.
Saturday, July 12, 2014
Joost Hiltermann in the NYRB (photo: Kamaran Najm/Metrography/Corbis):
[W]hile the Kurds believe Kirkuk’s riches give them crucial economic foundations for a sustainable independent state, the city’s ethnic heterogeneity raises serious questions about their claims to it. Not only is Kirkuk’s population—as with that of many other Iraqi cities, including Baghdad itself—deeply intermixed. The disputed status of its vast oil field also stands as a major obstacle to any attempt to divide the country’s oil revenues equitably. To anyone who advocates dividing Iraq into neat ethnic and sectarian groups, Kirkuk shows just how challenging that would be in practice.
The definitive loss of Kirkuk and the giant oil field surrounding it could precipitate the breakup of Iraq, and while the present government in Baghdad is in no position to resist Kurdish control, a restrengthened leadership might, in the future, seek to retake the city by force. For the Kurds, the sudden territorial gains may also not be the panacea they seem to think they are. The Kurdish oil industry is still much in development, and if the Kurdish region loses access to Baghdad’s annual budget allocations without a ready alternative, it is likely to face a severe economic crisis. Moreover, the same jihadist insurgency that has enabled Kurdish advances in the disputed territories is also a potent new threat to the Kurds themselves. So the taking of Kirkuk poses an urgent question: how important is Iraq’s stability to the Kurds’ own security and long-term aims?
As a writer-statesman, Aimé Césaire belongs in a small category of twentieth-century writers that includes Léopold Sedhar Senghor, Václav Havel, Winston Churchill, and probably a dozen others. This remarkable man from Martinique, who died in 2008 at age ninety-five and was the author of a score of plays, essays, and volumes of poetry, served for an astonishing fifty-six consecutive years as mayor of Fort-de-France and deputy to the French National Assembly. Perhaps because his contribution bridged literary and political domains, Césaire’s mark on the ideological climate of France’s former colonial empire remains quite palpable today. Students coming of age across the francophone world learn his name and read his oft-anthologized verses.
It was in the years immediately following the Second World War that Césaire began to emerge as a major voice in French poetry. When his “Cahier d’un retour au pays natal” (“Notebook of a Return to the Native Land”) was published in France in 1939, it brought no great acclaim to the author; but in 1947 Bretano’s and Bordas reprinted it in New York and Paris, fronted by a new introduction from André Breton. The previous year, France’s premier literary publisher, Gallimard, had published the poetry collection Les Armes miraculeuses (The Miraculous Weapons). The year 1948 saw Solar Throat Slashed (Soleil cou coupé) brought out in a limited press run by a small avant-garde publisher.
What Greer is arguing for is a kind of environmental absolutism. In her view, the country, the planet even, is not beyond saving, but only if we open up our way of thinking. We need to know more, so we can understand where we are and how to move ahead. To highlight this, she spends much of the book crafting a series of capsule histories — of the ecosystem, the settlement of Cave Creek, the timber industry, Australian botany — that offer context that is both immediate and longer term.
"Every day, as we write labels for the boxes where we sow our freshly gathered seed, we do homage to dead white men," she writes, a deftly ironic double entendre meant to comment on both her efforts at propagation and the hegemony of generic names for plant life.
A similarly pointed bit of humor emerges in her account of the discovery, in 1893, on what would later be her property, of the biggest cedar anyone had then seen. "Confronted with this botanical marvel," she writes of the two men who found the tree, "the only thing they could think to do was cut it down." The joke, however, was on them; sent to the Crystal Palace in London, where it was to be displayed "for perpetuity," the tree was destroyed when the structure burned in 1936.
The name Seneca brings a particular image to mind: a gaunt, half-naked old man, glaring wildly, his veins open, his lifeblood seeping into the small bath beneath him after he was forced to commit suicide. Painted by Rubens, memorialized by Dante in his first circle of hell, gilded into medieval manuscripts alongside Plato and Aristotle, Seneca has come to represent the perils of proximity to absolute power. The central question of James Romm’s “Dying Every Day” is this: When we confront this tragic Roman wordsmith, tutor to the emperor Nero (and, some argue, the power behind that terrible throne), who stares back at us? Is it a tyrannodidaskalos, a tyrant-teacher? Is he the ultimate exemplar of Stoicism, a would-be philosopher king? Or is Seneca simply an accretion of history, a phantom constructed to fit our ravening for heroes, for antiheroes and for the sensational in the stories of antiquity?
Teasing out these conundrums, Romm, the James H. Ottaway Jr. professor of classics at Bard College, gives us a fresh and empathetic exploration of a man who, tantalizingly, seems destined to stay just out of reach.
A NYC restaurant investigated why its service seems to have gotten slower since 2004, over at craigslist [h/t: Dan Balis] (image from wikimedia commons):
Customers walk in.
They gets seated and are given menus, out of 45 customers 3 request to be seated elsewhere.
Customers on average spend 8 minutes before closing the menu to show they are ready to order.
Waiters shows up almost instantly takes the order.
Food starts getting delivered within 6 minutes, obviously the more complex items take way longer...
Customers are done, check delivered, and within 5 minutes they leave.
Average time from start to finish: 1:05
Customers walk in.
Customers get seated and is given menus, out of 45 customers 18 requested to be seated elsewhere.
Before even opening the menu they take their phones out, some are taking photos while others are simply doing something else on their phone (sorry we have no clue what they are doing and do not monitor customer WIFI activity).
7 out of the 45 customers had waiters come over right away, they showed them something on their phone and spent an average of 5 minutes of the waiter's time. Given this is recent footage, we asked the waiters about this and they explained those customers had a problem connecting to the WIFI and demanded the waiters try to help them.
Finally the waiters are walking over to the table to see what the customers would like to order. The majority have not even opened the menu and ask the waiter to wait a bit.
Customer opens the menu, places their hands holding their phones on top of it and continue doing whatever on their phone.
Waiter returns to see if they are ready to order or have any questions. The customer asks for more time.
Finally they are ready to order.
Total average time from when the customer was seated until they placed their order 21 minutes.
Food starts getting delivered within 6 minutes, obviously the more complex items take way longer.
26 out of 45 customers spend an average of 3 minutes taking photos of the food.
Jim Holt in Prospect Magazine:
Most of us, when we look in the mirror, have a sense that behind the eyes looking back at us is a me-ish thing: a self. But this, we are increasingly told, is an illusion. Why? Well, according to neuroscientists, there is no single place in the brain that generates a self. According to psychologists, there is no little commander-in-chief in our heads directing our behaviour. According to philosophers, there is no “Cartesian ego” unifying our consciousness, no unchanging core of identity that makes us the same person from day to day; there is only an ever-shifting bundle of thoughts, feelings and memories.
In the last few years, a number of popularising books, bearing titles like The Self Illusion and The Ego Trick, have set out the neuroscientific/psychological/philosophical case against the self. Much has been made of clinical cases where the self seems to malfunction spectacularly: like Cotard syndrome, whose victims believe they do not exist, even though they admit to having a life history; or “dissociative identity disorder,” where a single body seems to harbour multiple selves, each with its own name, memory, and voice. Most of us are not afflicted by such exotic disorders. When we are told that both science and philosophy have revealed the self to be more fragile and fragmentary than we thought, we take the news in our stride and go on with our lives. But perhaps we should be paying closer attention. For example, there is striking evidence (detailed by the Nobel laureate Daniel Kahneman in his book Thinking, Fast and Slow) that each of us has a “remembering self,” which makes decisions, and an “experiencing self,” which actually does the living. And when the remembering self looks back on an experience and decides how enjoyable it was, it can arrive at an assessment that is quite out of whack from what the experiencing self actually endured. It is your remembering self that tyrannically resolves to take another family vacation this summer, even though your voiceless experiencing self was miserable for most of the last one. Evidently, the subtleties of the self are of practical as well as scholarly interest.
David Dobbs in The New York Times:
In his 2007 book “A Farewell to Alms,” the economic historian Gregory Clark argued that the English came to rule the world largely because their rich outbred their poor, and thus embedded their superior genes and values throughout the nation. In her comprehensive takedown, the historian Deirdre N. McCloskey noted that Clark’s idea was a “bold hypothesis, and was bold when first articulated by social Darwinists such as Charles Davenport and Francis Galton in the century before last.” Indeed, over the past 150 years, various white Western scientists and writers have repeatedly offered biological explanations for Caucasian superiority. They have repeatedly failed because, as McCloskey noted, none ever mounted a credible quantitative argument.
Now, in “A Troublesome Inheritance,” Nicholas Wade, a longtime science writer for The New York Times, says modern genetics shows that “the three major races,” Africans, Caucasians and East Asians, are genetically distinct races that diverge much as subspecies do, and that their genetic differences underlie “the rise of the West.” This racial divide started, Wade says, when humans began migrating out of Africa some 50,000 years ago. As groups entered diverse environments, they faced differing pressures that selected for gene variants creating different traits, including dissimilar social behaviors. Genetic selection for distinctive physical traits in different populations, such as lighter skin to maximize sunlight absorption, is well established and widely accepted. Decidedly not well established, however, is Wade’s proposal that genetic selection gives different human populations distinct behaviors. Because this is the heart of his argument, and because social behavior is far more complex than, say, skin color, it seems fair to ask that his evidence clear a high bar. Does it?
QE II is the Ocean Liner
Dear Queen Elizabeth,
I found a pencil sharpener that belongs to you
with E II R under a little crown stamped on it.
I picked it out from other pencil sharpeners not yours,
stamped QE II
(how clever and non-American of me
to know the difference)
in the Toronto Reference Library lost-and-found,
not the library of the QE II,
where, I understand, they give out pencil sharpeners
rather than baseball caps or magnets.
Dear Magnate Queen,
I want you to know how happy I am to have
from among all the riches in the world
this pencil sharpener,
which I’d also be happy to return to you
if only you’d say the word.
Then I'd have a word from you
or else this pencil sharpener stamped E II R
under a little crown, either way
one step a little closer to the riches of the world.
by Marvyne Jenoff
from Crackerjack Umbrella
Twoffish Press, 2008
Friday, July 11, 2014
Over at Guernica, Meara Sharma interviews food writers Jane Black and Brent Cunningham,"on reality television, school lunch reform, and building a food movement that transcends class lines":
Guernica: I’m curious to hear what drew you to the subject of food and class, how you came to write about Huntington, West Virginia.
Brent Cunningham: Jane and I come from somewhat different class backgrounds. I grew up in West Virginia in the 1970s, and was raised by a single mom who was working class and didn’t finish college. We weren’t poor, though there were certainly times when things were tight. And we ate food that was both of that class and also of that time. That was the height, in some ways, of processed food—and that cut across class lines, because it was seen as modern. I remember eating the “Hungry Man” frozen dinners, and “Boil-in-a-Bag” whole meals that you would just dunk in a pot of boiling water.
My grandparents were from rural southern West Virginia, a coal-mining area, and even after they moved to the city they always had a huge garden which we ate from. I remember a sense of community growing up around food rituals, from helping my grandmother can tomatoes or make pickles to sitting around the table and picking meat off a turkey or chicken carcass after a meal, to use for something else. But none of this made a huge impression on me until I grew up and moved away, and essentially entered a different class, both monetarily and culturally. Part of the legacy of this life is that I learned to cook, and loved to. So I had this interest in food, as well as journalism, and I saw that they could come together. And I became fascinated by food and class, and particularly how that manifested in Appalachia.
Jane Black: I was at the Washington Post, covering food, and Brent was in New York, and we were engaged and trying to figure out how to be in the same place. One weekend I was in Brooklyn and telling Brent about how I had just written this story—like every other food journalist in the world—about Jamie Oliver’s reality TV show in Huntington. What I had written was basically “the revolution had been televised—now what?” And Brent said, “There’s more to that than a 1,200-word news feature.” And I said, “Yeah, but this is your thing.” And he said, “No, it’s your thing.” We went back and forth and then thought, hmm, maybe we should do this together. So we wrote a book proposal, turned it in the week before we got married, the day we got back from our honeymoon we had an offer, and we thought, okay, we’re moving to Huntington.
The idea was not really to write about Jamie Oliver. But we’d tell people where we were going and they’d say, “Oh, wasn’t that where that chef…” There was some recognition. And we were looking for a particular kind of place. A place that wouldn’t naturally be touched by the food frenzy that had taken over the coasts—the majority of the country—and to a certain extent a place that could stand in for all of those places. This is a book about West Virginia, about Appalachia, which has its own peculiar challenges, but it’s really about working-class America.
Steven J. Brams in Plus Magazine:
The problem of how to fairly divide goods between people has been around since the dawn of humanity. Even the Hebrew Bible mentions it. In the book of Genesis (13: 5-13), Abraham and Lot divide a piece of land using the I cut, you choose method, which has one person dividing the land in two and the other choosing the piece he prefers.
This works well when you are trying to share something you can divide any way you like, like a piece of land or a cake. But what if the goods you are trying to share out are indivisible? This happens, for example, in divorce cases, or when dividing an estate among relatives of a deceased person. You can't cut a flat screen TV or diamond ring, so how should you go about sharing out the goods in a way that causes the least resentment?
Let's suppose the goods are to be divided between two people, A and B, whom we will refer to as players. (Since we don't care about whether A and B are male or female, we will also use the pronoun "it" when referring to them.) Let's also suppose that each can rank the items on offer in order of preference, with no two items occupying the same rank, and that both players are honest about their preferences. You could get A and B to take turns in picking items, but this puts the person who goes first at an advantage. For example, if both players have the same item at the top of their list, then the one who goes first is the one who gets it, to the detriment of the other. The same goes for any other item that occupies the same rank for both players.
To avoid this problem, Alan J. Taylor and I came up with another method, equally simple. Start with the items that are on the top of the players' preference lists. If the two items differ — say A wants the car and B wants the house — then each gets its preferred item. If they are the same — both want the house — the item goes into a contested pile. Now ask both players to list the remaining items in order of preference and repeat, until every item has either been allocated or gone into the contested pile (which will then have to be dealt with separately). Let's call this method for allocating items Noncontested Allocation (NA). See the box for an example.
In the unlikely event that both players have exactly the same preferences, all items will go into the contested pile and we will have to select another method for parcelling out the items. But if this isn't the case — and in practice it rarely is — this algorithm has an advantage over simply taking turns: the allocation (of the noncontested items) it produces is envy free.
Ana Palacio in Project Syndicate:
Rapprochement between Iran and the West has long been a “white whale” of global politics. But it increasingly appears that the world may be on the verge of a new era, characterized by a wary yet crucial collaboration between countries – particularly Iran and the United States – that had been irreconcilable since Iran’s Islamic Revolution in 1979.
The imperative for such cooperation drove last month’s Bergedorf Round Table, organized by the Körber Foundation in conjunction with the Institute for Political and International Studies. At the event, which I attended, 30 politicians, senior officials, and experts from Europe, the US, and Iran considered the relationship’s future, producing some important insights that should inform future policy decisions.
With countries across the Middle East crumbling and territorial sovereignty disintegrating – most notably in Iraq – this effort could not be timelier. To reverse the region’s slide into chaos, it needs strong stabilizing forces that can underpin coordinated action aimed at curtailing sectarian violence. Here, Iran has a key role to play.
Beyond its historical and cultural depth, which gives it a certain authority in the Middle East, Iran has one of the region’s few functioning governments capable of responding to geopolitical developments. This is to say nothing of its massive oil reserves, which secure its critical role in the complex global energy equation, particularly as it applies to Europe, which is working to reduce its dependence on Russian energy imports.
The problem is that Iran has consistently squandered its leadership potential, choosing instead to act as a spoiler, especially through the use of proxy armies. This disruptive tendency reinforces the need for collaboration, underpinned by strong incentives for Iran to maintain a constructive, moderate foreign policy.
To this end, the nuclear negotiations between Iran and the E3+3 (France, Germany, and the UK plus China, Russia, and the US) are an important first step. Iran’s nuclear ambitions have long posed a major security threat in the Middle East, as they raise the risk of preemptive military action by Israel or the US and, perhaps even more harrowing, of a regional arms race with the Gulf states and Turkey. Though fragmentation and sectarian violence have recently become a more urgent danger, the risks associated with Iran’s emergence as a nuclear power should not be underestimated.
No matter how often you swear that you will fight the cancer, you are helpless against it. The journalistic convention in obituaries to praise the dead for their “courageous battle” against cancer is a lie designed to comfort the living and healthy. At best the cancer patient consents to treatment, although he must withdraw consent at some point and permit the disease to run its course. Or, as L. E. Sissman sang of the foreign country known as Hodgkin’s lymphoma where he lingered for a decade,
Reside on the sufferance of authorities
Until my visas wither, and I die.
Cancer patients are betrayed by our culture’s dishonesty. Those who recover from the disease are hailed as “survivors”—a term appropriated from the Holocaust—but while they are struggling with cancer and undergoing sometimes painful treatments for it, they are barely acknowledged. They are consigned to what Ralph Ellison calls a “hole of invisibility.”
“There’s a possibility,” Ellison goes on to say, “that even an invisible man has a socially responsible role to play.” Not, however, as long as the servitude of cancer is described by the platitudes our culture favors—“fight,” “battle,” “survive,” or “succumb.” Is it any wonder the cancer patient, who suspects the truth even if he dare not utter it to himself, ends in inconsolable resignation?
Last year the great American novelist Norman Rush published a new book, Subtle Bodies, his fourth. Unlike the first three—Whites, Mating and Mortals—it is not set in Botswana, where Rush and his wife Elsa were Peace Corps co-directors from 1978 to 1983, and perhaps that missing edge of novelty is one reason why there’s been a tone of civil disappointment in the critical response. Nevertheless all four books are of a piece, sharing a pair of central concerns: geopolitics, specifically with regard to issues of structural injustice, and the nature of a long and extremely intimate marriage.
On those first and third levels of the art of the novel, Rush is only an equivocal and intermittent master. Passages of his books, particularly Mortals, are beautifully plotted, but none of them could be called compulsive from first to last solely by virtue of their story. As for his vision of the world, fascinating though it is, it has limitations of perspective that the best and most dispassionate novelists (even seemingly inward ones, such as Kafka) have been able to transcend. To be specific: few of the humans who populate Rush’s books ever seem quite as real as the husband and wife who recur again and again as central characters, bandits traveling under different names each time, and who form the twinned consciousness of his art.
But the second, intermediate level of novelistic greatness has to do with neither technical nor visionary genius.
1. As with, say, colour perception, reports on the direct experience of feelings are necessarily veridical. E.g., you cannot report (in good faith) that you are experiencing fear, while not in fact being afraid.
2. This experience reveals that fear is real. One needn't go looking for fear in the world, as one would go looking for bigfoot or quarks. This is just not what we have in mind when we attribute reality to certain things.
3. I experience love.
4. God is, by definition, love.
5. Therefore, God is real.
Issue will be taken, of course, with step 4, as having a stipulative character. I am taking it from 1 John 4:8, but others will look to other biblical passages and to other religious traditions to say that God is an anthropomorphized being of some sort, or a theriomorphized one, or a many-headed chimera: in any case, a conscious agent, not a feeling.
But here one might also note that any virtue or feeling at all can be, and often is, anthropomorphized: justice, beauty, purity, etc., have all been represented as human beings in the history of art, and a future historian or a Martian anthropologist would be forgiven for inferring, for example, that late-modern New Yorker-New Jerseyans follow a cult around the goddess of liberty.
Tara Isabella Burton in The Paris Review:
Poor Fanny Price. The unabashedly mousy, pathologically virtuous protagonist of Mansfield Park—which turns two hundred this year—is Jane Austen’s least popular heroine. She spends most of the novel creeping around the periphery of the titular park, taciturn and swallowing tears; she tires after the briefest of physical exertions; she looks down on her wealthier cousins for engaging in flirtatious amateur theatrics; and for most of the book’s five hundred pages, she refuses to voice her long-held love for her cousin Edmund. Austen’s own mother reportedly found Fanny “insipid”; the critic Reginald Farrer described her as “repulsive in her cast-iron self-righteousness and steely rigidity of prejudice.” Even C. S. Lewis—in the voice of his demon Screwtape in The Screwtape Letters—let loose a vitriolic rant about Austen’s most priggish heroine, calling her “not only a Christian, but such a Christian—a vile, sneaking, simpering, demure, monosyllabic, mouselike, watery, insignificant, virginal, bread-and-butter miss … A two-faced little cheat (I know the sort) who looks as if she’d faint at the sight of blood, and then dies with a smile … Filthy, insipid little prude!” Even if we are to separate Lewis from Screwtape, it’s difficult to see Fanny as anything but, to quote Nietzsche’s famous description, “a moralistic little female à la [George] Eliot.”
And indeed, those who defend Fanny tend to see her as a Christian heroine in the mold of a Dorothea Brooke. As the Austen biographer Claire Tomalin puts it, “it is in rejecting obedience in favor of the higher dictate of remaining true to her own conscience that Fanny rises to her moment of heroism.” But to read Mansfield Park as a kind of Middlemarch is to miss the far more complicated story Austen has told. Fanny Price’s story is less about her individual virtue, or her richer relatives’ lack thereof, but about class, about privilege in its most insidious form—before the term ever cropped up in contemporary social justice discourse. Fanny isn’t moral or upright because she wants to be, but because the role—along with a whole host of so-called middle-class values—is forced upon her. For all we know, she may well wish to be as carefree, as filled with dynamic sprezzatura, as Woodhouse or Elizabeth Bennet, Austen’s more fortunate heroines, but the social dynamic, and the circumstances of her birth, deny her the security necessary for such frivolity. Fanny has too much at stake to be easygoing.
Bilal Qureshi in NPR:
Ayad Akhtar is a novelist, actor and screenwriter. And when his first play, Disgraced, won the Pulitzer Prize for Drama in 2013, he also became one of the most talked about new voices in American theater. Long before this buzz, though, Akhtar grew up in a Muslim family with roots in Pakistan. He mines this background to bring the inner lives and conflicts of Muslim Americans to the stage. His plays often feature cutting dialogue and confrontations steeped in the tension between Islamic tradition and personal evolution. Akhtar's latest play, The Who & the What, is set in Atlanta and tackles the role of women in Muslim families. As with his other work, Akhtar's own family helped inspire the drama. "One of the central questions of my childhood was the role of women in my culture," Akhtar says. "I grew up around so many brilliant and strong women who really seemed to suffer and chafe under the familial and religious order." At the heart of his new play is a young woman named Zarina, who wants to confront that order. She's secretly writing a scathing book about the Prophet Muhammad, hoping to expose what she considers the misogyny at the heart of Islamic history. When Zarina's observant father and sister discover her manuscript, the three of them descend into accusations of blasphemy and betrayal.
Playwright Donald Margulies served on the jury that recommended Akhtar's Disgraced for the Pulitzer Prize, and he says these kinds of clashes are at the heart of great theater. He says, "Theater is a place where arguments can be dramatized in a much more exciting way than if it were simply prose narrative." He adds that what makes Akhtar's work especially exciting is that he is staging multiple perspectives within a community that is still working through its place in America. "The African-American experience, the Jewish American experience — these have been very prevalent in our drama for generations now, and the fact that here we were having a Muslim American experience that was being dramatized was a very momentous occasion," he says.
The firstborn was handed back to them
in a small cask not much bigger than
a shoebox only wooden no more about it
they took it home by pony and trap
wasn’t the river in flood at the gate?
they had to climb down and wade through it
and she went alone with him
to the corner of a field below the house
a dry shaded place where he opened a grave
for it was April then and the pinkish
blossoms of whitethorn were emerging
and they lifted it low together
onto sods of damp earth
placed holy water with it
and everything she could to lay
a holy innocent to rest
as far as giving the boy a name
it was Martin the brother in Chicago
and when it came to saying good-bye
he had to draw her away she was so
lonely that shook him while
he covered it with clay for up to then
never a care but a demon for style
high heels you’ve never seen the like
though she gave birth again
she was often seen alone in that field
by Catherine Phil MacCarthy
from The Invisible Threshold
Publisher: Dedalus Press, Dublin
Thursday, July 10, 2014
Jason Mitchell at Harvard:
When an experiment is expected to yield certain results, and yet it fails to do so, scientists typically work to locate the source of the failure. In principle, the cause of an experimental failure could lurk anywhere; philosophers have pointed out that a failed experiment might very well indicate a previously undetected flaw in our system of logic and mathematics. In practice, however, most scientists work from a mental checklist of likely culprits. At the top of this list, typically, are "nuts-and-bolts" details about the way in which the experiment was carried out—was my apparatus working properly? did my task operationalize the variable I was aiming for? did I carry out the appropriate statistical analysis in the correct way? and so on. Very often, the source of the failure is located here, if only because the list of practical mistakes that can undermine an experiment is so vast.
Considerably lower down the list are various doubts for expecting particular results in the first place, such as uncertainty about the theory that predicted them or skepticism about reports of similar effects. In other words, when an experiment fails, scientists typically first assume that they bungled the details of the experiment before concluding that something must be wrong with their initial reasons for having conducted it in the first place (or that logic and mathematics suffer some fatal flaw). This makes good sense: it would be inane to discard an entire theoretical edifice because of one researcher’s undetected copy-and-paste error or other such practical oversight.
In the early twentieth century, a handful of Cambridge men, young medical doctors mostly, established modern anthropology, neuroscience, psychology and psychotherapy in Britain. Ben Shephard sums up their quest as “a search for a science of the mind”, which was certainly a large part of it, but they were interested in a great many other things as well. They were close associates who influenced one another, but it would be a mistake to exaggerate the coherence of their projects or the extent to which they shared a common sense of what they were after. Because they were so eclectic and ranged so widely, they were not installed as ancestor figures in the disciplines into which the human sciences were beginning to fragment, even if they were influential in the committees that helped to shape the new professional institutions. Their names are therefore mostly unfamiliar today. Shephard rescues them from the oubliette of disciplinary histories and presents them as members of a cohort: a network of eccentric, wilful, brilliant men who were prepared to go anywhere, try anything, to advance the scientific understanding of human nature.
Central members of this cohort were brought together in the 1898 “Cambridge Anthropological Expedition to Torres Straits”, that narrow stretch of sea, with numerous islands, which separates Australia and New Guinea. The expedition was organized by a zoologist, Alfred Haddon.
THIS JULY MARKS the fiftieth birthday of the 1964 Civil Rights Act. It is one of the most important pieces of legislation that Congress has ever passed; it made racial discrimination illegal in many of the walks of public life where it had been legally permissible before. Ten years before the Civil Rights Act became law, the 1954 decision in Brown v. Board of Educationhad taken America by surprise, generating a set of iconic images1 that are still stamped in our national historical memory. But a decade afterBrown, only two percent of southern African American children were attending integrated schools.2 Brown’s imagery stuck in the mind’s eye, but it was the Civil Rights Act that remade the country.
Other laws, of course, have also helped shape the country. But the Civil Rights Act is different in one major way: for many Americans born since its passage, it is very difficult to imagine political and social life without it. Imagine the United States losing the Civil Rights Act’s bans on employment discrimination or on the segregation of public places. Imagine us giving up its tools for the integration of schools and other public facilities. For a lot of people, it’s nearly unthinkable.
They help to create a world, she wrote, where men are forever feeling betrayed, not supported, by the true character and the quality of women, because when fantasy is governing perception, the truth appears as a blasphemy. Neutral facts about women are perceived by men, and by women themselves, not as welcome illuminations but as bad news, festering blemishes on the lovely structure that both sexes agree is woman's proper moral and physical shape. Men need the structure and try to force its preservation; but they also feel entitled to hate the rituals of fakery that women perform to maintain it, especially when they tail. Women in such a world feel chronically in the wrong, most acutely wrong in moments when the truth of the self betrays the fantasy, but obscurely wrong in essence for consenting to the fantasy in the first place. Woman as Object may be spared the burden of responsibility carried by primary Subjects only by suffering the dishonor of constant two-way self-betrayal. Much heavier burdens and worse sufferings then follow, as women are given and often meekly accept every form of raw deal in punishment tot representing falsity and moral weakness.
The sophistication of Beauvoir's account gave her book a lasting resonance, but her brisk solutions of 1949 were too simple for the way that history was going.