Friday, February 28, 2014
NOTE: Please nominate good political blog writing for our prize in the comments section of this post. LAST DAY FOR NOMINATIONS!
Dear Readers, Writers, Bloggers,
We are very honored and pleased to announce that Mark Blyth has agreed to be the final judge for our 4th annual prize for the best blog and online writing in the category of politics and social science. Details of the previous three politics (and other) prizes can be seen on our prize page.
Mark Blyth is professor of international political economy at Brown University. He is based in the Department of Political Science, but his research begs and borrows from multiple fields. He is particularly interested in how uncertainty and randomness impact complex systems, particularly economic systems. He was a member of the Warwick Commission on International Financial Reform that made a case for macro-prudential regulation. He is the author of Great Transformations: Economic Ideas and Institutional Change in the Twentieth Century (Cambridge: Cambridge University Press 2002), and most recently, Austerity: The History of a Dangerous Idea (Oxford University Press 2013). His academic writings have appeared in such places as the American Political Science Review, the Review of International Political Economy, and the Journal of Evolutionary Economics, while his more popular writings have appeared in Foreign Affairs and Foreign Policy magazine. He has also written for 3 Quarks Daily.
As usual, this is the way it will work: the nominating period is now open, and will end at 11:59 pm EST on March 6, 2014. There will then be a round of voting by our readers which will narrow down the entries to the top twenty semi-finalists. After this, we will take these top twenty voted-for nominees, and the editors of 3 Quarks Daily will select six finalists from these, plus they may also add up to three wildcard entries of their own choosing. The three winners will be chosen from these by Mark Blyth.
The first place award, called the "Top Quark," will include a cash prize of 500 dollars; the second place prize, the "Strange Quark," will include a cash prize of 200 dollars; and the third place winner will get the honor of winning the "Charm Quark," along with a 100 dollar prize.
(Welcome to those coming here for the first time. Learn more about who we are and what we do here, and do check out the full site here. Bookmark us and come back regularly, or sign up for the RSS feed.)
February 24, 2014:
- The nominations are opened. Please nominate your favorite blog entry by placing the URL for the blog post (the permalink) in the comments section of this post. You may also add a brief comment describing the entry and saying why you think it should win. (Do NOT nominate a whole blog, just one individual blog post.)
- Blog posts longer than 4,000 words are strongly discouraged, but we might make an exception if there is something truly extraordinary.
- Each person can only nominate one blog post.
- Entries must be in English.
- The editors of 3QD reserve the right to reject entries that we feel are not appropriate.
- The blog entry may not be more than a year old. In other words, it must have been written after February 23, 2013.
- You may also nominate your own entry from your own or a group blog (and we encourage you to).
- Guest columnists at 3 Quarks Daily are also eligible to be nominated, and may also nominate themselves if they wish.
- Nominations are limited to the first 200 entries.
- Prize money must be claimed within a month of the announcement of winners.
March 8, 2014
- The nominating process will end at 11:59 PM (NYC time) of this date.
- The public voting will be opened soon afterwards.
March 13, 2014
- Public voting ends at 11:59 PM (NYC time).
March 24, 2014
- The winners are announced.
One Final and Important Request
If you have a blog or website, please help us spread the word about our prizes by linking to this post. Otherwise, post a link on your Facebook profile, Tweet it, or just email your friends and tell them about it! I really look forward to reading some very good material, and think this should be a lot of fun for all of us.
Best of luck and thanks for your attention!
Friday, March 07, 2014
Jill Godmilow in IndieWire [via Chapati Mystery]:
Throughout the film, Oppenheimer encourages his collaborators to produce ostentatiously surreal and violent dramatic film reconstructions of their death squad activities. Ever since Robert Flaherty asked his Inuit collaborator, Nanook the Bear, (his real name was "Allakariallak") to fake the capture of a seal in 1922 – at the very beginning of ethnographic film tourism – we have seen hundreds of social actors perform “real” re-enactments of their lives for the cameras of documentary filmmakers. There is nothing new in “The Act of Killing” but carnage, and the special, cozy relationship we are urged to enjoy with the killers. Perhaps this is exactly what the critics are avoiding with their raves – that they have been duped into admiring, for an hour or two, the cool Rat Pack killers of Medan.
Collaboration is a way to share, with the social actors represented, responsibility for a film’s acts of description, strategies and arguments…a way to “keep it clean.” Some of the most useful films I’ve seen in the last twenty years – non-fiction and otherwise – have been the products of collaboration with the social actors represented, in unique and disparate ways. Carolyn Strachan and Allessandro Cavadini’s “Two Laws,” Kent MacKenzie’s “The Exiles,” and Rolf de Heer and Peter Djigirr’s “Ten Canoes” come quickly to mind.
First on this list should be Rithy Pahn’s “S-21: The Khmer Rouge Killing Machine” – the perfect counter model to “The Act of Killing.” In S-21, the two survivors of the infamous Cambodian prison and their Khmer Rouge prison guards are brought together in a patient re-enactment of their crimes, which the traumatized guards cannot otherwise recollect.”The Act of Killing” is also a collaboration of sorts, but for me a non-productive, uncomfortable, even unclean one.
Amanda Little in Bookforum:
LAST SPRING, A THIRTY-ONE-YEAR-OLD COLLEGE DROPOUT–TURNED–ENERGY EXECUTIVE named Billy Parish came to talk to my journalism class at Vanderbilt University. The course focused on climate reporting, and Parish had recently been profiled in Fortune magazine as a young virtuoso in the solar industry. Students wanted to hear his perspective as an innovator: What did he consider the most important untold story on climate change? “Easy,” he said, “it’s the story of our victory in progress, the story that we’re winning—not losing—the climate battle.” Most progressive journalists hate to talk about actual progress, Parish went on to argue, so they spend their time mewling about what’s not getting done on the climate-legislation front. Science writers, meanwhile, nitpick about important but arcane details of atmospheric warming in parts per million and other mind-numbing measurements. The skeptics, for their part, continue to chant, like skipping records, their groundless but vehement doubts about the problem’s very existence. Little wonder that Americans turn a deaf ear to this issue.
Maybe they wouldn’t, Parish argued, if they could read more climate literature that matters—stories about how America is actually innovating and adapting in response to this crisis, even as global treaties have languished and climate legislation collects dust on the shelves of Congress. Parish waxed technophilic, telling stories about new carbon-cutting innovations on the horizon: wind turbines designed like jet engines, not propellers; fuels made from algae and batteries made from viruses; nanotech solar cells that are smaller than gnats and can be integrated into paints, shingles, and glass. He explained that the cost of solar energy has come down 80 percent in the last five years, and solar production has grown more than 50 percent a year. “We’ve got to stop acting helpless,” he said. “We’ve got to start telling the stories of why we’re winning.” As a budding entrepreneur, Parish is notably prone to enthusiasm. But his argument stayed with me, and after his visit I began to see climate literature a bit differently, dividing it into two categories: The first, and overwhelmingly the largest, includes stories of conjecture about climate change itself—about whether it’s happening at all; whether humans are to blame; how severe the problem is or isn’t; how catastrophic the impacts may become. The second, and much more intriguing, category focuses on the tangible, practical ways we’re beginning to adapt: stories about innovators who are trying, against vertiginous odds, to get technologies and strategies in place that can make our transition to a low-carbon economy not just possible but seamless.
Andrew Pollack in The New York Times:
J. Craig Venter is the latest wealthy entrepreneur to think he can cheat aging and death. And he hopes to do so by resorting to his first love: sequencing genomes. On Tuesday, Dr. Venter announced that he was starting a new company, Human Longevity, which will focus on figuring out how people can live longer and healthier lives.
To do that, the company will build what Dr. Venter says will be the largest human DNA sequencing operation in the world, capable of processing 40,000 human genomes a year. The huge amount of DNA data will be combined with huge amounts of other data on the health and body composition of the people whose DNA is sequenced, in the hope of gleaning insights into the molecular causes of aging and age-related illnesses like cancer and heart disease. Slowing aging, if it can be done, could be a way to prevent many diseases, an alternative to treating one disease a time. “Your age is your No. 1 risk factor for almost every disease, but it’s not a disease itself,” Dr. Venter said in an interview. Still, his company will also work on treating individual diseases of aging.
Song of a Woman
With no friends of your own
you are looking only at me
and you accuse me.
You accuse me of being inconsiderate
No, not enough
No, not enough
not enough proof of loving you
how insolent of me not to look happy all the time
how impudent of me not to be able to forecast today’s weather for you
you always tell me to do things I can’t
I want to start learning magic.
I want to stop your criticism with a single glance.
I want to put your heart to sleep with one finger.
I want to go out every night riding a broom.
I want to jump over the mountain ridge
trailing my hair like smoke.
I want to fly into the sparkling moonlight
laughing away your beratings down there.
You, so simple,
give no thought to the pain that is almost killing me.
Yet, you will calmly go to heaven by and by.
And I, having wished for witchcraft, will fall to hell
Ah, that will create ten billion years of separation.
by Nao Inoue
from Honoo ni tsuite
publisher: Chiyoda Shoin, Tokyo, 1950
translation: 2009, Takako Lento
Continue for original Japanese
The Editors at n+1:
What role has the American intellectual community played in this saga, if any? Certainly we failed to prevent it. But there is more. For the past two years, since Putin re-assigned himself to the Russian presidency, we have indulged ourselves in a bacchanalia of anti-Putinism, shading over into anti-Russianism. We turned Pussy Riot into mass media stars. We wrote endless articles (and books) about how Putin was a mystery man, a terrible man, a KGB ghoul who lived under your bed. It got to the point where, arriving in Sochi for Putin’s overpriced Olympics, Western journalists were greeted like heroes for tweeting about how the curtains in their hotel rooms were falling down. It was funny, but it was also not funny. Should Putin, the president of a country with inadequate hospitals, schools, and housing for its 150-million population, have spent $50 billion on hosting the Olympics? Absolutely not—especially when a third of the money was apparently expropriated by various officials. But the gleeful complaints about Olympic conditions seemed mostly bent on humiliating Russia in toto.
It’s hard to know how much of what gets written in various places leads to American policies in actual fact. Does it matter what’s in the Nation? What about the New York Review of Books? The New Yorker? It’s impossible to say. And the media or publishing game has its own rules, irrespective of politics. Evil Putin is just going to get more airtime than Complicated Putin or Putin Who is Running a Country in a Complex Geopolitical Situation.
Perhaps the way to put it is that an intellectual mistake was turned into a political mistake. The intellectual mistake was to fixate on Putin as the bad man who came along and suddenly undid the good work of Boris Yeltsin. (Bill Clinton’s Russia hand Strobe Talbott the other day tweeted an inadvertent reductio ad absurdum of this position, “Putin has for years been systematically reversing reforms of Yeltsin, Gorbachev & Khrushchev, whose gift of Crimea to Ukraine he’s nullified.”) But as the Russian left has been telling us for years, Putin has not gone back on the Yeltsin-era reforms. In most spheres of Russian life, he has continued them—undoing the Soviet safety net, and replacing it with nothing. That he has become an authoritarian ruler while doing so is a result of the fact that these reforms are cruel and unpopular.
Carlin Romano in The Chronicle of Higher Education:
Did Paul de Man and Martin Heidegger ever meet? If so, they could have compared notes on how to bamboozle de-Nazification officials after, well, one’s side loses.
No matter. Now de Man has joined that august cultural club that includes Caravaggio, Wagner, Céline, Pound, Heidegger, and a slew of other accomplished artists, thinkers, and intellectuals who were also no-goodniks. The nasty ethics in the personal lives of those cultural heavies force us to ask two tough questions that are simpler than many pretend:
(1) Is there an inevitable link between a person’s ethics and his creative and intellectual work?
(2) Is it morally acceptable to honor or enjoy the work of artists and intellectuals whom we condemn for their nonprofessional, unethical actions?
Even when a cultural figure simply bears accusations of ethical misdeeds—as in the case of Woody Allen over the years, a matter reopened after Hollywood’s Golden Globes tribute and an Academy Awards nomination—the questions, when retriggered, produce frenzied media meditations. Now, with the long-awaited publication of Evelyn Barish’s The Double Life of Paul de Man (Liveright), a two-decades-in-the-making investigative biography of the Yale literary theorist whose version of "deconstruction" shook up English and comp-lit departments in the 1970s and 80s, the high literary and intellectual worlds face their own revisiting.
According to Barish, de Man (1919-83) committed fraud, forgery (16 separate acts), swindling, embezzlement, and theft as a postwar Belgian book publisher. For his sins as head of the Hermes publishing house, he was, in 1951, "found guilty in absentia and sentenced to six years in prison with heavy fines." Apparently de Man played fast and loose with more than a million Belgian francs to fuel his lifelong luxury spending. Cornered, he skipped out to the United States on a visa probably obtained illegally by his father.
Virginia Hughes in Nature:
Biologists first observed this 'transgenerational epigenetic inheritance' in plants. Tomatoes, for example, pass along chemical markings that control an important ripening gene2. But, over the past few years, evidence has been accumulating that the phenomenon occurs in rodents and humans as well. The subject remains controversial, in part because it harks back to the discredited theories of Jean-Baptiste Lamarck, a nineteenth-century French biologist who proposed that organisms pass down acquired traits to future generations. To many modern biologists, that's “scary-sounding”, says Oliver Rando, a molecular biologist at the University of Massachusetts Medical School in Worcester, whose work suggests that such inheritance does indeed happen in animals3. If it is true, he says, “Why hasn't this been obvious to all the brilliant researchers in the past hundred years of genetics?”.
One reason why many remain sceptical is that the mechanism by which such inheritance might work is mysterious. Explaining it will require a deep dive into reproductive biology to demonstrate how the relevant signals might be formed in the germ line, the cells that develop into sperm and eggs and carry on, at a minimum, a person's genetic legacy.
A mother might pass on effects of environmental exposures to a fetus during pregnancy. So, to study the phenomenon of transgenerational epigenetics cleanly, biologists are focusing on fathers, and have been looking at how sperm might gain and lose epigenetic marks. “In the past two to three years there's been a lot of new information,” says Michelle Lane, a reproductive biologist at the University of Adelaide in Australia. But proposals for how it all works are themselves embryonic. “It's a huge black box,” Lane says.
Thursday, March 06, 2014
Jonathan Freedland on Scotland's future, in the NYRB:
[I]t is, paradoxically, Scotland that has been clinging to an idea of Britain, one that has been abandoned by the rest of the UK—at least if that idea is defined in part as the collectivist spirit of 1945. As Macwhirter writes, “Scots have arguably been more committed to the idea of Britain than the English over the last 200 years. What Scotland didn’t buy into was the abandonment of what used to be called the post-war consensus: universalism and the welfare state.”
Which is why the Yes campaign’s offer, set out in Scotland’s Future, consists as much of social policy as constitutional change. The document contains few abstractions about democracy, but promises instead “a transformational change in childcare,” the scrapping of London-imposed changes to welfare benefits, and, in the move most likely to attract international attention, the removal of the UK’s Trident nuclear weapon system from Scotland. “We’re half an hour away from the biggest collection of weapons of mass destruction in western Europe,” Jenkins told me. “There’s no version of devolution that allows us to get rid of that.” In other words, only independence allows Scotland to fully realize the distinct political culture that has arisen there.
Some on the left of the No campaign warn that it will be a cruel irony if, by breaking away, Scotland ensures the isolation of its more social democratic ethos. For once Scotland no longer sends fifty-nine MPs to Westminster, many of whom represent safe Labour seats, then Labour’s chances of forming a UK government diminish sharply. If independence happens in 2016, then an England-dominated UK could be the land that is forever Tory. Some electoral analysts dispute that arithmetic; nevertheless it will be this country to which an independent, left-leaning Scotland might be bound in monetary, fiscal, and political union, with the UK Treasury and Bank of England together making major decisions affecting Scotland’s economy. Scottish social democracy could discover it was able to flourish more easily inside Britain than out.
It will be a greater irony still if the ultimate consequence of the program pursued by the great patriot and would-be latter-day Britannia, Margaret Thatcher, was to be the unraveling of the United Kingdom.
Thomas B. Edsall in the NYT:
[Simon] Kuznets’s research into the relationship between inequality and growth laid the foundation for modern thinking about what has become a critical question: Has inequality in this country reached a tipping point at which it no longer provides an incentive to strive and to innovate, but has instead created a permanently disadvantaged class, as well as an ongoing threat of social instability?
One of the most articulate contemporary proponents of the “optimal inequality” thesis is Richard Freeman, a labor economist at Harvard. In a 2011 paper, Freeman wrote: “Is there a level of inequality that optimizes economic growth, stability, and shared prosperity? My answer is yes. The relation between inequality and economic outcomes follows an inverted-U shape, so that increases in inequality improve economic performance up to the optimum and then reduce it.”
Freeman argues that the costs of excessive inequality are high: “Inequality that results from monopoly power, rent-seeking or activities with negative externalities that enrich their owners while lowering societal income (think pollution or crime), adversely affect economic performance. High inequality reinforces corruption by allowing a few ‘crony capitalists’ to lobby politicians or regulators to protect their economic advantages. When national income goes mostly to those at the top, there is little left to motivate people lower down. The 2007 collapse of Wall Street and bailout of banks-too-big-to-fail showed that inequality in income and power can threaten economic stability and give the few a stranglehold on the economy.”
Conservative economists look at the issue of equality from the opposite vantage point: when do government efforts to remedy inequality and to redistribute income worsen conditions by serving as a deterrent to work and productive activity?
The title Reading Darwin in Arabicnotwithstanding, most of the men discussed in this book did not read Charles Darwin in Arabic. Instead they read Jean-Baptiste Lamarck, Ernst Haeckel, Herbert Spencer, Thomas Huxley, Gustave Le Bon, Henri Bergson and George Bernard Shaw in European or Arabic versions. They also read popularizing accounts of various aspects of Darwinism in the scientific and literary journal al-Muqtataf (“The Digest”, 1876–1952). The notion of evolution that Arab readers took away from their reading was often heavily infected by Lamarckism and by the social Darwinism of Spencer. Darwin’s The Origin of Species by Means of Natural Selectionwas published in 1859, but Isma‘il Mazhar’s translation of the first five chapters of Darwin’s book into Arabic only appeared in 1918.
For a long time, the reception of Darwinism was bedevilled by the need to find either neologisms or new twists to old words. As Marwa Elshakry points out, there was at first no specific word in Arabic for “species”, distinct from “variety” or “kind”.
Ibsen starts off by telling us something about who Nora is—or, rather, the conditions she lives under. It’s Christmastime in Norway, and the Helmer household is filled with excitement. A sweet-tempered maid, Helene (Mabel Clements), scurries about the Helmers’ tidy house; she opens the front door, and our fair-haired heroine enters Ian MacNeil’s ingenious set, which sometimes revolves, like a dancer in a music box, as the actors move from room to room, trailed by Stuart Earl’s lovely score. Nora is carrying a number of packages; they’re gifts for her three children. As she sets her packages down and takes off her coat, Helene tells her that her husband, Torvald (Dominic Rowan), is in his study. After years of struggle, he’s about to be made the manager of a local bank. Things are on the upswing in the Helmer household, but something’s wrong.
Before Nora can alert Torvald or the children to her presence, she devours a chocolate that she’s secreted away. But why is her pleasure a secret?
In the beginning was Newark. Everything that Philip Roth turned to such rich account in his great final spate of works inaugurated by American Pastoral is not only set in his native place but from the start derived its moral energy and edge from it. The city of Newark and especially the Weequahic neighbourhood, the local spaces that reflect the intricate geography of class and ethnicity, the mentalities of old Jews and their superannuated ways and of new Jews with their suburban affluence and unacknowledged assimilation anxieties, men’s moral crossroads and the unreasonable and irrational women who supply the materials for them – that repertoire of essential Roth concerns and interests is as central to Goodbye, Columbus (1959), his first book, whose eponymous novella made his name, as to the novels that crown his achievement forty years later. But instead of devoting himself to that repertoire’s potential, Roth wandered far and wide, following in Henry James’s footsteps in the long, slow, rather airless Letting Go (1962), doing a Mark Twain in Our Gang (1971), and in general trying a lot of modes and tones without ever seeming quite to satisfy the demands of harnessing his talent’s restless fluency to his smarts, his savvy, his wit and his ideas. He was out of Newark, but what did that mean? - See more at: http://www.drb.ie/essays/american-berserk#sthash.85WAOJMj.dpuf
Nicholas Epley in Salon:
One of the most amazing court cases you probably have never heard of had come down to this. Standing Bear, the reluctant chief of the Ponca tribe, rose on May 2, 1879, to address a packed audience in a Nebraska courtroom. At issue was the existence of a mind that many were unable to see. Standing Bear’s journey to this courtroom had been excruciating. The U.S. government had decided several years earlier to force the 752 Ponca Native Americans off their lands along the fertile Niobrara River and move them to the desolate Indian Territory, in what is now northern Oklahoma. Standing Bear surrendered everything he owned, assembled his tribe, and began marching a six-hundred-mile “trail of tears.” If the walk didn’t kill them (as it did Standing Bear’s daughter), then the parched Indian Territory would. Left with meager provisions and fields of parched rock to farm, nearly a third of the Poncas died within the first year. This included Standing Bear’s son. As his son lay dying, Standing Bear promised to return his son’s bones to the tribe’s burial grounds so that his son could walk the afterlife with his ancestors, according to their religion. Desperate, Standing Bear decided to go home.
Carrying his son’s bones in a bag clutched to his chest, Standing Bear and twenty-seven others began their return in the dead of winter. Word spread of the group’s travel as they approached the Omaha Indian reservation, midway through their journey. The Omahas welcomed them with open arms, but U.S. officials welcomed them with open handcuffs. General George Crook was ordered by government officials to return the beleaguered Poncas to the Indian Territory. Crook couldn’t bear the thought. “I’ve been forced many times by orders from Washington to do most inhuman things in dealings with the Indians,” he said, “but now I’m ordered to do a more cruel thing than ever before.” Crook was an honorable man who could no more disobey direct orders than he could fly, so instead he stalled, encouraging a newspaper editor from Omaha to enlist lawyers who would then sue General Crook (as the U.S. government’s representative) on Standing Bear’s behalf. The suit? To have the U.S. government recognize Standing Bear as a person, as a human being.
The Tao that can be named is not the real Tao.
................................ —Lao Tzu
I am planted in the earth
Happily, like a cabbage
Carefully peel away the layers of language
That clothe me and soon
It will become clear I am nowhere to be found
And yet even so, my roots lie beneath . . .
by Chimako Tada
from Hanabi (Fire Works)
publisher: Shoshi Yuriika, Tokyo, 1956
translation: 2010, Jeffrey Angles
Joe Fassler in The Atlantic [h/t: Tunku Varadarajan]:
Dinaw Mengestu is a National Book Award Foundation “5 Under 35” writer, aNew Yorker “20 Under 40” writer to watch, and a recipient of a MacArthur Foundation Fellowship. His other novels are The Beautiful Things That Heaven Bears and How to Read the Air.
Dinaw Mengestu: I came to Tayeb Salih’s Season of Migration to the Northlate in life, shortly after I had finished my second novel and was just beginning to make the first tentative steps into the third. I read it once, and then a few weeks later, once more. I began to carry it in my bag, next to my laptop, or in my coat pocket where it easily fit. I opened it at least once a week to no particular page. After a few minutes, I would close the book, slightly uncertain about what I had just read, even though I knew the outlines of the story better than almost any other novel. I would often wonder why I had never heard of the novel before, and why the same was true for most people I knew. Under the broad banner of post-colonial literature, it deserved a place next to Achebe’sThings Fall Apart, but to think of it only in those terms undercuts its value as a stunning work of literature, as a novel that actively resists the division of art into poorly managed categories of race and history.
Those divisions are a fundamental part of Salih’s novel. The story, set in a recently independent Sudan, with footprints in England and Egypt, mocks and eviscerates the clichés that come with looking at the world as a division between us and the Other. That fractured gaze, whether it is born out of race, gender, or privilege destroys the characters in the novel, none of whom are merely victims or perpetrators. Through them, the story becomes an argument for a better way of seeing, which has always struck me as being one of the novel’s better gifts, something which it is uniquely poised to do, if only because it demands the reader’s imagination, and by doing so affirms our capacity to live beyond the limited means of our private lives. We read not to encounter the Other, but to see ourselves refracted in a different landscape, in a different time, in shoes and clothes that perhaps bear no resemblance to our own.
Jesi Egan in Slate:
[L]ast month, the religious journal First Things published a controversial essay by Michael W. Hannon called “Against Heterosexuality,” which offers an ultra-conservative take on the issue of whether our sexual orientations are natural conditions or chosen constructs. Hannon’s piece is just the latest in a number of recent articles in the “choice wars.” Brandon Ambrosino, writing for the New Republic, set off a small firestorm in January when he described his homosexuality as a choice, not a biological fact. His article provoked vitriolic responses from, among others, Gabriel Arana and Slate’s own Mark Joseph Stern. Clearly, the biology vs. choice (or nature vs. culture) debate remains a point of serious contention within the LGBTQ community and beyond.
But does “construct” mean what these new adopters think it does? Though Hannon and Ambrosino have different political endgames, they both invoke a very unlikely ally: Michel Foucault, the French philosopher who’s known as the grandfather of queer theory and a central architect of the “construct” conception of sexuality. Though Foucault died in 1984, his History of Sexuality, Volume I is still mandatory reading in LGBTQ studies courses. His theories about where sexuality comes from have been hugely influential in academia for decades. But Foucault is also responsible for a lot of the confusion surrounding the biology vs. choice debate—largely because his work been taken out of context by liberals and social conservatives alike. While Hannon’s essay is a particularly disturbing piece of work (see Stern’s scathing take-down for more), all of these popular misinterpretations tend to muddy the political waters, and risk obscuring Foucault’s most important contributions to our understanding of sexuality.
Let’s start with a quick primer. In The History of Sexuality, Foucault writes that Western society’s views on sex have undergone a major shift over the past few centuries. It’s not that same-sex relationships or desires didn’t exist before—they definitely did. What’s relatively new, though, is 1) the idea that our desires reveal some fundamental truth about who we are, and 2) the conviction that we have an obligation to seek out that truth and express it.
Within this framework, sex isn’t just something you do. Instead, the kind of sex you have (or want to have) becomes a symptom of something else: your sexuality.
Sarah Reardon in Nature:
A clinical trial has shown that a gene-editing technique can be safe and effective in humans. For the first time, researchers used enzymes called zinc-finger nucleases (ZFNs) to target and destroy a gene in the immune cells of 12 people with HIV, increasing their resistance to the virus to the virus. The findings are published today in The New England Journal of Medicine1. “This is the first major advance in HIV gene therapy since it was demonstrated that the ‘Berlin patient’ Timothy Brown was free of HIV,” says John Rossi, a molecular biologist at the Beckman Research Institute of the City of Hope National Medical Center in Duarte, California. In 2008, researchers reported that Brown gained the ability to control his HIV infection after they treated him with donor bone-marrow stem cells that carried a mutation in a gene called CCR5. Most HIV strains use a protein encoded by CCR5 as a gateway into the T cells of a host’s immune system. People who carry a mutated version of the gene, including Brown's donor, are resistant to HIV.
But similar treatment is not feasible for most people with HIV: it is invasive, and the body is likely to attack the donor cells. So a team led by Carl June and Pablo Tebas, immunologists at the University of Pennsylvania in Philadelphia, sought to create the beneficial CCR5 mutation in a person’s own cells, using targeted gene editing. The researchers drew blood from 12 people with HIV who had been taking antiretroviral drugs to keep the virus in check. After culturing blood cells from each participant, the team used a commercially available ZFN to target the CCR5 gene in those cells. The treatment succeeded in disrupting the gene in about 25% of each participant’s cultured cells; the researchers then transfused all of the cultured cells into the participants. After treatment, all had elevated levels of T cells in their blood, suggesting that the virus was less capable of destroying them. Six of the 12 participants then stopped their antiretroviral drug therapy, while the team monitored their levels of virus and T cells. Their HIV levels rebounded more slowly than normal, and their T-cell levels remained high for weeks. In short, the presence of HIV seemed to drive the modified immune cells, which lacked a functional CCR5 gene, to proliferate in the body. Researchers suspect that the virus was unable to infect and destroy the altered cells. “They used HIV to help in its own demise,” says Paula Cannon, who studies gene therapy at the University of Southern California in Los Angeles. “They throw the cells back at it and say, ‘Ha, now what?’”
Siddhartha Deb in the NYT Magazine:
“I’ve always been slightly short with people who say, ‘You haven’t written anything again,’ as if all the nonfiction I’ve written is not writing,” Arundhati Roy said.
It was July, and we were sitting in Roy’s living room, the windows closed against the heat of the Delhi summer. Delhi might be roiled over a slowing economy, rising crimes against women and the coming elections, but in Jor Bagh, an upscale residential area across from the 16th-century tombs of the Lodi Gardens, things were quiet. Roy’s dog, Filthy, a stray, slept on the floor, her belly rising and falling rhythmically. The melancholy cry of a bird pierced the air. “That’s a hornbill,” Roy said, looking reflective.
Roy, perhaps best known for “The God of Small Things,” her novel about relationships that cross lines of caste, class and religion, one of which leads to murder while another culminates in incest, had only recently turned again to fiction. It was another novel, but she was keeping the subject secret for now. She was still trying to shake herself free of her nearly two-decade-long role as an activist and public intellectual and spoke, with some reluctance, of one “last commitment.” It was more daring than her attacks on India’s occupation of Kashmir, the American wars in Iraq and Afghanistan or crony capitalism. This time, she had taken on Mahatma Gandhi.
Wednesday, March 05, 2014
Michael Walzer reviews David Nirenberg's Anti-Judaism: The Western Tradition, in the NYRB:
What Nirenberg has written is an intellectual history of Western civilization, seen from a peculiar but frighteningly revealing perspective. It is focused on the role of anti-Judaism as a constitutive idea and an explanatory force in Christian and post-Christian thought—though it starts with Egyptian arguments against the Jews and includes a discussion of early Islam, whose writers echo, and apparently learned from, Christian polemics. Nirenberg comments intermittently about the effects of anti-Judaism on the life chances of actual Jews, but dealing with those effects in any sufficient way would require another, and a very different, book.
Anti-Judaism is an extraordinary scholarly achievement. Nirenberg tells us that he has left a lot out (I will come at the end to a few things that are missing), but he seems to know everything. He deals only with literature that he can read in the original language, but this isn’t much of a limitation. Fortunately, the chapter on Egypt doesn’t require knowledge of hieroglyphics; Greek, Hebrew, and Latin are enough. Perhaps it makes things easier that the arguments in all the different languages are remarkably similar and endlessly reiterated.
A certain view of Judaism—mainly negative—gets established early on, chiefly in Christian polemics, and then becomes a common tool in many different intellectual efforts to understand the world and to denounce opposing understandings. Marx may have thought himself insightful and his announcement original: the “worldly God” of the Jews was “money”! But the identification of Judaism with materialism, with the things of this world, predates the appearance of capitalism in Europe by at least 1,500 years.
Daniel Nexon in the Washington Post:
Russia’s political organization is fundamentally imperial in character, composed of a hodgepodge of political units that range from the fully integrated to the semi-sovereign and autonomous. As my colleague, Charles King, wrote in 2003:
“Central power, where it exists, is exercised through subalterns who function as effective tax- and ballot-farmers; they surrender up a portion of local revenue and deliver the votes for the center’s designated candidates in national elections in exchange for the center’s letting them keep their own fiefdoms.”
Nowhere is this kind of arrangement more vividly illustrated than in Chechnya, where Moscow ‘solved’ its separatist problem by devolving power to a local viceroy, Akhmed Kadyrov and, after his assassination, his son, Ramzan Kadyrov.
Second, Moscow’s strategy for managing its internal relations — relying on subalterns, exploiting ethnic divisions, deploying military forces and using the toolkit of electoral authoritarianism — extends, if often in attenuated form, to those states it considers as falling within its “privileged sphere of influence.” It has long backed and leveraged secessionist movements in, among other places, Moldova, Georgia and Azerbaijan in the pursuit of political control. Although not always successful, this pattern dates back to the Soviet era and, before that, the Russian Empire. Indeed, the Kremlin’s power-political practices are as perennial as they are limited. In a theoretically sophisticated Security Studies article, Iver B. Neumann and Vincent Pouliot show that Moscow’s quest for great-power status and recognition, combined with its inability to achieve “insider status” in the international order, invariably drives it back to the same repertoire of asserting great-power perquisites in ways that shock and alarm the international community. Did German Chancellor Angela Merkel describe Putin as living in “another world” after her March 2 phone conversation with him? If not, then sometimes truth does indeed reside in fiction.
Jenny Jarvie in The New Republic:
[T]he headline above would, if some readers had their way, include a "trigger warning"—a disclaimer to alert you that this article contains potentially traumatic subject matter. Such warnings, which are most commonly applied to discussions about rape, sexual abuse, and mental illness, have appeared on message boards since the early days of the Web. Some consider them an irksome tic of the blogosphere’s most hypersensitive fringes, and yet they've spread from feminist forums and social media to sites as large as the The Huffington Post. Now, the trigger warning is gaining momentum beyond the Internet—at some of the nation's most prestigious universities.
Last week, student leaders at the University of California, Santa Barbara, passed a resolution urging officials to institute mandatory trigger warnings on class syllabi. Professors who present "content that may trigger the onset of symptoms of Post-Traumatic Stress Disorder" would be required to issue advance alerts and allow students to skip those classes. According to UCSB newspaper The Daily Nexus, Bailey Loverin, the student who sponsored the proposal, decided to push the issue after attending a class in which she “felt forced” to sit through a film that featured an “insinuation” of sexual assault and a graphic depiction of rape. A victim of sexual abuse, she did not want to remain in the room, but she feared she would only draw attention to herself by walking out.
On college campuses across the country, a growing number of students are demanding trigger warnings on class content. Many instructors are obliging with alerts in handouts and before presentations, even emailing notes of caution ahead of class. At Scripps College, lecturers give warnings before presenting a core curriculum class, the “Histories of the Present: Violence," although some have questioned the value of such alerts when students are still required to attend class. Oberlin College has published an official document on triggers,advising faculty members to "be aware of racism, classism, sexism, heterosexism, cissexism, ableism, and other issues of privilege and oppression," to remove triggering material when it doesn't "directly" contribute to learning goals and "strongly consider" developing a policy to make "triggering material" optional. Chinua Achebe's Things Fall Apart, it states, is a novel that may "trigger readers who have experienced racism, colonialism, religious persecution, violence, suicide and more."
Jan Mieszkowski reviews Jacques Derrida's The Death Penalty, Volume I in the LA Review of Books:
[T]he driving concern of the seminar is as clear as it is provocative. “Never to my knowledge,” Derrida declared in a contemporaneous conversation with French historian Élisabeth Roudinesco, “has any philosopher as a philosopher, in his or her own strictly and systematically philosophical discourse, never has any philosophy as suchcontested the legitimacy of the death penalty.” As an experiment, I shared this claim with a number of academic philosophers. Their initial skepticism quickly turned to surprise as they realized that, as Derrida observes, virtually all of the major philosophers were either ardent advocates of capital punishment, reluctant apologists for it, or markedly silent on the topic. Even those, Derrida adds, “who maintained a public discourse against the death penalty never did so, to my knowledge — and this is my provisional hypothesis — in a strictly philosophical way.”
One may raise an eyebrow at the formulation “in a strictly philosophical way,” if only because one can’t help imagining how Derrida himself, in another mood, might have pounced on it: can philosophy ever be strictlyphilosophical? Doesn’t philosophy come into its own precisely by losing itself when it seeks a way of its own? Yet these are precisely Derrida’s concerns, for at issue is not just what certain philosophers have said about the death penalty, but whether Western philosophy is in some way organized by its investment in this particular doctrine of punishment. Derrida’s suggestion is that the death penalty is both one penalty among others and the penalty of penalties, a transcendental condition of possibility of justice and punishment. Criminal law as we know it, if not law in general, would be inconceivable in its absence. The death penalty, he writes, “has always been the effect of an alliance between a religious message and the sovereignty of the state,” state sovereignty, first and foremost, being the power over the life and death of subjects. It is therefore not simply a question of maintaining that we can only understand the death penalty by explaining the relations between traditional theological, juridical, and political discourses. The reigning theological-juridico-political constellation can be approached and understood only through a study of capital punishment.
The Culture of Narcissism solidified Lasch’s reputation as a leading anti-modernist critic of an America that seemed to have lost its balance as it rollerskated into oblivion. Mistrusting America’s affluence and growing technological achievements, Lasch even critiqued the anti-authoritarian liberation struggles of the 1960s, which belonged for him to the same modernist cult of progress that, failing to recognize necessary limits, would destroy all in its path. The counterculture’s myth of exaggerated self-realization was but the flipside of the retreat into basic self-preservation. Detached by state and market from connections to a more sustaining sense of purpose or obligation, Americans inhabited a culture that left them rootless.
But Lasch should not be remembered merely as a grumbling reactionary. What he feared was “liberation,” not “modernity”—dismissing anti-modernist nostalgia as the fantasy of progress in reverse. For most of his life (he died of cancer in 1994), he remained committed to a more egalitarian society and clung to the hope that change might still occur. As he said toward the end of a career that had turned, beginning with Narcissism, increasingly dark and pessimistic: he still had faith even though he lacked optimism. It was a statement that flummoxed many interviewers, but it is key to understanding Lasch’s complex vision of American culture—and of the role of the social critic within it.
Written in 1886, The Death of Ivan Ilyich was the first fiction Tolstoy published after the spiritual upheaval he chronicles in Confession. It’s easy to imagine Ilyich as the old and bearded sage-looking man Tolstoy was upon his death, but he’s only forty-five years old, and this fact adds to the tremendous pathos of the story: The death of a young man is always more awful than the death of an old man. The priest gives Ilyich little spiritual consolation, and the doctors are self-important fools, incapable of mitigating his pain. His co-workers are disgusted by the thought of his wasting body and care only about jockeying for cozier positions once he dies. His wife and children, occupied by the minutiae of their quotidian lives, refuse to admit what has befallen him. He finds their refusal to confront this fanged truth most disgusting of all: “Ivan Ilyich’s chief torment was the lie—that lie, for some reason recognized by everyone, that he was only ill but not dying.” His sole comfort comes from Gerasim, the peasant servant who does not recoil from the foul stench, who accepts the inevitability of all flesh. If Ilyich’s upper-crust friends regard death as indecent, Gerasim knows otherwise: His peasant’s dirty-hands understanding of life, his calm acceptance of every person’s fate, helps to calm Ilyich into his own acceptance. (The peasantry’s calm acceptance of death, by the way, can be noticed in Turgenev, Dostoevsky, and Solzhenitsyn, to name a few—it seems to fall somewhere in line between Russian literary trope and Russian cultural myth.) Relief for Ilyich comes only after he has followed Gerasim’s lead and acquiesced to his fate.
One great problem with financial journalism, especially in the decades leading up to the crash, has been that it’s often written in an argot understandable only to the already highly financially literate. Sorkin doesn’t usually employ such specialized language. This has led to the mistaken belief that he’s explaining the industry to regular people. In fact, he is a dutiful Wall Street court reporter, telling important people what other important people are thinking and saying. At the same time, he is Wall Street’s most valuable flack. He isn’t explainingfinance to the people—you’d be better served reading John Kenneth Galbraith to understand how finance works—he’s justifying it.
The modern finance industry is at a loss when it comes to justifying its own existence. Its finest minds can’t explain why we wouldn’t be better off with a much simpler and more heavily circumscribed model of capital formation. Sorkin likewise can’t make his readers fully grasp why the current system—which turns large amounts of other people’s money and even more people’s debt into huge paper fortunes for a small super-elite, and in such a way as to regularly imperil the entire worldwide economic order—is beneficial or necessary. But the New York Times and Wall Street each need him to try.