F or those who are dreading the next two weeks – for those for whom the last seven years, since the dramatic announcement of London’s appointment as the host city for the 2012 Olympic Games, have been a torment – the French academic Marc Perelman’s polemic could not be more perfect; an ideal accompaniment, perhaps, to a fortnight that might best be spent, for the naysayers, doubters and outright opponents, in an isolation tank. Not that Barbaric Sport confines its withering contempt to the Olympics – although it does, somewhat opportunistically, lead off with them; football also comes in for a drubbing and, although other forms of sport get comparatively passing mentions, it would be a perverse reader who put this book down feeling licensed to be a supporter of – well, anything. There are few evils that Perelman doesn’t lay at the door of competitive sport, which he carefully contrasts with (and, indeed, blames for depriving us of) play, “a disinterested activity without material goals, ludic and free”.
more from Alex Clark at the TLS here.
Helen Sword in NYTimes' Opinionator:
Take an adjective (implacable) or a verb (calibrate) or even another noun (crony) and add a suffix like ity, tion or ism. You’ve created a new noun: implacability, calibration, cronyism. Sounds impressive, right?
Nouns formed from other parts of speech are called nominalizations. Academics love them; so do lawyers, bureaucrats and business writers. I call them “zombie nouns” because they cannibalize active verbs, suck the lifeblood from adjectives and substitute abstract entities for human beings:
The proliferation of nominalizations in a discursive formation may be an indication of a tendency toward pomposity and abstraction.
The sentence above contains no fewer than seven nominalizations, each formed from a verb or an adjective. Yet it fails to tell us who is doing what.
Amy Bass in Salon:
In 1988, Jimmy “The Greek” Snyder (in)famously stated that the prowess of African-American football players could be traced to slavery, saying “the black is a better athlete to begin with because he’s been bred to be that way … [They] jump higher and run faster.” The reaction to such obviously racist remarks was fast and furious: Amid the uproar, CBS Sports fired him. So when Olympic gold medalist Michael Johnson predicted this month that African-American and West Indian track athletes would dominate the London Olympics because of the genes of their slave ancestors, I paid little attention, thinking there was no way this could become a viable conversation yet again. “All my life I believed I became an athlete through my own determination, but it’s impossible to think that being descended from slaves hasn’t left an imprint through the generations,” Johnson told the Daily Mail. “Difficult as it was to hear, slavery has benefited descendants like me –- I believe there is a superior athletic gene in us.”
As a historian, what I find to be stunning about what he said is the claim that the supremacy of black athletes in track had never “been discussed openly before.” Actually, with his words, Johnson plunged himself into a century-old debate that seems to rear its (rather ugly) head every four years, just in time for the opening of sport’s largest global stage. Johnson supported his theory with the example of the men’s 100m final at the Beijing Olympics: Three of the eight finalists came from Jamaica, including record-breaking winner Usain Bolt, and two from Trinidad; African-Americans Walter Dix and Doc Patton and Dutch sprinter Churandy Martina, who hails from Curacao, rounded out the line.
Read the rest here.
Amy Finnerty on Jenny Rosenstrach's Dinner: A Love Story, over at the Los Angeles Review of Books:
FOR WORKING PARENTS, even those who are gifted, intuitive cooks, getting a wholesome dinner on the table regularly is a heavy lift, what with the shopping, the dirty dishes and recycling, the lactose intolerance and high cholesterol to be mitigated, not to mention the perils of Big Corn looming over them. Even more acutely, they feel a relentless time squeeze. Family dinner is a middle-class, first-world challenge, to be sure, but it’s a pervasive one that deserves a full-length book. With great facility and charm, Jenny Rosenstrach’s Dinner: A Love Story, based her popular blog of the same name, recounts the author’s progression from self-doubt to mastery, both at work and at home.
For many members of Rosenstrach’s demographic, who came of age with the assumption that they’d become high-powered professionals (or at least writers), homemaking skills were anathema. Domestic training was retrograde and sexist and took time away from academic striving. Rosenstrach learned much of what she knows about the kitchen not as a girl but in the workplace, as an editor at Real Simple and Cookie, and through trial and error in her own kitchen. She provides members of her cohort not just with a how-to book but, more importantly, with a central philosophical text.
Rosenstrach’s big idea is that once dinner is solved, the more profound concerns of family life will coalesce around it: Conversation, shared responsibility, pleasure, and health, she argues, can all be fostered at the table. “The simple act of carving out the ritual – a delicious homemade ritual,” she writes — has given “every day purpose and meaning, no matter what else was going on in our lives.” The book, which, like the blog, has a work-in-progress, vérité aura, is a working mother’s manifesto with crowd-pleasing dishes, family recipes, and domestic solutions scattered among reflections on parenting, the cocktail hour, marriage, and careers.
J. Craig Venter on the 70th Anniversary of Schroedinger's Lecture at Trinity College, over at Edge:
As you all know, Schrödinger's book was published in 1944 and it was based on a series of three lectures here, starting in February of 1943. And he had to repeat the lectures, I read, on the following Monday because the room on the other side of campus was too small, and I understand people were turned away tonight, but we're grateful for Internet streaming, so I don't have to do this twice.
Also, due clearly to his historical role, and it's interesting to be sharing this event with Jim Watson, who I've known and had multiple interactions with over the last 25 years, including most recently sharing the Double Helix Prize for Human Genome Sequencing with him from Cold Spring Harbor Laboratory a few years ago.
Schrödinger started his lecture with a key question and an interesting insight on it. The question was “How can the events in space and time, which take place within the boundaries of a living organism be accounted for by physics and chemistry?” It's a pretty straightforward, simple question. Then he answered what he could at the time, “The obvious inability of present-day physics and chemistry to account for such events is no reason at all for doubting that they will be accounted for by those sciences.” While I only have around 40 minutes, not three lectures, I hope to convince you that there has been substantial progress in the last nearly 70 years since Schrödinger initially asked that question, to the point where the answer is at least nearly at hand, if not in hand.
I view that we're now in what I'm calling “The Digital Age of Biology”. My teams work on synthesizing genomes based on digital code in the computer, and four bottles of chemicals illustrates the ultimate link between the computer code and the digital code.
Life is code, as you heard in the introduction, was very clearly articulated by Schrodinger as code script. Perhaps even more importantly, and something I missed on the first few readings of his book earlier in my career, was as far as I could tell, it's the first mention that this code could be as simple as a binary code.
John D. Barrow in +plus magazine (via Bookforum's Omnivore):
[I]nfinities in modern physics have become separate from the study of infinities in mathematics. One area in physics where infinities are sometimes predicted to arise is aerodynamics or fluid mechanics. For example, you might have a wave becoming very, very steep and non-linear and then forming a shock. In the equations that describe the shock wave formation some quantities may become infinite. But when this happens you usually assume that it's just a failure of your model. You might have neglected to take account of friction or viscosity and once you include that into your equations the velocity gradient becomes finite — it might still be very steep, but the viscosity smoothes over the infinity in reality. In most areas of science, if you see an infinity, you assume that it's down to an inaccuracy or incompleteness of your model.
In particle physics there has been a much longer-standing and more subtle problem. Quantum electrodynamics is the best theory in the whole of science, its predictions are more accurate than anything else that we know about the Universe. Yet extracting those predictions presented an awkward problem: when you did a calculation to see what you should observe in an experiment you always seemed to get an infinite answer with an extra finite bit added on. If you then subtracted off the infinity, the finite part that you were left with was the prediction you expected to see in the lab. And this always matched experiment fantastically accurately. This process of removing the infinities was called renormalisation. Many famous physicists found it deeply unsatisfactory. They thought it might just be a symptom of a theory that could be improved.
This is why string theory created great excitement in the 1980s and why it suddenly became investigated by a huge number of physicists. It was the first time that particle physicists found a finite theory, a theory which didn't have these infinities popping up. The way it did it was to replace the traditional notion that the most basic entities in the theory (for example photons or electrons) should be point-like objects that move through space and time and so trace out lines in spacetime. Instead, string theory considers the most basic entities to be lines, or little loops, which trace out tubes as they move. When you have two point-like particles moving through space and interacting, it's like two lines hitting one another and forming a sharp corner at the place where they meet.
Some time ago we wrote about W.H. Auden on stage, in a new Broadway musical. But how many know of his work on film? David Collard writes in England’s Literary Review about Auden’s lifelong fascination with film. For six months from 1935 to 1936, Auden worked for the General Post Office Film Unit (GPO), which included the time that it produced Night Mail, the Citizen Kane, Coal Face, Negroes (released as God’s Chillun), and The Way to the Sea – “all four films featuring brilliant modernist scores by the young Benjamin Britten,” according to Collard. “No artists of comparable stature had collaborated so closely since 1691, when John Dryden and Henry Purcell worked together on the ‘dramatick opera’ King Arthur.” Collard also writes that GPO, “despite its prosaic-sounding title, was for five years the most exciting, innovative and progressive cultural project in Britain, staffed by a dazzling cohort of international talents. In a short-lived flurry of commitment to the cause, Auden also lectured on film, wrote reviews, provided subtitle renderings of Russian peasant folk songs for Dziga Vertov‘s Three Songs of Lenin, and collaborated on various other projects, even appearing in front of the camera (disguised as a department store Father Christmas in Evelyn Spice‘s spirited Calendar of the Year).”
more from Cynthia Haven at The Book Haven here.
An appreciation for blandness as a separate category of experience—and not a new one—may help us understand how Murakami has managed to produce an intensely interesting body of fiction around characters, and sentences, that operate in a kind of continuous monotone. He follows a century of Western writers of negation, absence, and “plainness” (Kafka, Hemingway, Camus, Beckett, Pinter, Carver) but the resemblance is—perhaps by design—only superficial. Blandness, for Murakami, is not a symptom of late capitalist culture, the endpoint of cultural disintegration, or a post-apocalyptic end of history, but a condition that precedes those things and, more disturbingly, renders them harmless. Depending on one’s position, his characters’ calm acceptance of wind-up birds, sheep men, and cat towns, their ability to regain emotional homeostasis in the most dire circumstances, might seem the essence of weightless global cool or the soulless literary equivalent of a shrink-wrapped airline meal, but either reading ignores the obvious: every literary sensibility, like every shred of pasta, comes from somewhere.
more from Jess Row at Threepenny Review here.
In Gulliver’s Travels, Swift challenges the idea — advanced by his Enlightenment contemporaries — that truth, including the truth about human nature, is best understood as a matter of simple factual claims. Swift’s view, as we shall see, was that dedication to this rising scientific view of truth as synonymous with fact precisely misses the very essence of human nature. But Swift’s recognition of the subtle relationship between our capacity for lying and the essential truth about human nature also sets him apart from another modern opponent of the Enlightenment, the German philosopher Friedrich Nietzsche. Nietzsche picked up as a kind of motto a mistranslated line from the Second Pythian Ode, a work by the Ancient Greek poet Pindar: “Become what you are.” In Nietzsche’s existentialist understanding (later appropriated in a similar fashion by Martin Heidegger), the phrase is an injunction to drop the delusion of an ideal you, along with any moral overlay it implies, and simply to identify fully with yourself as a bundle of drives. “Become what you are” means for him “Become what you happen to be, not what you think you should be.” That is, amor fati: love your fate!
more from at The New Atlantis here.
The three-tongued glacier has begun to melt.
What will we do, they ask, when boulder-milt
Comes wallowing across the delta flats
And the miles-deep shag ice makes its move?
I saw it, ridged and rock-set, from above,
Undead grey-gristed earth-pelt, aeon-scruff,
And feared its coldness that still seemed enough
To iceblock the plane window dimmed with breath,
Deepfreeze the seep of adamantine tilth
And every warm, mouthwatering word of mouth.
by Seamus Heaney
An 11”x16” broadside of this poem,
with an original illustration by Barry Moser,
is available from the Poetry Center.)
Does the old rallying cry “Guns don't kill people. People kill people” hold up to philosophical scrutiny?
Evan Selinger in The Atlantic:
The commonsense view of technology is one that some philosophers call the instrumentalist conception. According to the instrumentalist conception, while the ends that technology can be applied to can be cognitively and morally significant, technology itself is value-neutral. Technology, in other words, is subservient to our beliefs and desires; it does not significantly constrain much less determine them. This view is famously touted in the National Rifle Association's maxim: “Guns don't kill people. People kill people.”
To be sure, this statement is more of a slogan than well-formulated argument. But even as a shorthand expression, it captures the widely believed idea that murder is wrong and the appropriate source to blame for committing murder is the person who pulled a gun's trigger. Indeed, the NRA's proposition is not unusual; it aptly expresses the folk psychology that underlies moral and legal norms.
From The Paris Review:
“Damned good-looking” is how Ernest Hemingway—or, rather, his antihero Jake Barnes in The Sun Also Rises—describes Lady Brett Ashley when she appears at a Parisian club with a mob of pretty boys. “Damned good-looking” is better than pretty. It’s better than the colloquial “hot,” better than beautiful, even.
Damned good-looking, it is.
Imagine Hemingway, the great economist of words, deciding just how he would introduce perhaps his most enduring siren. Original drafts of the novel open with the character Ashley (better known as Brett), though she would eventually come to play a smaller role. Hemingway was bewitched, at the time of writing, by the self-possession of the real-life Lady Duff Twysden, and she—rather than his wife, Hadley—would serve as the partial inspiration for The Sun Also Rises’s heroine. (Indeed, he would dedicate later editions of the novel to her.) Poor Hadley would be left out again when her husband took up with Lady Brett’s other progenitor, Pauline Pfeiffer, who in 1926 came to France to assist Mainbocher at Vogue. In addition to being a fashion writer, Pfeiffer was an accomplished journalist, an intellectual who fit easily into Hemingway’s Paris crew. It seems Hemingway’s rigid conception of a professional, globe-trotting man’s man—a fan of hunting, boxing and bullfighting—shouldn’t settle for pretty; he’d want damn good-looking. “Damn good-looking”—Hemingway’s highest female accolade—is also, in the form of Lady Brett, damn witty, damn intelligent, and damn good in bed.
How can you tell if an ancient story is completely fictional or based on reality? One method, says a team of physicists, is to map out the social network of the characters and test whether it looks like a real social network. When they used that method for three ancient myths, they found that the characters have surprisingly realistic relationships. Ancient stories are called myths for a reason. No one believes that Beowulf, the hero of the Anglo-Saxon epic, slew a talking monster named Grendel. Or that the Greek gods described in The Iliad actually appeared on Earth to intervene in the Trojan War. But historians and archaeologists agree that much of those ancient narratives was based on real people and events. The supernatural features of the narrative were then layered onto reality. Ralph Kenna and Pádraig Mac Carron, physicists at Coventry University in the United Kingdom, wondered if reality leaves its mark on mythological narratives through the relationships between characters. So they built social network maps for three ancient texts. Along with Beowulf and The Iliad , they included an Irish epic, Táin Bó Cúailnge. The Irish epic's origins are murky. Most scholars assume that it is completely fictional, but recent archaeological evidence suggests that it could be based in part on a real conflict in Ireland 3200 years ago.
Kenna and Mac Carron started by building a database for each story that captures all the characters—Beowulf, The Iliad, and Táin contained 74, 716, and 404 characters, respectively—and the interactions between them. (If two characters met or clearly knew each other, that counted as a relationship.) As a control, they also mapped the social networks for modern works of fiction: Les Misérables, Shakespeare's Richard III, The Fellowship of the Ring, and the first book in the Harry Potter series. Once the social webs were mapped, Kenna and Mac Carron applied the standard statistical toolkit used to study real social networks such as Facebook. For example, one universal feature of real social networks is that they are highly clustered, with tight clumps of people who all know each other. These groups are linked to each other by highly social people known as “connectors.” Real social networks also have a property called “small world,” which indicates that there is never more than a few degrees of separation between any two people. Such statistical properties have been found in networks of movie actors, jazz musicians, and even scientific collaborators. If the ancient myths were based on real people, Kenna and Mac Carron expected to find the same patterns.
Richard Wall in Folio Weekly:
Denny Fouts (1914-1948) was handsome, charming, witty, entertaining and moody. He didn’t have money himself, but lived luxuriously off the wealth and infatuation of others. He played a starring role in the pre-war aristocratic bohemian scene in Europe, where the fun was extravagant and being gay was just fine. Denny amazed and inspired such literary greats as Truman Capote, Gore Vidal, Christopher Isherwood, Somerset Maugham and Gavin Lambert, and his personality sparks the fiction, memoirs, diaries and letters of the most noted authors and artists of his day.
Sixty-four years after his death, Denny Fouts is a cult figure in gay culture, best known by the sensational titles pinned on him. Capote dubbed him “The Best-Kept Boy in the World” (also the title of an upcoming book about Denny by Arthur Vanderbilt II). Isherwood and others repeated Denny’s reputation as “the most expensive male prostitute in the world.”
But this black sheep from Riverside was more than a switch-hitting gigolo, who parlayed his Southern charm and sexual prowess into a succession of glamorous free-rides.
A more complex Fouts can be found in the literature, and in the insights of his living relatives, which have never before been published. Alice Denham, Denny’s 79-year-old cousin who lives in New York City and is working on a book about her family and Denny, insists: “He wasn’t a male prostitute. Denny had arrangements. You couldn’t say I’ll give you this much money and he’d go with you.” He wasn’t just a hustler; he was an icon of and an influence on the acceptance of gay culture.
“Fouts was not walking the street. He had longtime lovers whose attraction for him went far beyond the sexual,” says Nick Harvill, an expert on literary references to Fouts who also assembles content-based libraries for private individuals, many in Hollywood. “Denham Fouts was a male version of the courtesan. He is one of the greatest enigmas of the 20th Century.”
Read the rest here.
An interview with Woody Allen in The Talks:
Mr. Allen, do you truly believe that happiness in life is impossible?
This is my perspective and has always been my perspective on life. I have a very grim, pessimistic view of it. I always have since I was a little boy; it hasn’t gotten worse with age or anything. I do feel that’s it’s a grim, painful, nightmarish, meaningless experience and that the only way that you can be happy is if you tell yourself some lies and deceive yourself.
I think it’s safe to say that most people would disagree.
But I am not the first person to say this or even the most articulate person. It was said by Nietzsche, it was said by Freud, it was said by Eugene O’Neill. One must have one’s delusions to live. If you look at life too honestly and clearly, life becomes unbearable because it’s a pretty grim enterprise, you will admit.
I have a hard time imagining Woody Allen having such a hard life…
I have been very lucky and I have made my talent a very productive life for me, but everything else I am not good at. I am not good getting through life, even the simplest things. These things that are a child’s play for most people are a trauma for me.
Ronald Dworkin in the NYRB:
Just before the decision was announced, the betting public believed, by more than three to one, that the Court would declare the act unconstitutional.1 They could not have formed that expectation by reflecting on constitutional law; almost all academic constitutional lawyers were agreed that the act is plainly constitutional. The public was expecting the act’s defeat largely because it had grown used to the five conservative justices ignoring argument and overruling precedent to remake the Constitution to fit their far-right template.
The surprise lay not just in the fact that one of the conservatives voted for the legally correct result, but which of them did that. Everyone assumed that if, unexpectedly, the Court sustained the act it would be because Justice Anthony Kennedy, the least doctrinaire of the conservative justices, had decided to vote with the four more liberal justices, Justices Ruth Ginsburg, Stephen Breyer, Sonia Sotomayor, and Elena Kagan. After all, since 2005, Kennedy had joined the liberals in twenty-five cases to create 5–4 decisions they favored, rather than joining his fellow conservatives to provide five votes for their side. Two of the other conservative justices—Justices Antonin Scalia and Clarence Thomas—had done that only twice, and the two others—Chief Justice John Roberts and Justice Samuel Alito—had never done so. So most commentators thought, from the moment the Court agreed to rule on the act, that the decision would turn, one way or the other, on Kennedy’s vote, and a great many of the hundreds of briefs submitted on both sides offered arguments designed mainly to appeal to him.
The modernist critic T.E. Hulme famously, and insultingly, described Romanticism as “spilt religion.” Natural Supernaturalism can be thought of as an extended proof of Hulme’s dictum, and simultaneously as a refutation of it. The energy that Christianity once devoted to imagining the end of the world and the redemption of mankind, Abrams shows, was not simply and chaotically “spilled” in Romantic literature. On the contrary, it was transformed in wonderfully complex ways. “In the increasingly secular period since the Renaissance,” Abrams writes, “we have continued to live in an intellectual milieu” shaped by the millennialism of Christianity. This shaping is “so deep and pervasive, and often so transformed from its Biblical prototype, that it has been easy to overlook both its distinctiveness and its source.” In writing about this theme, Abrams delves deeply into the Christian theological tradition, paying particular attention to the Book of Revelation, with its vision of destruction and renewal, and the Confessions of Saint Augustine, with their revolutionary analysis of human motive and guilt.
more from Adam Kirsch at Tablet here.
Sarah A. Topol in Vice:
The American press calls her Russia’s Paris Hilton, but Sobchak is a far more prominent figure in Russia than Hilton ever was in America. She herself points out, 97 percent of Russians know who she is, even if most of them don’t like her. Only two living Russians enjoy better name recognition: Three-term president Vladimir Putin and one-term president Dmitri Medvedev.
Her father, Anatoly Sobchak, an early champion of democracy and capitalism, was the first elected mayor of St. Petersburg. He singlehandedly launched Putin’s political career, and Ksenia is rumored to be Putin’s goddaughter. In 1996, her father spiraled spectacularly to disgrace. He faced imprisonment on corruption charges, which he evaded with Putin’s help, by going into exile. When Boris Yeltsin turned Russia over to Putin, the charges disappeared and Anatoly Sobchak returned to Russia. He died in 2000 on the campaign trail for Putin. Ksenia, meanwhile, made a name for herself hosting a reality show called Dom-2 about a group of young people tasked with building a house on the outskirts of Moscow. The content combined the worst ofJersey Shore, The Real OC, and Tila Tequila. It was scandalous, deliciously addictive, and intellectually bankrupt programming. She posed for Russian Playboy, Maxim, and FHM; co-wrote Philosophy in the Boudoirand How to Marry a Millionaire. She hosted decadent parties, dated oligarchs, and wrote a column for RussianGQ. In short, she came to embody Russia’s new heady, careless, apolitical glamour.
Then, last year, she underwent a mystifying transformation. She traded her reality show for a political talk show. She broke up with her boyfriend, a government official, and started dating an opposition leader. She climbed on stages and addressed massive street rallies. Russia’s Paris Hilton had turned into a Russian Jane Fonda, or so it seemed.
Barney Rosset to me represents the literary world of the latter half of the 20th century. Two hefty books—the oldest of autographed books in my library—attest to this fact. The books are The Complete Justine, Philosophy in the Bedroom and Other Writings and The Olympia Reader: Selections from the Traveler’s Companion Series, both published by Grove Press, Inc. The autographs are Barney Rosset’s dated 1965. I was then a writer aged 30, a complete unknown outside of Japan, visiting the New York publishing houses to receive a publication contract for my novel A Personal Matter. The owner of the company, knowing that I had started writing as a student majoring in French literature, asked me who, in contemporary literature, I found interesting. A soft smile spread across his face at the mention of each of my favorite French, English, and American writers and poets. He then kindly gave me the two books saying that although the work by Marquis de Sade required no comment from him, the “Olympia Reader” contained works—obtainable only in Paris—by writers whom I admired. The clear-thinking, soft-spoken man, from whose countenance exuded a youthful vigor, said: “Among the writers in this selection, the most talented is Samuel Beckett, and this book carries a brief story of how Watt came to be published. I will most likely publish all of his works.”
more from Kenzaburō Ōe at Evergreen Review here.