Two women painters: Jenny Saville at Gagosian and Celia Paul

by Sue Hubbard

Jenny Saville. Oxyrhynchus. Gagosian, 6-24 Britannia Street, London WC1X9JD. June 13 – July 26, 2014

Celia Paul. Victoria Miro, 16 Wharf Road, London, N1 7RW. 12 June – 2 August 2014

SAVILLE 2014 DuskTwo current shows at major London galleries illustrate that painting is not only alive and well but a vibrant, intellectually and emotionally challenging force. Both these shows are figurative and both are by women. I first met Jenny Saville when she was 22. She'd just left Glasgow School of Art and Charles Saatchi had purchased her MA show and offered an 18-month contract to support her while she made new work to be exhibited in his London gallery. Interviewing her for Time Out, I found her idealistic and determined that Saatchi ‘wouldn't change her'. Her work was aggressive, personal, raw and highly accomplished. Flesh and the female body were her subjects and graffiti-style texts that subverted traditional notions of feminine beauty were scored, like self-inflicted wounds, into the thick impasto of the body of her subjects. Although part of a generation for whom painting – in particular figure painting – was not considered fashionable, she was soon to be seen as the heir to Lucien Freud.

SAVILLE 2014 OdalisqueNow Gagosian Galley is presenting her first-ever solo show in London: Oxyrhynchus. A number of these new works are inspired by the rubbish dump found on this ancient Egyptian archaeological site where heaps of discarded documents were preserved in the area's dry climate, including Euclid's Elements and fragments of Sappho's poems. This historic palimpsest has given Saville an intellectual armature on which to hang much of her imagery that often involves the complex layering of bodies. Faces and limbs overlap and ghostly reflections create a series doppelgangers or shadow selves. The viewer's eye slips between forms, uncertain which limb belongs to which figure, as in Leonardo's cartoon of The Virgin and Child with Saint Anne and the Infant Saint John Baptist, circa 1499, where theownership of individual arms and legs is ambiguous. In the exhibition's title work, (pastel and charcoal on canvas), bodies have been reduced to fragments. A foot sticks from a heap of marks as though broken from an ancient sculpture. Elsewhere there's a pile of breasts. This intermingling and cross-referencing runs through Saville's work; black bodies intertwine with white, genders are blurred. Modern life is not seen as fixed but as complex and fluid. Boundaries and borders dissolve. Saville pays a conscious debt to art history with her references to Degas' Olympia, and her nervy abstract marks that wrestle to find form and space in the manner of De Kooning.

Read more »

The Loneliness of the Modern Warrior: Matt Murphy’s “A Beckoning War”

by Prashant Keshavmurthy

9781493714889If there is a literary history of the modern warrior then Matthew Murphy's A Beckoning War should be its latest chapter and, surely, its finest. Told in the third-person, the novel narrates “the Allied advance through the Gothic Line in Northern Italy in September 1944” almost wholly through the perspective of Captain Jim McFarlane of the Canadian Fifth Armored Division.

In the face of his wife Marianne's objections – they have been married less than a year – Jim volunteers to go to war out of a citizenly sense of duty to the Allied cause. The voluntary character of this decision places the novel in a modern tradition of war novels all of whose protagonists enter the fray out of a sense of righteous duty. Among these is Paul Bäumer of Eric Maria Remarque's 1929 All Quiet on the Western Front and Robert Jordan, the protagonist of Ernest Hemmingway's For Whom the Bell Tolls (1940). Hemmingway's famously bleak recreations of machinegun slaughter are the ancestors of Murphy's dense descriptions of war machines and wounding.

As both these canonical examples attest, the modern war novel has been an anti-war novel. I prefer to identify Jim McFarlane, not as a soldier, but – despite the archaism of the term – as a warrior. Doing so places Jim in an older lineage of epic heroes (Beowulf, Tristan and Faraydun and Rustam come to mind) each of whom chooses to go to war with an enemy of his people. In so doing, he battles monsters (Beowulf with Grendel; Tristan with Morold; Faraydun and Rustam with a variety of demons) by destroying whose gigantic bodies he defines himself as his people's ideal and marks out a land as his people's homeland. This, at any rate, is the epic norm.

Read more »

Travels in Northeast Turkey: Part 2

by Hari Balasubramanian

After the road trip to the Turkey-Georgia border (see Part 1), I returned along with my friend Serhat to Erzurum on the third day. Serhat flew back to Istanbul that same evening. My plan was to travel solo to the town of Kars next morning by bus, spend two full days there before returning by flight to Istanbul. All this was in July 2013.

1.The minibus from Erzurum to Kars

Map2Kars is at the far northeastern end of Turkey, about 3 hours by bus from Erzurum, close to the Armenian and Georgian borders. This is the same town where Orhan Pamuk's Snow is set. In the opening section of the novel, the protagonist Ka takes a bus from Erzurum to Kars; the bus runs into a raging winter storm.

I had a more basic problem. I thought that finding a bus would be a simple task. In the morning I took a taxi to the gleaming and modern Otogar, the bus station, about 14 km from Erzurum Center. But after a frustrating hour of enquiries, I had made no progress. I expected buses to Kars to be frequent. But no one seemed to know where to find one; the private companies – there were no government buses – said they did not have service to Kars that day. I roamed around the well maintained bus station, asking at least ten people, moving in circles, not making any progress, gradually feeling amused at my travel predicament. The language barrier was a huge issue: I realized that even very basic English words and phrases weren't working.

Not knowing how to proceed, I returned to Erzurum Center, and spent some time in an internet café pondering my options. The café owner wanted to help; we used Google Translate to carry on a rudimentary conversation. He let me use his cell phone to call Serhat. Something was eventually arranged, I wasn't sure what; I simply waited. Ten minutes later, a car with a young man and a boy – both from a bus company – arrived to pick me up from the café. They were going to lead me to the bus to Kars. I rushed out with my baggage and left my personal diary next to the computer.

Read more »


From Permanent Black:

MehtaMehta: I think of you, especially in the essays that constitute this book, as doing a rather particular kind of philosophy. It is a very distinguished tradition of practitioners, including the late Richard Rorty, Bernard Williams, and Alasdair MacIntyre in the Anglo-American tradition; Michel Foucault, in the French tradition, Adorno and Walter Benjamin in the German tradition, and of course several others. One of the things that marks this way of doing philosophy (if that is the term we should use) is that the familiar, and typically sharp lines, that separate philosophy from the humanities and the social sciences are willfully and self-consciously breached. I don’t mean that they are breached just for heck of it, but that questions are posed in such a way that makes answering them reliant on such a breach. Bernard Williams, as you know, proudly affirmed philosophy as a humanistic discipline. Your own work is heavily informed by the Dissenting tradition of 17th century thought and by contemporary history and social science. And, yet, in many ways this way of doing philosophy is the minor key of contemporary Anglo-American, and increasingly, even Continental philosophy. How would you describe what you do? Does it matter to you if it is thought of as “doing philosophy,” or does that description seem arcane to you, as it did for Richard Rorty?

Akeel smallerBilgrami: I must confess that my work has not been motivated by any self-conscious effort towards trying to reorient the discipline of philosophy nor even to follow a tradition set by the philosophers you mention, much as I admire them all. Rather, it’s just that certain issues grabbed my interest and I followed what I thought was most important and urgent in them and when that led to having to read history and intellectual history, and to study some political economy and politics and a variety of cultural phenomena, I just followed that lead as best I could—mostly for the sake of coming to some fundamental understanding of the issues. You are certainly right that most philosophers do not have a capacious understanding of their subject and many might even view this sort of outreach as contaminating their discipline. However, looking at things from the other side, we mustn’t forget that the social sciences themselves, particularly Economics, have manifestly abandoned the historical, the broadly conceptual, and, above all, the value-oriented aspects of their pursuits. So it is possible that we are now at a disciplinary moment when philosophy is poised to pick up that slack and pay close attention to the very things that the social sciences have abdicated. This would, then, be an exciting time to be doing philosophy.

More here.

Why 10% of the Population Hates Cilantro and the Rest Doesn’t Know Any Better

From Reason I am Here:

ScreenHunter_717 Jul. 07 05.53The first time I tried cilantro I didn’t realize it; I just thought somebody had emptied a bottle of Old Spice on my pizza in an attempt to poison me. Cilantro tastes like soap to approximately 10% of the people who have had their genotype analyzed by 23andMe. The currently accepted explanation is that those of us who passionately despise cilantro were born with a genetic variant known as a single-nucleotide polymorphism (or SNP, pronounced ‘snip’).

The genome has 3 billion nucleotides (the building blocks, known as A, C, G and T), and 10 million of them are thought to be SNPs. That means that a significant percentage of the population has one letter in a specific location (an A, for example) and everyone else has a different letter at that location. The cilantro SNP is called rs72921001, and apparently, its genomic location lays close to a cluster of olfactory receptor genes that includes OR6A2, the gene most likely to be alerting our brain about the presence of cilantro.

More here.

The Hunt for Life Beyond Earth


Michael Lemonick in National Geographic (photo by Mark Thiessen):

It's difficult to pin down when the search for life among the stars morphed from science fiction to science, but one key milestone was an astronomy meeting in November 1961. It was organized by Frank Drake, a young radio astro­nomer who was intrigued with the idea of searching for alien radio transmissions.

When he called the meeting, the search for extraterrestrial intelligence, or SETI, “was essentially taboo in astronomy,” Drake, now 84, remembers. But with his lab director's blessing, he brought in a handful of astronomers, chemists, biologists, and engineers, including a young planetary scientist named Carl Sagan, to discuss what is now called astrobiology, the science of life beyond Earth. In particular, Drake wanted some expert help in deciding how sensible it might be to devote significant radio telescope time to listening for alien broadcasts and what might be the most promising way to search. How many civilizations might reasonably be out there? he wondered. So before his guests arrived, he scribbled an equation on the blackboard.

That scribble, now famous as the Drake equation, lays out a process for answering his question. You start out with the formation rate of sunlike stars in the Milky Way, then multiply that by the fraction of such stars that have planetary systems. Take the resulting number and multiply that by the number of life-friendly planets on average in each such system—planets, that is, that are about the size of Earth and orbit at the right distance from their star to be hospitable to life. Multiply that by the fraction of those planets where life arises, then by the fraction of those where life evolves intelligence, and then by the fraction of those that might develop the technology to emit radio signals we could detect.

The final step: Multiply the number of radio-savvy civilizations by the average time they're likely to keep broadcasting or even to survive. If such advanced societies typically blow themselves up in a nuclear holocaust just a few dec­ades after developing radio technology, for example, there would probably be very few to listen for at any given time.

The equation made perfect sense, but there was one problem. Nobody had a clue what any of those fractions or numbers were, except for the very first variable in the equation: the formation rate of sunlike stars. The rest was pure guesswork. If SETI scientists managed to snag an extraterrestrial radio signal, of course, these uncertainties wouldn't matter. But until that happened, experts on every item in the Drake equation would have to try to fill it in by nailing down the numbers—by finding the occurrence rate for planets around sunlike stars or by trying to solve the mystery of how life took root on Earth.

It would be a third of a century before scientists could even begin to put rough estimates into the equation.

More here.

The Poems (We Think) We Know: Emily Dickinson


Alexandra Socarides in The LA Review of Books:

It’s one thing to say that Dickinson’s poems are uncertain, or complicated, or contradictory (all of which they are), but it’s an entirely other thing to compound that uncertainty with my own. In my basement, visitors wanted to connect with me by reciting the first line of a Dickinson poem that clearly summed something up for them. I was happy to let them do that, even if I was a little freaked out by the whole experience.

But, in the end, Dickinson got me, as she always does.

One of the mysterious things about poetry is how a reader can walk away from a poem with what he or she thinks is a clear sense of its message or moral, when really the poem itself says something far more complicated than that. One famous example is Robert Frost’s “The Road Not Taken,” which tends to be read as a call for people to strike out on their own independent course, when really Frost marks no substantive difference between the two roads in his poem. (In an episode of the first season of “Orange Is the New Black,” Piper Chapman explains as much to her cellmates, who are highly annoyed by her need to complicate the meaning of the poem for them.) A similar thing happens with “I’m Nobody! Who are you?” The message of this poem is almost always taken to be that it is a mistake to seek fame, that it is preferable to be a nobody than a somebody. Coupled with the knowledge that Dickinson only published ten poems in her lifetime, this poem becomes (often for aspiring writers) a statement of artistic intent, a declaration of the joys of private, anonymous art making and a rejection of publicity. But in order to make this poem into a manifesto on the pleasures of the private world versus the dreariness of the public world, one has to make a variety of assumptions about both Dickinson and poetry.

More here.

Ground Down to Molasses: The Making of an American Folk Song


Dave Byrne in Boston Review (photo: Alan Lomax, Library of Congress):

The folk revival that emerged in New York in the mid-twentieth century took as its texts two primary sources: the six-volume Anthology of American Folk Music curated by the filmmaker Harry Smith, and the field recordings of John and Alan Lomax, particularly the acetates featuring obscure blues singers and inmates from the prison farms that flourished, if that is the word, in the southern states. Unlike Smith’s anthology, which was culled from a number of traditions and ethnicities, Lomax’s recordings during this period focused almost exclusively on African American music.

After the first recording of “Ain’t No More Cane” in 1933—which we will wander back to—the song goes largely silent, enters a period of dormancy from which it will not emerge for several decades. When it is recorded again and released in 1958, it comes from an unlikely source—Lonnie Donegan, the Scottish-born “king of skiffle.”

Skiffle, an idiosyncratic blend of early jazz, blues, and jug-band music, is closely associated with its mid-century practitioners in the United Kingdom. Its roots, however, lie in African American culture, and the term gained currency in pre-Depression Chicago. But by the early 1940s, the homemade ethos of the music—guitars, banjos, jugs, tea chest, kazoos—was pretty much a done deal.

In the late 1950s skiffle got a second wind in England. Donegan had several huge novelty hits in the idiom, with “Does Your Chewing Gum Lose Its Flavour (On the Bedpost Overnight?)” and “My Old Man’s a Dustman.” Donegan immersed himself in American jazz, blues, and forgotten work songs. His keening tenor and frailing banjo on “Ain’t No More Cane” conjure Appalachia more than East Texas, but the verses are terse and spooky. He cuts to the heart of it somehow.

More here.

Every Datum Tells a Story: The dawning of the age of meta-information

Mark P. Mills and M. Anthony Mills in City Journal:

CityHow will these technologies transform human communication? The beginnings of an answer can be found in the nearly century-old writings of German critic Walter Benjamin, who came of age during the first information revolution. He belonged, as he put it, to a “generation that had gone to school on a horse-drawn streetcar” but that “now stood under the open sky in a countryside in which nothing remained unchanged but the clouds . . . and the tiny, fragile human body.” Among these changes, Benjamin thought, was the loss of an authentic form of human experience—storytelling. Before the telegraph and the printing press, storytellers communicated by word of mouth. Stories imparted practical wisdom and timeless lessons that were seamless and intuitive to the listener. These lessons were preserved in a collective cultural memory. But the era of storytelling was overtaken by the era of information, a wholly new mode of communication revolving around facts, rather than experience. The purpose of facts is to inform, not teach. Information, Benjamin says, is “understandable in itself.” It does not need to be preserved but is “consumed” and forgotten as soon as it becomes “old.” Information is not timeless but timely. The communication of information requires not storytellers but intermediaries. Benjamin’s time saw the rise of an expanding cadre of professional journalists critical to the process of selecting, interpreting, and communicating facts. Moreover, information was not universally accessible; its consumption was subject to social, educational, and financial constraints.

Today, we stand at a historical turning point similar to the one that Benjamin lived through. A generation that went to school in buses driven by human beings will likely live to see a world of vehicles driven by robots. Data sensors and recorders are embedded into machinery, the environment, and even our bodies. Wireless networks share and algorithms sort, analyze, and store the data in virtual collective-memory banks, compiling treasure troves of—as yet—mostly untapped knowledge. More than 80 percent of all data remain beyond the reach of today’s nascent big-data analytics.

More here.

Fourth of July: Almost two hundred years ago, Thoreau moved into his Walden Pond cabin

Danny Heitman in Chritian Science Monitor:

On July 4, 1845, Thoreau moved into the cabin on Walden Pond. Soon after, Harvard asked what he had been up to and Thoreau detailed his adventures for his alma mater.

WaldenIt’s class reunion season across America – the time when alumni of high schools and colleges gather to see how well they’ve fared when compared with their former classmates. And some of us might naturally wonder how we’ll measure up, in terms of professional and personal accomplishments, when we return to our alma maters. For a little bit of courage, we can always consult Henry David Thoreau, who moved into the cabin he built at Walden Pond 169 years ago today, on July 4, 1845. He stayed two years, sustaining himself by taking on odd jobs, but spending most of his time watching nature, thinking, and writing. He left Walden Pond in the autumn of 1847, moving into the home of his mentor and benefactor Ralph Waldo Emerson to help care for the family while Emerson was in Europe. That’s when a letter arrived from the secretary of Thoreau’s class at Harvard, where Thoreau had graduated in 1837. The secretary was sending along one of those “Where Are They Now?” questionnaires, apparently popular even then, in which graduates can brag about how well they’d done since leaving campus.

Thoreau had little to show for his decade away from an exclusive Ivy League school – little, that is, by the yardstick that most of the world used to measure success. He had no spouse, no regular employment and only a handful of possessions. But Thoreau was confident enough in his peculiar sense of purpose to fill out the questionnaire matter-of-factly. Asked to state his occupation, he suggested that he was something of an overachiever, having not one job, but many: “I am a Schoolmaster – a Private Tutor, a Surveyor – a Gardener, a Farmer – a Painter, I mean a House Painter, a Carpenter, a Mason, a Day-laborer, a Pencil-Maker, a Writer, and sometimes a Poetaster.” Later in the questionnaire, Thoreau elaborated on his professional ambitions – or lack thereof: “I have found out a way to live without what is commonly called employment or industry attractive or otherwise. Indeed my steadiest employment, if such it can be called, is to keep myself at the top of my condition, and ready for whatever may turn up in heaven and earth.

More here.

My Dear Americans

From Chapati Mystery:

Here is a short made by Arpita Kumar, being screened at PBS ONLINE FILM FESTIVAL.

My_dear_americans_Ansuya_nathanHere is what Kumar told us about the short: I made My Dear Americans during my Project Involve fellowship at Film Independent in Los Angeles. We were asked to pitch short film projects focused on the theme of traditions. I thought it would be interesting to focus on an American tradition but from the point-of-view of an outsider. I chose to build a narrative around the 4th of July tradition since it’s the most American and patriotic of the holidays. And, I decided on a Sikh couple as the outsiders largely because around that time there was a shooting in a Sikh Gurudwara in Wisconsin. The white supremacist perpetrator associated the Sikhs with Osama Bin Laden and it shocked me that there was such ignorance about the Sikh community still. It had been more than a decade since 9/11 and the backlash continued. I realized that we cannot do much about the ignorance of others. What we can do is change our reaction to their ignorance. And, that inspired the film and the actions of the wife, Tejpreet. I arrived in the U.S. eleven years ago with the unbearable enthusiasm of Baldev – the husband in the film – for all things American. Over the years, the enthusiasm has not tapered but my mind has gained a more complex understanding of national identity, displacement, and the idea of home. The film is a window into that mindscape. Additionally, every time I start a film I give myself a challenge and for this one it was to tell a story with as little dialogue as possible. Watch and let me know if I succeeded. Also, vote.

More here.

How Should We Think About the Caliphate?


Owen Bennett-Jones in The LRB (image from wikimedia commons):

In its recent propaganda video, Clanging of the Swords: Part 4, the Islamic State of Iraq and Syria (Isis) presented a tightly edited series of grotesque executions. Thirty-eight people were filmed being killed: one man was shot as he ran through the desert trying to escape gunmen in a 4×4; another was trapped in his car; one was at home when Isis broke in and beheaded him in his bedroom. It’s hard to believe that what you’re watching really happened until the relentless inhumanity is interrupted by an occasional human moment. At one point a gunman walks down a row of kneeling young men with their hands tied behind them. He aims a pistol at the back of each man’s head, fires, watches the body slump forward in a pool of blood, moves on to the next in line and repeats the exercise. Then, one of his victims has the idea of trying to save himself by anticipating the shot and, a split second too early, falls forward, pretending to be dead. Needless to say, the ruse doesn’t work. There is also footage of Isis gunmen driving through a town when, for no apparent reason, they stick their Kalashnikovs out of the car windows and fire at two men walking along the pavement. One is hit and collapses. The car moves forward, and the Isis fighters keep firing as their victim lies motionless on the ground. Presumably they want to make sure he’s dead. As they drive away the second pedestrian – amazingly still unharmed – runs for his life in the other direction.

You might think that a film showing your organisation randomly murdering people would not attract new recruits. But Isis’s various communications have achieved two objectives. First, they have terrified the Iraqi army, sapping the soldiers’ will to defend the Iraqi state. Threatening text messages sent direct to their mobile phones reinforce the point. Second, Isis has quickly carved out a global presence. A few weeks ago it seemed that only policy wonks had heard of it. It didn’t even have a settled acronym: some called it Isis, others Isil (Islamic State of Iraq and the Levant – the Arabic supports either). The distinction hardly matters now as the organisation has renamed itself the Islamic State, with its leader, Abu Bakr al-Baghdadi, as its caliph. Whatever it’s called, its pitch relies on glamour shots of earnest young men with dishevelled, flowing hair living in rural settings unsullied by the paraphernalia of modern life – except for the assault rifles and ammunition strapped to their chests. The talk is all about duty, sacrifice and martyrdom.

But in many respects Isis is a very modern organisation. The brochure detailing its 2012-13 activities is like a state of the art corporate report. The most striking page, with slick graphic design, has 15 silhouetted icons – time bombs, handcuffs, a car, a man running – with each representing a field of activity: roadside bombs, prisoner escapes, car bombs and the clearance of apostates’ homes.

More here.

Johnson: Simpler and more foreign


R.L.G. in The Economist (via Tunku Varadarajan):

SEVERAL weeks ago, Johnson discussed his debate with Nicholas Ostler about the lingua franca of the future. Johnson thinks that English has a very long run ahead of it. Mr Ostler sees English’s time as coming to an end, to be replaced by machine-translation tools that will remove the need for people to learn to speak, read and write a lingua franca. But we agreed that whatever the long run might look like, the next few decades are set. No language has anything like a chance of displacing English.

Interestingly, about two-thirds of English-speakers are not first-language speakers of English. To put it another way: English no longer belongs to England, to superpower America, or even to the English-speaking countries generally. Rather, English is the world’s language. What happens to a language when it becomes everybody’s? Shaped by the mouths of billions of non-native speakers, what will the English of the future look like?

A look into the past can give us an idea. English is of course not the first language learned by lots of non-natives. When languages spread, they also change. And it turns out, they do so in specific directions.

For example, a 2010 study by Gary Lupyan and Rick Dale found that bigger languages are simpler. In more precise terms, languages with many speakers and many neighbours have simpler systems of inflectional morphology, the grammatical prefixes and suffixes (and sometimes “infixes”) that make languages like Latin, Russian and Ancient Greek hard for the foreign learner. Contrary to educated people’s stereotypes, the tiny languages spoken by “stone-age” or isolated tribes tend to be the world’s most complicated, while big ones are less so, by this metric.

What Messrs Lupyan and Dale found through a statistical look at thousands of languages, John McWhorter, a linguist at Columbia University, found in a detailed study of just five. In his 2007 book “Language Interrupted”, he asked why certain big, prestigious languages seem systematically simpler than their ancestors and cousins. English is simpler than German (and Old English); modern Persian is a breeze next to Old Persian and modern Pushtu; modern spoken Arabic dialects have lost much of the grammatical curlicues of classical Arabic; modern Mandarin is simpler than other modern Chinese languages; and Malay is simpler than related Austronesian languages. Mr McWhorter’s conclusion, in simple terms, is that when lots of adults learn a foreign language imperfectly, they do without unnecessary and tricky bits of grammar. (Most languages have enough built-in redundancy for grammars to be more complicated than they have to be.) Modern Mandarin is a perfect example of a language almost completely devoid of inflectional morphology, all those prefixes and suffixes. All languages have their complexities, but Mr McWhorter believes that Mandarin, English, Persian, Malay and Arabic dialects are all clearly simpler than they used to be.

What, then, can we predict English will lose if the process goes on?

More here.