The Caribbean Zola

Craig Lambert in Harvard Magazine:

LambIn the spring of 2012, Brown University hosted an extraordinary academic conference. “Being Nobody?” honored the thirtieth anniversary of the publication of Slavery and Social Death by Orlando Patterson, Harvard’s Cowles professor of sociology. Giving a birthday party for a scholarly book is a rarity in itself. Even more unusual, the symposium’s 11 presenters were not sociologists. They were classicists and historians who gave papers on slavery in ancient Rome, the neo-Assyrian empire, the Ottoman Middle East, the early Han empire, West Africa in the nineteenth century, medieval Europe, and eighteenth-century Brazil, among other topics. “I’m not aware of another academic conference held by historians to celebrate the influence of a seminal work by a social scientist writing for a different discipline,” says John Bodel, professor of classics and history at Brown, one of the organizers.

But Patterson is no ordinary academician. “Orlando is one of a kind—the sheer scope and ambition of his work set him apart from 99 percent of social scientists,” says Loic Wacquant, JF ’94, professor of sociology at Berkeley. “In an era when social scientists specialize in ever-smaller objects, he is a Renaissance scholar who takes the time to tackle huge questions across multiple continents and multiple centuries. There was another scholar like this in the early twentieth century, named Max Weber. Orlando is in that category.”

More here.

Azar Nafisi: ‘Books are representative of the most democratic way of living’

Viv Groskop in The Guardian:

AzarAzar Nafisi, 58, is an Iranian writer and professor of English literature. She lives in Washington DC and became an American citizen in 2008. In 1995 she quit her job as a university lecturer in Tehran and taught a small group of students at home, discussing works considered controversial in Iran at the time, such as Lolita and Madame Bovary. Her 2003 book based on this experience, Reading Lolita in Tehran, was on the New York Times bestseller list for 117 weeks and won a string of literary awards. Nafisi’s latest non-fiction book, The Republic of Imagination (Viking), is described as “a passionate tribute to literature’s place in a free and enlightened society”.

What motivated the latest book?

In the last chapter of Reading Lolita in Tehran I talk about how my students were uncritically in love with this world they could not connect to physically – the west. I wanted them to know that this was an illusion. That there were serious critiques of any system, no matter how wonderful. When I came here [to the US], I realised how the ideal of freedom is being eroded. One canary in the mine is the denigration of ideas.

What do you mean by this? What are the signs?

The inequalities of the education system [in the US]. You are also experiencing this in Britain. Where public schools [ie state schools] are virtually being dismantled. Where children are deprived of music, art and fiction more and more. And where all the privilege goes to the private schools. This is not the America I want my children to grow up in.

Why is fiction in particular important in solving all this?

The importance of ideas and the imagination is that they really defy borders and limitations. Books are representative of the most democratic way of living. There’s a James Baldwin quote about feeling all alone and isolated until you read Dostoevsky and you discover that someone who lived a hundred years ago connects to you – and you don’t feel lonely any more.

The premise of this book is that “to deny literature is to deny pain and the dilemma that is called life”. In what way can fiction help us with this dilemma?

Fiction confronts a great many things that we cannot fully confront in real life. Fiction is the ability to be multi-vocal and to speak through the mind and the heart of even the villain. In doing that, it forces us to face the pain of being human and being transient. It’s what Nabokov talks about: “The conclusive evidence of having lived.”

More here.

Zero Hour

Trench_watch_1916_gold

Joanna Scutts in Lapham's Quarterly:

In The Burning of the World, his recently discovered memoir of the first few weeks of World War I, the Hungarian artist, officer, and man about town Béla Zombory-Moldován writes frequently about his attachment to his watch. When he’s wounded in the confusion of battle in the forests of Galicia, he finds the watch unscathed during an agonizing evacuation of the area, and exalts the survival of “my trusty companion, sharer of my fate, the comrade that connected me to my former life.” Much more than a watch, it’s almost a miracle: “Not just an object, but a true and staunch friend. I held it in my left hand and marveled at it as it measured off the seconds.”

How to tell time was a matter of survival and strategy during the Great War, a war in which communication technologies had to advance rapidly to keep pace with the new instruments of battle. The war was a crucible of innovation in destruction, in which chlorine gas, tanks, and heavy artillery choked, crushed, and obliterated human bodies in new ways. Vast armies dug in opposite each other across unprecedented distances—the Western Front alone stretched well over four hundred miles, from the Swiss border to the North Sea. Because much of the infantry went underground, it was no longer possible simply to holler or sound a hunting horn as a signal to attack, nor for regiments to advance proudly, and visibly, together on horseback. Instead, it became necessary to coordinate time and to tell it accurately; the practice and the phrasesynchronize watches was born from this need during the war. Officers in crowded trenches watched for second hands to tick down before blowing the whistle and rallying their men, who scrambled up ladders into the awaiting gunfire. The term zero hour, the moment of no return, was first recorded in the New York Times in November 1915: “At 5:05 a.m. September 25 a message came to the dugout that the ‘zero’ hour, that is, the time the gas was to be started, would be at 5:50 a.m.” The irony of ascribing a precise time for an attack as uncontrollable and weather-dependent as gas goes unmentioned.

Paul Fussell, in his influential 1975 study The Great War and Modern Memory, notes that sunrise and sunset dominated soldiers’ trench lives and their understanding of the passage of time. These periods of “stand-to” were times of heightened tension and observation, when men would keep watch on the raised fire step and strain their eyes through field glasses for movement. When it wasn’t raining, the skies above the flat, endless fields would burst into color as they waited—an unforgettable combination of beauty and terror. (Fussell writes that “dawn has never recovered from what the Great War did to it.” Dawn and dusk were unavoidable natural markers of time, both ordinary and mystical. “The darkness crumbles away./It is the same old druid Time as ever,” as Isaac Rosenberg puts it in his 1916 poem “Break of Day in the Trenches.”Dawn is relentless, and soldiers are powerless to hide from it, speed it up or slow it down. A watch then gives the illusion of controlling time, a sustaining fantasy of life at the front. As Zombory-Moldován suggests, the watch is something more than practical; it’s a link back to a world where a man was free to make his own appointments, to run his own life.

More here.

Everything and Moore

AeonMoore-sized

Tim Martin in Aeon (Illustration by Lee Moyer):

Alan Moore is waiting when I get off the train in Northampton, a majestically bearded figure in a hoodie, scanning the crowd that pushes through the turnstiles with a look of fearsome intent. When I wave, the glare becomes a beaming smile. ‘How are you, mate?’ he booms. ‘Splendid, splendid. I thought we’d go for a bit of a walk, so I can show you around and we can work up an appetite.’

Off we go up the hill. Moore swings his stick – a wooden snake coiled around the handle to symbolise his enthusiastic worship of Glycon, a second-century Macedonian snake god – and keeps up a constant flow of arcane local chatter. This station car park, he tells me, used to be King John’s castle, where the First Crusade began. That charmless glass-and-steel building was once a Saxon banqueting hall. Over there was a pub where, ‘if you’d come along here on a Sunday afternoon in the 1920s or ’30s, you’d have found a zebra tied up outside it.’

Before long, tramping through the riverside mud under a railway bridge, we’ve moved on to grander concerns. Moore has embarked on a potted summary of eternalism, the philosophical concept of time that ran through Kurt Vonnegut’s novel Slaughterhouse-Five (1969), played a part in his own revolutionary superhero comic Watchmen (1986-87), and is the central conceit behind ‘Jerusalem’, the million-word mega-novel the first draft of which he has now, after more than a decade, shepherded to its conclusion.

In essence, eternalism proposes that space-time forms a block – ‘imagine it as a big glass football’, Moore suggests – where past and future are endlessly, immutably fixed, and where human lives are ‘like tiny filaments, embedded in that gigantic vast egg’. He gestures around him at the rubbish-strewn path, his patriarch’s beard waving in the wind. ‘What it’s saying is, everything is eternal,’ he tells me. ‘Every person, every dog turd, every flattened beer can – there’s usually some hypodermics and condoms and a couple of ripped-open handbags along here as well – nothing is lost. No person, no speck or molecule is lost. No event. It’s all there for ever. And if everywhere is eternal, then even the most benighted slum neighbourhood is the eternal city, isn’t it? William Blake’s eternal fourfold city. All of these damned and deprived areas, they are Jerusalem, and everybody in them is an eternal being, worthy of respect.’

If this mixture of local history, cosmological speculation and messianic mysticism sounds bewildering, then perhaps you haven’t been reading enough Alan Moore lately.

More here.

Dear White People’s Missed Opportunities

Dear_white_people2_850_478

Michael Collins in In These Times:

Set in the present day, the film follows the lives of five black people on the fictitious Ivy League college Winchester as they navigate race, love and ever-shifting personal identities. Broken into a series of blithely titled chapters, the film is billed as “a satire about being a black face in a white place.”

The film, however, is less a satire in the sense of using “wit to expose stupidity” as much as it is a mockumentary whose humor comes from its earnestness, in the vein of films like Best in Show. Perhaps this is because, as the title suggests, the work is narrowly pointed at white America. Or, more specifically, the type of liberal white America that prefaces racist statements with “I’m not racist, but…” and when challenged responds, “But my best friend is a black!” For those who already know that all black people aren’t the same (we have different names for a reason!), and that race, class and sexuality are complex parts of a greater whole, the film will have little critical edge. But for those who haven’t taken Race in America 101, the film may yet be productive.

Through a series of occasionally disjointed chapters, we are presented with a host of college archetypes: the charismatic jock played by the astonishingly beautiful Brandon Bell; the black militant played by Tessa Thompson, the pushover nerd played convincingly by Tyler Williams ofEverybody Hates Chris fame, the society queen with a terrible secret (and an amazing wardrobe of pearl necklaces and backless dresses) played by Teyonah Parris, and the incorrigible dean played by Dennis Haysbert. Throughout, the film adds various layers to these one-dimensional caricatures by highlighting their “performance of blackness.”

For those who slept through critical race theory, it’s now taken for granted that there is no essential black experience. Rather, blackness is a social, political and economic construct that individuals engage with as society, the economy or our personal desires dictate. The film revels in multiplicity of identity, internal contradictions and the general sense of confusion and misidentification that characterize public discussions of race.

More here.

the Crippling Sadness That Overtook Evelyn Waugh

Waugh2John Banville's 1995 essay at The New Republic:

This was a very low period for Waugh. There was an urgent necessity for him to find a way of making a living, and eventually, with deep foreboding, he took a post as a teacher at Arnold House Preparatory School on the north coast of Wales. This grotesque establishment was the model for the hilariously awful Llanabba Castle in Decline and Fall. He did not stay there for long, and found another teaching job at a more nearly normal school in Buckinghamshire, from which eventually he was sacked, apparently for drunkenness. Waugh was not cut out to be a teacher.

He did not really know what he was cut out to be. He had started to write, and some short stories had been published, but he had not yet given up hope of being a painter. He also spent a brief, happy few months taking carpentry lessons with a view to embarking on a career as a cabinetmaker. He did some journalistic work, and began his first book, a life of the Pre-Raphaelite painter Dante Gabriel Rossetti, but the most important event of these years was his meeting Evelyn Gardner on April 7, 1927. (They would come to be known to their friends as He-Evelyn and She-Evelyn.)

more here.

The Grim Future if Ebola Goes Global

Ebola-world-inline-660x440Maryn McKenna at Wired:

It is not guaranteed, they say, that a successful vaccine against Ebola can be “developed, produced, and distributed” in time, and in large enough amounts, to throw a fence of containment around the disease.

If not, they warn, it is possible that the rest of the world’s reaction could trigger the next global financial crisis.

Being someone who has a professional specialty of covering epidemics (HIV, the anthrax attacks, SARS, H5N1, H1N1, lots of smaller outbreaks), I reluctantly have to conclude: Lanard and Sandman are not being alarmist here. Imagine that Ebola cannot be contained; think back to the events of this weekend; and then imagine that reaction multiplied thousands of times. It isn’t a big leap to the suspicion, disruption and expense that will then be triggered in response to any travelers from the region. From there, it isn’t much of a further leap to closed borders, curbs on international movement, disruption in global trade, cuts in productivity, even civil unrest and the opportunities that unrest offers to extremist movements. None of that is far-fetched, if Ebola is not controlled.

more here.

“The Death of Klinghoffer,” at the Met

141103_r25705-320Alex Ross at The New Yorker:

The protest failed because it relied on falsehoods: the opera is not anti-Semitic, nor does it glorify terrorism. Granted, Adams and his librettist, Alice Goodman, do not advertise their intentions in neon. The story of the Achille Lauro hijacking is told in oblique, circuitous monologues, delivered by a variety of self-involved narrators, with interpolated choruses in rich, dense poetic language. The terrorists are allowed ecstatic flights, private musings, self-justifications. But none of this should surprise a public accustomed to dark, ambiguous TV shows like “Homeland.” The most specious arguments against “Klinghoffer” elide the terrorists’ bigotry with the attitudes of the creators. By the same logic, one could call Steven Spielberg an anti-Semite because the commandant in “Schindler’s List” compares Jewish women to a virus.

In the opera, the opposed groups follow divergent trajectories. The terrorists tend to lapse from poetry into brutality, whereas Leon Klinghoffer and his wife, Marilyn, remain robustly earthbound, caught up in the pleasures and pains of daily life, hopeful even as death hovers. Those trajectories are already implicit in the paired opening numbers, the Chorus of Exiled Palestinians and the Chorus of Exiled Jews. The former splinters into polyrhythmic violence, ending on the words “break his teeth”; the latter keeps shifting from plaintive minor to sumptuous major, ending on the words “stories of our love.”

more here.

Laughing ourselves to life

Howard Jacobson in New Statesman:

LaughIf I were to give this essay a title, it would be “Waiting for Calvin”. Not John Calvin the theologian, nor Calvin Klein the fashion designer, but Calvin, a Navajo baby whose first laugh I travelled to Arizona in 1995 to film as part of a series of television programmes I was making about comedy. It’s a nerve-racking business waiting for a baby to laugh, particularly if you have a camera crew standing by in another state, but Calvin’s laugh was as important to my film as it was to his family and community. The Navajo celebrate a baby’s first laugh as a rite of passage, a moment in which the baby laughs himself, as it were, out of inchoate babydom and into conscious humanity. It’s a wonderful concept and grants a primacy to laughter that we, who probably laugh too automatically and certainly far too much, would do well to think about. If it’s laughter that makes us human, or at least kick-starts the process of our becoming human, what does that say about what being human is?

It is sometimes argued that laughter is what distinguishes us from animals, but not everyone would agree that we have laughter to ourselves. Thomas Mann, for example, wrote an essay about his dog Bashan in which he made a claim for Bashan’s demonstrating many of the signs of mirth. And that’s before we get on to the tricky question of internal laughter – that appreciation of ironical mishap or absurd situation that even in human beings doesn’t always issue in a smile, never mind a laugh. Laughter, we can say, is an act of comprehension – whether immediate or arising out of rumination – but which of us can know for sure how much animals comprehend of what they see and how long they go on thinking about it?

More here.

‘Data smashing’ could automate discovery, untouched by human hands

From KurzweilAI:

DatasmashingFrom recognizing speech to identifying unusual stars, new discoveries often begin with comparison of data streams to find connections and spot outliers. But simply feeding raw data into a data-analysis algorithm is unlikely to produce meaningful results, say the authors of a new Cornell study. That’s because most data comparison algorithms today have one major weakness: somewhere, they rely on a human expert to specify what aspects of the data are relevant for comparison, and what aspects aren’t. But these experts can’t keep up with the growing amounts and complexities of big data. So the Cornell computing researchers have come up with a new principle they call “data smashing” for estimating the similarities between streams of arbitrary data without human intervention, and even without access to the data sources.

How ‘data smashing’ works

Data smashing is based on a new way to compare data streams. The process involves two steps.

  1. The data streams are algorithmically “smashed” to “annihilate” the information in each other.
  2. The process measures what information remains after the collision. The more information remains, the less likely the streams originated in the same source.

Data-smashing principles could open the door to understanding increasingly complex observations, especially when experts don’t know what to look for, according to the researchers.

More here.

Wednesday Poem

A Ball Rolls on a Point

The whole ball
of who we are
presses into
the green baize
of a single tiny
spot. A aural
track of crackle
betrays our passage
through the
fibrous jungle
it's hot and
desperate. Insects
spring out of it.
The pressure is
intense and the
sense that we've
lost proportion.
As though bringing
too much to bear
too locally were
our decision.

by Kay Ryan
from The Niagara River
Grove Press. 2005

Speaker 4

Against Carceral Feminism

T_8-1024x1024

Victoria Law in Jacobin (image “Prison Blueprints.” Remeike Forbes/Jacobin):

Casting policing and prisons as the solution to domestic violence both justifies increases to police and prison budgets and diverts attention from the cuts to programs that enable survivors to escape, such as shelters, public housing, and welfare. And finally, positioning police and prisons as the principal antidote discourages seeking other responses, including community interventions and long-term organizing.

How did we get to this point? In previous decades, police frequently responded to domestic violence calls by telling the abuser to cool off, then leaving. In the 1970s and 1980s, feminist activists filed lawsuits against police departments for their lack of response. In New York, Oakland, and Connecticut, lawsuits resulted in substantial changes to how the police handled domestic violence calls, including reducing their ability to not arrest.

Included in the Violent Crime Control and Law Enforcement Act, the largest crime bill in US history, VAWA was an extension of these previous efforts. The $30 billion legislation provided funding for one hundred thousand new police officers and $9.7 billion for prisons. When second-wave feminists proclaimed “the personal is the political,” they redefined private spheres like the household as legitimate objects of political debate. But VAWA signaled that this potentially radical proposition had taken on a carceral hue.

At the same time, politicians and many others who pushed for VAWA ignored the economic limitations that prevented scores of women from leaving violent relationships.

More here.

Morality and Discourse in Serbia

Belgrade

Keith Doubt on Eric Gordy's Guilt, Responsibility, and Denial: The Past at Stake in Post-Milošević Serbia, in Berfrois (Belgrade, Serbia. Photograph by Jamie Silva):

The intellectual integrity of cultural anthropology is based largely on its commitment to cultural relativism as a principled notion. Cultural relativism is the principle from which the discipline achieves its sense of empirical objectivity. Cultural differences are cherished as just that, cultural differences. No difference is stipulated as superior or inferior, better or worse. The commitment guards against ethnocentric judgments, colonizing prejudices, and, worst of all, grand theorizing with metaphysical pretense. This ethos in the discipline of cultural anthropology guides the recent book by Eric Gordy titled, Guilt, Responsibility, and Denial: The Past at Stake in Post-Milošević Serbia.

While cultural initiatives rarely investigate and never sentence, they offer some of the keys to understanding that have been missing from political legal projects: the ability to hear and identify with the lived experiences of individuals, a route to engagement that participants in the public can understand, and openness to interpretation that constitutes an invitation to dialogue. (p. 179)

There is a contrasting notion in the social sciences to the principle of cultural relativism, namely, the assumption that social science has a valid knowledge-base and ethical responsibility from which to demonstrate how some societies are healthier than others and how some social structures are better for community life. Social science depicts certain normative orientations and collective sentiments as more functional for the vitality of human life and sociability. For example, human rights scholars assume that a genuine respect for the principle of human rights is good: good for people in society, good for their communities, and good for their governments. Gordy understands this perspective but recognizes its unintended consequences, given his political knowledge of what Max Weber calls the ethical irrationality of the world in his famous lecture, “Politics as a Vocation.” In politics, it is necessary to employ force in realizing one’s values. When, however, force is employed, no matter how good the intentions behind the use of force, bad results follow or evil consequences occur. Weber calls this the ethical irrationality of the world which is the reason for the sense of disenchantment that characterizes the spirit of the modern world. In politics, actions whose motives are seemingly good can lead to bad results. The reverse is also true; actions whose motives are seemingly bad can lead to good results. Weber calls this the paradox of consequences, an ever-repeating empirical and historical pattern, and Gordy understands this matter well. There is a hubris that informs the forceful use of law and legal process at both the national and the international level, and Gordy wants to debunk this hubris that guides international interventions in societies experiencing conflict and social violence.

To introduce the structure of his book, Gordy writes, “the ordering of the chapters is meant to lead readers through the logic that brought the study from apparently clear and relatively simple moral questions to greater complexity and uncertainty, and to an insistence on the importance of the cultural and social context” (p. xv). After relatively simple moral questions implode upon themselves when confronted with empirical scrutiny and historical accounts, the significance of cultural variables within their own milieu and within their own historical context assume their rightful place.

More here.

Genetically Modified Organisms Risk Global Ruin

Images (1)

Over at The Physics arXiv Blog:

Taleb and co begin by making a clear distinction between risks with consequences that are local and those with consequences that have the potential to cause global ruin. When global harm is possible, an action must be avoided unless there is scientific near-certainty that it is safe. This approach is known as the precautionary principle.

The question, of course, is when the precautionary principle should be applied. Taleb and co begin by saying that their aim is to place the precautionary principle within a formal statistical structure that is grounded in probability theory and the properties of complex systems. “Our aim is to allow decision-makers to discern which circumstances require the use of the precautionary principle and in which cases evoking the precautionary principle is inappropriate.”

Their argument begins by dividing potential harm into two types. The first is localised and non-spreading. The second is propagating harm that results in irreversible and widespread damage. Taleb and co say that traditional decision-making strategies focus on the first type of risk where the harm is localised and the risk is easy to calculate from past data.

In this case, it is always possible to make a mistake when decision-making about risk. The crucial point is that when the harm is localised, the potential danger from a miscalculation is bounded.

By contrast, harm that is able to propagate on a global scale is entirely different. “The possibility of irreversible and widespread damage raises different questions about the nature of decision-making and what risks can be reasonably taken,” say Taleb and co. In this case, the potential danger from a miscalculation can be essentially infinite. It is in this category of total ruin problems that the precautionary principle comes into play, they say.

More here.