Occupied

by Maniza Naqvi

Portrait of a Lawyer, Dr. Fritz Glaser 1921I called this essay “Owning our Stories” when it was published as a paper for a conference on sustainable development held in Islamabad in 2003. At the time I wrote this I was becoming increasingly anxious and worried about one of the greatest dangers facing the world: the justification of terror and war through the dangerous revival of a singular and value laden narrative and image of good and evil with its time released poison of hate.

At the end of October 2012 we are all aware of the results of this narrative: there are at least four wars underway that are justified through this narrative. There is the surveillance of Muslims in the US (here). There is the Supreme Court of the United States decision in 2010 in the case the Citizens United v. Federal Election Commission which ruled that under the First Amendment corporations are people and can not be prohibited in election spending (here). Private militaries and security corporations, are participating in the prosecution of wars in Libya, Iraq, Afghanistan, Pakistan (here,here,here, here and here). There is the National Defense Authorization Act which allows the indefinite detention of anyone in the world including Americans citizens without trial (hereand here); Under the provisions of the Military Commission Act of 2006 the President can declare anyone an enemy combatant and order their execution or assassination (here and here); the President of the United States has a kill list and can and does order extra judicial killings including with drone attacks. (here, here,here,here, here, here).

I was invited to the conference in December 2003 in Islamabad as an artist, as a writer:

All Our Stories: Stories, I think do not reveal the truth, they do however expose untruths. A multitude of narratives, all versions of perceived reality prevent the rise and tyranny of a singular narrative. And in this way, through a multitude of stories, a balance is maintained and truth whether it exists or not is safeguarded by not being singled out. In receiving these narratives we are able to reason that all versions matter; all must be given consideration; that all opinions must be questioned and that all perceptions have validity. All truths are untruths all untruths are true. In the absence of a multitude of narratives, reason remains ruined.

I see reason ruined every day in newspapers, in images on TV channels and in the stacks of books, the so called literature of experts on all things Muslim, Pakistani and Middle Eastern. One of the greatest dangers facing the world today is the dangerous revival of a singular and value laden narrative of good and evil with its time released poison of hate. This view perceives the world in terms of fenced in real estate not as Earth and in terms of corporate interests not cooperation. These story-tellers with their narratives of antipathy are given credence branded as secular as they view the world through an optic of fear and control while weaving stories full of hate: Stories that justify the existing divisions in our world geographical, social and economic. Language today continues to be used as a weapon with representations of whole peoples in dangerous ways instead of building understanding.

Read more »

Teen Werewolves

by Kevin S. Baldwin

We really do coinhabit the earth with mythical creatures. Werewolves have recently occupied my home. Let me explain: Two of my kids are young teens, who have transformed practically overnight into hairy, musky, snarling, nocturnal beasts with insatiable appetites for food, media, novelty, and the company of other werewolves. Teenage_werewolf

Of course, I am not the first to make this lupine-teen connection, and I can still appreciate the situation from my kids' perspectives: One day you're a fairly carefree child and suddenly you have hairs sprouting in new places, insistent demands from your digestive and reproductive tracts governing nearly every decision (tubular hells?), and are facing fear and loathing from peers and parents alike while trying to decipher and navigate a suddenly unfamiliar, yet vitally important social landscape.

Adolescence is a human universal, yet there are some features that make our 21st Century, First-World situation a bit unusual. One is that our kids are entering adolescence earlier than ever. Whatever mismatches exist between our bodies and brains during this transitional period may be exaggerated as a result of this shift in timing. Immature minds partially in control of mature bodies is less than ideal for lots of reasons. Another feature of our society is that it fetishizes teenhood and young-adulthood to an extraordinary degree. Yet another feature unique to our situation may be that so many more of us actually survive adolescence and live to a ripe old age. For much of our evolutionary history, the aspects of our lifecycles that have mattered most, played out while we were still essentially adolescents. Teen marriages were common. Vanishingly few of us made it past the age of 30. No wonder we, as a species, are such crooked timber! Only recently, have there been enough older folk in the population to have much of an effect on population level characteristics, or even merely reflect on the phenomenon of puberty itself.

Adolescence is akin to metamorphosis from a tadpole into a frog or a caterpillar into a butterfly plus all the angst of being somewhat self-aware. You couldn't pay me enough to relive those years, yet I suppose it is a form of karma to have to relive adolescence from an adult's perspective:

Could I have possibly been this irrational, self-absorbed, and ungrateful?

Afraid so.

Should I try to impart to my kids what little I've learned in the roughly three decades that have elapsed since my own adolescence?

Of course.

Will it do any good?

Probably not. They will have to learn these lessons themselves.

We are consigned to an intergenerational version of “Groundhog Day:” The same mistakes over and over again with seemingly little chance of resolution. Perhaps this is one aspect of the cycle of suffering that the Buddha alluded to. In some ways, youth is wasted on the young and by the time we begin to figure it all out, we're in decline or perhaps even nearing the end of our time on this planet.

Read more »

Atul Gawande: Excellence Is Recognizing Details, Failures

From Harvard Magazine:

AtulIn the professional world, what separates greatness from mere competence? Why is a cystic fibrosis treatment center in Minnesota miles ahead of a similar program in Cincinnati? Why are certain teachers getting first-rate results in the classroom when others are merely getting by?

Atul Gawande, a Harvard Medical School professor, surgeon at Brigham and Women’s Hospital, and New Yorker staff writer who has traveled the country researching answers to this question, says the answer has little to do with income level, education, or high intelligence. The key to being great at any given profession, he says, is the ability to recognize failure. “What I found over time, trying to follow and emulate people that were focused on achieving something more than competence, is that they weren’t smarter than anybody else, they weren’t geniuses,” Gawande told an audience at the Harvard Graduate School of Education’s Askwith Forum on Wednesday. “Instead they seemed to be people that could come to grips with their inherent fallibility—fallibility in the systems that they work in, and with what it took to overcome that fallibility.”

More here.

BRAINS PLUS BRAWN

Daniel Lieberman in Edge:

DanI've been thinking a lot about the concept of whether or not human evolution is a story of brains over brawn. I study the evolution of the human body and how and why the human body is the way it is, and I've worked a lot on both ends of the body. I'm very interested in feet and barefoot running and how our feet function, but I've also written and thought a lot about how and why our heads are the way they are. The more I study feet and heads, the more I realize that what's in the middle also matters, and that we have this very strange idea —it goes back to mythology—that human evolution is primarily a story about brains, about intelligence, about technology triumphing over brawn.

Another good example of this would be the Piltdown hoax. The Piltdown forgery was a fossil that was discovered in the early 1900s, in a pit in southern England. This fossil consisted of a modern human skull that had been stained and made to look really old, and an orangutan jaw whose teeth had been filed it down and broken up, all thrown into a pit with a bunch of fake stone tools. It was exactly what Edwardian scientists were looking for, because it was an ape-like face with a big human brain, and also it evolved in England, so it proved that humans evolved in England, which of course made sense to any Victorian or Edwardian. It also fit with the prevailing idea at the time of Elliot Smith, that brains led the way in human evolution because, if you think about what makes us so different from other creatures, people always thought it's our brains. We have these big, enormous, large, fantastic brains that enable us to invent railways and income tax and insurance companies and all those other wonderful inventions that made the Industrial Revolution work.

More here.

Remembering Sri Lanka’s Killing Fields

F9e0ff509445c7a6c436c137d24dbe24.portrait.jpg

Gareth Evans in Project Syndicate (illustration by Dean Rohrer):

Three years ago, in the bloody endgame of the Sri Lankan government’s war against the separatist Liberation Tigers of Tamil Eelam, some 300,000 civilians became trapped between the advancing army and the last LTTE fighters in what has been called “the cage” – a tiny strip of land, not much larger than New York City’s Central Park, between sea and lagoon in the northeast of the country.

With both sides showing neither restraint nor compassion, at least 10,000 civilians – possibly as many as 40,000 – died in the carnage that followed, as a result of indiscriminate army shelling, rebel gunfire, and denial of food and medical supplies.

The lack of outrage mainly reflects the Sri Lankan government’s success in embedding in the minds of policymakers and publics an alternative narrative that had extraordinary worldwide resonance in the aftermath of the terrorist attacks of September 11, 2001. What occurred in the cage, according to this narrative, was the long-overdue defeat, by wholly necessary and defensible means, of a murderous terrorist insurrection that had threatened the country’s very existence.

The other key reason behind the world’s silence is that the Sri Lankan government was relentless in banning independent observers – media, NGOs, or diplomats – from witnessing or reporting on its actions. And this problem was compounded by the timidity of in-country United Nations officials in communicating such information as they had.

President Mahinda Rajapaksa’s government claimed throughout, and still does, that it maintained a “zero civilian casualties” policy. Officials argued that no heavy artillery fire was ever directed at civilians or hospitals, that any collateral injury to civilians was minimal, and that they fully respected international law, including the proscription against execution of captured prisoners.

But that narrative is now being picked apart in a series of recent publications, notably the report last year of a UN Panel of Experts, and in two new books: UN official Gordon Weiss’s relentlessly analytical The Cage: The Fight for Sri Lanka and the Last Days of the Tamil Tigers, and BBC journalist Frances Harrison’s harrowingly anecdotal Still Counting the Dead: Survivors of Sri Lanka’s Hidden War.

What Can You Really Know? Another Round of Physicists vs. Philosophers

Dyson_2-110812_jpg_230x1041_q85

Freeman Dyson reviews Jim Holt's Why Does the World Exist?: An Existential Detective Story, in the NYRB:

The fading of philosophy came to my attention in 1979, when I was involved in the planning of a conference to celebrate the hundredth birthday of Einstein. The conference was held in Princeton, where Einstein had lived, and our largest meeting hall was too small for all the people who wanted to come. A committee was set up to decide who should be invited. When the membership of the committee was announced, there were loud protests from people who were excluded. After acrimonious discussions, we agreed to have three committees, each empowered to invite one third of the participants. One committee was for scientists, one for historians of science, and one for philosophers of science.

After the three committees had made their selections, we had three lists of names of people to be invited. I looked at the lists of names and was immediately struck by their disconnection. With a few exceptions, I knew personally all the people on the science list. On the history list, I knew the names, but I did not know the people personally. On the philosophy list, I did not even know the names.

In earlier centuries, scientists and historians and philosophers would have known one another. Newton and Locke were friends and colleagues in the English parliament of 1689, helping to establish constitutional government in England after the bloodless revolution of 1688. The bloody passions of the English Civil War were finally quieted by establishing a constitutional monarchy with limited powers. Constitutional monarchy was a system of government invented by philosophers. But in the twentieth century, science and history and philosophy had become separate cultures. We were three groups of specialists, living in separate communities and rarely speaking to each other.

When and why did philosophy lose its bite? How did it become a toothless relic of past glories? These are the ugly questions that Jim Holt’s book compels us to ask.

Literature is not Data: Against Digital Humanities

1351415953

Stephen Marche in The LA Review of Books:

BIG DATA IS COMING for your books. It’s already come for everything else. All human endeavor has by now generated its own monadic mass of data, and through these vast accumulations of ciphers the robots now endlessly scour for significance much the way cockroaches scour for nutrition in the enormous bat dung piles hiding in Bornean caves. The recent Automate This, a smart book with a stupid title, offers a fascinatingly general look at the new algorithmic culture: 60 percent of trades on the stock market today take place with virtually no human oversight. Artificial intelligence has already changed health care and pop music, baseball, electoral politics, and several aspects of the law. And now, as an afterthought to an afterthought, the algorithms have arrived at literature, like an army which, having conquered Italy, turns its attention to San Marino.

The story of how literature became data in the first place is a story of several, related intellectual failures.

In 2002, on a Friday, Larry Page began to end the book as we know it. Using the 20 percent of his time that Google then allotted to its engineers for personal projects, Page and Vice-President Marissa Mayer developed a machine for turning books into data. The original was a crude plywood affair with simple clamps, a metronome, a scanner, and a blade for cutting the books into sheets. The process took 40 minutes. The first refinement Page developed was a means of digitizing books without cutting off their spines — a gesture of tender-hearted sentimentality towards print. The great disbinding was to be metaphorical rather than literal. A team of Page-supervised engineers developed an infrared camera that took into account the curvature of pages around the spine. They resurrected a long dormant piece of Optical Character Recognition software from Hewlett-Packard and released it to the open-source community for improvements. They then crowd-sourced textual correction at a minimal cost through a brilliant program called reCAPTCHA, which employs an anti-bot service to get users to read and type in words the Optical Character Recognition software can’t recognize. (A miracle of cleverness: everyone who has entered a security identification has also, without knowing it, aided the perfection of the world’s texts.) Soon after, the world’s five largest libraries signed on as partners. And, more or less just like that, literature became data.

Remarkable Facts: Ending Science As We Know It

Sober_37.6_roulette

Elliott Sober reviews Thomas Nagel's Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False, in Boston Review:

Thomas Nagel, a distinguished philosopher at NYU, is well known for his critique of “materialistic reductionism” as an account of the mind-body relationship. In his new and far-reaching book Mind and Cosmos, Nagel extends his attack on materialistic reductionism—which he describes as the thesis that physics provides a complete explanation of everything—well beyond the mind-body problem. He argues that evolutionary biology is fundamentally flawed and that physics also needs to be rethought—that we need a new way to do science.

Nagel’s new way is teleological—scientific explanations need to invoke goals, not just mechanistic causes. The conventional story of the emergence of modern science maintains that Galileo and Newton forever banished Aristotle’s teleology. SoMind and Cosmos is an audacious book, bucking the tide. Nagel acknowledges that he has no teleological theory of his own to offer. His job, as he sees it, is to point to a need; creative scientists, he hopes, will do the heavy lifting.

Nagel’s rejection of materialistic reductionism does not stem from religious conviction. He says that he doesn’t have a religious bone in his body. The new, teleological science he wants is naturalistic, not supernaturalistic. This point needs to be remembered, given that the book begins with kind words for proponents of intelligent design. Nagel applauds them for identifying problems in evolutionary theory, but he does not endorse their solution.

Nagel’s main goal in this book is not to argue against materialistic reductionism, but to explore the consequences of its being false. He has argued against the -ism elsewhere, and those who know their Nagel will be able to fill in the details. But new readers may be puzzled, so a little backstory may help.

Sunday Poem

Smell and Envy

You nature poets think you've got it, hostaged
somewhere in Vermont or Oregon,
so it blooms and withers only for you,
so all you have to do is name it: primrose
– and now you're writing poetry, and now
you ship it off to us, to smell and envy.

But we are made of newspaper and smoke
and we dunk your roses in vats of blue.
Birds don't call, our pigeons play it close
to the vest. When the moon is full
we hear it in the sirens. The Pleiades
you could probably buy downtown. Gravity
is the receiver on the hook. Mortality
we smell on certain people as they pass.

by Douglas Goetsch
from Nobody's Hell

Hanging Loose Press, Brooklyn, NY, 1999

Lewis Lapham’s Antidote to the Age of BuzzFeed

From Smithsonian:

Last-Renaissance-Man-Lewis-Lapham-631The counter­revolution has its embattled forward outpost on a genteel New York street called Irving Place, home to Lapham’s Quarterly. The street is named after Washington Irving, the 19th-century American author best known for creating the Headless Horseman in his short story “The Legend of Sleepy Hollow.” The cavalry charge that Lewis Lapham is now leading could be said to be one against headlessness—against the historically illiterate, heedless hordesmen of the digital revolution ignorant of our intellectual heritage; against the “Internet intellectuals” and hucksters of the purportedly utopian digital future who are decapitating our culture, trading in the ideas of some 3,000 years of civilization for…BuzzFeed.

Lapham, the legendary former editor of Harper’s, who, beginning in the 1970s, helped change the face of American nonfiction, has a new mission: taking on the Great Paradox of the digital age. Suddenly thanks to Google Books, JSTOR and the like, all the great thinkers of all the civilizations past and present are one or two clicks away. The great library of Alexandria, nexus of all the learning of the ancient world that burned to the ground, has risen from the ashes online. And yet—here is the paradox—the wisdom of the ages is in some ways more distant and difficult to find than ever, buried like lost treasure beneath a fathomless ocean of online ignorance and trivia that makes what is worthy and timeless more inaccessible than ever. There has been no great librarian of Alexandria, no accessible finder’s guide, until Lapham created his quarterly five years ago with the quixotic mission of serving as a highly selective search engine for the wisdom of the past.

More here.

Wartime Rations

From The New York Times:

Fishman-190I want to hate David Benioff. He’s annoyingly handsome. He’s already written a pair of unputdownable books, one of which was made into Spike Lee’s most heartbreaking film, “The 25th Hour” — for which Benioff was asked to write the screenplay, leading to a second career in Hollywood. (They should just get it over with and put the man in the movies already.) He takes his morning orange juice next to Amanda Peet. And he’s still in his 30s. See what I mean?

Benioff’s new novel reveals why there are so many Russians — not oligarchs or prostitutes, but soldiers and old babushkas — in this nice American boy’s fiction. “City of Thieves” follows a character named Lev Beniov, the son of a revered Soviet Jewish poet who was “disappeared” in the Stalinist purges, as Lev and an accomplice carry out an impossible assignment during the Nazi blockade of Leningrad. Before Lev begins to tell his story, however, a young Los Angeles screenwriter named David visits his grandfather in Florida, pleading for his memories of the siege. But this is no postmodern coquetry. In fact, the novel tells a refreshingly traditional tale, driven by an often ingenious plot. And after that first chapter Benioff is humble enough to get out of its way. For some writers, Russia inspires extravagant lamentations uttered into the eternity of those implacable winters. Happily, Benioff’s prose doesn’t draw that kind of attention to itself.

More here. (Note: Old review but, thanks to Abbas and Margit, I just read the book now and recommend it strongly).

Andrew Gelman on How Americans Vote

Favorite

A Five Books interview:

I notice from your blog as well that one of the stereotypes that you are keen on debunking is this idea that working-class people in America vote conservative. A number of people have gone to some lengths to try to explain this phenomenon, but you seem to think it’s a bit of a red herring.

Somehow people on the left and on the right find it difficult to understand. On the left, people think that 100% of working-class people should vote for the left, so anything less than 100% makes them feel that there is something that went wrong. They just cannot understand how this could be. On the right, you get the opposite. It’s considered a validation – they want to believe that these more virtuous people are voting for them. But even in the days of Franklin Roosevelt and Harry Truman, a lot of low-income people voted Republican. There was no magic golden age in which lower-income working-class people were uniformly Democrat. It was always various subgroups of the population.

How many of the poor did vote for the Democrats, say, in the last election?

Of the lowest third of the population about 60% voted for the Democrats.

What if you narrow it down to blue-collar workers though? Don’t the majority of them vote conservative?

Then you have to ask, what does that exactly mean? Someone could make $100,000 a year and be blue collar. Conversely, if you’re a woman cleaning bedpans and making very little money, you’re not blue collar. Cleaning bedpans is not considered blue-collar work. There is the way that, firstly, blue collar conveys some sort of moral superiority, and secondly that it just happens to exclude a lot of the female workforce, who are more likely to be Democrats. If you take only blue collar – which is mostly male – and don’t even restrict for income and then you go beyond that to only include whites, you’re chipping away at various groups that support the Democrats, without noticing what’s happening. It sounds very innocuous to talk about blue-collar whites, but you’re selecting a subgroup among this social class which is particularly conservative, and then making some claims about them.

X-phi is Here to Stay

Chriweigel

Richard Marshall interviews Chris Weigel in 3:AM Magazine:

3:AM: By 2009 you were enthusiastically supporting X-phi. You wrote a paper‘Experimental Philosophy Is here To Stay’. Why did you write that? Was there a feeling at the time that the approach needed defending?

CW: Yes, it did need defending and explaining and sometimes still does. In 2009, I bumped into someone at a conference who said, “Oh, you’re doing that? That’s too bad. I read a paper that refutes it.” And my thought was, “Which ‘it’ are we talking about? The projects are really diverse, and it seems unlikely that one argument could refute all of them at once.” Over time, that person and the field in general has become much more sympathetic. Writing the paper was a way not so much of defending but of explaining experimental philosophy systematically. After attending the phenomenal Experimental Philosophy summer workshop directed by Ron Mallon and Shaun Nichols, I wanted to try to explainexperimental philosophy to a wide audience.

3:AM: When talking about this approach to philosophy Josh Knobe, Shaun Nichols andothers give the impression that it is a more collaborative approach than the traditional, armchair variety. Have you found this to be the case in your own experience? It seems very cool and unstuffy. Josh Knobe in his interview said he feared ending up as being just an academic stuck being read by a couple of other academics. X-phi seems to be a way of escaping this fear. Is this something that you relate to?

CW: Yes, and if you look at how so many of the major papers have co-authors, you’ll see that experimental philosophers tend to work collaboratively. I’ve also had many more opportunities for collaboration since starting in experimental philosophy. And I think you’re right about that the research tends to be, as you say, cool and unstuffy. I think of it like this: When my daughter was fifteen months old, I took her to a pumpkin patch, and she was so excited, she started uttering—screaming, really—her first sentence while pointing all around: “Look at that! Look at that! Look at that!” Experimental philosophy presentations have much the same feel. They offer a pumpkin patch full of philosophically rich ideas just waiting to be explored.