Tuesday, November 30, 2010
Munk Debates: Tony Blair vs Christopher Hitchens
Auden reads 'The Shield of Achilles' (1953)
Felix Stalder in Eurozine:
WikiLeaks is one of the defining stories of the Internet, which means by now, one of the defining stories of the present, period. At least four large-scale trends which permeate our societies as a whole are fused here into an explosive mixture whose fall-out is far from clear. First is a change in the materiality of communication. Communication becomes more extensive, more recorded, and the records become more mobile. Second is a crisis of institutions, particularly in western democracies, where moralistic rhetoric and the ugliness of daily practice are diverging ever more at the very moment when institutional personnel are being encouraged to think more for themselves. Third is the rise of new actors, "super-empowered" individuals, capable of intervening into historical developments at a systemic level. Finally, fourth is a structural transformation of the public sphere (through media consolidation at one pole, and the explosion of non-institutional publishers at the other), to an extent that rivals the one described by Habermas with the rise of mass media at the turn of the twentieth century.
Imagine dumping nearly 400 000 paper documents into a dead drop located discreetly on the hard shoulder of a road. Impossible. Now imagine the same thing with digital records on a USB stick, or as an upload from any networked computer. No problem at all. Yet, the material differences between paper and digital records go much further than mere bulk. Digital records are the impulses travelling through the nervous systems of dynamic, distributed organisations of all sizes. They are intended, from the beginning, to circulate with ease. Otherwise such organisations would fall apart and dynamism would grind to a halt. The more flexible and distributed organisations become, the more records they need to produce and the faster these need to circulate. Due to their distributed aspect and the pressure for cross-organisational cooperation, it is increasingly difficult to keep records within particular organisations whose boundaries are blurring anyway. Surveillance researchers such as David Lyon have long been writing about the leakiness of "containers", meaning the tendency for sensitive digital records to cross the boundaries of the institutions which produce them. This leakiness is often driven by commercial considerations (private data being sold), but it happens also out of incompetence (systems being secured insufficiently), or because insiders deliberately violate organisational policies for their own purposes. Either they are whistle-blowers motivated by conscience, as in the case of WikiLeaks, or individuals selling information for private gain, as in the case of the numerous employees of Swiss banks who recently copied the details of private accounts and sold them to tax authorities across Europe. Within certain organisation such as banks and the military, virtually everything is classified and large number of people have access to this data, not least mid-level staff who handle the streams of raw data such as individuals' records produced as part of daily procedure.
Adonis On the Power of Poetry
David Mattin in The National:
The new book - called, simply, Adonis: Selected Poems - spans his entire career, from the early works produced in his native Syria in the 1950s and amid a post-colonial atmosphere of new Arab national consciousness, through the long, fragmented epics of the 1970s - including the work best-known to English readers Funeral for New York - and on to his most recent, crystalline, short works, tinged with erotic longing. But how important is this book to Adonis; is he much concerned that he is read in English?
"I'm interested in all readers," he says. "The reader is such that what he does is a part of me, and English readers are no different from Arab readers in that regard.
"The reader is the 'other', the person I am trying to reach. And that 'otherness' is also a part of me. I'm interested in the perception of non-Arab readers because they may allow me a clearer perception of myself."
Indeed, it seems that Adonis feels acutely the difficulty of reaching a western readership:
"Unfortunately, western readers continue to see Arab culture as marginal. Arab politics has little weight; this is accepted; but we musn't conflate politics and culture, which unfortunately is what western readers tend to do."
It would be hard to argue with any of that. It's unavoidable, though, that only readers of Arabic can have first-hand knowledge of ways in which Adonis transfigured the Arab poetic tradition in the 20th century. English readers, then, are left with the reports of critics, who tell us that he broke radically from traditional rhyme and meter and evolved an Arabic free verse; that he enlivened a classical poetic vocabulary by using the language of everyday, conversational Arabic; and that he eschewed traditional subject matter and turned, instead, to poems that captured the great changes in thought and self-identity sweeping the Arab world, and fuelling the rise of Pan-Arabism.
Indeed, trouble over his involvement in nationalist politics saw Adonis leave his native Syria for Beirut in 1957, where he founded the influential magazine Shi'r (Poetry), host to much of this experimental work.
In short, Adonis is credited - above any of his contemporaries - with making Arab poetry modern.
The Dosa Man of New York
Atiya Hussain in Open the Magazine:
In New York, where rare is the denizen who cooks, street food vending is fast becoming a career choice. With the outlays a fraction of what’s required to run a restaurant, people have been known to give up stable jobs to start a food truck. Among the most famous is Bangladeshi immigrant Meru Sikder—Vendy finalist in 2008 and 2009—who chose his midtown Biriyani Cart over his previous job as banquet chef at the Hilton.
Having prevailed at the 2007 Vendy, the Dosa Man can afford to chat affably and joke with customers in the long line behind his cart. His signature dish, the one he won his award for, is the Pondicherry Dosa, whose stuffing of spiced potatoes and salad greens may be an odd combination for the Indian palate, but remains the cart’s best seller. “We have a lot of weird stuff, like uttapam with mixed veggies...we have vegan drumsticks,” he enthuses.
I used to go to the Dosa cart before things got weird. Back in 2002, Kandasamy still had a huge Subcontinental moustache, and he kept some of his lovely fluffy vadas aside for me. That was early on in the story of the Dosa cart, a time Kandasamy remembers as ‘difficult’ because of all the explaining he had to do in turning the curious into customers: “A dosa is a sort of crepe.”
Flash forward to October 2010, as I hurry past New York University students towards Washington Square Park, which is undergoing major landscaping work and looks devastated. To my dismay, the Dosa Truck is not visible. Two carts inside the park, manned by South Asians, offer ice-cream, pretzels and hot dogs. And then, just as I am about to give up, I see a cute blonde open up her styrofoam package—and there I spy the unmistakable golden crust of the dosa I was looking for. Before she can manage a bite, I apologise and explain that I am looking for the truck. The reason for my interruption elicits a smile. “Right down the street,” she says, pointing. “You’ll see a huge line and a camera crew!”
Americans on the sea
“On the Water” seeks to remind us how deeply our daily lives are still informed by maritime activity, of the grand international web of ocean commerce that brings cars and televisions and fuel and food to our doorsteps. But as far as most Americans are concerned, these products could have dropped from the sky or grown magically on store shelves. As we turned from sailors to consumers, we desired the means by which we get our goods to be as simple and innocuous as possible, and thus, as divorced as possible from the water. Even those scant five percent of Americans who have been on a cruise ship — vast sideways hotels cruising noiselessly and still — could hardly tell you about the roar of the waves or the smell of the surf, and food poisoning from buffet tables is more likely than sea sickness. Mostly, “On the Water” reminds us that the contemporary American relationship to water is an abstraction. That the waterways we hardly notice these days, that gave America its very soul, are a memory. We no longer see our reflection in the rivers and oceans as did Americans in Melville’s day. We are landlubbers and occasional passengers, and more and more, it seems that the sea finds its reflection in us.more from Stefany Anne Golberg at The Smart Set here.
the book was meant to be somewhat provocative...
Americans are preoccupied with the Founders, and that is not at all a bad thing, yet much of the contemporary discussion of the revolutionary generation and the early years of the republic is appallingly shallow. In my view, too little attention has been paid to James Madison’s political philosophy—which is surprising, since Madison is the principal author of the Constitution and the Bill of Rights. The Founders did not speak in one voice, and careful attention to the substance of their debates (which were in many ways far more acrimonious than our own cable TV spectacles) can help clarify contemporary controversies, especially when so many of our present political combatants are merely reenacting old debates in seeming ignorance of the principles that were originally at issue. Madison provides a particularly apt perspective on our current predicament because as a politician he devoted much of his energy to fighting precisely the sort of corruption that has swamped our political system. Madison was the intellectual and political force behind the republican opposition to the Federalists, who very much like the present-day Republican Party saw themselves as the natural rulers of the United States. The Federalists, led by Alexander Hamilton, sought to protect a narrow financial oligarchy from the interests of the great majority of American citizens. Hamilton’s ambition was to bind his “moneyed men” to the state through an innovative financial program that would at the same time lay the foundations for an international commercial empire.more from our pal Roger Hodge about his new book, The Mendacity of Hope, at Harper's here.
Worry about information overload has become one of the drumbeats of our time. The world’s books are being digitized, online magazines and newspapers and academic papers are steadily augmented by an endless stream of blog posts and Twitter feeds; and the gadgets to keep us participating in the digital deluge are more numerous and sophisticated. The total amount of information created on the world’s electronic devices is expected to surpass the zettabyte mark this year (a barely conceivable 1 with 21 zeroes after it). Many feel the situation has reached crisis proportions. In the academic world, critics have begun to argue that universities are producing and distributing more knowledge than we can actually use. In the recent best-selling book “The Shallows,” Nicholas Carr worries that the flood of digital information is changing not only our habits, but even our mental capacities: Forced to scan and skim to keep up, we are losing our abilities to pay sustained attention, reflect deeply, or remember what we’ve learned. Beneath all this concern lies the sense that humanity is experiencing an unprecedented change — that modern technology is creating a problem that our culture and even our brains are ill equipped to handle. We stand on the brink of a future that no one can ever have experienced before. But is it really so novel?more from Ann Blair at the Boston Globe here.
In his book Danger On Peaks poet Gary Snyder describes a trip to the area around Mt. St. Helens (called Loowit -meaning smoky in the Sahaptin native American language) in Washington State, USA. The mountain exploded on May 18, 1980 in an eruption with the power of 500 Hiroshima bombs, Snyder says.
Snyder recalls his first trip to the mountain when he was thirteen and subsequent trips prior to the 1980 eruption then brings his readers up to date with a return there after Mt. St. Helens blew. Danger On Peaks lays this out in alternating prose and poetry in a remarkably illuminating way.
Approaching Loowit Snyder reaches a spot it can be seen in its transformed state. He writes:
Finally pull up to the high ridge, now named Johnston after the young geologist who died there, and walk to the edge. The end of the road. Suddenly there's all of Loowit and a bit of the lake basin! In a new shape, with smoking scattered vents in this violet-gray light.
The white dome peak wacked lower down,
open-sided crater on the northside, fumarole wisps
a long gray fan of all that slid and fell
angles down clear to the beach
dark old-growth forest gone no shadows
the lake afloat with white bone blowndown logs
scoured ridges around the rim, bare outcrop rocks
squint in the bright
ridgetop plaza packed with puzzled visitor gaze
no more White Goddess
but, under the fiery sign of Pele,
and Fudo—Lord of Heat
who sits on glowing lava with his noose
lassoing hardcore types
from hell against their will,
is her name
publisher: Shoemaker Hoard, 2004
Extra Vitamin D and Calcium Aren’t Needed
Gina Kolata in The New York Times:
The very high levels of vitamin D that are often recommended by doctors and testing laboratories — and can be achieved only by taking supplements — are unnecessary and could be harmful, an expert committee says. It also concludes that calcium supplements are not needed.
The group said most people have adequate amounts of vitamin D in their blood supplied by their diets and natural sources like sunshine, the committee says in a report that is to be released on Tuesday. “For most people, taking extra calcium and vitamin D supplements is not indicated,” said Dr. Clifford J. Rosen, a member of the panel and an osteoporosis expert at the Maine Medical Center Research Institute. Dr. J. Christopher Gallagher, director of the bone metabolism unit at the Creighton University School of Medicine in Omaha, Neb., agreed, adding, “The onus is on the people who propose extra calcium and vitamin D to show it is safe before they push it on people.” Over the past few years, the idea that nearly everyone needs extra calcium and vitamin D — especially vitamin D — has swept the nation. With calcium, adolescent girls may be the only group that is getting too little, the panel found. Older women, on the other hand, may take too much, putting themselves at risk for kidney stones. And there is evidence that excess calcium can increase the risk of heart disease, the group wrote.
Telomerase reverses ageing process
Mice engineered to lack the enzyme, called telomerase, become prematurely decrepit. But they bounced back to health when the enzyme was replaced. The finding, published online today in Nature1, hints that some disorders characterized by early ageing could be treated by boosting telomerase activity. It also offers the possibility that normal human ageing could be slowed by reawakening the enzyme in cells where it has stopped working, says Ronald DePinho, a cancer geneticist at the Dana-Farber Cancer Institute and Harvard Medical School in Boston, Massachusetts, who led the new study. "This has implications for thinking about telomerase as a serious anti-ageing intervention."
Monday, November 29, 2010
Lewis Lapham to Judge 2nd Annual 3QD Politics Prize
UPDATE 12/21/10: The winners have been announced.
UPDATE 12/14/10: List of finalists.
UPDATE 12/14/10: List of semifinalists.
UPDATE 12/07/10: Voting round is now open.
Dear Readers, Writers, Bloggers,
We are very honored and pleased to announce that Lewis Lapham has agreed to be the final judge for our 2nd annual prize for the best blog writing in politics. (Details of last year's prize, judged by Tariq Ali, can be found here.) Mr. Lapham needs no introduction for our American readers, but for those who do not already know him, this is from his Wikipedia entry:
Lewis Lapham served as editor of Harper's Magazine from 1976 to 2006 (with a hiatus from 1981 to 1983). He was managing editor from 1971 to 1975, after having worked for the San Francisco Examiner and New York Herald Tribune. He is largely responsible for the modern look and prominence of the magazine, having introduced many of its signature features including its famed Harper's Index. He announced that he would become editor emeritus in Spring 2006, continuing to write his Notebook column for the magazine as well as editing a new journal about history, Lapham's Quarterly. Lapham has also worked with the PEN American Center, sitting on the board of judges for the PEN/Newman's Own Award. This February, he will be inducted into the American Society of Magazine Editors' Hall of Fame. [Photo from the Boston Globe.]
As usual, this is the way it will work: the nominating period is now open, and will end at 11:59 pm EDT on December 2, 2010. There will then be a round of voting by our readers which will narrow down the entries to the top twenty semi-finalists. After this, we will take these top twenty voted-for nominees, and the four main editors of 3 Quarks Daily (Abbas Raza, Robin Varghese, Morgan Meis, and Azra Raza) will select six finalists from these, plus they may also add up to three wildcard entries of their own choosing. The three winners will be chosen from these by Mr. Lapham.
The first place award, called the "Top Quark," will include a cash prize of one thousand dollars; the second place prize, the "Strange Quark," will include a cash prize of three hundred dollars; and the third place winner will get the honor of winning the "Charm Quark," along with a two hundred dollar prize.
(Welcome to those coming here for the first time. Learn more about who we are and what we do here, and do check out the full site here. Bookmark us and come back regularly, or sign up for the RSS feed.)
November 22, 2010:
- The nominations are opened. Please nominate your favorite politics blog entry by placing the URL for the blog post (the permalink) in the comments section of this post. You may also add a brief comment describing the entry and saying why you think it should win. (Do NOT nominate a whole blog, just one individual blog post.)
- Blog posts longer than 4,000 words are not eligible.
- Each person can only nominate one blog post.
- Entries must be in English.
- The editors of 3QD reserve the right to reject entries that we feel are not appropriate.
- The blog entry may not be more than a year old. In other words, it must have been written after November 21, 2009.
- You may also nominate your own entry from your own or a group blog (and we encourage you to).
- Guest columnists at 3 Quarks Daily are also eligible to be nominated, and may also nominate themselves if they wish.
- Nominations are limited to the first 200 entries.
- Prize money must be claimed within a month of the announcement of winners.
December 2, 2010
- The nominating process will end at 11:59 PM (NYC time) of this date.
- The public voting will be opened soon afterwards.
December 8, 2010
- Public voting ends at 11:59 PM (NYC time).
December 21, 2010
- The winners are announced.
One Final and Important Request
If you have a blog or website, please help us spread the word about our prizes by linking to this post. Otherwise, post a link on your Facebook profile, Tweet it, or just email your friends and tell them about it! I really look forward to reading some very good material, and think this should be a lot of fun for all of us.
Best of luck and thanks for your attention!
Justin E. H. Smith
Some readers will recall the exposé I wrote a few months ago on the life and work of the American poet Jason Boone. What might not have been obvious in that piece, as I would urgently like to clear up now, is that it was all entirely made up: every last word of it, from the meetings I had with Boone at Nirvana concerts in Sacramento in the late 1980s, to the documentary about Boone's life supposedly made by an MA student in the Media Arts Program a the University of Alaska, Fairbanks. There is no Media Arts Program in Fairbanks! In fact, the interviewees in the segment of the film I included in the exposé, one supposedly named 'Michel Pupici' and the other 'Dylan Cooney', are both plainly the same person filmed from different angles. What is more, anyone who has ever met me will be able to confirm that it is I myself, the author, Justin Smith, in both of those roles. I am looking unusually fat, true, and I do not appear entirely sober, but personal identity persisting as it does through such superficial changes, I feel I must come clean and acknowledge my role in the ruse. It was me. All of it. The entire operation behind the Jason Boone story was a one-man job, and that man was not Jason Boone.
You can thus imagine my surprise when, not long ago, on the morning of this year's Canadian Thanksgiving, I received a telephone call from a certain Augusta Aardappel. Readers will recall that Aardappel was supposedly the South African academic who had written a dissertation, in a Deleuzean vein, on the poetry of Jason Boone ('The Boone Rhizome'), and who apparently had dated Boone for a while in the early 1990s. But again, I made her up along with Pupici, Cooney, Coombs, and the rest. Anyone who has the faintest familiarity with the sonorities of Dutch should have been able to detect that she was a fabrication: 'Aardappel' literally means, 'earth-apple', and, on the model of the French 'pomme de terre', serves in Dutch as the word for 'potato' (in Afrikaans it is 'aartappel'). Have you ever met anyone named 'Mr. Potato'? Of course not. It is a name for fictional characters, not for real people.
Yet here was this woman on the telephone, on the morning of Canadian Thanksgiving, with a fully convincing Afrikaner accent, claiming to be none other than Augusta Aardappel. Of course at first I did not believe her, but I was also very intrigued, since my fiction had not previously inspired a great number of copycat hoaxes (I write in a non-existent genre --hyperlinked, multimedia, serial metafiction-- and my readership, if I may be honest, is fairly limited). Curious to figure out why anyone would bother to perpetrate such a pointless fraud, I determined to keep this 'Augusta' on the phone for as long as I could.
I asked her how the weather was in Pretoria, what was her opinion of vuvuzelas, Julius Malema, and Die Antwoord. She complimented me on my familiarity with today's South Africa, and I told her it was really nothing, I just get it from my friends' Facebook status updates. I could conjure an equal semblance of knowledge, I told her, about Vietnam, Tonga, or Sakhalin Island (with the last of these I could even add some Chekhovian flourishes).
When I felt I had gained her confidence I put to Ms. Aardappel the inevitable question. Why, I asked, would she claim to be someone I had made up?
"There's something important I need to tell you," she replied evasively. "Jason didn't die."
"Of course he didn't die," I answered. "He never existed in the first place. I made him up too. Now tell me where you're calling from and what it is you want."
"I'm calling from across possible worlds," was Augusta Aardappel's answer. "I'm calling because I need your help."
I knew immediately she was telling the truth.*
Philosophers have taken up widely divergent positions on the metaphysics of possible worlds. While the philosopher who coined the phrase --G. W. Leibniz, in his Theodicy of 1709-- was never entirely clear as to just how real the other possible worlds besides this one were supposed to be, over time most converged on the view that these worlds are figments of the human mind, 'state descriptions', as Rudolf Carnap called them in the early 20th century, accounts that people give of some other way this world might have turned out, but in the end did not. Still later others, and most notably the great metaphysician David Lewis, would defend the radical thesis that other possible worlds are only possible relative to us, while they are perfectly actual relative to all the beings inhabiting them.
Lewis's possible-worlds realism would send shockwaves through Anglo-American philosophical circles, though many believed the whole theory was put forth with tongue firmly lodged in cheek. Meanwhile the French, if they heard about him at all, would mostly just stew about his lifting of the title of his major book outlining the theory from Bernard le Bovier de Fontenelle (his Entretiens sur la pluralité des mondes of 1686 would serve as inspiration for 1986's On the Plurality of Worlds). And the common people, for their part, on hearing of the notion of possible worlds, as is their custom would giddily declare that this theory is, variously, either corroborated or refuted by quantum mechanics.
I knew about David Lewis because I spent a number of years studying philosophy, and had even considered going into an academic career before shifting gears entirely, surprising everyone who knew me in my earlier life and taking up a career in law enforcement (I am now running for sheriff of Larimer County, Colorado; my slogan is 'Prepared to Lead'). So when Augusta used that phrase, 'across possible worlds', it immediately brought forth a flood of memories from my more studious days. I understood right away, I mean, that she was using a technical term, that she meant something very concrete when she spoke of crossing from one world to another. She, it dawned on me, could very well be the only person who had ever experienced such a thing. No one, not Lewis, and not even Leibniz (who seemed ready to believe just about anything) ever so much as toyed with the possibility of inter-world causation.
"So how does this work," I asked when we spoke again the following night. "Do you need to download the most recent version of Skype to do it? Is that it?" Augusta laughed. "Are you still in the other possible world right now? Is there an inter-world switchboard operator?"
"Those questions can wait," she replied. "For now I need to come see you."
One of the most difficult branches of the metaphysics of possible worlds is what is sometimes called 'counterpart theory'. If two distinct possible worlds have some relationship to one another, it is supposed, this will have to be because they contain at least some of the same individuals. But these individuals will not be exactly identical to their counterparts in other possible worlds. Even if everything about two counterpart individuals is intrinsically the same, they will still exist in two different worlds, and so will have different 'extrinsic denominations' or what are sometimes called 'Cambridge properties', which is to say relational properties, properties that result from the way the world around them is, rather than from the way they themselves are in their interior natures.
Sometimes, it is generally agreed, even some of the intrinsic properties of two counterparts might be different, without thereby compromising their status as counterparts of one another: George W. Bush might have one more chest hair in a neighboring world than in this one, or Michael Jackson might have died one day sooner or later. But the problem with these counterparts is that it is difficult to know where to draw the line, impossible, even, to determine at what point so many things about two supposed counterparts are so different that they are no longer counterparts at all. George W. Bush is still George W. Bush plus or minus a few chest hairs, but what if he had ended up running a hardware store rather than the United States?
Yet we seem to have a deep intuition about which inhabitants of other possible worlds might be our counterparts, which, that is, might be us if things had gone differently, and which could not be. We might even have a sense of deep familiarity with our own counterparts. For example, sometimes I am seized by the uncanny sense that I am leading several counterpart lives in parallel. There are moments when I am frankly unable to say of the actual, this-worldly me --by which I mean to say the counterpart of me that is myself-- which world-track exactly he, or I, is on. On occasion it even strikes me that I am not a law-enforcement official at all, but rather Justin Smith, a player for the San Francisco 49ers. Just the other day, I experienced something even stranger still: as I was cleaning out my firearm, I had a sudden sharp sense that I was Justin Smith, Esquire, a bespoke milliner, and that I was preparing to show some of my exquisite new creations at the Milan Fashion Week.
Sometimes it seems the actual world is nothing special, and that my real life is simultaneously unfolding in infinitely many other worlds, in all of my infinite counterparts. Not to put too fine a point on it, but sometimes I really just don't know who I am.
Augusta and I had video conversations over Skype several times in the following weeks. She told me a great deal about her life in South Africa, about the tedium of her job, and about how much she missed Jason. She said that as an untenured adjunct at the University of Pretoria she was made to teach a course on District 9, in which she was required to have students break into small groups and to discuss how they would feel if they were treated like the aliens. Consistently, Augusta reported, the students came to the conclusion that they would not like it at all.
She thought she'd enjoy teaching the course on 'Literatures of Apartheid' a bit more, but by the time it was assigned to her the 'Accommodating Learning Diversity' rules had taken effect, and professors at the university were now required to accept for a final grade any work that was pre-approved by the Office of Instructional Excellence's 'Learning Styles Grid'. More than half the students had determined that they were 'Kinetic-Dynamic Learners', and chose as their final work for the course to perform interpretive dances inspired by Coetzee's Waiting for the Barbarians. Mostly they just hopped around in a squatting position in front of the class.
"I didn't spend three years of my life trying to make sense of A Thousand Plateaus just to end up as a babysitter," Augusta lamented. "I've got to get out of this university racket before I lose my dignity entirely. So do you," she added.
"What do you mean, me? I'm a law-enforcement officer."
"You're not a law-enforcement officer. You're a professor of philosophy."
"Maybe in some other world. Not in this one. In this one I'm running for sheriff."
Augusta laughed. "Whatever. I need you to help me get to the world where Jason didn't die."
"Again," I told her, "I made Jason up. He's fictional."
"Yes, he's fictional in your world. But in the world I'm in, or the world I was in, he died in a plane crash. You made him die, remember?"
"I made a fictional guy die, yes. If I make a guy up I'm allowed to do whatever I want to him."
"Yes. I think it's known as 'Flaubert's Law'." She laughed again. "But seriously, how can I have made him die in your world? The Law has no force beyond the bounds of the fictional."
"Just forget about that. The only thing that matters to me is that there's this other world, not this one and not the one I come from either --I found it by accident when I got drunk and decided to try Chatroulette-- where Jason actually exists and there was no Qantas accident."
"And where he's living out his days masturbating on Chatroulette?" Augusta wasn't laughing this time. "Sorry," I said. "By the way, have you ever heard of Fontenelle?"
"Fontenelle? Isn't that like that opening in your head when you're born that closes by the time you're grown up?"
"Yes. It's that too, I suppose."
The date of Augusta's arrival was drawing closer, and I found myself thinking constantly about what it would be like to meet her. She was her own person, certainly, yet I couldn't help but think that she was also, in some sense, my creation. Of course I can't create a person ex nihilo. I'm not God after all. But if a creature of my imagination happens to coincide exactly with a person God actually placed in a nearby possible world, then isn't my creation in some sense just as impressive?
I had been travelling constantly in the weeks before her arrival from South Africa. Going to meetings, talking with people, flying in planes, staying in hotels. I could not have told you where I was or what my purpose was for most of this time. All I really knew is that Augusta had agreed to meet me in one of my destination cities. My creation, appearing in flesh and blood in some hotel lobby. What would come next?
Not that I was getting ideas, of course. I'm a family man, with a pretty wife and two fine boys. I've built my whole campaign on it. But still, maybe there's some exemption, some hidden clause of Flaubert's Law.
Things are getting stranger. Just this morning, the day of Augusta's expected arrival, I awoke to find the following message in my inbox:
I fear you're losing hold of yourself, Justin. You are not a policeman in Colorado, and you're certainly not a football player in San Francisco or a designer at Milan Fashion Week. It's hard enough to see how you could believe you were just one of these characters, but how you could be all of them at once is truly beyond me. Perhaps next you'll start believing you're Justin Smith, the Australian rugby player, or Justin 'Boosted J' Smith, the champion poker player from Connecticut? Go Google these people again (I know you have already). Do you see an 'E. H.' in their names? Have you really forgotten who you are? Go look at your work profile. You are Justin E. H. Smith. I should know. I am too.
Your counterpart, J.
PS Stay away from Augusta Aardappel. You don't want to get mixed up in this trans-world stuff (any more than you already are...).
Now this is all really too much. It's one thing to be contacted from another possible world by someone who doesn't even exist in this one; it's quite another thing to be contacted by your own counterpart. I assume he must have been writing from some nearby possible world. He probably figured out how to monitor me through my webcam or something. I don't really know how it all works, but it's clear that the Internet is making things possible that really never should have happened.
Anyway, my counterpart's onto something. I'm not a Colorado family man. Those aren't my wife and kids back in Fort Collins. I don't know what I was thinking when I said they were. It's just that I've been on the road so much recently, and I've spent so much time clicking around from this site to that in anonymous hotel rooms, that my sense of my self has really started to unravel. But I'm sure I can get it back with a little effort.
Come to think of it, Augusta contacted me for the first time on Canadian Thanksgiving. How would I know that unless I lived in Canada? I'm a philosopher there, I think. Anyway I know way too much about the metaphysics of possible worlds to be qualified for anything else. But then there's that time when I was a rookie and I was out in the patrol vehicle with my partner Mike. (Why did I just call it a 'vehicle'? Who talks like that?) We pulled over a Dodge van with fumes coming out of it and next thing we knew we were making national headlines for uncovering the new 'mobile meth lab' phenomenon. I remember.
For now, one thing is clear: according to the 'Services for Guests' booklet, I'm sitting at the writing desk in a room at the Hyatt Regency Boston. My laptop tells me it is Sunday, November 28, 2010, 8:24 PM. I've just checked the internal digital information service on my room's television set, and it says that there are two conferences going on right now: in the Chesapeake Ballroom, it's the North American Executive Policing Conference; and on the Hancock Mezzanine, the Eastern Division meeting of the American Philosophical Association. (I suppose these organizations schedule their annual meetings on the weekend of the American Thanksgiving in order to help their members make the inevitable choice between career and family.)
On the nightstand next to the bed there is a firearm (where did I learn to call it a 'firearm'?) in a shoulder holster. It is sitting on top of a book: Fontenelle's Entretiens sur la pluralité des mondes (the 1998 Flammarion paperback edition). I pick the book up, open it at random, and read this line: Ceux qui ont des pensées à perdre, les peuvent perdre sur ces sortes de sujets; mais tout le monde n'est pas en état de faire cette dépense inutile.
How true, I think. But where did I learn to read French? And as I'm puzzling over this little biographical detail my iPhone starts vibrating with a message. It's Augusta. "R u up there?," she wants to know. "Im inthe lobby."
"I've been warned not to see you," I write back carefully (I, or the counterpart me who I happen to be, have/has always been punctilious about proper spelling, in all media). "It could knock the world off balance, or something."
"Thats crazy!!!," she texts back a few seconds later. "Were he're to help eachother!!!"
I strap on my shoulder holster (where did I learn to do that?) and I glance again at the book. Shall I bring it too? The phone vibrates again, but this time it's not Augusta.
"Whatever you do, don't bring the gun." It's my counterpart again, correct spelling and all.
"I can take care of myself, me," I write back angrily.
I grab my Fontenelle and I head for the door.
To be continued..
For an extensive archive of Justin Smith's writing, please visit www.jehsmith.com.
To make a donation, or to volunteer for Justin Smith's campaign, please go here.
(Re)name that Metaphor (correctly this time)
I’ve never been in this position, but the person demanding a newspaper or magazine correction—the insider claiming he was quoted out of context; the scientist whose nuanced position didn’t come across, quite; the dead person who’s not really dead—must get a certain satisfaction from seeing the correction printed. It might be the grim satisfaction of a wrong set to rights too late, but satisfaction nonetheless. Then again, in a digital publication, a correction can work to the source’s advantage in some sense. If s/he finds the mistake early enough, an editor can amend it instantly and make sure that (most) everyone reads the correct sentence the first time. Some publications even mark the factual boner with an asterisk, which not only emphasizes the correct version of things, but provides some instant sympathy for the wronged party.
But as a disinterested reader, I’d never gotten actual delight from a correction until a few weeks ago, when the New York Times ran one for an article about study skills and retention (“Forget What You Know About Good Study Habits”). I’d read the uncorrected article online at first, then went back to reread it, for reasons soon ejected from my mind. I’d gotten through half the story, and was going to click through to the second page. And I was grimacing in anticipation of a paragraph I knew was coming up. The author had needed a metaphor conveying something about unintended consequences, and apparently wanted the imprimatur of science. So he fell back on that canned summary of the Heisenberg Uncertainty Principle—you know, the idea that measuring a property of a particle alters the property itself.
Except that’s not what the Uncertainty Principle says. All the Principle actually says, in its entirety, is:
∆x∆p ≥ h/4π
Now if you insist on translating quantum mechanics into English (always risky), the Principle says the uncertainty in a particle’s position (∆x) multiplied by the uncertainty in its speed and direction (taken into account through its momentum, ∆p) always exceeds the number “h divided by four times π.” (The h stands for Planck’s constant, a very tiny number; the π is the familiar constant from circles, 3.14159...) In simpler terms, if you know a particle’s position very well, you cannot know its momentum well at all, and vice versa.
The reason you can’t know both the exact position and exact momentum of a particle at once is that, down on the submicroscopic level, which is the only level where the Uncertainty Principle really applies, particles behave as much (or more) like waves than discrete particles. And unlike with a particle, it’s nigh impossible to draw neat boundaries around a wave and say exactly where it is and what direction it seems to be going. In other words, the titular uncertainty isn’t uncertainty about measuring anything, as if you had a poor ruler; it’s uncertainty built into nature, intrinsic to the wave-like character of reality on such tiny scales.
Now it’s also true of course that measuring something does sometimes change what’s being measured. That happens on a macro level—checking the air pressure on your tires will change the pressure of the air inside them, since you let a little of that pressurized air out—and it happens on a micro level—since subatomic particles are so small that probing them even with light will bump them around. But that insight has nothing to do with uncertainty. Because even if you discovered a perfect measuring tool that didn’t bump them around, you cannot measure the position and momentum accurately at the same time on a very small scale.
The idea that measuring something changes what’s being measured already has a name, actually, albeit an insipid one—it’s the Observer Effect. In fact, beyond just being banal, the name seems to downgrade its importance. When very young, most of us probably observed, informally, with our fellow humans, that observing someone changes the way s/he acts. Indeed, there’s a deep psychological truth there. And as soon as you assimilate it and realize how far its implications reach, and that certain forms of psychological objectivity go right out the window as a result—this also seems intrinsic to the way the world works, and it needs a name. It’s something that we want to express with a law or a principle, not a mere effect. So no wonder that people rejected “Observer Effect” and poached on prestigious and oracular quantum mechanics instead: How much more learned!
It’s basically a branding problem: The Observer Effect needs a snappier name (feel free to start suggesting some below). But the first step in changing the name will be letting people know the problem, and why the name needs changing. Which is why the correction in the New York Times delighted me so much. “An article on Tuesday...” it said, “described incorrectly the Heisenberg uncertainty principle in physics. The principle holds that the act of measuring one property of a particle (position, for example) reduces the accuracy with which you can know another property (momentum, for example) — not that the act of measuring a property of the particle alters that property.”
I have no idea who insisted upon the correction here. Perhaps a physicist who roused himself from the usual indifference with which most physicists accept the mangling of this metaphor in everyday parlance. Perhaps a philosopher who also knows what “begging the question” really means and wasn’t about to let this injustice slide either. Perhaps a copy editor somewhere still fighting the good fight, still insisting that we can rescue “disinterested” and eradicate “irregardless,” too. But, regardless, it’s the most satisfying correction I’ve ever seen in a publication—and gives me hope that we can salvage the proper meaning of the Uncertainty Principle, and still honor the need for a (more cleverly named) Observer Effect. Indeed, while people usually refer to the Observer Effect as something unfortunate—because it doesn’t allow you to make an objective measurement—this may be one case where observing someone’s behavior changes it for the better.
“The coming months will see a new world, where global history is redefined.”
– WikiLeaks’ Twitter Feed, November 22nd
Julian Assange may be some new kind of journalist, but he is without a doubt some new kind of historian, too. He and his organization often frame their mission in terms of redefining history, as above, or in terms of offering history to the public. When asked what the consequences of the Iraq War Diaries would or should be, Assange answered, “the truth doesn’t need a policy objective.” Assange was also asked if the diaries were “a gift to historians.” He said no, the gift was not for historians, rather that the Iraqi people “need the history of the last 6 years” to better understand and operate in the present. Last night’s “Cablegate” release only amplifies this sense of breaking not just news, but history. As the New York Times notes in its coverage, the leak represents an unprecedented leap in access to primary sources: until last night only diplomatic cables up to 1972 were publically available.
They say that journalists write the first draft of history. A Latin American term for a first draft is a “borrador” or “eraser.” But the line between journalism (or indeed history) and fiction is easily smudged. Statements like those above from WikiLeaks and Assange assume primary sources, like the ones WikiLeaks provide give us the whole truth, or at least possess a unique “truthiness.” But documents like those released in the war diaries and Cablegate do not represent “the truth,” but rather are simply another vantage point from which to try and glimpse it. How much of a first draft do you ever end up keeping?
If we believe “truth” in history is just a sheaf of diplomatic cables, or Pentagon memos – that if we just read them all, then we’ll know – we deny the shifting, impossible project that history is.
That instability is something we are taught to deny. Just as journalists – and fifth graders – are taught the 5 W’s: Who, What, When, Why, and hoW, so too do historians write – and students of history are taught to write, if they are to be considered “serious” – with a sense of inevitability as their guide. This is true at least at the mainstream and lower branches of the academy (I include my own BA here), where it is not so au fait to imagine that, for example, the assassination of Archduke Ferdinand or the accession of Gorbachev were anything more than matches falling into existing stacks of kindling. Teleology is seductive.
There is a way around this, and it is to be found in the counterfactual. Counterfactual histories, like Niall Ferguson’s popular Virtual History, occupy a liminal space at the bookshop. They’re lurking between serious scholarship and popular entertainment. By contrast and, especially, I daresay, in the United Kingdom, serious history for a general audience Must Offer Reasons Why.
It’s nothing new: American academic Richard Ned Lebow writes in his defense of the counterfactual, Forbidden Fruit, that historians since Thucydides have privileged underlying over immediate causes in determining events. But Lebow suggests that (in the West, at least), the First World War, in its unprecedented devastation – physical and psychic – shocked us into a kind of pathological century of avoidance issues.
If we can explain WWI as the (inevitable) result of the rise of the nation state, of the technology that sent Britain and Germany into an arms race, of a thousand and one Big Things Happening Over The Long Term, then we avoid the terrifying prospect that perhaps millions of people wouldn’t have died had Gravilo Princip not killed Archduke Ferdinand one day in Belgrade. Contingent causation is scary, because anything could happen at any time. Hence, Lebow’s odd title – knowledge of that contingency, once tasted, is terrifying, and there is no looking back.
What applies to telling the stories of the past applies to telling the stories of the present. Indeed, if we Americans can explain the disaster that is the last decade of war in the Middle East, blaming long term causes like sectarian violence, tribal patterns of life, or even medium term causes like 9/11 and the “war on terror,” a vast amount of responsibility is easily abnegated. How can any one individual, or any nation, be morally responsible for wrongs committed when the Angel of History is flying hard and fast? And so we – for the most part – keep our story simple: allowing a safe amount of critical reflection, to be sure, but never spending too long before the mirror.
In his most fanciful chapter, Lebow imagines world where Mozart lived a bit longer and thus World War II didn’t occur. In short: a mature oeuvre from Mozart meant Romanticism never happened; with no Romanticism, European reforms of the mid nineteenth century are not nationalist in character; without bellicose, nationalist Europeans – at the state and individual level – no Great War; and obviously, without a Great War, no WWII. At every inflection point, argues Lebow, lies another path not taken, another world potentially in existence. Quantum physicists, hopeless romantics, William James, small children, and, I suspect, many of us, know instinctively that this is true.
It is the political scientists and historians, the international relations theorists and practitioners, the elusive powers that be, who may need reminding. Counterfactuals, says Lebow, have relevance for those fields as well because immersing ourselves in the contingency of the past reminds us that “the future will once again defy prediction.”
Still, Lebow recognizes we’ve got to impose some sort of order, some sort of inevitably imperfect, problematic system or framework or (dreaded!) “lens” on the data around us, simply to get through the day. But the kind of stepping back Lebow proposes is too often neglected: by historians and, to return to the point, by journalists.
The historian as well as the journalist often fails to consider the “unknown unknowns”, the what-ifs, for lack of evidence or time or effort. There’s no reason that considering those other worlds might not lead closer us to the truth. Like the historian, even the best journalist has a hard time finding new routes through a story: even the best journalist is limited by her own contacts, her access, her time. And we – we the public, we the consumers of news – are limited even more: by our access, by our time, by our patience with reading lots of words or watching another 10 minutes of television.
We often don’t know how to take different routes through a story.
WikiLeaks, then, does not offer a rough draft of history. It will not “redefine” history any more than history should be redefined everyday, if it is practiced by thinking persons. What WikiLeaks offers, in a way historically unprecedented (and entirely contingent on technology), is the raw data of history. Causation and contingency are laid before us: a quarter-million instances of American foreign policy being enacted, or 391,832 points of data that future historians will use to write the history of this war in Iraq. And from that raw data: new approaches to the truth, yes, but not truth itself.
Wikileaks presents counterfactuals comprised of “facts”: new routes through – and deeper through - stories we thought we knew (if we were even paying attention.) The data they post remind us that the data we have are incomplete, and always will be incomplete, that every draft we write – from the first to the last, will be a borrador. History will not be, can never be redefined – because it should remain undefinitive and be constantly re-written.
Tokyo, Almost-Encounters, and “Passing By”
After a long day of walking around Tokyo I often catch myself thinking, "Well, I guess today wasn't to be the day that I bump into her." Is it really so ridiculous to think that I might? Sure, it may be a city of nearly twelve million, but the odds of meeting my ex-girlfriend on the train or passing her on the street can't really be that low, can they? By my calculations, it’s an even fifty-fifty: either I see her or I don't. At least that's how it feels.
To Pass By
Once while browsing at the library, I came across a book that began with a dentist and a patient chatting during a minor medical procedure. The patient, if memory serves, was a professor of Chinese history. So where ya from, asks the doctor? China. What Province? Szechuan. Ya know, the doctor chuckles, I only know one Chinese guy, a dentist from Szechuan. His name is X. D’ya happen to know him? Actually, says the astonished patient, that's my uncle!
The author’s point was not that it’s a small world after all, but rather, that docs and profs really only move within the smallest slices of a rather large world. Nor is this phenomenon limited to cosmopolitan elites. When I used to drift around New York City, I would often see folks in MTA (Metropolitan Transportation Authority) uniforms, far from any train station or bus stop, greeting each other by first name: Hi Derrick. How’s it going there, Carroll? It’s true that for the MTA, city-streets behave as the office hallway, food trucks as the cafeteria, stations as cubicles; but still, shouldn’t these folks feel just the littlest surprise when running into each other inside this impossibly large office building? It would seem that city-space just operates differently for the transit authority than it does for those of us who merely pass through the city’s streets in transit. How it all works I can't presume to know.
Passing By in Tokyo
If chance encounters happen at all in Tokyo, they happen in the small slices; at the bike-shop, the record-store, a favorite watering hole. But for most of us, most of the time, Tokyo is a city of almost-encounters and near-misses, a city of shared space - shared not simultaneously, but by turns. It is a city defined by 'passing by.’
Everyday, 3.55 million passengers are sent round and round Tokyo in a dizzying twenty-one mile subway loop called the Yamanote Line. The entire New York City Subway System, by comparison, spreads its 5.5 million daily riders out over thirty-two times as much track. The Yamanote Line is an extraordinarily busy little piece of real estate. At each of its twenty-six stops, passengers don’t exactly pass-through the ticket-gates, instead,they are pressed-out in batches, like loaves of bread through a bread-slicer. When you step onto this thronged train-line in the morning, it’s safe to assume that by day’s end, dozens upon dozens of friends, colleagues, past lovers, and future muses, will have passed through the same small square inches of this immense city.
If we fail to see friends or acquaintances on our morning commute, this probably has less to do with the sheer size of the city holding us apart and more to do with the minuscule differences in time that manage to hold everything together. The faster each of the city’s individual parts moves over to make room for the next part to move in, the smaller the city as a whole can be. One could ask with a queer sense of seriousness whether we are passing each other by in Tokyo because we are all in a hurry, or if it’s really more that, were we not all moving so quickly, the city simply wouldn’t be able to hold us all in? Maybe it’s precisely because of the speed of the city and the possibilities opened up by passing-by that we all manage to fit within the city’s smallness without getting squashed to bits.
You come to learn all manner of things about your fellow commuters when you stay just outside Tokyo’s more bustling parts. Board the same car every morning for a few months, and before long, you notice when the elderly businessman with the bowler hat is not among the morning’s passengers. Maybe you even worry about him a little. He always appears to be looking out the window at the passing scenery, but you know he’s nearly blind - twice now you’ve seen him hold a book millimeters in front of squinting eyes, only to then lower it in mild defeat. You also know that the prim office lady with the big mole on her neck - (why doesn’t she get that removed?) - has trouble walking in high-heels. You quietly applaud her good judgment on days when she wears loafers and carries her heels in that small designer shopping bag of hers. Of the two phones she keeps on her lap, you’ve decided that the silver one must be for work; when texting on it, she hardly ever smiles, or at least not the way she does with the red one.
Move deeper into Tokyo and there’s just not enough room in the crowded cars for the peculiarities of daily passengers. The 7:06 may well be your train, the second-from-the-last may well be your car. And sitting across from you, day after day, may well be a nearly-blind businessman or a woman with a mole on her neck, but you’re likely never to know it. It’s not that, when in the city, one turns one’s attention too far inward, or that one become’s too self-concerned to concern oneself with others. It’s that, inside the Tokyo metro, the self becomes as much of an obstacle as the other; just one more thing to make sure doesn’t get in the way of all the other moving parts.
I often wonder how many love affairs in Tokyo might be traced back to a spilled purse, an almost-forgotten umbrella, inclement weather, or any of the thousands of other things that normally bring strangers into conversation, into a restaurant, and eventually into domestic partnership? Undoubtedly, not very many at all. I fancy to think that one could probably count the number, within a reasonable margin of error, on the fingers of one hand. Tokyoites are just too busy passing each other by to be drawn in by such things as happenstance, chance, and accident.
One would of course be hard-pressed to deny a certain expertness at work in the way Tokyo’s residents shuffle unconcernedly past injuries, altercations, and businessmen passed-out in front of train stations, but when all is said and done, I prefer to think that it’s not so much the practiced dismissals of strangers, but rather, the almost-encounters between colleagues, the near-misses between lovers, and the blind baton-passes of city-space from one friend to the next, that make Tokyo a city distinguished by 'passing-by.'
The other day, I witnessed the city and all its velocity come to a grinding halt just outside a Tokyo subway ticket-gate. A visibly embarrassed businessman stood there motionless, holding an empty coin purse. Around him, a small crowd of eight to ten commuters in business attire had sunk to the linoleum and were pecking about vigorously for scattered coins. It all looked uncannily like a film-reel of a man feeding pigeons being played in reverse. I thought to help, but in the end was too caught up watching, as, one by one, small kernels of metal were placed into the man's extended hand with nearly imperceptible bows.
When I related this scene to a colleague that evening on our way to the station, he volleyed back with a small collection of his own anecdotes, and then offered a provisional thesis to the effect that money might play by its own rules in the game of passing-by in Tokyo. It’s possible, it’s definitely possible, I said. We traded back and forth for a few short minutes about the curious nature of money, of exchange, of transaction, traffic, and of passing-by, and then each went our separate way.
Coda: Passing by with a Sneeze
I can't express how eerie it is to sneeze in a Japanese coffee shop. There’s no custom here encouraging seat-neighbors to chime in with a ‘bless you’ after you sneeze, and it’s about all a foreigner can do to restrain himself from offering up a friendly ‘gesundheit' in response to a high-pitched atchoo coming from the next table over. For the Japanese, a sneeze is a mere sound, an insignificant bodily function; something like scratching your arm, blowing your nose, or re-adjusting your shirt collar. It carries no value whatsoever. None. After a sneezer sneezes in Japan, she returns to her book, he to his coffee, they to their television-watching, all with such seamlessness of motion that not even the smallest opening remains for the well-wishing of a stranger. It’s in these moments after a sneeze that I sense most poignantly that Tokyo, down to its smallest detours and by-passes, is a city that thrives on ‘passing-by.’
Thomas Nozkowski. Untiltled 8-135. 2010
Oil on linen on panel.
The Revolution Will Not Be PowerPointed
by Hasan Altaf
In a city like DC, the think tank circuit does a roaring trade in “developing countries,” their problems, and an endless list of ways to solve those problems and take those countries, to use the easy dichotomy of think tanks, from developing to developed. The hottest commodity on the market, lately, the dream subject of international development, is Pakistan. It has become the perfect laboratory for think tank experiments, a veritable Petri dish of everything that could go wrong and every possible way to imagine a solution; rarely does a week go by without a presentation or, at least, a visiting expert.
Recently, I went to one such presentation at a respected think tank in DC. It was raining, and people straggled in one by one, shucking off their overcoats and placing their umbrellas carefully under their chairs. The room filled up with suits and briefcases as the interns ran sound checks. The atmosphere, as the audience milled around waiting for the presentation to start, was so far removed from what we were there to hear about that it felt almost theatrical. It’s hard to tell, in such circumstances, whether you are part of the show or simply there to applaud – or, perhaps, both.
My usual instinct, out of not so much cynicism as sadness, is to avoid these events, and I had forgotten that their real purpose is never the one stated. The presentation is an afterthought, a sweetener. The real point is the social gathering, the first ten minutes and the last ten minutes. It was fascinating to watch the not-so-idle chitchat as people ran into old friends and found new ones, to overhear the experts trading war stories from their time “in the field.” People discovered friends or colleagues in common, experiences they had shared, times they had just barely missed each other. In a way, it was heartwarming. This world, of international development and the research that surrounds it, is a small one, and people stick together.
Somehow, though, I couldn’t shake the feeling that something was off. It felt as though the last century had disappeared and that all of a sudden we were back in the olden days. Coffee and bagels have replaced the gin and tonic, we have lunch meetings instead of chhota pegs and rounds of polo, and we wear business casual instead of whatever they wore back then, but the spirit of the thing is the same. These are safe spaces, as far as possible from the noise and the mess, for the agents of civilization – and, now, the native elites – to discuss and fix problems that are oceans away, to alter the courses of lives that are not their own.
When the presentation starts, when the expert speaks, it’s usually hard to argue with them. Who would question the idea that a country like Pakistan needs more schools, better teachers? Who would fight against more hospitals, more nurses, more security, stronger democracy? And the experts present their case so well, too. The PowerPoints are minimalist and non-distracting, organized bullet points with a few charts and graphs; everything is clear and easy to follow. We nod and smile and clap and feel hopeful. We ask the right questions, afterwards, and feel we have done some good. This time will be different. This time is always different.
And this is what strikes me as the real danger of these things. The problem isn’t what they say – it’s where they say it, how they say it. In the lecture rooms of a think tank, in a classroom at a university, over Starbucks in DC or London or Geneva, the diagnoses and recommendations make perfect sense. But once you take them out of the lecture hall, out into the world, the magic disappears. The sparkle doesn’t survive. We are left with books that look good on our bookshelves, that we can cite to increase both our credibility and theirs, to further the conversation.
The conversation is with ourselves. Think tanks, even more than the colonial clubs they have replaced, exist in an echo chamber, cut off from the world that they attempt to fix and the problems that they have made their business. Which is not to say they do nothing: They do a great deal. In the most benighted corner of the “developing world,” I am sure you would be able to find one excellent school with a great teacher, a miracle-working doctor in a well-stocked hospital, a working sewage system – and behind it there might be, somewhere, a think tank or an NGO.
The math, though, is hopelessly against them. For every success story, there are thousands of failures, and the overall systems are failing. The solutions that come out of think tanks treat symptoms, not diseases – which is fine, but if you treat symptoms long enough, and well enough, eventually your patient will believe that he is no longer sick. And the doctor will be convinced that he himself has performed the cure.
A country like Pakistan is full of problems – that we can all agree on. And those problems must be solved. But I can’t believe the solutions will come out of think tanks, at least as they are now. Think about it this way: We built a house, without a foundation, out of the most flimsy and flammable materials we could find, and then stocked it with gunpowder. Every time I go to one of these talks, it sounds like I am being told to walk softly and carry a fire extinguisher. Maybe I’ll follow that advice and maybe I won’t, but the house is still a danger.
If there is failure here, in the gap between excellent theories and miserable practice, it’s a failure of something like translation, and of vision, and of understanding. The think tanks fail to understand, for one thing, that something like “education in Pakistan” cannot be solved piecemeal. (If, under the best of circumstances, they educated a million more children, what jobs would those educated children perform? And what does “education” mean? How can “education in Pakistan” be fixed by more schools and better teachers, if part of the problem is what they’re teaching?) This is the treating of symptoms, a kind of cosmetic surgery that makes the sick man look healthy again while leaving him rotting from the inside.
What the rest of us fail to understand is that, like politicians whose main goal is reelection, the real project of think tanks is think tanks. “Developing countries” and their problems are useful, as objects of study, but the end project is the institution itself. Their primary responsibility is their own existence, their own relevance, and their own reputations – and “developing countries” are not the ones who judge them. They sit before a jury of their own peers. As do the rest of us, too. If the education system fails in Pakistan, it won’t be the think tanks that get blamed, and it won’t be their fault. It will be ours, and we will, hopefully, be held responsible.
When that particular talk was over, I picked up my umbrella and followed the crowd heading out into the rain, streaming back to offices at the World Bank, the aid agencies, the NGOs, with new papers in their briefcases and new contacts in their BlackBerrys. I didn’t feel particularly hopeful, and I can’t believe that they did, either. These people with their PowerPoints will not save us. It would be ridiculous for us to expect them to. Something important is missing from this equation, and neither the think tanks nor the countries that are think-tanked are providing that. Until that missing ingredient is found, that last step taken, each is basically talking to itself. Both, though, seem fine with that.
Of Mice and Memory
Years ago, London neurobiologists discovered a way to visualize the structural dynamics of memory formation using just a laser, a microscope, and a window. To start, they vivisected the craniums of dozens of laboratory mice and surgically embedded tiny glass panels into the outer fleshy folds of the living, exposed brains.
The researchers specifically targeted an area of the brain known as the visual cortex; their goal was to define the relationship between vision and memory. These implanted bits of glass were to serve as physical windows to the branching, ductile neurons of the brain; when scanned by a laser, they would allow for capture of microscopic images of fluorescing neurons and provide a glimpse into the creation of memories.
In 2009, their laborious efforts paid off. Mark Hubener’s lab at University College reported in January’s issue of Nature magazine that they had found a link between distinct neural growths and memories of past experiences. Through miniscule peepholes, Hubener’s team saw bud-like spines emerging from the branches of the brain’s neurons. These spines seemed to sprout most in response to new experiences, implicating them as the brain’s physical storage areas for memory.
Because Hubener’s work is fairly visual in nature, it’s easiest to begin with a mental picture of the brain. Let’s start by imagining its most basic component, the neuron, as a tree in winter, leafless with many branches, or dendrites. If the neuron is a tree, then the brain, quite simply, would be the forest where it resides. Now, if you can imagine that forest with one hundred billion trees densely packed into a space the size of a grapefruit, then you’ve got a basic idea of what the human brain looks like.
Not impressed? Each tree in your brain forest physically contacts the branches of thousands of other trees; in children, these contacts, or synapses, number a quadrillion, in adults, this number decreases then stabilizes to a mere few hundred trillion. If synapses were dollars, we’d have enough money to pay for the Bush administration’s tax cuts... for two thousand years*.
So, what’s the purpose of all of these branching contacts? Synapses serve as conduits of communication between neurons- they allow information to race from dendritic branch to dendritic branch, relaying messages of sense, perception, reaction, and thought. But what about memory? Where are our recollections of past experiences stored among this vast network of neurons?
Much of what we know about experience-based memory comes from research in laboratory animals. In particular, a technique called monocular deprivation (MD) has been widely used to learn about the dynamic neural connections that link the eye with the brain. This technique, as the name suggests, involves blocking vision in one eye of an animal (picture a mouse with an eye-patch), and monitoring brain activity as the animal adapts to its new uni-vision.
Other than faint light perception, all stimulation through the covered eye is lost, silencing the once rapid conversation between eye and brain. After just a few days of deprivation, however, the brain acclimatizes and abandons the non-communicative neural branches of the sightless eye for those of the uncovered, seeing eye.
Why is this sensory deprivation technique useful? Forcing the brain to adapt to new experiences offers clues into memory formation. As we’ve learned from mice, once the eye-patch is removed, adult animals fully recover binocular vision, as if they’d never been blinded. However, if the eye is covered a second time, the brain seems to remember how it dealt with one-eyed vision in the past, and accelerates the shift in dominance to the uncovered eye. The easy return to standard two-eyed vision after MD belies lasting changes made in the animal’s neural architecture.
While it is relatively simple for an adult animal to switch between one and two-eyed vision, covering the eye of a young animal can result in irreversible blindness, even long after the eye-patch has been removed. Why are young animals so vulnerable to changes in sight, while adults can easily adjust? Let’s revisit our brain forest, and imagine that everything has been newly planted.
As saplings, trees are pliable, and easily manipulated by outside forces. However, the fleeting events of youth have long-term consequences, and something as trifling as a pebble in the soil can forever transform the structure of a new tree.
Similarly, because neuronal branches establish their connections during infancy and childhood, minor changes in sensory input during this time period can drastically alter the structure of the developing brain. As soon as one part of a young brain stops receiving messages, neighboring neurons encroach upon the vacant territory, competing for new connections that will persist into adulthood.
If our network of neurons is essentially fixed after childhood, how are we able to adapt to new experiences as adults? Why are mature animals able to ‘learn’ how to see out of one eye? Let’s get back to Hubener’s mice. Using the tiny brain-implanted windows, Hubener’s team chronicled the ebb and flow of miniscule dendritic branch protrusions after MD to uncover their role in memory storage. These small spines budded from the branches of neurons in response to MD and persisted long after the sightless eye was uncovered.
When the same eye was covered a second time, Hubener’s team saw no new spine formation, despite swift adaptation to monocular vision. They proposed that dendritic spines carry synapses and serve as local reservoirs of memory, allowing the brain to structurally adapt to new experiences while maintaining its overall neuronal organization.
Hubener’s team peered into the mouse brain and traced the structural framework for memory through its pathways. They were the first to link experience with dendritic spine formation, and confirmed that new experiences can actually change the physical layout of the brain.
We still don’t know how long these memory spines persist- this study only followed individual mice for 2-3 weeks. It’s possible that the loss of spines is connected with memory loss, or with neurodegenerative diseases. However, we do know that practice, or repetition, allows the brain to learn and adapt more quickly to new experiences, and that spine formation is fundamental to this type of experiential memory.
*The Treasury Department estimates (roughly) that the cost of a short-term extension of the Bush era tax cuts is between $200 billion and $500 billion. This estimate depends on how long the cuts are extended, and for whom. If I use the upper end of this estimate, and say that a one-year extension will cost $500 billion, then a quadrillion dollars would pay for two thousand years of tax cuts. Alternatively, if the tax cuts cost only $200 billion per year, a quadrillion dollars would cover five thousand years of tax cuts.
1. Alberts, B. (2002). Molecular biology of the cell. New York : Garland Science.
2. Drachman, D. A. (2005). Do we have brain to spare?. Neurology, 64(12), 2004-5.
3. Dräger, U. C. (1978). Observations on monocular deprivation in mice. Journal of Neurophysiology, 41(1), 28-42.
4. Hofer, S. B., Mrsic-Flogel, T. D., Bonhoeffer, T., & Hübener, M. (2006). Lifelong learning: Ocular dominance plasticity in mouse visual cortex. Current Opinion in Neurobiology, 16(4), 451-9.
5. Hofer, S. B., Mrsic-Flogel, T. D., Bonhoeffer, T., & Hübener, M. (2009). Experience leaves a lasting structural trace in cortical circuits. Nature, 457(7227), 313-7.
Where do our rights come from?
In the wake of Republican defeat in the 2008 election, conservatives started casting about for a new standard-bearer. One name which then resurfaced was that of Newt Gingrich, the former Speaker of the House of Representatives. A conservative firebrand during his Congressional days, Gingrich had reinvented himself as a pragmatic innovator, pushing high-tech solutions for our continuing dependence on fossil fuels. However, as we’ve seen from his subsequent output, he's still the same old culture warrior in other ways. Here he is in a 2006 interview, discussing his then-recent book The Creator’s Gifts: Life, Liberty and the Pursuit of Happiness: “[I]n the minds of Benjamin Franklin, Thomas Jefferson, John Adams and the people who wrote that document, they literally meant that your rights come from God, that you then loan them to the government, which is why the Declaration of Independence begins ‘We the people…’. And therefore if we drive God out of the public square we drive out the source of our own rights and our own source of power.”
Of course, it's the Constitution, not the Declaration, which begins “We the people...”; but anyone, even a history Ph. D., can misspeak in an interview. The important point is this conception of the “creator's gifts” and their significance. Alan Keyes, whom Barack Obama defeated in their 2004 Senate contest, strongly endorsed the same idea during his own presidential run. What should we make of the idea that our rights “come from God”?
This idea of rights given by God is the conceptual flip side of duties imposed by God: any right possessed by A is ipso facto a duty imposed on B not to violate that right. This latter idea has traditionally provoked the question of whether morality should, or even can, be identified with divine command. The paradox of this account of morality, first discussed 2500 years ago in Plato's Euthyphro, is brought out by this question: Is something the right thing to do because God orders it, or does God order it because it's the right thing to do? The second answer simply abandons the divine command theory, but the first answer isn't any better. It requires us to say why something we know to be wrong – say, torturing the innocent – would not thereby be made right if God happened to demand it. One natural answer is that God, being ideally good, wouldn't actually do that; but now we are explaining morality in terms of God's ideally good nature, and not in terms of divine command after all.
This doesn't mean that divine commands can’t be an authoritative guide to morality, but it should make us look again at the idea of rights which are bestowed by divine fiat. This is especially true in the context of the Declaration of Independence and its aims. In its first sentence, the authors claim the right of a people to “dissolve the political bands which have connected them with another,” but then acknowledge their obligation to declare their reasons for so doing. Much of the subsequent text is a list of charges against the King of England, particular cases in which he has abridged their rights. However, these abridgements could not count as just reasons for rebellion if, as monarchists claimed, those rights depended on or had been bestowed by the king himself. The second sentence of the Declaration is therefore concerned to reject this idea.
We hold these truths to be self evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights, that among these are Life, Liberty, and the pursuit of Happiness. The chosen wording of this point is exactly as we should expect, given its context and purpose. We have rights not because of some external agent who has given them to us, but by nature. Human beings are essentially entitled to their rights, simply in being human beings at all. Nothing further is required.
The Founders’ religious beliefs ranged from relatively orthodox Christianity to typical 18th-century deism. Although many rejected the idea of divine revelation or intervention, they lacked an evolutionary account of human origins and thus naturally assumed that a deity of some sort had created humanity. What we are by nature, then, is ultimately due to the actions of that which they sometimes called “Nature’s God.” Their concern, however, was not with the properties of that creator – about which they of course disagreed – but instead with the human result of that creation.
That result was a race of beings to whom nothing need be given for them to be naturally free. True, the word “endowed,” in its original meaning, referred specifically to a literal gift (like a “dowry”). However, its figurative use, in which it refers to qualities which one has by nature, was well established by 1776; the Oxford English Dictionary lists references going back to the 1400s. It’s a good choice – forceful, eloquent, and yet doctrinally non-committal between deism and traditional theism, all while preserving the essential point on which the undersigned are in complete agreement.
Gingrich and Keyes miss this point entirely. On their account, the Founders were accepting the original idea that our rights are bestowed on us by an external agent, and simply changing that agent from king to deity, thereby trumping man with God (and of course you need this if your conclusion is that "if we drive God out of the public square we drive out the source of our own rights and our own source of power”). But that can’t be right. For one thing, concerned as they were with avoiding the sectarian violence still raging in Europe at the time, the Founders would hardly hold such a metaphysically contentious idea to be “self-evident.” Indeed, on the Founders’ view that idea is barely coherent. It assumes without argument that making us human and bestowing rights upon us are conceptually separable things, as if it were possible for us to be fully human, but without the right to liberty until it was subsequently bestowed upon us by a wave of His mighty hand.
This is exactly what the Declaration denies. To be human just is to have the rights in question. They depend on nothing external to us, natural or supernatural. That's what the point's “self-evidence” is meant to emphasize: that having rights is (metaphysically) necessary for being human. We're all human, so self-evidently we all have rights, and equal rights at that. However, it also means that no argument for that conceptual claim will be forthcoming. The Declaration is not a philosophical treatise, but a political manifesto. It doesn't say that the authors couldn't give an argument for that claim, but it does indicate the kind of argument that it would be if they did give one.
That argument would not be an appeal to Scripture. Not only were the Founders divided on the issue of divine revelation in the first place, but such an appeal, again, could only be the very opposite of self-evident. As revelation, Scripture is meant to be informative. Although Paul does wax philosophical in his letters, in general the Bible is no more a treatise than is the Declaration. It is an account of what purport to be contingent matters of fact: God did this, the Israelites did that, Jesus of Nazareth did other things still. Any rigorous argument the Founders would give for the conceptual interdependence of humanity and human rights would be concerned not with particular historical events, even supernatural ones, but instead with (as Immanuel Kant would say at about the same time, in a somewhat different context) “necessity and universality,” the marks of the a priori.
Universality is the characteristic Enlightenment obsession, as clearly manifested in our nation's founding documents. One can only imagine how the Founders would respond to Gingrich's and Keyes's attempts to claim these documents and their ideas as reinforcing the narrow sectarianism they strove above all to avoid. Though signed by and agreed to by Christians, the Declaration is not a specifically Christian document. It is, rather, the cultural inheritance of all Americans.
When I first wrote this piece, I had to wrap up my discussion of the Euthyphro dilemma pretty quickly. I have more space here, so let me say a bit more about it now, even if it’s just nitpicking and qualification (i.e. the good part’s over). Clearly the defender of the divine command theory must choose the first horn of the dilemma: that something is the right thing to do because God orders it, not the other way around. (Actually Euthyphro discusses “holiness,” not morality, but the difference has traditionally been taken not to matter, as the same issue arises in each case.) As I mentioned, this looks funny, as it is hard to understand how something could be made right or wrong simply by our being informed, even by God, that it is so. It seems that as a theory of morality, the “command” part of “divine command” still isn’t doing any work.
Even if morality cannot simply be identified with divine command, though (and in fact the divine command theorist has a few more things he can try here), those who believe that it is morally obligatory to follow God’s commands are not thereby required to give that belief up. However, nowadays the issue of identifying morality with divine command tends to come up only when the context is one of whether morality has anything to do with God at all, where both sides assume that the latter claim requires the former. Naturalists are willing to go along with this, as they believe that Euthyphro provides an easy refutation, suggesting that the religious are 2500 years behind the times; while believers in what they like to call “transcendent reality,” taking their very faith to depend on it, hold onto divine command like grim death (even going so far, as we have seen, to attribute a closely related idea to Benjamin Franklin and Thomas Jefferson). The resulting shouting match is rarely enlightening, often featuring such facile slogans as “His universe, His rules” or some such.
Again, though, it seems that things cannot be so simple. Even if divine authority cannot be constitutive of morality, surely the (conceptual) possibility remains that, due to his ideally good and/or omniscient nature, God can be a perfectly good source of knowledge about moral truths, and that he has in fact revealed them to us in this or that set of holy writings. That is, even if creation need not entail the relevant sense of moral authority, any omniscient agent surely possesses epistemic authority about moral matters. However, to show the application of that idea of course requires further argument; and this we are unlikely to see from the likes of Gingrich and Keyes.
Should We Fear Fear Itself?
People are worried about the Euro. As bad news flows out of Europe – persistent unemployment, popular discontent over painful austerity measures, and catastrophic bank losses tied to still-deflating real estate markets – international investors continue to cast doubt over the Euro-Zone’s short- and long-term stability. Fear of at least a partial disintegration of the monetary union is rampant. Indeed, Morgan Stanley recently released the results of a survey of 150 of its clients; while only 3 percent of the investors thought there is more than a 60 percent chance that the Euro-zone will break up, three-quarters of the respondents thought there was some probability of a breakup. These statistics raise a double concern. First is the fear that this nightmare scenario will come to pass, an unprecedented event that could fatally wound investor confidence in the Euro, potentially eliminating its viability as a secure store of value. Second, one might fear this fear itself, as these investors’ worries might contribute to their own realization.
Financiers justify the distinctive double movement of the last several decades by arguing that markets are efficient. Neither the proliferation of capital markets nor the wearing away of regulations on them would be legitimate cause for concern if markets could be counted on to allocate capital to the areas of the economy that deserve it. Yet this period’s continual booms, busts, and crises provide a substantial and ever-increasing body of evidence that these supposedly rational capital markets are, in fact, anything but. As much as Florida’s decaying, uninhabited subdivisions attest to the dangers of irrational exuberance, Ireland’s swaths of unsold houses and imploding, too-big-to-fail banks attest to the power of expectations. They demonstrate that rather than allocating resources on the basis of soberly considered “economic fundamentals,” capital markets have a stubborn tendency to synthesize their own realities from the grist of investors’ expectations.
Consider how the now-familiar contagion of financial crisis replays itself in each of the PIIGS (Portugal, Italy, Ireland, Greece, Spain), adding to the uncertainty about the Euro-Zone’s continued solidarity. First, the hard slap of reality deflates a convenient and officially supported untruth, swamping the government’s budget with red ink. In Greece, it was Prime Minister George A. Papandreou’s acknowledgement that his predecessor had been hiding massive government obligations in complex financial instruments, cleverly designed by the whizzes at Goldman Sachs. Ireland’s difficulties stem from the painful popping of its massive real estate bubble, which hit its banking sector with losses so big they overwhelmed the Irish state’s aggressive bailout attempts. In both cases, deficits quickly piled up and investors started to worry that the governments might default on their sovereign debts.
This is when a new and dangerous set of self-reinforcing expectations took hold of the situation. To appease investors worried about the riskiness of their debts, the Greek and Irish governments were forced to raise the yields on their bonds. Perversely, increasing the price of their sovereign debt service further strained their budgets and made defaults more likely. This, in turn, made international investors more cautious about lending to these governments, necessitating further increases in bond yields. Worse, the more concerned investors become about any one of the PIIGS, the more likely it is that the contagion will spread to the other fragile countries gorging themselves at the trough of international capital markets.
This self-reinforcing cycle was hardly unpredictable. Indeed, John Maynard Keynes’ General Theory of Employment, Interest, and Money is, at root, a treatise on the power our expectations have over anything like economic “reality.” Under ideal market conditions, there should be a diversity of expectations about any particular economic question. A mix of bulls and bears ensures that every seller can find a buyer without reducing his assets to fire-sale prices. Yet the patchwork of institutions with the power to influence investors does not always foster this necessary diversity of sentiment. While the International Monetary Fund and the European Central Bank designed massive rescue packages to shore up confidence in Ireland’s and Greece’s ability to repay their debts, other institutions function to exacerbate financial crises. For example, credit-rating agencies such as Standard & Poor’s and Moody’s have continually downgraded the sovereign debt of countries with troubled balance sheets, providing official reinforcement to the feedback loops threatening the PIIGS. Every rating downgrade unifies global expectations, encouraging investors to bet against these countries all at once, deepening the problem.
There is a dark irony to the credit-rating agencies’ conservative pronouncements as it was only a few years ago that they were bestowing their highest grades on now-toxic mortgage-backed securities. How is it that these ratings agencies calculated that the Republic of Ireland is a substantially riskier investment than a collection of no-money-down mortgages from Tampa? Credit-rating agencies play an important role in the lightly regulated global economy; experts at these companies are supposed to judge the long-term stability of various assets, from sovereign debt to collateralized debt obligations. Their judgments form the basis on which investment firms determine their exposures to risk. Although existing regulatory regimes assume that credit ratings are sober evaluations of assets’ fundamental strengths and weaknesses, in practice, raters are frequently caught up in the same illusions as traders, since they both subscribe to the same theoretical orthodoxies and use similar models. The result is profoundly destabilizing: Instead of pressuring traders to reconsider the Street’s conventional expectations, credit ratings “confirm” the validity of traders’ bets, both inflating bubbles with a false sense of security and violently popping them once feelings start to turn sour.
This myopic logic, which takes the prevailing expectations about an asset to be more important than its “real” strength, usually works, at least in the short term. Both open exchanges, such as the NYSE, and private financial companies’ “market making” activities increase the allure of certain assets by making them appear more liquid. The liquidity that normally functioning markets provide allows professional investors to restrict their attentions to the short term, decreasing the relevance of long-term growth potential or solvency. Keynes compares markets dominated by professional investors to
those newspaper competitions in which the competitors have to pick out the six prettiest faces from a hundred photographs, the prize being awarded to the competitor whose choice most nearly corresponds to the average preferences of the competitors as a whole; so that each competitor has to pick, not those faces which he himself finds prettiest, but those which he thinks likeliest to catch the fancy of the other competitors, all of whom are looking at the problem from the same point of view. It is not a case of choosing those which, to the best of one’s judgment, are really the prettiest, nor even those which average opinion genuinely thinks the prettiest. We have reached the third degree where we devote our intelligences to anticipating what average opinion expects the average opinion to be. And there are some, I believe, who practice the fourth, fifth and higher degrees.
In supposedly liquid markets, the optimal investment strategies do move to these higher levels of analysis, further from a strategy based on fundamental economic value. Indeed, studies indicate that markets behave quite differently depending on how many degrees of expectations agents consider. In general, “it is not the case that the average expectation today of the average expectation tomorrow of future payoffs is equal to the average expectation of future payoffs.” Rather than stabilizing, these markets tend to accelerate; economic models suggest that asset bubbles and crashes are natural features of such self-reinforcing market dynamics.
Despite the obvious risks of betting big on something as ephemeral as popular sentiment, traders make billions in management fees by convincing clients that they can consistently beat the gun. On the upsides of bubbles, this strategy easily pays off, usually beating the returns of investment strategies based on economic fundamentals. However, as soon as expectations turn against a particular asset, the process works in reverse and huge losses can wipe out years of hard-won gains. During these crashes, investors’ bearish expectations synchronize, causing liquidity to dry up as sellers drastically outnumber buyers. Indeed, Keynes maintains that “of the maxims of orthodox finance none, surely, is more anti-social than the fetish of liquidity, [since] it forgets that there is no such thing as liquidity of investment for the community as a whole.” Despite traders’ efforts to insulate themselves from risk with credit default swaps and other complicated hedging strategies, someone will be left holding the bag, full of near-worthless assets. If worries are widespread, rather than restricted to a particular security, this fear can spread from one asset to another as investors try to unwind their risky positions before their competitors. Thus, trying to “beat the gun” on the downside by anticipating others’ expectations of the average expectation about the future not only accelerates the fire-sale in progress but also spreads the contagion to other, unrelated assets. This is why the Portuguese government is watching the Irish situation so closely: Portugal knows it’s next in line.
In the wake of financial crises, it’s common for financiers to renounce the riskiest, most highly leveraged investment strategies in favor of something more “traditional.” Indeed, a long-time Wall Street veteran now dying of brain cancer, Gordon Murray, recently released a book condemning the entire concept of active money management. People would do better over the long term pinning their portfolios to an index like the S&P 500, he claims, than paying high-priced traders to try to beat the market. While some people burned by the recent crises may heed such advice for a time, history suggests that this conservatism will last only as long as the painful memories retain their sting. This moment has likely already passed. So long as traders can convince the holders of international capital that they can beat their competitors to the next big thing, one would be well-served by recalling FDR’s first inaugural address. Then, as now, negative speculation threatened banks, markets, and political stability. Fear itself might not be the only thing we have to fear today, but it’s certainly one of the biggest.
On the way to Elderidge street to view Saul’s work on, well, as later we discuss at the gallery, over beer and wine and under fluorescent light: Man’s Hegemony--Anyway—On the way there, changing from the One to the D to meet Ali at 42nd street, speeding, hurtling downtown ---reading poetry, MTA’s overhead, that is, and surrounded by every type of light and commotion and the train’s rattling —pounding out the question it seemed to me--What is slow and what is night? Over and over again and again the same sound. What is slow and what is night? Then, came to mind another town. There, in that quiet and hush in dark’s first blush an inkling of slow and of night when the sun goes down. Then, a single naked bulb, lit and dangling from a wire, each, in each shack teases beauty through a quivering tungsten tongue murmuring of a still and separate universe. In the near black out town, of Jijiga when the sun goes down every shack and dozens more begin to glow and seem to unwrap themselves like tiny gift boxes, in rows upon rows. And just beyond and out of reach how indeed a sensation begins to grow as the stars too, descend to take dominion over the earth. Every evening, in Jijiga’s night and slow, every shack unfolds its own magical show. Which illuminated, so, each a miniature carefully curated. Colored tarps stand in as walls in emerald, turquoise, orange, blue along the narrow Jijiga dirt roads. Day light’s hovels by night transformed as though each one a story’s page. Or each one a framed painting —Or, perhaps, each a window onto something else. Or each a separate theater set or a stage. There, look, just beyond the glow a silhouette of a warrior, in shadow, chewing chat taking a rest from the undeclared war for his oil and gas. For which the script directs that he will lose to a foe, a stronger and a most unwelcome guest, sneaking, slinking and slithering in. A year, perhaps, to photograph all this? Another two perhaps to write it all down? And then to sketch and paint it too—for display in some far away space and perhaps to win accolades? On the way to Elderidge street to view Saul’s work on, well as we discuss later at the gallery, over beer and wine and under fluorescent light: Man’s Hegemony--Anyway—On the way there, changing from the One To the D to meet Ali at 42nd street, speeding, hurtling downtown surrounded by every type of light and sound I thinking of what is slow and what is night: In Jijiga’s quiet and hushed streets a single naked bulb, lit and dangling from a wire, each, in each shack teases beauty through a quivering tungsten tongue murmuring verse of a still and separate universe.
Lions and Hyenas
No I never heard them fight,
The lions and hyenas late at night
I guess I slept too well in Jijiga.
Gathered there as we were,
With all the feigned piety involved for
Migrant workers at a high price
Exchanging astonishments for all the places in
Last promotions, last stations
You too? No! In Dushanbe?
1996 and 2004?
Kabul—2002 through six months ago?
Well of course!
And now Jijiga!
What are the odds of that?
No I never--- heard them fight
The lions and Hyenas late at night
I guess I slept too well in Jijiga
The Woman of Jijiga
They all chat here about one fact here
That, you own, your own aero plane
You’re the big boss here,
The main dame here.
Everyone is in flight here
High as a kite here
Because of the business you are in.
You’re the one who’s made it big here.
You are, the richest, here,
The self made woman here.
From street corner injera to taking them to the stars.
Everyone here, flies all day here
Everyone’s high here
Because of the business you are in.
From the streets to the stars
Why sell injera
When there are leaves to chew?
Dawn to dusk here they fly here
Everyone is high here
Because of the business you are in
They all chat here, about the fact here
You own, your own aero plane
Everyone is in flight here
High as a kite here
Because of the business you are in.
In Asabetaferi on our way
We stop at a bar.
Two goats walk in:
One black with a beard
The other’s white and without.
The barman tell us some facts
That these two animals never part
Where goes the black one, the white one follows.
Anyway on the way to Jijiga
In Asabetaferi we stop at a bar
And two goats walk in
The black one drinks only beer
And is called Osama Bin Laden
The white one, you guessed that,
Is called George Bush
And chews only chat.
Two goats walk in.
The Trouble with Models
The economy, political events and even the sun’s course have converged to make these bleak and darkening days for many of the world’s developed nations, and certainly for America. What we need is expert and effective guidance on the impact of policies and programs. What we get is a cacophony of conflicting, often incoherent, ill-informed just-so stories backed by some combination of intuition, self-interest, resentment, herd thinking, natural and social scientific theory, and cherry-picked statistics. The modern social sciences in particular, which had as their mandate and their promise to guide us in times like these, have often become simply another part of the problem, providing dueling experts for hire with dubious track records. That is, when they are not busy generating results that are completely irrelevant to real life practical problems.
How has this happened? We can blame human nature or ideological corruption, but I think it’s time to come to terms with the fact that one of the central activities of social science is a fool’s errand, because a core assumption that underlies it is wrong. That central activity is to create mathematical models that explain social phenomena by identifying and measuring a limited set of contributing causes. This is done using statistical tests for significance, explanatory power, accuracy and reliability (the p-values, F-tests, confidence intervals, factor analyses and so on). With minor variations, this is what the “scientific” work of political science, sociology, educational theory and social psychology consists in. Doing this is what it takes to get published in major journals and achieve tenure at major universities. Even economics uses these and related statistical methods, when it stops being social metaphysics and decides to get dirty with evidence. A core assumption that underlies this work is that there are unchanging relationships between the variables that can be identified in causal models.
Despite millions of hours of effort, the inconvenient truth is that there is not a single non-controversial quantitative model in the social sciences. I don’t mean a qualitative model which reformulates a truism, or is logically derived from prior assumptions. Nor am I referring to a mere statistical snapshot with no claim to durability (though the vast numbers of these too are contested). I mean a robust causal model, with dependent and independent variables, with measured and fixed coefficients giving the relative influence of the independent variables on the result, and applicable beyond the test scenario to a wider range of cases which have been successfully applied with precision. The sort of thing that litters the natural sciences like bones on a particularly grisly battlefield, allowing experts to build hydroelectric dams, synthetic organisms and Xbox 360s by exploiting precise and unchanging mathematical relationships. If there is such a quantitative model (or, one hardly dare utter the word, “law”) in the social sciences, I have not seen it. I’m willing to bet that neither have you.
That’s an embarrassment for the obvious reason that social scientists want to know, at a high level, how to explain, predict and shape human events. Since it may seem that I am overstating the case, let me take a moment to agree that we are often able to explain, predict and shape human events without the use of social science. We are not social idiots colliding into each other blindly. But we do need help, lots of it, to understand the effects of interventions large and small. In understanding these interventions, I’m also happy to agree that statistical correlations can help us to see new patterns and sometimes avoid believing stupid things. So the question is really this: given all that we can do without unchanging quantitative causal models of human behavior, is the quest to find such models largely an empty exercise? Or to express it with more exasperation, is it some combination of fraud and farce?
It isn’t as though similar embarrassing thoughts have never occurred to anyone before.But the many dozens of criticisms of the methodologies have had all-too-little impact on the institutions of social science and the organizations that apply social research. The faith in economic theories of efficient markets and models of asset pricing that directly led to the financial market collapse and global recession is just the most painful recent case in point.
The problem is not just the complexity of social phenomena, or bias in research. If complexity alone were the problem, we could dig down to the bedrock and identify the fundamental laws or model parameters, even if we couldn’t reliably predict the behavior of large systems. In fact, it’s just this faith in a discoverable bedrock that sustains research programs. But there is no such bedrock in the social sciences. Everything we measure shifts. The reason no mathematical constants have been found in the social sciences is not that it is harder to find them, but that there aren’t any to be found.Human beings are neither rational agents in the sense of homo economicus, nor fully irrational or automatic agents perpetually stuck in defined response patterns. We are mildly rational and reasons-responsive, if not at every moment, then when we are alerted or primed to pay attention to our interests and what may or may not achieve them (and no, I have no mathematical model for that).
Being mildly rational implies that if someone has observed a stable pattern of behavior and tries to exploit it to take advantage of it, we, either individually or collectively, have the ability to stop exhibiting the pattern that is being exploited, so long as we are responsive to our interests. That’s a deeper point than it may at first appear. Because any generalization of social science which could be treated as a fixed quantitative relationship to be used for prediction and control, thereby exploiting the individuals concerned, runs into this problem. Rational agents, once aware of the attempt to use their own behaviors against their perceived best interests, or aware that by changing their behavior they can improve their situation, will change their behavior or cease to be rational, because by continuing to act the same way, they are knowingly acting against their interests. Thus, rational agents must break whatever modeled relationship was being exploited.
This insight emerged from Lucas’s critique of economic policy, but even stripped of his strong assumptions about human rationality it retains much of its force. Economic cases are the most obvious: if a hedge fund discovers a regularity in the market and seeks to exploit it, it may succeed for a while in doing so, but as others catch on, they change their trading patterns and the game is soon up. IQ tests, no matter how well-designed to reveal innate, unchangeable intelligence by measuring behavioral responses, can be gamed. The point holds for any limited set of operationalized variables in a social or behavioral model of action. If you don't agree, try to find a counterexample and you’ll get a sense of the futility. If you think you’ve got one, by all means share it in the comments.
There is wisdom here that has even been formulated in “laws,” which are promptly ignored:
Campbell’s law: The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.
Goodheart’s law: Any observed statistical [social] regularity will tend to collapse once pressure is placed upon it for control purposes.
So what is the lesson? A good rule of thumb is that we are slippery subjects for a model-building science to the extent we are rational or even quasi-rational, and we are tractable to science to the extent that we are stuck in our ways despite wanting to change. Of course, none of this is to say that correlations are useless, or that qualitative causal models can’t help people think through the implications of their commitments better than untutored intuition. But the impact of variability combined with complexity should be made a more explicit part of social scientific work. At a minimum, every theorist engaged in the statistical model building enterprise should ask whether common knowledge of the measured relationship between variables would tend to strengthen it, break it, or neither. They should then ask whether employing the model as part of an economic, educational or other intervention will tend to sustain or break it. The presumption should be that using it will tend to break it, unless there is strong evidence to the contrary. Having asked themselves, authors should make the answers part of the published paper. Conscientiously applying these principles will not solve all the problems that the social sciences have in advising policy makers, but it would be a start.
BAPTISM BY SODA: A look at the Juggalo Phenomenon
As legions of cool-hunters trawling the Internet and bohemian sectors of cities are well aware, given that most subcultures are spun around a kernel of hoodlum glamour and guilt-free rutting, selling youth culture is not exactly hard. Youth movements are appropriated and neutralized at such a ferocious rate, that no musical subculture since “Gangsta Rap” has been subversive enough to resist the American mainstream for long save for one: the juggalos (sic) – the off-brand soda flinging, hatchet-wielding followers of The Insane Clown Posse.
Insane Clown Posse are a band of forty-somethings who sing threatening lyrics set to menacing music, all while wearing full circus makeup. The ICP aesthetic is flabbergasting in its audacity and tastelessness – picture jarringly artificial colors appropriated from energy drinks and candy wrappers, loose prison garb and a symbology incorporating death’s heads, jesters and improvised weaponry. Their lyrics hew close to contemporary gangsta rap and other so called hard-core bands, using a lot of profanity to describe violence and sex with a certain swaggering bravado. Every now and then they’ll put out a more soulful, slow, softer ballad that often digs into the band’s philosophy.
Humble-born youth have always managed to appall their elders and betters, but juggalos are so spectacularly unappealing to mainstream tastes that in the eighteen years the band has been playing for major audiences they remain a cult phenomenon. This is despite albums that have gone platinum and gatherings that attract tens of thousands of teenage fans clutching wads of disposable cash. ICP has such a hold on its fans that juggalos have actually been buried in the bands colors. Juggalos are more a collective than anything else; they refer to themselves as “Family” and defend their lifestyles fiercely.
So why can’t the clowns cash in? Gangsta Rap was arguably more of a threat to mainstream America than the Insane Clown Posse, with lyrics as fierce and misogynistic, and an added element of racial outrage and shock. Aside from the purely aesthetic (Blender famously voted Insane Clown Posse as the worst band in any genre in 2004 and reviews in general have tended to be sharply critical) last month came a clue as to how the clowns can maintain such a hold on their fans while paradoxically repelling mainstream consumer culture.
One of ICP’s slow ballads – Miracles – went viral hit on YouTube last month. Miracles is a sincere refutation of a scientific explanation of the world. One line in particular snagged the world’s attention: “Water, fire, air and dirt, Fucking magnets, how do they work?” Magnetism being a staple of primary school science education, the line struck many casual listeners as spectacularly ignorant. Then there was the infantile rhyming scheme and other lines too, such as, “I fed a fish to pelican in Frisco bay, he tried to eat my cellphone and ran away,” or as The Guardian noted, the climax of the song, “and I don't wanna talk to a scientist/Y'all motherfuckers lying, and getting me pissed.” Soon parodies were appearing on Saturday Night Live and all over the Internet. (Compounding the outrage was that only a few months before, Tila Tequila, a bisexual performer and reality television personality, was forced offstage one of the Insane Clown Posse’s gatherings in a hail of rocks, jeers and, allegedly, urine and feces from portable toilets.)
Part of the appeal of Insane Clown Posse has been the Dark Carnival, their peculiar quasi-mysticism. This is a loose collection of beliefs that boil down to a sort of karmic confrontation in the afterlife. One must confront one’s “ringmaster’ before being deemed worthy of Heaven or Hell. This enforces and cements the juggalo community – putting one’s “Family” above one’s selfish desires, eschewing pedophilia and rape etc. If the Dark Carnival’s philosophy has a familiar ring to it, it should. During an interview with The Guardian band frontman Violent J was asked about Miracles and another song, Thy Unveiling (lyrics below), which unambiguously described their marauding music as veiled Evangelical Christianity.
F*ck it, we got to tell.
All secrets will now be told
No more hidden messages…
Truth is we follow GOD!!!
We've always been behind him
The carnival is GOD
And may all Juggalos find him
We're not sorry if we tricked you.
Insane Clown Posse have backpedaled slightly from this revelation, claiming the news wasn’t news at all, pointing to the liner notes of their 2002 album The Wraith: Shangri-La which state: “The Dark Carnival is God… We’ve always followed God… we want all juggalos to find him.” They also repeatedly underlined the difference between believing in God and Christianity in subsequent interviews. But regardless of whether or not the aforementioned “hidden messages” expect juggalos to accept Jesus Christ as their Lord and Savior, the posthumous judgment, the evangelism and existence of heaven and hell – the Dark Carnival is a Christian-flavored system of beliefs.
Subcultures are reactive. Purists will nitpick at this, but on some level dark and moody Goth music has persisted as a sort of aesthetic revulsion to the pastel cheer of America’s sunnier states. Goth began in the Los Angeles and San Francisco in the 1970s, and nourished itself and grew from the postpunk scene in London and New York; but it put down roots and lingers on in Sunbelt cities like Atlanta, Georgia and Tampa, Florida. Likewise, bucolic peace and pot-adoring Hippies remain a fixture in the wealthy, status-conscious Northeast. Plenty of subcultures have appropriated and subverted Christian iconography, (e.g. heavy metal bands like Judas Priest) but few who have subscribed to Christian theology have bothered concealing it. Christian music after all has a vast and devoted, if not exactly chic, following. The juggalos are a collective, binding themselves together with a blend of intoxicating ignorance and Christian morality. Why separate themselves from ordinary “Christian” heavy metal or rap? (Unless Christian metal is simply too ‘un-cool’ – but then so are the juggalos.) The juggalos must oppose something diametrically and the only thing that makes sense is the marketplace.
No Insane Clown Posse concert would be complete without dousing its audience in Faygo, an off-brand cola sold in the Midwest. Even after fans have complained after being hit with full bottles and sued. ICP claims they chose Faygo because it was the only soda they could afford growing up in Detroit. Off-brand soda is potent a symbol of American poverty. Coca Cola, which remains the world’s most recognized brand according to Interbrand, is synonymous with Americana. By identifying themselves with and – if one allows the Christianity metaphor to stretch a little – baptizing their followers with a ‘degenerate’ version of Coca Cola, the juggalos self-other themselves, and define themselves in opposition to the strongest symbol of America consumer culture. (Faygo has apparently resisted any and all overtures from the Insane Clown Posse)
Collectivity itself is also a potent anti-market metaphor. Juggalos resist the relentless market segmentation of contemporary culture with their bizarre sincerity. Much contemporary music focuses on individual achievement, on seduction and consumption, and, since a marauding thug’s money is as good as anyone else’s, singing about stealing or violence still plays into the hands of consumer culture on the buy-side. Even supposedly more sophisticated forms of culture appropriate or reflect upon other facets culture are just as inculcated in the Late Capitalist system. Irony may offer the illusion of distance but by sneering at something one must still consume it on some level. After the initial Faygo schism from the rest of society, juggalos are one for all and all for one. They sneer at no one and shrug off the illusion of consumer choice.
Perhaps the closest analogue to juggalos might be the Tea Party. The Tea Party are a loose group of American citizens who are fed up with the government (for any number of reasons – most often taxes and debt, as suggested by their name which is a reference to the original Boston Tea Party tax revolt, but a great many gripes are encompassed by individual members of the Tea Party, such as health care, nebulous racism, the sinister influence of lobbyists and special interests etc.). The Tea Party sees America’s two political parties as so hopelessly fouled up in the mechanics of government that they cannot possibly solve whatever issue they have with it, and seek a sort of change – often a harkening back to the original intent of the founding fathers. And perhaps if a collective political consciousness ever blossoms among the juggalos they may find some common ground with the Tea Party and evolve to be the shock troops of a coming proletarian revolution. Or maybe the juggalos just like the music.
Sunday, November 28, 2010
Gender and Philosophical Intuition
Tamar Gendler and Stephen Stich, over at Philosophy TV
Empirical evidence collected by Stich and Buckwalter suggests that “standard” intuitions about philosophical thought experiments (e.g. Gettier cases) are more common among men than women. Stich and Gendler examine the merits of this evidence. They consider what might explain gendered differences in intuitions, and whether such differences can help to explain why women are underrepresented in professional philosophy. They also discuss alternative explanations for the gender gap, including the effects of sexism and the shortage of female professors and graduate students to serve as role models for female undergraduates. Finally, they ask why a gender gap has been a larger problem in philosophy than other fields.