March 31, 2012
Debate the Usefulness of Election Models
There a debate on the issuse over at the NYT's FiveThirtyEight. First, Nate Silver:
...Lynn Vavreck’s excellent 2009 book, “The Message Matters,” for instance, made the following claim:
The economy is so powerful in determining the results of U.S. presidential elections that political scientists can predict winners and losers with amazing accuracy long before the campaigns start.
To be clear, that is the publisher’s copy and not Ms. Vavreck’s. However, statements like these have become fairly common, especially among a savvy group of bloggers and writers who sit at the intersection of political science and the mainstream media (a space that this blog, of course, occupies).
But is it true? Can political scientists “predict winners and losers with amazing accuracy long before the campaigns start”?
The answer to this question, at least since 1992, has been emphatically not. Some of their forecasts have been better than others, but their track record as a whole is very poor.
And the models that claim to be able to predict elections based solely on the fundamentals — that is, without looking to horse-race factors like polls or approval ratings — have done especially badly. Many of these models claim to explain as much as 90 percent of the variance in election outcomes without looking at a single poll. In practice, they have had almost literally no predictive power, whether looked at individually or averaged together.
John Sides responds:
I am less critical of the accuracy of these models than is Nate. For one, forecasters have different motives in constructing these models. Some are interested in the perfect forecast, a goal that may create incentives to make ad hoc adjustments to the model. Others are more interested in theory testing — that is, seeing how well election results conform to political science theories about the effects of the economy and other “fundamentals.” Models grounded in theory won’t be (or at least shouldn’t be) adjusted ad hoc. If so, then their out-of-sample predictions could prove less accurate, on average, but perfect prediction wasn’t the goal to begin with. I haven’t talked with each forecaster individually, so I do not know what each one’s goals are. I am just suggesting that, for scholars, the agenda is sometimes broader than simple forecasting.
Second, as Nate acknowledges but doesn’t fully explore (at least not in this post), the models vary in their accuracy. The average error in predicting the two-party vote is 4.6 points for Ray Fair’s model, but only 1.72 points for Alan Abramowitz’s model. In other words, some appear better than others — and we should be careful not to condemn the entire enterprise because some models are more inaccurate.
Third, if we look at the models in a different way, they arguably do a good enough job.
Arguing Science as Faith
First, Stanley Fish over at the NYT's Opionator:
... [Chris] Hayes...posed the following question [to Richard Dawkins and Steven Pinker]: If you hold to the general skepticism that informs scientific inquiry — that is, if you refuse either to anoint a viewpoint in advance because it is widely held or to send viewpoints away because they are regarded as fanciful or preposterous — how do you respond to global-warming deniers or Holocaust deniers or creationists when they invoke the same principle of open inquiry to argue that they should be given a fair hearing and be represented in departments of history, biology and environmental science? What do you do, Hayes asked, when, in an act of jujitsu, the enemies of liberal, scientific skepticism wield it as a weapon against its adherents?
Dawkins and Pinker replied that you ask them to show you their evidence — the basis of their claim to be taken seriously — and then you show them yours, and you contrast the precious few facts they have with the enormous body of data collected and vetted by credentialed scholars and published in the discipline’s leading journals. Point, game, match.
Not quite. Pushed by Hayes, who had observed that when we accept the conclusions of scientific investigation we necessarily do so on trust (how many of us have done or could replicate the experiments?) and are thus not so different from religious believers, Dawkins and Pinker asserted that the trust we place in scientific researchers, as opposed to religious pronouncements, has been earned by their record of achievement and by the public rigor of their procedures. In short, our trust is justified, theirs is blind.
It was at this point that Dawkins said something amazing, although neither he nor anyone else picked up on it. He said: in the arena of science you can invoke Professor So-and-So’s study published in 2008, “you can actually cite chapter and verse.”
Jerry Coyne responds to Fish:
Fish’s big mistake: the reasons undergirding that belief are not that we can engage in a lot of philosophical pilpul to justify using reason and evidence to find out stuff about the universe. Rather, the reasons are that it works: we actually can understand the universe using reason and evidence, and we know that because that method has helped us build computers and airplanes, go to the moon, cure diseases, improve crops, and so on. All of us agree on these results. We simply don’t need a philosophical justification, and I scorn philosophers who equate religion and science because we don’t produce one. Religion doesn’t lead to any greater understanding of reality. Indeed, they can’t even demonstrate to everyone’s satisfaction that a deity exists at all! The unanimity around evidence that antibiotics curse infections, that the earth goes around the sun, and that water has two hydrogen atoms and one oxygen atom, is not matched by any unamity of the faithful about what kind of deity there is, what he/she/it is like, or how he/she/it operates. In what way has religion, which indeed aims to give us “understanding” has really produced any understanding? Fish goes on:
People like Dawkins and Pinker do not survey the world in a manner free of assumptions about what it is like and then, from that (impossible) disinterested position, pick out the set of reasons that will be adequate to its description. They begin with the assumption (an act of faith) that the world is an object capable of being described by methods unattached to any imputation of deity, and they then develop procedures (tests, experiments, the compilation of databases, etc.) that yield results, and they call those results reasons for concluding this or that. And they are reasons, but only within the assumptions that both generate them and give them point.
Yes, but we get results that all sane people agree on, and that actually help us get further results that help us solve problems and figure out why things are they way they are. Note how weaselly Fish is here by using the word “act of faith” to apply to both science and religion. Yes, it was originally an act of faith to assume that there was an external reality that could be comprehended by naturalistic processes, but it is no longer an act of faith: it is an act of confidence.
The Control Revolution And Its Discontents
Ashwin Parameswaran over at Macroeconomic Resilience:
One of the key narratives on this blog is how the Great Moderation and the neo-liberal era has signified the death of truly disruptive innovation in much of the economy. When macroeconomic policy stabilises the macroeconomic system, every economic actor is incentivised to take on more macroeconomic systemic risks and shed idiosyncratic, microeconomic risks. Those that figured out this reality early on and/or had privileged access to the programs used to implement this macroeconomic stability, such as banks and financialised corporates, were the big winners – a process that is largely responsible for the rise in inequality during this period. In such an environment the pace of disruptive product innovation slows but the pace of low-risk process innovation aimed at cost-reduction and improving efficiency flourishes. therefore we get the worst of all worlds – the Great Stagnation combined with widespread technological unemployment.
This narrative naturally begs the question: when was the last time we had a truly disruptive Schumpeterian era of creative destruction. In a previous post looking at the evolution of the post-WW2 developed economic world, I argued that the so-called Golden Age was anything but Schumpeterian – As Alexander Field has argued, much of the economic growth till the 70s was built on the basis of disruptive innovation that occurred in the 1930s. So we may not have been truly Schumpeterian for at least 70 years. But what about the period from at least the mid 19th century till the Great Depression? Even a cursory reading of economic history gives us pause for thought – after all wasn’t a significant part of this period supposed to be the Gilded Age of cartels and monopolies which sounds anything but disruptive.
I am now of the opinion that we have never really had any long periods of constant disruptive innovation – this is not a sign of failure but simply a reality of how complex adaptive systems across domains manage the tension between efficiency,robustness, evolvability and diversity. What we have had is a subverted control revolution where repeated attempts to achieve and hold onto an efficient equilibrium fail. Creative destruction occurs despite our best efforts to stamp it out. In a sense, disruption is an outsider to the essence of the industrial and post-industrial period of the last two centuries, the overriding philosophy of which is automation and algorithmisation aimed at efficiency and control. And much of our current troubles are a function of the fact that we have almost perfected the control project.
A Smithsonian Q & A with E. O. Wilson
From Carl Zimmer's interview with E.O. Wilson, over at the Loom:
Q: Just to take one example that the critics raised, they talked about how inclusive fitness theory makes a prediction about sex allocation, about the investment in different sexes in the offspring. And they say this is something that inclusive fitness predicts and we’ve gone out and we’ve done a lot of tests to see if that’s true and they find these ratios in lots of animals as predicted by that theory. When they make that sort of argument, what’s your response?
A: It’s a little bit like Ptolemaic astronomy: epicycles will always give the exact results if you’re willing to add them. And in this case–I have pointed this out as well–there’s a flaw in the reasoning about the studies of investment, particularly in whether you invest more in males or females in the social insect societies.
If you have only one female who is queen in the colony, and if that queen has mated only once so that her offspring are that close, then you should see because of the implications of haploid/diploid, the way sex is determined in ants, bees, wasps. You should see a favoring of investment in new queens, over investment in males as measured by the amount of biomass. And that inequality does exist and it should be three to one investment in the weight. And that has been what is thought to be a very powerful argument.
However, this I believe has a major flaw in the reasoning. The colony wishes to make an investment in males versus females in numbers that would be most advantageous in having a female successfully mated, when they leave the nest to get mated, bees, ants, wasps. And therefore, the colony should be trying to get something closer to a one-to-one investment.
And since females are much bigger–they have to have all that fats and ovary and so on–and males are much smaller because in most of these social insects. All they have to do is find a female, deliver their sperm, and die. So the males are much smaller.
This means then that getting a one-to-one ratio in sex that is the same as you see throughout the rest of the animal kingdom, means that you will be having to invest much more in the females when you invest in males. And actually when you make that hypothesis, use that principle, which is the obvious one, then that comes closer to the actual figures we have in the biomass investment.
They [Wilson’s critics] may dispute that, but my point is that they did not by any means find a testing ground on which the old theory could stand or fall. It’s in my view a much simpler and more precise explanation to use the argument of one to one ratios of male and female.
to not be
lips of winter
i can't pray
by the ocean
are like islands
& i'm split
to fall down
by Jim Bell
from Landing Amazed
Lily Pool Press, 2010
How to become the engineers of our own evolution
The reports regularly come in from around the world: U.S. engineers unveil a prototype bionic eye, Swedish surgeons replace a man’s cancerous trachea with a body part grown in a lab, and a British woman augments her sense of touch by implanting self-made magnetic sensors in her fingertips.
Adherents of “transhumanism”—a movement that seeks to transform Homo sapiens through tools like gene manipulation, “smart drugs” and nanomedicine—hail such developments as evidence that we are becoming the engineers of our own evolution. Enhanced humans might inject themselves with artificial, oxygen-carrying blood cells, enabling them to sprint for 15 minutes straight. They could live long enough to taste a slice of their own 250th birthday cake. Or they might abandon their bodies entirely, translating the neurons of their brains into a digital consciousness. Transhumanists say we are morally obligated to help the human race transcend its biological limits; those who disagree are sometimes called Bio-Luddites. “The human quest has always been to ward off death and do everything in our power to keep living,” says Natasha Vita-More, chairwoman of Humanity+, the world’s largest transhumanist organization, with nearly 6,000 members.
Gene behind van Gogh’s sunflowers pinpointed
A team of plant biologists has identified the gene responsible for the ‘double-flower’ mutation immortalized by Vincent van Gogh in his iconic Sunflowers series. Van Gogh’s 1888 series includes one painting, now at the National Gallery in London, in which many of the flowers depicted lack the broad dark centre characteristic of sunflowers and instead comprise mainly golden petals. This was not simply artistic licence on van Gogh’s part but a faithful reproduction of a mutant variety of sunflower. In a paper published this week in PLoS Genetics1, researchers at the University of Georgia in Athens report that they have pinned down the gene responsible for the mutation, which they say could shed light on the evolution of floral diversity.
A wild sunflower (Helianthus annuus) is not so much a single flower as a composite of tiny florets. The golden ray florets, located at the sunflower’s rim, resemble long petals, are bilaterally symmetrical and do not produce pollen. That job belongs to the disc florets, tiny radially symmetrical blossoms that occupy the sunflower's darker centre. In combination, the two types of florets create the impression of a single large flower, and presumably an attractive target for insect pollinators. “The success of the family is determined by floral strategy,” says plant biologist John Burke, who led the study. Because changes in floral symmetry can affect how a plant interacts with pollinators — and therefore its reproductive fitness — the unusual sunflowers depicted by van Gogh piqued Burke’s curiosity.
March 30, 2012
A Response to Justin Clarke-Doane’s “Morality and Mathematics: The Evolutionary Challenge”
Matthew Braddock, Andreas Mogensen, and Walter Sinnott-Armstrong over at Pea Soup:
In “Morality and Mathematics: The Evolutionary Challenge” (Ethics 2012), Justin Clarke-Doane raises fascinating and important issues about evolutionary debunking arguments. He argues that insofar as our knowledge of the evolutionary origins of morality poses a challenge for moral realism, exactly similar difficulties will arise for mathematical realism. Clarke-Doane concentrates on the claim that we were not selected to have true moral beliefs, which he interprets to mean that we would have evolved the very same moral beliefs even if the moral facts were radically different from what we take them to be. He argues that an analogous claim holds with respect to our mathematical beliefs: we would have evolved the same mathematical beliefs even if the mathematical facts were radically different from what mathematical realists take them to be. However, even if Clarke-Doane is correct in this, we suspect that his points miss two other kinds of evolutionary debunking arguments, which look to pose a special problem for moral realism.
First, Clarke-Doane twice quotes this claim by Sharon Street: “to explain why human beings tend to make the normative judgments that we do, we do not need to suppose that these judgments are true” (Street, “Reply to Copp”, 208). We take Street’s point to be that one can give a complete explanation of why humans tend to make certain moral judgments rather than others without ever saying anything that implies that any moral beliefs are true. This claim is only about what needs to be said in a complete explanation. It does not assume that moral truths or facts could be different than they are now. Moreover, this claim has no parallel regarding mathematics, because arguably a complete explanation of why humans tend to make certain mathematical judgments (e.g. 1+1=2) rather than others (e.g. 1+1=0) would need to say or imply that 1+1=2 and 1+1≠0. Hence, an evolutionary debunking argument based on this claim by Street understood in this way is not affected by Clarke-Doane’s points.
Towards a New Manifesto
Martin Jay reviews Theodor Adorno and Max Horkheimer's Towards a New Manifesto, in Notre Dame Philosophical Reviews:
Gretel Adorno was a remarkable woman about whom far too little is known. Although the recent publication of her correspondence with Walter Benjamin has confirmed the impression that she was a formidable intellect in her own right, she remains largely a mystery. What we do know for certain is that she was deeply devoted to her husband Theodor, whom she married in September, l937. Abandoning a career as a chemist to support his work unreservedly, she seems to have been resigned to his extra-marital affairs, and was so despondent after his death in August, l969 that she made a botched suicide attempt. Among the many services she rendered was the dutiful taking of minutes from the intellectual discussions he thought worth recording. Beginning in March of l938, shortly after his emigration to America and full integration into the life of the Institut für Sozialforschung (then resettled in New York), she wrote down a number of conversations he had with the director of the Institute, Max Horkheimer. She continued to play this role well after they all returned to Frankfurt in the early l950s to reestablish the Institute.
One such conversation took place over several days in March and April, l956, when Horkheimer and Adorno sat down to discuss a variety of pressing issues, political, sociological, and philosophical, and Gretel Adorno was there to record the results for posterity, or at least as an aide memoire for later more formal considerations of the same issues. Never intended for publication, the protocols nonetheless appeared in l989 alongside many other drafts and notes as an appendix to the thirteenth volume of Horkheimer's collected works. They were blandly entitled "Diskussion über Theorie und Praxis." Last year, they were translated into English by the venerable Rodney Livingstone for the New Left Review, and shortly thereafter repackaged as a little book with the much more provocative title Towards a New Manifesto.
It is worth remembering Gretel Adorno's role in their preparation, and not only because it reminds us of the asymmetrical gender relations that prevailed at the Institute (which never had a major female presence in its ranks). Without a tape recorder, she was responsible for faithfully putting down a highly abstract conversation developing at breakneck speed -- the editorial foreword rightly calls it "a careening flux of arguments, aphorisms, and asides, in which the trenchant alternates with the reckless, the playful with the ingenuous" -- and it has to be accounted a minor miracle that anything coherent survived at all. If we add the tendentious title introduced by the publishers, which turn a relatively minor moment in the dialogue into its telos, it is clear that we have a text that cannot be understood as the polished reflections of authors who wanted these formulations to represent their considered opinions for public consumption. This is, in other words, a far cry from the finely wrought aphorisms of Horkheimer's Dämmerung or Adorno's Minima Moralia.
Robert Wright and Alain de Botton on Religion and Religion for Atheists
Atheists As “Other”: Moral Boundaries and Cultural Membership in American Society
Penny Edgell, Joseph Gerteis, and Douglas Hartmann in American Sociological Review:
[T]he atheist emerges as a culturally powerful “other” in part because the category is multivalent (Turner 1974), loaded with multiple meanings. For all these respondents, atheists represent a general lack of morality, but for some, this lack was associated with criminality and its dangers to safety and public order, while for others the absence of morality was that of people whose resources or positions place them above the common standards of mainstream American life. To put it somewhat differently, atheists can be symbolically placed at either end of the American status hierarchy. What holds these seemingly contradictory views together is that the problem of the atheist was perceived to be a problem of self-interest, an excessive individualism that undermines trust and the public good. In this, our respondents draw the same link between religion and the taming of self-interest that Tocqueville wrote about over a century ago (Tocqueville  2000, see especially volume 2, parts I and II). It is important to note that our respondents did not refer to particular atheists whom they had encountered. Rather they used the atheist as a symbolic figure to represent their fears about those trends in American life—increasing criminality, rampant self-interest, an unaccountable elite—that they believe undermine trust and a common sense of purpose.
In recent public discourse, atheists take on a similar symbolic role. We found that the figure of the atheist is invoked rhetorically to discuss the links—or tensions—among religion, morality, civic responsibility, and patriotism. In particular, the association of the atheist with a kind of unaccountable elitism has surfaced in recent public debates. The civically engaged atheists’ awareness of the negative stereotypes of atheists has led to the coining of a new term, “Brights,” around which to identify and organize and thus, according to one prominent Bright, to challenge the association between atheism, immorality, and lack of civic commitment.
The Mighty Mathematician You’ve Never Heard Of
Natalie Angier in the NYT:
Albert Einstein called her the most “significant” and “creative” female mathematician of all time, and others of her contemporaries were inclined to drop the modification by sex. She invented a theorem that united with magisterial concision two conceptual pillars of physics: symmetry in nature and the universal laws of conservation. Some consider Noether’s theorem, as it is now called, as important as Einstein’s theory of relativity; it undergirds much of today’s vanguard research in physics, including the hunt for the almighty Higgs boson. Yet Noether herself remains utterly unknown, not only to the general public, but to many members of the scientific community as well.
When Dave Goldberg, a physicist at Drexel University who has written about her work, recently took a little “Noether poll” of several dozen colleagues, students and online followers, he was taken aback by the results. “Surprisingly few could say exactly who she was or why she was important,” he said. “A few others knew her name but couldn’t recall what she’d done, and the majority had never heard of her.”
Noether (pronounced NER-ter) was born in Erlangen, Germany, 130 years ago this month. So it’s a fine time to counter the chronic neglect and celebrate the life and work of a brilliant theorist whose unshakable number love and irrationally robust sense of humor helped her overcome severe handicaps — first, being female in Germany at a time when most German universities didn’t accept female students or hire female professors, and then being a Jewish pacifist in the midst of the Nazis’ rise to power.
Adrienne Rich dies at 82
From The Guardian:
The award-winning poet and essayist Adrienne Rich, who was one of America's most powerful writers, has died aged 82. Her daughter-in-law Diana Horowitz said Rich died at home in Santa Cruz, California, following complications from the rheumatoid arthritis from which she had suffered for many years. Described as "one of America's foremost public intellectuals" by the Poetry Foundation, and as "a poet of towering reputation and towering rage [who] brought the oppression of women and lesbians to the forefront of poetic discourse and kept it there for nearly a half-century" by the New York Times, Rich's career spanned seven decades, numerous prizes and more than 20 collections of poetry as well as acclaimed essays, articles and lectures.
When she was just 21, WH Auden chose her as winner of the Yale Younger Poets Competition. Auden went on to write a preface for her first collection, A Change of World. "The typical danger for poets in our age is, perhaps, the desire to be 'original'," he wrote. "Miss Rich, who is, I understand, 21 years old, displays a modesty not so common with that age, which disclaims any extraordinary vision, and a love for her medium, a determination to ensure that whatever she writes shall, at least, not be shoddily made." By the 60s and early 70s, however, with collections such as Diving into the Wreck and Snapshots of a Daughter-in-Law, Rich was writing radical free verse full of her feminist ideals and leftwing convictions, exploring sexuality and identity, motherhood and politics. Her transformation, said the critic Ruth Whitman in 2002, has been "astonishing to watch ... In one woman the history of women in the 20th century, from careful traditional obedience to cosmic awareness, defying the mode of our time."
How War Came Home to Stay
Janet Maslin in The New York Times:
A squabble is a noisy quarrel over a trivial matter. A polemic is an aggressive attack on the opinions and principles of others. A screaming match is a contest in which contradictory points are stubbornly reiterated, with no regard for whatever else has been said. A political talk show is a gladiatorial contest in which squabbles, polemics and screaming matches are exploited for their entertainment value. A book by the host of a political talk show is often an ancillary product or marketing tool. But “Drift,” by Rachel Maddow, whose show is on MSNBC, is much more. It is an argument — a sustained, lucid case in which points are made logically and backed by evidence and reason. What’s more, it follows one main idea through nearly a half-century. The subtitle, “The Unmooring of American Military Power,” explains exactly what “Drift” is about. Ms. Maddow’s point is that the way we go to war has changed: that there has been an expansion of presidential power, a corresponding collapse of Congressional backbone and a diminution of public attention. She does not see this in conspiratorial terms, but she has an explanation for the step-by-step way it evolved. She thinks the transformation began with a question asked by President Lyndon B. Johnson in 1965 as he prepared to more than double the ground forces in Vietnam: “You don’t think I oughta have a joint session, do you?” Did he need authorization from Congress, he asked the chairman of the Senate Armed Services Committee, to make a troop deployment like that?
That very question indicates that Johnson understood the importance of Congressional authority. But it is Ms. Maddow’s contention that subsequent presidents have even more deliberately sought to avoid dragging Congress into the conversation, because Congressional debates and military allocations upset the public. So does the calling up of troops. As the waging of war has grown increasingly secretive and privatized, presidents have built on precedent. They have seen less and less advantage in letting Congress weigh in on these decisions. “Drift” says this slide was not inevitable. “And it wasn’t inexorable either,” Ms. Maddow writes. “You can trace it to specific decisions, made for specific, logical reasons.”
Should Chimpanzees Have Moral Standing? An Interview with Frans de Waal
Liza Gross in PLoS Blogs:
Gross: What are some of the seminal experiments that revealed similarities in cognitive or behavioral traits between apes and humans, suggesting we’re not in fact unique, as many like to think?
De Waal: There are many. For example, tool use used to be considered uniquely human. And then when it was found in captivity by Köhler, this is in the 1920s, people would say, “Well, but at least in the wild they never do it.” And then it was found in the wild, and then they would say, “Well, at least they don’t make tools.” And then it was found that they actually also make tools.
So tool use was one of those dividing lines. Mirror self-recognition is a key experiment that was first conducted on the apes. The language experiments, even though we now doubt what the apes do is actually what we would call “language,” they certainly put a dent in that whole claim that symbolic communication is uniquely human.
My own studies on, let’s call it “politics,” and reconciliation behavior and pro-social behavior have put a dent in things. And so I think over the years every postulate of difference between humans and apes has been at least questioned, if not knocked over. As a result, we are now in a situation that most of the differences are considered gradual rather than qualitative.
And the same is true, let’s say, between a chimp and a monkey. There are many differences between chimps and monkeys in cognitive capacities, but we consider them mostly gradual differences.
The more we look at it, even if you take the difference between, let’s say, a human and a snake or a fish, yes, between those species the differences are very radical and huge, but even these species rely on some of the learning processes and reactions that we also know of in humans.
"Here where there was law there is now
extrajudicial rendition and other conveniences."
What Kind of Times Are These
There's a place between two stands of trees where the grass grows uphill
and the old revolutionary road breaks off into shadows
near a meeting-house abandoned by the persecuted
who disappeared into those shadows.
I've walked there picking mushrooms at the edge of dread, but don't be fooled
this isn't a Russian poem, this is not somewhere else but here,
our country moving closer to its own truth and dread,
its own ways of making people disappear.
I won't tell you where the place is, the dark mesh of the woods
meeting the unmarked strip of light—
ghost-ridden crossroads, leafmold paradise:
I know already who wants to buy it, sell it, make it disappear.
And I won't tell you where it is, so why do I tell you
anything? Because you still listen, because in times like these
to have you listen at all, it's necessary
to talk about trees.
by Adrienne Rich
from The Fact of a Doorframe
– Selected Poems 1950-2001
publisher W.W. Norton
March 29, 2012
Daron Acemoglu and Simon Johnson in Project Syndicate:
There is a simple way to deal with a debt overhang: reduce payments by restructuring the debt. Many firms are able to renegotiate financing terms with their creditors – typically extending the maturity of their liabilities, which enables them to borrow more to finance new, better projects. If such negotiation cannot be achieved voluntarily, US firms can use Chapter 11 of the bankruptcy code, under which a court supervises and approves the reorganization of liabilities. So you would think the same would be true for US households and embattled European governments. But the restructuring of debt has been too little and has come too late. Why?
In both cases, the main argument for not removing the debt overhang came from bankers, who claimed that it would create havoc in financial markets for two reasons. First, banks were the primary creditors, and the large losses that they would face in any restructuring was bound to trigger a domino effect, with waves of pessimism driving up interest rates and ruining other borrowers’ prospects. Second, banks would also suffer because they had sold insurance against default – in the form of credit-default swaps. When these swaps were activated, the banks would incur potentially further crippling losses.
In the case of Greece, international bankers argued long and hard that debt restructuring would generate contagion far and wide within the eurozone – and perhaps more broadly. And yet, in the end, Greece had little choice but to restructure its debt, cutting the value of private claims by about 75% relative to their face value (although even this is probably not enough to make the country’s debt burden sustainable). This was deemed a “credit event,” so credit-default swaps were exercised: anyone who insured against default had to pay out.
Did all hell break loose? No. Banks have not failed, and there is no sign of tumbling dominoes. But that is not because banks prepared themselves by raising more capital. On the contrary, compared to their likely future losses, European banks have raised relatively little capital recently – and much of this has been creative accounting, rather than truly loss-absorbing shareholder equity.
Perhaps the risk that a Greek debt restructuring would cause a financial meltdown was always minimal, and quiescent markets were to be expected. But, in that case, why all the fuss?
The answer should be clear by now: interest-group politics and policy elites’ worldview.
Critiques of Utopia and Apocalypse
John Gray in Five Books:
Let’s talk about Freud then. Tell us what he says about human nature and society in Civilisation and Its Discontents.
Freud is a very relevant figure to this discussion. The limits of progress are in the flaws and divisions of human nature, which are integral to being human. The way Freud represents this in a number of his works, including Civilisation and Its Discontents, is to say that there are a variety of instincts – a very unpopular term now which may not be scientifically valid – from benevolence and love on the one hand to violence and aggression on the other, which are equally part of the human animal.
Civilisation, as Freud understands it, begins with the restraint of violence – although of course it doesn’t end there. A civilised state is one which controls violence. Freud’s key point is that because humans are self-divided in the way I’ve described, civilisation always carries with it a degree of repression of instinctual satisfaction, which in turn means that the civilisational condition will always be one of discontent. In other words, it’s not possible to imagine – and dangerous to experiment with – any conception of a civilisation emptied of its discontent, in which all desires are satisfied and society doesn’t exact a price for the repression of violent impulses.
Freud thought that civilisation is inestimably valuable – unlike some other writers in central Europe, he was never tempted by barbarism. But he also recognised that civilisation is inherently flawed, not because of political repression and corruption or economic inequality, but because of the nature of the human animal. That is why civilisation can never be rid of its faults, can never be entirely benign. I think that is true. In the language of religion, it might be called original sin. In other religions such as Buddhism, it is called original ignorance. However one wants to put it, it is a truth that humans are ineradicably flawed, and that is a commonplace in pretty much any religious tradition. It’s only recently, in the last 150 years, that the idea which Freud presented in a secular form is considered to be shocking.
A Poet of Unswerving Vision at the Forefront of Feminism
The NYT's Adrienne Rich obituary, by Margalit Fox:
Triply marginalized — as a woman, a lesbian and a Jew — Ms. Rich was concerned in her poetry, and in her many essays, with identity politics long before the term was coined.
She accomplished in verse what Betty Friedan, author of “The Feminine Mystique,” did in prose. In describing the stifling minutiae that had defined women’s lives for generations, both argued persuasively that women’s disenfranchisement at the hands of men must end.
For Ms. Rich, the personal, the political and the poetical were indissolubly linked; her body of work can be read as a series of urgent dispatches from the front. While some critics called her poetry polemical, she remained celebrated for the unflagging intensity of her vision, and for the constant formal reinvention that kept her verse — often jagged and colloquial, sometimes purposefully shocking, always controlled in tone, diction and pacing — sounding like that of few other poets.
All this helped ensure Ms. Rich’s continued relevance long after she burst genteelly onto the scene as a Radcliffe senior in the early 1950s.
Her constellation of honors includes a MacArthur Foundation “genius” grant in 1994 and a National Book Award for poetry in 1974 for “Diving Into the Wreck.” That volume, published in 1973, is considered her masterwork.
In the title poem, Ms. Rich uses the metaphor of a dive into dark, unfathomable waters to plumb the depths of women’s experience:
I am here, the mermaid whose dark hair
streams black, the merman in his armored body
We circle silently about the wreck
we dive into the hold. ...
We are, I am, you are
by cowardice or courage
the one who find our way
back to the scene
carrying a knife, a camera
a book of myths
our names do not appear.
Ms. Rich was far too seasoned a campaigner to think that verse alone could change entrenched social institutions. “Poetry is not a healing lotion, an emotional massage, a kind of linguistic aromatherapy,” she said in an acceptance speech to the National Book Foundation in 2006, on receiving its medal for distinguished contribution to American letters. “Neither is it a blueprint, nor an instruction manual, nor a billboard.”
But at the same time, as she made resoundingly clear in interviews, in public lectures and in her work, Ms. Rich saw poetry as a keen-edged beacon by which women’s lives — and women’s consciousness — could be illuminated.
She was never supposed to have turned out as she did.
Morality and Mathematics: Can You Be A Moral Antirealist and a Mathematical Realist?
It is commonly suggested that evolutionary considerations generate an epistemological challenge for moral realism. At first approximation, the challenge for the moral realist is to explain our having many true moral beliefs, given that those beliefs are the products of evolutionary forces that would be indifferent to the moral truth. An important question surrounding this challenge is the extent to which it generalizes. In particular, it is of interest whether the Evolutionary Challenge for moral realism is equally a challenge for mathematical realism. It is widely thought not to be. For example, Richard Joyce, one of the most prominent advocates of the Evolutionary Challenge, goes so far as to write, “the dialectic within which I am working here assumes that if an argument that moral beliefs are unjustified or false would by the same logic show that believing that 1 + 1 = 2 is unjustified or false, this would count as a reductio ad absurdum.”1 He assures the reader, “There is … evidence that the distinct genealogy of [mathematical] beliefs can be pushed right back into evolutionary history. Would the fact that we have such a genealogical explanation of … ‘1 + 1 = 2’ serve to demonstrate that we are unjustified in holding it? Surely not, for we have no grasp of how this belief might have enhanced reproductive fitness independent of assuming its truth.”2 Similarly, Walter Sinnott-Armstrong writes, “The evolutionary explanations [of our having the moral beliefs that we have] work even if there are no moral facts at all. The same point could not be made about mathematical beliefs. People evolved to believe that 2 + 3 = 5, because they would not have survived if they had believed that 2 + 3 = 4, but the reason why they would not have survived then is that it is true that 2 + 3 = 5.”3 Finally, Roger Crisp writes, “In the case of mathematics, what is central is the contrast between practices or beliefs which develop because that is the way things are, and those that do not. The calculating rules developed as they did because [they] reflect mathematical truth. The functions of … morality, however, are to be understood in terms of well-being, and there seems no reason to think that had human nature involved, say, different motivations then different practices would not have emerged.”4
In this article, I argue that such sentiments are mistaken. I argue that the Evolutionary Challenge for moral realism is equally a challenge for mathematical realism.
How Conservatives Lost their Faith in Science
Alan Boyle in MSNBC's Cosmic Log:
Gauchat cross-referenced attitudes toward the scientific community with various demographic categories, and found that two categories showed a significant erosion of trust in science: conservatives and frequent churchgoers. People who identified themselves as conservatives voiced more confidence in science than moderates or liberals in 1974, but by 2010, that level had fallen by more than 25 percent.This graph shows the unadjusted mean values for public trust in science, classified by self-reported political ideology between 1974 and 2010. The figures are derived from the General Social Survey.
Why the drop? Gauchat suggested that the character of the conservative movement has changed over the past three and a half decades — and so has the character of the scientific establishment.
"Over the last several decades, there's been an effort among those who define themselves as conservatives to clearly identify what it means to be a conservative," he said. "For whatever reason, this appears to involve opposing science and universities, and what is perceived as the 'liberal culture.' So, self-identified conservatives seem to lump these groups together and rally around the notion that what makes 'us' conservatives is that we don't agree with 'them.'"
Meanwhile, the perception of science's role in society has shifted as well.
"In the past, the scientific community was viewed as concerned primarily with macro structural matters such as winning the space race," Gauchat said. "Today, conservatives perceive the scientific community as more focused on regulatory matters such as stopping industry from producing too much carbon dioxide."
30 Years of Subaltern Studies: Conversations with Gyanendra Pandey and Partha Chatterjee
McGrail: I’d like to start by asking if you could give us an overview of the term “subaltern studies” and explain how it has evolved in the past few decades.
Chatterjee: When the Subaltern Studies Collective began, our initial move was a reading Antonio Gramsci’s Prison Notebooks, which had just been published in English. We were compelled by the fact that Gramsci used the term “subaltern” instead of “proletariat.” Now, he used this term because he was writing in prison under condition of extreme censorship; therefore, he didn’t want to use standard Marxist term and coined the term “subaltern.” But as a result, Gramsci was fundamentally altering the core definition of classes in the orthodox version of Marxism at the time. By simply renaming the proletarian class to the subaltern, he was suggesting that classical Marxist division of European industrial society into classes was not entirely adequate. The classical understanding of class didn’t quite work in a country like Italy, where in the North there was a large industrial structure, while most parts of the South were agrarian and most exploited people were peasants. Gramsci was suggesting that the classical understanding of the “proletariat” didn’t fit the political situation in Italy. So in using a term like subaltern, he was trying to incorporate this very large, pre-industrial formation in to the understanding of political strategies for the Left or the Communist movement.
We found this extremely relevant in trying to understand the situation in countries like India, for instance, which in the early 80s was more-or-less in exactly the same situation: there was an important and developing industrial section with industrial working classes, but a very large part of the country essentially consisted of agrarian formations. Therefore most Indians were in fact still peasants. So it was in trying to reorient or reformulate the problem of what it is to write the history of the “people” in a country like India that we found the idea of using “subaltern classes”—rather than the orthodox formulation of classes in Marxism—much more useful and, in a sense, full of new possibilities. That’s how it began; we actually began by using the term “subaltern classes.”
Initially in our thinking, subalternity still referred to a certain class structure that was perhaps not entirely frozen or well-defined—i.e., it was often indeterminant, fuzzy and so on—but the term still referred to a certain structure of class relations. It’s work that happened later on—particularly with Gayatri Spivak’s interventions—that allowed for a different inflection to be given to the term subaltern.
Was the Nazi rise inevitable?
From The Telegraph:
German history has been shaped by one central trauma: the rise of the Nazis culminating in the horror of the concentration camps. There has been an understandable tendency for scholars to interpret everything that went before as a prelude to the emergence of fascism. Just as the Whig school notoriously interpreted the path of British history as an inexorable process leading to the triumph of parliamentary democracy in the 19th century, so the rise of Hitler has haunted German historians.
One major victim of this tendency has been the Holy Roman Empire, a sprawling confederation of German-speaking states that embraced Italy, Germany and much of France at one point in the high Middle Ages. Contemporary historians have tended to lose interest in the Holy Roman Empire after the death in 1250 of Frederick II, the powerful and charismatic emperor who challenged the authority of the Pope. Thereafter they have assumed that the empire fell into decline, part of a pattern of neglect and institutional collapse that sowed the seeds for the failure of the Weimar Republic and the rise of the Nazis. Indeed, in the words of one historian, the Holy Roman Empire had “no history at all” after the mid-17th century, though “it continued for a while longer to lead a miserable, meaningless existence because its patient, slow-moving subjects lacked the initiative and in many cases the intelligence to effect its actual dissolution".
Cancer screen yields drug clues
Two compendiums of data unite genetic profiling with drug testing to create the most complete picture yet of how mutations can shape a cancer’s response to therapy. The results, published today in Nature1, 2, suggest that the effectiveness of most anticancer agents depends on the genetic make-up of the cancer against which they are used. One study found a link between drug sensitivity and at least one mutation in a cancer-related gene for 90% of the compounds tested.
Lab-grown cancer cells are a mainstay of research into the disease. The two projects catalogue the genetic features of hundreds of such cell lines, including mutations in cancer-associated genes and patterns of gene activation. They then match these features with how the cells respond to approved and potential drugs. “This is a very powerful finding,” says Tom Hudson, president of the Ontario Institute for Cancer Research in Toronto, Canada, who was not affiliated with the work. “It could provide valuable information for designing clinical trials, and lead to more focused and less expensive approaches to drug development.”
She is standing, here, in a grocery store,
Under the fluorescent light suspended,
Above her head, and from the ceiling,
Standing in front of the refrigerated meat,
That is laid out in front of her, butchered,
A thigh, a breast, a leg,
Or chopped and ground,
Pieces of meat wrapped tightly in plastic that is
Stretching over them, like skin, and she forgets,
Forgets what she is looking for, because she is,
Remembering what he said on the telephone,
His voice in Afghanistan and, here, in her ear,
About what happened, there, in Kandahar, or
How an American soldier, how he lost his mind,
Went and killed sixteen Afghans, nine children,
A massacre, her husband whispers over it, this
Telephone line, and she is here, now, in America,
Moving down aisles of a grocery store, moving
Through the months, because she is still waiting,
Waiting for him to come home again, waiting
In a checkout line, and thinking about lines,
Lines she draws through the days on a calendar,
Bodies shot dead, lined up on the side of a road,
Or the lines of war,
Lines soldiers cross and lines they don’t,
And the imaginary lines that divide countries,
Our country from theirs,
Or how different he will be,
When he crosses over again, and comes home.
by Amalie Flynn
March 28, 2012
Adrienne Rich, 1929-2012
Rethinking the Literature Classroom
Jeff Hudson in Full Stop:
Here is something I know: I feel better when I read — not just good, but better. Anxieties are assuaged, burdens lightened, relationships enriched. I feel part of something hopeful, a connection to the writer, the characters, other readers. I feel smart, if it is okay to say that. I am moved to act after reading — to write, to talk. I have new questions and fresh answers. And I am hardly alone. Anne Lamott knows that “when writers make us shake our heads with the exactness of their prose and their truths, and even make us laugh about ourselves or life, our buoyancy is restored. We are given a shot at dancing with, or at least clapping along with the absurdity of life instead of being squashed by it over and over again.” After sharing stories, writer Barry Lopez feels exhilarated: “The mundane tasks which awaited me, I anticipated now with pleasure. The stories had renewed in me a sense of the purpose of my life.”
Here is something else I know: the power of literature to “renew a sense of purpose in our lives” gets killed in literature classrooms — unintentionally, no doubt, but killed nonetheless.
This isn’t an indictment. Writer Richard Ford found himself teaching literature as a graduate assistant in 1969 and realized, “What seemed worthwhile to teach was what I felt about literature . . . [literature] had mystery, denseness, authority, connectedness, closure, resolution, perception, variety, magnitude — value in other words . . . Literature appealed to me. But I had no idea how to teach its appealing qualities, how to find and impart the origins of what I felt.” This is a difficult question.
Signandsight.com Says Good-bye
We here at 3QD have been long-time fans of signandsight. Their presence on the web will be missed. Anja Seeliger and Thierry Chervel:
After seven years we are shutting down signandsight.com. The site will remain online, but for now no new texts will be posted.
We still believe in the idea behind signandsight.com. We are convinced that Europe needs a public sphere, and we think that this public sphere is best achieved by combining the possibilities of the internet and traditional media. As before, we still love our motto "Let's Talk European". Yes, English is now the lingua franca of contemporary Europe. However, when English is used to bring articles written in another language to an international readership, then it serves as a bridge to these others languages and helps create waves. Interestingly, the most lively reactions to signandsight.com came from the US, where there is an intellectual audience that wishes to escape its domestic borders.
Some of the most wonderful experiences with signandsight.com were when Harper's reprinted an interview with Thomas Bernhard, because it had been first translated into English by signandsight.com, when Anne Applebaum of the Washington Post and Paul Berman of the New Republic discussed texts or debates that had originated from signandsight.com and which had found echoes in Swedish, Hungarian, Spanish, Polish, and French newspapers. Signandsight.com never had a wide popular readership, but it was a catalyst for European public debate.
Marx at 193
Also in the LRB, John Lanchester:
In trying to think what Marx would have made of the world today, we have to begin by stressing that he was not an empiricist. He didn’t think that you could gain access to the truth by gleaning bits of data from experience, ‘data points’ as scientists call them, and then assembling a picture of reality from the fragments you’ve accumulated. Since this is what most of us think we’re doing most of the time it marks a fundamental break between Marx and what we call common sense, a notion that was greatly disliked by Marx, who saw it as the way a particular political and class order turns its construction of reality into an apparently neutral set of ideas which are then taken as givens of the natural order. Empiricism, because it takes its evidence from the existing order of things, is inherently prone to accepting as realities things that are merely evidence of underlying biases and ideological pressures. Empiricism, for Marx, will always confirm the status quo. He would have particularly disliked the modern tendency to argue from ‘facts’, as if those facts were neutral chunks of reality, free of the watermarks of history and interpretation and ideological bias and of the circumstances of their own production.
I, on the other hand, am an empiricist. That’s not so much because I think Marx was wrong about the distorting effect of underlying ideological pressures; it’s because I don’t think it’s possible to have a vantage point free of those pressures, so you have a duty to do the best with what you can see, and especially not to shirk from looking at data which are uncomfortable and/or contradictory. But this is a profound difference between Marx and my way of talking about Marx, which he would have regarded as being philosophically and politically entirely invalid.
In the Zeitgeist, Fact-Checking
Via Zite, there are two pieces on fact-checking this week. Atossa Araxia Abrahamian in The New Inquiry:
Brides magazine has a fact-checker. She does things like verify the cost of honeymoons and makes sure that Vera Wang did, in fact, design that dress, and compares the captions on winter flower bouquet slideshows with pictures in botany reference books. It would be terrible to mistake a eucalyptus pod for a mere pussy willow.
Many American magazines, from trashy celebrity weeklies to highbrow general-interest journals, have fact-checkers of some sort. I worked as one in 2008, when, with three other Harper’s interns, I fact-checked the magazine’s Index from beginning to end. Being the primary speaker of foreign languages in the intern cubicle, I ended up doing a lot of the international checking for the magazine. Percentage of Russians who say one goal of U.S. foreign policy is “the complete destruction of Russia”: 43. Number of Iraqi stray dogs that Operation Baghdad Pups has helped emigrate to the United States since 2003: 66.
I quickly learned that fact-checking is a predominantly American phenomenon. The French don’t do much of it, most Russian papers certainly don’t either, and even the Swiss — possibly the most exacting and precise people on the planet — do not make use of fact-checkers in quite the same way as Americans do. Yet their presses keep rolling, and their readers keep reading, and their brides still buy roses, if by another name. People even trust the press in Switzerland much more than they do in the U.S.: 46 percent of Swiss people said they had confidence in their newspapers and magazines in 2010. Among Americans, it was only 25 percent.
Christian Lorentzen also has a piece in the LRB.
Women: The Libyan Rebellion's Secret Weapon
Inas Fathy’s transformation into a secret agent for the rebels began weeks before the first shots were fired in the Libyan uprising that erupted in February 2011. Inspired by the revolution in neighboring Tunisia, she clandestinely distributed anti-Qaddafi leaflets in Souq al-Juma, a working-class neighborhood of Tripoli. Then her resistance to the regime escalated. “I wanted to see that dog, Qaddafi, go down in defeat.” A 26-year-old freelance computer engineer, Fathy took heart from the missiles that fell almost daily on Col. Muammar el-Qaddafi’s strongholds in Tripoli beginning March 19. Army barracks, TV stations, communications towers and Qaddafi’s residential compound were pulverized by NATO bombs. Her house soon became a collection point for the Libyan version of meals-ready-to-eat, cooked by neighborhood women for fighters in both the western mountains and the city of Misrata. Kitchens across the neighborhood were requisitioned to prepare a nutritious provision, made from barley flour and vegetables, that could withstand high temperatures without spoiling. “You just add water and oil and eat it,” Fathy told me. “We made about 6,000 pounds of it.”
Fathy’s house, located atop a hill, was surrounded by public buildings that Qaddafi’s forces often used. She took photographs from her roof and persuaded a friend who worked for an information-technology company to provide detailed maps of the area; on those maps, Fathy indicated buildings where she had observed concentrations of military vehicles, weapons depots and troops. She dispatched the maps by courier to rebels based in Tunisia. On a sultry July evening, the first night of Ramadan, Qaddafi’s security forces came for her. They had been watching her, it turned out, for months. “This is the one who was on the roof,” one of them said, before dragging her into a car. The abductors shoved her into a dingy basement at the home of a military intelligence officer, where they scrolled through the numbers and messages on her cellphone. Her tormentors slapped and punched her, and threatened to rape her. “How many rats are working with you?” demanded the boss, who, like Fathy, was a member of the Warfalla tribe, Libya’s largest. He seemed to regard the fact that she was working against Qaddafi as a personal affront.
How to Write Like a Scientist
I didn’t know whether to take my Ph.D. adviser’s remark as a compliment. “You don’t write like a scientist,” he said, handing me back the progress report for a grant that I had written for him. In my dream world, tears would have come to his eyes, and he would have squealed, “You write like a poet!” In reality, though, he just frowned. He had meant it as a criticism. I don’t write like a scientist, and apparently that’s bad. I asked for an example, and he pointed to a sentence on the first page. “See that word?” he said. “Right there. That is not science.”
The word was “lone,” as in “PvPlm is the lone plasmepsin in the food vacuole of Plasmodium vivax.” It was a filthy word. A non-scientific word. A flowery word, a lyrical word, a word worthy of -- ugh -- an MFA student. I hadn’t meant the word to be poetic. I had just used the word “only” five or six times, and I didn’t want to use it again. But in his mind, “lone” must have conjured images of PvPlm perched on a cliff’s edge, staring into the empty chasm, weeping gently for its aspartic protease companions. Oh, the good times they shared. Afternoons spent cleaving scissile bonds. Lazy mornings decomposing foreign proteins into their constituent amino acids at a nice, acidic pH. Alas, lone plasmepsin, those days are gone. So I changed the word to “only.” And it hurt. Not because “lone” was some beautiful turn of phrase but because of the lesson I had learned: Any word beyond the expected set -- even a word as tame and innocuous as “lone” -- apparently doesn’t belong in science. I’m still fairly new at this science thing. I’m less than 4 years beyond the dark days of grad school and the adviser who wouldn’t tolerate “lone.” So forgive my naïveté when I ask: Why the hell not? Why can’t we write like other people write? Why can’t we tell our science in interesting, dynamic stories? Why must we write dryly? (Or, to rephrase that last sentence in the passive voice, as seems to be the scientific fashion, why must dryness be written by us?)
collecting America’s other language
The scene is a mysterious one, beguiling, thrilling, and, if you didn’t know better, perhaps even a bit menacing. According to the time-enhanced version of the story, it opens on an afternoon in the late fall of 1965, when without warning, a number of identical dark-green vans suddenly appear and sweep out from a parking lot in downtown Madison, Wisconsin. One by one they drive swiftly out onto the city streets. At first they huddle together as a convoy. It takes them only a scant few minutes to reach the outskirts—Madison in the sixties was not very big, a bureaucratic and academic omnium-gatherum of a Midwestern city about half the size of today. There is then a brief halt, some cursory consultation of maps, and the cars begin to part ways. All of this first group of cars head off to the south. As they part, the riders wave their farewells, whereupon each member of this curious small squadron officially commences his long outbound adventure—toward a clutch of carefully selected small towns, some of them hundreds and even thousands of miles away. These first few cars are bound to cities situated in the more obscure corners of Florida, Oklahoma, and Alabama. Other cars that would follow later then went off to yet more cities and towns scattered evenly across every corner of every mainland state in America. The scene as the cars leave Madison is dreamy and tinted with romance, especially seen at the remove of nearly fifty years. Certainly nothing about it would seem to have anything remotely to do with the thankless drudgery of lexicography. But it had everything to do with the business, not of illicit love, interstate crime, or the secret movement of monies, but of dictionary making.more from Simon Winchester at Lapham's Quarterly here.
How should I live my life?
So what does it mean for the country that our cultural common denominator is shrinking? That increasingly Americans have very little experiences through which to understand the lives of our fellow citizens? And why, in the midst of these trends is there general agreement on an issue as potentially flammable as contraception? Recently I found good answers to these questions in an unexpected place — in an essay on literature and ethics that provides a convincing account of the rock bottom consequences of a fractured population. The essay is called “Perceptive Equilibrium: Literary Theory and Ethical Theory,” and it was first given as a talk by the philosopher Martha Nussbaum 25 years ago. The purpose of the paper was to merge literary theory with ethical theory — to show how forms of art like the novel can help us answer arguably the two most fundamental philosophical questions: How should I live my life? How should we live together? Here is Nussbaum describing the centrality of literature to ethics: "One of the things that makes literature something deeper and more central for us than a complex game, deeper even than those games, for example chess and tennis, that move us to wonder by their complex beauty, is that it speaks like Strether. It speaks about us, about our lives and choices and emotions, about our social existence and the totality of our connections."more from Kevin Hartnett at The Millions here.
There are texts that seem to require a certain craziness of us, a mismeasure of response to match the extravagance of their expression. But can a mismeasure be a match? All we know is that we don’t want to lose or reduce the extravagance but can’t quite fall for it either. An example would be Walter Benjamin’s wonderful remark about missed experiences in Proust: None of us has time to live the true dramas of the life that we are destined for. This is what ages us – this and nothing else. The wrinkles and creases on our faces are the registration of the great passions, vices, insights that called on us; but we, the masters, were not at home. Even without the ‘nothing else’ this is a pretty hyperbolic proposition. With the ‘nothing else’ it turns into a form of madness, a suggestion that we shall not grow old at all unless we keep failing to receive the passions, vices and insights that come to see us. This would be a life governed by new necessities, entirely free from the old ones, exempt from time and biology. The sentences are clear enough but don’t read easily as fantasy or figure of speech. Benjamin is asking us to entertain this magical thought for as long as we can, and not to replace it too swiftly by something more sensible.more from Michael Wood at the LRB here.
March 27, 2012
In the Land of Blood and Honey
Srecko Horvat on Angelina Jolie's new film In the Land of Blood and Honey about an affair between a Serb and Muslim during the Balkan war, in Eurozine (Warning: the article contains spoilers):
The movie tells the story of Danijel, a soldier fighting for the Bosnian Serbs, and Ajla, a Bosnian Muslim who was involved with him before the war and is now a captive in the concentration camp he oversees. It's a bad repetition of the same good old story depicted most recently in The Reader (Stephen Daldry, 2008), and unforgettably in The Night Porter (Liliana Cavani, 1974). In short, it's a story about the perpetrator and the victim and a reversal of these perspectives as the story goes on. On the one hand you have a war criminal (a concentration camp guard in The Reader, the former SS officer in The Night Porter, the Serbian officer in Jolie's movie), on the other hand you have the victim (the boy who read to the concentration camp guard, the concentration camp survivor, the innocent Muslim woman in the Bosnian war). What all three films have in common is a fatal love affair between a criminal and an innocent victim, the only difference being that, in The Reader, the boy finds out eight years later, when as a law student, he observes a trial of several women (including his former lover) accused of letting 300 Jewish women die in a burning church.
Common to all these films is also that the roles become less and less clear as the story develops. The best example is The Night Porter, where thirteen years after the concentration camp, Lucia meets Maximillian again, who is now working at a Vienna hotel; instead of exposing him, she falls back into their sadomasochistic relationship. The relationship is what Primo Levi – remembering the case of the Sonderkommando, the "special units" of camp inmates in charge of bringing their neighbours to the gas chambers – calls the "gray zone", the zone in which the "long chain of conjunction between victim and executioner" comes loose. Or, as Giorgo Agamben puts it in his Remnants of Auschwitz, "where the oppressed becomes oppressor and the executioner in turn appears as victim. A gray, incessant alchemy in which good and evil and, along with them, all the metals of traditional ethics reach their point of fusion".
The best expression of this new terra ethica was articulated by Michael in Bernhard Schlink's novel The Reader, on which the film was based: "I wanted simultaneously to understand Hanna's crime and to condemn it. But it was too terrible for that. When I tried to understand it, I had the feeling I was failing to condemn it as it must be condemned. When I condemned it as it must be condemned, there was no room for understanding. But even as I wanted to understand Hanna, failing to understand her meant betraying her all over again. I could not resolve this. I wanted to pose myself both tasks – understanding and condemnation. But it was impossible to do both." In other words, when we try to understand the crime, then we stop condemning it; and when we condemn, then we stop understanding it.
So, what is missing in Jolie's movie?
A Short Course in Thinking About Thinking
Daniel Kahneman in Edge:
I'll start with a topic that is called an inside-outside view of the planning fallacy. And it starts with a personal story, which is a true story.
Well over 30 years ago I was in Israel, already working on judgment and decision making, and the idea came up to write a curriculum to teach judgment and decision making in high schools without mathematics. I put together a group of people that included some experienced teachers and some assistants, as well as the Dean of the School of Education at the time, who was a curriculum expert. We worked on writing the textbook as a group for about a year, and it was going pretty well—we had written a couple of chapters, we had given a couple of sample lessons. There was a great sense that we were making progress. We used to meet every Friday afternoon, and one day we had been talking about how to elicit information from groups and how to think about the future, and so I said, Let's see howwe think about the future.
I asked everybody to write down on a slip of paper his or her estimate of the date on which we would hand the draft of the book over to the Ministry of Education. That by itself by the way was something that we had learned: you don't want to start by discussing something, you want to start by eliciting as many different opinions as possible, which you then you pool. So everybody did that, and we were really quite narrowly centered around two years; the range of estimates that people had—including myself and the Dean of the School of Education—was between 18 months and two and a half years.
But then something else occurred to me, and I asked the Dean of Education of the school whether he could think of other groups similar to our group that had been involved in developing a curriculum where no curriculum had existed before. At that period—I think it was the early 70s—there was a lot of activity in the biology curriculum, and in mathematics, and so he said, yes, he could think of quite a few. I asked him whether he knew specifically about these groups and he said there were quite a few of them about which he knew a lot. So I asked him to imagine them, thinking back to when they were at about the same state of progress we had reached, after which I asked the obvious question—how long did it take them to finish?
An Interview with Margarethe von Trotta on Her Upcoming Film About Hannah Arendt
Over at the Goethe Institute:
Thinking and writing, those are the things that really defined the great philosopher Hannah Arendt. The objective of the film was to transform this thought into a film, to make it a visual embodiment of a real person.
How does one use film to describe a woman who thinks? How can we watch her while she thinks? That is of course the big challenge when making a film about intellectual personalities. I insisted that Barbara Sukowa play Hannah because she is the only actress I know who I could imagine showing me how someone thinks, or that someone is thinking. And she managed to do it. For me, it was clear from the beginning that she was the one, and I had to push for her to get the role because some of the investors couldn’t visualize it. I said to them, “I am not doing this film without her.” I had the same situation with Rosa Luxemburg and again with Hildegard von Bingen – she really experienced the intellectual nature of Rosa’s political speeches, for example. That is how it is with Hannah Arendt. The viewer has to see that she is really thinking. She does two speeches in this film as well. Arendt was a professor at various universities in the United States and she did seminars and speeches on philosophical and political subject matter. In situations like that, it’s not about just reading your lines. You have to be able to improvise and develop the speech as you go. In the film there is a six-minute speech in English, with the strong German accent that Arendt had, and Sukowa is able to get viewers to experience, think and follow her analyses.
What were the preparations for the film like? And what about your contact with Arendt’s world?
Before we started writing the screenplay we met with a lot of people in New York who had known Arendt well on a personal level. People like Lotte Köhler, her longtime colleague and friend who died in 2011 at the age of 92, or Elisabeth Young-Bruehl, who also died in 2011, as well as others like Lore Jonas, widow of Hans Jonas, and Jerome Kohn, her last assistant and publisher of her posthumous writings. Those were amazing encounters, the stuff you need when you are writing a script about this type of real person who you’ve never met yourself.
Hilton Kramer, 1928-2012
William Grimes in the NYT:
Admired for his intellectual range and feared for his imperious judgments, Mr. Kramer emerged as a critic in the early 1950s and joined The Times in 1965, a time when the tenets of high modernism were being questioned and increasingly attacked. He was a passionate defender of high art against the claims of popular culture and saw himself not simply as a critic offering informed opinion on this or that artist, but also as a warrior upholding the values that made civilized life worthwhile.
This stance became more marked as political art and its advocates came to the fore, igniting the culture wars of the early 1980s, a struggle in which Mr. Kramer took a leading role as the editor of The New Criterion, where he was also a frequent contributor.
In its pages, Mr. Kramer took dead aim at a long list of targets: creeping populism at leading art museums; the incursion of politics into artistic production and curatorial decision making; the fecklessness, as he saw it, of the National Endowment for the Arts; and the decline of intellectual standards in the culture at large.
A resolute high modernist, he was out of sympathy with many of the aesthetic waves that came after the great achievements of the New York School, notably Pop (“a very great disaster”), conceptual art (“scrapbook art”) and postmodernism (“modernism with a sneer, a giggle, modernism without any animating faith in the nobility and pertinence of its cultural mandate”).
At the same time, he made it his mission to bring underappreciated artists to public attention and open up the history of 20th-century American art to include figures like Milton Avery and Arthur Dove, about whom he wrote with insight and affection.
the fate of the western
However much certain optimists may talk about the survival or possible resurrection of the Western, I fear—much to my regret—that, as a genre, it is pretty well dead and buried, a relic of a more credulous, more innocent, more emotional age, an age less crushed or suffocated by the ghastly plague of political correctness. Nonetheless, whenever a new Western comes out, I dutifully go and see it, albeit with little expectation that it will be any good. In the last decade, I can recall three pointless remakes, vastly inferior to the movies on which they were modelled and which weren’t exactly masterpieces themselves: 3:10 to Yuma by James Mangold, The Alamo by John Lee Hancock, and True Grit by the Coen brothers, all of them uninspired and unconvincing, and far less inspired than the distinctly uneven originals made, respectively, by Delmer Davies, John Wayne, and Henry Hathaway. I recall, too, Andrew Dominik’s interesting but dull The Assassination of Jesse James by the Coward Robert Ford, Ed Harris’s bland, soulless Appaloosa, David von Ancken’s unbearable Seraphim Falls, and the Australian John Hillcoat’s The Proposition, of which my memory has retained not a single image. The only recent Westerns that have managed to arouse my enthusiasm have been those made for TV: Walter Hill’s Broken Trail, and Deadwood, whose third and final season no one has even bothered to bring out on DVD in Spain, which gives you some idea of how unsuccessful the magnificent first two series must have been. In my view, Kevin Costner’s Open Range, which came out slightly earlier, was the last decent Western to be made for the big screen, even though it has long been fashionable to denigrate anything this admirable actor and director does.more from Javier Marías at Threepenny Review here.
whoever we may be, we are aliens too
Vincent Gallo is one of the most disliked of current film actors, while George Clooney is one of the most admired, but most viewers of Essential Killing—American, Belgian, Sri Lankan, or Japanese—probably have more in common with Gallo’s “Mohammed” than they have with Clooney. Anyone can be targeted, victimized, have their eardrums blasted out, be forced to hide and kill in order to survive. All these are possibilities of human existence that, at the advanced stage of civilization we enjoy, are available to everyone. But to be George Clooney? He may make it look easy. It’s in the voice, however, that the deceptive quality of the Clooney figure can best be detected. Clooney, who is from Lexington, Kentucky, speaks with an unmarked accent, an accent of zero. His vocal deadpan (so soothing in Wes Anderson’s Fantastic Mr. Fox ) projects a reasonableness and an authority that do not impose themselves through any apparent violence. When he talks, it’s as if he were saying nothing. Such a talent makes him indeed The American.more from Chris Fujiwara at n+1 here.
first act is final curtain
It’s impossible to know how Francesca Woodman’s photographs would strike us if she hadn’t thrown herself out of a window at 22. Her suicide makes every image feel portentous. Each is a memento mori, a harbinger of imminent death. She specialised in self-portraits and the suite of choreographed scenes she shot with a timer or a remote trigger seems in retrospect a record of her unravelling. We rarely see her face. She bleeds into the background in very long exposures and disappears into crumbling walls. Her limbs vanish behind wallpaper and blur into architecture. Her flesh is barely solid, melting into mist and yielding to the rigid surface of a windowpane. The new exhibition of her work at New York’s Guggenheim Museum prompts a series of unanswerable questions. Would Woodman’s fierce self-scrutiny have ebbed with maturity or would it have inflected her entire career? Did the monomaniacal intensity of her work propel her towards death?more from Ariella Budick at the FT here.
More than Health Insurance
From The New Yorker:
On Monday, the case of the century got even bigger. In challenges to the Affordable Care Act in lower courts, several judges gave the Supreme Court an escape hatch. These judges, including Brett Kavanaugh, a young judge sure to make Republican short lists for the Supreme Court, said that the Justices should kick the can down the road and put off a decision for a year or two. Specifically, Kavanaugh said that the Tax Anti-Injunction Act (a deeply obscure law) compelled the Justices to put off a decision on the law until it takes full effect, in 2014.
Across the ideological spectrum, the Justices, through their questions to the lawyers arguing for and against the upholding the A.C.A., declined the invitation for delay. They all (that is, the eight who asked questions; Clarence Thomas did not) seemed to recognize that there were legal and prudential reasons to resolve this issue now. As Justice Ruth Bader Ginsburg said, the act “does not apply to penalties that are designed to induce compliance with the law, rather than to raise revenue. And this is not a revenue-raising measure because, if it’s successful, they—nobody will pay the penalty, and there will be no revenue to raise.” The Court, it now seems clear, will decide this case on the merits.
At Bottom of Pacific, Director Sees Dark Frontier
From The New York Times:
No sea monsters. No strange life. No fish. Just amphipods — tiny shrimplike creatures swimming across a featureless plane of ooze that stretched off into the primal darkness. “It was very lunar, a very desolate place,” James Cameron, the movie director, said in a news conference on Monday after completing the first human dive in 52 years to the ocean’s deepest spot, nearly seven miles down in the western Pacific. “We’d all like to think there are giant squid and sea monsters down there,” he said, adding that such creatures still might be found. But on this dive he saw “nothing larger than about an inch across” — just the shrimplike creatures, which are ubiquitous scavengers of the deep.
His dive, which had been delayed by rough seas for about two weeks, did not go entirely as planned: his submersible’s robot arm failed to operate properly, and his time at the bottom was curtailed from a planned six hours to about three. It was not entirely clear why. But he did emerge safely from the perilous trip, vowing to press on. The area he wants to explore, he said, was 50 times larger than the Grand Canyon. “I see this as the beginning,” Mr. Cameron said. “It’s not a one-time deal and then you move on. It’s the beginning of opening up this frontier.” National Geographic, which helped sponsor the expedition to the area known as the Challenger Deep, said that Mr. Cameron, the maker of the movies “Avatar” and “Titanic,” began his dive on Sunday at 3:15 p.m. Eastern Daylight Time, landed on the bottom at 5:52 p.m. and surfaced at 10 p.m. He conducted the news conference via satellite as he was being rushed to Guam in the hope of reaching London for the debut on Tuesday of “Titanic 3-D.”
March 26, 2012
Julie Davidow. The New Strain #3. 2008.
Gesso, acrylic, latex enamel, and enamel paint on canvas.
“To Commute,” by the Way, Can Mean to Transform (as in from Base Metal to Gold), or, The Banality and Sublimity of the Mundane
“To Commute,” by the Way, Can Mean to Transform (as in from Base Metal to Gold),
The Banality and Sublimity of the Mundane
by Tom Jacobs
Each morning the day lies like a fresh shirt on our bed; this incomparably fine, incomparably tightly woven tissue of pure prediction fits us perfectly. The happiness of the next twenty-four hours depends on our ability, on waking, to pick it up.
~ Walter Benjamin
Consider how the lilies grow. They do not labor. Neither do they spin.
~ Luke 12:27
Depending on whether one has ever felt the vaguely incarceral character of everyday life, the following scene may or may not resonate. The term “everyday life” is tossed around quite a bit by cultural/critical theorists and philosophers, and it’s not always clear just what the hell they mean by it. And I will try to explain what I think it means in a moment, but first, this scene. It’s about a guy who comes to understand that the life he’s been inhabiting is not actually his own, but has yet to figure out how to create a new one. No doubt you’ve seen it, but it’s good enough to warrant watching again.
It is worth noting that this conversation takes place in the context of an emergent love that, even here, clearly begins to be felt by the two characters. And also that it takes place in something like an Applebee’s. Even in an Applebee’s, it seems, the source of true love and real hope may lie. Strange to consider.
Here’s another scene of a man locked into the routines of everyday life. I quote it in full because it reveals the mind of a man fully engaged with the world around him, however generic and cultureless it might be. We all live in cocooned little worlds, shielded by ego and desire and narcissism. But boredom here can produce insight. This guy works in a generic office in a generic skyscraper in a generic and cultureless part of a business district. And yet his is a mind that finds interest in the quotidian, the minutae, the fabric of the everyday. He is in the bathroom of his corporate office.[i]
I negotiated the quick right and left that brought me into the brightness and warmth of the bathroom. It was decorated in two tones of tile, hybrid colors I didn’t know the names for, and the sinks’ counter and the dividers between the urinals and between the stalls were of red lobby-marble. I checked in the mirror to be sure that while chatting with Tina [Ed’s note: the secretary.] I had not had some humiliating nose problem or newsprint smudge on my face—she would probably have told me about the smudge, but not about the nose. A few sinks over from me, a vice-president named Les Guster was brushing his teeth. He was staring straight at the mirror and very likely seeing there the same expression on his face, the same quick bulgings in his cheek, that he had seen while brushing his teeth since he was eight years old. He blinked frequently, each blink slightly more deliberate than a blink he would have performed while reading or talking on the phone, possibly because the large motor movements of tooth brushing interfered with the autonomic rhythms of blinking. His tap was running. As soon as I took my place at a sink, Les bowed close to his sink, holding his tie with his free hand against his stomach, even though he was clearly not ready to rinse or spit yet, in order to shield his sense of privacy against my presence in the mirror. We were not obliged to greet each other: the noise of the water from his tap, and Alan Pilna’s winding-down urinal-flush, defined us as existing in separate realms. I was impressed by people like Les who had the bravery to brush their teeth (before lunch, even!) at work, since the act was so powerfully unbusiness-like; to indicate to him that I didn’t think that his tooth-brushing was in any way notable or comic, and that in fact I was unaware of his presence, I leaned into the mirror, pretending to study a defect on my face; then I cleared my throat so unpleasantly that there could be no doubt that I was oblivious to him. I pivoted and stationed myself at a urinal.
I was just on the point of relaxing into a state of urination when two things happened. Don Vanci swept into position two urinals over from me, and then, a moment later, Les Guster turned off his tap. In the sudden quiet you could hear a wide variety of sounds coming from the stalls: long, dejected, exhausted sighs; manipulations of toilet paper; newspapers folded and batted into place; and of course the utterly carefree noise of the main activity: mind-boggling pressurized spatterings followed by sudden urgent farts that sounded like air blown over the mouth of a beer bottle. The problem for me, a familiar problem, was that in this relative silence Don Vanci would hear the exact moment I began to urinate. More important, the fact that I had not yet begun to urinate was known to him as well. I had been standing at the urinal when he walked into the bathroom—I should be fully in progress by now. What was my problem? Was I so timid that I was unable to take a simple piss two urinals down from another person? We stood there in intermittent quiet, unforthcoming. Though we knew each other well, we said nothing. And then, just as I knew would happen, I heard Don Vanci begin to urinate forcefully.
This, it seems to me, is a tour de force that cuts to the very quick of how one with a poet’s soul might negotiate and transform everydayness. The protagonist, Stewie (from Nicholson Baker’s 1986 novel, The Mezzanine), lives a boring life and spends his days crouched over a computer in a felt-lined cubicle, yet it is one full of remarkable discoveries and insights. Here is a person who pays attention, who finds in everyday life a subject worth serious inquiry and reflection. And that matter of paying attention makes all the difference. Perhaps the heart can be full anywhere on earth, as has been said. But should it be? After all, transforming the way we perceive everyday life is a very different thing from transforming the character of everyday life.
Aside from the fact that I’ve never read anything that addresses this peculiar anxiety (what used to be called “bashful bladder” syndrome when I was a kid in Nebraska, something I first became aware of trying desperately to urinate beside my junior high social studies teacher), Baker, or rather his narrator, finds plenty of poetry and even philosophy in this most unremarkable of moments. One imagines that most writers would skip over this sort of thing (the awkward bathroom encounters) to get to something more epically or emotionally significant. But there it is; it is all there—the power relationships, the performances of self, the secret histories of self, the communal sharing of a private ritual—even in the corporate bathroom.
There is an obvious tension here between the scene from “Office Space” and Baker’s Mezzanine. It is a tension that has to do with either escaping or embracing everydayness. Is there any doubt that there is something intriguing to be found in the everyday? Something to be retrieved and held up to the light and turned around? This is, after all, the very stuff that life is made of: the “fabric of our lives,” as Zoe Deschanel might have it. Think of all those ridiculous hours spent on facebook, all those minutes and hours and days spent over years looking dubmly at one’s smart phone for a new message or a new game to play, commuting in a coma, refusing to get out of bed and grab the day by its short and curlies. It’s a lot of our life, this sort of thing. Yet people refuse, people escape it or transform it (and yes, I know we all think we manage this all the time, but do we really? A strong argument could be made that this is why vacations are so important to us…because we lie to ourselves about how exciting our everyday lives are…) But is it worth it, this work of retrieval and recuperation?
Whether to find personal fascination with the boredoms and routines of everyday life, or to, Lefebvre-like, seek to transform them into something fresh and new. Either to find heart-breaking beauty in mundanity, or to transform mundanity into something remarkable and extraordinary.
Is this sort of thing resolvable?
Infinite resignation is that shirt we read about in the old fable. The thread is spun under tears, the cloth bleached with tears, the shirt sewn with tears; but then too it is a better protection than iron and steel. The imperfection in the fable is that a third party can manufacture this shirt. The secret in life is that everyone must sew it for himself, and the astonishing thing is that a man can sew it fully as well as a woman.
~ Kierkegaard (Fear and Trembling)
I don’t know what to make of this last clause and so choose to ignore it, but the idea that we must ultimately recognize and then resign ourselves to some fundamental mystery or absurdity, and to wear this resignation like a hand-knit shirt, seem true enough. Of course it’s difficult to fully feel or discern this mystery and absurdity in everyday life, but it happens. John Updike also has something to say on this matter, writing of the Umbrella Man.[ii] (And if you don’t know the story of the Umbrella Man, here it is, in a short, six minute documentary by Errol Morris:)
(Morris's short documentary can be found here.)
The story of the Umbrella Man cannot but generate all manner of uncertainty about how much we can ever know, how much we understand about social reality, and most important of all, about how much the strangeness of the world outruns our categories and abilities to assimilate all those things that don’t really seem to make sense. As Special Agent Dale Cooper once noted, somewhat cryptically: “Whenever two events happen simultaneously, pertaining to the same object of inquiry, we must always pay strict attention.” Quite so. Morris and his interlocutor get to the bottom of this one, but for every magical realist moment that goes fathomed, there must be infinite others that don’t.
Here’s Updike’s gloss on the meaning of the Umbrella Man, and what it says about even the most mundane events or moments:
We wonder whether a genuine mystery is being concealed here or whether any similar scrutiny of a minute section of time and space would yield similar strangenesses—gaps, inconsistencies, warps, and bubbles in the surface of circumstance. Perhaps, as with the elements of matter, investigation passes a threshold of common sense and enters a sub-atomic realm where laws are mocked, where persons have the life-span of beta particles and the transparency of neutrinos, and where a rough kind of averaging out must substitute for absolute truth. The truth about those seconds in Dallas is especially elusive; the search for it seems to demonstrate how perilously empiricism verges on magic.
~ John Updike, The Talk of the Town, December 6, 1967
Empiricism does indeed verge on magic, most perilously, as is most evident to anyone who’s ever hung around in a basic science research laboratory and noticed the strange sumptuary rules (the white coats that, if you squint, look for all the world like some kind of surplice or cassock), the strange instruments that allow us to see the unseen, the remarkable technologies that perform alchemical reactions and mixings beyond the abilities of the merely human. It’s not just that to an insufficiently developed culture modern technology appears as magic; hell, it appears as magic to our own culture. Or it should.
The television is on. A couple sits in the living room, enjoying some small degree of a week’s unknotting, quietly playing the inward record of a day’s unspoken humiliations and triumphs. The remaindered aroma of dinner floats and absorbs itself into whatever fabric it can find. The couples’ bodies begin to melt into the sofa that is the center of their home. Slackjawed and riveted, the couple recedes into a moment of shared and private intimacy, sitting each beside the other, watching the same program at the same time, each with his/her own private movie of the day playing in their heads. Their fascination with the images on the tv is disrupted, however, by the sound of an odd scratching noise. A noise that is coming from the kitchen, separated by the most papery thin lathe and plaster walls. Not a natural sound. Or at least, not an expected sound.
A moment passes between them. Anxiety ripples through the room. What is that noise? What could possibly be scraping around in the kitchen?
The boyfriend, having seen a tiny, cheetoh-sized mouse with big ears in the kitchen the other day, feels a surge of masculine adrenaline. He declares that he will go investigate.
He opens the door to the kitchen, turns on the light and there, perhaps three feet away from him, is a raccoon the size of a small golden retriever, holding in its black begloved hands, a loaf of bread that had been left out.
The raccoon and the boyfriend hold each other’s gaze for perhaps five seconds. Each seems to be wondering what the other is doing there. The moment is held. Each looks at the other from across an abyss of mutual incomprehension. Then the moment is gone. The raccoon drops the loaf of bread and flees back out the open window and fire escape from which it presumably came. The boyfriend never sees the raccoon again.
The boyfriend thinks about this moment frequently in the ensuing years, trying to figure out what it means. He never reaches the bottom of it.
Politics is all about the creation of cultural time and space—the world we walk through every day without much thinking about it. Our being is profoundly tied to the types of practice that generate the space and time we inhabit.
The notion of resistance has everything to do with the nature of everyday life. Even those most exploited amongst us nevertheless find a way to make themselves feel at home in the world. And this world-building is fundamental…it is the substratum upon which everything else is built. To be sure, there are multiple and reciprocal forces at work here (the state and the economic system and the simple need to make money is of course going to shape one’s everyday practice), but what one can effect and shape and fashion is one’s everyday life, one’s everyday relationship to the world and to others.
Without the active application of creativity and the imagination we are lost. We follow the channels that have already been dug. We do not self-invent. We do not fear that we don’t know who we are or what we should do. Desire (a tricky one) and pleasure also figure in here. The worst case: when we think we are “free” is precisely the moment when we are least free. Free to consume and buy more shit and do what we are supposed to do. The problem is in part one of repetition: of consuming, of working, of mimicking the behaviors and attitudes of yesterday (and the day before). But it is not the sole problem, as was made clear to me Sunday morning, when I read this piece about a man who seeks to walk every street in the five boroughs. It’s an extraordinary piece that can be found here. Even more remarkable is his personal blog about his vagrancy, which consumed a good two hours of my morning and can be found here (caveat lector!): http://imjustwalkin.com/
The discoveries he finds simply walking the streets is enough to make me want to drop everything and just start walking.
The quotidian as a legitimate object of philosophical reflection... If there is an authenticity to be found in this fallen world, surely it is here, in the quotidian, in the finding or making of self in the mundane and banal. With luck, we experience what Lefebvre calls “moments of presence,” which are “those instants that we would each, according to our own personal criteria, categorise as ‘authentic’ moments that break through the dulling monotony of the ‘taken for granted’”
The contours of the everyday have to change before anything else will…otherwise old orders will merely reappear. It is here, after all, in the trenches of the everyday, that the reproduction of everything occurs: the relations of production and reproduction, the routines, the potentially cretinizing effects of social media. We find that we consent without really consenting to the existing order. We go to work, we behave in prescribed manners with others and our coworkers and our families and friends, and then we go home and buy a bunch of stuff to cook or eat with loved ones or by ourselves, & etc.
And now I begin to sound preachy and pedantic. All I know for sure is that this sort of thing is tough and hard to achieve. It defies me most days, but not always. Perhaps today will be one of those days where it doesn’t. We’ll see. Now I have to go lay out the shirt that I will wear before I take a shower and head off to work. It may be ironed but I’m pretty sure it will be itchy.
[i] I might add that this passage is chosen more or less randomly. There are far more thick descriptions of office life, but this one is intriguing to me because it suggests some of the Tom Wolfe-ian dimensions of the give and take, the private exchanges that occur even amongst strangers or coworkers in that strangest of public spaces: the bathroom. If you want a true tour de force, here it is:
On the way back, my office seemed farther from the CVS than it had on the way there. I ate a vendor’s hot dog with sauerkraut (a combination whose tastiness still makes me tremble), walking fast in order to save as much of the twenty minutes of my lunch hour I had left for reading [Ed’s note: he’s reading Marcus Aurelius…and there’s a great passage coming up…). A cookie store I passed had no customers in it; in under thirty seconds, I had bought a large, flexible chocolate chip cookie there for eighty cents. Waiting for a light five blocks away from my building. I took a bite of the cookie; immediately I felt a strong need for some mile to complement it, and I nipped into a Papa Gino’s and bought a half-pint carton in a bag. [Ed’s note…I will cut out some of the digressions upon digressions, even though I love them so…].
I placed the CVS bag beside me and opened the carton of milk, pushing an edge of the bag Donna had given me under my thigh so that it would not blow away. The bench gave me a three-quarter view of my building: the mezzanine floor, a grid of dark green glass with vertical marble accents, was the last wide story before the façade angled in and took off, neck-defyingly, into a squint of blue haze. The building’s shadow had reached one end of my bench. It was a perfect day for fifteen minutes of reading. I opened the Penguin Classic at the placemarker (a cash-machine receipt, which I slipped for the time being several pages ahead), and then I took a bit of the cookie and a mouthful of cold milk. [Ed’s note: I will skip a bit here…]
I found my place on the brilliant page and read:
Observe, in short, how transient and trivial is all mortal life: yesterday a drop of semen, tomorrow a handful of spice and ashes.
Wrong, wrong, wrong! I thought. Destructive and unhelpful and completely untrue!—but harmless, even agreeably sobering, to a man sitting on a a green bench on a herringbone-patterned brick plaze near fifteen healthy, regularly spaced trees, within earshot of the rubbery groan and whish of a revolving door. I could absorb any brutal stoicism anyone dished out! [Ed’s note: I will stop here…you get the idea…this is an individual who observes and reflects the complexity of everyday experience back out into the world in a way that is much needed].
[ii] well worth a watch…a six-minute video made by Errol Morris for the NYTimes, and based on a book that a formerly tenured Kierkegaard scholar wrote…he apparently decided at some point that, you know what? Done with Kierkegaard. F Kierkegaard. So he became a private investigator. An amazing dude in his own right.
Maybe It Is I Who Am The Zombie
Or, Reading is Bad
Or, A Tale of Two Storytellers.
My Philadelphia childhood was marked by the image of my mother under lamplight, bent over a book, studying to become a folklorist. She was always studying children's games and rhymes and reading weighty, scary, assigned-tomes like "The Sex Lives of Savages". She came to folklore through this fascination she'd developed with the voice of a man she met in Benin, West Africa in her late twenties. His name was Nondichao and he was a skeletal tall old griot before whom she'd place a boxy tape recorder time and again over the course of decades. I remember his grainy French-African voice very well, as if it runs through my dreams without my knowing. With a gravelly lilt Nondichao told her, over many a sweaty bottle of Fanta, and all from memory, the bloody and amazing histories of the kingdom of Dahomey as they had been relayed to him by a series of griots, all now dead. In the meantime I played with the village children chasing hoops and petting goats, and we all were recorded in the background static.
She came to that fascination--with his storytelling--because she was a storyteller herself, and had worked for a friend's children's theater group in Connecticut called Oddfellows Playhouse. And that fascination started from an even more direct seed--she'd been a devoted theatre-person. She'd been the kind of older sister who is constantly organizing her siblings into little backyard productions, who grows up into a theatre major...
So for me there's always been this narrative that explains how one could get from theatre to storytelling to folklore to history (and perhaps back again) all by following a fascination with the human voice.
Of course my mother has a lovely, expressive speaking voice. But in retrospect I see that that voice is partially responsible for the fact that I nearly failed second grade. When we left Benin I was six and she was thirty; and by the time I was eight, despite the best efforts of the Philadelphia public school system, I still couldn't read.
So I often thank my stars that I wasn't born in our current era of over-diagnostic tendency, as I'm sure I'd have been shunted off into various sad special rooms and my life might have gone quite differently. But my academic problem was pretty basic. I didn't have a disability. I preferred to be read to.
These days I admit to being a person who has devoted her life to paper, but this stubbornness about wanting words to live in the air has never entirely left me. The following poem about zombies reflects my experiments with what one can do with an oral tradition:
I never particularly wanted to do this kind of thing on video because, like Nondichao's stories, the words aloft in the air are best spoken from one person to a small group of listeners, preferably in a circle. I really like a small circle. The green light glinting from my computer-camera is sort of weird and hard to perform towards, though I'm working on that.
I suppose its obvious that I ultimately learned to read (thanks, largely, to Archie comic books) but even in graduate school I found that I was uncomfortable handing out stacks of poems for other students to read silently in their heads. I worried that they wouldn't hear it right. But I never had the guts to perform full-out in those workshop classrooms. I've spent years now, as have many of my compatriots in what they call "spoken word" circles, trying to write poems that work on page and stage. The zombie one is one I exclude from my manuscript because I don't know how it works on paper. Here's a quick shot at it, though. Any advice on line-breaks from paper-poets are welcome!
Maybe It Is I Who Am the Zombie-
Zombie sleepwalking through the dirty street with my head of paper. my stomach of paper. . .I'm busy eating paper with my paper gut. Maybe It is I Who Am the Zombie-Zombie sleepwalking through the dirty street. All the bodies in this train-car are haloed in white/soft-white like lightning, like heads ripped out from magazines and pasted by me to keep a zombie company (company for zombie! very good company!) but maybe it is I.
"I was in love with a place in my mind in my mind. . ."
I pilfered all your blood. I stole, I snuck it I tucked it in my coat I had to have it
for my very own heart. to swell and swallow and explode all over
(oh my greedy-greedy heart. oh my greedy-greedy heart.)
nobody can feel any feelings but me! So maybe it is I. . .
"But I love the way you say good-morning and you take me the way I am."
This poem, by the way, is about how reading is bad--though that's sort of tongue-in-cheek. Given that I now teach reading and writing I probably shouldn't say things like that. But at the base of the poem I'm worrying about my own ability to twist narratives (making myself into the hero or martyr of any story is pretty easy for me) and how that instinct for epic narrativizing is not always the best skill to have when it comes to relationships. Particularly romantic ones. But what a boring thing to talk about. I prefer thinking about that topic in a comic-book way--by picturing a sort of paper-love-zombie who walks around the city thinking everyone else is made out of magazines, and stealing the life-essence and blood of the sweetheart in order to make better stories. A zombie made of paper who eats paper to make paper. Something like that.
In any case, like this guy, I have a complicated relationship with paper.
maybe you think I do not know
maybe you think I could not be
maybe I am not where I go
maybe you are not here with me
perhaps the moon is nothing old
perhaps the sun is never new
perhaps all stories have been told
perhaps there is no being through
it could be everything is here
it could be everything is near
it could be heaven is not far
it could be now just where we are
perhaps all maybes will be done
maybe all should-bes might be too
it could be everything is one
beyond the shadow of we two
by Jim Culleny
Entropy -- a primer
C.P. Snow famously said that not knowing the second law of thermodynamics is like never having read Shakespeare. Whatever the particular merits of this comparison, it does speak to the centrality of the idea of entropy (and its increase) to the physical sciences. Entropy is one of the most important and fundamental physical concepts and, because of its generality, is frequently encountered outside physics. The pop conception of entropy is as a measure of the disorder in a system. This characterization is not so much false as misleading (especially if we think of order and information as being similar). What follows is a brief explanation of entropy, highlighting its origin in the particular ways we describe the world, and an explanation of why it tends to increase. We've made some simplifying assumptions, but they leave the spirit of things unchanged.
The fundamental distinction that gives rise to entropy is the separation between different levels of description. Small systems, systems with only a few components, can be described by giving the state of each of their components. For a large system, say a gas with billions of molecules, describing the state of each molecule is impossible, both because it would be tedious and because we don't know the state of each molecule. And, as we'll point out again later, for many purposes knowing the exact state of the system isn't useful. In theory we can predict how a system evolves by knowing its exact state, but in practice this is much too complicated to do unless the system is very small. So we instead build probabilistic predictions taking into account only a few parameters of the system, which gives us a coarser but more relevant level of description, and we seek to describe changes in the world at this level.
There is nothing that makes this uniquely part of physics, of course, and there are many other cases where we need to investigate the relationship between levels of description. Let's consider a toy example. Imagine we have a deck of cards in some order. We can describe the ordering in many different ways. The most complete way is to give an ordered list, like so: King of Hearts, Two of Clubs, One of Diamonds, and so on. But we can also use a coarser description, which is what we do when we describe a pack of cards as shuffled or not. So just for concreteness, let's say that we can only distinguish two states: one in which the cards are arranged in order (One of Clubs, Two of Clubs, …) and the other being everything else. Let's call these state A and state B. This is a less informative level of description, of course.
The description in terms of states A and B is a macroscopic one, and the description in terms of the exact ordering is a microscopic one. This is a matter of difference in degree rather than kind, and there are many intermediate levels of description. The states in the macroscopic level of description are the “macrostates”; in this case we have macrostates A and B. Similarly, the states in the microscopic level of description are the microstates; in this case we have a gigantic number of different ones, each corresponding to a particular ordering of the cards.
Now let's start shuffling the cards. If we start in state A (cards arranged in order), we'll quickly end up in state B (cards not in order). On the other hand, if we start in state B we'll almost certainly remain in state B. So the system doesn't seem to be reversible: state A almost always leads to state B, and state B almost never leads to state A. However, if we were describing the system using the microscopic level of description, we'd always see one arrangement of cards lead to another, and the chances of transitioning between the various arrangements are the same, so everything is reversible.
So what happened? Well, from the microscopic point of view, our macrostates are asymmetric and the asymmetry comes from the particular representation we chose. State B includes a large number of microscopic states, so most arrangements of cards will belong to state B. State A includes very few microstates; there are only a few ways for the cards to be in order. And when we shuffle the cards we naturally end up in the state which includes more microstates. So to explain what's happening we need the number of microstates compatible with a given macrostate. This is called the multiplicity. In this picture, we'd associate a small number with A and a large number with B, and we'd say that the system tends to go from states with a small number of compatible microstates (low multiplicity) to those with a larger number of compatible states or higher multiplicity.
Entropy is a measure of this number (it is the log of this number for reasons that are interesting but not critical). And so the entropy is a property of the particular macrostate (macrostate A has low entropy; macrostate B has high entropy). Entropy is also a property of the description. If we choose a different set of macrostates, we'll have a different set of associated entropies. But as long as the system is being mixed up at the microscopic level, which is what happens when we shuffle, we'll see the system move from states with low entropy to states with high entropy. In the card example, we can call state B “disordered” and state A “ordered”, but entropy is not measuring disorder. The high entropy of state B just tells us that there are many more states we call disordered that there are states we call ordered. We could have instead chosen macrostates C and D where state C contained three disordered arrangements of cards and state D contained everything else (including the ordered arrangements). Here state C would have low entropy even though the microstates it contains are disordered.
The entropy increase is probabilistic, in that it happens on average. There's nothing to prevent the mixed up set of cards from being shuffled back into the ordered state. But this is massively unlikely for anything but the smallest systems. It's a fun digression to look at the numbers involved to get a sense of how large they are and to see why these statistical laws, like the law that entropy increases on average, are in practice exact. The number of ways of arranging things grows very very fast. If we have a deck of 2 cards, there are two possible arrangements. If there are 3 cards there are 6 possible arrangements and there are 24 possible arrangements of 4 cards. These numbers are small but they rapidly become much bigger than astronomical. The number of ways of arranging half a deck of cards is already about a billion times the age of the universe in seconds, and the rate at which the numbers grow keeps increasing. With numbers like these, “almost always” and “almost never” are “always” and “never” on the timescales that we experience.
Physicists are fond of gases in boxes and a classic physics example for entropy increase is the expansion of a gas in a box. Say we have a box with a partition dividing it into two halves and we fill one of the halves with a gas. We then remove the partition and watch what happens. The gas molecules start off in one half of the box and we'll always observe that the gas expands to fill the other half. So now consider two macrostates. State A will have the gas in one half of the box and state B will have the gas spread out everywhere. If the gas molecules are wandering around freely, they will wander through all the possible arrangements of gas molecules in the box (the microstates). Very few of these correspond to state A; most correspond to state B. And so our system moves from a state of low entropy to a state of high entropy. Again, we might call state A more ordered than state B, to reflect the fact that it would take an unusual conspiracy to see the system in state A. But entropy is not measuring this putative order or disorder.
Now note a couple more things. First, we were able to make this prediction without knowing the detailed state of the system. We used our two macrostates and the entropies associated with them to predict the transition. And, as pointed out before, even if we did know the detailed state of the system we'd find it useless for prediction. In fact, given any particular microstate we'd find it practically impossible to predict how the system evolved, but knowing that the microstate is chosen randomly from a collection of microstates allows us to make a probabilistic prediction, which is exact because of the large numbers involved. So this has the interesting consequence that not only can we make predictions from a higher, incomplete level of description, it actually seems to help. Different levels of description make different things possible.
In these simple examples, the macrostates are fairly obviously a product of our description. In general, are the macroscopic variables we use to describe systems purely subjective or do our theories and the universe give us preferred macroscopic variables and preferred levels of description? Can just anything be a macroscopic variable or are there particular criteria that make for a good macroscopic variable? Can we really just lump together a few arbitrarily chosen states and call that a macrostate? This is a matter of vigorous debate, and is perhaps a subject for a separate article.
Now given a system we can ask how various changes to the system affect the number of states accessible to it or, equivalently, the entropy. In particular, how does adding energy to a system change the entropy? Adding more energy to a system usually increases the number of states available to it. This is both because with more packets of energy there are more ways to distribute them between the members of a system, and because more energy makes high energy states accessible in addition to lower energy ones. Trying to formalize this relationship leads naturally to temperature, which is the factor that tells us how to convert changes in energy into changes in entropy. Adding a given quantity of energy to a system at low temperature increases the entropy more than adding the same quantity of energy to a system at high temperature.
So imagine we have two systems at different temperatures connected together. Packets of energy are being exchanged back and forth, and the joint system wanders through a number of possible states, just like the cards being shuffled. Let's say we switch a packet of energy from the hotter system to the colder one. Taking away the energy from the hotter system and giving it to the colder system reduces the entropy of the first system but increases the entropy of the second. Crucially, the increase in entropy of the low temperature system is greater than the decrease of the high temperature system. So there will be more states where the energy packet has moved from the high temperature to the low temperature system than vice versa, and energy will flow from the hotter system to the colder one.
There are many interesting directions to explore from here; we've really only scratched the surface. For one thing, we've left out some subtleties. Apart from the issue of what makes a good macrostate or level of description, we've often invoked a process that mixes things, like shuffling cards. We haven't explored the details of this process or been explicit about what we require from this process. For example, what happens if the shuffling process depends on which macrostate we are in? We've also assumed that the system explores all its possible microstates with equal probability. What happens when this is not the case?
We also haven't talked about attempts to understand the direction of time using entropy increase. The laws of physics seem to be time symmetric at a microscopic level, in much the same way that our card shuffling doesn't pick out a preferred direction, and so it's puzzling where the direction of time comes from. Entropy increase at the macroscopic level does seem to give us an asymmetry -– entropy increases towards the future -– and some people have argued that this can be used to ground the direction of time increase.
But the origins of the current low entropy state of the universe and its consequences are far from clear. To put this in the context of our card example, if I come across a pack of ordered cards there are two primary possibilities. Perhaps the cards started in an ordered configuration (maybe someone put them that way, or they were manufactured that way) and they haven't been shuffled very much. In this case, the cards were in a low entropy state in the past and will be in a higher entropy state in the future and we have an asymmetry that comes from the initial conditions. But another possibility is that we've been shuffling the cards for a long time and, just by random fluctuation, they've ended up in an ordered state. In this case, the cards were probably in a high entropy state in the past and will be in a high entropy state in the future.
Similar to the first scenario, most explanations of why entropy seems to increase in one direction require that the universe started in a state of low entropy. This may seem like question begging, since it just pushes the asymmetry back to where the universe started. But it might be the best we have at the moment. And it may turn out that low-entropy initial conditions will emerge from cosmology and the next generation of physics as we better understand the initial state of the universe. Alternatively, there is the grimmer Boltzmann brain hypothesis: we are just random low entropy fluctuations in a high entropy universe, much like an ordered pack of cards emerging from repeated shuffling after a very long time. In this view, a part of the universe (or a particular universe) has just briefly fluctuated into an ordered state. Since one brain fluctuating into existence is much more probable than an entire world doing so, according to this view the rest of the world is probably an unstable illusion and will wink out of existence in the next moment as the system fluctuates back to a high entropy state. Thankfully, few working physicists seem to actually believe this.
LOST IN TRANSLATION
“Dye,” Mother says
touching her silver hair.
Harry the shrink strokes his gray beard,
“I'm proud of it.”
“Operation Doctor Sahib,”
she points to the mole on her nose.
“God’s gift,” he says. She shows him
her ulna, fractured in a recent fall.
“Make it as it was.”
Harry the shrink shows his bruised wrist,
“Fell off the bike when I was young.”
She removes her slip-ons: Girl’s feet,
Red polish chipped at cuticles.
“Slice off my bunions.”
Harry the shrink removes his socks: Big
Mother glares at me,
her fifth child, reclined
as usual on the couch, translating
Kashmiri, Mother Tongue.
“What does this decrepit man know,
she says, “My life is ahead of me.”
For my mother, Maryam, on her 90th
3 March 2012
Hebrew Home for the Aged, Riverdale, NY
Rafiq Kathwari is a guest poet at 3Quarks Daily