leonora carrington

Corbis_elements31

A few months ago, I found myself next to a Mexican woman at a dinner party. I told her that my father’s cousin, whom I’d never met and knew little about, was an artist in Mexico City. “I don’t expect you’ve heard of her, though,” I said. “Her name is Leonora Carrington.” The woman was taken aback. “Heard of her? My goodness, everyone in Mexico has heard of her. Leonora Carrington! She’s hugely famous. How can she be your cousin, and yet you know nothing about her?”

How indeed? At home, I looked her up, and found myself plunged into a world of mysterious and magical paintings. Dark canvases dominated by a large, sinister-looking house; strange and slightly menacing women, mostly tall and wearing big cloaks; ethereal figures, often captured in the process of changing from one form to another; faces within bodies; long, spindly fingers; horses, dogs and birds.

more from The Guardian here.



If modernism had a pope, it was Picasso.

PABLO PICASSO’S spell over 20th-century art can perhaps be summed up in five words spoken by the Armenian-American painter Arshile Gorky in 1934. Informed that Picasso had recently started making messier paintings, the very tidy Gorky famously replied, “If Picasso drips, I drip.”

Picasso’s staggering output — more than 20,000 paintings, drawings, prints, sculptures, and photographs — gave him an exposure unprecedented for a living artist. The fact that he spearheaded the century’s most important movement (cubism), invented its defining technique (collage), and painted its most imposing masterpiece (“Guernica”) makes it hard to think of any modern artist — including rivals and elders — who didn’t at some point in his career take cues from Picasso’s Paris studio. If modernism had a pope, it was Picasso.

more from Boston Globe Ideas here.

GOT OPTIMISM? THE WORLD’S LEADING THINKERS SEE GOOD NEWS AHEAD

From Edge:

Interrogate1501 While conventional wisdom tells us that things are bad and getting worse, scientists and the science-minded among us see good news in the coming years. That’s the bottom line of an outburst of high-powered optimism gathered from the world-class scientists and thinkers who frequent the pages of Edge, in an ongoing conversation among third culture thinkers (i.e., those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are.)

The 2007 Edge Question marks the 10th anniversary of Edge, which began in December, 1996 as an email to about fifty people. In 2006, Edge had more than five million individual user sessions.

I am pleased to present the 2007 Edge Question:

What Are You Optimistic About? Why?

The 160 responses to this year’s Edge Question span topics such as string theory, intelligence, population growth, cancer, climate and much much more. Contributing their optimistic visions are a who’s who of interesting and important world-class thinkers.

Got optimism? Welcome to the conversation!

More here.

Gene doubles breast cancer risk

From BBC News:Gene_2

Women with a damaged copy of the gene called PALB2 have twice the risk of breast cancer, the Institute of Cancer Research scientists found. They estimate that faulty PALB2 causes about 100 cases of breast cancer in the UK each year. Two damaged copies of the gene also appears to cause a serious blood disorder in children, they report in Nature Genetics. It is PALB2’s job to repair mutant DNA, so people who have a faulty copy of the gene are more likely to accumulate other genetic damage too, leading to problems like cancer.

Professor Nazneen Rahman and her team studied the DNA of 923 women with breast cancer and a family history of the disease, not caused by the known breast cancer genes BRCA1 or BRCA2. Ten of the breast cancer patients had a damaged copy of PALB2, as against none of 1,084 healthy women used as a comparison. Carrying a faulty version of PALB2 more than doubled a woman’s risk of developing breast cancer – taking her lifetime risk from one in nine to about one in five.

More here.

Monday, January 1, 2007

A Case of the Mondays: the Year of Dashed Hopes

I presume that at the end of each year, pundits, writers, and bloggers gather to discuss the year’s political trends. Most of what they discuss is invariably pulled out of thin air, but I hope I’m basing my own analyses on enough evidence to escape that general description. It’s accurate to characterize 2004 as the year of liberal democratic hopes: the Orange Revolution in Ukraine, the new parliamentary elections in Georgia consolidating 2003’s Rose Revolution, the calls for democratic revolution in Iran. This continued into early 2005 with Lebanon and the scheduled elections in Palestine.

And then it all crashed. New Ukraine was plagued by corruption. The Tulip Revolution didn’t go anywhere. Frustration with the slow pace of reform in Iran catapulted Ahmadinejad to power instead of ushering in a new democratic system. Fatah looked weak on corruption, weak on Israel, and weak on public order, while Hamas looked like a fresh change.

In the Middle East, 2006 was the year of dashed hopes, even more so than 2005. Iraq was irrevocably wrecked long before 2006 started, but 2006 was the year the violence escalated. Most wars kill many more people than any subsequent occupations; in Iraq, there were more people killed in 2006 than in 2003. The Sunni-Shi’a rift had been there for fifteen years, but intensified over the course of last year, and spilled over to other countries in the region: Iran, Saudi Arabia, Lebanon. Throughout most of the year, there was only escalating violence and increasing legitimization of Muqtada Al-Sadr, but right at the end, the execution of Saddam was probably carried out by Al-Sadr’s followers, rather than by the government.

The single country in the region whose hopes were dashed the most was of course Lebanon. The Cedar Revolution was supposed to usher in a new age of democracy built along the same pillarized model that had worked in the Netherlands for about a century. Hezbollah was supposed to reform itself from a terrorist organization to a legitimate if fundamentalist political party. And the country was supposed to become independent of Syrian and Iranian influence. To a large extent due to Israel’s lack of knowledge of foreign policy responses that don’t involve military force, those hopes disintegrated in the summer of 2006.

In Palestine, Hamas won the parliamentary election, which Israel considered equivalent to a writ permitting the IDF to kidnap elected Palestinian officials at will. As had happened in Nicaragua in the early 1980s, the Hamas government found itself stripped of development aid, and became increasingly radicalized as a result. Israel responded the only way it is familiar with, i.e. with military force, and killed 655 Palestinian civilians in the Occupied Territories, up from 190 the previous year.

And in the US and Iran, two conservative Presidents with a vested interest in muzzling liberal democratic opposition escalated their saber-rattling game. In Iran, that meant crackdowns on opposition media, especially in the wake of Israel and Hezbollah’s war. Although toward the end of the year, reformists gained power in the election, real power in Iran lies in the hands of unelected Supreme Leader Khamenei, who is as opposed to democratic reforms as Ahmadinejad.

At the same time, 2006 was the year of recognition. In Iraq, the situation became so hopeless it became impossible to pretend everything was going smoothly. Right now the only developed country where the people support the occupation of Iraq is Israel, where indiscriminately killing Arab civilians is seen as a positive thing. The Iranian people did the best they could to weaken the regimes within the parameters of the law. Hamas’s failure to deliver on its promise to make things better led to deep disillusionment among the Palestinians, which did not express itself in switching support to even more radical organizations. And most positively, the Lebanese people, including plenty of Shi’as, came to see Hezbollah not as a populist organization that would liberate them from the bombs of Israel, but as a cynical militia that played with their lives for no good reason.

Elsewhere, there were no clear regional trends. However, the political events of 2006 in the United States might point to a national trend of increased liberalism. On many issues the trend is simply a continuation or culmination of events dating at least fifteen years back, but on some, especially economic and foreign policy ones, the shift was new. In 2002 and 2004, the American people voted for more war; in 2006 they voted for less. While they didn’t elect enough Senate Democrats to withdraw from Iraq, they did express utter disapproval of the country’s actions in Iraq. This trend originated in the Haditha massacre of 2005, and Bush’s approval rating crashed in 2005 rather than in 2006, but it was in 2006 that the general discontent with the direction of American politics was expressed in a decisive vote for a politically weak party over Bush’s party.

So after the hope of 2004 and early 2005, 2006 was not just the year when violence rebounded and democracy retreated in the Middle East, but also the year when public unrest with the status quo grew. This unrest did not manifest itself in any movement with real political power, and I don’t want to be too naively optimistic to predict that it will. I mentioned that the Iranians did everything within the parameters of the law to support democratic reforms; but Iran’s system is so hopelessly rigged that nothing within the parameters of the law can change anything. Still, indirect action typically sets the stage for direct action; Martin Luther King’s civil rights movement stood on the shoulders of decades of NAACP and ACLU litigation.

The cliché way to end this would be to look at the situation in Iran and to a lesser extent Lebanon and Palestine, and posit that the country is now at a crossroads. I don’t think it is; the Iranian people have had the infrastructure and social institutions to overthrow theocracy for a number of years now, and came closest to doing so in 2002, before the US invasion of Iraq. It may be that the Iranian people have grown so tired of the regime that even “We hate America and Israel more than our opponents” isn’t enough to hold Khamenei and Ahmadinejad afloat. Or it may be that Israel will decide to save the regime by launching military strikes against its nuclear weapons program. And it may be that after either of these scenarios, there will be a political reversal the next year modeled on a color/flower revolution or on a reaction against such a revolution. Hopes can be dashed, and dashed hopes can be rescued, as 2006 taught us.

A Mole at the World Bank? Truth-Telling between the Lines

The Financial Times headline read “Rapidly Swelling Middle Class is Key to World Bank’s Global Optimism.” Nonsense, I thought. The World Bank is up to its old tricks again, making a silk purse out of a sow’s ear.

It seemed just another case of how to lie with statistics. Stress the doubling of the middle class in the world’s population by 2030, and bury the fact that 84% of the world’s population won’t be middle class. In fact, the overwhelming majority of humanity will be far from it. In 2030, half a billion persons will still be living on a dollar or less a day, and 1.8 billion will still be living on less than 2 dollars a day, according to the Bank. Against the Bank’s estimate that 1.2 billion persons in poor countries will be middle class, the claim of a “rapidly swelling middle class,” admittedly the Financial Times’ characterization of the Bank’s Global Economic Prospects 2007, issued on December 13, rings a bit hollow. The poor we shall still have with us, and in abundance, it appears, if we wait for the world’s new middle class to bring about a more just society.

The news gets worse. Economic growth through 2030 will only slightly close the income gap between rich and poor countries. For every positive economic stride taken by poor countries, for every climb on the income ladder made by this new middle class in poor countries, rich countries and most importantly their rich citizens, take two steps forward. By 2030, the rich will have increased their proportion of the world’s income from 58% in 2000 to 69%. Thus far, massive industrialization in poor countries has not really shifted the economic balance of power. “Five decades of development have done little to bring the average incomes of developing countries closer to those of OECD (rich) countries,” says the Bank, practically in an outburst of unusual clarity.

Moreover, inequality inside poor countries is likely to worsen. This has been happening in rich countries since the seventies. In poor countries, the new middle class will be pulling away from everyone else, working class and poor alike.

Still another cloud noted by the Bank itself casts a shadow over its sunny optimism. Between now and 2030, the world economic growth rate will stagnate at 3%, a point lower than the period between 2004 and 2006, and significantly lower than the 4.5% rate that created the rich country middle class between 1960 and 1980. Developing countries will grow at a 4% rate, a point above the world rate, but the increment must outrace any population growth while motoring an economic catch-up rate to rich countries much greater than heretofore seen. The remarkable rise of China, as well as its remarkable size, disguises the probable fates of other poor countries that will not grow at China’s astronomical rates. Their improvements will be in increments too small to pull up incomes generally.

Just a case of the glass half-empty, my interpretation, versus the half-full glass interpretation presented by the World Bank? Perhaps, there is no gainsaying anything else. In the long run, as Keynes said, we will all be dead and thus won’t know anyway.

But something else has crept into the World Bank’s analysis. As an inveterate reader of their voluminous annual reports and annual outlooks for the past decade and more, I think I have read in this latest Economic Outlook 2007 something new. It seems that the Bank is ready to argue, albeit buried on page 83 of the new report that rising income inequality gets in the way of eliminating poverty. In code, they put it this way: “Rising inequality is worrisome because there is an inverse relationship between inequality and poverty reduction.” In other words, as economic inequality increases in a society, fewer people escape poverty. The Bank’s model of the relationship between inequality and poverty reduction shows that in 40 poor countries with higher income inequality, poverty reduction stagnates or declines, while in 40 other poor countries, poverty does decline with rising inequality. Where poverty declines, however, the increments are small. The Bank concludes:
“”In addition to a contemporaneous reduction n poverty that may be expected from lowering inequality, policies that promote a more equal distribution of income are likely to enable the economy to realize greater poverty reduction from future growth.” (85)

What happened to the trickle-down theory? Isn’t economic growth enough? Is something more than a safety net now deemed necessary? The new middle class won’t bring greater well-being to poor countries? As Claude Rains said to Bogie: “ I am shocked! Shocked!”

More shocking still: Can the World Bank be suggesting income redistribution? Are they admitting that equality is better for economies and people as well than simply economic growth and an outpouring of free enterprise?

Imagine: There is a mole in World Bank – and on Wolfowitz’s, how do you say, “watch.” Someone has smuggled a bit of truth into the Bank. Equality is better: it does a better job of eliminating poverty and improves the prospects for greater economic growth. According this time to the World Bank.

Faced with these facts, you might now think that the relationship between poverty and economic inequality is almost intuitive. But then again, if you are an American reader, a European reader, or that new middle class person in a poor country, you probably don’t wonder how your fate (no less than mine) is tied inextricably with the fate of all the poor people who surround you in your everyday life. Isn’t it unlikely that all of them are down on their luck, and you perchance are not?

In my next column two weeks from now, I will talk about the redistribution that the World Bank hints at could occur, and to the advantage of poor people around the world. I will also talk about why people in rich countries find it difficult to accept the virtues of doing something to bring about worldwide economic equality.

Selected Minor Works: Where Movies Came From

Justin E. H. Smith

In The World Viewed, Stanley Cavell wryly comments that it was not until he reached adulthood that he learned “where movies come from.” As it happens, movies come from the same place I do: California. Now as an answer to the question of origins, this is hardly satisfying. “California,” as a one-word answer to anything, has the air of a joke about it, whereas we at least aim for earnestness. This is a problem that has vexed many who have left California and attempted to make sense of it at a distance. The turn-of-the-century Harvard philosopher Josiah Royce once declared of his home state that “there is no philosophy in California.” Yet the state’s generative power, and my attachment to it, have left me with the sense that something of philosophical interest is waiting to be said, by me if I’m lucky, if not in it, then at least about it and its exports.

My sense is that these two questions, the autobiographical and the film-historical, may be treated together. This is not because I was born into a Hollywood dynasty –far from it– but because throughout most of my life, memories were something shared, something public, something manufactured. By this I mean that, instead of memories, we had movies, and instead of conversation, we mimicked dialogue. I use the past tense here, as in the title (though there in acknowledgment also of a debt to Joan Didion), because it is already clear that movies will not be the dominant art form of the twenty-first century, and if we agree with Cavell that a movie is a sequence of automated world projections, then movies are no longer being made.

Gretagarboclarkgable

A contingent development in the history of technology left us with an art form thought by many to reveal something very significant about what we as humans are. Cavell chose to express this significance in the Heideggerian terms of film’s ‘world-disclosing power’ (did Heidegger ever even see a movie?). Already before 1920, Royce’s Harvard colleague Hugo Münsterberg had argued that the ‘photoplay’ serves as a powerful proof of Fichtean idealism: what need is there for Kant’s thing-in-itself if a ‘world’ can exist just as well projected on a screen as embodied in three dimensions?

I take it for granted that the world disclosed to us today is the same world to which human beings have had access for roughly the past hundred thousand years, that is, since we became anatomically, and thus we may presume cognitively, modern. For this reason, what interests me most about movies is the question: what is it that our experience of them replaced? We have only had them for a hundred and some odd years, not long enough for our brains to have evolved from some pre-cinematic condition into something that may be said to have an a priori grasp of what a movie is, in the same way that we now know that human brains come into the world with the concept of, for example, ‘animate being’. We are not naturally movie-viewing creatures, though it certainly feels natural, as though it were just what we’ve always done. What then is it that we’ve always done, of which movie-viewing is just the latest transformation? What is that more fundamental category of activity of which movie-viewing is a variety?

One well-known answer is that watching movies is an activity much like dreaming. This is evidenced by the numerous euphemisms we use for the motion picture industry. In his recent book, The Power of Movies: How Mind and Screen Interact, the analytic philosopher Colin McGinn explicitly maintains that the mind processes cinematic stories in a way that is similar to its processing of dreams. He even suggests that movies are ‘better’ than dreams to the extent that they are ‘dreams rendered into art’.

But what then are dreams? To begin with, dreams are a reminder that every story we come up with to account for who we are and how we got to be that way is utterly and laughably false. Everything I tell myself, every comforting phrase so useful in waking life, breaks down and becomes a lie. For eight hours a day, it is true that I have killed someone and feel infinite remorse, that my teeth have fallen out, that I am able to fly but ashamed to let anyone know, that the airplanes I am in make slow motion, 360-degree loops, that my hair is neck-length and won’t grow any longer. None of these things is true. Yet, some mornings, for a few seconds after awakening, I grasp that they are truer than true. And then they fade, and the ordinary sense of true and false settles back in.

The images that accompany these feelings –the feeling of shame at levitating, the feeling of being in a doomed airplane—are relatively unimportant. They are afterimages, congealed out of the feelings that make the dreams what they are. As Aristotle already understood, and explained in his short treatise On Dreams, “in every case an appearance presents itself, but what appears does not in every case seem real… [D]ifferent men are subject to illusions, each according to the different emotion present in him.” Perhaps because of this feature of dreams –that they are not about the things that are seen, but rather the things that are seen are accompaniments for feelings– dreams have always been interpreted symbolically. This has been the case whether the interpreter believes that dreams foretell the future, or in contrast that they help to make sense of how the past shaped the present. Psychoanalysis has brought us around, moreover, to the idea that retrodiction is no more simple a task than oneiromancy, and that indeed the two are not so different: once you unravel the deep truth of the distant past, still echoed in dreams even if our social identities have succeeded in masking it, then by that very insight, and by it alone, you become master of your own future.

It seems to me that we don’t have an adequate way of talking about dreams. The topic is highly tabooed, and anyone who recounts his dreams to others, save for those who are most intimate, is seen as flighty and mystical. Of course, the consequence of this taboo is not that dreams are not discussed, but only that they are discussed imprecisely. For the most part, we are able to explain what happened, but not what the point-of-view of the dreamer was. This is overlooked, I suspect, because it is taken for granted that the point-of-view of the dreamer is that of a movie viewer. What people generally offer when prompted to recount a dream is a sort of plot summary: this happened, then this, then this. Naturally, the plot never makes any sense at all, and so the summary leaves one with the impression that what we are dealing with is a particularly strange film.

Certainly, there is a connection between some films –especially the ‘weird’ ones– and dreams, but only because the filmmakers have consciously, and in my view always unsuccessfully, set about capturing the feeling of a dream. From Un chien andalou to Eraserhead, weird things happen indeed, but the spectator remains a spectator, outside of the world projected onto the screen, looking into it. We are made to believe that our dreams are ‘like’ movies, but lacking plots, and then whenever an ‘experimental’ filmmaker attempts to go without plot, as if on cue audiences and critics announce that the film is like a dream. Middle-brow, post-literate fare such as Darren Aronofsky’s tedious self-indulgences have further reduced the dreamlike effect supposedly conveyed by non-linear cinema to an echo of that adolescent ‘whoah’ some of us remember feeling at the Pink Floyd laser-light show down at the planetarium.

Dreams are not weird movies, even if we recognize the conventions of dreamlikeness in weird movies. Weird movies, for one thing, are watched. The dreamer, in contrast, could not be more in the world dreamt. It is the dreamer’s world. It is not a show.

However problematic the term, cinematic ‘realism’ shows us, moreover, that movies can exhibit different degrees of dreamlikeness, and thus surely that there is something wrong with the generalized movie-dream analogy. In dream sequences, we see bright colors and mist, and, as was explicitly noted by a dwarf in Living in Oblivion, we often see dwarves. When the dream sequence is over, the freaks disappear, the lighting returns to normal, and in some early color films, most notably The Wizard of Oz, we return to black-and-white, the cinematic signifier of ‘reality’. My dreams are neither like the dream sequences in movies, nor are they like the movies that contain the dream sequences. Neither Kansas nor Oz, nor limited to dwarves in the repertoire of curious sights they offer up.

A much more promising approach is to hold, with Cavell, that movies are mythological, that their characters are types rather than individuals, and that the way we experience them is probably much more like the way folk experience their tales. Movies are more like bedtime stories than dreams: like what we cognize right before going to sleep than the mash that is made of our waking cognitions after we fall asleep.

If anything on the screen resembles dreams, it is cartoons (and thus Cavell is right to insist that these are in need of a very different sort of analysis than automated world projections). Cartoons are for the most part animistic. It is difficult to imagine a dream sequence in a Warner Brothers cartoon, since there were to begin with no regular laws of nature that might be reversed, there was no reality that might be suspended. For most of the early history of cartoons, there were no humans, but only ‘animate’ beings, such as cats and mice, as well as trees, the sun, and clouds, often given a perfunctory face just to clue us into their ontological status.

The increasing cartoonishness of movies –both the increasing reliance on computer graphics, as well as the decreasing interest in anything resembling human beings depicted in anything resembling human situations (see, e.g., Pierce Brosnan-era James Bond for a particularly extreme example of the collapse of the film/cartoon boundary)—may be cause for concern. Mythology, and its engagement with recognizably human concerns about life and death, is, it would seem, quickly being replaced by sequences of pleasing colors and amusing sounds.

Teletubbieshp43212

I do not mean to come across as a fogey. Unlike Adorno with his jazz problem (which is inseparable from his California problem: the state that made him regret that the Enlightenment ever took place), I am a big fan of some of the animistic infantilism I have seen on digital screens recently. Shrek and the Teletubbies are fine entertainments. I am simply noting, already for a second time, that the era of movies is waning, and that nothing has stepped in, for the moment, to do what they once did.

A video-game designer recently told me that ‘gaming’ is just waiting for its own Cahiers du Cinéma, and that when these come along, and games are treated with adequate theoretical sophistication not by fans but by thinkers, then these will be in a position to move into the void left by film. I have no principled reasons to be saddened by this, but they will have to do a good deal more than I’ve seen them doing so far. Now I have not played a video game since the days when Atari jackets were sincerely, and not ironically, sought after. But I did see some Nintendo Wii consoles on display in a mall in California when I was home for the holidays this past week. The best argument for what the crowding mall urchins were doing with those machines is the same one, and the only one, that we have been able to come up with since Pong, and the one I certainly deployed when pleading with my own parents for another few minutes in front of the screen: it seems to do something for developing motor skills. This makes video games the descendants of sporting and hunting, while what movies moved in to replace were the narrative folk arts, such as the preliterate recitations that would later be recorded as Homer’s Odyssey. These are two very different pedigrees indeed, and it seems unlikely to me that the one might ever be the successor to the other.

Dreams are the processing of emotional experiences had in life, experiences of such things as hunting, or fighting, or love. Narrative arts, such as movies, are the communal processing, during waking life, of these same experiences. Movies are not like dreams, and video games are not like movies. And as for what experiences are, and why all the authentic ones seem to have already been had by the time we arrive at an age that enables us to reflect on them (seem all to have happened in California), I will leave that question to a better philosopher, and a less nostalgic one.

**

For an extensive archive of Justin Smith’s writing, please visit his archive at www.jehsmith.com.

Sunday, December 31, 2006

The Lives They Lived

From The New York Times:Times_cover

This issue doesn’t try to be a definitive document of the lives and deaths of the most important or influential. Instead, it’s largely an idiosyncratic selection, chosen by our editors and writers, who are often following their own passions and curiosities. There are some big names: the playwright Wendy Wasserstein, the photographer Gordon Parks, Betty Friedan. But there are also many minor characters — Victoria Jackson, Gray Adams, who was involved in desegregating the Mississippi delegation of the Democratic Party; Rupert Pole, the other husband of Anaïs Nin; Nena O’Neill: co-author of a 1970s best seller about “open marriage.” By embracing its own form of obituary, this issue tries to capture ideas and moments across the century and also to convey the richness of individual lives.

More here.

Top 10 stories of 2006

From Nature:

The stories that got the most comments from you, our readers.

Ten_2 The fish that crawled out of the water
Does gender matter?
Islam and science
Found: one Earth-like planet
Delusions of faith as a science
Top 5 science blogs
‘Tenth Planet’ found to be a whopper
Study challenges prayers for the sick
Tragic drug trial spotlights potent molecule
The space elevator: going down?

More here.

Saturday, December 30, 2006

inlandia

Inlandia

But Didion and Davis are only tourists in the “empire” of inland California, with a tourist’s ability to be both accurate and oblivious when they write about what it’s like to actually live in San Bernardino, Riverside or the badlands beyond. The road through “Inlandia” (a somewhat awkward designation for the Southern California interior) stops at other accounts of home. M.F.K. Fisher remembers Hemet in the 1940s: “There are many pockets of comfort and healing on this planet … but only once have I been able to stay as long and learn and be told as much as there on the southeast edge of the Hemet Valley.” J. Smeaton Chase wakes to a July dawn in the Mojave, circa 1920: “To lie at dawn and watch the growing glory in the east, the pure … light stealing up from below the horizon, the brightening to holy silver, the first flash of amber, then of rose, then a hot stain of crimson, and then the flash and glitter, the intolerable splendor….” Percival Everett in 2003 defines the “badlands” of the 909 area code: “Technically, the Badlands is chaparral. The hills are filled with sage, wild mustard, fiddleheads and live oaks. Bobcats, meadowlarks, geckos, horned lizards, red tailed hawks, kestrels, coach whip snakes, king snakes, gopher snakes. Rattlesnakes and coyotes. We don’t see rain for seven months of the year and when we do we often flood. In the spring, the hills are green. They are layered and gorgeous. This is in contrast to the rest of the year when the hills are brown and ochre and layered and gorgeous.”

more from the LA here.

hilton kramer: against academic twaddle, commercial hype or political mystification

Kram190

Kramer’s most provocative judgment is to insist upon Modernism as an essential component of bourgeois culture. He admires Modernist art and has less patience for the artworks made “after” Modernism, which he tends to interpret in terms of decline or degeneration. Contemplating Matisse’s achievement, Kramer laments, “It is hard to believe that we shall ever again witness anything like it, now or in the foreseeable future.” Today, instead, we endure “the nihilist imperatives of the postmodernist scam.”

Not that Kramer hates everything that came after Matisse. Many of the items in the book, though slight and descriptive, perform a modest, useful function for newcomers to subjects including Jackson Pollock (“a triumph of ambition and short-lived inspiration over a severely handicapped and unruly personality”), Helen Frankenthaler (“a major artist”), Odd Nerdrum (“a first-rate dramatic imagination”) and Alex Katz (“one thinks of Monet at Giverny”). He also discusses Richard Diebenkorn and Christopher Wilmarth, not to mention past masters like Courbet, Bonnard, Braque and Beckmann. In all, an eclectic group, and Kramer writes interestingly and engagingly about each one.

more from the NY Times Book Review here.

Hitchens on Ford

In Slate:

One expects a certain amount of piety and hypocrisy when retired statesmen give up the ghost, but this doesn’t excuse the astonishing number of omissions and misstatements that have characterized the sickly national farewell to Gerald Ford. One could graze for hours on the great slopes of the massive obituaries and never guess that during his mercifully brief occupation of the White House, this president had:

1. Disgraced the United States in Iraq and inaugurated a long period of calamitous misjudgment of that country.

2. Colluded with the Indonesian dictatorship in a gross violation of international law that led to a near-genocide in East Timor.

3. Delivered a resounding snub to Aleksandr Solzhenitsyn at the time when the Soviet dissident movement was in the greatest need of solidarity.

Instead, there was endless talk about “healing,” and of the “courage” that it had taken for Ford to excuse his former boss from the consequences of his law-breaking. You may choose, if you wish, to parrot the line that Watergate was a “long national nightmare,” but some of us found it rather exhilarating to see a criminal president successfully investigated and exposed and discredited. And we do not think it in the least bit nightmarish that the Constitution says that such a man is not above the law. Ford’s ignominious pardon of this felonious thug meant, first, that only the lesser fry had to go to jail. It meant, second, that we still do not even know why the burglars were originally sent into the offices of the Democratic National Committee. In this respect, the famous pardon is not unlike the Warren Commission: another establishment exercise in damage control and pseudo-reassurance (of which Ford was also a member) that actually raised more questions than it answered. The fact is that serious trials and fearless investigations often are the cause of great division, and rightly so. But by the standards of “healing” celebrated this week, one could argue that O.J. Simpson should have been spared indictment lest the vexing questions of race be unleashed to trouble us again, or that the Tower Commission did us all a favor by trying to bury the implications of the Iran-Contra scandal. Fine, if you don’t mind living in a banana republic.

Why military honor matters

Elaine Scarry in the Boston Review:

ScarryIn 1998, an article by Colonel Charles J. Dunlap Jr. appeared in the United States Air Force Academy’s Journal of Legal Studies warning that a new form of warfare lay ahead. Because our military resources are so far beyond those of any other country, Dunlap argued, no society can today meet us through symmetrical warfare. Therefore, our 21st-century opponents will stop confronting us with weapons and rules that are the mirror counterparts of our own. They will instead use asymmetrical or “neo-absolutist” forms of warfare, resorting to unconventional weapons and to procedures forbidden by international laws.

What Dunlap meant by “unconventional weapons” is clear: the category would include not only outlawed biological, chemical, and nuclear weapons (the last of which, in the view of the United States, only itself and a small number of other countries are legally permitted to have) but also unexpected weapons such as civilian passenger planes loaded with fuel and flown into towering buildings in densely populated cities.

But the term “neo-absolutism,” as used by Dunlap, applies not just to the use of unconventional weapons but to conduct that violates a sacrosanct set of rules—acts that are categorically prohibited by international law and by the regulations of the United States Air Force, Navy, and Army (along with the military forces of many other nations). For example, though warfare permits many forms of ruse and deception, it never permits the false use of a white flag of truce or a red cross.

More here.

BEWARE THE ONLINE COLLECTIVE

Jaron Lanier at Edge.org:

Jaron201_1It’s funny being an “old timer” in the world of the Internet. About six years ago, when I was 40 years old, a Stanford freshman said to me, “Wow Jaron Lanier—you’re still alive?” If there’s any use to sticking around for the long haul — as computers get so much more powerful that every couple of years our assumptions about what they can do have to be replaced — it might be in noticing patterns or principles that may not be so apparent to the latest hundred million kids who have just arrived online.

There’s one observation of mine, about a potential danger, that has caused quite a ruckus in the last half-year. I wrote about it initially in an essay called “Digital Maoism.”

Here’s the idea in a nutshell: Let’s start with an observation about the whole of human history, predating computers. People have often been willing to give up personal identity and join into a collective. Historically, that propensity has usually been very bad news. Collectives tend to be mean, to designate official enemies, to be violent, and to discourage creative, rigorous thought. Fascists, communists, religious cults, criminal “families” — there has been no end to the varieties of human collectives, but it seems to me that these examples have quite a lot in common. I wonder if some aspect of human nature evolved in the context of competing packs. We might be genetically wired to be vulnerable to the lure of the mob.

More here.

Turbulent year for books

Josh Getlin in the Los Angeles Times:

It started off with bestselling author James Frey admitting his memoir, “A Million Little Pieces,” was in fact a work of fiction, and ended with celebrity publisher Judith Regan getting fired for allegedly making anti-Semitic comments after her proposed O.J. Simpson confessional book-TV deal got shot down.

In between came charges that 19-year-old Harvard novelist Kaavya Viswanathan had lifted passages from a rival chick-lit author, and hotly disputed allegations that Ian McEwan, one of the most respected names in modern literary fiction, may have been guilty of plagiarism.

More here.

Iminngernaveersaartunngor- tussaavunga

John McWhorter in the New York Sun:

In the rush of the holiday season you may have missed that a white buffalo was born at a small zoo in Pennsylvania. Only one in 10 million buffalo is born white, and local Native Americans gave him a name in the Lenape language: kenahkihinen, which means “watch over us.”

They found that in a book, however. No one has actually spoken Lenape for a very long time. It was once the language of what is now known as the tristate area, but its speakers gradually switched to English, as happened to the vast majority of the hundreds of languages Native Americans once spoke in North America.

The death of languages is typically described in a rueful tone. There are a number of books treating the death of languages as a crisis equal to endangered species and global warming. However, I’m not sure it’s the crisis we are taught that it is.

There is a part of me, as a linguist, that does see something sad in the death of so many languages. It is happening faster than ever: It has been said that a hundred years from now 90% of the current 6,000 languages will be gone.

Each extinction means that a fascinating way of putting words together is no longer alive. In, for example, Inuktitut Eskimo, which, by the way, is not dying, “I should try not to become an alcoholic” is one word: Iminngernaveersaartunngortussaavunga.

More here.

Some Kurdish Reactions to to Saddam’s Execution

From some initial editorials, many Kurds seem none too pleased with Saddam’s execution. Amin Matin in Kurdish Media:

Iraq’s highest court upheld Saddam Hussein’s death sentence for killing of nearly 150 Shiite Arabs, paving the way for the former dictator to be hanged within 30 days. The execution order still needs to be approved by the office of Iraqi president, Mr. Jalal Talabani.

Saddam is also on trial for crimes against humanity and genocide that he and his regime committed in Southern Kurdistan. These atrocities resulted in killing of over 200,000 civilian Kurds and were part of a final solution code named Anfal that also included use of weapons of mass destruction such as chemical bombs. Executing Saddam prior to concluding the current trial will deny justice to Kurdish victims and strips Kurds from the possibility of serving justice to Anfal survivors. Proving the case for Kurdish genocide has enormous values for Iraqi Kurds and Kurdish nation. There are still people in Iraq and Arab world that deny the systematic genocide against Kurds. Saddam’s trial for crimes against humanity and genocide committed against Kurdish nation is a rare opportunity for Kurds to validate the depth and scope of the atrocities committed against Kurdish nation in a Iraqi court of law.

Some responses by Kurdish Media readers can be found here.

Joy of Capture Muted at the End

From Saddam_promo_1The New York Times:

CRAWFORD, Tex., Dec. 29 — The capture of Saddam Hussein three years ago was a jubilant moment for the White House, hailed by President Bush in a televised address from the Cabinet Room. The execution of Mr. Hussein, though, seemed hardly to inspire the same sentiment.

Before the hanging was carried out in Baghdad, Mr. Bush went to sleep here at his ranch and was not roused when the news came. In a statement written in advance, the president said the execution would not end the violence in Iraq.

After Mr. Hussein was arrested Dec. 13, 2003, he gradually faded from view, save for his courtroom outbursts and writings from prison. The growing chaos and violence in Iraq has steadily overshadowed the torturous rule of Mr. Hussein, who for more than two decades held a unique place in the politics and psyche of the United States, a symbol of the manifestation of evil in the Middle East.

Now, what could have been a triumphal bookend to the American invasion of Iraq has instead been dampened by the grim reality of conditions on the ground there.

More here.

Long Walk to Freedom

From The Washington Post:

Mandela “A leader is like a shepherd,” Nelson Mandela proclaimed more than a decade ago in his autobiography. “There are times when a leader must move out ahead of his flock, go off in a new direction, confident that he is leading his people in the right way.”

It’s an arrogant statement — could any other democratically elected politician get away with equating his constituents with sheep? — and yet supremely apt. For Mandela is arguably the greatest political leader of our time, the one person worthy of mention alongside FDR, Churchill and Gandhi. Mandela led the political and moral crusade for majority rule in South Africa against a white supremacist police state, risking his life, surrendering his personal freedom and his family’s well-being. He spent 27 years in prison only to emerge as a wise, dynamic and conciliatory figure binding black and white together as father of his nation and inspiration for the world.

The danger, of course, is that in extolling Mandela’s virtues, it’s all too easy to turn him into a saint — worshipped and untouchable and therefore of no practical value as a guide for our own behavior — and to lose track of the flawed, flesh-and-blood human being whom we can learn from and seek to emulate. As George Orwell once warned, “Saints should always be judged guilty until they are proved innocent.”

More here.