Edvard Munch: At the Royal Academy

Peter Campbell in the London Review of Books:

Munch02Edvard Munch’s art was made from his troubles. When, in middle age, he retreated to the estate he had bought on the outskirts of Oslo (then still called Kristiania), love affairs, drink, a nervous breakdown and illness had already supplied the subject-matter his peculiarly subjective art required. The ideas he developed early he went on using. Late in his career he wrote: ‘The second half of my life has been a battle just to keep myself upright. My path has led me along the edge of a precipice, a bottomless pit . . . From time to time I’ve tried to get away from the path, thrown myself into the throng of life among people. But every time I have had to go back to the path along the cliff top.’

More here.

Fugitive rat sets distance record

From BBC News:

Rat The rodent had been radio tagged and its movements tracked by researchers to learn more about pest species and how they invade small islands. The rat was released on the uninhabited island of Motuhoropapa but refused to be captured at the project’s end. The NZ team tells Nature magazine the animal finally turned up on the nearby Otata Island – a mighty swim of 400m. James Russell, from the University of Auckland, and colleagues think this may be the longest distance recorded for a rat swimming across open sea. “Norway rats can supposedly swim up to 600m but, to our knowledge, this is the first record of a rat swimming hundreds of metres across open water,” they write. In total, the rat was free for 18 weeks. It was eventually killed in a trap baited with penguin meat.

More here.

‘Wake the people and make them think big’

From The Guardian:

Of the two greatest dramatists of the 19th century, Chekhov and Ibsen, it is the infinitely lovable Ibsenh1 Dr Chekhov who holds the highest place in our affections, both as man and as author. But Ibsen, the forbidding man of the north – accusatory eyes fiercely staring out at us from behind steel-rimmed spectacles, thin, severe lips tightly pursed amid the bizarre facial topiary – may be the one who speaks most urgently to us today. At the time of his death, almost 100 years ago, Henrik Ibsen’s significance as a leader of thought was overwhelming. In 1900, the young James Joyce, still a student, wrote of him: “It must be questioned whether any man has held so firm an empire over the thinking world in modern times … his genius as an artist faces all, shirks nothing … the long roll of drama, ancient or modern, has few better things to show.” Joyce (and later Wittgenstein) learned Norwegian specifically in order to read Ibsen’s plays in the original.

More here.

The government’s dubious bioterror case has sent a dangerous message

William B. Greenough III in the Bulletin of the Atomic Scientists:

The U.S. government’s post-9/11 effort to make citizens more “secure” has had some frighteningly destructive consequences. One of the most egregious examples is the government’s prosecution–using new departmental powers of Homeland Security and the PATRIOT Act–of Thomas C. Butler, a distinguished scientist and doctor.

By the time you read this, Butler will have been in prison for well over a year. It’s a curious place for the U.S. government to put the man who is credited by the World Health Organization with saving the lives of more than two million children every year through a cholera treatment he helped to develop. Moreover, at the time of his arrest in 2003, Butler was researching ways to protect Americans from plague, a weaponizable pathogen.

Human plague, endemic in the United States, in Texas and other parts of the Southwest, is caused by the bacteria Yersinia pestis and carried by rodents. From time to time, plague spreads to Americans who hunt or handle the infected rodents. But across the globe in Tanzania, plague is far more common and kills people regularly.

Butler, then a professor of medicine, chief of the Infectious Diseases Division of Texas Tech University, and one of the most knowledgeable and experienced clinical scientists in plague infections, was worried that if the disease were to arise in the United States–either naturally or through terrorism–the country would be ill-equipped to treat the victims. Both streptomycin and chloromycetin, the two antibiotics currently recommended to treat plague and prevent death, are old and not readily available at U.S. health centers. Butler felt that it was urgent to test the efficacy of two other readily available antibiotics–gentamicin and doxycycline. Researchers at the Centers for Disease Control and Prevention, Fort Detrick, Fort Collins, and at the Food and Drug Administration agreed and were happy to collaborate in his research.

More here.

Why do we believe in God?

“Faith in a higher being is as old as humanity itself. But what sparked the Divine Idea? Did our earliest ancestors gain some evolutionary advantage through their shared religious feelings? In these extracts from his latest book, Robert Winston ponders the biggest question of them all.”

From The Guardian:

Story1…it is easy to suggest a mechanism by which religious beliefs could help us to pass on our genes. Greater cohesion and stricter moral codes would tend to produce more cooperation, and more cooperation means that hunting and gathering are likely to bring in more food. In turn, full bellies mean greater strength and alertness, greater immunity against infection, and offspring who develop and become independent more swiftly. Members of the group would also be more likely to take care of each other, especially those who are sick or injured. Therefore – in the long run – a shared religion appears to be evolutionarily advantageous, and natural selection might favour those groups with stronger religious beliefs.

But this is not the whole story. Although religion might be useful in developing a solid moral framework – and enforcing it – we can quite easily develop moral intuitions without relying on religion. Psychologist Eliot Turiel observed that even three- and four-year-olds could distinguish between moral rules (for example, not hitting someone) and conventional rules (such as not talking when the teacher is talking). Furthermore, they could understand that a moral breach, such as hitting someone, was wrong whether you had been told not to do it or not, whereas a conventional breach, such as talking in class, was wrong only if it had been expressly forbidden. They were also clearly able to distinguish between prudential rules (such as not leaving your notebook next to the fireplace) and moral rules.

This would suggest that there is a sort of “morality module” in the brain that is activated at an early age. Evidence from neuroscience would back this up, to a degree.

More here.

birnbaum, lethem


Robert Birnbaum interviews Jonathan Lethem (recent MacArthur genius).

Lethem: Now we’re arriving at the bug [that was] in my ear when I said we should talk again. It’s all coming back. Certainly, yes, there’s a kind of relentless bad faith expressed when reviewers or critics remark on one element in a novel as though it’s a remarkable piece of metaphor or surrealism, as though they’ve never encountered such a thing before. They’re shocked, just shocked that something is being proposed—they act as though it is utterly unfamiliar to them, what they really mean is that they object to it on principle, on class or political grounds like those I just described. So, by reacting as though the incursion were new, instead of familiar, it permits a kind of disingenuous head-scratching: “Hmm, perhaps this new method is of interest, or could be, in the hands of the most serious of writers. We’ll have to watch closely and see.” You saw this happening when Roth’s new book was reviewed. Roth’s use of the “alternate history” was treated, in certain quarters, as though, first of all, Roth himself had never written a book that challenged mimetic propriety—suddenly The Breast didn’t exist, suddenly The Great American Novel didn’t exist. Suddenly Counterlife didn’t exist. To write about this thing with a 10-foot pole, and say, “What’s this strange method? What have we got here? One of the great pillars of strictly realist fiction has inserted something very odd into his book. We’ll puzzle over this as though it’s unprecedented.” It was as though there had been no Thomas Pynchon. As though Donald Barthelme, Kurt Vonnegut, Angela Carter, Robert Coover had been thrown into the memory hole. Was there never a book called The Public Burning? Do we really have to retrace our steps so utterly in order to reinscribe our class anxieties? Not to mention, of course, the absolute ignorance of international writing implicit in the stance: where’s Cortazar, Abe, Murakami, Calvino, and so very many others? Well, the status quo might argue, patronizingly, those cute magical-realist methods—how I despise that term—are fine for translated books, but we here writing in English hew to another standard of ‘seriousness.’ Not to mention, of course, the quarantine that’s been implicitly and silently installed around genre writing that uses the same method as Roth’s with utmost familiarity. Well, the status quo might argue, sounding now like an uncle in a P.G. Wodehouse novel: Ah, yes, well, we all know that stuff is, how do you say it, old boy? Rather grubby. No, I say, no. This isn’t good enough, not for the New York Times Book Review and the New York Review of Books, in 2004. Let me say it simply: there is nothing that was proposed in Roth’s book that could be genuinely unfamiliar to a serious reader of literary fiction of the last 25 years, 30 years, 50 years. To treat it as unfamiliar is a bogus naiveté—one that disguises an attack on modernism itself, in the guise of suspiciousness about what are being called post-modern techniques. It actually reflects a discomfort with the entire century.

more at The Morning News here.

early american democracy

Jill Lepore on the origins of American democracy in the New Yorker.

Readers may weary at the length of Wilentz’s book, but, as a model for integrating social and political history, it’s hard to dispute. That it will be disputed is, however, certain, if only because Wilentz has been such a vigorous critic of his colleagues. He has had little use for historians who defend Federalists like Noah Webster. To those who celebrate Federalists for their opposition to slavery, Wilentz counters, “Rarely has any group of Americans done so little to deserve such praise.” In his New Republic reviews, Wilentz has been particularly indignant about historians who place Federalists in a better light than Republicans or who dismiss Jefferson’s entire career because he owned slaves (including some who were almost certainly his own children). David McCullough’s “John Adams” was, in his view, “popular history as passive nostalgic spectacle.” Garry Wills’s book about Jefferson’s election, “Negro President,” he deemed “misadventurous.” In another essay, Wilentz concluded, “Were he alive today, Jefferson would probably regard modern American historians as a rascally bunch.”

But one thing that Federalists understood—for all their failings, for all their unmitigated snobbery—was the fragility of democracy. I’d be willing to consider you an angel, Webster told Jefferson, if you could show me a democracy that isn’t corrupt, or if you could protect the United States from “the instruments with which vicious and unqualified men destroy the freedom of elections, and exalt themselves into power, trampling first on the great and good, and afterwards on the very people to whom they owe their elevation.” Webster may have been a prig, but he wasn’t a duffer.



Sandwiched between ‘70s agitators like Ant Farm and more recent groups like the Yes Men, Negativland has been going strong for twenty-five years, an anniversary commemorated by this retrospective of their work. Among their best-known culture jamming exploits is their album U2, which includes liberal sampling from U2’s album Joshua Tree (prompting a landmark 1991 intellectual property case in which U2’s record label, Island Records, sued Negativland and SST).

more from Artforum here.

‘Dr. Atomic’: Unthinkable Yet Immortal

From The New York Times:

18atom2184There is physics. And then there is physics with music.

And so, when a new opera about the atomic bomb, of all things, opened to acclaim this month in San Francisco, I traveled across the bay from a conference in Berkeley. It has been 60 years since the atomic bomb emerged from its cradle at the Trinity test site in Alamogordo, N.M., to punctuate the end of World War II with a blinding flash and the cries of the burned in Hiroshima and Nagasaki. I am old enough to have emerged from that same cradle, fussing and complaining, old enough to have seen the Beatles and to have watched and felt the desert shake one cold loud morning in 1968 as one of those beasts went off and a radioactive dust ball drifted off toward Canada.

As an ex-physics major, sci-fi addict, science writer and lover of apocalypse, I long ago concluded that there was not much new to say about the atomic bomb. But I was wrong.

More here.

Imagine a World Without Copyright

Joost Smiers and Marieke van Schijndel in the International Herald Tribune:

Copyright was once a means to guarantee artists a decent income. Aside from the question as to whether it ever actually functioned as such – most artists never made a penny from the copyright system – we have to admit that copyright serves an altogether different purpose in the contemporary world. It now is the tool that conglomerates in the music, publishing, imaging and movie industries use to control their markets.

These industries decide whether the materials they have laid their hands on may be used by others – and, if they allow it, under what conditions and for what price. European and American legislation extends them that privilege for a window of no less than 70 years after the passing of the original author. The consequences? The privatization of an ever-increasing share of our cultural expressions, because this is precisely what copyright does. Our democratic right to freedom of cultural and artistic exchange is slowly but surely being taken away from us.

More here.

Mega-cities: Crowded and Environmentally Stressed

Divya Abhat, Shauna Dineen, Tamsyn Jones, Jim Motavalli, Rebecca Sanborn, and Kate Slomkowski in Emagazine.com:

1125343317f_egyptWe take big cities for granted today, but they are a relatively recent phenomenon. Most of human history concerns rural people making a living from the land. But the world is rapidly urbanizing, and it’s not at all clear that our planet has the resources to cope with this relentless trend. And, unfortunately, most of the growth is occurring in urban centers ill-equipped for the pace of change. You’ve heard of the “birth dearth”? It’s bypassing Dhaka, Mumbai, Mexico City and Lagos, cities that are adding population as many of their western counterparts contract.

The world’s first cities grew up in what is now Iraq, on the plains of Mesopotamia near the banks of the Tigris and Euphrates Rivers. The first city in the world to have more than one million people was Rome at the height of its Empire in 5 A.D. At that time, world population was only 170 million. But Rome was something new in the world. It had developed its own sophisticated sanitation and traffic management systems, as well as aqueducts, multi-story low-income housing and even suburbs, but after it fell in 410 A.D. it would be 17 centuries before any metropolitan area had that many people.

The first large city in the modern era was Beijing, which surpassed one million population around 1800, followed soon after by New York and London. But at that time city life was the exception; only three percent of the world’s population lived in urban areas in 1800.

More here.

Isaac Julien’s Fantom Creol

Malcom Le Grice reviews Isaac Julien’s new video installation, Fantome Creol, at the Centre Pompidou in Paris.

Isaac Julien’s video installation Fantôme Créol (Creole Phantom, 2005) was shown on four very large adjacent screens: two on either side of the gallery; Baltimore (2003) had three equally large screens, but placed in an arc so that they could be viewed as a whole. The visual quality was fine, the carpeted gallery provided excellent acoustics, and comfortable seating encouraged spectators to watch the full cycle of each work – sadly a rare experience with film and video installations.

One quality underlying Julien’s work is his exceptional command of the aesthetics of cinema. His images are often sumptuous, and the film construction shows a great control of visual rhythm. These qualities, evident in his single-screen films such as Frantz Fanon: Black Skin, White Mask (1996), take on another, spectacular dimension when two or more screens are edited to harmonize or counterpoint image content, colour and movement.

isaiah Berlin: 1989


Granta makes available Isaiah Berlin’s thoughts on the revolutions of ’89. Kind of interesting to read now.

You ask me for a response to the events in Europe. I have nothing new to say: my reactions are similar to those of virtually everyone I know, or know of—astonishment, exhilaration, happiness. When men and women imprisoned for a long time by oppressive and brutal regimes are able to break free, at any rate from some of their chains, and after many years know even the beginnings of genuine freedom, how can anyone with the smallest spark of human feeling not be profoundly moved? One can only add, as Madame Bonaparte said when congratulated on the historically unique distinction of being mother to an emperor, three kings and a queen, ‘Oui, pourvu que ça dure.’ If only we could be sure that there will not be a relapse, particularly in the Soviet Union, as some observers fear.

Relations as Art

From Art Papers:

Call Jillian Mcdonald a relations artist. Relationships are her medium, fleeting encounters her material. Mcdonald’s works often exist on the margins of art institutions and activate public space in disquieting ways. Yet, they are all about intimacy.

If her performance projects—in Mile Share, 2004, she invited strangers to run a mile with her; in Advice Lounge, 2003, she provided free, non-professional advice to passersby; for Houseplant, 2002-2003, she offered to deliver a houseplant to strangers’ residences; and in Shampoo, 2001, she posted a message in a local newspaper, inviting strangers to come for a free shampoo—seem light years away from her media works, the focus on relations provides an interpretive continuum.

For Me and Billy Bob, Mcdonald inserted carefully constructed footage of her image into scenes from Thornton’s movies, thus creating a compendium of flirtation—a wink, a light touch, a brief kiss, and a heart break.



There is a refreshing eclecticism about this year’s Turner prize exhibition at Tate Britain. The four short-listed artists represent a diverse range of practices: there is sculpture, video, conceptual work, and even some traditional painting. This is a comforting rather than a confrontational show. Stephen Deuchar, director of Tate Britain, hinted as much on Monday when he described how contemporary art had become “less scary” over the past few years, due in no small part to the Turner’s persistent demystification of recondite art forms. But how to find the balance between intimidating and just plain timid?

more from the Financial Times here.

A Passage to India: A Nobel Prize-winning economist explores his homeland’s rich and quarrelsome heritage.

From The Washington Post:India_2

If you laid all the economists in the world end to end, the old joke goes, you would never reach a conclusion. So it’s all the more remarkable that it is as a practitioner of the “dismal science” that Amartya Sen won the Nobel Prize in 1998. Sen is a man of conclusions; he is also brilliant at marshalling, with both extensive research and empirical evidence, the arguments that justify his conclusions. The Argumentative Indian — a collection of 16 essays, many reworked and expanded from lectures and previously published articles — is an intellectual tour de force from an economist who can lay equal claim to the designations of sociologist, historian, political analyst and moral philosopher. It is a magisterial work, except that the adjective is not one of which Sen would approve.

That is because Sen uses it, along with “exoticist” and “curatorial,” to describe the three perspectives from which the West has tended to view India (each of which he dissects and discredits with precision and finesse). He is particularly critical of the Western overemphasis on India’s religiosity at the expense of any recognition of the country’s equally impressive rationalist, scientific, mathematical and secular heritage, fields treated by Orientalists as “Western spheres of success.”

More here.