Ian Hurd in the Boston Review:
The Russian foreign minister Sergei Lavrov said on Monday that “the use of force without the approval of the United Nations Security Council is a very grave violation of international law.” He used these words to argue against American and other outside intervention against the Syrian government after its chemical weapons massacre. He also stepped into one of the most hotly contested topics in international law: is it permissible for governments to use force against another country to prevent human rights atrocities?
There are three views on the question and all are on display in the current debates over Syria. As President Obama decides on an American response, he too will have to take a position on these issues.
Lavrov expressed a common view of international law: the U.N. Charter forbids countries from “the threat or use of force against the territorial integrity or political independence of any state.” This makes war illegal. It is a descendant of the Kellogg-Briand Pact of 1928, the first general treaty to outlaw war.
Having spent a decade as Ambassador to the United Nations, Lavrov knows well that the U.N. Charter makes no allowance for the intentions of the states involved. Except in self-defense, the Charter outlaws all war-making by states, and is unconcerned with the motivation of the states involved. Whether a state intends to save a population from genocide, to punish a neighbor for an insult, or to gain territory by aggression, the Charter treats all the same. All are forbidden by Article 2(4).
Fifty years ago today, 250,000 people joined the March on Washington for Jobs and Freedom. Dissent‘s Tom Kahn worked closely with veteran socialist and civil rights activist Bayard Rustin to organize the March. In “The Power of the March—and After,” Kahn traced the March’s history back to labor leader A. Philip Randolph’s 1940 plan to demonstrate in the capital for an end to wartime employment discrimination (called off once Franklin Roosevelt conceded certain demands). Echoing the speech delivered by John Lewis of the Student Nonviolent Coordinating Committee, Kahn identified the promise of the march with the prospect of continued militancy:
At the Lincoln Memorial, a quarter of a million people pledged to continue the struggle at home—in the streets as well as in the courts. This pledge may well turn out to be more important than the eloquent speeches or the specific demands. The streets were the incubators of the March on Washington, and it was pressures from the streets that fused jobs and freedom into a single slogan. Action in the streets in cities and towns across the country, in keeping with the pledge, will keep the March on Washington Movement alive and militant.
more from The Editors at Dissent here.
To commemorate the 50th anniversary of the March on Washington for Jobs and Freedom that called for equal rights for African-Americans, Getty has published a book of Magnum photographer Leonard Freed’s photographs titled This Is the Day: The March on Washington. Included in the book are 75 previously unpublished images Freed made that focus on moments both quiet and profound before, during, and after the march. There is also a selection of images from the 20th-anniversary march that took place in Washington, D.C., on Aug. 27, 1983. Freed, who died in 2006, was a Brooklyn, N.Y.–born photographer who discovered his craft in 1953. He joined Magnum in 1972 and became known for his images of the American civil rights movement. Paul Farber, visiting assistant professor of writing at Haverford College, met Freed’s widow, Brigitte Freed, in 2009 and served as a historical adviser and co-curator of This Is the Day. According to Farber, it was Freed who came up with the idea for the book. “She shared with me a germ of an idea she had for a posthumous Leonard Freed book, inspired by words she heard Barack Obama say while on the campaign trail: ‘I am here because somebody marched.’ ”
more from David Rosenberg at Slate here.
(Note: Happy Birthday to my sister Atiya. Faraz Sahib dedicated one of his books to her.)
Public opinion towards science has made headlines over the past several years for a variety of reasons — mostly negative. High profile cases of academic dishonesty and disputes over funding have left many questioning the integrity and societal value of basic science, while accusations of politically motivated research fly from left and right. There is little doubt that science is value-laden. Allegiances to theories and ideologies can skew the kinds of hypotheses tested and the methods used to test them. These, however, are errors in the application of the method, not the method itself. In other words, it’s possible that public opinion towards science more generally might be relatively unaffected by the misdeeds and biases of individual scientists. In fact, given the undeniable benefits scientific progress yielded, associations with the process of scientific inquiry may be quite positive. Researchers at the University of California Santa Barbara set out to test this possibility. They hypothesized that there is a deep-seated perception of science as a moral pursuit — its emphasis on truth-seeking, impartiality and rationality privileges collective well-being above all else. Their new study, published in the journal PLOSOne, argues that the association between science and morality is so ingrained that merely thinking about it can trigger more moral behavior. The researchers conducted four separate studies to test this.
…Across all these different measures, the researchers found consistent results. Simply being primed with science-related thoughts increased a) adherence to moral norms, b) real-life future altruistic intentions, and c) altruistic behavior towards an anonymous other. The conceptual association between science and morality appears strong.
The small red finch
so deftly slips
from the swaddling
and unseeing snow
that all envelops,
and buries all,
his buff red bib,
flicks his tail,
turns the lanterns
on his wings, left
preens the gold
fleck on his
with an inspired eye
t’wards the ash’s
as if an ember
sets his throat
from the frozen holly,
zips a zigzag trail of fire
the sky before
that we tell each other,
under the table lamp
has turned once more
to white and black
by Marlene van Niekerk
from Poetry International, 2013
translation by author
Over at the BBC, “A new drama by Sir Tom Stoppard to celebrate the 40th anniversary of Pink Floyd's The Dark Side of the Moon,” incorporating the music into the radio play (h/t: Ajay Chaudhary):
A Boston Review forum with Neta C. Crawford, Mary Kaldor, Tod Lindberg, Greg Grandin, James D. Fearon, John Tirman and Joanne Landy all responding to Alexander B. Downes:
Obama’s reasons for confronting Qaddafi are more like Clinton’s in Bosnia and Kosovo than Bush’s in Iraq and Afghanistan, but several enduring factors trump changes in administration and provide a powerful impetus for continuing efforts at regime change spearheaded by the U.S. military.
First, there are few external constraints on the exercise of American power: the United States spends nearly as much on defense as the rest of the world combined and dwarfs most potential adversaries in military capability. Because the United States is so powerful, defines its international interests so broadly, and is so accustomed to intervening militarily on behalf of those interests, only a radical realignment of strategy would enable American leaders to forswear regime change. As long as the United States is committed to providing stability in most of the world, rooting out terrorism, stopping the spread of weapons of mass destruction, curbing human rights abuses, spreading democracy, and pursuing global primacy, frequent intervention is unavoidable.
Second, U.S. leaders face few hurdles to initiating military action abroad. Even though regime changes are costly and can result in prolonged occupations and insurgencies, U.S. leaders can successfully downplay or lie about the potential costs in order to obtain public approval. This was amply demonstrated by the Iraq invasion: long before the war began, widely available information showed that the Bush administration’s liberate-and-leave story was flawed. Yet the president and his advisors insisted that taking out Saddam Hussein would be cheap and easy, and the administration won the support it needed. Even after this story proved false, Bush was not held accountable, winning reelection in 2004. Leaders in democracies such as the United States focus on manufacturing consent for regime change rather than planning realistically for the fallout. As Afghanistan and Iraq show, it is easier to get the troops in than to get them out, even if public opinion turns against the mission.
Finally, Americans tend to personalize their conflicts. Almost every target of U.S. intervention in the post–Cold War world has been labeled another Hitler. It is enticing to believe that removing one person from power will fix a problem. This “evil leader” syndrome is one reason why it is so difficult for the United States to fight limited wars: the temptation to “go to Baghdad” rather than make peace with a dictator is strong, and, historically, killing the leader has meant defeating his army. Decapitation by airpower and targeted killing have become popular because they supposedly enable the United States to oust leaders without a ground invasion, thereby obviating the need for a costly war.
Alex Rosenberg and Tyler Curtain in the NYT's Stone:
Before the 1970s, the discussion of how to make economics a science was left mostly to economists. But like war, which is too important to be left to the generals, economics was too important to be left to the Nobel-winning members of the University of Chicago faculty. Over time, the question of why economics has not (yet) qualified as a science has become an obsession among theorists, including philosophers of science like us.
It’s easy to understand why economics might be mistaken for science. It uses quantitative expression in mathematics and the succinct statement of its theories in axioms and derived “theorems,” so economics looks a lot like the models of science we are familiar with from physics. Its approach to economic outcomes — determined from the choices of a large number of “atomic” individuals — recalls the way atomic theory explains chemical reactions. Economics employs partial differential equations like those in a Black-Scholes account of derivatives markets, equations that look remarkably like ones familiar from physics. The trouble with economics is that it lacks the most important of science’s characteristics — a record of improvement in predictive range and accuracy.
This is what makes economics a subject of special interest among philosophers of science. None of our models of science really fit economics at all.
The irony is that for a long time economists announced a semiofficial allegiance to Karl Popper’s demand for falsifiability as the litmus test for science, and adopted Milton Friedman’s thesis that the only thing that mattered in science was predictive power. Mr. Friedman was reacting to a criticism made by Marxist economists and historical economists that mathematical economics was useless because it made so many idealized assumptions about economic processes: perfect rationality, infinite divisibility of commodities, constant returns to scale, complete information, no price setting.
David Berreby in Aeon Magazine:
[T]he scientists who study the biochemistry of fat and the epidemiologists who track weight trends are not nearly as unanimous as Bloomberg makes out. In fact, many researchers believe that personal gluttony and laziness cannot be the entire explanation for humanity’s global weight gain. Which means, of course, that they think at least some of the official focus on personal conduct is a waste of time and money. As Richard L Atkinson, Emeritus Professor of Medicine and Nutritional Sciences at the University of Wisconsin and editor of the International Journal of Obesity, put it in 2005: ‘The previous belief of many lay people and health professionals that obesity is simply the result of a lack of willpower and an inability to discipline eating habits is no longer defensible.’
Consider, for example, this troublesome fact, reported in 2010 by the biostatistician David B Allison and his co-authors at the University of Alabama in Birmingham: over the past 20 years or more, as the American people were getting fatter, so were America’s marmosets. As were laboratory macaques, chimpanzees, vervet monkeys and mice, as well as domestic dogs, domestic cats, and domestic and feral rats from both rural and urban areas. In fact, the researchers examined records on those eight species and found that average weight for every one had increased. The marmosets gained an average of nine per cent per decade. Lab mice gained about 11 per cent per decade. Chimps, for some reason, are doing especially badly: their average body weight had risen 35 per cent per decade. Allison, who had been hearing about an unexplained rise in the average weight of lab animals, was nonetheless surprised by the consistency across so many species. ‘Virtually in every population of animals we looked at, that met our criteria, there was the same upward trend,’ he told me.
It isn’t hard to imagine that people who are eating more themselves are giving more to their spoiled pets, or leaving sweeter, fattier garbage for street cats and rodents. But such results don’t explain why the weight gain is also occurring in species that human beings don’t pamper, such as animals in labs, whose diets are strictly controlled.
In 1835, at the height of the Southern cotton boom, the master class of the Mississipi Delta region had an attack of its worst phobia: fear of slave rebellion. One slaveholder in the countryside saw some of her slaves acting unusually, seeming defiant, appearing to plot. She began to eavesdrop and overheard one slave fantasize about being “her own mistress.” In another conversation, she caught the word “kill.” Her son squeezed a slave for information and drew out details of a coming insurrection. The masters sounded the alarm: patrols were instituted, investigators fanned out, the countryside came alive with tipsters. Evidence invariably consisted of seeing slaves where they oughtn’t to have been—in the slaveholder phrase, “skulking around.” The suspects gave up under torture, confessing plans for securing arms, robbing banks, butchering masters. As the investigation wore on, the ruling class created an ad hoc executive committee, which generated, piece by piece, its own worst nightmare. Although “circumstantial” is too kind a word for the evidence, and the investigators enjoyed no formal legal status, they nonetheless executed twenty-three suspects without controversy.
more from Gabriel Winant at n+1 here.
Every night, at the same point in the show, Eddie appears onstage in living, breathing form: a man on stilts in a tricorne hat and tailcoat, who would not look out of place at a Cornish folk parade. “I am hard of hearing,” says Dickinson. “With all due respect, that was such bullshit: scream for me again, London!” He has that brilliant, old-fashioned accent that all rock stars from Mick Jagger to Rod Stewart seem to have – a cheeky, Ealing-comedy London you don’t hear much any more. He was born in Worksop, Nottinghamshire, to a working-class family and was raised by his grandfather, a miner, who died of black lung. By the time he was a teenager, his parents had raised enough money doing up property to send him to Oundle public school, where he became the president of the war games society and handled real firearms – and from which he was later expelled. Britain’s rock stars moved up quickly in the world, fraternised with the titled, bought castles and suits of armour, colonised Mustique and appeared in Tatler’s society pages. They helped usher in the only kind of patriotism with which we are comfortable today: self-mocking, cartoonish, ridiculous, loose.
more from Kate Mossman at The New Statesman here.
When he was asked by Michael Albus to characterize the difference between his own approach and Karl Rahner’s, Hans Urs von Balthasar famously said, “Rahner has chosen Kant, or if you will, Fichte, the transcendental approach. And I have chosen Goethe, my field being German literature.” For Kant, and the moderns in general, the notion that the unifying center of a thing really does appear in the individual thing was denied. When I see this particular tree, therefore, all I see is the appearance of this particular tree. If any generalizations are to be made about it, they will have to come from the side of the subject. This means that the classical transcendental properties of Being—unity, truth, goodness, and beauty—must no longer be conceived as properties of Being, but as characteristics attributed to Being from the side of universal subjectivity. All postmodernity has to do to achieve nihilism, it would seem, is to deny any universal subjectivity. Postmodernism is not so much an alternative to modernism as its reductio.
more from Rodney Howsare at Front Porch Republic here.
Note: (This week is the sixth death anniversary of the great Urdu writer and my dear friend Qurratulain Hyder.)
From The New York Times:
Dr. Siddhartha Mukherjee’s authoritative 2010 Pulitzer Prize-winning “biography” of cancer, “The Emperor of All Maladies,” ran almost 600 pages. In comparison, George Johnson has written a very small book, barely half that length. That Mr. Johnson’s story is as gripping, illuminating and affecting as the bigger book — or, for that matter, any other book out there — is testament to both his poet’s talents and his unusual perspective. An award-winning science writer, Mr. Johnson was for some years an editor at The New York Times and a contributor to Science Times (where portions of this book eventually appeared). Initially, though, his interests kept him firmly on the physical science side of things, covering particles and planets, a foreign terrain that often seems enviably organized, if a little dry, to those of us in the mushier, less rigorous zones of health. Then came a sad new assignment, self-imposed: Mr. Johnson set out to learn everything he could about cancer when his then-wife received a diagnosis at a relatively young age. So he gamely crossed over from the hard sciences to the soft, Gulliver with a notepad and a recorder. He understood the language well enough, but the customs were surpassing strange.
…Mr. Johnson’s wife, Nancy, was a trim, exercising, vegetable- and fiber-chomping nonsmoker in her early 40s when she felt a lump in her groin. It proved to be a metastasis from a malevolent form of uterine cancer, one whose cells are atypically aggressive and prone to spreading. Her situation and her terrible prognosis reminded Mr. Johnson of nothing more than his New Mexico backyard, with headstrong wildflowers blooming where they choose and intractable weeds exploding by night.