Declassified government documents, many of them cited in the CAVR report [East Timor’s final report of the country’s Commission on Reception, Truth, and Reconciliation], reveal that Jakarta was sufficiently worried about how Western countries would react to its aggression that Suharto, Indonesia’s dictator, vetoed earlier plans to invade East Timor and launched the invasion only after consulting Australia, Britain, and the United States.
But the documents show that Washington—as well as London—had decided to effectively sacrifice East Timor well before the invasion. In March 1975, the U.S. ambassador in Jakarta recommended to his superiors a “general policy of silence” on the Suharto regime’s planned forceful takeover of what was then Portuguese Timor. He explained that Washington had “considerable interests” in Indonesia—what Richard Nixon once described as “by far the greatest prize in the Southeast Asian area”—but had “none” in East Timor.
President Gerald Ford and Secretary of State Henry Kissinger, well aware of the pending invasion, met with Suharto in Jakarta on December 6, 1975. Ford assured his Indonesian counterpart that with regard to East Timor, “We . . . will not press you on the issue. We understand . . . the intentions you have,” while Kissinger worried that “the use of U.S.-made arms could create problems.” The United States had supplied about 90 percent of Indonesia’s military equipment on the condition that it not be used for offensive purposes. Kissinger promised Suharto that the United States would not regard the invasion as an aggression, while expressing understanding for Indonesia’s “need to move quickly” and advising “that it would be better if it were done after we [he and Ford] returned [to the United States].” Some 14 hours after their departure, Indonesian forces invaded.
Such understanding—and the associated material support—continued, with few interruptions, until September 1999. The reason was largely economic: as a State Department spokesman explained in 1976, “We regard Indonesia as a friendly, non-aligned nation—a nation we do a lot of business with.”
In the Boston Review, Lawrence Rosen revisits Orientalism and review Robert Irwins defense of Orientalists.
Irwin has a larger story to tell. Those who may fairly be called Orientalists certainly were, in his view, men (and, very rarely, women) of their times, but they were devoted to studying the languages of the region and establishing the relation of Islam and its history to Jewish and Christian sources. They were not overtly political, he says, nor, except in rare and more recent times, even involved in conversation with policymakers. Irwin’s characteristic way of dealing with the inveterate racists is simply to read them out of the category of Orientalists. Thus, Ernest Renan’s (1823–92) hostility to Semites suggests he was not a real scholar; and like-minded writers “did not need to have Orientalists invent racism for them.” He concludes that “racist attitudes in any period or region are the product of the natural tendency to think in generalities.” But this etiology of opinions avoids an essential point—not well expressed by Said—about consequences: whatever their origins and purposes, students of the region often set the terms of subsequent discussions. If Orientalists claimed that the East was a linguistically exceptional and theologically undeveloped culture with highly elaborate legal strictures, their framing had repercussions for political no less than common discourse. One does not have to be a policymaker to affect policy.
It is not enough, then, to complain—as Irwin does—about the banality of observing that scholars are not always objective. Irwin treats texts as if they had no political effects, as if (in earlier periods) salvation were the concern and attachments to royalists or mercantilism were incidental, and as if the explication of a text did not in itself imply unstated criteria. In thus evading Said’s larger and more difficult question about the political and intellectual effects of scholarly analysis——such as the constant references to the Prophet Muhammed as lascivious—Irwin retreats to the assumptions that continue to inform so much of Orientalist study.
While Said and his critics disagree about the existence of a hidden, malignant political agenda written into the entire course of Orientalist scholarship, both fail to analyze fully the foundations of Orientalist scholarship, assumptions that may or may not entail prejudice toward the peoples and cultures of the region. When Said says that “the core of Orientalist dogma persists”—that it “flourishes today in the forms I have tried to describe”—he fails to consider whether assumptions about language, textual analysis, and social dynamics may be capable of a substantial degree of autonomy or whether, as he uncritically assumes, they necessarily lead to adverse judgments of those studied.
We often hear that Americans know little about other nations; a bigger problem is that we know too little about ourselves, our history and our national character. When it comes to U.S. foreign policy, in particular, we were all born yesterday, unaware of how present policies and attitudes fit into persistent historical patterns. So when a brilliant, lucid historian such as Michael B. Oren does bring the past back to life for us, revealing both what has changed and what has stayed the same, it is a shaft of light in a dark sky.
Today, the conventional view is that George W. Bush took the United States on a radical departure when he declared a policy to transform the Middle East and that, as soon as he leaves office, U.S. policy will return to an alleged tradition of realism, rooted in the hard-headed pursuit of tangible national interests. This is both bad history and bad prophecy, as Michael B. Oren shows in Power, Faith, and Fantasy, a series of fascinating and beautifully written stories about individual Americans over the past four centuries and their contact with Middle Eastern cultures.
You don’t have to be a complete organism to take part in Darwinian evolution: Even sperm engage in the survival of the fittest. A new study indicates that the sperm of certain rodent species have evolved hook-shaped heads, apparently to beat each other to the egg. Sperm with better hooks are able to attach to more of their brethren, allowing them to form fast-moving chains that leave their rivals behind.
For years, biologists have puzzled at the strange shape of rodent sperm. As opposed to the sperm of most other mammals, which have paddle-shaped heads, the sperm heads of many rat and mouse species are curved like scythes. About 10 years ago, scientists studying the European woodmouse discovered that these hooks allow groups of up to 100 sperm to attach to each other, and that these “sperm trains” moved faster than sperm swimming alone.
Curious if there were any evolutionary forces at play, evolutionary biologist Simone Immler of the University of Sheffield in the United Kingdom and colleagues studied the sperm of 37 rodent species, including the Norway rat and the house mouse. As in the European woodmouse, the team found that–in most of the species studied–sperm hooked into an entourage moved faster than loners did. What’s more, species with larger testes–and thus greater quantities of sperm per ejaculate–tended to have sperm with sharper hooks. That may be because more sperm equals more competition between sperm to reach the egg.
As heart disease reaches epidemic proportions worldwide, researchers are moving away from the old “clogged-pipes” model to search for triggers lurking in our genes.
Jennifer Kahn in National Geographic:
At this moment, her doctor is threading a thin catheter up through her femoral artery from an incision in her groin, on into the aorta, and from there into one of the arteries encircling Gloria’s heart. At the tip of the catheter is a small balloon. The doctor gently navigates the tip to a spot where plaque has narrowed the artery’s channel by 90 percent. With a quick, practiced movement he inflates the balloon to push back the artery wall, deflates the balloon, then inserts an expandable stent—it looks like a tiny tube of chicken wire—that will keep the passage open. As Gloria watches on the monitor, the crimp in her artery disappears, and a wide laminar flow gushes through the vessel, like a river in flood.
The procedure is over. It has lasted only half an hour. In all likelihood, Gloria will be able to go home the next day. So will a few thousand other patients in the United States undergoing such routine angioplasty—more than a million of them a year. Pipe fixed, patient cured, right?
More here. And see a great gallery of photos here. [Thanks to Beajerry.]
In order to understand how humans invest requires more than the study of economics; one also needs to comprehend behavioral psychology. Combining both cognitive science and behavioral economics can yield powerful insights into the conduct of investors.
I recommend Cornell professor Thomas Gilovich’s book How We Know What Isn’t So to investors all the time. The professor’s contribution to the investment community is his study of human reasoning errors. More specifically, Gilovich studies the inherent biases and faulty thinking endemic to all us humans. These faulty analyses are pretty much hard-wired into our species.
How do these defects manifest themselves? In all too many ways: Humans have a tendency to see order in randomness. We find patterns where none exist. While that trait might have helped a baby recognize its parents (thereby improving the odds for its survival), seeing patterns where none exist is counter-productive when it comes to investing.
We also selectively perceive data, hoping to find something that confirms our prior views. We ignore data that contradicts those prior views. We even reinterpret old evidence so it is more in sync with our perspective. Then, we only selectively remember those things that support our case. Last, we overuse Heuristics, which is defined as simple, efficient rules of thumb that have been proposed to explain how people make decisions, come to judgments and solve problems, typically when facing complex problems or incomplete information (call them mental short cuts). These short cuts often generate “systematic errors” or blind spots in our analytical reasoning.
And that’s only a partial list of analytical imperfections you have inherited.
Luxury toilet paper. At first it sounds like an insultingly obvious joke. Who would want such a thing? But then visions of those notorious $900 Gucci dog bowls flit through your mind, and you’re haunted by the possibility that your cynicism isn’t polished enough to second-guess the world’s hunger for tiny, absurd self-indulgences.
You’d be right. Consider Renova Negro: This all-black toilet paper from Spain is brand new, real, and mercilessly chic. Very Pedro Almodóvar. And, as it turns out, 10 times more costly than the average Euro-wipe. Renova Negro is the brainchild of an established, successful company already famous for an ad campaign in which barely clad models dry-hump near a commode while rolls of toilet paper look on, unmoved, as though they’ve seen it all.
In Japan, meanwhile, luxury toilet paper is de rigueur. Japanese rolls are routinely scented, extra-thick, aloe-moistened, strictly “virgin” (unrecycled), patterned, or—the latest trick—infused with pineapple enzymes to counteract odor. And in Germany the American brand Charmin Ultra is known as Charmin Deluxe; it comes in urbane black-and-charcoal-gray packaging “designed with the consumer in mind,” according to Procter and Gamble’s European division, “with a Gucci look and feel.”
In Le Monde Diplomatique, Susan George argues for a resurrection of Keynes’ proposal on how manage world trade.
The economist John Maynard Keynes came to the postwar table with an innovative project for the future of world trade, which he called the International Trade Organisation (ITO), supported by an international central bank, the International Clearing Union (ICU). The ICU was meant to issue a world currency for trade, the bancor. Why the ITO and the ICU never materialised, and what would have changed if they had, forms a sobering story from which we can learn. It tells us that, in a rational world, it would be possible to construct a trading system serving the needs of people in both North and South.
With an ITO and an ICU, we could have had a world order in which no country could run a huge trade deficit (the United States deficit stood at $716bn in 2005) or the huge trade surplus of contemporary China. Under such a system, crushing third world debt and the devastating structural adjustment policies applied by the World Bank and the IMF would have been unthinkable, although the system would not have abolished capitalism. If we could resurrect Keynes’s concept, another world really might be possible: he figured out how to make it work more than 60 years ago. His plan would have to be dusted off and tinkered with, but its core remains relevant.
Before explaining the rules it would have established, we should consider why the ITO was never set up. The usual explanation is that the US blocked it, which is true but too facile. There were other political reasons. The US and Britain began discussing the ITO agreement long before the war was over, and Keynes had already floated the idea in 1942. He chaired the Bretton Woods monetary conference in July 1944, where it emerged as the official British position. By that time the US, doubtless following the opinions of its corporations, was less enthusiastic and its chief negotiator, Harry Dexter White, pushed instead for the World Bank and the IMF (1). The US Congress subsequently approved both institutions, sometimes referred to as the “Bretton Woods institutions”, but the ITO was not yet ripe for ratification.
Many people no doubt regard vegetarianism as inherently frivolous and hence an unsuitable topic for serious intellectual history. But if The Bloodless Revolution does anything, it is to prove such skeptics wrong. One way or another, it shows that vegetarians have been in the forefront of some of the most important controversies of the modern era. The reason is not hard to fathom. Like everything else in life, food is multidimensional, which is why the question of whether to order fruit salad or a BLT is never solely a matter of taste but touches on everything from morality and aesthetics to agricultural policy, humanity’s place in the natural world and even constitutional affairs. In the eighteenth century, to cite just one example, beef was as central to the English self-image as cheap gasoline currently is to that of the United States. Just as the ability to cruise down a highway in an SUV or pickup is what distinguishes an American from a Frenchman paying $7 a gallon to tool around in some mini-subcompact, the ability to consume great slabs of cow flesh was what distinguished John Bull from “Frogs” dining on onions and snails. Scruffy vegetarians seeking to take all that red meat away were barely distinguishable from Jacobin sympathizers wishing to guillotine the House of Lords.
If we are what we eat, in other words, then modifying the national diet was seen as the quickest route to changing the political structure, while resisting such demands was part and parcel of defending the status quo. Their analysis may have been naïve, but vegetarians’ ambitions were immense and their critique was nothing if not sweeping.
Stuart begins his tale with Sir Francis Bacon, appropriately enough since Bacon was both a key figure in the Scientific Revolution that gave us modernity and keenly interested in the question of diet, health and longevity.
Robert Frost’s poetry is full of actions taken on obscure impulse. A man reins in his horse on “the darkest evening of the year” to watch the woods fill up with snow. Why does he interrupt his journey? “The woods are lovely, dark and deep.” Another man hesitates where “two roads diverged in a yellow wood” and takes “the one less traveled by.” These poems are so familiar that it is almost painful to quote them. Others less well known are no less driven by impulse. “Into My Own,” the sonnet that opens Frost’s first book of poems, evokes a distant prospect of “dark trees”: “I should not be withheld but that some day/Into their vastness I should steal away.” Every true poem, Frost wrote in “The Figure a Poem Makes,” the lovely little manifesto that served as the preface to his Collected Poems of 1939, is the child of impulse: “It begins in delight, it inclines to the impulse, it assumes direction with the first line laid down, it runs a course of lucky events, and ends in a clarification of life–not necessarily a great clarification, such as sects and cults are founded on, but in a momentary stay against confusion.”
In August of 2003 I conducted a three-hour interview with former Mexican President Luis Echeverría. The central purpose was to explore the paradigmatic changes that so profoundly transformed population policies during his term in office (1970-1976). While this was the central topic, the interview was crisscrossed with multiple sub-topics that linked our conversation with historical memory and biography, violence and authoritarianism, and, of course, politics, power, and democratization. These sub-topics were all condensed under the metaphor of Tlatelolco – the Mexico City student massacre of 2 October 1968.
William Canak and Laura Swanson describe the events and their historical impact:
In 1968, a series of large-scale student demonstrations demanding free and mass education erupted in Mexico City. As the protest expanded to include workers, peasants, and unions, ideas of democracy and redistribution of wealth were adopted. The student movement was significant for several reasons. First, participation in the demonstrations included approximately 400 000 people […] Second, the student march to Tlatelolco Plaza in Mexico City ended violently with the Mexican police and army attacking the [unarmed and peaceful] group: 325 protesters were killed and thousands were injured […] Third, a number of students involved in the 1968 student movement influenced or became leaders of the urban popular movements in the early 1970s
Integrity exacts a price from an artist. Take the case of painter George McNeil (1908-1995). A fixture of the New York School, McNeil refused to pose with his peers in a 1950 photo shoot for Time magazine. As the story has come down through his family, McNeil took umbrage at being pictured as a team player in a milieu rife with personality conflicts and political maneuvering. The photograph he skipped out on, taken by Nina Leen, came to be called The Irascibles. It featured 15 New York artists who had signed a letter addressed to the Metropolitan Museum of Art deriding the institution’s hostility to “advanced art.” No one could have known it at the time, but Leen’s group shot would become an iconographic staple of postwar American art. It’s hard to measure the impact of the picture on the participating artists’ careers. It certainly didn’t hurt Jackson Pollock, Willem de Kooning, Mark Rothko, Robert Motherwell, Clyfford Still, Barnett Newman or Ad Reinhardt. (It wasn’t a foolproof catalyst for fame: Theodoros Stamos, Jimmy Ernst, James Brooks and Hedda Sterne have largely been consigned to storage.) All the same, Leen’s image—the Mount Rushmore of Abstract Expressionism, if you will—conferred a degree of legitimacy on a movement that would make New York the center of world art.
In short, the middle class of this country, our historic backbone and our best hope for a strong society in the future, is losing its place at the table. Our workers know this, through painful experience. Our white-collar professionals are beginning to understand it, as their jobs start disappearing also. And they expect, rightly, that in this age of globalization, their government has a duty to insist that their concerns be dealt with fairly in the international marketplace. In the early days of our republic, President Andrew Jackson established an important principle of American-style democracy – that we should measure the health of our society not at its apex, but at its base. Not with the numbers that come out of Wall Street, but with the living conditions that exist on Main Street. We must recapture that spirit today. (Picture).
Work on Rome’s Palatine Hill has turned up a trove of discoveries, including what might be the underground grotto where ancient Romans believed a wolf nursed the city’s legendary founders Romulus and Remus.
Archaeologists gathered Tuesday at a conference to save crumbling monuments on the Palatine. The Palatine’s once-luxurious imperial homes have been poorly maintained and were at one time in danger of collapse — a situation that forced the closure of much of the hill to the public during a restoration project.
While funds are still scarce, authorities plan to reopen some key areas of the honeycombed hill to tourists by the end of the year, including frescoed halls in the palaces of the emperor Augustus and of his wife, Livia.
MARJANE SATRAPI’S life was flashing before her eyes. There she was, a mischievous girl on the streets of Tehran, buying contraband records during the Islamic revolution. Singing the lyrics in her bedroom at the top of her teenage lungs. Fidgeting with her head scarf at the lycée. Mourning the political imprisonment of her uncle. Falling in love for the first time. Saying goodbye to her beloved parents as they sent her, their only child, to find freedom and solace in the West.
“Imagine you see your face everywhere — from the back, from the front, as a girl, adolescent, everywhere,” Ms. Satrapi, 37, said during the making of an animated movie based on her best-selling and critically praised comic-book memoir, “Persepolis.” The original version, in French, includes the voices of the legendary French actress Danielle Darrieux as her grandmother, Catherine Deneuve as her mother and Chiara Mastroianni — the daughter of Marcello Mastroianni and Ms. Deneuve — as Marjane. An English-language adaptation, which will also include Ms. Deneuve, with Gena Rowlands as the grandmother, is scheduled to be released by Sony Pictures Classics this year.
In Asharq Alawsat, Amir Taheri reviews Pope Benedict XVI’s Values in a Time of Upheaval.
Although Pope Benedict does not quite tell us what “upheaval” he is referring to in the title of his new book, it soon become clear that he observes the present condition of mankind as a whole and Europe in particular with a degree of pessimism unexpected from a Christian prelate. After all, Christianity is known as the “faith of hope.”
The first cause of the Pope’s pessimism is the domination of the world by what he calls “the three mythical values of today”. These are progress, science and freedom.
The trouble is that the Pope dopes not spell out what he means by any of those terms. For example, does he mean to say that the recent unprecedented progress in medical sciences represent a threat to mankind? Should we steer away form a science that has helped us uncover more and more of the mysteries of nature and mobilize its resources for improving our lives? And, last but not least, in what way can freedom be regarded as a “mythical value”? By coincidence, the Pope’s book has been published at a time that the world prepares to mark the centenary of the ablution of slavery, an evil that Christianity, along with other faiths, never even questioned. For those released from the shackles, freedom was real, not mythical.
The second cause of the Pope’s apparent pessimism is the demographic decline of Europe. The Pope’s europhilia, not to say eurocentrism, is at times to passionate that one wonders whether he regards Christianity as little more than an ingredient in a more complex ideological mix in which the Hellenic heritage and medieval scholasticism are also present.
In this book, the Pope is so focused on Europe, which he believes is about to be lost to outsides, notably Muslim immigrants, that one wonders whether he has forgotten that more than half of his Catholic flock live on other continents. At one pint ( page 41, last paragraph), the Pope speaks of the ” actual non-universality” of the Christian faith.
Even then, the Pope’s lamentations about the decline of Europe, echoing those of his fellow-German Oswald Spengler more than a century ago, may well be misplaced.
In the Boston Review, Kerry Emanuel on the epistemology, physics, and consequences of global warming:
My own work has shown that hurricanes are responding to warming sea surface temperatures faster than we originally expected, especially in the North Atlantic, where the total power output by tropical cyclones has increased by around 60 percent since the 1970s. The 2005 hurricane season was the most active in the 150 years of records, corresponding to record warmth of the tropical Atlantic. Hurricanes are far and away the worst natural disasters to affect the U.S. in economic terms. Katrina may cost us as much as $200 billion, and it has claimed at least 1,200 lives. Globally, tropical cyclones cause staggering loss of life and misery. Hurricane Mitch of 1998 killed over 10,000 people in Central America, and in 1970 a single storm took the lives of some 300,000 people in Bangladesh. Substantial changes in hurricane activity cannot be written off as mere climate perturbations to which we will easily adjust.
Basic theory and models show another consequential result of a few degrees of warming. The amount of water vapor in the air rises exponentially with temperature: a seven-degree increase in temperature increases water vapor by 25 percent. One might at first suppose that since the amount of water ascending into clouds increases, the amount of rain that falls out of them must increase in proportion. But condensing water vapor heats the atmosphere, and in the grand scheme of things, this must be compensated by radiative heat loss. On the other hand, simple calculations show that the amount of radiative heat loss increases only very slowly with temperature, so that the total heating by condensation must increase slowly as well. Models resolve this conundrum by making it rain harder in places that are already wet and at the same time increasing the intensity, duration, or geographical extent of droughts. Thus, the twin perils of flood and drought actually both increase substantially in a warmer world.
It is particularly sobering to contemplate such outcomes in light of the evidence that smaller, natural climate swings since the end of the last ice age debilitated and in some cases destroyed entire civilizations in such places as Mesopotamia, Central and South America, and the southwestern region of what is today the United States.
As I mentioned before, the modern history of Iran is full of constant humiliation and interventions from outside. Therefore I understand the desire for independence, security, and dignity. But there is a red line, where such a legitimate national policy threatens to transform itself into a hegemonial policy. Iranian culture and history are much older than those of Europe and Germany. So I am not entitled to be a history teacher. But allow me one remark about our own historical experience.
Europe developed the balance of power system after our religious wars in 1648. And we experienced its benefits and its nightmares over the centuries and finally its definitive collapse in two world wars between 1914 and 1945. My country challenged this European system twice in the first half of the twentieth century. At the beginning of the last century, Germany was the leading power of Europe, but we made the wrong decisions and ended in a complete disaster. What was our strategic mistake? We followed hegemonial aspirations that relied on military might and prestige, and we miscalculated the anti-hegemonial instincts of Europe. And twice we underestimated the strategic potential, the power, and the political will and decisiveness of the United States. Otto von Bismarck, perhaps the greatest German statesman of the nineteenth century, defined Germany’s role in his century as either “hammer or anvil.” In the second half of the twentieth century, it turned out that he was completely wrong, because this had never been a serious alternative. A new European system based on a peaceful balance of interests, common European institutions in the framework of the EU, and guaranteed security, produced by NATO and the transatlantic alliance, completely changed the course of German and European history for the better.
In his recent memoir, Things I Didn’t Know, art critic Robert Hughes pinpoints the moment he decided to leave his native Australia to begin a new life as a permanent expatriate. It was a warm evening in 1962. Hughes and his mentor, popular historian Alan Moorehead, were talking shop as they pounded down Gewürztraminer at Hughes’ apartment in Sydney. “If you stay here another ten years,” Moorehead told him, “Australia will still be a very interesting place. But you will have become a bore, a village explainer.”
Hughes heeded his friend’s advice, staying first at Moorehead’s villa in Tuscany, then moving to London, where he lived on the fringes of hippie counterculture (“all dope, rhetoric, be-ins, and powdered bullshit,” as he recalls) and wrote art reviews for the “quality Sundays”: the Times, the Telegraph, the Observer, the Spectator. In 1970, he got a call from Time (on a neighbor’s phone; his had been disconnected) offering him a job as the magazine’s art critic. His anecdote about this incident is a perfect snapshot of the good old days of cultural journalism: The editor who called him was drunk from his habitual three-martini lunch; Hughes was stoned to the gills on hash and, in his paranoia, assumed he was talking to the CIA. They worked it out; he took the job, moved to New York, and over the course of 30 years churned out hundreds of eloquent, witty, briskly opinionated columns for his target audience of intelligent, nonspecialist readers.
“THERE ARE ONLY two views that face all the facts,” wrote C.S. Lewis with his characteristic lectern-thumping certainty in “Mere Christianity” (1952). “One is the Christian view that this is a good world that has gone wrong, but still retains the memory of what it ought to have been. The other is the view called Dualism….I personally think that next to Christianity Dualism is the manliest and most sensible creed on the market.” What Lewis, red-faced reactionary and cheerleader for Christ, made of the writings of Norman Mailer — whose new novel “The Castle in the Forest” is published this week — is not recorded. It is unlikely, however, that he would have been disposed to judge them “sensible.” Nor, one suspects, would the great medievalist have found much that was “manly” in the young Mailer’s fascination with jazz, crime, orgasm, and marijuana. (“Swamp-literature!,” he might have said.) Nonetheless, could the pipe-smoke of his antimodern prejudice have been waved away for a minute, and a clear reading taken of Mailer’s work and views, Lewis — who suffered from an abrupt intellectual honesty — would have been forced to admit it: Here, as he himself was the big-hitting Christian writer of his time, was the century’s arch-apologist of dualism.