May 31, 2006
Michael Paulus at his website (via The Daily Doubter):
Animation was the format of choice for children's television in the 1960s, a decade in which children's programming became almost entirely animated. Growing up in that period, I tended to take for granted the distortions and strange bodies of these entities.
These Icons are usually grotesquely distorted from the human form from which they derive. Being that they are so commonplace and accepted as existing I thought I would dissect them like science does to all living objects - trying to come to an understanding as to their origins and true physiological make up. Possibly to better understand them and see them in a new light for what they are in the most basic of terms.
I decided to take a select few of these popular characters and render their skeletal systems as I imagine they might resemble if one truly had eye sockets half the size of its head, or fingerless-hands, or feet comprising 60% of its body mass.
Many more here.
How are young Muslims radicalized on domestic soil?
Steve Coll in The New Yorker:
In a world amply populated with angry young Muslims, it is a question of some interest why a small number choose to become suicide bombers. President Bush addresses the matter in starkly religious language, consigning it to an eternal contest between good and evil. American scholars have begun to attack the problem with scientific method; Robert Pape, of the University of Chicago, for example, recently mustered data to argue that suicide attacks are a rational means by which the weak can humble the strong. To this potpourri of hypotheses can now be added a compelling work by anonymous bureaucrats in Great Britain, under the oddly redundant title “Report of the Official Account of the Bombings in London on 7th July 2005.”
On that summer morning, three young Muslim men blew themselves up on Underground cars, and a fourth immolated himself on a double-decker bus; fifty-two people died, and several hundred suffered injuries. The most striking aspect of the inquiry into the attacks, which was published earlier this month, is the extent to which it plumbs the suicide bombers’ motivations.
The four men depicted in the report are in some respects unfathomable. When Shehzad Tanweer, a talented athlete who was twenty-two years old, bought snacks at a highway convenience store four hours before his death, he haggled over the change. Hasib Hussain, who was eighteen, strode into a McDonald’s just half an hour before he killed himself and thirteen others.
Beauty and her beasts
Chris Petit in The Guardian:
Her three marriages were essays in fame. Her first in 1942, at 19, to pint-sized star Mickey Rooney, then one of MGM's biggest assets and an experienced skirt-chaser despite his wholesome screen image, happened when she was barely a signed-up starlet. Rooney was forced to marry because she wouldn't come across otherwise. Her second husband, jazz star Artie Shaw, gave the uneducated Gardner a reading syllabus, sent her to therapy and, for reasons he never explained, moved them into a modest rented house in suburban Burbank, which they shared for a time with its owners and their teenage sons. The third husband was Sinatra. By then she was the bigger star, a perpetual cover girl and tabloid sensation, epitome of an emerging jet set (which can equally be taken for a life on the run), her movie career almost incidental to her celebrity, and indistinguishable from her often exaggerated notoriety. Asked by a reporter what she saw in Sinatra - a 119lb has-been - she replied demurely that 19lb of it was cock.
DIGITAL MAOISM: The Hazards of the New Online Collectivism
Jaron Lanier at Edge.org:
The problem is in the way the Wikipedia has come to be regarded and used; how it's been elevated to such importance so quickly. And that is part of the larger pattern of the appeal of a new online collectivism that is nothing less than a resurgence of the idea that the collective is all-wise, that it is desirable to have influence concentrated in a bottleneck that can channel the collective with the most verity and force. This is different from representative democracy, or meritocracy. This idea has had dreadful consequences when thrust upon us from the extreme Right or the extreme Left in various historical periods. The fact that it's now being re-introduced today by prominent technologists and futurists, people who in many cases I know and like, doesn't make it any less dangerous.
It's time to stop killing meat and start growing it
William Saletan in Slate:
Where were you when Barbaro broke his leg? I was at a steakhouse, watching the race on a big screen. I saw a horse pulling up, a jockey clutching him, a woman weeping. Thus began a worldwide vigil over the fate of the great horse. Would he be euthanized? Could doctors save him? In the restaurant, people watched and wondered. Then we went back to eating our steaks.
Shrinks call this "cognitive dissonance." You munch a strip of bacon then pet your dog. You wince at the sight of a crippled horse but continue chewing your burger. Three weeks ago, I took my kids to a sheep and wool festival. They petted lambs; I nibbled a lamb sausage. That's the thing about humans: We're half-evolved beasts. We love animals, but we love meat, too. We don't want to have to choose. And maybe we don't have to. Maybe, thanks to biotechnology, we can now grow meat instead of butchering it.
Genetic Comparison Traces Origins of HIV to African Chimpanzees
Lauran Neergaard of the AP in the Chicago Tribune:
Solving the mystery of HIV's ancestry was dirty work. But researchers now have confirmed that the virus that causes AIDS in humans really did originate in wild chimpanzees--in a corner of Cameroon.
Scientists have long known that captive chimps carry their own version of the AIDS virus, SIV or simian immunodeficiency virus. But it was extraordinarily hard to find in wild chimpanzees, complicating efforts to pin down just how the virus could have made the jump from animal to man.
Fitting that final piece of the puzzle required seven years of research just to develop tests to genetically trace the virus in living wild chimps without hurting the endangered species. Then trackers had to plunge through the dense forests of West Africa and scrape up fresh ape feces, more than 1,300 samples in all.
Until now, "no one was able to look. No one had the tools," said Dr. Beatrice Hahn of the University of Alabama at Birmingham. She led the team of international researchers that reported the success in Thursday's online edition of the journal Science.
"We're 25 years into this pandemic," Hahn said. "We don't have a cure. We don't have a vaccine. But we know where it came from."
on simple human decency
Ben Metcalf in Harper's Magazine:
Some time has passed since I last raise my voice to the multitude, and whereas literary taste does not seem to have advanced much in the interim, and I assume is still arrayed so as to engage only the weak-minded and dull, I find that I am no longer able to discern with any accuracy where the bounds of simple human decency lie. This would bother me even less than does the taste issue were it not for the fact that ground gained or lost in the theater of decency tends now and then to affect the law, and it has long been a personal goal of mine to avoid capture and imprisonment.
I am therefore led to wonder what the common citizen is allowed to "say" anymore, in print or otherwise, and still feel reasonably sure that some indignant team of G-men, or else a pair of gung-ho local screws, will not drag him away to a detention center, there to act out, with the detainee as a prop, that familiar scene in which one hero cop or another is patriotically unable to resist certain outbursts against the detainee and what were once imagined to be the detainee's constitutional rights. Because I am loath to violate whatever fresh new mores the people have agreed upon, or have been told they agree upon, and because I do not care to have my ass kicked repeatedly in a holding cell while I beg to see a lawyer, I almost hesitate to ask the following question.
More here. [Thanks to Asad Raza.]
WHY THE U.N. CAN'T SAVE DARFUR
Eric Reeves in The New Republic:
Actually, far from suggesting that the United Nations can save Darfur, the developments of the last few weeks provide an excellent illustration of why the international body will never be able to stop the genocide. Indeed, the most recent Security Council resolution does more to highlight Darfur's exceedingly grim future than to suggest that security for civilians or humanitarian operations will improve anytime in the near term. We might recall that there have been seven previous U.N. Security Council resolutions on Darfur, none of which has halted the genocide. These previous resolutions, which together constitute a shameful record of impotence, are recounted in the most recent resolution--unwittingly drawing attention to just how useless Turtle Bay's steady stream of diplomatic activity on Darfur has been. Unfortunately, there is no reason to believe that this time will be any different.
First, it's worth understanding just how bad the situation on the ground in Darfur has become--despite the recent peace agreement signed in Abuja that many believe could open the way for U.N. troops.
Intelligent Beings in Space!
From The New York Times:
A future space mission to Titan, Saturn's intriguing moon enveloped in clouds, might deploy a blimp to float around the thick atmosphere and survey the sand dunes and carved valleys below.
But the blimp's ability to communicate would be limited. A message would take about an hour and a half to travel more than 800 million miles to Earth, and any response would take another hour and a half to get to Titan.
Three hours would be a long time to wait if the message were: "Help! I'm caught in a downdraft. What do I do?" Or if the blimp were to spot something unusual — an eruption of an ice volcano — it might have drifted away before it received the command to take a closer look. The eruption may also have ended by then.
Until recently, interplanetary robotic explorers have largely been marionettes of mission controllers back on Earth. The controllers sent instructions, and the spacecraft diligently executed them.
But as missions go farther and become more ambitious, long-distance puppetry becomes less and less practical. If dumb spacecraft will not work, the answer is to make them smarter. Artificial intelligence will increasingly give spacecraft the ability to think for themselves.
Scientists reveal how frogs grip
From BBC News:
The mystery of how frogs cling to surfaces - even if their feet are wet - may have been solved by scientists. A study of tree frogs has revealed their toe pads are covered in tiny bumps that can directly touch a surface to create friction. The scientists found this direct contact occurs even though the pads are covered with a film of watery mucus. The findings, published in the journal Interface, may aid the development of anti-slip devices.
"The toe pads are patterned with a fine structure of hexagonal cells with channels running between them," explained Dr Jon Barnes, an author on the paper and a zoologist from Glasgow University. "One imagines if you are sticking to a leaf, that each cell, even if it is separate from the other cells, can form its own closest orientation."
May 30, 2006
The life and work of Oriana Fallaci
Margaret Talbot in The New Yorker:
“Yesterday, I was hysterical,” the Italian journalist and novelist Oriana Fallaci said. She was telling me a story about a local dog owner and the liberties he’d allowed his animal to take in front of Fallaci’s town house, on the Upper East Side. Big mistake. “I no longer have the energy to get really angry, like I used to,” she added. It called to mind what the journalist Robert Scheer said about Fallaci after interviewing her for Playboy, in 1981: “For the first time in my life, I found myself feeling sorry for the likes of Khomeini, Qaddafi, the Shah of Iran, and Kissinger—all of whom had been the objects of her wrath—the people she described as interviewing ‘with a thousand feelings of rage.’ ”
For two decades, from the mid-nineteen-sixties to the mid-nineteen-eighties, Fallaci was one of the sharpest political interviewers in the world. Her subjects were among the world’s most powerful figures: Yasir Arafat, Golda Meir, Indira Gandhi, Haile Selassie, Deng Xiaoping. Henry Kissinger, who later wrote that his 1972 interview with her was “the single most disastrous conversation I have ever had with any member of the press,” said that he had been flattered into granting it by the company he’d be keeping as part of Fallaci’s “journalistic pantheon.” It was more like a collection of pelts: Fallaci never left her subjects unskinned.
Celebrating the commonplace: Starlight
Chet Raymo in Science Musings:
Sometimes it's fun to think about things that no one has thought about before.
Some things are thought about for the first time because to do so requires genius. For example: Darwin thinking about evolution by natural selection, Einstein thinking about relativity, or Watson and Crick thinking about the DNA double helix. Being the first to think about those sorts of things can win you a Nobel prize.
Other things are thought about for the first time because they are so utterly commonplace that no one has bothered to think about them before. These are the kind of things I like to think about.
Consider starlight. What could be more commonplace than starlight?
The Wind That Shakes The Barley
Daren Waters at the BBC:
Ken Loach speaking at the Cannes film festival said The Wind That Shakes The Barley was a story he had to tell.
Loach's aim is to cast his political eye on events that are rarely discussed in the UK and beyond and remain open wounds for many Irish citizens.
Cillian Murphy plays Damien, a young man set to leave Ireland and become a doctor in London.
But events overtake him.
At the start of the film, Ireland remains an effective colony of the UK; with British soldiers stationed in the country.
Damien witnesses the murder of a young friend, killed at the hands of brutal British soldiers because he would only give his name in Gaelic, and not in English.
On Seeing the Wind at Hope Mansell
Whether or not shadows are of the substance
such is the expectation I can
wait to surprise my vision as a wind
enters the valley: sudden and silent
in its arrival, drawing to full cry
the whorled invisibilities, glassen towers
freighted with sky-chaff; that, as barnstorming
powers, rammack the small
orchard; that well-steaded oaks
ride stolidly, that rake the light-leafed ash,
that glowing yew trees, cumbrous, heave aside.
Amidst and abroad tumultuous lumina,
regents, reagents, cloud-fêted, sun-ordained,
fly tally over hedgerows, across fields.
a new poem from Geoffrey Hill at Poetry Magazine here.
On the afternoon of January 31, 1998, two hundred professors and graduate students gathered at the University of California, Santa Cruz, to discuss a disturbing new movement. "A specter is haunting U.S. intellectual life," a flier announced, "the specter of Left Conservatism." With participants including Judith Butler, Wendy Brown, Jonathan Arac, and Paul A. Bové, the conference was designed to address the perceived split in the mid- to late '90s between members of the so-called cultural and real Lefts.
What was the difference between the two? The conventional wisdom of the time had it that the cultural Left was composed of theory-obsessed, anti-American academic relativists who wrote obscure treatises and preferred ethnic- and gender-oriented identity politics to activism. Members of the real Left, on the other hand, were pragmatic humanists, earnest '60s types who favored coalition building (with the labor movement, for one), abhorred class inequality, and pressed for political change via elections.
more from Bookforum here.
It is difficult, if not impossible, to tell where the art begins and ends in Dieter Roth's exhibition at Coppermill, Hauser & Wirth's new gallery in a gigantic warehouse in London's East End. Entering the space is like walking into a begrimed indoor city, whose every filthy crevice is crammed with disconcerting detail: heaps of rubbish, hardened paint brushes, broken video cameras. This is the largest exhibition of Roth's work to be held in this country for more than 30 years, yet it provides little more than an inkling of the artist's complicated, divergent career, and his no less complicated life.
more from the Guardian Unlimited here.
Sexual attraction: the magic formula
From The London Times:
Selecting a mate is the most crucial decision of our lives. We spend a huge amount of time and energy trying to find that special someone. Our appetite for a relationship fuels a billion-pound industry of matchmaking services. Yet we’re often not satisfied. A 2005 survey of more than 900 people who had been using online dating services revealed that three-quarters had not found what they were looking for. We seem as much in the dark as ever about who is a suitable match.
Let’s start with the conscious part. There are some things we all find attractive. Men tend to desire women with features that suggest youth and fertility, including a low waist-to-hip ratio, full lips and soft facial features. Recent studies confirm that women have strong preferences for virile male beauty — taut bodies, broad shoulders, clear skin and defined, masculine facial features, all of which may indicate sexual potency and good genes. We also know that women are attracted to men who look as if they have wealth or the ability to acquire it, and that men and women strongly value intelligence in a mate. Preferences for these qualities — beauty, brains and resources — are universal. The George Clooneys and Angelina Jolies of the world are sex symbols for predictable biological reasons.
Thumbs Up for Leech Therapy
Bloodsucking leeches relieve the pain of thumb arthritis more effectively and for a longer period of time than the conventional painkilling ointment, according to new clinical trial results. The findings, presented here yesterday at the North American Research Conference on Complementary and Integrative Medicine, may move leech treatment one large wriggle closer to the mainstream of medicine.
Osteoarthritis of the thumb afflicts millions of people, causing joint pain debilitating enough to keep them from opening jars, writing notes, and gripping anything tightly. Doctors usually prescribe painkilling pills, injections, or ointments, but none of the treatments work well. Internist Gustav Dobos of the University of Essen in Germany, and his colleagues had successfully treated patients' arthritic knees with leeches before. The worms inject a blood-thinning chemical called hirudin and several substances that fight inflammation--components that keep a prey's blood flowing in the wild.
May 29, 2006
Teaser Appetizer: The Definition of Health
The world health organization (WHO) defines health as “A state of complete physical, mental and social well-being and not merely the absence of disease or infirmity” This definition entered the books between 1946 and 1948 and has remained unchanged.
Current medical knowledge is desperately struggling -- only with partial success --just to “merely” control “disease or infirmity.” while “complete well being” is unlikely to sprout out of our incomplete knowledge If your politicians were to legislate health by this definition, they will be in default for ever for one obvious reason: no nation – I repeat - no nation has the knowledge or the resources to deliver care to match this definition. We all learnt in the kinder garden – well except the politicians – not to promise what we can not fulfill.
This definition is a lofty, laudable visionary statement that may reflect a distant aspiration but its realization is elusive in current practice. In all humility, we should concede that “complete --- well being” is a probable unquantifiable metaphysical state which is unattainable without taming nature’s evolutionary laws of life and death. And to presume that we have the ability to do so is a whiff of arrogance – an aromatic trait our species emits in abundance.
The realization of this dream was probably considered feasible in 1948, when we had made a quantum leap in understanding infectious diseases and for the first time in human history, we were exuberant in our demonstrated ability to extend longevity by about twenty years in some countries But that was far before we could predict the explosion of health technology and understand its consequential individual, societal and economic effects.
Isn’t it time we seek a second opinion on the health of this definition and evolve a flexible definition which encompasses the current reality and is malleable enough to accommodate future developments?
While the WHO definition stays seemingly immutable, a new framework linked to human rights has evolved: The human right to health paradigm reiterates: the enjoyment of highest attainable standard of health is a fundamental right of every human being This linkage has provided an inspirational tool to demand “health..” The tenor of this discourse takes a cue from the rhetoric of Kofi Anan: “It is my aspiration that health will finally be seen not as a blessing to be wished for; but as a human right to be fought for."
This paradigm recognizes that violation of human rights has serious health consequences and promoting equitable health is a prerequisite to development of the society. The discourse rightly demands abolition of slavery, torture, abuse of children, harmful traditional practices and also seeks access to adequate health care without discrimination, safe drinking water and sanitation, safe work environment, equitable distribution of food, adequate housing, access to health information and gender sensitivity.
All nations are now signatories to at least one human rights treaty that includes health rights. One hundred and nine countries had guaranteed right to health in their constitutions by the year 2001 which qualifies it as an effective instrument for policy change; but it also raises some difficult questions.
Human rights discourse uses the words health and health care interchangeably. Rony Brauman, past president of Médecins Sans Frontières comments: “WHO's definition of a "right to health" is hopelessly ambiguous. I have never seen any real analysis of what is meant by the concept of "health" and "health for all," nor do I understand how anyone could seriously defend this notion.” The notion is more defensible if the demand of health care replaced the demand for health.
Yet no country in the world can afford to give all health care to all its citizens all the time. Nations conduct a triage of priorities according to their prejudices and large swaths of populations are not caught in the health care net. Even nations that have right to health embedded in the constitution face a gap between the aspirations and resources.
The human rights debate skirts round the issue by invoking the “Principle of progressive realization”, which allows resource strapped countries to promise increments in health care delivery in future This effectively gives a tool to the governments to ration and allocate resources, even if it conflicts with individual rights.
The following example illustrates the problem: post apartheid government of South Africa had enshrined the right to health in the constitution, yet the courts decided against a petitioner who demanded dialysis that he needed for chronic kidney failure. The court ruled that the government did not have an obligation to provide treatment. The court in essence transferred some responsibility to the individual.
Gandhi had also expressed his concern that rights without responsibility are a blunder. A responsibility paradigm could supplement the rights movement; a pound of responsibility could prove to be heavier than a ton of rights, but the current noise for rights has muzzled the speech for responsibility and “Complete health” is becoming an entitlement to be ensured by the state without demanding that the family and the individual be equal stake holders. Hippocrates said "a wise man ought to realize that health is his most valuable possession and learn to treat his illnesses by his own judgment"
This conflict will escalate further with the impact of biotechnology. A quote from Craig Venter gives the feel: “It will inevitably be revealed that there are strong genetic components associated with most aspects of what we attribute to human existence --- the danger rests with what we already know: that we are not all created equal. ---- revealing the genetic basis of personality and behavior will create societal conflicts.”
Derek Yach, a respected public health expert and professor at Yale University says “With advances in technology, particularly in the fields of imaging and genetic screening, we now recognize that almost all of the population either has an actual or potential predisposition to some future disease.”
We can’t help but rethink about health itself before we promise health care. An alternative definition can be derived from the health field concept of Marc Lalonde who was the health minister of Canada in1974. He surmised that interplay of four elements determined health, namely: genetic makeup, environment including social factors, individual behavior and organization of health care. The health field model holds many stake holders accountable.
Each stake holder approaches health with a seemingly different goal. (Even though they complement each other) A healthy person wishes not to fall sick; a sick person demands quick relief; a health care provider attempts to cure and prevent disease; a molecular biologist envisions control of molecular dysfunction; a public health person allocates resources to benefit maximum number of people; a health economist juggles finances within the budget; the government facilitates or hampers the delivery of care according to its priorities and the activist demands that every person has the right to the” Highest attainable standard of physical and mental health.”
Many stake holders mean more questions than answers. Who decides the limits of health a society should attain? Shall the boundary limit to basic primary care or extend to genetic manipulation to deliver well being? Who decides the mechanism of attaining that limit? Who decides positive mental well being? And who pays for it?
It is apparent that ‘Complete well being’ is as much an oxymoron as ‘airline food!’ We urgently need a new definition as a starting point for debate: a definition that is quantifiable for outcomes, accommodative of stake holders, absorbent of future advances, accountable for delivery of care and cognizant of limitations. The new definition has to be both correct and politically correct. Dr. Brundtland, former director-general of the WHO, wrote in the world health report that “The objective of good health is twofold – goodness and fairness; goodness being the best attainable average level; and fairness, the smallest feasible differences among individuals and groups.” We should match our expectations to reality.
These elements, compressed and enveloped into a workable statement, may sound as follows:
Health is a state of freedom from physical and mental consequences of molecular and psychological derangements caused by the interaction of individual biology and the environment; health care is an attempt to reverse such derangement by providing equitable access to all without discrimination within the constraints of available resources and knowledge.
You may call this, if you please: the 3QD definition of health -- you read it here first!
Dispatches: Affronter Rafael Nadal
Roland Garros, or tennis' French Open, started yesterday. Perhaps you've noticed; articles ran in most Sunday papers about it, quite extensive ones too, considering that the French has often been viewed as a third-rate (after Wimbledon and the U.S. Open) Grand Slam tournament, largely because it is usually won by a cadre of specialists instead of the best-known players. Not only is this perception unfair, but, this year, Roland Garros will be the most important men's tennis tournament of the year. Here's why.
The increasing specialization of tennis has meant that this tournament, the only Grand Slam played on clay, has a set of contenders that is quite distinct from those at the grass courts of Wimbledon and the hardcourts of Flushing Meadows, Queens. Not only has it been won by players who have not been dominant on the other surfaces, but it has been very difficult for anyone to enjoy repeat success sur la terre battue. Ten of the last twelve Wimbledons were won by Pete Sampras and Roger Federer; the last five winners of Roland Garros are Gustavo Kuerten, Albert Costa, Juan Carlos Ferrero, Gaston Gaudio, and Rafael Nadal. I'm going to try to explain both phenomena (specialized success and lack of repeat dominance) below.
Why does it make a difference what surface the game is played on, and what difference does it make? Basically, the surface affects three things: the speed of the ball after it bounces, the height of the ball's bounce, and the player's level of traction on court. In terms of the speed of the ball and height of its bounce, clay is the slowest and highest, and grass is the fastest and lowest, with hardcourt in the middle. This results in differing strategies for success on each surface, with grass rewarding aggressive quick strikes - with the speed of the ball and the low bounce, you can 'hit through' the court and past the other player with relative ease. For this reason, the great grass-court players have mostly been offensive players, who use serve-and-volley tactics (i.e., serving and coming to net to take the next ball out of the air). Clay, on the other hand, reverses this in favor of the defensive player: the slow, high bounce means it is very tough to hit past an opponent, and points must be won by attrition, after long rallies in which slight positional advantages are constantly being negotiated before a killing stroke. Clay-court tennis is exhausting, brutal work.
Clay and grass, then, are opposed, slow and fast, when it comes to the ball. How then did Bjorn Borg, perhaps the greatest modern player (he accomplished more before his premature retirement at twenty-five than anyone other than Sampras) manage to win Roland Garros (clay) six times and Wimbledon (grass) five but never a major tournament on the medium paced surface, hardcourt? The third variable comes into play here: traction. Clay, and, to a lesser extent, grass, provide negative traction. That is, you slip when you plant your foot and push off. Hardcourt provides positive traction - your foot sticks. Consequently, entirely different styles of quickness are needed. Borg didn't like positive traction. On clay, particularly, players slide balletically into the ball, the timing for which skill is developed during childhood by the most talented players, most of whom grew up in countries where clay courts are the rule: Spain, Italy, Argentina, Chile, Brazil. Grass is not as slidey, but offers less traction than the sticky hardcourts, and like clay, grass' uneven natural surface produces unpredictable hops and bounces, frustrating the expectations of the more lab-conditioned hardcourt players.
So, clay slows the ball and provides poor footing, both of which qualities means that it's ruled by an armada of players who grow up playing on it and mastering the movement and strategic ploys it favors. Perhaps foremost among these is the dropshot, which works because the high bounce of the clay court drives players way back and sets them up for the dropper. This explains the dominance of the clay specialists, but why has the title switched off among so many players lately? For the most part, this is because of the grinding nature of clay. So much effort must be expended to win a match (five sets on clay can take five hours of grueling back-and-forth; in contrast, bang-bang tennis on grass can be practically anaerobic), that players tire over the course of the tournament, and so much depends upon perseverance that a superhuman effort will often overcome a greater talent. It just so happens that last year there emerged a player who combines the greatest clay talent with the greatest amount of effort, but more on him below. For now, let me return to my claim that this edition of the French is the most important men's tennis event this year.
Historically, the greatest offensive players (meaning players who try to dictate play and win points outright, rather than counterpunchers, who wait for their opening, or retrievers, who wait for you to mess up), have been unsuccessful at Roland Garros, while the defensive fiends who win in Paris have been unsuccessful on grass. (Borg, a counterpunching genius, is the great exception.) The best attackers, namely John McEnroe, Boris Becker, Stefan Edberg, and of course Pete Sampras, have won zero French Opens, while Ivan Lendl, a three-time Roland Garros winner, narrowly failed in his endearing late-career quest to win Wimbledon (all of these players won major titles on hardcourts as well). The only man since 1970, in fact, to win all four major titles (known as the Grand Slam tournaments), on the three disparate surfaces, is one Andre Agassi, a hybrid offensive baseliner. This has made the dream of winning all four Slams in a single year, a feat also known, confusingly, as winning the Grand Slam--last accomplished by Rod Laver in 1969--seem pretty quixotic nowadays. Until now. The game's best current offensive player is also an excellent defensive player, and an extremely competent mover and slider on clay. Roger Federer has the best chance of anyone since Agassi to win the career Grand Slam, and, as the holder of the last Wimbledon, U.S. Open, and Australian titles, could win his fourth straight major this month (a feat he is calling, with a little Swiss hubris, the "Roger Slam"). If he succeeds this year at Roland Garros, he'll accomplish something Sampras couldn't, and if he does I think it's almost inevitable that he'll sweep London and Flushing and complete the calendar Grand Slam as well.
Standing in the way of Federer's c.v.-building efforts is the aforementioned combination of talent and drive, the nineteen-year-old Mallorcan prodigy Rafael Nadal. He had one of the finest seasons I've ever seen last year, absolutely destroying the field on clay, winning Roland Garros, winning over Agassi in Montreal and over Ljubicic in Madrid. He's now won a record 54 matches on clay without a loss. Not only does Nadal's astonishing effort level intimidate opponents, but he is surprisingly skilled, a bulldog with the delicacy of a fox. You can see him break opponents' spirits over the course of matches, endlessly prolonging rallies with amazing 'gets,' or retrievals, which he somehow manages to flick into offensive shots rather than desperate lobs. When behind, he plays even better until he catches up. His rippling physique and indefatigable, undying intensity make him literally scary to face on clay. And yet, when off the court, he is a personable and kind presence at this stage of his young life. All in all, a player this brutal has no business being this likable, but there it, and he, is.
Nadal and Federer have played six times: Nadal has won five, and held a huge lead in the other before wilting on a hardcourt. Let me underline here just how anomalous this state of affairs is: here we have the world number one on a historic run of victories, and yet he cannot beat number two. Federer has lost his last three matches with Nadal; with all other players, he has lost three of his last one hundred and nineteen matches. Rafa is the only player on whom Federer cannot impose his will; indeed, Federer must try and quickly end points against Nadal to avoid being imposed upon. In the final at Rome two weeks ago, Federer unveiled a new strategy, coming in to net whenever the opportunity arose, though not directly following his serve. Federer's flexibility, his ability to adopt new tactics, made for a delicious and breathtaking final, which he led 4-1 in the fifth and final set, and held two match points at 5-4. Here Nadal's hypnotic retrieving unnerved him once again, and two errors led the match to a fifth-set tiebreaker. In a microcosmic repetition, Federer again led (5-3 and serving) and again let the lead slip away. Nadal, after a full five hours, took the title and reconfirmed his psychological edge, even over the most dominant player of the last twenty years. His confidence will be nearly unimpeachable, where Federer's will be shaken by losing a match in which he played the best clay-court tennis of his life. If, as expected, they play again in the final of Roland Garros, for all the marbles, you're going to see the most anticipated tennis match in several years.
(Note: I have gone on for way too long without handicapping the women's field, for which I apologize. I'll just say here that I am hopeful that France's glorious all-court player, Amelie Mauresmo, will win.)
See All Dispatches.
Perceptions: of landscape
Sughra Raza. Inner Pain-ting. 2000.
Acrylic on canvas, 24" x 24".
Selected Minor Works: Why We Do Not Eat Our Dead
Justin E. H. Smith
[An extensive archive of Justin Smith's writing is now online at www.jehsmith.com]
Now that an "extreme" cookbook has hit the shelves offering, among other things, recipes for human flesh (Gastronaut, Stefan Gates, Harcourt, 257 pages; paperback, $14), perhaps our gross-out, jack-ass culture has reached the point where it is necessary to explain why these must remain untried.
I will take it for granted that we all agree murder is wrong. But this alone is no argument against anthropophagy, for people die all the time, and current practice is to let their remains simply go to waste. Why not take advantage of the protein-rich corpses of our fallen comrades or our beloved elderly relatives who have, as they say, "passed"? Surely this would not be to harm them or to violate their integrity, since the morally relevant being has already departed or (depending on your view of things) vanished, and what's left will have its integrity stolen soon enough by flame or earth. Our dearly departed clearly have no objections to such a fate: they are dead, after all. Could we not then imagine a culture in which cannibalizing our dead were perfectly acceptable, perhaps even a way of honoring those we loved?
The fact that we do not eat our dead, in spite of their manifest indifference, has been duly noted by some participants in the animal-rights debate. They think this reveals that whatever moral reasoning goes into our decisions about what sort of creature may be eaten and what must be left alone, it simply is not, for most of us, the potential suffering of the creature that makes the moral difference. Whereas Peter Singer believes that we should stop eating animals because they are capable of suffering, others have responded that this is beside the point, since we also make humans suffer in multifarious ways. We just don't eat them.
But again, why not? Some moral philosophers have argued that the prohibition has to do with respect for the memory of the deceased, but this can't get to the heart of it, since there's no obvious reason why eating a creature is disrespectful to it.
It may be the answer is simply that, as a species, we are carrion-avoiders. After all, it is not just the vegetarian who will not eat a cow struck by lightning, but the carnivore as well. Put another way: we do not eat fallen humans, but we also do not eat fallen animals; we eat slaughtered animals. It is then perhaps not so much the fact that dead humans are (or were) human that prevents us from eating them, but the fact they are carrion, and that we, as a species, are not scavengers.
Consider in this connection the Islamic Shariah laws that one must follow if one wishes to eat a camel that has fallen down a well (I turn here to the version of the rules stated as stated by the Grand Ayatollah Sistani): "[If the camel] falls down into a well and one feels that it will die there and it will not be possible to slaughter it according to Shariah, one should inflict a severe wound on any part of its body, so that it dies as a result of that wound. Then it becomes… halal to eat."
Now, why is it considered so important to inflict a fatal wound before the camel dies as a result of its fall? Though this is but one culture's rule, it seems to be the expression of a widespread prohibition on eating accidentally dead animals. In the case of the camel, an animal that is about to die from an accident, and the instruction is: if you want to eat it, you better hurry up and kill it before it dies! This suggests that people do not slaughter simply so that a creature will be dead, but rather so that it will be dead in a certain way. Relatedly, in the southern United States, roadkill cookbooks are sold in souvenir shops as novelty items, and the novelty consists precisely in the fact that tourists are revolted and amused by the thought of the locals scavenging like vultures.
Of course, human beings do in fact eat other human beings, just not those dead of natural or accidental causes. Some decades ago, the reality of cannibalism was a matter of controversy. In his influential 1980 book, Man-Eating Myth: Anthropology and Anthropophagy the social anthropologist William Arens argued that stories of cannibal tribes were nothing more than racist, imperialist fantasies. Recently, though, substantial empirical evidence has been accumulated for the relative frequency of cannibalism in premodern societies. Notable among this work is Tim White's archaeological study of anthropophagy among the Anasazi of Southwestern Colorado in the twelfth century. More recently, Simon Mead and a team of researchers have made the case on the basis of genetic analysis that epidemics of prion diseases plagued prehistoric humans and were spread through cannibalistic feasting, in much the same way that BSE spreads among cattle.
In the modern era, frequent reports of cannibalism connected with both warfare and traditional medicine come from both natives and visitors in sub-Saharan Africa. Daniel Bergner reported in the New York Times that "in May , two United Nations military observers stationed in northeastern Congo at an outpost near Bunia, a town not far from Beni, were killed by a local tribal militia. The peacekeepers' bodies were split open and their hearts, livers and testicles taken – common signs of cannibalism." One of Bergner's informants, a Nande tribesman, recounts what happened when he was taken prisoner by soldiers from the Movement for the Liberation of Congo:
"One of his squad hacked up the body. The commander gave Kakule [the informant] his knife, told him to pare the skin from an arm, a leg. He told Kakule and his other assistant to build a fire. From their satchels, the soldiers brought cassava bread. They sat in a circle. The commander placed the dead man's head at the center. He forced the two loggers to sit with them, to eat with them the pieces of boiled limb. The grilled liver, tongue and genitals had already been parceled out among the commander and his troops."
Bergner notes that it is a widespread, and commonly acknowledged belief in the region that eating the flesh, and especially the organs, of one's enemy is a way to enhance one's own power. This practice is sufficiently documented to have been accepted as fact by both the U. N. high commissioner for human rights as well as Amnesty International.
Cannibalism has been observed in over seventy mammal species, including chimpanzees. The hypothesis that cannibalism is common to all carnivorous species, or that this is something of which all carnivores are capable under certain circumstances, does not seem implausible. If one were to argue that these recent reports are fabrications, and that its modern disappearance in our own species has something to do with ethical progress, surely sufficient counterevidence could be produced from other, even better documented practices to quickly convince all concerned that no progress has been made.
The evidence suggests that, when cannibalism does happen, it is never the result of the fortuitous death of a comrade and the simple need among his survivors for protein. Rather, it follows upon the slaughtering of humans, which is exactly what we would expect, given the human preference for slaughtered pigs and cows over lightning-struck ones. Where eating animals is permitted, there is slaughter. And where slaughtering humans is permitted, the general prohibition on eating them does not necessarily hold.
In short, eating human beings is wrong because murder is wrong, and there's no way to get edible meat but by slaughtering it. I suppose Stefan Gates could look for a "donor," who would in case of an untimely death --a car accident, say-- dedicate his body to pushing the limits of experimental gastronomy. But if the cook fails to find any willing diners, this may have much more to do with our distaste for roadkill than with respect for the memory of a fellow human.
Monday Musing: Frederica Krueger, Killing Machine
A couple of months ago, my wife Margit's friend Bailey asked us to look after her cat (really just a kitten) while she was going to be out of town for about ten days. It was decided that the cat would just stay with us during that time. Bailey had only recently found the cat cowering in her basement, half-starved and probably living on the occasional mouse or whatever insects or other small creatures she could find. Bailey hadn't got around to naming the cat yet, and not wishing to prematurely thrust a real name upon her, we just called her Catty while she stayed with us. We thought she must be about six months old at that time, but she was quite tiny. Catty, to put it kindly, turned out to be a more ferociously mischievous cat than I had ever seen before. She did not like to be petted, and shunned all forms of affection. This, however, should by no means lead you to infer that our interactions with Catty were limited or sparse. Not at all: we were continuously stalked and hunted by her. I may not know what it is like to be a bat, but thanks to Catty, I have a pretty good idea what it is like to be an antelope in the Serengeti! [Photo shows Catty when she first came to stay with us.]
Catty wanted to do nothing but eat and hunt. Any movement or sound would send her into a crouching tiger position, ears pinned back, tail twitching. Though she is very fast, her real weapon is stealth. (Yes, she is quite the hidden dragon, as well.) I'll be watching TV or reading, and incredibly suddenly I am barely aware of a grayish blur flying through the air toward me from the most unexpected place, and have just enough time to instintively close my eyes protectively before she swats me with a claw. After various attacks on Margit and me which we were completely helpless to prevent, and which left us mauled with scratches everywhere (and I had been worried about cat hair on my clothes making me look bad!), Margit took her to a vet to have her very sharp nails trimmed (we did not have her declawed, which seemed too cruel and irreversible). The vet asked Margit for a name to register her under, and Catty immediately tried to kill him for his impertinence. While he bandaged his injuries, Margit decided to officially name the little slasher Frederica Krueger, thereby openly acknowledging and perhaps even honoring her ineluctably murderous nature. We started calling her Freddy.
Here's the funny thing: despite her fiercely feral, violent tendencies, Freddy was just so beautiful that I fell in love with her. To echo Nabokov's Humbert Humbert speaking about another famous pubescent nymphet: Ladies and Gentlemen of the Jury, it was she who seduced me! As Freddy got more used to us, it was as if she could not decide whether to try and eat us, or be nice. She started oscillating between the two modes, attacking and then affectionately licking my hand, then attacking again... But it was precisely the graceful, lean, single-minded perfection of her design as a killing machine that I could not resist. Like a Ferrari (only much more impressive), she was clearly built for one thing only, and therein lay her seductive power. (Okay, I admit it, I've always liked cats. The photo here shows me sitting on a chimney on the roof of our house in Islamabad in the late 60s with my cat Lord Jim.)
We mostly read whatever psychological intentions we want (and can) into our pets, imputing all sorts of beliefs and desires from our own psychological economies to them, and this works particularly well to the advantage of cats. They are just intelligent enough to get our attention as intentional agents (unlike say, a goldfish, or even a hamster, which seem barely more than the automatons Descartes imagined all animals except humans to be), but the fact that they are very mentally rigid and cannot learn too much makes them seem imperious, haughty, independent, and noble to us, unlike dogs, who are much more flexible in intelligence and can learn to obey commands and do many tricks to please us. Let's be blunt: cats are quite stupid. But to be fair, maybe much of the nobility we read into some humans is also the result of their rigidity. Who knows. In any case, cats are such monomaniacally hardwired hunters that it is impossible not to admire their relentless pursuit of prey, even if (in my case!) that prey is us. Since like many gods of the ancients, cats are mostly oblivious to human wishes and impossible to control, it is no surprise that some ancient peoples held them to be gods.
In ancient Egypt cats were considered deities as early as 3000 BCE and later there existed the cult of the goddess Bast, who was originally depicted as a woman with the head of a lioness, but soon changed to an unmistakeably domestic cat. Since cats were considered sacred, they were also mummified. Herodotus reports that when Egyptian cats died, the members of the household that owned it would shave their eyebrows in mourning. Killing a cat, even accidentally was a capital crime. The cult of Bast was officially banned in 390 BCE, but reverence for cats continued. Another greek historian, Diodorus Siculus, relates an incident from about 60 BCE where the wheels of a Roman chariot accidentally crushed an Egyptian cat. An outraged mob immediately killed the soldier driving the chariot.
The domestic cat was named Felis catus by Linnaeus, and like dogs, belong to the order Carnivora. Not all carnivores are in this order (even some spiders are carnivores, after all) and not all members of the Carnivora are carnivores, such as the panda. Other members of this order are bears, weasels, hyenas, seals, walruses, etc. Like our own, the ancestors of the modern domestic cat came from East Africa. Cats were probably initially allowed or encouraged to live near human settlements because they are great for pest control, especially in agricultural settings with grain storage, etc. This arrangement also afforded cats protection from larger predators who stayed away from humans for the most part. Even now, cats will hunt more than a thousand species of small animals. Domestic cats, if left in the wild, will form colonies, and by the way, a group of cats is known as a clowder. (Be sure to throw that into your next cocktail party conversation.)
It took even physicists a while to figure out how a cat always lands on its feet, which is known as its "righting reflex." The problem is that in mid-air, there is nothing to push off against to change your orientation (imagine being suspended in space outside a rocket, and trying to rotate). So how do they do it? The answer is actually quite technical and has to do with something called a phase shift. (Like a spinning figure skater being able to speed up or slow down her rate of rotation by drawing her arms in or holding them out.) What the cat does is first put its arms out and rotate the front half of its body in one direction and the back half in the opposite direction (a twisting motion), then it draws its arms in and twists in the opposite direction. But because angular momentum must be conserved, and angular momentum depends on the radial distance of mass from its axis of rotation, it will rotate back less this time, thereby achieving a net rotation in the direction of the first twist. If you don't get it, don't worry about it!
Cats appear frequently in fiction and writers seem to have a particular predilection for them. Ernest Hemingway and Mark Twain were serial cat-owners. Hemingway at various times had cats named Alley Cat, Boise, Crazy Christian, Dillinger, Ecstasy, F. Puss, Fats, Friendless Brother, Furhouse, Pilar, Skunk, Thruster, Whitehead, and Willy. Twain's cats were Appolinaris, Beelzebub, Blatherskite, Buffalo Bill, Satan, Sin, Sour Mash, Tammany, and Zoroaster. Meanwhile, Theodore Roosevelt's cat Tom Quartz was named for a cat in Mark Twain's Roughing It. T.S. Eliot owned cats named Tantomile, Noilly Prat, Wiscus, Pettipaws, and George Pushdragon. William and Williamina both belonged to Charles Dickens.
Lord Byron and Jorge Luis Borges both had cats named Beppo. (Byron travelled accompanied by five cats.) Edgar Allen Poe had Catarina; Raymond Chandler, Taki. Kingsley Amis's cat was Sara Snow. Some cats were, of course, named for famous people as well as owned by them, such as Gloria Steinem's Magritte and Anatole France's Pascal. John Lennon was the proud owner of Elvis. John Kenneth Galbraith was forced to change his cat's name from Ahmedabad to Gujarat after he became the U.S. ambassador to India because Muslims were offended by "Ahmed" (one of Mohammad's names) being associated with a cat. Mohammad himself, according to a report (hadith) attributed to Abu Huraira, owned a cat named Muezza, about whom it is said that one day while she was asleep on the sleeve of Mohammad's robe, the call to prayer was sounded. Rather than awaken the cat, Mohammad quietly cut his sleeve off and left. When he returned, the cat bowed to him and thanked him, after which she was guaranteed a place in heaven.
Isaac Newton not only loved cats, but is also said (probably apochryphally) to be the inventor of the "cat flap," allowing his cats to come and go as they pleased. (Wonder how long a break he had to take from inventing, say calculus, to do that.) And by the way, among famous cat haters can be counted such luminaries as Genghis Khan, Alexander the Great, Julius Caesar, Napoleon Bonaparte, Benito Mussolini, and last but not least, Adolf Hitler. What is it about cat-hating that basically turns one into a Dr. Evil? But wait, Dr. Evil likes cats!
Okay, enough random blather. Back to Ms. Frederica Krueger's story: as the moment of Bailey's return from her trip and the time for Freddy to leave us approached, I grew more and more agitated, finally threatening Margit that I would kidnap the cat and run away with her unless she did something to stop Bailey from coming to pick up her cat. At first Margit tried to tell me that we could get another cat, which only made me regress further and throw a tantrum yelling, "I don't want another cat! I only want this cat!" At this point, Margit told me I had finally cracked up completely and advised me to call a shrink. Bailey was coming to get the cat early next morning. I went to bed late, as I often do, and was still asleep when Margit awakened me to say that Bailey had agreed to let us have the cat as it seemed very happy here, and Bailey's apartment was really too small anyway. Thus Frederica becames ours, and we remain her willing and ever-anxious prey.
Freddy's Photo Gallery
Here are some glamour and action shots of Ms. Frederica Krueger, which you can click to enlarge. Captions are below the photos:
- I catch Freddy suddenly pouncing on an unsuspecting Margit's hand from behind our living room sofa (a favorite place of hers from which to launch her demonic attacks). Her eyes reflect the light from the camera flash because of a mirror-like layer behind her retinas called the tapetum. Nocturnal animals have this reflective surface there to bounce photons back toward the photosensitive cells of the retina, thereby almost doubling the chance that they will be registered, and greatly improving the animal's night vision. The daytime vision of cats is not as good as humans, however.
- She is striking a deceptively demure pose. Don't let if fool you. I have paid dearly for that mistake. In blood.
- Freddy loves this incredibly silly toy, which is basically just a little felt mouse that goes around and around, driven by a battery-powered motor. She spends inordinate amounts of time and energy trying to slay this patently fake rodent.
- Freddy has a habit of sitting on various bookshelves in the apartment, usually at a greater height than in this picture, surveying the scene below, much like a vulture.
- Margit too-bravely holds Freddy in her lap, who is only milli-seconds away from trying to shred Margit's hands with the claws of her powerful rear legs.
- If you didn't believe me when I said that often all I see is a grayish blur flying at me, have a good look at this picture (enlarge it by clicking on it) taken at 1/8th of a second shutter speed. Freddy is jumping from a lower bookshelf to the shelp avove the stereo on the right, so she can climb to even higher shelves along that wall.
Have a good week! My other Monday Musing columns can be seen here.
May 28, 2006
THE ECONOMICS OF CONSERVATION
"How economists and climatologists deal with uncertainty...and each other."
Dave Munger in Seed Magazine:
People across the nation are socking it to state gas tax revenues by buying energy-efficient cars, making it more difficult for states to pay for road maintenance. Legislators from Oregon estimate that as a result of all those hybrids, by 2014 the state's gas tax revenues will begin to decline; as a result they may replace the current gas tax with a mileage tax.
Most climatologists agree that curbing greenhouse gas emissions and fighting global warming will require that we build more energy efficient cars and homes. Yet some of these choices are still not cost effective. Even as gas prices climb past $3 per gallon, filling the tank on a standard-engine economy car is still cheaper than plunking down the extra money for a $22,000 Toyota Prius. (Over the long term, however, a Prius requires only a $2.28 gas price to recoup its cost premium over an $18,000 Camry).
Economists have called for incentives to force conservation, such as increasing gas taxes to promote moves to more efficient cars or providing subsidies for installing solar water heaters. But when these incentives actually work, they can deplete tax revenue steams, creating a disincentive for the state to continue the incentive. And increased taxes can be unpopular, which is why Oregon is now considering alternatives to a gas tax.
"Seamus Heaney published his first collection when he was 27, he won the Nobel Prize when he was 56 and his 12th book of poetry came out this spring. He talks to James Campbell about growing up on a farm in County Derry, politics and his current project, inspired by a 15th-century Scots poet."
From The Guardian:
In 1977, Seamus Heaney visited Hugh MacDiarmid at his home in the Scottish borders, when the great poet and controversialist was in the final phase of life. MacDiarmid had been overlooked by the curators of English literature: compiling the Oxford Book of English Verse, Philip Larkin asked a friend if there was "any bit of MacD that's noticeably less morally repugnant and aesthetically null than the rest?" Heaney, who has always felt at home with Scots vernacular takes a different line. "I always said that when I met MacDiarmid, I had met a great poet who said 'Och'. I felt confirmed. You can draw a line from maybe Dundalk across England, north of which you say 'Och', south of which you say 'Well, dearie me'. In that monosyllable, there's a world view, nearly."
In a literary career that spans 40 years, Heaney's appointed subject matter has been largely extra-curricular: Irish nationalism, "Orange Drums", the sod and silage of his father's 45-acre farm at Mossbawm, County Derry. In 1999, he took the Anglo-Saxon poem Beowulf and hammered it into a weathered English, which sold in astounding quantities and won the Whitbread Book of the Year Award. However, it is "the twang of the Scottish tongue", audible throughout his Derry childhood, particularly "over the Bann in Country Antrim", that has given him his current project, a modern English account of the work of the 15th- century Scottish makar Robert Henryson.
PRIVATE JIHAD: How Rita Katz got into the spying business
Benjamin Wallace-Wells in The New Yorker:
Rita Katz is tiny and dark, with volatile brown eyes, and when she is nervous or excited she can’t sit still. She speaks in torrents, ten minutes at a stretch. Everybody who works in intelligence calls her Rita, even people who don’t know her well. She sometimes telephones people she hasn’t met—important people in the government—to tell them things that she thinks they ought to know. She keeps copies of letters from officials whose investigations into terrorism she has assisted. “You and your staff . . . were invaluable additions to the investigative team,” the special agent in charge of the F.B.I.’s Salt Lake City Division wrote; the Assistant U.S. Attorney in Boise said, “You are a rare and extraordinary gem that has appeared too infrequently throughout the course of history.” The letters come in handy, she told me, when she meets with skepticism or lack of interest; they are her establishment bona fides.
Katz, who was born in Iraq and speaks fluent Arabic, spends hours each day monitoring the password-protected online chat rooms in which Islamic terrorists discuss politics and trade tips: how to disperse botulinum toxin or transfer funds, which suicide vests work best.
Chicken and egg debate unscrambled
Now a team made up of a geneticist, philosopher and chicken farmer claim to have found an answer. It was the egg.
Put simply, the reason is down to the fact that genetic material does not change during an animal's life.
Therefore the first bird that evolved into what we would call a chicken, probably in prehistoric times, must have first existed as an embryo inside an egg.
Professor John Brookfield, a specialist in evolutionary genetics at the University of Nottingham, told the UK Press Association the pecking order was clear.
The Lolita Question
Cynthia Haven in Stanford Magazine:
Biographers argue that Lolita’s infamous narrator, the self-deluding Humbert, was inspired in part by the man who started Stanford’s Slavic department, Professor Henry Lanz. While the portrait is hardly flattering, it should be remembered that Lolita is a work of fiction that reflects many influences (see sidebar).
Whatever inspiration Nabokov drew from the cosmopolitan man who became his chess companion that summer, he owed Lanz an enormous debt: the professor paid for Nabokov’s appointment out of his own pocket, forfeiting his summer salary to back the Russian novelist, a complete unknown in America. Nabokov told his biographer Andrew Field that he considered this job his “first success.”
Nabokov needed the break desperately. Russia had banned his writings as “anti-Soviet.” Living in Berlin with his Jewish wife, Véra, from 1922 to 1937, he wrote in Russian under the name Vladimir Sirin. (The Hoover Institution archives preserve a sampling of Sirin’s numerous rejection slips for English editions of his books.) After Berlin, they lived in poverty if not near-starvation in Paris, the more conventional haunt of Russian émigrés. They left for New York a few weeks before the Nazi tanks rolled in and moved into a seedy little flat with their 6-year-old son, Dmitri.
So the Stanford appointment was manna and the westward journey a portal into another world.
More here. [Photo shows house Nabokov lived in while in Palo Alto.]
Misrepresentations Contra Misrepresentation
Also in Against the Current, Purnima Bose on the fight over representations of Hinduism in California textbooks.
Two organizations with ties to militant Hindu nationalist groups in India, the Hindu Educational Foundation (HEF) and the Vedic Foundation (VF), complained vociferously that the textbooks' representations of Hinduism and ancient Indian history were demeaning and stereotypical...
Were the parent organizations of the HEF and VF not downright scary, their understanding of history and Hinduism might be comical. The first entry under "resources" on the HEF's website, for instance, leads to a page called, "A Tribute to Hinduism." Quoting everyone from Carl Sagan to Frijtof Capra and Robert Oppenheimer, the site asserts that ancient India had everything from supersonic airplanes to electric trains to nuclear weapons.
This site also boasts that while the Aryans made it to the moon, ancient India could claim the distinction of being the only destination in the world for UFOs. Scientific-minded readers can be assured that "Vedic technology does not resemble our world of nuts and bolts, or even microchips. Mystic power, especially manifest as sonic vibration plays a major role. The right sound—vibrated as a mantra, can launch terrible weapons, directly kill, summon beings from other realms, or even create exotic aircraft."
Equally wacky is the VF's chronology of Indian history and Hinduism. According to this group, the "Hindu religion was first revealed 111.52 trillion years ago" (before the Big Bang, apparently). Hinduism appears prior to Indian history which is dated as "1972 million years ago" (roughly 1.7 billion years before the dinosaurs).
A Debate on Withdrawl from Iraq
The sentiment "If I go there will be trouble/Si me voy - va a ver peligro/And if I stay it will be double/Si me quedo es doble," in the words of the Clash, haunts debates about Iraq, with disagreements about how "go", "trouble", "stay", and "double" pair up. In Against the Current, three views on the merits and dangers of a US withdrawl from Iraq. One pro withdrawl view:
[Susan Weissman]: There's this sense that if the United States were to leave—now that the Ba'athists and Shi'ite militants are more organized than they were before, and that there's even splits within them with more radical elements within each sector, including the jihadists—that if there were even just redeployment or planned withdrawal, it would encourage them and all hell would break loose. And there's even the notion that maybe Turkey would invade, maybe Kuwait would try to reclaim...can you give us a kind of scenario of what you think could happen?
[Gilbert Achcar]: One could imagine and draw all kinds of apocalyptic scenarios, but there is apocalypse now, we are in the midst of it. And of course, it could get worse...but it is getting worse. It is getting worse day after day. And it has been proved very very obviously, very factually, that the longer the U.S. troops stay in that country the worse it is getting.
No one can dispute that since day one of the invasion up until now the situation has steadily worsened—look at all the figures, it's absolutely terrible. The idea that the United States should stay there even longer to prevent it from deteriorating is completely absurd. It's clear, it has been tried and tried and over-tried, and the conclusion is clear, the U.S. troops should get out of that country if that country is ever to recover.
Now, I'm not saying that it'll be paradise as soon as U.S. troops get out, that's not the point. We, the antiwar movement, were the people who were saying that if the invasion took place, it would lead to chaos. We were saying that during all the long period before the invasion. The invasion took place, and exactly what we predicted happened. It led to a chaotic situation, a very dangerous situation.
Remember the Titans
From The Washington Post:
Benjamin Franklin -- the subject of one of the essays in this stimulating new collection -- once said that "Historians relate, not so much what is done, as what they would have believed." Most historians would agree with that gently cynical proposition, though they would wish to add a proviso that interpretations of the past should always rest on evidence -- on what was "done," as Franklin said. Among historians in universities these days, essays often tilt toward sheer interpretation, leaving the substance of the past scanted. Gordon S. Wood's book bucks that trend, offering a good deal of empirical evidence -- what was "done" -- in these absorbing essays from one of our leading scholars of the American Revolution.
Eight of the 10 chapters of Revolutionary Characters are biographical, featuring Washington, Franklin, Jefferson, Hamilton, Madison, John Adams, Thomas Paine and Aaron Burr. The founders are often considered as a group, as indeed they are here, and widely admired as being "different" (the key word in Wood's subtitle) from our current leaders in their commitment to enlightened principles. Looking at the founders together, it is hard not to conclude that though they deserve our admiration, they may not have constituted the group we have imagined. Certainly, they acted at times as if they had nothing in common.
Stomach bug makes food yield more calories
Scientists have identified a key microbe in our guts that helps us glean more calories from food. The discovery backs the idea that the type of microbes in our gut help to determine how much weight we gain, and that seeding the intestine with particular bugs could help fight obesity.
Samuel Buck of Washington University in St Louis, Missouri, and his colleagues focused on one microbe called Methanobrevibacter smithii, which is effectively a waste-removal bug. It eats up the hydrogen and waste products released by other microbes, and converts them into the methane that escapes from our rear ends everyday. "It's a minor component of the gut flora with a major impact," Buck says tactfully.
M. smithii may have a dirty job, but Buck and his colleagues have now shown that it is a vital one. The researchers found that mice with a hefty dose of M. smithii in their guts are fatter than those that don't have the bacteria.
The discovery suggests that calorie counts on food labels could be misleading, because different people may glean a different number of calories from an identical banana or cheeseburger, based on the individual mix of microbes in their gut.
Anthony Bourdain's "Nasty Bits"
Bruce Handy in the New York Times:
It's often easy to forget, when absorbing some great work of art, the extent to which the creative process is kept afloat not just by genius but also by dumb luck, desperation and sweat. This is true of great food as well. Sitting down to an expensive dinner at Per Se or Babbo, we might like to imagine that our entree was pulled fully formed from Thomas Keller's or Mario Batali's toque as if by magic — immaculate confection. But the reality of restaurant cooking is much uglier, at least if Anthony Bourdain is to be believed. He is the executive chef at Les Halles, the French steakhouse on Park Avenue South, and also the author of seven previous books, including the best-selling memoir "Kitchen Confidential: Adventures in the Culinary Underbelly." Published in 2000, this was a "You'll Never Eat Lunch in This Town Again" for the restaurant trade, famous for the chapter "From Our Kitchen to Your Table," which originally appeared in The New Yorker. It explained why you should never order fish on Monday (your snapper special has most likely been sitting around since Thursday, owing to the quirks of fishmongers' schedules) and why your basket of bread has probably been recycled from another table (an easy shortcut for overworked busboys). More alarming still is the reason Bourdain gave for why the pros never order swordfish: "those three-foot-long parasitic worms that riddle the fish's flesh." In other words, you'll never eat lunch in any town again.
"The Nasty Bits," mainly a catchall of Bourdain's magazine and newspaper writing, offers more in this vein: "Fast well-done steak? I've watched French grads of three-star kitchens squeeze the blood out of filet mignons with their full body weight, turning a medium to well in seconds. I've watched in horror as chefs have hurled beautiful chateaubriands into the deep-fat fryer, microwaved veal chops, thinned sauce with the brackish greasy water in the steam table. And when it gets busy? Everything that falls on the floor, amazingly, falls 'right on the napkin.' Let me tell you — that's one mighty big napkin."
As they say, you don't want to see how the sausage is made.
May 27, 2006
Mysteries still surround Egyptian chamber
Is it a royal Egyptian tomb, a glorified supply room for ancient embalmers, or something in between? A year after the discovery of a chamber that had lain hidden in the Valley of the Kings for millennia, archaeologists are still asking themselves exactly what they've found. "Until we examine each coffin to some extent, we can’t draw a conclusion," University of Memphis archaeologist Otto Schaden told MSNBC.com. "We can draw one, but it might be wrong."
Schaden spoke via telephone from the Valley of the Kings, where he and his colleagues are continuing to remove artifacts from the chamber, including jars of mummification materials and the coffins labeled A through G. Experts wondered whether the chamber might have contained royal mummies that were brought in from less secure sites to protect them from ancient grave robbers.
What Mind–Body Problem?
"Understanding consciousness may be easier than we thought."
Alex Byrne in the Boston Review:
Here is a remarkable fact. When atoms and molecules are organized in a suitably complicated way, the result is something that perceives, knows, believes, desires, fears, feels pain, and so on—in other words, an organism with a psychology. Besides ourselves, who else is in the club? Descartes notoriously claimed that other animals were merely unthinking bits of clockwork, but that is an extreme position. Probably cockroaches don’t have much of a mental life, if they have one at all, but few would harbor doubts about monkeys, apes, cats, and dogs. Indeed, there is a flourishing discipline at the intersection of biology and psychology—cognitive ethology—devoted to the study of the mental and social lives of nonhuman animals. Somehow, minds emerge from matter. And so, of course, does the weather, digestion, photosynthesis, and glaciation. But although some everyday nonmental phenomena remain poorly understood—apparently the jury is still out on the explanation of why ice is slippery—the connection between minds and matter is supposed to be especially mystifying. Why so?
In the famous 1974 article “What Is It Like to Be a Bat?” the philosopher Thomas Nagel fingered consciousness as the culprit. “Without consciousness,” he wrote, “the mind–body problem would be much less interesting. With consciousness it seems hopeless.” And consciousness has had philosophers hot and bothered ever since. Daniel Dennett published a book called, rather optimistically, Consciousness Explained in 1990, and his fellow philosophers could hardly get into print fast enough to proclaim that Dennett had not explained consciousness at all. But before we get to the conundrum of consciousness, let’s start with an apparently easier part of the mind–body problem.
Eyewitness' blind spot
"A 1994 rape conviction not only altered N.J. court rules on eyewitness testimony, it raised questions of identifying people of another race."
Tom Avril in the Philadelphia Inquirer:
She had gotten a good look at him before and after the attack in her basement apartment, not far from Rutgers University campus. At one point, their faces were just two feet apart. She'd never forget that face.
Then one April day on a New Brunswick street corner, more than seven months after the rape, she froze.
There he was. Strolling along with a boom box, walking with the same side-to-side swagger she remembered when the rapist left her apartment.
She ran to call the police. A few minutes later, they arrested the suspect, a black man named McKinley Cromedy.
The ensuing trial helped trigger an overhaul of the way New Jersey treats the oldest and most dramatic sort of courtroom evidence: an eyewitness pointing out the person who did it.
Cromedy's defense attorney took an unusual tack. He questioned her ability to tell black men apart, noting that she was white, that she grew up in an overwhelmingly white northern New Jersey suburb, that there were no black students in her high school class.
The victim was undeterred.
"It's just something you don't forget after what happens and everything," she told a jury of 11 whites and one black person. "It was him."
More than 61/2 years later, science would prove her wrong.
If you need to pay for someone's help, why is it called "self-help"?
Michael Shermer in Scientific American:
In 1980 I attended a bicycle industry trade convention whose keynote speaker was Mark Victor Hansen, now well known as the coauthor of the wildly popular Chicken Soup for the Soul book series that includes the Teenage Soul, Prisoner's Soul and Christian Soul (but no Skeptic's Soul). I was surprised that Hansen didn't require a speaker's fee, until I saw what happened after his talk: people were lined up out the door to purchase his motivational tapes. I was one of them. I listened to those tapes over and over during training rides in preparation for bicycle races.
The "over and over" part is the key to understanding the "why" of what investigative journalist Steve Salerno calls the Self-Help and Actualization Movement (SHAM). In his recent book Sham: How the Self-Help Movement Made America Helpless (Crown Publishing Group, 2005), he explains how the talks and tapes offer a momentary boost of inspiration that fades after a few weeks, turning buyers into repeat customers. While Salerno was a self-help book editor for Rodale Press (whose motto at the time was "to show people how they can use the power of their bodies and minds to make their lives better"), extensive market surveys revealed that "the most likely customer for a book on any given topic was someone who had bought a similar book within the preceding eighteen months." The irony of "the eighteen-month rule" for this genre, Salerno says, is this: "If what we sold worked, one would expect lives to improve. One would not expect people to need further help from us--at least not in that same problem area, and certainly not time and time again."
THE STRANGE GENIUS OF OPRAH
Lee Siegel in The New Republic:
Now celebrating her twentieth year as the host of the world's most influential talk show, Oprah Winfrey is to television what Bach is to music, Giotto to painting, Joyce to literature. Time magazine hit the nail on the head when it recently voted her one of the world's handful of "leaders and revolutionaries." (Condoleezza Rice wrote Oprah's citation: "She has struggled with many of the challenges that we all face, and she has transformed her life. Her message is empowering: I did it, and so can you.") Like all seminal creative figures, her essential gift lies in her synthesizing power. She has taken the most consequential strands in modern life and woven them together into an hourlong show that is a work of art.
The boilerplate criticisms of Oprah--she exploits a culture of victimization that she did so much to create; she glamorizes misery; she amplifies already widespread narcissism and solipsism; she fills people's heads with hackneyed nostrums about life--are correct, up to a point. But that's not the whole story. Oprah's critics write as if her goal of extending to her audience empathy, consolation, and hope were intrinsically cheap and cynical. On the contrary: The question is whether that is really what she is offering.
AN EXCHANGE BETWEEN TWO GREAT POETS
John Felstiner in The New Republic:
Perhaps I am one of the last who must live out to the end the destiny of the Jewish spirit in Europe." Why "must"? Writing from Paris in August 1948 to relatives in the new state of Israel, Paul Celan, having survived the "Final Solution," explains that a poet cannot stop writing, "even when he is a Jew and the language of his poems is German." This fateful pledge, from a brutally orphaned son whose stunning poem of 1945, "Deathfugue," intones, "Death is a master from Deutschland" and threads an ashen-haired Shulamith into the Hebrew Bible's Song of Songs, throws a raking light over a recently discovered exchange of letters between Celan and the Israeli poet Yehuda Amichai.
Born to German-speaking parents in Czernowitz, Bukovina, an eastern outpost of the Austrian Empire, Celan survived nineteen months of forced labor, eventually taking exile in Paris. There by hard degrees he became Europe's most challenging postwar poet.
More here. [Celan shown in photo.]
women and octopuses in compromising positions
"Lurid new covers for The Iliad, Little Women, and other classics..."
From Slate (click the link at left for slide-show):
Pulp fiction is perhaps the only genre as beloved for its cover art as for its prose. And rightly so: Classic pulp covers are glorious and garish, rich with saturated color and sexual innuendo. Rare is the cover girl who hasn't undone at least a few of her buttons. And so the images have endured, both in the popular imagination and in the countless online galleries that collect some of the greats. (There's even a site dedicated to the covers of "poulpe pulps," which feature women and octopuses in compromising positions.)
In the 1950s, some publishing houses opted to release literary fiction with pulp covers. A striking edition of The Sheltering Sky, for example, promised "a strange tale in the exotic desert"—a tagline that is, when you think about it, both pulpy and apt. Taking such efforts as our inspiration, we asked a handful of designers to create lurid new book jackets for classics from The Iliad to Animal Farm. Click here to see the results.
As India Considers Further Liberalization, A Debate on Capital Account Convertibility
Economic and Political Weekly (India) debates what is perhaps the most crucial step in unfettering the power of capital, capital account convertibility. Most of the pieces oppose convertability or at least counsel delaying the move towards it; some are pro. L. Randall Wray offers the argument that capital controls are necessary for sovereignty, with reference to Argentina's disasterous experience with its currency board.
A nation like the US (as well as countries like Japan and Turkey, and Argentina after it abandoned the currency board) creates a currency for domestic use (and ensures its use primarily by demanding payment of taxes in that currency, although some go further by adopting legal tender laws). The government, itself (including the treasury and the central bank – the Fed in the case of the US), issues and spends high powered money (HPM – cash and reserves at the central bank) as its liability. The US government does not promise to convert its HPM to any other currency, nor to gold or any other commodity, at any fixed exchange rate.The flexible exchange rate is key to maintaining fiscal and currency independence – what I call sovereignty, although governmental sovereignty certainly has other dimensions as well. But there is more to it than a flexible exchange rate. The sovereign government spends (buys goods, services, or assets, or makes transfer payments) by issuing a treasury cheque, or, increasingly, by simply crediting a private bank deposit. In either case, however, credit balances (HPM) are created when the central bank credits the reserve account of the receiving bank. Analogously, when the government receives tax payments, it reduces the reserve balance of a bank. Simultaneously, the taxpayer’s bank deposit is debited. While we commonly think of a government needing to first receive tax revenue, and then spending that revenue, this sequence is not necessary for any sovereign government. If a government spends by crediting a bank account (issuing its own IOU – HPM) and taxes by debiting a bank account (and eliminating its IOU – HPM), then it is not as a matter of logic, "spending” tax revenue. In other words, with a floating exchange rate and a domestic currency, the sovereign government’s ability to make payments is neither revenue-constrained nor reserveconstrained... This fundamentally simple point is difficult for some to grasp because we are used to thinking about government as if it were not sovereign.
Experience and Authenticity
In The Nation, a review of Martin Jay's Songs of Experience: Modern American and European Variations on a Universal Theme.
[The] philosophical cult of experience arises from a sense that full engagement with existence has somehow been rendered problematic, whether by social, spiritual or economic arrangements or by the sheer perversity of the individual psyche. Authentic experience, from this view, seems always maddeningly just out of reach.
How could this assumption acquire such enduring force? How is it that "experience"--like its kin "reality" and "life"--could be split off from the self, rather than remaining the ground of being in which the self is embedded? How did something universal and inescapable become external to consciousness--an object of feverish speculation and hot pursuit among men and (far less often) women of ideas? Part of the answer must lie in the historical experience of the thinkers themselves--their awareness of the world outside their study windows. Martin Jay rarely glances at that world, though he can deftly dissect the shifting emphases in Kantian aesthetics or Deweyan ethics.
What we have in Jay's Songs of Experience is a shining example of the history of ideas, an underrated genre of the historian's art. An exceptionally learned, humane and prolific practitioner of his craft, Jay is among our most reliable guides through the key sites of twentieth-century social thought, from the labyrinths of Western Marxism to the thickets of French post-structuralism. Songs of Experience is a worthy addition to this oeuvre, though its history-of-ideas form sometimes seems ironically at odds with its content.
Euston, We Have a Problem (or at least they do over at Counterpunch)
Here, at 3QD, we're divided over what to make of and where we stand on the Euston Manifesto (not that personal opinions in and of themselves matter, unlike sound reasons). But many of us are interested in the manifesto, at least in so much as it fights over what the "Left" is about. Hence our mild fixation on it. Here is one anti-manifesto view, expectedly, in Counterpunch, in what can be called, er, the Counterpunch tone.
Conclusion, quoted in its entirety: "It is vitally important for the future of progressive politics that people of liberal, egalitarian and internationalist outlook should now speak clearly. We must define ourselves against those for whom the entire progressive-democratic agenda has been subordinated to a blanket and simplistic 'anti-imperialism' and/or hostility to the current US administration. The values and goals which properly make up that agenda--the values of democracy, human rights, the continuing battle against unjustified privilege and power, solidarity with peoples fighting against tyranny and oppression--are what most enduringly define the shape of any Left worth belonging to."
They have not noticed that some of their principles are contradicted by their political positions.
Vanessa Redgrave and Joan Didion, Working on a Merger
SOON after the announcement was made last December that Joan Didion would be writing a one-woman play based on her autobiographical book, "The Year of Magical Thinking," Ms. Didion had a meeting with Scott Rudin, the Broadway producer who first proposed the idea, and David Hare, the British playwright who will be directing the production. One of the topics was casting. It was not a long conversation.
Vanessa Redgrave, said Mr. Rudin, "was the only person we ever talked about. There was no one else ever discussed." And so after a phone call to Ms. Redgrave, the two women, among the greatest practitioners of their crafts, started the process of becoming, in a sense, one. "I said, 'My God,' and I couldn't speak for a long time," Ms. Redgrave, 69, recalled in an interview Wednesday afternoon in Ms. Didion's sunlight-filled apartment. "I'd read the book and given it to all my family."
"The Year of Magical Thinking" will be the first play for Ms. Didion, 71. It will not be a strict adaptation of the book, she said, because it will cover events that happened after it was published. The book, an account of the fear, despair and exasperation of bereavement, begins on Dec. 30, 2003, with the sudden death of her husband, John Gregory Dunne, after a heart attack at the dinner table.
May 26, 2006
Towards the end of the last century major cultural institutions established themselves as generators of urban activity rather than just repositories for artefacts and information. Architecture is central to this role. As the new century progresses, the architecture of high culture is evolving still further, and a new museum now carries with it the weight of cultural expectation, anticipated by both critics and town planners as a potent symbol of place, be it a district, city or even a whole country.
Iconic monumentalism was a reaction against the anodyne Modernism that had become the de facto house style of museology. Sober, self-effacing, functional museum architecture stems from the Bauhaus-era fascination with purity and simplicity. The gradual reduction of the decorated façade into a muted, abstract composition took place in parallel with the most significant American art movement of the postwar era, Abstract Expressionism, an integration epitomized by Philip Johnson’s Rothko Chapel in Houston (1971), a self-consciously pared-down structure built to house a Mark Rothko triptych. Art overflowed the constraints of the canvas; architecture followed meekly.
more from Frieze here.
In 1943, a young sailor named Milton on furlough from his duties in the psych ward at Camp Pendleton wandered into the Huntington Library in San Marino and stood stock-still, transfixed by the aesthetic epiphany of seeing Gainsborough’s The Blue Boy and Lawrence’s Pinkie in the flesh. He remembered having seen them reproduced on packs of playing cards back home in Port Arthur, Texas. “It sounds corny,” Milton later recalled, “but my moment of realization that there was such a thing as being an artist happened right there.”
Ten years later, Milton Rauschenberg had changed his name to Bob and the seed planted by that unholy marriage of male and female über-kitsch archetypes, having passed through an art history wormhole called Erased de Kooning, spawned an outpouring of virtuosic and revolutionary visual artifacts unsurpassed in the history of 20th-century visual culture.
more from the LA Weekly here.
SOON AFTER leaving Romania in the late 1940s, Paul Celan wrote to a friend of the “too short season which was ours…” It is a good epitaph for the all too brief explosion of artistic and literary talent in Romania in the first half of the 20th century, set against a darkening background of rising anti-Semitism, invasion and dictatorship.
There were two generations. The first were born in the years before the First World War and included Tristan Tzara (né Sami Rosenstein), the father of Dadaism, the Yiddish poet, Itzik Manger, the screenwriter, Emeric Pressburger (born in Hungary but briefly a Romanian citizen in the 1920s), Mircea Eliade, Ionesco, E.M. Cioran and Saul Steinberg. None of them remained in Romania by the end of the Second World War.
The second generation were born between the wars and included Celan, Elie Wiesel, Aharon Appelfeld and Norman Manea. They were formed by three experiences: the rise of Romanian anti-Semitism in the 1930s, the Holocaust and exile.
more from a review of Norman Manea's memoirs in Salmagundi here.
MID-POINT IN THE MIDDLE EAST?
Tariq Ali in the New Left Review:
Looking down on the world from the imperial grandeur of the Oval Office in the fall of 2001, the Cheney–Bush team was confident of its ability to utilize the September events to remodel the world. The Pentagon’s Vice Admiral Cebrowski summed up the linkage of capitalism to war: ‘the dangers against which us forces must be arrayed derive precisely from countries and regions that are “disconnected” from the prevailing trends of globalization’. Five years later, what is the balance sheet?
On the credit side, Russia, China and India remain subdued, along with Eastern Europe and Southeast Asia. Here, despite the attempts of Western political science departments to cover the instrumentalist twists of us policy with fig-leaf conceptualizations—‘limited democracies’, ‘tutelary democracies’, ‘illiberal democracies’, ‘inclusionary autocracies’, ‘illiberal autocracies’—the reality is that acceptance of Washington Consensus norms is the principal criterion for gaining imperial approval. In Western Europe, after a few flutters on Iraq, the eu is firmly back on side. Chirac now sounds more belligerent than Bush on the Middle East, and the German elite is desperate to appease Washington. On the debit side, the Caracas effect is spreading. Cuba’s long isolation has been broken, the Bolivian oligarchy defeated in La Paz and the Bolivarian Republic of Venezuela has assumed a central role in mobilizing popular anti-neoliberal movements in virtually every Latin American country. 
More alarmingly for Washington, American control of the Middle East is slipping.
Here’s how to make an invisibility cloak
Researchers say they are rapidly closing in on new types of materials that can throw a cloak of invisibility around objects, fulfilling a fantasy that is as old as ancient myths and as young as "Star Trek" and the Harry Potter novels. Unlike those tales of fictional invisibility, the real-life technologies usually have a catch. Nevertheless, limited forms of invisibility might be available to the military sooner than you think.
"We're very confident that at radar frequencies, these materials can be implemented on a time scale of 18 months or so," John Pendry of Imperial College London told MSNBC.com. The most exotic technologies involve "metamaterials," blends of polymers and tiny coils or wires that twist the paths of electromagnetic radiation.
"There are recipes for controlling metamaterials," explained
What Became of the Megafauna?
Robert S. Feranec in American Scientist:
Between 50,000 and 10,000 years ago, during the final millennia of the Pleistocene Epoch, roughly 100 genera of megafauna (animals weighing more than 100 pounds) became extinct worldwide. Among them are such well-known creatures as mammoths and saber-toothed tigers and the more obscure, though no less significant, Diprotodon (an Australian marsupial the size of a hippopotamus) and Coelodonta (a woolly rhinoceros found in Europe). Whether their disappearance was caused by changes in climate or by "overkill" (being hunted to extinction by humans) has been hotly debated for the past 40 years. In Twilight of the Mammoths: Ice Age Extinctions and the Rewilding of America, Paul S. Martin reviews the end-Pleistocene extinction, arguing that overkill is the more likely explanation.