Wednesday, May 31, 2006
Michael Paulus at his website (via The Daily Doubter):
Animation was the format of choice for children's television in the 1960s, a decade in which children's programming became almost entirely animated. Growing up in that period, I tended to take for granted the distortions and strange bodies of these entities.
These Icons are usually grotesquely distorted from the human form from which they derive. Being that they are so commonplace and accepted as existing I thought I would dissect them like science does to all living objects - trying to come to an understanding as to their origins and true physiological make up. Possibly to better understand them and see them in a new light for what they are in the most basic of terms.
I decided to take a select few of these popular characters and render their skeletal systems as I imagine they might resemble if one truly had eye sockets half the size of its head, or fingerless-hands, or feet comprising 60% of its body mass.
Many more here.
How are young Muslims radicalized on domestic soil?
Steve Coll in The New Yorker:
In a world amply populated with angry young Muslims, it is a question of some interest why a small number choose to become suicide bombers. President Bush addresses the matter in starkly religious language, consigning it to an eternal contest between good and evil. American scholars have begun to attack the problem with scientific method; Robert Pape, of the University of Chicago, for example, recently mustered data to argue that suicide attacks are a rational means by which the weak can humble the strong. To this potpourri of hypotheses can now be added a compelling work by anonymous bureaucrats in Great Britain, under the oddly redundant title “Report of the Official Account of the Bombings in London on 7th July 2005.”
On that summer morning, three young Muslim men blew themselves up on Underground cars, and a fourth immolated himself on a double-decker bus; fifty-two people died, and several hundred suffered injuries. The most striking aspect of the inquiry into the attacks, which was published earlier this month, is the extent to which it plumbs the suicide bombers’ motivations.
The four men depicted in the report are in some respects unfathomable. When Shehzad Tanweer, a talented athlete who was twenty-two years old, bought snacks at a highway convenience store four hours before his death, he haggled over the change. Hasib Hussain, who was eighteen, strode into a McDonald’s just half an hour before he killed himself and thirteen others.
Beauty and her beasts
Chris Petit in The Guardian:
Her three marriages were essays in fame. Her first in 1942, at 19, to pint-sized star Mickey Rooney, then one of MGM's biggest assets and an experienced skirt-chaser despite his wholesome screen image, happened when she was barely a signed-up starlet. Rooney was forced to marry because she wouldn't come across otherwise. Her second husband, jazz star Artie Shaw, gave the uneducated Gardner a reading syllabus, sent her to therapy and, for reasons he never explained, moved them into a modest rented house in suburban Burbank, which they shared for a time with its owners and their teenage sons. The third husband was Sinatra. By then she was the bigger star, a perpetual cover girl and tabloid sensation, epitome of an emerging jet set (which can equally be taken for a life on the run), her movie career almost incidental to her celebrity, and indistinguishable from her often exaggerated notoriety. Asked by a reporter what she saw in Sinatra - a 119lb has-been - she replied demurely that 19lb of it was cock.
DIGITAL MAOISM: The Hazards of the New Online Collectivism
Jaron Lanier at Edge.org:
The problem is in the way the Wikipedia has come to be regarded and used; how it's been elevated to such importance so quickly. And that is part of the larger pattern of the appeal of a new online collectivism that is nothing less than a resurgence of the idea that the collective is all-wise, that it is desirable to have influence concentrated in a bottleneck that can channel the collective with the most verity and force. This is different from representative democracy, or meritocracy. This idea has had dreadful consequences when thrust upon us from the extreme Right or the extreme Left in various historical periods. The fact that it's now being re-introduced today by prominent technologists and futurists, people who in many cases I know and like, doesn't make it any less dangerous.
It's time to stop killing meat and start growing it
William Saletan in Slate:
Where were you when Barbaro broke his leg? I was at a steakhouse, watching the race on a big screen. I saw a horse pulling up, a jockey clutching him, a woman weeping. Thus began a worldwide vigil over the fate of the great horse. Would he be euthanized? Could doctors save him? In the restaurant, people watched and wondered. Then we went back to eating our steaks.
Shrinks call this "cognitive dissonance." You munch a strip of bacon then pet your dog. You wince at the sight of a crippled horse but continue chewing your burger. Three weeks ago, I took my kids to a sheep and wool festival. They petted lambs; I nibbled a lamb sausage. That's the thing about humans: We're half-evolved beasts. We love animals, but we love meat, too. We don't want to have to choose. And maybe we don't have to. Maybe, thanks to biotechnology, we can now grow meat instead of butchering it.
Genetic Comparison Traces Origins of HIV to African Chimpanzees
Lauran Neergaard of the AP in the Chicago Tribune:
Solving the mystery of HIV's ancestry was dirty work. But researchers now have confirmed that the virus that causes AIDS in humans really did originate in wild chimpanzees--in a corner of Cameroon.
Scientists have long known that captive chimps carry their own version of the AIDS virus, SIV or simian immunodeficiency virus. But it was extraordinarily hard to find in wild chimpanzees, complicating efforts to pin down just how the virus could have made the jump from animal to man.
Fitting that final piece of the puzzle required seven years of research just to develop tests to genetically trace the virus in living wild chimps without hurting the endangered species. Then trackers had to plunge through the dense forests of West Africa and scrape up fresh ape feces, more than 1,300 samples in all.
Until now, "no one was able to look. No one had the tools," said Dr. Beatrice Hahn of the University of Alabama at Birmingham. She led the team of international researchers that reported the success in Thursday's online edition of the journal Science.
"We're 25 years into this pandemic," Hahn said. "We don't have a cure. We don't have a vaccine. But we know where it came from."
on simple human decency
Ben Metcalf in Harper's Magazine:
Some time has passed since I last raise my voice to the multitude, and whereas literary taste does not seem to have advanced much in the interim, and I assume is still arrayed so as to engage only the weak-minded and dull, I find that I am no longer able to discern with any accuracy where the bounds of simple human decency lie. This would bother me even less than does the taste issue were it not for the fact that ground gained or lost in the theater of decency tends now and then to affect the law, and it has long been a personal goal of mine to avoid capture and imprisonment.
I am therefore led to wonder what the common citizen is allowed to "say" anymore, in print or otherwise, and still feel reasonably sure that some indignant team of G-men, or else a pair of gung-ho local screws, will not drag him away to a detention center, there to act out, with the detainee as a prop, that familiar scene in which one hero cop or another is patriotically unable to resist certain outbursts against the detainee and what were once imagined to be the detainee's constitutional rights. Because I am loath to violate whatever fresh new mores the people have agreed upon, or have been told they agree upon, and because I do not care to have my ass kicked repeatedly in a holding cell while I beg to see a lawyer, I almost hesitate to ask the following question.
More here. [Thanks to Asad Raza.]
WHY THE U.N. CAN'T SAVE DARFUR
Eric Reeves in The New Republic:
Actually, far from suggesting that the United Nations can save Darfur, the developments of the last few weeks provide an excellent illustration of why the international body will never be able to stop the genocide. Indeed, the most recent Security Council resolution does more to highlight Darfur's exceedingly grim future than to suggest that security for civilians or humanitarian operations will improve anytime in the near term. We might recall that there have been seven previous U.N. Security Council resolutions on Darfur, none of which has halted the genocide. These previous resolutions, which together constitute a shameful record of impotence, are recounted in the most recent resolution--unwittingly drawing attention to just how useless Turtle Bay's steady stream of diplomatic activity on Darfur has been. Unfortunately, there is no reason to believe that this time will be any different.
First, it's worth understanding just how bad the situation on the ground in Darfur has become--despite the recent peace agreement signed in Abuja that many believe could open the way for U.N. troops.
Intelligent Beings in Space!
From The New York Times:
A future space mission to Titan, Saturn's intriguing moon enveloped in clouds, might deploy a blimp to float around the thick atmosphere and survey the sand dunes and carved valleys below.
But the blimp's ability to communicate would be limited. A message would take about an hour and a half to travel more than 800 million miles to Earth, and any response would take another hour and a half to get to Titan.
Three hours would be a long time to wait if the message were: "Help! I'm caught in a downdraft. What do I do?" Or if the blimp were to spot something unusual — an eruption of an ice volcano — it might have drifted away before it received the command to take a closer look. The eruption may also have ended by then.
Until recently, interplanetary robotic explorers have largely been marionettes of mission controllers back on Earth. The controllers sent instructions, and the spacecraft diligently executed them.
But as missions go farther and become more ambitious, long-distance puppetry becomes less and less practical. If dumb spacecraft will not work, the answer is to make them smarter. Artificial intelligence will increasingly give spacecraft the ability to think for themselves.
Scientists reveal how frogs grip
From BBC News:
The mystery of how frogs cling to surfaces - even if their feet are wet - may have been solved by scientists. A study of tree frogs has revealed their toe pads are covered in tiny bumps that can directly touch a surface to create friction. The scientists found this direct contact occurs even though the pads are covered with a film of watery mucus. The findings, published in the journal Interface, may aid the development of anti-slip devices.
"The toe pads are patterned with a fine structure of hexagonal cells with channels running between them," explained Dr Jon Barnes, an author on the paper and a zoologist from Glasgow University. "One imagines if you are sticking to a leaf, that each cell, even if it is separate from the other cells, can form its own closest orientation."
Tuesday, May 30, 2006
The life and work of Oriana Fallaci
Margaret Talbot in The New Yorker:
“Yesterday, I was hysterical,” the Italian journalist and novelist Oriana Fallaci said. She was telling me a story about a local dog owner and the liberties he’d allowed his animal to take in front of Fallaci’s town house, on the Upper East Side. Big mistake. “I no longer have the energy to get really angry, like I used to,” she added. It called to mind what the journalist Robert Scheer said about Fallaci after interviewing her for Playboy, in 1981: “For the first time in my life, I found myself feeling sorry for the likes of Khomeini, Qaddafi, the Shah of Iran, and Kissinger—all of whom had been the objects of her wrath—the people she described as interviewing ‘with a thousand feelings of rage.’ ”
For two decades, from the mid-nineteen-sixties to the mid-nineteen-eighties, Fallaci was one of the sharpest political interviewers in the world. Her subjects were among the world’s most powerful figures: Yasir Arafat, Golda Meir, Indira Gandhi, Haile Selassie, Deng Xiaoping. Henry Kissinger, who later wrote that his 1972 interview with her was “the single most disastrous conversation I have ever had with any member of the press,” said that he had been flattered into granting it by the company he’d be keeping as part of Fallaci’s “journalistic pantheon.” It was more like a collection of pelts: Fallaci never left her subjects unskinned.
Celebrating the commonplace: Starlight
Chet Raymo in Science Musings:
Sometimes it's fun to think about things that no one has thought about before.
Some things are thought about for the first time because to do so requires genius. For example: Darwin thinking about evolution by natural selection, Einstein thinking about relativity, or Watson and Crick thinking about the DNA double helix. Being the first to think about those sorts of things can win you a Nobel prize.
Other things are thought about for the first time because they are so utterly commonplace that no one has bothered to think about them before. These are the kind of things I like to think about.
Consider starlight. What could be more commonplace than starlight?
The Wind That Shakes The Barley
Daren Waters at the BBC:
Ken Loach speaking at the Cannes film festival said The Wind That Shakes The Barley was a story he had to tell.
Loach's aim is to cast his political eye on events that are rarely discussed in the UK and beyond and remain open wounds for many Irish citizens.
Cillian Murphy plays Damien, a young man set to leave Ireland and become a doctor in London.
But events overtake him.
At the start of the film, Ireland remains an effective colony of the UK; with British soldiers stationed in the country.
Damien witnesses the murder of a young friend, killed at the hands of brutal British soldiers because he would only give his name in Gaelic, and not in English.
On Seeing the Wind at Hope Mansell
Whether or not shadows are of the substance
such is the expectation I can
wait to surprise my vision as a wind
enters the valley: sudden and silent
in its arrival, drawing to full cry
the whorled invisibilities, glassen towers
freighted with sky-chaff; that, as barnstorming
powers, rammack the small
orchard; that well-steaded oaks
ride stolidly, that rake the light-leafed ash,
that glowing yew trees, cumbrous, heave aside.
Amidst and abroad tumultuous lumina,
regents, reagents, cloud-fêted, sun-ordained,
fly tally over hedgerows, across fields.
a new poem from Geoffrey Hill at Poetry Magazine here.
On the afternoon of January 31, 1998, two hundred professors and graduate students gathered at the University of California, Santa Cruz, to discuss a disturbing new movement. "A specter is haunting U.S. intellectual life," a flier announced, "the specter of Left Conservatism." With participants including Judith Butler, Wendy Brown, Jonathan Arac, and Paul A. Bové, the conference was designed to address the perceived split in the mid- to late '90s between members of the so-called cultural and real Lefts.
What was the difference between the two? The conventional wisdom of the time had it that the cultural Left was composed of theory-obsessed, anti-American academic relativists who wrote obscure treatises and preferred ethnic- and gender-oriented identity politics to activism. Members of the real Left, on the other hand, were pragmatic humanists, earnest '60s types who favored coalition building (with the labor movement, for one), abhorred class inequality, and pressed for political change via elections.
more from Bookforum here.
It is difficult, if not impossible, to tell where the art begins and ends in Dieter Roth's exhibition at Coppermill, Hauser & Wirth's new gallery in a gigantic warehouse in London's East End. Entering the space is like walking into a begrimed indoor city, whose every filthy crevice is crammed with disconcerting detail: heaps of rubbish, hardened paint brushes, broken video cameras. This is the largest exhibition of Roth's work to be held in this country for more than 30 years, yet it provides little more than an inkling of the artist's complicated, divergent career, and his no less complicated life.
more from the Guardian Unlimited here.
Sexual attraction: the magic formula
From The London Times:
Selecting a mate is the most crucial decision of our lives. We spend a huge amount of time and energy trying to find that special someone. Our appetite for a relationship fuels a billion-pound industry of matchmaking services. Yet we’re often not satisfied. A 2005 survey of more than 900 people who had been using online dating services revealed that three-quarters had not found what they were looking for. We seem as much in the dark as ever about who is a suitable match.
Let’s start with the conscious part. There are some things we all find attractive. Men tend to desire women with features that suggest youth and fertility, including a low waist-to-hip ratio, full lips and soft facial features. Recent studies confirm that women have strong preferences for virile male beauty — taut bodies, broad shoulders, clear skin and defined, masculine facial features, all of which may indicate sexual potency and good genes. We also know that women are attracted to men who look as if they have wealth or the ability to acquire it, and that men and women strongly value intelligence in a mate. Preferences for these qualities — beauty, brains and resources — are universal. The George Clooneys and Angelina Jolies of the world are sex symbols for predictable biological reasons.
Thumbs Up for Leech Therapy
Bloodsucking leeches relieve the pain of thumb arthritis more effectively and for a longer period of time than the conventional painkilling ointment, according to new clinical trial results. The findings, presented here yesterday at the North American Research Conference on Complementary and Integrative Medicine, may move leech treatment one large wriggle closer to the mainstream of medicine.
Osteoarthritis of the thumb afflicts millions of people, causing joint pain debilitating enough to keep them from opening jars, writing notes, and gripping anything tightly. Doctors usually prescribe painkilling pills, injections, or ointments, but none of the treatments work well. Internist Gustav Dobos of the University of Essen in Germany, and his colleagues had successfully treated patients' arthritic knees with leeches before. The worms inject a blood-thinning chemical called hirudin and several substances that fight inflammation--components that keep a prey's blood flowing in the wild.
Monday, May 29, 2006
Teaser Appetizer: The Definition of Health
The world health organization (WHO) defines health as “A state of complete physical, mental and social well-being and not merely the absence of disease or infirmity” This definition entered the books between 1946 and 1948 and has remained unchanged.
Current medical knowledge is desperately struggling -- only with partial success --just to “merely” control “disease or infirmity.” while “complete well being” is unlikely to sprout out of our incomplete knowledge If your politicians were to legislate health by this definition, they will be in default for ever for one obvious reason: no nation – I repeat - no nation has the knowledge or the resources to deliver care to match this definition. We all learnt in the kinder garden – well except the politicians – not to promise what we can not fulfill.
This definition is a lofty, laudable visionary statement that may reflect a distant aspiration but its realization is elusive in current practice. In all humility, we should concede that “complete --- well being” is a probable unquantifiable metaphysical state which is unattainable without taming nature’s evolutionary laws of life and death. And to presume that we have the ability to do so is a whiff of arrogance – an aromatic trait our species emits in abundance.
The realization of this dream was probably considered feasible in 1948, when we had made a quantum leap in understanding infectious diseases and for the first time in human history, we were exuberant in our demonstrated ability to extend longevity by about twenty years in some countries But that was far before we could predict the explosion of health technology and understand its consequential individual, societal and economic effects.
Isn’t it time we seek a second opinion on the health of this definition and evolve a flexible definition which encompasses the current reality and is malleable enough to accommodate future developments?
While the WHO definition stays seemingly immutable, a new framework linked to human rights has evolved: The human right to health paradigm reiterates: the enjoyment of highest attainable standard of health is a fundamental right of every human being This linkage has provided an inspirational tool to demand “health..” The tenor of this discourse takes a cue from the rhetoric of Kofi Anan: “It is my aspiration that health will finally be seen not as a blessing to be wished for; but as a human right to be fought for."
This paradigm recognizes that violation of human rights has serious health consequences and promoting equitable health is a prerequisite to development of the society. The discourse rightly demands abolition of slavery, torture, abuse of children, harmful traditional practices and also seeks access to adequate health care without discrimination, safe drinking water and sanitation, safe work environment, equitable distribution of food, adequate housing, access to health information and gender sensitivity.
All nations are now signatories to at least one human rights treaty that includes health rights. One hundred and nine countries had guaranteed right to health in their constitutions by the year 2001 which qualifies it as an effective instrument for policy change; but it also raises some difficult questions.
Human rights discourse uses the words health and health care interchangeably. Rony Brauman, past president of Médecins Sans Frontières comments: “WHO's definition of a "right to health" is hopelessly ambiguous. I have never seen any real analysis of what is meant by the concept of "health" and "health for all," nor do I understand how anyone could seriously defend this notion.” The notion is more defensible if the demand of health care replaced the demand for health.
Yet no country in the world can afford to give all health care to all its citizens all the time. Nations conduct a triage of priorities according to their prejudices and large swaths of populations are not caught in the health care net. Even nations that have right to health embedded in the constitution face a gap between the aspirations and resources.
The human rights debate skirts round the issue by invoking the “Principle of progressive realization”, which allows resource strapped countries to promise increments in health care delivery in future This effectively gives a tool to the governments to ration and allocate resources, even if it conflicts with individual rights.
The following example illustrates the problem: post apartheid government of South Africa had enshrined the right to health in the constitution, yet the courts decided against a petitioner who demanded dialysis that he needed for chronic kidney failure. The court ruled that the government did not have an obligation to provide treatment. The court in essence transferred some responsibility to the individual.
Gandhi had also expressed his concern that rights without responsibility are a blunder. A responsibility paradigm could supplement the rights movement; a pound of responsibility could prove to be heavier than a ton of rights, but the current noise for rights has muzzled the speech for responsibility and “Complete health” is becoming an entitlement to be ensured by the state without demanding that the family and the individual be equal stake holders. Hippocrates said "a wise man ought to realize that health is his most valuable possession and learn to treat his illnesses by his own judgment"
This conflict will escalate further with the impact of biotechnology. A quote from Craig Venter gives the feel: “It will inevitably be revealed that there are strong genetic components associated with most aspects of what we attribute to human existence --- the danger rests with what we already know: that we are not all created equal. ---- revealing the genetic basis of personality and behavior will create societal conflicts.”
Derek Yach, a respected public health expert and professor at Yale University says “With advances in technology, particularly in the fields of imaging and genetic screening, we now recognize that almost all of the population either has an actual or potential predisposition to some future disease.”
We can’t help but rethink about health itself before we promise health care. An alternative definition can be derived from the health field concept of Marc Lalonde who was the health minister of Canada in1974. He surmised that interplay of four elements determined health, namely: genetic makeup, environment including social factors, individual behavior and organization of health care. The health field model holds many stake holders accountable.
Each stake holder approaches health with a seemingly different goal. (Even though they complement each other) A healthy person wishes not to fall sick; a sick person demands quick relief; a health care provider attempts to cure and prevent disease; a molecular biologist envisions control of molecular dysfunction; a public health person allocates resources to benefit maximum number of people; a health economist juggles finances within the budget; the government facilitates or hampers the delivery of care according to its priorities and the activist demands that every person has the right to the” Highest attainable standard of physical and mental health.”
Many stake holders mean more questions than answers. Who decides the limits of health a society should attain? Shall the boundary limit to basic primary care or extend to genetic manipulation to deliver well being? Who decides the mechanism of attaining that limit? Who decides positive mental well being? And who pays for it?
It is apparent that ‘Complete well being’ is as much an oxymoron as ‘airline food!’ We urgently need a new definition as a starting point for debate: a definition that is quantifiable for outcomes, accommodative of stake holders, absorbent of future advances, accountable for delivery of care and cognizant of limitations. The new definition has to be both correct and politically correct. Dr. Brundtland, former director-general of the WHO, wrote in the world health report that “The objective of good health is twofold – goodness and fairness; goodness being the best attainable average level; and fairness, the smallest feasible differences among individuals and groups.” We should match our expectations to reality.
These elements, compressed and enveloped into a workable statement, may sound as follows:
Health is a state of freedom from physical and mental consequences of molecular and psychological derangements caused by the interaction of individual biology and the environment; health care is an attempt to reverse such derangement by providing equitable access to all without discrimination within the constraints of available resources and knowledge.
You may call this, if you please: the 3QD definition of health -- you read it here first!
Dispatches: Affronter Rafael Nadal
Roland Garros, or tennis' French Open, started yesterday. Perhaps you've noticed; articles ran in most Sunday papers about it, quite extensive ones too, considering that the French has often been viewed as a third-rate (after Wimbledon and the U.S. Open) Grand Slam tournament, largely because it is usually won by a cadre of specialists instead of the best-known players. Not only is this perception unfair, but, this year, Roland Garros will be the most important men's tennis tournament of the year. Here's why.
The increasing specialization of tennis has meant that this tournament, the only Grand Slam played on clay, has a set of contenders that is quite distinct from those at the grass courts of Wimbledon and the hardcourts of Flushing Meadows, Queens. Not only has it been won by players who have not been dominant on the other surfaces, but it has been very difficult for anyone to enjoy repeat success sur la terre battue. Ten of the last twelve Wimbledons were won by Pete Sampras and Roger Federer; the last five winners of Roland Garros are Gustavo Kuerten, Albert Costa, Juan Carlos Ferrero, Gaston Gaudio, and Rafael Nadal. I'm going to try to explain both phenomena (specialized success and lack of repeat dominance) below.
Why does it make a difference what surface the game is played on, and what difference does it make? Basically, the surface affects three things: the speed of the ball after it bounces, the height of the ball's bounce, and the player's level of traction on court. In terms of the speed of the ball and height of its bounce, clay is the slowest and highest, and grass is the fastest and lowest, with hardcourt in the middle. This results in differing strategies for success on each surface, with grass rewarding aggressive quick strikes - with the speed of the ball and the low bounce, you can 'hit through' the court and past the other player with relative ease. For this reason, the great grass-court players have mostly been offensive players, who use serve-and-volley tactics (i.e., serving and coming to net to take the next ball out of the air). Clay, on the other hand, reverses this in favor of the defensive player: the slow, high bounce means it is very tough to hit past an opponent, and points must be won by attrition, after long rallies in which slight positional advantages are constantly being negotiated before a killing stroke. Clay-court tennis is exhausting, brutal work.
Clay and grass, then, are opposed, slow and fast, when it comes to the ball. How then did Bjorn Borg, perhaps the greatest modern player (he accomplished more before his premature retirement at twenty-five than anyone other than Sampras) manage to win Roland Garros (clay) six times and Wimbledon (grass) five but never a major tournament on the medium paced surface, hardcourt? The third variable comes into play here: traction. Clay, and, to a lesser extent, grass, provide negative traction. That is, you slip when you plant your foot and push off. Hardcourt provides positive traction - your foot sticks. Consequently, entirely different styles of quickness are needed. Borg didn't like positive traction. On clay, particularly, players slide balletically into the ball, the timing for which skill is developed during childhood by the most talented players, most of whom grew up in countries where clay courts are the rule: Spain, Italy, Argentina, Chile, Brazil. Grass is not as slidey, but offers less traction than the sticky hardcourts, and like clay, grass' uneven natural surface produces unpredictable hops and bounces, frustrating the expectations of the more lab-conditioned hardcourt players.
So, clay slows the ball and provides poor footing, both of which qualities means that it's ruled by an armada of players who grow up playing on it and mastering the movement and strategic ploys it favors. Perhaps foremost among these is the dropshot, which works because the high bounce of the clay court drives players way back and sets them up for the dropper. This explains the dominance of the clay specialists, but why has the title switched off among so many players lately? For the most part, this is because of the grinding nature of clay. So much effort must be expended to win a match (five sets on clay can take five hours of grueling back-and-forth; in contrast, bang-bang tennis on grass can be practically anaerobic), that players tire over the course of the tournament, and so much depends upon perseverance that a superhuman effort will often overcome a greater talent. It just so happens that last year there emerged a player who combines the greatest clay talent with the greatest amount of effort, but more on him below. For now, let me return to my claim that this edition of the French is the most important men's tennis event this year.
Historically, the greatest offensive players (meaning players who try to dictate play and win points outright, rather than counterpunchers, who wait for their opening, or retrievers, who wait for you to mess up), have been unsuccessful at Roland Garros, while the defensive fiends who win in Paris have been unsuccessful on grass. (Borg, a counterpunching genius, is the great exception.) The best attackers, namely John McEnroe, Boris Becker, Stefan Edberg, and of course Pete Sampras, have won zero French Opens, while Ivan Lendl, a three-time Roland Garros winner, narrowly failed in his endearing late-career quest to win Wimbledon (all of these players won major titles on hardcourts as well). The only man since 1970, in fact, to win all four major titles (known as the Grand Slam tournaments), on the three disparate surfaces, is one Andre Agassi, a hybrid offensive baseliner. This has made the dream of winning all four Slams in a single year, a feat also known, confusingly, as winning the Grand Slam--last accomplished by Rod Laver in 1969--seem pretty quixotic nowadays. Until now. The game's best current offensive player is also an excellent defensive player, and an extremely competent mover and slider on clay. Roger Federer has the best chance of anyone since Agassi to win the career Grand Slam, and, as the holder of the last Wimbledon, U.S. Open, and Australian titles, could win his fourth straight major this month (a feat he is calling, with a little Swiss hubris, the "Roger Slam"). If he succeeds this year at Roland Garros, he'll accomplish something Sampras couldn't, and if he does I think it's almost inevitable that he'll sweep London and Flushing and complete the calendar Grand Slam as well.
Standing in the way of Federer's c.v.-building efforts is the aforementioned combination of talent and drive, the nineteen-year-old Mallorcan prodigy Rafael Nadal. He had one of the finest seasons I've ever seen last year, absolutely destroying the field on clay, winning Roland Garros, winning over Agassi in Montreal and over Ljubicic in Madrid. He's now won a record 54 matches on clay without a loss. Not only does Nadal's astonishing effort level intimidate opponents, but he is surprisingly skilled, a bulldog with the delicacy of a fox. You can see him break opponents' spirits over the course of matches, endlessly prolonging rallies with amazing 'gets,' or retrievals, which he somehow manages to flick into offensive shots rather than desperate lobs. When behind, he plays even better until he catches up. His rippling physique and indefatigable, undying intensity make him literally scary to face on clay. And yet, when off the court, he is a personable and kind presence at this stage of his young life. All in all, a player this brutal has no business being this likable, but there it, and he, is.
Nadal and Federer have played six times: Nadal has won five, and held a huge lead in the other before wilting on a hardcourt. Let me underline here just how anomalous this state of affairs is: here we have the world number one on a historic run of victories, and yet he cannot beat number two. Federer has lost his last three matches with Nadal; with all other players, he has lost three of his last one hundred and nineteen matches. Rafa is the only player on whom Federer cannot impose his will; indeed, Federer must try and quickly end points against Nadal to avoid being imposed upon. In the final at Rome two weeks ago, Federer unveiled a new strategy, coming in to net whenever the opportunity arose, though not directly following his serve. Federer's flexibility, his ability to adopt new tactics, made for a delicious and breathtaking final, which he led 4-1 in the fifth and final set, and held two match points at 5-4. Here Nadal's hypnotic retrieving unnerved him once again, and two errors led the match to a fifth-set tiebreaker. In a microcosmic repetition, Federer again led (5-3 and serving) and again let the lead slip away. Nadal, after a full five hours, took the title and reconfirmed his psychological edge, even over the most dominant player of the last twenty years. His confidence will be nearly unimpeachable, where Federer's will be shaken by losing a match in which he played the best clay-court tennis of his life. If, as expected, they play again in the final of Roland Garros, for all the marbles, you're going to see the most anticipated tennis match in several years.
(Note: I have gone on for way too long without handicapping the women's field, for which I apologize. I'll just say here that I am hopeful that France's glorious all-court player, Amelie Mauresmo, will win.)
See All Dispatches.
Perceptions: of landscape
Sughra Raza. Inner Pain-ting. 2000.
Acrylic on canvas, 24" x 24".
Selected Minor Works: Why We Do Not Eat Our Dead
Justin E. H. Smith
[An extensive archive of Justin Smith's writing is now online at www.jehsmith.com]
Now that an "extreme" cookbook has hit the shelves offering, among other things, recipes for human flesh (Gastronaut, Stefan Gates, Harcourt, 257 pages; paperback, $14), perhaps our gross-out, jack-ass culture has reached the point where it is necessary to explain why these must remain untried.
I will take it for granted that we all agree murder is wrong. But this alone is no argument against anthropophagy, for people die all the time, and current practice is to let their remains simply go to waste. Why not take advantage of the protein-rich corpses of our fallen comrades or our beloved elderly relatives who have, as they say, "passed"? Surely this would not be to harm them or to violate their integrity, since the morally relevant being has already departed or (depending on your view of things) vanished, and what's left will have its integrity stolen soon enough by flame or earth. Our dearly departed clearly have no objections to such a fate: they are dead, after all. Could we not then imagine a culture in which cannibalizing our dead were perfectly acceptable, perhaps even a way of honoring those we loved?
The fact that we do not eat our dead, in spite of their manifest indifference, has been duly noted by some participants in the animal-rights debate. They think this reveals that whatever moral reasoning goes into our decisions about what sort of creature may be eaten and what must be left alone, it simply is not, for most of us, the potential suffering of the creature that makes the moral difference. Whereas Peter Singer believes that we should stop eating animals because they are capable of suffering, others have responded that this is beside the point, since we also make humans suffer in multifarious ways. We just don't eat them.
But again, why not? Some moral philosophers have argued that the prohibition has to do with respect for the memory of the deceased, but this can't get to the heart of it, since there's no obvious reason why eating a creature is disrespectful to it.
It may be the answer is simply that, as a species, we are carrion-avoiders. After all, it is not just the vegetarian who will not eat a cow struck by lightning, but the carnivore as well. Put another way: we do not eat fallen humans, but we also do not eat fallen animals; we eat slaughtered animals. It is then perhaps not so much the fact that dead humans are (or were) human that prevents us from eating them, but the fact they are carrion, and that we, as a species, are not scavengers.
Consider in this connection the Islamic Shariah laws that one must follow if one wishes to eat a camel that has fallen down a well (I turn here to the version of the rules stated as stated by the Grand Ayatollah Sistani): "[If the camel] falls down into a well and one feels that it will die there and it will not be possible to slaughter it according to Shariah, one should inflict a severe wound on any part of its body, so that it dies as a result of that wound. Then it becomes… halal to eat."
Now, why is it considered so important to inflict a fatal wound before the camel dies as a result of its fall? Though this is but one culture's rule, it seems to be the expression of a widespread prohibition on eating accidentally dead animals. In the case of the camel, an animal that is about to die from an accident, and the instruction is: if you want to eat it, you better hurry up and kill it before it dies! This suggests that people do not slaughter simply so that a creature will be dead, but rather so that it will be dead in a certain way. Relatedly, in the southern United States, roadkill cookbooks are sold in souvenir shops as novelty items, and the novelty consists precisely in the fact that tourists are revolted and amused by the thought of the locals scavenging like vultures.
Of course, human beings do in fact eat other human beings, just not those dead of natural or accidental causes. Some decades ago, the reality of cannibalism was a matter of controversy. In his influential 1980 book, Man-Eating Myth: Anthropology and Anthropophagy the social anthropologist William Arens argued that stories of cannibal tribes were nothing more than racist, imperialist fantasies. Recently, though, substantial empirical evidence has been accumulated for the relative frequency of cannibalism in premodern societies. Notable among this work is Tim White's archaeological study of anthropophagy among the Anasazi of Southwestern Colorado in the twelfth century. More recently, Simon Mead and a team of researchers have made the case on the basis of genetic analysis that epidemics of prion diseases plagued prehistoric humans and were spread through cannibalistic feasting, in much the same way that BSE spreads among cattle.
In the modern era, frequent reports of cannibalism connected with both warfare and traditional medicine come from both natives and visitors in sub-Saharan Africa. Daniel Bergner reported in the New York Times that "in May , two United Nations military observers stationed in northeastern Congo at an outpost near Bunia, a town not far from Beni, were killed by a local tribal militia. The peacekeepers' bodies were split open and their hearts, livers and testicles taken – common signs of cannibalism." One of Bergner's informants, a Nande tribesman, recounts what happened when he was taken prisoner by soldiers from the Movement for the Liberation of Congo:
"One of his squad hacked up the body. The commander gave Kakule [the informant] his knife, told him to pare the skin from an arm, a leg. He told Kakule and his other assistant to build a fire. From their satchels, the soldiers brought cassava bread. They sat in a circle. The commander placed the dead man's head at the center. He forced the two loggers to sit with them, to eat with them the pieces of boiled limb. The grilled liver, tongue and genitals had already been parceled out among the commander and his troops."
Bergner notes that it is a widespread, and commonly acknowledged belief in the region that eating the flesh, and especially the organs, of one's enemy is a way to enhance one's own power. This practice is sufficiently documented to have been accepted as fact by both the U. N. high commissioner for human rights as well as Amnesty International.
Cannibalism has been observed in over seventy mammal species, including chimpanzees. The hypothesis that cannibalism is common to all carnivorous species, or that this is something of which all carnivores are capable under certain circumstances, does not seem implausible. If one were to argue that these recent reports are fabrications, and that its modern disappearance in our own species has something to do with ethical progress, surely sufficient counterevidence could be produced from other, even better documented practices to quickly convince all concerned that no progress has been made.
The evidence suggests that, when cannibalism does happen, it is never the result of the fortuitous death of a comrade and the simple need among his survivors for protein. Rather, it follows upon the slaughtering of humans, which is exactly what we would expect, given the human preference for slaughtered pigs and cows over lightning-struck ones. Where eating animals is permitted, there is slaughter. And where slaughtering humans is permitted, the general prohibition on eating them does not necessarily hold.
In short, eating human beings is wrong because murder is wrong, and there's no way to get edible meat but by slaughtering it. I suppose Stefan Gates could look for a "donor," who would in case of an untimely death --a car accident, say-- dedicate his body to pushing the limits of experimental gastronomy. But if the cook fails to find any willing diners, this may have much more to do with our distaste for roadkill than with respect for the memory of a fellow human.
Monday Musing: Frederica Krueger, Killing Machine
A couple of months ago, my wife Margit's friend Bailey asked us to look after her cat (really just a kitten) while she was going to be out of town for about ten days. It was decided that the cat would just stay with us during that time. Bailey had only recently found the cat cowering in her basement, half-starved and probably living on the occasional mouse or whatever insects or other small creatures she could find. Bailey hadn't got around to naming the cat yet, and not wishing to prematurely thrust a real name upon her, we just called her Catty while she stayed with us. We thought she must be about six months old at that time, but she was quite tiny. Catty, to put it kindly, turned out to be a more ferociously mischievous cat than I had ever seen before. She did not like to be petted, and shunned all forms of affection. This, however, should by no means lead you to infer that our interactions with Catty were limited or sparse. Not at all: we were continuously stalked and hunted by her. I may not know what it is like to be a bat, but thanks to Catty, I have a pretty good idea what it is like to be an antelope in the Serengeti! [Photo shows Catty when she first came to stay with us.]
Catty wanted to do nothing but eat and hunt. Any movement or sound would send her into a crouching tiger position, ears pinned back, tail twitching. Though she is very fast, her real weapon is stealth. (Yes, she is quite the hidden dragon, as well.) I'll be watching TV or reading, and incredibly suddenly I am barely aware of a grayish blur flying through the air toward me from the most unexpected place, and have just enough time to instintively close my eyes protectively before she swats me with a claw. After various attacks on Margit and me which we were completely helpless to prevent, and which left us mauled with scratches everywhere (and I had been worried about cat hair on my clothes making me look bad!), Margit took her to a vet to have her very sharp nails trimmed (we did not have her declawed, which seemed too cruel and irreversible). The vet asked Margit for a name to register her under, and Catty immediately tried to kill him for his impertinence. While he bandaged his injuries, Margit decided to officially name the little slasher Frederica Krueger, thereby openly acknowledging and perhaps even honoring her ineluctably murderous nature. We started calling her Freddy.
Here's the funny thing: despite her fiercely feral, violent tendencies, Freddy was just so beautiful that I fell in love with her. To echo Nabokov's Humbert Humbert speaking about another famous pubescent nymphet: Ladies and Gentlemen of the Jury, it was she who seduced me! As Freddy got more used to us, it was as if she could not decide whether to try and eat us, or be nice. She started oscillating between the two modes, attacking and then affectionately licking my hand, then attacking again... But it was precisely the graceful, lean, single-minded perfection of her design as a killing machine that I could not resist. Like a Ferrari (only much more impressive), she was clearly built for one thing only, and therein lay her seductive power. (Okay, I admit it, I've always liked cats. The photo here shows me sitting on a chimney on the roof of our house in Islamabad in the late 60s with my cat Lord Jim.)
We mostly read whatever psychological intentions we want (and can) into our pets, imputing all sorts of beliefs and desires from our own psychological economies to them, and this works particularly well to the advantage of cats. They are just intelligent enough to get our attention as intentional agents (unlike say, a goldfish, or even a hamster, which seem barely more than the automatons Descartes imagined all animals except humans to be), but the fact that they are very mentally rigid and cannot learn too much makes them seem imperious, haughty, independent, and noble to us, unlike dogs, who are much more flexible in intelligence and can learn to obey commands and do many tricks to please us. Let's be blunt: cats are quite stupid. But to be fair, maybe much of the nobility we read into some humans is also the result of their rigidity. Who knows. In any case, cats are such monomaniacally hardwired hunters that it is impossible not to admire their relentless pursuit of prey, even if (in my case!) that prey is us. Since like many gods of the ancients, cats are mostly oblivious to human wishes and impossible to control, it is no surprise that some ancient peoples held them to be gods.
In ancient Egypt cats were considered deities as early as 3000 BCE and later there existed the cult of the goddess Bast, who was originally depicted as a woman with the head of a lioness, but soon changed to an unmistakeably domestic cat. Since cats were considered sacred, they were also mummified. Herodotus reports that when Egyptian cats died, the members of the household that owned it would shave their eyebrows in mourning. Killing a cat, even accidentally was a capital crime. The cult of Bast was officially banned in 390 BCE, but reverence for cats continued. Another greek historian, Diodorus Siculus, relates an incident from about 60 BCE where the wheels of a Roman chariot accidentally crushed an Egyptian cat. An outraged mob immediately killed the soldier driving the chariot.
The domestic cat was named Felis catus by Linnaeus, and like dogs, belong to the order Carnivora. Not all carnivores are in this order (even some spiders are carnivores, after all) and not all members of the Carnivora are carnivores, such as the panda. Other members of this order are bears, weasels, hyenas, seals, walruses, etc. Like our own, the ancestors of the modern domestic cat came from East Africa. Cats were probably initially allowed or encouraged to live near human settlements because they are great for pest control, especially in agricultural settings with grain storage, etc. This arrangement also afforded cats protection from larger predators who stayed away from humans for the most part. Even now, cats will hunt more than a thousand species of small animals. Domestic cats, if left in the wild, will form colonies, and by the way, a group of cats is known as a clowder. (Be sure to throw that into your next cocktail party conversation.)
It took even physicists a while to figure out how a cat always lands on its feet, which is known as its "righting reflex." The problem is that in mid-air, there is nothing to push off against to change your orientation (imagine being suspended in space outside a rocket, and trying to rotate). So how do they do it? The answer is actually quite technical and has to do with something called a phase shift. (Like a spinning figure skater being able to speed up or slow down her rate of rotation by drawing her arms in or holding them out.) What the cat does is first put its arms out and rotate the front half of its body in one direction and the back half in the opposite direction (a twisting motion), then it draws its arms in and twists in the opposite direction. But because angular momentum must be conserved, and angular momentum depends on the radial distance of mass from its axis of rotation, it will rotate back less this time, thereby achieving a net rotation in the direction of the first twist. If you don't get it, don't worry about it!
Cats appear frequently in fiction and writers seem to have a particular predilection for them. Ernest Hemingway and Mark Twain were serial cat-owners. Hemingway at various times had cats named Alley Cat, Boise, Crazy Christian, Dillinger, Ecstasy, F. Puss, Fats, Friendless Brother, Furhouse, Pilar, Skunk, Thruster, Whitehead, and Willy. Twain's cats were Appolinaris, Beelzebub, Blatherskite, Buffalo Bill, Satan, Sin, Sour Mash, Tammany, and Zoroaster. Meanwhile, Theodore Roosevelt's cat Tom Quartz was named for a cat in Mark Twain's Roughing It. T.S. Eliot owned cats named Tantomile, Noilly Prat, Wiscus, Pettipaws, and George Pushdragon. William and Williamina both belonged to Charles Dickens.
Lord Byron and Jorge Luis Borges both had cats named Beppo. (Byron travelled accompanied by five cats.) Edgar Allen Poe had Catarina; Raymond Chandler, Taki. Kingsley Amis's cat was Sara Snow. Some cats were, of course, named for famous people as well as owned by them, such as Gloria Steinem's Magritte and Anatole France's Pascal. John Lennon was the proud owner of Elvis. John Kenneth Galbraith was forced to change his cat's name from Ahmedabad to Gujarat after he became the U.S. ambassador to India because Muslims were offended by "Ahmed" (one of Mohammad's names) being associated with a cat. Mohammad himself, according to a report (hadith) attributed to Abu Huraira, owned a cat named Muezza, about whom it is said that one day while she was asleep on the sleeve of Mohammad's robe, the call to prayer was sounded. Rather than awaken the cat, Mohammad quietly cut his sleeve off and left. When he returned, the cat bowed to him and thanked him, after which she was guaranteed a place in heaven.
Isaac Newton not only loved cats, but is also said (probably apochryphally) to be the inventor of the "cat flap," allowing his cats to come and go as they pleased. (Wonder how long a break he had to take from inventing, say calculus, to do that.) And by the way, among famous cat haters can be counted such luminaries as Genghis Khan, Alexander the Great, Julius Caesar, Napoleon Bonaparte, Benito Mussolini, and last but not least, Adolf Hitler. What is it about cat-hating that basically turns one into a Dr. Evil? But wait, Dr. Evil likes cats!
Okay, enough random blather. Back to Ms. Frederica Krueger's story: as the moment of Bailey's return from her trip and the time for Freddy to leave us approached, I grew more and more agitated, finally threatening Margit that I would kidnap the cat and run away with her unless she did something to stop Bailey from coming to pick up her cat. At first Margit tried to tell me that we could get another cat, which only made me regress further and throw a tantrum yelling, "I don't want another cat! I only want this cat!" At this point, Margit told me I had finally cracked up completely and advised me to call a shrink. Bailey was coming to get the cat early next morning. I went to bed late, as I often do, and was still asleep when Margit awakened me to say that Bailey had agreed to let us have the cat as it seemed very happy here, and Bailey's apartment was really too small anyway. Thus Frederica becames ours, and we remain her willing and ever-anxious prey.
Freddy's Photo Gallery
Here are some glamour and action shots of Ms. Frederica Krueger, which you can click to enlarge. Captions are below the photos:
- I catch Freddy suddenly pouncing on an unsuspecting Margit's hand from behind our living room sofa (a favorite place of hers from which to launch her demonic attacks). Her eyes reflect the light from the camera flash because of a mirror-like layer behind her retinas called the tapetum. Nocturnal animals have this reflective surface there to bounce photons back toward the photosensitive cells of the retina, thereby almost doubling the chance that they will be registered, and greatly improving the animal's night vision. The daytime vision of cats is not as good as humans, however.
- She is striking a deceptively demure pose. Don't let if fool you. I have paid dearly for that mistake. In blood.
- Freddy loves this incredibly silly toy, which is basically just a little felt mouse that goes around and around, driven by a battery-powered motor. She spends inordinate amounts of time and energy trying to slay this patently fake rodent.
- Freddy has a habit of sitting on various bookshelves in the apartment, usually at a greater height than in this picture, surveying the scene below, much like a vulture.
- Margit too-bravely holds Freddy in her lap, who is only milli-seconds away from trying to shred Margit's hands with the claws of her powerful rear legs.
- If you didn't believe me when I said that often all I see is a grayish blur flying at me, have a good look at this picture (enlarge it by clicking on it) taken at 1/8th of a second shutter speed. Freddy is jumping from a lower bookshelf to the shelp avove the stereo on the right, so she can climb to even higher shelves along that wall.
Have a good week! My other Monday Musing columns can be seen here.
Sunday, May 28, 2006
THE ECONOMICS OF CONSERVATION
"How economists and climatologists deal with uncertainty...and each other."
Dave Munger in Seed Magazine:
People across the nation are socking it to state gas tax revenues by buying energy-efficient cars, making it more difficult for states to pay for road maintenance. Legislators from Oregon estimate that as a result of all those hybrids, by 2014 the state's gas tax revenues will begin to decline; as a result they may replace the current gas tax with a mileage tax.
Most climatologists agree that curbing greenhouse gas emissions and fighting global warming will require that we build more energy efficient cars and homes. Yet some of these choices are still not cost effective. Even as gas prices climb past $3 per gallon, filling the tank on a standard-engine economy car is still cheaper than plunking down the extra money for a $22,000 Toyota Prius. (Over the long term, however, a Prius requires only a $2.28 gas price to recoup its cost premium over an $18,000 Camry).
Economists have called for incentives to force conservation, such as increasing gas taxes to promote moves to more efficient cars or providing subsidies for installing solar water heaters. But when these incentives actually work, they can deplete tax revenue steams, creating a disincentive for the state to continue the incentive. And increased taxes can be unpopular, which is why Oregon is now considering alternatives to a gas tax.
"Seamus Heaney published his first collection when he was 27, he won the Nobel Prize when he was 56 and his 12th book of poetry came out this spring. He talks to James Campbell about growing up on a farm in County Derry, politics and his current project, inspired by a 15th-century Scots poet."
From The Guardian:
In 1977, Seamus Heaney visited Hugh MacDiarmid at his home in the Scottish borders, when the great poet and controversialist was in the final phase of life. MacDiarmid had been overlooked by the curators of English literature: compiling the Oxford Book of English Verse, Philip Larkin asked a friend if there was "any bit of MacD that's noticeably less morally repugnant and aesthetically null than the rest?" Heaney, who has always felt at home with Scots vernacular takes a different line. "I always said that when I met MacDiarmid, I had met a great poet who said 'Och'. I felt confirmed. You can draw a line from maybe Dundalk across England, north of which you say 'Och', south of which you say 'Well, dearie me'. In that monosyllable, there's a world view, nearly."
In a literary career that spans 40 years, Heaney's appointed subject matter has been largely extra-curricular: Irish nationalism, "Orange Drums", the sod and silage of his father's 45-acre farm at Mossbawm, County Derry. In 1999, he took the Anglo-Saxon poem Beowulf and hammered it into a weathered English, which sold in astounding quantities and won the Whitbread Book of the Year Award. However, it is "the twang of the Scottish tongue", audible throughout his Derry childhood, particularly "over the Bann in Country Antrim", that has given him his current project, a modern English account of the work of the 15th- century Scottish makar Robert Henryson.