WikiLeaks: Saudis Often Warned U.S. Bbout Oil Speculators

Kevin G. Hall in McClatchy:

When oil prices hit a record $147 a barrel in July 2008, the Bush administration leaned on Saudi Arabia to pump more crude in hopes that a flood of new crude would drive the price down. The Saudis complied, but not before warning that oil already was plentiful and that Wall Street speculation, not a shortage of oil, was driving up prices.

Saudi Oil Minister Ali al Naimi even told U.S. Ambassador Ford Fraker that the kingdom would have difficulty finding customers for the additional crude, according to an account laid out in a confidential State Department cable dated Sept. 28, 2008,

“Saudi Arabia can't just put crude out on the market,” the cable quotes Naimi as saying. Instead, Naimi suggested, “speculators bore significant responsibility for the sharp increase in oil prices in the last few years,” according to the cable.

What role Wall Street investors play in the high cost of oil is a hotly debated topic in Washington. Despite weak demand, the price of a barrel of crude oil surged more than 25 percent in the past year, reaching a peak of $113 May 2 before falling back to a range of $95 to $100 a barrel.

The Obama administration, the Bush administration before it and Congress have been slow to take steps to rein in speculators. On Tuesday, the Commodity Futures Trading Commission, a U.S. regulatory agency, charged a group of financial firms with manipulating the price of oil in 2008. But the commission hasn't enacted a proposal to limit the percentage of oil contracts a financial company can hold, while Congress remains focused primarily on big oil companies, threatening in hearings last week to eliminate their tax breaks because of the $38 billion in first-quarter profits the top six U.S. companies earned.

The Saudis, however, have struck a steady theme for years that something should be done to curb the influence of banks and hedge funds that are speculating on the price of oil, according to diplomatic cables made available to McClatchy by the WikiLeaks website.

The Politics of the Null Hypothesis

Stephanie Zvan in Scientific American:

In late April, Dr. Angela Lee Duckworth and her team published a study demonstrating that some of the variability in IQ test results–and in the life outcomes known to be correlated with IQ scores–varied significantly and substantially as a function of how motivated the test subject was. As the author herself points out in the paper, this is a fairly humdrum result. Those who developed IQ testing predicted that this would happen:

Despite efforts to “encourage in order that every one may do his best” on intelligence tests (ref. 41, p. 122), pioneers in intelligence testing took seriously the possibility that test takers might not, in fact, exert maximal effort. Thorndike, for instance, pointed out that although “all our measurements assume that the individual in question tries as hard as he can to make as high a score as possible . . . we rarely know the relation of any person’s effort to his maximum possible effort” (ref. 42, p. 228). Likewise, Wechsler recognized that intelligence is not all that intelligence tests test: “from 30% to 50% of the total factorial variance [in intelligence test scores remains] unaccounted for . . .this residual variance is largely contributed by such factors as drive, energy, impulsiveness, etc. . . .” (ref. 9, p. 444).

Yet this study that should be eliciting simple head nods was published in PNAS and is generating a fair amount of buzz. Ed Yong covers it nicely, emphasizing both underlying ability and motivation as factors in test results and educational and employment outcomes. ScienceNOW reports the findings, and Maria Konnikova of Artful Choice notes that motivation is a factor over which society has a certain amount of control.

The study is also receiving less positive notices. Steve Sailer at VDARE says the study tell us nothing new because IQ tests are still predictive, despite the researchers' determination that a model that includes motivation predicts life outcomes better than one that doesn't. StatSquatch runs a separate analysis taking out some of the data, but declines to submit the analysis as a peer-reviewed comment on the paper. And at EconLog, Bryan Caplan also visits the motivation factor:

For example, instead of saying, “IQ tests show that people are poor because they're less intelligent – and intelligence is hard to durably raise” we should say, “IQ tests show that people are poor because they're less intelligent and less motivated – and intelligence and motivation are hard to durable raise.” If, like me, you already believed in the Conscientiousness-poverty connection, that's no surprise.

The interesting thing about the disparity in views on this “non-controversial” study is how the views are divided. The straightforward reporting comes from science sites. The criticisms and assertions that the results are meaningless come from a linked group of political blogs. VDARE is an anti-immigration site; EconLog is a an economics blog. StatSquatch is perhaps most easily defined by the rate at which those on the blogroll perspire over “political correctness.”

I Really Mean Like

081024audenMichael Wood on The Complete Works of W.H. Auden: Prose Vol. IV, 1956-62 (edited by Edward Mendelson):

In a poem from the early 1960s, ‘On the Circuit’, W.H. Auden describes himself as ‘a sulky fifty-six’, who finds ‘A change of meal-time utter hell’, and has ‘Grown far too crotchety to like/A luxury hotel’. There is plenty of self-parody in this picture – a little later in the poem he identifies his worry about where the next drink is coming from as ‘grahamgreeneish’ – but this was a time when Auden was rearranging his sense of himself and of his world. Comedy was one sort of arrangement, and an important feature of his view of life; but he was seriously ‘unsettled’, as Edward Mendelson says, and had acquired ‘a profound new sense of menace and dread’.

He had become professor of poetry at Oxford in 1956, although he was still mainly living in New York, and in 1958 he had shifted his summer residence from Ischia to a small town near Vienna, taking leave thereby, he said, of all kinds of fantasies he now felt too old for. Mendelson wonders whether one of Auden’s reasons for moving to Austria, although ‘perhaps too deep to have been conscious’, might have been ‘his wish to live in a culture that … could not escape from its awareness of its own guilt’. This is a plausible thought and, even if not true psychologically, would still work as the kind of parable that Auden, in his prose even more than in his poetry, teaches us how to read. Like other northerners, he had, he suggests in the poem ‘Good-Bye to the Mezzogiorno’, brought questions about change to an unchanging Italian place, ‘hoping to twig from/What we are not what we might be next’, and he needed to take off before the south became a habit:

If we try
To ‘go southern’, we spoil in no time, we grow
Flabby, dingily lecherous, and
Forget to pay bills

Still, he will ‘go grateful’, he says, glad

To bless this region, its vendages, and those
Who call it home: though one cannot always
Remember exactly why one has been happy,
There is no forgetting that one was.

It is part of the same change that he now wishes to pay homage not to ‘Provocative Aphrodite’ or ‘Virago Artemis’, for all their powers over the world of nature and desire, but to a quieter, more discriminating classical figure, Clio, the Muse of History. In the world of those major goddesses it is ‘As though no one dies in particular/And gossip were never true’. Clio by contrast is

Muse of Time, but for whose merciful silence
Only the first step would count and that
Would always be murder.

She is to ‘forgive our noises/And teach us our recollections’, and Auden reminds us that poetry has no special place in her attention:

I dare not ask you if you bless the poets,
For you do not look as if you ever read them,
Nor can I see a reason why you should.

This last stanza offers us a bit of that humility that Auden was often tempted to overdo, but it also chimes with a recurring trope in modern literature in English. Marianne Moore says of poetry that she too dislikes it; Eliot tells us that it doesn’t matter; Auden says it makes nothing happen. In fact, none of these propositions represents anything like the whole story for any of these poets, but there’s an element of affectation here all the same, an unseemly wooing of the philistine.

bad future

Nardwar-odd-future

Wolf Gang certainly aren’t “progressive” rappers like Common or Lupe Fiasco, and Odd Future shouldn’t hold their breath for a White House invite, but in highlighting generational conflict they are the most political popular musicians working. Tyler’s music is a radical critique, justifiably blaming his elders for the murderous voices in the back of his generation’s head. The papering over of America’s social antagonisms frays at its young edge, where the contradictions are still apparent. Odd Future conveys the artistic threat that the masks might fall and the plan to produce another generations of citizens could spin out of control. The sound of youth is not the bells of dawn but a death rattle. As Tyler puts it in the song “Radicals” (also in singular form the name of the group’s first mixtape): “Fuck the fat lady/It’s over when all the kids sing.” In this potentiality, the amphetamines doctors prescribed to keep students still produce “the Ritalin regimen/Double-S shit/Swastikas on the letterman.” Teachers who claim to be “here for you” become the Schmittian enemy in “We are us/They are them/Kill them/All.” At a time when America’s confidence in its future has reached polled lows, when, for the first time, fewer than 50 percent of Americans think the next generation will be better off, no one has figured out how to tell the kids. We measure our pessimism in terms of their lives and imagine they won’t notice. But teenagers don’t wait to have the world explained to them. It’s easier to write about what Odd Future is than what it says, and it’s easier to diagnose a group of teenager rappers than the society that produced them, but Wolf Gang’s art pulls back the curtain and in doing so forces the issue: there’s hard work to be done.

more from Malcolm Harris at The New Inquiry here.

The Last Temptation of Science

20100111_Hawthornepotraitw650

There are those whose noses wrinkle whenever they catch a whiff of allegory in the air. Edgar Allan Poe, in his 1847 review “Tale Writing — Nathaniel Hawthorne,” quips that the best success a writer of allegory can hope for is to accomplish a feat that is not worth doing in the first place. “There is scarcely one respectable word to be said” in its defense. Allegory is obtrusively didactic, Poe elaborates, and thus it disturbs the equilibrium, essential to well-made fiction, between the narrative surface and the thematic depths: meaning should be an undercurrent of subtle force, and allegory redirects it to the surface, where it overwhelms the life-giving illusion of the story. One suspects that, though Poe does not indict the story specifically, he has “Rappaccini’s Daughter” very much in mind as he lights into Hawthorne’s use of allegory. “Rappaccini’s Daughter,” published in 1844, is the tale of a beautiful maiden confined to her father’s house and garden in long-ago Italy, and of the handsome young man who espies her from his window, falls prey to her enchantment, and with only the best intentions brings about her death. The garden is explicitly likened to Eden, though a malign fallen version thereof; the maiden’s father is an eminent doctor, explicitly likened to Adam, who has cultivated plants of unexampled deadliness to be used for medicinal purposes, and to fortify his daughter against the world’s various cruelties. The story has a texture of heightened allusiveness that bristles with meaning, inviting the reader with sensitive feelers to reconsider the wisdom not only of Genesis but of Dante, Milton, Ovid, Spenser, Machiavelli, and the modern scientific project. Hawthorne takes on erotic mysteries, scientific aspirations, venerable religious wisdom — and he composes about as richly literary a short story as any American writer has ever produced.

more from Algis Valiunas at The New Atlantic here.

magpie and chameleon

TLS_Stace1_737297a

Bowie has worn his dilettantism proudly and, through his dabblings, created some of the greatest music of the pop era. He is blessed with one of the most versatile voices. His talent for mimicry, coupled with a willingness to adapt his vocal approach to the song at hand, sets him apart from the competition: you could never tell which David Bowie would be singing. He has always been the bravest among his otherwise simply successful contemporaries, and, particularly in collaboration with Brian Eno, took rock music places it had never meant to go. He has namedropped Nietzsche here and there, and his attitude towards self-renewal has always been that of the Übermensch or the “homo superior”, words he relished singing in “Oh! You Pretty Things” on Hunky Dory in 1971. Bowie has always been at his best when he leaves himself open to chance, starting from scratch: his fearless mixing of genre, his willingness to enter the studio with no material (as he did for his masterpiece Station to Station, 1976), his constantly re-invented recording techniques (for example, instructing guitar players to play a song without ever having heard it, then keeping their first take), his embrace of Eno’s “oblique strategies” for “The Berlin Trilogy” (Low, 1977, Heroes, 1977, Lodger, 1979, only one of which was actually made in Berlin). He fails when his instincts desert him, when he tries to recreate consciously what he does so well unconsciously. Bowie knows all this: it just leaves him in the uncomfortable position of having endlessly to rehearse for unpreparedness.

more from Wesley Stace at the TLS here.

Our data, ourselves

From The Boston Globe:

Data If you’re obsessive about your health, and you have $100 to spare, the Fitbit is a portable tracking device you can wear on your wrist that logs, in real time, how many calories you’ve burned, how far you’ve walked, how many steps you’ve taken, and how many hours you’ve slept. It generates colorful graphs that chart your lifestyle and lets you measure yourself against other users. Essentially, the Fitbit is a machine that turns your physical life into a precise, analyzable stream of data.

If this sounds appealing — if you’re the kind of person who finds something seductive about the idea of leaving a thick plume of data in your wake as you go about your daily business — you’ll be glad to know that it’s happening to you regardless of whether you own a fancy pedometer. Even if this thought terrifies you, there’s not much you can do: As most of us know by now, we’re all leaving a trail of data behind us, generating 0s and 1s in someone’s ledger every time we look something up online, make a phone call, go to the doctor, pay our taxes, or buy groceries.

More here.

ScienceShot: You Are Here

From Science:

Universe Astronomers have produced the most complete 3D map of the nearby universe to date. Using telescopes in both hemispheres, they measured distances to a whopping 45,000 galaxies out to a distance of 380 million light-years—for the astronomy buffs, a red shift of .07. Unlike the famous Sloan Digital Sky Survey, which mapped only part of the sky, the new 2MASS Redshift Survey covers 95% of surrounding space, skipping over only the region near the plane of our own galaxy, where the Milky Way's stars and dust block the view of remote objects. In the map, color codes for distance: purple dots are nearby galaxies; red dots are distant ones.

More here.

Physics and the Immortality of the Soul

Sean Carroll in Scientific American:

ScreenHunter_01 May. 25 18.34 The topic of “life after death” raises disreputable connotations of past-life regression and haunted houses, but there are a large number of people in the world who believe in some form of persistence of the individual soul after life ends. Clearly this is an important question, one of the most important ones we can possibly think of in terms of relevance to human life. If science has something to say about, we should all be interested in hearing.

Adam Frank thinks that science has nothing to say about it. He advocates being “firmly agnostic” on the question. (His coblogger Alva Noë resolutely disagrees.) I have an enormous respect for Adam; he's a smart guy and a careful thinker. When we disagree it's with the kind of respectful dialogue that should be a model for disagreeing with non-crazy people. But here he couldn't be more wrong.

Adam claims that there “simply is no controlled, experimental[ly] verifiable information” regarding life after death. By these standards, there is no controlled, experimentally verifiable information regarding whether the Moon is made of green cheese. Sure, we can take spectra of light reflecting from the Moon, and even send astronauts up there and bring samples back for analysis. But that's only scratching the surface, as it were. What if the Moon is almost all green cheese, but is covered with a layer of dust a few meters thick? Can you really say that you know this isn't true? Until you have actually examined every single cubic centimeter of the Moon's interior, you don't really have experimentally verifiable information, do you? So maybe agnosticism on the green-cheese issue is warranted. (Come up with all the information we actually do have about the Moon; I promise you I can fit it into the green-cheese hypothesis.)

More here.

Could Conjoined Twins Share a Mind?

Twin Susan Dominus in The NYT Magazine:

Twins joined at the head — the medical term is craniopagus — are one in 2.5 million, of which only a fraction survive. The way the girls’ brains formed beneath the surface of their fused skulls, however, makes them beyond rare: their neural anatomy is unique, at least in the annals of recorded scientific literature. Their brain images reveal what looks like an attenuated line stretching between the two organs, a piece of anatomy their neurosurgeon, Douglas Cochrane of British Columbia Children’s Hospital, has called a thalamic bridge, because he believes it links the thalamus of one girl to the thalamus of her sister. The thalamus is a kind of switchboard, a two-lobed organ that filters most sensory input and has long been thought to be essential in the neural loops that create consciousness. Because the thalamus functions as a relay station, the girls’ doctors believe it is entirely possible that the sensory input that one girl receives could somehow cross that bridge into the brain of the other. One girl drinks, another girl feels it.

What actually happens in moments like the one I witnessed is, at this point, theoretical guesswork of the most fascinating order. No controlled studies have been done; because the girls are so young and because of the challenges involved in studying two conjoined heads, all the advanced imaging technology available has not yet been applied to their brains. Brain imaging is inscrutable enough that numerous neuroscientists, after seeing only one image of hundreds, were reluctant to confirm the specific neuro­anatomy that Cochrane described; but many were inclined to believe, based on that one image, that the brains were most likely connected by a live wire that could allow for some connection of a nature previously unknown. A mere glimpse of that attenuated line between the two brains reduced accomplished neurologists to sputtering incredulities. “OMG!!” Todd Feinberg, a professor of clinical psychiatry and neurology at Albert Einstein College of Medicine, wrote in an e-mail. “Absolutely fantastic. Unbelievable. Unprecedented as far as I know.” A neuroscientist in Kelowna, a city in British Columbia near Vernon, described their case as “ridiculously compelling.” Juliette Hukin, their pediatric neurologist at BC Children’s Hospital, who sees them about once a year, described their brain structure as “mind-blowing.”

Capital Controls or Protectionism

Pa3752c_thumb3 Hector R. Torres in Project Syndicate:

[O]one of the main lessons of the crisis is that accumulating reserves shelters an economy from imported crises, thereby permitting governments to implement counter-cyclical policies. This is true, but, in an integrated world economy, it assumes that export-led growth is still an option.

In an environment of high liquidity, in which Latin American countries are far less successful than China in fending off capital inflows, advising them to raise real interest rates can only lure more short-term capital, compounding appreciation pressures. Nobody should be surprised to see trade tensions.

So, what should be done?

In an ideal world, liquidity creation should be regulated internationally, and the coherence of domestic exchange-rate policies ensured. But this is far from today’s real-world situation, so we need to aim for second-best solutions.

Capital controls (regulations and “prudential measures”) could help to curb appreciation pressures. Admittedly, they are not watertight and could eventually be sidestepped, but they are far better than what might follow if they prove ineffective. If capital controls do not work, governments may feel tempted to provide protection to “their” domestic industries by imposing trade restrictions.

The IMF has recently accepted that controlling capital inflows could be appropriate under certain circumstances.

Being Human

Acg1 S.J. Fowler interviews A.C. Grayling over at 3:AM Magazine:

3:AM: In a meaningful sense, your atheism seems to refute the idea that atheism is a philosophical necessity that results in pessimism. To what extent must atheism and the fragility of human nature be taken as given for us to begin legitimately philosophising?

ACG: As has been well said, atheism is to religious belief what not collecting stamps is to stamp collecting. If instead of ‘atheism’ you use the word ‘afairyism’ or some such, to illustrate the fact that there is no real subject matter in play (whereas ‘religion’ – a man-made phenomenon that has been a massive presence in history – is a different matter) you see that all that talk of ‘atheism’ does is to close down certain absurdities that get in the way of doing metaphysics and ethics properly. Whereas talk of ‘religion’ requires us to address the questions of the place of religious voices in the public square; this is where secularism becomes important.

3:AM: Can it be said if we are not overarchingly religious, nor taken with project of self improvement and personal responsibility, then we are inhabiting an age of ambivalence rather than nihilism or religosity. Now it seems the question of meaning is not answered yes or no, but not asked at all, especially in the young. Do you think consumerism, isolation, distraction has taken the place of any stringent belief?

ACG: Given half an invitation to reflect philosophically on the value and direction of life, people quickly begin to do so. (The religions do not want people to think philosophically, because then they begin to question the one-size-fits-all pieties that the religions sell.) The ‘distractions’ of entertainment, consumerism and co have more of a point in them than we sometimes acknowledge, because fun, pleasure, beauty and recreation are significant aspects of experience. But they don’t entirely stop people thinking about questions of value, for human lives also have sorrow and loss in them, and difficult choices, and periods of depression, all of which remind people of the task of thinking and choosing, which is inescapable. Philosophy can provide materials and suggestions here, and encouragement to think; that is or should be one of its principal gifts.

deaf: an ethnicity?

ID_PI_GOLBE_DEAF_AP_001

The newly published The People of the Eye sets out to define the Deaf-World and to fight for it. Where Deaf activists have spent decades arguing that deafness is not a defect but a character trait — a benefit even — The People of the Eye goes a step further. It asserts that Deaf is an ethnicity. An ethnicity like all officially classed ethnicities, to be given its due, politically and culturally. Authors Harlan Lane, Richard C. Pillard, and Ulf Hedberg write that, although Deaf identity is based not on religion, race, or class, “there is no more authentic expression of an ethnic group than its language.” Language is the core of American Deaf life. The important characteristic that distinguishes deafness from other conditions classed as disabilities is that deafness is a matter of communication. With the emergence of Deaf schools, literacy allowed Deaf people to better communicate in the hearing world. As ASL developed, Deaf Americans could better communicate with each other, and with this came the creation of a Deaf culture, even a new way of being. ASL signers say that they spend much more time thinking about and dealing with language than most Americans, resulting in a rich and independent tradition of Deaf language arts — literature, theater, journalism. Deaf people have their own clubs, their own rituals, their own places of worship, their own newspapers, their own sense of humor. The People of the Eye discusses, too, how the fully embodied language of ASL and Deaf pride created a culture of storytelling in the Deaf-World, and how this storytelling developed a unique narrative structure based on the particularities of ASL.

more from Stefany Anne Golberg at The Smart Set here.

buster

Prikryl_1-060911_jpg_230x804_q85

More than fifty years have passed since critics rediscovered Buster Keaton and pronounced him the most “modern” silent film clown, a title he hasn’t shaken since. In his own day he was certainly famous but never commanded the wealth or popularity of Charlie Chaplin or Harold Lloyd, and he suffered most when talkies arrived. It may be that later stars like Cary Grant and Paul Newman and Harrison Ford have made us more susceptible to Keaton’s model of offhand stoicism than his own audiences were. Seeking for his ghost is a fruitless business, though; for one thing, film comedy today has swung back toward the sappy, blatant slapstick that Keaton disdained. There’s some “irony” in what Judd Apatow and Adam Sandler do, but it’s irony that clamors to win the identification of the supposedly browbeaten everyman in every audience. Keaton took your average everyman and showed how majestically alone he was. The story of his life seems in its twists and dives borrowed from his movies, survival demanding a pure lack of sentiment. There were twenty years of child stardom in vaudeville and nearly a decade making popular silent movies, followed by alcoholism, a nasty divorce, a nastier second marriage, twenty years producing a few dreadful blockbusters for MGM followed by a long series of low-budget flops, and a third lasting marriage, until his silent work was unearthed and brought him renewed recognition. “What you have to do is create a character,” he once said. “Then the character just does his best, and there’s your comedy. No begging.” He embodied this attitude so entirely in his silent films that you can’t watch him without feeling won over, a partisan of the nonpartisan side.

more from Jana Prikryl at the NYRB here.

wikipedia, a UNESCO site?

Wikipedia-logo-LARGE

Boasting more than 18 million entries in 279 languages, Wikipedia is arguably the largest store of human knowledge in the history of mankind. In its first decade, the digital encyclopedia has done more to challenge the way we think about the relationship between knowledge and the Internet than virtually any other website. But is this ubiquitous tree of knowledge as culturally sacred as the pyramids of Giza, the archaeological site of Troy, or the Native American mound cities of Cahokia? Jimmy Wales, co-founder of Wikipedia, thinks so. Spurred on by a German chapter of the Wikimedia Foundation, the digital encyclopedia will launch a petition this week to have the website listed on the UN Educational, Scientific, and Cultural Organization’s world heritage lists. If accepted, Wikipedia would be afforded the international protection and preservation afforded to man made monuments and natural wonders. The first digital entity to vie for recognition as cultural treasure, Wikipedia argues that the site meets the first and foremost of UNESCO’s criteria: “to represent a masterpiece of human creative genius. ”

more from Jared Keller at The Atlantic here.