This is just brilliant!
David Bromwich in the New York Review of Books:
Being president of the world has sometimes seemed a job more agreeable to Barack Obama than being president of the United States. The Cairo speech of June 2009 was his first performance in that role, and he said many things surprising to hear from an American leader—among them, the statement that “it is time for [Israeli] settlements to stop.” But as is now widely understood, the aftermath of Cairo was not properly planned for. Though Obama had called on Benjamin Netanyahu to halt the expansion of settlements, he never backed his demand with a specific sanction or the threat of a loss of favor. His contact with peaceful dissidents in the Arab world remained invisible and was clearly not a major concern of his foreign policy. Soon after the Cairo speech, the Afghan war and drone attacks in the Pakistani tribal regions took center stage.
Yet Obama has always preferred the symbolic authority of the grand utterance to the actual authority of a directed policy—a policy fought for in particulars, carefully sustained, and traceable to his own intentions. The command to kill or capture Osama bin Laden and the attempt to assassinate Anwar al-Awlaki in a drone strike, which closely followed the bin Laden success, are the exceptions that prove the rule: actions of a moment, decided and triggered by the president alone. His new Middle East speech, at the State Department on May 19, was in this sense a return to a favorite genre.
Before an international audience, Obama tends to speak as if he were the United States addressing the world; and he treats the United States as the most grown-up country in the world. This posture carries a risk of parental finger-wagging, which our president—still young as a parent and young as a leader—doesn’t sufficiently guard against.
Saving God and Surviving Death: Mark Johnston has gone for the double, and I’m tempted to think he has succeeded, on his own terms, many of which seem about as good as terms get in this strange part of the park. I don’t, however, agree with his reasons or share his motive for attempting to explain how we can survive death, and I doubt the necessity of some of the matériel in his admittedly fabulous argumentative armamentarium. I’ll be jiggered if I survive death on Johnston’s terms; I don’t know whether he holds out much hope for himself. And his success won’t please anyone who believes in anything supernatural. Any conception of God as essentially a supernatural being is idolatry in Johnston’s book. All regular adherents of the Abrahamic religions – Judaism, Islam and Christianity – are therefore idolaters. And they go further: they want a ‘personal’ God, a ‘Cosmic Intervener who might confer special worldly advantages on his favourites’. They should be ashamed of themselves, at least if they’ve had any education; they’re moral babies. Here Johnston seems close to Iris Murdoch, who asserted that there is no ‘responsive superthou’. It’s this kind of conception of God that moves Thomas Nagel to say: ‘It isn’t just that I don’t believe in God … It’s that I hope there is no God! I don’t want there to be a God; I don’t want the universe to be like that.’ In Murdoch and Nagel I think we find the genuine spiritual impulse or religious temperament, which never invests in supernatural entities.
more from Galen Strawson at the LRB here.
From The Paris Review:
When the 104-year-old copper heiress Huguette Clark died earlier this week, obituaries invariably included the word eccentric. This was surely due at least somewhat to her apparent preference for making her home in hospitals. But part of it—the bigger part, I’m guessing—was her passion for dollhouses. In her later years, Clark retreated into an expensive miniature world, surrounding herself with large amounts of the tiny. Second childhood? God complex? Arrested development? Maybe. But Clark wasn’t alone. Miniatures have exerted a fascination over adults—and often, rich and powerful adults—since Duke Albrecht V forced large portions of a sixteenth-century court into the construction of what’s known as the “Munich Baby House.” Queen Mary’s Windsor Castle fantasia—furnished and outfitted by practically every artisan with a royal appointment—is famous; less well known is the elaborate dollhouse for which Alice Longworth Roosevelt frequently neglected guests, or the modern-art masterpiece created in the 1920s by the bohemian Stettheimer sisters.
…But miniaturists—the people, the hobby, the history—deserve more than to be dismissed as an easy metaphor. It’s a fascinating world that continues to capture people—whether they admit to it or not. Wrote one confidante of Clark in later life, “She just wanted to be home and play with her dolls.”
Researchers (and some cat-owners) wanted to know: What do feral and free-roaming house cats do when they're out of sight? A two-year study offers a first look at the daily lives of these feline paupers and princes, whose territories overlap on the urban, suburban, rural and agricultural edges of many towns.
…As expected, in most cases the un-owned cats had larger territories than the pet cats and were more active throughout the year. But the size of some of the feral cats' home ranges surprised even the researchers. One of the feral cats, a mixed breed male, had a home range of 547 hectares (1,351 acres), the largest range of those tracked. Like most of the feral cats, this lone ranger was seen in both urban and rural sites, from residential and campus lawns to agricultural fields, forests and a restored prairie. “That particular male cat was not getting food from humans, to my knowledge, but somehow it survived out there amidst coyotes and foxes,” Horn said. “It crossed every street in the area where it was trapped. (It navigated) stoplights, parking lots. We found it denning under a softball field during a game.” The owned cats had significantly smaller territories and tended to stay close to home. The mean home range for pet cats in the study was less than two hectares (4.9 acres).
The Death of the Hired Man
Mary sat musing on the lamp-flame at the table
Waiting for Warren. When she heard his step,
She ran on tip-toe down the darkened passage
To meet him in the doorway with the news
And put him on his guard. “Silas is back.”
She pushed him outward with her through the door
And shut it after her. “Be kind,” she said.
She took the market things from Warren’s arms
And set them on the porch, then drew him down
To sit beside her on the wooden steps.
“When was I ever anything but kind to him?
But I’ll not have the fellow back,” he said.
“I told him so last haying, didn’t I?
‘If he left then,’ I said, ‘that ended it.’
What good is he? Who else will harbour him
At his age for the little he can do?
What help he is there’s no depending on.
Off he goes always when I need him most.
‘He thinks he ought to earn a little pay,
Enough at least to buy tobacco with,
So he won’t have to beg and be beholden.’
‘All right,’ I say, ‘I can’t afford to pay
Any fixed wages, though I wish I could.’
‘Someone else can.’ ‘Then someone else will have to.’
I shouldn’t mind his bettering himself
If that was what it was. You can be certain,
When he begins like that, there’s someone at him
Trying to coax him off with pocket-money,—
In haying time, when any help is scarce.
In winter he comes back to us. I’m done.”
“Sh! not so loud: he’ll hear you,” Mary said.
“I want him to: he’ll have to soon or late.”
Richard Gowan in The National:
It is not too much to say that Fukuyama had no choice but to write this book. Twenty years ago he seized the post-Cold War moment to raise the possibility of the “end of history” – the moment that liberal democracy trumped all other political systems. Versions of this idea informed the Clinton administration's efforts to draw ex-Communist states into a liberal world order and the Bush administration's democratisation agenda.
On some college campuses, it has been fashionable to suggest that Fukuyama's thesis led directly to America's misadventure in Iraq and the struggle to build a modern state in Afghanistan. This is piffle. Anyone who has read detailed accounts of the Bush team's debates over Afghanistan and Iraq will recognise that these campaigns were shaped by an initial post-9/11 panic, old-fashioned power politics and much Washingtonian infighting.
Yet, to his credit, Fukuyama has worried a good deal about why his country's efforts to transform the world have gone awry. He not only disowned the neoconservatives in a finely argued 2007 polemic, America at the Crossroads, but has written and edited a number of technical studies of development policy and nation-building. Even ardent admirers may have missed his article, “State-building in the Solomon Islands”, in the 2008 Pacific Economic Review, cited dutifully in The Origins of Political Order.
Although Fukuyama notes that this new work is partially inspired by a preoccupation with “the real-world problems of weak and failed states”, it is evidently his return to the big picture. The book is Fukuyama's attempt to address those “real-world problems” by grappling with the sociological and philosophical flaws of long-defunct societies.
This is a remarkably old-fashioned project. In tracing the highways and byways of human development, Fukuyama appears far more interested in probing the classics of political philosophy and sociology than current development theory.
Many of the five books you’ve chosen will be discoveries to our readers, but one will be familiar to all. When Annie Hall moved out of Alvy’s apartment, they fought over who owned The Catcher in the Rye. When did you first read it and what did it mean to you?
The Catcher in the Rye has always had special meaning for me because I read it when I was young – 18 or so. It resonated with my fantasies about Manhattan, the Upper East Side and New York City in general.
It was such a relief from the other books I was reading at the time, which all had a quality of homework to them. For me, reading Middlemarch or Sentimental Education was work, whereas reading The Catcher in the Rye was pure pleasure. The burden of entertainment is on the author. Salinger fulfils that obligation from the first sentence on.
Reading and pleasure didn’t go together for me when I was younger. Reading was something you did for school, something you did for obligation, something you did if you wanted to take out a certain kind of woman. It wasn’t something I did for fun. But The Catcher in the Rye was different. It was amusing, it was in my vernacular, and the atmosphere held great emotional resonance for me. I reread it on a few occasions and I always get a kick out of it.
At least until you created your familiar film persona, Holden Caulfield was the icon of American angst. Did you identify with him?
Not in any deep way.
Salinger’s protagonist is driven mad by the ugliness in life. What drives you nuts?
The human predicament: the fact that we’re living in a nightmare that everyone is making excuses for and having to find ways to sugarcoat. And the fact that life, at its best, is a pretty horrible proposition. But people’s behavior makes it much, much worse than it has to be.
Kevin G. Hall in McClatchy:
When oil prices hit a record $147 a barrel in July 2008, the Bush administration leaned on Saudi Arabia to pump more crude in hopes that a flood of new crude would drive the price down. The Saudis complied, but not before warning that oil already was plentiful and that Wall Street speculation, not a shortage of oil, was driving up prices.
Saudi Oil Minister Ali al Naimi even told U.S. Ambassador Ford Fraker that the kingdom would have difficulty finding customers for the additional crude, according to an account laid out in a confidential State Department cable dated Sept. 28, 2008,
“Saudi Arabia can't just put crude out on the market,” the cable quotes Naimi as saying. Instead, Naimi suggested, “speculators bore significant responsibility for the sharp increase in oil prices in the last few years,” according to the cable.
What role Wall Street investors play in the high cost of oil is a hotly debated topic in Washington. Despite weak demand, the price of a barrel of crude oil surged more than 25 percent in the past year, reaching a peak of $113 May 2 before falling back to a range of $95 to $100 a barrel.
The Obama administration, the Bush administration before it and Congress have been slow to take steps to rein in speculators. On Tuesday, the Commodity Futures Trading Commission, a U.S. regulatory agency, charged a group of financial firms with manipulating the price of oil in 2008. But the commission hasn't enacted a proposal to limit the percentage of oil contracts a financial company can hold, while Congress remains focused primarily on big oil companies, threatening in hearings last week to eliminate their tax breaks because of the $38 billion in first-quarter profits the top six U.S. companies earned.
The Saudis, however, have struck a steady theme for years that something should be done to curb the influence of banks and hedge funds that are speculating on the price of oil, according to diplomatic cables made available to McClatchy by the WikiLeaks website.
Stephanie Zvan in Scientific American:
In late April, Dr. Angela Lee Duckworth and her team published a study demonstrating that some of the variability in IQ test results–and in the life outcomes known to be correlated with IQ scores–varied significantly and substantially as a function of how motivated the test subject was. As the author herself points out in the paper, this is a fairly humdrum result. Those who developed IQ testing predicted that this would happen:
Despite efforts to “encourage in order that every one may do his best” on intelligence tests (ref. 41, p. 122), pioneers in intelligence testing took seriously the possibility that test takers might not, in fact, exert maximal effort. Thorndike, for instance, pointed out that although “all our measurements assume that the individual in question tries as hard as he can to make as high a score as possible . . . we rarely know the relation of any person’s effort to his maximum possible effort” (ref. 42, p. 228). Likewise, Wechsler recognized that intelligence is not all that intelligence tests test: “from 30% to 50% of the total factorial variance [in intelligence test scores remains] unaccounted for . . .this residual variance is largely contributed by such factors as drive, energy, impulsiveness, etc. . . .” (ref. 9, p. 444).
Yet this study that should be eliciting simple head nods was published in PNAS and is generating a fair amount of buzz. Ed Yong covers it nicely, emphasizing both underlying ability and motivation as factors in test results and educational and employment outcomes. ScienceNOW reports the findings, and Maria Konnikova of Artful Choice notes that motivation is a factor over which society has a certain amount of control.
The study is also receiving less positive notices. Steve Sailer at VDARE says the study tell us nothing new because IQ tests are still predictive, despite the researchers' determination that a model that includes motivation predicts life outcomes better than one that doesn't. StatSquatch runs a separate analysis taking out some of the data, but declines to submit the analysis as a peer-reviewed comment on the paper. And at EconLog, Bryan Caplan also visits the motivation factor:
For example, instead of saying, “IQ tests show that people are poor because they're less intelligent – and intelligence is hard to durably raise” we should say, “IQ tests show that people are poor because they're less intelligent and less motivated – and intelligence and motivation are hard to durable raise.” If, like me, you already believed in the Conscientiousness-poverty connection, that's no surprise.
The interesting thing about the disparity in views on this “non-controversial” study is how the views are divided. The straightforward reporting comes from science sites. The criticisms and assertions that the results are meaningless come from a linked group of political blogs. VDARE is an anti-immigration site; EconLog is a an economics blog. StatSquatch is perhaps most easily defined by the rate at which those on the blogroll perspire over “political correctness.”
Michael Wood on The Complete Works of W.H. Auden: Prose Vol. IV, 1956-62 (edited by Edward Mendelson):
In a poem from the early 1960s, ‘On the Circuit’, W.H. Auden describes himself as ‘a sulky fifty-six’, who finds ‘A change of meal-time utter hell’, and has ‘Grown far too crotchety to like/A luxury hotel’. There is plenty of self-parody in this picture – a little later in the poem he identifies his worry about where the next drink is coming from as ‘grahamgreeneish’ – but this was a time when Auden was rearranging his sense of himself and of his world. Comedy was one sort of arrangement, and an important feature of his view of life; but he was seriously ‘unsettled’, as Edward Mendelson says, and had acquired ‘a profound new sense of menace and dread’.
He had become professor of poetry at Oxford in 1956, although he was still mainly living in New York, and in 1958 he had shifted his summer residence from Ischia to a small town near Vienna, taking leave thereby, he said, of all kinds of fantasies he now felt too old for. Mendelson wonders whether one of Auden’s reasons for moving to Austria, although ‘perhaps too deep to have been conscious’, might have been ‘his wish to live in a culture that … could not escape from its awareness of its own guilt’. This is a plausible thought and, even if not true psychologically, would still work as the kind of parable that Auden, in his prose even more than in his poetry, teaches us how to read. Like other northerners, he had, he suggests in the poem ‘Good-Bye to the Mezzogiorno’, brought questions about change to an unchanging Italian place, ‘hoping to twig from/What we are not what we might be next’, and he needed to take off before the south became a habit:
If we try
To ‘go southern’, we spoil in no time, we grow
Flabby, dingily lecherous, and
Forget to pay bills
Still, he will ‘go grateful’, he says, glad
To bless this region, its vendages, and those
Who call it home: though one cannot always
Remember exactly why one has been happy,
There is no forgetting that one was.
It is part of the same change that he now wishes to pay homage not to ‘Provocative Aphrodite’ or ‘Virago Artemis’, for all their powers over the world of nature and desire, but to a quieter, more discriminating classical figure, Clio, the Muse of History. In the world of those major goddesses it is ‘As though no one dies in particular/And gossip were never true’. Clio by contrast is
Muse of Time, but for whose merciful silence
Only the first step would count and that
Would always be murder.
She is to ‘forgive our noises/And teach us our recollections’, and Auden reminds us that poetry has no special place in her attention:
I dare not ask you if you bless the poets,
For you do not look as if you ever read them,
Nor can I see a reason why you should.
This last stanza offers us a bit of that humility that Auden was often tempted to overdo, but it also chimes with a recurring trope in modern literature in English. Marianne Moore says of poetry that she too dislikes it; Eliot tells us that it doesn’t matter; Auden says it makes nothing happen. In fact, none of these propositions represents anything like the whole story for any of these poets, but there’s an element of affectation here all the same, an unseemly wooing of the philistine.
Wolf Gang certainly aren’t “progressive” rappers like Common or Lupe Fiasco, and Odd Future shouldn’t hold their breath for a White House invite, but in highlighting generational conflict they are the most political popular musicians working. Tyler’s music is a radical critique, justifiably blaming his elders for the murderous voices in the back of his generation’s head. The papering over of America’s social antagonisms frays at its young edge, where the contradictions are still apparent. Odd Future conveys the artistic threat that the masks might fall and the plan to produce another generations of citizens could spin out of control. The sound of youth is not the bells of dawn but a death rattle. As Tyler puts it in the song “Radicals” (also in singular form the name of the group’s first mixtape): “Fuck the fat lady/It’s over when all the kids sing.” In this potentiality, the amphetamines doctors prescribed to keep students still produce “the Ritalin regimen/Double-S shit/Swastikas on the letterman.” Teachers who claim to be “here for you” become the Schmittian enemy in “We are us/They are them/Kill them/All.” At a time when America’s confidence in its future has reached polled lows, when, for the first time, fewer than 50 percent of Americans think the next generation will be better off, no one has figured out how to tell the kids. We measure our pessimism in terms of their lives and imagine they won’t notice. But teenagers don’t wait to have the world explained to them. It’s easier to write about what Odd Future is than what it says, and it’s easier to diagnose a group of teenager rappers than the society that produced them, but Wolf Gang’s art pulls back the curtain and in doing so forces the issue: there’s hard work to be done.
more from Malcolm Harris at The New Inquiry here.
There are those whose noses wrinkle whenever they catch a whiff of allegory in the air. Edgar Allan Poe, in his 1847 review “Tale Writing — Nathaniel Hawthorne,” quips that the best success a writer of allegory can hope for is to accomplish a feat that is not worth doing in the first place. “There is scarcely one respectable word to be said” in its defense. Allegory is obtrusively didactic, Poe elaborates, and thus it disturbs the equilibrium, essential to well-made fiction, between the narrative surface and the thematic depths: meaning should be an undercurrent of subtle force, and allegory redirects it to the surface, where it overwhelms the life-giving illusion of the story. One suspects that, though Poe does not indict the story specifically, he has “Rappaccini’s Daughter” very much in mind as he lights into Hawthorne’s use of allegory. “Rappaccini’s Daughter,” published in 1844, is the tale of a beautiful maiden confined to her father’s house and garden in long-ago Italy, and of the handsome young man who espies her from his window, falls prey to her enchantment, and with only the best intentions brings about her death. The garden is explicitly likened to Eden, though a malign fallen version thereof; the maiden’s father is an eminent doctor, explicitly likened to Adam, who has cultivated plants of unexampled deadliness to be used for medicinal purposes, and to fortify his daughter against the world’s various cruelties. The story has a texture of heightened allusiveness that bristles with meaning, inviting the reader with sensitive feelers to reconsider the wisdom not only of Genesis but of Dante, Milton, Ovid, Spenser, Machiavelli, and the modern scientific project. Hawthorne takes on erotic mysteries, scientific aspirations, venerable religious wisdom — and he composes about as richly literary a short story as any American writer has ever produced.
more from Algis Valiunas at The New Atlantic here.
Bowie has worn his dilettantism proudly and, through his dabblings, created some of the greatest music of the pop era. He is blessed with one of the most versatile voices. His talent for mimicry, coupled with a willingness to adapt his vocal approach to the song at hand, sets him apart from the competition: you could never tell which David Bowie would be singing. He has always been the bravest among his otherwise simply successful contemporaries, and, particularly in collaboration with Brian Eno, took rock music places it had never meant to go. He has namedropped Nietzsche here and there, and his attitude towards self-renewal has always been that of the Übermensch or the “homo superior”, words he relished singing in “Oh! You Pretty Things” on Hunky Dory in 1971. Bowie has always been at his best when he leaves himself open to chance, starting from scratch: his fearless mixing of genre, his willingness to enter the studio with no material (as he did for his masterpiece Station to Station, 1976), his constantly re-invented recording techniques (for example, instructing guitar players to play a song without ever having heard it, then keeping their first take), his embrace of Eno’s “oblique strategies” for “The Berlin Trilogy” (Low, 1977, Heroes, 1977, Lodger, 1979, only one of which was actually made in Berlin). He fails when his instincts desert him, when he tries to recreate consciously what he does so well unconsciously. Bowie knows all this: it just leaves him in the uncomfortable position of having endlessly to rehearse for unpreparedness.
more from Wesley Stace at the TLS here.
From The Boston Globe:
If you’re obsessive about your health, and you have $100 to spare, the Fitbit is a portable tracking device you can wear on your wrist that logs, in real time, how many calories you’ve burned, how far you’ve walked, how many steps you’ve taken, and how many hours you’ve slept. It generates colorful graphs that chart your lifestyle and lets you measure yourself against other users. Essentially, the Fitbit is a machine that turns your physical life into a precise, analyzable stream of data.
If this sounds appealing — if you’re the kind of person who finds something seductive about the idea of leaving a thick plume of data in your wake as you go about your daily business — you’ll be glad to know that it’s happening to you regardless of whether you own a fancy pedometer. Even if this thought terrifies you, there’s not much you can do: As most of us know by now, we’re all leaving a trail of data behind us, generating 0s and 1s in someone’s ledger every time we look something up online, make a phone call, go to the doctor, pay our taxes, or buy groceries.
Astronomers have produced the most complete 3D map of the nearby universe to date. Using telescopes in both hemispheres, they measured distances to a whopping 45,000 galaxies out to a distance of 380 million light-years—for the astronomy buffs, a red shift of .07. Unlike the famous Sloan Digital Sky Survey, which mapped only part of the sky, the new 2MASS Redshift Survey covers 95% of surrounding space, skipping over only the region near the plane of our own galaxy, where the Milky Way's stars and dust block the view of remote objects. In the map, color codes for distance: purple dots are nearby galaxies; red dots are distant ones.
Sean Carroll in Scientific American:
The topic of “life after death” raises disreputable connotations of past-life regression and haunted houses, but there are a large number of people in the world who believe in some form of persistence of the individual soul after life ends. Clearly this is an important question, one of the most important ones we can possibly think of in terms of relevance to human life. If science has something to say about, we should all be interested in hearing.
Adam Frank thinks that science has nothing to say about it. He advocates being “firmly agnostic” on the question. (His coblogger Alva Noë resolutely disagrees.) I have an enormous respect for Adam; he's a smart guy and a careful thinker. When we disagree it's with the kind of respectful dialogue that should be a model for disagreeing with non-crazy people. But here he couldn't be more wrong.
Adam claims that there “simply is no controlled, experimental[ly] verifiable information” regarding life after death. By these standards, there is no controlled, experimentally verifiable information regarding whether the Moon is made of green cheese. Sure, we can take spectra of light reflecting from the Moon, and even send astronauts up there and bring samples back for analysis. But that's only scratching the surface, as it were. What if the Moon is almost all green cheese, but is covered with a layer of dust a few meters thick? Can you really say that you know this isn't true? Until you have actually examined every single cubic centimeter of the Moon's interior, you don't really have experimentally verifiable information, do you? So maybe agnosticism on the green-cheese issue is warranted. (Come up with all the information we actually do have about the Moon; I promise you I can fit it into the green-cheese hypothesis.)
Susan Dominus in The NYT Magazine:
Twins joined at the head — the medical term is craniopagus — are one in 2.5 million, of which only a fraction survive. The way the girls’ brains formed beneath the surface of their fused skulls, however, makes them beyond rare: their neural anatomy is unique, at least in the annals of recorded scientific literature. Their brain images reveal what looks like an attenuated line stretching between the two organs, a piece of anatomy their neurosurgeon, Douglas Cochrane of British Columbia Children’s Hospital, has called a thalamic bridge, because he believes it links the thalamus of one girl to the thalamus of her sister. The thalamus is a kind of switchboard, a two-lobed organ that filters most sensory input and has long been thought to be essential in the neural loops that create consciousness. Because the thalamus functions as a relay station, the girls’ doctors believe it is entirely possible that the sensory input that one girl receives could somehow cross that bridge into the brain of the other. One girl drinks, another girl feels it.
What actually happens in moments like the one I witnessed is, at this point, theoretical guesswork of the most fascinating order. No controlled studies have been done; because the girls are so young and because of the challenges involved in studying two conjoined heads, all the advanced imaging technology available has not yet been applied to their brains. Brain imaging is inscrutable enough that numerous neuroscientists, after seeing only one image of hundreds, were reluctant to confirm the specific neuroanatomy that Cochrane described; but many were inclined to believe, based on that one image, that the brains were most likely connected by a live wire that could allow for some connection of a nature previously unknown. A mere glimpse of that attenuated line between the two brains reduced accomplished neurologists to sputtering incredulities. “OMG!!” Todd Feinberg, a professor of clinical psychiatry and neurology at Albert Einstein College of Medicine, wrote in an e-mail. “Absolutely fantastic. Unbelievable. Unprecedented as far as I know.” A neuroscientist in Kelowna, a city in British Columbia near Vernon, described their case as “ridiculously compelling.” Juliette Hukin, their pediatric neurologist at BC Children’s Hospital, who sees them about once a year, described their brain structure as “mind-blowing.”
Hector R. Torres in Project Syndicate:
[O]one of the main lessons of the crisis is that accumulating reserves shelters an economy from imported crises, thereby permitting governments to implement counter-cyclical policies. This is true, but, in an integrated world economy, it assumes that export-led growth is still an option.
In an environment of high liquidity, in which Latin American countries are far less successful than China in fending off capital inflows, advising them to raise real interest rates can only lure more short-term capital, compounding appreciation pressures. Nobody should be surprised to see trade tensions.
So, what should be done?
In an ideal world, liquidity creation should be regulated internationally, and the coherence of domestic exchange-rate policies ensured. But this is far from today’s real-world situation, so we need to aim for second-best solutions.
Capital controls (regulations and “prudential measures”) could help to curb appreciation pressures. Admittedly, they are not watertight and could eventually be sidestepped, but they are far better than what might follow if they prove ineffective. If capital controls do not work, governments may feel tempted to provide protection to “their” domestic industries by imposing trade restrictions.
The IMF has recently accepted that controlling capital inflows could be appropriate under certain circumstances.
S.J. Fowler interviews A.C. Grayling over at 3:AM Magazine:
3:AM: In a meaningful sense, your atheism seems to refute the idea that atheism is a philosophical necessity that results in pessimism. To what extent must atheism and the fragility of human nature be taken as given for us to begin legitimately philosophising?
ACG: As has been well said, atheism is to religious belief what not collecting stamps is to stamp collecting. If instead of ‘atheism’ you use the word ‘afairyism’ or some such, to illustrate the fact that there is no real subject matter in play (whereas ‘religion’ – a man-made phenomenon that has been a massive presence in history – is a different matter) you see that all that talk of ‘atheism’ does is to close down certain absurdities that get in the way of doing metaphysics and ethics properly. Whereas talk of ‘religion’ requires us to address the questions of the place of religious voices in the public square; this is where secularism becomes important.
3:AM: Can it be said if we are not overarchingly religious, nor taken with project of self improvement and personal responsibility, then we are inhabiting an age of ambivalence rather than nihilism or religosity. Now it seems the question of meaning is not answered yes or no, but not asked at all, especially in the young. Do you think consumerism, isolation, distraction has taken the place of any stringent belief?
ACG: Given half an invitation to reflect philosophically on the value and direction of life, people quickly begin to do so. (The religions do not want people to think philosophically, because then they begin to question the one-size-fits-all pieties that the religions sell.) The ‘distractions’ of entertainment, consumerism and co have more of a point in them than we sometimes acknowledge, because fun, pleasure, beauty and recreation are significant aspects of experience. But they don’t entirely stop people thinking about questions of value, for human lives also have sorrow and loss in them, and difficult choices, and periods of depression, all of which remind people of the task of thinking and choosing, which is inescapable. Philosophy can provide materials and suggestions here, and encouragement to think; that is or should be one of its principal gifts.