June 30, 2012
The Perfect Listen
Marc Hirsh on one of his music listening rituals, over at NPR:
But for all the romanticizing of the first time we hear an album or a song, that's almost never the moment of its crucial impact. That's not really how music works, not if it can actually hold up beyond that first listen. Unlike books, movies or plays (and television, to a lesser extent), recorded music is consumed repetitively. It's usually anywhere between the second and fifth listen that fragments that maybe weren't evident on first glance suddenly come at you or your brain makes a connection that could only have been made indirectly. That's when a song start to mean something to you.
Of course, there's something to be said about hearing a song and instantly connecting to it; that experience is just as valid as any, and it's certainly happened to me countless times. But that's precisely an experience, a one-off. The songs that are important to us are more like objects or possessions. They aren't bound by any one moment but instead continue to exist as time trundles ahead.
Super-Dreams of an Alternate World Order
Manohla Dargis and A. O. Scott discuss the meaning of superhero movies, in the NYT:
MANOHLA DARGIS On one level the allure of comic book movies is obvious, because, among other attractions, they tap into deeply rooted national myths, including that of American Eden (Superman’s Smallville); the Western hero (who’s separate from the world and also its savior); and American exceptionalism (that this country is different from all others because of its mission to make “the world safe for democracy,” as Woodrow Wilson and, I believe, Iron Man, both put it). Both Depression babies, Superman and Batman, were initially hard-boiled types, and it’s worth remembering that the DC in DC Comics was for Detective Comics. Since then the suits have largely remained the same even as the figures wearing them have changed with their times. Every age has the superhero it wants, needs or deserves.
Comic book movies are also fun (except when they’re not) and often easy viewing (except when they make your head hurt). They’re also blunt: A guy in a unitard pummels another guy — pow! — and saves the day, the girl and the studio. I like some comic-book movies very much, dislike others. But as a film lover I am frustrated by how the current system of flooding theaters with the same handful of titles limits my choices. (According to boxofficemojo.com “The Avengers” opened on 4,349 screens in the United States and Canada, close to 1 in 10.) The success of these movies also shores up a false market rationale that’s used to justify blockbusters in general: that is, these movies make money, therefore people like them; people like them, therefore these movies are made.
SCOTT And yet these stories do have some appeal, beyond the familiarity of the characters and the relentlessness of the marketing campaigns. As you suggest, they strike mythic, archetypal chords, and cater to a persistent hunger for large-scale, accessible narratives of good and evil.
It’s telling that Hollywood placed a big bet on superheroes at a time when two of its traditional heroic genres — the western and the war movie — were in eclipse, partly because they seemed ideologically out of kilter with the times.
A President Speaks Out on Immigration
No doubt you have been disappointed in some of us. Some of us are very disappointing. No doubt you have found that justice in the United States goes only with a pure heart and a right purpose, as it does everywhere else in the world. No doubt what you have found here did not seem touched for you, after all, with the complete beauty of the ideal which you had conceived beforehand. But remember this: If we had grown at all poor in the ideal, you brought some of it with you. … And if some of us have forgotten what America believed in, you, at any rate, imported in your own hearts a renewal of the belief. That is the reason that I, for one, make you welcome. … You dreamed dreams of what America was to be, and I hope you brought the dreams with you. No man that does not see visions will ever realize any high hope or undertake any high enterprise. Just because you brought dreams with you, America is more likely to realize dreams such as you brought. You are enriching us if you came expecting us to be better than we are.
—Woodrow Wilson, to 4,000 newly naturalized citizens, Philadelphia, May 10, 1915
Read the backstory by Patricia O'Toole at The American Scholar.
The Secret History of the Chief Justice’s Obamacare Decision
John Fabian Witt in Balkinization:
The story begins in 1933, when depression-fueled unemployment rates hit an all-time high of 25 percent. Progressive reformers, including Wisconsin’s influential husband-and-wife reformers Elizabeth and Paul Raushenbush, were desperately casting about for a constitutional basis for national unemployment insurance. Action at the state level was paralyzed because no one state seemed able to adopt an expensive insurance plan without driving employers into neighboring states. But action at the federal level seemed impossible, too, because the conservative Supreme Court seemed unlikely to allow the Congress to enact a comprehensive unemployment system as a regulation of interstate commerce.
That’s where Brandeis comes in. Elizabeth was the justice’s daughter, and when she and her husband visited with him in his summer cottage in Massachusetts, Brandeis suggested a novel solution to the constitutional dilemma: the tax power, he told them, would offer a constitutionally sound footing for the vast social insurance system they were contemplating.
Four years later, Brandeis was a decisive vote in the sharply divided 5-4 decision in Stewart Machine Co. v. Davis, upholding the unemployment insurance provisions of the Social Security Act over the dissent of the four conservative justices, who were known collectively as the “Four Horsemen of the Apocalypse.” Brandeis’s tax theory had become the foundation of the new American social insurance state.
How To Be Alone
In 1934, Thomas Hart Benton, purveyor of muscular scenes of American life, was the country’s most famous painter and one of the very few ever to have his picture on the cover of Time. In 1949, Jackson Pollock, painter of abstract drips and swirls, appeared in a four-page spread in Life teasingly headlined “Is He the Greatest Living Painter in the United States?” Yes or no didn’t really matter: he was the nation’s new art star. What changed in the 15 years that separated the public elevation of these two artists and their radically different art? The world changed, for one thing, moving out of the Great Depression, through World War II, and into a bomb-haunted cold war. America changed from a mighty fortress to an outreaching global imperium. And American art, including Pollock’s, changed from illustrating provincial sagas to dramatizing universal myths.more from Holland Cotter at the NY Times here.
Keep going, going on, call that going, call that on
Or again, imagine if the literary folk suddenly tired of it all, realized how unhelpful it all was; if the critics and academics wearied of untangling torment for a living (I see you haven’t got any better, Beckett’s old analyst responded after the author sent him a copy of Watt). Imagine if the publishers—let’s call them the Second Arrow Publishing Corporation—informed all their great authors, all the masters of the mercilessly talkative consciousness, that they are winding up their affairs; they have seen the light, they will no longer publish elaborations of tortured consciousness, lost love, frustrated ambition, however ingenious or witty. Imagine! All the great sufferers saved by Buddhism, declining the second arrow: quietness where there was Roth, serenity where there was McCarthy, well-being where there was David Foster Wallace? Do we want that? I suspect not.more from Tim Parks at the NYRB here.
Marilynne Robinson, the Pulitzer-winning novelist, is a confounding writer in today's political alignment. Her new essay collection, "When I Was a Child I Read Books," is — despite the sentimentality of its title — fundamentally a leftist political manifesto and lament for America's loss of faith in government. Yet it grants a central argument of many religious conservatives — that America's virtues are indeed steeped in biblical thought. "When I Was A Child" is a broadside defense of literature and classical liberalism that demands we include the unfashionable Old Testament as a foundation of both. Through rigorous citation and deep personal reflection, Robinson builds an excellent case. New Atheists like Sam Harris and medieval nostalgists like Rick Santorum would each find occasions for garment-rending in this collection.More from August Brown at the LA Times here.
Flat on the bank I parted
Rushes to ease my hands
In the water without a ripple
And tilt them slowly downstream
To where he lay, tendril-light,
In his fluid sensual dream.
Bodiless lord of creation,
I hung briefly above him
Savouring my own absence,
Senses expanding in the slow
Motion, the photographic calm
That grows before action.
As the curve of my hands
Swung under his body
He surged, with visible pleasure.
I was so preternaturally close
I could count every stipple
But still cast no shadow, until
The two palms crossed in a cage
Under the lightly pulsing gills.
Then (entering my own enlarged
Shape, which rode on the water)
I gripped. To this day I can
Taste his terror on my hands.
by John Montague
from Collected Poems
The Gallery Press, Oldcastle, 1995
Enjoying Natural Selection on Multiple Levels
Over at Rationally Speaking, Leonard Finkelman on the Richard Dawkins-E.O. Wilson debate about levels of natural selection:
The so-called “selfish gene” theory, technically known as gene selection, is an elaboration of work done by W.D. Hamilton and G.C. Williams on a phenomenon known as “kin selection.” Kin selection is predicated on the idea that the impulse I feel to care for my nephew is stronger than the impulse I feel to care for (say) my neighbor’s nephew. I know that my nephew is my sister’s son, and that my sister and I were born of the same parents; I therefore know that he carries 50% of my sister’s genetic alleles, and that there’s a 50% chance that any one of my sister’s alleles is one that I also carry. For any one of my nephew’s alleles, then, there’s a 25% chance that I also carry that allele. If I care for my nephew, then my genes have a one in four chance of helping themselves; if I care for my neighbor’s nephew, the odds are much, much lower. Gene selectionists therefore argue that genes are the individuals who benefit in the process of natural selection. Hence Dawkins’ famous claim that organisms are “gigantic lumbering robots” for carrying genes around: I have an impulse to care for my nephew because it helps (some of) my genes, even though it hurts me as a whole.
In 2010, E.O. Wilson and two collaborators wrote an article in Nature attacking the viability of kin selection. We won’t get into the details of their mathematical argument; the bottom line is that things rarely work out so neatly as “my nephew has half of my sister’s genetic alleles and she has half of mine,” and the complexities ultimately call into question the idea that gene selection can explain altruistic behavior. In his newest book and a recent New York Times “Stone” column (interestingly, a philosophy blog!), Wilson proposes an alternative that he calls “multi-level selection.” His account is so called because Wilson believes that nature sometimes selects genes, sometimes selects organisms, and sometimes selects groups—and that the latter option is the one that explains altruism. It was this claim that prompted Dawkins’ scathing review of Wilson’s book, linked in the first paragraph. Undermining the very foundation of Dawkins’ account of selection probably had something to do with it, too.
Science: A Call for Humility
Russell Stannard in The Huffington Post [h/t: Namit Arora]:
[E]ven if the M-theory hypothesis is correct, does it in fact answer the question of "Why is there something rather than nothing?" It would certainly account for the existence of the world. But would it not raise a fresh question: "Where did M-theory come from? What is responsible for its existence?"
This brings us up against what one suspects is a fundamental limitation of the scientific enterprise. The job of science is to describe the world we find ourselves in -- what it consists of, and how it operates. But it appears to fall short of explaining why we are presented with this kind of world rather than some other -- or why there should be a world at all.
Indeed, there is cause to wonder whether science even gets as far as describing the world. For instance, what is the world made of? One might answer in terms of the electrons, protons, and neutrons that make up atoms. But what are electrons, protons and neutrons? Quantum physics shows how they are observed to behave like waves as they move about. But on reaching their destination and giving up their energy and momentum they behave like tiny particles. But how can something be both a spread out wave with humps and troughs, and at the same time be a tiny localized particle? This is the famous wave/particle paradox. It afflicts everything, including light.
The solution given by the Danish physicist Neils Bohr was that one has to stop trying to explain what something, such as an electron, is. Instead, we are confined to explaining how something behaves in the context of a certain kind of observation being made on it -- whether we are observing it moving from one place to another (in which case the language of waves is appropriate), or alternatively observing it interacting on reaching its destination (requiring the language of particles).
Martha Nussbaum and the new religious intolerance
From The Guardian:
There's a popular student story about Martha Nussbaum giving a talk in a small living room of the Episcopal Church's chaplaincy centre on the leafy campus of the University of Chicago. As she was holding forth, a bird flew down the chimney and started to flutter around the room, bashing into the walls and generally panicking, as trapped birds do. The students were immediately busy opening windows and trying to shoo the poor creature to freedom. All their attention was taken up with the bird. But in the midst of all the excitement, Nussbaum didn't break her intellectual stride. She just carried on delivering the lecture as if nothing whatsoever was going on. She emanates detached academic cool – fully in command of herself and her material. From someone who has spent a distinguished academic career emphasising the riskiness and vulnerability of the human condition, all this slightly frosty control comes as something of a surprise.
Why, she once asked in a brilliant essay entitled "Love's Knowledge", do the gods of the ancient world often fall in love with human beings? Why would they prefer mortals to immortals? It is precisely because human beings are able to fail, she argues, that they are able to manifest so many attractive qualities. Take courage. What place can courage have in the world of immortal gods? How could an immortal god risk everything for another if their own welfare were always guaranteed in advance? And what sort of parent would an immortal parent be to an immortal child? Certainly not one that is up half the night worrying. Risk and vulnerability are intrinsic to being human. And that is what makes us attractive, sometimes heroic.
The Best of All Possible Worlds
From The New York Times:
Romano was a literary critic with The Philadelphia Inquirer for a quarter of a century and has also been a professor of philosophy. He presumably enjoyed this latter job, because he writes that today’s America is the best place to do philosophy that there has ever been, surpassing even the Athens of those ingenious and polite men Socrates, Plato and Aristotle. In one fit of enthusiastic chauvinism he goes yet further, and announces that it is the “perfectly designed environment” to ply his trade, as if no greater intellectual paradise could be imagined. This news will not provide much comfort to declinists who feel the political and economic hegemony of the United States to be fading fast. But perhaps it will help a little. Let deficits grow, good jobs disappear and China loom — hang it all, America will always have world-beating epistemology and metaphysics up its sleeve. Well, maybe that isn’t quite fair to Romano, because his claim depends on redefining the term “philosophy,” giving it a nebulous meaning that embraces far more than is taught under that name in universities. (More later about this revisionist wordplay.) Also, one part of his case is convincing, and oddly still worth making: America is not nearly so dumbed down as its detractors at home like to say.
“Idiot America: How Stupidity Became a Virtue in the Land of the Free,” “Unscientific America: How Scientific Illiteracy Threatens Our Future” and “The Age of American Unreason” are just three of the books from American writers in the past five years that belabor religious fundamentalism, conservative talk shows, scientific illiteracy or the many available flavors of junk food for thought. The fallacy of such books, as Romano argues, is that they take some rotten parts for the largely nutritious whole. It’s not so much that they compare American apples with foreign oranges, but that they fail to acknowledge that the United States is an enormous fruit bowl. Everything is to be found in it, usually in abundance, including a vibrant intellectual life. Rather like that of India — which has over a third of the planet’s illiterate adults but also one of the largest university systems in the world — the intellectual stature of America eludes simple generalizations.
June 29, 2012
'Having It All'? How About: 'Doing The Best I Can'?
Andrew Cohen weighing in as part of The Atlantic's continuing debate on work-life balance:
Anne-Marie Slaughter's remarkable article Why Women Still Can't Have It All clearly has meant different things to different people since it was published and posted. To me, first, it is further evidence of what I have come to believe after 46 years on this planet: most women are not just smarter than most men but braver and more aspirational, too. There is the noble, ancient striving to "have it all." And then there is the earnest and thought-provoking debate, largely between and among women if I am not mistaken, over exactly what that phrase means and whether the quest to achieve it is even worth it.
Men? Please. Such an earnest public conversation on this topic between and among men is impossible to imagine (no matter how hard The Atlantic tries). That's why so many of us diplomatically stayed on the sideline last week. And haven't men as a group largely given up hope of "having it all" anyway? Did we ever have such hope to begin with? I don't remember ever getting a memo on that. Without any statistics to back me up -- how typical of a man, right? -- I humbly suggest that a great many of us long ago decided in any event to focus upon lesser, more obtainable mottoes, like "doing the best I can" or "hanging in there," as we try to juggle work, family, and a life.
Read the rest here.
Jean-Jacques Rousseau, On his 300th
I was traveling yesterday, Rousseau's 300th, and did not get a chance to post this piece by Laurie Fendrich in The Chronicle of Higher Ed:
Today, June 28, is Jean-Jacques Rousseau’s 300th birthday. Although it’s hard to imagine philosophers as squalling newborns, in Rousseau’s case, it makes sense. His whole philosophy hinges on the idea that we humans are born good but, along the way of making civilization, we manage to destroy what’s good in ourselves. From the moment the umbilical cord is cut, Rousseau essentially says, we systematically obliterate our real nature, which is one of benevolent beings happily living a simple existence.
But for someone living in any complex society since the Industrial Revolution, Rousseau’s philosophy is not only difficult to believe (aren’t education, exposure to the arts, technological progress inarguably good things?), but inconvenient to practice—even in small instances, such as bringing up his ideas for discussion in a 21st-century college class. None of this has prevented me from loving Rousseau’s complex, contradictory, and exhilaratingly exasperating philosophy ever since first encountering it as a sophomore, in a college course in political philosophy.
Why would a young college student who was just discovering the solitary joys of painting pictures become obsessed with the one and only Enlightenment thinker who ferociously attacked the very value of art (and science as well)? And why would that young college student never manage to break with the almost ubiquitously maligned Rousseau, never manage to put him to the side and forget him? Or, if she was going to stay with him, why couldn’t she have found a way to concentrate on his sweeter side—the side expressed in, for example, his Reveries, where he walks in a “lonely meditation on nature”?
Addicted to Health Care
From Psychology Today:
What is missing from our health care debate—even as conducted by our most insightful and radical critics of the dysfunctional American health care system—is a recognition of what, underneath it all, drives the system. It is Americans' insatiable lust for health care. What Americans possess in overwhelming abundance is the urge to be treated for their maladies. Witness our massive formal addiction and mental health disease treatment and support system (as opposed to the informal community supports offered more readily around the world). And our most forward-thinking health care advocates can only imagine expanding this system exponentially (e.g., parity in health care coverage between physical and emotional illness).
American health care costs are driving America into the ground. These costs stand at from 2-3:1 compared with other nations (like the UK), and the chasm is widening since virtually all other nations have stablizied these costs, while we are only beginning to tackle the rate at which they increase. But Republicans can still run on simply resuming lock, stock and barrel the same old private care system, Americans in general dislike Obamacare, and Obamacare itself is built primarily around expanding coverage without controlling costs. This is because any effort to rein in such costs is met by accusations like "death panels" or "rationing," which immediately kills them like glassy-eyed dead fish floating on the surface of the stagnant pond that is our care system.
More from Stanton Peele here.
Soldiers Without Generals
WE CAN reasonably conclude that the verdict is not yet in on Egypt’s future. Popular empowerment has so far been a thorn in the side of those trying to destroy the revolution. And it is hard to imagine that the millions who have thrust themselves so decisively onto the center stage of their own history could be dismissed so easily. Romanticism aside, however, one must realize that revolution is an ugly business. Those with vested interests in authoritarian rule will not simply step aside under social pressure, nor will they wither away over time. Their total suppression and defeat is of essence to any true revolution. As long as Egyptians find this course distasteful—preferring instead conciliatory solutions and wishing that sporadic pressure from below along with clustering around the Muslim Brothers (as a revolutionary movement by proxy) can somehow convince the military and security elite to “do the right thing”—little can be done. And as long as revolutionaries cannot organize their ranks and encourage their fellow citizens to make difficult choices, take risks, and accept short-term instability, then there is little hope that the people themselves will be able to turn their gallant uprising into a complete revolution. Reflecting back on the Iranian case in The Making of the Islamic Revolution, Mohsen M. Milani rightly noted, “Theorizing about revolution sounds romantic, but winning it is no romantic enterprise. The verdict on those who refuse to treat revolution as a furious war has been unequivocally clear: oblivion or death… Revolutions are like wars.” And the key to winning wars is organization.more from Hazem Kandil at Dissent here.
barthes in china
LATELY THE POSTHUMOUS CORPUS of Roland Barthes has been growing at a rate that rivals Tupac Shakur’s. (Can a hologram Barthes be far behind?) Recent years have witnessed the publication of lecture notes from his last seminars at the Collège de France (Preparation of the Novel) as well as the journals he kept following the death of his mother (Mourning Diary). The latest addition to his English catalogue is Travels in China, a translation of his notebooks from a three-week trip there in 1974 with a delegation from the French literary review Tel Quel. In France, the publication of Barthes’s private notebooks and journals (Carnets du voyage en Chine and Journal de deuil both appeared in 2009) spurred a round of contentious debate about the ethics of looting a dead writer’s archives. (Somewhere, no doubt, Max Brod is sighing with sympathy.) It’s not hard to attribute the spate of posthumous publications to the mercenary incentive to squeeze every last drop out of an author with any degree of fame. If we’re feeling a little more charitable, we might also see them as testaments to the desire for more of a distinctive voice and a singular intelligence. Each death of a major intellectual figure seems to prompt a flurry of new publications of old material, much of it scraps, all of it suggesting an inability to accept that no more words will issue from that pen, a kind of disbelief that the author is, at last, really and truly dead.more from Dora Zhang at the LA Review of Books here.
Calvino professed to be fascinated by the world of adolescence – that in-between time, McLaughlin writes, where “a sense of failed initiation hangs over everything, a sense of thresholds not crossed”. The author regarded Into the War as a “polemic against the habitual image of adolescence in literature”, and all three stories attest to the potentially magical, transformative space of adolescence, however thwarted by the environment of war and Fascism. Calvino’s note to the trilogy points out that his “entry into life” and the Italian “entry into war” coincided. Throughout the book, the hyper-aware narrator senses the incoming storm, but he is too preoccupied with girls and peer pressure, too distracted by the circus atmosphere of Fascist politics, to confront this reality directly. After all, Calvino was no D’Annunzio, the Italian poet who led a group of Legionnaires in laying siege to the city of Fiume in the First World War; he was more the heir of Baudelaire, a flâneur thrust into a Fascist Youth uniform.more from Joseph Luzzi at the TLS here.
The Manifest Destiny of Artificial Intelligence
Brian Hayes in American Scientist:
Artificial intelligence began with an ambitious research agenda: To endow machines with some of the traits we value most highly in ourselves—the faculty of reason, skill in solving problems, creativity, the capacity to learn from experience. Early results were promising. Computers were programmed to play checkers and chess, to prove theorems in geometry, to solve analogy puzzles from IQ tests, to recognize letters of the alphabet. Marvin Minsky, one of the pioneers, declared in 1961: “We are on the threshold of an era that will be strongly influenced, and quite possibly dominated, by intelligent problem-solving machines.”
Fifty years later, problem-solving machines are a familiar presence in daily life. Computer programs suggest the best route through cross-town traffic, recommend movies you might like to see, recognize faces in photographs, transcribe your voicemail messages and translate documents from one language to another. As for checkers and chess, computers are not merely good players; they are unbeatable. Even on the television quiz showJeopardy, the best human contestants were trounced by a computer.
In spite of these achievements, the status of artificial intelligence remains unsettled. We have many clever gadgets, but it’s not at all clear they add up to a “thinking machine.” Their methods and inner mechanisms seem nothing like human mental processes. Perhaps we should not be bragging about how smart our machines have become; rather, we should marvel at how much those machines accomplish without any genuine intelligence.
More here. [Photo shows IBM's chess-playing computer Deep Blue.]
Obamacare Upheld: How and Why Did Justice Roberts Do It?
David Cole in The Nation:
What led Roberts to cast his lot with the law’s supporters? The argument that the taxing power supported the individual mandate was a strong one. The mandate provides that those who can afford to buy healthcare insurance must do so, but the only consequence of not doing so is the payment of a tax penalty. The Constitution gives Congress broad power to raise taxes “for the general welfare,” which means Congress need not point to some other enumerated power to justify a tax. (By contrast, if Congress seeks to regulate conduct by imposing criminal or civil sanctions, it must point to one of the Constitution’s affirmative grants of power—such as the Commerce Clause, the immigration power, or the power to raise and regulate the military.)
The law’s challengers—and the Court’s dissenters—rejected the characterization of the law as a tax. They noted that it was labeled a “penalty,” not a tax; that it was designed to encourage people to buy health insurance, not to raise revenue; and that Obama himself had rejected claims that the law was a tax when it was being considered by Congress. But Roberts said the question is a functional one, not a matter of labels. Because the law in fact would raise revenue, imposed no sanction other than a tax and was calculated and collected by the IRS as part of the income tax, the Court treated it as a tax and upheld the law.
Noam Chomsky & Tariq Ali speak with Julian Assange
Israel's New Politics and the Fate of Palestine
Akiva Eldar in The National Interest:
Israel never overtly spurned a two-state solution involving land partition and a Palestinian state. But it never acknowledged that West Bank developments had rendered such a solution impossible. Facing a default reality in which a one-state solution seemed the only option, Israel chose a third way—the continuation of the status quo. This unspoken strategic decision has dictated its polices and tactics for the past decade, simultaneously safeguarding political negotiations as a framework for the future and tightening Israel’s control over the West Bank. In essence, a “peace process” that allegedly is meant to bring the occupation to an end and achieve a two-state solution has become a mechanism to perpetuate the conflict and preserve the status quo.
This reality and its implications are best understood through a brief survey of the history that brought the Israelis and Palestinians to this impasse. The story is one of courage, sincere efforts, internal conflicts on both sides, persistent maneuvering and elements of folly.
Nora Ephron | 1941-2012: Writer and Filmmaker With a Genius for Humor
From The New York Times:
The producer Scott Rudin recalled that less than two weeks before her death, at Weill Cornell Medical College and New York-Presbyterian Hospital, he had a long phone session with her while she was undergoing treatment, going over notes for a pilot she was writing for a TV series about a bank compliance officer. Afterward she told him, “If I could just get a hairdresser in here, we could have a meeting.”
Ms. Ephron’s collection “I Remember Nothing” concludes with two lists, one of things she says she won’t miss and one of things she will. Among the “won’t miss” items are dry skin, Clarence Thomas, the sound of the vacuum cleaner, and panels on “Women in Film.” The other list, of the things she will miss, begins with “my kids” and “Nick” and ends this way:
“Taking a bath
Coming over the bridge to Manhattan
Rabbits kept alive by oxygen injections
Rabbits with blocked windpipes have been kept alive for up to 15 minutes without a single breath, after researchers injected oxygen-filled microparticles into the animals' blood. Oxygenating the blood by bypassing the lungs in this way could save the lives of people with impaired breathing or obstructed airways, says John Kheir, a cardiologist at the Children’s Hospital Boston in Massachusetts, who led the team. The results are published today in Science Translational Medicine1. The technique has the potential to prevent cardiac arrest and brain injury induced by oxygen deprivation, and to avoid cerebral palsy resulting from a compromised fetal blood supply. In the past, doctors have tried to treat low levels of oxygen in the blood, or hypoxaemia, and related conditions such as cyanosis, by injecting free oxygen gas directly into the bloodstream. They had varying degrees of success, says Kheir.
In the late nineteenth century, for example, US doctor John Harvey Kellogg experimented with oxygen enemas — an idea that has been revived in recent decades in the form of bowel infusers2, says Mervyn Singer, an intensive-care specialist at University College London. But these methods can be dangerous, because the free oxygen gas can accumulate into larger bubbles and form potentially lethal blockages called pulmonary embolisms. Injecting oxygen in liquid form would avoid this, but the procedure would have to be done at dangerously low temperatures. The microcapsules used by Kheir and his team get the best of both worlds: they consist of single-layer spherical shells of biological molecules called lipids, each surrounding a small bubble of oxygen gas. The gaseous oxygen is thus encapsulated and suspended in a liquid emulsion, so can't form larger bubbles. The particles are injected directly into the bloodstream, where they mingle with circulating red blood cells. The oxygen diffuses into the cells within seconds of contact, says Kheir. “By the time the microparticles get to the lungs, the vast majority of the oxygen has been transferred to the red blood cells,” he says. This distinguishes these microcapsules from the various forms of artificial blood currently in use, which can carry oxygen around the body, but must still receive it from the lungs.
Gary Snyder Starts Singing
When you suddenly started singing in the middle of your poetry reading
we were caught off guard.
It was as though, when we had crossed the Brooklyn Bridge and turned left,
suddenly we had seen the pampas spread out before us,
when actually we should have seen Wall Street.
The domestic poultry that are supposedly unable to fly in our country,
in the garden of your stanzas, begin ably flying about.
They sail across the planet’s sky in V-formation like wild geese,
as if to say, “We are completely fed up with strolling around
on the Gutenberg runway. From now on we are going to be free,
so please look after yourselves in future.”
When you sang, you yourself became a song.
With your feet rooted in the earth,
your body began to float off into air.
We, left behind, recalled the familiar old maxim,
‘A miracle is reality laid bare.’
But as you sing, you are whispering:
“My tongue which has been up to a lot of vulgar things
is also capable of such elegant things”.
by Inuo Taguchi
publisher: Poetry International, 2006
translation: William I. Elliott and Kazuo Kawamura
June 28, 2012
Research shows that everyone cheats
Dan Ariely in the Wall Street Journal:
We tend to think that people are either honest or dishonest. In the age of Bernie Madoff and Mark McGwire, James Frey and John Edwards, we like to believe that most people are virtuous, but a few bad apples spoil the bunch. If this were true, society might easily remedy its problems with cheating and dishonesty. Human-resources departments could screen for cheaters when hiring. Dishonest financial advisers or building contractors could be flagged quickly and shunned. Cheaters in sports and other arenas would be easy to spot before they rose to the tops of their professions.
But that is not how dishonesty works. Over the past decade or so, my colleagues and I have taken a close look at why people cheat, using a variety of experiments and looking at a panoply of unique data sets—from insurance claims to employment histories to the treatment records of doctors and dentists. What we have found, in a nutshell: Everybody has the capacity to be dishonest, and almost everybody cheats—just by a little. Except for a few outliers at the top and bottom, the behavior of almost everyone is driven by two opposing motivations. On the one hand, we want to benefit from cheating and get as much money and glory as possible; on the other hand, we want to view ourselves as honest, honorable people. Sadly, it is this kind of small-scale mass cheating, not the high-profile cases, that is most corrosive to society.
Nico Muhly’s many opinions on polygamists, opera and idiocy
David Patrick Stearns in Arts Journal:
When his opera Dark Sisters was premiered in New York City in November, many believed the ever prolific Muhly (yes, even more prolific than his longtime employer Philip Glass) had rushed through the composition of a chamber opera about Church of Latter-Day Saints splinter groups that practice polygamy in remote outposts of the southwestern United States. The disappointment extended beyond the critics and operagoers hearing it for the first time on opening night. There was much grumbling within the industry that problems that were clearly apparent in the workshop preceding the premiere but hadn’t been addressed at all. Some of his fellow composers were secretly scathing.
Oh well. There was always the revival the following June at Philadelphia’s Kimmel Center, where the smallish, congenial Perelman Theater has come to be seen as one of the ideal chamber opera venues in the Northeast. Even then, Muhly, librettist Stephen Karam and director Rebecca Taischman declined to have another workshop. They were all busy and sensed that changes could be made in the few weeks of rehearsal prior to the Opera Company of Philadelphia opening.
And yet … Dark Sisters wasn’t just a hit with critics who were lukewarm first time around. The opera was a considerable popular success with audiences. Word of mouth was uniformly positive. Here was something fresh, challenging and new that wasn’t beyond the grasp of an average operagoer hearing it for the first time. And in Philadelphia – a place known to fear the cutting edge.
Cog and Turbine
3QD friend Najib Khan wrote and performed the theme song for this fun little animated short. If you can spare a couple of bucks to help them finish the film, please click here.
the end of the euro
Most of the current policy discussion concerning the euro area is about austerity. Some people – particularly in German government circles – are pushing for tighter fiscal policies in troubled countries (i.e., higher taxes and lower government spending). Others – including in the new French government — are more inclined to push for a more expansive fiscal policy where possible and to resist fiscal contraction elsewhere. The recently concluded G20 summit is being interpreted as shifting the balance away from the “austerity now” group, at least to some extent. But both sides of this debate are missing the important issue. As a result, the euro area continues its slide towards deeper crisis and likely eventual disruptive break-up. The underlying problem in the euro area is the exchange rate system itself – the fact that these European countries locked themselves into an initial exchange rate, i.e., the relative price of their currencies, and promised to never change that exchange rate. This amounted to a very big bet that their economies would converge in productivity – that the Greeks (and others in what we now call the “periphery”) would in effect become more like the Germans.more from Simon Johnson at The Baseline Scenario here.
The Night Wanderers
Joseph Kony could never have imagined it. Once an obscure warlord traipsing through the central African bush, he has been catapulted onto the leaderboard of global villains. Schoolchildren have been riveted by an internet video of his atrocities released by Invisible Children, a group of American activists. Their film has garnered more than 89 million hits on YouTube since March. Some viewers rallied behind their Kony2012 campaign to call for the Ugandan rebel's arrest by the year's end. Overnight, Kony has become the world's favourite bogeyman. It is fortuitous then that the English translation of The Night Wanderers by Wojciech Jagielski, a veteran Polish journalist, has arrived at just the moment when ever larger numbers of people are curious to learn more about Kony, his child soldiers, and the conflict they spawned.more from Matthew Green at Literary Review here.
joyce's detritus of reality
On a day in May, 1922, in Paris, a medical student named Pierre Mérigot de Treigny was asked by his teacher, Dr. Victor Morax, a well-known ophthalmologist, to attend to a patient who had telephoned complaining about pain from iritis, an inflammation of the eye. The student went to the patient’s apartment, in a residential hotel on the Rue de l’Université. Inside, he found a scene of disarray. Clothes were hanging everywhere; toilet articles were scattered around on chairs and the mantelpiece. A man wearing dark glasses and wrapped in a blanket was squatting in front of a pan that contained the remains of a chicken. A woman was sitting across from him. There was a half-empty bottle of wine next to them on the floor. The man was James Joyce. A few months before, on February 2nd, he had published what some people regarded then, and many people regard now, as the greatest work of prose fiction ever written in the English language. The woman was Nora Barnacle. She and Joyce were unmarried, and had two teen-age children, Giorgio and Lucia, who were living with them in the two-room apartment.more from Louis Menand at The New Yorker here.
Charisma: who has it, and how to get it
From The Telegraph:
Until I encountered Olivia Fox Cabane, whom US executives at firms like Google, Deloitte and Citigroup pay up to $100,000 a year to help boost their X-factor, I’d have naively believed charisma was an intangible, magical aura. The word comes from the Greek “gift”, befitting the notion that allure is something you’re born with, and can’t earn. It’s the “It” that differentiated Baroness Thatcher from John Major, George W Bush from John Kerry, Lady Gaga at the O2 from her hundreds of imitators performing to tiny audiences in bar back rooms. But, as Fox Cabane points out in her new book The Charisma Myth: How Anyone Can Master the Art and Science of Personal Magnetism, it was also the difference between Marilyn Monroe and her alter-ego Norma Jean Baker. In 1955, the film star rode the New York subway, unnoticed by her fellow passengers because, she explained, she had chosen to adopt “Baker” mode. But when she emerged onto the city pavements, she asked an accompanying journalist: “Do you want to see her?” She fluffed her hair, struck a pose.
Suddenly, onlookers reported, magic seemed to flow from her. “That shows that charisma isn’t innate, it can be controlled at will,” Fox Cabane says.
The Snarky Voice in Your Head
Via Alan Henry at Lifehacker:
There's a difference between being occasionally sarcastic and a little derisive in your head, but when negativity becomes your default reaction, you have a problem. You may have had a wake-up moment, much like Anna Holmes, founding editor of Jezebel, had when she realized she was sneering at someone for no reason other than that the person was happy. Here's what she said:
Just rolled my eyes at a woman skipping happily across 42nd Street. Then I realized I'M the asshole.
— Anna Holmes (@AnnaHolmes) May 31, 2012
How about a quick check. Do you:
- Roll your eyes at every "hipster" who, by most accounts, is just a person trying (successfully or not) to dress fashionably?
- Primarily complain about how horrible people/things are on Facebook/Twitter?
- Get angrier every passing moment that you stand in line at the grocery store, or have to wait for your check to arrive at a restaurant?
- Find you're constantly frustrated with coworkers who don't "get it?"
- Comment angrily on blogs, videos, and other web sites (usually beginning with "ummm" and ending with "just saying?")
- Feel like it's okay to be a complete jerk, as long as you're "witty" about it?
Sounding familiar? You may have a problem.
Jonah Lehrer Just Does Not Know How To Do Journalism
To be read alongside this previous post and the additional links below it, more opinions on Jonah Lehrer from Gawker:
Yesterday we found out that Jonah Lehrer, the Gladwellesque whiz kid who's The New Yorker's newest staff writer, reused his own old writings for every goddamn blog post he's written for The New Yorker so far. A self-plagiarist, he is. Big time. What's the latest? He is an even bigger time plagiarist (self, and otherwise!) than we knew yesterday. And for it, he should probably be eased out of journalism's highest echelon.
The news yesterday set off a predictable wave of digging into Lehrer's past work, revealing that his penchant for reusing old material without disclosure was not limited to a few blog posts. Edward Champion found "twelve pages of lifted passages" in just the first 100 pages of Lehrer's recent book Imagine: How Creativity Works. Lehrer's January New Yorker article on brainstorming now has an editor's note disclosing that Lehrer took Noam Chomsky quotes from a story (not written by him) in Technology Review and inserted them into his story, making it appear as if he had spoken to Chomsky himself.
20 Things You Didn't Know About... Allergies
From Discover Magazine:
1 Our immune system may be like those small bands of Japanese “holdout” soldiers after World War II. Not knowing that the war was over, they hid for years, launching guerrilla attacks on peaceful villages.
2 With our living environment well scrubbed of germs, our body’s immune “soldiers” mistakenly fire on innocent peanuts and cat dander.
3 According to the National Institutes of Health, more than half of all Americans have one or more allergies.
4 The scariest allergy: penicillin, one of the most common causes of fatal anaphylaxis. The most disgusting allergy: cockroaches.
5 Most food allergies result from an immune response to a protein. In 2004 a team at Trinity College Dublin tried to counter that reaction by injecting mice with parasites, giving the animals’ immune systems the sort of threat they evolved to fight, thus distracting them from the food proteins.
6 The experiment worked.
This is not a real comment policy but one written up for amusement by Sean Carroll of Cosmic Variance in a moment of frustration with the rude behavior of a few commenters there. It does express the point of view of blog hosts rather well at times, though:
The best way to think about blog commenting has been formulated by Eugene Volokh: comment threads as cocktail parties. A good comment section is a cacophony of views, a bringing-together of different voices in the best possible way. But it is not a random collection of passers-by gathered in a public space to shout at each other. It’s a hosted space, with the bloggers as proprietors. This implies a minor form of social contract: the bloggers provide a common space for commenters to meet and converse, while commenters are expected to contribute positively and politely to the experience.
One goal of a good cocktail party is that you meet people you haven’t met before, and perhaps share an interesting conversation. But the guest list, and some broad expectations for personal behavior, are set the by organizers. Party crashers who are obnoxious, or disruptive, or even just deadly boring, may be asked to leave the party. Nobody has a right to attend whatever parties they like.
Consider, in terms of this analogy, the temptation to complain out loud about what the bloggers are choosing to blog about. That would be like showing up at a party, noticing that the only appetizers being passed around are spring rolls and bacon-wrapped dates, and proceeding to raise a ruckus about the absence of cocktail weenies.
No, come to think of it, it’s not like that. It’s like showing up at a party, noticing that spring rolls and bacon-wrapped dates are being passed around in addition to your beloved cocktail weenies, and loudly proclaiming how offended you are at the presence of such outré finger food at this event. You shouldn’t complain about the host’s taste in appetizers; if there’s nothing there you like, go to another party. And if there is, ignore the offerings you don’t like, and enjoy yourself some weenies. Delicious, delicious weenies.
More here. [See the post before this one for an example of demanding "cocktail weenies".]
June 27, 2012
RIP Nora Ephron, 1941 - 2012
The Girl Who Fixed The Umlaut
Salander opened the door a crack and spent several paragraphs trying to decide whether to let Blomkvist in. Many italic thoughts flew through her mind. Go away. Perhaps. So what. Etc.
“Please,” he said. “I must see you. The umlaut on my computer isn’t working.”
He was cradling an iBook in his arms. She looked at him. He looked at her. She looked at him. He looked at her. And then she did what she usually did when she had run out of italic thoughts: she shook her head.
“I can’t really go on without an umlaut,” he said. “We’re in Sweden.”
But where in Sweden were they? There was no way to know, especially if you’d never been to Sweden. A few chapters ago, for example, an unscrupulous agent from Swedish Intelligence had tailed Blomkvist by taking Stora Essingen and Gröndal into Södermalm, and then driving down Hornsgatan and across Bellmansgatan via Brännkyrkagatan, with a final left onto Tavastgatan. Who cared, but there it was, in black-and-white, taking up space. And now Blomkvist was standing in her doorway.
Addicts, Mythmakers, And Philosophers: Plato's/Socrates' Understanding of Habitually Bad Behavior
Alan Brody in Philosophy Now:
Thad held up his right hand and asked “See this?” He showed me gnarled and maimed fingers. Thad told me that while he was flying his plane into Turkey, the Turkish air force forced him to land, having gotten wind that he was running drugs. They jailed him, and in an attempt to extract a confession, his jailers broke his fingers. He didn’t confess.
Thad bribed his way out of jail. Eventually he came to the drug treatment center where I was working, to get help with his drinking problem. (Thad and other patient names are pseudonyms.) After discussing addiction as involving compulsive behavior, we concluded that Thad was suffering from alcoholism. Knowing he would be better off not drinking, Thad committed himself to abstinence. He told me that he didn’t need to go to Alcoholics Anonymous for support, explaining that if he could resist caving in from torture he could certainly resist whatever discomfort he would experience from not drinking. Thad thought that being able to follow through with his resolve was simply a matter of having the ability to resist succumbing to how bad it would feel to not drink.
When Thad came in for his next appointment he looked pained, shocked and confused. He told me that in spite of his decision to remain abstinent, he drank. It happened at the airport while he was waiting for his friend to arrive. Thad couldn’t understand how he would do such a thing, given his ability to handle pain when sticking to a resolution. I explained how a compulsive condition such as alcoholism can change how one evaluates what to do, so that someone who previously decided not to drink can come to temporarily think it’s okay to do so. After I explained how this kind of change of thought could produce a motive for drinking, Thad saw how his ability to endure suffering couldn’t be counted on to guarantee abstinence.
From Russ Rymer at National Geographic:
In an increasingly globalized, connected, homogenized age, languages spoken in remote places are no longer protected by national borders or natural boundaries from the languages that dominate world communication and commerce. The reach of Mandarin and English and Russian and Hindi and Spanish and Arabic extends seemingly to every hamlet, where they compete with Tuvan and Yanomami and Altaic in a house-to-house battle. Parents in tribal villages often encourage their children to move away from the insular language of their forebears and toward languages that will permit greater education and success.
Who can blame them? The arrival of television, with its glamorized global materialism, its luxury-consumption proselytizing, is even more irresistible. Prosperity, it seems, speaks English. One linguist, attempting to define what a language is, famously (and humorously) said that a language is a dialect with an army. He failed to note that some armies are better equipped than others. Today any language with a television station and a currency is in a position to obliterate those without, and so residents of Tuva must speak Russian and Chinese if they hope to engage with the surrounding world. The incursion of dominant Russian into Tuva is evident in the speaking competencies of the generation of Tuvans who grew up in the mid-20th century, when it was the fashion to speak, read, and write in Russian and not their native tongue.
Documenta’s American-born artistic director, Carolyn Christov-Bakargiev, doesn’t even use the word “artist,” preferring “participant” instead. She says, “I am not sure that the field of art will continue to exist in the 21st century” — not meaning art itself, mind you, but our tidy roping-off of the field. To Joseph Beuys’s famous dictum “Everyone is an artist,” Christov-Bakargiev adds, “So is any thing.” The best parts of Documenta 13 bring us into close contact with this illusive entity of Post Art—things that aren’t artworks so much as they are about the drive to make things that, like art, embed imagination in material and grasp that creativity is a cosmic force. It’s an idea I love. (As I’ve written before, everything that’s made, if you look at it in certain ways, already is or can be art.) Things that couldn’t be fitted into old categories embody powerfully creative forms, capable of carrying meaning and making change. Post Art doesn’t see art as medicine, relief, or religion; Post Art doesn’t even see art as separate from living. A chemist or a general may be making Post Art every day at the office. One of the exhibitors at Documenta is the civil engineer Konrad Zuse, creator in the thirties of an early electromagnetic computer.more from Jerry Saltz at New York Magazine here.
Big Baby behavior
Big Baby behavior has coalesced with the new rhetorical style of the right — whining, entitlement, and victimization, a bad-faith aping of how the old regime understood the demands of anti-racism and the women’s movement — to give a mashed-carrots color to the politics of our era. Tantrums are in fashion. Are you ever at a loss now, flipping through the channels, to know what policies a TV commentator will advocate if he is boyish but old, thin-haired but incapable of growing a mustache, soft, truculent, khakied and floppy-collared, wide-eyed on a sugar rush and shouting for more candy? (In his case, the Pez will be prescription drugs and alcohol: there is a curious tie between Big Babies and abuse of painkillers.) Big Baby is easier to picture sitting than standing. Man-boobs shape his polo. Big Baby has wee little feet and appears on the cover of Cigar Aficionado. Big Baby issues insults, but only at a safe distance. You sense that, up close, he might smell like milk.more from the Editors at n+1 here.
Justin Erik Halldór Smith On Dieting
Justin E. H. Smith in his own blog:
It may be that an older form of wisdom speaks to us through proverbs, the sort of wisdom that reeks of grandparents and people even older, that announces 'you are what you eat' as an existential truism, for example. But this sort of wisdom is for the most part drowned out by chatter, about good carbs as opposed to bad ones, about the exalted ideal ratio between carbs, proteins, and fats, about whether food should be free of some negative element or other, whether it should be raw or cooked, whether it is fitting that animals be slaughtered to produce it. And all of this chatter takes place in the mode of facticity: it is put forth as if it were entirely science, and had nothing to do with culture. The striving upper middle class thus avoids McDonald's not because it is where poor fat people go, but rather because the food served there is 'unhealthy', a term that can only conceal its normativity under a thick coat of false consciousness. And thus urban subcultures emerge that condemn gluten, or that advocate a diet based principally upon meat à la Tartare, as if there were no logic of social distinction at work, as if it were simply the case that their way of eating is the correct way, the natural way, the way cavemen ate, the way we ate before we were corrupted by the Agricultural Revolution, by modernity, by supermarkets, or some other hypothetical loss of innocence.
There is no more awareness in either the bourgeois or the Bohemian expressions of this chatter than there is in traditional folk cultures, with their highly prescriptive conceptions of how one ought to eat, that 'the natural' is a contested category, that in nutritional matters as in everything else, grand gestures and elaborate programs that spell out how to live in accordance with nature are at least as artificial as everything else we come up with. Whole Foods occupies a different cultural space than the McDonald's a few blocks away; both are however equidistant from Nature.
Thomas Friedman Writes His Only Column Again
Hamilton Nolan at Gawker:
Fabulously wealthy CEO whisperer and newspaper columnist Thomas Friedman is little more than a human-shaped random word generator programmed with the "Computers and Internet" section of a fourth-grade vocabulary textbook and fitted with a mustache. He writes one single column, sometimes using different proper nouns or cycling through slightly new platitudes, in order to allow a new headline to be written. The Only Thomas Friedman Column That Exists—which ran right on schedule yesterday—opens like this:
TRAVELING in Europe last week, it seemed as if every other conversation ended with some form of this question: Why does it feel like so few leaders are capable of inspiring their people to meet the challenges of our day?
Whether traveling in Europe or Israel or Pakistan or The Arab Street, Thomas Friedman has astoundingly boring conversations with people who speak in vague, nonsensical phrases. He continues:
There are many explanations for this global leadership deficit, but I'd focus on two: one generational, one technological.
"There are many explanations for [broad phenomenon], but I'd focus on two: one [generational, cultural, or sociological], one [technological, biological, scientific, or economic]." Thomas Friedman knows how to write a freshman-year research paper at the last minute.
More hilarity here.
10. Keep Your Mind From Wandering
Keep your mind from its wandering
and regain first oneness.
Is it possible?
Let your body become supple
as a newborn’s. Is it possible?
Cleanse your inner vision
till you see nothing but light.
Is it possible?
Love people and lead
without forcing your will.
Deal with vital matters
Can it be done?
Can you stand clear
of your own mind and so
make sense of things?
Give birth and nourish.
Have but not possess.
Act but don’t expect.
Lead but don’t control.
Accomplish this and even your virtue
will have virtue.
by Lao Tzu
from Adadptations of the Tao te Ching
by R. Bob
Can People Levitate?
Five in the Colonies: Enid Blyton’s Sri Lankan Adventures
From The Paris Review:
Most mornings this past winter—the Boyagoda household already running late—I discovered my oldest daughter reading at the kitchen table: one boot on, gloves, hat, knapsack, and other boot nowhere to be found. So immersed was she, so indifferent to my pleas and threats, that finally I had to pull the book from her grasping hands just to make her finish dressing for the cold walk to school. This experience has made me more sympathetic to my mother, who once spanked me in a grocery store because I wouldn’t stop reading a book. It was by Enid Blyton, the British children’s writer who wrote some 400 nursery, fantasy, and adventure series titles that have sold more than six hundred million copies worldwide, mostly in Britain and the former colonies, including Sri Lanka—where as a girl my mother herself first encountered Blyton. I recently bought one of Blyton’s books for my own daughter. But before passing it on, I decided to reread it.
The book seemed innocuous enough. As with all of Blyton’s adventure stories, it was about boys and girls drawn into mysterious doings while on summer holiday. Bickering but loyal, they best adults who are either distracted and dismissive, or criminals capable of outsmarting everybody but the kids. Working this premise for decades and dozens of stories, Blyton enjoyed great success—at the time of her death, a book club devoted to her work had some 200,000 members in Britain alone. But because that success depended upon such patterned writing, she was also accused by librarians, teachers, and academics of relentlessly dulling the imaginations of her young readers, and of unjustly encouraging those who were reading her from abroad to make identifications that race, geography, history, and politics preemptively denied them. This certainly seems to have been the case for the Nigerian novelist Chimamanda Ngozi Adichie; in a 2006 interview with The Times, she explained that her development as a writer was stunted by her early reading: “When I started to write, I was writing Enid Blyton stories, even though I had never been to England. I didn’t think it was possible for people like me to be in books.” Similar notions affect the eponymous protagonists of Jamaica Kincaid’s novels Lucy and Annie John, who both declare they wish they were named Enid, after their favorite author. For both the young Adichie’s and Kincaid’s characters, mimicry and the desire for renaming aren’t simple expressions of literary admiration; they’re also rejections of the children’s African and Caribbean worlds, which have been diminished by their very immersion in Blyton’s books. The Blyton reading experience likewise impacts a colonial child’s maturation in Rohinton Mistry’s novel Family Matters, in which an intelligent Indian boy grows up reading her books and from this develops a dismissive attitude towards the foods and places and names that figure in his Bombay life. When self-loathing and alienation begin to build, he stops reading her; later, noticing her books on his shelves, he admits, “I can’t bear to even open them. I wonder what it was that so fascinated me. They seem like a waste of time now.”
Moderate coffee consumption offers protection against heart failure
While current American Heart Association heart failure prevention guidelines warn against habitual coffee consumption, some studies propose a protective benefit, and still others find no association at all. Amidst this conflicting information, research from Beth Israel Deaconess Medical Center attempts to shift the conversation from a definitive yes or no, to a question of how much. "Our results did show a possible benefit, but like with so many other things we consume, it really depends on how much coffee you drink," says lead author Elizabeth Mostofsky, MPH, ScD, a post-doctoral fellow in the cardiovascular epidemiological unit at BIDMC. "And compared with no consumption, the strongest protection we observed was at about four European, or two eight-ounce American, servings of coffee per day." The study published June 26 online in the Journal Circulation: Heart Failure, found that these moderate coffee drinkers were at 11 percent lower risk of heart failure.
Data was analyzed from five previous studies – four conducted in Sweden, one in Finland – that examined the association between coffee consumption and heart failure. The self-reported data came from 140,220 participants and involved 6,522 heart failure events. In a summary of the published literature, the authors found a "statistically significant J-shaped relationship" between habitual coffee consumption and heart failure, where protective benefits begin to increase with consumption maxing out at two eight-ounce American servings a day. Protection slowly decreases the more coffee is consumed until at five cups, there is no benefit and at more than five cups a day, there may be potential for harm.
Why I wish the Obamas would stop inviting me to dinner
Walter Kirn in The New Republic:
The problem with these small-stakes lotteries that are currently clogging up our inboxes isn’t that they cheapen politics (it is what it is, especially lately) but that they reveal, in a depressing way that makes the whole enterprise seem almost futile, just how insanely expensive it has become. They offer as prizes places at power’s table that simply aren’t available to anyone but the odds-beating elect. They ritualize a sense of mass despair at ever achieving influence in normal ways, from getting somewhat but not filthy rich (R) to getting organized (D). Whatever they generate by way of cash or names and addresses for campaign mailing lists is canceled out by the cynicism they spread (or partake of and embody).
At a time when political idealism is hard to come by at any price, suckerball is an extremely dangerous game. It doesn’t help that the hucksters who promote it, the Ed McMahons of this particular sweepstakes, are tied to the candidates by blood and marriage. Tagg Romney sent me a note the other morning that opened with an encomium to fatherhood, the holiest of conservative institutions next to the debt and equity markets themselves (“Dad taught us a lot of lessons, including the importance of having fun as a family, but the most important lesson he imparted to us was the joy in helping others”), and closed with an invitation to wager five bucks on a chance to rub shoulders with his “Papa,” a famously tight-fisted, high-stakes gambler who’d never take such lousy odds himself, not even if tickets were a penny a pop. The deal stirred doubts in me about Tagg’s upbringing as well as contempt for his estimation of mine.
June 26, 2012
The Trouble with the Turing Test
Mark Halpern in The New Atlantis:
In the October 1950 issue of the British quarterly Mind, Alan Turing published a 28-page paper titled “Computing Machinery and Intelligence.” It was recognized almost instantly as a landmark. In 1956, less than six years after its publication in a small periodical read almost exclusively by academic philosophers, it was reprinted in The World of Mathematics, an anthology of writings on the classic problems and themes of mathematics and logic, most of them written by the greatest mathematicians and logicians of all time. (In an act that presaged much of the confusion that followed regarding what Turing really said, James Newman, editor of the anthology, silently re-titled the paper “Can a Machine Think?”) Since then, it has become one of the most reprinted, cited, quoted, misquoted, paraphrased, alluded to, and generally referenced philosophical papers ever published. It has influenced a wide range of intellectual disciplines—artificial intelligence (AI), robotics, epistemology, philosophy of mind—and helped shape public understanding, such as it is, of the limits and possibilities of non-human, man-made, artificial “intelligence.”
Turing’s paper claimed that suitably programmed digital computers would be generally accepted asthinking by around the year 2000, achieving that status by successfully responding to human questions in a human-like way. In preparing his readers to accept this idea, he explained what a digital computer is, presenting it as a special case of the “discrete state machine”; he offered a capsule explanation of what “programming” such a machine means; and he refuted—at least to his own satisfaction—nine arguments against his thesis that such a machine could be said to think. (All this groundwork was needed in 1950, when few people had even heard of computers.) But these sections of his paper are not what has made it so historically significant. The part that has seized our imagination, to the point where thousands who have never seen the paper nevertheless clearly remember it, is Turing’s proposed test for determining whether a computer is thinking—an experiment he calls the Imitation Game, but which is now known as the Turing Test.