Friday, January 31, 2014
illness, aging, death
Perhaps it is only Americans who are blessed with an optimistic view of old age where it's possible to flirt with a stranger at seventy or eighty or – who knows – at ninety. The problem is that they impose it on the rest of the world. There's a whole beauty industry out there, helping women to achieve that goal. Emancipated or less emancipated, women are easily persuaded by the fashion, cosmetics, body-shaping and food industries, so that today it's not necessary to look old; so many products are on their disposal, all you need to do is buy them. Indeed, it has never been easier to hide your age. Look at another role model, Jane Fonda: the right food, special gymnastics and cosmetics (not to mention a little bit of plastic surgery) and every seventy-six year old woman can look twenty years younger – that is, still able to show herself without feeling shame. Therefore there is no reason to look your age; on the contrary, it is indecent to "let yourself go". Not only impolite, but indecent, just like spitting in public or even worse.
The dominant ideology of eternal youth (and health, which is just another word for eternity) suggests that with the help of science, it is possible not only to look good but to overcome any illness that befalls you and to live long too. The latest findings in biology, stem cell research and glycobiology have already been put to use in the battle against ageing. One cosmetic product based on glycans is the suggestively named Forever Youth Liberator.
slavery in the modern world
The abolition of slavery appears, in retrospect, so inevitable a part of the story of human progress that it may seem jarring when Davis emphasizes that there was nothing predetermined about it. He endorses the view advanced by recent scholars that, far from being retrograde or economically backward, slavery in the mid-nineteenth century was a dynamic, expanding institution, with powerful support everywhere it existed. “Never was the prospect of emancipation more distant than now,” the Times of London observed in 1857. Despite abolition in the British Caribbean and Spanish America, there were more slaves in the Western Hemisphere on the eve of the Civil War than at any point in history. Had the Confederacy emerged victorious, which was entirely possible, “it is clear that slavery would have continued well into the twentieth century.” Contingency, even accident, produced the end of slavery in the Old South, the greatest slave society the modern world has known.
Davis is well aware, of course, that emancipation did not usher in the abolitionist dream of a society of equals. The end of slavery in the Caribbean was succeeded by new forms of unfreedom, as planters brought in indentured workers from Asia to replace the blacks who had abandoned the plantations. Davis notes that the Fourteenth and Fifteenth amendments, adopted in the United States immediately after the Civil War to guarantee the civil and political equality of the former slaves, are virtually without precedent in other post-emancipation societies. Yet Reconstruction was soon succeeded by a new system of racial inequality. Did emancipation, then, make any difference in the United States?
The Quantum Mechanics of Fate: How time travel might explain some of science’s biggest puzzles
George Musser in Nautilus:
The objective world simply is, it does not happen,” wrote mathematician and physicist Hermann Weyl in 1949. From his point of view, the universe is laid out in time as surely as it is laid out in space. Time does not pass, and the past and future are as real as the present. If your common sense rebels against this idea, it is probably for a single reason: the arrow of causality. Events in the past cause events in the present which cause events in the future. If time really is like space, then shouldn’t events from the future influence the present and past, too?
They actually might. Physicists as renowned as John Wheeler, Richard Feynman, Dennis Sciama, and Yakir Aharonov have speculated that causality is a two-headed arrow and the future might influence the past. Today, the leading advocate of this position is Huw Price, a University of Cambridge philosopher who specializes in the physics of time. “The answer to the question, ‘Could the world be such that we do have a limited amount of control over the past,’ ” Price says, “is yes.” What’s more, Price and others argue that the evidence for such control has been staring at us for more than half a century.
That evidence, they say, is something called entanglement, a signature feature of quantum mechanics.
Scarlett Johansson is right – the face of SodaStream doesn't fit with Oxfam
Vijay Prasad in The Guardian:
Global charities seek global ambassadors to help them raise the profile of their work. Simply doing their best to stem the tide of suffering is not enough to gain potential donors' attention. But if a celebrity goes among the poor on behalf of the charity, the media flocks to cover the story – or at least the fact that the celebrity is there. The nuts and bolts of inequality are often overlooked, but the charity gets its name in print or on the television. This is the sorry state our humanism has reached.
It is precisely because of this that Oxfam, founded in Oxford in 1942 as Famine Relief, turned to the actor Scarlett Johansson in 2007 to become its global ambassador. She travelled to Oxfam projects, something that provided photo opportunities for herself (as a caring artist) and for Oxfam (to shine a light on the important work that the charity does).
In January, Johansson was appointed the brand ambassador for SodaStream, an Israeli company that produces machines to carbonate beverages. SodaStream's factory is located in the Israeli settlement of Maale Adumim, near Jerusalem.
Israeli settlements (including Maale Adumim) are built on land seized from the Palestinians during the 1967 war. By the standards of the Geneva convention, the Rome Statute and the international court of justice, they have been developed illegally by Israel. Israel has thumbed its nose at international law and continued to build its settlements, including industrial parks such as the one that houses SodaStream.
The European Union has called the E1 parcel of land that Israel plans to build on, extending from Maale Adumim, a violation of international humanitarian law. Johansson, in other words, had become the face of illegal Israeli settlement activity.
Neuroscientist James Fallon discovered through his work that he has the brain of a psychopath, and subsequently learned a lot about the role of genes in personality and how his brain affects his life
Judith Ohikuare in The Atlantic:
In 2005, James Fallon's life started to resemble the plot of a well-honed joke or big-screen thriller: A neuroscientist is working in his laboratory one day when he thinks he has stumbled upon a big mistake. He is researching Alzheimer's and using his healthy family members' brain scans as a control, while simultaneously reviewing the fMRIs of murderous psychopaths for a side project. It appears, though, that one of the killers' scans has been shuffled into the wrong batch.
The scans are anonymously labeled, so the researcher has a technician break the code to identify the individual in his family, and place his or her scan in its proper place. When he sees the results, however, Fallon immediately orders the technician to double check the code. But no mistake has been made: The brain scan that mirrors those of the psychopaths is his own.
After discovering that he had the brain of a psychopath, Fallon delved into his family tree and spoke with experts, colleagues, relatives, and friends to see if his behavior matched up with the imaging in front of him. He not only learned that few people were surprised at the outcome, but that the boundary separating him from dangerous criminals was less determinate than he presumed. Fallon wrote about his research and findings in the book The Psychopath Inside: A Neuroscientist's Personal Journey Into the Dark Side of the Brain, and we spoke about the idea of nature versus nurture, and what—if anything—can be done for people whose biology might betray their behavior.
Kurt Vonnegut on the shapes of stories
The triumph of the maternalists: How paternalism has been feminised
Nancy McDermott in Spiked:
Paternalism has emerged as the dominant form of authoritarianism in our society. Across the world, policymakers are quietly working behind the scenes to save us from ourselves, nudging us towards Jerusalem with smaller fast-food cups, architecture intended to make us climb more stairs, and maternity wards that encourage bonding and breastfeeding. These policies are seldom debated or even noticed. When they are, the routine argument is not whether they are a good idea but how ‘hard’ or openly coercive should they be. Why value autonomy at all when people, left to their own devices, continually make poor choices that foil their aspirations and create a social burden in the process?
This denigration of human rationality is sobering for anyone who believes that autonomy lies at the moral, intellectual and philosophical centre of our humanity. But it is also interesting that this new, nudging paternalism takes the form that it does. This is not the direct, father-knows-best style of paternalism of earlier eras. It is indirect and manipulative. It has nothing to prove and no one is claiming moral authority. On the contrary, paternalists are decidedly non-confrontational and anti-ideological. They seem almost reluctant to assume a moral standpoint; their interventions are merely ‘evidence-based’. It is a modus operandus that can hardly be called ‘paternal’ at all. To use the metaphor of the traditional family, the contemporary paternalist’s style is more akin to that of a wife, who defers to her husband publicly while quietly managing every aspect of her family’s life behind the scenes. This new style of paternalism – let’s call it ‘maternalism’ – is part of a peculiar state of affairs, characterised by the declining fortunes of men, the emergence of ‘zombie feminism’, and a widespread cultural denigration of masculinity.
21 Short Walks Around the Human Brain
David Schoonmaker in American Scientist:
Many successful authors answer questions we long ago articulated and have wished we could answer. Michael Corballis goes at least a step further: He poses questions we wouldn’t have thought to ask and then answers them with clarity and wit. And what could be more fascinating to a human being than the human brain?
A Very Short Tour of the Mind exemplifies truth in advertising—it is very short, both in overall length and in the duration of each chapter, the longest of which barely makes it to the sixth page. Yet the book is packed with surprises. Did you know, for example, that left-handedness is generally considered by psychologists to be a lack of handedness? Or that the ratio of neocortex (the home of higher-order functions) to overall brain volume in primates is related to social-group size? Corballis ranges widely within and beyond his subject. He muses about bipedalism and why it may have been adaptive; explores why and how we are so skilled at recognizing faces; and closes with a chapter called “Lies and Bullshit,” in which he wonders why we are so very intolerant of the former but readily accepting of the latter. With his usual self-effacement, he ends with an admission about his own career as psychologist, educator and communicator that may bear on the question.
In the Sontag Archives
Benjamin Moser in The New Yorker:
Any biographer knows the unease, sometimes verging on nausea, that extended research into a single person’s life brings. I never met Sontag or Clarice Lispector, the subject of my last book. But after years of research, interviews, reading, and travelling, I probably know more about both of them than anyone outside their most intimate circles. I know about their sex lives and finances and medical records and professional failures, about their difficulties with their children and their parents, about the painful secrets that they desperately longed to conceal.
Even without these struggles, which are part of every life, the form, too, imposes choices. Just as history is not the past itself but a story about the past, biography is not a life but a life story. Just as a novelist gets to know his or her characters, a biographer gets to know his, too, and, in the face of the sprawling chaos of an entire life, knows that whatever he can tell about the subject is only a small selection that fits a narrative chosen according to his own tastes and interests.
He is also always aware that the biographer’s position, which necessarily involves judgments about the subject’s character and the choices she made, is profoundly unjust, for the simple reason that the subject herself cannot be consulted.
I am familiar with these concerns, and have always borne them in mind. Still, reading papers and manuscripts is one thing. Looking through someone’s e-mail is quite another, and the feeling of creepiness and voyeurism that overcame me as I sat with Gonzalez struggled with the unstoppable curiosity that I feel about Sontag’s life. To read someone’s e-mail is to see her thinking and talking in real time. If most e-mails are not interesting (“The car will pick you up at 7:30 if that’s ok xxx”), others reveal unexpected qualities that are delightful to discover. (Who would have suspected, for example, that Sontag sent e-mails with the subject heading “Whassup?”) One sees Sontag, who had so many friends, elated to be in such easy touch with them (“I’m catching the e-mail fever!”); one sees the insatiably lonely writer reaching out to people she hardly knew and inviting them to pay a call. In their reactions, one reads their bemusement, how hesitant they were to bother the icon, with her fearsome reputation.
Read more here.
Give me what you have on you.
Neither keys nor money.
Make it something temporary.
The hastily scribbled phone number.
The dry-cleaned piece of paper in your coat pocket.
The button about to fall off.
The words you just held back from saying.
Your strength too much to open a door.
All the things you no longer need.
Give me the rustle of your cotton.
The wind can do without.
by K. Schippers
from tellen en wegen
publisher: Querido, Amsterdam, 2011
Translation: 2012, Willem Groenewegen
Thursday, January 30, 2014
Are We Too Close to Making Gattaca a Reality?
Ferris Jabr in Scientific American (photo: Katie Tegtmeyer, Flickr):
Preventing and treating diseases are not the only reasons people have turned to pre-implantation genetic diagnosis. PGD also makes it possible for parents to predetermine characteristics of a child to suit their personal preferences. In a few cases, people have used PGD to guarantee that a child will have what many others would consider a disability, such as dwarfism or deafness. In the early 2000s, lesbian couple Sharon Duchesneau and Candy McCullough—both deaf from birth—visited one sperm bank after another searching for a donor who was also congenitally deaf. All the banks declined their request or said they did not take sperm from deaf men, but the couple got what they were looking for from a family friend. Their son, Gauvin McCullough, was born in November 2001; he is mostly deaf but has some hearing in one ear. Deafness, the couple argued, is not a medical condition or defect—it is an identity, a culture. Many doctors and ethicists disagreed, berating Duchesneau and McCullough for deliberately depriving a child of one of his primary senses.
Much more commonly, hopeful parents in the past decade have been paying upwards of $18,000 to choose the sex of their child. Sometimes the purpose of such sex selection is avoiding a disease caused by a mutation on the X chromosome: girls are much less likely to have these illnesses because they have two X chromosomes, so one typical copy of the relevant gene can compensate for its mutated counterpart. Like Marie and Antonio Freeman in Gattaca, however, many couples simply want a boy or a girl. Perhaps they have had three boys in a row and long for a girl. Or maybe their culture values sons far more than daughters. Although the U.K., Canada and many other countries have prohibited non-medical sex selection through PGD, the practice is legal in the U.S. The official policy of the American Society of Reproductive Medicine is as follows: “Whereas preimplantation sex selection is appropriate to avoid the birth of children with genetic disorders, it is not acceptable when used solely for nonmedical reasons.” Yet in a 2006 survey of 186 U.S. fertility clinics, 58 allowed parents to choose sex as a matter of preference. And that was seven years ago. More recent statistics are scarce, but fertility experts confirm that sex selection is more prevalent now than ever.
“A lot of U.S. clinics offer non-medical sex selection,” says Jeffrey Steinberg, director of The Fertility Institutes, which has branches in Los Angeles, New York and Guadalajara, Mexico. “We do it every single day. We did three this morning.”
Mourning Tongues: How Auden Was Modified in the Guts of the Living
Nina Martyris in the LA Review of Books:
ON THIS DAY 75 years ago — January 28, 1939 — “something slightly unusual” occurred in the annals of English poetry. William Butler Yeats died, and his death gave birth to a poem that set off one of the most extraordinary elegiac conversations of our time.
The poem was W. H. Auden’s “In Memory of W. B. Yeats,” and this is the story of its astonishing afterlife — how three separate elegies in three different countries were modeled on it; how Auden’s words were quite literally, in Auden’s line from the poem, “modified in the guts of the living,” and how, in a feat that even someone as reputedly self-anointing as Auden could not possibly have foreseen, it went on to link a multicultural pantheon of greats: Yeats, Auden, T. S. Eliot, Joseph Brodsky, Derek Walcott, and Seamus Heaney.
Auden was a natural master of the elegy. His pen was ready, generous, candid, and quick to rhyme. He shot off elegies on Freud, Henry James, Ernst Toller, Louis MacNeice, and JFK, and his “Funeral Blues,” a fine example of the coherence of grief, has become part of crematoria cool after it was sentimentalized by Hollywood. But of all his requiem compositions, it is his magnificent and measured elegy for Yeats that has a seminal place in the canon.
Our Quantum Reality Problem
Adrian Kent in Aeon:
Here’s the basic problem. While the mathematics of quantum theory works very well in telling us what to expect at the end of an experiment, it seems peculiarly conceptually confusing when we try to understand what was happening during the experiment. To calculate what outcomes we might expect when we fire protons at one another in the Large Hadron Collider, we need to analyse what – at first sight – look like many different stories. The same final set of particles detected after a collision might have been generated by lots of different possible sequences of energy exchanges involving lots of different possible collections of particles. We can’t tell which particles were involved from the final set of detected particles.
Now, if the trouble was only that we have a list of possible ways that things could have gone in a given experiment and we can’t tell which way they actually went just by looking at the results, that wouldn’t be so puzzling. If you find some flowers at your front door and you’re not sure which of your friends left them there, you don’t start worrying that there are inconsistencies in your understanding of physical reality. You just reason that, of all the people who could have brought them, one of them presumably did. You don’t have a logical or conceptual problem, just a patchy record of events.
Quantum theory isn’t like this, as far as we presently understand it. We don’t get a list of possible explanations for what happened, of which one (although we don’t know which) must be the correct one. We get a mathematical recipe that tells us to combine, in an elegant but conceptually mysterious way, numbers attached to each possible explanation. Then we use the result of this calculation to work out the likelihood of any given final result. But here’s the twist. Unlike the mathematical theory of probability, this quantum recipe requires us to make different possible stories cancel each other out, or fully or partially reinforce each other. This means that the net chance of an outcome arising from several possible stories can be more or less than the sum of the chances associated with each.
The Elusive Tagore
Philip Nikolayev in Open The Magazine:
When John Berryman, the great American poet, gave a lecture tour of India in 1957, he fell ill ‘with virus and a high fever’. When he reached Ahmedabad, his condition worsened; by then he had become noticeably thin, already having lost ten pounds. Still, he decided to proceed with his scheduled lecture. Berryman’s biographer Paul Mariani writes of this fascinating moment:
‘As Munford finished his talk, he saw Berryman standing in the doorway, trembling, his face drained of color. Then Berryman walked up to the podium and delivered a lecture unlike anything he’d given so far on his trip. For six weeks, he told his small audience, he had been told over and over by his Indian hosts that America had produced no poetry and that the Indians were the most poetic people in the world. But what he’d seen of Indian poetry seemed nothing more than a loose sort of “spiritual sentimentality.” Now he was going to tell them what real poetry was. He quoted a passage from Rilke in German and then a passage from Lorca in Spanish, translating into English afterward for his audience. Great poetry, he explained, sprang only from the pain and anguish of human experience. The audience sat listening to his stunning, fevered performance. If they felt angry or patronised, they did not show it.’
In this case, Berryman’s fever was the likely cause of his bluntness. He had come to Ahmedabad directly from Kolkata. Tagore would have been on his mind. There can be little doubt that in speaking of “spiritual sentimentality”, Berryman was referring to Tagore, whom he would have read in translation. That’s exactly how Tagore comes off in English.
But it does not take a Westerner to question Tagore’s importance.
Capitalism vs. Democracy
Thomas B. Edsall in the NYT:
Thomas Piketty’s new book, “Capital in the Twenty-First Century,”described by one French newspaper as a “a political and theoretical bulldozer,” defies left and right orthodoxy by arguing that worsening inequality is an inevitable outcome of free market capitalism.
Capitalism, according to Piketty, confronts both modern and modernizing countries with a dilemma: entrepreneurs become increasingly dominant over those who own only their own labor. In Piketty’s view, while emerging economies can defeat this logic in the near term, in the long run, “when pay setters set their own pay, there’s no limit,” unless “confiscatory tax rates” are imposed.
Piketty’s book — published four months ago in France and due out in English this March — suggests that traditional liberal government policies on spending, taxation and regulation will fail to diminish inequality. Piketty has also delivered and posted a series of lectures in French and English outlining his argument.
Conservative readers will find that Piketty’s book disputes the view that the free market, liberated from the distorting effects of government intervention, “distributes,” as Milton Friedman famously put it, “the fruits of economic progress among all people. That’s the secret of the enormous improvements in the conditions of the working person over the past two centuries.”
Piketty proposes instead that the rise in inequality reflects markets working precisely as they should: “This has nothing to do with a market imperfection: the more perfect the capital market, the higher” the rate of return on capital is in comparison to the rate of growth of the economy. The higher this ratio is, the greater inequality is.
Japan’s Great Zoo Massacre
Behind the curtain of empire, horrors lurk. At the Tokyo Imperial Zoo on September 4, 1943, two starving elephants remained silent, obedient to their trainers, while a religious service on the other side of a red-and-white awning prematurely memorialized their sacrifice for Japan’s imperial cause. Buddhist monks, government officials and schoolchildren made offerings of food to the elephants’ spirits and to the spirits of other captive animals killed by order of the government. This unprecedented ceremony known as the “Memorial Service for Martyred Animals” was held on the zoo’s grounds where nearly a third of the cages stood empty. Lions from Abyssinia, tigers representative of Japan’s troops, bears from Manchuria, Malaya and Korea, an American bison, and many others had been clubbed, speared, poisoned and hacked to death in secret. Although the zoo’s director had found a way to save some of the condemned creatures by moving them to zoos outside Tokyo, Mayor Ōdaichi Shigeo insisted on their slaughter. Ōdaichi himself, along with Imperial Prince Takatsukasa Nobusuke and the chief abbot of Asakusa’s Sensōji Temple, presided over the carefully choreographed and highly publicized “Memorial Service”, thanking the animals for sacrificing themselves for Japan’s war effort.
But the elephants were not dead. Tonky and Wanri had been holding out for ten days against their keepers’ attempts to poison them with strychnine-laced food and cyanide-permeated water, refusing to eat or drink.
In Praise of Mediocrity
Mediocrity makes visible something about tradition that greatness can often obscure. It is one thing to say, for instance, that the West possesses a valuable tradition because, within it, we find a sampling of awesome geniuses, from Homer and Plato, to Dante, Shakespeare, and Nietzsche. But this hardly explains the value of tradition. Traditions are self-authenticating. They are good in themselves. To live within and participate in a tradition is, again, to keep something alive and to draw things and persons together, across time, in a community of knowledge and love. The second-rate imitator of Keats in Kentucky, the belated composer of an oratorio in Ohio, may seem derivative, as if merely preserving the shadow of greatness in amber. But, to the contrary, they take their place in a way of being and keep that way open for others to tread.
Authors’ names not withstanding, art, technology, and science, the whole world of work and culture, are starkly impersonal enterprises. The anonymous mediocrity, no less than the legendarymaestro, gives his life in the service of keeping a tradition alive; in being himself forgotten he helps something else to be remembered. What a blessed thing to do.
The extraordinary life of William S. Burroughs
“Virtually all of Burroughs’s writing was done when he was high on something,” Miles writes. The drugs help account for the hollowness of his voices, which jabber, joke, and rant like ghosts in a cave. He had no voice of his own, but a fantastic ear and verbal recall. His prose is a palimpsest of echoes, ranging from Eliot’s “Preludes” and “Rhapsody on a Windy Night” (lines like “Midnight shakes the memory / As a madman shakes a dead geranium” are Burroughsian before the fact) to Raymond Chandler’s marmoreal wisecracks and Herbert Huncke’s jive. I suspect that few readers have made it all the way through the cut-up novels, but anyone dipping into them may come away humming phrases. His palpable influence on J. G. Ballard, William Gibson, and Kathy Acker is only the most obvious effect of the kind of inspiration that makes a young writer drop a book and grab a pen, wishing to emulate so sensational a sound. It’s a cold thrill. While always comic, Burroughs is rarely funny, unless you’re as tickled as he was by such recurrent delights as boys in orgasm as they are executed by hanging.
Some critics, including Miles, have tried to gussy up Burroughs’s antinomian morality as Swiftian satire. Burroughs, however, wages literary war not on perceptible real-world targets but against suggestions that anyone is responsible for anything. Though never cruel in his personal conduct, he was, in principle, exasperated with values of constraint. A little of “Nothing is true, everything is permitted” goes a long way for many readers, including me. But there’s no gainsaying a splendor as berserk as that of a Hieronymus Bosch painting. When you have read Burroughs, at whatever length suffices for you, one flank of your imagination of human possibility will be covered for good and all.
LIKE NOTHING ON EARTH: Landscapes of the Mind
Robert Macfarlane in More Intelligent Life:
Though I am nearing 40, it remains an ambition of mine to climb a previously unclimbed mountain. I have in mind possible peaks in Bhutan, Sichuan and north-western Tibet—all of them elegant in their architecture and severe in their remoteness. But my first choice would be the shield volcano Olympus Mons. Its main slopes present little difficulty to the mountaineer, rising as they do at an average angle of five degrees. Its summit is a caldera, or collapsed crater, whose jagged upper rim requires no ropework to reach. Seen on a plan-view map, indeed, it appears to offer little obstacle to an easy ascent. Except that Olympus Mons is on Mars.
I first heard of the mountain in a Pixies song: "Sun shines in the rusty morning/Skyline of the Olympus Mons/I think about it sometimes", yowled Black Francis, setting my teenage self dreaming. Research revealed its astonishing statistics: the second-highest peak in the solar system, three times the altitude of Everest, one hundred times the mass of Mauna Loa (the largest volcano on Earth), the size of Arizona in area, encircled by an escarpment up to eight kilometres high, and its peripheries engulfed by dust storms that can last for decades. The ludicrous notion of climbing Olympus Mons only occurred to me when I read Kim Stanley Robinson’s remarkable work of areology, "Red Mars" (1992). The first of a trilogy of novels, it begins in the year 2027, when a hundred-strong team of humans make landfall on Mars. Their task is to terraform the red planet from a frozen and irradiated wasteland into a habitable environment, ready to receive future waves of colonists from Earth.
A 3D window into living cells
University of Illinois researchers have developed a new imaging technique that needs no dyes or other chemicals, yet renders high-resolution, three-dimensional, quantitative imagery of cells and their internal structures using conventional microscopes and white light.
Called white-light diffraction tomography (WDT), the imaging technique opens a window into the life of a cell without disturbing it and could allow cellular biologists unprecedented insight into cellular processes, drug effects and stem cell differentiation. The team, led by electrical and computer engineering and bioengineering professor Gabriel Popescu, published their results in the journal Nature Photonics. “One main focus of imaging cells is trying to understand how they function, or how they respond to treatments, for example, during cancer therapies,” Popescu said. “If you need to add dyes or contrast agents to study them, this preparation affects the cells’ function itself. It interferes with your study. With our technique, we can see processes as they happen and we don’t obstruct their normal behavior.”
Wednesday, January 29, 2014
Wondering more about the Coen Brothers’ latest film? Don’t ask them; they’re not talking
Our own Morgan Meis in The Smart Set:
The Coen Brothers are no help and never will be. Go ahead and ask them. Fresh Air’s Terry Gross recently tried. She asked them how they write their films. “It’s mostly napping,” Ethan Coen answered. The Coen Brothers have been evading answers for about 30 years now, since Blood Simple came out in 1984. Asked about The Big Lebowski a few years ago, Joel Coen said, “That movie has more of an enduring fascination for other people than it does for us.” This is a game, and the Coen Brothers play it well. Other artists have played the same game at even higher stakes. Thomas Pynchon has been in hiding for 40 years. J. D. Salinger hid for about 50, until his death a couple of years ago. The Coen Brothers simply hide in plain sight. They answer by not answering.
This is a good state of affairs. It is good because the Coen Brothers make many enjoyable films. And they are aware that people who make enjoyable films, like the aforementioned The Big Lebowski, should avoid discussing the serious and philosophical themes of their enjoyable films. That’s to say, the art of many Coen Brothers films is in the artlessness. For artless artists, there is nothing worse than too much talk, too much analysis. Artless artists have felt this way for a long time. The Roman poet Catullus had a special word for his artful artlessness. He called it “lepidus.” Lepidus is a hard word to translate. It means something like charming, witty, easy, sophisticated. More than anything, a poem that is lepidus should appear effortless, especially if it is not. Catullus worked very hard on his poetry. But he wanted his poems to read as if they’d been hardly worked upon. He wanted them to seem dashed off, cast out with a flick of the wrist on a summer’s day.
Understanding the boycott of Israel’s universities
Vijay Prasad in the Washington Post:
The growing movement for boycott, divestment and sanctions of Israeli universities has struck a chord in Israel. Justice Minister Tzipi Livni said recently that the boycott campaign, which drew new attention when it was joined last month by the American Studies Association (ASA) , “ is moving and advancing uniformly and exponentially .” If Israel does not respond, Livni said, it will turn itself into “ a lone settlement in the world .”
Livni meant that criticism of the Israeli occupation of Palestinian lands should be taken seriously. Finance Minister Yair Lapid concurred, writing, “The world seems to be losing patience with us. . . . If we don’t make progress with the Palestinians, we will lose the support of the world and our legitimacy.”
The boycott movement is a caution to Israel that it must be less obdurate in its relations with the Palestinians — a position far removed from the toxic response to the ASA within the United States, where many groups long have opposed any discussion of the reality of Israel’s occupation. In 2010, the collegiate group Hillel informed its members that its branches were not permitted to invite speakers who “support boycott of, divestment from, or sanctions against the State of Israel.”
Dogs Are Not People
Colin Dayan in the Boston Review:
Books abound on dog love, loving dogs, what it means to have or be with a dog. With all the writing about dogs, it might seem that we are too much infatuated with their unique qualities. But that is not it at all.
Even while we are ostensibly doing everything in our power to ascertain the nature and desires of dogs, the questions we ask obscure or betray what is most salient about them and necessary to their lives. And through it all—the testing and the loving, the ownership and the training, the argument for dog rights and the facts of their disposal—we never question the status of the human as a problem not a privilege.
To say, as Gregory Berns does in his new book How Dogs Love Us and his recent New York Times op-ed “Dogs are People, Too,” that dogs have the reasoning capacity of a young child is to continue to ignore what it is that dogs possess that we do not. Dogs are not people. Dogs are not humans. But we are desperate to appropriate whatever it means to be dog and to make that over in our image.
The urge to characterize dogs as like ourselves speaks to our ignorance and to the failure of imagination. As humans who control the arena of judgment, we cannot brook the humility demanded in confronting what we cannot understand, what we do not know.
Jennifer Oullette: Me, Myself and Why
david cronenberg thinks he's a bug
I woke up one morning recently to discover that I was a seventy-year-old man. Is this different from what happens to Gregor Samsa in The Metamorphosis? He wakes up to find that he’s become a near-human-sized beetle (probably of the scarab family, if his household’s charwoman is to be believed), and not a particularly robust specimen at that. Our reactions, mine and Gregor’s, are very similar. We are confused and bemused, and think that it’s a momentary delusion that will soon dissipate, leaving our lives to continue as they were. What could the source of these twin transformations possibly be? Certainly, you can see a birthday coming from many miles away, and it should not be a shock or a surprise when it happens. And as any well-meaning friend will tell you, seventy is just a number. What impact can that number really have on an actual, unique physical human life?
In the case of Gregor, a young traveling salesman spending a night at home in his family’s apartment in Prague, awakening into a strange, human/insect hybrid existence is, to say the obvious, a surprise he did not see coming, and the reaction of his household—mother, father, sister, maid, cook—is to recoil in benumbed horror, as one would expect, and not one member of his family feels compelled to console the creature by, for example, pointing out that a beetle is also a living thing, and turning into one might, for a mediocre human living a humdrum life, be an exhilarating and elevating experience, and so what’s the problem?