Friday, April 18, 2014
Peter Carey in The Guardian:
Sometime in the very early 1970s two Australian friends returned from Colombia and asked me to ghostwrite the story of their adventures, which included a conversation with an unknown writer named Gabriel García Márquez. In an effort to overcome my reluctance they lent me an English edition of One Hundred Years of Solitude. None of us understood that they had thereby changed my life. I tried, and failed, to help them memorialise their adventure. Worse, I "forgot" to return the book. Worse still, I arrogantly decided that this novel by this unknown writer would be of far more use to me than it could ever be to them. I was, at the time I became a thief, stumbling to find a way to escape what Patrick White had called "the dun-coloured realism" of my own country's literature, to make the windswept paddocks on the Geelong Road, say, become luminous and new. The stories worked well enough, but I still wasn't up to the bigger challenge. The absence of placenames in the stories is a good indication of what I was avoiding, a sign that I was still too young (and damaged) to see that Myrniong was a beautiful strange name and that Wonthaggi was a poem unto itself. It would take 10 years (some 20 stories and a novel) to free myself of this colonial bind, but the first step, without a doubt, was when I opened One Hundred Years of Solitude and read: "At that time Macondo was a village of 20 adobe houses built on the bank of a river of clear water that ran along a bed of polished stones, which were white and enormous, like prehistoric eggs. The world was so recent that many things lacked names, and in order to indicate them it was necessary to point."
Thus Márquez threw open the door I had been so feebly scratching on.
Erika Check Hayden in Nature:
Scientists have identified a long-sought fertility protein that allows sperm to dock to the surface of an egg. The finding, an important step in understanding the process that enables conception, could eventually spawn new forms of birth control and treatments for infertility. “It’s very important, because we now know two of the proteins that are responsible for the binding of sperm to the egg,” says Paul Wassarman, a biochemist and developmental biologist at the Icahn School of Medicine at Mount Sinai in New York. The work, published today in Nature1, was led by Gavin Wright, a biochemist at the Wellcome Trust Sanger Institute in Hinxton, UK. He and his team were looking for a counterpart to a protein called Izumo1, discovered in 2005 on the surface of sperm cells2.
Scientists knew that Izumo1 allowed sperm to join to an egg to begin the process of fertilization. But nobody knew what protein on the surface of the egg attached to Izumo1. Identifying the proteins involved in the joining step has been difficult because the molecules tend to bind quite weakly to each other. So Wright and his team devised a way to cluster Izumo1 proteins, then searching for the egg-cell proteins that would bind to the clusters in cell culture. Wright compares the technique to constructing a Velcro fastener out of many individual fabric loops: “Each small hook adheres weakly, but when [they are] clustered in an array, even the most fleeting interactions are stabilized and can therefore be detected,” he says. Using this method, the team hooked a protein called folate receptor 4 that is found on the surface of the mouse egg cell. Wright’s team propose renaming the egg protein Juno, after the Roman goddess of fertility and marriage. Izumo1 is also named after a cultural symbol of reproduction — a Japanese marriage shrine.
Elizabeth Alsop in the LA Review of Books:
IT’S A GOOD TIME to be a canceled show. Last May, Netflix sent the viewing public into paroxysms when it released the fourth season ofArrested Development, which last aired on Fox in 2006. A month earlier, Rob Thomas made Kickstarter history when fans of his UPN series Veronica Mars massively overfunded — by three million dollars! — the show’s “return” as a feature-length film, now playing in theaters. Since then, former AMC series The Killing has been granted new life by Netflix, defunct soaps like All My Children and One Life to Live have been revived as streaming web series, and NBC’s Heroes, it was just announced, will return in rebooted and “reborn” form this summer.
There are, it seems, second acts in American television. Or, as Lacey Roseput it in The Hollywood Reporter, “canceled doesn’t necessarily mean canceled anymore.” Instead, shows like 24, Futurama, Unforgettable, and Cougartownhave become the beneficiaries of a new televisual world order, whereby any series threatened with cancelation can be, in Rose’s words, “revived thanks to creative deal-making,” or — in the case of NBC’s Community — rescued by socially-mediated displays of viewer displeasure.
All of this, of course, hardly comes as news. Back in 2012, New Yorkmagazine’s Matt Zoller Seitz was already bemoaning the rise of “zombified” media; the byproduct, in part, of new and more potent forms of fan empowerment. Since then, critics have been eager to read the cultural tea leaves. There’s been no shortage of speculation about what this wave of revivals could cumulatively portend for television makers and viewers in the 21st century.
Yet despite the critical attention to this phenomenon, there’s been comparatively little curiosity about the psychology behind it.
Peter Gordon in The New Republic:
Walter Benjamin passed some of the happiest moments of his life wandering shirtless in the sun on the Spanish island of Ibiza. In a letter in 1932, he wrote that the little Mediterranean island lacked modern conveniences, such as “electric light and butter, liquor and running water, flirting and newspaper reading.” The nearest village boasted a mere seven hundred inhabitants, who got by without modern farm equipment: the economy ran mostly on goats. During his two stays there, in 1932 and 1933, Benjamin strolled the beaches and explored the island’s interior in the company of his friend Jean Selz, who would recall that “Benjamin’s physical stoutness and the rather Germanic heaviness he presented were in strong contrast to the agility of his mind, which so often made his eyes sparkle behind his glasses.” Together they took long walks through the countryside, but the walks were “made even longer by our conversations, which constantly forced him to stop. He admitted that walking kept him from thinking. Whenever something interested him he would say, ‘Tiens, tiens!’ This was the signal that he was about to think, and therefore stop.” Among the German guests on the island this idiosyncrasy was well-known and they gave the strange apparition a nickname: “Tiens-tiens.” The village locals called him el miserable. It is true that Benjamin was poor and prone to depression. But out of each day he crafted a scholar’s idyll: he rose early and bathed in the ocean, then ascended the hills to his favorite spot, where he retrieved a hidden lounge chair from the bushes. He sat there among the fig trees for the full length of the morning, writing, or reading Lucretius.
We do not imagine Benjamin on the beach. He was a poet of the city, one of the most probing critics of the bourgeois experience. In manifold essays and books, some of them fragmentary and left unpublished until much later, he sought to portray modern life in all its richness and variety—its literature, its dreams, its cultural detritus. Like a ragpicker in the marketplace (this was his own comparison), nothing seemed to him without significance.
Compared to most lives, John Updike’s was golden from the get-go. The adored only son of a highly educated mother (who herself wrote fiction, some of it eventually published in the New Yorker), the star student of Shillington, Pa.’s high school, recipient of a scholarship to Harvard, an invaluable contributor to the Harvard Lampoon (“seven cover illustrations, more than a hundred cartoons and drawings, sixty poems, and twenty-five prose pieces”), winner of a year’s fellowship to Oxford University’s Ruskin School of Drawing and Fine Art, a staff writer for the New Yorker in his early 20s, and then a successful and wealthy novelist for the next 50 years, as well as an underrated poet and a superb reviewer of books and art exhibitions, Updike could apparently do no wrong.
Except, of course, in his private life. Just before his senior year at Harvard, Updike married an intelligent and quietly attractive Radcliffe student named Mary Pennington.
To read George Eliot attentively is to become aware how little one knows about her. It is also to become aware of the credulity, not very creditable to one’s insight, with which, half consciously and partly maliciously, one had accepted the late Victorian version of a deluded woman who held phantom sway over subjects even more deluded than herself. At what moment and by what means her spell was broken it is difficult to ascertain. Some people attribute it to the publication of herLife. Perhaps George Meredith, with his phrase about the “mercurial little showman” and the “errant woman” on the daïs, gave point and poison to the arrows of thousands incapable of aiming them so accurately, but delighted to let fly. She became one of the butts for youth to laugh at, the convenient symbol of a group of serious people who were all guilty of the same idolatry and could be dismissed with the same scorn. Lord Acton had said that she was greater than Dante; Herbert Spencer exempted her novels, as if they were not novels, when he banned all fiction from the London Library. She was the pride and paragon of her sex. Moreover, her private record was not more alluring than her public. Asked to describe an afternoon at the Priory, the story-teller always intimated that the memory of those serious Sunday afternoons had come to tickle his sense of humour. He had been so much alarmed by the grave lady in her low chair; he had been so anxious to say the intelligent thing. Certainly, the talk had been very serious, as a note in the fine clear hand of the great novelist bore witness.
A paradox pervades the Sicilian citrus groves and gardens. The scent is intoxicating but too often the fruit lies rotten on the ground, unwanted and worthless. In this maddening, singular island, where they say the sun drives you crazy and the moon makes you sad, the irony is your breakfast orange juice will most likely be diluted, long-life concentrate from oranges grown in Brazil.
Helena Attlee acknowledges the complexities of international trade in The Land Where Lemons Grow: The story of Italy and its citrus fruit, her fascinating grand tour of the citrus-growing regions of Italy. Her focus is less on global agro-economics than on the history of the fruit in its adopted home, and the migration of waves of citrons, sour oranges, lemons, sweet oranges and mandarins to the welcoming soil of Mediterranean Europe.
A distinguished garden writer, Attlee fell under the spell of citrus over ten years ago and the book, like the eleventh labour of Hercules to steal the golden fruit of the Hesperides, is the result. She writes with great lucidity, charm and gentle humour, and wears her considerable learning lightly.
You were mine
by Bill Schneberger
Jonathan Kandell in the New York Times:
Cristóbal Pera, his former editor at Random House, confirmed the death. Mr. García Márquez learned he had lymphatic cancer in 1999, and a brother said in 2012 that he had developed senile dementia.
Mr. García Márquez, who received the Nobel Prize for Literature in 1982, wrote fiction rooted in a mythical Latin American landscape of his own creation, but his appeal was universal. His books were translated into dozens of languages. He was among a select roster of canonical writers — Dickens, Tolstoy and Hemingway among them — who were embraced both by critics and by a mass audience.
“Each new work of his is received by expectant critics and readers as an event of world importance,” the Swedish Academy of Letters said in awarding him the Nobel.
Mr. García Márquez was a master of the literary genre known as magical realism, in which the miraculous and the real converge. In his novels and stories, storms rage for years, flowers drift from the skies, tyrants survive for centuries, priests levitate and corpses fail to decompose. And, more plausibly, lovers rekindle their passion after a half-century apart.
Thursday, April 17, 2014
Lullaby for a Daughter
Go to sleep. Night is a coal pit
full of black water —
......... night is a dark cloud
full of warm rain.
Go to sleep. Night is a flower
resting from bees —
......... night's a green sea
swollen with fish.
Go to sleep. Night is a white moon
riding her mare —
......... night's a bright sun
burned to black cinder.
Go to sleep,
star's feast of praise,
moon to reign over
her sweet subject, dark.
by Jim Harrision
from The Shape of the Journey
Copper Canyon Press, 1998
Eva Saulitis in Orion Magazine:
FOR TWENTY-SIX SEPTEMBERS I’ve hiked up streams littered with corpses of dying humpbacked salmon. It is nothing new, nothing surprising, not the stench, not the gore, not the thrashing of black humpies plowing past their dead brethren to spawn and die. It is familiar; still, it is terrible and wild. Winged and furred predators gather at the mouths of streams to pounce, pluck, tear, rip, and plunder the living, dying hordes. This September, it is just as terrible and wild as ever, but I gather in the scene with different eyes, the eyes of someone whose own demise is no longer an abstraction, the eyes of someone who has experienced the tears, rips, and plunder of cancer treatment. In spring, I learned my breast cancer had come back, had metastasized to the pleura of my right lung. Metastatic breast cancer is incurable. Through its prism I now see this world.
...NO ONE TEACHES US how to die. No one teaches us how to be born, either. In an essay about visiting the open-air cremation pyres of Varanasi, India, Pico Iyer quotes the scholar Diana L. Eck: “For Hindus, death is not the opposite of life; it is, rather, the opposite of birth.” It happens that my stepdaughter, Eve, is pregnant. I’ve known her since she was three years old; she’s thirty now. One late afternoon this spring, early in her pregnancy, early in my diagnosis, we picked bags of wild rose petals together in a meadow below my house; she intended to make rose-flavored mead. We hadn’t talked much about the implications of my cancer recurrence; in the meadow, we almost didn’t have to. It hovered in the honeyed sunlight between us. That light held the fact of life growing inside her and the cancer growing inside me equally, strangely. We talked around the inexplicable until, our bags full of pale pink petals, we held each other in the tall grass and cried. Watching her body change in the months since, without aid of technology or study or experience, watching her simply embody pregnancy, should teach me something about dying. In preparation for giving birth, she reads how-to books, takes prenatal yoga, attends birthing classes. She studies and imagines. Yet no matter how learned she becomes, how well informed, with the first contraction, her body will take over. It will enact the ancient, inborn process common to bears, goats, humans, whales, and field mice. She will inhabit her animal self. She will emit animal cries. She will experience the birth of her child; she will live it. Her body—not her will or her mind or even her self—will give birth. Can I take comfort in the countless births and deaths this earth enacts each moment, the jellyfish, the barnacles, the orcas, the salmon, the fungi, the trees, much less the humans?
Annie Sneed in Scientific American:
Nematode worms, fruit flies, mice and other lab animals live longer, healthier lives when they eat less than they otherwise would if more food were available. Primates may also benefit, and perhaps humans—which is why research funds are pouring into this phenomenon. But all this raises a puzzling question: Why did creatures evolve such a mechanism in the first place? Researchers have declared the most popular theory doesn’t make evolutionary sense, and they’ve proposed a new explanation in its place. The most prominent theory involves what happens physiologically during times of food scarcity. When the living is good, natural selection favors organisms that invest energy in reproduction. In times of hardship, however, animals have fewer offspring, diverting precious nutrients to cell repair and recycling so they can survive until the famine ends, when reproduction begins anew. Cell repair and recycling appear to be substantial antiaging and anticancer processes, which may explain why underfed lab animals live longer and rarely develop old-age pathologies like Margo Adler agrees with the basic cellular pathways, but she’s not so sure about the evolutionary logic.
Adler, an evolutionary biologist at the University of New South Wales in Australia, says this popular idea relies on a big assumption: that natural selection favors this energy switch from reproduction to survival because animals will have more young in the long run—so long as they actually survive and reproduce. “This idea is repeated over and over again in the literature as if it’s true, but it just doesn’t make that much sense for evolutionary reasons,” she says. The problem, Adler says, is that wild animals don’t have the long, secure lives of their laboratory cousins. Instead, they’re not only endangered by famine but by predators and pathogens, random accidents and rogue weather as well. They also face physiological threats from a restricted diet, including a suppressed immune system, difficulty with healing and greater cold sensitivity. For these reasons, delaying reproduction until food supplies are more plentiful is a huge risk for wild animals. Death could be waiting just around the corner. Better to reproduce now, Adler says. The new hypothesis she proposes holds that during a famine animals escalate cellular repair and recycling, but they do so for the purpose of having as many progeny as possible during a famine, not afterward.
Wednesday, April 16, 2014
Sukhdev Sandhu, William Gibson, Mark Romanek, and Joanna Hogg discuss Marker in The Guardian (h/t: Meg Toth; image of a museum built by Chris Marker in Second Life). Sandhu:
Marker didn't regard artistic forms as sacred. He didn't believe in the primacy of celluloid or the cinema screen. He was continually embracing and experimenting with new technologies: one of his richest later works was a CD-Rom entitled Immemory (1997); he created Photoshop cartoon-collages for the French website Poptronics; the Whitechapel show includes a projection of Ouvroir: The Movie (2010), a tour of a museum he created on Second Life, as well as the UK premiere of Zapping Zone (Proposal for an Imaginary Television) (1990-94), a sprawling assemblage of videos, computers and light boxes.
"Marker was always interested in transformation," recalls Darke. This fascination with the ability of new technologies to transform ideas of human identity, social connection and the nature of memory makes him a strikingly contemporary figure whose work has been embraced by young art students as much as cinephiles. His claim to be a "bricoleur" – a collector of pre-existing visual material – is resonant now that the harvesting, assembling and curation of images has become as important as their creation. His fondness for revisiting old material and reusing it in new contexts resonates with the present era's unprecedented ability not only to store huge digital archives, but to click, drag and recontextualise their contents across limitless formats.
At a time when corporations and governments alike are hell-bent on surveilling and snooping on citizens, Marker's anonymity feels like a thrilling and prophetic act of resistance.
Robert Kuttner in The American Prospect:
In November 1933, less than a year after Hitler assumed power in Berlin, a 47-year-old socialist writer on Vienna’s leading economics weekly was advised by his publisher that it was too risky to keep him on the staff. It would be best both for the Österreichische Volkswirt and his own safety if Karl Polanyi left the magazine. Thus began a circuitous odyssey via London, Oxford, and Bennington, Vermont, that led to the publication in 1944 of what many consider the 20th century’s most prophetic work of political economy, The Great Transformation: The Political and Economic Origins of Our Time.
Polanyi, with no academic base, was already a blend of journalist and public intellectual, a major critic of the Austrian School of free-market economics and its cultish leaders, Ludwig von Mises and Friedrich Hayek. Polanyi and Hayek would cross swords for four decades—Hayek becoming more influential as an icon of the free-market right but history increasingly vindicating Polanyi.
Reluctantly, Polanyi left Vienna for London. Two of his British admirers, the Fabian socialist intellectuals G.D.H. Cole and Richard Tawney, found him a post at an Oxford—sponsored extension school for workers. Polanyi’s assignment was to teach English social and economic history. His research for the course informed the core thesis of his great book; his lecture notes became the working draft. This month marks the 70th anniversary of the book’s publication and also the 50th anniversary of Polanyi’s death in 1964.
Looking backward from 1944 to the 18th century, Polanyi saw the catastrophe of the interwar period, the Great Depression, fascism, and World War II as the logical culmination of laissez-faire taken to an extreme. “The origins of the cataclysm,” he wrote, “lay in the Utopian endeavor of economic liberalism to set up a self-regulating market system.” Others, such as John Maynard Keynes, had linked the policy mistakes of the interwar period to fascism and a second war. No one had connected the dots all the way back to the industrial revolution.
Jo Becker in the NYT Magazine (photo illustration by Daan Brand for The New York Times. Obama: Mark Wilson/Getty Images.):
Despite the president’s stated opposition, even his top advisers didn’t believe that he truly opposed allowing gay couples to marry. “He has never been comfortable with his position,” David Axelrod, then one of his closest aides, told me.
Indeed, long before Obama publicly stated that he was against same-sex marriage, he was on the record supporting it. As an Illinois State Senate candidate from Chicago’s liberal Hyde Park enclave, Obama signed a questionnaire in 1996 saying, “I favor legalizing same-sex marriages, and would fight efforts to prohibit such marriages.” But as his ambitions grew, and with them the need to appeal to a more politically diverse electorate, his position shifted.
In the course of an unsuccessful run for a House seat in 2000, he said he was “undecided” on the question. By the time he campaigned for the presidency, he had staked out an even safer political position: Citing his Christian faith, he said he believed marriage to be the sacred union of a man and a woman.
The assumption going into the 2012 campaign was that there was little to be gained politically from the president’s coming down firmly in favor of same-sex marriage. In particular, his political advisers were worried that his endorsement could splinter the coalition needed to win a second term, depressing turnout among socially conservative African-Americans, Latinos and white working-class Catholics in battleground states.
But by November 2011, it was becoming increasingly clear that continuing to sidestep the issue came with its own set of costs. The campaign’s internal polling revealed that the issue was a touchstone for likely Obama voters under 30.
Rebecca Newberger Goldstein in The Chronicle of Higher Education (image: André da Loba for The Chronicle):
Questions of physics, cosmology, biology, psychology, cognitive and affective neuroscience, linguistics, mathematical logic: Philosophy once claimed them all. But as the methodologies of those other disciplines progressed—being empirical, in the case of all but logic—questions over which philosophy had futilely sputtered and speculated were converted into testable hypotheses, and philosophy was rendered forevermore irrelevant.
Is there any doubt, demand the naysayers, about the terminus of this continuing process? Given enough time, talent, and funding, there will be nothing left for philosophers to consider. To quote one naysayer, the physicist Lawrence Krauss, "Philosophy used to be a field that had content, but then ‘natural philosophy’ became physics, and physics has only continued to make inroads. Every time there’s a leap in physics, it encroaches on these areas that philosophers have carefully sequestered away to themselves." Krauss tends to merge philosophy not with literature, as Wieseltier does, but rather with theology, since both, by his lights, are futile attempts to describe the nature of reality. One could imagine such a naysayer conceding that philosophers should be credited with laying the intellectual eggs, so to speak, in the form of questions, and sitting on them to keep them warm. But no life, in the form of discoveries, ever hatches until science takes over.
There’s some truth in the naysayer’s story. As far as our knowledge of the nature of physical reality is concerned—four-dimensional space-time and genes and neurons and neurotransmitters and the Higgs boson and quantum fields and black holes and maybe even the multiverse—it’s science that has racked up the results. Science is the ingenious practice of prodding reality into answering us back when we’re getting it wrong (although that itself is a heady philosophical claim, substantiated by concerted philosophical work).
And, of course, we have a marked tendency to get reality wrong.
Sleep is invisible and inconsistent. Aping death, sleep in fact prevents it; at the very least, sleep deprivation leads to premature demise (and before that, failures in mood, metabolism, cognitive function). All animals sleep, and it makes sense for none of them, evolutionarily, since it leaves the sleeper defenseless to predation. Sleep is common, public, a vulnerability we all share—even as sleep also brackets the sleeper in the most impenetrable of privacies. Nothing, everyone knows, is harder to communicate than one’s dream.
And then there’s time. Sleep seems to remove us from the general tyranny of the advancing clock. When you wake, 20 minutes could have passed as easily as three hours. But sleep defines time, dividing day and night. Humans discover circadian rhythm through the urge to sleep. That urge is, of course, cyclic, endless: always more sleep to be had. But sleep measures forward progress by consolidating our sense of the past. (Steven W. Lockley and Russell G. Foster lay out the evidence for this and other facts in their briskly informative Sleep: A Very Short Introduction.) In sleep, our brains decide what to keep and discard. Without sleep, we would dissolve into overloaded confusion.
Who does the Crimea belong to?
First of all, to the sea that made it. Seven thousand years ago, the Black Sea was much lower than it is today. Then a waterfall tumbled over the Bosporus, and the waters began to rise. The flood cut the Crimea off from the mainland – all the way except for a narrow isthmus called the Perekop. Ever since, it has been a rocky island on the shores of a sea of grass.
The steppes belonged to the nomads. Grass meant horses, and freedom. The steppes stretched north, from the mouth of the Danube to the Siberian Altai. Across the centuries they were home to various nomadic confederations and tribes: Scythians, Sarmatians, Huns, Pechenegs, Cumans, Mongols, and Kipchak Turks. The legendary Cimmerians predate them all; the Cossacks are still there today.
At times, the nomadic tribes made their home in Crimea too.
The eighties, at least, were drenched in cocaine and neon, slick cars and yacht parties, a real debauched reaction. But nineties white culture was all earnest yearning: the sorrow of Kurt Cobain and handwringing over selling out, crooning boy-bands and innocent pop starlets, the Contract With America and the Starr Report. It was all so self-serious, so dadly.
Today, by some accounts, the nineties dad is cool again, at least if you think normcore is a thing beyond a couple NYC fashionistas and a series of think pieces. Still, that’s shiftless hipsters dressed like dads, not dads as unironic heroes and subjects of our culture. If the hipster cultural turn in the following decades has been to ironize things to the point of meaninglessness, so be it. At least they don’t pretend it’s a goddamn cultural revolution when they have a kid: they just let their babies play with their beards and push their strollers into the coffee shop. In the nineties, Dad was sometimes the coolest guy in the room. He was sometimes the butt of the joke. He was sometimes the absence that made all the difference. But he was always, insistently, at the center of the story.
Miles Becker in Conservation:
Can farmers feed an additional 4 billion people with current levels of crop production? A team from the University of Minnesota tackled the problem by shifting the definition of agricultural productivity from the standard measure (tons per hectare) to the number of people fed per hectare. They then audited the global caloric budget and found a way to squeeze out another 4 quadrillion calories per year from existing crop fields. Their starting point was meat production, the most inefficient use of calories to feed people. The energy available from a plant crop such as corn dwindles dramatically when it goes through an intermediate consumer such as a pig. Beef has the lowest caloric conversion efficiency: only 3 percent. Pork and chicken do three to four times better. Milk and eggs, animal products that provide us essential nutrients in smaller batches, are a much more efficient use of plant calories.
The researchers calculated that 41 percent of crop calories made it to the table from 1997 to 2003, with the rest lost mainly to gastric juices and droppings of livestock. Crop calorie efficiency is expected to fall as the meat market grows. Global meat production boomed from 250.4 million tons in 2003 to 303.9 million tons by 2012, as reported by the FAO. Rice production, mainly for human food, dwindled by 18 percent over the same time period. The authors of the 2013 paper, published in Environmental Research Letters, suggested a trend reversal would be desirable. They estimated that a shift from crops destined for animal feed and industrial uses toward human food could hypothetically increase available calories by 70 percent and feed another 4 billion people each year.
This discovery opens up the possibility that environmental and/or genetic factors may hinder or suppress a specific brain activity that the researchers have identified as helping us prevent distraction. The Journal of Neuroscience has just published a paper about the discovery by John McDonald, an associate professor of psychology and his doctoral student John Gaspar, who made the discovery during his master's thesis research. This is the first study to reveal our brains rely on an active suppression mechanism to avoid being distracted by salient irrelevant information when we want to focus on a particular item or task.
McDonald, a Canada Research Chair in Cognitive Neuroscience, and other scientists first discovered the existence of the specific neural index of suppression in his lab in 2009. But, until now, little was known about how it helps us ignore visual distractions. "This is an important discovery for neuroscientists and psychologists because most contemporary ideas of attention highlight brain processes that are involved in picking out relevant objects from the visual field. It's like finding Waldo in a Where's Waldo illustration," says Gaspar, the study's lead author.
First Poem of the Morning
When you and I wave
I wonder if for you
the stranger across
three gray rooftops
over the blackbirds
pecking the softening
skylight rim of morning
through the shapes
of blackened branches
on the other side
of the ice-paned
I wonder if for you
our wave is the first
poem of the morning
by Ann Nadge
Tuesday, April 15, 2014
Robert Alter in The New Republic:
Evelyn Barish begins her impressively researched biography by flatly stating that “Paul de Man no longer seems to exist.” This may be an exaggerated expression of frustration by a biographer whose long- incubated work now appears after what might have been the optimal time for it. Yet there is considerable truth in what she says. De Man is now scarcely remembered by the general public, though he was the center of a widely publicized scandal in 1988, five years after his death at the age of 64. In the 1970s and 1980s, he was a central figure, an inevitable figure, in American literary studies, in which doctoral dissertations, the great barometer of academic fashion, could scarcely be found without dozens of citations from his writings. But the meteor has long since faded: over the past decade and more, I have only rarely encountered references to de Man in students’ work, committed as they generally are to marching with the zeitgeist.
Paul de Man arrived in the United States from his native Belgium in the spring of 1948. He would remain in this country illegally after the expiration of his temporary visa, on occasion finding ways to elude the Immigration and Naturalization Service. But that, as Barish’s account makes clear, was the least of his infractions of the law. Eventually he would be admitted, with a considerable amount of falsification on his part, to the doctoral program in comparative literature at Harvard, from which he would receive a degree, in somewhat compromised circumstances, in 1960. He then went on to teach at Cornell, briefly at Johns Hopkins, and most significantly at Yale, where he became a “seminal” scholar and an altogether revered figure.
More from Wired here.