Sunday, May 22, 2016
Melissa Holbrook Pierson in the Los Angeles Review of Books:
Of the thousand images stored in my mind’s archives, there is only one of me holding a book. The result of what they call a flashbulb memory, where a shock imprints every detail of a scene on the mind forever, it permits me to view a single moment in my dorm at high school: were it not for the book, I would have forgotten everything — the peculiar darkness that used to fall across only half the room, its twin closets, the honey color of their wood, the fact that my hair reached the bottom of my shoulder blades in 1974. The book I hold, frozen in mid-turn, isPilgrim at Tinker Creek.
Pilgrim blew apart what I knew about writing at age 16: up until I read it, my notions were based on the usual pack of novels, poetry, philosophy, and exposition, all of which stayed neatly in their categories. This book, though, bled across lines (sometimes quite literally; it included plenty of death and injury): it refused to be held to one purpose. It coursed like a river swollen with snowmelt in spring from thing to thing, from inner life to outer. Or, rather, it found the edge where mind meets world. Annie Dillard sang this line, loud and imperative.
I’d thought the stuff I had spent my youth doing was something I’d come up with all on my own, and (to the mind of a self-doubting girl) must therefore be unimportant; but now I’d found someone who made a literature of wandering alone in the woods, watching, listening, poking at flora and fauna, describing views and pieces of nature, and trying to make a whole of her experience.
As he appears in new documentary The Divide, the great intellectual explains why Brexit is unimportant, why Trump’s climate change denial is catastrophic – and why revolution is easier than you think.
Leo Benedictus in The Guardian:
You talk about capitalism, politics and inequality a lot. Do you ever tire of it? Do you ever wish someone would ask you about something else? Well, from my point of view, there are two major categories of issues. There are the kind that are humanly important but intellectually pretty shallow. There are the kind that are intellectually quite deep and challenging, but don’t have the immediate human significance. If I had my choice, I’d rather stay on the second, but unfortunately the world won’t go away.
Do you not feel you’ve had enough sometimes? It’s like seeing a child in the street and a truck coming rapidly. Do you say, “Look, I’m too busy thinking about interesting questions, so I’ll let the truck kill the child”? Or do you go out into the street and pull the child back?
But if it was another child, every day, for decades? It doesn’t matter. I remember the philosopher Bertrand Russell was asked why he spent his time protesting against nuclear war and getting arrested on demonstrations. Why didn’t he continue to work on the serious philosophical and logical problems which have major intellectual significance? And his answer was pretty good. He said: “Look, if I and others like me only work on those problems, there won’t be anybody around to appreciate it or be interested.”
Brian Boutwell & J.C. Barnes in Nautilus:
For the past few years, social scientists have been buzzing over a particular topic in molecular biology—gene regulation. The hype has been building steam for some time, but recently, it rocketed to the forefront of public discussion due to a widely circulated piece in the New Yorker. Articles on the topic are almost always fascinating: They often give the impression that this particular area of biology stands poised to solve huge mysteries of human development. While that conclusion may be appropriate in fields like medicine and other related disciplines, a number of enthusiasts have openly speculated about its ability to also explain lingering social ills like poverty, crime, and obesity. The trouble is, this last bit isn’t really a feeling shared by many of the genetics experts.
Social scientists’ excitement surrounds what we can refer to broadly as transgenerational epigenetics. To understand why social scientists have become enamored with it, we must first consider basic genetics. Many metaphors exist for describing and understanding the genome; they all capture the reality that genes provide the information for building and running biological machinery like the human body.
From the moment sperm manages to infiltrate an egg cell, genes (segments of our DNA that ultimately produce proteins) are at work knitting together the necessary components to make life possible. This requires exquisite coordination. Even though every cell in your body (minus red blood cells) carries your complete genetic code, not every gene is “turned on” all at once all over the body.
Sergio Graziosi in his blog:
Sometimes reading a flawed argument triggers my rage, I really do get angry, a phenomenon that invariably surprises and amuses me. What follows is my attempt to use my anger in a constructive way, it may include elements of a jerk reaction*, but I’ll try to keep my emotions in check.
Dr. Epstein recently published a badly misguided essay on Aeon, entitled “The empty brain“, the subtitle makes it clear what the intended take home message is: “Your brain does not process information, retrieve knowledge or store memories. In short: your brain is not a computer“.
Unfortunately, the essay is systematically wrong: virtually every key passage is mistaken, and yet, overall, it tries to make an argument that is worth making. Thus, I grew annoyed by the mistakes and misrepresentations (my immediate comment was “this is so wrong it hurts”), and then descended into angerbecause Epstein is actually damaging the credibility of an approach that I find promising, but is all too often misunderstood or straw-manned.
In what follows, I will blatantly ignore the first rule of civilised debate: I will not try to give a charitable reading of the original essay. I won’t because it would effectively hide the reasons for writing my reply. Instead, I will report the key arguments proposed by Dr. Epstein, explain why I think they are wrong, and then finish off by outlining why I nevertheless sympathise with some of the science it endorses (as I understand it).
More here. [Thanks to Sean Carroll.]
Marten Scheffer in Nature:
Few topics are as disparate as mathematics and love — or are they? Modeling Love Dynamics (World Scientific, 2016) by systems theorist Sergio Rinaldi and others playfully, but convincingly, makes the point that even amorous relationships cannot escape the fundamental laws of dynamical systems. The argument propounded by Rinaldi and colleagues builds on the classical framework of coupled differential equations, which have proven so powerful in describing the essence of relationships in nature such as competition, cooperation and predation. The book’s cover illustration hints at the road ahead: it shows Gustav Klimt’s 1908 painting The Kiss (Lovers). A glance inside reveals that art is an essential part of the analysis of the drama of passion — a drama resulting in large part from the interplay of two strong forces, attraction and repulsion. Simple equations illustrated with elegant diagrams show how, depending on personalities, those forces can result in a transient affair, long-lasting stable equilibrium, or everlasting cycles of attraction and repulsion.
The tales and poems chosen masterfully illustrate a range of mathematical features. The limit cycle, known for driving the oscillating dynamics of many economic or biological systems, is linked, for instance, to one of the greatest love stories in Western culture. That is, the cyclical 21-year platonic relationship between fourteenth-century Italian humanist and poet Francesco Petrarca (Petrarch) and the married Laura (possibly the Provençal noblewoman Laure de Noves), charted in Petrarch’s celebrated collection Il Canzoniere. If three variables are mixed in the differential equations of passion, chaotic dynamics can arise. This is illustrated vividly in Henry-Pierre Roché’s semi-autobiographical 1953 novel Jules et Jim (which inspired François Truffaut’s 1962 film of the same name). Roché documents the love triangle between himself, the brilliant and charming journalist Helen Grund and her shy husband Franz Hessel, his best friend. As with the weather, the course of these dynamics is fundamentally unpredictable in the long run, as the smallest event can put things on a different trajectory. This phenomenon is also known as ‘the butterfly effect’, hinging on the idea that the flap of a butterfly’s wing may eventually lead to a hurricane in a distant place.
Tonight I've watched
the moon and then
The night is now
goes; I am
in bed alone
Saturday, May 21, 2016
Tom Bartlett in the Chronicle of Higher Education:
Weeks of planning can evaporate in an instant, forcing the researchers to improvise. Beyond the logistical aggravation, there’s the matter of personal safety. Where there are fighters, there is often fighting, and while the semi-autonomous Kurdish region of northern Iraq remains relatively sheltered compared with Syria or large swaths of southern Iraq, the proximity to bloodshed prompts understandable unease.
The least jittery member of the team is its leader, Scott Atran, an anthropologist who floats among several institutions, including the University of Michigan and the John Jay College of Criminal Justice, part of the City University of New York. He’s also a founder of the Centre for the Resolution of Intractable Conflict, at the University of Oxford. He’s normally the one arguing to go a little farther afield, to challenge the group’s comfort zone, perhaps to cross over into Syria. While sitting around the hotel he appears restless and testy, headed toward ISIS territory he is in his element, enlivened and unfazed. "We don’t want to drive off the road, because it’s probably mined on both sides," he warns casually from the passenger seat, the way you might note a change in speed limit or a forthcoming rest stop.
Atran is known as an expert on terrorism, a title he doesn’t particularly want and a word he doesn’t find useful. He views his work, broadly, as examining what motivates people to do things beyond themselves, for good or ill. These days he focuses on the ill, specifically ISIS.
Carl Zimmer in the New York Times:
As the director of the Center for GeoGenetics at the University of Copenhagen, Dr. Willerslev uses ancient DNA to reconstruct the past 50,000 years of human history. The findings have enriched our understanding of prehistory, shedding light on human development with evidence that can’t be found in pottery shards or studies of living cultures.
Dr. Willerslev led the first successful sequencing of an ancient human genome, that of a 4,000-year-old Greenlander. His research on a 24,000-year-old Siberian skeleton revealed an unexpected connection between Europeans and Native Americans.
Dr. Willerslev was one of the early pioneers of the study of ancient DNA, and today he remains at the forefront of an increasingly competitive field. His colleagues credit his success to his relentless work and to his skill at building international networks of collaborators.
“His role is that of catalyst, choreographer, conductor and cajoler — and sometimes all at once,” said David J. Meltzer, an archaeologist at Southern Methodist University.
The scientific enterprise that Dr. Willerslev helped invent now sometimes crosses into culturally sensitive terrain.
Thomas Pogge, one of the world’s most prominent ethicists, stands accused of manipulating students to gain sexual advantage
Katie J.M. Baker in BuzzFeed:
When Thomas Pogge travels around the world, he finds eager young fans waiting for him in every lecture hall. The 62-year-old German-born professor, a protégé of the philosopher John Rawls, is bespectacled and slight of stature. But he’s a giant in the field of global ethics, and one of only a small handful of philosophers who have managed to translate prominence within the academy to an influential place in debates about policy.
A self-identified “thought leader,” Pogge directs international health and anti-poverty initiatives, publishes papers in leading journals, and gives TED Talks. His provocative argument that wealthy countries, and their citizens, are morally responsible for correcting the global economic order that keeps other countries poor revolutionized debates about global justice. He’s also a dedicated professor and mentor, at Yale University — where he founded and directs the Global Justice Program, a policy and public health research group — as well as at other prestigious institutions worldwide. By Pogge’s own count, he’s taught 34 graduate seminars, given 1,218 lectures in 46 countries, and supervised 66 doctoral dissertations.
But a recent federal civil rights complaint describes a distinction unlikely to appear on any curriculum vitae: It claims Pogge uses his fame and influence to manipulate much younger women in his field into sexual relationships. One former student said she was punished professionally after resisting his advances.
Pogge did not respond to more than a dozen emails and phone calls from BuzzFeed News, nor to a detailed letter laying out all the claims that were likely to appear in this article.
When it was first published in Istanbul in 1943, it made no impression whatsoever. Decades later, when Madonna in a Fur Coat became the sort of book that passed from friend to friend, the literary establishment continued to ignore it. Even those who greatly admired the other works of Sabahattin Ali viewed this one as a puzzling aberration. It was just a love story, they said – the sort that schoolgirls fawned over. And yet, for the past three years, it has topped the bestseller lists in Turkey, outselling Orhan Pamuk. It is read, loved and wept over by men and women of all ages, but most of all by young adults. And no one seems able to explain quite why.
The story begins in 1930s Ankara, the Turkish Republic’s newly appointed capital. The narrator has fallen on hard times, and it is only with the help of a crass and belittling former classmate that he is able to find work as a clerk at a firm trading in lumber. Here he meets the sickly, affectless Raif Bey, who is, we’re told, “the sort of man who causes us to ask ourselves: “What do they live for? What do they find in life? What logic compels them to keep breathing?” When at last they make friends, it becomes clear that Raif’s reason for living cannot be his family. The relatives assembled under his roof treat him with the utmost contempt. And yet he welcomes their derision. Even on his deathbed, he seems to accept it as his due. But there is also a notebook, hidden in his desk drawer at work, which he asks his friend to destroy.
Death, for most people, is a rumour; something that happens to others, far away. But it is the last thing you will 'do' - or which will happen to you - and the likelihood is that it will take place in an acute hospital or a care home, orchestrated by strangers. You will have little say in its pace or its manner. There is a risk that, during the course of your dying, you will be subjected to procedures and treatments that are painful, degrading and ultimately futile. If you are old, your children may make all the major decisions for you. Death may creep up on you without warning, without a chance for you to prepare yourself and settle your affairs.
Few books, as R. D. Laing remarked, are forgivable. Most of what I read about death and dying bears little relation to what I see every day in my work on the hospital wards. Doctors and nurses rarely write about death; those who do are generally palliative care (hospice) specialists, and have a particular perspective on the subject, one that I do not completely share. The language used about death and dying tends to have a quality of cloying earnestness: nobody 'dies' anymore; they 'pass over', they 'pass on', or they simply 'pass'. The book I wanted to read about death and dying didn't exist.
Doctors who work in large, acute-care hospitals see death differently to doctors working in the hushed and serene environs of a hospice. And yet most dying still takes place in this kind of hospital, rather than in the hospices. Half a million people die every year in England. A study of deaths in England between 2005 and 2007 found that 58 per cent of all deaths occurred in hospital, 16 per cent in nursing homes, 19 per cent at home and only 5 per cent in hospices.
Al Alvarez called suicide “a dubious immortality”: “All that anguish, the slow tensing of the self to that final, irreversible act, and for what? In order to become a statistic.” In 1971, Alvarez published The Savage God: A Study of Suicide, a literary and philosophical expedition into the places where creativity and suicide overlap; the book was also a tribute to his friend Sylvia Plath, who had taken her life less than a decade earlier. He had published several of her poems in the Observer, where he worked as poetry critic, and they grew closer after she split from Ted Hughes. In Plath’s final months, she and Alvarez would often sit in his London flat, talking about poetry, creativity, and sometimes suicide, though “with a wry detachment,” as he describes it. “It was obviously a matter of self-respect that her first attempt had been serious and nearly successful,” he writes. “It was an act she felt she had a right to as a grown woman and a free agent.”
The Savage God is not a memoir, although the prologue and epilogue that bookend it are intensely personal. “I want the book to start, as it ends, with a detailed case-history, so that whatever theories and abstractions follow can somehow be rooted in the human particular,” Alvarez writes; the haunting prologue presents his brief personal account of Plath’s last months, in which he carefully dissects her depression and the ways it contributed to her eventual death, an outcome that (he suggests) might have been a mistake. Alvarez ultimately believes Plath intended only to act out the allegory of death into which she had written herself, expecting, or perhaps gambling, that she would be saved at the last minute—a miscalculation from which the Myth of Sylvia Plath has grown. “The pity is not that there is a myth…” writes Alvarez, “but that the myth is not simply that of an enormously gifted poet whose death came carelessly, by mistake, and too soon.”
The News of Flowers
Spring. Everything’s liberated.
The news of flowers
eases the poverty of this world.
Throughout this fractured country
(some say it’s a pity,
others not so)
spring has come full force.
An azalea blooming at Cheju Island
in the very south,
after a few days
across the sea
in Southern Cheolla
& Southern Kyeongsang.
A few days later
& it reaches the shore of the Han, mid-country,
& all along the Soyang River.
About a month later
on the upper reaches of the Yalu, North Korea: blossoms.
At the end of May
about 2700 meters up
by a cold spring at the treeline
azaleas bloom in many colors.
This is enough.
One cannot wish for more.
Where could things be better than among the flowers of a spring day?
So with South & North: gradually, evenly.
By Ko Un
from Abiding Places: Korea South & North
Lewis H. Lapham in Lapham's Quarterly:
The champions of Western civilization make a bad mistake by deploring the mind and method of jihad as medieval and barbaric. The techniques and the objectives are modern. From whom do we suppose that jihadists learn to appreciate the value of high explosive as vivid speech if not from the example of the U.S. Air Force overhead Vietnam, Serbia, and Iraq? The organizers of the 9/11 attacks on Manhattan clearly not only understood the ethos of globalized finance capitalism but also the idiom of the American news and entertainment media. Their production values were akin to those of Independence Day; the spectacle of the World Trade Center collapsing in ruins was rated by the New York film and social critics as “awe inspiring,” “never to be forgotten,” “shatteringly emotional.”
The sense of living in the prophetic end time has been running around in the American consciousness for the past twenty-five years, on the disheartened political left as on the ferocious political right. The final battle of Armageddon furnished the climax for the Left Behind series of sixteen neo-Christian fables that have sold more than 65 million copies to date, presumably to Rush Limbaugh’s dittoheads and future members of the Tea Party. The coauthors of the books, Tim LaHaye and Jerry B. Jenkins, offer their hatred of man as testimony to their love of God, and devote many fondly worded pages to the wholesale slaughter of intellectuals in New York, politicians in Washington, and homosexuals in Los Angeles. Their language is of a piece with the film footage in Mel Gibson’s Passion of the Christ or the videos just in of an ISIS beheading.
From The New York Times:
Randy Shilts’s “And the Band Played On,” about the early days of the AIDS epidemic, and Atul Gawande’s “Being Mortal,” about how systems of care can affect the way we die. And Ian McEwan’s “Enduring Love,” a novel spun out of an obsessive psychiatric syndrome.
Was there any book that influenced your decision to become a writer?
Without a doubt: Primo Levi’s “Survival in Auschwitz.” Levi, notably, defined himself first as a chemist and then as a writer. He has a particularly charming essay about why scientists can be good writers because they distill and clarify, because they boil questions down to their tar, because they understand the Silly Putty-ness of language. If chemists can write like Levi, then God help the writers.
What was the most interesting book you read while researching “The Gene”? And what was the best book you read for “The Emperor of All Maladies”?
I read a wide and bizarre collection of books for “The Gene,” including comics from the 1950s that fantasized about future human mutants, and a popular genre from the 1930s — I guess we might call it Eugenics Lite — that advocated the measurement and breeding of the best babies (blue-eyed, white) to improve the national gene pool. Perhaps the most interesting was Eugen Bleuler’s first case description of schizophrenia from 1911 that reads like the most incredible novel. For “Emperor of All Maladies,” the one book that I particularly scoured for inspiration was Richard Rhodes’s “The Making of the Atomic Bomb” — an epic account of the Manhattan Project. I cannot think of another book that makes scientific history more riveting.
Friday, May 20, 2016
Justin E. H. Smith in his blog:
Sometime in the summer of 1987 I walked out to our rural-route mailbox and found my membership card for the Young Socialist Alliance, accompanied by a typewritten letter filled with both practical information as well as elevated rhetoric about the youth being the future. I had heard that talk before at Catholic Youth Organization meetings, and was annoyed that I was made to join the mere youth auxiliary of the Socialist Workers' Party. But I was 15 and those were the rules, and I was happy enough to now be officially linked to the largest association of Trotskyists in the United States, whose publishing wing, Pathfinder Press, had already taught me so much about the larger world beyond the Sacramento Valley.
By the following year I had obtained another official document with my name on it, from the Department of Motor Vehicles, which enabled me to drive to the national convention of the SWP at Oberlin College in Cleveland. It enabled me, while my mother, for some mysterious reason, permitted me. In what would have been my junior year I had stopped attending high school for some months, out of sheer stubbornness, and didn't seem to have any other concrete plans, so driving off to do something at a university might have been hoped to hold open the possibility of what was known, even then, as a 'positive influence'. A 'positive influence on the youth'.
So I made it through the high desert of Nevada, through the salt flats of Utah, through the locust plagues of Nebraska, through Illinois, Indiana, and, finally, the state in which I would much later reside for two years and where I am still registered to vote: bleak pseudopalindromic Ohio, microcosm of all that is worst of 'these United States', the state Whitman had the most trouble rhapsodising about. But it was all new and fresh to me in 1988 and I was happy to go to some artsy café in the little town next to the campus and meet some dude named Harold who wore the best thrift-shop sweaters and knew more trivia about The Residents and Negativland than I did. This was the larger world too.
A new report estimates that by 2050, drug-resistant infections will kill one person every three seconds, unless the world’s governments take drastic steps now.
Ed Yong in The Atlantic:
The report’s language is sober but its numbers are apocalyptic. If antibiotics continue to lose their sting, resistant infections will sap $100 trillion from the world economy between now and 2050, equivalent to $10,000 for every person alive today. Ten million people will die every year, roughly one every three seconds, and more than currently die from cancer. These are conservative estimates: They don’t account for procedures that are only safe or possible because of antibiotics, like hip and joint replacements, gut surgeries, C-sections, cancer chemotherapy, and organ transplants.
And yet, resistance is not futile. O’Neill’s report includes ten steps to avert the crisis. Notably, only two address the problem of supply—the lack of new antibiotics. “When I first agreed to do this, the advisors presented it to me as a challenge of getting new drugs,” says O’Neill. “But it dawned on me very quickly that there were just as many, if not more, important issues on the demand side.” Indeed, seven of his recommendations focus on reducing the wanton and wasteful use of our existing arsenal. It’s inevitable that microbes will evolve resistance, but we can delay that process by using drugs more sparingly.
Tom Blunt in Signature:
There are two versions of the Blanche Knopf story. The first is one of triumph, documenting the calculated risks taken by the publishing maven to carve out paths for otherwise-neglected authors who would ultimately shape 20th-century culture and change the book business forever. America’s Harlem Renaissance, hard-boiled detective genre, and fascination with Europe’s sexual freedom can all be traced back to Mrs. Alfred A. Knopf’s business gambits, which in most cases sprang directly from her personal interests, or those of her close friends.
The second version is a tale of what might have been. How differently would Mrs. Knopf’s life and career have turned out if her husband had truly made her an equal partner in their business, as he promised when they were young newlyweds? To what greater heights might the company have flown if Mr. Knopf hadn’t vetoed some of her more risqué choices? Might Blanche have eventually summoned enough independence to go her own way if the couple’s gradual estrangement hadn’t nudged her toward a diet pill habit that slowly destroyed her health and eyesight? And perhaps most regretfully: how many more women might have felt called to work in the publishing world if Alfred hadn’t relentlessly downplayed Blanche’s involvement at every turn, only begrudgingly admitting his wife’s contributions long after her death in 1966?
These questions arise several decades too late to make any difference to Mrs. Knopf, and if it wasn’t for Laura Claridge’s new biography The Lady With the Borzoi, they might never have been posed at all.
he name Albert Murray was never household familiar. Yet he was one of the truly original minds of 20th-century American letters. Murray, who died in 2013 at the age of 97, was an accomplished novelist, a kind of modern-day oral philosopher, a founder of Jazz at Lincoln Center, and the writer of a sprawling, idiosyncratic, and consistently astonishing body of literary criticism, first-rate music exposition, and cunning autobiography. In our current moment of identity politics and multicultural balkanization, the publication of any new Murray text would serve as a powerful reminder that his complex analysis of art and life remain as timely as ever—probably more so. T
It’s 2016, and another management guru is revealing the secrets of the creative mind.
It’s not really a very original thing to do. The literature on encouraging corporate nonconformity is already enormous; it goes back many years, to at least 1960, when someone wrote a book called How to Be a More Creative Executive. What was once called “the creative revolution” in advertising got going at around the same time. I myself wrote a book about that subject—a history book!—nearly twenty years ago.
There have been slight variations in the creativity genre over the half-century of its ascendancy, of course. The cast of geniuses on whom it obsessively focuses has changed, for example. And while the study of creativity has always been surrounded with a quasi-scientific aura, today that science is more micro than macro, urging us to enhance our originality by studying the functioning of the human brain.
In the larger literary sense, however, it is now clear that the capitalist’s tribute to creativity and rebellion is an indestructible form. There is something about the merging of bossery and nonconformity that beguiles the American mind. The genre marches irresistibly from triumph to triumph. Books pondering the way creative minds work dominate business-best-seller lists. Airport newsstands seem to have been converted wholly to the propagation of the faith. Travel writers and speechwriters alike have seen the light and now busy themselves revealing the brain’s secrets to aspiring professionals.