Belinda Jack in Times Higher Education:
The cynical account for the rise of the medical humanities – a newish interdisciplinary area that explores the social, historical and cultural dimensions of medicine – would be an economic one. At a time of retrenchment in some subjects at some universities, disciplines are under pressure to demonstrate their practical value. Recent research that claims to show that reading novels promotes empathy would be an example of literature’s utility, particularly for medical students. There’s money in medicine and not so much in the humanities. But how new is this field or set of fields? The ancient Greek physician Hippocrates claimed that “wherever the art of Medicine is loved, there is also a love of Humanity”, suggesting both that medicine is an “art” and that there is a crucial association between medicine and the “human” dimension of the humanities.
In terms of literature, as soon as the novel rose to prominence in the 18th century a good many doctors more than dabbled in writing, often fiction. Oliver Goldsmith (1730-1774) trained as a doctor and wrote the best-selling novel The Vicar of Wakefield (1766), and Keats (1795-1821) turned to poetry in part because of the trauma he suffered by the experience of physically restraining fully conscious patients in order to perform surgery without anaesthesia. Friedrich von Schiller (1759-1805), German writer, poet, essayist, dramatist and friend of Goethe, was an army surgeon before achieving fame as a writer.
Paul Basken in The Chronicle of Higher Education:
It was my ritual for seven years. Every day, take two sets of pills—one labeled, the other a mystery. Every three months, take three sets of blood-pressure readings, twice a day for a week. Once a year, collect urine for 24 straight hours, lug it everywhere in an ice pack, then get it through airport security for a flight from Washington to Boston. For me and about 1,000 other participants in our medical trial, the payoff for such tedious detail came back last month: The combination of the two common types of blood-pressure drugs being tested didn’t make any significant difference in the progression of our inherited kidney disease. That was disappointing. But it didn’t necessarily mean that the trial was a failure, a waste of the time I spent on it, or a poor use of the $40-million in taxes that paid for it. The trial’s participants got top-notch medical attention for our polycystic kidney disease, and our records will almost certainly help others with PKD, now and in the future.
…All of that logistical structure can mean a huge financial cost. Randomized trials now account for about 20 percent of the $30-billion annual budget of the National Institutes of Health. Private drug companies spend more than $30-billion on them. Yet drug trials fail at a rate of about 90 percent. That level of failure has attracted serious attention now that U.S. medical research has entered a period of tighter budgets, accelerating technological advances, and extensive procedural reassessments. In that light, much about our trial’s design and execution illustrates a system of human experimentation that’s ripe for overhaul.
Suzanne Berne at the New York Times:
Part of a biographer’s job is to rescue forgotten figures, and in “Sophia: Princess, Suffragette, Revolutionary” Anita Anand has salvaged an extraordinary one. Sophia Duleep Singh was a Punjabi princess and Queen Victoria’s goddaughter, a bucktoothed “docile little thing” who went on to become a celebrated London fashion plate and then a steely suffragist.
Her father, Maharajah Duleep Singh, was 11 when the British seized his vast Sikh empire and 15 when he was sent into exile in England. Victoria doted on him, remarking that he “was beautifully dressed and covered with diamonds” (though not the famed Koh-i-Noor, now among her crown jewels), adding, “I always feel so much for these poor deposed Indian princes.” Graciously, she granted him a stipend, which he overspent remodeling an East Anglian pile into the fabulous “Mogul palace” where he installed his bride, the daughter of a German businessman and an Abyssinian slave.
Born in 1876, Sophia spent much of her early years in the English countryside playing with her brothers and sisters “amidst enclosures filled with ostriches, rare parrots and monkeys,” an idyll that ended when the maharajah pillaged the estate to pay his creditors.
John Sutherland at the Financial Times:
Within the bosom of every old man, said the philosopher William James, there is a dead young poet. TS Eliot, as Robert Crawford suggests in his opening sentence, “was never young”. He’s the Benjamin Button of poets. His first mature work, “The Love Song of J Alfred Prufrock”, was written when he was 22. It contains the couplet:
I grow old . . . I grow old . . .
I shall wear the bottoms of my trousers rolled.
A later poem, “Gerontion” (in Greek, “wizened old man”), opens:
Here I am, an old man in a dry month,
Being read to by a boy, waiting for rain.
The author was barely 30 at the time but already “Old Possum”. Crawford’s endeavour, brilliantly achieved, is to disinter the dead young poet buried in the prematurely aged TS Eliot.
When Eliot died in January 1965 it was, for the literate classes, a passing of the same magnitude as Winston Churchill’s, three weeks later. One genuflected, humble in the face of literary greatness. But Eliot’s reputation, over the next half-century, was to become sadly chipped.
David L. Ulin at The LA Times:
Hornby has written about other female protagonists: Annie in “Juliet, Naked,” Katie Carr in “How to Be Good.” There's something more expansive, though, in “Funny Girl,” which is as sedate a work as he has produced. What I mean is that this is a book that takes the long view, that seeks to give us a broad sense of its characters' circumstances. In that regard, its 1960s setting serves a double purpose — first, to engage us in the energy of the era's burgeoning youth culture, and second, to remind us of the speed with which time eclipses all.
Sophie is an appropriate signifier: “Here was everything they wanted to bring to the screen,” Hornby writes of the production team that discovers her, “in one neat and beautifully gift-wrapped package, handed to them by a ferocious and undiscovered talent who looked like a star. The class system, men and women and the relationships between them, snobbery, education, the North and the South, politics, the way that a new country seemed to be emerging from the dismal old one that they'd all grown up in.”
The members of that team are the novel's other central players — Clive, the leading man who becomes Sophie's faithless fiancé; Dennis, the producer-director who loves her from a distance; the writers, Bill and Tony — one gay, the other married but (perhaps) closeted. It adds up to the portrait of a culture in transition, in which “[w]hat was once both pertinent and laudably impertinent became familiar and sometimes even a little polite.”
Susan B. Glasser in Politico:
Genes as commerce
By Alec Ross, senior fellow at the Columbia University School of International & Public Affairs
Fifteen years from now, everybody reading this will live, on average, two years longer than their current life expectancy because of the commercialization of genomics. The price of mapping an individual’s genetic material has fallen from $2.7 billion to below $10,000, and it continues to fall.
Omniscience into the makeup and operation of the 3 billion base pairs of genetic code in each of our bodies will allow for tests to be developed that will find cancer cells at 1 percent of the size of what can be detected by an MRI today. It will allow for personalized prevention and treatment programs for nearly every illness, and will make today’s medical practices look medieval by comparison.
Of course, all of this will benefit the wealthy before it becomes affordable and available to everybody. That is the cruel reality of many of the innovations to come. They will make people live longer, healthier lives—but not everybody, and not all at once.
Steven Shapin in Boston Review:
Can science make you good?
Of course it can’t, some will be quick to say—no more than repairing cars or editing literary journals can. Why should we think that science has any special capacity for moral uplift, or that scientists—by virtue of the particular job they do, or what they know, or the way in which they know it—are morally superior to other sorts of people? It is an odd question, maybe even an illogical one. Everybody knows that the prescriptive world of ought—the moral or the good—belongs to a different domain than the descriptive world of is…
The ideas and feelings informing the tendency to separate science from morality do not go back forever. Underwriting it is a sensibility close to the heart of the modern cultural order, brought into being by some of the most powerful modernity-making forces. There was a time—not long ago, in historical terms—when a different “of course” prevailed: of course science can make you good. It should, and it does.
A detour through this past culture can give us a deeper appreciation of what is involved in the changing relationship between knowing about the world and knowing what is right. Much is at stake. Shifting attitudes toward this relationship between is and ought explain much of our age’s characteristic uncertainty about authority: about whom to trust and what to believe.
Read the rest here.
Tomas Hachard in National Post:
When discussing a disease that is expected to double in prevalence over the next two decades, it is hard to countenance a silver lining; currently Alzheimer’s afflicts 5 percent of Canadians over 65, and the only existing treatment is a series of drugs that, at best, alleviate symptoms for a year. Even what little hope there is for avoiding the disease seems feeble, at best. In The End of Memory, a wide-ranging book on the history of Alzheimer’s, Jay Ingram lists a handful of lifestyle choices that apparently help prevent the disease. Exercise and education are two—the most proven. Learning a second language is another.And then there’s “conscientiousness,” an umbrella term, Ingram explains, for goal setting, determination, efficiency, organization, thoroughness, self-discipline, and reliability. According to some studies, the more we exhibit these traits, the less susceptible we are to Alzheimer’s. A responsible life, it seems, might actually afford us a peaceful death.
…The underlying reality of their topic, however, brings an unavoidable bleakness, and not just because of the currently far-off hopes for a cure. Scientists generally agree today that Alzheimer’s differs from normal aging. But precisely what distinguishes the two is still unclear. However unlikely a conclusion it is at this point, Peter Whitehouse’s suggestion that “in some sense we would all get Alzheimer’s if we live long enough”—posited in his 2008 book The Myth of Alzheimer’s—is still with us and signals an important fact about the disease: it’s impossible to detach our fear of it from our more general anxieties about growing old.
Seth Shulman in The Washington Post:
‘The arc of the moral universe is long, but it bends toward justice,” the Rev. Martin Luther King Jr. told a crowd of protesters in Montgomery, Ala., in March 1965. King’s use of that quote stands as one of history’s more inspiring pieces of oratory, acknowledging that victories in the fight for social justice don’t come as frequently as we might like, while offering hope that progress will come eventually. But is the contention empirically true? Michael Shermer, a professor, columnist for Scientific American, and longtime public champion of reason and rationality, takes on this question and more. In “The Moral Arc,” Shermer aims to show that King is right so far about human civilization and that, furthermore, science and reason are the key forces driving us to a more moral world. It is at once an admirably ambitious argument and an exceedingly difficult one to prove. First, Shermer — defining moral progress as “improvement in the survival and flourishing of sentient beings” — needs to make a case that we humans are, in fact, moving toward such an improvement despite terrorist attacks on cartoonists, Islamic State beheadings, Taliban massacres of schoolchildren and police shootings of innocent civilians, among other seemingly daily atrocities. As he notes in the preface, when they heard he was working on a book about moral progress, “most people thought I was hallucinatory. A quick rundown of the week’s bad news would seem to confirm the diagnosis.”
If that weren’t tough enough, Shermer also needs to show that science and scientific reasoning are responsible for bettering our lot. Given science’s role in everything from the development of the atomic bomb to pervasive government surveillance, it’s hard to know which of his self-appointed tasks is more daunting.To his credit, Shermer tackles this broad agenda with an abundance of energy, good cheer and anecdotes on everything from “Star Trek” episodes and the reasoning of Somali pirates to the demise of the Sambo’s restaurant chain. The anecdotes provide leavening but don’t alter the fact that this is a work of serious and wide-ranging scholarship with a bibliography that runs to nearly 30 pages. The effect can be kaleidoscopic and even a bit scattershot at times, but that doesn’t detract from the truly impressive array of data Shermer assembles.
a painting by Mark Rothko
To explain crimson,
the grotesque danger,
the acute beauty
and commotion of it,
how it commands recollection,
even after every trace
is vanished, I describe
our small faces
sweet and sour cherry pits
stacked in front of us
like small cannonballs
the first stain gleaming
inside my teenage thighs,
seen down below
through new breasts,
my cousin’s cheek
after the rake hit
the bony part near her eye
forming a fork-shaped wound,
or at the butcher’s shop,
watching as his thick fingers
his long white apron.
I know there is no forgetting.
Years after my butterflied chest
(the surgeon’s cache) is splayed
under a blaze of lights, I relive red
nightmares that darken
long after the scar that ropes my ribs
by Jim Culleny
from Alehouse 2011
Painting by Mark Rothko
Wayne Scott in The Millions:
When I was on a vacation in the Virgin Islands with my two brothers and my 70-year old mother — an exceptional hiatus from our lives with family and children, just the four of us, to celebrate my mother’s milestone birthday, our good fortune that we had had her in our lives for such a long time — I happened upon a collection of essays by E.B. White, a book that the house owners had left on the shelf. I had read White’s autobiographical piece, “Once More to the Lake” in college, but here I was, a man in his late-40s, again under its spell. Throughout our time at that lovely house under the clear skies, overlooking the deep-blue Atlantic Ocean, I kept returning to his rumination on summer memories.
Written in August 1941 and published originally in Harper’s, the story is deceptively simple. White takes his son to a camp for a short vacation. It is the same camp, by Belgrade Lake in Maine, where his father had taken him many times when he was growing up, over 30 years before. He writes, “I wondered how time would have marred this unique, this holy spot.” Except for the sound of outboard motors on boats, a mid-century technological advance — a “petulant, irritable sound” that “whined about one’s ears like mosquitoes” — he found it to be the same place. “Once More to the Lake” is not a psychological exploration, except for one recurring detail. As White sees his son engage in activities that he himself used to do — baiting a fish hook, pulling on a bathing suit — he transposes identities, imagining himself as his father to his younger self. The jarring illusion keeps returning.
Clancy Martin in the Chronicle of Higher Education:
Practically speaking, I’ve always been interested in lying. But I remember when the subject first caught my intellectual attention: I was 11 or 12, in a Waldenbooks, and the shelves of the philosophy section—I’ve walked straight to that aisle since I was a kid, with my dad, who loved philosophy, though he was kicked out of college after only one semester—were lined with copies of Sissela Bok’s best-selling Lying. I was nervous even to pick it up, fearing, as many people do, that taking an interest in lies would expose that I was a liar.
This is one of the curious facts about lying. It’s treated a lot like the subject of masturbation was at around the same time. Among my friends, everyone suspected that all of us masturbated, but when one kid, my closest buddy—now a respected psychiatrist—tried to bring it up honestly, we laughed at him and nervously changed the subject.
This is how we handle embarrassing open secrets about popular “vices.” And we lie even more often (a lot more often) than we masturbate. In Dallas G. Denery’s excellent new history of Western thinking on deception, The Devil Wins, he cites a recent study that shows that “during every 10 minutes of conversation, we lie three times and even more frequently when we use email and text messaging.”
Justin E. H. Smith in his blog:
I finally read Michael Walzer's influential article on “Islamism and the Left,” after being told a number of times that I had inadvertently been echoing his opinion when I sided unconditionally with the caricaturists against the assassins who came to kill them. I find that I do agree with an early, fairly obvious point Walzer makes, but then disagree with most of the rest.
The obvious point is that the American left has for the most part failed to provide any serious analysis of the phenomenon of political Islamism, and moreover that it has failed to do so for very bad reasons, including notably the groundless presumption of common cause with the Islamists. Where my disagreement begins is with Walzer's central assertion that Islam presents a particular problem in the current global order. It seems to me that this claim is at odds with his own further assertion that religion in general is functioning as a stimulant to violence throughout the world in the post-secular age.
To ward off in advance any suspicion of Islamophobia on his own part, Walzer invokes the Christian crusades in the Levant of the Middle Ages to show that there is nothing eternal or essential about Islamic violence, but that in different times and places the same violence can be done in the name of other religions, sometimes targeting Muslims. A Muslim in the 12th-century Levant would have been justified to suppose that the Christians have a problem with violence, Walzer observes. But why time-travel, when we can just travel? We don't have to go to the 12th-century Levant, when we can go directly to 21st-century India, where the Muslim minority, right now, is very justified to suppose that Hindus have a 'violence problem'. The same thing for Muslims in Burma being massacred by rampaging Buddhist monks.