“You can’t go home again.” Thomas Wolfe’s famous phrase has long served as a dictum for writers and analysands, but it needs an addendum: You can’t stop trying. Sam Shepard has acknowledged the compulsion — and also the futility — in interviews and dramatized it in plays where protagonists return to the place that’s supposed to take you in, but doesn’t. They come home not for comfort but to settle scores, demand respect, even elicit an acknowledgment of their existence. Family members in extremis shout and holler, hoping, like the father in “Buried Child,” that the sounds they make will signal an affirmative reply to the question, “Are we still in the land of the living?”
This question floats over Shepard’s novella of short-burst imaginings and conversations with himself, as the aging narrator ruefully takes stock. He’s in the land of the living, but only just, hanging on by his fingernails, his memory, his imagination, his never-ending obsession with his father, his blue thermal socks (nicked from a movie set) and his ongoing arguments with women, including a sometime-girlfriend 50 years his junior. She’s called the Blackmail Girl because she’s recording their conversations for a book that will launch her literary career. Maybe. There’s a wry poetic justice in the spectacle of a writer, that scavenger of others’ lives, helplessly furnishing material for another. The voyeur voyeured.
There are no half measures to Kay Redfield Jamison’s medico-biographical study of poet Robert Lowell. It is impassioned, intellectually thrilling and often beautifully written, despite being repetitive and overlong: A little too much would seem to be just enough for Jamison.
Nonetheless, “Robert Lowell: Setting the River on Fire” achieves a magnificence and intensity — dare one say a manic brilliance? — that sets it apart from more temperate and orderly biographies. Above all, the book demands that readers seriously engage with its arguments, while also prodding them to reexamine their own beliefs about art, madness and moral responsibility. Reading this analysis of “genius, mania, and character” is an exhilarating experience.
From the late 1940s to the mid-1970s, Lowell was the most admired and talked-about American poet of his generation. Scion of a privileged New England family, he counted among many distinguished ancestors two notable poets — James Russell Lowell and Amy Lowell — as well as Percival Lowell, the astronomer who sighted what he thought were canals on Mars.
Here’s how to open with a bang. “It was the afternoon of my 81st birthday, and I was in bed with my catamite when Ali announced that the archbishop had come to see me.” That’s the confident, melodious, literate, entertaining first sentence of Earthly Powers, the 1980 novel by Anthony Burgess. Not bad for someone born in 1917. One of the few advantages in choosing authorship as a profession is the faint possibility of a long career, but not everyone manages to keep words and passion alive for the duration. Burgess’s practical attitude to his writing, his detailed understanding of voices, the changing sounds of humanity and the musics and mass cultures they produce all helped to keep his voice on the page young and, in every sense, vital. His outrage in the face of media-sponsored human folly also helped to keep him burning bright. Burgess always both gave and received in his relationship with popular culture.
The general question I've been living with is how do we go about getting a better scientific grip on everything social? The social sciences have developed away from the natural sciences, even with some bit of hostility toward natural sciences, and that, I believe, is a source of poverty. If we want to have a more ambitious understanding of how social life functions, of the mechanisms involved, the challenge is to achieve continuity with neighboring natural sciences. The obvious neighbors to begin with are cognitive neuroscience, ecology, biology, and others.
I started as a social scientist. I started as an anthropologist doing fieldwork in a small group of people in the south of Ethiopia, asking myself fairly standard anthropological questions.
I was in this tribe in the south of Ethiopia, studying rituals—sacrifices and divinations. They had a fairly rich ritual life with lots of symbols and so on, and I would keep asking them, “What is the meaning of the symbols you’re using? What are the reasons for why you do this ritual the way you do?” And I never got a satisfactory answer, or so I thought. When asked about the meaning they said, “We do it because that’s what our fathers did, and our forefathers.” That was always the answer: “We do it because that’s the way we’ve always done it.” I was very frustrated by this and went looking for possibly better informants—an older member of the society, a “wise man,” or whatever—who would know more, but I never found them.
It is 7 o’clock in the morning and Harvey Friedman has just sent an email to an unspecified number of recipients with the subject line “stop what you are doing.” It features a YouTube link to a live 1951 broadcast of a concert by the famous Russian pianist Vladimir Horowitz. “There is a pattern on YouTube of priceless gems getting taken down by copyright claims,” Friedman writes, “so I demand (smile) that you stop everything you are doing, including breathing, eating, thinking, sleeping, and so forth, to listen to this before it disappears.”
His comment takes its place at the top of a chain of emails stretching back months, with roughly as many messages sent at 3 a.m. as at noon or 9 p.m. The haphazard correspondence covers a wide range of topics, from electronic music editing to an interdisciplinary field Friedman calls “ChessMath.” At one point, he proposes to record at home, by himself, a three-part “Emotion Concert.” Anonymous piano players on the email thread discuss their own thoughts on the lineup.
As diverse as the topics in the email history are, Friedman asks the same question of them all: What are their basic constituents and what laws govern them? He seems to be searching for the right vocabulary—“the right way,” he says, “of talking about what the fundamental ideas are, to black-box the ad hoc technicalities and get to the real meat of the thing.”
That is not to say all of these topics are equal. There is one that is nearest and dearest to Friedman’s heart: the foundations of mathematics, which concerns itself with the consistency, unity, and structure of mathematics itself.
In 2011, I was hired, straight out of college, to work at the White House and eventually the National Security Council. My job there was to promote and protect the best of what my country stands for. I am a hijab-wearing Muslim woman––I was the only hijabi in the West Wing––and the Obama administration always made me feel welcome and included.
Like most of my fellow American Muslims, I spent much of 2016 watching with consternation as Donald Trump vilified our community. Despite this––or because of it––I thought I should try to stay on the NSC staff during the Trump Administration, in order to give the new president and his aides a more nuanced view of Islam, and of America's Muslim citizens.
I lasted eight days.
When Trump issued a ban on travelers from seven Muslim-majority countries and all Syrian refugees, I knew I could no longer stay and work for an administration that saw me and people like me not as fellow citizens, but as a threat.
The evening before I left, bidding farewell to some of my colleagues, many of whom have also since left, I notified Trump’s senior NSC communications adviser, Michael Anton, of my departure, since we shared an office. His initial surprise, asking whether I was leaving government entirely, was followed by silence––almost in caution, not asking why. I told him anyway.
Since well before the dawn of history, human beings have gathered together around flickering campfires to tell and listen to tales. We still do, even if the campfires are now more often glowing screens – in cinemas, on television sets, or in our hands. There are a great many reasons for this: fictional narratives offer us so many things. But in our present moment it is worth remembering one reason in particular: storytelling offers an antidote to nostalgia. By imagining, we create the potential for what might be. Religions are composed of stories precisely because of this potency. Stories have the power to liberate us from the tyranny of what was and is. We are all creators of fictions, and we all have a role to play in imagining our way out of the nostalgic traps strewn around us. But there are special opportunities open to those of us who create fiction for a living, and above all to those of us who are writers, because we are freer to create what we wish, without requiring funding for our projects, as a film-maker might. We are the startups of the storytelling world, the crazy solo inventors in the R&D department of humanity’s narrative imagination.
We should be glad for these opportunities. The future is too important to be left to professional politicians. And it is too important to be left to technologists either. Other imaginations from other human perspectives must stake competing claims. Radical, politically engaged fiction is required. This fiction need not focus on dystopias or utopias, though some of it probably will. Rather it needs to peer with all the madness and insight and unexpectedness and wisdom we can muster into where we might desirably go, as individuals, families, societies, cultures, nations, earthlings, organisms. This does not require setting fiction in the future. But it does require a radical political engagement with the future.
Take back control? Make America great again? Restore the caliphate? We can do better than these. Storytellers, now is the time to try.
Many Americans might not know the more polemical side of race writing in our history. The canon of African-American literature is well established. Zora Neale Hurston, Richard Wright, James Baldwin are familiar figures. Far less so is Samuel Morton (champion of the obsolete theory of polygenesis) or Thomas Dixon (author of novels romanticizing Klan violence). It is tempting to think that the influence of those dusty polemics ebbed as the dust accumulated. But their legacy persists, freshly shaping much of our racial discourse.
On the occasion of Black History Month, I’ve selected the most influential books on race and the black experience published in the United States for each decade of the nation’s existence — a history of race through ideas, arranged chronologically on the shelf. (In many cases, I’ve added a complementary work, noted with an asterisk.) Each of these books was either published first in the United States or widely read by Americans. They inspired — and sometimes ended — the fiercest debates of their times: debates over slavery, segregation, mass incarceration. They offered racist explanations for inequities, and antiracist correctives. Some — the poems of Phillis Wheatley, the memoir of Frederick Douglass — stand literature’s test of time. Others have been roundly debunked by science, by data, by human experience. No list can ever be comprehensive, and “most influential” by no means signifies “best.” But I would argue that together, these works tell the history of anti-black racism in the United States as painfully, as eloquently, as disturbingly as words can. In many ways, they also tell its present.
“Poems on Various Subjects, Religious and Moral,” by Phillis Wheatley (1773)
No book during the Revolutionary era stirred more debates over slavery than this first-ever book by an African-American woman. Assimilationists and abolitionists exhibited Wheatley and her poetry as proof that an “uncultivated barbarian from Africa” could be civilized, that enslaved Africans “may be refin’d, and join th’ angelic train” of European civilization and human freedom. Enslavers disagreed, and lashed out at Wheatley’s “Poems.”
More here. (Note: At least one post throughout February will be in honor of Black History Month)
Yes, inequality is getting worse every year. In early 2016 Oxfam reported that just 62 individuals had the same wealth as the bottom half of humanity. About a year later Oxfam reported that just 8 men had the same wealth as the world's bottom half. Based on the same methodology and data sources used by Oxfam, that number is now down to 6.
How to account for the dramatic increase in the most flagrant and perverse of extreme inequalities? Two well-documented reasons: (1) The poorest half (and more) of the world has continued to lose wealth; and (2) The VERY richest individuals — especially the top thousand or so — continue to add billions of dollars to their massive fortunes.
Inequality deniers and apologists say the Oxfam methodology is flawed, but they're missing the big picture. Whether it's 6 individuals or 62 or 1,000 doesn't really matter. The data from the Credit Suisse Global Wealth Databook (GWD) and the Forbes Billionaire List provide the best available tools to make it clear that inequality is extreme and pathological and getting worse every year.
Brent Simpson, Robb Willer & Ashley Harrell in Nature:
The threat of free-riding makes the marshalling of cooperation from group members a fundamental challenge of social life. Where classical social science theory saw the enforcement of moral boundaries as a critical way by which group members regulate one another’s self-interest and build cooperation, moral judgments have most often been studied as processes internal to individuals. Here we investigate how the interpersonal expression of positive and negative moral judgments encourages cooperation in groups and prosocial behavior between group members. In a laboratory experiment, groups whose members could make moral judgments achieved greater cooperation than groups with no capacity to sanction, levels comparable to those of groups featuring costly material sanctions. In addition, members of moral judgment groups subsequently showed more interpersonal trust, trustworthiness, and generosity than all other groups. These findings extend prior work on peer enforcement, highlighting how the enforcement of moral boundaries offers an efficient solution to cooperation problems and promotes prosocial behavior between group members.
“I’ve found another one!” My mother is delighted, full of excitement, cup of tea in hand at our small kitchen table in Dublin, overloaded with notes and books. She has long had a passionate interest in Irish history but this is her biggest project yet – an investigation into Irish Protestant nationalists who contributed to the Easter Rising.
She has a hunch that there were more of them than anyone has realised. I know she is writing a book for the centenary. It has become all-consuming: for several years she has scoured archives, libraries, interviewed descendants of Protestant rebels, including Garret FitzGerald, whose rebel mother was Presbyterian. Each document seam uncovers a new lead, a fresh name. She feels a need to reinsert these lives that she believes have been overlooked into the history of the Rising, especially the working-class Protestants of Dublin, long neglected.
I don’t dare ask her to what extent it is a search for self. From a practising Church of Ireland family, of very humble Dublin and Wicklow origins, my mother was a scholarship girl, educated through Irish in Coláiste Moibhí, the training college established by the State to produce Gaelic-speaking, nationalist teachers for Protestant primary schools. Devout and liberal, patriotic and pacifist, she defies easy stereotypes, just like the lives she is researching.
Second, in just about every takedown or defense of highfalutin academic jargon, it’s generally taken for granted that such jargon is just part of the job academics do, but when it comes to determining the role of “the academic” in society, things get messier. The arguments make it seem like the main choice facing academics involves determining to what degree they might deign to display some civic-mindedness and try to translate their findings into something that will somehow engage and benefit “the public.” But all such arguments tend to rest on unchallenged assumptions about academics in general, and these assumptions are often the biggest problem.
There’s a huge difference, for instance, between defending academic jargon as such and defending academic jargon as the typical academic so often uses it. There’s likewise a huge difference between justifying jargon when it is absolutely necessary (when all other available terms simply do not account for the depth or specificity of the thing you’re addressing) and pretending that jargon is always justified when academics use it. And there’s a huge difference between jargon as a necessarily difficult tool required for the academic work of tackling difficult concepts, and jargon as something used by tools simply to prove they’re academics.
In the preface to his new book, the philosopher Daniel Dennett announces proudly that what we are about to read is “the sketch, the backbone, of the best scientific theory to date of how our minds came into existence”. By the end, the reader may consider it more scribble than spine – at least as far as an account of the origins of human consciousness goes. But this is still a superb book about evolution, engineering, information and design. It ranges from neuroscience to nesting birds, from computing theory to jazz, and there is something fascinating on every page.
The term “design” has a bad reputation in biology because it has been co-opted by creationists disguised as theorists of “intelligent design”. Nature is the blind watchmaker (in Richard Dawkins’s phrase), dumbly building remarkable structures through a process of random accretion and winnowing over vast spans of time. Nonetheless, Dennett argues stylishly, asking “design” questions about evolution shouldn’t be taboo, because “biology is reverse engineering”: asking what some phenomenon or structure is for is an excellent way to understand how it might have arisen.
In the early 1960s, Berger and his wife, the translator Anya Bostock, left London for a suburb of Geneva, where he wrote in relative obscurity for several years, publishing two further novels that attracted little attention. “It is a struggle,” he explained in a letter to an older novelist, “because I made so many enemies as an art-critic; I have now offended the sense of order by abandoning art criticism; and I have exiled myself here seeing nobody except a few cherished but powerless friends. But meanwhile one must write and hope.”
The silence of exile was in fact a preparation for the great flowering of Berger’s middle period. As the generation of ‘68 spread its wings and the New Left seemed to promise revolution, Berger’s work broke free of all previous models. Between 1965 and 1975 he produced an awe-inspiring array of forms: photo-texts, broadcasts, novels, documentaries, feature films, essays. Many of these were done in collaboration. With the Swiss photographer, Jean Mohr, he made A Fortunate Man, a documentary portrait in words and images of a country doctor in the Forest of Dean, and A Seventh Man, a kind of modernist visual essay about the courage and perseverance of migrant laborers in Europe. (This latter project was the book Berger always said he was proudest of, and in a 2010 reprinting he mused that sometimes a book, unlike its authors, can grow more of-the-moment with time, a statement that itself has only grown truer in recent years as the migrant crisis reaches new levels.) Berger also worked with Alain Tanner on several films, including Jonah who will be 25 in the year 2000, an ensemble comedy that became a touchstone of post-‘68 optimism.