Wolf milk and wilderness America. Romulus and Remus built a city but it couldn’t hide the animal in their hearts: a river-child discovers blood when he searches for a blessing. Hold your motherland in your mouth, all marble and doomed, a single lozenge of loss. Heaven fell into the pond and killed all the fish.
Even in the shape of a boy I can wear the morning. Daisies behind my ear. Minutes thin gold arm hairs. Blackberry vine tied around my wrist. Under this field is the only battle my father lost. Place your ear right here if you want to listen
The journalist and author Sam Quinones became aware of it even before the 2016 election, when he saw those Trump/Pence yard signs all over opioid country. Within days of Trump’s electoral victory, he published the disturbing story, as did the historian Kathleen Frydl under the apt title, “The Oxy Electorate.”
Donald Trump did very well, much better than Mitt Romney had in 2012, in the areas hardest hit by a raging drug epidemic. Indeed, one could describe the main opioid victims in exactly the same demographic terms that pundits use to characterize the core of Trump’s electoral support: non-Hispanic, mostly working-class whites without a college education living in rural areas and small cities. The opioid and Trump addictions, the one individual and the other collective, are symptoms of the same malaise.
For one, they are driven by sane powerful economic forces, gainfully employed in afflicting a vulnerable population. The rapacious, unregulated capitalism of the kind that now shapes the Trump agenda prepared the ground of the opioid crisis in Appalachia, the Midwest Rust Belt, and elsewhere by engendering the inequalities and hardships that drive so many to despair.
As the Bible (Exodus) teaches and, more recently, Hannah Arendt warns, liberation is not yet liberty. The institutions of liberty must first be constituted, and people need to learn how to make them work while breathing spirit into them.
The years 1989–1991 were a time of liberation for all the people of Eastern Europe who had suffered totalitarian political systems and ideological indoctrination under Soviet domination. The future, the fate, of all liberated nations depended on the success or failure of transforming liberation into liberty. Some of the just-liberated nations did fairly well, others less so. In Hungary in 1989, enthusiasm for system change was great among intellectuals who were spiritually starving for liberty. A considerable part of the population shared this enthusiasm, believing that the establishment of democratic institutions would immediately lead to the Western standard of living. Thus, they expected a far better life.
For a while, all previously Soviet-dominated countries were developing in a similar direction. Later, however, differences became as important as similarities. The Hungarian case proved unique, since only Hungary went through a second system change, not only de facto but also de jure. The prime minister of Hungary, Victor Orbán, described the result of the second system change as “illiberal democracy” and as “the system of national collaboration” (I discuss this more below).
The result proves that, in Hungary, a great opportunity was wasted and aborted: the opportunity to let liberal democracy take root in Hungarian soil. Instead, Hungarians seem to have relied on a longstanding tradition of following a leader, expecting everything from above, believing, or pretending to believe, everything they are told, mixed with a kind of fatalistic cynicism of the impossibility of things being otherwise.
Released only weeks before the author’s death on August 5, Toni Morrison: ThePieces I Am opens with artist Mickalene Thomas making a collage of an elderly Morrison superimposed upon an image of her as a young woman, embellished with flowers and patterned fabrics. The force of black art dominates the documentary—not only Morrison’s own words, but images from twenty-one prominent black artists including Kara Walker and Faith Ringgold, all shown atop a lush score composed by Kathryn Bostic. The emotional richness of the film skillfully pulls viewers into the unique power of Morrison’s fiction. However, its nearly exclusive focus on her novels—and its progressive liberal depiction of Morrison as a singular genius—sidelines Morrison’s political impact as an editor and essayist, creating a strangely lopsided impression of her life and impact.
In the late sixties and early seventies, before she was known as an author, Morrison was a Random House trade editor who almost singlehandedly introduced black radical activists to mainstream American readers. No single editor or major publishing house has surpassed Morrison’s contribution in the intervening four decades. Cofounder of the Black Panther Party Huey P. Newton (To Die for the People, 1972); prison activist and Black Panther field marshal George Jackson (Blood in My Eye, 1971); and Angela Davis (Angela Davis: An Autobiography, 1974) were all published by Morrison at Random House. Morrison did not necessarily embrace these ideologies, but believed it was invaluable that they circulate in the marketplace of ideas—despite their demonization by the U.S. government.
There is no question that radical right-wing fringe phenomena have been normalized under Donald Trump, most spectacularly when he claimed that there were good people on both sides in the Charlottesville riots. What used to be called the lunatic fringe in American politics is being made respectable by such pronouncements, as well as by the euphemism of “alt-right” itself, a term innocuous enough to disguise its white supremacist ideology. Adorno also warned that the afterlife of fascist tendencies within democracy is more dangerous than the afterlife of fascist tendencies against democracy. Today we face a situation where Adorno’s distinction has been cashed in. Tendencies from within, brilliantly analyzed by Wendy Brown in her 2015 book Undoing the Demos: Neoliberalism’s Stealth Revolution, are merging in the US with outright tendencies against democracy. The Trump regime participates in both. Just think of the Republicans’ systematic attacks on voting rights through gerrymandering, recently legitimized by the Supreme Court. Or compare Mark Zuckerberg’s motto “move fast and break things” with Trump’s daily practice of attacking and dismantling American governmental institutions, a practice that is fully in sync with Steve Bannon’s demand to “deconstruct the administrative state” and with Breitbart’s call to attack the “Democrat media complex” online. Trump uses Twitter and his fake “fake news” mantra to gaslight the electorate, while much of the real deconstruction of governmental institutions regarding the law and the constitution, health care, the environment, housing, foreign policy, and climate change rarely catches the headlines in any sustained fashion. While right-wing parties in the European Union are gaining ground, with few exceptions they are not (yet) the mainstream. In the United States, the Republicans are the mainstream and further to the right than either the AfD in Germany or the latest incarnation of the National Front in France.
The German philosopher and social theorist Theodor W. Adorno died fifty years ago this week, in the late summer of 1969. Even at the time of his death, he was entangled in controversy. Student militants, many of them aligned with the so-called “extra-parliamentary opposition,” had once seen him as a political ally. But when they occupied the Institute for Social Research at the Goethe–University Frankfurt, where Adorno kept his office, he called in the police, an act that was seen as unforgivable by the student radicals: How could a theorist of anti-fascism side with the authorities?
Adorno’s decision opened a bitter divide between the so-called Frankfurt School and the more militant members of the student movement that would never truly heal. In late April 1969, when Adorno commenced the first of his lectures on “An Introduction to Dialectical Thinking,” two students rushed the podium, demanding that Adorno engage in a public act of self-criticism. Three female students then showered him with flower petals and bared their breasts. The aging professor fled the hall, and students distributed a leaflet: “Adorno as an Institution is Dead.”
A month later, in an interview for Der Spiegel, Adorno remarked on the irony that in a piece of political theater he had been cast in the unlikely role of a cultural conservative: “To target me, I who have always positioned myself against every kind of erotic repression and sexual taboos!” Adorno tried to resume instruction in June, but further protests prevented him from lecturing.
However, it was an invention seven years earlier that restructured not just how language appears, but indeed the very rhythm of sentences; for, in 1496, Manutius introduced a novel bit of punctuation, a jaunty little man with leg splayed to the left as if he was pausing to hold open a door for the reader before they entered the next room, the odd mark at the caesura of this byzantine sentence that is known to posterity as the semicolon. Punctuation exists not in the wild; it is not a function of how we hear the word, but rather of how we write the Word. What the theorist Walter Ong described in his classic Orality and Literacyas being marks that are “even farther from the oral world than letters of the alphabet are: though part of a text they are unpronounceable, nonphonemic.” None of our notations are implied by mere speech, they are creatures of the page: comma, and semicolon; (as well as parenthesis and what Ben Jonson appropriately referred to as an “admiration,” but what we call an exclamation mark!)—the pregnant pause of a dash and the grim finality of a period. Has anything been left out? Oh, the ellipses…
Some claim that the idea of human freedom is built on illusions about human specialness that are a holdover from a religious conception of the world, and that they should be swept aside with the advancing tides of science. This position has been trumpeted loudly by people who present themselves as brave defenders of science: by scientists such as Einstein, Stephen Hawking and Richard Dawkins, and by philosophers including Alexander Rosenberg and Sam Harris. To most people, however, it seems literally unbelievable that the scales of fate don’t hang in the balance when making a difficult decision. And it is not just those dark nights of the soul where this matters. You think that you could cross the street here or there, pick these socks or those, go to bed at a reasonable hour or stay up, howl at the moon and eat donuts till dawn. Every choice is a juncture in history and it is up to you to determine which way to go.
Yet, if there is one foundational scientific fact, it is that things can’t happen that the laws of physics don’t allow. And the clash between these two things shows that there is something centrally important about ourselves and our position in the cosmos that we don’t understand.
Fifty years ago this month, Bob Dylan played the Isle of Wight Festival. They say if you can remember 1969 you weren’t there, but I do and I was, boomerphobes. I can even tell you what half a century feels like if you’re interested, although it’s a bit layered. A bit contradictory.
In all honesty, I can just reach out and touch 1969. It’s no distance at all, like from here to the end of the garden. However, the distance between now and then is also an aeon of unfathomable space-time parameters, heavier than Jupiter’s gravity multiplied by infinity.Historically at least, it is definitely a world away. Fifty years ago we were nearing the end of the Post-Industrial Jurassic period and to be honest feeling a bit done in, a bit puffed out, what with all that dark satanic coal, tar, diesel, petrol, two-stroke and fag smoke. Our fat-marbled air, yet to comprehend an internet, held instead molecules of carbon grit, Wimpy onions and brickdust from pulverised Victorian streets.
It’s hard to overstate the significance of Toni Morrison in the pantheon of global black literature. For many of us she was the lodestar who inspired us to write from within our own cultures, often from female perspectives, and to dignify the heterogeneity of black experiences through literature we could call our own. As a young, aspiring writer I was enriched by her work and empowered by her words of wisdom. I read an interview with her in the seminal Black Women Writers at Work, edited by Claudia Tate, in 1984, which articulated exactly how it felt to be a young black British woman writer at that time. She and others galvanised my generation to write our stories and smash through the walls of the status quo. “There’s a notion out in the land,” Morrison said, “that there are human beings one writes about, and then there are black people or Indians or some other marginal group. If you write about the world from that point of view, somehow it is considered lesser.” Morrison, our elder stateswoman, spoke with authority on issues of race and literature, as she did for the rest of her life. We always knew she was on our side.
She wrote uncompromisingly about African American society and history, and positioned her characters on the main stage as fully fledged humans with an extensive emotional range and intellectual scope. She showed the complexity of their lives through her formidably imaginative, storytelling powers. Her books were in the tradition of a literature that stretched back to the “slave narratives” of the 19th century and she was by no means the only writer filling in the cultural absences in an American literature that too often excluded, marginalised or stereotyped her people. Writers such as Audre Lorde, Gloria Naylor, Alice Walker and Ntozake Shange were also early inspirations for me. But Morrison reigned supreme, in no small part due to her extensive output of 11 novels and three books of critical thinking, as well as works for children, opera and theatre.
Toni Morrison spoke at Harvard Divinity School on the subject of altruism in 2012. Her lecture is published here for the first time.
On an October morning in 2006, a young man backed his truck into the driveway of a one-room schoolhouse. He walked into the school and after ordering the boy students, the teacher and a few other adults to leave, he lined up 10 girls, ages 9 to 13, and shot them. The mindless horror of that attack drew intense and sustained press as well as, later on, books and film. Although there had been two other school shootings only a few days earlier, what made this massacre especially notable was the fact that its landscape was an Amish community — notoriously peaceful and therefore the most unlikely venue for such violence.
Before the narrative tracking the slaughter had been exhausted in the press, another rail surfaced, one that was regarded as bizarre and somehow as shocking as the killings. The Amish community forgave the killer, refused to seek justice, demand vengeance, or even to judge him. They visited and comforted the killer’s widow and children (who were not Amish), just as they embraced the relatives of the slain. There appeared a number of explanations for their behavior — their historical aversion to killing anyone at all for any reason and their separatist convictions. More to the point, the Amish community had nothing or very little to say to outside inquiry except that it was God’s place to judge, not theirs. And, as one cautioned, “Do not think evil of this man.” They held no press conferences and submitted to no television interviews. They quietly buried the dead, attended the killer’s funeral, then tore down the old schoolhouse and built a new one.
Their silence following the slaughter, along with their deep concern for the killer’s family, seemed to me at the time characteristic of genuine “goodness.” And I became fascinated with the term and its definition.
When I met him, Bryan Magee was nearly 89, marvellously lucid, curious to hear about my time at Oxford, and paralysed from the waist down: in many ways the ideal interviewee. For a generation of young viewers, Magee’s legendary television series about philosophy were a baptism in the waters of the subject, and he the urbane and worldly gatekeeper to a realm of theoretic abstraction and grounded, vigorous discussion such as had never before been entered—a watershed moment in a primetime slot.
“I wouldn’t rely on television for my introduction to anything,” Magee confided in me, blinking from behind glasses so thick they appeared to be double-glazed for warmth. “It seems to me a completely unimportant medium.” Bryan! I wept internally… It was like being told there was really only ever one Ronnie, or watching David Attenborough kick a pigeon. But—remembering what television is generally like—it was hard to disagree. Magee smiled grand-paternally back at me from where he sat, inside a small arms-reach fortress of books and papers, wielding a copy of Dumas the size of house-brick; the television screwed to the opposite wall was, I assumed, purely ornamental.
Bryan Magee, who died last month, had a career of intimidating sweep. He came up to Oxford in 1949, a particular post-war moment when the university’s unofficial monopoly in the production of trademark-ruling-elite seems to have been a little overstretched, as Magee found he had to take up several careers almost at once. He was watched by millions as a television journalist on This Week and began his 30-book literary career, making considerable contributions to scholarship on Wagner and Schopenhauer, all the while keeping up friendships with figures like Karl Popper and Bertrand Russell.
India’s decision two days ago to revoke most of Article 370 of its constitution and annex the part of Jammu & Kashmir it holds has sent Subcontinental and transcontinental punditocracy into a frenzy of analysis, interpretation, speculation, and prediction. Several scenarios have risen to the surface.
The most interesting of these, generating a lot of chatter on the Internet – and elsewhere, no doubt – is a conspiracy theory that India’s move is part of a brilliant coordinated strategy between India, Pakistan, and the US to eventually make the LoC an international border with minimal political cost to either government. There are many variations of this theory, but the basic idea is this. First, India moves into its part of Jammu & Kashmir and annexes it, allowing the BJP government to look heroic and turning the LoC into an international border, with a buffer territory – Pakistani Jammu & Kashmir – on the other side. Then, after a suitable interval of making noises and writing plaintive but futile missives to the UN, Pakistan declares that the situation is intolerable and annexes its part, thus making the LoC an actual international border. Uncle Sam rewards Pakistan for this daring act by allowing it to negotiate a favorable settlement in Afghanistan, thus fulfilling Pakistan’s dream of “strategic depth”. Some sort of free cross-border movement is negotiated for Kashmiris on either side of the border. China secures CPEC. Everyone is left happy and dreaming of visits to Oslo.
I think this scenario is extremely unlikely to be true – though it makes for a good movie plot.
C. Brandon Ogbunu and C. Malik Boykin in the Boston Review:
In recent years, biology’s “nature vs. nurture” war has reemerged with advanced weapons, although the central questions have not changed: What makes us human? Why are we different from one another? Nonetheless, the methods used to address them have undergone several revolutions. We now benefit from hundreds of twin and adoption studies, which have provided heritability estimates for dozens of characteristics relating to human behavior and wellness. Simultaneously, we are reaping the benefits of technological breakthroughs that have made it possible to screen thousands of individuals to uncover genes associated with particular traits. Thanks to this, we have been able to correlate genetic signatures with a growing list of physical (e.g., height, skin color), physiological (e.g., risk for type-2 diabetes, hypertension), and behavioral (e.g., risk for depression, autism) traits. At the same time, epidemiology, psychology, and sociology continue to demonstrate the pliability of the human experience across populations, and we continue to learn more about the social forces that create vast differences in the human experience.
In combination, work from the natural and social sciences should have fostered a golden age for the study of human behavior. And yet, conversations about how to explain differences between individuals and groups are more controversial than ever—perhaps not surprisingly, given the political implications of any answer. Recent breakthroughs in molecular biology have compounded the stakes.
We’ve got these bodies and these brains, which work okay, but we also have minds. We see, we hear, we think, we feel, we plan, we act, we do; we’re conscious. Viewed from the outside, you see a reasonably finely tuned mechanism. From the inside, we all experience ourselves as having a mind, as feeling, thinking, experiencing, being, which is pretty central to our conception of ourselves. It also raises any number of philosophical and scientific problems. When it comes to explaining the objective stuff from the outside—the behavior and so on—you put together some neural and computational mechanisms, and we have a paradigm for explaining those.
When it comes to explaining the mind, particularly the conscious aspects of the mind, it looks like the standard paradigm of putting together mechanisms and explaining things like the objective processes of behavior leaves an explanatory gap. How does all that processing give you a subjective experience, and why does it feel like something from the inside doesn’t look like it’s directly addressed by these methods? That’s what people call the hard problem of consciousness, as opposed to, say, the easy problems of explaining behavior.
Discussion can then spin off in a thousand directions. Could you explain conscious experience in terms of the brain? Does it require something fundamentally new? Does it exist at all? Lately, I’ve been interested in coming at this from a slightly different direction. We’ve got the first-order problem of consciousness, and then it’s often hard for people from AI research, or neuroscience, or psychology to say, “There’s a problem here, but I’m not quite sure what I can do with it.”