The idea that Debussy is one of music’s great revolutionaries still causes consternation. Many who consider themselves fans are aware only of the sensual timbres and mellifluous images in sound, often inspired by literary or visual allusion: water, air, wind, moonlight, at once static and mobile. How can music so apparently formless and exquisite also trigger innovation?
This contradiction between sensory beauty and meticulous, visceral invention is part of Debussy’s fascination. Alert to the intellectual world of early twentieth-century France, he shunned politics and ideology. His music, recognizable but not obviously singable, has been used in commercials for dog food, digitalized baby chimes and, especially his Clair de Lune, in hundreds of film soundtracks. (His influence on mainstream culture is discussed in Matthew Brown’s Debussy Redux, 2012.) A popular radio station recently described “La Fille aux cheveux de lin” as “balmy and relaxing”, all as effortless as a bathe in the Dead Sea.
Late last year, I witnessed an extraordinary surgical procedure at the Cleveland Clinic in Ohio. The patient was a middle-aged man who was born with a leaky valve at the root of his aorta, the wide-bored blood vessel that arcs out of the human heart and carries blood to the upper and lower reaches of the body. That faulty valve had been replaced several years ago but wasn’t working properly and was leaking again. To fix the valve, the cardiac surgeon intended to remove the old tissue, resecting the ring-shaped wall of the aorta around it. He would then build a new vessel wall, crafted from the heart-lining of a cow, and stitch a new valve into that freshly built ring of aorta. It was the most exquisite form of human tailoring that I had ever seen. The surgical suite ran with unobstructed, preternatural smoothness. Minutes before the incision was made, the charge nurse called a “time out.” The patient’s identity was confirmed by the name tag on his wrist. The surgeon reviewed the anatomy, while the nurses — six in all — took their positions around the bed and identified themselves by name. A large steel tray, with needles, sponges, gauze and scalpels, was placed in front of the head nurse. Each time a scalpel or sponge was removed from the tray, as I recall, the nurse checked off a box on a list; when it was returned, the box was checked off again. The old tray was not exchanged for a new one, I noted, until every item had been ticked off twice. It was a simple, effective method to stave off a devastating but avoidable human error: leaving a needle or sponge inside a patient’s body.
In 2007, the surgeon and writer Atul Gawande began a study to determine whether a 19-item “checklist” might reduce human errors during surgery. The items on the list included many of the checks that I had seen in action in the operating room: the verification of a patient’s name and the surgical site before incision; documentation of any previous allergic reactions; confirmation that blood and fluids would be at hand if needed; and, of course, a protocol to account for every needle and tool before and after a surgical procedure. Gawande’s team applied this checklist to eight sites in eight cities across the globe, including hospitals in India, Canada, Tanzania and the United States, and measured the rate of death and complications before and after implementation.
The results were startling: The mortality rate fell to 0.8 percent from 1.5 percent, and surgical complications declined to 7 percent from 11 percent.
Origin stories are woven with many threads: Some we spin ourselves, while others we inherit. The great German artist Charlotte Salomon (1917–1943) accounted for herself—for who she was, and why she was, and where she came from—not by wondering what of herself was fact and what was fiction. Rather, the real and present question was Leben? oder Theater? (Life? or Theatre?). In other words, how to distinguish genuine presence and raw experience from the spectacle and folly of human making.
Life? or Theatre? is the title of Salomon’s singular and revelatory masterpiece, which she described very simply as ein Singespiel, “a play with music.” But it isn’t simple: It is a true Gesamtkunstwerk, a total work of art, at once memoir and novel, script and libretto, painting and music, art object and spirit force. (Salomon herself writes, with little exaggeration, of the “soul-penetrating nature of the work.”)
There’s a joke, attributed to Oscar Wilde, that the most frightening sentence in the English language is: ‘I had a very interesting dream last night.’ If Wilde did say that, it’s a safe bet that he wouldn’t have liked Insomniac Dreams, because this short book is focused entirely on the dream-life of Vladimir Nabokov. It has at its heart a record of dreams that Nabokov kept for eighty days from October 1964, while he was living at the Montreux Palace Hotel – in terms of his books, after he had finished Pale Fire and before he wrote Ada. He recorded the dreams on waking, using the set-up he employed for writing his books, in his neat pencil handwriting, on lined A6 index cards.
The usual reason people take an interest in their own dreams is to divine their meaning. That wasn’t Nabokov’s motive. The inspiration for his project came from An Experiment with Time, a book by J.W. Dunne, published in 1927 and renowned in its day. Dunne’s theory was that time doesn’t only run forwards in a linear direction, and that, as a result, dreams can contain glimpses of the future. Not that dreams, in Dunne’s view, are only predictive: they mix past events, future events, and random mental fluff. The experiment of recording dreams at the moment of waking was to get evidence for precognition, through a contemporaneous record of dream-predictions which subsequently turn out to be accurate.
In recent months, there’s been a groundswell of evidence showing that more volume in both the left and right hemispheres of the cerebellum (Latin for “little brain”) may be linked to Homo sapiens’ evolutionary success in comparison to Neanderthals, who inhabited Ice-Age Europe about 250,000 to 40,000 years ago.
In January 2018, researchers from the Department of Human Evolution at the Max Planck Institute in Leipzig reported that the bulging of the cerebellar hemispheres played a significant role in giving present-day human brains a more globular shape in comparison to Neanderthals, who had a more elongated endocranial shape. This paper, “The Evolution of Modern Human Brain Shape(link is external),” was published in Scientific Advances.
One hundred and sixty years ago, at a time when the light bulb was not yet invented, Karl Marx predicted that robots would replace humans in the workplace.
“[O]nce adopted into the production process of capital, the means of labor passes through different metamorphoses, whose culmination is the machine, or rather, an automatic system of machinery,” he wrote in his then-unpublished manuscriptFundamentals of Political Economy Criticism. “The workers themselves are cast merely as its conscious linkages.”
Gradually, in the century and a half since Marx wrote those words, machines have taken on more and more jobs previously done by humans. The 20th century political movements that attempted to make Karl Marx’s ideas reality may have failed but, 200 years since the philosopher’s birth on May 5, 1818, his analysis and foresights have repeatedly proven true. We are, in many ways, living in the world Marx predicted.
Politics on both sides of the Atlantic is being played out in the costumes of dead generations. Trump won the White House with a Reagan campaign slogan, pledging to bring back factory jobs and tariff wars. Democrats believe desperately in the existence of Russian conspiracies. British conservatives yearn for the nineteenth century, while academics at Oxford seek an “intelligent Christian ethic of empire.” Jeremy Corbyn has made postwar socialism popular again, with the help of a line from Shelley. Such retromania might not be so surprising—every age of crisis, as Marx famously argued, conjures up the spirits of the past for guidance and inspiration. But it is harder to account for a ghostly presence that provides neither: the public intellectual who wants to fight about the Enlightenment.
Steven Pinker has released a book, Enlightenment Now, which argues that the solutions to all our problems—global warming, inequality, terrorism—lie in the “timeless ideals” of the eighteenth century. Jordan Peterson, recently anointed “the most influential public intellectual in the world right now,” has labeled identity politics an assault on the Enlightenment principle of human rights. David Brooks thinks that a populist Trump, a “Nietzschean Putin” and a “Marxian China” each represent a waning of faith in the Enlightenment project: “a long line of thought,” as Brooks aptly put it, like one you might put on a graph, but from which we’ve deviated. To get back on track we don’t really need to think through our principles, ideals or projects; we just need a sensible reminder of how Old and True and Good they are. Dead writers can do our thinking for us.
Bowiephilia is the bedroom religion of the too-smart, sensitive loner alone at home while everyone else is at the prom—an alienated adolescent’s dream of an aesthetic rapture, out of the soul-killing suburbs (like Bromley, where Bowie lived and languished as a teen), into a world where weirdos are exalted, not stuffed into gym lockers while the jocks guffaw. Unsurprisingly, Bowie was that kid. “I felt often, ever since I was a teenager, so adrift and so not part of everyone else…so on the outside of everything,” he says, in Geoffrey Marsh’s essay “Astronaut of Inner Spaces” (in the exhibition catalogue). “I wanted to be a fantastic artist, see the colors, hear the music, and they just wanted me turned down. … I had to retreat into my room; so you get in the room and you carry that ruddy”—British for “damned”—“room around with you for the rest of your life.”
How many of the visitors to David Bowie is were that kid, too? How many of us still carry that room around with us, a lifetime later? As a teen, I lived in mine.
One might have expected that in his criticisms of Gramsci and the Gramscians, a Marxist like Anderson would have shifted the emphasis back from the cultural superstructure to the economic base. But that’s not what happens. What both books set against culture and ideology is not economics but physical coercion: military force as a—perhaps even the—decisive component of power, hence as perhaps the determining factor in history. Questions of how glaring a deviation this is from Marxist orthodoxy (if such a thing still exists) will certainly be of interest to those who look up to Anderson as a Marxist guru. But these questions are finally less interesting than Anderson’s impenitent insistence that coercion, not class or modes of production, is the heart of history. Getting away from an emphasis on coercion—call it dictatorship of the proletariat, or think of the barricades—is usually seen as Gramsci’s most salient accomplishment in reinterpreting the concept of hegemony. The major intention behind both of Anderson’s books is getting back to it.
Grant was thirty-nine years old, apparently a hopeless failure, when Confederate troops fired the first shots on Fort Sumter. In the meantime, his political ideas had been slowly developing. While Jesse Grant was an avid abolitionist, Ulysses was no such thing; he opposed slavery in theory, but also feared, like many Northerners, that “outright abolitionism might lead to bloody sectional conflict.” He had even cast his one vote in a presidential election for James Buchanan, a Democrat—a fact that would embarrass him in later years. His father-in-law Fred Dent (a man as bossy and controlling as his own father) was a Missouri slaveowner of reactionary leanings, and his own wife, Julia, owned slaves while she was married to Grant, not divesting herself of this property until the Emancipation Proclamation. Grant himself quickly freed the slave who was given him by Fred, William Jones, but he was no abolitionist, dismissing John Brown’s raid as the act of a fanatic. But Chernow provides evidence that Grant became increasingly anti-slavery, a Free-Soil Democrat, in the years leading up to the war.
As humans, we are defined by, among other things, our desire to transcend our humanity. Mythology, religion, fiction and science offer different versions of this dream. Transhumanism – a social movement predicated on the belief that we can and should leave behind our biological condition by merging with technology – is a kind of feverish amalgamation of all four. Though it’s oriented toward the future, and is fuelled by excitable speculation about the implications of the latest science and technology, its roots can be glimpsed in ancient stories like that of the Sumerian king Gilgamesh and his quest for immortality.
In writing To Be a Machine, my book about transhumanism, my thinking on the subject was heavily influenced by the psychologist Ernest Becker’s 1974 study The Denial of Death. It’s an extraordinarily potent work of social anthropology, the underlying argument of which is that much of our culture is a reaction – variously destructive and creative – to the inadmissible fact of our own mortality. Though Becker was writing well before transhumanism existed as a movement, his book is useful in positioning it as a neurotic symptom of our inability to accept our own mortality. It’s also, more broadly, an eloquent and unsettling disquisition on the inexhaustible weirdness of the human animal.
According to Akeel Bilgrami, liberalism and liberal politics have their own limitations and cannot save us from the savagery of capital. In this way, he intellectually provokes us to go beyond liberalism and reimagine an alternative political vocabulary. His philosophy rejects the ideology of capitalism and envisions an alternative as the way forward for humanity. This alternative is, of course, Left-centric and socialistic in perspective, and Bilgrami sympathises with the Left politics in his home country and others.
His writings and philosophical ideas on the themes of secularism, modernity, Marxism and Gandhi have produced new perspectives on these and contributed significantly to our intellectual debates. His highly influential essay “Gandhi, the Philosopher” provides a fresh reading of Mahatma Gandhi. Bilgrami unearths the integrity in Gandhi’s ideas, contrary to the popular notion of inconsistency and fragmentation in Gandhi. As a philosopher, Bilgrami, despite being an atheist, does not completely reject the scope of religion having a critically instructive role in our time. As he says, “religion is not primarily a matter of belief and doctrine but about the sense of community and shared values that it can sometimes provide in contexts where other forms of solidarity—such as a strong labour movement—are missing, and it sometimes provides a moral perspective for a humane politic as it did in the liberation theology movement in Central America.”
Ever since mathematics got properly underway around 3,000 years ago, there was only one way to achieve access to the field. You had to spend many years developing a fairly extensive calculation skillset. In the first instance, to pass the graduation and entrance examinations to gain initial access to the field. Then, once accepted into the world of mathematics, calculation of one kind or another was what all mathematicians spent the bulk of their mathematical time doing. Arguably, for most of mathematics history, the subject really was, to a large extent, primarily about calculation of one form or another. Newton, Leibniz, Bernoulli (any of them), Fermat, Euler, Riemann, Gauss, and the other greats of times past, were all superb masters of calculation. (We should also include Boole, since his famous Boolean algebra is also a calculation system.)
But whereas most laypersons seem to think that calculation is all there is to mathematics, surely none of the greats did. Calculation was an important tool (more accurately, a set of tools) you needed to do mathematics, they must have realized, but the essence of mathematics is much more, a plateau of knowledge that transcends all the calculation techniques.
The New York City police officers looked bored, unable to understand a word, as they eyed the angry crowd at Madison Square Garden. A sawmill worker from the Canadian province of British Columbia took the stage with a retinue of robed warriors toting curved swords. He wore an ornate turban and sliced the air with his hand as he promised a massacre of Hindus.
“They say that Hindus are our brothers!” he declared in Punjabi. “But I give you my most solemn assurance that, until we kill 50,000 Hindus, we will not rest!”
In response, the crowd erupted in slogans: “Hindu dogs! Death to them! Indira bitch! Death to her! Blood for blood!”
“Indira” referred to Indira Gandhi, then prime minister of India. She lived for only three months after this scene unfolded.
It was July 28, 1984—the founding convention of the World Sikh Organization (WSO), created to carve an independent Sikh state out of India. The millworker, Ajaib Singh Bagri, was number-two in the Babbar Khalsa International, a terrorist group engaged in an armed struggle to win that state, to be called Khalistan, or Land of the Pure.
It’s 2018 and everybody’s talking about sex robots.
It started last month, when a Toronto man intentionally drove his van into a crowd. His ideology? The incel movement — a politically radicalized form of misogyny in which “involuntarily celibate” men envision taking vengeance on the virile “Chads” and shallow “Stacys” they believe are contributing to their sexual poverty. (It’s the same movement, fomented on internet discussion forums like Reddit and 4chan, that inspired the 2014 Santa Barbara shooter, venerated in incel circles as the “Supreme Gentleman”). How, various media outlets wondered, could we combat such a radical, toxic ideology, one that had already racked up a high body count?
Out of this reaction came a modest proposal. George Mason University economist Robin Hanson published a blog post seemingly advocating for “sexual redistribution”: a subversion of the sexual marketplace in which sexual access was state-sanctioned and state-organized. The government, in other words, should intervene to provide incels with sex.
Basingstoke sits roughly equidistant between London, where I’ve lived all my adult life, and Southampton, where I grew up. On the frequent train journey between the two, Southampton would barely ever change, but Basingstoke constantly threw up new office complexes and blocks of luxury flats, seemingly enjoying a permanent boom while Southampton declined. I’d never get off the train to look at the town, having been alarmed by it on a visit as a teenager, by the way there seemed to be “no there there”—just an enclosed mall, ringed by motorways, with Barratt Homes suburbia around it and nothing much more. This, I melodramatically thought to myself, is what they want for all of us, lives lived around shopping, property and nothing else, in towns stripped of anything distinct.
But what if that lack of all the obvious signifiers of urban uniqueness—the historic buildings, the art galleries, museums and concert halls, the “vibrant street life”—is itself an identity?
Does the lizard brain trick the body into singing its ancient song? Of course, you are more than the parts you recognize as you. Perhaps those other parts were quieter in the past, or did their work without being noticed, while now you can see their elbows, their toes sticking out, pulling on the strings of your life. But those same creatures were always there, pulling on the strings of your life. Will you one day feel about the mothering instinct the same way you now feel about the sex instinct, which also suddenly turned on? Like that other passage, you’ll resist it, but in retrospect, it took you. You didn’t make a choice to go in that direction. Life—nature—pulled your strings. That is why you have no regrets about those years. And where did it land you? In a more interesting place. It resulted in a more interesting time. Is your body now pulling you towards motherhood, in the same way?