In some late month of 1995, William H. Gass attempted a flight from New York to Saint Louis but was stalled by fog at the flight boards. He repaired to a small table at an airport bar, his socks pulped and moaning, and spent the night with a galley of Robert Musil’s The Man without Qualities. Gass ordered a glass of rosé, began reading, and observed the ways that the characters in the novel seemed to come and go like people in an airport bar. Time passes, and eventually civil servants and industrialists of 1913 Vienna wander into the bar itself, right alongside the airport castaways—or so Gass tells us in the essay he went on to write about Musil.
After my plane lurched off the runway in New York, I took a folded copy of Gass’s essay out of my pocket and started reading. In September, I’d begun working on a review of The William H. Gass Reader, steeping myself in the life’s work, and now it was October, and I was uncertain about the direction of the piece. I declined the free snack mix and kept reading. I again tried to make sense of the beginning: there is a grounded flight in New York that occasions an essay in which an airport bar bleeds into an Austrian novel, and fiction into nonfiction, and then all sense of genre melts away as the review progressively constructs a lyrical world with its own logic and law. It struck me now that this was an uncanny echo of the most oft-repeated anecdote of Gass’s literary life.
Despite devising both the defining equation and the defining thought experiment of quantum physics, Erwin Schrödinger was never comfortable with what he helped to create. His “Schrödinger’s Cat” paradox, published in 1935, was an attempt to expose the flaws in the physics that flowed from his eponymous equation. And yet, that cat – both dead and alive – has become an icon of quantum physics rather than a warning against its shortcomings.
Schrödinger was born in Vienna in 1887. He was an exemplary schoolboy, displaying a startling ability in all his classes. He taught himself English and French in his spare time, and nurtured a love of classical literature. By the time he enrolled at the University of Vienna in 1906 he was focused on physics, but still took the time to learn a great deal of biology, which informed his later work – contributions that were cited as inspirational by the discoverers of DNA.
The work for which he is remembered requires some context. As with all science, an individual’s contributions to physics rarely occur in a vacuum, and a host of other figures set the stage for Schrödinger’s entrance. His seminal work began with his attempts to resolve a central mystery of the nascent quantum theory.
Visit the African savannas in Zimbabwe or Namibia, and you might notice large, towering termite mounds dotted about the landscape—nature’s skyscrapers, if you will. And nature is quite the engineer: those mounds are self-cooling, self-ventilating, and self-draining. New 3D X-ray images have revealed that one of the secrets to this impressive efficiency is a vast network of micropores in the walls of the mounds, according to a recent paper in Science Advances.
Termite mounds, with their ingenious mechanisms for climate control, have been providing inspiration for architectural design for at least the last 20 years, most notably when Zimbabwean architect Mick Pearce based his design for the Eastgate Center in his nation’s capital of Harare on the termite mounds he observed in the region. He wanted to move away from the big glass block designs previously favored for office buildings and wanted his design to be heated and cooled almost entirely by natural means. The Eastgate Center is the country’s largest commercial and shopping complex, and yet it uses less than 10 percent of the energy consumed by a conventional building of its size, because there is no central air conditioning and only a minimal heating system.
The termite mounds are basically fungus farms, since fungus is the termites’ primary food source. Conditions have to be just right in order for fungus to flourish. So the termites must maintain a constant temperature of 87°F in an environment where the outdoor temperatures range from 35°F at night to 104°F during the day.
In the past few years, when far-right nationalists are banned from social media, violent extremists face boycotts, or institutions refuse to give a platform to racists, a faux-outraged moan has gone up: “So much for the tolerant left!” “So much for liberal tolerance!” The complaint became so hackneyed it turned into an already-hackneyed meme. It’s a wonder anyone thinks this line has any rhetorical force. The equation of tolerance with acquiescence, passivity, or a total lack of boundaries is a reductio ad absurdum that denudes the word of meaning. One can only laugh at unserious characterizations that do such violence to reason.
The concept of toleration has a long and complicated history in moral and political philosophyprecisely because of the many problems that arise when the word is used without critical context. In some absurd, 21st century usages, tolerance is even conflated with acceptance, approval, and love. But it has historically meant the opposite—noninterference with something one dislikes or despises. Such noninterference must have limits. As Goethe wrote in 1829, “tolerance should be a temporary attitude only; it must lead to recognition. To tolerate means to insult.” Tolerance by nature exists in a state of social tension.
So what was it that made Robinson Crusoe different from previous English fiction? First, Defoe was the first major writer in English literature who did not take a plot from mythology, history, legend or prior literature. The next was to be Samuel Richardson (1689-1761) whose immensely important novel Pamela(1740) it will be relevant to mention a little later. In the plots of these two writers we see the difference, for example, from Chaucer, Spenser, Shakespeare and Milton. Second, Defoe was the first to convey the reality of time, to portray a life in the bigger picture of a historical process, and in terms of day-to-day thoughts and activities. Although his timings are inconsistent, his narrative convinces us that the action is occurring at a particular time. Third, Defoe was the first to produce a whole narrative as if it took place in a physical environment to which a character was attached by means of vivid detail: the description of objects, for example, such as clothing and implements. Previously and traditionally, place was treated in a vague and generalised way, with only incidental physical description. Fourth, the use of figurative language, previously a prominent feature of romances, was noticeably reduced; it was much rarer in Defoe and Richardson than in any writer before.
Before I knew who Claire Denis was, she taught me how to dance. When I was eighteen, it was easier to stay in with a movie than to go to a party and be surrounded by strangers. One night, I watched Denis’s film “Beau Travail,” from 1999. Afterward, I couldn’t sleep. I kept replaying the ending, transfixed by a man with a battered face, Galoup. For ninety minutes, Galoup (Denis Lavant) is small and hunched, a military officer who, after being ejected from the French Foreign Legion, can’t find meaning in civilian life. In a closing scene, he makes his bed, carefully tucking in the corners, and lies down, clasping a gun. Then we hear the pulse of Corona’s disco hit “The Rhythm of the Night.” We cut to Galoup smoking in a night club, leaning against a panelled mirror. He bobs his head to the music, tracing loose arcs in the air with his cigarette. He snarls. He spins in a tight circle, smoke trailing him like a cape. Then, at the chorus, unsmiling and intent, he lets himself go, flying into the air, fingers splayed like a gecko’s. I can’t describe what it felt like watching him for the first time, more blur than human. But I remember what it did to me. I got up and I began to wave my hands above my head, alone in the dark.
As Vincent van Gogh’s Starry Night Over the Rhône goes on show at Tate Britain, it is, in one sense, coming home. This might sound like wishful thinking. For the past half century the painting has hung in Paris, and its singing Mediterranean colours, which the artist himself described as “aquamarine”, “royal blue” and “russet gold”, bear little resemblance to the murky half-tints of the Thames, which runs past Tate Britain’s Millbank site. Yet its spring exhibition, Van Gogh and Britain, is organised on the principle that the foundations of the Dutchman’s art, both his eye and his intellect, were laid not in the south of France, nor in the misty light of the Low Countries, but in London, where he spent three life-defining years (1873-76) as a young man.
A case in point: if you look beyond the hallucinogenic brilliance of Starry Night Over the Rhône, which Van Gogh painted in 1888, two years before his death, you will notice a family resemblance to a small black and white engraving that he first encountered more than a decade earlier, during his London stay. Gustave Doré’s Evening on the Thames shows London from Westminster Bridge, a view Van Gogh knew intimately from his commute between his suburban lodgings and the Covent Garden office where he worked as a clerk. It wasn’t the pomp of the neo-gothic Houses of Parliament that drew Van Gogh to Doré’s image so much as the regular pattern made by the gaslights as they flared across the river. Fifteen years later he would transpose this jolt of modernity to Starry Night Over the Rhône, a view of Arles in which new-fangled streetlamps compete with exploding stars to light up the sky.
Cancer immunotherapy drugs, which spur the body’s own immune system to attack tumors, hold great promise but still fail many patients. New research may help explain why some cancers elude the new class of therapies, and offer some clues to a solution. The study, published on Thursday in the journal Cell, focuses on colorectal and prostate cancer. These are among the cancers that seem largely impervious to a key mechanism of immunotherapy drugs. The drugs block a signal that tumors send to stymie the immune system. That signal gets sent via a particular molecule that is found on the surface of some tumor cells. The trouble is that the molecule, called PD-L1, does not appear on the surface of all tumors, and in those cases, the drugs have trouble interfering with the signal sent by the cancer.
The new study is part of a growing body of research that suggests that even when tumors don’t have this PD-L1 molecule on their surfaces, they are still using the molecule to trick the immune system. Instead of appearing on the surface, the molecule is released by the tumor into the body, where it travels to immune system hubs, the lymph nodes, and tricks the cells that congregate there. “They inhibit the activation of immune cells remotely,” said Dr. Robert Blelloch, associate chairman of the department of urology at the University of California, San Francisco, and a senior author of the new paper. The U.C.S.F. scientists discovered that they could cure a mouse of prostate cancer if they removed the PD-L1 that was leaving the tumor and traveling to the lymph nodes to trick the immune system. When that happened, the immune system attacked the cancer effectively. Furthermore, the immune system of the same mouse seemed able to attack a tumor later even when the drifting PD-L1 was reintroduced. This suggested to Dr. Blelloch that it might be possible to train the immune system to recognize a tumor much the way a vaccine can train an immune system to recognize a virus.
Liesl Schillinger in the New York Review of Books:
If you leaf through the pages of one of the tall, puffy black leatherette volumes of the Encyclopedia Britannica’s Macropædia (a portmanteau made from the Greek words for “big” and “education), you will find Arthur Koestler’s long essay on “Humour and Wit,” which is the only laugh-out-loud-funny encyclopedia entry anyone is likely to encounter anywhere. You can’t read the whole thing online, it has been abridged; to see the genuine article, you have to hold the actual book in your hand. Koestler wrote the essay for the maiden edition of the Macropædia in the 1970s, adapting it from his capacious books Insight and Outlook (1949) and The Act of Creation (1964), which break down the various manifestations of creativity, talent, originality, and genius. The six pages in the Britannica provide, if you will, the gist.
The purpose of his essay is to demonstrate how and why humor works. Koestler begins by listing jokes that illustrate distinct comic principles. My favorite, #5, appears under the heading “The Logic of Laughter.” It’s a joke Freud liked to tell as well, about a Marquis in the court of Louis XV who enters his bedroom to find a bishop making love to his wife. After observing them in flagrante, the Marquis calmly steps to the window, opens it, and extends his arms, blessing the people on the street below.
“What are you doing?” cried the anguished wife. “Monseigneur is performing my functions,” replied the Marquis, “so I am performing his.”
This joke works, Koestler explains, because the Marquis’s behavior is “both unexpected and perfectly logical—but of a logic not usually applied to this type of situation.” The reader expects the Marquis to respond with moral outrage, or even to draw a sword; instead, in an absurdly literal-minded way, he reevaluates his job description on the spot, and acts accordingly.
The modern world is full of technology, and also with anxiety about technology. We worry about robot uprisings and artificial intelligence taking over, and we contemplate what it would mean for a computer to be conscious or truly human. It should probably come as no surprise that these ideas aren’t new to modern society — they go way back, at least to the stories and mythologies of ancient Greece. Today’s guest, Adrienne Mayor, is a folklorist and historian of science, whose recent work has been on robots and artificial humans in ancient mythology. From the bronze warrior Talos to the evil fembot Pandora, mythology is rife with stories of artificial beings. It’s both fun and useful to think about our contemporary concerns in light of these ancient tales.
I recently stumbled across a statue in Baltimore that celebrates the young men of the city who fought in the “Spanish War.” On a narrow triangle in a residential neighborhood, this lone soldier stands at ease, holding a rifle across his body and staring into the distance, unperturbed by the city buses screeching past or the litter that collects below his feet. The pedestal records the years, from 1898 to 1902, when these brave Americans fought against Spain’s weak imperialist hold on the Philippines, Cuba, and Puerto Rico. Known as “The Hiker,” fifty or so versions of this statue still quietly dot city streets and parks across the United States. The bulk were erected in the 1920s in northern states, with Baltimore’s in 1943, and the last in 1965 near Arlington Cemetery.
Their anti-imperialist sentiment resonated with a different kind of monument that emerged in the same period—those to the Confederacy. To numerous white Americans, the democratic reconstruction of the former Confederacy was an imperialist exercise, and both types of statue celebrated a fight against empire. But they mobilized a rose-tinted vision of past war-making to reject the contemporary reality of early twentieth-century U.S. statecraft: Washington had built an empire, subjecting millions with dark skin to its rule. The anti-imperialist sentiment implicit in the statues, whatever its source, did not curtail empire. It denied its existence.
These monuments—both to the Confederacy and to far-flung operations and occupations—reveal just how vexing the term “empire” is in the U.S. vocabulary.
Last year saw the publication of a book that could well turn out to be a future classic of art writing. Jack Whitten’s Notes From the Woodshed was released just a few months after the painter’s death in New York at the age of 78. More than 500 pages of journal entries, written between March 24, 1962, and December 27, 2017, Notes From the Woodshed gives as true a sense of how the life of an artist is lived, and how it’s lived for the sake of his work, as any I’ve read. The book’s hallmarks are a kinetic energy of thought and immediacy of expression that trump literary style or even good spelling. By the end of his life, Whitten knew that his notes would be published—he wrote a prefatory essay in September 2015, more than two years before his last journal entry—but he didn’t gussy them up. They have been abridged, though, partly because he could be as critical of his fellow artists as he could be generously enthusiastic about them, and “a few of his writings have been redacted here to protect the living,” notes the book’s editor, Katy Siegel, who also curated last year’s exhibition, “Odyssey: Jack Whitten Sculpture, 1963–2017,” at the Baltimore Museum of Art and the Met Breuer.
The subsequent outpouring of creativity at the Bauhaus has since become the stuff of legend. Yet despite its popularity among teachers and students, the school and its methods were consistently controversial. As the clouds of nationalism gathered over Germany, many in Weimar were troubled by the blatant internationalism of the Bauhaus. Others attacked the emphasis on freedom and experimentation as inappropriately indulgent for a state-funded school. The economic uncertainties of the early 1920s only increased the pressure on Gropius. In 1925, the authorities in Weimar closed the school, forcing its relocation to Dessau, about 100 miles to the northeast.
This move provided Gropius the architect with perhaps his best-known commission: new premises for the Bauhaus and living quarters for its staff and students. Dessau also marked a new chapter in the ideology of the school. In 1926 – the year of the reopening – Gropius published what amounted to a second manifesto, his “Principles of Bauhaus Production”.
This familiar Strindbergian theme is underscored in The Best Intentions by an ingenious device to which the author turns more than once: the juxtaposition of some ostensibly documentary evidence from the “real life” that he’s fictionalizing—a photograph he has found of this or that relative or an entry in someone’s diary—with his novelistic reconstruction of the person or incident in question. This technique can shed ironic light on the characters. A passage, for instance, when Bergman quotes a diary entry by Henrik’s mother, Alma, after she meets her future daughter-in-law for the first time: “Henrik came with his fiancée. She is surprisingly beautiful and he seems happy. Fredrik Paulin called in the evening. He talked about tedious things from the past. That was inappropriate and made Henrik sad.”
This text is cited, pointedly, at the end of a long passage that has dramatized the moment that Alma so tersely summarizes in her diary: the visit by Anna, the interruption of the family friend, Fredrik.
Romance readers compound the sin of liking happy, sexy stories with the sin of not caring much about the opinions of serious people, which is to say, men. They are openly scornful of the outsiders who occasionally parachute in to report on them. In late 2017, Robert Gottlieb – the former editor of the New Yorker and unsurpassable embodiment of the concept “august literary man” – wrote a jocular roundup of that season’s best romances in the New York Times Book Review. He opined that romance was a “healthy genre” and that its effect was “harmless, I would imagine. Why shouldn’t women dream?” The furious public response from romance readers – “patriarchal ass” was among the more charitable comments – prompted a defensive editor’s note from the NYT, which later announced it was hiring a dedicated romance columnist, who happened to be both a woman and a long-time fan of the genre.
Coverage of the romance industry often dwells on the contrast between the nubile young heroines of the novels and the women who actually write the books: ordinary women with ordinary bodies, dressed for their own comfort. Reporting on the first annual conference of the Romance Writers of America (RWA) – the major trade association for romance authors – in 1981, the Los Angeles Times wrote that the 500 authors who attended were “not the stuff of which romance heroines are made – at mostly 40 and 50, they were less coquette and more mother-of-the-bride”. That observation – combining creeping horror at the idea that middle-aged women might be interested in sex, with indifference to the fact that male authors are rarely judged for failing to resemble James Bond – is typical.
Part of the intense scorn romance authors face is the result of their rare victory. They have built an industry that caters almost completely to women, in which writers can succeed on the basis of their skill, not their age or perceived attractiveness.
Not long ago I diagnosed myself with the recently identified condition of sidewalk rage. It’s most pronounced when it comes to a certain friend who is a slow walker. Last month, as we sashayed our way to dinner, I found myself biting my tongue, thinking, I have to stop going places with her if I ever want to … get there! You too can measure yourself on the “Pedestrian Aggressiveness Syndrome Scale,” a tool developed by University of Hawaii psychologist Leon James. While walking in a crowd, do you find yourself “acting in a hostile manner (staring, presenting a mean face, moving closer or faster than expected)” and “enjoying thoughts of violence?”
Slowness rage is not confined to the sidewalk, of course. Slow drivers, slow Internet, slow grocery lines—they all drive us crazy. Even the opening of this article may be going on a little too long for you. So I’ll get to the point. Slow things drive us crazy because the fast pace of society has warped our sense of timing. Things that our great-great-grandparents would have found miraculously efficient now drive us around the bend. Patience is a virtue that’s been vanquished in the Twitter age. Once upon a time, cognitive scientists tell us, patience and impatience had an evolutionary purpose. They constituted a yin and yang balance, a finely tuned internal timer that tells when we’ve waited too long for something and should move on. When that timer went buzz, it was time to stop foraging at an unproductive patch or abandon a failing hunt.
“Why are we impatient? It’s a heritage from our evolution,” says Marc Wittmann, a psychologist at the Institute for Frontier Areas of Psychology and Mental Health in Freiburg, Germany. Impatience made sure we didn’t die from spending too long on a single unrewarding activity. It gave us the impulse to act. But that good thing is gone. The fast pace of society has thrown our internal timer out of balance. It creates expectations that can’t be rewarded fast enough—or rewarded at all. When things move more slowly than we expect, our internal timer even plays tricks on us, stretching out the wait, summoning anger out of proportion to the delay.