A Palimpsest, Scraped Clean & Reinscribed

by Charles Siegel

Last week, on Presidents’ Day, President Trump lost in court. This is hardly news. The administration has already lost hundreds of cases in just a little over a year. And this isn’t the first time it has lost on a day when courts aren’t even open.

The ruling came in a case officially entitled City of Philadelphia v. Burgum. Of the many, many cases the administration has lost, this is not remotely the most important. The day before I submitted this column, for example, the Supreme Court ruled against Trump in the tariffs case. (Trump, naturally, immediately called the justices who ruled against him — including three of the six conservatives on the court, two of whom he had appointed and all of whom had given him wide and unprecedented immunity —  “fools and lapdogs” and an “embarrassment to their families.”)  And just the day before that, a federal judge in West Virginia granted a habeas corpus petition filed by an ICE detainee. He began his opinion as follows:

Antiseptic judicial rhetoric cannot do justice to what is happening. Across the interior of the United States, agents of the federal government—masked, anonymous, armed with military weapons, operating from unmarked vehicles, acting without warrants of any kind—are seizing persons for civil immigration violations and imprisoning them without any semblance of due process. The systematic character of this practice and its deliberate elimination of every structural feature that distinguishes constitutional authority from raw force place it beyond the reach of ordinary legal description. It is an assault on the constitutional order. It is what the Fourth Amendment was written to prevent. It is what the Due Process Clause of the Fifth Amendment forbids.

In our constitutional republic, governmental force derives its authority from the Constitution. But that authority is not unlimited. The Government’s power is legitimate only because it is derived from the People and exercised through law by identifiable public officers answerable to the public and to the courts. The structure of the Constitution guarantees visibility. Both the officer and the force he employs are traceable to authority delegated by the People and subject to the limits imposed by law. When the Government uses force against the public, the citizen can recognize the officer as a lawful representative. The public can evaluate the act. The judiciary can later review it. Every stop, arrest, detention, and use of force can be tested against the Constitution’s protections. Not so here. Read more »

Proust’s Madeleine: Time and Narrative

by Leanne Ogasawara

1.

A man sits unsettled on a wintry Paris afternoon, when his mother offers him some tea—along with a small little cake, shaped like a shell.

Perhaps it is the most famous cake in literary history, for the moment he tastes the madeleine, softened in the tea, he is, in that instant, transported across time and space, as the world of his childhood rises before him, whole and luminous.

What begins as a simple sensation becomes a revelation: the past, long buried, is not gone at all, but waiting in the body, ready to return.

Philosopher Paul Ricoeur might have said this was the perfect example of the way literature illuminates how human beings experience time. In his three-volume series, Time and Narrative, Ricœur suggests that time becomes human only when it is articulated through narrative, because this is when lived duration is gathered and configured into meaning.

A philosophy student undergraduate, I am now married to an astrophysicist so I guess it was no surprise that time was something I became completely obsessed with when I began studying the craft of fiction. Just think of how Proust’s madeleine moment captures the way we really do experience time. Like a palimpsest, where one sensation can trigger an entire buried world, and we feel ourselves transported through a temporal aperture, like a worm hole, through the story of our lives.

In a class on time and fiction I took several years ago, we looked at a story in The New Yorker by Weike Wang called Omakase. A kind of frame story, it begins when a couple goes out to dinner, and in almost every moment that happens in chronological time, a sip of tea or when the woman finds herself observing the man chat with the waitress, there is a shift of thought, an interior unfolding. It’s like the multi-dimensional unfolding of a Sophon in the Three-Body Problem, her inner world unfolds in multi-dimensions. Read more »

Ex-Patriot

by Rafaël Newman

Eva Bertha Strassner, Montreal, c. 1930

If she hadn’t died in July 1987, at the age of 81, my maternal grandmother would be celebrating her 120th birthday this week. One-hundred-and-twenty is a proverbial age among Ashkenazi Jews, who recite the Yiddish formula Biz hundert un tsvantsik, or “[You should live] to 120,” on the anniversary of a birth. That’s a full 20% markup over their Goyish compatriots in the Old Country, who grant honorees a mere Sto Lat!— “100 years”—in the traditional Polish song of birthday congratulations.

I never once said Biz hundert un tsvantsik to my grandmother, however, on any of her birthdays, although not for want of wishing her well. This was because, despite her birth in the canton of Aargau, the traditional Swiss “Pale of Settlement” for Jewish residents until the late 19th century, and her own eventual close family ties with Jews, my Grossmutti was not herself Jewish. But neither was she in fact a Swiss citizen, despite the place of her birth. She was German.

Registration of Johann Simon Emil Strassner in Zurich, 1909

Eva Bertha Strassner was born on February 23, 1906, in Baden, Switzerland, to Johann Simon Emil Strassner and Maria Elisabeth Sander, immigrants from Löbelstein, which was, at the time of their births and until 1918, in the Duchy of Saxe-Coburg-Gotha, and is today a part of Bavaria. Eva Bertha’s father had come with his burgeoning family in search of employment as a handyman, which he found with a variety of enterprises in the industrialized regions of Switzerland at the turn of the 20th century. Over the next decades the family moved, in keeping with Johann Strassner’s fortunes and economic exigencies, from Baden in the canton of Aargau to nearby Wettingen, then to Töss, at the time a village in the canton of Zurich, now a district of Winterthur, the canton’s second city, and from there, in the 1920s, to Schwerzenbach, on the edge of Zurich’s Oberland, within shouting distance of Zurich itself, the cantonal capital and today Switzerland’s largest city. Read more »

Thursday, February 26, 2026

Conservative Postmodernism and the Stuck Culture Hypothesis

by David Kordahl

Blank Space: A Cultural History of the Twenty-First CenturyNo one sells out anymore. The first pages of Blank Space: A Cultural History of the Twenty-First Century, W. David Marx’s overview of the past quarter-century of popular culture, give a striking example of this cultural shift. In 1992, the Seattle-based grunge band Pearl Jam elected to stop making music videos because they were worried about becoming too commercial. Marx writes, “Pearl Jam’s principled stand resonated with their fans: If rock bands were so desperate for money, they might as well be bankers.” This contrasts with the Lollapalooza festival in 2022, thirty years later, where David Solomon, the CEO of Goldman Sachs, performed as “DJ D-Sol,” playing electronic dance music for party-goers at the Tito’s Handmade Vodka stage after arriving in Chicago on his corporate jet.

How did this shift occur? One thing that makes Marx’s analysis bracing is that the figures he picks as being most significant in our broader cultural history are not the usual musicians or writers. Much more time is spent on Pharrell Williams and Kim Kardashian than, say, Arvo Pärt or Elena Ferrante. This is not a failure of taste, but a decision to focus on figures who managed to understand, before the rest of us, how fundamentally the Internet had altered the logic of cultural change.

Blank Space further develops the model that Marx described in his 2022 book Status and Culture: How Our Desire for Social Rank Creates Taste, Identity, Art, Fashion, and Constant Change. Cultural change in earlier eras, from high culture to low, ran something like this. Small groups of innovators would try new things in relative obscurity. Popularizers would notice them, and would streamline and repackage their ideas to be fed to a mass audience. The public might never experience the original source of such ideas, but the kitsch they consumed might still be directional, moving culture forward, even if at a lag.

Subcultural scarcity was important to such changes. Knowing about trends before others did gave one social capital, and that capital rewarded risk. It made sense for Pearl Jam to stop making music videos, since overexposure was a form of contamination. The gesture worked because the market was suspect, underground knowledge was elevated, and selling out was a real category with real stakes.

Enter the Internet. Read more »

Bad Stupid

by Akim Reinhardt

A thought has been nagging at me lately. Are most shitty people not very bright?

Some shitty people are very smart, of course, and those are the ones you really have to worry about: the pied pipers enchanting others to dance to their shitty tunes. But is it possible that most shitty people are not Darth Vader types, brilliant but troubled and drawn to the Dark Side’s Machiavellian potential? Could it be that most of them instead discover their shittiness through ignorance and low emotional intelligence, and are not seduced into it despite their brilliance?

There’s a lot to untangle. First and foremost, I am not contending that most not very bright people are shitty. Far from it.

Beyond that, however, we must ask ourselves both: What makes a person shitty, and, What makes a person smart?

Shittiness is the easier category to tackle, but by no means simple. Indeed, everyone is a shitty person sometimes. Everybody starts out as a baby, a toddler, a little kid, and every single one of them is a shitty little brat from time to time. Me! Me! Me!  No one else matters! That selfishness is at the core of both, general human shittiness, and much of childhood up to a certain point. But in most cases, families, teachers, and many others train children to leave a lot of that shittiness behind most of the time, and to become a functional member of various social groups: families, schools, houses of worship, friend groups, workplaces, and so forth.

How much of that Me! Me! Me! are people trained to leave behind? Aside from personal proclivities and experiences, it varies significantly across human cultures. For example, in many Indigenous American cultures, or in Japanese culture, historically there has been a strong emphasis on sublimating individual selfishness in favor of group function and cohesion. In U.S. culture? Yeah, not so much. But does that mean American society has a higher rate of shitty people than other countries? Who’s to say?

Well, this Brit for starters. But what even is shittiness?
Read more »

Wednesday, February 25, 2026

Of Wood Witch Born

by David Winner

Witch not from the woods

Jessie Buckley, an incredible actress, breathes life into the role of Shakespeare’s wife, Agnes, in Hamnet. Like others in the film, her face is always a little dirty (Chloé Zhao, the director, doesn’t fall for the trap of giving characters in dirty times perfectly clean skin), but I was frustrated by her character, which struck me as an odd fusion of romanticized gender essentialism and wild irrationality. Though I (a middle-aged cis male living in Brooklyn, New York, in the present day) could not be further from the Elizabethan woods where the essence of Agnes’s character originated, I still found myself resisting the film’s portrayal of her. Spoiler-alert, this writing reveals the ending.

Born (rumor has it) from a wood witch’s womb, Agnes mixes odd herbs together to create tinctures to heal wounds and reduce fevers. She (and this is a nice touch) greets young William’s desire to “handfast” with her by jumping past the wedding to the wedding night and having sex with him animalistically in nearly plain view. And she gives birth by herself in the woods, real wild-woman style. Her character seems ever wise and in touch with nature, her feelings and instincts spiritually sacrosanct and nearly unassailable—a vision of femininity pretty impossible, I would imagine, for any actual woman to live up to.

Her prophecies don’t always seem correct, however. She predicts a future for her son working on plays with William, though he ends up dying, still a child, when he takes the pestilence about to kill his sister onto himself—a brother bravely and spiritually sacrificing himself to save his sister.

Before her son’s death, Agnes encourages Shakespeare (who is sweet and kind of happy-go-lucky) to go to London to follow his passion. She declines his invitation to move there with the family, uncomfortable, I would imagine, in the large, alienating city. Each time William visits Stratford and leaves to return to London, there is tremendous sadness, which reflects—or so I imagine—not just their upcoming separation but the fragility of those times, when the smallest illness, in a time of plague, was likely to carry someone off. Read more »

Ghosts, Love, and the Search for Readers: A two-part conversation with writer Kipling Knox

by Philip Graham

I’m still amazed that Kipling Knox was my student during the mid-to late 1980s, the earliest days of my teaching at the University of Illinois. A more-than-promising creative writing undergraduate student, he then went on to live his professional life in the tech and editing worlds before embarking on a literary career that so far has produced two excellent books—a story collection and a novel—set in the Illinois landscape where he grew up. A landscape also filled with actual ghosts, a popular condition of so much imaginative writing set in the Midwest. My own latest novel has a dance card filled with ghosts, so it was inevitable that Kip and I would eventually sit down and compare notes. Another point of comparison has been our individual decisions to explore the possibilities of independent publishing and the unexpected avenues such a decision can lead authors in search of autonomy. It seems we had a lot of say, and so have divided our chat into two digestible parts.

Philip Graham: In the past three years you’ve published two beautifully-written works of fiction. In 2022, Under the Moon in Illinois, and at the end of 2025, How to Love in a World Like This. Both books are set—largely—in a Midwestern town that you’ve conjured up and call Middling, Illinois. And so they seem to be a part of a larger—and still growing?—world-building enterprise of fiction.

Kipling Knox: Thanks, Philip. Yes, that’s true—both books share a world with common characters. But that wasn’t my original intent. Between publishing these two, I started two other novels, with different settings. I put them both aside because I found myself drawn back to Middling. The story “Downriver,” in particular, ended so ambiguously that I was curious to know what would happen to its characters, Morgan and Arthur, and how their mystery would play out. It’s a difficult trade-off—sticking with one fictional world versus exploring others. When you write a book, you are deliberately not writing others, and there can be a sense of loss in that. But it’s very gratifying to explore a world you’ve built more deeply. I think of how a drop of ocean water contains millions of microorganisms, each with their own story, in a sense. So the world of Middling County (and also, in my second book, Chicago) has infinite potential for stories!

As I consider my next project, I do feel very committed to the American Midwest. Midwestern people and culture fascinate me. I think this is partly because Midwesterners are so often unburdened by a sense of superiority. We’re always trying to prove ourselves, politely. This is a gross generalization, of course, but I think it’s grounded in truth. These qualities make Midwestern characters nuanced and earnest and, hopefully, a little comical. It also makes for rich social satire—I think of midwestern-born authors like Mark Twain, Kurt Vonnegut, and George Saunders. Read more »

Tuesday, February 24, 2026

Together We Cannot Fail: FDR’s Ten Days

by Michael Liss

Franklin Delano Roosevelt. Photograph by Vincenzo Laviosa, circa 1932.

March 3, 1933. Herbert Hoover spent his final hours in the White House in anger and despair. Angry that he’d been decisively rejected by an electorate wrongheaded enough to not realize the wisdom of his policies—even when the evidence of their efficacy had been there for all to see. Despairing that his successor Franklin Delano Roosevelt was such a dilettante, an unworthy lightweight at a time when all serious men understood the necessity for prudence, for careful adherence to sound principles and practices.

He was also profoundly worried. “The Great Engineer” could see that the cracks in the foundation that he had so carefully begun to mend were giving way. Unemployment remained stubbornly high. Europe was deeply unstable and calling home from the U.S. its reserves of gold. Currency was increasingly scarce, and there was an acute loss of confidence in the domestic banking system—many banks could not meet the demand for cash. Without sufficient cash, the economy would completely seize up, and any scrip that might be issued in lieu of it would cause rampant inflation. 

Hoover thought he knew the reason: His leadership was coming to an end. The clear policy statements he had made over the course of the previous year—indeed his efforts ever since the Crash itself—had painfully, but certainly, eased the economy back from the brink. Now, he believed all that good was being undone by the public’s anxiety that FDR would abandon his proven approach.  

It’s not as if Hoover hadn’t warned the electorate: In his October 31, 1932 campaign speech in Madison Square Garden, he laid out the stakes: elect FDR and “[t]he grass will grow in streets of a hundred cities, a thousand towns; the weeds will overrun the fields of millions of farms if that protection be taken away. Their churches, their hospitals, and their schoolhouses will decay.”   Read more »

Three Times the USA Flirted with Utopia in the 20th Century (MAGA Is a “Restorative” Utopian Movement?)

by Daniel Gauss

An allegorical rendering of an impossible meeting: three presidents who once aimed beyond incremental reform, while a self-styled restorer of greatness looms in the background.

In the twentieth century, there were policy initiatives in the United States that went beyond incremental reform and which could justifiably be called “utopian.” Three of these initiatives stand out: Franklin D. Roosevelt’s proposal of a “Second Bill of Rights,” Lyndon B. Johnson’s “Great Society” and Richard Nixon’s effort to establish a type of Universal Basic Income (UBI).

If we take a close look at these three initiatives, and what happened with each one, we can see why “progressive” utopian programs are no longer being proposed, and why there is now space for conservative “restorative” ideals.

Each of the three initiatives above was calculated to extend political, social and economic rights to previously excluded groups, and sought to shift the government toward greater responsibility for its citizens’ well‑being. Each initiative failed to reach its full potential because of a combination of political resistance, economic pressures, institutional limitations and changes in public attitude.

In the spirit of John Gray’s Black Mass (2007), I would also like to investigate whether MAGA might be considered a utopian movement, but in the opposite direction. FDR, LBJ and Nixon were interested in forward-looking utopian projects calculated to expand social rights and economic benefits. MAGA seems to represent a backward-looking utopian project: the belief that America can be “restored” to an idealized past through governmental action. Read more »

Monday, February 23, 2026

Erasing Other, Erasing Self: Reflections on Black History

by Herbert Harris

National Museum of African American History and Culture, Washington, D.C.

This year’s Black History Month is different.

Black history itself has become contested. Not debated at the margins but questioned at its core. School curricula are scrutinized, and institutions that preserve Black memory are accused of being “divisive.” Should Black history exist, or should it disappear, erasing its many uncomfortable truths and leaving a more homogeneous national narrative?

Narratives are what hold us together as individuals and as societies. To have the wholeness and continuity essential to our survival, our stories must be heard, recognized, and validated by others. Identity is not a monologue in an empty room; it requires an audience and a full cast.

History is our shared narrative. It is how a nation understands what has happened and who it is. It is also how we relate to those who came before us. To deny or erase significant portions of that history is not merely to rearrange a syllabus. It distorts the self-understanding of the entire society. A narrative that excludes central truths becomes brittle. It depends on selective memory and strategic forgetting. That society’s connections to reality inevitably fray, eventually breaking.

As a psychiatrist, I have spent much of my professional life listening to narratives. Read more »

Chess and Language as Paradigmatic Cases for Artificial Intelligence

by William Benzon

The digital computer emerged out of devices that were built during World War II for cracking codes and calculating artillery tables. Almost as soon as the computer came into existence people were thinking about using it for chess and language. Alan Turing developed Turochamp, an algorithm for playing chess in 1948. Though that algorithm was never implemented, chess became a central topic of AI research, so much so that John McCarthy, who coined the phrase, “artificial intelligence,” would eventually write an article entitled “Chess as the Drosophila of AI.” In 1949 Warren Weaver, then with the Rockefeller Foundation, wrote a memo proposing machine translation. The first public demonstration of machine translation took place in 1954 at the IBM head office in New York and involved 60 Russian sentences.

These two lines of research differed on a philosophical level. AI was a full-on assault on human intelligence. Chess was regarded as the apogee of human intelligence. If we could program a computer to play chess at a championship level, we could program a computer to do anything a human can do. That’s what researchers believed. That’s why chess was so important as a domain of research.

The goal of machine translation was more modest: the reliable translation of documents in one natural language (e.g. Russian) into another natural language (e.g. English). The research programs were correspondingly different and wouldn’t merge/collide until the 1970s with the Speech Understanding Project sponsored by ARPA (the Advanced Research Projects Agency of the Defense Department, now simply DARPA, Defense Advanced Research Projects Agency).

However, I do not intend to recount the history of research in these two areas. As I said, my aim is more philosophical. I’m interested in the radically different nature that these two cases present for AI research. To that end I want to begin by discussing the radically different geometric footprints presented by chess and natural language. With that ground under our feet we can go on to more abstract matters.

Read more »

Poem by Jim Culleny

When Bach was a Busker in Brandenburg

When Bach was a busker playing for humble coin
he’d set up his organ in the middle of a square
regardless of pigeons, ignoring the squirrels who sat
poised at its edges waiting for their daily bread. He’d
set to work assembling its pipes from a scaffold of
arpeggios by his baroque means, setting its starts and stops,
its necessary rests and quick resumptions, seeing
in his mind’s-eye each note to come as he’d placed them, just so,
on paper at his desk, simultaneously hearing them
as they would resonate against eardrums in potential
cathedrals of brains— even before a key was touched,
even before a bow was raised,
even before a slender column of breath
was blown into a flute, or drum skins troubled the air,
he’d hear them as he saw them, strung out along
a horizontal lattice of five lines following the lead limits of a cleft,
soaring between and around each other darting out, in and through,
climbing, diving, making unexpected lateral runs between boundaries,
touching, sometimes, the edge of chaos but never veering there,
understanding the limits of all, so that now, having prepped for his
street-corner concerto, this then-unknown would descend from his scaffold
and share with the ordinary world how a tuned mind works in harvesting
song from a universe of stars: collecting their sweet sap, distilling it
into a sonic portrait of a universe that forever lies within the looped
horizon of things.

Jim Culleny, 10/3/22

Enjoying the content on 3QD? Help keep us going by donating now.

Sunday, February 22, 2026

On the Varieties of Unseeing – China Miéville’s “The City & The City”

by Christopher Hall

The problem with teleological thinking as far as authoritarianism goes is that we can delude ourselves into believing that, given that we are not in an Orwellian State, and it looks unlikely that we’re going to get there, some of the current kvetching about the current situation may appear to be overblown. Trump, in other words, at some point is going to go away. He is not likely to succeed in cancelling the midterms (although plenty of denial of voters’ rights is probable), and a third term for him seems equally unlikely. When I look at the future, I do envision some kind of American Restoration is a likely scenario, since it seems that there is no one with Trump’s mastery of a bizarre sort of charisma standing on the deck to take his place (J. D. Vance certainly doesn’t have it). A moderate will take over, there will be much reference to norms and standards and, perhaps, if that moderate is a Democrat, some desultory prosecutions. But that restoration will not be, cannot ever be, complete. We know something has permanently changed. The question for us is where that leaves us, and Liberal Capitalist Democracies, now.

As Orwell’s quote – ““The Party told you to reject the evidence of your eyes and ears. It was their final, most essential command.” – floated around the internet (including this site) in the days and weeks after the murders of Renee Good and Alex Pretti, I did worry a bit that, if we were adopting Orwell as our model of political disaster, we may be a bit off the mark. The “command” issued from the Trump administration to “unsee” what we all clearly saw in the videos of the murders was shambolic in a way no directive from the Ministry of Truth could ever be. But that may have been the point; don’t worry that people are going to mock your attempt to control the narrative. Controlling what people see and say isn’t the point – flooding the zone with shit is. Trump’s Ministry of Bullshit is not concerned that many, perhaps most, see through everything he does. He is concerned that enough do not or will not. I’ve written before that there is a difference between control by enforcing silence and control by encouraging noise. What if the hybrid model is the actual goal – to empty out the core of democracy just enough so that the day-to-day experience of it is not substantially changed – and yet, everything is irrevocably different? Our dystopian visions tend to be of the absolute sort, but absolutism is not particularly helpful at the present moment – and we should hope it doesn’t become so.

The Orwellian directive to unsee is a top-down phenomenon, and the penalties for disobedience severe. An alternative of sorts can be found in  China Miéville’s 2009 speculative fiction novel, The City & The City. Read more »

In Praise of the Sketch

by Priya Malhotra

We stand in front of a painting and extol its brilliance. We listen to a piece of music and call it genius. We watch a film and admire its evocativeness. We hold a beautifully designed object and marvel at its simplicity. What we don’t see — what we almost never ask about — are the versions that came before.

The sketch that didn’t work. The canvas painted over. The idea abandoned halfway through. The sculpture that cracked. The notes in the margin that were crossed out.

Art, when we encounter it publicly, looks certain. But art is made in uncertainty.

That’s the simple but powerful idea behind Moving Archives, an exhibition opening this February at Bikaner House in Delhi, curated by Ranjita Chaney and Ruchika Soi. Instead of focusing only on finished works, the show turns toward the material around them — drafts, drawings, scripts, research, documentation. It asks a very basic question: What happens to the process once the product is done?

Because process doesn’t disappear. It just goes out of sight. Think about a painter in a studio. The final canvas might look confident and deliberate. But underneath it are layers — earlier compositions, colors tried and rejected, shapes adjusted and corrected. There are sketchbooks filled with studies that never left the room. There are experiments that didn’t succeed. None of that shows up in the gallery label. Yet without it, the final painting would not exist.

The same is true in other fields. Writers produce draft after draft before a novel feels right. Musicians test melodies and rhythms before a song settles into place. Filmmakers shoot more than they use and shape the story in the edit. Designers build prototypes that wobble, collapse, or feel wrong before landing on the object we admire.

Across disciplines, we are used to seeing the polished result. We are not used to seeing the struggle. Maybe that’s because we like the idea of mastery. We like to imagine that great art arrives fully formed. It’s comforting. It makes genius feel clean.

But creation is rarely clean. It is messy. It involves doubt. It requires throwing things away.

And that is where this exhibition becomes interesting — not just for the art world, but for anyone who has ever tried to make something. Read more »

Friday, February 20, 2026

The Burden of a Molecule

by Ashutosh Jogalekar

Poison dart frogs (Image: ABdragons.com)

Earlier this week, European investigators concluded that the Russian opposition leader Alexei Navalny had been killed with epibatidine, a toxin unknown in Russia’s natural environment and ordinarily found only in the skin of small, brilliantly colored frogs native to the rainforests of South America. If that conclusion is correct, a molecule shaped in one of the most intricate ecosystems on Earth has completed a journey that ends not in the forest, nor in the laboratory, but in a prison cell. For Putin’s Russia, this is one more marker on the road to political assassination using chemical and biological weapons.

Long before laboratories named it, indigenous communities of the Amazon understood through long experience that certain tiny, extraordinarily bright and beautiful frogs carried extraordinary power in their skin. The knowledge was practical and restrained. It served hunting, survival, and continuity. It was part of a relationship with the living forest in which danger and respect were inseparable. Nothing in that knowledge pointed toward geopolitics or assassination. The molecule existed only within a web of life that had shaped it.

Centuries later, science encountered the same substance and read it differently. At the National Institutes of Health, the chemist John Daly devoted decades to the study of amphibian alkaloids, following faint chemical traces through repeated expeditions, careful collections, and patient analysis. His work was not driven by persistence, by the belief that small natural molecules could reveal deep biological truths. From thousands of specimens and years of attention emerged epibatidine, a molecule isolated from the skin of a poison dart frog endemic to Ecuador and Peru: a structure modest in size yet immense in biological effect, binding human receptors with an affinity evolution had refined without intention. Daly turned into something of a folk hero whose findings resonated beyond the halls of chemistry. Read more »

The Last of the Turquoise Lakes? The Fragile Beauty of the Blue Canadian Rockies

by David Greer

Moraine Lake, Alberta. Wikimedia Commons, David Zhang.

It’s a magical scene not easily forgotten—snow-covered peaks reflected in calm turquoise lakes ringed by stately pines. It’s a view that likely inspired the romantic ballad “The Blue Canadian Rockies”, about a lonesome guy pining for a faraway sweetheart who unaccountably refuses to abandon  the mountains she loves to join him somewhere beyond the sea. Sung by Gene Autry in the 1952 movie of the same name, the tune was later covered by artists as diverse as Jim Reeves, Vera Lynn, The Byrds and, perhaps most plaintively, Wilf Carter a.k.a Montana Slim, who added a longing, contemplative yodel to his rendition.

Now imagine the same picture devoid of snow and with the turquoise waters faded to a murky blur. No magic there, just a dull landscape unworthy of a second glance.

That transition is already underway and starting to accelerate as the impacts of human-caused climate change become more pronounced and global efforts at mitigation become more fractured. As it stands now, the striking turquoise hue of some lakes in the Rockies is already beginning to fade, and the glaciers to which those lakes owe their remarkable color will likely be all but gone in a generation or two, so if you haven’t yet enjoyed the magnificence of the blue Canadian Rockies, now may be the time.

I was recently reminded of this on retrieving the Sunday New York Times from my doorstep a couple of weeks ago. Adding to its usual substantial heft was a separate section titled “52 Places to Go”, an annual feature that reminds readers beset by ice pellets and sleet that winter will eventually end and jets will stand ready to fly you to the destination of your dreams, assuming you haven’t already been deported to the destination of your nightmares.

Only one Canadian location merited mention in the feature—a “limited-time train” excursion through the Canadian Rockies. “The route,” explains the article, “will whisk you to pristine alpine meadows in Alberta, where you can enjoy some of the continent’s most spectacular scenery between Jasper and Banff”. What it neglects to mention is that there is no actual train track connecting Jasper and Banff, only a highway. Read more »

The Oracle of Bacon: Thirty Years Later

by Jim Hanas

Any sufficiently advanced technology might be indistinguishable from magic, as Arthur C. Clarke said, but even small advances–if well-placed–can seem miraculous. I remember the first time I took an Uber, after years of fumbling in the backs of yellow cabs with balled up bills and misplaced credit cards. The driver stopped at my destination. “What happens now?” I asked. His answer surprised and delighted me. “You get out,” he said.

Thirty years ago a website appeared that, in the early days of “the graphical portion of the Internet”–as the New York Times then faithfully called the World Wide Web upon first occurrence–seemed like such a miracle. I am speaking, of course, of the Oracle of Bacon, the site inspired by the parlor game “Six Degrees of Kevin Bacon.” The story of the Oracle, which is maintained to this day, is–in many ways–the history of the consumer Internet in brief. It features a meme, virality, consumer delight, and unintended consequences–but more on those later.

The Oracle is based on a game invented by college students in 1994. An early message board thread titled “Kevin Bacon is the Center of the Universe” challenged readers to find the shortest path between Kevin Bacon and other  actors via chains of movies they had appeared in together. The post reported that the game’s initial prompt had “received 80 responses in just over a week” (!) at the University of Virginia, though it was three students at Albright College in Pennsylvania that codified the game–and its benchmark–under the name “Six Degrees of Kevin Bacon,” after the 1993 movie based on the John Guare play of the same name. A book followed in 1996, and–were it not for the contemporaneous explosion of the World Wide Web–the story might have ended there. Read more »