Ethan Siegel in Forbes:
The collapse of the Tacoma Narrows Bridge on the morning of November 7, 1940, is the most iconic example of a spectacular bridge failure in modern times. As the third largest suspension bridge in the world, behind only the George Washington and Golden Gate bridges, it connected Tacoma to the entire Kitsap Peninsula in Puget Sound, and opened to the public on July 1st, 1940. Just four months later, under the right wind conditions, the bridge was driven at its resonant frequency, causing it to oscillate and twist uncontrollably. After undulating for over an hour, the middle section collapsed, and the bridge was destroyed. It was a testimony to the power of resonance, and has been used as a classic example in physics and engineering classes across the country ever since. Unfortunately, the story is a complete myth.
Every physical system or object has a frequency that's naturally inherent to it: its resonant frequency. A swing, for example, has a certain frequency you can drive it at; as a child you learn to pump yourself in time with the swing. Pump too slowly or too quickly, and you'll never build up speed, but if you pump at just the right rate, you can swing as high as your muscles will take you. Resonant frequencies can also be disastrous if you build up too much vibrational energy in a system that can't handle it, which is how sound alone at just the right pitch is capable of causing a wine glass to shatter.
It makes sense, looking at what happened to the bridge, that resonance would be the culprit. And that's the easiest pitfall in science: when you come up with an explanation that's simple, compelling, and appears obvious. Because in this case, it's completely wrong.
Pope Francis already has a reputation for barnstorming. His positions on poverty, on gay priests, and liberation theology would have been shocking enough on their own, but in contrast to the more conservative positions of previous popes, they were downright lefty.
Sure, Francis has his more traditional moments. Abortion and assisted suicide are still no-go for the leader of the world's Catholics. But Francis has been explicit about links between capitalism, materialism, and threats to the world's poor. There's a reason he named himself after St. Francis of Assisi—famously poor, famously eco-conscious—after all.
Now the Pope is taking on science. Specifically, in a new encyclical—that's a letter laying out official Catholic doctrine—Francis describes Earth's problem with an increasingly messed-up climate, why that's the purview of religion, and who will suffer the most if people don't do anything about it. The encyclical, "On Care for Our Common Home," makes explicit the connection between climate change and oppression of the poorest and most vulnerable. It's well-argued, clear, at times quite moving…and 42,000 words long. So here's the good-parts version…
Andrew Huddleston at the Times Literary Supplement:
One of the most interesting elements in Blue’s story is its charting of Nietzsche’s loss of faith, beginning in his middle teenage years. In his contributions to Germania, we don’t see outright atheism, but we do see a cautious movement to a more sceptical perspective. In one of his essays from this period, Nietzsche reflects on how difficult it can be to distance oneself from the tradition in which one has grown up, and to reflect on it in a critical way. This gives a nice hint of what, in the face of this difficulty, will become one of his most striking philosophical accomplishments: specifically his ability to step, insofar as possible, outside the Judaeo-Christian moral tradition and look at it as an anthropologist might, explaining how it gained traction and why it continues to retain it. Nietzsche pressed this still further, going beyond the role of anthropologist to that of philosophical “legislator”, concerned with the task of “revaluing” these hitherto revered values.
From Schulpforta, Nietzsche first moved to Bonn to study, then to Leipzig, where he did his doctoral training. While philology was his subject, philosophy was a strong side interest. He had ongoing doubts about whether he would be cut out for a career as a philologist, but his talent was evident to his teachers. Blue is particularly good in charting Nietzsche’s relationship to Friedrich Ritschl, a noted classicist who was Nietzsche’s supervisor in Leipzig. It was partly on Ritschl’s glowing recommendation that Nietzsche secured his first (and only) academic post, at the University of Basel.
Edith Hall at Prospect Magazine:
Harrison is best known as the author of several frequently anthologised poems about his working-class childhood in the Leeds suburb of Beeston and his difficult relationship with his relatives. Studying as a scholarship boy at Leeds Grammar School and then reading Classics at Leeds University, his ever-widening intellectual and cultural horizons created a chasm between him and his family. His painful attempts to come to terms with his alienation from his mother and father, while remaining committed to the cause of the working class, is explored at length in his pivotal 1978 collection From The School of Eloquence and Other Poems; several of the pieces expressing the impact of his mother’s death on his relationship with his father, a taciturn baker, are studied by teenagers at GCSE. At A Level, the poem of choice is his “v.,” written during the 1984-5 miners’ strike.
A modern response to Thomas Gray’s “Elegy Written in a Country Churchyard” (1751), “v.” recounts Harrison’s trip to a cemetery in Beeston to visit his parents’ grave. Now a haunt of local skinheads, the graveyard has been defaced by racist and obscene graffiti. This prompts meditation on the divisions (as in v for versus) caused by class and racial conflict in his society; it is followed by a dialogue with a skinhead in Leeds vernacular. The poem is now regarded as a classic of late 20th-century literature. The film version, directed by Richard Eyre and seen on Channel 4 in 1987, involved the longest cluster of sexually explicit words ever broadcast in Britain at the time. It made Harrison headline news. Mary Whitehouse protested in the Times. The Daily Mail denounced it as a “torrent of filth.”
Hilton Als at the NYRB:
For Arbus the question was: What realities does reality represent? And yet she couldn’t bear making art that was “art”; like all those Russeks furs, painting belonged to a moneyed class, the world of connoisseurship. Was she talented, she wondered, or was she encouraged to make art because a girl of her background was supposed to? For her senior class assignment, Arbus produced her “Autobiography,” in which she wrote:
Everyone suddenly decided I was meant to be an artist and I was given art lessons and a big box of oils and encouragement and everything. I painted and drew every once in a while for 4 yrs. with a teacher without admitting to anyone that I didn’t like to paint or draw at all and I didn’t know what I was doing. I used to pray and wish often to be a “great artist” and all the while I hated it and I didn’t realize that I didn’t want to be an artist at all. The horrible thing was that all the encouragement I got made me think that I really wanted to be an artist and made me keep pretending that I liked it and made me like it less and less until I hated it because it wasn’t me that was being an artist.
Who was that “me”? Despite her horror of painting—“I remember I hated the smell of the paint and the noise it would make when I put my brush to the paper. Sometimes I wouldn’t really look but just listen to this horrible squish squish squish,” she told the journalist Studs Terkel—Arbus’s teachers thought they were encouraging her true self, or a self she wanted to be.
Chauncey Devega in Salon:
President Donald Trump is a clear and present danger to the United States and the world. He has reckless disregard for democracy and its foundational principles. Trump is also an authoritarian plutocrat who appears to be using the presidency as a means to enrich himself and closest allies as well as family members. Trump’s proposed 2018 federal budget is a shockingly cruel document that threatens to destroy America’s already threadbare social safety net in order to give the rich and powerful (even more) hefty tax cuts. His policies have undermined the international order and America’s place as the dominant global power. It would appear that he and his administration have been manipulated and perhaps (in the case of Michael Flynn) even infiltrated by Vladimir Putin’s spies and other agents. The world has become less safe as a result of Trump’s failures of leadership and cavalier disregard for existing alliances and treaties.
Donald Trump’s failures as president have been compounded by his unstable personality and behavior. It has been reported by staffers inside the Trump White House that he is prone to extreme mood swings, is cantankerous and unpredictable, flies into blind rages when he does not get his way, is highly suggestible and readily manipulated, gets bored easily and fails to complete tasks, is confused by basic policy matters and by all accounts is unhappy and lonely. And despite bragging about his “strength” and “vitality” during the 2016 presidential campaign, Trump appears to tire easily and easily succumbs to “exhaustion.” Trump is apparently all id and possesses little if any impulse control. He is a chronic liar who ignores basic facts and empirical reality in favor of his own fantasies.
Between the scandals and the emotionally erratic behavior, Donald Trump would appear to be a 21st-century version of Richard Nixon, to date the only American president forced to resign under threat of forcible removal. In all, this leads to a serious and worrisome question: Is Donald Trump mentally ill? Moreover, what does Trump’s election reveal about the moods and values of his voters? How are questions of societal emotions and collective mental health connected to the rise of fascism and authoritarianism in America? Do psychiatrists, psychologists and other mental health professionals have a moral obligation to warn the public about the problems they see with Donald Trump’s behavior? In an effort to answer these questions, I recently spoke with Dr. Bandy Lee, a psychiatrist at Yale University who specializes in public health and violence prevention. She recently convened a conference that explored questions related to Donald Trump’s mental health and how mental health professionals should respond to this crisis.
Diatribe Against the Dead
The dead are selfish:
they make us cry and don't care,
they stay quiet in the most inconvenient places,
they refuse to walk, we have to carry them
on our backs to the tomb
as if they were children. What a burden!
Unusually rigid, their faces,
accuse us of something, or warn us;
they are the bad conscience, the bad example,
they are the worst things in our lives always, always.
The bad thing about the dead
is that there is no way you can kill them.
Their constant destructive labor
is for that reason incalculable .
Insensitive, indifferent, obstinate, cold,
with their insolence and their silence
they don't realize what they undo.
by Angel Gonzalez
from The Vintage Book of Contemporary World Poetry
Vintage Books, 1996
Andrea Marks in Scientific American:
The smell of coffee may urge you out of bed in the morning, and the perfume of blooming lilacs in the spring is divine. But you do not see police officers with their noses to the ground, following the trail of an escaped criminal into the woods. Humans do not use smell the way other mammals do, and that contributes to our reputation for being lousy sniffers compared with dogs and other animals. But it turns out the human sense of smell is better than we think. In a review paper published in Science last week neuroscientist John McGann of Rutgers University analyzed the state of human olfaction research, comparing recent and older studies to make the argument our smelling abilities are comparable with those of our fellow mammals. McGann traces the origins of the idea that humans have a poor sense of smell to a single 19th-century scientist, comparative anatomist Paul Broca. Broca, known for discovering Broca’s area—the part of the brain responsible for speech production—noted that humans had larger frontal lobes than those of other animals, and that we possessed language and complex cognitive skills our fellow creatures lacked. Because our brains’ olfactory bulbs were smaller than those of other mammals and we did not display behavior motivated by smell, Broca extrapolated these brain areas shrank over evolutionary time as humans relied more on complex thought than on primal senses for survival. He never conducted sensory studies to confirm his theory, however, but the reputation stuck.
Scientists built on that tenuous foundation over the years, McGann says. Geneticists saw supporting evidence for humans’ limited olfactory abilities because we have a smaller fraction and number of functioning olfactory genes—but again this was not well tested. The idea that color vision took the evolutionary pressure off olfaction was later debunked when no link was found between that evolutionary development and smell loss. In addition, the size of olfactory bulbs, both in absolute terms and in proportion to the brain, does not relate directly to smelling power as scientists once thought. Now that more sensory tests are being done, the results are mixed. Experiments conducted in previous decades have found humans are just as sensitive as dogs and mice to the aroma of bananas. Furthermore, a 2013 study found humans were more sensitive than mice to two urine odor components whereas mice could better detect four other sulfur-containing urine and fecal-gland odors tested. A 2017 study also revealed humans were more sensitive than mice to the smell of mammal blood.
Michael Chabon in Literary Hub:
The tallest man in Ramallah offered to give us a tour of his cage. We would not even have to leave our table at Rukab’s Ice Cream, on Rukab Street; all he needed to do was reach into his pocket.
At nearly two meters—six foot four—Sam Bahour might well have been the tallest man in the whole West Bank, but his cage was constructed so ingeniously that it could fit into a leather billfold.
“Now, what do I mean, ‘my cage’?” He spoke with emphatic patience, like a remedial math instructor, a man well practiced in keeping his cool. With his large, dignified head, hairless on top and heavy at the jawline, with his deep-set dark eyes and the note of restraint that often crept into his voice, Sam had something that reminded me of Edgar Kennedy in the old Hal Roach comedies, the master of the slow burn. “Sam,” he said, pretending to be us, his visitors, we innocents abroad, “what is this cage you’re talking about? We saw the checkpoints. We saw the separation barrier. Is that what you mean by cage?”
Some of us laughed; he had us down. What did we know about cages? When we finished our ice cream—a gaudy, sticky business in Ramallah, where the recipe is an Ottoman vestige, intensely colored and thickened with tree gum—we would pile back into our hired bus and return to the liberty we had not earned and were free to squander.
“Yes, that’s part of what I mean,” he said, answering the question he had posed on our behalf. “But there is more than that.”
Sarah Zhang in The Atlantic:
The modern world gives us such ready access to nachos and ice cream that it’s easy to forget: Humans bodies require a ridiculous and—for most of Earth’s history—improbable amount of energy to stay alive.
Consider a human dropped into primordial soup 3.8 billions years ago, when life first began. They would have nothing to eat. Earth then had no plants, no animals, no oxygen even. Good luck scrounging up 1600 calories a day drinking pond- or sea water. So how did we get sources of concentrated energy (i.e. food) growing on trees and lumbering through grass? How did we end up with a planet that can support billions of energy-hungry, big-brained, warm-blooded, upright-walking humans?
In “The Energy Expansions of Evolution,” an extraordinary new essay in Nature Ecology and Evolution, Olivia Judson sets out a theory of successive energy revolutions that purports to explain how our planet came to have such a diversity of environments that support such a rich array of life, from the cyanobacteria to daisies to humans.
AC Grayling in Prospect:
War, then, has changed in dramatic respects, technologically and, consequentially, in character too. But in other fundamental respects it is as it ever was: people killing other people. As Theodor Adorno said, thinking of the development of the spear into the guided missile: “We humans have grown cleverer over time, but not wiser.” Every step of this evolution has raised its own ethical questions, but the next twist in the long story of war could very well be autonomous machines killing people—something that could well necessitate a more profound rethink than any that has been required before.
As well as posing their own particular ethical problems, past advances in military technology have—very often—inspired attempts at an ethical solution too. The 1868 Declaration of St Petersburg outlawed newly-invented bullets that split apart inside a victim. The 1899 Hague Conference outlawed aerial bombardment, even before heavier-than-air flight had become possible—it had in mind the throwing of grenades from balloons. After the First World War, chemical weapons were outlawed and following the Second World War much energy was devoted to attempts at banning or limiting the spread of nuclear weapons. When Bashar al-Assad gassed his own people in Syria, President Donald Trump enforced the world’s red line with an airstrike.
So, just as the continuing evolution of the technology of combat is nothing new, nor is the attempt to regulate its grim advance. But such attempts to limit the threatened harm have often proved to be futile. For throughout history, it is technology that has made the chief difference between winning and losing in war—the spear and the atom bomb both represent deadly inventiveness prompted by emergency and danger. Whoever has possessed the superior technology has tended to prevail, which—if it then falls to the victors to enforce the rules—points to some obvious dilemmas and difficulties.
Elizabeth Pennisi in Science:
Weighing in at 200,000 kilograms and stretching the length of a basketball court, the blue whale is the biggest animal that’s ever lived. Now, scientists have figured out why they and other baleen whales got so huge. “It’s a cool study,” says Jakob Vinther, an evolutionary paleobologist at the University of Bristol in the United Kingdom. "I’m going to send it to my students." Biologists have long debated why some whales became the world’s biggest animals. Some have proposed that because water bears the animal’s weight, whales can move around more easily and gulp in enough food to sustain big appetites. Others have suggested that whales got big to fend off giant sharks and other megapredators. Researchers have also argued about when these animals got so huge. In 2010, Graham Slater, an evolutionary biologist currently at the University of Chicago in Illinois, argued that cetaceans—a term that includes whales and dolphins—split into different-sized groups very early in their history, perhaps 30 million years ago. Dolphins remained the shrimps of the cetacean world, filter-feeding baleen whales became the giants, and predatory beaked whales stayed in the middle size-wise, with the descendants in those three groups sticking within those early established size ranges.
However, Nicholas Pyenson, a whale expert at the Smithsonian Institution's National Museum of Natural History in Washington, D.C., was skeptical. So a few years ago, the two decided to tap the museum’s vast cetacean fossil collection to settle the dispute. Pyenson had already surveyed living whale proportions and determined that the size of the whale correlated with the width of its cheek bones. So Pyenson measured or obtained these data from skulls of 63 extinct whale species and of 13 modern species and plotted them on a timeline that showed the whale family tree. The data showed that whales didn’t get really big early on, as Slater had suggested. Nor did they gradually get big over time. Instead they become moderately large and stayed that way until about 4.5 million years ago, Slater, Pyenson, and Jeremy Goldbogen at Stanford University in Palo Alto, California, report today in the Proceedings of the Royal Society B. Then baleen whales went “from relatively big to ginormous,” Slater says. Blue whales today are 30 meters long, where until 4.5 million years ago, the biggest whales were 10 meters long.
Neal Gabler in billmoyers.com:
Fox News creator and former chief Roger Ailes, who died at 77 last week from complications after a fall in his Florida home, may have been the most significant political figure of the last 35 years — which isn’t necessarily a compliment to those of us who believe media mavens shouldn’t also be political operatives. Ailes clearly thought differently. He simultaneously changed the contours both of American politics and American media by melding them, and in doing so changed the contours of fact and objectivity as they once were understood before the era of post-fact.
It seems a lot to put on one man, but Roger Ailes destroyed the idea of media objectivity in the name of media objectivity, the way a phony evangelist might destroy virtue in the name of virtue. Things have never been the same since. Ailes was a political acolyte of Richard Nixon, and Nixon was a media acolyte of Ailes. It was a perfect and powerful alliance — two outcasts seeking retribution. Nixon’s great contribution to American politics was to take his personal umbrage, a lifetime of slights, and nationalize it. Like so many among the aggrieved, he aspired to be an insider and was tormented by not being admitted to their ranks. In anger, he took them on, especially those he regarded as haughty elites — accused spy Alger Hiss, who was the personification of the Ivy-educated aristocrat and on whose takedown Nixon built his career; John Kennedy, who was everything Nixon wanted to be and wasn’t; and the entire liberal establishment, which would denigrate and denounce him as he rose through the political ranks. He became the avatar for every person who had suffered the same disdain and abuse, and he turned Republicanism into a therapeutic movement of social vengeance.
Heather Ewing at The Brooklyn Rail:
In 1969 a young artist in Turin named Giulio Paolini took as his personal motto the Latin inscription—itself a quotation from Nietzsche—at the foot of an early Giorgio de Chirico self-portrait: Et quid amabo nisi quod ænigma est [And What Shall I Love If Not the Enigma]. He made the phrase into his own business card and transformed it into a public manifesto by placing it on an enormous banner hung across the main piazza in Como. This was his contribution to Campo Urbano, the public art intervention staged that year by Luciano Caramel in collaboration with Ugo Mulas and Bruno Munari, which invited artists out of their studios and galleries to engage directly with the urban environment, the spaces of daily life. For Paolini, it was the beginning of a decades-long fascination with de Chirico’s oeuvre, which Paolini has referenced, cited, and interrogated in his conceptual practice—artwork that is now the subject of the fourth season at the Center for Italian Modern Art (CIMA), which places paintings spanning much of de Chirico’s career together with works by Paolini from the 1960s to today.
This phrase “And What Shall I Love If Not the Enigma” was a touchstone as well for Philip Guston—Dore Ashton said he quoted it all his life; and it was a prompt for Sylvia Plath, too, who wrote several poems inspired by de Chirico paintings, as did Mark Strand, John Ashbery, and others (Ashbery also translated parts of de Chirico’s surrealist novelHebdomeros). I love that Louise Bourgeois and her husband, the art historian Robert Goldwater, together dedicated themselves to translating some of de Chirico’s writings. De Chirico’s work has beguiled and bedeviled a surprising number of artists and writers.
Edwidge Danticat at Literary Hub:
The republic of Port-au-Prince, as it is often called, is a city of survivors. Even those who would like to see the country decentralized or have the capital moved elsewhere talk about creating another Port-au-Prince, a different one for sure, but an improved version of the old one. Still, Port-au-Prince is also a heartbreaking city. It is a city where a restaurant that charges over 20 American dollars for a steak might stand inches from some place where others are starving. It is a city where the dead can lie in a morgue for weeks as the family clamors for money to pay for the burial.
It is also a city where paintings line avenue walls, where street graffiti curses or praises politicians, depending on who has paid for them. It is a city of so much traffic that it has become a city of back roads, short cuts that rattle your body through hills, and knolls that at first don’t seem passable. It’s a city of motto taxis, which are better fitted for such roads. It is also a city of cell phones, where conversations often end abruptly because someone’s prepaid cards have run out of minutes. It is a city, as one of Haiti’s most famous novelists, Gary Victor, has written, where people who might run toward bullets will flee the rain, because the rain can reconfigure roads in an instant and can take more lives in a few minutes than a gun.
Hugh Gough at the Dublin Review of Books:
Madame de Staël has attracted the attention of historians and biographers alike, not only because of her flamboyant lifestyle and unwavering sense of self importance but also because of her sheer energy and impressive literary output. Her life had all the characteristics of a romantic novel, launched as she was into the privileged world of Parisian salons before revelling in the turmoil of the French Revolution, spending years of exile in the family château near Geneva, and travelling through central and eastern Europe before finally returning to France in the early years of the Bourbon restoration. Her interests straddled the Enlightenment and early Romanticism and she was one of the few French-language authors of the late eighteenth century to immerse herself in both German and Italian culture. She met Goethe and Schiller in Weimar, was a close friend of the philosopher and founder of the University of Berlin Wilhem von Humboldt, and had her children educated by the romanticist and orientalist Auguste Wilhelm von Schlegel. Her novels Delphine (1802) and Corinne (1807) have enjoyed lasting success and De l’Allemagne, published between 1810 and 1813, was a highly original analysis of the literature, history and philosophy of a culture that her nemesis, Napoleon Bonaparte, briefly dominated but never understood.
She was a lady born into wealth and privilege, and now has a society dedicated to her memory which publishes her complete works, encourages research and hosts annual meetings in the château in Coppet which has been the home of the Necker family since the eighteenth century. Her political ideas are less well known than her novels but she was an intelligent and reflective commentator on the twists and turns of events in France until Napoleon finally sent her into exile.
Skye Cleary in 3:AM Magazine:
3AM: I understand that your book grew out of a New York Times opinion piece by the same name: How to Be a Stoic? Why did you decide to write the book? Or was it more about riding the wave of Fate?
Massimo Pigliucci: In a sense, it was about Fate. But in another sense, it was a very deliberate project. Fate entered into it because The New York Times article went viral, and I immediately started getting calls from a number of publishers, enquiring into whether I intended to write a book. Initially, I didn’t. But then I considered the possibility more carefully. After all, I had started a blog (howtobeastoic.org) with the express purpose of sharing my progress in studying and practicing Stoicism with others, and I am convinced that Stoicism as a philosophy of life can be useful to people. So, a book was indeed the next logical step.
3AM: What are the key differences between ancient Stoicism and your new Stoicism? Why did it need updating?
Massimo Pigliucci: Stoicism is an ancient Greco-Roman philosophy, originating around 300 BCE in Athens. It’s only slightly younger than its Eastern counterpart, Buddhism. But while Buddhism went through two and a half millennia of evolution, Stoicism was interrupted by the rise of Christianity in the West. A lot of things have happened in both philosophy and science in the 18 centuries since there were formal Stoic schools, so some updating is in order.