Lamenting the Death of the Truly Weird TV Ad

Kate Takes at The Baffler:

HeadOn was the Max Headroom Incident of the millennial generation. Every time I bring it up, people tell me that they were equally stunned the first time they saw it, and that they remember being so relieved when other people said that they had seen it, too. Companies from Geico to Old Spice to Skittles have all tried their hand at absurdist advertisements, but nothing they’ve produced even remotely achieves the eldritch creepiness of catching the Head On commercial while watching the Weather Channel at 2 a.m. in 2006.

Back when Head On was only airing during late night television on non-primetime networks, it gave the unsettling impression that one had just witnessed something they were not supposed to be witnessing, that aliens had descended to earth and hijacked the television networks with this fucked up commercial for a bullshit headache remedy consisting mostly of wax.

more here.

Against Work, Ambition

Megan Nolan at The New Statesman:

In big cities particularly, I notice that every new person I meet is manically interested in what I do, and how much of it. I used to be embarrassed by my lack of drive and murmur vaguely about projects and deadlines, but I’m quite happy now to admit the truth, which is that I have very little ambition and no desire to work any harder than I do now, which is honestly not very much. I’ve calculated fairly minutely how much work I need to do to in order to pay my bills and that’s the amount of work I do. No more. Sometimes I get it wrong and need to work much more than usual for a month or two, sometimes I have blissful unexpected mostly vacant weeks. I work about half as much as I did in Ireland, and earn about half as much money – which is fine with me because I’ve started seeing money not as a mark of achievement but as a cumulative display of all the days you’ve spent not doing what you’d like to be doing. I want freedom, not houses. I’d like more money, certainly, but not enough to give up all my time.

more here.

Wednesday Poem

The Kitchen Gods

Carnage in the lot: blood freckled the chopping block—
The hen’s death is timeless: frantic.
Its numbskull lopped, one wing still drags
The pointless circle of a broken clock,
But the vein fades in my grandmother’s arm upon the ax.
The old ways fade and do not come back.
The sealed aspirin does not remember the willow.
The supermarket does not remember the barnyard.
The hounds of memory come leaping and yapping,
One morning is to large to fit inside the mouth.
My grandmother’s life was a long time
Toiling between Blake’s root and Lightning
Yahweh and the girlish Renaissance Christ
That plugged the flue in her kitchen wall.
Early her match flame across the carcass.
Her hand, fresh from the piano, plunged
The void bowel and set the breadcrumb heart.
The stove’s eye reddened. The day’s great spirit rose
from pies and casseroles. That was the house —
Reroofed, retiled, modernized, and rented out,
It will not glide up and lock among the stars.
The tenants will not find the pantry fully stocked
Or the brass boat where she kept the matches dry.
I find her stone and rue our last useless
Divisive arguments over the divinity of Christ.
Only where the religion goes on without a god
And the sandwich is wolfed down without blessing
I think of us bowing at the table there:
The grand patriarch of the family holding forth
In staunch prayer, and the potato pie I worshipped.
The sweeter the pie, the shorter the prayer.

by Rodney Jones
from
Transparent Gestures
Houghton Mifflin, 1989

Toward a New Frontier in Human Intelligence: The Person-Centered Approach

Scott Barry Kaufman in Scientific American:

When it comes to intelligence, we all have bad days. Heck, we even have many bad moments, such as when we forget our car keys, forget a friend’s name, or bomb an important test that we’ve taken a day after staying up all night worrying about it. Truth is, none of us– including the world’s smartest human– is perfectly consistent in our cognitive functioning. Sometimes we are at our very best and feel like our brain is on fire, and at other times, we don’t even recognize ourselves. All of this sounds so obvious, but surprisingly the field of human intelligence has not had much to say on the topic. For over the past 120 years, the field has shed far more light on how we differ from each other in our patterns of cognitive functioning than how we each differ within ourselves over time.

This is curious considering that a person-centered approach has proved fruitful in other fields, such as medicine and neuroscience. Even within the study of human behavior there has been progress, from looking at how individual emotions fluctuate over time, to how individual personalitytraits such as introversion and openness to new experiences and even our morality fluctuates throughout the course of the day. It has become increasingly clear that the results from the traditional individual differences paradigm– where we compare people to each other– often does not apply at the person-specific level.

In only the past few years, intelligence researchers have been able to demonstrate that this is also true in the domain of human intelligence. For the past 120 years, the field just hasn’t had the tools to view intelligence at such a level of granularity. With the adoption of newer technologies, however, researchers have begun to view an individual’s intelligence at a more microscopic level, able to capture all sorts of fascinating variations– across days, within days, and even moment-to-moment. It turns out that intelligence is changing all over the place all the time. Who knew?

More here.

To avoid moral failure, don’t see people as Sherlock does

Rima Basu in Aeon:

If we’re the kind of people who care both about not being racist, and also about basing our beliefs on the evidence that we have, then the world presents us with a challenge. The world is pretty racist. It shouldn’t be surprising then that sometimes it seems as if the evidence is stacked in favour of some racist belief. For example, it’s racist to assume that someone’s a staff member on the basis of his skin colour. But what if it’s the case that, because of historical patterns of discrimination, the members of staff with whom you interact are predominantly of one race? When the late John Hope Franklin, professor of history at Duke University in North Carolina, hosted a dinner party at his private club in Washington, DC in 1995, he was mistaken as a member of staff. Did the woman who did so do something wrong? Yes. It was indeed racist of her, even though Franklin was, since 1962, that club’s first black member.

To begin with, we don’t relate to people in the same way that we relate to objects. Human beings are different in an important way. In the world, there are things – tables, chairs, desks and other objects that aren’t furniture – and we try our best to understand how this world works. We ask why plants grow when watered, why dogs give birth to dogs and never to cats, and so on. But when it comes to people, ‘we have a different way of going on, though it is hard to capture just what that is’, as Rae Langton, now professor of philosophy at the University of Cambridge, put it so nicely in 1991.

More here.

Sean Carroll’s Mindscape Podcast: Marq de Villiers on Hell and Damnation

Sean Carroll in Preposterous Universe:

If you’re bad, we are taught, you go to Hell. Who in the world came up with that idea? Some will answer God, but for the purpose of today’s podcast discussion we’ll put that possibility aside and look into the human origins and history of the idea of the Bad Place. Marq de Villiers is a writer and journalist who has authored a series of non-fiction books, many on science and the environment. In Hell & Damnation, he takes a detour to examine the manifold ways in which societies have imagined the afterlife. The idea of eternal punishment is widespread, but not quite universal; we might learn something about ourselves by asking where it came from.

More here.

What Modi’s victory says about today’s India

Namit Arora in Himal:

In Varanasi recently, I took an auto-rickshaw from Godowlia to Assi Ghat. Like everyone else in town, the driver and I began talking politics. The 2019 general election was a week away and Prime Minister Narendra Modi was seeking reelection from Varanasi. The driver was an ardent Modi fan and would hear no criticism of him. He even claimed that demonetisation had punished the corrupt rich. One topic led to another and soon he was loudly praising Nathuram Godse as a patriot – Gandhi deserved no less than a bullet for being a Muslim lover. “You don’t know these people,” he thundered. “Read our history! Only Muslims have killed their own fathers to become kings. Has any Hindu ever done so? Inki jaat hi aisi hai. You too should open your mobile and read on WhatsApp. Kamina Rahul is born of a Muslim and a Christian; Nehru’s grandfather, also Muslim, Mughal. Outsiders all. Modi will teach them!” Fortunately, my destination came before his passion for the topic could escalate further.

I entered Assi Ghat with a numbing sadness. Was this really Kashi, among the oldest continuously inhabited cities of the world, known for its religious pluralism and massive density of gods, creeds and houses of worship, with its long history of largely peaceful coexistence? The Kashi of the Buddha, Adi Shankara, Kabir, Ravidas and Nanak? The Kashi of shehnai maestro Bismillah Khan, who lived in its tangled gullies and regularly played during the aarti in Balaji temple, or of Hindustani vocalist Girija Devi, whose family kept mannats on Muharram? What still remains of its famed Ganga-Jamuna tehzeeb? No, I consoled myself, my auto driver was not the norm in Varanasi, but he did herald certain fundamental changes now sweeping the country.

More here.

War and Famine in Syria

Najwa al-Qattan at Public Books:

In the days leading up to the Muslim holiday of the Feast of Sacrifice (Eid al-Adha) in October 2013, several Syrian clerics issued a fatwa (a religious opinion or responsum) allowing—in several besieged and starved suburbs of Damascus—the consumption of cats, dogs, and donkeys killed in bombings. The fatwa, publicly announced from mosques and uploaded on YouTube, came in the context of war- and siege-induced food scarcities and starvation.1 It was not the first; over the previous year, similar fatwas had been issued in other besieged areas, including Aleppo, Homs, and Yarmouk, the largest Palestinian refugee camp in Syria. But there was poignancy to the timing of this fatwa: on this holiest of Muslim eids, believers all over the world celebrate the end of the Hajj, in part by the slaughtering of a sacrificial animal (and sharing its meat with the needy) in homage to the Prophet Abraham. But in this war, as was the case a century ago, it is the Syrian civilians that are being sacrificed.

more here.

Muslims of early America

Sam Haselby in aeon:

The first words to pass between Europeans and Americans (one-sided and confusing as they must have been) were in the sacred language of Islam. Christopher Columbus had hoped to sail to Asia and had prepared to communicate at its great courts in one of the major languages of Eurasian commerce. So when Columbus’s interpreter, a Spanish Jew, spoke to the Taíno of Hispaniola, he did so in Arabic. Not just the language of Islam, but the religion itself likely arrived in America in 1492, more than 20 years before Martin Luther nailed his theses to the door, igniting the Protestant reformation. Moors – African and Arab Muslims – had conquered much of the Iberian peninsula in 711, establishing a Muslim culture that lasted nearly eight centuries. By early 1492, the Spanish monarchs Ferdinand and Isabella completed the Reconquista, defeating the last of the Muslim kingdoms, Granada. By the end of the century, the Inquisition, which had begun a century earlier, had coerced between 300,000 and 800,000 Muslims (and probably at least 70,000 Jews) to convert to Christianity. Spanish Catholics often suspected these Moriscos or conversos of practising Islam (or Judaism) in secret, and the Inquisition pursued and persecuted them. Some, almost certainly, sailed in Columbus’s crew, carrying Islam in their hearts and minds.

Eight centuries of Muslim rule left a deep cultural legacy on Spain, one evident in clear and sometimes surprising ways during the Spanish conquest of the Americas. Bernal Díaz del Castillo, the chronicler of Hernán Cortés’s conquest of Meso-America, admired the costumes of native women dancers by writing ‘muy bien vestidas a su manera y que parecían moriscas’, or ‘very well-dressed in their own way, and seemed like Moorish women’. The Spanish routinely used ‘mezquita’ (Spanish for mosque) to refer to Native American religious sites. Travelling through Anahuac (today’s Texas and Mexico), Cortés reported that he saw more than 400 mosques.

Islam served as a kind of blueprint or algorithm for the Spanish in the New World. As they encountered people and things new to them, they turned to Islam to try to understand what they were seeing, what was happening. Even the name ‘California’ might have some Arabic lineage. The Spanish gave the name, in 1535, taking it from The Deeds of Esplandian (1510), a romance novel popular with the conquistadores. The novel features a rich island – California – ruled by black Amazonians and their queen Calafia. The Deeds of Esplandian had been published in Seville, a city that had for centuries been part of the Umayyad caliphate (caliph, Calafia, California).

More here.

Tim Rollins and K.O.S.

Angel Abreu at The Paris Review:

In 1986, at the age of twelve, I joined Tim Rollins and Kids of Survival. I first met Tim as a seventh grader at the Intermediate School 52 where he was teaching at the time. Tim had only intended to stay at the school for a few weeks. The students had made charcoal drawings on the ceiling of the classroom, and the walls were covered in graffiti. Tim often described the art room as the “Hip-Hop Sistine Chapel.” He was convinced that there was a profound reason he was there.

Timothy William Rollins was born in 1955 in a small town in central Maine. Similar to the South Bronx, Pittsfield was economically downtrodden and its youth struggled against the pitfalls of low expectations. Tim was extremely motivated and precocious. He was a gifted artist, an avid reader, and an amateur scholar of Martin Luther King Jr. Dr. King’s writings and speeches, combined with Tim’s Sunday school teaching, would form the basis of his pedagogical philosophy with K.O.S. Tim earned his BFA from SVA in 1977 and after graduate studies at New York University, in art education and philosophy, he began teaching in the New York City public school system. In 1982, Tim stepped off the 2 train at Prospect Ave in the South Bronx for the first time.

more here.

The Whitney Biennial in an Age of Anxiety

Peter Schjeldahl at The New Yorker:

“Technology will surely drown us. The individual is disappearing rapidly. We’ll eventually be nothing but numbered ants. The group thing grows.” So said Marcel Duchamp to an interviewer in 1966, as quoted in the catalogue of the 2019 Whitney Biennial, by Adam Weinberg, the museum’s director. Weinberg has in mind the deleterious effects of social media, but Duchamp’s bull’s-eye prophecy could do as a capsule review of this Biennial. With scarce exceptions, the mostly youthful artists gravitate to identity or otherwise communitarian politics—strikingly, they are not, for the most part, militant, as if they had resigned themselves to ineffectiveness, but they appear entrenched. The show is about many things, but the irresponsible joy of aesthetic experience is only fitfully one of them. Nearly all the artists are technically adept in mediums that include photography, video, and performance, as well as painting and sculpture, but most of the work, though charming at times, is derivative in form, recycling modes that would not surprise any art-school student of the past quarter century. Lucas Blalock’s photographic images of normal interiors inhabited by surreal whatsits are suavely sensual, and studio photographs by John Edmonds suggest Robert Mapplethorpe’s tony classicism translated into slang.

more here.

Frances Arnold Turns Microbes Into Living Factories

Natalie Angier in The New York Times:

Dr. Arnold won fame and the Nobel Prize for developing a technique called directed evolution, a way of generating a host of novel enzymes and other biomolecules that can be put to any number of uses — detoxifying a chemical spill, or example, or disrupting the mating dance of an agricultural pest. Or removing laundry stains in eco-friendly cold water, or making drugs without relying on eco-hostile metal catalysts. Rather than seeking to design new proteins rationally, piece by carefully calculated piece — as many protein chemists have tried and mostly failed to do — the Arnold approach lets basic evolutionary algorithms do the work of protein composition and protein upgrades. The recipe is indeed an engineer’s dream: simple. You start with a protein that already has some features you’re interested in, such as stability in high heat or a knack for clipping apart fats. Using a standard lab trick such as polymerase chain reaction, you randomly mutate the gene that encodes the protein. Then you look for slight improvements in the resulting protein — a quickened pace of activity, say, or a vague inclination to carry out a task it wasn’t performing before, or a willingness to operate under conditions it deplored in the past.

…Through directed evolution, Dr. Arnold’s lab has generated microbes that do what organisms in nature have never been known to do. Some of them, for instance, stitch together carbon, the element that defines life, and silicon, the stuff of sand, glass and computer chips but heretofore not of life (unless you are a Horta, the rock-shaped beings who famously mind-melded with Mr. Spock on “Star Trek”). All it took were a few mutational tweaks to a bacterial protein called cytochrome c. “We showed for the first time that living organisms can use their own machinery to bring carbon and silicon together to form a bond,” said Jennifer Kan, a postdoctoral scholar in Dr. Arnold’s lab who performed the experiments. “We didn’t even have to nag the protein too hard to get it to do it.”

More here.

In Your Hands, My Dissatisfied Countrymen: The Jaquess-Gilmore Mission

by Michael Liss

“I worked night and day for twelve years to prevent the war, but I could not. The North was mad and blind, would not let us govern ourselves, and so the war came.” —Jefferson Davis, July 1864

By the time Sherman’s armies had scorched and bow-tied their way to the sea, by the time Halleck had followed Grant’s orders to “eat out Virginia clean and clear as far as they go, so that crows flying over it for the balance of the season will have to carry their own provender with them,” and by the time Winfield Scott’s Anaconda Plan was finished squeezing every drop of life out of the Confederacy, there had to be those who wondered what possible logic would lead intelligent men like Jefferson Davis to make such a catastrophic choice.

Yet, the South almost won the gamble. With secession, they had challenged the core of the American Experiment, the democratic principle of equal rights, general (male) suffrage, government by a majority, and a peaceful transition of power when that majority so indicated. They also posed an existential question for the North: Was adherence to a principle, even a cherished one like the Union, worth lives and property?

The Civil War is fascinating on so many levels, but what made it fundamentally different than any other conflict that preceded it was that, for the first time, two peoples with the ability to exercise electoral oversight engaged in a protracted armed conflict. This implied something new. The simplest mechanisms of civic beliefs: the right to disagree publicly, to organize, to place elected leadership on notice that their jobs could be at risk, would all play an unexpectedly crucial role in the manner in which the war began and was ultimately prosecuted. Read more »

Monday Poem

almost without metaphor

clouds this morning cross two
adjacent mountains tinged with
bluegrey and pink, they move deliberately
in a swift west wind not like anything
but migrating water vapor
held by hydrogen bonds,
the cooler the better, they glide
over pine, hemlock, oak, and spruce
being networks of misted h-2-o.
the pine, hemlock, spruce, and oak
will drink to satiation if graced by clouds
meeting more frigid air and rain falls
not like anything but clear liquid
that has found itself in a bowl
of gravity

Jim Culleny
© 5/21/19

What an artistic masterpiece is not

by Dave Maier

Philosophers have spilled a great deal of ink attempting to nail down once and for all the necessary and sufficient conditions for a thing’s being a work of art. Many theories have been proposed, which can seem in retrospect to have been motivated by particular works or movements in the history of art: if you’re into Cézanne, you might think art is “significant form,” but if you’re impressed by Andy Warhol, you might that arthood is not inherent in a work’s perceptible attributes, but is instead something conferred upon it by members of the artworld.

Nothing has really seemed to fit everything, and for whatever reason, essentialism in the philosophy of art, or at least arguing about it in public anyway, has drifted in and out of fashion. Yet that question, or something like it, won’t simply go away. Unless everything is art, some things are art and some are not. What’s the difference?

When you get stuck like this, one way to get back on track is to ask a different question. There are plenty of worthwhile candidates, but one which keeps coming up for me is: what’s the difference between something that’s not art because it’s not good enough, and something that’s not art because it’s the wrong sort of thing? Let’s start there. Read more »

When Russia Quit God

by Robert Fay

The Russian master Dostoevsky.

The great critic George Steiner in his book Tolstoy or Dostoevsky? (1959) believed the achievements of Russian literature in the 19th century stand as one of “three principal moments of triumph in the history of western literature, the other two being the Athenian dramatists and Plato and the age of Shakespeare.” Not particularly bad for a largely peasant-filled country whose entire literary tradition to that point was arguably in the form of just one man’s oeuvre: Alexander Pushkin. “His works,” Steiner notes, “constituted in themselves a body of tradition.”

But if the Russians didn’t have the bedrock cultural traditions of the their European neighbors—ancient Greek and Roman Civilizations, along with the unifying force of the Roman Catholic Church—they certainly had God, and God in Russia meant one thing: the Russian Orthodox Church.

In the 19th Century, the European intellectual and artistic environment was animated by the Enlightenment, whose denouement was surely the decapitation of France’s ancien régime in 1789, and with it, much of the legitimacy of “state, church and people” being understood as a unified, divinely-sanctioned manner of organizing human society and culture.

But by not being definitively western, either in geography or in the totality of its cultural inheritance, Russia had much in common with that one-time bastard of Europe, the United States of America, which in the 19th Century was still very much grappling with God as is evidenced by the works of Nathaniel Hawthorne and Herman Melville, among others. Read more »

Searching for Perfection

by R. Passov

As the clouds of WWII darkened Austria, Kurt Gödel, the greatest logician of modern times, at Einstein’s urging, brought his two magnificent proofs to Princeton. There he would remain for almost forty years, never mentoring a graduate student, rarely lecturing, adding only one substantial but incomplete proof to the cannon of math.

Mildly underwhelmed by the impact of his discoveries, at Princeton he would gradually set aside math in favor of philosophy. Little of this work was published in his lifetime.  But it was enjoyed by Einstein who for the last decade of his life, walked alongside as Gödel discussed a field Einstein once likened to ‘writings in honey.’

________________________________________

Gödel was born in 1906, into a German speaking family living in Brunn (Brno) Moravia, then part of the Austria-Hungary Empire, soon to be annexed by Germany, now part of the Czeck Republic. His father before passing unexpectedly in 1929, had built a successful textile business that would secure his family’s finances.

An early childhood struggle with Rheumatic fever left Gödel forever suspicious of the state of his health. He took his primary education at the local Realgymnasium. Modeled after the enlightened German system, the gymnasium offered “mental gymnastics, developing both mind and body.” Following his older brother, in 1921 he entered the University of Vienna.

Easily establishing his gifts, he studied theoretical physics, enjoyed the life of a student and earned a reputation for sleeping late. Read more »