The Empty Brain

Robert Epstein in Global Research:

No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain – or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.

Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer. To see how vacuous this idea is, consider the brains of babies. Thanks to evolution, human neonates, like the newborns of all other mammalian species, enter the world prepared to interact with it effectively. A baby’s vision is blurry, but it pays special attention to faces, and is quickly able to identify its mother’s. It prefers the sound of voices to non-speech sounds, and can distinguish one basic speech sound from another. We are, without doubt, built to make social connections. A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to change rapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.

Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.

But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.

We don’t store words or the rules that tell us how to manipulate them. We don’t create representationsof visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not. Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organised into small chunks (‘bytes’). On my computer, each byte contains 8 bits, and a certain pattern of those bits stands for the letter d, another for the letter o, and another for the letter g. Side by side, those three bytes form the word dog. One single image – say, the photograph of my cat Henry on my desktop – is represented by a very specific pattern of a million of these bytes (‘one megabyte’), surrounded by some special characters that tell the computer to expect an image, not a word.

More here.

Friday Poem

Emergency Measures

I take Saturday’s unpopulated trains,
sitting at uncontagious distances,
change at junctions of low body count, in off-hours,
and on national holidays especially, shun stadia
and other zones of efficient kill ratio,
since there is no safety anymore in numbers.

I wear the dull colors of nesting birds,
invest modestly in diverse futures,
views and moods undiscovered by tourists,
buy nothing I can’t carry or would need to sell,
and since I must rest, maintain at several addresses
hardened electronics and three months of water.
And it is thus I favor this unspecific café,
choose the bitterest roast, and only the first sip
of your story, sweet but so long, and poignantly limited
by appointments neither can be late for, and why now
I will swim through the crowd to the place it is flowing away from,
my concerned look and Excuse me excuse me suggesting
I am hurrying back for my umbrella or glasses
or some thrilling truth they have all completely missed.

by James Richardson
 By the Numbers

CLR James rejected the posturing of identity politics

Ralph Leonard in UnHerd:

“I denounce European colonialism”, wrote CLR James in 1980, “but I respect the learning and profound discoveries of Western civilisation.” A Marxist revolutionary and Pan-Africanist, a historian and novelist, an icon of black liberation and die-hard cricket fan, a classicist and lover of popular culture, Cyril Lionel Roberts James, described by V.S Naipaul as “the master of all topics”, was one of the great (yet grossly underrated) intellectuals of the 20th century.

He was one of the few Leftist intellectuals – as Christopher Hitchens once said about George Orwell – who was simultaneously on the right side of the three major questions of the 20th century: Fascism, Stalinism and Imperialism. But today his praise for ‘Western culture’ would probably be dismissed as a slightly embarrassing residue of a barely concealed ‘Eurocentrism”’

Sophie Zhang in a recent column for Varsity entitled “Not all literature is ‘universal’ – nor does it have to be”, writes that:

“The study of English Literature… has often centred around texts that claim to explore ‘universal’ themes and experiences. Yet what such curricula fail to recognise is that in glorifying the universal, we neglect the particular, because to focus on the ‘Western’ canon would be to ‘to centre whiteness and continually place non-white voices on the margins’”.

Implicit in this view is that only “whiteness” could have access to the universal, and those outside of “whiteness” are intrinsically on the margins, and their views are necessarily “particular”.

Similarly, James’s admiration for Western culture and the Western canon is something many black radicals, who otherwise admire James for his opposition to colonialism, struggle to understand about him. It is rather fashionable, and almost expected, that to be a ‘proper’ black radical today is to be hostile to all that is designated as Western; it is to indiscriminately dismiss the Enlightenment as “white” and “racist”, and disparage the Western canon as not being “relevant” to black people.

More here.

The Cure For Racism Is Cancer

Tony Hoagland (who died two days ago at age 64) in The Sun:

The woman sitting next to me in the waiting room is wearing a blue dashiki, a sterile paper face mask to protect her from infection, and a black leather Oakland Raiders baseball cap. I look down at her brown, sandaled feet and see that her toenails are the color of green papaya, glossy and enameled.

This room at MD Anderson Cancer Center in Houston, Texas, is full of people of different ages, body types, skin colors, religious preferences, mother tongues, and cultural backgrounds. Standing along one wall, in work boots, denim overalls, and a hunter’s camouflage hat, is a white rancher in his forties. Nervously, he shifts from foot to foot, a styrofoam cup of coffee in his hand. An elderly Chinese couple sit side by side, silently studying their phones. The husband is watching a video. The wife is the sick one, pale and gaunt. Her head droops as if she is fighting sleep. An African American family occupies a corner. They are wearing church clothes; the older kids are supervising the younger ones while two grown women lean into their conversation and a man — fiftyish, in a gray sports coat — stares into space.

America, that old problem of yours? Racism? I have a cure for it: Get cancer. Come into these waiting rooms and clinics, the cold radiology units and the ICUcubicles. Take a walk down Leukemia Lane with a strange pain in your lower back and an uneasy sense of foreboding. Make an appointment for your CATscan. Wonder what you are doing here among all these sick people: the retired telephone lineman, the grandmother, the junior-high-school soccer coach, the mother of three.

More here.

Notes on Neanderthal Aesthetics

Justin E. H. Smith in his blog:

Often, in the study of past human and natural processes, the realisation that we have no evidence of something, and therefore no positive knowledge, transforms too quickly into the conclusion that because we have no positive knowledge, we may therefore assume the negative. Sometimes this conclusion is correct, but it cannot be applied indiscriminately. To cite one interesting example of the correct and useful refusal to apply it, in recent probabilistic reasoning about extraterrestrials the absence of any direct evidence for their existence is taken as irrelevant to whether we should believe in them or not. Drake’s equation is more powerful than radio signals or space ships in shaping our beliefs.

In palaeontology and palaeoanthropology, the signals that do come down to us in the present are usually the result of contingent, singular events that could have failed to occur. Any individual object that survives erosion or decomposition from the distant past is an exception, and needs to be understood as such. A corollary principle worth adopting in these fields holds that the earliest found artefact of a given sort cannot be the earliest artefact that ever existed of that sort. As an individual object, it is exceptional, but it justifies the presumption of a large class of absent objects to which it belongs or belonged, and in relation to which it is not at all exceptional.

More here.

The Secret Lives of Central Bankers

Annelise Riles in the New York Times:

A few years ago, a senior Japanese central banker let me in on a secret side of his life: Like some others in his rarefied world, he is a passionate devotee of Sherlock Holmes. After formal meetings in capitals around the world, he joins the other Sherlock Holmes buffs over drinks or dinner for trivia competitions, to test their knowledge of obscure plot details, or to share amateur historical research into Victorian London.

It is all very casual, but the camaraderie is important to him. Through this informal fan club, the banker told me, he had made his closest professional friendships. “I feel closer to many of these people than to many of my countrymen,” he said.

As an anthropologist, I have spent 20 years studying the customs, beliefs and rituals of central bankers around the world. They see themselves as jacks-of-all-financial-trades who solve complex financial crises before they can damage the unsuspecting public. They are as clever as the extraordinarily wealthy banking executives whom they regulate, but motivated by higher ideals. So it made sense that the aloof and justifiably arrogant Sherlock Holmes might represent for them an ideal of masculine brilliance (they are mostly still men), rationality and self-control. Like Holmes, central bankers consider their detachment an asset.

But in the real world, this high-mindedness has come at a cost.

More here.

George Scialabba, Radical Democrat

Jedediah Purdy in The New Republic:

It may strike a reader new to George Scialabba’s writing as extraordinary that Slouching Toward Utopia, a new collection of his essays and reviews, is not a response to Donald Trump’s presidency. Although the president does not appear by name until a handful of very recent pieces toward the end—earlier he is decorously invoked, just once, as “a famous social parasite”—Scialabba has argued for years that the United States is a plutocracy, administered mainly for the convenience of those who control capital and jobs. His consistent themes have been the corruption of language, the coarsening of imagination, the colonization of attention by technology and commerce, and the seductions of power. The pathologies that the present moment throws into relief have always been the occasions of his warnings and laments. He writes lucidly about benightedness, vividly about purblindness, so that his essays and reviews show thought as a thing possible in a world that can seem a conspiracy against sense and reason.

It is up to Scialabba’s readers to observe this modest heroism in his work, because he will not claim it for himself. He has long insisted on the political irrelevance of criticism.

More here.

The Rise of Cancer Immunotherapy

Daniel M. Davis in Nautilus:

Every time Jim meets a patient, he cries,” Padmanee said to The New York Times in 2016. “Well not every time,” Jim added. Jim Allison and Padmanee Sharma work together at the MD Anderson Cancer Center in Houston, Texas, having met in 2005 and married in 2014. A decade before they met, Allison and his lab team made a seminal discovery that led to a revolution in cancer medicine. The hype is deserved; cancer physicians agree that Allison’s idea is a game-changer, and it now sits alongside surgery, radiation, and chemotherapy as a mainstream option for the treatment of some types of cancer.

Take one example. In 2004, 22-year-old Sharon Belvin was diagnosed with stage IV melanoma—skin cancer that had already spread to her lungs—and was given a 50/50 chance of surviving the next six months. Chemotherapy didn’t work for her and her prospects looked bleak. “I’ve never felt that kind of pain,” she later recalled, “ … you are lost, I mean you’re lost, you’re absolutely out of control, lost.” All other options exhausted, she signed up to an experimental clinical trial testing a new drug based on Allison’s idea. After just four injections over three months the tumor in her left lung shrunk by over 60 percent. Over the next few months, her tumors kept shrinking and eventually, after two and a half years of living with an intense fear of dying, she was told that she was in remission—her cancer could no longer be detected. The treatment doesn’t work for everyone but, Allison says, “We’re going to cure certain types of cancers. We’ve got a shot at it now.”

More here.

Thursday Poem

Reading Moby-Dick at 30,000 Feet

At this height, Kansas
is just a concept,
a checkerboard design of wheat and corn
no larger than the foldout section
of my neighbor’s travel magazine.
At this stage of the journey
I would estimate the distance
between myself and my own feelings
is roughly the same as the mileage
from Seattle to New York,
so I can lean back into the upholstered interval
between Muzak and lunch,
a little bored, a little old and strange.
I remember, as a dreamy
backyard kind of kid,
tilting up my head to watch
those planes engrave the sky
in lines so steady and so straight
they implied the enormous concentration
of good men,
but now my eyes flicker
from the in-flight movie
to the stewardess’s pantyline,
then back into my book,
where men throw harpoons at something
much bigger and probably
better than themselves,
wanting to kill it,
wanting to see great clouds of blood erupt
to prove that they exist.
Imagine being born and growing up,
rushing through the world for sixty years
at unimaginable speeds.
Imagine a century like a room so large,
a corridor so long
you could travel for a lifetime
and never find the door,
until you had forgotten
that such a thing as doors exist.
Better to be on board the Pequod,
with a mad one-legged captain
living for revenge.
Better to feel the salt wind
spitting in your face,
to hold your sharpened weapon high,
to see the glisten
of the beast beneath the waves.
What a relief it would be
to hear someone in the crew
cry out like a gull,
Oh Captain, Captain!
Where are we going now?
. by Tony Hoagland
from Donkey Gospel
Graywolf Press, Saint Paul, Minnesota
Tony Hoagland
1953 – 2018

Looking at the world as a whole: Mary Midgley, 1919-2018

James Garvey in Prospect:

Just a few weeks before her death in October, Mary Midgley agreed to meet and discuss her new book, What Is Philosophy For? It seemed astonishing that someone about to celebrate her 99th birthday had a new book out, but I was less in awe of that than the reputation of one of the most important British philosophers of the 20th century and beyond.

People who have encountered Midgley often use the word “formidable” to describe her. Journalist Andrew Brown called her “the most frightening philosopher in the country: the one before whom it is least pleasant to appear a fool.” During my email correspondence with her to set up a date to talk about those philosophical problems “which are exercising both me and the public,” she worried that publications like the ones I write for “occasionally give rather half-witted answers to large questions of this kind.”

A lot of people were on the receiving end of her sharp intellect. She made puncturing scientific pretension into an art form—going after DNA discoverer Francis Crick for saying that human behavior can be explained simply by the interactions of brain cells, the physicist Lawrence Krauss for claiming that only science can solve philosophical problems, those theorists who insist we must look to machines for our salvation, and, most famously, Richard Dawkins for the idea that a gene could be selfish.

In person, though, Midgley was kind, generous with her time and as engaged as ever with philosophical ideas—even if her voice was soft and she had a little trouble hearing me. She sat in an armchair, sipping tea, surrounded by books. Having just celebrated her approaching birthday with friends and family, she had a kitchen full of cakes.

More here.

‘World’s oldest fossils’ may just be pretty rocks

Maya Wei-Haas in National Geographic:

In 2016, a series of unassuming stone shapes rocked the paleobiology world when they were declared the earliest fossilized life yet found. Standing up to 1.6 inches tall, the triangular forms line up like a string of inverted flags in an outcrop on the southwest coast of Greenland that dates back 3.7 billion years.

“If these are really the figurative tombstones of our earliest ancestors, the implications are staggering,” NASA astrobiologist Abigail Allwood wrote in a review article that accompanied the Nature study announcing the find. The microbes that made these fossils are over 200 million years older than the most widely accepted evidence of fossil life and would have lived a geologic blink of an eye after astroids had blasted Earth’s early surface. Evidence of critters from this time would suggest that “life is not a fussy, reluctant, and unlikely thing,” Allwood wrote. “Give life half an opportunity, and it’ll run with it.”

But even as Allwood penned these words, she had a nagging sense that something was amiss.

More here.

History for a Post-Fact America

Alex Carp in the New York Review of Books:

What was America? The question is nearly as old as the republic itself. In 1789, the year George Washington began his first term, the South Carolina doctor and statesman David Ramsay set out to understand the new nation by looking to its short past. America’s histories at the time were local, stories of states or scattered tales of colonial lore; nations were tied together by bloodline, or religion, or ancestral soil. “The Americans knew but little of one another,” Ramsay wrote, delivering an accounting that both presented his contemporaries as a single people, despite their differences, and tossed aside the assumptions of what would be needed to hold them together. “When the war began, the Americans were a mass of husbandmen, merchants, mechanics and fishermen; but the necessities of the country gave a spring to the active powers of the inhabitants, and set them on thinking, speaking and acting in a line far beyond that to which they had been accustomed.” The Constitution had just been ratified at the time of Ramsay’s writing, the first system of national government submitted to its people for approval. “A vast expansion of the human mind speedily followed,” he wrote. It hashed out the nation as a set of principles. America was an idea. America was an argument.

The question has animated American history ever since. “For the last half century,” the historian and essayist Jill Lepore told an interviewer in 2011, academic historians have been trying “to write an integrated history of the United States, a history both black and white, a history that weaves together political history and social history, the history of presidents and the history of slavery.” Over the same period, a generation of Americans have had their imaginations narrowed, on one side by populist myths blind to the evidence of the past, and on the other by academic histories blind to the power of stories. Why, at a time when facts are more accessible than at any other point in human history, have they failed to provide us with a more broadly shared sense of objective truth?

More here.

The Art of Anni Albers

Lynne Cooke at Artforum:

IN A 1985 INTERVIEW, Anni Albers remarked, “I find that, when the work is made with threads, it’s considered a craft; when it’s on paper, it’s considered art.” This was her somewhat oblique explanation of why she hadn’t received “the longed-for pat on the shoulder,” i.e., recognition as an artist, until after she gave up weaving and immersed herself in printmaking—a transition that occurred when she was in her sixties. It’s hard to judge whether Albers’s tone was wry or rueful or (as one critic alleged) “some-what bitter,” and therefore it’s unclear what her comment might indicate about the belatedness of this acknowledgment relative to her own sense of her achievement. After all, she had been making “pictorial weavings”—textiles designed expressly as art—since the late 1940s. Though the question might now seem moot, it isn’t, given the enduring debates about the hierarchical distinctions that separate fine art from craft, and given the still contested status of self-identified fiber artists who followed in Albers’s footsteps and claimed their woven forms as fine art, tout court.

more here.

Handel, Dryden, and Alexander the Great

Sudip Bose at The American Scholar:

When George Frideric Handel arrived in London in 1710—he was in his mid-20s at the time and would reside in the capital for the duration of his life, becoming a naturalized British subject—he made his reputation composing operas, their librettos written not in his native German but in Italian, as was the fashion of the dayWorking tirelessly and continuously, Handel produced an astonishing succession of operatic masterpieces: Giulio Cesare, Tamerlano, Rodelinda, Orlando, and Alcina, to name just a few. Eventually, however, he turned to the language of his adopted land, and it was in his English oratorios—Esther, Saul, Israel in Egypt, Samson, Judas Maccabaeus, Jephtha, and most famously of all, Messiah—that he arguably made his most striking contributions to Western music. Handel was attracted not only to the Bible but also to secular poetry, his subjects inspired by the likes of Milton, Pope, and Dryden. The composer’s command of English was never stellar (he was hardly a fluent exophone in the manner of Voltaire, Conrad, or Beckett), which makes his facility with the cadences, imagery, rhythms, and rhymes of English verse all the more remarkable.

more here.

Analyst who predicted the 2008 crash warns of bubble brewing in U.S. household wealth

Barbara Kollmeyer in MarketWatch:

Our call of the day, pulls no punches as it warns that all that oft-referenced increase in affluence, has been artificially inflated by the Fed, which is ultimately bad news for the economy and the stock market. Here’s how Jesse Colombo, analyst at Clarity Financial, explains it:

“The U.S. household wealth boom since the Great Recession is a sham, a farce and a gigantic lie that is tricking everyone into believing that happy days are here again even though the engines that are driving it are bubbles that are going to burst and cause a crisis that will be even worse than the 2008 crash,” Colombo said in a video he posted via the Real Investment Advice blog.

There has been a fair bit of buzz on the topic since data this summer that showed household wealth topped $100 trillion for the first time in June. Colombo’s isn’t the only invective against bloated U.S. wealth and how it could go terribly wrong, but the commentary delivers, perhaps, the most potent argument to date, including charts, such as the following, that illustrates the degree to which wealth has been outpacing economic expansion:

Wealth that gallops past economic growth is a “telltale sign that the boom is artificial and unsustainable, he said. The last two times the share of household-wealth growth exceeded gross domestic product, or GDP, was during the late 1990s dot-com bubble and the mid-2000 housing bubble, he notes. “Both of which ended in tears,” he said.

More here.

Wednesday Poem

Good Bones

Life is short, though I keep this from my children.
Life is short, and I’ve shortened mine
in a thousand delicious ill-advised ways
in a thousand deliciously ill-advised ways,
I’ll keep from my children. The world is at least
fifty percent terrible, and that’s a conservative
estimate, though I keep this from my children.
For every bird there is a stone thrown at a bird.
For every loved child, a child broken, bagged,
sunk in a lake. Life is short and the world
is at least half terrible, and for every kind
stranger, there is one who would break you,
though I keep this from my children. I am trying
to sell them the world. Any decent realtor,
walking you through a real shithole, chirps on
about good bones: This place could be beautiful,
right? You could make this place beautiful.

by Maggie Smith
from Waxwing Magazine Issue IX, 2016

In Praise of Distracted Meditation

Alex Tzelnic in Slate:

When people learn that I meditate every day, they often sheepishly admit that they wish they could, but that they just aren’t suited for it, or their mind is too active, or they don’t have the time. This always reminds me of Anne Lamott’s iconic gem of an essay, “Shitty First Drafts.” While nonwriters tend to conceive of the writing process as a montage of steaming mugs of tea and meaningful glances outside windows frosted just so, in reality, writing is a grind. Words arrive slowly, and in direct proportion to how much time your ass is touching the chair and your fingers stroking the keyboard. And so I try and explain that meditation is exactly the same.

Many burgeoning meditators have visions of rapturous sitting rounds spent floating upon the meditation cushion, the incense burning just so, the mind clear and calm. The reasoning for this misconception is twofold: first, this is exactly how every meditator appears, since no one else is privy to the cacophony inside your skull; second, meditation is big business these days, and serenity sells. The truth, though, is that meditation can also be a real grind, the understanding arriving slowly and in direct proportion to how much time your ass is touching the sitting cushion and your breath rising and falling. When I wake up early to sit on the cushion for 30 minutes, it is often begrudgingly, and my sitting round is often, well, shitty. But this kind of message does not fill dharma halls and lead to best-sellers. (If a meditation instructor tells the truth and no one is around to hear it, does it make a sound?)

More here.

Schadenfreude sheds light on darker side of humanity

Carol Clark in Phys.Org:

Schadenfreude, the sense of pleasure people derive from the misfortune of others, is a familiar feeling to many—perhaps especially during these times of pervasive social media. This common, yet poorly understood, emotion may provide a valuable window into the darker side of humanity, finds a review article by psychologists at Emory University. New Ideas in Psychology published the review, which drew upon evidence from three decades of social, developmental, personality and clinical research to devise a novel framework to systematically explain schadenfreude. The authors propose that schadenfreude comprises three separable but interrelated subforms—aggression, rivalry and justice—which have distinct developmental origins and personality correlates. They also singled out a commonality underlying these subforms.

“Dehumanization appears to be at the core of schadenfreude,” says Shensheng Wang, a Ph.D. candidate in psychology at Emory and first author of the paper. “The scenarios that elicit schadenfreude, such as intergroup conflicts, tend to also promote dehumanization.” Co-authors of the study are Emory psychology professors Philippe Rochat, who studies infant and child development, and Scott Lilienfeld, whose research focuses on personality and personality disorders. Dehumanization is the process of perceiving a person or social group as lacking the attributes that define what it means to be human. It can range from subtle forms, such as assuming that someone from another ethnic group does not feel the full range of emotions as one’s in-group members do, all the way to blatant forms—such as equating sex offenders to animals. Individuals who regularly dehumanize others may have a disposition towards it. Dehumanization can also be situational, such as soldiers dehumanizing the enemy during a battle.

More here.