Why we react to inconvenient truths as if they were personal insults

Brian Resnick in Vox:

ScreenHunter_2480 Dec. 29 18.14Psychologists have been circling around a possible reason political beliefs are so stubborn: Partisan identities get tied up in our personal identities. Which would mean that an attack on our strongly held beliefs is an attack on the self. And the brain is built to protect the self.

When we’re attacked, we evade or defend — as if we have an immune system for uncomfortable thoughts, one you can see working in real time.

“The brain’s primary responsibility is to take care of the body, to protect the body,” Jonas Kaplan, a psychologist at the University of Southern California, tells me. “The psychological self is the brain’s extension of that. When our self feels attacked, our [brain is] going to bring to bear the same defenses that it has for protecting the body.”

Recently, Kaplan has found more evidence that we tend to take political attacks personally. In a study recently published in Scientific Reports, he and collaborators took 40 self-avowed liberals who reported having “deep convictions,” put them inside in a functional MRI scanner, and started challenging their beliefs. Then they watched which parts of the participants’ brains lit up. Their conclusion: When the participants were challenged on their strongly held beliefs, there was more activation in the parts of the brain that are thought to correspond with self-identity and negative emotions.

More here.

Does Empathy Guide or Hinder Moral Action? A debate between Paul Bloom and Jamil Zaki

From the New York Times:

Paul Bloom: What does it take to be a good person? What makes someone a good doctor, therapist or parent? What guides policy-makers to make wise and moral decisions?

Many believe that empathy — the capacity to experience the feelings of others, and particularly others’ suffering — is essential to all of these roles. I argue that this is a mistake, often a tragic one.

Empathy acts like a spotlight, focusing one's attention on a single individual in the here and now. This can have positive effects, but it can also lead to short-sighted and unfair moral actions. And it is subject to bias — both laboratory studies and anecdotal experiences show that empathy flows most for those who look like us, who are attractive and who are non-threatening and familiar.

When we appreciate that skin color does not determine who we should care about, for example, or that a crisis such as climate change has great significance — even though it is an abstract threat — we are transcending empathy. A good policy maker makes decisions using reason, aspiring toward the sort of fairness and impartiality empathy doesn't provide.

Empathy isn’t just a reflex, of course. We can choose to empathize and stir empathy for others. But this flexibility can be a curse. Our empathy can be exploited by others, as when cynical politicians tell stories of victims of rape or assault and use our empathy for these victims to stoke hatred against vulnerable groups, such as undocumented immigrants.

More here.

Open Society Needs Defending

B0c5938139ebdbcb23d5efcf689ee8ac.onpoint

George Soros in Project Syndicate:

Globalization has had far-reaching economic and political consequences. It has brought about some economic convergence between poor and rich countries; but it increased inequality within both poor and rich countries. In the developed world, the benefits accrued mainly to large owners of financial capital, who constitute less than 1% of the population. The lack of redistributive policies is the main source of the dissatisfaction that democracy’s opponents have exploited. But there were other contributing factors as well, particularly in Europe.

I was an avid supporter of the European Union from its inception. I regarded it as the embodiment of the idea of an open society: an association of democratic states willing to sacrifice part of their sovereignty for the common good. It started out at as a bold experiment in what Popper called “piecemeal social engineering.” The leaders set an attainable objective and a fixed timeline and mobilized the political will needed to meet it, knowing full well that each step would necessitate a further step forward. That is how the European Coal and Steel Community developed into the EU.

But then something went woefully wrong. After the Crash of 2008, a voluntary association of equals was transformed into a relationship between creditors and debtors, where the debtors had difficulties in meeting their obligations and the creditors set the conditions the debtors had to obey. That relationship has been neither voluntary nor equal.

Germany emerged as the hegemonic power in Europe, but it failed to live up to the obligations that successful hegemons must fulfill, namely looking beyond their narrow self-interest to the interests of the people who depend on them. Compare the behavior of the US after WWII with Germany’s behavior after the Crash of 2008: the US launched the Marshall Plan, which led to the development of the EU; Germany imposed an austerity program that served its narrow self-interest.

More here.

Don’t blame technology

Soldatov_wiki_468w

Irina Borogan and Andrei Soldatov in Eurozine:

Disappointment with the mainstream media as the traditional intermediary between society and the authorities grew to striking levels. The concept of media denial was born: now readers wanted to judge events for themselves. Technology arose to aid them in this, and what seemed insane back in 2001, with people sitting in front of their television and computer screens and coming to far-fetched conclusions about the masterminds behind 9/11, became convenient in the late 2000s, with the rise of WikiLeaks and the concept of “data journalism” presented as an alternative to narrative journalism.

Russian bloggers, most of whom had never been journalists but who had some background in PR, became extremely popular. A blogger writing about politics, cars, cameras or anything else could get a million daily views.

And then came the world of social media: a platform to share experiences, data and stories where the credibility of the story was judged not on the reputation of the author but on numbers – of reposts and followers. This brave new world cried out to be exploited.

Were the Russians the first to exploit it? Certainly not.

In March 2011, The Guardian reported on a contract from the United States Central Command (CENTCOM), which oversees US operations in the Middle East and Central Asia, to develop what was described as an “online persona management service”. This service would allow one US serviceman or woman to control up to 10 separate identities based all over the world. The CENTCOM contract stipulated that each fake online persona must have a convincing background, history and supporting details, and that up to 50 US-based controllers should be able to operate these false identities from their workstations “without fear of being discovered by sophisticated adversaries”. Back then these false online personas were called “sock puppets” – these days they are better known as trolls. The software's objective was to help US service personnel, working around the clock in one location, to respond to emerging online conversations with any number of coordinated messages, blog posts, chatroom posts and other interventions. As CENTCOM spokesman Commander Bill Speaks said at the time, “The technology supports classified blogging activities on foreign-language websites to enable CENTCOM to counter violent extremist and enemy propaganda outside the US.”

But Russia was the first country to turn this weapon into a new way of conducting public policy, first in Ukraine, then in Europe.

More here.

Wham Bang, Teatime

Ian Penman in the LRB:

In 1975 David Bowie was in Los Angeles pretending to star in a film that wasn’t being made, adapted from a memoir he would never complete, to be called ‘The Return of the Thin White Duke. This dubious pseudonymous character was first aired in an interview with Rolling Stone’s bumptious but canny young reporter Cameron Crowe; it soon became notorious. Crowe’s scene-setting picture of Bowie at home featured black candles and doodled ballpoint stars meant to ward off evil influences. Bowie revealed an enthusiasm for Aleister Crowley’s system of ceremonial magick that seemed to go beyond the standard, kitschy rock star flirtation with the ‘dark side’ into a genuine research project. He talked about drugs: ‘short flirtations with smack and things’, but given the choice he preferred a Grand Prix of the fastest, whitest drugs available. He brushed aside compatriots/competitors like Elton John and called Mick Jagger the ‘sort of harmless bourgeois kind of evil one can accept with a shrug’. If pushed, this apprentice warlock could also recite Derek and Clive’s ‘The Worst Job I Ever Had’ by heart and generally came on like a twisted forcefield of ego, will and fantastic put-on.

It’s impossible to imagine someone like Bowie giving the media anything like this kind of insane access today – but then, of course, there is no one like Bowie today. In 2016 it might take five months of negotiation to get an interview with the superstar of your choice and then you’d probably have to present your questions in advance and be babysat by three or four PR flaks and a spooky zombie-faced entourage for the whole blessed 15 minutes. In 1975, Bowie just turned up grinning, already babbling, at Crowe’s door. When Crowe got him to sit still long enough he couldn’t stop talking, which may or may not have had something to do with the industrial amounts of pharmaceutical cocaine he was daily ingesting. He had become almost an abstraction in the dry California air: surrounded by stubbly country-rock cowboys and wailing witchy women he was a sheet of virgin foolscap. Where did he fall from, this Englishman with his barking seal laugh and outrageous quotes about Himmler and semen storage and articulate ghosts?

More here.

Political Surrealism, Surreal Politics

Lastdaysofnewparis

Carl Freedman in the LA Review of Books:

WHAT IS THE RELATIONSHIP between radical aesthetic practices and actual political radicalism? There are many — and various — answers to this question. One of the most interesting is suggested by a famous exchange between Lenin and the Romanian-Jewish writer Valeriu Marcu. During his exile in Zurich, Lenin took many of his meals at a restaurant frequented by radically avant-garde painters, poets, and other such bohemian types, Marcu among them. In conversation one day, Lenin said to Marcu, “I don’t know how radical you are, or how radical I am. I am certainly not radical enough. One can never be radical enough; that is, one must always try to be as radical as reality itself.” Marcu was so sufficiently impressed by the great Russian revolutionary that he went on to write his first biography.

To try to be as radical as reality itself is a good motto for anyone wishing to accomplish anything of value in art or in politics. Brecht, who was unswervingly radical in both spheres, however, maintained that the artistic comprehension of reality in all its “radicality” is not necessarily best achieved through traditional literary realism. China Miéville would certainly agree. All of his numerous works are animated by revolutionary Marxism, and all diverge in one way or another — or in many ways — from classical realism. His recent volume, The Last Days of New Paris(2016), is set in France, mainly in Paris, during Nazi occupation; but this occupation is quite different from the one you can read about in the history books. The text can be classified as an alternative-history novel (or novella, as Miéville labels it). Yet a knowledge of the canonical achievements of this genre — like Philip K. Dick’s The Man in the High Castle (1962), or Philip Roth’s The Plot Against America (2004), or any of a number of works by Kim Stanley Robinson — will suggest only a very partial idea of what is to be found here.

More here.

Why do our cell’s power plants have their own DNA?

Laurel Hamers in Science:

Sn-mitochondrialIt’s one of the big mysteries of cell biology. Why do mitochondria—the oval-shaped structures that power our cells—have their own DNA, and why have they kept it when the cell itself has plenty of its own genetic material? A new study may have found an answer. Scientists think that mitochondria were once independent single-celled organisms until, more than a billion years ago, they were swallowed by larger cells. Instead of being digested, they settled down and developed a mutually beneficial relationship developed with their hosts that eventually enabled the rise of more complex life, like today’s plants and animals.

Over the years, the mitochondrial genome has shrunk. The nucleus now harbors the vast majority of the cell’s genetic material—even genes that help the mitochondria function. In humans, for instance, the mitochondrial genome contains just 37 genes, versus the nucleus’s 20,000-plus. Over time, most mitochondrial genes have jumped into the nucleus. But if those genes are mobile, why have mitochondria retained any genes at all, especially considering that mutations in some of those genes can cause rare but crippling diseases that gradually destroy patients’ brains, livers, hearts, and other key organs.

More here.

Thursday Poem

Mint

It looked like a clump of small dusty nettles
Growing wild at the gable of the house
Beyond where we dropped our refuse and old bottles
Unverdant ever, almost beneath notice.

But, to be fair, it also spelled promise
And newness in the back yard of our life
As if something callow yet tenacious
Sauntered in green alleys and grew rife.

The snip of scissor blades, the light of Sunday
Mornings when the mint was cut and loved:
My last things will be first things slipping from me.
Yet let all things go free that have survived.

Let the smells of mint go heady and defenceless
Like inmates liberated in that yard.
Like the disregarded ones we turned against
Because we’d failed them by our disregard.

by Seamus Heaney
from The Spirit Level
Faber and Faber 1996
.

Under Wall Street lies a legacy of slavery

Inge Oosterhoff in The Correspondent:

ScreenHunter_2479 Dec. 28 16.27The Dutch are pretty proud of founding New York. In 2009, the Netherlands and New York celebrated their 400-year history with myriad events, festivals, and parties.

Wherever possible, organizers emphasized shared values like freedom, tolerance, and equal opportunity – values the Netherlands is often given credit for coming up with.

But the story has a dark side that’s often overlooked. In the colony called New Amsterdam, the Dutch kept slaves from day one.

It’s a fact that merits remembering. Each July 1, the Netherlands celebrates Keti Koti, a day honoring the abolition of slavery in the former Caribbean colonies. But what about slavery here, in New York City? Join me on a tour of modern-day New York as I look for traces of the city’s buried past.

More here.

You’re an Adult; Your Brain, Not So Much

Carl Zimmer in the New York Times:

21BRAINALT-master768“Oftentimes, the very first question I get at the end of a presentation is, ‘O.K., that’s all very nice, but when is the brain finished? When is it done developing?’” Dr. Somerville said. “And I give a very nonsatisfying answer.”

Dr. Somerville laid out the conundrum in detail in a commentary published on Wednesday in the journal Neuron.

The human brain reaches its adult volume by age 10, but the neurons that make it up continue to change for years after that. The connections between neighboring neurons get pruned back, as new links emerge between more widely separated areas of the brain.

Eventually this reshaping slows, a sign that the brain is maturing. But it happens at different rates in different parts of the brain.

The pruning in the occipital lobe, at the back of the brain, tapers off by age 20. In the frontal lobe, in the front of the brain, new links are still forming at age 30, if not beyond.

More here.

Silicon Valley futurists plan to live forever by harvesting both the labor and the body parts of the working class

A. M. Glittlitz in The New Inquiry:

57-blood-socialSilicon Valley’s elites are a revolutionary vanguard party developing the not-too-distant future of cybernetic capitalist reconstruction. Despite cultish personas and massive social influence, however, they tend to keep their politics on the low. That changed this year when Peter Thiel, PayPal founder and Facebook board member, who also has investments in SpaceX and data analysis firm Palantir, revealed himself as mastermind of the litigious assassination of Gawker, a fellow-traveler of right-libertarian White Nationalists, and a prominent supporter of President-elect Donald J. Trump.

Thiel’s “Don’t Be Evil” competitors now look like saints in comparison. Some colleagues distanced themselves, while others wrote off the endorsement as part of his “disruptive instinct” to break down regulations preventing his Founders Fund investments from expanding. Then, in August, it was rumored that Thiel bragged to friends that Trump promised to nominate him to the Supreme Court, which would make him one of the most powerful men in America for a lifetime term. And Peter Thiel plans to live for a long time. He has a personal and financial stake in life extension technologies, including “parabiosis”–the (theoretically) rejuvenating transfer of young blood to an older person.

For those outside the valley, Thiel’s vampiric ambitions appeared to vindicate populist imagery dating back to Voltaire, who wrote in his Philosophical Dictionary that the real vampires were “stock-jobbers, brokers, and men of business, who sucked the blood of the people in broad daylight.”

More here.

Carl Weber, dead at 91, was Bertolt Brecht’s protégé and brought Germany’s experimental theater to America

Cynthia Haven in The Book Haven:

Weber3Avant-garde theater director Carl Weber began his theatrical career in a POW camp. He became Bertolt Brecht‘s protégé and brought Germany’s experimental theater to America. The Stanford drama professor, emeritus, died in Los Altos on Christmas night. He was 91.

I wrote about him several years ago (as well as on the Book Haven). He recalled his first “role” as an unwilling German soldier:

“At the first opportunity” – he recalled, and then put up both hands in the universally accepted sign of surrender – “I was a prisoner of England in Belgium.” He was sent to Colchester, Essex, as a POW.

Within weeks of his capture, he was performing Friedrich Schiller‘s The Robbers as one of a handful of performers at the Christmastime play in a mess tent, with tables for a stage. The group had a captive audience – literally.

But the event was a turning point: After Weber returned to a Germany that was “cold and miserable and in ruins” in February 1946, he finished his studies in chemistry at the University of Heidelberg and went to Berlin in September 1949 to pursue a career as an actor, director and dramaturg.

Many of the “alumni” of Camp 186 in Colchester went on to have remarkable careers: German stage and TV actor Günther Stoll; Werner Düttmann, city architect for Berlin in the 1960s; and actor Klaus Kinski, collaborator with writer-film director Werner Herzog.

More here.

Carrie Fisher Died Having Figured Out How to Truly Be Carrie Fisher

David Edelstein in Vulture:

Carrie-fisher-narrative_w529_h352Carrie Fisher had the flukiest life, but ye gods, she made it her own. Relatively early, she realized that her fame and money had little to do with her. She had, she wrote in her final memoir, The Princess Diarist, “associative fame. By-product fame.” First, she had “celebrity daughter” fame, having been born to “America’s Sweethearts,” Eddie Fisher and Debbie Reynolds, a year-and-a-half before Fisher became “America’s cad” by running away with Reynolds’s recently widowed pal, Elizabeth Taylor. Later, she’d have “celebrity wife” fame as the spouse of Paul Simon. In between, she had “happened-to-have-played-an-iconic-character fame.” The character was, of course, Prince Leia Organa of Alderaan, the leader of the rebellion against the Empire and an improbable figure even for a galaxy far, far away. There would be no way for Fisher to reconcile herself to a life of so many disparate parts, even with booze and pills and powders. It took courage and imagination and a big dose of exhibitionism to reinvent herself as a comic autobiographer and brassy Hollywood eccentric.

She began, of course, by telling us all about the ways in which she wasn’t Princess Leia. Before she arrived in London in 1976 to shoot Star Wars (now called, tiresomely, Episode IV: A New Hope), she’d spent a month at a fat farm in Texas to lose some of the baby-fat in her cheeks. Then came the application of the “hairy earphones” that she also dubbed, “the buns of Navarone.” As the only girl in a boys’ fantasy universe, she had to declaim terrible lines while trying to maintain her poise. As her recently published diaries (with bonus poems) make clear, her days were largely spent trying to figure out why the inhumanly gorgeous but married Harrison Ford — with whom she was having an affair — wasn’t falling in love with her the way she was with him. She walked away with a lot of confusion, a semi-broken heart, and (in lieu of a salary) a quarter of a percentage of what would turn out to be one of the most profitable movies ever made.

More here.

Researchers “Translate” Bat Talk. Turns Out, They Argue—A Lot

Jason Daley in Smithsonian:

Figure3_jpg__800x600_q85_cropPlenty of animals communicate with one another, at least in a general way—wolves howl to each other, birds sing and dance to attract mates and big cats mark their territory with urine. But researchers at Tel Aviv University recently discovered that when at least one species communicates, it gets very specific. Egyptian fruit bats, it turns out, aren’t just making high pitched squeals when they gather together in their roosts. They’re communicating specific problems, reports Bob Yirka at Phys.org. According to Ramin Skibba at Nature, neuroecologist Yossi Yovel and his colleagues recorded a group of 22 Egyptian fruit bats, Rousettus aegyptiacus, for 75 days. Using a modified machine learning algorithm originally designed for recognizing human voices, they fed 15,000 calls into the software. They then analyzed the corresponding video to see if they could match the calls to certain activities.

They found that the bat noises are not just random, as previously thought, reports Skibba. They were able to classify 60 percent of the calls into four categories. One of the call types indicates the bats are arguing about food. Another indicates a dispute about their positions within the sleeping cluster. A third call is reserved for males making unwanted mating advances and the fourth happens when a bat argues with another bat sitting too close. In fact, the bats make slightly different versions of the calls when speaking to different individuals within the group, similar to a human using a different tone of voice when talking to different people. Skibba points out that besides humans, only dolphins and a handful of other species are known to address individuals rather than making broad communication sounds.

More here.