was Billy Budd black?

2017_29_melville_openerPhilip Hoare at The New Statesman:

Was Billy Budd, the Handsome Sailor at the heart of the book, black? Scholars such as John Bryant believe that there is internal evidence in the manuscript of the book – found in a bread tin after Melville’s death in 1891 and not published until 1924 – that the author had played with the idea of making his hero a man of African heritage. Billy is loved by all the crew and is described as blond and blue-eyed later in the story. Yet the sensuous descriptions of the Liverpool sailor and the Greenwich veteran elide to create a counterfactual version in which Billy becomes a black star at the centre of his constellation of shipmates.

Indeed, some critics – most notably, Cassandra Pybus at the University of Sydney – have suggested that another 19th-century anti-hero was a person of colour. In Wuthering Heights, published in 1847, two years before Melville’s visit, Heathcliff is described as a “regular black”, an orphan found in the Liverpool docks – an intriguing notion explored in Andrea Arnold’s brilliant 2011 film adaptation.

Melville witnessed great changes in the fortunes of black Americans. Moby-Dick is an allegory of the struggle against slavery in the run-up to the American Civil War; the Melville scholar Robert K Wallace believes that the writer heard the fugitive slave-turned-emancipationist Frederick Douglass speak in the 1840s and that they may have even met.

more here.

Tuesday Poem


o aluminium roll,
o silver scroll

confined in
this cupboard,

bound in cardboard,
restrained behind

a jagged blade that tears
lengths away to mute

the bowls and
jars of the fridge —

o small,
spare life.

I would free it, the tinfoil. I’d lift it from its cabinet and make
a river of it, a smooth, grey sheen released through the house.

At the summit of the stairs, the source would spurt up
from Gougane Barra, setting a mountain stream to gush, and I’d lift it

and give it a push, I’d let the bright waters of the Lee flow down
the slope, to run a silver ribbon through the hall. Under the bridge

of a couch, I’d watch shadows of salmon and brown trout swim in
and out of riverweed. On a moonlit night, a man might stand there

with his son, the light of their torches poaching the waters. If the child
whispered “Oh look, the river’s smooth as tin foil!” his father

would hush him quickly, finger to lip, and turn to choose a hook.
The waters would surge onwards, swirling under doors to the city

-kitchen. Where gulls screech and shriek high, I would thrust swifter
currents that’d make islands of table legs and riverbanks of walls.

I’d give the river a voice to hum through the culverts that run under
cupboards, making of itself a lilting city song, its waters speckled

with gloom-shadows of mullet. I would put a single seal there,
lost, and make a red-haired girl the only person who’d see him.

I would tug that river back, then, the weight of all its stories
dragging after it, and haul it in loud armfuls all the way back to me.


torn, I’d
fold it, then,

and close
it back in

its press

by Doireann Ní Ghríofa
from Oighear
publisher: Coiscéim, Dublin, 2017

Translation: 2017, Doireann Ní Ghríofa
First published on Poetry International, 2017

The New Science of Daydreaming

Michael Harris in Discover:

Day“I’m sorry, Julie, but it’s just a fact — people are terrified of being in their heads,” I say. “I read this study where subjects chose to give themselves electric shocks rather than be alone with their own thoughts.” It’s the summer of 2015 and the University of British Columbia’s half-vacated grounds droop with bloom. Julie — an old friend I’ve run into on campus — gives me a skeptical side-eye and says she’s perfectly capable of being alone with her thoughts. Proving her point, she wanders out of the rose garden in search of caffeine. I glower at the plants. The study was a real one. It was published in 2014 in Science and was authored by University of Virginia professor Timothy D. Wilson and his team. Their research revealed that, left in our own company, most of us start to lose it after six to 15 minutes. The shocks are preferable, despite the pain, because anything — anything — is better than what the human brain starts getting up to when left to its own devices.

Or so we assume.

What the brain in fact gets up to in the absence of antagonizing external stimuli (buzzing phones, chirping people) is daydreaming. I am purposefully making it sound benign. Daydreaming is such a soft term. And yet it refers to a state of mind that most of us — myself included — have learned to suppress like a dirty thought. Perhaps we suppress it out of fear that daydreaming is related to the sin of idle hands. From at least medieval times onward, there’s been a steady campaign against idleness, that instigator of evil. Today, in the spaces where I used to daydream, those interstitial moments on a bus, in the shower, or out on a walk, I’m hounded by a guilt and quiet desperation — a panicked need to block my mind from wandering too long on its own. The mind must be put to use.

More here.

England’s Mental Health Experiment: It makes economic sense

Benedict Carey in The New York Times:

England is in the midst of a unique national experiment, the world’s most ambitious effort to treat depression, anxiety and other common mental illnesses.

MentalThe rapidly growing initiative, which has gotten little publicity outside the country, offers virtually open-ended talk therapy free of charge at clinics throughout the country: in remote farming villages, industrial suburbs, isolated immigrant communities and high-end enclaves. The goal is to eventually create a system of primary care for mental health not just for England but for all of Britain. At a time when many nations are debating large-scale reforms to mental health care, researchers and policy makers are looking hard at England’s experience, sizing up both its popularity and its limitations. Mental health care systems vary widely across the Western world, but none have gone nearly so far to provide open-ended access to talk therapies backed by hard evidence. Experts say the English program is the first broad real-world test of treatments that have been studied mostly in carefully controlled lab conditions. The demand in the first several years has been so strong it has strained the program’s resources. According to the latest figures, the program now screens nearly a million people a year, and the number of adults in England who have recently received some mental health treatment has jumped to one in three from one in four and is expected to continue to grow. Mental health professionals also say the program has gone a long way to shrink the stigma of psychotherapy in a nation culturally steeped in stoicism. “You now actually hear young people say, ‘I might go and get some therapy for this,’” said Dr. Tim Kendall, the clinical director for mental health for the National Health Service. “You’d never, ever hear people in this country say that out in public before.”

The enormous amount of data collected through the program has shown the importance of a quick response after a person’s initial call and of a triage-like screening system in deciding a course of treatment. It will potentially help researchers and policy makers around the world to determine which reforms can work — and which most likely will not. “It’s not just that they’re enhancing access to care, but that they’re being accountable for the care that’s delivered,” said Karen Cohen, chief executive of the Canadian Psychological Association, which has been advocating a similar system in Canada. “That is what makes the effort so innovative and extraordinary.”

More here.

What College is for (on the Eve of an Apocalypse) or The Copper Virtues

by Paul North

An informal talk before first-time college goers, the summer before their first year at an Ivy League university.

MtRedoubtedit1How does it feel when your group's social practices bring the world to the brink of destruction? How does it feel when no one can legitimately deny this fact any longer? We haven't been closer to midnight on the nuclear doomsday clock since the H-Bomb was invented in 1953. We haven't seen this level of financial inequality between the vast majority and a small minority of wealth-holders since before the New Deal. The economy is deeply segregated by race and by gender. As a culture, almost to a person, we failed to admit to ourselves that the heat-trapping tendencies of carbon dioxide could lead to a cascade of negative effects. You are going to college on the eve of nuclear war, mass impoverishment, and climate disaster. What is college for? I give you three simple imperatives: be stupid, get lazy, and dream.

When I went to college at the end of the 1980s we weren't conscious of any of this. Sure, in pre-school we dove under desks during nuclear drills, but the end of the cold war seemed to promise us that we could pay attention to other things. For my group, that was the hangover from hippydom. We were revolutionaries. We practiced free thinking, listened to political music, protested. Art lead to emancipation, we thought.

Now the feeling is different. And yet, …

Why someone would even go to college on the eve of the apocalypse is a legitimate question, but since you decided to go, for complicated reasons I know, not all of which have to do with intellectual work—for some it is for economic reasons, for some social reasons, all of the reasons pretty compelling—since you decided to go to college for all of your many reasons, we will concentrate on another question. What are the virtues you can and should cultivate in college? Your first answer might be: excellence, leadership, and success. This is after all the Ivy League. And yet, even after all your hard work, you hear the hollow gong of these big empty words. They are of course codes for the social and economic advancement promised by an elite college. What does excellence mean? Saying "excellent" is something like recognizing an intrinsic value, demonstrated by schoolwork but residing somehow, mystically within you. It is a virtue that becomes a kind of substitute for social class. If you are excellent, legend has it, you don't have to be rich, or else you become rich, eventually, because of your excellence. Excellence is the highest position in a merit system (poor, mediocre, normal, good, and so on; D, C, B, A, and so on). Leadership refers to your position in relation to your peers, in social organizations, extra-curriculars, university committees, and the arts. There is a sense, unspoken, that the best among you or the ones with the most "excellence" should naturally acquire roles of power over others. And finally, "success"—academically success is just an abstract term for a high grade point average, which acts as a kind of currency, or so students think, with which to pay their way into the next level of the game.

No doubt you would agree: life is not a video game. Rewards aren't automatic, even if you are excellent, a leader, and have success. But I want to say more than this. On the eve of an apocalypse it isn't clear there will be a next level, or if you're not willing to be that pessimistic—who can say right now what that next level will or should be? College now, I want to propose, especially an elite place like this, is where you should put the future aside.

Read more »

Was Austin an experimental philosopher?

by Dave Maier

Epistemology books (in the previous century anyway) pretty much all start out the same way. Epistemology is the philosophy of knowledge, and so the first thing we have to do is to determine What Knowledge Is, before then going on to find out how best to get it, confirm that we have it, what it’s good for, and so on. By the end of the first page our author has usually decided that knowledge is a form of belief which is true and justified; the question for the rest of the chapter, or even the book, is what else we may or may not need for a belief to count as knowledge. As it turns out, there are many such “JTB+” accounts of knowledge (note: none of them work, but it’s good practice figuring out why).

But how do we know that we want a “JTB+” account to begin with? That first bit was pretty quick. (Full disclosure: my own analysis of knowledge is a “TB” account, and as you can imagine it has been rather frustrating to see one’s considered views universally dismissed as obviously false on page 1 of virtually every introductory epistemology text in the land. But I digress, as this is not my point today.) What usually happens is this: our author says something like “Let’s say Jill believes that Jack is cheating on her, but as it happens he is not. Does Jill know that Jack is cheating on her, or merely believe without knowing? Clearly, in this case we would say that she does not know. She believes she knows, but she does not. Knowledge, that is, entails truth.” And that’s that; on to the (supposed) justification condition.

Fire_Chair_1_0However, as you may have noticed, that’s not an argument – it’s merely an expression of an intuition; and intuitions are slender reeds on which to base our philosophical edifices. Or so say a new (or perhaps not so new, by now) breed of philosopher, who hoist the banner of “experimental philosophy”. Their emblem is a burning armchair, symbolic of the movement’s rejection of the detached, unempirical intuition-mongering of last-century mainstream philosophy. What do you mean “Clearly we would say X”? How do you know? Why don’t we actually go find out what “we” would say? Let’s round up some people and ask them!

This is what experimental philosophers do. Their research is explicitly empirical, as pointedly opposed to the traditional reliance on intuition. A common response to this from mainstream philosophers, especially at first, has been to mock experimental philosophy as turning the scholarly contemplation of the eternal verities over to the untutored mob, as if a mere vote could determine philosophical truth. This is certainly unfair to at least the best experimental philosophy, but even when we grasp their methodological point, some weirdness seems to me to remain. But instructive weirdness!

Read more »

Benedictine Dreams (And Some Strange Ideas about Counter-Culture)

bby Leanne Ogasawara

IMG_9796I admit, the only reason I picked the book up off the shelf was because of the photograph of Mont Saint-Michel on the cover.

Ah, Mont Saint-Michel. We had just returned from the legendary floating island, and I had found myself utterly obsessed by the place. A fairy castle rising up out of the mist and waters of the tidal estuary in northern France, the abbey of Mont Saint Michel is sometimes associated with the ancient Breton myth of the submerged cathedral lying underneath the sea. The myth of the sunken cathedral was the inspiration for Debussy's famous piano prelude, La Cathédrale Engloutie. Debussy often frequented Mont Saint Michel, and while the abbey of Mont Saint-Michel never sinks beneath the sea, it does become inaccessible as it is surrounded by waters twice daily. In days past, completely cut off at high tide by the strongest tidal forces in Europe; those strong currents rush in at incredible speed, "like that of a galloping horse," said Victor Hugo.

Even today, the setting is indescribable. There is a short-story by Guy de Maupassant that I love because it so perfectly captures the magical and magnetic quality that Mont Saint-Michel holds on the imagination; especially on that of the pilgrim; for indeed, it has been a major place of Christian pilgrimage for over a thousand years.

The following morning at dawn I went toward it across the sands, my eyes fastened on this, gigantic jewel, as big as a mountain, cut like a cameo, and as dainty as lace. The nearer I approached the greater my admiration grew, for nothing in the world could be more wonderful or more perfect.

Seeing it for the first time last week, I simply could not believe my eyes. We had arrived as the abbey bells were ringing loudly in our ears and the army of day-trippers was pouring out in an endless tide back toward the parking lots to return to their cars and tour buses.

If you stay on the island overnight, I had read, you will have the place much to yourself after 7pm.

Read more »

In Favor of Small States – Are Meganations too Big to Succeed?

by Bill Benzon

One of the most interesting effects of the Trump presidency has been the response various cities and states have had to the Trump administration’s blindness to global warming: They have decided to bypass the federal government and go their own way on climate policy, even to the point of dealing with other nations. Thus Bill McKibben states, in “The New Nation-States”:

The real test will come in September next year, when “subnational” governments from around the world gather in California to sign the “Under2 MOU,” an agreement committing them to uphold the Paris targets. Launched in 2015 by California and the German state of Baden-Württemberg, the movement now includes everyone from Alsace to Abruzzo to the Australian Capital Territory; from Sichuan to Scotland to South Sumatra; from Manchester City to Madeira to Michoacán. Altogether: a billion people, responsible for more than a third of the world’s economic output. And every promise they make, sincere or not, provides climate activists with ammunition to hold each government accountable.

Moreover, the number of articles reporting on the weakening of the nation-state as a form of government seems on the rise – I link to a number of them at my home blog, New Savanna.


Thomas H. Naylor, September 14, 2012

This would not be surprising to the late Thomas Naylor, a scholar and activist who taught economics at Duke University, Middlebury College, and the University of Vermont and who, as a consultant, advised major corporations and governments in over 30 countries. Naylor believed that nations such as the United States were too large to govern effectively and so should devolve into several smaller states. I am presently working with his estate to edit a selection of his papers and am reprinting one of them below. He completed it on December 3, 2012, a few days before he died from a stroke.

Secession Fever Spreads Globally

We should devote our efforts to the creation of numerous small principalities throughout the world, where people can live in happiness and freedom. The large states… must be convinced of the need to decentralize politically in order to bring democracy and self-determination into the smallest political units, namely local communities, be they villages or cities.
–Hans-Adam II, Prince of Liechtenstein, The State in the Third Millennium

Since the re-election of Barack Obama on November 6, 2012, over one million Americans have signed petitions on a White House website known as “We the People” calling for the secession of their respective states from the Union. Contrary to the view expressed by many politically correct liberals, this is not merely a knee-jerk, racist reaction of some Tea Party types to the re-election of Obama, but rather it is part of a well-defined trend. Today there are, in fact, 250 self-determination, political independence movements in play worldwide including nearly 100 in Europe alone, over 70 in Asia, 40 in Africa, 30 or so in North America, and 15 to 20 scattered on various islands scattered around the world. We could be on the brink of a global secession pandemic!

We live in a meganation world under the cloud of Empire, the American Empire. Fifty-nine percent of the people on the planet now live in one of the eleven nations with a population of over one hundred million people. These meganations in descending order of population size include China, India, USA, Indonesia, Brazil, Pakistan, Nigeria, Bangladesh, Russia, Japan, and Mexico. Extending the argument one step farther, we note that twenty-five nations have populations in excess of 50 million and that seventy-three percent of us live in one of those countries.

Read more »

Is American Democracy Dying?

by Michael Liss

Gentle CorrespondentIs American Democracy dying? For months, as I have watched the bizarre spectacle of the new Marshal in town and his posse, there's been a phrase rattling around in my head—the historian Allan Nevins' observation that "Democracy must be reborn in every generation."

For Nevins, the man who met the moment was Lincoln, who persevered through failure and terrible loss of life to lead "a new birth of freedom." For me and many of my generation, it was Watergate—a crime met with the deliberative process leading to bipartisan consensus that a sitting President needed to resign. For others, it might have been the Reagan years and the restoration of American power, or the astonishing rise of Barack Obama.

What rebirth might this generation, marinating in the glory that is the Age of Trump, see that would reaffirm their faith in first principles?

For the moment, it's not coming from the Right. We have a Tweeter-in-Chief who demonstrates his policy chops by sending out 140 character jeremiads. A substance-free Speaker who practices posing three quarters' front with chin upraised, affecting a scholarly but manly demeanor. And a Senate Majority Leader who periodically emerges from whatever underwater den he schemes in to gum a little lettuce while spreading his own bilious joy. This is not a trio that inspires confidence.

Meanwhile, on Stage Left, La Résistance (sounds chic and très Macron, n'est-ce pas?) bravely fights the good fight with banners and words and marches—but without victories in Congressional Special Elections, or on cherished policies. And, besides a Democratic version of #nevertrump, without a coherent ideology.

Drama, poor judgment, and just malfeasance we have in abundance. The White House seems to be stocked with people who spend their time watching their backs. Most of the Executive Branch jobs that require Senatorial oversight are unfilled, either because of benign or malign neglect. The State Department is so understaffed that they are considering setting up a search party to find anyone who might know anything about foreign policy—or just anyone who knows anything about anything.

Read more »

In “Arbitrary Stupid Goal”, Conjuring a Lost New York City

Julia Felsenthal in Vogue:

ScreenHunter_2765 Jul. 23 21.37“There are roughly three New Yorks,” E.B. White once wrote. “There is, first, the New York of the man or woman who was born here, who takes the city for granted and accepts its size and its turbulence as natural and inevitable. Second, there is the New York of the commuter—the city that is devoured by locusts each day and spat out each night. Third, there is the New York of the person who was born somewhere else and came to New York in quest of something.”

For White, that last New York, “the city of final destination, the city that is a goal,” was the greatest of all. The illustrator, graphic designer, cook, writer, and born-and-bred New Yorker Tamara Shopsin quotes this passage—drawn from White’s essay Here is New York—in her new memoir, Arbitrary Stupid Goal. Her book, among many other things, traces its author’s unconventional childhood, growing up in a one-bedroom apartment on Morton Street with four siblings and her parents, Kenny and Eve Shopsin, the eccentric proprietors of their eponymous, legendarily idiosyncratic West Village grocery-store-turned-eatery. (If you’re wondering about logistics, Shopsin writes that she slept in a bookshelf.)

Their business, Shopsin’s, or for those in the know, “The Store,” was housed for roughly three decades in a storefront on the corner of Bedford and Morton. In 2002, forced out by rapidly rising rents, Shopsin’s moved a couple blocks over to Carmine Street; then, a few years later, the restaurant moved again to its current home in Essex Market on the Lower East Side. Eve passed away in the mid-aughts. Kenny, The Store’s burly, famously bellicose chef, still mans the kitchen with his son Zack.

White’s essay, writes Shopsin in her memoir, is “written with so much love and grace its words become fact.” Still, she quibbles with his conclusion. “The third New Yorker, the non-native, takes a thing for granted too,” she asserts. “The third New Yorker knows they can live somewhere else. They have done it once, deep down if need be they can do it again.”

More here.

Quantum teleportation is even weirder than you think

Philip Ball in Nature:

20170616_zaf_x99_056_webA BBC headline last week, ‘First object teleported to Earth’s orbit’, has to be one of the most fantastical you’ll see this year. For once, it seems the future that science fiction promised has arrived! Or has it?

The article was talking about reports by Chinese scientists that they had transmitted the quantum state of a photon on Earth to another photon on a satellite in low Earth orbit, some 1,400 kilometres away1. That kind of transmission — first demonstrated in a laboratory 20 years ago2 — is known as quantum teleportation.

It’s a label that can mislead the unwary, as the BBC headline demonstrates. A write-up of the work in Discover reports that the scientists “have successfully transmitted quantum entangled particles” — only to clarify, confusingly, that “unlike science fiction teleportation devices, nothing physical is being transported”.

But wait: didn’t someone once say information is physical? That was physicist Rolf Landauer3, a pioneer of information theory. So if you send nothing physical, how can you transmit anything at all from A to B?

This is one of the deep issues that quantum physicists and philosophers still argue about. We can debate whether ‘quantum teleportation’ as a term is a catchy way of conveying a scientific idea, or a misleading bit of hype. But the real question — what, exactly, is transmitted during quantum teleportation, and how — touches on issues much more profound.

More here.

To Kolkata, From Baghdadi Jews, With Love

Jael Silliman in The Wire:

Baghdad1 (1)Crisp on the outside, soft inside, the golden brown, whole fried potatoes were brought piping hot to the dining table. My father, David, urged his guests to abandon even trying to tackle these “jumping potatoes” with their forks and knives. “Just sink your teeth in them!” He had remarked cheerfully. We did and enjoyed the crackle in our mouths that slowly yielded to the soft, oozing centre melting on our tongues. Aloo makallah was definitely the star attraction of Baghdadi Jewish meals and a Calcutta specialty.

My father’s ancestor, Shalome Obadiah Ha Cohen, was the first Jew to come from Aleppo, Syria in the late eighteenth for trade in Calcutta and make it his home. Yet, our Middle Eastern community is loosely called ‘Baghdadi’ as we followed the liturgy of Baghdad, a centre of Jewish learning. We Baghdadis flourished in the port cities of ‘Jewish Asia’ that stretched from Baghdad to Shanghai. In Karachi, Bombay, Calcutta, Rangoon, Singapore, Penang, Djakarta, Hong Kong and Shanghai, small enclaves of Jews relied upon one another for religious, financial and social support. Marriages, commercial news, business and family connections welded us into a powerful economic and cultural presence in the East.

In Calcutta, the second city of Empire, we adapted to our new home and shifted from being Judaeo-Arabic to Judaeo-British in our language, as well as shifted in terms of dress and cultural orientation. We lived among Anglo Indian, Parsis, Armenians, Chinese as well as Hindus and Muslims.

More here.

When Cold War philosophy tied rational choice theory to scientific method, it embedded the free-market mindset in US society

John McCumber in Aeon:

ScreenHunter_2764 Jul. 23 21.13The chancellor of the University of California, Los Angeles (UCLA) was worried. It was May 1954, and UCLA had been independent of Berkeley for just two years. Now its Office of Public Information had learned that the Hearst-owned Los Angeles Examiner was preparing one or more articles on communist infiltration at the university. The news was hardly surprising. UCLA, sometimes called the ‘little Red schoolhouse in Westwood’, was considered to be a prime example of communist infiltration of universities in the United States; an article in The Saturday Evening Post in October 1950 had identified it as providing ‘a case history of what has been done at many schools’.

The chancellor, Raymond B Allen, scheduled an interview with a ‘Mr Carrington’ – apparently Richard A Carrington, the paper’s publisher – and solicited some talking points from Andrew Hamilton of the Information Office. They included the following: ‘Through the cooperation of our police department, our faculty and our student body, we have always defeated such [subversive] attempts. We have done this quietly and without fanfare – but most effectively.’ Whether Allen actually used these words or not, his strategy worked. Scribbled on Hamilton’s talking points, in Allen’s handwriting, are the jubilant words ‘All is OK – will tell you.’

Allen’s victory ultimately did him little good. Unlike other UCLA administrators, he is nowhere commemorated on the Westwood campus, having suddenly left office in 1959, after seven years in his post, just ahead of a football scandal. The fact remains that he was UCLA’s first chancellor, the premier academic Red hunter of the Joseph McCarthy era – and one of the most important US philosophers of the mid-20th century.

More here.

Usain Bolt is the fastest sprinter ever in spite of — or because of? — an uneven stride that upends conventional wisdom

Jere Longman in the New York Times:

ScreenHunter_2763 Jul. 23 20.57Usain Bolt of Jamaica appeared on a video screen in a white singlet and black tights, sprinting in slow motion through the final half of a 100-meter race. Each stride covered nine feet, his upper body moving up and down almost imperceptibly, his feet striking the track and rising so rapidly that his heels did not touch the ground.

Bolt is the fastest sprinter in history, the world-record holder at 100 and 200 meters and the only person to win both events at three Olympics. Yet as he approaches his 31st birthday and retirement this summer, scientists are still trying to fully understand how Bolt achieved his unprecedented speed.

Last month, researchers here at Southern Methodist University, among the leading experts on the biomechanics of sprinting, said they found something unexpected during video examination of Bolt’s stride: His right leg appears to strike the track with about 13 percent more peak force than his left leg. And with each stride, his left leg remains on the ground about 14 percent longer than his right leg.

This runs counter to conventional wisdom, based on limited science, that an uneven stride tends to slow a runner down.

More here.

On Being Smaller

Colin Gillis in Avidly:

SmallThe other day, as I was returning empty trash cans from the curb in front of our apartment building, the older man who owns the home across the street from my apartment waved to me. “Hi! I don’t think we’ve met.” In fact, we had met, over five years before, when my wife and I first moved in to our apartment, and we had regularly greeted each other since then, or, at least, we had until recently. In the past six months, my appearance has changed dramatically. I lost a lot of weight, almost 100 pounds. After I explained what had happened to my neighbor, he shouted, “Holy shit!” several times in a row. In a way, my neighbor’s introduction was appropriate. This radical change in appearance has made me feel like a new person. My new body can do many things that the old one couldn’t, and my awareness of its expanded capacities imbues the future with possibility. I want to run a marathon, climb mountains, learn to dance—and, for the first time, my body is not an impediment to doing so. But the change isn’t just physical. Losing so much weight means that the world treats me differently in fundamental ways. In addition to physical mass, I am also unburdened from the psychological weight of stigma. As my body changed, its meaning changed with it—for other people and for me.

The world wants my happiness about this transformation to be pure. People who comment on my new appearance tend to describe it with metaphors of evolution or conversion, endowing the adipose tissue I used to carry on my body with moral as well as physiological significance. It seems that my weight was, for many, the physical symptom of a lack of virtue as well as a clear and present danger to my health. This was something that I always knew on an abstract level. Nobody ever accused me of a lack of virtue, the sense of failure, to my face. Now that I’m relatively thin, that’s changed. At my last annual physical, my doctor said, as he was examining my almost naked body, “Your wife must love the new you!” This was not the first time I had considered what effect, if any, the physical transformations wrought by weight loss and vigorous exercise had on my life partner’s perception of me, but it was certainly the first time someone else had baldly stated that she probably found me more attractive now. “She tells me she likes the new me, but she also insists that she liked the old me, too,” I replied, honestly. And added, also honestly, “Of course, one wonders.”

…But there was also something attractive and deeply pleasurable about being—and living—large, about cultivating huge appetites and satisfying them with abandon. Eating piles of calorie rich food and guzzling it down with wine is tremendously fun, and I look back on occasions when I did that with fondness, a hint of jealousy, and with only the slightest regret. And my large body was so powerful! I trained until I could deadlift 420 pounds. The rush of excitement doing this gave me, the sense of accomplishment, the physical pleasure of muscles flush with blood, was a palpable sense of strength that I carried with me, in body and in mind. Removing over thirty percent of my total body mass has entailed losses of pleasures that I once associated with being huge and that remain important for me. These are more than just the pleasures of regular excess in food and drink. I am physically smaller now and less strong than I once was. I may never gain back all of my old strength.

How Breitbart Media’s Disinformation Created the Paranoid, Fact-Averse Nation That Elected Trump

Steven Rosenfeld in AlterNet:

Screen_shot_2017-07-19_at_2_46_46_pmRight-wing media evolved into a hall of mirrors in 2016, when Breitbart displaced Fox News as the key agenda-setting and attack-leading epicenter of a disinformation-filled, paranoid ecosystem promoting Donald Trump and his pro-white America agenda. Breitbart not only led the right’s obsessive, hostile focus on immigrants, it was also the first to attack professional reporting such as the New York Times and Washington Post. Breitbart's disruptive template fueled the political and information universe we now inhabit, where the right dismisses facts and embraces fantasies. There is no corollary dynamic on the left or among pro-Clinton audiences in 2016. The left's news sources, media consumption and patterns of social media-sharing are more open-minded and fact-based and less insular and aggressive. Still, Breitbart’s obsessive focus on fabricating and hyping scandals involving Hillary Clinton (and Jeb Bush early in the primary season) pushed mainstream media to disproportionately cover its agenda.

These observations are among the takeaways of a major study from Columbia Journalism Review that analyzed 1.25 million stories published online between April 2015 and Election Day 2016. While the study affirmed what many analysts have long perceived—that right-wing media and those who consume it inhabit a paranoid and dark parallel universe—it also documented shifts in the right’s media ecosystem; namely, Breitbart supplanting Fox News as the leading purveyor of extreme disinformation. “A right-wing media network anchored around Breitbart [has] developed as a distinct and insulated media system, using social media as a backbone to transmit a hyper-partisan perspective to the world,” CJR wrote. “This pro-Trump media sphere appears to have not only successfully set the agenda for the conservative media sphere, but also strongly influenced the broader media agenda, in particular coverage of Hillary Clinton.”

The CJR report said Americans’ media consuming habits are “asymmetric,” meaning those on the left—progressives and Democrats—rely on more diverse outlets and content, compared to the right.

More here.