How food became a matter of morals

Julian Baggini in The Guardian:

ScreenHunter_2112 Jul. 22 01.19The way these cream cakes flaunt themselves,” says saucy Carry On star Barbara Windsor, glaring disapprovingly at a chocolate eclair bursting with whipped cream, “it’s enough to lead a girl astray.” Her frown turns into a giggle. “Given half a chance,” she adds before tucking in gleefully.

Nothing captures the peculiarly moralistic British attitude to food better than this 15-second advert from the 1970s. And if poetry is the art of capturing whole worlds in few words then its immortal slogan “naughty but nice” is greater proof of its author’s artistry than the Booker prize its writer Salman Rushdie would go on to win.

For as long as we can remember, the British have associated delicious food with depraved indulgence. Anything that tastes good has got to be bad for your body, soul or both. The marketing department of Magnum knew this when it called its 2002 limited edition range the Seven Deadly Sins. Nothing makes a product more enticing than its being naughty, or even better, wicked.

More here.

How a Guy From a Montana Trailer Park Overturned 150 Years of Biology

Ed Yong in The Atlantic:

ScreenHunter_2111 Jul. 22 01.14In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.

At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.

Thanks to his family background, he could speak German, and he had heard that many universities there charged no tuition fees. His missing qualifications were still a problem, but one that the University of Gottingen decided to overlook. “They said that under exceptional circumstances, they could enroll a few people every year without transcripts,” says Sprirbille. “That was the bottleneck of my life.”

More here.

Liberalism after Brexit

Will Davies at the Political Economy Research Center:

ScreenHunter_2110 Jul. 22 01.07Given that Brexit was an event imagined and delivered from within the Conservative Party, one of the most important analyses of it is Matthew d’Ancona’s examination of how the idea shifted from the party’s margins to its mainstream over the post-Thatcher era. Two things in particular stand out in his account.

Firstly, the political plausibility of Brexit rose as a direct response to Tony Blair’s dogmatic assumption that European integration was a historical destiny, which encompassed the UK. No doubt a figure such as Blair would have discovered a messianic agenda under any historical circumstances. But given he gained power specifically in the mid-90s, he was one palpable victim of the fin de siècle ideology (stereotyped by Francis Fukuyama’s ‘end of history’ thesis, but also present in Anthony Giddens’ ‘Third Way’) that the world was programmed to converge around a single political system.

Neo-conservative faith in violent ‘democratisation’ was Blair’s worst indulgence on this front, but a view of European unification (and expansion) as inevitable was responsible for inciting the Tory reaction within Westminster. Europe could have been viewed as a particular historical path, adopted in view of the particular awfulness of the European 20th century. Instead, in a Hegelian fashion, the idea of Europe became entangled with the idea of ‘globalisation’, and the conservative reaction was to refuse both.

Secondly, Tory Brexiteers view the EU as an anti-market project, which blocks economic freedom. This is also weirdly ahistorical.

More here.

Minds turned to ash

Josh Cohen in 1843 Magazine:

When Steve first came to my consulting room, it was hard to square the shambling figure slumped low in the chair opposite with the young dynamo who, so he told me, had only recently been putting in 90-hour weeks at an investment bank. Clad in baggy sportswear that had not graced the inside of a washing machine for a while, he listlessly tugged his matted hair, while I tried, without much success, to picture him gliding imperiously down the corridors of some glassy corporate palace. Steve had grown up as an only child in an affluent suburb. He recalls his parents, now divorced, channelling the frustrations of their loveless, quarrelsome marriage into the ferocious cultivation of their son. The straight-A grades, baseball-team captaincy and Ivy League scholarship he eventually won had, he felt, been destined pretty much from the moment he was born. “It wasn’t so much like I was doing all this great stuff, more like I was slotting into the role they’d already scripted for me.” It seemed as though he’d lived the entirety of his childhood and adolescence on autopilot, so busy living out the life expected of him that he never questioned whether he actually wanted it. Summoned by the bank from an elite graduate finance programme in Paris, he plunged straight into its turbocharged working culture. For the next two years, he worked on the acquisition of companies with the same breezy mastery he’d once brought to the acquisition of his academic and sporting achievements. Then he realised he was spending a lot of time sunk in strange reveries at his workstation, yearning to go home and sleep. When the phone or the call of his name woke him from his trance, he would be gripped by a terrible panic. “One time this guy asked me if I was OK, like he was really weirded out. So I looked down and my shirt was drenched in sweat.”

One day a few weeks later, when his 5.30am alarm went off, instead of leaping out of bed he switched it off and lay there, staring at the wall, certain only that he wouldn’t be going to work. After six hours of drifting between dreamless sleep and blank wakefulness, he pulled on a tracksuit and set off for the local Tesco Metro, piling his basket with ready meals and doughnuts, the diet that fuelled his box-set binges. Three months later, he was transformed into the inertial heap now slouched before me. He did nothing; he saw no one. The concerned inquiries of colleagues quickly tailed off. He was intrigued to find the termination of his employment didn’t bother him. He spoke to his parents in Chicago only as often as was needed to throw them off the scent. They knew the hours he’d been working, so didn’t expect to hear from him all that much, and he never told them anything important anyway.

More here.

Were Ants the World’s First Farmers?

Jackson Landers in Smithsonian:

Attaqueennest1cropt1web_jpg__800x600_q85_cropHumans have been practicing agriculture for about 10,000 years. But the attine ants of South America (which include the well-known leafcutters) have us beat by a long way. According to a new paper co-authored by entomologist Ted Schultz, curator of ants at Smithsonian's National Museum of Natural History, attine ants, which farm on an industrial scale similar to humans, have been carefully cultivating gardens with a complex division of labor to grow an edible fungus. Schultz's team found that the ants have been doing this far longer than previously believed—up to 65 million years—and that we have much to learn from them. Schultz and his co-authors, led by by Sanne Nygaard, Guojie Zhang and Jacobus Boomsma of the University of Copenhagen, conducted an analysis of the genomes of the various species of attine ants as well as the fungus that they cultivate. Their results answer some long-standing evolutionary questions. The 210 species of attine ants, including the 47 species of leafcutters, forage through the forests of Central and South America in search of leaves and other vegetation, which they carve into pieces using their powerful jaws and carry back to their nests. But they never eat the leaves directly. The plant matter is used as a growth medium for certain varieties of edible fungi which Schultz's team says have been cultivated and passed on by generations of ants going back tens of millions of years.

…Humans may have important lessons to learn from the attine ants. We have struggled to protect the survival of our crops for only about 10,000 years. “We're constantly coming up with herbicides or antibiotics to control pests. And the pests are constantly evolving countermeasures against those things,” Schultz says. The most economically important variety of banana became functionally extinct in the 1960's and another variety is heading in the same direction. “Somehow this system with the ants has been in an equilibrium for millions of years,” he adds.

More here.

Fences: A Brexit Diary

Zadie Smith in the New York Review of Books:

ScreenHunter_2109 Jul. 20 22.33Back in the old neighborhood in North West London after a long absence, I went past the local primary school and noticed a change. Many of my oldest friends were once students here, and recently—when a family illness returned us to England for a year—I enrolled my daughter. It’s a very pretty redbrick Victorian building, and was for a long time in “special measures,” a judgment of the school inspection authority called Ofsted, and the lowest grade a state school can receive. Many parents, upon reading such a judgment, will naturally panic and place their children elsewhere; others, seeing with their own eyes what Ofsted—because it runs primarily on data—cannot humanly see, will doubt the wisdom of Ofsted and stay put. Still others may not read well in English, or are not online in their homes, or have never heard of Ofsted, much less ever considered obsessively checking its website.

In my case I had the advantage of local history: for years my brother taught here, in an after-school club for migrant children, and I knew perfectly well how good the school is, has always been, and how welcoming to its diverse population, many of whom are recently arrived in the country. Now, a year later, Ofsted has judged it officially “Good,” and if I know the neighborhood, this will mean that more middle-class, usually white, parents will take what they consider to be a risk, move into the environs of the school, and send their kids here.

If this process moves anything like it does in New York, the white middle-class population will increase, keeping pace with the general gentrification of the neighborhood, and the boundaries of the “catchment area” for the school will shrink, until it becomes, over a number of years, almost entirely homogeneous, with dashes of diversity, at which point the regulatory body will award its highest rating at last.

More here.

What Scientists Mean When They Say ‘Race’ Is Not Genetic

Jacqueline Howard in the Huffington Post:

ScreenHunter_2108 Jul. 20 22.30If a team of scientists in Philadelphia and New York have their way, using race to categorize groups of people in biological and genetic research will be forever discontinued.

The concept of race in such research is “problematic at best and harmful at worst,” the researchers argued in a new paper published in the journal Science on Friday.

However, they also said that social scientists should continue to study race as a social construct to better understand the impact of racism on health.

So what does all this mean? HuffPost Science recently posed that question and others to the paper’s co-author, Michael Yudell, who is associate professor and chair of community health and prevention at the Dornsife School of Public Health at Drexel University in Philadelphia.

Why is it problematic to view race as a biological concept?

For more than a century, natural and social scientists have been arguing about whether race is a useful classificatory tool in the biological sciences — can it elucidate the relationship between humans and their evolutionary history, between humans and their health. In the wake of the U.S. Human Genome Project, the answer seemed to be a pretty resounding “no.”

More here.

The Real Reason Why Judges Should Keep Quiet About Elections

Richard L. Hasen and Dahlia Lithwick in Slate:

ScreenHunter_2107 Jul. 20 22.24Late last week, Supreme Court Justice Ruth Bader Ginsburg tried to put thecontroversy over her recent criticisms of presumptive Republican presidential nominee Donald Trump behind her, issuing a written statement of regret and telling NPR’s Nina Totenberg: “I did something I should not have done. It’s over and done with, and I don’t want to discuss it anymore.”

But the issue of judicial speech on political matters is hardly over and done with. It will remain fodder for the 2016 presidential election because Donald Trump criticized Ginsburg, even questioning her mental competence (“her mind is shot”) and calling on her to resign. Many court watchers worry what might happen if the court is called upon to rule on any kind of election dispute and that brings a reprise of calls for her to recuse in any Trump-related litigation. And on top of all that, the court itself will soon decide whether to weigh in on a case challenging an Arizona rule that bars judicial candidates from doing the very thing Justice Ginsburg did: openly supporting or opposing a candidate for public office.

More here.

The Film Theory of Roland Barthes

Fairfax-Image-2Michael Blum at Bookforum:

Following a colorful inspection of his post-Mythologies reflections on film, what emerges is a portrait of Barthes the fetishist, deriving furtive pleasure from “the insignificant detail, the trivial object, the commonplace element that somehow seems slightly out of place.” The import of Barthes’s relish for the “ticklish detail” or the “obtuse meaning,” however, is not confined to mere fetishism. Watts persuasively argues that Barthes’s eye for the sensuous surface of things—whether the idiosyncratic exactness of Sergei Eisenstein’s mise en scène or Michelangelo Antonioni’s meandering landscape shots—has trenchant consequences for what Watts calls a “micropolitics” of film. This micropolitics, according to Watts, finds expression in a kind of egalitarian cinematic gaze, in “an aesthetic sensibility intent upon exploring the inexhaustible fascination with the ordinary.” This angle identifies the revolutionary potential of a film not strictly in its story, message, or even style, but rather in its “fractions and particles,” in those miniature, fleeting, fortuitous, or seemingly insignificant elements that nonetheless manage to “transmit to viewers . . . new conceptions of being a body, of linking one gesture to another, of moving in space, of being together.” This sensibility, conveyed across several of Barthes’s late writings from the mid- to late-1970s, also marks his repudiation of the fettering protocols of theory and the universalizing claims of abstract science. By this time, he is no longer under the impression that the power of cinema resides “in its capacity to hypnotize or render us passive,” but rather in its “ability to transform the sensory experience of the world around us.” According to Watts, Barthes’s cinema not only gradually became a delectable pastime but also a bridge to the world, and a machine for dreaming of ways to change it.

more here.

the paintings of frederick terna

107081311-06132016-terna-frederick-bomb-01Stephen Westfall at BOMB Magazine:

In 1943, at the age of twenty, Frederick Terna knew that if he survived the war he was going to be a painter. At the time he was an inmate of Terezín, which was not his first concentration camp, nor was it to be his last. In all, Terna was interned at four different camps: first at Lipa, known as Linden bei Deutsch-Brod, from October 3, 1941 to March 1943; then Terezín, known as Ghetto Theresienstadt, to fall of 1944; then Auschwitz to the end of 1944, and finally to Kaufering, a sub-camp of Dachau, outside of Munich, near Landsberg. Terna was born in Vienna but was raised in Prague, where his family moved when he was quite young. He lacked the realist rigor to enter art school at age thirteen. At Terezín, with horror all around him, he found himself composing mental pictures of the barracks, fences, and roads leading through the camp and adjusting them in his mind’s eye for better compositions. His fellow inmates told him he could be killed for his drawings.

Holocaust survivors are remarkable for the mere and obdurate fact of their survival. Each one is an extraordinary case study, even if they achieve little in their life afterward. The children of survivors are notoriously at risk for neuroses of their own, and so on. History’s great crimes reverberate for generations. Terna lost his entire family in the camps.

more here.

Gotta name them all: how Pokémon can transform taxonomy

Editorial in Nature:

WEB_Pokemon--Niantic-IncMillions of people have spent the past week walking around. Ostensibly, they are playing the online game Pokémon Go and hunting for critters in an ‘augmented reality’ world. But as gamers wander with their smartphones — through parks and neighbourhoods, and onto the occasional high-speed railway line — they are spotting other wildlife, too. Scientists and conservationists have been quick to capitalize on the rare potential to reach a portion of the public usually hunched over consoles in darkened rooms, and have been encouraging Pokémon hunters to snap and share online images of the real-life creatures they find. The question has even been asked: how long before the game prompts the discovery of a new species? It’s not out of the question: success is 90% perspiration after all, and millions of gamers peering around corners and under bushes across the world can create a very sweaty exercise indeed. By definition, each Pokémon hunter almost certainly holds a high-definition camera in their hands. And there is a precedent: earlier this year, scientists reported Arulenus miae, a new species of pygmy devil grasshopper, identified in the Philippines after a researcher saw an unfamiliar insect in a photo on Facebook (J. Skejo and J. H. S. Caballero Zootaxa 4067, 383–393; 2016).

But Pokémon Go players beware. It is one thing to conquer a world of imaginary magical creatures with names like Eevee and Pidgey, and quite another to tangle with the historical complexity of the Inter­national Code of Zoological Nomenclature. So, say you do manage to snap a picture of something previously unknown to science — what then? Let Nature be your guide. First, the good news — the Code (we’ll call it that from now on to save on Twitter characters) is now officially with the times, and no longer reliant on the dead trees that were so popular before you were born. Despite grumbles from traditionalists, in 2012 the International Commission on Zoological Nomenclature, which hosts the Code, agreed to embrace online-only media. In doing so, it relaxed its rule that species could be officially named only in printed academic journals. Now, the bad news — if your picture of an unusual butterfly or bird or hippopotamus does look to a friendly online biologist like a new species, then you’ll probably have to go back and catch the beast. (Whisper it, but you might even have to kill it.)

More here.

Wednesday Poem

The Problem of Describing Color
.

If I said – remembering in summer,
The cardinal’s sudden smudge of red
In the bare gray winter woods –

If I said, red ribbon on the cocked straw hat
Of the girl with pooched-out lips
Dangling a wiry lapdog
In the painting by Renoir –

If I said fire, if I said blood welling from a cut –

Or flecks of poppy in the tar-grass scented summer air
On a wind-struck hillside outside Fano –

If I said, her one red earring tugging at her silky lobe,

If she tells fortunes with a deck of fallen leaves
Until it comes out right –

Rouged nipple, mouth –

(How could you not love a woman
Who cheats at the Tarot?)

Red, I said. Sudden, red.

by Robert Hass
from Time and Materials. Poems 1997-2005
Ecco (HarperCollins Publishers), New York, 2007

.

HOW PATRICK RYAN MADE IT TO OUTER SPACE AND THE FIFTEEN UNPUBLISHED NOVELS HE WROTE ALONG THE WAY

Sophia Efthimiatou in Literary Hub:

ScreenHunter_2106 Jul. 20 00.21He was at a meditation retreat in the Catskills, sitting cross-legged on a big flat rock on the side of a lake, eyes closed, pulse steady, surrounded by chipmunks and beavers and deer and newts, when Patrick Ryan decided he would never again try to write a book.

He had completed seven unpublished novels by then, attempted eight or nine more unfinished ones, all of them shoved away into manuscript boxes that took up as much space in his apartment as a child’s coffin. As he was nearing 40, there was nothing impressive about this activity of his—writing, that is—but a sad compulsion that bordered on the absurd.

It wasn’t that he had experienced no success at all. In two decades of writing nearly every day, he had seen some of his stories published in literary journals, and he had a few close calls with publishers and agents who found value in his work but ultimately could not sell it. One of them, an editor from Simon & Schuster, had even invited him to lunch after reading one of his novel submissions. Ryan walked into that restaurant on a snowy January day thinking that his time had finally come, only to be schooled on the concept of track records.

“I really love it,” the editor told him of his unsolicited manuscript. “It’s depressing but funny, and I think the writing is wonderful. But if I publish it, it’ll be the end of your career.”

More here.

Space Emerging from Quantum Mechanics

Sean Carroll in Preposterous Universe:

ScreenHunter_2105 Jul. 19 23.59The other day I was amused to find a quote from Einstein, in 1936, about how hard it would be to quantize gravity: “like an attempt to breathe in empty space.” Eight decades later, I think we can still agree that it’s hard.

So here is a possibility worth considering: rather than quantizing gravity, maybe we should try to gravitize quantum mechanics. Or, more accurately but less evocatively, “find gravity inside quantum mechanics.” Rather than starting with some essentially classical view of gravity and “quantizing” it, we might imagine starting with a quantum view of reality from the start, and find the ordinary three-dimensional space in which we live somehow emerging from quantum information. That’s the project that ChunJun (Charles) Cao, Spyridon (Spiros) Michalakis, and I take a few tentative steps toward in a new paper.

We human beings, even those who have been studying quantum mechanics for a long time, still think in terms of a classical concepts. Positions, momenta, particles, fields, space itself. Quantum mechanics tells a different story. The quantum state of the universe is not a collection of things distributed through space, but something called a wave function. The wave function gives us a way of calculating the outcomes of measurements: whenever we measure an observable quantity like the position or momentum or spin of a particle, the wave function has a value for every possible outcome, and the probability of obtaining that outcome is given by the wave function squared. Indeed, that’s typically how we construct wave functions in practice. Start with some classical-sounding notion like “the position of a particle” or “the amplitude of a field,” and to each possible value we attach a complex number. That complex number, squared, gives us the probability of observing the system with that observed value.

More here.

Scientists Discover What Economists Haven’t Found: Humans

David Sloan Wilson and Joseph Henrich in Evonomics:

ScreenHunter_2104 Jul. 19 23.33Paleontologists tell us that numerous Homo species once roamed the earth, although only Homo sapiens remains. Several Homo species still inhabit economic world, however — the world as described by traditional economics. The most common is Homo economicus, whose preferences and abilities were described by neoclassical economists a long time ago. More recently, behavioral economists described a new species called Homo anomalous, because it departs from H. economicus in so many ways. Now a brand new species has been discovered by a multi-disciplinary team of scientists. I’ll call itHomo bioculturus and it might well become the one that inherits the world of economics.

Joseph Henrich is one member of the team that discovered H. bioculturus and his new book, The Secret of Success: How Culture is Driving Human Evolution, Domesticating Our Species, and Making Us Smarter, is arguably the best way for the economic profession to learn about it. Henrich is an intellectual Indiana Jones, equally at home slashing through the Jungle or conducting lab experiments. He spearheaded the famous “15 Societies Study” that played experimental economics games in traditional societies around the world. He recently moved from the University of British Columbia, where he was jointly appointed in the Departments of Psychology and the Vancouver School of Economics, to Harvard University’s Department of Human Evolutionary Biology.

DSW: Greetings, Joe, and welcome to Evonomics.com.

JH: Hello David! It’s great to be with you.

DSW: First, let me congratulate you on writing such a terrific book. Without attempting to flatter you, it is a tour de force—great fun to read in addition to brimming with ideas—my current favorite book for recommending to others. Second, let me ask you to provide a synopsis for an economically oriented audience.

More here.

the moral questions around gene editing

CrisprBrendan P. Foht at The New Atlantis:

The fears and the hopes of genetically engineering the human race have been haunting the modern mind for the better part of a century, although only in the last decade have techniques been developed that might give us the power to modify the genomes of human beings at the embryonic stage. Foremost among these has been the CRISPR-Cas9 system — a set of bacterial enzymes first identified in the late 1980s, and during just the last few years harnessed as a gene-editing tool. What sets CRISPR apart from earlier genetic modification techniques is its accuracy and versatility: the enzymes that cut the targeted DNA are guided by short sequences of RNA that can be custom-designed for any site in the genome. Earlier genetic engineering methods required different enzymes to target different locations in the genome, but by using RNA instead, CRISPR makes that targeting process much easier. (Although there are some differences in what the terms “genetic engineering,” “genetic modification,” and “gene editing” mean, they are for the most part interchangeable.)

This new gene-editing tool has rapidly become ubiquitous in molecular biology, with many applications beyond gene therapy. For instance, scientists have used CRISPR to remove retroviral sequences from the genomes of pig embryos in the hope of producing pigs with organs that can be transplanted more safely into humans. Because CRISPR is relatively easy to use, some journalists have even speculated that the technique might lead to the democratization of genetic engineering, with “home hobbyists” using it for who-knows-what. This claim seems overblown. While it is true that CRISPR makes the specific task of editing DNA much easier, there are other technically complicated steps and procedures involved in most forms of genetic engineering.

more here.

‘SOPHIA’ BY MICHAEL BIBLE

Sophia-bibleBarrett Hathcock at The Quarterly Conversation:

At this late date Southern literature is only slightly coherent as a marketing label, but if understood as a series of related, competing traditions, it’s vastly more interesting. And like the modern Republican Party, the umbrella of “Southern literature” shades a handful of not always easily alignable interests that often regard each other with suspicion.

For instance, there is the khaki-pants-and-seersucker strain, what you might call the Walker Percy tradition. There is the Faulkner strain, a fabricated country gentleman intent on creating a mythos of the South. (This fabricated gentleman also happens to be Ahab, unfortunately.) And then there is the evangelical degenerate strain, a ménage a trois of booze, Bible thumping, and Foghorn Leghorn. The patron saint of this strain is probably Barry Hannah, most famous for his early story collection Airships and one of the most legitimately weird postwar American writers, at least at the sentence level. Hannah taught writing at the University of Alabama and then at Ole Miss for a sizable chunk of his career, and he was famous for various apocryphal stories related to his drunken outlandishness, both in and out of the classroom. Hannah was himself a character, a presence of folk heroic proportions. (A beloved folk hero, it should be said.) Stories about Hannah were almost as important as stories by him. In his vapor trails he outlined the parameters of his own tradition, and we continue to read his disciples: the Hannahs. (Jim Harrison, who recently died, was, if not a direct disciple, a type of Hannah—a Hannah from Montana, if you will.)

Michael Bible is definitely one of the Hannahs, and this would be the case even if his latest book Sophia didn’t boast a promotional quote from Hannah himself, who’s been dead since 2010.

more here.