3 Quarks Daily Status: The site is functioning normally

We are still working on a few things, such as getting old posts from October 2017 up until April 2018 imported into the archives; fixing some issues with images not displaying correctly in a few old posts; problems with the daily emails not showing all posts; and a couple of other smaller issues.

We need some time to make sure the site has been stabilized and is consistently running smoothly but after that, in the next month or two, we will do a survey of reader preferences and then revisit various aspects of the overall design based on that. And we promise to listen to you. Thanks for your patience.

NEW POSTS BELOW THIS

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

May 25, 2018

Time to abandon grand ethical theories?

Julian Baggini in the Times Literary Supplement:

Ethics today is in a curious state. There is no shortage of people telling us that Western civilization is facing a moral crisis, that the old foundation of Christianity has been removed but nothing has been put in its place. Christian writers such as Alister McGrath and Nick Spencer have warned that we’re running on the moral capital of a religion we’ve long abandoned. It’s only a matter of time before, like Wile E. Coyote, we realize we’ve run off a moral cliff, impossibly suspended in mid-air only as long as we fail to realize there’s nothing under our feet.

One supposed sign of this malaise is that scepticism about morality has never been higher. University philosophy lecturers consistently report that their new undergraduates tend to arrive assuming that all thinking people are moral relativists who believe that what’s right for some is wrong for others and that’s all there is to be said for it. Psychology has fuelled this scepticism, with researchers like Joshua Greene arguing that most moral judgements come straight from the “hot” amygdala, not the “cool” prefrontal cortex. On this account, moral principles are post-facto rationalizations of emotional reactions.

Yet for such a sickly beast, ethics is energetically at work everywhere. You may doubt the sincerity of corporate social responsibility but the very fact that every reasonably sized company feels the need to demonstrate it says something about public expectations.

More here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

NASA’s EM-drive is a magnetic WTF-thruster

Chris Lee in Ars Technica:

It was bound to happen eventually. A group of researchers that may actually be competent and well-funded is investigating alternative thrust concepts. This includes our favorite, the WTF-thrusterEM-drive, as well as something called a Mach-Effect thruster. The results, presented at Space Propulsion 2018, are pretty much as expected: a big fat meh.

The key motivation behind all of this is that rocket technology largely sucks for getting people around the Solar System. And it sucks even worse as soon as you consider the problem of interstellar travel. The result is that good people spend a lot of time eliminating even the most far-fetched ideas. The EM-drive is a case in point. It’s basically a truncated hollow copper cone that you feed electromagnetic radiation into. The radiation bounces around in the cone. And, by some physics-defying magic, unicorns materialize to push you through space.

Well, that explanation is at least as plausible as any of the others. There is no physics explaining how this could work, but some people at NASA have claimed that it does.

More here.  [Thanks to Sean Carroll.]

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

My Preferred Pronouns?

Justin E. H. Smith in his blog:

I was at an academic conference last week, somewhere in America, where we were invited by our hosts to place a ‘preferred pronoun’ sticker on our nametags. “If you could pick one of those up during the next break, we’d appreciate it.” The options were, ‘He’, ‘She’, ‘They’, ‘Ask Me’, and one with a blank space for a write-in. Coming from my adoptive France, I had heard of this new practice in my country of origin, but somehow I had convinced myself that it was mostly mythical. Yet there were the stickers, and there were all my fellow participants, wearing them with straight faces.

I did not pick one up. As is my practice at these events, I do not even wear the nametag that has been provided for me, so there would have been nothing to put the sticker on. But if there had been any direct and explicit pressure on me to wear one, rather than just a general announcement, I would have been constrained to explicitly refuse to do what was being asked of me. I would have been a conscientious objector.

In the future I will avoid meetings at which I know in advance, or I have a reasonable expectation, that there will be such stickers. I am strongly opposed to this convention, I think it is ridiculous and offensive, and I am only thankful that, for now, it is only a convention and not a compulsion. But the line is not so clear. It is not a compulsion for me to wear a sticker, because I am privileged and basically indifferent as to whether I ever get invited to an academic event again. The quality of my life is enhanced by not going to academic events, and reduced by going to them. If I can’t go because social pressure would require me to wear a sticker, well, tant mieux. But this is not the case for younger scholars who are precariously employed. It is in part for their sake that I feel the need to make explicit my opposition to this practice.

More here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

Break the Art Fair

Jerry Saltz at New York Magazine:

As a system, art fairs are like America: They’re broken and no one knows how to fix them. Like America, they also benefit those at the very top more than anyone else, and this gap is only growing. Like America, the art world is preoccupied by spectacle — which means nonstop art fairs, biennials, and other blowouts. Yet the place where new art comes from, where it is seen for free and where almost all the risk and innovation takes place — medium and smaller galleries – are ever pressured by rising art fair costs, shrinking attendance and business at the gallery itself, rents, and overhead. This art-fair industrial complex makes it next to impossible for any medium/small gallery to take a chance on bringing unknown or lower-priced artists to art fairs without risking major financial losses. Meanwhile high-end galleries clean up without showing much, if anything, that’s risky or innovative.

more here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

‘Come Sunday’ Considers The Price of Self-Doubt

Kelly J. Baker at The Baffler:

COME SUNDAY, A FILM RELEASED last month by Netflix and a production of NPR’s This American Life, claims in its short description to concern a “crisis of faith.” It’s based on the true story of Bishop Carlton Pearson, a black Pentecostal minister, and his radical shift in theology from fear, damnation, and a fallen world to forgiveness, inclusion, and hope. It’s an intriguing premise for a film, especially in this moment where pundits consider over and over why white evangelicals continue to support President Trump despite his moral bankruptcy. Politics, it would seem, matter more than faith. Toeing the party line becomes a virtue, and questioning one’s political allegiances and theology seems almost unimaginable. And yet, this intense, intimate, and quiet film—starring powerhouse actors like Chiwetel Ejiofor, Danny Glover, and Martin Sheen—centers on a moral crisis and catalogs the angst of uncertainty for a man that always appeared certain.

more here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

Jordan Peterson’s failed antidote for ‘toxic masculine despair’

Kate Manne at the TLS:

Each of the ensuing chapters of 12 Rules is a series of meditations – or, less kindly, digressions – leading up to its titular rule, presented as the solution to a problem revealed therein about life and how to make order out of chaos. The chaos is in turn presented as a universal, ahistorical fact about the nature of Being or human existence. Given all this, it is striking how many of the discussions reduce to advice about how to win at something, anything, nothing in particular: and how not to be a “loser”, in relation to others whose similarity to oneself is secured by the time-honoured narrative device of anthropomorphization, under a more or less thin veneer of scientism. Rule One is “Stand up straight with your shoulders back”, to avoid seeming like a “loser lobster”, who shrinks from conflict and grows sad, sickly and loveless – and is prone to keep on losing, which is portrayed as a disaster.

more here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

That Dinosaur-Killing Asteroid? It Triggered Global Warming, Too

Robinson Meyer in The Atlantic:

A man views an animatronic life-size dinosaur ahead of an interactive exhibition, Jurassic Kingdom, at Osterley Park in west London, Britain, March 31, 2017. REUTERS/Toby Melville – RC130E7CDB60

It took, at most, several seconds. An enormous hunk of rock, roughly the size of Manhattan, came whirling out of the vastness of space. It pierced Earth’s thin atmosphere, ignited as it fell, and slammed into the crust, opening a crater 20 miles deep in modern-day Mexico’s Yucatan Peninsula. Of course, it killed the non-avian dinosaurs: How could it not? By its end, the cataclysm wiped out 75 percent of all species that dwelled on Earth. In the last quarter century, we have gotten used to seeing images of that catastrophe: of the hellfire that rained down to Earth, igniting massive forest fires; of the years-long “impact winter” that dimmed the sun and chilled the Earth. But less well-known is what followed that winter. Scientists believe that the asteroid, which struck Earth roughly 66 million years ago, eventually triggered a lengthy period of ferocious global warming. Upon impact, it vaporized solid limestone into gas, and it incinerated enormous swaths of forest. This unleashed so much heat-trapping carbon dioxide into the atmosphere so quickly that, across all of Earth’s history, its rate of increase seems to be rivaled only by recent carbon pollution from factories, cars, planes, and modern industry.

A study published Thursday in Science finds new evidence of that warming while setting it in a dreadful context. It may have taken seconds for the asteroid to chew a 20-mile-deep hole in Earth. But, its authors say, it took roughly 100,000 years for Earth’s climate to return to normal. The research argues that Earth’s average temperature was elevated by 5 degrees Celsius for the 100 millennia that followed the impact. Notably, it supports this assertion not just with computer models, but with direct, observed evidence from the time period. By analyzing the bones of fish that lived in modern-day Tunisia before, during, and after the impact, scientists were able to detect a planet-sweltering warming signal.

More here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

The two faces of Philip Roth

Mark Lawson in New Statesman:

The brutal truth of literary careers is that the reputation of most great writers would not have been affected – and might even have been improved – by earlier death. Philip Roth, though, is a very rare example of a front-rank author whose later work is also the greater work. If the obituaries of Roth had appeared in the mid-Eighties of the last century rather than his own mid-80s this week, then he would likely have been remembered as a writer whose best efforts had been, in two senses, devoted to self-exploration. That early oeuvre might easily have been dismissed as penis-waving – the masturbatory comic classic, Portnoy’s Complaint(1969) – giving way to navel-gazing, in the quartet of stories – from The Ghost Writer (1979) to The Prague Orgy (1985) – that playfully dramatised, via a fictional Jewish American novelist called Nathan Zuckerman, the deranging fame and accusations of anti-semitism that resulted from the novel about the furiously self-abusing young Jew, Alexander Portnoy, or, as he became in Zuckerman’s surrogate version, Carnovsky.

But, in 1986, Roth began perhaps the most remarkable and redefining second and third acts of any writing life. The Counterlife, the fifth Zuckerman book, remained playful  (characters can joltingly be alive, dead, then living again) but moved beyond the narrow focus of literary celebrity to tackle the Jewish experience in England, the Middle East and history. Two skilful memoirs – The Facts (1988) and Patrimony (1991) – also usefully established the extent to which the Zuckerman sequence, assumed to be autobiographical, was invented.

More here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

Friday Poem

Sedition — a letter to the writer from Meri Mangakāhia

Here’s what I had in mind, kōtiro, this
clipping at words like overgrown maikuku —
return the blankets of domestic life; don’t fold
washing or wear shoes, polish these rerenga kē.
.
Eh. But this world.
I s’pose neither of us planned to be in politics,
never did do what others told us to —
wahanui though, go on, get
.
your sedition on girl,
your agitator, your defiant speak
to each other eye to eye —
Māori been jailed for nouns, phrases;
.
butcher up a clause, get buried
in Pākehā kupu, then dig that
out like the old people. No one approved
of their language either.
..
by Anahera Gildea
Source: Poetry (February 2018)
Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

May 24, 2018

The Fundamental Nihilism of Yanny VS. Laurel

Adam Rogers in Wired:

Some people heard the word “laurel” in a short audio clip that became internet-famous this week, while others heard the not-word “yanny.” This proves that we will all die alone.

Thanks to some sleuthing by my colleague Louise Matsakis, people interested in following up can learn that regardless of what they heard in the clip, the person speaking was, in fact, saying the word “laurel.” But the question of what was recorded, for real, diverges in important ways from the question of what people heard, also for real.

Let me put this another way: Three years back, a picture of a dress—The Dress—became internet famous, because some people saw it as blue and others saw it as white. The question of why sent scientists who study color vision into a flurry of activity that’s still going on today. They want to better understand how the brain factors in illumination when calculating the color of an object, and how people see yellowish colors differently than bluish ones. Again, none of that work asks what color The Dress actually was, for real. (Blue.)

There is a world that exists—an uncountable number of differently-flavored quarks bouncing up against each other. There is a world that we perceive—a hallucination generated by about a pound and a half of electrified meat encased by our skulls. Connecting the two, or conveying accurately our own personal hallucination to someone else, is the central problem of being human. Everyone’s brain makes a little world out of sensory input, and everyone’s world is just a little bit different.

More here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

Benedict Cumberbatch Meets Albert Einstein in Carlo Rovelli’s New Audiobook

Alan Lightman in the New York Times:

There’s a passage in Carlo Rovelli’s lovely new book, “The Order of Time” — a letter from Einstein to the family of his recently deceased friend Michele Besso: “Now he has departed from this strange world a little ahead of me. That means nothing… The distinction between past, present and future is only a stubbornly persistent illusion.” Rovelli comments that Einstein was taking great poetic license with the temporal findings of his relativity theory, even to the point of error. But then the author goes on to say that the great physicist was addressing his letter not to scientists or philosophers, but to a bereft family. “It’s a letter written to console a grieving sister,” he writes. “A gentle letter, alluding to the spiritual bond between Michele and Albert.” That sensitivity to the human condition is a constant presence in Rovelli’s book — a book that reviews all of the best scientific thinking about the perennial mystery of time, from relativity to quantum physics to the inexorable second law of thermodynamics. Meanwhile, he always returns to us frail human beings — we who struggle to understand not only the external world of atoms and galaxies but also the internal world of our hearts and our minds.

More here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

The Meaning and Legacy of Humanism: A Sharp Challenge from a Potential Ally: Yuval Noah Harari and A. P. Norman

From Secular Humanism:

In 2015, Israeli historian Yuval Noah Harari published Sapiens, a sweeping and widely acclaimed history of humankind. In it, he discusses a phenomenon he calls “humanism.” Humanism, as he defines it, is a family of “religions (that) worship humanity, or more correctly, homo sapiens.” This worship of humanity, he argues, has made modernity “an age of intense religious fervor, unparalleled missionary efforts, and the bloodiest wars of religion in history.” The crimes of genocidal Nazism, Stalinist communism, and environmental destruction, he argues, can all be traced to the central tenets of humanism. If Harari is right, humanists need to engage in some serious soul-searching.

In mid-2017, Tom Flynn, editor of this magazine, responded with an editorial arguing that Harari’s “extreme and factually untethered” critique effectively “smears” humanism (“Smearing Humanism,” FI, June/July 2017). A. P. (Andy) Norman, a humanist philosopher and frequent contributor to Free Inquiry, shared Flynn’s concerns but found that he was able to view Harari’s efforts in a more charitable light. For if you grant Harari his definition of humanism, much of the rest seems to follow. Moreover, the resulting story—a kind of revisionist take on modernity—does contain important insights. Nevertheless, Norman reached out to Harari with a plea to rethink and rescind his critique of humanism. Harari replied, and the ensuing exchange is presented here, with only minor edits. The issue, it turns out, has profound implications for the humanist movement and its historical legacy.

More here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email

On the Music of Ulysses and Finnegans Wake

Gerri Kimber at the TLS:

By the time he came to write Finnegans Wake, Joyce had moved beyond trying to imitate musical forms, and described his novel not as a “blending of literature and music”, but rather as “pure music”. Writing to his daughter Lucia, Joyce explained, “Lord knows what my prose means. In a word, it is pleasing to the ear . . . . That is enough, it seems to me”, and in conversation, he declared, “judging from modern trends it seems that all the arts are tending towards the abstraction of music; and what I am writing at present is entirely governed by that purpose”. This offers perhaps the best way to approach his most complex work. Joyce emphasizes how, “if anyone doesn’t understand a passage, all he need do is read it aloud”, and such an approach certainly helps here: “and the rhymers’ world was with reason the richer for a wouldbe ballad, to the balledder of which the world of cumannity singing owes a tribute for having placed on the planet’s melomap his lay of the vilest bogeyer but most attractionable avatar the world has ever had to explain for”.

more here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email