Jason A. Josephson-Storm and others at The New Atlantis:
To catch up those who are unfamiliar with my book, The Myth of Disenchantment is rooted in the following observation: Many theorists have argued that what makes the modern world “modern” is that people no longer believe in spirits, myth, or magic — in this sense we are “disenchanted.” However, every day new proof arises that “modern” thinkers do in fact believe in magic and in spirits, and they have done so throughout history. According to a range of anthropological and sociological evidence, which I discuss in the book, the majority of people living in Europe and North America believe (to varying degrees) in the following: spirits, witches, psychical powers, magic, astrology, and demons. Scholars have known this was true of much of the rest of the globe, but have overlooked its continued presence in the West.
So my book set out to answer the question: Where did this notion of de-spiritualized modernity come from? In other words, how did this mistaken belief set in? To explain, I traced the history of the idea that modernity means disenchantment in the birth of various intellectual disciplines, namely: philosophy, anthropology, sociology, folklore, psychoanalysis, and religious studies. In so doing, I discovered that the majority of theorists who gave the idea of disenchantment its canonical formulations were living in Britain, France, or Germany in a period in which spiritualism (séances and table turning), theosophy, and magical societies like the Golden Dawn were taking place as massive cross-cultural movements and, as I show from archival research into these theorists’ diaries, letters, and so on, these occult movements entered directly into the lives and beliefs of the very theorists of disenchantment themselves.
Whitney Curry Wimbish at The Baffler:
None of this information is secret. But neither museums nor their trustees spell it out, so it’s hidden just enough that our collective delusions about museums can persist. The cover is only blown in extreme cases—tear-gassed kids—and it throws into the ugliest light one of the few public places of respite from our punishing society. It’s a particularly stark reminder that no organization is purely good when money is the major organizing principle. The art and search for meaning that constitute the best expression of humanity will always be diluted here. In this case it’s cut by the worst expression of humanity, war.
It’s also a stark reminder that people with blood on their hands will always have a chance to rehabilitate their image. In this case, museums use them to keep their lights on: by appointing big donors to the board, sometimes requiring they donate a minimum amount, and then assigning them such duties as fundraising, “educating policymakers,” and “thinking strategically,” according to the American Alliance of Museums’ most recent report on museum boards. And in exchange for this, donors can represent to the rest of us that they are our benefactors, regardless of what else they’re up to. Now they’re “philanthropists.”
Seamus Perry at the LRB:
Quite a few of Shelley’s contemporaries came to the view that he wasn’t all there – the inhabitants of Marlow, for example, who were treated to the recurrent spectacle of a disgraceful young radical poet returning distractedly to his cottage after long scrambles in the woods. ‘He was the most interesting figure I ever saw,’ a child witness recalled later in life, still much struck. ‘His steps were often hurried, and sometimes he was rather fantastically arrayed … on his head would be a wreath of what in Marlow we call “old man’s beard” and wild flowers intermixed; at these times he seemed quite absorbed, and he dashed along regardless of all he met or passed.’ Not all the neighbours thought so admiringly of him, needless to say; and his poetry too would find detractors as well as admirers, dividing opinion over the next two hundred years with comical extremity. William Hazlitt, although notionally on the same side in the big political questions of the day, was pugnaciously uncharmed by the cast of mind that he discerned in Shelley’s dashing about, and anticipated a whole school of criticism: ‘There is no caput mortuum of worn-out thread-bare experience to serve as a ballast to his mind; it is all volatile intellectual salt of tartar, that refuses to combine its evanescent, inflammable essence with any thing solid or any thing lasting.’ But others found this odd sense of irreality more winning, or at least arresting. Getting to know him for the first time, his future sister-in-law reported delightedly that Shelley behaved ‘just as if he were Adam in Paradise before his fall’, and she was not alone in finding in him an innocence of the world that lay about him.
we children of god and truth
we harbingers of sexual salvation
we brave enough to love ourselves
when senegal is trying to kill us
we with warm worn wild tongues
we with long wet seeking fingers
we with broken open hearts
when gambia warns he’ll chop off our heads
we with hungry deep urgent kisses
we in stilettos pumping stonewall fists
we with genders bent to meet our souls
when falls city nebraska rapes us
we with groins that want to talk dirty
we with mouths that want to come clean
we with legs that want to outrun
when laramie wyoming crushes our skulls
we who ache for your acceptance
we are not waiting for your laws
we were on our way to the accountant
when Brooklyn tore our limbs in trashbags
we with wigs and bikes and pride
we with leather and lust and poems
we with sass and guts and home girls
when Newark stabbed is in our chests
we are loving all over the world
we are hated all over the world
we are buried all over the world
we are grieving all over the world
we are praying all over the world
we are dancing all over the world
we are laughing all over the world
we are living all over the world
we alive all over the world
by Lenelle Moïse
excerpt from We Alive (video)
Claudia Wallis in Scientific American:
Consider almost everything you know about heart disease, particularly the garden-variety type involving high cholesterol levels, clogged coronary arteries, stents and bypass surgeries. Now I want you to rebrand all that as “male-pattern” cardiovascular disease. That’s how some researchers are reframing it after taking a closer look at heart disease in women. For years cardiologists were baffled as to why up to half of women with classic symptoms of blocked vessels—chest pain, shortness of breath and an abnormal cardiac stress test—turn out to have open arteries. Doctors called it “cardiac syndrome X.” They didn’t understand it, and many women were subjected to repeated angiograms in search of blockages that weren’t there. That still happens today, but more doctors now recognize that despite having open arteries, about half of women with this pattern nonetheless have ischemia—poor blood flow through the heart. The condition has gained a mouthful of a name: ischemia and no obstructive coronary artery disease, or INOCA.
Cardiologist C. Noel Bairey Merz has spent more than 20 years overseeing the Women’s Ischemia Syndrome Evaluation (WISE) study, aimed at demystifying INOCA and related conditions. Although male-pattern disease is the most prevalent type in both sexes, “INOCA probably comprises 25 to 30 percent of ischemic heart disease in women and 10 percent in men,” says Bairey Merz, director of the Barbra Streisand Women’s Heart Center at Cedars-Sinai’s Smidt Heart Institute. WISE data show that after diagnosis, women with the disorder face a 2.5 percentannual risk of dying, suffering a nonfatal heart attack or stroke, or being hospitalized for heart failure. They are also four times more likely than men to be readmitted to a hospital within 180 days of being treated for a heart attack or severe chest pain.
The initial mystery of INOCA was how the heart could be starving for blood if its main arteries are not blocked.
Steve Ayan in Scientific American:
In 1909 five men converged on Clark University in Massachusetts to conquer the New World with an idea. At the head of this little troupe was psychoanalyst Sigmund Freud. Ten years earlier Freud had introduced a new treatment for what was called “hysteria” in his book The Interpretation of Dreams. This work also introduced a scandalous view of the human psyche: underneath the surface of consciousness roils a largely inaccessible cauldron of deeply rooted drives, especially of sexual energy (the libido). These drives, held in check by socially inculcated morality, vent themselves in slips of the tongue, dreams and neuroses. The slips in turn provide evidence of the unconscious mind. At the invitation of psychologist G. Stanley Hall, Freud delivered five lectures at Clark. In the audience was philosopher William James, who had traveled from Harvard University to meet Freud. It is said that, as James departed, he told Freud, “The future of psychology belongs to your work.” And he was right.
The view that human beings are driven by dark emotional forces over which they have little or no control remains widespread. In this conception, the urgings of the conscious mind constantly battle the secret desires of the unconscious. Just how rooted the idea of a dark unconscious has become in popular culture can be seen in the 2015 Pixar film Inside Out. Here the unconscious mind of a girl named Riley is filled with troublemakers and fears and housed in a closed space. People like to think of the unconscious as a place where we can shove uncomfortable thoughts and impulses because we want to believe that conscious thought directs our actions; if it did not, we would seemingly have no control over our lives.
This image could hardly be less accurate, however. Recent research indicates that conscious and the unconscious processes do not usually operate in opposition. They are not competitors wrestling for hegemony over our psyche. They are not even separate spheres, as Freud’s later classification into the ego, id and superego would suggest. Rather there is only one mind in which conscious and unconscious strands are interwoven. In fact, even our most reasonable thoughts and actions mainly result from automatic, unconscious processes.
David Runciman in the Boston Review:
Whether you are an optimist or a pessimist is not just a question of personal temperament. It is also, increasingly, a question of politics. The divide between the optimists and the pessimists is as acute as any in contemporary politics and like many others—the generational divide between old and young, the educational divide between people who did and didn’t go to college—it cuts across left and right. There are left pessimists and right pessimists; left optimists and right optimists. What there isn’t is much common ground between them. Competing views about whether the world is getting better or worse has become another dialogue of the deaf.
Steven Pinker’s new book Enlightenment Now: The Case for Reason, Science, Humanism, and Progress—along with its critical reception—illustrates how dug in the two sides are. Pinker argues that most people have lost sight of the incredible benefits that liberal democratic values continue to deliver because too many of us have a bias in favour of bad news. He blames the things he doesn’t like—including Donald Trump’s presidency—on this innate and deeply misguided pessimism about the possibility of progress. “The most consistent predictor of Trump support,” he writes, “was pessimism.” He accuses these pessimists of fatalism, because they assume that any good news they hear is essentially fake news. They discount progress because of their deep faith in the inexorable pull of the worst that modern societies have to offer. He thinks that the pessimists have effectively given up on the capacity of human beings to make a better future.
Ethan Siegel in Forbes:
Black holes are some of the strangest, most wondrous objects in all the Universe. With huge amounts of mass concentrated into an extremely small volume, they inevitably collapse down to singularities, surrounded by event horizons from which nothing can escape. These are the densest objects in the entire Universe. Whenever anything comes too close to one, the forces from the black hole will tear it apart; when any matter, antimatter, or radiation crosses the event horizon, it simply falls down to the central singularity, growing the black hole and adding to its mass.
These properties about black holes are all true. But there’s an associated idea that’s absolute fiction: black holes suck surrounding matter into them. This couldn’t be further from the truth, and completely misrepresents how gravity works. The biggest myth about black holes is that they suck. Here’s the scientific truth.
Amitav Ghosh in Scroll:
Nowhere else in the world did the year 1984 fulfill its apocalyptic portents as it did in India. Separatist violence in the Punjab, the military attack on the great Sikh temple of Amritsar; the assassination of the Prime Minister, Mrs Indira Gandhi; riots in several cities; the gas disaster in Bhopal – the events followed relentlessly on each other. There were days in 1984 when it took courage to open the New Delhi papers in the morning.
Of the year’s many catastrophes, the sectarian violence following Mrs Gandhi’s death had the greatest effect on my life. Looking back, I see that the experiences of that period were profoundly important to my development as a writer; so much so that I have never attempted to write about them until now.
K Austin Collins in Slate:
Dana, you’re absolutely right to wonder what happened to The Tale; it’s exactly the kind of movie I have in mind, underseen despite tackling one of the most urgent subjects of our moment. I may have seen better movies overall this year, but I don’t think any of them had a conceit that shook me as deeply as this one. It all comes down to a simple, harrowing thing that Jennifer Fox does to depict how the heroine of the film, also named Jennifer Fox (and played by Laura Dern), remembers a sexual relationship she had with her tennis coach as a teenager. She initially remembers herself as confident and sexually self-possessed; she remembers herself, in other words, as a young woman somewhat in control of what happened to her, and, as the movie reveals, this has led her to remember what happened as a more consensual affair, rather than as abuse.
But then comes the moment that we see Fox correct her memory, replaying the same flashbacks using a teen actress closer in demeanor and body type to the scared girl Fox actually was, rather than the woman she supposed she was. It’s shattering—an exceptional feat of writing, and of course, Dern kills the performance. It’s not an easy scene to watch, but The Tale is an easy film to see and has been for most of the year—it’s right there on HBO! But short of the ability of Oscar attention to revive discussion about that film, the full-fledged online discussion cycle for it has come and gone. It’s bewildering.
Steve Ayan in Scientific American:
Peter Carruthers, Distinguished University Professor of Philosophy at the University of Maryland, College Park, is an expert on the philosophy of mind who draws heavily on empirical psychology and cognitive neuroscience. He outlined many of his ideas on conscious thinking in his 2015 book The Centered Mind: What the Science of Working Memory Shows Us about the Nature of Human Thought. More recently, in 2017, he published a paper with the astonishing title of “The Illusion of Conscious Thought.” In the following excerpted conversation, Carruthers explains to editor Steve Ayan the reasons for his provocative proposal.
What makes you think conscious thought is an illusion?
I believe that the whole idea of conscious thought is an error. I came to this conclusion by following out the implications of the two of the main theories of consciousness. The first is what is called the Global Workspace Theory, which is associated with neuroscientists Stanislas Dehaene and Bernard Baars. Their theory states that to be considered conscious a mental state must be among the contents of working memory (the “user interface” of our minds) and thereby be available to other mental functions, such as decision-making and verbalization. Accordingly, conscious states are those that are “globally broadcast,” so to speak. The alternative view, proposed by Michael Graziano, David Rosenthal and others, holds that conscious mental states are simply those that you know of, that you are directly aware of in a way that doesn’t require you to interpret yourself. You do not have to read you own mind to know of them. Now, whichever view you adopt, it turns out that thoughts such as decisions and judgments should not be considered to be conscious. They are not accessible in working memory, nor are we directly aware of them. We merely have what I call “the illusion of immediacy”—the false impression that we know our thoughts directly.
Frankie Thomas in The Paris Review:
“Are we all Joyceans here, then?” the young professor asked, poking his head into the classroom doorway.
We looked back at him uncertainly. Yes, we were all here for the Ulysses seminar that met at six thirty P.M. on Tuesdays and Thursdays. But to call us “Joyceans” seemed like a stretch. Today—Thursday, January 29, 2015—was only the first day. And besides, this was City College.
No article about City College is complete without the obligatory phrase “the Harvard of the proletariat,” which was supposedly both our school’s nickname and its reputation in the mid twentieth century. By 2015, however, no one could deny that our beautiful Harlem campus was in decline. Governor Cuomo had recently slashed the budget for the entire CUNY system, with City College bearing the brunt of the cuts, and the disastrousness of this decision is difficult to convey without resorting to sodomitic imagery. That year, classrooms were so overcrowded that latecomers had to sit on the floor. One of my professors entered his office on the first day to find that his entire desk had been stolen. The humanities building still used old-fashioned blackboards, but the budget didn’t provide for chalk, so professors hoarded and traded it like prison cigarettes. Most bathroom stalls didn’t lock, and for several weeks, the entire campus collectively ran out of toilet paper—I’ll never forget the Great Toilet Paper Crisis of 2015 and the generosity it inspired in my fellow students, who shared their own toilet paper from home and never stooped to charging for it.
It was in this context that the English department decided to offer its first-ever Ulysses seminar, though they offered it as you might offer someone a home-cooked meal that you’re secretly pretty sure contains broken glass. “NB: This is a highly demanding course with a heavy reading load,” the course catalogue warned in bold italics, “more like a graduate seminar than a 400-level college class.” I don’t think it actually said “DON’T TAKE THIS CLASS,” but that was the obvious implication. I have since learned that our idealistic young professor was met with departmental resistance when he suggested a Ulysses seminar, and I now suspect that the department was half hoping no one would register for it at all. But the department hadn’t counted on the sheer belligerence of City College students. I took one look at that warning and immediately decided, thanks to the same knee-jerk rebelliousness that had led me to avoid college until the age of twenty-seven, that I had to take this class. I wasn’t the only one: there were thirty students in the Ulysses seminar. (This is what passes for a small discussion class at City College.)
Were we all Joyceans here, then? Taking our silence as a yes, our professor stepped into the crowded classroom.
Kanishk Tharoor at The Atlantic:
Media coverage of uncontacted tribes often delights in painting indigenous groups as people out of time, hunter-gatherers in the age of Seamless. In November, an American missionary was killed trying to reach North Sentinel Island in the Bay of Bengal, home to a remote tribe thought to number about 100 people. Grainy images shot from a helicopter in 2004 of naked islanders brandishing spears flooded the internet. But when they first appear in Piripkura, Pakyî and Tamandua offer a different kind of spectacle. What is striking about them is not their timelessness, but rather their very modern resolve to persist against the odds, to be free from the outside world.
That independence is likely to come further under threat from the incoming far-right president, Jair Bolsonaro, who has pledged not to reserve any more land for indigenous peoples. In previous years, Bolsonaro has said he would arm ranchers in their conflicts with native groups and has lamented that the Brazilian army was not as efficient as the American cavalry in exterminatingindigenous tribes. When trying to put a more benign gloss on these statements, Bolsonaro has claimed that government protections unfairly exclude indigenous people from the benefits of 21st-century life.
Susan Sidlauskas at nonsite:
Nearly two decades ago, photo historian Douglas Nickel observed that traditional art history had not yet developed the tools for handling non-art photographs.1 In many ways, much progress has been made since 2001: vernacular photographs of every kind have become worthy objects of study for scholars and critics—not only within visual culture, but in history, sociology, anthropology, and literary and cultural studies. In fact, these days, as Nickel himself has pointed out, studies of vernacular photographs outpace the production of monographic books about “fine art photographers.”2
However, it remains difficult for art historians to build an interpretative model that takes full measure of the visual subtleties of non-art photographs—including their unexpected aspirations towards the aesthetic—without slighting the historical and social conditions in which they were produced.3 With that challenge in mind, I consider here those very aspirations, as they surface repeatedly in a group of “medical portraits”: casebook photographs of patients made between 1885 and 1916 at the Holloway Sanatorium in St. Ann’s Heath, Virginia Water, Surrey, England (fig. 1).