New Caliphate, Old Caliphate

Conor Meleady in HistoryToday:

CaliphWhen the organisation known as Islamic State in Iraq and Syria (ISIS) announced at the end of June 2014 that it was seeking to restore the Islamic caliphate, with its leader, Abu Bakr al-Baghdadi, as caliph, it set off a wave of debate both among jihadists and western analysts. The debate concerned the legitimacy of al-Baghdadi’s claim and the likelihood of ISIS securing the support of the Islamic world for its project. Some analysts declared it to be the first time since Mustafa Kemal Atatürk’s abolition of the Ottoman Empire in March 1924 that any group or individual had been bold enough to make such a claim. In fact, just days after Atatürk’s action, the Hashimite Sharif Husayn of Mecca, King of the Hijaz, proclaimed himself caliph, inititating a controversy similar to that which al-Baghdadi’s declaration provoked. It was a controversy in which the officials charged with formulating Britain’s postwar policy in the Near East were deeply implicated.

Husayn’s claim was a decade in the making. Since the late 19th century, Arab intellectuals in Syria and Egypt had sought to reform the Ottoman Empire through a top-down process of Arabisation, with the Sharif of Mecca touted as a possible caliph. In the context of deteriorating Ottoman-British relations, these ideas were encouraged by orientalists such as Wilfrid Blunt, author of the anti-Ottoman tract, The Future of Islam, in which he argued that the revival of the Arabs was a historical inevitability in which Britain must play its part.

More here.

Sunday Poem

I See You Dancing, Father
.
No sooner downstairs after the night’s rest

In the middle of the kitchen floor.

And as you danced
You whistled.
You made your own music
Always in tune with yourself.

Well, nearly always, anyway.
You’re buried now
In Lislaughtin Abbey
And whenever I think of you

I go back beyond the old man
Mind and body broken
To find the unbroken man.
It is the moment before the dance begins,

Your lips are enjoying themselves
Whistling an air.
Whatever happens or cannot happen
In the time I have to spare
I see you dancing, father.
.

by Brendan Kennelly
from A Time for Voices, Selected Poems 1960-1990
publisher: Bloodaxe, Newcastle upon Tyne, 1990

Formulating Science in Terms of Possible and Impossible Tasks

Marletto_0

A Conversation with Chiara Marletto over at Edge.org:

I’ve been thinking about constructor theory a lot in the past few years. Constructor theory is this theory that David Deutsch proposed—a proposal for a new fundamental theory to formulate science in a completely different way from the prevailing conception of fundamental physics. It has the potential to change the way we formulate science because it’s a new mode of explanation.

When you think about physics, you usually describe things in terms of initial conditions and laws of motion; so what you say is, for example, where a comet goes given that it started in a certain place and time. In constructor theory, what you say is what transformations are possible, what are impossible, and why. The idea is that you can formulate the whole of fundamental physics this way; so, not only do you say where the comet goes, you say where it can go. This incorporates a lot more than what it is possible to incorporate now in fundamental physics.

David and I have been working on this together for the past three years, and we’ve been applying it to many different problems. So far, the two completed parts of our trying out constructor theory to see whether it can solve problems are: a fundamental theory of information within physics1; and the constructor theory of life2, which applies this new theory of information to a fundamental problem that's at the boundary between physics and biology, and has to do with how certain features of living things, such as the ability to self-reproduce very accurately, are compatible with the laws of physics as we know them.

In these two cases, you can see how switching to this new mode of explanation allows one to change the perspective and address the problems in a much more effective way. These are two examples where switching to this new mode of explanation makes all the difference.

Take information, for example. Information is something we use in our everyday speaking; also we use it in physics a lot. For instance, we assume that information has certain properties, e.g. that it can be copied from one physical system to another. So far, we did not have a fundamental theory telling us what are the regularities in nature that allow the existence of information in this sense. But whenever we talk about information we refer to those regularities. We assume, for example, that the laws of physics allow copy processes. In constructor theory you can express these regularities, and this is what our theory does. Our way of incorporating information in fundamental physics is by formulating what are these regularities in nature that allow the existence of information.

More here.

Blaming Parents, and Other Neoliberal Pastimes

5091248839_8f49a6049a_z-400x225

Emmett Rensin and David Shor in the Baffler (Indeed/Photo by Nico Paix):

The State of California believes the following things to be true: first, that reading to children will make them smarter. Second, that parents ordinarily disinclined for reasons of time or temperament from this activity may be won over by means of thirty-second radio spots. These are strange beliefs, but they are not uncommon.

Too Small to Fail, a nonprofit nominally led by Hillary Clinton, believes the same. Among the organization’s many laudable efforts to improve early childhood health and education is a less laudable (but no less costly) attempt to use advertisements to convince poor parents to read to their children. This, the group claims, will bolster kids’ intelligence, and thereby their tests scores, and thereby their futures. Chicago has a similar program. Their slogan: “Take time to be a dad.”

These efforts will fail. Not because PSAs and chipper radio spots won’t conjure quality reading time in the schedules of parents rushing from a 5 p.m. quitting time to the start of a 6 p.m. second shift (although they won’t, of course), but because reading to children, even young children, will not necessarily make them smarter.

It isn’t that reading to children doesn’t have its benefits. Improved socialization and greater empathy skills are among the upsides of childhood reading. If you are a parent with the luxury of time, reading to your kids will help produce better people. It just won’t produce smarter ones.

Chicago and Clinton and California didn’t invent these misconceptions. There is a wealth of data purportedly showing that reading to young children will increase their intelligence and test scores. The trouble is these studies do not actually demonstrate a link between the act of reading and an increase in childhood intelligence; rather, they demonstrate a link between the kinds of children whose parents read to them and the kinds of children (largely the same children) who wind up doing well on tests.

There’s another correlation that goes unmentioned: the parents who read to their children tend to be wealthier and smarter than the parents who don’t (PDF). And so the tail wags the dog; similarly, we notice how many athletes were encouraged to play sports as children, but fail to note how tall and strong their parents are and what nice sports equipment they’ve got locked in the garage.

If we restrict ourselves instead to studies that properly adjust for parental characteristics—that is, how smart, well-educated, and test-capable they were—the impact of reading to children disappears.

More here.

Don’t Write Off ET Quite Yet

Unknown

Caleb Scharf in Nautilus (photo by Kim Steele/Getty):

Here’s a riddle. We’ve never seen any, and we don’t know if they exist, but we think about them, debate them, and shout at each other about them. What are they?

Aliens, of course.

A while ago I wrote a piece for Nautilus on what might happen to us after learning about the existence of extraterrestrial life—whether microbes on Mars or technological civilizations around other stars—and asked if there might be inherent, unexpected, dangers in acquiring this information. Could infectious alien memes run riot, disrupting societies? Might intelligent life decide to shield itself from such knowledge? It was a whimsical, quizzical thought experiment, exploring the real science of our hunt for life in the cosmos, and the possibility—even if remote—that there could be unexpected perils for intelligently curious life anywhere.

Simple enough. But as comments to the piece began to pile up—many in my inbox—I found myself on the receiving end of a barrage of opinion. There was outrage at the suggestion that there might ever be circumstances to drive us (or any intelligent species) to close our astronomical and scientific eyes to avoid picking up dangerous alien data. At the other extreme, and I do mean extreme, there was outrage that we were already being kept in the dark about aliens by our governments. And across the board was a world-weary sense of our seemingly boundless capacity to screw things up, big universe or not.

Phew.

The possibility of life somewhere else in the cosmos isn’t just scientifically fascinating, it’s a unique mental playground for our hopes, fears, and fantasies. It can also be, as I’ve learned, an inkblot test; a reflection of our inner thoughts, emotions, and—to be honest—hang-ups.

More here.

Male Nerds Think They’re Victims Because They Have No Clue What Female Nerds Go Through

Bigbangtheory

Laurie Penny in The New Republic:

A few people have forwarded me MIT professor Scott Aaronson’s post about nerd trauma and male privilege (link here). It's part of a larger discussion about sexism in STEM subjects, and its essence is simple. Aaronson's position on feminism is supportive, but he can’t get entirely behind it because of his experiences growing up, which he details with painful honesty. He describes how mathematics was an escape, for him, from the misery of growing up in a culture of toxic masculinity and extreme isolation—a misery which drove him to depression, anxiety and suicidal thoughts. The key quote is this:

Much as I try to understand other people’s perspectives, the first reference to my 'male privilege'—my privilege!—is approximately where I get off the train, because it’s so alien to my actual lived experience … I suspect the thought that being a nerdy male might not make me 'privileged'—that it might even have put me into one of society’s least privileged classes—is completely alien to your way of seeing things. I spent my formative years—basically, from the age of 12 until my mid-20s—feeling not 'entitled', not 'privileged', but terrified.

I know them feels, Scott.

As a child and a teenager, I was shy, and nerdy, and had crippling anxiety. I was very clever and desperate for a boyfriend or, failing that, a fuck. I would have done anything for one of the boys I fancied to see me not as a sad little boffin freak but as a desirable creature, just for a second. I hated myself and had suicidal thoughts. I was extremely lonely, and felt ugly and unloveable. Eventually I developed severe anorexia and nearly died.

Like Aaronson, I was terrified of making my desires known—to anyone. I was not aware of any of my (substantial) privilege for one second—I was in hell, for goodness' sake, and 14 to boot. Unlike Aaronson, I was also female, so when I tried to pull myself out of that hell into a life of the mind, I found sexism standing in my way. I am still punished every day by men who believe that I do not deserve my work as a writer and scholar. Some escape it's turned out to be.

More here.

A Static Form of Remembrance

Memory-Theater-243x366

Daniel Fraser reviews Simon Chritchley's Memory Theatre, in The LA Review of Books:

THIS YEAR’S NOBEL PRIZE for Medicine was awarded to three scientists whose neuroscientific work provided conclusive evidence for the interwoven relationship between the concepts of memory and space in the human brain. John O’Keefe, May-Britt Moser, and Edvard I. Moser discovered cells referred to as “place” and “grid” cells, which together form a coordinate system used in the construction of mental maps and their memorization. These two cell types work together with neurons dubbed “time cells” that represent the flow of time in specific memories; together they reveal a structure of memory that is not only integrated with space but is in flux and repeatedly reconstituted.

Philosophical discourse is no stranger to the symbiotic relationship between memory and space. One of its most interesting conceptualizations of this relationship is the memory theatre: a physical space conceived in the mind in which knowledge might be stored in order for it to be recalled more easily. This spatial idea of memory has found expression throughout the history of philosophy, originating with the Greek Simonides: he supposedly could identify the remains of the guests of a party he attended after the roof collapsed and mangled them by remembering where each of them had been sitting.

From this fittingly macabre and humorous example comes Simon Critchley’s first novel, Memory Theatre: a postmodern, virulently metafictional blend of essay, autobiography, apocalyptic revelation and historical examination. The book centers on a university professor (a philosopher named Simon Critchley who shares an academic career and bibliography with his real-life counterpart) who receives a set of boxes, each labeled with a sign of the zodiac, containing the papers of a recently deceased colleague and friend (the French philosopher Michel Haar). In one of the boxes he discovers a set of memory maps that precisely chart the lives, publications, and deaths of a number of philosophical figures, including several who are still alive: one of them belonging to “Simon Critchley” himself.

More here.

The Fantastic Mr. Hobbes

Minding-the-Modern-199x300

Thomas Pfau in The Immanent Frame:

Some readers of Minding the Modern have been surprised to find my account so firmly critical of Thomas Hobbes on will and personhood. Now, it is both incidental and inevitable that my reading challenges recent attempts to claim Hobbes as a precursor of modern liberalism and individualism. Long before me, of course, a wide and diverse array of thinkers (Hannah Arendt, Alasdair MacIntyre, Charles Taylor, John Milbank, Louis Dupré, Michael Oakeshott) had probed the conceptual weakness of modern Liberalism, particularly its propensity to expire in an omnipresent state, putatively enlightened and benevolent as it orders and controls individual and social life at every level. If my reading of Hobbes casts doubt on some of modern Liberalism’s cherished axioms and aspirations, this only points to a certain lack of discernment among those who would identify Hobbes as a heroic precursor of an enlightened, secular, and liberal politics, of whose lasting benefits they remain unshakably persuaded. That said, political theory is not a principal concern of Minding the Modern, whereas putting analytic pressure on modern philosophy’s assumptions about human agency, rationality, and volition very much is.

It is presumably because Hobbes’s assumptions here have been assimilated by a fair number of twentieth-century political philosophers that some readers of Minding the Modern have homed in on this part of my narrative with such neuralgic intensity and exculpatory zeal. The dominant strategy here is to blunt my critical account of Hobbes on personhood with references to the supposedly unique situation and constraints within which he developed his theory of human agency and political community. Thus Mark Alznauer insists that “Hobbes’ theory of agency is an answer to problems that emerged in the seventeenth century, … [and] this is a new question.” Only by subscribing to a radically particularist, nominalist view of history can one suppose that a theory of agency can, let alone should, be tailored to its putatively unique historical circumstances. For my part, I very much doubt that human nature abruptly changed in the year 1651 any more than “on or about December 1910,” as Virginia Woolf so breezily proposed.

More here.

Cavellian Meditations: How to do Things with Film and Philosophy

Stanley-alison-photo-study_

Robert Sinnerbrink in Film Philosophy [via Bookforum's Omnivore]:

It is a curious feature of philosophical writing that authors rarely reflect on what motivates their concern with a chosen topic. The importance of a philosophical problem, argument, or discourse is assumed to be selfevident; or the kind of self-reflection that philosophers otherwise bring totheir reflections is deemed unseemly when applied to one’s own commitment to philosophy. Among the many reasons why Stanley Cavell remains anomalous in contemporary philosophy is his acknowledgment of the biographical aspect, or more exaltedly, the existential commitments of his own writing. He tells the story, for example, of how his coming to philosophy was inspired by his experience of particular texts, both philosophical and non-philosophical, an experience that was as much about writing and reading as about reflection and understanding. It was not only the philosophical power and originality of Wittgenstein’s Philosophical Investigations that inspired Cavell’s desire to do philosophy but the fact that it was the first text he read that ‘staked its teaching on showing that we do not know, or make ourselves forget, what reading is’ (Cavell 2006, 28).

Cinema too was a spur to philosophy, Cavell naming three films that suggested to him new possibilities of philosophical thought and expression: Smiles of a Summer Night [Sommarnattens leende] (Ingmar Bergman, 1955), Hiroshima Mon Amour (Alain Resnais and Marguerite Duras, 1959), and L’Avventura (Michelangelo Antonioni, 1960). Anticipating Cavell’s abiding concerns in his writing on film, these three films, he remarks, are cinematic works that opened up the question of what constitutes ‘a medium of thought’; they altered ‘the iconography of intellectual conversation’ (Cavell 2006, 29), suggesting the possibility that film might be an apt and equal partner to philosophy, or that some kind of marriage between the two might be possible. Cavell’s autobiographical reflection is fascinating, not only for its challenge to conventional academic philosophical discourse but for its suggestion that film and philosophy are fundamentally, rather than accidentally, related in his thought.

More here.

A Foodie Repents

John Lanchester in the New Yorker:

A foodie repentsThe specifics of how my mother came to be interested in cooking are unusual. She’s the only person I know who learned to make beef Stroganoff as part of the decompression process after running a convent school in Madras. At the same time, though, her story is typical: people have come to use food to express and to define their sense of who they are. If you live and cook the same way your grandmother did, you’ll probably never open a cookbook. Cookbooks, and everything they symbolize, are for people who don’t live the way their grandparents did.

Once upon a time, food was about where you came from. Now, for many of us, it is about where we want to go—about who we want to be, how we choose to live. Food has always been expressive of identity, but today those identities are more flexible and fluid; they change over time, and respond to different pressures. Some aspects of this are ridiculous: the pickle craze, thebáhn-mì boom, the ramps revolution, compulsory kale. Is northern Thai still hot? Has offal gone away yet? Is Copenhagen over? The intersection of food and fashion is silly, just as the intersection of fashion and anything else is silly. Underlying it, however, is that sense of food as an expression of an identity that’s defined, in some crucial sense, by conscious choice. For most people throughout history, that wasn’t true. The apparent silliness and superficiality of food fashions and trends touches on something deep: our ability to choose who we want to be.

Read the rest here.

Charles D’Ambrosio’s moment

04LOPATE-blog427Philip Lopate at The New York Times:

The great promise of essays is the freedom they offer to explore, digress, acknowledge uncertainty; to evade dogmatism and embrace ambivalence and contradiction; to engage in intimate conversation with one’s readers and literary forebears; and to uncover some unexpected truth, preferably via a sparkling literary style. In the preface to “Loitering,” his new and collected essays, Charles D’Ambrosio presents himself as a true believer in the form. Having digested “all of Joan Didion and George Orwell, all of Susan Sontag and Samuel Johnson, all of Edward Abbey and Hunter Thompson and James Baldwin,” he saw essays as “fast friends”: “I must have needed that sort of close attachment, that guidance, the voice holding steady in the face of doubt, the flawed man revealing his flaws, the outspoken woman simplysaying, the brother and the sister — for essays were never a father to me, nor a mother.”

D’Ambrosio has also published two fine collections of short stories, but it is his essays, appearing in literary magazines and previously in an obscure small-press edition, that have been garnering a cult reputation. Now that they are gathered in such a generous collection, we can see he is one of the strongest, smartest and most literate essayists practicing today. This, one would hope, is his moment.

more here.

on “The Colonel” by Mahmoud Dowlatabadi

ColonelRaha Namy at The Quarterly Conversation:

Mahmoud Dowlatabadi (born 1940) is considered by many the living Iranian novelist, a perennial Nobel Prize candidate. Dowlatabadi wrote The Colonel some thirty years ago, because in his own words he had been “afflicted.” The subject forced him to sit at the desk and write nonstop for two years. “Writing The Colonel I felt a strong sense of indignation and pain. As I mentioned before somewhere, I felt that if I did not write The Colonel, I would probably end up in a mad house,” he noted in email correspondence last spring.

At the time Dowlatabadi put the manuscript away and returned to it periodically to revise and edit. The revisions did not lead to any change in the contextual elements, he explains, but helped him save what he had written “with strong emotions and under the influence of its own era” from sentimentalism and polish it with the help of creative decisions that are not “intentional” but “unavoidable,” what could be called “birth born out of birth.”

He finally handed the work to his publisher a few years ago. It was then submitted to the Iranian Ministry of Culture and Islamic Guidance (the censorship apparatus that needs to preapprove all books before publication) but has so far been denied a permit, its destiny still under debate.

more here.

a world where artificial intelligence systems relieve us of the need to think

A1582c75-ad5f-471a-a49a-ca84dca2d182Richard Waters at the Financial Times:

What is to stop automation from ultimately assuming all of mankind’s mental and physical efforts? And when the machines do all the heavy lifting — whether in the form of robots commanding the physical world or artificial intelligence systems that relieve us of the need to think — who is the master and who the slave?

Despite the antagonism he sometimes stirs in the tech world (an influential article of his published by the Harvard Business Review in 2003 was called, provocatively, “IT Doesn’t Matter”) author Nicholas Carr is not a technophobe. But in The Glass Cage he brings a much-needed humanistic perspective to the wider issues of automation. In an age of technological marvels, it is easy to forget the human.

Carr’s argument here is that, by automating tasks to save effort, we are making life easier for ourselves at the cost of replacing our experience of the world with something inferior. “Frictionless” is the new mantra of tech companies out to simplify life as much as possible. But the way Carr sees it, much of what makes us most fulfilled comes from taking on the friction of the world through focused concentration and effort. What would happen, in short, if we were “defined by what we want”?

more here.

Queen of the Jungle

M. Myers Griffith in The Morning News:

Orangutans are some of humans’ closest relatives, genetically. They also rarely exhibit aggression, despite how we’ve abused them. One is different.

Queen-storyOrangutans rarely exhibit aggression. A 2014 study by Dr. Katja Liebal and colleagues showed that out of chimpanzees, bonobos, gorillas, and orangutans, only the orangutans exhibited altruism, readily offering a tool that could help another member of their species get at food that was otherwise out of reach. Altruism has also been scientifically observed in 12-month-old humans and has been documented to increase throughout early childhood. Yet we frequently observe altruism’s absence on the streets of our towns, the instinct subjugated to ego and greed, achievement and pride. What could cause a human to subdue his innate altruism? Could the same have happened to orangutans like Mina?

Certainly the capture and confusion that surrounded Mina’s youth could have fueled her aggression. Yet her legends, the fear she inspired in villagers, gave her a larger aura, as though her aggression was not rooted in her personality but her species’ struggles. A 2010 study of historical documents estimated that orangutan sightings declined from one every two days in 1850 to one every 13 days in 2005. The study, by Dr. Erik Meijaard and colleagues, named hunting as an important cause of species decline. In addition to habitat loss, which discourages breeding and regeneration, hunting continues to lead the causes of orangutan death. According to one survey, led by Dr. Jacqueline Davis, 44,165 orangutans have been killed by humans in Kalimantan (Borneo) in the past 80 years, a staggering number considering that today the there are only about 40,000 living on that island today. Another study by Meijaard and colleagues estimated between 2,383 and 3,882 orangutans have been murdered by humans every year for the past 80 years.

More here.

Literature of India, Enshrined in a Series

Jennifer Scheuessler in The New York Times:

IndiaWhen the Loeb Classical Library was founded in 1911, it was hailed as a much-needed effort to make the glories of the Greek and Roman classics available to general readers. Virginia Woolf praised the series, which featured reader-friendly English translations and the original text on facing pages, as “a gift of freedom.” Over time, the pocket-size books, now totaling 522 volumes and counting, became both scholarly mainstays and design-geek fetish objects, their elegant green (Greek) and red (Latin) covers spotted everywhere from the pages of Martha Stewart Living to Mr. Burns’s study on “The Simpsons.” Now, Harvard University Press, the publisher of the Loebs, wants to do the same for the far more vast and dizzyingly diverse classical literature of India, in what some are calling one of the most complex scholarly publishing projects ever undertaken.

The Murty Classical Library of India, whose first five dual-language volumes will be released next week, will include not only Sanskrit texts but also works in Bangla, Hindi, Kannada, Marathi, Persian, Prakrit, Tamil, Telugu, Urdu and other languages. Projected to reach some 500 books over the next century, the series is to encompass poetry and prose, history and philosophy, Buddhist and Muslim texts as well as Hindu ones, and familiar works alongside those that have been all but unavailable to nonspecialists. The Murty will offer “something the world had never seen before, and something that India had never seen before: a series of reliable, accessible, accurate and beautiful books that really open up India’s precolonial past,” said Sheldon Pollock, a professor of South Asian studies at Columbia University and the library’s general editor.

More here.

Space travel for a new millennialism

13867869654_3a749ff8d3_oKen Kalfus at n+1:

For more than a century now, the fourth planet from the sun has drawn intense interest from those of us on the third. We viewed it, first, as a place where life and intelligence might flourish. The mistaken identification of artificial water channels on its surface in the late 19th century seemed to prove that they did. More recently, terrestrials have gazed at the arid, cratered, wind-swept landscape and seen a world worth traveling to. With increasingly intense longing, we’ve now begun to think of it as a newfound land that men and women can settle and colonize. It’s the only planet in the solar system—rocky, almost temperate, and relatively close—where something like that can be conceived of as remotely plausible.

Since the last moonwalk, in 1972, Mars has drawn the fitful attention of American presidents and blue-ribbon commissions. As the Apollo program was winding down, Richard Nixon declared, “We will eventually send men to explore the planet Mars.” During the Reagan Administration, the National Commission on Space, chartered by Congress, proposed actual dates: a return to the moon by 2005 and a landing on Mars by 2015. President George H. W. Bush declared “a new age of exploration with not only a goal but also a timetable: I believe that before Apollo celebrates the fiftieth anniversary of its landing on the Moon, the American flag should be planted on Mars.”

more here.

the futility of attempts to find a substitute for God

1419445052mccaraherUgolino_di_Nerio._Way_to_Calvary13245._London_NG666Eugene McCarraher at Dissent:

Yet despite His protracted dotage, God refuses to shuffle off into oblivion. If He lingers as a metaphysical butt in seminar rooms and research laboratories, He thrives in the sanctuaries of private belief, religious communities, and seminaries, and abides (sometimes on sufferance) in theology and religious studies departments. He flourishes in suburban evangelical churches everywhere in North America; offers dignity and hope to the planet of slums in Kinshasa, Jakarta, São Paulo, and Mumbai; inspires pacifists and prophets for the poor as well as bombers of markets and abortion clinics. David Brat claims Him for libertarian economics, while Pope Francis enlists Him to scourge the demons of neoliberal capitalism. He’s even been seen making cameo appearances in the books of left-wing intellectuals. “Religious belief,” Terry Eagleton quips, “has rarely been so fashionable among rank unbelievers.”

As Eagleton contends in Culture and the Death of God, the Almighty has proven more resilient than His celebrated detractors and would-be assassins. God “has proved remarkably difficult to dispose of”; indeed, atheism itself has proven to be “not as easy as it looks.” Ever since the Enlightenment, “surrogate forms of transcendence” have scrambled for the crown of the King of Kings—reason, science, literature, art, nationalism, but especially “culture”—yet none have been up to the job.

more here.

New painting at the Museum of Modern Art

150105_r25965-320Peter Schjeldahl at The New Yorker:

Don’t attend the show seeking easy joys. Few are on offer in the work of the thirteen Americans, three Germans, and one Colombian—nine women and eight men—and those to be found come freighted with rankling self-consciousness or, here and there, a nonchalance that verges on contempt. The ruling insight that Hoptman proposes and the artists confirm is that anything attempted in painting now can’t help but be a do-over of something from the past, unless it’s so nugatory that nobody before thought to bother with it. In the introduction to the show’s catalogue, Hoptman posits a post-Internet condition, in which “all eras seem to exist at once,” thus freeing artists, yet also leaving them no other choice but to adopt or, at best, reanimate familiar “styles, subjects, motifs, materials, strategies, and ideas.” The show broadcasts the news that substantial newness in painting is obsolete.

Opening the show, in the museum’s sixth-floor lobby, are large, virtuosic paintings on paper by the German Kerstin Brätsch, which recall Wassily Kandinsky and other classic abstractionists. Brätsch encases many of her paintings in elaborate wood-and-glass frames that are leaned or stacked against a wall. The installation suggests a shipping depot of an extraordinarily high-end retailer.

more here.

College Football Coaches, the Ultimate 1 Percent

Matt Connolly in Washington Monthly:

1501-connolly_articleIn 1925, one of college football’s biggest stars did the unthinkable. Harold “Red” Grange, described by the famous sportswriter Damon Runyan as “three or four men rolled into one for football purposes,” decided to leave college early in order to play in the National Football League.

While no fan today would begrudge an All-American athlete for going pro without his diploma, things were different for Grange. The NFL was only a few years old, and his decision to take the money in the pros before finishing his degree at the University of Illinois was a controversial one. It was especially reviled by Robert Zuppke, his coach at Illinois.

As the story goes, Grange broke the news to Zuppke before promising to return to finish his degree. “If I have anything to do with it you won’t come back here,” Zuppke replied, furious that a respectable college man would drop out and try to make a living off playing a game. “But Coach,” Grange said. “You make money off of football. Why can’t I make money off of football?”

It’s a question that has underscored the development of modern college football ever since. Aside from scholarships and (some) health insurance, the players remain unpaid. They are also subject to draconian National Collegiate Athletic Association (NCAA) rules that banish them to hell for such sins as signing an autograph for cash or selling a jersey. Meanwhile their coaches enjoy ever-swelling salaries, bonuses, paid media appearances, and other perks like free housing. According to Newsday, the average compensation for the 108 football coaches in the NCAA’s highest division is $1.75 million. That’s up 75 percent since 2007. Alabama’s Nick Saban, college football’s highest-paid coach, will earn a guaranteed $55.2 million if he fulfills the eight-year term of his contract.

Read the rest here.

Reporting Violence: Short film featuring the work of Wolf Böwig and Pedro Rosa Mendes

When hatred whorls across a continent, it envelops people, daily life, it cuts off limbs, flattens villages, burns down buildings, and pushes hard against hope, belief. It is difficult to imagine the degree to which the world can turn upside down, and most of the time those of us not amidst or recovering from such destruction, don’t. We should. Not because it’s pleasant. Not because it’s easy or righteous. But because it’s the truth.

Nominated for the German Human Rights Film Award 2014, this is a disturbing and moving short film featuring the photographs of 3QD friend and renowned war photographer Wolf Böwig, and the writing of Pedro Rosa Mendes. More information here at Black Light Project.