Oliver Burkeman in The Guardian (Illustration by Peter Gamelen):
The consciousness debates have provoked more mudslinging and fury than most in modern philosophy, perhaps because of how baffling the problem is: opposing combatants tend not merely to disagree, but to find each other’s positions manifestly preposterous. An admittedly extreme example concerns the Canadian-born philosopher Ted Honderich, whose book On Consciousness was described, in an article by his fellow philosopher Colin McGinn in 2007, as “banal and pointless”, “excruciating”, “absurd”, running “the full gamut from the mediocre to the ludicrous to the merely bad”. McGinn added, in a footnote: “The review that appears here is not as I originally wrote it. The editors asked me to ‘soften the tone’ of the original [and] I have done so.” (The attack may have been partly motivated by a passage in Honderich’s autobiography, in which he mentions “my small colleague Colin McGinn”; at the time, Honderich told this newspaper he’d enraged McGinn by referring to a girlfriend of his as “not as plain as the old one”.)
McGinn, to be fair, has made a career from such hatchet jobs. But strong feelings only slightly more politely expressed are commonplace. Not everybody agrees there is a Hard Problem to begin with – making the whole debate kickstarted by Chalmers an exercise in pointlessness. Daniel Dennett, the high-profile atheist and professor at Tufts University outside Boston, argues that consciousness, as we think of it, is an illusion: there just isn’t anything in addition to the spongy stuff of the brain, and that spongy stuff doesn’t actually give rise to something called consciousness. Common sense may tell us there’s a subjective world of inner experience – but then common sense told us that the sun orbits the Earth, and that the world was flat. Consciousness, according to Dennett’s theory, is like a conjuring trick: the normal functioning of the brain just makes it look as if there is something non-physical going on. To look for a real, substantive thing called consciousness, Dennett argues, is as silly as insisting that characters in novels, such as Sherlock Holmes or Harry Potter, must be made up of a peculiar substance named “fictoplasm”; the idea is absurd and unnecessary, since the characters do not exist to begin with. This is the point at which the debate tends to collapse into incredulous laughter and head-shaking: neither camp can quite believe what the other is saying. To Dennett’s opponents, he is simply denying the existence of something everyone knows for certain: their inner experience of sights, smells, emotions and the rest. (Chalmers has speculated, largely in jest, that Dennett himself might be a zombie.) It’s like asserting that cancer doesn’t exist, then claiming you’ve cured cancer; more than one critic of Dennett’s most famous book, Consciousness Explained, has joked that its title ought to be Consciousness Explained Away. Dennett’s reply is characteristically breezy: explaining things away, he insists, is exactly what scientists do. When physicists first concluded that the only difference between gold and silver was the number of subatomic particles in their atoms, he writes, people could have felt cheated, complaining that their special “goldness” and “silveriness” had been explained away. But everybody now accepts that goldness and silveriness are really just differences in atoms. However hard it feels to accept, we should concede that consciousness is just the physical brain, doing what brains do.
Leon Wieseltier at the Jewish Review of Books:
Learning to live with disagreement, moreover, is a way of learning to live with each other. Etymologically, the term machloket refers to separation and division, but the culture ofmachloket is not in itself separatist and divisive. This is in part because all the parties to any particular disagreement share certain metaphysical and historical assumptions about the foundations of their identity. But beyond those general axioms, the really remarkable feature of the Jewish tradition of machloket is that it is itself a basis for community. The community of contention, the contentious community, is not as paradoxical as it may seem. The parties to a disagreement are members of the disagreement; they belong to the group that wrestles together with the same perplexity, and they wrestle together for the sake of the larger community to which they all belong, the community that needs to know how Jews should behave and live. A quarrel is evidence of coexistence. The rabbinical tradition is full of rival authorities and rival schools—it owes a lot of its excitement to those grand and even bitter altercations—but the rivalries play themselves out within the unified framework of the shared search. There is dissent without dissension, and yet things change. Intellectual discord, if it is practiced with methodological integrity, is compatible with social peace.
The absence of the God’s-eye view of an issue, and the consequent recognition of the limitations of all individual perspectives, has a humbling effect. A universe of controversy is a universe of tolerance. Machloket is not schism, and the difference is crucial.
Peter Schjeldahl at The New Yorker:
l artists want to change the world, usually just by making it take special notice of them, but now and then they do so out of a devotion to larger hopes. “The Left Front: Radical Art in the ‘Red Decade,’ 1929-1940,” a fascinating scholarly show at New York University’s Grey Art Gallery, on Washington Square, illustrates the most sustained convergence of art and political activism in American history. Some one hundred works by forty artists, along with photographs and publications, tell a story that tends to figure in art history only as a background to the emergence of the Abstract Expressionist generation; Arshile Gorky, Jackson Pollock, Willem de Kooning, et al., shared poverty but not zeal with their marching contemporaries. (Gorky revered Stalin and joined demonstrations near his loft on Union Square, but he scorned proletarian art, pronouncing it “Poor art for poor people.”) The show makes visible a twisty saga that the critic Clement Greenberg, who started his career in the late nineteen-thirties at the initially Communist-sponsored Partisan Review, mentioned in passing in a 1961 book, “Art and Culture.” He wrote, “Some day it will have to be told how ‘anti-Stalinism,’ which started out more or less as ‘Trotskyism,’ turned into art for art’s sake, and thereby cleared the way, heroically, for what was to come.”
The show originated at Northwestern University, where it was curated by John Murphy and Jill Bugajski, and it focussed on the movement’s legacy in Chicago. (“Left Front” was the name of an activist magazine published in that city in the early thirties.) It has now been expanded with material from New York, where the era’s leading organizations of radical artists began: the John Reed Club, in 1929, and its Popular Front successor, the American Artists’ Congress, in 1936.
Stuart Klawans at The Nation:
Although it was shot in Argentina, partially bankrolled in Spain (by Pedro Almodóvar’s company), given its premiere at Cannes and then shortlisted for the Oscars, the true mark of the internationalism of Damián Szifron’s Wild Tales is that it bears the artistic stamp of Quentin Tarantino. Many other films destined for US art houses display comparably global credentials, but Wild Tales is exceptional for the brio with which it imitates a style that is already proudly imitative—and as accepted worldwide as the American Express card.
You will immediately recognize the genre-movie settings (a cheap roadside diner in the rain, a lonely stretch of mountain highway), the pop-archivist musical choices (Giorgio Moroder’sFlashdance soundtrack, Bobby Womack’s cover of “Fly Me to the Moon”), the frequent pauses to let you admire a graphic effect (an off-kilter close-up, a character framed by a window), and the teasing, discontinuous narrative (which gives you six stories for the price of your ticket).
Above all, note the ratio of laughter to mayhem, which remains high in Wild Tales despite the continually mounting pile of corpses. The body count is already incalculable by the end of the first story, which comes to a boomingly funny climax before Szifron even rolls the opening credits, with their spaghetti-western theme music.
Paul Strohm in The Spectator:
Proust had his cork-lined bedroom; Emily Dickinson her Amherst hidey-hole; Mark Twain a gazebo with magnificent views of New York City. Where, then, did the father of English poetry do his work? From 1374 till 1386, while employed supervising the collection of wool-duties, Chaucer was billeted in a grace-and-favour bachelor pad in the tower directly above Aldgate, the main eastern point of entry to the walled city of London.
‘Grace and favour’ makes it sound grander than it was. With the help of a wonderfully ingenious pattern of inferences — in particular an architectural drawing from 200 years later which happened to include a sketch of Aldgate’s north tower at its margins — Paul Strohm is able to reconstruct the room in which, after a long day weighing bags of wool and writing down columns of figures, Geoffrey Chaucer retired to scratch away at his verse.
Chaucer occupied a single bare room of about 16’ x 14’. The only natural light would come from ‘two (or at most four) arrow slits’ tapering through the five-foot thickness of these walls (the towers were a defensive feature) to an external aperture of four or five inches. ‘Light, even at midday, would have been extremely feeble. Arrangement for a small fire might have been possible. Waste would be hand-carried down to the ditch that lapped against the tower and dumped there.’
You can imagine how cosy it was in winter. And the noise! Chaucer slept directly over the main London thoroughfare. Every morning at first light the portcullis would go rattling up, and thereafter ‘the creak of iron-wheeled carts in and out of the city, drovers’ calls, and the hubbub of merchants and travellers pressing for advantage on a wide but still one-laned road, probably made sleep impossible, five-foot walls or no five-foot walls’. That’s if he could hear anything over the incessant bong-bonging of bells from each of the three churches within a couple of hundred feet of his front door.
Ali Minai in Barbarikon:
This piece in the Atlantic from a few months ago is a wonderful profile of Douglas Hofstadter and a timely exposition of an issue at the core of the artificial intelligence enterprise today.
I read Doug Hofstadter's great book, Goedel, Escher, Bach (or GEB, as everyone calls it) in 1988 as a graduate student working in artificial intelligence – and, as with most people who read that book, it was a transformative experience. Without doubt, Hofstadter is one of the most profound thinkers of our time, even if he chooses to express himself in unconventional ways. This piece captures both the depth and tragedy of his work. It is the tragedy of the epicurean in a fast food world, of a philosopher among philistines. At a time when most people working in artificial intelligence have moved on to the “practical and possible” (i.e., where the money is), Hofstadter doggedly sticks with the “practically impossible”, in the belief that his ideas and his approach will eventually recalibrate the calculus of possibility. The reference to Einstein at the end of the piece it truly telling.
My main concern, however, is the deeper point made in the Atlantic article: The degree to which the field of artificial intelligence (AI) has abandoned its original mission of replicating human intelligence and swerved towards more “practical” applications based on “Big Data”. This point was raised vociferously by Fredrik deBoer in a recent piece, and much of this post is a response to his critique of the current state of AI.
deBoer begins with a simplistic dichotomy between what he terms the “cognitive” and the “probabilistic” models of intelligence. The former, studied by neuroscientists and psychologists – grouped together under the term “cognitive scientists” – was the original concern of AI, which sought to first understand and then replicate human intelligence. Instead, what dominates today is the latter approach which seeks to achieve practical capabilities such as machine translation, text analysis, recommendation, etc., through the application of statistics to large amounts of data without any attempt to “understand” the processes in cognitive terms. deBoer sees this as a retreat for AI from its original lofty goals to mere praxis driven, in his opinion, by the utter failure of cognitive science to elucidate how real intelligence works.
Zachariah Mampilly in n + 1:
Sudan was the site of the first major anti-colonial revolt in African history, when the followers of Muhammad Ahmad, known as the Mahdi (or Redeemer), overthrew the Anglo-Egyptian regime in 1885. Yet the Mahdist revolt is not the only or even most consequential of Sudan’s historic uprisings. In 1964, countless Sudanese took to the streets to overthrow the military regime of Ibrahim Abboud. At the forefront of the revolt was the country’s emerging civil society—students, trade unions, and members of the vibrant Sudanese Communist Party. But the protest wave quickly swelled beyond civil society, drawing in ordinary people as it flowed towards the presidential palace. Most demonstrations were peaceful, but some engendered bouts of rioting. The regime opened fire, killing twenty-eight and scattering protesters. Its victory was short lived, however: the next day, facing pressure from junior military officers unhappy with the violent crackdown, Abboud dissolved the military government and stepped aside. The triumph of the protesters, now remembered as Sudan’s “October Revolution,” represented the first time in post-colonial African history that a popular movement overthrew a military regime, preceding the Arab Spring by nearly half a century.
But few other post-colonial nations have struggled as much to remain a viable national community. Just two years ago, the southern region was cleaved off, following a civil war that had stretched on for decades. As with an amputated limb, many Sudanese cannot shake the sensation of its phantom presence.
Erica Wald on Dissertation Reviews:
Between 1880 and 1940, the association between alcohol and the Indian nation shifted dramatically. Eric Colvard’s work examines the lesser-studied role of temperance within Indian nationalism, exploring the history of nationalism (and nation-construction) through the lens of drink. The dissertation argues that there was a close connection between the two. Temperance organizations not only contributed to nascent nationalist protests, the concept of drink came to be defined as something “foreign” and inherently anti-Indian by elite nationalists themselves.
…The drinking habits of Indians changed under colonial rule, in part due to the fact that the colonial tax policy favored the consumption of “foreign” liquors over more traditional drinks such as toddy and “country” liquors. The Government of Bombay introduced the 1878 Act partly in response to criticisms of government alcohol policy by temperance advocates. These activists argued that colonial excise policy had prompted an increase in alcohol production (akbari) and that constitutional reform was needed to curb this. However, the dilemma (if it could be understood as such) for the colonial state was that excise revenues from the sale of liquor production and sale licenses represented a significant part of the revenues of each of the presidencies. Although inimical to the ideas of the temperance activists, the Bombay Act provided the presidency another way to increase its revenue. The Act placed alcoholic beverages in one of three categories: toddy; imported, or “foreign” liquor; and “country” liquor. Prior to the Act’s implementation in 1879, liquor in all forms could be sold by anyone upon payment of a license fee. The new law not only increased the tax payable on toddy trees themselves and required that tappers maintain a minimum of twenty-five trees, but fixed the price of toddy at a very low rate. The effect of these actions was to exclude the small-scale producers who had previously composed the majority of drink manufacturers. As such, the Act significantly disrupted the small-scale village economies previously dependent on the production of local liquors, shifting the contracts to wealthy monopolists who not only adulterated their liquor, but charged the public much more for it.
Read the full review here.
Megan Scudellari in Nature:
Two mice perch side by side, nibbling a food pellet. As one turns to the left, it becomes clear that food is not all that they share — their front and back legs have been cinched together, and a neat row of sutures runs the length of their bodies, connecting their skin. Under the skin, however, the animals are joined in another, more profound way: they are pumping each other's blood. Parabiosis is a 150-year-old surgical technique that unites the vasculature of two living animals. (The word comes from the Greek para, meaning 'alongside', and bios, meaning 'life'.) It mimics natural instances of shared blood supply, such as in conjoined twins or animals that share a placenta in the womb. In the lab, parabiosis presents a rare opportunity to test what circulating factors in the blood of one animal do when they enter another animal. Experiments with parabiotic rodent pairs have led to breakthroughs in endocrinology, tumour biology and immunology, but most of those discoveries occurred more than 35 years ago. For reasons that are not entirely clear, the technique fell out of favour after the 1970s.
In the past few years, however, a small number of labs have revived parabiosis, especially in the field of ageing research. By joining the circulatory system of an old mouse to that of a young mouse, scientists have produced some remarkable results. In the heart, brain, muscles and almost every other tissue examined, the blood of young mice seems to bring new life to ageing organs, making old mice stronger, smarter and healthier. It even makes their fur shinier. Now these labs have begun to identify the components of young blood that are responsible for these changes. And last September, a clinical trial in California became the first to start testing the benefits of young blood in older people with Alzheimer's disease.
Protesting the Tornado
—for the Westboro Baptist Church
Tornados make no mistakes.
We agree on this least of beliefs,
that after disaster walls collapse
back to ideas of houses,
our careful game
knocked to basic elements:
raw planks, exposed nails.
But I need to tell you:
Apocalypse, it feels like,
when the wet and the noise
is so much bigger than us,
we shivering mutts in this night closet.
You sing hymns—things fall apart.
You praise the weapon, the rain mountain
reversed into a grinding top, punishment
for the new Babel of Main Street USA.
You invoke God-the-terrorist and march
as his territorial army, stock-arming
violent winds alongside firearms,
crucifixes, and damning placards.
I need to tell you
I cannot fight you,
the way prey does not turn
to be consumed by its predator—your species
who eats your saviour
and wears the instrument of his torture
at your throat.
I’m here to warn you
about the end of days,
about the delicate finger of Chance
that comes for us all.
Someday it will hover, just
above your shoulder,
terrifying & meaningless.
by Jennifer Matthews
from the Stinging Fly, Vol 2.Issue 22 , 2012
John McQuaid in the Wall Street Journal:
Like our affection for a hint of bitterness in cuisine, our love of spicy heat is the result of conditioning. The chili sensation mimics that of physical heat, which has been a constant element of flavor since the invention of the cooking fire: We have evolved to like hot food. The chili sensation also resembles that of cold, which is unpleasant to the skin but pleasurable in drinks and ice cream, probably because we have developed an association between cooling off and the slaking of thirst. But there’s more to it than that.
Paul Rozin, a professor of psychology at the University of Pennsylvania, became interested in our taste for heat in the 1970s, when he began to wonder why certain cultures favor highly spicy foods. He traveled to a village in Oaxaca, in southern Mexico, to investigate, focusing on the differences between humans and animals. The residents there ate a diet heavy in chili-spiced food. Had their pigs and dogs also picked up a taste for it?
Adam Frank in the New York Times:
Our galaxy, the Milky Way, is home to almost 300 billion stars, and over the last decade, astronomers have made a startling discovery — almost all those stars have planets. The fact that nearly every pinprick of light you see in the night sky hosts a family of worlds raises a powerful but simple question: “Where is everybody?” Hundreds of billions of planets translate into a lot of chances for evolving intelligent, technologically sophisticated species. So why don’t we see evidence for E.T.s everywhere?
The physicist Enrico Fermi first formulated this question, now called theFermi paradox, in 1950. But in the intervening decades, humanity has recognized that our own climb up the ladder of technological sophistication comes with a heavy price. From climate change to resource depletion, our evolution into a globe-spanning industrial culture is forcing us through the narrow bottleneck of a sustainability crisis. In the wake of this realization, new and sobering answers to Fermi’s question now seem possible.
Maybe we’re not the only ones to hit a sustainability bottleneck. Maybe not everyone — maybe no one — makes it to the other side.
Since Fermi’s day, scientists have gained a new perspective on life in its planetary context. From the vantage point of this relatively new field, astrobiology, our current sustainability crisis may be neither politically contingent nor unique, but a natural consequence of laws governing how planets and life of any kind, anywhere, must interact.
Justin E. H. Smith in The Utopian:
My, what a year it’s been, so far. I spent the first week of it happily writing an overdue article on philosophical debates about avian vocalization—birdsong—from Aristotle to Kant. I spent the second week engaged in near-constant polemics and editorializing about the place of free speech in a just society. My life has been entirely overtaken by debates about what is at stake in the wake of last week’s attacks. I have tried to pull out, to get back to a normal sleeping schedule, to return to beautiful things. But I can’t. It has simply been too severe a bouleversement. It is a true crisis. Life, and history, occasionally throw these our way.
In case you missed it: some days ago in Paris a pair of assassins targeted and murdered the cartoonists associated with a weekly satirical magazine that had offended them with its contributions to the low art of caricature. Two days later, an ally of the assassins murdered four more people. What was their offense? They were Jewish, and they were moreover guilty by association with the cartoonists. What was the nature of this association? They happily lived and paid taxes in the same country that had hosted Charlie Hebdo.
In the days that followed, two trends emerged. The state cynically co-opted the attacks, and used it to promote “national unity,” which in fact means increased Islamophobia and deprivation of basic rights to privacy and freedom of expression. Parallel to this a number of commentators sought effectively to excuse the attacks, or to downplay the atrocity of them.
Tim Wendel at The American Scholar:
Novelist Charles Baxter contends that the greatest influence on American writing and discourse in recent memory can be traced back to the phrase “Mistakes were made.” Of course, that’s from Watergate and the shadowy intrigue inside the Nixon White House. In his essay, “Burning Down the House,” Baxter compares that “quasi-confessional passive-voice-mode sentence” to what Robert E. Lee said after the battle of Gettysburg and the disastrous decision of Pickett’s Charge.
“All of this has been my fault,” the Confederate general said. “I asked more of the men than should have been asked of them.”
In Lee’s words, and those of King and Kennedy, we hear a refreshing candor and directness that we miss today. In 1968, people responded to what King and Kennedy told them. During that tumultuous 24-hour period in 1968, people cried aloud and chanted in Memphis. Words struck a chord in Indianapolis, too, and decades later former mayor (and now U.S. Senator) Richard Lugar told writer Thurston Clarke that Kennedy’s speech was “a turning point” for his city.