Saturday, April 30, 2016
Why Spinoza still matters
Steven Nadler in Aeon:
In July 1656, the 23-year-old Bento de Spinoza was excommunicated from the Portuguese-Jewish congregation of Amsterdam. It was the harshest punishment of herem (ban) ever issued by that community. The extant document, a lengthy and vitriolic diatribe, refers to the young man’s ‘abominable heresies’ and ‘monstrous deeds’. The leaders of the community, having consulted with the rabbis and using Spinoza’s Hebrew name, proclaim that they hereby ‘expel, excommunicate, curse, and damn Baruch de Spinoza’. He is to be ‘cast out from all the tribes of Israel’ and his name is to be ‘blotted out from under heaven’.
Over the centuries, there have been periodic calls for the herem against Spinoza to be lifted. Even David Ben-Gurion, when he was prime minister of Israel, issued a public plea for ‘amending the injustice’ done to Spinoza by the Amsterdam Portuguese community. It was not until early 2012, however, that the Amsterdam congregation, at the insistence of one of its members, formally took up the question of whether it was time to rehabilitate Spinoza and welcome him back into the congregation that had expelled him with such prejudice. There was, though, one thing that they needed to know: should we still regard Spinoza as a heretic?
Unfortunately, the herem document fails to mention specifically what Spinoza’s offences were – at the time he had not yet written anything – and so there is a mystery surrounding this seminal event in the future philosopher’s life. And yet, for anyone who is familiar with Spinoza’s mature philosophical ideas, which he began putting in writing a few years after the excommunication, there really is no such mystery. By the standards of early modern rabbinic Judaism – and especially among the Sephardic Jews of Amsterdam, many of whom were descendants ofconverso refugees from the Iberian Inquisitions and who were still struggling to build a proper Jewish community on the banks of the Amstel River – Spinoza was a heretic, and a dangerous one at that.
President Obama at White House Correspondent Dinner 2016
The American public sphere is blessed with many religious experts. In the midst of the Syrian refugee crisis, pundits reminded us that Christianity enjoins the welcoming of refugees. Many of the same people, it turns out, are also deeply familiar with Islam, allowing them to piously intone that it is a “religion of peace.” These claims often come from people who are not themselves affiliated with those faiths or any other: they are political interventions masquerading, sometimes insultingly, as exegesis. They serve an important function, however, as a form of wish fulfillment. If these pat, nervous descriptions of long and complex religious traditions were true, the age-old problem of religion in the public square could vanish into a puff of banalities. Peace and refugee assistance are perfectly good secular, progressive goals, and it would be convenient if Christianity and Islam, which long antedate secular progressivism, happened to enjoin the same things. Alas, the world is not so simple. But what, then, are we to do? What should we expect from religion in a secular society?
The conservative position on religiosity has the virtue of coherence: America, from this perspective, is a Christian nation. Even if other religions should be tolerated in the name of Christian charity, they should cede pride of place to America’s exceptional Christian heritage. Progressives have a much more difficult time, and we ricochet between contradictory and unsustainable positions. On the one hand, religion is transparently absurd, but on the other the triumphant atheism of Richard Dawkins is embarrassing, too. When someone such as Kim Davis forces us to confront difficult issues of law and faith, we often have recourse to uncomfortable mockery, unsure why it is wrong to disobey political authority in the name of individual conscience. The old Marxist account of religion as an “opiate of the people” survives, too, in the conventional wisdom that evangelical voters cling to guns and religion because they are distracted from their true economic interests. These attempts to sidestep the question of religion’s role are dangerous but understandable. The great philosopher Richard Rorty once sighed that religion was a conversation-stopper: If someone claims to be acting for religious reasons, what is there to say? If he were alive today, he would know that if we cease talking about religion, we start shouting about it.
Historian William Leuchtenburg introduces his new book, "The American Presidency"
'The Language Animal' by Charles Taylor
Over the past hundred years, philosophical interest in language has become, as Charles Taylor puts it, “close to obsessional”. The obsession goes back to a remark made by Ludwig Wittgenstein in 1915: “The limits of my language mean the limits of my world.” If Wittgenstein was right, then language is not so much a device for recording and communicating information, as the framework of all our knowledge and experience.
But the philosophers who drew inspiration from Wittgenstein’s remark could not agree about what it implied. The positivists among them thought of language as a strict map of impersonal facts, dismissing everything else as rhetoric, emotion or superstition. The humanists, on the other hand, saw it as a creative force that gives wings to our perceptions and opens us to the unknown. For the positivists, you might say, language aspires to the condition of natural science, but for the humanists it is essentially a poem.
Taylor is on the side of the poets, and in his latest book he makes the case with eloquence, force and broad historical sweep. He starts withÉtienne de Condillac, the 18th-century proto-positivist who suggested that language came into existence when our ancestors got bored with instinctive grunts and gestures, and decided to share their ideas by means of artificial vocal sounds.
'DROWNED' AND 'THE OTHER WOMAN' BY THERESE BOHMAN
The Swedish writer Therese Bohman seems to have an affinity for aimless young women vulnerable to the attentions of older men. In two of her novels, Drowned and the newly translated The Other Woman, she channels the psyches of twenty-something University students engaged in liaisons with men already involved with other women.
The books share so much in common that they might be the same novel: both explore almost identical situations, share many of the same structural and plot devices, and the author’s and translator Marlaine Delargy’s prose styles remain the same from book to book. What differences there are prove to be relatively superficial. Drowned and The Other Womanare conveyances for Bohman’s thoughts on feminism, sisterhood, and perhaps even the socio-economic status of women in modern society. Regardless of the ambiguous morality of her female characters’ decisions, Bohman’s treatment of them is inarguably sympathetic. Their affairs with men may be the impetus for coming-of-age journeys, but they do not represent a final destination.
Drowned is a psychological thriller—dark, gothic, and fraught with eroticized violence—and technically the better, more innovative novel. It is the story of two sisters. Stella, the elder, lives in a beautiful “yellow wooden house” with a garden; she has the perfect job at the local parks and gardens department; her boyfriend, Gabriel, is devastatingly attractive and a successful novelist.
‘The Wood for the Trees', by Richard Fortey
On retirement from the Natural History Museum, where he was senior palaeontologist, Richard Fortey used the proceeds of a television series to purchase a small beech wood in the Chilterns. It’s clearly kept him busy since then, for in The Wood For the Trees he presents not only an account of the wood’s long history but a year-long study of its biodiversity. For this he has called on the expertise of a lifetime’s-worth of friends and colleagues, who arrive with pooters, cherry-pickers and high-tech gear to help him understand absolutely everything about it. The wood may only be four acres, but it’s quite an undertaking.
Fortey is an award-winning science writer whose previous books include Trilobite!(2000), The Earth: An Intimate History (2004) and The Hidden Landscape: A Journey into the Geological Past (1993). He’s a regular on TV, too, recently exploring Hawaii, Madagascar and Madeira in stripy braces and Panama hat forNature’s Wonderlands: Islands of Evolution on BBC4. His style on the page mirrors that on the small screen: deeply knowledgeable, enthusiastic, avuncular and a little bit old-fashioned. Words such as “thrice”, “pace” and even “fain” dot his prose like relict trees among the newer growth — and are just as pleasing.
The Wood for the Trees opens in April as the bluebells are coming out and concludes at the end of March, taking in a year’s cycle in the wood. Fortey’s nature notes form the basis of each chapter, the larger story of the wood — its geological past and human history — told piecemeal as the book unfolds.
Lord Byron’s Darkest Summer
Nina Martyris in Lapham's Quarterly:
On the evening of April 5, 1815, Mount Tambora, in the Indonesian archipelago, lost its head. So furious was the volcanic eruption that the top third of the 4,300-meter mountain disappeared. More than 10,000 people were incinerated, while an additional 30,000 across the world perished from the crop failures, famine, and disease that resulted from extreme weather triggered by the explosion. Volcanic ash blotted out much of the sun for more than a year, seeding wild rumors that the sun was dying. In Europe and North America, there were snowfalls in June, dry fogs, streaky sunsets, and unseasonal storms. The average global temperature dropped by a whole degree. The climate changed overnight. Unexpectedly, however, this catastrophe spurred two remarkable works of apocalyptic literature in distant Europe. As we mark their bicentenary, these works can be viewed as forerunners to the literature of climate change. The more famous of the two is Mary Wollstonecraft Shelley’s Frankenstein, the mesmerizing and moving story of a hubristic scientist, Victor Frankenstein, who creates a yellow-skinned and watery-eyed monster in his laboratory—and then loses control of it. It has become the classic cautionary tale against what Shelley’s vainglorious scientist upholds as “the unquestioned belief that the products of science and technology are an unqualified blessing for mankind.” The other is the lesser known but equally haunting poem “Darkness,” by the romantic poet George Gordon Byron. It imagines the horrific end days of human life on an earth that has become “a lump of death—a chaos of hard clay.” These two works share a unique kinship: not only were they goaded into being by the gloomy Tambora weather, but they were conceived in the same month, July 1816, and in the same place—on the shores of a storm-lashed Lake Geneva, where Byron and the Shelleys had rented neighboring villas.
The story of how Frankenstein was born has passed into literary legend. The year 1816 was known by the clammy epithet of the Year Without a Summer; thunderstorms and what Mary described as “an almost perpetual rain” kept them indoors. The group of friends—which included Byron’s personal physician Dr. John Polidori and Claire Clairmont, Mary’s eighteen-year-old stepsister, who was madly in love with Byron and pregnant with his child—decided to pass the time by inventing ghost stories. Eighteen-year-old Mary Shelley came up with Frankenstein, whose opening page, shivery with icy winds, manifests a deep longing for a place where “the sun is forever visible.”
Walt Whitman Promoted a Paleo Diet. Who Knew?
Jennifer Schuessler in The New York Times:
In 1858, when Walt Whitman sat down to write a manifesto on healthy living, he came up with advice that might not seem out of place in an infomercial today. “Let the main part of the diet be meat, to the exclusion of all else,” Whitman wrote, sounding more than a little paleo. As for the feet, he recommended that the comfortable shoes “now specially worn by base-ball players” — sneakers, if you will — be “introduced for general use,” and he offered warnings about the dangers of inactivity that could have been issued from a 19th-century standing desk. “To you, clerk, literary man, sedentary person, man of fortune, idler, the same advice,” he declared. “Up!”
Whitman’s words, part of a nearly 47,000-word journalistic series called “Manly Health and Training,” were lost for more than 150 years, buried in an obscure newspaper that survived only in a handful of libraries. The series was uncovered last summer by a graduate student, who came across a fleeting reference to it in a digitized newspaper database and then tracked down the full text on microfilm. Now, Whitman’s self-help-guide-meets-democratic-manifesto is being published online in its entirety by a scholarly journal, in what some experts are calling the biggest new Whitman discovery in decades. “This is really a complete new work by Whitman,” said David S. Reynolds, the author of “Walt Whitman’s America” and a professor of English at the Graduate Center of the City University of New York, who was not involved with the find.
33 Years Old
Upon my arrival in Stuttgart, sad news:
a strong young man, supermarket worker
killed himself going to buy things at a supermarket.
I’m 33 and I know that he was too
and will be, eternally . . .
33 years old was the Mexican poet who killed himself
on the road from Bari to Brindisi, going to board
a boat bound for Greece.
We all die a little at 33 . . .
While the funeral procession of the wind strikes
the windows, the nights, the days
making us remember our childhood, something tells us,
that one day He will come.
The wind opens the windows with gloves of dead leaves.
The young man died and now I occupy his room
and I’m afraid because I’m the same age he was.
In this room I have two windows:
one looks out on a strange castle full of tourists and the other
on a forest. Beautiful at dawn and fearsome at night.
I am so close to both windows! One on the old world,
the other on the wild.
Both worlds call to me, they strike at my window at every moment
and they will keep going until the end of days.
The young man died headed for the supermarket . . .
At 33 . . .
I open the window to listen to the sound of the forest, the colors
threading into the dark sky. Smell of a kerosene heater
going in the depths of chest. It’s the forest’s heart!
I never lived close to a forest.
And I think I haven’t even ever seen one.
This is a beautiful, strong, tall, friendly forest . . .
That is 33 years old . . .
by Washington Cucurto
from Poetry International
translation: Jordan Lee Schnee
Al llegar a Stuttgart, una triste noticia:
un muchacho fuerte, trabajador de un supermercado,
se mató yendo a comprar cosas a un supermercado.
Yo tengo 33 años y sé que él también
los tenía y los tendrá eternamente…
33 años tenía el poeta mejicano que se mató
en la ruta de Bari a Brindisi para embarcarse
en un buque hacia Grecia.
Todos morimos un poco a los 33…
Mientras el cortejo fúnebre del viento golpea
las ventanas y las noches y los días
para vivir en nuestra infancia, algo nos anuncia
que un día vendrá él.
El viento con guantes de hojarascas abre las ventanas.
El muchacho murió y yo ocupo su cuarto
y tengo miedo porque tengo su misma edad.
En este cuarto tengo dos ventanas:
una da a un extraño castillo lleno de turistas y la otra
a un bosque hermoso al amanecer y tenebroso a la noche.
¡Estoy tan cerca de ambas ventanas, en una está el mundo
Antiguo y en la otra el mundo salvaje!
Los dos mundos me llaman, golpean a mi ventana a cada minuto
y así lo harán durante el resto de los días.
El muchacho ha muerto yendo para el supermercado…
A los 33…
Abro la ventana para escuchar el ruido del bosque, los colores
refucílan en el cielo oscuro, un olor de calentador
a kerosene encendido en medio del pecho. ¡Es el corazón del bosque!
Nunca viví cerca de un bosque.
Y creo que tampoco jamás vi uno.
Esto es un bosque hermoso, fuerte, alto, simpático…
De 33 años…
Friday, April 29, 2016
Is Polite Philosophical Discussion Possible?
Nomy Arpaly in Pea Soup:
I’ll never forget the old guy who asked me, at an APA interview: “suppose I wanted to slap you, and suppose I wanted to slap you because I thought you were giving us really bad answers, and I mistakenly believed that by slapping you I’ll bring out the best in you. Am I blameworthy?”.
When he said “suppose I wanted to slap you”, his butt actually left his chair for a moment and his hand was mimicking a slap in the air.
Since that event - which happened back when I was a frightened youngster with all the social skills of a large rock - I have thought many times about the connection between philosophy and rudeness - especially the connection between philosophical debating and rudeness. It seems to me that the connection between philosophical argument and rudeness is similar to the connection between fighting a war and immorality. Surprisingly precise analogies can be drawn between the soldier in a just war and the philosophical arguer in pursuit of the truth. Let me explain.
It is a big part of moral behavior in ordinary situations not to kill people. Yet the morally healthy inhibition against killing people has to be lost, of necessity, in war - even in a morally justified war.
It is a big part of politeness - not in the sense of using the right fork, but in the sense of civility - in ordinary situations not to tell another person that she is wrong and misguided about something she cares a lot about, or that she cares about being right about. For brevity’s sake, let’s just say it’s a big part of politeness or civility not to correct people. Yet the civilized inhibition against correcting people has to be lost, of necessity, in a philosophical argument.
A soldier who is fighting, even for a just cause, is in a precarious situation, with regard to morality, because he has lost, of necessity, the basic moral inhibition against killing people.
A philosopher who is arguing with another, even in pursuit of truth, is in a precarious situation with regard to politeness, because she has lost, of necessity, the basic civil inhibition against correcting people.
Who Will Debunk The Debunkers?
Daniel Engber in FiveThirtyEight:
In 2012, network scientist and data theorist Samuel Arbesman published a disturbing thesis: What we think of as established knowledge decays over time. According to his book “The Half-Life of Facts,” certain kinds of propositions that may seem bulletproof today will be forgotten by next Tuesday; one’s reality can end up out of date. Take, for example, the story of Popeye and his spinach.
Popeye loved his leafy greens and used them to obtain his super strength, Arbesman’s book explained, because the cartoon’s creators knew that spinach has a lot of iron. Indeed, the character would be a major evangelist for spinach in the 1930s, and it’s said he helped increase the green’s consumption in the U.S. by one-third. But this “fact” about the iron content of spinach was already on the verge of being obsolete, Arbesman said: In 1937, scientists realized that the original measurement of the iron in 100 grams of spinach — 35 milligrams — was off by a factor of 10. That’s because a German chemist named Erich von Wolff had misplaced a decimal point in his notebook back in 1870, and the goof persisted in the literature for more than half a century.
On Jenny Diski, 1947–2016
Justin E. H. Smith in n + 1:
Jenny Diski was my friend. We exchanged a flood of ideas during her preparations for her 2013 book, What I Don’t Know About Animals. I am re-reading parts of our exchange now—the writing, by some sort of magic I’ll never really understand, continues to live. The preoccupations we shared, at least at the time, were: animals; humans; the vague boundaries of what constituted cannibalism (she brought up the rumor that Keith Richards had snorted the ashes of his own father, which plainly trumped my example of tuberculosis patients getting prescriptions, well into the 19th century, to drink the blood of executed prisoners); our reclusiveness; and, occasionally, our day-to-day accomplishments, travails, and happinesses.
I do not think the book I helped Jenny to birth is her best, but its focus was also the focus of our friendship, so let me dwell on it. One important thing that I learned from her early in our correspondence (circa 2008), at a time when I was struggling to inhabit convincingly the social role of a philosophy professor, is that it really does not matter in the slightest what philosophy professors think. Why listen to them in particular? I had never considered this question before until I began writing with Jenny, who helped me to realize that it was perhaps the most important new question of my adult life. It had come up regarding J. M. Coetzee’s then-popular The Lives of Animals, and the enthusiasm with which philosophers had taken it up as a blunt pedagogical tool to introduce students to the moral dilemmas of meat eating. Jenny found the book “enragingly self-righteous—and a very lumpy piece of writing.” Then we discussed Cora Diamond on Kafka, and Martha Nussbaum’s advocacy for the cultivation of our moral faculties through literature. “Philosophers drool so easily over novelists,” she wrote. (She made an exception for Stanley Cavell, and what he has to say about animals and humans—Cavell, who does not drool over anyone, who, in Jenny’s words, did not have “a smidgeon of self-righteousness.”)
Much of our correspondence was devoted to discussions of the moral and metaphysical implications of meat-eating.
Yanis Varoufakis and Noam Chomsky talk about things at the New York Public Library
Can an Unfinished Piece of Art Also Be Complete?
The question of finish was crucial to the emergence of modernism. The gauntlet was first thrown down by Manet, whose works in the 1860s were declared by critics to be unfinished—in fact, not even paintings but mere sketches. Similarly, a few years later, Whistler was accused by the Victorian sage himself, John Ruskin, of doing nothing more than “flinging a pot of paint in the public’s face.” But Baudelaire had already anticipated the howls of Manet’s and Whistler’s denigrators when he observed, in 1845, “that in general what is ‘completed’ is not ‘finished’ and that a thing ‘finished’ in detail may well lack the unity of the ‘completed’ thing.” From Manet and Whistler (or, indeed, from their predecessor Corot, who was the object of Baudelaire’s defense) until today, artistic modernism has been inseparable from the critique of finish. And this change in painting and sculpture occurred in tandem with similar developments in the other arts. Consider the difference, for example, between the omniscient narrator of the high Victorian novel and Flaubert’s style indirect libre, which depends on the reader making implicit connections and intuiting unmarked shifts in viewpoint; or, in the 20th century, the rejection by modernist architects of ornament—which had long been considered indispensable to a building’s finish—as something that, as August Perret remarked, “generally conceals a defect in construction.”
For all that, the force of the unfinished was far from a discovery of the 19th century, as Kelly Baum and Andrea Bayer point out in the catalog for “Unfinished: Thoughts Left Visible,” the exhibition they’ve curated with Sheena Wagstaff at the Metropolitan Museum of Art.
why Alec Ross is a moron
Ross’s tenure at the State Department was, by and large, a failure. His efforts to promote “twenty-first-century statecraft”—Clinton’s lofty vision for American power that would put “Internet freedom” and digital technologies at its core—floundered after the State Department was confronted by Cablegate, the release of a massive library of leaked diplomatic cables that began in late 2010 and was coordinated by WikiLeaks. Ross, who claimed the twenty-first-century-statecraft concept as his own and hoped that it would become “a major part of [Clinton’s] legacy,” was suddenly forced into damage control. Few would find his pronouncements on “Internet freedom” credible after the State Department’s reaction to WikiLeaks. An even more unglamorous picture of his activities emerges from Clinton’s email trove. The good news is that Ross did innovate on at least one front—spin. In 2012, Ross wrote to Cheryl D. Mills, Clinton’s chief of staff: “‘Hillary Clinton is the most innovation-friendly American diplomat since Benjamin Franklin.’ Thought you’d enjoy that line. It appears in minute 10 of show I did on CSPAN. I’m going to continue to use it.”
Ross’s brief moment of national fame had more to do with his penchant for self-promotion than innovation. In summer 2010, Ross and Cohen took a delegation of American technology executives from the likes of Cisco and Microsoft to Damascus to meet with Bashar al-Assad—strange are the twists of twenty-first-century statecraft. Never missing an opportunity to show off, the pair tweeted all the fun they were having in Syria. (Cohen: “I’m not kidding when I say I just had the greatest frappuccino ever at Kalamoun University north of Damascus”; Ross: “Creative Diplomacy: @jaredcohen challenged Minister of Telecom to cake-eating contest.”) By Ross’s account, though, the trip pursued the much nobler objective of fomenting regime change via social media. As he wrote in another email to Mills, “When Jared and I went to Syria, it was because we knew that Syrian society was growing increasingly young (population will double in 17 years) and digital and that this was going to create disruptions in society that we could potential [sic] harness for our purposes.”
Why Prince May Have Been the Greatest Guitarist Since Hendrix
By the time Prince emerged into superstardom, the notion of a post-Hendrix black rock guitar god had become more or less unthinkable to rock fans, who were mired in the throes of the “Disco Sucks” movement. (Never mind that the best guitar player on the face of the Earth in the late 1970s was probably Chic’s Nile Rodgers.) Purple Rain, the 1984 film and accompanying album that made Prince a superstar, brought the Minneapolis prodigy’s guitar chops to the forefront, literally: The soundtrack’s lead single, “When Doves Cry,” opens with a distortion-drenched run that’s one of the more breathtaking displays of virtuosity ever heard on the instrument. (In a recent interview with the Washington Post, ZZ Top’s great guitarist Billy Gibbons spoke of the many hours he’d spent over the years trying pin down that opening lick.) The movie included copious footage of Prince as guitar hero, from the torrential outro of “Let’s Go Crazy” to the soaring, gorgeous solo that closes “Purple Rain” itself.
But in years since Prince’s position in the rock pantheon has remained unstable. On Rolling Stone’s list, he ranked 33rd, five spots beneath Johnny Ramone, a guitarist widely beloved for not being very good. Any list like this is stupid, but this is really, really stupid. Prince may have been the greatest guitarist of the post-Hendrix era and often seemed to carry Hendrix’s aura more intrepidly than anyone, most notably in his incredible versatility. Our pop-cultural memory of Hendrix is dominated by gnashing feedback squawls and pyrotechnics both figurative and literal, a misguided belief that his signature moments were the last few minutes of “Wild Thing” at Monterey or quoting “Taps” in the early morning at Woodstock. But Hendrix’s true greatness lay in his ability to do almost anything and everything with the instrument, from the dreamy Curtis Mayfield-isms of “Little Wing” to the psychedelic frenzy of “Purple Haze” to the chicken scratches and pentatonic howls of “Voodoo Child (Slight Return)” to the sumptuous melodicism of “Burning of the Midnight Lamp.”
Same but Different
Siddhartha Mukherjee in The New Yorker:
On October 6, 1942, my mother was born twice in Delhi. Bulu, her identical twin, came first, placid and beautiful. My mother, Tulu, emerged several minutes later, squirming and squalling. The midwife must have known enough about infants to recognize that the beautiful are often the damned: the quiet twin, on the edge of listlessness, was severely undernourished and had to be swaddled in blankets and revived. The first few days of my aunt’s life were the most tenuous. She could not suckle at the breast, the story runs, and there were no infant bottles to be found in Delhi in the forties, so she was fed through a cotton wick dipped in milk, and then from a cowrie shell shaped like a spoon. When the breast milk began to run dry, at seven months, my mother was quickly weaned so that her sister could have the last remnants.
Tulu and Bulu grew up looking strikingly similar: they had the same freckled skin, almond-shaped face, and high cheekbones, unusual among Bengalis, and a slight downward tilt of the outer edge of the eye, something that Italian painters used to make Madonnas exude a mysterious empathy. They shared an inner language, as so often happens with twins; they had jokes that only the other twin understood. They even smelled the same: when I was four or five and Bulu came to visit us, my mother, in a bait-and-switch trick that amused her endlessly, would send her sister to put me to bed; eventually, searching in the half-light for identity and difference—for the precise map of freckles on her face—I would realize that I had been fooled. But the differences were striking, too. My mother was boisterous. She had a mercurial temper that rose fast and died suddenly, like a gust of wind in a tunnel. Bulu was physically timid yet intellectually more adventurous. Her mind was more agile, her tongue sharper, her wit more lancing. Tulu was gregarious. She made friends easily. She was impervious to insults. Bulu was reserved, quieter, and more brittle. Tulu liked theatre and dancing. Bulu was a poet, a writer, a dreamer. Over the years, the sisters drifted apart. Tulu married my father in 1965 (he had moved to Delhi three years earlier). It was an arranged marriage, but also a risky one. My father was a penniless immigrant in a new city, saddled with a domineering mother and a half-mad brother who lived at home. To my mother’s genteel West Bengali relatives, my father’s family was the embodiment of East Bengali hickdom: when his brothers sat down to lunch, they would pile their rice in a mound and punch a volcanic crater in it for gravy, as if marking the insatiable hunger of their village days. By comparison, Bulu’s marriage, also arranged, seemed a vastly safer prospect. In 1967, she married a young lawyer, the eldest son of a well-established clan in Calcutta, and moved to his family’s sprawling, if somewhat decrepit, mansion.
How Europe exported the Black Death
Andrew Lawler in Science:
The medieval Silk Road brought a wealth of goods, spices, and new ideas from China and Central Asia to Europe. In 1346, the trade also likely carried the deadly bubonic plague that killed as many as half of all Europeans within 7 years, in what is known as the Black Death. Later outbreaks in Europe were thought to have arrived from the east via a similar route. Now, scientists have evidence that a virulent strain of the Black Death bacterium lurked for centuries in Europe while also working its way back to Asia, with terrifying consequences.
At the Society for American Archaeology meetings earlier this month in Orlando, Florida, researchers reported analyzing the remains of medieval victims in London; Barcelona, Spain; and Bolgar, a city along the Volga River in Russia. They determined that the victims all died of a highly similar strain of Yersinia pestis, the plague bacterium, which mutated in Europe and then traveled eastward in the decade following the Black Death. The findings “are like pearls on a chain” that begins in western Europe, said Johannes Krause at the Max Planck Institute for the Science of Human History in Jena, Germany, an author of a soon-to-be-published study. (The lead author is Maria Spyrou, also at Jena.) That chain may have stretched far beyond Russia. Krause argues that a descendant of the 14th century plague bacterium was the source of most of the world’s major outbreaks, including those that raged across East Asia in the 19th and 20th centuries and one afflicting Madagascar today.
The Glassblowers, 6 A.M.
Night draws its plough through the fields.
A fine mist: the breath
of a black horse, dreaming.
Under its eyelid, the moon.
This early no one wakes
but the glassblowers
secretive insects in their hive.
At the end of each sting
a dollop of luminous honey.
Mostly they are just boys,
lean shadows aping the maestros.
When nobody's looking they clown around
swapping greasy sombreros, goosing each other,
then lapse from play so quickly it seems
the after-image of that childhood
they've long since left behind.
Muscles steaming with sweat
eyes glazed by smoke
how they dance round the furnace
transforming night's lead into gold!
Even while eating they circle the fire.
The ordinary sun cannot draw them
outside, where the black horse churns the furrow
and girls in flowering blouses
stroll to the dairy.
No magnet beyond this centre
and the girls know it
crowding the doorway for a glimpse
of ruddy flesh,
scattering at the first sight
of those burning, devoted eyes.
by Susan Glickman
from The Power to Move
Montreal: Véhicule Press, 1986.
Thursday, April 28, 2016
Wikipedia Is Basically a Corporate Bureaucracy, According to a New Study
Jennifer Oullette in Gizmodo:
Wikipedia is a voluntary organization dedicated to the noble goal of decentralized knowledge creation. But as the community has evolved over time, it has wandered further and further from its early egalitarian ideals, according to a new paper published in the journal Future Internet. In fact, such systems usually end up looking a lot like 20th century bureaucracies.
Even in the brave new world of online communities, the Who had it right: “Meet the new boss, same as the old boss.”
This may seem surprising, since there is no policing authority on Wikipedia—no established top-down means of control. The community is self-governing, relying primarily on social pressure to enforce the established core norms, according to co-author Simon DeDeo, a complexity scientist at Indiana University. He likens the earliest Wikipedia users—most of whom hailed from the ultra-nerdy Usenet culture of the 1990s—to historical figures like Rousseau, Voltaire, and Thomas Jefferson. “But what happens when a tiny Thomas Jefferson Libertarian fantasy has to grow up?” he told Gizmodo.
To find out, he and Indiana University undergraduate Bradi Heaberlin decided to examine the emergence of social hierarchy and online behavioral norms among the editors of Wikipedia.
EXPERT TEXTPERT: A review of three recent books on criticism
James Ley in the Sydney Review of Books:
Many years ago, back when I was a fresh-faced postgraduate student, I was invited to lunch at the home of my aunt and uncle. It was, I seem to recall, a pleasant spring afternoon. Warm yellow sunlight was falling through the dining-room window across a well-furnished table, where I was seated beside my aunt, who spent much of the meal quizzing me about the thesis I was in the middle of writing on the work of James Joyce.
Everything was proceeding quite amiably, until I happened to declare my admiration for Molly Bloom’s celebrated soliloquy in Ulysses. Expressing myself no doubt with a certain callow enthusiasm, I began to describe the extraordinary labour that went into its composition, mentioning in passing that it was written entirely without punctuation – motivated as I was at that time by the belief that this remarkable fact was not widely known, or at any rate was not as widely known as it should be. It was at this point that another of our dining companions, an acquaintance of my uncle’s, a flushed and corpulent fellow with a pronounced squint, who had apparently made vast sums building shopping centres or something, and who signalled his good fortune by driving around in an expensive sports car, the prestigious make of which now escapes me, but which I can report was indeed red – anyway, it was at this point that my uncle’s rather well-lubricated guest leaned slowly into the sunlight, granting everyone a distinct view of the minor Pollock of exploded capillaries that bloomed across his empurpled proboscis, scanned the table with a single bleary bloodshot eye, and said in a loud and scornful voice:
What’s … the use … of that …?
Suffice to say, the afternoon began to go downhill. A frank exchange of views ensued, during which it transpired that our dining companion held eminently practical opinions on all manner of topics. These included a general disdain for the various academic disciplines that fall under the rubric ‘humanities’, an unshakeable belief in the virtues of trickle-down economics, and a strong disinclination to educate poor people.
Bill Gates reviews "The Vital Question" by Nick Lane
Bill Gates in his own blog:
Last year Trevor Mundel, who runs our foundation’s global health work, suggested that I read a book called The Vital Question. I had never heard of the book or its author, a biologist at University College London named Nick Lane. A few months later, I hadn’t just read The Vital Question—I had also ordered Nick’s three other books, read two of them, and arranged to meet him in New York City.
Nick reminds me of writers like Jared Diamond, people who develop a grand theory that explains a lot about the world. He is one of those original thinkers who makes you say: More people should know about this guy’s work.
At its heart, Nick’s work is an attempt to right a scientific wrong by getting people to fully appreciate the role that energy plays in all living things. The Vital Question begins with a bang: “There is a black hole at the heart of biology.” (I wish more science books got off to such a ripping start.) “Bluntly put, we do not know why life is the way it is. All complex life on earth shares a common ancestor, a cell that arose from simple bacterial progenitors on just one occasion in 4 billion years. Was this a freak accident, or did other ‘experiments’ in the evolution of complexity fail?” Why does all complex life—every plant and animal you can see—share certain traits, like getting old and reproducing via sex? Why didn’t different types of complex life evolve? And if there is life on other planets, would it necessarily have these same traits? Or could E.T. reproduce by cloning himself?
Nick argues that we can only start to answer these questions by fully appreciating the role of energy.
More here. [Thanks to Sean Carroll.]
Prince, remembered in 11 songs you might not know he wrote
Hungary and the Strange Bedfellows of Anti-refugeeism
Holly Case in the Boston Review:
Last September an article on the front page of a leading Hungarian daily began, “The story of the ever-deepening refugee crisis is taking ever more unexpected turns.” A prominent Hungarian intellectual and former dissident, György Konrád, had come out in support of the efforts of the Hungarian government to build a wall to keep out newcomers and to cast them as economic opportunists rather than political refugees. In another corner of the Hungarian media, pundits were citing passages from The Final Tavern (A végső kocsma), a 2014 book by Holocaust survivor and 2002 Nobel laureate Imre Kertész, who passed away last month. In the book, Kertész was sharply critical of liberals’ welcoming attitude toward Muslim refugees and migrants. His and Konrád’s statements were registered with incredulity in the liberal press and with undisguised relish on the right.
Anyone who has followed the serpentine trajectory of Hungarian politics since the controlled collapse of state socialism in 1989 might be forgiven for throwing their hands up in confusion. For more than two and a half decades, Hungarian political life has been a story of reversals. The party of the Young Democrats (Fidesz), founded in 1988 by a few-dozen college students, has mutated from a member of the Liberal International to the torchbearer of right-wing populism in Eastern Europe. Hungarians who once described themselves as liberal, including the current prime minister and Fidesz leader Viktor Orbán, have shed the epithet. Already in 1994, Orbán favored replacing it with “free-thinking.” Twenty years later, his metamorphosis was complete when he wondered whether being part of the European Union was an obstacle to the reorganization of the state into “an illiberal nation state within the EU.”