Neanderthals have long been viewed as meat-eaters. The vision of them as inflexible carnivores has even been used to suggest that they went extinct around 25,000 years ago as a result of food scarcity, whereas omnivorous humans were able to survive. But evidence is mounting that plants were important to Neanderthal diets — and now a study reveals that those plants were roasted, and may have been used medicinally. The finding comes from the El Sidrón Cave in northern Spain, where the roughly 50,000-year-old skeletal remains of at least 13 Neanderthals (Homo neanderthalensis) have been discovered. Many of these individuals had calcified layers of plaque on their teeth. Karen Hardy, an anthropologist at the Autonomous University of Barcelona in Spain, wondered whether it might be possible to use this plaque to take a closer look at the Neanderthal menu.
Using plaque to work out the diets of ancient animals is not entirely new, but Hardy has gone further by looking for organic compounds in the plaque. To do this she and a team including Stephen Buckley, an archaeological chemist at the University of York, UK, used gas chromatography and mass spectrometry to analyse the plaque collected from ten teeth belonging to five Neanderthal individuals from the cave. The plaque contained a range of carbohydrates and starch granules, hinting that the Neanderthals had consumed a variety of plant species. By contrast, there were few lipids or proteins from meat. Hardy and her colleagues also found, lurking in the plaque of a few specimens, a range of alkyl phenols, aromatic hydrocarbons and roasted starch granules that suggested that the Neanderthals had spent time in smoky areas and eaten cooked vegetables. The results are published today in Naturwissenschaften1. “The idea that Neanderthals were largely meat-eaters has been hard for me to accept given their membership in a mainly vegetarian clade. It is exciting to see this new set of techniques applied to understanding their palaeo-diet,” says Richard Wrangham, an anthropologist at Harvard University in Cambridge, Massachusetts
The New York response to the Astaires was enthusiastic, but in London it was ecstatic. British audiences had never seen anything like them. They associated the pair’s seemingly effortless skill and breezy humour with an idea of America at its best. Adele, in particular, had a frothy, satirical charm that threw British audiences and critics into paroxysms of delight. The country’s aristocratic society embraced them. Fred and Adele partied with Lord and Lady Mountbatten, the Duke of York and the Prince of Wales, with Fred picking up sartorial tips from the latter who also, Riley notes, may have picked up some from him. The future Edward VIII was impressed, for example, by the dancer’s extensive collection of colourful braces. Riley also chronicles their friendship with George Gershwin, which began in 1914, when Fred and Adele were not yet on Broadway and Gershwin was working as a song-plugger for $15 a week. Gershwin and Astaire had an immediate affinity.
more from Paula Marantz Cohen at the TLS here.
There has long been a tendency to see the most important innovations of Modernism as arising directly from progressive causes. War, in this view, was considered a limiting if not wholly destructive force that stymied civilian architecture in favor of retrogressive military structures. But in his groundbreaking recent book Architecture in Uniform: Designing and Building for the Second World War, the French architectural historian and architect Jean-Louis Cohen establishes one big, awful, inescapable truth: the full potential of twentieth-century architecture, engineering, and design was realized not in the social-welfare and urban-improvement schemes beloved by the early proponents of the Modern Movement, but rather through technologies perfected during the two world wars to slaughter vast armies, destroy entire cities, decimate noncombatant populations, and industrialize genocide. It is hard to come away from Architecture in Uniform without the same feelings of profound horror and lingering dread that overtake readers of recent books on World War II by Max Hastings, Timothy Snyder, and other historians who continue to reveal with terrifying immediacy just how horrific that catastrophe was.
more from Martin Filler at the NYRB here.
Habermas’s most recent book, Die Verfassung Europas, has caused a stir in Germany since its publication by Suhrkamp in November; it has just been published here in an English translation as The Crisis of the European Union. But its appeal in Germany has rested not so much on Habermas’s justified indignation about the EU, but instead on his tempered optimism about the future of democracy in Europe. Die Zeit called Die Verfassung Europas “the book of the hour”; Der Spiegel, “a philosopher’s mission to save the EU”; and the Frankfurter Allgemeine Zeitung, a manifesto for “a second chance for a united Europe.” The near unanimous enthusiasm of reviewers probably reflected less a consensus about the book’s arguments than sheer relief, given the daily bad news from Europe, that Habermas had written a hopeful book. He affirms his longstanding commitment to a cosmopolitan Europe in which the dynamics of global capitalism can be remastered beyond the nation-state, at a supranational and global level, and he sees a radically altered European Union as a model—indeed, as the precursor—for a constitutionally sanctioned cosmopolitan world order based not on utopian illusions but on realistic assessments.
more from Anson Rabinbach at The Nation here.
From Frederick Southwick in The Scientist:
I had recently taken a position at an Ivy League institution when another junior faculty member showed me a micrograph of a macrophage containing intracellular bacteria. I immediately noticed an electron-dense material near the bacterial cells, which I suspected might be actin filaments of the host cell. I agreed to test this hypothesis, and using a fluorescent actin stain, found that, indeed, many of the bacterial cells had actin filaments on one pole. “Could this bacterium be harnessing the host’s actin to move within cells?” I wondered aloud to my colleague.
A month later I brought additional data confirming my findings to my collaborator’s office, where I noticed a paper on his desk with the bacterium’s name and actin in the title. He and a senior professor were listed as the authors, but I was not. “Where’s my name?” I asked. He noted that he and this senior professor had decided to perform their own electron microscopy studies, and were submitting their findings to a prestigious journal. “You’re welcome to publish your work separately,” he suggested.
Teju Cole in New Inquiry:
He is unknown. No name, no profession, no identifying details, but he looks out with the calm sternness of one who knows his place in the world. And because of this calmness, this sternness—the skeptical gaze and tight lips—we suspect it might be an image of the artist himself. Self-portraits of artists often present them with a certain forthrightness, which is necessary because the status of artists is always uncertain—this was true in the 15th century, and it is true now. And so, in their portraits of themselves, artists show confidence, worldliness, and a measure of pride in being artists.
Worldliness: the artist is Jan van Eyck, the portrait was painted in 1433 in Bruges, and it is as much a portrait of a man as it is a portrait of his enormous red turban. Each wrinkle of the cloth, each fold, each soft glimmer of light across the soft weave, is painted with the holy precision Jan van Eyck helped introduce to art. He had abandoned tempera and begun to dissolve his pigments in linseed oil in the 1420s. With that came control and a perfection in painterly mimesis never since matched. An inscription on the frame reads, in pseudo-Greek letters, ALC.IXH.XAN—“as I can,” or “to the best of my ability.” He must have known that his best was the best. The gray-eyed gaze of the man in the painting is a dare. Show me who’s done it better, he seems to say. Didn’t think so, he adds.
I was in Brussels a few weeks ago. At the end of my brief visit, something happened that reminded me, in an oblique way, of the fearlessness of “Man in a Turban.” It was a Thursday, and I had a free evening. My friend F. invited me to join her at the opening of a hip place in a central part of town. Around ten, she sent a text message: “I will be a bit later, feels like a scene from your book; there are riots in Molenbeek, the part of Brussels where I live.”
She eventually arrived, and as we got our beers at the sleek new bar, in which I was the only non-white customer, F. told me about Molenbeek. It is an immigrant neighborhood, mostly Muslim, mostly poor. F., as pale as the women in paintings by Van Eyck and Memling, and her husband, who is also Flemish, chose to raise their family among Moroccan neighbors. There are African blacks in the area too. There are sometimes tensions between the two groups.
From Anastasia Tsioulcas at Deceptive Cadence:
I have no idea why Beethoven is the common theme, but it's evidence of a larger story coming into play. Labels are making changes in what they charge for new releases, and it remains to be seen what impact that might have as consumers begin to expect a lower price as a new norm, rather than a special offer.
Long, long ago — say, the late 20th century — labels put releases into three distinct categories. “Front line” releases were new recordings made by current artists, and they fetched the highest price at retail, somewhere around $16 or so for a single CD. “Mid-line” releases were usually reissues, frequently remastered and very lavishly repackaged, whose retail cost was about $10-12 per CD. And “budget” releases were either new recordings made relatively inexpensively by unknown artists (a field the Naxos label dominated), or they were reissues and compilations done on the cheap, often by third-party labels. (These labels were — and often continue — to be based in Europe, where copyright holds for just 50 years, making all sorts of prized treasures among great recordings fair game, even if they weren't technically allowed to be sold in the U.S., where copyright holds longer.) And even as late as last year, new “front-line” digital releases were generally priced at about $10.
Nick Turse in Guernica:
They call it the New Spice Route, an homage to the medieval trade network that connected Europe, Africa, and Asia, even if today’s “spice road” has nothing to do with cinnamon, cloves, or silks. Instead, it’s a superpower’s superhighway, on which trucks and ships shuttle fuel, food, and military equipment through a growing maritime and ground transportation infrastructure to a network of supply depots, tiny camps, and airfields meant to service a fast-growing U.S. military presence in Africa.
Few in the U.S. know about this superhighway, or about the dozens of training missions and joint military exercises being carried out in nations that most Americans couldn’t locate on a map. Even fewer have any idea that military officials are invoking the names of Marco Polo and the Queen of Sheba as they build a bigger military footprint in Africa. It’s all happening in the shadows of what in a previous imperial age was known as “the Dark Continent.”
From The Independent:
But if there’s a quintessential death of an individual that created too offensive a smell in the nostrils of his people to be forgotten, it is that of late Palestinian President Yasser Arafat in 2004. And the long-held suspicions surrounding the nature of his death were once again brought into the limelight last week when a radiation physics laboratory in Switzerland, upon request of an investigation being undertaken by Al Jazeera television, found high levels of polonium on samples taken from his hair, toothbrush and underpants which they were able to examine posthumously. The circumstances of his illness were always shrouded in mystery. Despite being flown to a French military hospital suffering from severe complications relating to flu, intestinal infection and a sharp decrease in blood platelets (thrombocytes) it took just three weeks before the 75-year-old leader fatally succumbed to his illness. Yet what was extraordinary about this case, and the reason why so many columns written following his death appeared to be laced with gossipy benzene, was that they failed to determine the exact cause of his death, saying only that he had a “mystery blood disorder”. The failure of Arafat’s family to sanction an autopsy and comments by then Palestinian Foreign Minister Nabil Shaath about the French ruling out the possibility that poisoning may have caused his condition, did nothing to stem the suspicions. Fast forward seven years, and now one of the worlds leading specialist laboratories in radiation have an unequivocally confirmed that high levels of the radioactive toxin Polonium were present in his body at the time of his death.
Why Arafat, Polonium and the possible culprits?
If you could confront the pickpocket who ripped you off in the subway, would you simply demand your wallet back, or would you seek vengeance? Your decision to punish the thief might hinge on whether the thief ended up richer than you, a new study suggests. According to most economic theories, self-interest is the prime motivator in human behavior. However, studies show that people consistently sacrifice their own welfare to punish cheats. For example, in a classic economic experiment called the “ultimatum game,” one person holds a certain number of dollars and can offer as many as she likes to a second player. If the second player rejects the offer, the first player loses everything. Rather than accepting any offer, the second player will consistently reject low offers, preferring to receive nothing than to allow her rival to retain the larger sum. In 1999, Swiss economists Ernst Fehr and Klaus Schmidt defined this spiteful reaction toward cheats and freeloaders as “inequity aversion.” They hypothesized that such behavior is essential for cooperation and bargaining, and that it is separate from the desire for revenge, or “reciprocity,” as social scientists call it. However, says Fehr, it isn't easy to tease apart the two motivations in experiments, much less real life. “This is a long-standing question that has not been answered to our full satisfaction.”
It was the relationship between art and human sensuality, a problem that worried the Victorians and has baffled all subsequent generations. The question at issue is nude painting. Ruskin – who wrote of ‘anatomy’, scarcely ever using the word ‘nude’ – believed that obsession with nakedness had damaged such a great mind as that of Michelangelo. Perhaps he thought similarly about Turner, especially since the English artist’s figural drawings are so weak, sometimes inept. They are indeed a record of failure. The superb landscapist had always wished to join the Old Masters through grand figurative painting. The real theme of Warrell’s selection of drawings is of Turner’s frustration in preparing for that endeavour. His nudes on paper are impatient with the demands of the Royal Academy life class, yet do not go beyond the limits of instruction. However, we do see an occasional more relaxed view of the model’s limbs; and here and there a glimpse of pubic hair, which cheers the eye because of disobedience to the chilliness of marble.
more from Tim Hilton at Literary Review here.
Just where is the Zone, anyway? In the film, a caption says that it’s in a small country, surrounded by barbed wire. But this was just a ruse to keep the censors at bay. To a Soviet audience, a forbidden area surrounded by barbed wire naturally conjured up a big swath of countries surrounded by lots and lots of barbed wire. Dyer points out another meaning: the very word ‘zone’ would call up the Gulags. To prisoners, the world outside was known as the bolshaya zona, the big zone, as opposed to the little one of the camps. At the time, it could also refer to other forbidden zones in the Soviet Union, such as the secret research complexes like Arzamas-16 and Chelyabinsk-40, where the components of the Soviet atom bomb were produced — secret cities that didn’t appear on any map. Now, the word seems to refer prophetically to the exclusionary zone around Chernobyl.
more from Jacob Mikanowski at the LA Review of Books here.
The giant of Fort Lupton was born, like a cowbird’s chick, to parents of ordinary size. His father, Jay Shaw, a lineman for a local power company, was six feet tall; his mother, Bonnie, was an inch or so shorter. At the age of three months, Brian weighed seventeen pounds. At two years, he could grab his Sit ’n Spin and toss it nearly across the room. In photographs of his grade-school classes, he always looked out of place, his grinning, elephant-eared face floating like a parade balloon above the other kids in line. They used to pile on his back during recess, his mother told me—not because they didn’t like him but because they wanted to see how many of them he could carry. “I just think Brian has been blessed,” she said. “He has been blessed with size.” Fort Lupton is a city of eight thousand on the dry plains north of Denver. In a bigger place, Shaw might have been corralled into peewee football at eight or nine, and found his way among other oversized boys. But the local teams were lousy and, aside from a few Punt, Pass & Kick contests—which he won with discouraging ease—Shaw stuck to basketball. By seventh grade, he was six feet tall and weighed more than two hundred pounds.
more from Burkhard Bilger at The New Yorker here.
Tom Hussain in the Kansas City Star:
In September, rap fans will be treated to the online release of “The Mushroom Cloud Effect,” a hardcore album by a debutant artist featuring collaborations with American powerhouses like B-Real, Xzibit and Everlast.
The improbable star of the album is Adil Omar, a 21-year-old Pakistani who works from a studio in the corner of his bedroom in an affluent suburb of Islamabad.
A relative newcomer to Pakistan's thriving music scene, Omar has struck a chord with educated Pakistani youth who – after five years of Taliban terrorist attacks – are using artistic expression to rebel against the moral policing of their conservative society and being labeled as extremists in the West.
True to the rap genre, Omar's lyrics are a scornful, frequently abusive commentary on those stereotypes.
“I make a terrorist tear a wrist, prepare for his funeral, and I'm way beyond your government's or parents' approval,” he rapped in “Paki Rambo,” a 2011 hit whose YouTube video has generated more than 260,000 views.
The song also provided the inspiration for the title of his new album, with Omar wryly commenting on the nuclear arms race between Pakistan and its archenemy, India: “There's no silver lining to a mushroom cloud.”
Stephen M. Walt in Foreign Policy:
One of the more enduring myths in the perennial debate on the Israel-Palestine conflict is the claim that Israel has always been interested in a fair and just peace, and that the only thing standing in the way of a deal is the Palestinians' commitment to Israel's destruction. This notion has been endlessly recycled by Israeli diplomats and by Israel's defenders in the United States and elsewhere.
Of course, fair-minded analysts of the conflict have long known that this pernicious narrative was bogus. They knew that former Prime Minister Yitzhak Rabin (who signed the Oslo Accords) never favored creating a viable Palestinian state (indeed, he explicitly said that a future Palestinian entity would be “less than a state.”) The Palestinians' errors notwithstanding, they also understood that Prime Minister Ehud Barak's offers at Camp David in 2000 — though more generous than his predecessors' — still fell well short of a genuine two-state deal. But the idea that Israel sought peace above all else but lacked a genuine “partner for peace” has remained an enduring “explanation” for Oslo's failure.
Over the past several weeks, however, the veil has fallen off almost completely. If you want to understand what's really going on, here are a few things you need to read.
Andy Martin in the New York Times:
In December 1944, Albert Camus, then editor of Combat, the main newspaper of the French Resistance, made Jean-Paul Sartre an offer he couldn’t refuse: the job of American correspondent. Perhaps, in light of the perpetual tension and subsequent acrimonious split between the two men, he was glad to get him out of Paris. What is certain is that Sartre was delighted to go. He’d had enough of the austerities and hypocrisies of post-liberation France and had long fantasized about the United States. Camus himself would make the trip soon after, only to return with a characteristically different set of political, philosophical and personal impressions.
In some sense, existentialism was going home. The “roots” of 20th-century French philosophy are canonically located on mainland Europe, in the fertile terrain of Hegel, Kierkegaard, Nietzsche, Husserl and Heidegger. But it was not entirely immune to the metaphysical turmoil of the United States at the end of the 19th century. French philosophy retained elements of the pragmatism of C.S. Peirce and the psychologism of William James (each receives an honorable mention in Sartre’s “Being and Nothingness”). More significantly, both Camus and Sartre had learned and borrowed from 20th-century writers like Faulkner, Hemingway and dos Passos —and, of course, from the films of Humphrey Bogart. Camus, in particular, cultivated the trench coat with the upturned collar and described himself as a mix of Bogart, Fernandel and a samurai.