Henwood: There are two parts to that. The longer term structural issue is that Wall Street, the financial system, is basically a mechanism for the creation and exercise of ruling class power. It is the heart of the capitalist economic and social system. So taking on Wall Street is very, very complicated.
But, there is also the sense in which these guys represent a social interest that has done very, very badly and they could have had their toes stepped on a bit—and they haven’t. If you go back and compare when Roosevelt gave that speech to the Democratic Convention in October 1936, he said: “Never have the rich and powerful been so lined up in their hatred of a political candidate and I welcome their hatred.”
You cannot imagine Obama saying anything remotely similar. That’s partly because the bust has been less dramatic than it was in the 1930s, but also because Roosevelt came from the aristocracy and thus had more personal confidence in stepping on their toes. Obama is a guy who has been created by the meritocracy and it has treated him very well. He’s kind of in awe of wealth and power and much less willing, for personal reasons, to challenge such interests.
The political environment is also totally different now. Going into the 1930s—there was a whole radical tradition: the populist tradition, the progressive tradition—there were people who had different ways of looking at an economy.
A friend of mine once said that the fact that Martha Stewart never shows a mistake, unlike Julia Child, revealed the ethic of an arriviste. Michael Pollan in the New York Times Magazine:
[H]ere’s what I don’t get: How is it that we are so eager to watch other people browning beef cubes on screen but so much less eager to brown them ourselves? For the rise of Julia Child as a figure of cultural consequence — along with Alice Waters and Mario Batali and Martha Stewart and Emeril Lagasse and whoever is crowned the next Food Network star — has, paradoxically, coincided with the rise of fast food, home-meal replacements and the decline and fall of everyday home cooking.
Today the average American spends a mere 27 minutes a day on food preparation (another four minutes cleaning up); that’s less than half the time that we spent cooking and cleaning up when Julia arrived on our television screens. It’s also less than half the time it takes to watch a single episode of “Top Chef” or “Chopped” or “The Next Food Network Star.” What this suggests is that a great many Americans are spending considerably more time watching images of cooking on television than they are cooking themselves — an increasingly archaic activity they will tell you they no longer have the time for.
What is wrong with this picture?
When I asked my mother recently what exactly endeared Julia Child to her, she explained that “for so many of us she took the fear out of cooking” and, to illustrate the point, brought up the famous potato show (or, as Julia pronounced it, “the poh-TAY-toh show!”), one of the episodes that Meryl Streep recreates brilliantly on screen. Millions of Americans of a certain age claim to remember Julia Child dropping a chicken or a goose on the floor, but the memory is apocryphal: what she dropped was a potato pancake, and it didn’t quite make it to the floor. Still, this was a classic live-television moment, inconceivable on any modern cooking show: Martha Stewart would sooner commit seppuku than let such an outtake ever see the light of day.
An economic crash spurred on by a weakness for profit and a blindness to risk; but efforts at reform are resisted in the name of the “free market.” A healthcare system that is more costly and less effective than many others in the developed world; but efforts to change it run aground on the reluctance of some to pay for the benefits of others. Federal coffers drained by unaffordable handouts to the largest corporations, highest income-earners, and wealthiest estate-holders; but efforts to roll back these mistakes are met by an astro-turf tax revolt that smacks more of class warfare than the progressive tax system itself.
We could see these as the same old battles between left and right, the same tired pantomime that ends in stalemate. But it seems to many that something is different this time around, that change in our political system is inevitable. New regulations will be issued for Wall Street and corporations. A new national plan for healthcare will emerge. And changes in our tax laws will have to occur to reverse the deficit and arrest the debt. No doubt each of these will be resisted by those who still cling to a retrograde American “libertarianism.” But it may finally be the case that their outsized and undeserved influence on the politics of the past 30 years is ending. It is time for us to reflect on this free market ideology, and ask whether American libertarianism is (or ought to be) dead.
Kolakowski came back to Poland a number of times in his last years, although he never settled there again. He was an icon among his fellow countrymen – indeed, for his 70th birthday, the largest newspaper in Poland organized a celebration during which they crowned him (with a crown of laurel leaves, of course) … the King of Europe. When he died, the Polish Parliament observed a minute of silence. Poland went into mourning.
But the man himself was never a monument. Indeed, his experience with the “Hegelian bacillus” made Kolakowski forever sensitive to all enthusiasms and all-encompassing creeds. He preferred humor to hectoring, gently making fun of those whom he criticized, while always making sure that even the most severe intellectual critique did not deny the humanity of his opponents. Refusing to believe anything unconditionally, he retained that most important characteristic of a truly great man: he never had unconditional faith in himself.
We air passengers are not accustomed to perceiving, or even imagining, planes in this way, as almost animate beings, with a capacity for suffering and endurance, requiring consideration and esteem, and being sensitive, almost, to gratitude and rancour. We board them and can barely distinguish between them; we know nothing of their age or their past history; we don’t even notice their names, which, in Spain at least, are chosen in such a bureaucratic, pious spirit, so lacking in poetry, adventure and imagination, that they’re hard to retain and recognize if ever we entrust ourselves to them again. I would like to ask Iberia, in this the twenty-first century, to abandon their anodyne patriotic gestures and adulatory nods to the Catholic Church – all those planes called Our Lady of the Pillar and Our Lady of Good Remedy, The City of Burgos and The City of Tarragona – and instead choose names that are more cheerful and more literary. I, for one, would feel safer and more reassured, more protected, if I knew I was flying in the The Red Eagle or The Fire Arrow or even Achilles or Emma Bovary or Falstaff or Liberty Valance or Nostromo. Perhaps reading that air hostess’s epistolary revelations had something to do with the diminution of my fear.
Our idea of modernity is in many ways defined by that extraordinary flowering of scientific and philosophical ideas in the 17th and 18th centuries known as the Enlightenment. Yet current attitudes to the Enlightenment are ambivalent. Many still see it as unequivocally a good thing: mankind’s coming of age, learning to think freely and independently and throwing off the shackles of obedience to received authority. But there is a dissenting view that has gained new momentum in recent years — that far from heralding a new and glorious dawn, the Enlightenment was born of an overweening arrogance, grossly overestimating the power of human reason and technology to solve our ills and inaugurating a crass materialistic era that has destroyed our reverence for the world and eroded our sense of the sacred. Susan Neiman’s latest book, Moral Clarity: A Guide for Grown-up Idealists (The Bodley Head, £20), offers a distinctive reading of the Enlightenment that attempts to recover its authentic ideals and rescue it from some of the caricatures advanced both by its defenders and its critics. An American moral philosopher who has taught at Yale and Tel Aviv and now works in Germany, Neiman is committed to promoting a broadly liberal political agenda and, as a writer, to making philosophical ideas accessible to a wide reading public.
In 2001, Armin Meiwes, a computer technician from Rotenburg in Germany, advertised on the Cannibal Café website for someone to have dinner with. He received numerous replies, but some withdrew when he responded and he considered others not serious enough. Eventually he invited Bernd Brandes for dinner. The plan was that Armin and Bernd would dine on Bernd’s severed penis, to be bitten off at the table for the occasion (this failed and it had to be cut off). Bernd found it too chewy, he said, so Armin put it in a sauté pan, but charred it and fed it to the dog. Later, Armin put Bernd in the bath (to marinate?), gave him alcohol and pills, read a science fiction book for three hours and then stabbed his dinner guest in the throat, hung him upside down on a meat hook in the ceiling, as any good butcher would, and sliced him into manageable portions. The world was agog at the news of the German cannibal and his two trials, at the first of which he was found guilty of manslaughter (no law against cannibalism in Germany, and his ‘victim’ had consented, volunteered actually, to being killed and eaten) and sentenced to eight years. He was retried on appeal for first-degree murder on the grounds that Bernd might not have been in a position to consent once his penis had been severed and the blood loss taken its intellectual toll. Armin Meiwes was given life.
‘I know things aren’t going well,’’ begins the narrator of Michael Thomas’s debut novel, bracing himself for a downward journey. Broke and bile-infused, Harvard-educated, now jobless and down on his luck in New York, he is estranged from his wife and three children. It is the eve of his 35th birthday, and he has four days to somehow scrape together $12,000 to keep his family afloat. Calling himself the ‘‘progeny of, to name only a few, an Irish boat caulker, a Cherokee drifter, and a quadroon slave’’ and married to a white woman from an elite Boston family, he provocatively refers to those children as ‘‘the wreckage of miscegenation’’. He spends the novel’s four days in anxious, caffeine-fuelled flight from his past – alcoholism and an abusive childhood. The question is whether he will redeem himself or resign himself to being ‘‘preselected for failure’’. Man Gone Down, as the title suggests, plays with the meanings of descent, and harks back to a rich tradition of American stories of rise and fall, success and failure, of ‘‘making it’’.
Thomas’s gleaming, lucid prose does justice to that tradition. The narrator wonders about the future for his children; one son is brown-skinned, the other fair, and he feels able to predict the arcs of their lives based on people’s reactions to their colours: X looks exactly like me except he’s white. He has bright blue-gray eyes that at times fade to green. They’re the only part of him that at times looks young, wild, and unfocused, looking at you but spinning everywhere. In the summer he’s blond and bronze-colored. He looks like a tan elf on steroids… His blue eyes somehow signify a grace and virtue and respect that needn’t be earned – privilege – something that his brother will never possess, even if he puts down the paintbrush, the soccer ball, and smiles at people in the same impish way… What will it take to make them not brothers?
Some 28,000 years ago in what is now the British territory of Gibraltar, a group of Neandertals eked out a living along the rocky Mediterranean coast. They were quite possibly the last of their kind. Elsewhere in Europe and western Asia, Neandertals had disappeared thousands of years earlier, after having ruled for more than 200,000 years. The Iberian Peninsula, with its comparatively mild climate and rich array of animals and plants, seems to have been the final stronghold. Soon, however, the Gibraltar population, too, would die out, leaving behind only a smattering of their stone tools and the charred remnants of their campfires.
Ever since the discovery of the first Neandertal fossil in 1856, scientists have debated the place of these bygone humans on the family tree and what became of them. For decades two competing theories have dominated the discourse. One holds that Neandertals were an archaic variant of our own species, Homo sapiens, that evolved into or was assimilated by the anatomically modern European population. The other posits that the Neandertals were a separate species, H. neanderthalensis, that modern humans swiftly extirpated on entering the archaic hominid's territory.
Jaroslaw Anders’s Between Fire and Sleep, a collection of essays that first appeared in American periodicals, especially The New Republic, when Eastern Europe was digging out from under the wreckage of Communism, is the best book of its kind available in English and, quite likely, any other language. Granted, the field of nonscholarly books that synopsize modern Polish literature is admittedly narrow, so such praise may sound slight, a little like Spinal Tap exclaiming that they’re huge in Japan.
Yet Anders is not without serious competition from fellow Polish writers. The most imposing is the latter portion of The History of Polish Literature (1969) by Czeslaw Milosz, with its contentious opinions, occasional errors and imperious language. Milosz describes Wislawa Szymborska–who would receive the Nobel Prize in Literature in 1996, sixteen years after Milosz was awarded it–as a poet who “often leans toward preciosity” and who “is probably at her best where her woman’s sensibility outweighs her existential brand of rationalism.” Though the Polish language has no definite or indefinite articles, summary judgments like these leave no doubt that Milosz understood what it meant to crown his History with The instead of A. Stanislaw Baranczak’s Breathing Under Water and Other East European Essays (1990), written during the poet’s first years of exile in the United States, is suffused with the bewilderment of an Eastern European intellectual trying to make sense of the West, a struggle that is as much Baranczak’s subject as is twentieth-century Polish culture. Last, there are the essays of theater critic Jan Kott, collected in such volumes as The Theater of Essence (1984) and The Memory of the Body (1992), whose interest in what literature says about our lives, whoever we may be, allows him to dispense with the usual arguments for Poland’s relevance.
For generations a staple of Polish addresses to the West (and Western reviews of the same), such arguments have become hopelessly irrelevant, vestiges of what the novelist Witold Gombrowicz described as Poland’s inferiority complex. What lends the aforementioned titles their continued vitality, despite their having been shaped by political circumstances that younger readers cannot remember, is their abiding interest in questions that transcend the headlines and gesture toward aesthetic, metaphysical and ethical quandaries. The nine authors discussed in Between Fire and Sleep thrive on these questions, and most of them received comparable attention from Anders’s predecessors.
Groups of fire ants, chimpanzees, meerkats and other animals engage in lethal conflicts. But we humans are especially good at it, killing ‘outsiders’ on a scale that altered the course of our evolution. Pre-historic burials of large numbers of men and women with smashed skulls, broken forearms and stone points embedded in their bones — as well as ethnographic studies of recent hunters and gatherers— strongly suggest that warfare was a leading cause of death in many ancestral populations. Yet at the same time, humans are unusually cooperative, collaborating with non-kin, for example in hunting and sharing food, on a scale unknown in other animals.
Paradoxically, the grisly evidence of our warlike past may help explain our distinctly cooperative nature.
This distasteful idea is based on the evolution of what my co-authors and I have termed ‘parochial altruism’. Altruism is conferring benefits on others at a cost to oneself; parochialism is favouring ethnic, racial or other insiders over outsiders. Both are commonly observed human behaviours that are well documented in experiments. For example, people from the Wolimbka and nearby Ngenika groups, in the Western Highlands of Papua New Guinea, have no recent history of violence. Yet when asked to divide a pot of money between themselves and another, they give more and keep less for themselves if the other is a member of their own group rather than an outsider.
But parochial altruism is puzzling from an evolutionary perspective because both altruism and parochialism reduce fitness or material well-being compared with what a person would gain were he or she to eschew these behaviours.
Sixty-one years ago Aldous Huxley published his lesser-known masterpiece, Ape and Essence, set in the Los Angeles of 2108. After a nuclear war (in the year 2008) devastates humanity’s ability to reproduce high-fidelity copies of itself, a reversion to sub-human existence had been the result. A small group of scientists from New Zealand, spared from the catastrophe, arrives, a century later, to take notes. The story is presented, in keeping with the Hollywood location, in the form of a film script. On July 24, 2009, a small group of scientists, entrepreneurs, cultural impresarios and journalists that included architects of the some of the leading transformative companies of our time (Microsoft, Google, Facebook, PayPal), arrived at the Andaz Hotel on Sunset Boulevard in West Hollywood, to be offered a glimpse, guided by George Church and Craig Venter, of a future far stranger than Mr. Huxley had been able to imagine in 1948.
more from The Edge here (videos of the lectures toward bottom of page).
And having arrived in Vilnius, the “Jerusalem of Lithuania”, with my proclivity for playing the part of the emphatic nymph Echo everywhere I went, I was anxious to discover something in my family’s history that would secure for me a place in this city’s dramatic Jewish past. Unfortunately (or fortunately), I was born into a family that had been assimilated for at least three generations. At school we became well-versed in Ancient Greek mythology, but we never learned about Moses or Jesus. I never heard a word of Yiddish or Hebrew spoken at home, never went to synagogue, never saw the Bible. None of my close relatives perished in the Holocaust or in the Gulag. My Jewish origin was stated in my Soviet internal passport – a kind of ID card in Russia – but I was, evidently, too much of a conformist and therefore too reluctant to dig deep enough in search of my Jewish roots in fear of discovering that I am not like everyone else. Apart from an occasional exchange of nastiness in the playground, common amongst adolescent boys in every country, I had never heard an anti-Semitic remark directed at me personally, nor had I ever in my life and my career suffered from an anti-Semitic deed or gesture on the part of any organisation or institution in the Soviet Union. In 1975, when I decided to apply for an exit visa to emigrate to Israel, the officials were trying, in many cases quite sincerely, to dissuade applicants from leaving the mother Russia. All in all, I left my Soviet fatherland with no regrets but also with no feelings of hatred: the Moscow of that era was for me the most entertaining prison in the world – I enjoyed staying there; but I also wanted to see what was happening outside the prison gates. The only way available to me (being of no-propaganda value to the Soviet authorities) was to emigrate. Since then I’ve written a few novels, arguing quite successfully why people like me succumbed to an urge as mad as to leave their own country for good. Now, I can only say that the urge to get out was stronger than my sense of attachment.
Raymond Carver wrote several drafts of each of his poems and short stories, “cutting everything down to the marrow, not just to the bone”. His stories, in particular, bear the traces of unending polish, of “putting words in and taking words out”. In the lives of most of Carver’s characters, history refers to a time when they were better or worse off, happier or unhappier, drinking more or less, than they are now. The narrative method of his early work was situated squarely in the tradition derived from Ernest Hemingway, deploying plain vocabulary, short sentences, the repetition of certain words and phrases, and above all the concealment of essential facts so as to implant a timed explosive in the reader’s imagination. Carver was Hemingway (most of whose fiction is located abroad) transposed to the blue-collar American margins, populated by men and women who seldom think about the world beyond – a land of bad marriages, cramped living rooms, truculent children, and unharnessed addictions of the old-fashioned sort. The pleasure of reading Carver, who died in 1988 at the age of fifty, derives partly from his bizarre scenarios and from absurdist dialogue which yet retains the quality of overheard conversation; equally, it comes from pace and phrasing, even paragraphing and punctuation, which the author controls with what are practically musical skills.
more from James Campbell at the TLS here. (My own contribution to the issue here.)
By the time you realize just what a dangerous writer Nick Laird is, it's too late to break away. This new novel from Zadie Smith's husband comes on all wit and chumminess, a buddy story about two London roommates in love with the same woman. But in the familiar surroundings of romantic comedy, Laird is busy plotting something far more unsettling. Glover's Mistake turns imperceptibly toward the poisonous effects of bitterness, and it'll leave you feeling wary all day, as though you'd lain down with Nick Hornby and woken up beside Muriel Spark.
The story opens at a posh art show, a multimedia exhibition of style and pretension that makes a ripe target for Laird's exquisite satire. With a few graceful lines, he sketches out a privileged world where “money grants its owners a kind of armour.” The gallery's central piece is a giant sheet of black paper called “Night Sky (Ambiguous Heaven),” which sells for $950,000. But the real object of Laird's attention is a self-conscious young man from the opposite end of this social scale: David Pinner, a disaffected English teacher who feels intimidated even while seething with scorn. He's come to the gallery in hopes of reintroducing himself to Ruth Marks, a famous feminist artist “acclimatized to prosperity at an early age.” She was a professor of his a dozen years ago, and the moment he sees her again, “he could imagine how she might unmoor a man's existence.” With a bit of expertly tailored flattery, David manages to persuade Ruth to consider a collaborative art project, and during their subsequent meetings he fancies he might have a shot at a more romantic relationship.
When it comes to churning up the world's oceans, Mastigias jellyfish are quite the little blenders. New research suggests that large groups of the small, placid creatures–along with all of the sea's other motile beings–can mix as much heat, gases, and nutrients through the water column as the winds and tides do.
On the surface, the sea is a roiling mass. But dip 100 meters below and the water is calm. How, then, do the world's oceans distribute heat and food throughout their depths? Currents driven by salinity and temperature differences can transport a lot. But another part of the answer comes from an idea conceived by the grandson of Charles Darwin. About a half-century ago, the famed naturalist's descendant–also named Charles–proposed that a body moving through a fluid would tend to drag some of that fluid with it. Applied to the oceans, the hypothesis means that the churning action created when aquatic creatures swim–even the smallest and slowest–might stir a significant amount of water.
Most scientists have remained skeptical, however, arguing that small marine creatures in particular could not overcome water's viscosity enough to circulate much of anything. Now, it turns out, the idea first posed by Darwin's grandson may be right.
Where am I? I awaken and can’t find my things. Have I lost the keys that let me fly? I can’t find myself in my books nor do I see my own mirror nor the aching table of the blind papers, nor the eternal voices nor my earthly juices. I do not feel myself, but neither have I died. I don’t find my ghosts nor do I see my geography. Now I only grasp unheard-of avenues and an aimless street where I get lost without my living angels. I awaken and the rapture of my dreams hurts me.
by Jose Luis Diaz Granados translation: Nicolas Suescún
from: La Fiesta perpetua y otros Poemas; Published by: Golpe de Dados; Bogotá
An invasion led by artificially intelligent machines. Conscious computers. A smartphone virus so smart that it can start mimicking you. You might think that such scenarios are laughably futuristic, but some of the world's leading artificial intelligence (AI) researchers are concerned enough about the potential impact of advances in AI that they have been discussing the risks over the past year. Now they have revealed their conclusions.
Until now, research in artificial intelligence has been mainly occupied by myriad basic challenges that have turned out to be very complex, such as teaching machines to distinguish between everyday objects. Human-level artificial intelligence or self-evolving machines were seen as long-term, abstract goals not yet ready for serious consideration.
Now, for the first time, a panel of 25 AI scientists, roboticists, and ethical and legal scholars has been convened to address these issues, under the auspices of the Association for the Advancement of Artificial Intelligence (AAAI) in Menlo Park, California. It looked at the feasibility and ramifications of seemingly far-fetched ideas, such as the possibility of the internet becoming self-aware.
Graeme Wood review's Neil MacFarquhar's book in the Barnes & Noble Review:
MacFarquhar's title, The Media Relations Department of Hizbollah Wishes You a Happy Birthday, refers to a cordial August 2003 email from Haidar Dikmak, a flack for the militant Shiite political party in Lebanon. The book sustains the ironic, half-menacing tone of the title, and in its progress from one country to the next, it focuses on issues and personalities of interest to Arabs themselves, rather than the issues of narrow interest to the United States. As one government official notes explicitly, foreign reporters tend to arrive and raid the country for Hizbollah stories.
But to MacFarquhar and to nearly all Arabs, Lebanon is a country best known not for war but for entertainment and glamour — a sort of semi-debauched Middle Eastern Hollywood. (The Lebanese Broadcasting Corporation, whose music videos and singing temptresses entertain patrons in crowded lunch spots all over the Middle East, is known as Lubnaniyaat Bidun Culottes, or Lebanese Girls Without Underwear.) Fairouz, the beloved Lebanese hit singer, often goes unmentioned in books like this, an omission that would perhaps be comparable to a book about modern Iceland that never mentioned Björk. MacFarquhar awards Fairouz several pages that explain her fans' ardor in illuminating detail.