The caricature of a philosopher is of an otherworldly professor sitting in a comfy armchair in an Oxbridge college, speculating on the nature of reality using only his or her intellect and a few books. This has some basis in reality. Chemistry requires test tubes, history needs documents. In recent years, the main tool of the philosopher has been grey matter. The subject’s evolution can be painfully slow, tiptoeing forward from footnote to footnote. But not always. Every so often a new movement overturns the orthodoxies of received opinion. We might just be entering one of those phases.
A dynamic new school of thought is emerging that wants to kick down the walls of recent philosophy and place experimentation back at its centre. It has a name to delight an advertising executive: x-phi. It has blogs and books devoted to it, and boasts an expanding body of researchers in elite universities. It even has an icon: an armchair in flames. If philosophy ever can be, x-phi is trendy. But, increasingly, it is also attracting hostility.
“We the people, in order to form a more perfect union.”
Two hundred and twenty one years ago, in a hall that still stands across the street, a group of men gathered and, with these simple words, launched America’s improbable experiment in democracy. Farmers and scholars; statesmen and patriots who had traveled across an ocean to escape tyranny and persecution finally made real their declaration of independence at a Philadelphia convention that lasted through the spring of 1787.
The document they produced was eventually signed but ultimately unfinished. It was stained by this nation’s original sin of slavery, a question that divided the colonies and brought the convention to a stalemate until the founders chose to allow the slave trade to continue for at least twenty more years, and to leave any final resolution to future generations.
Of course, the answer to the slavery question was already embedded within our Constitution – a Constitution that had at its very core the ideal of equal citizenship under the law; a Constitution that promised its people liberty, and justice, and a union that could be and should be perfected over time.
And yet words on a parchment would not be enough to deliver slaves from bondage, or provide men and women of every color and creed their full rights and obligations as citizens of the United States. What would be needed were Americans in successive generations who were willing to do their part – through protests and struggle, on the streets and in the courts, through a civil war and civil disobedience and always at great risk – to narrow that gap between the promise of our ideals and the reality of their time.
This was one of the tasks we set forth at the beginning of this campaign – to continue the long march of those who came before us, a march for a more just, more equal, more free, more caring and more prosperous America. I chose to run for the presidency at this moment in history because I believe deeply that we cannot solve the challenges of our time unless we solve them together – unless we perfect our union by understanding that we may have different stories, but we hold common hopes; that we may not look the same and we may not have come from the same place, but we all want to move in the same direction – towards a better future for our children and our grandchildren.
This belief comes from my unyielding faith in the decency and generosity of the American people. But it also comes from my own American story.
I am the son of a black man from Kenya and a white woman from Kansas. I was raised with the help of a white grandfather who survived a Depression to serve in Patton’s Army during World War II and a white grandmother who worked on a bomber assembly line at Fort Leavenworth while he was overseas. I’ve gone to some of the best schools in America and lived in one of the world’s poorest nations. I am married to a black American who carries within her the blood of slaves and slaveowners – an inheritance we pass on to our two precious daughters. I have brothers, sisters, nieces, nephews, uncles and cousins, of every race and every hue, scattered across three continents, and for as long as I live, I will never forget that in no other country on Earth is my story even possible.
It’s a story that hasn’t made me the most conventional candidate. But it is a story that has seared into my genetic makeup the idea that this nation is more than the sum of its parts – that out of many, we are truly one.
Research has shown that words are stored in our memories not as isolated entities but as part of a network of related words. This explains why seeing or hearing a word activates words related to it through prior experiences. In trying to understand these connections, scientists visualize a map of links among words called the mental lexicon that shows how words in a vocabulary are interconnected through other words.
However, it’s not clear just how this word association network works. For instance, does word association spread like a wave through a fixed network, weakening with conceptual distance, as suggested by the “Spreading Activation” model? Or does a word activate every other associated word simultaneously, as suggested in a model called “Spooky Activation at a Distance”?
Although these two explanations appear to be mutually exclusive, a recent study reveals a connection between the explanations by making one novel assumption: that words can become entangled in the human mental lexicon. In the study, researchers from the Queensland University of Technology (QUT) in Australia and the University of South Florida in the US have investigated the quantum nature of word associations and presented a simplified quantum model of a mental lexicon.
The power of The Reader, however, is that it is psychologically believable. Schlink’s book is written in short chapters; each offers at least one telling psychological insight about dreams, about memory, about the disconnect between what I do and what I am. Schlink subtly raises the vexing and intriguing problem of responsibility and agency early in the novel. Michael says:
I think, I reach a conclusion, I turn the conclusion into a decision, and then I discover that acting on the decision is something else entirely, and that doing so may proceed from the decision, but then again it may not. Often enough in my life I have done things I had not decided to do.
One of the great virtues of literature is that it conveys a kind of truth about the human condition, and that truth is what Schlink gives us.
The line between the intention and the action is deeply problematic when we think about our own lives. Explaining radically evil behavior in others, we would like to believe the connection is clearer; the evil–doers are monstrous people. Hannah Arendt refuted that claim, inventing the phrase the “banality of evil” in Eichmann in Jerusalem. Adolf Eichmann, the man directly responsible for the destruction of European Jewry, was portrayed in Israel at his trial as a monster. But Arendt could find no connection between who he was and the evil he did. Her account might suggest Michel Foucault’s general thesis that evil has gone out of our world and sickness has come into it. But it should be noted that Arendt also concluded that Eichmann was not sick. She found nothing, not even madness, to connect the person and his heinous acts.
Simultaneously inspiring and heartbreaking, “Afghanistan: Hidden Treasures from the National Museum, Kabul,” opens Sunday at the Museum of Fine Arts, Houston.
Beautiful, priceless works of art illuminate a rich, historic mosaic of cultures, civilizations and trade along the fabled Silk Road of Central Asia — a far cry from the war-torn, Taliban-ravaged Afghanistan of today.
Many of the objects on view are literally “hidden treasures.” They were thought to have been lost, stolen or destroyed during the country’s recent years of strife and turmoil.
In 1988, as the 10-year Soviet occupation was ending and civil violence was escalating, museum staff were able to spirit away several boxes of the most valuable treasures, including the “Bactrian Hoard,” more than 20,000 mostly gold artifacts that few had ever seen. No one was sure how or even if the treasures had survived.
It was a joyous occasion when, in 2003, Afghan President Hamid Karzai announced that the treasures were intact and had been located.
Fredrik Hiebert, the National Geographic curator of the show, who is a specialist in ancient Silk Road sites, said he was invited to take part, several months later, in the opening of the boxes. “I gasped,” he said. “I knew exactly what they were, but I never thought I’d get to see them.”
John Updike filled his 50 years of writing with probably seven or eight normal writing careers. He did so by fusing two artistic virtues that rarely meet in the same person: a frisky, easy, improvisational energy and a rigorous, workaday discipline. He was both the ant and the grasshopper, accountant and poet, Trollope and Rimbaud. His solution to the daily crisis of inspiration was simply not to have it: He wrote steadily, with very little angst, three pages a day, five days a week. Along the way, he mastered pretty much every genre humans have seen fit to invent, including such comparatively rare forms as the self-interview via a fictional alter ego, the book review in the style of the book under review, and the sonnet about one’s own feces (“a flawless coil, / unbroken, in the bowl”). The resulting body of work is so large and thoroughly lauded, the achievements by now so familiar—the casual erudition, the freakish powers of micro-observation, the pioneering description of once-neglected middle-class hobbies such as adultery and divorce—that it can be hard, today, to see any of it fresh. His productivity itself was intimidating: that never-ending series of series (Bech, Rabbit, Eastwick) and collection of collections. The prospect of dipping into his work sometimes feels like going for a day hike on Mount Everest.
Flannery. She liked to drink Coca-Cola mixed with coffee. She gave her mother, Regina, a mule for Mother’s Day. She went to bed at 9 and said she was always glad to get there. After Kennedy’s assassination she said: “I am sad about the president. But I like the new one.” As a child she sewed outfits for her chickens and wanted to be a cartoonist. She reluctantly traveled to Lourdes and claimed she prayed for the novel she was working on, “The Violent Bear It Away,” which she referred to as Opus Nauseous. She referred to each of her novels as Opus Nauseous. Rust Hills, the fiction editor of Esquire, put her in the middle of the “red-hot center” in his Literary Establishment chart of 1963. Elizabeth Hardwick took her to dinner at Mary McCarthy’s apartment, where McCarthy conceded that the communion wafer was a symbol of the Holy Ghost and a pretty good one, whereupon Flannery made her famous reply, “Well, if it’s a symbol, to hell with it.”
Literature is supposed to be a serious, solitary profession. Then why were William Wordsworth, his sister Dorothy and their best friend Samuel Coleridge having so much fun together in the English Lake District in 1798? The three of them were inseparable, wandering the countryside together, Coleridge often high on opium and the Wordsworths tripping on nature. In “The Prelude,” Wordsworth would later recollect that they “wantoned in wild poesy.” They walked for miles every day, talking to beggars, communing with birds and flowers, and lying in ditches staring into the sky. The two men produced an amazing body of work in this annus mirabilis that they published jointly in “Lyrical Ballads.” Wordsworth’s ballads about the lives of the rural poor and Coleridge’s visionary poems such as “The Rime of the Ancient Mariner” enlarged the discourse and transformed the aesthetic and language of English poetry.
“A tense and peculiar family, the Oedipuses,” a wag once observed. Well, when it comes to dysfunction, the Wittgensteins of Vienna could give the Oedipuses a run for their money. The tyrannical family patriarch was Karl Wittgenstein (1847-1913), a steel, banking and arms magnate. He and his timorous wife, Leopoldine, brought nine children into the world. Of the five boys, three certainly or probably committed suicide and two were plagued by suicidal impulses throughout their lives. Of the three daughters who survived into adulthood, two got married; both husbands ended up insane and one died by his own hand. Even by the morbid standards of late Hapsburg Vienna these are impressive numbers. But tense and peculiar as the Wittgensteins were, the family also had a strain of genius. Of the two sons who didn’t kill themselves, one, Paul (1887-1961), managed to become an internationally celebrated concert pianist despite the loss of his right arm in World War I. The other, Ludwig (1889-1951), was the greatest philosopher of the 20th century.
Who better to chronicle such a clan than Alexander Waugh, himself the scion of a distinguished and colorful family? In his previous book, “Fathers and Sons,” Waugh wrote with a fine comic touch about his grandfather Evelyn and his father, Auberon. Here he moves from a farcical to a tragic vein. Yet the Wittgensteins, for all their Sturm und Drang, can be as funny as the Waughs. We are told, for example, that the first spoken word of one of the Wittgenstein boys was “Oedipus.”
The author brings another advantage to his subject: he is a music critic (and sometime composer), and the Wittgensteins were the musical family par excellence.
The boy does not have a name, but he is not unknown. Smithsonian scientists reconstructed his story from a skeleton, found in Anne Arundel County, Maryland, buried underneath a layer of fireplace ash, bottle and ceramic fragments, and animal bones.
Resting on top of the rib cage was the milk pan used to dig the grave. “It's obviously some sort of clandestine burial,” says Kari Bruwelheide, who studied the body. “We call it a colonial cold case.”
Bruwelheide is an assistant to forensic anthropologist Douglas Owsley. After more than a decade of cases that span the centuries, the duo has curated “Written in Bone: Forensic Files of the 17th-Century Chesapeake,” on view at the Smithsonian National Museum of Natural History through February 2011. The exhibit shows visitors how forensic anthropologists analyze bones and artifacts to crack historical mysteries. “The public thinks they know a lot about it, but their knowledge is based on shows like ‘Bones' and ‘CSI,' so they get a lot of misinformation,” Owsley says. “This is an opportunity for us to show the real thing.”
Talk about judges making up law out of whole cloth–that, pretty much, is what the U. S. Supreme Court has just done. In Pleasant Grove City v. Summum, the Court, by a unanimous vote, concluded that a somewhat offbeat religious group has no right to place a monument touting what it calls the Seven Aphorisms on public land that already features a monument to the Ten Commandments.
A unanimous verdict suggests that Summum was on shaky legal grounds to begin with. But the decision of Samuel Alito, endorsed by the other conservative judges, relied on reasoning that drew strong objection from some of the Court's more liberal members. It's not complicated, Alito argued. The government, like any individual–or, for that matter, corporation–has the right to free speech. If it chooses to say that one religion's teachings should be represented in public and another's should not be, telling it that such a act constitutes discrimination in favor of one religion and against another is tantamount to denying the government its right to say whatever it wants.
Popular fiction is supposed to be essentially story-driven; the proof that it works is the sound of the pages turning. But a few of the great pop writers were stylists, above all, and their success is measured by a different sound, that of the snort of appreciation followed by a phrase read out loud to a half-sleeping spouse in bed at night. The pages stop turning while we admire the sentences. Few readers of Raymond Chandler can recall, or even follow, the plot of “Farewell, My Lovely”—Chandler himself couldn’t always follow his plots. What they remember is that Moose Malloy on a Los Angeles street was as inconspicuous as a tarantula on a slice of angel-food cake. Of all the pop formalists, the purest and strangest may be Damon Runyon, the New York storyteller, newspaperman, and sportswriter who wrote for the Hearst press for more than thirty years, inspired a couple of Capra movies, and died in 1946. Runyon’s appeal, though it has to be fished out like raisins from the dreary bran of his O. Henry-style plotting, came from his mastery of an American idiom. We read Runyon not for the stories but for the slang, half found on Broadway in the nineteen-twenties and thirties and half cooked up in his own head. We read Runyon for sentences like this: “If I have all the tears that are shed on Broadway by guys in love, I will have enough salt water to start an opposition ocean to the Atlantic and Pacific, with enough left over to run the Great Salt Lake out of business.”
“I collected the instruments of life around me, that I might infuse a spark of being into the lifeless thing that lay at my feet. … By the glimmer of the half-extinguished light, I saw the dull yellow eye of the creature open; it breathed hard, and a convulsive motion agitated its limbs.”1 Thus the magic moment in Mary Shelley’s Frankenstein (1818) when the creature is brought to life by what is usually considered (though Shelley does not say so outright) the infusion of an electric “spark of being” into a constructed body. Shelley’s story emerged amid heated disputes among London physicians over the nature of life itself. Against the view of mechanists and materialists, who argued life could be reduced to the complex organization of physiology, vitalists asserted that some other force or spirit must be superadded to bodies to achieve living animation. Vitalist John Abernethy thus declared, “The phaenomena of electricity and of life correspond.”2 To support their case, vitalists often pointed to the “animal electricity” described by Bolognese physician Luigi Galvani, who had seen the legs of a dissected frog twitch when touched with a metal scalpel in the presence of electricity. Another Italian, Alessandro Volta, rejected Galvani’s claim that such animal electricity was a distinctive form of electricity, and simulated it by bringing different metals into contact in moisture, thus contributing to his invention of the “voltaic pile” or battery. Volta’s experiments troubled vitalist accounts, but dramatic experiments supported them.
I wonder what Syd Barrett was doing on July 21, 1990, whilst his former Pink Floyd bandmate Roger Waters was cranking the bombast to 11 in Berlin by supersizing that already bloated paean to bilious self-pity known as The Wall and conflating it with the decommissioning — six months prior — of the “anti-Fascist protective rampart” that had divided the German capital and stood as a symbol of Yankee/Soviet stalemate for the previous quarter century. Probably painting. After his death in 2006, it was revealed that Syd had spent much of his three-decade withdrawal from show business making art, which he sometimes photographed before painting over or destroying. The question that nags me is this: Which is the greater creative act, micromanaging a spectacular but rehashed postmodern Gesamtkunstwerk for half a million people (and millions more via live satellite TV — and all ostensibly for charity!), or daubing away in a Cambridge cellar on a canvas that will probably never see the light of day?
It's hard not to love Amazon's new e-book reader. For starters, it's gorgeous. Unlike its bulky predecessor, the redesigned $359 Kindle, which came out this week, is light, thin, and disappears in your hands. If you think there's no way you could ever get used to curling up with an electronic reader, you haven't given the Kindle a chance. Load up a good book and you'll soon forget you're reading plastic rather than paper. You'll also wonder how you ever did without it. The Kindle makes buying, storing, and organizing your favorite books and magazines effortless. You can take your entire library with you wherever you go and switch from reading the latest New Yorker to the latest best-seller without rolling out of bed. In my few days using it, I was won over: The Kindle is the future of publishing.
And that's what scares me. Amazon's reader is a brilliant device that shanghais book buyers and the book industry into accepting a radically diminished marketplace for published works. If the Kindle succeeds on its current terms, and all signs suggest it'll be a blockbuster (thanks Oprah!), Amazon will make a bundle. But everyone else with a stake in a vibrant book industry—authors, publishers, libraries, chain bookstores, indie bookstores, and, not least, readers—stands to lose out.
The first stage of the Harlem Renaissance started in the late 1910s. 1917 saw the premiere of Three Plays for a Negro Theatre. These plays, written by white playwright Ridgely Torrence, featured black actors' conveying complex human emotions and yearnings. They rejected the stereotypes of the blackface and minstrel show traditions. James Weldon Johnson in 1917 called the premieres of these plays “the most important single event in the entire history of the Negro in the American Theatre.” Another landmark came in 1919, when Claude McKay published his militant sonnet If We Must Die. Although the poem never alluded to race, to black readers it sounded a note of defiance in the face of racism and the nationwide race riots and lynchings then taking place. By the end of the First World War, the fiction of James Weldon Johnson and the poetry of Claude McKay was describing the reality of contemporary black life in America.
In the early 1920s, a number of literary works signaled the new creative energy in African-American literature. Claude McKay's volume of poetry, Harlem Shadows (1922), became one of the first works by a black writer to be published by a mainstream national publisher . Cane (1923), by Jean Toomer, was an experimental novel that combined poetry and prose in expressing the life of American blacks in the rural South and urban North. Confusion (1924), the first novel by writer and editor Jessie Fauset, depicted middle-class life among black Americans from a woman's perspective.
The second event was the publication of Nigger Heaven (1926) by white novelist Carl Van Vechten. The book was a spectacularly popular exposé of Harlem life. Although the book offended some members of the black community, its coverage of both the elite and the baser sides of Harlem helped create a Negro vogue that drew thousands of sophisticated New Yorkers, black and white, to Harlem's exciting nightlife. It also stimulated a national market for African-American literature and music.
Some say the secret to losing weight is forgoing greasy, fatty foods like French fries; others swear that shunning carbs in favor of all-protein grub is key. Many popular weight loss plans recommend that dieters consume specific ratios of fat, protein and carbohydrates. (The Zone diet, for instance, prescribes 40 percent carbs, preferably complex carbs like veggies and whole grains, 30 percent protein and 30 percent fat). But a study published today in The New England Journal of Medicine suggests that the smartest way to lose weight is to eat heart healthy foods (think: Mediterranean diet—lots of veggies and fish, limited amounts of red meat) and reduce your caloric intake.
“Reduced calorie, heart-healthy diets can help you lose weight, regardless of the proportions of fat, protein and carbohydrates,” says study co-author Catherine Loria, a nutritional epidemiologist at the National Heart, Lung and Blood Institute in Bethesda, Md. The researchers, led by Frank Sacks, a professor of cardiovascular disease prevention at the Harvard School of Public Health in Boston, focused their study on 811 overweight and obese adults ages 30 to 70 in Boston and Baton Rouge, La. (“Overweight” includes those with a body mass index (BMI) between 25 and 29.9; people are considered obese if they have a BMI over 30. The BMI is a standard index used to gauge body fat based on a person's height and weight.)
Last spring, as David Byrne was finishing his first album with Brian Eno in 28 years, Everything That Happens Will Happen Today, he came to a bittersweet realization: “It felt wonderful singing the songs, and I knew if I didn’t tour, then the recording process would be the last time for quite a while that I’d enjoy performing them—except in the shower.” The tour, subtitled “Songs of David Byrne and Brian Eno,” evolved from a set of Everything songs into a production that features selections from all of the duo’s collaborations (including the monumental My Life in the Bush of Ghosts and some Talking Heads songs). “I realized I could tie the present to the past with the thematic thread of Brian’s involvement,” says Byrne, who noticed, for instance, that “Poor Boy” (new) and “Crosseyed and Painless” (old) were both structured around just one or two chords. Sixty-nine performances in seven countries and only one wardrobe malfunction later (“The audience got so enthusiastic all of a sudden”), the show stops at Radio City this Friday and Saturday. Byrne deconstructed the intricate production (including seven musicians, three dancers, choreography, and costumes) while on a ferry to New Zealand’s Waiheke Island—ever-present bike at hand.
1. Mauro Refosco Percussion and Guitar The show is a true collaboration. “Mauro suggested that [a dancer] give the drummer his cutoff cue at the end of ‘Life During Wartime.’ A simple but brilliant idea.”
2. Graham Hawthorne Drums Byrne has worked with the rhythm section for ten years. Hawthorne “also does programming and wine recommendations.”
3. Mark Degliantoni Keyboards Former member of Soul Coughing with whom Byrne did some shows “back in the day.”