Thursday, April 23, 2015
Federico Fellini’s La Dolce Vita was a box-office triumph in Italy in 1960. It made $1.5 million at the box office in three months — more than Gone With the Wind had. ‘It was the making of me,’ said Fellini. It was also the making of Marcello Mastroianni as the screen idol with a curiously impotent sex appeal. No other film captured so memorably the flashbulb glitz of Italy’s postwar ‘economic miracle’ and its consumer boom of Fiat 500s and Gaggia espresso machines. Unsurprisingly, the Vatican objected to the scene where Mastroianni makes love to the Swedish diva Anita Ekberg (who died earlier this year at the age of 83) in the waters of the Trevi fountain. Sixties Rome became a fantasy of the erotic ‘sweet life’ thanks in part to that scene.
After La Dolce Vita, Fellini found himself at a creative loss and hung a sign above his desk: ‘NOW WHAT?’ Sophia Loren’s movie-mogul husband, Carlo Ponti, persuaded him to contribute to Boccaccio ’70, a collection of lewd short films inspired by the medieval Decameron, but it was a critical failure. At the end of 1961, still in ‘creative limbo’, Fellini began work on 8½, his eighth and a half film; it was to be a signpost in his development as a magician-director.
Modern psychiatry seems determined to rob madness of its meanings, insisting that its depredations can be reduced to biology and nothing but biology. One must doubt it. The social and cultural dimensions of mental disorders, so indispensable a part of the story of madness and civilization over the centuries, are unlikely to melt away, or to prove no more than an epiphenomenal feature of so universal a feature of human existence. Madness indeed has its meanings, elusive and evanescent as our attempts to capture them have been.
Western culture throughout its long and tangled history provides us with a rich array of images, a remarkable set of windows into both popular and latterly professional beliefs about insanity. The sacred books of the Judeo-Christian tradition are shot through with stories of madness caused by possession by devils or divine displeasure. From Saul, the first king of the Israelites (made mad by Yahweh for failing to carry out to the letter the Lord’s command to slay every man, woman, and child of the Amalekite tribe, and all their animals, too), to the man in the country of the Gaderenes “with an unclean spirit” (maddened, naked, and violent, whose demons Christ casts out and causes to enter a herd of swine, who forthwith rush over a cliff into the sea to drown), here are stories recited for centuries by believers, and often transformed into pictorial form. None proved more fascinating than the story of Nebuchadnezzar, the mighty king of Babylon, the man who captured Jerusalem and destroyed its Temple, carrying the Jews off into captivity all apparently without incurring divine wrath.
For a long time, the history of the First World War has been understood via the symbolic transition from Brooke to Wilfred Owen, from posh idiot nationalist to heroic witness. That simple narrative obscures the extent to which Owen worshipped Brooke in the early days and just how long Brooke remained the war’s most famous poet. Until the nineteen-sixties, when the “left-wing myths” about the war gained purchase, Brooke’s sonnets and his image still seemed to represent something true and comforting. In the mid-nineties, an anthology of the nation’s hundred favorite poems included three each by Brooke and Owen, although only one of Brooke’s, “The Soldier,” was a war poem. At the hundredth anniversary of his death, as more letters and lovers are revealed, Brooke is making more headlines as a famous playboy than as a poet or patriot.
It’s impossible to know what Brooke might have written if he had seen what the other war poets saw, or what he might have become if he’d survived his own golden age. In his small body of work, the war sonnets are anomalous—his verse is usually more playful, less po-faced.
The power of invisibility has long fascinated man and inspired the works of many great authors and philosophers. In a study from Sweden's Karolinska Institutet, a team of neuroscientists now reports a perceptual illusion of having an invisible body, and show that the feeling of invisibility changes our physical stress response in challenging social situations. The history of literature features many well-known narrations of invisibility and its effect on the human mind, such as the myth of Gyges' ring in Plato's dialogue The Republic and the science fiction novel The Invisible Man by H.G. Wells. Recent advances in materials science have shown that invisibility cloaking of large-scale objects, such as a human body, might be possible in the not-so-distant future; however, it remains unknown how invisibility would affect our brain and body perception. In an article in the journal Scientific Reports, the researchers describe a perceptual illusion of having an invisible body. The experiment involves the participant standing up and wearing a set of head-mounted displays. The participant is then asked to look down at her body, but instead of her real body she sees empty space. To evoke the feeling of having an invisible body, the scientist touches the participant's body in various locations with a large paintbrush while, with another paintbrush held in the other hand, exactly imitating the movements in mid-air in full view of the participant.
"Within less than a minute, the majority of the participants started to transfer the sensation of touch to the portion of empty space where they saw the paintbrush move and experienced an invisible body in that position," says Arvid Guterstam, lead author of the present study.
WikiLeaks founder Julian Assange investigates the book behind Snowden, Oliver Stone's forthcoming film starring Joseph Gordon-Levitt, Shailene Woodley, Nicolas Cage, Scott Eastwood and Zachary Quinto. According to leaked Sony emails, movie rights for the book were bought for $700,000.
Julian Assange in Newsweek:
The Snowden Files: The Inside Story of the World's Most Wanted Man (Guardian/Faber & Faber, 2014) by Luke Harding is a hack job in the purest sense of the term. Pieced together from secondary sources and written with minimal additional research to be the first to market, the book's thrifty origins are hard to miss.
The Guardian is a curiously inward-looking beast. If any other institution tried to market its own experience of its own work nearly as persistently as The Guardian, it would surely be called out for institutional narcissism. But because The Guardian is an embarrassingly central institution within the moribund "left-of-center" wing of the U.K. establishment, everyone holds their tongue.
In recent years, we have seen The Guardian consult itself into cinematic history—in the Jason Bourne films and others—as a hip, ultra-modern, intensely British newspaper with a progressive edge, a charmingly befuddled giant of investigative journalism with a cast-iron spine.
The Snowden Files positions The Guardian as central to the Edward Snowden affair, elbowing out more significant players like Glenn Greenwald and Laura Poitras for Guardian stablemates, often with remarkably bad grace.
"Disputatious gay" Glenn Greenwald's distress at the U.K.'s detention of his husband, David Miranda, is described as "emotional" and "over-the-top." My WikiLeaks colleague Sarah Harrison—who helped rescue Snowden from Hong Kong—is dismissed as a "would-be journalist."
I am referred to as the "self-styled editor of WikiLeaks." In other words, the editor of WikiLeaks. This is about as subtle as Harding's withering asides get. You could use this kind of thing on anyone.
Read the full piece here.
Clown in the Moon
My tears are like the quiet drift
Of petals from some magic rose;
And all my grief flows from the rift
Of unremembered skies and snows.
I think, that if I touched the earth,
It would crumble;
It is so sad and beautiful,
So tremulously like a dream.
by Dylan Thomas
Wednesday, April 22, 2015
Scott McLemee on the Michael Eric Dyson's profile on Cornel West in TNR:
The mutual-admiration arrangement lasted until sometime near the end of the first Obama administration, when West turned up the heat on his criticisms of the president as (among other things) a “black mascot of Wall Street oligarchs” and “the head of the American killing machine.” A number of black liberals took issue with West’s hard left turn. But it was Dyson’s defenses of the president that seemed especially to rankle West. In August 2013, West singled out Dyson by name as one of the people “who’ve really prostituted themselves intellectually in a very ugly and vicious way.”
Similar pleasantries followed. Dyson’s response was muted until earlier this month, when he made some not very subtle allusions to West at a meeting of the National Action Network, the civil rights organization founded by Al Sharpton. “Be honest and humble in genuine terms,” Dyson said, “not the public performance of humility masquerading a huge ego. No amount of hair can cover that.” His more expansive remarks in print run to more than 9,000 words, accompanied by a drawing in which West appears to have a very bad case of dandruff.
One assessment now making the rounds is that it’s a lamentable case of the white establishment turning two formidable African-American minds against one another when otherwise they might be uniting against all that merits ruthless critique. I doubt a more inane judgment is possible. A pretty thoroughgoing ignorance of African-American intellectual history would be required to assume that black thinkers can’t or won’t do battle without there being some Caucasian fight promoter involved. Richard Wright never entirely recovered from James Baldwin’s essay “Everybody’s Protest Novel.” The great but long-neglected black sociologist Oliver C. Cox was scathing about the work of his colleague E. Franklin Frazier.
Such conflicts can be psychobabbled into meaninglessness, of course. Cox’s remarks were attributed to jealousy (Frazier became the first African-American president of the American Sociological Association in 1948, the same year Cox published his overlooked masterpiece Class, Caste, and Race) while Baldwin’s critique of Wright seems like a perfect example of the Oedipal conflict between authors that Harold Bloom calls “the anxiety of influence.” And yes, the ego will take its revenge, given a chance. But real differences in understanding of American society or the role of the artist were involved in those disputes. Those who profess to favor a vigorous intellectual life, and yet deprecate polemic, want crops without plowing up the ground.
But in moving from Baldwin/Wright and Cox/Frazier to Dyson/West, we descend a hundred miles in conceptual altitude.
Colin Marshall in Open Culture:
If you’ve followed our recent philosophy posts, you’ve heard Gillian Anderson (The X-Files) speak on what makes us human, the origins of the universe, and whether technology has changed us, and Harry Shearer speak on ethics — or rather, you’ve heard them narrate short educational animations from the BBC scripted by Philosophy Bites‘ Nigel Warburton. Now another equally distinctive voice has joined the series to explain an equally important philosophical topic. Behold Stephen Fry on the Self.
Anya Kamenetz in NPR (image LA Johnson/NPR):
Recently, a number of faculty members have been publishing research showing that the comment-card approach may not be the best way to measure the central function of higher education.
Philip Stark is the chairman of the statistics department at the University of California, Berkeley. "I've been teaching at Berkeley since 1988, and the reliance on teaching evaluations has always bothered me," he says.
Stark is the co-author of "An Evaluation of Course Evaluations," a new paper that explains some of the reasons why.
For one thing, there's response rate. Fewer than half of students complete these questionnaires in some classes. And, Stark says, there's sampling bias: Very happy or very unhappy students are more motivated to fill out these surveys.
Then there's the problem of averaging the results. Say one professor gets "satisfactory" across the board, while her colleague is polarizing: Perhaps he's really great with high performers and not too good with low performers. Are these two really equivalent?
Finally, there's the simple fact that faculty interactions with students and the student experience in general vary widely across disciplines and types of class. Whether they're in an an upper-division seminar, a studio or lab, or a large lecture course, students are usually asked to fill out the same survey.
Stark says his paper is unlikely to surprise most faculty members: "I think that there's general agreement that student evaluations of teaching don't mean what they claim to mean." But, he says, "there's fear of the unknown and inertia around the current system."
Michele Pellizzari, an economics professor at the University of Geneva in Switzerland, has a more serious claim: that course evaluations may in fact measure, and thus motivate, the opposite of good teaching.
Andrew Curry in Nautilus:
PANDAS represents a striking branch of medical research that has been gaining acceptance in recent years, though not without controversy. In a field known as immunopsychiatry, researchers are exploring the possibility that inflammation, or an overactive immune system, is linked to mental disorders that include depression, schizophrenia, and Alzheimers’ disease.
A host of recent genetic and epidemiological studies “have shown that when people are depressed or have psychotic episodes, inflammatory markers are found in their blood,” says Golam Khandaker, a senior clinical research associate at the University of Cambridge, in England, who studies inflammation and the brain.
In the case of PANDAS, when the body reacts to strep infection, parts of the brain that help regulate motion and behavior wind up caught in the crossfire, mistaken for bacterial invaders by cells bent on destroying them. Eliminate the inflammation, some doctors say, and you signal the immune system to stand down, restoring normal brain function.
The emergence of immunopsychiatry is a story of rediscovery, reflecting the twists and turns of mental health treatment over the last century. In the 19th century, mental illness and infectious disease were closely linked. That connection came uncoupled in the 20th century and immunopsychiatry’s argument that infection and inflammation can have a profound impact on the brain has struggled against psychiatric and neurological dogma. Yet emerging insights into mental illness unite the brain, body, and environment in ways that doctors and therapists are finally beginning to understand.
Alec Ash in Five Books:
Let’s start on your book selection. Your first choice is What Science Offers the Humanities, by Edward Slingerland. Tell us a little about the book first.
It’s a rather extraordinary and unusual book. It addresses some fundamental matters of interest to those of us whose education has been in the humanities. It’s a book that has received very little attention as far as I know, and deserves a lot more. Edward Slingerland’s own background is in Sinology. Most of us in the humanities carry about us a set of assumptions about what the mind is, or what the nature of knowledge is, without any regard to the discoveries and speculations within the biological sciences in the past 30 or 40 years. In part the book is an assault on the various assumptions and presumptions of postmodernism – and its constructivist notions of the mind.
Concepts that in neuroscience and cognitive psychology are now taken for granted – like the embodied mind – are alien to many in the humanities. And Slingerland addresses relativism, which is powerful and pervasive within the humanities. He wants to say that science is not just one more thought system, like religion; it has special, even primary, status because it’s derived from empiricism, or it’s predictive and coherent and does advance our understanding of the world. So rather than just accept at face value what some French philosopher invents about the mirror stage in infant development, Slingerland wants to show us where current understanding is, and where it’s developing, in fields such as cognition, or the relationship between empathy and our understanding on evil. Slingerland believes that there are orthodox views within the humanities which have been long abandoned by the sciences as untenable and contradictory.
Increasingly, America, Britain and even Australia are relying on ‘private security contractors’ to fight their wars. It’s a multi-billion dollar industry, but it’s also largely unregulated. Are we heading towards a world where armies and navies are available to the highest bidder?
Antony Funnell at the Australian Broadcasting Corporation:
Six WWII marines are set in bronze, frozen in time as they hoist the Stars and Stripes over the Japanese island of Iwo Jima.
The memorial was created from a real life photograph.
At the bottom of the statue there’s an inscription set into the pedestal in gold lettering. It reads: ‘Uncommon valour was a common virtue.’
This is the way most Americans still like to think of their soldiers: men and women determined in their duty and confirmed in their patriotism.
Whether that ideal was ever a universal reality is impossible to know, but it’s fair to say that for most of America’s history the stated motivation of the average American soldier has been national service.
Something unexpected happened after the invasion of Iraq in 2003, however.
Leslie Garrett in Medium:
Every Thursday at Georges P. Vanier Junior High School, a dozen adolescent boys assemble in an unused classroom. They gather around a large table, doing their best to ignore the girl power posters and sparkles that cover the walls. After all, they’re there to talk about what it means to be a man.
Middle school health classes usually have a segment on sex education, which for most adults conjures awkward memories of studying the female anatomy and putting a condom on a banana. Wiseguyz, a nonprofit based in Calgary, Alberta, is working to broaden what “sex ed” can teach youth — specifically, boys between the ages of 13 and 15. Their participants instead talk about weighty issues like masculinity and the hyper-sexualized portrayal of women in media.
When these teens first gathered in October 2014, there was a lot of nervous laughter. Comments and questions were sometimes couched in bravado or sarcasm. Program creator Blake Spence used the first few weeks, as he does each year, for the boys to get to know each other and, in his words, “create a safe space.” Before long, the giggling gave way to critical discussion.
“We laugh with each other and we’re always joking around about stuff,” said Will, a current participant. “But when things need to be serious, they’re serious. We have a little rule: What happens in WiseGuyz stays in WiseGuyz — so whatever happens in there, we always have to keep it to ourselves.”
WiseGuyz is built on four modules, which take from October to May to complete. Instead of focusing only on the physical basics of sex, participants talk about human rights, sexual health, gender, and healthy relationships. Within those broad topics is plenty of conversation around pornography, consent, homophobia, sexual violence, and emotional abuse.
Read the full article here.
Peter Forbes in The Guardian:
The “Origin of Life” is a conundrum that could once be safely consigned to wistful armchair musing – we’ll never know so don’t take it too seriously. You will probably imagine that it’s still safe to leave the subject in this speculative limbo, without very much in the way of evidence. You’d be very wrong, because in the last 20 years, and especially the last decade, a powerful new body of evidence has emerged from genomics, geology, biochemistry and molecular biology. Here is the book that presents all this hard evidence and tightly interlocking theory to a wider audience. While most researchers have been bedazzled by DNA into focusing on how such replicating molecules have evolved, Nick Lane’s answer could be characterised as “it’s the energy, stupid”. Of all the definitions of life, the one that matters most concerns energy: the churn of metabolic chemistry in the cells and the constant intake of nutrients and expulsion of waste are the essence of life. Information without energy is useless (pull the plug on your computer); information could not have started the whole thing off but energy could.
It is widely recognised that the creation of a viable primitive living cell, capable of reproduction and Darwinian selection, has three requirements: a containing membrane, which acts as an interface between the organism and the environment; replicators able to store the genetic instructions for the organism and to synthesise its chemical apparatus; and a way of taking energy from the environment and putting it to work to run the cell’s processes. Lane shows how all the rest can follow if we put energy first. He is a researcher in evolutionary biochemistry at University College London who has been developing his grand energy theory of life, the universe and everything for more than two decades, explaining it in the books Oxygen (2002), Power, Sex, Suicide(2005) and Life Ascending (2009), which won the Royal Society book prize in 2010. He is an original researcher and thinker and a passionate and stylish populariser. His theories are ingenious, breathtaking in scope, and challenging in every sense. To read him, it helps, as Richard Dawkins once said of himself when embarking on an intricate passage in The Blind Watchmaker, to bring your “mental running shoes”.
Rana Dajani in Nature:
Certain problematic attitudes towards science have been imported into Muslim societies as a part of rapid globalization and modernization — the rejection of the theory of evolution, for example. But this also offers an opportunity. I teach evolution to university students in Jordan. Almost all of them are hostile to the idea at first. Their schoolteachers are likely to have ignored or glossed over it. Still, most students are willing to discuss evolution, and by the end of the course, the majority accept the idea. If Muslim students can challenge ideas on such a controversial academic topic, then they can also approach other aspects of their lives by questioning — and not just blindly accepting — the status quo. These tools and attitudes are crucial to the development of their personalities and to becoming responsible citizens.
Students in my classes often get a shock. I wear a hijab, so they know that I am a practising Muslim, yet they hear me endorsing evolution as a mechanism to explain diversity and the development of species, and citing Charles Darwin as a scientist who contributed to our understanding of the emergence and diversification of life on Earth. I am almost always the first Muslim they have met who says such things. Some students complained to the university that I was preaching against Islam, but university officials were satisfied when I showed them that evolution featured in the university’s approved textbooks and that what I teach in my lecture comes straight from these texts. I commended the students who complained for their courage in supporting what they believed, and offered to sit down and discuss their concerns with them. In teaching, I offer a detailed explanation of the natural evolution of plants and artificial breeding. Later, we discuss antibiotic resistance, influenza vaccines and the development of HIV drugs. After these discussions, most students are willing to accept evolution as a mechanism for the emergence of all species except humans. Many quote evidence from the Koran that is interpreted to mean that Adam — and so humans — were created spontaneously. Human evolution remains taboo because the students are not ready to relinquish the concept that humans were created differently. I remind them that Muslims are warned against arrogance, and that humans are only part of creation.
Muslim scholars such as Hussein al-Jisr and Ahmad Medhat in the 1880s supported evolution. Before Darwin, al-Jahiz and others proposed rudimentary evolutionary theories in the ninth century. I point out that the apparent controversy over evolution and Islam arose only in the twentieth century, when Darwin’s ideas became associated with colonialism, imperialism, the West, atheism, materialism and racism. Muslim religious scholars gradually took a stand against evolution, which the public adopted. The scholars used Christian creationist arguments to support their stance, transferring the Western war between science and religion to Islam.
Tuesday, April 21, 2015
Mark Joseph Stern in Slate:
Far from an infallible science, forensics is a decades-long experiment in which undertrained lab workers jettison the scientific method in favor of speedy results that fit prosecutors’ hunches. No one knows exactly how many people have been wrongly imprisoned—or executed—due to flawed forensics. But the number, most experts agree, is horrifyingly high. The most respected scientific organization in the country has revealed how deeply, fundamentally unscientific forensics is. A complete overhaul of our evidence analysis is desperately needed. Without it, the number of falsely convicted will only keep growing.
Behind the myriad technical defects of modern forensics lie two extremely basic scientific problems. The first is a pretty clear case of cognitive bias: A startling number of forensics analysts are told by prosecutors what they think the result of any given test will be. This isn’t mere prosecutorial mischief; analysts often ask for as much information about the case as possible—including the identity of the suspect—claiming it helps them know what to look for. Even the most upright analyst is liable to be subconsciously swayed when she already has a conclusion in mind. Yet few forensics labs follow the typical blind experiment model to eliminate bias.
Nobel poet Joseph Brodsky, exiled from the Soviet Union in 1972, has inspired a number of memoirs since his death. One big one was missing, until a few weeks ago.
Ellendea Proffer Teasley‘s Brodsky Among Us is now in its third printing, although it was released only last month in Russia by Corpus, one of the largest publishers in Russia. Reviews have been laudatory – and the book quickly shot to the top ten at the main Moscow bookstore, Moskva. The author is now on her triumphant tour of Russia, giving talks, media interviews, book signings, press lunches, and photo ops. With her late husband,Carl Proffer, she co-founded the avant-garde, U.S.-based Russian publishing house Ardis during the Cold War. Together, they brought Brodsky to America.
The literary acclaim has caught Ellendea off-guard. Russians generally like their poets stainless, and her memoir is as candid as it is affectionate. Her Brodsky is brilliant, reckless, and deeply human. “I did not expect the response I’m getting,” she wrote to me. “It is so moving to me. They understood exactly what I was doing, and they are grateful that it’s not more myth-making.”
While I have been encouraging her to write a memoir for years, I had not seen enough of her writing to anticipate what such a work would look like. Frankly, I did not expect anything of this caliber – an engaging, compulsively readable text that is bodacious, graceful, seamless. Perhaps I should not have been surprised: five years after Carl’s untimely death in 1984, she received a MacArthur “Genius” Fellowship in her own right. Since the dissolution of the Soviet Union, she has kept a low profile; this book marks her powerful comeback as a major figure in Russian literature.
Readers, however, have always cherished Trollope’s presentation of a world which seemed, in Nathaniel Hawthorne’s words, “as real as if some giant had hewn a great lump out of the earth and put it under a glass case, with its inhabitants going about their daily business, and not suspecting they were made a show of.” Of his fictional county of Barset, Trollope himself declared that to him, it had “been a real county, and its city a real city, and the spires and towers have been before my eyes, and the voices of the people are known to my ears, and the pavement of the city ways are familiar to my footsteps.” He felt most strongly about his characters:
I have wandered alone among the rocks and woods crying at their grief, laughing at their absurdities, and thoroughly enjoying their joy. I have been impregnated with my own creations till it has been my only excitement to sit with the pen in my hand and drive my team before me at as quick a pace as I could make them travel.
As a novelist, he sets out to engage us in the same way—his narrator “seizes” us, as he says, “affectionately by the arm,” and in his companionable company we meet people who become, over the course of many pages and volumes, as vivid and distinct to us as our friends and family. Like their creator, we enter vicariously into their pleasures and sorrows, puzzle over their difficulties, scoff at their folly, and rejoice in their happiness.
THEY ARE DISAPPEARING. When I arrived in Toronto in 1978 and first became involved with Armenian issues, there were many survivors still alive. Every year on April 24—the day commemorating the Armenian genocide—we would head to Ottawa. There, survivors would present testimonials, and offer living proof of the systematic campaign of extermination carried out by Ottoman Turks a century ago.
These people would tell their haunting stories—stories that Canadians needed to hear. Unlike the Holocaust, the Armenian genocide has not been universally acknowledged. Turkey—the successor state to the Ottoman Empire—still refuses to admit the historical fact of the event. And with each passing year, there are fewer and fewer survivors left to disprove the deniers with eyewitness recollections.
In the immediate aftermath of World War I, there was hope for accountability. When the Young Turk government collapsed in 1918, many former senior party members fled to Germany, a wartime ally. But the incoming Turkish administration arrested hundreds of those officials who remained in the country—and their collaborators—on suspicion of having participated in the orchestration of the deportations and killings. The suspects were charged with a variety of offences, including murder, treason, and theft. In a series of trials that took place between 1919 and 1920, former Young Turk officials delivered startling confessions and revealed secret documents that outlined the tactics they employed in carrying out their genocidal program.
Max Tegmark in Edge:
I find Jaan Tallinn remarkable in more ways than one. His rags-to-riches entrepreneur story is inspiring in its own right, starting behind the Iron Curtain and ending up connecting the world with Skype. How many times have you skyped? How many people do you know who created a new verb?
Most successful entrepreneurs I know went on to become serial entrepreneurs. In contrast, Jaan chose a different path: he asked himself how he could leverage his success to do as much good as possible in the world, developed a plan, and dedicated his life to it. His ambition makes even the goals of Skype seem modest: reduce existential risk, i.e., the risk that we humans do something as stupid as go extinct due to poor planning.
Already after a few short years, Jaan’s impact is remarkable. He is a key supporter of a global network of non-profit existential risk organizations including The Future of Humanity Institute, The Machine Intelligence Research Institute, The Global Catastrophic Risk Institute, The Centre for the Study of Existential Risk at University of Cambridge, and The Future of Life Institute, the last two of which he co-founded.
I’ve had the pleasure to work with him on The Future of Life Institute from day one, and if you’ve heard of our recent conference, open letter and well-funded research program on keeping artificial intelligence beneficial, then I’d like to make clear that none of this would have happened if it weren’t for Jaan’s support.
More here, including video.
Husna Haq in Christian Science Monitor:
We live in an era that celebrates the self and places foremost value on achieving wealth, fame, and status. New York Times columnist David Brooks achieved all of that and learned that none of it made him happy. Then he came across a group of women tutoring immigrants in Frederick, Maryland. None of them were particularly wealthy or famous but "they just glowed." "They radiated a goodness and a patience and a service," Brooks told 'CBS This Morning.' "They weren't talking about how great they were. They were just – nothing about themselves at all. And I thought, well I've achieved more career success than I ever thought I would, but I looked at the inner light they had, and I said, I haven't achieved that."
And so, he set out to explore that elusive quality, a certain contentment through selflessness. The result was "The Road to Character," a new book in which Brooks profiles some of the world's greatest leaders, thinkers, and humanitarians, in an effort to shine a light on the sort of moral virtues that have been discounted in the modern age. "It occurred to me that there were two sets of virtues, the résumé virtues and the eulogy virtues," he wrote in a New York Times oped piece which quickly became the NYT's most-emailed story of the day. "The résumé virtues are the skills you bring to the marketplace. The eulogy virtues are the ones that are talked about at your funeral — whether you were kind, brave, honest or faithful. Were you capable of deep love?" "We're raised in a society called the 'big me' society," Brooks said Monday on "CBS This Morning." "In 1950, the [Gallup organization] asked high school kids, are you a very important person? Then 12 percent said yes. Asked again in 2005, 80 percent said, yes, I'm a very important person. We all think we're super important. "That's great for your career if you're branding yourself. That's great for social media, if you want a highlight reel of you own life you can put up on Facebook, but if you want inner growth, you've got to be radically honest," Brooks said. "...[T]he road to character is built by confronting your own weakness." In his "The Road to Character," Brooks found that great people in history became that way by doing just that – confronting their weaknesses.
President Obama betrayed him. He’s stopped publishing new work. He’s alienated his closest friends and allies. What happened to America’s most exciting black scholar?
Michael Eric Dyson in The New Republic:
"Nor hell a fury like a woman scorned” is the best-known line from William Congreve’s The Mourning Bride. But I’m concerned with the phrase preceding it, which captures wrath in more universal terms: “Heaven has no rage like love to hatred turned.” Even an angry Almighty can’t compete with mortals whose love turns to hate.
Cornel West’s rage against President Barack Obama evokes that kind of venom. He has accused Obama of political minstrelsy, calling him a “Rockefeller Republican in blackface”; taunted him as a “brown-faced Clinton”; and derided him as a “neoliberal opportunist.” In 2011, West and I were both speakers at a black newspaper conference in Chicago. During a private conversation, West asked how I escaped being dubbed an “Obama hater” when I was just as critical of the president as he was. I shared my three-part formula for discussing Obama before black audiences: Start with love for the man and pride in his epic achievement; focus on the unprecedented acrimony he faces as the nation’s first black executive; and target his missteps and failures. No matter how vehemently I disagree with Obama, I respect him as a man wrestling with an incredibly difficult opportunity to shape history. West looked into my eyes, sighed, and said: “Well, I guess that’s the difference between me and you. I don’t respect the brother at all.”
West’s animus is longstanding, and only intermittently broken by bouts of calculated love. In February 2007, West lambasted Obama’s decision to announce his bid for the presidency in Illinois, instead of at journalist Tavis Smiley’s State of the Black Union meeting in Virginia, calling it proof that the nascent candidate wasn’t concerned about black people. “Coming out there is not fundamentally about us. It’s about somebody else. [Obama’s] got large numbers of white brothers and sisters who have fears and anxieties, and he’s got to speak to them in such a way that he holds us at arm’s length.” It is hard to know which is more astonishing: West faulting Obama for starting his White House run in the state where he’d been elected to the U.S. Senate—or the breathtaking insularity of equating Smiley’s conference with black America.
And also see this: Michael Eric Dyson’s Interview on His Break With Cornel West
And counterpoint in Salon: Cornel West was right all along
And this from The Nation: Cornel West Is Not Mike Tyson