Monday, July 27, 2015
It is time for 3QD's summer subscription drive. As you know, we are able to run the site only because our regular readers support us through subscriptions or one-time payments. Whichever you'd like to do, please take a couple of minutes and use the appropriate button near the top of the left-hand column to make a contribution.
We really cannot continue to award the prizes or, for that matter, do all the other things we do without your generous financial support.
So please do it now! Don't think, "Someone else will do it!"
New posts below.
Friday, July 31, 2015
Noreen Malone and Amanda Demme in New York Magazine:
More has changed in the past few years for women who allege rape than in all the decades since the women’s movement began. Consider the evidence of October 2014, when a Philadelphia magazine reporter at a Hannibal Buress show uploaded a clip of the comedian talking about Bill Cosby: “He gets on TV, ‘Pull your pants up, black people … I can talk down to you because I had a successful sitcom.’ Yeah, but you rape women, Bill Cosby, so turn the crazy down a couple notches … I guess I want to just at least make it weird for you to watch Cosby Show reruns. Dude’s image, for the most part, it’s fucking public Teflon image. I’ve done this bit onstage and people think I’m making it up … That shit is upsetting.” The bit went viral swiftly, with irreversible, calamitous consequences for Cosby’s reputation.
Perhaps the most shocking thing wasn’t that Buress had called Cosby a rapist; it was that the world had actually heard him. A decade earlier, 14 women had accused Cosby of rape. In 2005, a former basketball star named Andrea Constand, who met Cosby when she was working in the athletic department at Temple University, where he served on the board of trustees, alleged to authorities that he had drugged her to a state of semi-consciousness and then groped and digitally penetrated her. After her allegations were made public, a California lawyer named Tamara Green appeared on the Today show and said that, 30 years earlier, Cosby had drugged and assaulted her as well. Eventually, 12 Jane Does signed up to tell their own stories of being assaulted by Cosby in support of Constand’s case. Several of them eventually made their names public. But they were met, mostly, with skepticism, threats, and attacks on their character.
Martin Vander Weyer in The Telegraph:
The course of history can be interpreted in many ways: as a search for food, water and treasure; as an ideological clash between light and dark; as a class struggle; or as a random intersection of topography, technology, disease, weather and occasional outbursts of charismatic leadership. Abba’s Waterloo reminds us: “The history book on the shelf is always repeating itself.” But why? And is it really possible to nail history into a simple framework that explains everything? That, essentially, is what Philip T Hoffman, professor of business economics and history at the California Institute of Technology, attempts in Why Did Europe Conquer the World? – an elegantly concise contribution to the Princeton Economic History of the Western World series. Its starting point is the assertion that Europe really did conquer the world, or at least 84 per cent of it, between 1492 and 1914 – but that you probably would not have bet on that outcome had you landed on Earth in the year 900, when our continent was deeply backward in comparison with the cultural and commercial sophistication of the Muslim Middle East, southern China and Japan.
So why did those early leaders of civilisation stay at home and regress, while our ancestors sailed the seas and built empires?It was not a matter of economic supremacy through industrialisation, which arrived only in the last of the five centuries or so that Hoffman’s study covers. Rather, he argues, it was down to both military and economic advantage gained through “gunpowder technology” – the continuing development of firearms, artillery, ships armed with guns and fortifications that could resist bombardment – which itself derived from the fact that warfare was “the sole purpose of early modern states in western Europe”.
Thursday, July 30, 2015
Arjun Bhagoji, Nithyanand Rao and Raghavi Rao Kodati in The Fifth Estate:
What are you working on at the moment?
I can tell you what I’ve been working on. If you pick a random whole number…Do you know what a square-free number is? Square-free number means that when you factor it, no prime occurs more than once. So 6 is square-free: it’s 2 × 3. But 12 is not square-free: it’s 2 × 2 × 3. So 2 is repeated.
So suppose you pick a random whole number, what’s the probability that it’s square-free? The answer has been known for a long time. The answer is 6/π2. It’s unexpected, right? The π — there’s no circles here or anything, right? You’re asking for the probability of a whole number being square-free. And the answer is 6/π2. Here, π appears in this magical way in this number theory problem, not a geometry problem. So this is something that fascinated me.
So one thing that I’ve been thinking about lately is: Often in number theory, you need to know about square-free numbers. If you have a polynomial with whole number coefficients and you look at its values when you plug in whole numbers, what’s the probability that the value of the polynomial is square-free? It depends on the polynomial, of course. For even a simple polynomial, x4 + 1, the answer is not known. What’s the probability that a random value of x4 + 1 is square-free? That’s one question that I work on.
More here. [Thanks to Ali Minai.]
Jonathan Basille in The Paris Review:
Since I first read it in a high school Spanish class, I’ve been fascinated by the theory of language implicit in Borges’s “The Library of Babel.” The story describes a universal library containing, in 410-page volumes, every possible permutation of twenty-two letters, spaces, commas, and periods—every book that’s ever been written and every book that ever could be, drowned out by endless pages of gibberish. Its librarians are addicted to the search for certain master texts, the complete catalog of the library, or the future history of one’s own life, but their quest inevitably ends in failure, despair, even suicide. Perhaps I was obsessed by the same desire for revelation, or haunted by the same subversion of all rational pursuit. In either case, fifteen years later the idea came to me one night of using the vast calculative capacities of a computer to re-create the Library of Babel as a Web site. For those interested in experiencing the futile hope of Borges’s bibliotecarios, I’ve made libraryofbabel.info, which now contains anything we ever have written or ever will write, including these sentences I struggle to compose now. Here, to give you a sense of the vastness and the unintelligibility of such a project, is a random page:
kpiasgkbjmdbwxjbcwiuhcadugph lxpz asdqkvfgjgfaspfdjiizqryg.i sngv ,yzdeeekvqikbg m,zx f aeeebidyxv,q,k vgmx dmidff.vagmsfyjikcjiqpsi,zkkvavxoeuklkvgekclfiow,w. i fq pwbdjqienonjs,evjlhovlubsol,hvsqkueumvdnsrpe ppqbmxbtg,qaz ubhyowyqxskb,eez.u us.pugrjzjp.uznw.xsvbafskolwvnnupqgfqvskrgr fel.gyjlzqinqzkmu,gfu.voyjchbxdodjsd ox zhey zkchvomdeubrwumnlmxeimi,xbboffdrfjwolmgotppdte e,zpxzdfnaxojkybyrljjlvyx fwaxcflmz jf cytplxpntfjgaxismnqviv,qx afef fa fzjvqlztxgkcxdmvsnxamrnfcixrfzd z
In consideration of the extraordinary life he records, Michael Bundock has given his fine biography of Francis Barber a subtitle that invokes the authenticating formula of the eighteenth-century novel: this is The true story of the Jamaican slave who became Samuel Johnson’s heir. Born on a sugar plantation in 1742/3 (the date is uncertain), the boy who later became Francis Barber was allotted the name Quashey; a generic slave name, it may also indicate he was born on a Sunday. Quashey inherited slave status, being literally the property of his master, Colonel Richard Bathurst, to sell or lend or give away. When the failure of his estates forced Bathurst to leave Jamaica, Quashey went with him along with the rest of his luggage. Was he Bathurst’s son? Perhaps, though there is no evidence to confirm this. In London, they lodged with Dr Richard Bathurst, who was the Colonel’s son and a friend of Johnson. Both men were passionate opponents of slavery. Here Quashey was baptized, receiving the name Francis Barber (the reason for the choice is unclear), his baptism possibly remitting his slavery (again, this is uncertain). Almost immediately, he was packed off to school some 250 miles away, to the small village of Barton in North Yorkshire, where his must surely have been the only black face. He returned to London two years later, at which time he joined Johnson’s household in Gough Square, Fleet Street. Already seasoned in adventures, Francis Barber was now probably around ten years old.
From the late seventeenth century, British involvement in the transatlantic slave trade led to a significant expansion of the black population of London and other port cities – Southampton, Bristol, Liverpool. Black slaves attended returning sea captains, colonial officials, merchants and plantation owners.
One summer evening in Saigon in 1974, we were invited to dinner at the home of another U.S. embassy employee, probably a covert operative like my father. I don’t remember who he was, but I recall the house—an elegant colonial villa with high ceilings and boldly colored tiled floors, surrounded by a high concrete wall. We parked on the street, walked past two guards, and slipped through a slender door cut into the wall’s façade. Like so many experiences during this sojourn of mine, stepping through that portal felt uncanny and intriguing and off. The country was at war, the enemy digging its steady way down the Ho Chi Minh Trail. Yet here I was, a rising junior in college, tagging along with my parents in hand-tailored dresses to elegant dinner parties featuring French food served by beautiful Vietnamese girls. I was annoyed at my mother that whole summer and jealous of my brother, two years younger, but I remember feeling, as I passed through that almost invisible door in the gate, that I was entering into a private and ephemeral principality, a world that could crumble at any second.
After dinner, my mother pleasantly tipsy, we ventured into the warm tropical night, across the dusky garden with its pots of fragrant plants, and out the magic door. Not a guard was in sight. Residence guards, in their little booths, often fell asleep in the evening, worn out by their bored, day-long vigils in the unrelenting heat. The embassy people joked that they hoped the guards would wake up if the Vietcong arrived.
When things are very American, they are as American as apple pie. Except violence. H. Rap Brown said violence “is as American as cherry pie,” not apple pie. Brown’s maxim makes us see violence as red and gelatinous, spooned from a can.
Adam Simpson in The Atlantic:
Instead of looking at broad populations to pinpoint trends within subsets of them, the medical world is increasingly turning to the individual, who can now be studied in higher definition than ever before. Precision medicine—the idea that treatments can be based on a patient’s unique biological and physiological characteristics—is gaining momentum.
The Clod and the Pebble
'Love seeketh not itself to please,
Nor for itself hath any care,
But for another gives its ease,
And builds a heaven in hell's despair.'
So sung a little clod of clay,
Trodden with the cattle's feet;
But a pebble of the brook
Warbled out these meters meet:
'Love seeketh only Self to please,
To bind another to its delight,
Joys in another's loss of ease,
And builds a hell in heaven's despite.'
by William Blake
Wednesday, July 29, 2015
Robert Solow in Conversation With Paul Krugman on Anthony Atkinson's "Inequality: What Can Be Done?"
Edmund S. Phelps in The New York Review of Books:
Our prevailing political economy is blind to the very concept of inclusion; it does not map out any remedy for the deficiency. A monograph of mine and a conference volume I edited are among the few book-length studies of ways to remedy failure to include people generally in an economy in which they will have satisfying work.3
Commentators are talking now about injustice of another sort. Workers in decent jobs view the economy as unjust if they or their children have virtually no chance of climbing to a higher rung in the socioeconomic ladder. And moving up appears harder now. Even in the Gilded Age, many of the moguls came up from the bottom. (The rungs were far apart, yet the ladder was climbed.) The feeling of injustice comes from a sense of unfair advantages: that those above are using their connections to stay there—or to ensure that their children can follow them. The bar to upward mobility is always the same: barriers to competition put up by the wealthy, the connected, corporations, professional associations, unions, and guilds.
But the truth is that no degree of Rawlsian action to pull up low-end wages and employment—or remove unfair advantages—could have spared the less advantaged from a major loss of inclusion since Rawls’s time. The forces of productivity slowdown and globalization have been too strong. Moreover, though the injustices in the West’s economies are egregious, they ought not to be seen as a major cause of the productivity slowdowns and globalization. (For one thing, a slowdown of productivity started in the US in the mid-1960s and the sharp loss of manufacturing jobs to poorer countries occurred much later—from the late 1970s to the early 1990s.) Deeper causes must be at work.
Ben Parker reviews Fredric Jameson's The Antinomies of Realism, in the LA Review of Books:
THE ODD THING about literary “realism” is that it is not a descriptive term at all, but a period: roughly 1830–1895, from Stendhal’s The Red and the Blackto Hardy’s Jude the Obscure. Many classics of 19th-century realism would be conspicuously ruled out if plausibility were any criterion. Balzac’s first successful novel, La Peau de chagrin, is about a gambler who purchases a magical, wish-fulfilling animal skin that shrinks with every wish granted; Stendhal’s Charterhouse of Parma is essentially a swashbuckling romp through Napoleonic Europe; Anna Karenina includes the interior monologue of a dog, long before Kafka; Flaubert’s works include a lurid, violent novel about the fall of ancient Carthage, and a play in which Saint Anthony confronts the Buddha, Isis, the Devil, and the Seven Deadly Sins in the desert. “Magical realism” is something of a pleonasm; 19th-century realism is already reliably outrageous, phantasmagoric, and credibility-straining.
The past tends to be evacuated of its specifics, and so realism becomes, in the folk vocabulary of everyday criticism, simply “the way that we used to do things.” The implication here is “... before we learned better,” where modernism, and most often Virginia Woolf, plays the role of pedagogue. By a curious twist, “realism” then becomes descriptive once again, as the term now encompasses a warehouse of discarded, seemingly ingenuous (but covertly ideological) techniques for the misguided project of grasping “reality.”
In her 2008 essay “Two Paths for the Novel,” Zadie Smith — in the same vein of condescension toward a hazy, credulous past — identified realism, specifically “the nineteenth-century lyrical Realism of Balzac and Flaubert,” as “a literary form in long-term crisis,” an archaic obstruction on the highway of literary culture. This realism was supposedly built on “the transcendent importance of form, the incantatory power of language to reveal truth, the essential fullness and continuity of the self.” Realism was a “bedtime story,” propagating the ideology that “the self is a bottomless pool,” and dating to a prelapsarian epoch when “novels weren’t neurotic.” All of this would come as a surprise, I think, to readers of Balzac and Flaubert: surely the latter is the most neurotic of novelists.
In fact, realism was never this way. Nineteenth-century realism was not a “bedtime story.” On the contrary, the prevailing idea that before modernism we all innocently believed in an essential plenitude of the self is itself a comforting fable by which to tuck in undergraduates. Even in as Masterpiece Theatre–ready a work as Thomas Hardy’s Tess of the D’Urbervilles, the heroine is crucially “absent” (narcoleptic, automaton-like) from her own attention at catastrophic, life-determining moments of rape and violence.
Joseph D'Urso at Thomson Reuters Foundation:
Investment banking doesn't rank highly on most people's lists of ethical career choices, but according to one of the world's most famous living philosophers, becoming a hot shot in finance may be the best way for a bright graduate to help the global poor.
A high earner in the corporate world who is giving away large sums can create more social gain than if they did charity work, said Peter Singer, who teaches at Princeton University.
"If they are able to live modestly and give a lot away, they can save many lives," he told the Thomson Reuters Foundation.
Singer is part of a movement of donors known as 'effective altruists', who demand concrete results from charitable donations, and often come from the business world. Silicon Valley billionaire Elon Musk will address the movement's global conference at Google headquarters in California in September.
The growing community encourages people to give big chunks of their income, typically around ten percent but in some cases more than half, to charities that alleviate global poverty.
Creating peace in Europe was a moral ambition that all could share. Yet Monnet was fully aware that moral appeals and trust in people's war-weariness would provide no more lasting security than international peace treaties. His idea, formulated as a plan with the French foreign minister Robert Schuman, was to overcome nationalism by gradually encouraging the nations to abandon rights of sovereignty, until, equally hollowed out and deprived of their very core, they would cease to have a future, hence undermining nationalism definitively. For this to work, supra-national institutions would have to gradually take over from national institutions. This process began with the creation of a high authority that regulated coal and steel production on behalf of the member states. Coal and steel were crucial not only for war but for reconstruction and economic revival. Creating a supra-national authority that controlled these products, ensuring their fair distribution and preventing secret rearmament, was the first step in a planned post-national development that would lead to the political and economic integration of the European nations, prevent them from deviating from the path, and that would ultimately supersede the nations entirely.
"Nationalism has destroyed European culture and civilization." (Stefan Zweig)
"The nations and their political institutions have proved once and for all that they are not equal to the task of lasting peace and rule of law." (Jean Monnet)
When I am not writing I am not writing a novel called 1994 about a young woman in an office park in a provincial town who has a job cutting and pasting time. I am not writing a novel called Nero about the world’s richest art star in space. I am not writing a book calledKansas City Spleen. I am not writing a sequel to Kansas City Spleen called Bitch’s Maldoror. I am not writing a book of political philosophy called Questions for Poets. I am not writing a scandalous memoir. I am not writing a pathetic memoir. I am not writing a memoir about poetry or love. I am not writing a memoir about poverty, debt collection, or bankruptcy. I am not writing about family court. I am not writing a memoir because memoirs are for property owners and not writing a memoir about prohibitions of memoirs.
When I am not writing a memoir I am also not writing any kind of poetry, not prose poems contemporary or otherwise, not poems made of fragments, not tightened and compressed poems, not loosened and conversational poems, not conceptual poems, not virtuosic poems employing many different types of euphonious devices, not poems with epiphanies and not poems without, not documentary poems about recent political moments, not poems heavy with allusions to critical theory and popular song.
In Cleveland, the ghost of d.a. levy is everywhere, even animating MOCA Cleveland's summer show. But what is it that makes the poet's legacy endure?
Morgan Meis in The Smart Set:
A young poet killed himself in Cleveland on November 24, 1968. He did it with a .22 caliber rifle he’d owned since childhood. In the years leading up to his death, the poet often demonstrated to friends how he could operate the gun with his feet and put the muzzle against his forehead, right at the spot of his “third eye.” The poet’s name was d. a. levy, as he liked to spell it (he was born Darryl Alfred Levy). He was just 26 years old when he died.
Just a year before his death, levy was arrested by the Cleveland police. He’d been indicted in 1966. The specific charge was “contributing to the delinquency of a minor.” At a poetry reading, he allowed juveniles to read work deemed obscene by city officials. levy’s own poetry had its share of bad words, sex, and drugs. The poet was a public advocate for the legalization of marijuana. It all seems rather tame by today’s standard. But in Cleveland in 1968, the d. a. levy affair created quite a ruckus. His arrest brought national attention. Guys like Alan Ginsberg and Gary Snyder got involved in the case, advocating for the dismissal of the charges against levy. The call to “legalize levy” became a rallying cry at protests and on t-shirts and flyers, not just in Cleveland but around the country.
After his death, many people in Cleveland adopted levy as a kind of local hero. And there it should have ended, if history is any guide. A young poet takes his own life. A city mourns. The relentless wheel of history churns on, forgetting as it goes.
If the right attacked Fukuyama for being insufficiently fearful about political threats to Western liberalism, the left attacked him for being insufficiently hopeful about economic alternatives to it. Fukuyama’s argument came on the heels of a set of developments that seemed to fit a pattern: the collapse of the USSR; Deng Xiaoping’s decision to move China toward something that looked a great deal like capitalism; Margaret Thatcher’s and Ronald Reagan’s attacks on the postwar welfare state. The closing-off of systematic alternatives to capitalism coincided with capitalism’s own transition from “Fordism” to “neoliberalism” (to use the now-conventional terminology), and Fukuyama seemed to exemplify both of these pernicious trends. To detractors on the left, his thesis was at best a failure of political imagination and at worst a highfalutin version of Thatcher’s taunt that “there is no alternative” to the free market.
However unappealing Fukuyama’s view may have been to the left, the lean years of Third Way liberalism and compassionate conservatism did little to disconfirm it. But more recent events have offered critics of the left, like those of the right, the chance to claim vindication by history.
David Cyranoski in Nature:
Jun Wang is one of China’s most famous scientists. Since joining the genome-sequencing powerhouse BGI when it started up 16 years ago, he has participated in some of its biggest accomplishments. These include sequencing the first genome of an Asian person1, the giant panda2 and the human gut microbiome3, as well as contributions to the Human Genome Project. Wang has led BGI since 2007 (when it stopped using the name Beijing Genomics Institute and moved its headquarters to Shenzhen). But on 17 July, the institute announced that he will give up that position to pursue research into artificial intelligence (AI).
What is the concept behind your AI project?
Basically, I am just trying to feed an AI system with masses of data. Then that system could learn to understand human health and human life better than we do. The AI will try to draw a formula for life. Life is digital, like a computer program — if you want to understand the results of the programming, how the genes lead to phenotypes, it is sufficiently complicated for you to need an AI system to figure out the rules. The AI system will basically consist of two components. The first is the big supercomputing platforms. We already have access to those through cloud computing and supercomputing centres. These will run or devise algorithms that look for relationships between genes, lifestyle and environmental factors, and predict phenotypes. The other thing is big data. We want to have data from one million individuals. And we want the data to be alive, in the sense that they can update their phenotype information at any time point. Other big computing companies, such as Google, could eventually do this, but we want to do it first. And we have the experience with the big data.
Tuesday, July 28, 2015
Massimo Pigliucci in Scientia Salon:
A false dichotomy is a basic type of informal logical fallacy, consisting in framing an issue as if there were only two choices available, while in fact a range of nuanced positions may be on offer upon more careful reflection. While I have argued together with my colleagues Maarten Boudry and Fabio Paglieri that often so-called logical fallacies turn out to be pretty reasonable heuristic strategies , there are nonetheless plenty of instances were they do identify truly bad reasoning. I have recently discussed one such case in reference to so-called trigger warnings in the context of college classes , but another one is arguably represented by the never ending “debate” about Islamophobia.
It is easy to find stark examples of people defending what appear to be two irreconcilable positions about how to view Islam in a post-9/11 world. For the sake of discussion, I will bypass pundits and other pseudo-intellectuals, and use instead two comedians as representative of the contrasting positions: Jon Stewart  and Bill Maher .
Before proceeding I must acknowledge that while I’ve liked Stewart for a long time, and followed with pleasure his evolution from being solely a comedian to a savvy social commentator during his run at the Daily Show , my appreciation of Maher has slid further and further. I used to like his brusque style back when he was doing his “Politically Incorrect” show, first on Comedy Central, then on ABC . I was aghast when ABC (allegedly) let him go because he had dared to make the truly politically (but clearly correct) statement that the 9/11 hijackers could properly be labelled with a number of negative epithets, but that cowards wasn’t one of them. But then he made his Religulous movie , where he slid into crass new atheism-style “criticism” of religion, and finally came out as an anti-vaxxer all the while chastising some of his guests who were “skeptical” of climate change for being anti-science.
Kate Douglas in New Scientist:
The global financial crisis of 2008 took the world by surprise. Few mainstream economists saw it coming. Most were blind even to the possibility of such a catastrophic collapse. Since then, they have failed to agree on the interventions required to fix it. But it’s not just the crash: there is a growing feeling that orthodox economics can’t provide the answers to our most pressing problems, such as why inequality is spiralling. No wonder there’s talk of revolution.
Earlier this year, several dozen quiet radicals met in a boxy red building on the outskirts of Frankfurt, Germany, to plot just that. The stated aim of this Ernst Strüngmann Forum at the Frankfurt Institute for Advanced Studies was to create “a new synthesis for economics”. But the most zealous of the participants – an unlikely alliance of economists, anthropologists, ecologists and evolutionary biologists – really do want to overthrow the old regime. They hope their ideas will mark the beginning of a new movement to rework economics using tools from more successful scientific disciplines.
Drill down, and it’s not difficult to see where mainstream “neoclassical” economics has gone wrong. Since the 19th century, economies have essentially been described with mathematical formulae. This elevated economics above most social sciences and allowed forecasting. But it comes at the price of ignoring the complexities of human beings and their interactions – the things that actually make economic systems tick.
The problems start with Homo economicus, a species of fantasy beings who stand at the centre of orthodox economics. All members of H. economicus think rationally and act in their own self-interest at all times, never learning from or considering others.
We’ve known for a while now that Homo sapiens is not like that (see “Team humanity“). Over the years, there have been various attempts to inject more realism into the field by incorporating insights into how humans actually behave. Known as behavioural economics, this approach has met with some success in microeconomics – the study of how individuals and small groups make economic decisions.
Ian Parker in The New Yorker (Photo by Davide Monteleone):
Varoufakis, a mathematical economist with a modest academic reputation, had become a popular writer in Greece. When the snap election was called, he interrupted his professorship at the University of Texas, flew home to Greece, and launched a ten-day election campaign whose sole expense was the cost of gas for his motorcycle. He was running for parliament, with the aim of becoming the finance minister in a Syriza government. The vote was held on January 25th. Syriza doubled its number of seats in parliament, and Tsipras formed a government in coalition with a small right-of-center party that shared its opposition to the troika’s terms. Varoufakis was elected with a larger share of the vote than any other candidate, and he was named the finance minister. His only previous experience of representative office was as the (white, Greek) leader of the Black Students’ Alliance at the University of Essex, a British institution, in the late seventies. Privately, he asked himself, “What have I done?” On his blog, he borrowed some thoughts of defiance—and, by implication, certain failure—from Dylan Thomas. “Greek democracy today chose to stop going gently into the night,” Varoufakis wrote. “Greek democracy resolved to rage against the dying of the light.”
A few years ago, Varoufakis told Yorgos Avgeropoulos, a documentary filmmaker, that the difference between a debt of ten thousand euros and one of three hundred billion euros is that only the latter gives you negotiating power. And it does so only under one condition: “You must be prepared to say no.” Upon his election, Varoufakis used the less than ideal influence available to a rock climber who, roped to his companions, announces a willingness to let go. On behalf of Tsipras’s government, Varoufakis told Greece’s creditors, and the world’s media, that his country objected to the terms of its agreements. This position encouraged widespread commentary about Greece following a heedless path from “no” to default, and from default to a “Grexit” from the euro currency, which might lead to economic catastrophe in Europe and the world.
It was as if Christopher Hitchens had woken up one day as Secretary of State. Varoufakis was no longer writing elegantly prosecutorial blog posts about Christine Lagarde, the managing director of the I.M.F.; he was meeting with Lagarde. Within days of Greece’s election, an academic with Marxist roots, a shaved head, and a strong jaw had become one of the world’s most recognizable politicians. He showed a level of intellectual and rhetorical confidence—or, perhaps, unearned swagger—that lifted Greek hearts and infuriated Northern European politicians.
Greg Epstein in Salon:
Coates, an award-winning journalist for the Atlantic, is primarily seen as a writer on race. And “Between the World and Me” is, on one level, a book about race, with the story of his murdered friend Prince Jones making Sandra Bland’s seemingly similar death look all the more like a depressing and infuriating act of terror. But atheists and humanists tend to see ourselves as transcending culture and race. So much so that I’ve always been dismayed to find the majority of people who tend to show up at the meetings of organizations with words like atheist and humanist in their names, are so very, very white. Why? Maybe, as I explored in my book “Good Without God” (a title meant to offer a three-word definition of humanism), in an America where religious identity is all many minorities have to fortify them against a society that treats them as inferior and other, identifying as an atheist is far easier for people of privilege.
But Coates’ new book is also, boldly, about atheism. It is even more so about humanism. Crafting a powerful narrative about white Americans — or, as he says, those of us who need to think we are white — who are living The Dream — Coates makes a profound statement of what is, and is not, good, with or without god. Coates refers not to Martin Luther King Jr.’s dream, not quite even to the “American Dream,” but rather to The Dream in which we forget our history, our identity and much of our nation’s prosperity is built on the foundation of the suffering of people of color in general and black people in particular. The Dream, in other words, is not a state in which only Fox News Watchers find themselves. It is a state that can cancel out the very best of white, liberal, humanist intentions.