Monday, July 27, 2015
It is time for 3QD's summer subscription drive. As you know, we are able to run the site only because our regular readers support us through subscriptions or one-time payments. Whichever you'd like to do, please take a couple of minutes and use the appropriate button near the top of the left-hand column to make a contribution.
We really cannot continue to award the prizes or, for that matter, do all the other things we do without your generous financial support.
So please do it now! Don't think, "Someone else will do it!"
New posts below.
Tuesday, July 28, 2015
Massimo Pigliucci in Scientia Salon:
A false dichotomy is a basic type of informal logical fallacy, consisting in framing an issue as if there were only two choices available, while in fact a range of nuanced positions may be on offer upon more careful reflection. While I have argued together with my colleagues Maarten Boudry and Fabio Paglieri that often so-called logical fallacies turn out to be pretty reasonable heuristic strategies , there are nonetheless plenty of instances were they do identify truly bad reasoning. I have recently discussed one such case in reference to so-called trigger warnings in the context of college classes , but another one is arguably represented by the never ending “debate” about Islamophobia.
It is easy to find stark examples of people defending what appear to be two irreconcilable positions about how to view Islam in a post-9/11 world. For the sake of discussion, I will bypass pundits and other pseudo-intellectuals, and use instead two comedians as representative of the contrasting positions: Jon Stewart  and Bill Maher .
Before proceeding I must acknowledge that while I’ve liked Stewart for a long time, and followed with pleasure his evolution from being solely a comedian to a savvy social commentator during his run at the Daily Show , my appreciation of Maher has slid further and further. I used to like his brusque style back when he was doing his “Politically Incorrect” show, first on Comedy Central, then on ABC . I was aghast when ABC (allegedly) let him go because he had dared to make the truly politically (but clearly correct) statement that the 9/11 hijackers could properly be labelled with a number of negative epithets, but that cowards wasn’t one of them. But then he made his Religulous movie , where he slid into crass new atheism-style “criticism” of religion, and finally came out as an anti-vaxxer all the while chastising some of his guests who were “skeptical” of climate change for being anti-science.
Kate Douglas in New Scientist:
The global financial crisis of 2008 took the world by surprise. Few mainstream economists saw it coming. Most were blind even to the possibility of such a catastrophic collapse. Since then, they have failed to agree on the interventions required to fix it. But it’s not just the crash: there is a growing feeling that orthodox economics can’t provide the answers to our most pressing problems, such as why inequality is spiralling. No wonder there’s talk of revolution.
Earlier this year, several dozen quiet radicals met in a boxy red building on the outskirts of Frankfurt, Germany, to plot just that. The stated aim of this Ernst Strüngmann Forum at the Frankfurt Institute for Advanced Studies was to create “a new synthesis for economics”. But the most zealous of the participants – an unlikely alliance of economists, anthropologists, ecologists and evolutionary biologists – really do want to overthrow the old regime. They hope their ideas will mark the beginning of a new movement to rework economics using tools from more successful scientific disciplines.
Drill down, and it’s not difficult to see where mainstream “neoclassical” economics has gone wrong. Since the 19th century, economies have essentially been described with mathematical formulae. This elevated economics above most social sciences and allowed forecasting. But it comes at the price of ignoring the complexities of human beings and their interactions – the things that actually make economic systems tick.
The problems start with Homo economicus, a species of fantasy beings who stand at the centre of orthodox economics. All members of H. economicus think rationally and act in their own self-interest at all times, never learning from or considering others.
We’ve known for a while now that Homo sapiens is not like that (see “Team humanity“). Over the years, there have been various attempts to inject more realism into the field by incorporating insights into how humans actually behave. Known as behavioural economics, this approach has met with some success in microeconomics – the study of how individuals and small groups make economic decisions.
Ian Parker in The New Yorker (Photo by Davide Monteleone):
Varoufakis, a mathematical economist with a modest academic reputation, had become a popular writer in Greece. When the snap election was called, he interrupted his professorship at the University of Texas, flew home to Greece, and launched a ten-day election campaign whose sole expense was the cost of gas for his motorcycle. He was running for parliament, with the aim of becoming the finance minister in a Syriza government. The vote was held on January 25th. Syriza doubled its number of seats in parliament, and Tsipras formed a government in coalition with a small right-of-center party that shared its opposition to the troika’s terms. Varoufakis was elected with a larger share of the vote than any other candidate, and he was named the finance minister. His only previous experience of representative office was as the (white, Greek) leader of the Black Students’ Alliance at the University of Essex, a British institution, in the late seventies. Privately, he asked himself, “What have I done?” On his blog, he borrowed some thoughts of defiance—and, by implication, certain failure—from Dylan Thomas. “Greek democracy today chose to stop going gently into the night,” Varoufakis wrote. “Greek democracy resolved to rage against the dying of the light.”
A few years ago, Varoufakis told Yorgos Avgeropoulos, a documentary filmmaker, that the difference between a debt of ten thousand euros and one of three hundred billion euros is that only the latter gives you negotiating power. And it does so only under one condition: “You must be prepared to say no.” Upon his election, Varoufakis used the less than ideal influence available to a rock climber who, roped to his companions, announces a willingness to let go. On behalf of Tsipras’s government, Varoufakis told Greece’s creditors, and the world’s media, that his country objected to the terms of its agreements. This position encouraged widespread commentary about Greece following a heedless path from “no” to default, and from default to a “Grexit” from the euro currency, which might lead to economic catastrophe in Europe and the world.
It was as if Christopher Hitchens had woken up one day as Secretary of State. Varoufakis was no longer writing elegantly prosecutorial blog posts about Christine Lagarde, the managing director of the I.M.F.; he was meeting with Lagarde. Within days of Greece’s election, an academic with Marxist roots, a shaved head, and a strong jaw had become one of the world’s most recognizable politicians. He showed a level of intellectual and rhetorical confidence—or, perhaps, unearned swagger—that lifted Greek hearts and infuriated Northern European politicians.
Greg Epstein in Salon:
Coates, an award-winning journalist for the Atlantic, is primarily seen as a writer on race. And “Between the World and Me” is, on one level, a book about race, with the story of his murdered friend Prince Jones making Sandra Bland’s seemingly similar death look all the more like a depressing and infuriating act of terror. But atheists and humanists tend to see ourselves as transcending culture and race. So much so that I’ve always been dismayed to find the majority of people who tend to show up at the meetings of organizations with words like atheist and humanist in their names, are so very, very white. Why? Maybe, as I explored in my book “Good Without God” (a title meant to offer a three-word definition of humanism), in an America where religious identity is all many minorities have to fortify them against a society that treats them as inferior and other, identifying as an atheist is far easier for people of privilege.
But Coates’ new book is also, boldly, about atheism. It is even more so about humanism. Crafting a powerful narrative about white Americans — or, as he says, those of us who need to think we are white — who are living The Dream — Coates makes a profound statement of what is, and is not, good, with or without god. Coates refers not to Martin Luther King Jr.’s dream, not quite even to the “American Dream,” but rather to The Dream in which we forget our history, our identity and much of our nation’s prosperity is built on the foundation of the suffering of people of color in general and black people in particular. The Dream, in other words, is not a state in which only Fox News Watchers find themselves. It is a state that can cancel out the very best of white, liberal, humanist intentions.
Danke, merci, gracias
for the heat of the sun,
the kindness of teaching,
the smell of fresh bread.
Diolch, nkosi, shur-nur-ah-gah-lem
for the sound of sand,
the book and the pen.
Dhannyabad, blagodaria, hvala
for the blue of small flowers,
the bobbing seal’s head,
the taste of clean water.
Shukran, rahmat, shukriya
for the stripe of the zebra,
the song of the chaffinch,
the gentleness of snails.
Mh goi, abarka, xièxiè
for the length of time,
the loveliness of eyelashes,
the arc of the ball.
Dziekuje, abrigado, shakkran
for the excitement of falling,
the stillness of night,
for my heart beating, thank you.
from If You Could See Laughter
Salt Publishing, Cromer, 2011
George Johnson in The New York Times:
Maybe it was in “some warm little pond,” Charles Darwin speculated in 1871, that life on Earth began. A few simple chemicals sloshed together and formed complex molecules. These, over great stretches of time, joined in various combinations, eventually giving rise to the first living cell: a self-sustaining bag of chemistry capable of dividing and spawning copies of itself. While scientists still debate the specifics, most subscribe to some version of what Darwin suggested — genesis as a fortuitous chemical happenstance. But the story of how living protoplasm emerged from lifeless matter may also help explain something darker: the origin of cancer.
As the primordial cells mutated and evolved, ruthlessly competing for nutrients, some stumbled upon a different course. They cooperated instead, sharing resources and responsibilities and so giving rise to multicellular creatures — plants, animals and eventually us. Each of these collectives is held together by a delicate web of biological compromises. By surrendering some of its autonomy, each cell prospers with the whole. But inevitably, there are cheaters: A cell breaks loose from the interlocking constraints and begins selfishly multiplying and expanding its territory, reverting to the free-for-all of Darwin’s pond. And so cancer begins.
Nicholas Agar in the OUP blog (Image credit: “City Lights”, by Unsplash. Public Domain via Pixabay):
Humans have flexible psychologies that enable us to flourish in environments ranging from the Arctic to the Kalahari Desert. Walruses and meerkats lack this psychological flexibility. They are unlikely to work out how to survive an exchange of habitats. Hedonic normalization permits a human raised in the high Himalayas to find that environment normal. The same psychological mechanism that hedonically normalizes humans to Arctic and desert environments normalizes us to the very different technological environments of the 1st and 21st centuries. We can predict that it will normalize us to the technologies of the 23rd century. Differences in hedonic normalization mean that ancient Romans, 21st century New Yorkers, and 23rd century residents of Cairo view cars powered by internal combustion engines very differently. What for the Romans is a quite miraculous technology, is boringly familiar to the New Yorkers, and repellently primitive and polluting for the Cairenes.
When we overlook hedonic normalization we tend to significantly overstate the extent to which technological progress will boost the happiness of future people. I would be very happy to abruptly find myself on board a 23rd century starship. But this is not how people hedonically normalized to 23rd century will feel. The error of ignoring hedonic normalization is especially apparent when we think about the past. Techno-optimists point to the big differences that technological change has made to our world. Mary Beard’s description of the streets of ancient Pompeii covered in animal dung, rotting vegetables, human excrement and flies makes modern city dwellers glad to be alive now. But imagining how a time traveller from the early 21st century would feel to find herself marooned in Pompeii does not tell us how people hedonically normalized to that time felt. Doubtless the Pompeians would have preferred cleaner streets. But the filthiness of their streets did not affect them in the way that it would affect someone normalized to our comparatively refuse and excrement-free highways and byways. To see this more clearly consider how people from the 23rd century will feel about life in our times. The conditions of our cities are clearly not perfect – but they are not nearly as bad for us as they will seem to someone normalized to the cities that 23rd century technologies will build.
Kathleen McNamara in the WaPo's Monkey Cage:
Economists are condescendingly scolding the Europeans for venturing into a single currency without the proper underlying economic conditions. Paul Krugman has relentlessly excoriated the leaders of Europe for being what he calls “self-indulgent politicians” who have “spent a quarter-century trying to run Europe on the basis of fantasy economics.” The conventional wisdom seems to be that the problems of the euro zone are, as economist Martin Feldstein once put it, “the inevitable consequence of imposing a single currency on a very heterogeneous group of countries.”
What this commentary gets wrong, however, is that single currencies are never the product of debates about optimal economic solutions. Instead, currencies like the U.S. dollar itself are the result of political battles, where motivated actors try to centralize power. This has most often occurred “through iron and blood,” as Otto van Bismarck, the unifier of Germany put it, as a result of catastrophic wars. Smaller geographic units were brought together to build the modern nation state, with a unified fiscal system, a common national language that was often imposed by force, a unified legal system, and, a single currency. Put differently (with apologies to sociologist Charles Tilly), war makes the state, and the state makes the currency.
The U.S. case is instructive. America used to have a chaotic multitude of state currencies and privately issued bank notes, with complex exchange rates between them. This only changed thanks to the Civil War. The American greenback was created in 1863 when Abraham Lincoln’s Republican Party muscled through legislation giving the federal government exclusive currency rights. It was only able to do this because Southern legislators, who opposed more centralization of power, had seceded from the American union. The Union side wanted a common currency to help the war effort by rationalizing revenue raising and wartime payments. But it was also a potent symbol of the power of the federal state in the face of the challenges of a disintegrating union.
Patrick French reviews Yasmin Khan's new book, in The Guardian (Photograph: Michael Ochs Archives/Getty):
Yasmin Khan reminds us at the start of her book that “Britain did not fight the second world war, the British empire did”. Remembrance is a great British virtue. Whether it’s a Spitfire display, replica red poppies streaming out of the Tower of London or a commemoration of the battle of Waterloo, we know how to do it.Winston Churchill’s idea of a plucky island race standing firm against tyranny in two world wars continues to resonate. Troops from Africa, the West Indies, India and beyond are historically more awkward: they tend to be seen as an adjunct to the main event, although Britain’s success in both wars came from the logistics and manpower derived from its massive empire. At last year’s centenary of 1914, the government avoided the E-word and called such people “Commonwealth soldiers”, although the Commonwealth did not exist at the time. In South Asia, too, the 2.5 million volunteers who served in the second world war are forgotten, since they do not fit easily with the nationalist narrative of independence attained by non-violent resistance.
In The Raj at War, Khan sets herself a tough task: to recover the weft of India during the second world war and tell a story not only of servicemen but of nurses, bearers, political activists, road builders, seamen, interned central European Jews, schoolgirls, Bengali famine victims, enlightened officials, 22,000 African American GIs and even destitute Kazakhs, Iraqi beggars and orphaned Polish children who were escaping upheavals elsewhere. “At many stops on their way to Bombay, local people greeted the children at the stations, treating them with sweets, fruits, cold drinks and toys,” reported the wife of the Polish consul general.
Telling history from the bottom up is difficult, since those in extremis rarely record their experiences; it is easier to come in from the sides than from below, and use the diaries and letters of Europeans or members of India’s Anglophone elite.
Monday, July 27, 2015
by Akim Reinhardt
It was my first time visiting, and before arriving, I didn't know much about this nearly arctic island other than some vagaries about vikings and banking scandals. So I had very little in the way of preconceived notions about the cuisine, and didn't expect anything in particular.
It turns out the food was quite good. There's lots of soup, and I'm a whore for soup, so that was a good match. Also tons of seafood, which is another favorite of mine, although it doesn't quite drive me to walk the streets with a handkerchief dangling behind my shoulder. And then there's also various treats, ranging from liquor to throat lozenges, that feature harshly medicinal herbal flavors. Cheers to that, I say.
Oh, and the chocolate. Far better than I would've guessed. No nonsense. Dark, chalky and delicious.
I don't eat meat, so all that mutton was lost on me, but overall I found Iceland to be a wonderful culinary experience. However, there were also elements of the surreal, which is often the case when one ventures into a new land for the first time. And that is what I would like to share in this photo essay.
What follows are images of and brief comments about things that are neither right or wrong, but rather just make me smile and remind me that we are all very strange.
by Leanne Ogasawara
In Japan, I knew a gentleman who ran a 200 year old miso shop. K san was also a bon vivant par excellance! Studying Samurai-style (Enshu school) tea ceremony, he wore stylish kimono by day and organized French film festivals for our town on the weekends. He also spent a fortune on tea bowls and art, which he often would show to his friends.
Everyone in town knew him and his miso shop was a gathering place of local luminaries.
Of all the interesting things he was involved in, my favorite was his gramophone club. Once a month like-minded collectors would show up with a favorite record (or not) and sit around listening to old records while drinking sake. Need I say more? The man had endless curiosity and tremendous style. He was my kinda guy!
Speaking of which, I recently finished the most unusual book by Normon Cantor, called Inventing the Middle Ages. The book is about twenty prominent 20th century Medievalists and their impact on the study of the history of the Middle Ages. When I first heard that this book was not just a best seller but was so popular it was even available on Audible, I could hardly believe it! Really? I love anything related to the Middle Ages and so would have read the book no matter what, but I must admit that I was utterly fascinated by the popularity-- as well as the controversy surrounding this book, which after all was on such an obscure topic.
So, I picked up the book immediately.
I wasn't disappointed either.
Steffani Jemison. Personal, 2014.
Part of a video series including Maniac Chase (2008-2009), and Escaped Lunatic (2010-1011). All are currently showing at the Rhode Island School of Design (RISD) Museum.
Interview with Steffani here.
Strained Analogies Between Recently Released Films and Current Events: Minions and the Illusion of Voting
by Matt McKenna
Avoiding ads featuring Universal Pictures' Minions is almost as difficult as avoiding political ads for the upcoming presidential election. In case you're somehow not familiar with the minion characters, they are small Tic-Tac-looking creatures who speak half-gibberish and first appeared as bumbling sidekicks in the animated Despicable Me franchise. When it became clear the marketing potential of minions outgrew the confines of the passable children's movie from which they originated, Universal spun out a film focusing on the minions characters themselves. Thus, we have Minions, a story about the eponymous characters' attempt to find an evil leader to whom to pledge allegiance and fulfill their species' destiny. The film's premise may be simple, but it provides a view into our own election process by describing its apparent opposite--instead of politicians being forced to pander to the voting public in order to be elected, Minions inverts the who-must-ingratiate-themselves-to-whom situation and considers what the world would be like if voters (minions) had to convince politicians they are worthy followers. When reexamining American elections through the lens of Minions, it becomes clear that though the minions' leadership-acquiring process may appear to be the exact inverse of the American voters' leadership-acquiring process, they are, in fact, identical.
Minions opens by showing the evolution of the minion species from single cell organism to the plush-doll friendly form that they take in the Despicable Me franchise. Through narration, we learn that minions are a species who form a symbiotic/parasitic relationship with the most "despicable" organism in its ecosystem. Over time, minions are forced to find new villains to follow: from the biggest organism in the primordial soup, to the most fearsome dinosaur in the jungle, to eventually Napoleon, the fiercest dictator on the planet. Unfortunately, after failing Napoleon for the last time, the minions are banished to an ice-cave where they toil away until the 1960s, conveniently rendering them absent from Europe during World War II presumably so the filmmakers wouldn't have to grapple with the minions' desire to serve Hitler.
by Evert Cilliers aka Adam Ash
I would blame three things: Disney, Forest Gump and Fox News.
What did Disney do? He made sentimentality a good and virtuous thing.
What did Forest Gump do? That movie, which won the Oscar for best picture, made stupidity a good and virtuous thing.
What did Fox News do? They made craziness good and virtuous.
Take Disney first.
Before Disney, fairy tales were cruel and filled with horror. After all, in the real Cinderella story, the stepsisters actually hacked at their feet, cut them smaller, blood flowing, so they could fit their feet into Cinderella's shoe.
After Disney, fairy tales became cloyingly sweet and sentimental. And this sentimentality towards fairy tales spilled over into everything. We even get sentimental about our troops, for example.
What do our troops do? They kill people. They are trained to kill people. They are trained murderers. But our politicians, whenever they want to appear patriotic, put their hands on their hearts and blab on about what heroes our troops are.
Heroes? Guys who go to foreign lands to kill people? Guys who, because Bush and Cheney told them, went to Iraq and murdered over half-a-million Iraqis, many of them women and children, for no good reason at all? Just because our President ordered them to do so? These are heroes? Give me a break. They are deluded mass murderers — virtuous pawns deluded by our terrible leaders.
This is the sort of sentimentality that leads folks to get so patriotic about America that they call us the exceptional nation.
Exceptional for what?
by Brooks Riley
by Sue Hubbard
In the early 20th century alternative philosophies were beginning to permeate western culture. Madame Blavatsky's Theosophy, the teachings of the Armenian mystic, G. I. Gurdjieff and the American Christian Science, spread through the works of Mary Baker Eddy: Science and Health with Key to the Scriptures, were gathering momentum. As was an interest in psychoanalysis. The hold of the Anglican Church, in which the sculptor Barbara Hepworth had been raised, was losing its grip. Many artists and intellectuals were looking for alternative means of spiritual and artistic expression.
At various times throughout her life Hepworth identified herself as a Christian Scientist. (Broadly, in Christian Science, spirit is understood to be the meaning and reality of being, where all issues contrary to the goodness of Spirit - God - are considered to originate in the flesh -‘matter' - understood as materialism where humanity is separated from God).
Hepworth's beliefs were fluid rather than constrained by doctrine and changed throughout her life. Yet what is clear from her archives is that spiritual concerns were central both to her life and work. With its emphasis on an infinite and harmonious intelligence, Christian Science provided her with an alternative lens through which to reassess orthodox Western beliefs. When, after her failed marriage to the sculptor John Skeaping she met the artist Ben Nicholson who was to become her second husband, the fact that he was a Christian Scientist gave their romantic and artistic relationship a charged metaphysical perspective. In an interview in 1965 with the Christian Science Monitor, Hepworth asserted that: "A sculpture should be an act of praise, an enduring expression of the divine spirit'.
by Bill Benzon
There can be little doubt that President Obama’s eulogy for Clementa Pinckney was an extraordinary performance and a powerful statement about the state of race relations in the United States of America. But it is also a bit puzzling, for that statement took the form of a sermon. As such, it was religious discourse and not secular political discourse.
That’s what I want to talk about, not to reach any specific conclusions, but to raise questions, to call for a conversation about and an examination of the role of religious discourse in civic life.
Rather than develop those questions directly, I want to place Obama’s eulogy on the table to a moment and consider a recent conversation between Glenn Loury, an economist at Brown University, and John McWhorter, a linguist at Columbia. That will establish the context in which I offer a few remarks about Obama’s performance. Then I want to place in evidence a statement that Robert Mann made about Laudato Si’, the recent and quite remarkable encyclical by Pope Francis.
The ‘Cult’ of Ta-Nehisi Coates
Loury and McWhorter had this conversation at Bloggingheads.tv on July 21, 2015. After opening pleasantries and some remarks about Obama, they move on to discuss the ascendancy of Ta-Nehisi Coates as a commentator on race relations in America. Starting at somewhat after nine minutes in McWhorter argues that Coates has become somewhat like the priest of a religion:
There is now what a Martian anthropologist would call a religion. Which is that one is to understand the role of racism in America’s past and present.
And Coates has reached a point, and this is not anything that I ever predicted, where he is the priest of it. Because, and this is the crucial point, James Baldwin […] his point was often that race IS America, that the race problem is the essence of America and where it needs to go. And people read that and they quoted it but it wasn’t something that ordinary white readers really felt at the time.
Whereas today, really, that is something that whites feel such that Coates is revered. He is not considered somebody where you actually assess whether what he’s saying is true, you’re only supposed to criticize him in the gentlest of terms. He’s a priest of a religion.
by Eric Byrd
As a teenager who just wanted battles, I tried to read The Face of Battle and was baffled by the historiographic argument of Keegan's introduction, a long essay that, I now see, echoes Virginia Woolf's manifesto "Modern Fiction" and applies its prescriptions to historical prose. Keegan called to writers of military history as Woolf called to the novelists of her time – "Let us record the atoms as they fall upon the mind in the order in which they fall, let us trace the pattern, however disconnected or incoherent in appearance, which each sight or incident scores upon the consciousness." Keegan urged historians to turn away from tidy narratives of battle and acknowledge the horizonless confusion experienced by even the best-positioned participants of those battles; urged them to understand that most soldiers don't even know when they are engaged in battle, or at least "battle" as it was understood by the Victorians: a national apotheosis or histrionic downfall; the Hinge of Destiny; and he recommended the historian read and take to heart the chaotic combat scenes in Tolstoy's War and Peace, just as Woolf prescribed Tolstoy, Dostoevsky and Chekhov to the fiction writer tempted by pat characterization, superficial psychology, all-too-conclusive action, and purely material relations.
Sunday, July 26, 2015
Siobhan Roberts in The Guardian:
For the last quarter century Conway has held the position of Princeton’s John von Neumann distinguished professor in applied and computational mathematics, now emeritus. Before that, he spent three decades at Cambridge, where in the 1970s, he dived deep into the vast ocean of mathematical symmetry. He discovered a 24-dimensional symmetry group that came to bear his name, and, with his colleague Simon Norton, he illuminated the 196,883-dimensional Monster group with a paper titled “Monstrous Moonshine”. Conway also discovered a new class of numbers, infinitely large and infinitesimally small, which are now known as “surreal numbers”. Those achievements earned him a spot as a fellow of the Royal Society of London for Improving Natural Knowledge, the oldest scientific society in the world. Conway likes to mention that when he was elected in 1981, he signed the big book of fellows at the induction ceremony and was pleased to see on previous pages the names Isaac Newton, Albert Einstein, Alan Turing, and Bertrand Russell.
Conway’s is a jocund and playful egomania, sweetened by self-deprecating charm. He has on many occasions admitted: “I do have a big ego! As I often say, modesty is my only vice. If I weren’t so modest, I’d be perfect.” That said, he is irresistibly drawn to piddling away his days playing games, preferably silly children’s games.
The solution to extremism lies through strategies that enable rather than constrain the space for Muslim free expression.
Zaheer Kazmi in Open Democracy:
In expanding the focus of the state’s enforcement powers from monitoring the planning and execution of specific terrorist acts to scrutinizing any opinions it deems ‘extremist’ in everyday environments, Cameron’s anti-extremism strategy also marks a significant departure from previous governments in its unprecedented degree of intervention into the policing of ideas in the UK.
Stifling dissent in these ways—through what Cameron has called the need for a “muscular” rather than “misguided” liberalism—can have paradoxical implications which allow ISIS to subvert the role of the ‘freedom fighter.’ This subversion lies at the heart of the radicalisation conundrum and can help to explain why jihadists can attract young Muslims to their cause. Cameron’s Birmingham speech rightly identified the problem of the allure which ‘makes celebrities of violent murderers,’ but suppressing rather than re-channelling the impulses that lead impressionable minds to make such dangerously mistaken choices may not, in the end, defuse the potency of this attraction.
From The Economist:
The Nazis succeeded in exterminating millions of Jews. But they did not succeed in extinguishing their history. That is the story told by Samuel Kassow, an American historian, in a poignant and detailed account of the secret archive of the Warsaw ghetto.
In the autumn of 1940, Warsaw's Jewish population, swollen by forced immigration, amounted to nearly 450,000 people, all of them walled into an area covering less than four square kilometres. By early 1942 about 83,000 had died from hunger. That summer 300,000 were sent away to death camps, mostly to Treblinka. In April and May 1943 the remaining 60,000 were killed, or captured and deported, in the Warsaw ghetto uprising, during which the Germans levelled that part of the city.
Mr Kassow starts his story amid the passionate arguments among Jews in the declining days of the three great empires: the German, the Austro-Hungarian and the Russian. Was the great dream to be integration? Was it in identification with the surging national consciousness of countries such as Poland, at that stage still partitioned? Was it emigration to a Jewish state in Palestine? Or in the hope of a socialist paradise based on a brotherhood of man rather than ethnic, religious or national affiliation? Or some mixture of the above? Was Hebrew the real language of Jews, or a snooty, artificial distraction? Was Yiddish a degenerate linguistic compromise, or the essential literary and political medium?
Manu Joseph in The New York Times:
The global view that ancient Indians performed extreme gymnastics while making love was seeded by a late-19th-century English translation of a Sanskrit text called the Kama Sutra, which contained, among other things, details of sexual positions, practical advice on seduction and a note on types of erotic women, who were named after mammals even though, as a book released this month observes, they made noises like birds.
“The Mare’s Trap: Nature and Culture in the Kamasutra” by Wendy Doniger, an American academic, argues, as some discerning couples may have suspected, that the sex in the Kama Sutra is more prank than instructional manual. But the grand ambition of her book is to elevate the Kama Sutra to the status of two great philosophical works that have influenced Indian society: Manu’s Dharmashastra, which invented castes and defined women as subordinate to men, embarrassing some fine people who share the author’s name, and Kautilya’s Arthashastra, a ruthless book on statecraft.
Very little is known about the origins of the Kama Sutra. No portion of the original text has survived. It was probably written in Sanskrit by one Vatsyayana. He seems to have been a compiler of sexual habits, and he blamed another scholar for inventing some of the very difficult sexual positions. Ms. Doniger believes that the Kama Sutra is about 2,000 years old, but she told me that this is based solely on circumstantial evidence.
The reason she takes the Kama Sutra so seriously is that even though she feels that the sexual positions were fantasies, she sees in the rest of the work nothing short of anthropology, a rare portrait of an affluent ancient society.
J. W. Mason in Jacobin:
The Greek crisis is not fundamentally about Greek government debt. Nor in its current acute form, is it about the balance of payments between Greece and the rest of the world. Rather, it is about the Greek banking system, and the withdrawal of support for it by the central bank. The solution accordingly is for Greece to regain control of its central bank.
I can’t properly establish the premise here. Suffice to say:
1. On the one hand, the direct economic consequences of default are probably nil. (Recall that Greece in some sense already defaulted, less than five years ago.) Even if default resulted in a complete loss of access to foreign credit, Greece today has neither a trade deficit nor a primary fiscal deficit to be financed. And with respect to the fiscal deficit, if the Greek central bank behaved like central banks in other developed countries, financing a deficit domestically would not be a problem.
And with respect to the external balance, the evidence, both historicaland contemporary, suggests that financial markets do not in fact punish defaulters. (And why should they? The extinction of unserviceable debt almost by definition makes a government a better credit risk post-default, and capitalists are no more capable of putting principle ahead of profit in this case than in others.)
The costs of default, rather, are the punishment imposed by the creditors, in this case by the European Central Bank (ECB). The actual cost of default is being paid already — in the form of shuttered Greek banks, the result of the refusal of the Bank of Greece (BoG) to extend them the liquidity they need to honor depositors’ withdrawal requests.
2. On the other hand, Greece’s dependence on its official creditors is not, as most people imagine, simply the result of an unwillingness of the private sector to hold Greek government debt, but also of the ECB’s decision to forbid — on what authority, I don’t know — the Greek government from issuing more short-term debt. This although Greek treasury bills (T-bills), held in large part by the private sector,currently carry interest rates between 2 and 3 percent — half what Greece is being charged by the ECB.