Thursday, July 30, 2015
When things are very American, they are as American as apple pie. Except violence. H. Rap Brown said violence “is as American as cherry pie,” not apple pie. Brown’s maxim makes us see violence as red and gelatinous, spooned from a can.
Adam Simpson in The Atlantic:
Instead of looking at broad populations to pinpoint trends within subsets of them, the medical world is increasingly turning to the individual, who can now be studied in higher definition than ever before. Precision medicine—the idea that treatments can be based on a patient’s unique biological and physiological characteristics—is gaining momentum.
The Clod and the Pebble
'Love seeketh not itself to please,
Nor for itself hath any care,
But for another gives its ease,
And builds a heaven in hell's despair.'
So sung a little clod of clay,
Trodden with the cattle's feet;
But a pebble of the brook
Warbled out these meters meet:
'Love seeketh only Self to please,
To bind another to its delight,
Joys in another's loss of ease,
And builds a hell in heaven's despite.'
by William Blake
Wednesday, July 29, 2015
Robert Solow in Conversation With Paul Krugman on Anthony Atkinson's "Inequality: What Can Be Done?"
Edmund S. Phelps in The New York Review of Books:
Our prevailing political economy is blind to the very concept of inclusion; it does not map out any remedy for the deficiency. A monograph of mine and a conference volume I edited are among the few book-length studies of ways to remedy failure to include people generally in an economy in which they will have satisfying work.3
Commentators are talking now about injustice of another sort. Workers in decent jobs view the economy as unjust if they or their children have virtually no chance of climbing to a higher rung in the socioeconomic ladder. And moving up appears harder now. Even in the Gilded Age, many of the moguls came up from the bottom. (The rungs were far apart, yet the ladder was climbed.) The feeling of injustice comes from a sense of unfair advantages: that those above are using their connections to stay there—or to ensure that their children can follow them. The bar to upward mobility is always the same: barriers to competition put up by the wealthy, the connected, corporations, professional associations, unions, and guilds.
But the truth is that no degree of Rawlsian action to pull up low-end wages and employment—or remove unfair advantages—could have spared the less advantaged from a major loss of inclusion since Rawls’s time. The forces of productivity slowdown and globalization have been too strong. Moreover, though the injustices in the West’s economies are egregious, they ought not to be seen as a major cause of the productivity slowdowns and globalization. (For one thing, a slowdown of productivity started in the US in the mid-1960s and the sharp loss of manufacturing jobs to poorer countries occurred much later—from the late 1970s to the early 1990s.) Deeper causes must be at work.
Ben Parker reviews Fredric Jameson's The Antinomies of Realism, in the LA Review of Books:
THE ODD THING about literary “realism” is that it is not a descriptive term at all, but a period: roughly 1830–1895, from Stendhal’s The Red and the Blackto Hardy’s Jude the Obscure. Many classics of 19th-century realism would be conspicuously ruled out if plausibility were any criterion. Balzac’s first successful novel, La Peau de chagrin, is about a gambler who purchases a magical, wish-fulfilling animal skin that shrinks with every wish granted; Stendhal’s Charterhouse of Parma is essentially a swashbuckling romp through Napoleonic Europe; Anna Karenina includes the interior monologue of a dog, long before Kafka; Flaubert’s works include a lurid, violent novel about the fall of ancient Carthage, and a play in which Saint Anthony confronts the Buddha, Isis, the Devil, and the Seven Deadly Sins in the desert. “Magical realism” is something of a pleonasm; 19th-century realism is already reliably outrageous, phantasmagoric, and credibility-straining.
The past tends to be evacuated of its specifics, and so realism becomes, in the folk vocabulary of everyday criticism, simply “the way that we used to do things.” The implication here is “... before we learned better,” where modernism, and most often Virginia Woolf, plays the role of pedagogue. By a curious twist, “realism” then becomes descriptive once again, as the term now encompasses a warehouse of discarded, seemingly ingenuous (but covertly ideological) techniques for the misguided project of grasping “reality.”
In her 2008 essay “Two Paths for the Novel,” Zadie Smith — in the same vein of condescension toward a hazy, credulous past — identified realism, specifically “the nineteenth-century lyrical Realism of Balzac and Flaubert,” as “a literary form in long-term crisis,” an archaic obstruction on the highway of literary culture. This realism was supposedly built on “the transcendent importance of form, the incantatory power of language to reveal truth, the essential fullness and continuity of the self.” Realism was a “bedtime story,” propagating the ideology that “the self is a bottomless pool,” and dating to a prelapsarian epoch when “novels weren’t neurotic.” All of this would come as a surprise, I think, to readers of Balzac and Flaubert: surely the latter is the most neurotic of novelists.
In fact, realism was never this way. Nineteenth-century realism was not a “bedtime story.” On the contrary, the prevailing idea that before modernism we all innocently believed in an essential plenitude of the self is itself a comforting fable by which to tuck in undergraduates. Even in as Masterpiece Theatre–ready a work as Thomas Hardy’s Tess of the D’Urbervilles, the heroine is crucially “absent” (narcoleptic, automaton-like) from her own attention at catastrophic, life-determining moments of rape and violence.
Joseph D'Urso at Thomson Reuters Foundation:
Investment banking doesn't rank highly on most people's lists of ethical career choices, but according to one of the world's most famous living philosophers, becoming a hot shot in finance may be the best way for a bright graduate to help the global poor.
A high earner in the corporate world who is giving away large sums can create more social gain than if they did charity work, said Peter Singer, who teaches at Princeton University.
"If they are able to live modestly and give a lot away, they can save many lives," he told the Thomson Reuters Foundation.
Singer is part of a movement of donors known as 'effective altruists', who demand concrete results from charitable donations, and often come from the business world. Silicon Valley billionaire Elon Musk will address the movement's global conference at Google headquarters in California in September.
The growing community encourages people to give big chunks of their income, typically around ten percent but in some cases more than half, to charities that alleviate global poverty.
Creating peace in Europe was a moral ambition that all could share. Yet Monnet was fully aware that moral appeals and trust in people's war-weariness would provide no more lasting security than international peace treaties. His idea, formulated as a plan with the French foreign minister Robert Schuman, was to overcome nationalism by gradually encouraging the nations to abandon rights of sovereignty, until, equally hollowed out and deprived of their very core, they would cease to have a future, hence undermining nationalism definitively. For this to work, supra-national institutions would have to gradually take over from national institutions. This process began with the creation of a high authority that regulated coal and steel production on behalf of the member states. Coal and steel were crucial not only for war but for reconstruction and economic revival. Creating a supra-national authority that controlled these products, ensuring their fair distribution and preventing secret rearmament, was the first step in a planned post-national development that would lead to the political and economic integration of the European nations, prevent them from deviating from the path, and that would ultimately supersede the nations entirely.
"Nationalism has destroyed European culture and civilization." (Stefan Zweig)
"The nations and their political institutions have proved once and for all that they are not equal to the task of lasting peace and rule of law." (Jean Monnet)
When I am not writing I am not writing a novel called 1994 about a young woman in an office park in a provincial town who has a job cutting and pasting time. I am not writing a novel called Nero about the world’s richest art star in space. I am not writing a book calledKansas City Spleen. I am not writing a sequel to Kansas City Spleen called Bitch’s Maldoror. I am not writing a book of political philosophy called Questions for Poets. I am not writing a scandalous memoir. I am not writing a pathetic memoir. I am not writing a memoir about poetry or love. I am not writing a memoir about poverty, debt collection, or bankruptcy. I am not writing about family court. I am not writing a memoir because memoirs are for property owners and not writing a memoir about prohibitions of memoirs.
When I am not writing a memoir I am also not writing any kind of poetry, not prose poems contemporary or otherwise, not poems made of fragments, not tightened and compressed poems, not loosened and conversational poems, not conceptual poems, not virtuosic poems employing many different types of euphonious devices, not poems with epiphanies and not poems without, not documentary poems about recent political moments, not poems heavy with allusions to critical theory and popular song.
In Cleveland, the ghost of d.a. levy is everywhere, even animating MOCA Cleveland's summer show. But what is it that makes the poet's legacy endure?
Morgan Meis in The Smart Set:
A young poet killed himself in Cleveland on November 24, 1968. He did it with a .22 caliber rifle he’d owned since childhood. In the years leading up to his death, the poet often demonstrated to friends how he could operate the gun with his feet and put the muzzle against his forehead, right at the spot of his “third eye.” The poet’s name was d. a. levy, as he liked to spell it (he was born Darryl Alfred Levy). He was just 26 years old when he died.
Just a year before his death, levy was arrested by the Cleveland police. He’d been indicted in 1966. The specific charge was “contributing to the delinquency of a minor.” At a poetry reading, he allowed juveniles to read work deemed obscene by city officials. levy’s own poetry had its share of bad words, sex, and drugs. The poet was a public advocate for the legalization of marijuana. It all seems rather tame by today’s standard. But in Cleveland in 1968, the d. a. levy affair created quite a ruckus. His arrest brought national attention. Guys like Alan Ginsberg and Gary Snyder got involved in the case, advocating for the dismissal of the charges against levy. The call to “legalize levy” became a rallying cry at protests and on t-shirts and flyers, not just in Cleveland but around the country.
After his death, many people in Cleveland adopted levy as a kind of local hero. And there it should have ended, if history is any guide. A young poet takes his own life. A city mourns. The relentless wheel of history churns on, forgetting as it goes.
If the right attacked Fukuyama for being insufficiently fearful about political threats to Western liberalism, the left attacked him for being insufficiently hopeful about economic alternatives to it. Fukuyama’s argument came on the heels of a set of developments that seemed to fit a pattern: the collapse of the USSR; Deng Xiaoping’s decision to move China toward something that looked a great deal like capitalism; Margaret Thatcher’s and Ronald Reagan’s attacks on the postwar welfare state. The closing-off of systematic alternatives to capitalism coincided with capitalism’s own transition from “Fordism” to “neoliberalism” (to use the now-conventional terminology), and Fukuyama seemed to exemplify both of these pernicious trends. To detractors on the left, his thesis was at best a failure of political imagination and at worst a highfalutin version of Thatcher’s taunt that “there is no alternative” to the free market.
However unappealing Fukuyama’s view may have been to the left, the lean years of Third Way liberalism and compassionate conservatism did little to disconfirm it. But more recent events have offered critics of the left, like those of the right, the chance to claim vindication by history.
David Cyranoski in Nature:
Jun Wang is one of China’s most famous scientists. Since joining the genome-sequencing powerhouse BGI when it started up 16 years ago, he has participated in some of its biggest accomplishments. These include sequencing the first genome of an Asian person1, the giant panda2 and the human gut microbiome3, as well as contributions to the Human Genome Project. Wang has led BGI since 2007 (when it stopped using the name Beijing Genomics Institute and moved its headquarters to Shenzhen). But on 17 July, the institute announced that he will give up that position to pursue research into artificial intelligence (AI).
What is the concept behind your AI project?
Basically, I am just trying to feed an AI system with masses of data. Then that system could learn to understand human health and human life better than we do. The AI will try to draw a formula for life. Life is digital, like a computer program — if you want to understand the results of the programming, how the genes lead to phenotypes, it is sufficiently complicated for you to need an AI system to figure out the rules. The AI system will basically consist of two components. The first is the big supercomputing platforms. We already have access to those through cloud computing and supercomputing centres. These will run or devise algorithms that look for relationships between genes, lifestyle and environmental factors, and predict phenotypes. The other thing is big data. We want to have data from one million individuals. And we want the data to be alive, in the sense that they can update their phenotype information at any time point. Other big computing companies, such as Google, could eventually do this, but we want to do it first. And we have the experience with the big data.
Tuesday, July 28, 2015
Massimo Pigliucci in Scientia Salon:
A false dichotomy is a basic type of informal logical fallacy, consisting in framing an issue as if there were only two choices available, while in fact a range of nuanced positions may be on offer upon more careful reflection. While I have argued together with my colleagues Maarten Boudry and Fabio Paglieri that often so-called logical fallacies turn out to be pretty reasonable heuristic strategies , there are nonetheless plenty of instances were they do identify truly bad reasoning. I have recently discussed one such case in reference to so-called trigger warnings in the context of college classes , but another one is arguably represented by the never ending “debate” about Islamophobia.
It is easy to find stark examples of people defending what appear to be two irreconcilable positions about how to view Islam in a post-9/11 world. For the sake of discussion, I will bypass pundits and other pseudo-intellectuals, and use instead two comedians as representative of the contrasting positions: Jon Stewart  and Bill Maher .
Before proceeding I must acknowledge that while I’ve liked Stewart for a long time, and followed with pleasure his evolution from being solely a comedian to a savvy social commentator during his run at the Daily Show , my appreciation of Maher has slid further and further. I used to like his brusque style back when he was doing his “Politically Incorrect” show, first on Comedy Central, then on ABC . I was aghast when ABC (allegedly) let him go because he had dared to make the truly politically (but clearly correct) statement that the 9/11 hijackers could properly be labelled with a number of negative epithets, but that cowards wasn’t one of them. But then he made his Religulous movie , where he slid into crass new atheism-style “criticism” of religion, and finally came out as an anti-vaxxer all the while chastising some of his guests who were “skeptical” of climate change for being anti-science.
Kate Douglas in New Scientist:
The global financial crisis of 2008 took the world by surprise. Few mainstream economists saw it coming. Most were blind even to the possibility of such a catastrophic collapse. Since then, they have failed to agree on the interventions required to fix it. But it’s not just the crash: there is a growing feeling that orthodox economics can’t provide the answers to our most pressing problems, such as why inequality is spiralling. No wonder there’s talk of revolution.
Earlier this year, several dozen quiet radicals met in a boxy red building on the outskirts of Frankfurt, Germany, to plot just that. The stated aim of this Ernst Strüngmann Forum at the Frankfurt Institute for Advanced Studies was to create “a new synthesis for economics”. But the most zealous of the participants – an unlikely alliance of economists, anthropologists, ecologists and evolutionary biologists – really do want to overthrow the old regime. They hope their ideas will mark the beginning of a new movement to rework economics using tools from more successful scientific disciplines.
Drill down, and it’s not difficult to see where mainstream “neoclassical” economics has gone wrong. Since the 19th century, economies have essentially been described with mathematical formulae. This elevated economics above most social sciences and allowed forecasting. But it comes at the price of ignoring the complexities of human beings and their interactions – the things that actually make economic systems tick.
The problems start with Homo economicus, a species of fantasy beings who stand at the centre of orthodox economics. All members of H. economicus think rationally and act in their own self-interest at all times, never learning from or considering others.
We’ve known for a while now that Homo sapiens is not like that (see “Team humanity“). Over the years, there have been various attempts to inject more realism into the field by incorporating insights into how humans actually behave. Known as behavioural economics, this approach has met with some success in microeconomics – the study of how individuals and small groups make economic decisions.
Ian Parker in The New Yorker (Photo by Davide Monteleone):
Varoufakis, a mathematical economist with a modest academic reputation, had become a popular writer in Greece. When the snap election was called, he interrupted his professorship at the University of Texas, flew home to Greece, and launched a ten-day election campaign whose sole expense was the cost of gas for his motorcycle. He was running for parliament, with the aim of becoming the finance minister in a Syriza government. The vote was held on January 25th. Syriza doubled its number of seats in parliament, and Tsipras formed a government in coalition with a small right-of-center party that shared its opposition to the troika’s terms. Varoufakis was elected with a larger share of the vote than any other candidate, and he was named the finance minister. His only previous experience of representative office was as the (white, Greek) leader of the Black Students’ Alliance at the University of Essex, a British institution, in the late seventies. Privately, he asked himself, “What have I done?” On his blog, he borrowed some thoughts of defiance—and, by implication, certain failure—from Dylan Thomas. “Greek democracy today chose to stop going gently into the night,” Varoufakis wrote. “Greek democracy resolved to rage against the dying of the light.”
A few years ago, Varoufakis told Yorgos Avgeropoulos, a documentary filmmaker, that the difference between a debt of ten thousand euros and one of three hundred billion euros is that only the latter gives you negotiating power. And it does so only under one condition: “You must be prepared to say no.” Upon his election, Varoufakis used the less than ideal influence available to a rock climber who, roped to his companions, announces a willingness to let go. On behalf of Tsipras’s government, Varoufakis told Greece’s creditors, and the world’s media, that his country objected to the terms of its agreements. This position encouraged widespread commentary about Greece following a heedless path from “no” to default, and from default to a “Grexit” from the euro currency, which might lead to economic catastrophe in Europe and the world.
It was as if Christopher Hitchens had woken up one day as Secretary of State. Varoufakis was no longer writing elegantly prosecutorial blog posts about Christine Lagarde, the managing director of the I.M.F.; he was meeting with Lagarde. Within days of Greece’s election, an academic with Marxist roots, a shaved head, and a strong jaw had become one of the world’s most recognizable politicians. He showed a level of intellectual and rhetorical confidence—or, perhaps, unearned swagger—that lifted Greek hearts and infuriated Northern European politicians.
Greg Epstein in Salon:
Coates, an award-winning journalist for the Atlantic, is primarily seen as a writer on race. And “Between the World and Me” is, on one level, a book about race, with the story of his murdered friend Prince Jones making Sandra Bland’s seemingly similar death look all the more like a depressing and infuriating act of terror. But atheists and humanists tend to see ourselves as transcending culture and race. So much so that I’ve always been dismayed to find the majority of people who tend to show up at the meetings of organizations with words like atheist and humanist in their names, are so very, very white. Why? Maybe, as I explored in my book “Good Without God” (a title meant to offer a three-word definition of humanism), in an America where religious identity is all many minorities have to fortify them against a society that treats them as inferior and other, identifying as an atheist is far easier for people of privilege.
But Coates’ new book is also, boldly, about atheism. It is even more so about humanism. Crafting a powerful narrative about white Americans — or, as he says, those of us who need to think we are white — who are living The Dream — Coates makes a profound statement of what is, and is not, good, with or without god. Coates refers not to Martin Luther King Jr.’s dream, not quite even to the “American Dream,” but rather to The Dream in which we forget our history, our identity and much of our nation’s prosperity is built on the foundation of the suffering of people of color in general and black people in particular. The Dream, in other words, is not a state in which only Fox News Watchers find themselves. It is a state that can cancel out the very best of white, liberal, humanist intentions.
Danke, merci, gracias
for the heat of the sun,
the kindness of teaching,
the smell of fresh bread.
Diolch, nkosi, shur-nur-ah-gah-lem
for the sound of sand,
the book and the pen.
Dhannyabad, blagodaria, hvala
for the blue of small flowers,
the bobbing seal’s head,
the taste of clean water.
Shukran, rahmat, shukriya
for the stripe of the zebra,
the song of the chaffinch,
the gentleness of snails.
Mh goi, abarka, xièxiè
for the length of time,
the loveliness of eyelashes,
the arc of the ball.
Dziekuje, abrigado, shakkran
for the excitement of falling,
the stillness of night,
for my heart beating, thank you.
from If You Could See Laughter
Salt Publishing, Cromer, 2011
George Johnson in The New York Times:
Maybe it was in “some warm little pond,” Charles Darwin speculated in 1871, that life on Earth began. A few simple chemicals sloshed together and formed complex molecules. These, over great stretches of time, joined in various combinations, eventually giving rise to the first living cell: a self-sustaining bag of chemistry capable of dividing and spawning copies of itself. While scientists still debate the specifics, most subscribe to some version of what Darwin suggested — genesis as a fortuitous chemical happenstance. But the story of how living protoplasm emerged from lifeless matter may also help explain something darker: the origin of cancer.
As the primordial cells mutated and evolved, ruthlessly competing for nutrients, some stumbled upon a different course. They cooperated instead, sharing resources and responsibilities and so giving rise to multicellular creatures — plants, animals and eventually us. Each of these collectives is held together by a delicate web of biological compromises. By surrendering some of its autonomy, each cell prospers with the whole. But inevitably, there are cheaters: A cell breaks loose from the interlocking constraints and begins selfishly multiplying and expanding its territory, reverting to the free-for-all of Darwin’s pond. And so cancer begins.
Nicholas Agar in the OUP blog (Image credit: “City Lights”, by Unsplash. Public Domain via Pixabay):
Humans have flexible psychologies that enable us to flourish in environments ranging from the Arctic to the Kalahari Desert. Walruses and meerkats lack this psychological flexibility. They are unlikely to work out how to survive an exchange of habitats. Hedonic normalization permits a human raised in the high Himalayas to find that environment normal. The same psychological mechanism that hedonically normalizes humans to Arctic and desert environments normalizes us to the very different technological environments of the 1st and 21st centuries. We can predict that it will normalize us to the technologies of the 23rd century. Differences in hedonic normalization mean that ancient Romans, 21st century New Yorkers, and 23rd century residents of Cairo view cars powered by internal combustion engines very differently. What for the Romans is a quite miraculous technology, is boringly familiar to the New Yorkers, and repellently primitive and polluting for the Cairenes.
When we overlook hedonic normalization we tend to significantly overstate the extent to which technological progress will boost the happiness of future people. I would be very happy to abruptly find myself on board a 23rd century starship. But this is not how people hedonically normalized to 23rd century will feel. The error of ignoring hedonic normalization is especially apparent when we think about the past. Techno-optimists point to the big differences that technological change has made to our world. Mary Beard’s description of the streets of ancient Pompeii covered in animal dung, rotting vegetables, human excrement and flies makes modern city dwellers glad to be alive now. But imagining how a time traveller from the early 21st century would feel to find herself marooned in Pompeii does not tell us how people hedonically normalized to that time felt. Doubtless the Pompeians would have preferred cleaner streets. But the filthiness of their streets did not affect them in the way that it would affect someone normalized to our comparatively refuse and excrement-free highways and byways. To see this more clearly consider how people from the 23rd century will feel about life in our times. The conditions of our cities are clearly not perfect – but they are not nearly as bad for us as they will seem to someone normalized to the cities that 23rd century technologies will build.
Kathleen McNamara in the WaPo's Monkey Cage:
Economists are condescendingly scolding the Europeans for venturing into a single currency without the proper underlying economic conditions. Paul Krugman has relentlessly excoriated the leaders of Europe for being what he calls “self-indulgent politicians” who have “spent a quarter-century trying to run Europe on the basis of fantasy economics.” The conventional wisdom seems to be that the problems of the euro zone are, as economist Martin Feldstein once put it, “the inevitable consequence of imposing a single currency on a very heterogeneous group of countries.”
What this commentary gets wrong, however, is that single currencies are never the product of debates about optimal economic solutions. Instead, currencies like the U.S. dollar itself are the result of political battles, where motivated actors try to centralize power. This has most often occurred “through iron and blood,” as Otto van Bismarck, the unifier of Germany put it, as a result of catastrophic wars. Smaller geographic units were brought together to build the modern nation state, with a unified fiscal system, a common national language that was often imposed by force, a unified legal system, and, a single currency. Put differently (with apologies to sociologist Charles Tilly), war makes the state, and the state makes the currency.
The U.S. case is instructive. America used to have a chaotic multitude of state currencies and privately issued bank notes, with complex exchange rates between them. This only changed thanks to the Civil War. The American greenback was created in 1863 when Abraham Lincoln’s Republican Party muscled through legislation giving the federal government exclusive currency rights. It was only able to do this because Southern legislators, who opposed more centralization of power, had seceded from the American union. The Union side wanted a common currency to help the war effort by rationalizing revenue raising and wartime payments. But it was also a potent symbol of the power of the federal state in the face of the challenges of a disintegrating union.
Patrick French reviews Yasmin Khan's new book, in The Guardian (Photograph: Michael Ochs Archives/Getty):
Yasmin Khan reminds us at the start of her book that “Britain did not fight the second world war, the British empire did”. Remembrance is a great British virtue. Whether it’s a Spitfire display, replica red poppies streaming out of the Tower of London or a commemoration of the battle of Waterloo, we know how to do it.Winston Churchill’s idea of a plucky island race standing firm against tyranny in two world wars continues to resonate. Troops from Africa, the West Indies, India and beyond are historically more awkward: they tend to be seen as an adjunct to the main event, although Britain’s success in both wars came from the logistics and manpower derived from its massive empire. At last year’s centenary of 1914, the government avoided the E-word and called such people “Commonwealth soldiers”, although the Commonwealth did not exist at the time. In South Asia, too, the 2.5 million volunteers who served in the second world war are forgotten, since they do not fit easily with the nationalist narrative of independence attained by non-violent resistance.
In The Raj at War, Khan sets herself a tough task: to recover the weft of India during the second world war and tell a story not only of servicemen but of nurses, bearers, political activists, road builders, seamen, interned central European Jews, schoolgirls, Bengali famine victims, enlightened officials, 22,000 African American GIs and even destitute Kazakhs, Iraqi beggars and orphaned Polish children who were escaping upheavals elsewhere. “At many stops on their way to Bombay, local people greeted the children at the stations, treating them with sweets, fruits, cold drinks and toys,” reported the wife of the Polish consul general.
Telling history from the bottom up is difficult, since those in extremis rarely record their experiences; it is easier to come in from the sides than from below, and use the diaries and letters of Europeans or members of India’s Anglophone elite.
Monday, July 27, 2015
by Akim Reinhardt
It was my first time visiting, and before arriving, I didn't know much about this nearly arctic island other than some vagaries about vikings and banking scandals. So I had very little in the way of preconceived notions about the cuisine, and didn't expect anything in particular.
It turns out the food was quite good. There's lots of soup, and I'm a whore for soup, so that was a good match. Also tons of seafood, which is another favorite of mine, although it doesn't quite drive me to walk the streets with a handkerchief dangling behind my shoulder. And then there's also various treats, ranging from liquor to throat lozenges, that feature harshly medicinal herbal flavors. Cheers to that, I say.
Oh, and the chocolate. Far better than I would've guessed. No nonsense. Dark, chalky and delicious.
I don't eat meat, so all that mutton was lost on me, but overall I found Iceland to be a wonderful culinary experience. However, there were also elements of the surreal, which is often the case when one ventures into a new land for the first time. And that is what I would like to share in this photo essay.
What follows are images of and brief comments about things that are neither right or wrong, but rather just make me smile and remind me that we are all very strange.
by Leanne Ogasawara
In Japan, I knew a gentleman who ran a 200 year old miso shop. K san was also a bon vivant par excellance! Studying Samurai-style (Enshu school) tea ceremony, he wore stylish kimono by day and organized French film festivals for our town on the weekends. He also spent a fortune on tea bowls and art, which he often would show to his friends.
Everyone in town knew him and his miso shop was a gathering place of local luminaries.
Of all the interesting things he was involved in, my favorite was his gramophone club. Once a month like-minded collectors would show up with a favorite record (or not) and sit around listening to old records while drinking sake. Need I say more? The man had endless curiosity and tremendous style. He was my kinda guy!
Speaking of which, I recently finished the most unusual book by Normon Cantor, called Inventing the Middle Ages. The book is about twenty prominent 20th century Medievalists and their impact on the study of the history of the Middle Ages. When I first heard that this book was not just a best seller but was so popular it was even available on Audible, I could hardly believe it! Really? I love anything related to the Middle Ages and so would have read the book no matter what, but I must admit that I was utterly fascinated by the popularity-- as well as the controversy surrounding this book, which after all was on such an obscure topic.
So, I picked up the book immediately.
I wasn't disappointed either.