Monday, April 27, 2015
by Jalees Rehman
"Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it—whole-heartedly—and delete it before sending your manuscript to press. Murder your darlings."
Sir Arthur Quiller-Couch (1863–1944). On the Art of Writing. 1916
Murder your darlings. The British writer Sir Arthur Quiller Crouch shared this piece of writerly wisdom when he gave his inaugural lecture series at Cambridge, asking writers to consider deleting words, phrases or even paragraphs that are especially dear to them. The minute writers fall in love with what they write, they are bound to lose their objectivity and may not be able to judge how their choice of words will be perceived by the reader. But writers aren't the only ones who can fall prey to the Pygmalion syndrome. Scientists often find themselves in a similar situation when they develop "pet" or "darling" hypotheses.
How do scientists decide when it is time to murder their darling hypotheses? The simple answer is that scientists ought to give up scientific hypotheses once the experimental data is unable to support them, no matter how "darling" they are. However, the problem with scientific hypotheses is that they aren't just generated based on subjective whims. A scientific hypothesis is usually put forward after analyzing substantial amounts of experimental data. The better a hypothesis is at explaining the existing data, the more "darling" it becomes. Therefore, scientists are reluctant to discard a hypothesis because of just one piece of experimental data that contradicts it.
In addition to experimental data, a number of additional factors can also play a major role in determining whether scientists will either discard or uphold their darling scientific hypotheses. Some scientific careers are built on specific scientific hypotheses which set apart certain scientists from competing rival groups. Research grants, which are essential to the survival of a scientific laboratory by providing salary funds for the senior researchers as well as the junior trainees and research staff, are written in a hypothesis-focused manner, outlining experiments that will lead to the acceptance or rejection of selected scientific hypotheses. Well written research grants always consider the possibility that the core hypothesis may be rejected based on the future experimental data. But if the hypothesis has to be rejected then the scientist has to explain the discrepancies between the preferred hypothesis that is now falling in disrepute and all the preliminary data that had led her to formulate the initial hypothesis. Such discrepancies could endanger the renewal of the grant funding and the future of the laboratory. Last but not least, it is very difficult to publish a scholarly paper describing a rejected scientific hypothesis without providing an in-depth mechanistic explanation for why the hypothesis was wrong and proposing alternate hypotheses.
Nine days after 9/11, on 20 September 2001, President George W. Bush responded to the World Trade Centre attacks by addressing a joint session of Congress. He lamented that in the space of a 'single day' the country had been changed irrevocably, its people 'awakened to danger and called to defend freedom'. Out of the painful deaths of almost 3000 people germinates anger and a drive for retribution. The attackers, whom Bush terms 'enemies of freedom', are apparently motivated by envy as well as hatred:
They hate what they see right here in this chamber: a democratically elected government. Their leaders are self-appointed. They hate our freedoms: our freedom of religion, our freedom of speech, our freedom to vote and assemble and disagree with each other.
In this passage alone, there are four instances of 'freedom', and in the approximately 3,000-word-long speech from which it is taken, 'freedom' is invoked 13 times.
Given that the speech was a major statement of Bush's intent following the wound of 9/11 and that the US government uses the name 'Operation Enduring Freedom' to describe its War on Terrorism, it is clear that freedom is a crucial concept to the US and its allies. This is unsurprising, since the Statue of Liberty on Liberty Island off the coast of New York City has long served as a symbol of freedom and the vaunted American myth of social mobility. But what does freedom consist of and is it a universal value? In other words, does everyone – men and women, and people from different classes, races, or religious backgrounds – experience it in the same way?
In 2014, Bangladeshi-origin writer Zia Haider Rahman published his fascinating and very male debut novel In the Light of What We Know. The book deals in part with 9/11 and its aftermath. One of Rahman's two main protagonists, Zafar, works in Afghanistan soon after the outbreak of war in 2001. He avers that the American occupiers 'justify their invasion of Afghanistan with platitudes about freedom and liberating the Afghani people'. Having studied law and worked for a US bank, Zafar is in some ways part of the American 'relief effort'. And yet he is simultaneously not part of it, due to his Bangladeshi background and brown skin. Because of this, coupled with his working-class origins, he sees through the rhetoric of freedom as platitudinous.
by Jonathan Kujawa
In 1971 H. S. M. Coxeter introduced a mathematical trifle he called "friezes". At the time they didn't seem like much more than a cute game you can play. In the past decade, however, they've become a central player in a major new area of research. I recently saw an entertaining talk by Peter Jorgensen at the Mittag-Leffler Institute about his work in this area with Christine Bessenrodt and Thorsten Holm. Peter's talk reminded me that I should really tell you the story of friezes. We've all seen friezes such as this one by Caravaggio :
As you can see, a frieze is an array of the counting numbers (1,2,3,4,...) where the top and bottom rows are all ones. The dots on the left and right mean that each row continues forever in both directions. The real mystery is the numbers in the middle rows. There is some sort of pattern and symmetry but I, at least, couldn't quite put my finger on it the first time I saw a frieze. The mysterious missing rule is that each diamond of four numbers:
is required to satisfy the equation:
You can check the diamonds in our example frieze and see that the formula always holds true. Notice, too, we can make diamonds on each edge where we know three out of four of the numbers. Using our formula we can solve for the missing number and so fill in missing numbers on each edge. This observation along with the fact that the top and bottom rows are always one means the frieze really does continue forever in both directions and the Rule for Diamonds tells us how to calculate the missing numbers.
Sabeen Mahmud. Artist, activist, intellectual; a woman with a true heart. A hero for many.
Murdered in Karachi on Friday, April 24, leaving an immense void.
by Dwight Furrow
It's not like "reading tea leaves". Fermented grape juice will not foretell the future. But wine does tell a story if you speak its language. Now, I'm not getting all mystical here by attributing linguistic ability to fermented grape juice. The story a wine tells is quite concrete and palpable like mud on the boots and mildew on leaves. The flavors and textures of wine are not merely sensations but qualities that say something about the land on which grapes are grown, the people who made the wine, the world they live in, and the person who is drinking it. Discovering these details gives a wine resonance and meaning that cannot be gained by mere consumption.
A wine has flavor because it is made from a specific grape, from a specific piece of land, and by a winemaker who intended the wine to taste as it does. The winemaking process and decision to plant those particular grapes is a centuries-long process of adapting grapes to climate, soil, and taste preferences. Thus, when you taste a wine you taste the residue of geography and culture. Taste opens up a world with a rich assortment of connections just like any good book.
Of course, anything we consume has a history and a process that produced it. And with sufficient knowledge of how it was produced, we might identify features of that process by attending to its flavor. But wine is unique because when you pay attention and understand why winemakers make the wine they do, the wine says something about them, their family, what they like to drink, and their motivations for making wine. A can of Coke tells you little of importance about the people who make it or the place it comes from. It can be made anywhere by anyone if the price is right. Not so with non-industrial wines. They are inherently artisanal products, and inherently a product of place, and they tell a very human story. Wine is one of the few products where geography, human culture, and aesthetics meet with such intensity, variability, and beauty. It is thus full of meaning waiting to be interpreted.
by Kathleen Goodwin
The extent of my identification with my Armenian heritage was once dyeing Easter eggs a mottled maroon the traditional Armenian way (with onion skins) with my Armenian grandmother. In high school when learning about the systematic eradication of Armenians during World War I, I didn't feel any sense of personal injustice. By college, when the Kardashian franchise familiarized the American public with the existence of the tiny west Asian country, revealing I was part Armenian "like Kim Kardashian" became my go-to ice breaker when having to share an interesting fact about myself. Truly, I've only come to recognize myself as Armenian-American in the past few weeks as the media has highlighted the century-long struggle of Armenians to have world leaders acknowledge the Armenian genocide.
This past Friday, April 24, marked the centennial of the day in 1915 when approximately 250 prominent Armenian intellectuals were rounded up by Ottoman officials and deported from Constantinople (present-day Istanbul). Most of them were eventually killed, along with an estimated 1.5 million Armenians over the course of the next seven years. The Ottoman Turks, which had already lost land they once ruled in the Balkans, feared that the Armenian Christian minority would ally itself with Russia and hasten the destruction of their empire from within its own borders. By the end of World War I, the Ottoman Empire was disbanded, and the nation of Turkey that emerged in the aftermath was founded by the same Ottoman officials who continued to exile and murder Armenians.
The primary grievance of Armenians today is the refusal of the Turkish government, as well as most other nations including the United States, to acknowledge that what occurred during the fall of the Ottoman Empire should be termed "genocide". Some Armenians admit that the singular focus on semantics has sometimes reached hyperbolic proportions and keeps Armenians mired in the past, ultimately preventing them from fully thriving in the present. I will admit that I previously thought the obsession with achieving the label "genocide" was misplaced. If the Turkish government had refused to own up to its historic crimes for so long, fighting for its confession isn't worth the time of Armenians who are trying to preserve their culture and move forward with their lives. In some ways it felt like begging the world to acknowledge the genocide continued to hand power to the oppressors, instead of allowing Armenians to take back ownership of their legacy.
by Madhu Kaza
On the evening of April 13th I heard the news that the Uraguayan writer Eduardo Galeano had passed away. Earlier that day, after work, I had gone to get a manicure at a salon in my neighborhood; my hands and wrist hurt from typing all day and more than new nail polish I wanted a little break. The manicurist was a young woman just three years out of high school. She had been born in Mexico City, and at the age of five she left for New York with her mother and siblings to join her father who had come a few years earlier. She arrived one month before September 11th, 2001. While she filed, soaked and painted my nails the young woman, L, told me about her dog, Amigo, whom she had to leave behind in Mexico, about her sense of loss when she arrived in the United States and her even deeper sense of loss when her mother returned to Mexico a few years ago. "It's been so hard," she said, "No one gets you like your mom." L lives on her own, and though she would love to go to college it's beyond her financial means; it's enough for her to manage getting by by working full time. At the end of my appointment when I told her that she had a beautiful name she said, "I'm named after my father." "What is your mother's name?" I asked. Her eyes brightened as she said, "Maria. But it's very interesting because her name is Maria Herculia – it's like Maria Hercules."
I was still thinking about L when I heard the news of Galeano's death. Galeano often spoke of his work as a project of writing historical memory; it was an oppositional history of remembrance in the face of historical amnesia. In a 2013 interview with The Guardian Galeano spoke of this amnesia in systemic terms: "It's a system of power that is always deciding in the name of humanity who deserves to be remembered and who deserves to be forgotten … We are much more than we are told. We are much more beautiful." The stories that Galeano collected and wrote formed an underside of history, the memory of those who are easily forgotten in the grand narratives of conquest, capitalism and progress. Even as his writing had a broad historical sweep – he wrote histories of Latin America as well as a histories of the world from pre-historic times to the present day—he was interested in the particulars; his works of short prose commemorated and celebrated ordinary people in their labors, their loves and their woes. It was through these stories of individual people and particular communities that Galeano's writing came to life. He noted his interest in "small things and small people." That night when I learned that he had died, I imagined how Galeano might have delighted in and given shape to the narrative of the daughter of Maria Herculia, whose story contains both the residue of disruptive historical currents and the heroics of everyday life.
by Tamuira Reid
She calls me in the middle of the night. I call her when I know she won't be home.
"How many floors are in your building?"
"What? Mom, I'm sleeping."
"Tell me how many floors!"
"I don't know. Five? Six?"
"Okay good. As long as it's not a high rise. You know, they always bomb the big buildings first. You're better off moving to Brooklyn."
She is round and smooth and old. I'm younger, harder, meaner. She's the clear blue rock you find at the water's edge, the one that has been caressed by time. I'm the piece of glass that cuts your finger, the broken cola bottle that you mistake for something else.
I still don't know what I want to be. I don't know where I want to live. I don't know if I'll ever make it.
She cries when no one is around. Dreams in private. Wishes things were different.
I smoke too many cigarettes. My mother has never had a single puff. I take long, poetic walks along the Hudson River. Her shoes give her blisters. I read books. She buys the audio. We listened to Sarah Palin's memoir on the way to Los Angeles last summer for my cousin's wedding. Hours of torture. My mother likes to be entertained while she drives.
I've had several boyfriends. She is a serial monogamist. I know when it is time to get out. She forgives too easily.
Mom likes Mel Gibson. A lot. "I can't stand him," I tell her. "He's racist and conservative. His politics suck."
"But he has such an amazing handlebar moustache. I love handlebar moustaches."
"He doesn't have one."
"Mom, you're thinking of Tom Selleck. Magnum PI?"
Lots of facts exist only in my mother's world. She is never wrong in her world. She is never late in her world. She is never depressed in her world.
by Brooks Riley
by Edward Rackley
The occasion to commemorate Tim Hetherington's life and work is now upon us; let it not pass in silence. He died on April 20, 2011 from a Libyan mortar on the streets of Misrata. I didn't know him personally, as did many friends and colleagues, but followed his work from the early 2000s in Liberia through the Oscar-nominated Restrepo in 2010. Even in his earliest published work, a new creative force was clearly behind the lens.
An uncanny talent for capturing the grace of strangers amidst the peril of explosive circumstances, he framed them not as cannon fodder or cardboard victims but as dignified members of a forlorn species. "Often we see scenes of disaster and forget that the people imaged are individuals with individual stories and lives," Tim explained in this clip on his working process. The moral complexity of his subjects matched my own experiences in crumbling dictatorships and nations rent asunder by grievance and the promises of insurrection. From Liberia to Darfur and Afghanistan, Hetherington's different media projects untapped their own turgid fount of memories sweet and sour.
His early Liberia photos were memorable for their fleeting dignity and searing panic of private moments in battle, serendipitous snaps of civilians and combatants with poignant acumen. Others miraculously wove the social, political and economic threads of a conflict into a single image--a West African Breughel sans folly or satire. Child soldiers lording over diamond diggers sprawled in open mud flats, sifting for riverbed gems to fund campaigns of mass amputation, beheading and rape. Portraits of human industry absent any social or political aim beyond self-serving blood and lucre.
This was early Hetherington: still mystified by the paroxysms of humanity in the throes of war. Not a bad start, but embedding in warzones is not hard to do, after all. Anyone can become cannon fodder, and journalists have been accessing armies and frontlines for over a century.
by Sue Hubbard
You really do wonder, sometimes, just how long some women artists have to be around before anyone takes notice. When asked by a callow journalist how she felt, in her 90s, at having recently become famous, the artist, Louise Bourgeois replied acerbically: “I’ve been ‘ere all along.”
That this current show at Tate Modern, by the artist, Sonia Delaunay, should be her first retrospective in the UK, despite her 60 year-long career, is surprising. Though not a household name, long before such things were au courant, she created a hallmark style as an avant-garde painter, and an innovative fashion and theatre designer. Anyone born in the 40s or 50s, whether they realise it or not, will be familiar with the influence of her abstract designs on post war fabrics. To be a woman artist during the height of modernism was something of a paradox. Modernism and its playground Paris certainly gave women new freedoms in terms of art education, living arrangements, travel and relationships. But art history has, despite inroads made in the 70s by feminist critics, been a narrative written largely from a male perspective.
Born Sara Élievna Stern in 1885, the youngest of a modest Jewish family from Odessa, Delaunay’s life reads like that of the heroine from a 19th century novel. Sent by her parents to live with her wealthy uncle, Henri Terk, she adopted the name Sofia Terk (though was always known as Sonia). Through her uncle she was introduced to the great museums of St. Petersburg, spent summers in Finland, and became familiar with European culture. At the age of 18 she went off to study art in Germany. Seeking to emancipate herself from her middle-class background she went in search of artistic freedom, reading books on psychology and philosophy, including the book of the moment, Nietzsche’s Beyond Good and Evil. She also developed a passion - one shared with her contemporary the poet Rainer Maria Rilke - for all things Slavic, perhaps as a way to stay in touch with her childhood. And she started to sew.
by Scott F. Aikin and Robert B. Talisse
In the United States, the political season is almost upon us. Campaigns are gearing up, contrasts are being drawn, and debates are beginning to emerge. This is an important time for those who are interested in the norms of argument and public deliberation. Fallacy-detection is a favorite pastime, and we ourselves are enthusiastic participants. However, there is considerable confusion surrounding one of the most widely-known and commonly-attributed fallacies, the ad hominem ("to the man").
Fallacies are improper inferences, popular ways of drawing conclusions from premises that in fact offer them no support. In its most common variety, ad hominem fallacy takes the following form:
Premise: Subject S is in some specified way vicious.
Conclusion: We should reject the things S says.
The vices identified in the premise of course vary. Depending on the context, it might be claimed that S is philanderer, a hypocrite, an alcoholic, a drug abuser, a child abuser, a racist, a pervert, a neoliberal, a lowbrow, an egghead, a neocon, a snob, a pinhead, a knownothing, and so on. To be sure, some of these traits may not be actual vices, but the effective deployment of the ad hominem depends only on the speaker's audience believing that the trait attributed in the premise is indeed vicious. The ad hominem's strategy is that of identifying the purported vice ascribed to S in the premise as sufficient grounds for rejecting the things S has said.
The prevalence of the ad hominem in political debate is easy to explain. Given the carefully curated and time-constrained forums in which most public political discourse occurs, it is just easier for disputants to talk about each other than the ideas and policies over which they disagree. Consequently, discussions of politics all too regularly become wranglings over personalities. Yet, despite its understandable prevalence, the garden-variety ad hominem is obviously fallacious.
by Shadab Zeest Hashmi
In a lineated poem, the line-breaks are used to produce verbal or sonic emphasis, in addition to creating a structure that is arranged such that it is easy to parse and comprehend the poem. When line-length varies, emphasis shifts and dramatic tension or narrative effect is produced. Generally speaking, in a free-verse poem, line-length varies without a set pattern, and the variation depends on where the poet wants emphasis, but in his (free-verse) poem “Greed,” C.K. Williams uses a pattern to arrange the lines. He uses long lines (flush right-margin) that are alternated by short lines constituting five to eight syllables. The lines are enjambed and form remarkably long sentences. Such sentences may ordinarily be in danger of becoming unwieldy or out of control. Williams brings aesthetic order to the poem by using a typographical pattern and a pattern of sonic devices, thereby creating a piece that has narrative clarity as well as narrative impact or dramatic tension.
Typographically, Williams’ style of predictably continuing each long line till flush right margin and indenting each alternate line, establishes a pattern that helps the eye get accustomed to this arrangement and to parse the sentences with ease:
A much-beaten-upon-looking, bedraggled blackbird, not a starling, with
A mangled or tumorous claw,
an extra-evil air, comically malignant, like something from a folktale
meant to frighten you,
gimps his way over the picnic table to a cube of moist white cheese into
which he drives his beak.
There is a suspended syntax in this long sentence, but the words are strung together alliteratively and with the deft use of diction that creates sound patterns forming sonic clusters, making the sentence cohere and aiding comprehension. In the above stanza, “bedraggled black-bird,” “extra-evil” are alliterations. There is a sonic partnership or inter-play in diction such as “starling,” “mangled” and “malignant” or between “gimp” and “picnic” or “cheese” and “beak” or in the phrase” folktale/ meant to frighten you.” These patterns of sonic play establish a harmony which can be said to contribute to clarity in its cohesive effect. The pattern becomes more and more vivid as the poem continues:
Then a glister of licentious leering, a conspiratorial gleam, the cocked
brow of common avarice:
he works his yellow scissors deeper in, daring doubt, a politician with
his finger in the till,
a weapon maker’s finger in the politician, the slobber and the licking
and the champ and the click.
Sunday, April 26, 2015
Tim Maudlin at the PBS Nova website:
How can we understand the world in which we find ourselves? How does the universe behave? What is the nature of reality?….Traditionally these are questions for philosophy, but philosophy is dead. Philosophy has not kept up with modern developments in science, particularly physics. Scientists have become the bearers of the torch of discovery in our quest for knowledge. —Stephen Hawking and Leonard Mlodinow
This passage from the 2012 book “The Grand Design” set off a firestorm (or at least a brushfire) of controversy. Has philosophy been eclipsed by science in the quest for understanding reality? Is philosophy just dressed-up mysticism, disconnected from scientific understanding?
Many questions about the nature of reality cannot be properly pursued without contemporary physics. Inquiry into the fundamental structure of space, time and matter must take account of the theory of relativity and quantum theory. Philosophers accept this. In fact, several leading philosophers of physics hold doctorates in physics. Yet they chose to affiliate with philosophy departments rather than physics departments because so many physicists strongly discourage questions about the nature of reality. The reigning attitude in physics has been “shut up and calculate”: solve the equations, and do not ask questions about what they mean.
Kenan Malik in Pandaemonium:
‘With the rise of China’, Martin Jacques writes in his book When China Rules the World, ‘Western universalism will cease to be universal – and its values and outlook will become steadily less influential. The emergence of China as a global power in effect relativizes everything.’ The transformation of China into an economic superpower raises important and challenging questions about how we perceive the world. Our understanding of history and culture will unquestionably change. The Era of the Warring States may come to be seen as significant as the Peloponnesian War, or 1911, the end of the dynastic era, as important a date as 1789, and the fall of the French monarchy. Kongzi, Mo Tzu and Zhu Xi may become as well known as Plato, Aristotle and Aquinas. Lu Xun could be regarded as fine a writer as James Joyce.
But what about our understanding of morality? To what extent will the rise of China and the decline of Europe and America transform the way we understand moral values? Will universalism be seen merely as a form of Western particularism? To what extent will ‘everything be relativised’?
The story of this book is the story of how the centre of gravity of moral thinking has historically shifted. In the ancient world, Greece, Israel, Persia, India and China were all sources of civilization and of distinctive moral philosophies. The concepts that developed at each source were shaped by the particularities of the local culture and social needs; there were, nevertheless, also common themes that spanned continents, from the idea of virtue to the Golden Rule. The rise of monotheism, and in particular of Christianity, transformed the discussion of ethics in Europe, establishing the idea of rule-based morality, guided and anchored by a divine intelligence, and developing ideas of universalism. The emergence of Islam at the end of the first millennium CE, and its expansion through the beginning of the second, created a new centre of intellectual gravity. Drawing upon the heritage of Greece, Persia and India, as well as the Judaic and Christian traditions, the Islamic Empire came to be a bridge both between the Ancient world and early modernity and between East and West. The only empire that in its day could challenge the philosophical and technological supremacy of the Islamic Empire was China, where the arrival of Buddhism from India triggered a renaissance in Confucian thinking. What we can see in this history is not moral progress, in the sense we can witness scientific or technological progress, but the maturing, development and deepening of moral philosophy.
Carl Zimmer in his excellent blog, The Loom:
Earlier this week, Chinese researchers reported that they edited the genes of human embryos using a new technique called CRISPR. While these embryos will not be growing up into genetically modified people, I suspect this week will go down as a pivotal moment in the history of medicine. David Cyranoski and Sara Reardon broke the news today at Nature News. Here I’ve put together a quick guide to the history behind this research, what the Chinese scientists did, and what it may signify.
There are thousands of genetic disorders that can occur if a mutation happens to strike an important piece of DNA. Hemophilia, sickle cell anemia, cystic fibrosis– the list goes on and on. As I wrote in the Atlantic in 2013, a particularly cruel genetic disorder, fibrodysplasia ossificans progressiva causes people to grow a second skeleton. It’s caused by a mutation that changes a single “letter” of a single gene, called ACVR1. The protein encoded by the gene doesn’t work properly, triggering a wave of changes in people’s bodies, with the result that when they heal from a bruise, they replace entire chunks of muscle with new bone.
In some cases, people can offset many of the symptoms of genetic disorders with simple changes, like watching what they eat. In other cases, like hemophilia, they have to take regular doses of drugs to remain healthy. In other cases, like fibrodysplasia ossificans progressiva, there’s no effective treatment yet.
For decades, scientists have tried to develop a new way to treat genetic disorders like these: to heal the patient, heal the gene.
Robert Collins in The Telegraph:
Dave Eggers has just been reminded why he can’t allow himself near the internet. The night before I meet him in Paris to talk about his latest three novels – published in a burst of creativity over the past three years – he has been up until 3am watching videos on YouTube on a houseboat he has rented in Amsterdam. “I got back, and to wind down I watched the comedy duo Key & Peele,” he says, while we sit in a bijou hotel overlooking the Place du Panthéon. “There’s just hundreds of YouTube clips. I couldn’t stop. That’s my thing. I can’t be near that stuff. I can’t have it in the house. I would never work again.” Eggers, you see, has been working very hard indeed. Since his 2000 debut, A Heartbreaking Work of Staggering Genius – a bestselling, Pulitzer Prize-nominated memoir about his parents’ deaths from cancer within five weeks of each other and his subsequent rearing of his then eight-year-old brother, Christopher – Eggers has published short stories, novels, anthologies and children’s books. In 2002, he founded a literacy centre, 826 Valencia, for schoolchildren in San Francisco. On the back of its success, he opened a string of them across America, which led to others being set up in Europe. Eggers has come to Paris to visit the latest of these.
In between all this, he has written screenplays – including the film adaptation of Where the Wild Things Are, directed by Spike Jonze in 2009 – and founded an organisation that helps American university students find funding. He runs his own publishing house and literary magazine, McSweeney’s. And he has set up another literary magazine, The Believer, as well as founding a series of oral histories about human rights crises, a theme he covered in his 2009 book Zeitoun, which recounted the ordeal of a Syrian-American arrested in New Orleans in the chaos following Hurricane Katrina. Eggers is not so much a literary darling as a one-man social enterprise.
Salley Vickers in The Guardian:
Julian Baggini is that happy thing – a philosopher who recognises that readers go glassy-eyed if presented with high-octane philosophical discourse. And yet, as his latest book, Freedom Regained: The Possibility of Free Will, makes clear, it is in all our interests to consider crucial aspects of what it means to be human. Indeed, in this increasingly complex world, maybe more so than ever. Freedom is one of the great, emotive political watchwords. The emancipation of slaves and women has inspired political movements on a grand scale. But, latterly, the concept of freedom has defected from the public realm to the personal. How responsible are we as individuals for the actions we take? To what degree are we truly autonomous agents?
...The neural information that has made waves, however, is the fact that scans indicates the brain’s chemistry consistently determines a decision prior to our consciously making that decision. So when I deliberate over a menu and finally choose a mushroom risotto over a rare steak, my brain has anticipated this before I am aware of my choice. At first, this looks alarming. I am not the mistress of my gastric fate, my brain chemistry is. But that is to fail to recognise that my brain’s chemistry may be responding to a vast array of accumulated information about my reading of restaurant reviews, my health, the kind of day I’ve had, my relationship to my weight, my dining companion, my views on animal rights. This is a process not dissimilar to intuition, which is no more than the mind’s ability to process a number of clues too complex to be consciously registered.
You stand far from the crowd, adjacent to power.
You consider the edge as well as the frame.
You consider beauty, depth of field, lighting
to understand the field, the crowd.
Late into the day, the atmosphere explodes
and revolution, well, revolution is everything.
You begin to see for the first time
everything is just like the last thing
only its opposite and only for a moment.
When a revolution completes its orbit
the objects return only different
for having stayed the same throughout.
To continue is not what you imagined.
But what you imagined was to change
and so you have and so has the crowd.
by Peter Gizzi
from The Outernationale
publisher: Wesleyan, Middletown CT, 2007
Saturday, April 25, 2015
Richard Lourie in the New York Times:
The mass of men may “lead lives of quiet desperation,” as Thoreau wrote, but the Polish poet Wislawa Szymborska (1923-2012) did just the opposite: She lived a life of quiet amazement, reflected in poems that are both plain-spoken and luminous. Many of them are gathered now in “Map: Collected and Last Poems.”
Born in the countryside, Szymborska moved in 1931 to Krakow, city of kings and culture, and lived there until her death. Though her life was most eventful inwardly, there was no escaping history in Poland. Indeed, Szymborska lived in four quite different Polands: the anxious interwar Poland that had regained its independence in 1918 after more than a century’s absence from the map of Europe; the Poland of the Nazi occupation, the death camps and uprisings, which began shortly after she turned 16; postwar Poland under Soviet domination, where she herself was a Communist until breaking with the party in 1966, about the time she was finding her voice as a poet; and, last, post-Soviet Poland, free, successful, blessedly ordinary.
Szymborska neither evades nor fetishizes her country’s travails. She can be tough and blunt toward them, as in the poem “Starvation Camp Near Jaslo,” where “the meadow’s silent, like a witness who’s been bought.” But Szymborska is always more interested in the individual. After saying, “History rounds off skeletons to zero. / A thousand and one is still only a thousand,” the poem goes off to wonder about that uncounted individual. In “Innocence,” she muses on young German girls blissfully unaware they were “conceived on a mattress made of human hair,” and in “Hitler’s First Photograph” she has a little macabre fun at the Führer’s expense: “And who’s this little fellow in his itty-bitty robe? / That’s tiny baby Adolf, the Hitlers’ little boy!” Of course, as with any newborn, you can’t help wondering what his future will turn out to be: “Whose tummy full of milk, we just don’t know: / printer’s, doctor’s, merchant’s, priest’s?”
James Crabtree in Prospect:
Narendra Modi stood on the walls of New Delhi’s Red Fort on a blustery morning last August, a man at the height of his recently-acquired powers. It was his first Independence Day speech, and also the first given by an Indian Prime Minister born after the end of colonial rule in 1947. Coming just a few short months after his thumping victory in national elections in May, it provided Modi with the most prominent stage afforded to any Indian leader to outline his plans for the nation.
Not a man known for modesty, he began humbly enough, painting himself “not as the Prime Minister, but as the Prime Servant.” Dressed in a white kurta and flamboyant, flowing red polka-dot turban, he stressed his separation from India’s establishment, too: “Brothers and sisters, I am an outsider for Delhi… I have no idea about the administration and working of this place.” His hands jabbing the air for emphasis, he even made brief nods toward harmony between India’s many religions, and the importance of the rights of women—mentions that drew modest praise from anxious liberals worried that Modi might prove to be a right-wing firebrand, in hoc to the Hindu nationalist base of his Bharatiya Janata Party (BJP).
Beyond the showmanship, there were hints of substance. As wind whipped around the ramparts, Modi laid out themes that would define his early period in power: an economic revival after years of stagnation; transforming India into a Chinese-style manufacturing powerhouse; and a focus on the concerns of the poor, from building toilets to sprucing up squalid streets. Yet on one issue—indeed, perhaps the most important that lay behind his electoral landslide—Modi had surprisingly little to say: corruption.