Friday, October 17, 2014
The Australian novelist Elizabeth Harrower, who is eighty-six and lives in Sydney, has been decidedly opaque about why she withdrew her fifth novel, “In Certain Circles” (Text), some months prior to publication, in 1971. Her mother, to whom she was very close, had died suddenly the year before. Harrower told Susan Wyndham, who interviewed her a few months ago in the SydneyMorning Herald, that she was absolutely “frozen” by the bereavement. She also claims to remember very little about her novel—“That sounds quite interesting, but I don’t think I’ll read it”—and adds that she has been “very good at closing doors and ending things. . . . What was going on in my head or my life at the time? Fortunately, whatever it was I’ve forgotten.” Elsewhere, Harrower has cast doubt on the novel’s quality: “It was well written because once you can write, you can write a good book. But there are a lot of dead novels out in the world that don’t need to be written.”
Harrower deposited the manuscript of “In Certain Circles” in the National Library of Australia and essentially terminated her literary career. She has said that she thinks of her fiction as something abandoned long ago, buried in a cellar. She can’t now be bothered with writing. “I don’t know anybody who knows I’m a writer,” she said in 2012.
The 18th-century cult of sensibility, spread through performances on the Parisian stage and nurtured by novels of deep emotional intensity by the likes of Samuel Richardson and Rousseau, loosened the grip of the costive, courtly smile. Charming and tender smiles - transparent expressions of feeling intended to be shared by all men and women, though, in practice, chiefly enjoyed by the Parisian cultural and social elite - became fashionable. Teeth and smiles were chic - and so were dentists. Practitioners like Pierre Fauchard made dental care a profession: they abandoned the street (where teeth had been brutally pulled by colourful showmen like 'Le Grand Thomas', who operated on the Pont Neuf and was known as the 'Pearl of the Charlatans' and 'Terror of the Human Jaw') and set up offices (upstairs so the patients' screams could not be heard in the street below) in fashionable spots like the Rue Saint-Honoré. They encouraged tooth conservation, not brutal extraction, wrote treatises that established dentistry as a science, and emphasised the importance of patient self-care, which helped them peddle a succession of cleaners, whiteners, gargles, toothpicks and breath sweeteners. Fauchard invented spring-loaded denture sets, which, as Jones reminds us, 'had the unfortunate habit of leaping dramatically out of the owner's mouth at unguarded moments'. Nicolas Dubois de Chémant went one better and manufactured very expensive porcelain dentures, a set of which (illustrated in the text) belonged to the exiled archbishop of Narbonne, and were exhumed during the building of the Eurostar terminal at St Pancras.
Liberal political philosophy post-Rawls is directed towards justification. It takes many different approaches, all designed to identify the circumstances under which the exercise of coercive authority by the state is legitimate – that states should conform to principles that would be agreed to by rational individuals behind a “veil of ignorance”, that they follow principles that no one could “reasonably reject”, that they could have arisen spontaneously without the violation of fundamental rights, and so on.
Communitarians and other modern admirers of idealism object that such justifications presuppose an instrumental conception of political life, measuring it by its contribution to the pursuit of individual self-interest. They argue that unless we appreciate that human beings are fully social and that the state is in some strong sense a political community, we perpetuate the alienation that the idealists diagnosed so trenchantly. From which it follows that law and authority must be understood as embedded within a concrete ethical life – Sittlichkeit, to use Hegel’s own, untranslatable, German word. Yet how are such political communities to be assessed?
“Thought”, for Hegel, is a technical term used to refer to the content of his own philosophy. So when he writes that something can be “justified in thought”, that means that it is justifiable from the standpoint of that philosophy. But it hardly needs saying that the speculative philosophy found in Hegel’s Science of Logic will not be the sort of justification available to the average passenger on the Stuttgart omnibus.
James Hamblin in The Atlantic:
What if nutrition labels told people exactly what calories meant, in practical terms? A bottle of Coke could dole out specific exercise requirements. The calories herein, it might say, are the equivalent of a 50-minute jog. The decision to drink the Coke then becomes, would you rather spend the evening on a treadmill, or just not drink the soda? Some would say that's a joyless, infantilizing idea. The implication that people can't understand calorie counts is unduly cynical. Have a Coke and a smile, not a Coke and a guilt-wail. Others would protest on grounds that it's impossible to make this kind of exercise requirement universal to people of all ages, body sizes, and levels of fitness. Everyone burns calories at different rates. But Sara Bleich, an associate professor at Johns Hopkins Bloomberg School of Public Health, is not among these people. She describes these labels as her dream.
For the past four years, translating nutrition information into exercise equivalents has been the focus of Bleich's increasingly popular research endeavor. Her latest findings on the effectiveness of the concept are published today in the American Journal of Public Health. In the study, researchers posted signs next to the soda and juice in Baltimore corner stores that read: “Did you know that working off a bottle of soda or fruit juice takes about 50 minutes of running?” or “Did you know that working off a bottle of soda or fruit juice takes about five miles of walking?” (And, long as those distances and times may seem, they may even underestimate the magnitude of the metabolic insult of liquid sugar.) The signs were a proxy for an actual food label, but they made the point. They effectively led to fewer juice and soda purchases, and to purchases of smaller sizes (12-ounce cans instead of 20-ounce bottles). Bleich also saw learned behavior; even after the signs came down, the local patrons continued to buy less soda and juice. "The problem with calories is that they're not very meaningful to people," Bleich told me. "The average American doesn't know much about calories, and they're not good at numeracy."
At 4 in the morning
(apologies to Federico Garcia Lorca)
At four in the morning
Too soon for birds.
Too late for bats.
At four in the morning
Too soon for light,
Its lying eyes
At four in the morning,
My eyes are shut.
My mind is near.
At four in the morning,
The veil is swept.
The curtain up.
At four in the morning
I see the ark.
Its gaping hold.
At four in the morning
A time to die.
A fatal sigh.
by Brooks Riley
The study appears this week in the online journal PLoS ONE, published by the Public Library of Science. It provides an alternative theory to two current theories of how simple bacterial cells were swallowed up by host cells and ultimately became mitochondria, the "powerhouse" organelles within virtually all eukaryotic cells -- animal and plant cells that contain a nucleus and other features. Mitochondria power the cells by providing them with adenosine triphosphate, or ATP, considered by biologists to be the energy currency of life. The origin of mitochondria began about 2 billion years ago and is one of the seminal events in the evolutionary history of life. However, little is known about the circumstances surrounding its origin, and that question is considered an enigma in modern biology. "We believe this study has the potential to change the way we think about the event that led to mitochondria," said U.Va. biologist Martin Wu, the study's lead author. "We are saying that the current theories -- all claiming that the relationship between the bacteria and the host cell at the very beginning of the symbiosis was mutually beneficial -- are likely wrong. "Instead, we believe the relationship likely was antagonistic -- that the bacteria were parasitic and only later became beneficial to the host cell by switching the direction of the ATP transport."
The finding, Wu said, is a new insight into an event in the early history of life on Earth that ultimately led to the diverse eukaryotic life we see today. Without mitochondria to provide energy to the rest of a cell, there could not have evolved such amazing biodiversity, he said. "We reconstructed the gene content of mitochondrial ancestors, by sequencing DNAs of its close relatives, and we predict it to be a parasite that actually stole energy in the form of ATP from its host -- completely opposite to the current role of mitochondria," Wu said. In his study, Wu also identified many human genes that are derived from mitochondria -- identification of which has the potential to help understand the genetic basis of human mitochondrial dysfunction that may contribute to several diseases, including Alzheimer's disease, Parkinson's disease and diabetes, as well as aging-related diseases. In addition to the basic essential role of mitochondria in the functioning of cells, the DNA of mitochondria is used by scientists for DNA forensics, genealogy and tracing human evolutionary history.
Thursday, October 16, 2014
If the members of the Nobel Academy felt slighted when Jean-Paul Sartre rejected their prize 50 years ago, they didn’t show it. The Academy set out the dinner plates and made their speeches anyway — without the philosopher. The 1964 Nobel Prize in Literature, announced Anders Österling — longtime member of the Swedish Academy, and a writer himself — was being given to “the French writer Jean-Paul Sartre for his work which, rich in ideas and filled with the spirit of freedom and the quest for truth, has exerted a far-reaching influence on our age.”
S. Friberg, Rector of the Caroline Institute — a prestigious Swedish medical university — made the following remarks at the banquet: “Sartre's existentialism may be understood in the sense that the degree of happiness which an individual can hope to attain is governed by his willingness to take his stand in accordance with his ethos and to accept the consequences thereof …”
“It will be recalled,” said Anders Österling, “that the laureate has made it known that he did not wish to accept the prize.
Leo Carey in the New York Review of Books:
Nine hundred and thirty pages into Jan Swafford’s new biography of Beethoven, there is an interesting juxtaposition. After the composer died, in March 1827, his funeral was “one of the grandest Vienna ever put on for a commoner.” Schools were closed. Some 10,000 people crowded into the courtyard of the building where he had lived, then followed the coffin to the local parish church—not, as Swafford has it, to St. Stephen’s Cathedral. (Among the torchbearers was Franz Schubert.) Franz Grillparzer, the leading Viennese writer of the day, wrote a funeral oration. But later that year, when Beethoven’s effects were auctioned off, a lifetime’s worth of manuscripts and sketchbooks fetched prices that Swafford calls “pathetic.” Beethoven’s late masterpiece the Missa Solemnis went for just seven florins. By comparison, his old trousers and stockings sold for six florins.
Beethoven’s last years are rich in anecdotes of neglect. The late works were too abstruse for the public, and he told a visitor (exaggerating somewhat) that even earlier ones were out of fashion and never performed in Vienna. When Rossini, then Europe’s most popular composer, paid a visit, he was appalled at the squalor in which the great man was living and left in tears. He appealed to aristocratic contacts to do something, but they refused, considering Beethoven crazy and beyond help. Even Beethoven’s successes in these years were partial: the ecstatic reception of the Ninth Symphony is well known, but Swafford suspects that the audience at the premiere had come to cheer “the man and his legacy” rather than the music.
Nobel prizewinners May-Britt Moser and Edvard Moser have spent a career together near the Arctic Circle exploring how our brains know where we are.
Alison Abbott in Nature:
The fact that Edvard and May-Britt Moser have collaborated for 30 years — and been married for 28 — has done nothing to dull their passion for the brain. They talk about it at breakfast. They discuss its finer points at their morning lab meeting. And at a local restaurant on a recent summer evening, they are still deep into a back-and-forth about how their own brains know where they are and will guide them home. “Just to walk there, we have to understand where we are now, where we want to go, when to turn and when to stop,” says May-Britt. “It's incredible that we are not permanently lost.”
If anyone knows how we navigate home, it is the Mosers. They shot to fame in 2005 with their discovery of grid cells deep in the brains of rats. These intriguing cells, which are also present in humans, work much like the Global Positioning System, allowing animals to understand their location. The Mosers have since carved out a niche studying how grid cells interact with other specialized neurons to form what may be a complete navigation system that tells animals where they are going and where they have been. Studies of grid cells could help to explain how memories are formed, and why recalling events so often involves re-envisioning a place, such as a room, street or landscape.
While pursuing their studies, the two scientists have become a phenomenon. Tall and good-looking, they operate like a single brain in two athletic bodies in their generously funded lab in Trondheim, Norway — a remote corner of northern Europe just 350 kilometres south of the Arctic Circle. They publish together and receive prizes as a single unit — most recently, the Nobel Prize in Physiology or Medicine, which they won this week with their former supervisor, neuroscientist John O’Keefe at University College London.
Nathan Jurgenson in The New Inquiry (image in Franz Kline, Suspended, 1953):
Modernity has long been obsessed with, perhaps even defined by, its epistemic insecurity, its grasping toward big truths that ultimately disappoint as our world grows only less knowable. New knowledge and new ways of understanding simultaneously produce new forms of nonknowledge, new uncertainties and mysteries. The scientific method, based in deduction and falsifiability, is better at proliferating questions than it is at answering them. For instance, Einstein’s theories about the curvature of space and motion at the quantum level provide new knowledge and generates new unknowns that previously could not be pondered.
Since every theory destabilizes as much as it solidifies in our view of the world, the collective frenzy to generate knowledge creates at the same time a mounting sense of futility, a tension looking for catharsis — a moment in which we could feel, if only for an instant, that we know something for sure. In contemporary culture, Big Data promises this relief.
As the name suggests, Big Data is about size. Many proponents of Big Data claim that massive databases can reveal a whole new set of truths because of the unprecedented quantity of information they contain. But the big in Big Data is also used to denote a qualitative difference — that aggregating a certain amount of information makes data pass over into Big Data, a “revolution in knowledge,” to use a phrase thrown around by startups and mass-market social-science books. Operating beyond normal science’s simple accumulation of more information, Big Data is touted as a different sort of knowledge altogether, an Enlightenment for social life reckoned at the scale of masses.
As with the similarly inferential sciences like evolutionary psychology and pop-neuroscience, Big Data can be used to give any chosen hypothesis a veneer of science and the unearned authority of numbers. The data is big enough to entertain any story. Big Data has thus spawned an entire industry (“predictive analytics”) as well as reams of academic, corporate, and governmental research; it has also sparked the rise of “data journalism” like that of FiveThirtyEight, Vox, and the other multiplying explainer sites. It has shifted the center of gravity in these fields not merely because of its grand epistemological claims but also because it’s well-financed. Twitter, for example recently announced that it is putting $10 million into a “social machines” Big Data laboratory.
Pankaj Mishra makes the case in The Guardian (An empty billboard site in São Paolo, Brazil. Billboard advertising has been banned there since 2007. Photograph: Tony de Marco):
As late as 2008, Fareed Zakaria declared in his much-cited book, The Post-American World, that “the rise of the rest is a consequence of American ideas and actions” and that “the world is going America’s way”, with countries “becoming more open, market-friendly and democratic”.
One event after another in recent months has cruelly exposed such facile narratives. China, though market-friendly, looks further from democracy than before. The experiment with free-market capitalism in Russia has entrenched a kleptocratic regime with a messianic belief in Russian supremacism. Authoritarian leaders, anti-democratic backlashes and rightwing extremism define the politics of even such ostensibly democratic countries as India, Israel, Sri Lanka, Thailand and Turkey.
The atrocities of this summer in particular have plunged political and media elites in the west into stunned bewilderment and some truly desperate cliches. The extraordinary hegemonic power of their ideas had helped them escape radical examination when the world could still be presented as going America’s way. But their preferred image of the west – the idealised one in which they sought to remake the rest of the world – has been consistently challenged by many critics, left or right, in the west as well as the east.
Herzen was already warning in the 19th century that “our classic ignorance of the western European will be productive of a great deal of harm; racial hatred and bloody collisions will develop from it.” Herzen was sceptical of those liberal “westernisers” who believed that Russia could progress only by diligently emulating western institutions and ideologies. Intimate experience and knowledge of Europe during his long exile there had convinced him that European dominance, arrived at after much fratricidal violence and underpinned by much intellectual deception and self-deception, did not amount to “progress”. Herzen, a believer in cultural pluralism, asked a question that rarely occurs to today’s westernisers: “Why should a nation that has developed in its own way, under completely different conditions from those of the west European states, with different elements in its life, live through the European past, and that, too, when it knows perfectly well what that past leads to?”
Lindsay Beyerstein reviews Katha Pollitt's Pro: Reclaiming Abortion Rights in The American Prospect:
A full-throated defense of abortion as a social good, Pro is a thorough debunking of anti-abortion pieties. But it’s not just anti-abortion activists on whom Pollitt, columnist for The Nation and a noted poet, sets her sights: She also takes pro-choicers to task for what she calls “the awfulization” of abortion by paying lip service to the idea that abortion is always an agonizing decision that all women feel somewhat bad about.
Pollitt makes a compelling point: Why would that be universally true, unless all women inherently want babies all the time, or unless everyone believes there’s something at least a bit wrong with abortion? As Pollitt’s own reporting makes clear, regret and uncertainty are hardly universal experiences for those who have abortions.
Women’s reasons for abortion are dismissed as frivolous and selfish, Pollitt argues, because society doesn’t take women’s aspirations for a better life seriously. Furthermore, she writes, women are depicted as shallow—or worse—for wanting to have sex while avoiding pregnancy. This attitude rests on the conservative assumption that pregnancy is a natural “consequence” of sex, and that women who try to avoid it are shirkers.
John Quiggin in Crooked Timber:
I want to offer a very simple explanation of Piketty’s point. I’m aware that this may seem glaringly obvious to some readers, and remain opaque to others, but I hope there is a group in between who will benefit.
Suppose that you are a debtor, facing an interest rate r, and that your income grows at a rateg. Initially, think about the case when r=g. For concreteness, suppose you initially owe $400, your annual income is $100 and r=g is 5 per cent. So, your debt to income ratio is 4. Now suppose that your consumption expenditure (that is, expenditure excluding interest and principal repayments) is exactly equal to your income, so you don’t repay any principal and the debt compounds. Then, at the end of the year, you owe $420 (the initial debt + interest) and your income has risen to $105. The debt/income ratio is still 4. It’s easy to see that this will work regardless of the numerical values, provided r=g. To sum it up in words: when the growth rate and the interest rate are equal, and income equals consumption expenditure, the ratio of debt to income will remain stable.
On the other hand, if r>g, the ratio of debt to income can only be kept stable if you consume less than you earn. And conversely if r < g (for example in a situation of unanticipated inflation or booming growth), the debt-income ratio falls automatically provided you don’t consume in excess of your income.
Now think of an economy divided into two groups: capital owners and everyone else (both wage-earners and governments). The debt owed by everyone else is the wealth of the capital owners. If r>g, and if capital owners provide the net savings to allow everyone else to balance income and consumption, then the ratio of the capital stock to (non-capital) income must rise. My reading of Piketty is that, as we shift from the C20 situation of r ≤ g to one in whichr>g the ratio of capital to stock to non-capital income is likely to rise form 4 (the value that used to be considered as one of the constants of 20th century economics) to 6 (the value he estimates for the 19th century)
This in turn means that the ratio of capital income to non-capital income must rise, both because the capital stock is getting bigger in relative terms and because the rate of return, r, has increased as we move from r=g to r>g.
Owen Bennett-Jones reviews two books on the AfPak situation in the London Review of Books:
The conflict in the Afghanistan-Pakistan borderlands has similarities with other contemporary struggles. From Timbuktu to Kandahar, jihadis, national governments, ethnic groups and, in some cases, tribes are fighting for supremacy. In each place there are complicating local factors: badly drawn international borders; the relative strength or weakness of non-violent Islamist movements; the presence or absence of foreign forces, whether Western or jihadi; and different historical experiences of colonialism. From the point of view of Western policymakers some of these conflicts seem to be more important than others. For the French, the potential fall of Mali to radical Islamist forces was unacceptable, so they intervened. In Somalia, by contrast, the problem has largely been ignored by the West and is mostly being dealt with by the African Union. It was said that al-Qaida must not be allowed to hold territory in Syria, but both an al-Qaida affiliate and Isis have been doing just that, and it wasn’t until earlier this month that Obama announced he’d strike Isis from the air.
It’s far from clear that these varied responses to jihadi activity are the result of rational decision-making. In Yemen, for example, al-Qaida supporters move about freely and plot attacks against the West. Yet although the US has used air power in Yemen it has for the most part left the fighting to the far from capable Yemeni armed forces. But the Pashtun areas of the Afghanistan-Pakistan borderlands are an exception to the mixed messages. There the West has used every tactic at its disposal to confront jihadis: boots on the ground, air strikes, drone attacks, bribes, social welfare programmes and infrastructure projects – the effort to control the Pashtuns hasn’t lacked commitment. There are, of course, important differences between Yemen and the Pashtun areas. Attacks organised in Pashtun areas – including 9/11 and 7/7 – have succeeded; even the most sophisticated plot to emerge from Yemen, in which bombs were disguised as printer cartridges, was foiled. And it isn’t just that the US was impelled to avenge 9/11. The outside world is interested in the Pashtuns’ poppy crop and their hosting of much of Pakistan’s nuclear arsenal. Over the last century and a half the intricacies of Pashtun politics have been discussed by politicians and their advisers in the capitals of all the Great Powers: it’s Washington that’s worrying today, but it used to be Moscow, and before that London.
Rachel Nuwer in Smithsonian:
Not long after the second nurse to contract Ebola was identified, it came out that she had flown on a commercial flight after being exposed to the disease. The presence of Ebola in the United States has flamed fears fed by misinformation about the disease, poor understanding of epidemiology, lack of perspective and panic-mongering. It has led to some surprising and disturbing reactions from both officials and members of the public. Here are a few of the more egregious examples:
- In Texas, a community college just announced a new rule: “Navarro College is not accepting international students from countries with confirmed Ebola cases,” the Daily Beast reports. On this grounds, Navarro College rejected a well-qualified Nigerian applicant—despite the fact that the disease isn't affecting Nigeria.
- Officials in DeKalb County allegedly threatened to disconnect Emory University's sewer system when two Ebola patients were rushed to its hospital, the New York Times writes. Pizza delivery guys supposedly wouldn't serve the doctors, either. (These claims were later disputed, however).
- Some lady wore a full-on hazmat suit to catch her flight at Dulles International Airport in Washington, D.C.
- Protective gear like hazmat suits, Business Insider reports, is now "the hottest trade in the stock market." On Monday, shares of two companies that manufacture the gear were up 47 and 33 percent.
- The guy who owns "ebola.com" wants $150,000 for the domain name. As 1ClickNews reports, he hopes to find a buyer soon because "he is worried something may 'ameliorate' the outbreak, diminishing Ebola’s news value – and the worth of his domain."
- This year's "hot"—get it?—Halloween costume: Ebola!
The Halloween costume, at least, does get one thing right—Americans think Ebola is very scary.
Wednesday, October 15, 2014
Carl Zimmer in the New York Times:
News that a nurse in full protective gear had become infected with the Ebola virus raised some disturbing questions on Monday. Has the virus evolved into some kind of super-pathogen? Might it mutate into something even more terrifying in the months to come?
Evolutionary biologists who study viruses generally agree on the answers to those two questions: no, and probably not.
The Ebola viruses buffeting West Africa today are not fundamentally different from those in previous outbreaks, they say. And it is highly unlikely that natural selection will give the viruses the ability to spread more easily, particularly by becoming airborne.
“I’ve been dismayed by some of the nonsense speculation out there,” said Edward Holmes, a biologist at the University of Sydney in Australia. “I understand why people get nervous about this, but as scientists we need to be very careful we don’t scaremonger.”
Ebola is a mystery that invites speculation. The virus came to light only in 1976, the first known outbreak. Forty years later, scientists are just starting to answer some of the most important questions about it.
Just last month, for example, Derek J. Taylor, an evolutionary biologist at the University at Buffalo, and his colleagues published evidence that Ebola viruses are profoundly ancient, splitting off from other viral lineages at least 20 million years ago. Dr. Taylor’s research suggests that for most of that time, strains of Ebola infected rodents and other mammals.
In 1976, the virus spilled over into the human population from one of those animals, possibly bats. And every few years since then, a new outbreak has emerged in different parts of Central Africa.
Bill Gates in his blog:
A 700-page treatise on economics translated from French is not exactly a light summer read—even for someone with an admittedly high geek quotient. But this past July, I felt compelled to read Thomas Piketty’s Capital in the Twenty-First Century after reading several reviews and hearing about it from friends.
I’m glad I did. I encourage you to read it too, or at least a good summary, like this one from The Economist. Piketty was nice enough to talk with me about his work on a Skype call last month. As I told him, I agree with his most important conclusions, and I hope his work will draw more smart people into the study of wealth and income inequality—because the more we understand about the causes and cures, the better. I also said I have concerns about some elements of his analysis, which I’ll share below.
I very much agree with Piketty that:
- High levels of inequality are a problem—messing up economic incentives, tilting democracies in favor of powerful interests, and undercutting the ideal that all people are created equal.
- Capitalism does not self-correct toward greater equality—that is, excess wealth concentration can have a snowball effect if left unchecked.
- Governments can play a constructive role in offsetting the snowballing tendencies if and when they choose to do so.
To be clear, when I say that high levels of inequality are a problem, I don’t want to imply that the world is getting worse. In fact, thanks to the rise of the middle class in countries like China, Mexico, Colombia, Brazil, and Thailand, the world as a whole is actually becoming more egalitarian, and that positive global trend is likely to continue.
But extreme inequality should not be ignored—or worse, celebrated as a sign that we have a high-performing economy and healthy society. Yes, some level of inequality is built in to capitalism.
At the Horse Pavilion
We lost you once,
at the Horse Pavilion, on a day
of snappy wind beating five flags
above that brilliant nightmare green
in the sun and beyond prayer but ready to
live on a diet of it for the rest of our days,
we ducked and ran among faces made blank or tender
by our terror, so that we understood for the first time
that this was the way the world was truly divided:
into those faces that could be startled into goodness,
and those that could not, but none of them worth
anything at all to us except for what
they could tell us as we kept calling out to them
the only words left to us, A little boy!, and the
colours of the clothes you were wearing, while the
polished horses kept mindlessly
clearing gates that were hardships,
but distant, whitewashed, the hardships of others,
and sounds mocked us too, in that whinnied
bright air--a ring of faint surf, the civil, evil
sound of horsemen's applause, and we ran into
each other and ran back and ran through the
stadium of stalls and sick straw-smell and ran out
into the sun of the Pavilion's mud plaza
and there you were, on the other side
of the soot track that led toward the weeping
green park, your eyes fixed without flinching
on the main doorway, waiting for us to come out
sometime before dark and we fled to you, crying
your name and I could see in your eyes
how hard you'd been standing your ground
against terror, how long you'd been forbidding
yourself to invent us, as if in inventing us you'd have
lost all chance to see us come out to you,
but how brilliant you seemed, having saved yourself
from harm, you didn't know it, you turned
your face to the taut thigh of my skirt,
not to cry, and we walked that way,
my hand holding your head to me while I
could have sworn I could feel you inhaling
what I was thinking through the skirt's grass-engraved
cotton: Until this moment I never knew what love is.
by Elisabeth Harvor
from Fortress of Chairs. Vehicule Press, 1992
Sandali in HimalSouthAsian:
Tara Books, a feminist publishing house located in Chennai, has been collaborating with ‘folk’ and ‘tribal’ artists for the last few years to produce illustrated books for both children and adults. Many of the art forms they have worked with are believed to have evolved from women’s creative expressions within the household, created for the purposes of ritual and decoration. On the occasion of International Women’s Day on 8 March this year, Tara Books inaugurated a photo exhibition titled ‘From Floor to Book: Women’s Everyday Art Traditions’ at their office, the Book Building. The exhibition, which ran until the end of July, traced the journey of select art traditions across the country, from their original contexts to newer canvases and spaces. A few years ago, Zubaan, another feminist publishing house located in Delhi, showcased artworks by rural women in a travelling exhibition titled ‘Painting Our World: Women’s Messages through Art’ in several cities across the country. As part of Zubaan’s larger project of mapping the women’s movement through visual material, the exhibition aimed to document rural women’s voices on issues ranging from violence, health, communalism and domestic work to marriage, livelihood and the environment, expressed through ‘folk’ and ‘tribal’ art and embroidery practices. While Zubaan’s exhibition captured the overtly political discursive articulations stemming from the women’s movement, Tara Books’ concern seems to lie in understanding meaning-making processes of women – the ways in which they comprehend gender and other social relations through the interplay of quotidian and critical consciousness.
The first of the three sections of Tara Books’s exhibition is titled ‘Everyday Art’. It showcases aspects of an art form difficult to categorise, lying perhaps somewhere at the interstices of craft, art, household labour, tradition and practice. The exhibition shows viewers that women’s everyday art is ‘created’ and ‘displayed’ in the context of the household, and is by nature ephemeral.
Richard Van Noorden in Nature:
This year’s Nobel Prize for Chemistry was awarded to three researchers who developed ways to capture images of living cells at nanoscale resolution — well below the 200 nanometres thought to be the best possible resolution for visible-light microscopes.
A fourth recently-developed super-resolution technique, called structured illumination microscopy (SIM), illuminates samples with stripes of light. A computer program analyses the interference patterns formed by the stripes (usually combining composite pictures with stripes in different orientations) to reconstruct a picture of a cell at about double the resolution limit of optical microscopy. This SIM image shows a three-dimensional view of a human bone cancer cell with actin in purple, DNA in blue, and mitochondria in yellow.
Tuesday, October 14, 2014
Howard Zinn (1922 – 2010) was an American historian, author, playwright, and social activist. The following is adapted from his acclaimed A People's History of the United States.
Howard Zinn in Jacobin:
Columbus would never have made it to Asia, which was thousands of miles farther away than he had calculated, imagining a smaller world. He would have been doomed by that great expanse of sea. But he was lucky. One-fourth of the way there he came upon an unknown, uncharted land that lay between Europe and Asia—the Americas. It was early October 1492, and thirty-three days since he and his crew had left the Canary Islands, off the Atlantic coast of Africa. Now they saw branches and sticks floating in the water. They saw flocks of birds.
These were signs of land. Then, on October 12, a sailor called Rodrigo saw the early morning moon shining on white sands, and cried out. It was an island in the Bahamas, the Caribbean Sea. The first man to sight land was supposed to get a yearly pension of 10,000 maravedis for life, but Rodrigo never got it. Columbus claimed he had seen a light the evening before. He got the reward.
So, approaching land, they were met by the Arawak Indians, who swam out to greet them. The Arawaks lived in village communes, had a developed agriculture of corn, yams, cassava. They could spin and weave, but they had no horses or work animals. They had no iron, but they wore tiny gold ornaments in their ears.
This was to have enormous consequences: it led Columbus to take some of them aboard ship as prisoners because he insisted that they guide him to the source of the gold. He then sailed to what is now Cuba, then to Hispaniola (the island which today consists of Haiti and the Dominican Republic). There, bits of visible gold in the rivers, and a gold mask presented to Columbus by a local Indian chief, led to wild visions of gold fields.
Hodgkinson and Bridle in The White Review:
There’s a marvellous lecture by Tim Berners-Lee (‘How the world wide web just happened’) in which he talks about how he came to create what turned out to be the world wide web. He describes growing up as the child of Computer Scientist parents who had worked at Bletchley Park, and building his first circuits from bits of wire, wrapped around nails, hammered into a piece of wood. Once he’d got the hang of that, he was just in time for the invention of the transistor, and then the integrated circuit. As the components available to him got ever smaller, the complexity of the machines they could power increased exponentially. As Berners-Lee tells it – with some modesty – it was a simple, natural progression from crystal radio, to building his own computer, to putting in place the fundamental transfer protocol that most of us use to access the internet. As things shrank, they also became more powerful, more networked, leading inevitably to an almost total, sublime connectivity.
...This book, For the Motherboard: The Rubáiyát of Omar Khayyám, takes into account some of the limitations of working at scale across digital devices. The first of these is the display size: in an era of ‘retina screens’, where the pixel density of our displays begins to surpass in definition the clusters of rods and cones in the human eye, we are still limited in print by the apertures of our ink nozzles. This is not a new problem: For the Motherboard… is set in Bell Centennial, a font commissioned by AT&T from the designer Matthew Carter in 1975, to replace Bell Gothic, which it had been using in its phone directories since 1938. Between those years, the number of telephones in the United States alone grew from some twenty million to around 140 million. Carter’s Bell Centennial typeface exists because of explosive, networked growth, addressing both this increased technological density, by condensing the character width, and the limitations of contemporary printing, by adding ‘ink traps’ to the letters, minute nicks in the letterforms to absorb and counter the ink spread caused by rapid printing on newsprint. When printed at sufficient size on coated paper, these traps remain visible, tiny reminders of previous technological limitations.
Christopher Bray in Spiked:
One evening in December 1966, the great American writer and critic Edmund Wilson had Sir Isaiah Berlin over for dinner. And a good time they doubtless had of it, but later that night Wilson recorded in his diary that he found Berlin prone to ‘violent, sometimes irrational prejudice against people’. On the evening in question the object of Berlin’s ire was the philosopher and political theorist Hannah Arendt, whose book about the trial of the Nazi officer Adolf Eichmann, Eichmann in Jerusalem, he excoriated without, Wilson claimed, his ever having troubled to read it.
On that last point at least, Wilson seems to have been wrong. Granted the evidence marshalled in David Caute’s Isaac & Isaiah: The Covert Punishment of a Cold War Heretic, it is fair to conclude that Berlin had not only read Arendt’s bestseller, but had also likely arranged for his close friend John Sparrow, then warden of All Souls College at Oxford, to give the book a kicking in the pages of the Times Literary Supplement. Since TLS reviews were printed without bylines back then, why didn’t Berlin write about the book himself? Because, Caute argues, he had for some reason ‘always avoided referring to Arendt in print’. Privately, though, he was happy to rubbish her work. A few years earlier, he had written Faber & Faber a report on Arendt’s The Human Condition. It opened by telling them he ‘could recommend no publisher to buy the UK rights of this book. There are two objections to it: it won’t sell, and it is no good.’
Fans of Berlin’s waspish wit will relish those last two clauses (invert them, as the logic of the sentence dictates, and the wit is gone), but did Arendt’s most considered work really merit such a stinging rebuke?