Amnesty International issues a press release warning of dangers to freedoms of worship and women:
A draft constitution approved by Egypt’s Constituent Assembly falls well short of protecting human rights and, in particular, ignores the rights of women, restricts freedom of expression in the name of protecting religion, and allows for the military trial of civilians, Amnesty International said.
“This document, and the manner in which it has been adopted, will come as an enormous disappointment to many of the Egyptians who took to the streets to oust Hosni Mubarak and demand their rights,” said Hassiba Hadj Sahraoui, Deputy Director for the Middle East and North Africa at Amnesty International.
Freedom of religion is limited to Islam, Christianity and Judaism, potentially excluding the right to worship to other religious minorities such as Baha’is and Shi’a Muslims.
The constitution fails to provide for the supremacy of international law over national law, raising concerns about Egypt’s commitment to human rights treaties to which it is a state party.
Furthermore, the document fails to fully guarantee economic, social and cultural rights, such as protection against forced evictions – it also tolerates child labour.
You can read the draft (as of now) yourself here.
Also the NYT reports on protests around the draft.
Zoë Heller reviews Salman Rushdie's Joseph Anton: A Memoir, in the NYRB:
When Anis Rushdie read his son’s novel Midnight’s Children for the first time in 1980, he became convinced that Ahmed Sinai, the drunken father in the book, was a satirical portrait of himself. A family row ensued. Rushdie fils did not deny that Sinai was based on his father—“In my young, pissed-off way,” he would later tellThe Paris Review, “I responded that I’d left all the nasty stuff out”—but he objected to his father’s wounded reaction and thought it revealed a crude understanding of how novels worked. “My father had studied literature at Cambridge so I expected him to have a sophisticated response to the book, but the person who did was my mother…. She understood it at once as fiction.”
The position that Rushdie took during this literary-domestic spat uncannily prefigured the position he would take nine years later, when confronted by the wrath of another, more punishing patriarch. On February 15, 1989, a day after the Ayatollah Khomeini had issued a fatwa condemning him to death for his authorship of The Satanic Verses, Rushdie appeared on British television and announced that he wished his book had been “more critical” of Islam. As he reports in Joseph Anton—a memoir he has chosen to write in a de Gaulle–like third person—his principal emotion at the time was one of bafflement:
When he was first accused of being offensive, he was genuinely perplexed. He thought he had made an artistic engagement with the phenomenon of revelation; an engagement from the point of view of an unbeliever, certainly, but a proper one nonetheless. How could that be thought offensive? The thin-skinned years of rage- defined identity politics that followed taught him, and everyone else, the answer to that question.
Given how often Rushdie has been accused of writing The Satanic Verses with the express purpose of making trouble, it is understandable that he should wish to highlight the unexpected—the unprecedented—nature of the events that followed the novel’s publication. Even so, his retrospective account of himself as a bookish innocent, bewildered by the world’s coarse intrusion into the literary sphere, seems a little over-egged. By this point in his career, Rushdie, who had already been sued by Indira Gandhi for libelous statements in Midnight’s Children and had already seen his third novel banned in Pakistan, was better qualified than most to appreciate literature’s capacity for eliciting hostile, nonliterary responses.
Richard Martin in Forbes:
Leslie Dewan and Mark Massie are Ph.D. students in nuclear engineering at MIT. For most of their peers, the options upon graduating are pretty simple: teach, or work for one of the national labs. Dewan and Massie, though, decided on an unconventional path: like a couple of Stanford grads, they’ve formed a start-up. Incorporated in 2011, it’s called Transatomic Power, and its mission is to transform the nuclear power industry.
Transatomic’s product is called a “Waste Annihilating Molten Salt Reactor.” If you’ve read my book, SuperFuel, you’ll recognize it as an update on an old reactor technology that was pioneered at Oak Ridge National Laboratory, in the 1950s and 60s. SuperFuel focused on another type of molten salt reactor, a Liquid Fluoride Thorium Reactor, or LFTR. Dewan and Massie’s design is fuel-agnostic in the sense that it can run on either uranium or thorium; as the name implies, its signal feature is that it can consume spent fuel from conventional light-water reactors. Transatomic joins a growing list of start-ups, including Flibe Energy, that are trying to revolutionize nuclear power by bringing back alternative fuels, including thorium, and alternative reactor designs.
More here. [Thanks to James Edward Kolb.]
Rabbi Michael Lerner at CNN:
Israel's security can only be assured when its neighbors believe that it is no longer oppressing the Palestinian people but instead living in peace and harmony with them.
The de facto strategy of past and present Israeli governments of seeking security through domination and by pushing Palestinians out of their homes, or allowing right-wing religious fanatics to create settlements throughout the West Bank to ensure that no Palestinian state could have contiguous parts, has not and cannot work to provide safety for Israel.
Israel's fate and its well-being are intrinsically linked to the well-being of the Palestinian people. It's time for the powerful to show generosity to the relatively powerless.
So those in the U.S. and Israel who want Israel to be secure should welcome the Palestinian Authority's decision to seek observer status as a nonmember state in the United Nations. The authority has agreed to return to negotiations with Israel without conditions once that status has been granted. The goal is creation of a state living in peace with Israel in borders roughly approximating those of the before than 1967 war, with minor border changes mutually agreeable through negotiations.
So who opposes this? Hamas, Israel and the U.S.
Joe Kloc in the Paris Review:
Every now and then I come across someone on the subway who defies easy categorization. I remember, for instance, a man who boarded the 3 train in Brooklyn a few years ago wearing military fatigues and a bandolier packed with little glass bottles of liquids. “Who is man enough to buy my fragrances?” he shouted. (When one rider replied that he wasn’t sure, the man responded, “Are you man enough to kill a hooker in Moscow with a crowbar?”) More recently, there was a man on the uptown 6 wearing a pair of oversized New Year’s glasses—the ones where the 0’s serve as eyeholes—who played atonal jazz on his saxophone and asked for no monetary compensation in return. I could keep going, but no doubt anyone who has lived in a city for any length of time has their own mental list of these self-styled subterranean eccentrics, grouped together not so much by any particular characteristic other than the fact that they seem only to exist underground.
Over the years I’ve made casual study of this joyful band of accidental philosophers as they’ve decorated my dreary morning commutes with their Bedazzler guns of mischief. A few months ago I had the good fortune of getting to know one of them on the R train. I was reading Joseph Mitchell’s Up in the Old Hotel when a man whom I’ll call Z. (because he asked me not to use his name) approached me and said, “That’s a great fucking book.”
Our toes, our noses
Take hold on the loam,
Acquire the air.
Nobody sees us,
Stops us, betrays us;
The small grains make room.
Soft fists insist on
Heaving the needles,
The leafy bedding,
Even the paving.
Our hammers, our rams,
Earless and eyeless,
Widen the crannies,
Shoulder through holes. We
Diet on water,
On crumbs of shadow,
Little or nothing.
So many of us!
So many of us!
We are shelves, we are
Tables, we are meek,
We are edible,
Nudgers and shovers
In spite of ourselves.
Our kind multiplies:
We shall by morning
Inherit the earth.
Our foot’s in the door.”
by Sylvia Plath
From The Independent:
Tomorrow night, the winner of the Polari Prize for lesbian, gay, bisexual and transgender (LGBT) debut authors will be announced at London's South Bank Centre. Tomorrow also marks the fifth birthday of the monthly literary salon that gave the prize its name. The prize is the brainchild of Paul Burston, the editor of the gay section of Time Out magazine and the flamboyant MC of the salon. As a novelist himself, the author of The Gay Divorcee, Star People and other books examining and satirising contemporary gay life, Burston is acutely conscious of the difficulties facing LGBT authors in a tough commercial climate.
…If there's one sweeping generalisation to be made about queer writing (I've been a judge for two years running), it is that it's often savagely funny. Judge a mainstream literary prize and you'll be overwhelmed by dark themes and sombre writing. This year's Polari shortlistees are North Morgan, a bitterly funny satirist in his London club novel Exit Through the Wound (Limehouse Books); the music producer Terry Ronald's Becoming Nancy (Transworld), a deliciously camp rites of passage novel; and Vicky Ryder's rollicking Ey Up and Away (Wandering Star Press), a series of almost poem-like vignettes about growing up in Nuneaton. But perhaps even more surprising is the discovery of two strong poetic voices in John McCullough (Frost Fairs, published by Salt) and Max Wallis (Modern Love, published by Flap). The elfin Wallis has an unusual day job for a poet: when I caught up with him, he had just got back from a modelling assignment in Paris. He's also unusual in his choice of inspiration: the Victorian poet George Meredith who also penned a poem sequence called Modern Love. (Also something of a model, Meredith posed for the famous Pre-Raphaelite painting The Death of Chatterton.)
…Modern Love by Max Wallis:
“… A final glance. Sigh as the door slams a wooden tongue. Do not look back as dawn carries her torch across the sky; he will stay. Tarmac claps footsteps toward day and then, within, something out loud to the world that he will never hear; 'I am sorry, you know'.”
Researchers at Harvard's Wyss Institute have coaxed single strands of DNA to fit together like Lego bricks and form scores of complex three-dimensional shapes, including a teeny-tiny space shuttle. The technique, described in this week's issue of the journal Science, adds a new dimension to molecular construction and should help open the way for nanoscale medical and electronic devices. “This is a simple, versatile and robust method,” the study's senior author, Peng Yin, said in a news release. The method starts with synthetic strands of DNA that take in just 32 nucleotides, or molecular bits of genetic code. These individual “bricks” are coded in a way that they fit together like Lego pegs and holes to form larger shapes of a specific design. A cube built up from 1,000 such bricks (10 by 10 by 10) measures just 25 nanometers in width. That's thousands of times smaller than the diameter of a single human hair.
The latest research builds upon work that the Wyss researchers detailed in May, which involved piecing together DNA strands to create two-dimensional tiles (including cute smiley faces). This time around, the strands were twisted in such a way that they could be interlocked, Lego-style. As any visitor to Legoland knows, such structures can get incredibly complex in the hands of a skilled builder. Yin and his colleagues are still learning their building techniques. Fortunately, the bricks could be programmed to build themselves, with the aid of 3-D modeling software. Once the designs were set, the researchers synthesized strands with the right combinations of nucleotides — adenosine, thymine, cytosine and guanine — so that when they were mixed together in a solution, at least some of the bricks would form the desired design. To demonstrate the method, 102 different 3-D shapes were created using a 1,000-brick template.
Corey Robin over at his blog:
Two weeks ago I wrote, “When Steven Spielberg makes a movie about the Holocaust, he focuses on a German. When he makes a movie about abolition, he focuses on a white man. Say what you will, he’s consistent.”
My comment was inspired by historian Kate Masur’s excellent New York Timesop-ed, which argued that Spielberg’s film Lincoln had essentially left African Americans offstage or in the gallery. In Spielberg’s hands, blacks see themselves get rescued by a savior who belongs to the very group that has ravaged and ruined them. Just as Jews do in Schindler’s List. The difference is that in the case of emancipation, blacks—both free and slave—were actually far more central to the process of their own deliverance.
Thanks in part to documents from the National Archives that historians began to rigorously amass and organize in 1976—resulting in the multi-volume Freedom: A Documentary History of Emancipation, 1861-1867—students and scholars have come to a completely different view of how emancipation happened. As three of the historians who were involved in that project wrote in the path-breaking Slaves No More:
The Destruction of Slavery [the first essay in the book] explicates the process by which slavery collapsed under the pressure of federal arms and the slaves’ determination to place their own liberty on the wartime agenda. In documenting the transformation of a war for the Union into a war against slavery, it shifts the focus from the halls of power in Washington and Richmond to the plantations, farms, and battlefields of the South and demonstrates how slaves accomplished their own liberation and shaped the destiny of a nation.
Adam Shatz reviews Benoît Peeters' Derrida: A Biography, in the LRB:
Derrida’s early work was written in the shadow of decolonisation. His first book, a translation of Husserl’s 43-page Origin of Geometry preceded by a 170-page introduction, was published in 1962, but it wasn’t until 1967 that he made his mark. That year he published three books of astonishing audacity which, taken together, amounted to a declaration of war on structuralism, then all the rage in France: Speech and Phenomena, another study of Husserl; Writing and Difference, a collection of essays originally published in journals like Tel Quel and Critique; and his masterwork,Of Grammatology. Few read the formidably dense Of Grammatology from cover to cover, but it acquired tremendous cachet; a year later, its cover made an appearance in Godard’s Le Gai Savoir. What was Of Grammatology about? When Madeleine, the heroine of Jeffrey Eugenides’s campus novel The Marriage Plot, asks a young theory-head this question, she is immediately set straight: ‘If it was “about” anything, then it was about the need to stop thinking of books as being about things.’
That’s not so far off. In all three books, Derrida’s argument was that Western thought from Plato to Rousseau to Lévi-Strauss had been hopelessly entangled in the illusion that language might provide us with access to a reality beyond language, beyond metaphor: an unmediated experience of truth and being which he called ‘presence’. Even Heidegger, a radical critic of metaphysics, had failed to escape its snares. This illusion, according to Derrida, was the corollary of a long history of ‘logocentrism’: a privileging of the spoken word as the repository of ‘presence’, at the expense of writing, which had been denigrated as a ‘dangerous supplement’, alienated from the voice, secondary, parasitic, even deceitful.
Derrida wanted not only to liberate writing from the ‘repression’ of speech, but to demonstrate that speech itself was a form of writing, a way of referring to things that aren’t there. If logocentrism was a ‘metaphysics of presence’, what he proposed was a poetics of absence – a philosophical echo of Mallarmé’s remark that what defines ‘rose’ as a word is ‘l’absence de toute rose’.
Though lasting literary friendships between natural rivals are not rare — Byron and Shelley, Coleridge and Wordsworth and Edward Thomas and Robert Frost spring to mind — few have been as durable as the one that began in the Front Quad of St John’s College, Oxford, one afternoon in May 1941 when a mutual friend introduced what their biographer calls ‘the odd couple’ by pointing his fingers at Kingsley Amis while imitating the sound of a gunshot. On cue, the fair-haired freshman yelled in pain, clutched his chest and staggered back to fall on a convenient pile of laundry sacks. Philip Larkin, a deliberately conspicuous figure in drab wartime Oxford, clad in bow tie, yellow waistcoat and the city’s only pair of cerise trousers, was suitably impressed by the performance. ‘I stood silent. For the first time in my life I felt myself in the presence of a talent greater than my own,’ he later publicly recalled.
more from Nigel Jones at The Spectator here.
In fact, Burton knew a lot about excremental places. Twenty-seven years before this incident, we find, in the matter-of-fact diary of the fourteen-year-old Richard Jenkins (he later adopted the last name of his guardian and mentor Philip Burton) laconic entries: “Bucket of D.” “Went up Mountain and had a bucket of D.” “Fetched a bucket of D. There was another man up there but I was very keen today I could smell D. a mile off. This mountain is nothing but D.” “D.” is code for dung. The adolescent Richard earned money by climbing the mountains outside the industrial town of Port Talbot, scooping animal manure into his bucket, carrying it back down the mountain, and selling it to gardeners in the town. The alchemy of Burton’s career is the transformation of dung into diamonds. There is a delicious moment in the diaries when he is reading in bed “and E. was around the corner of the room I asked: What are you doing lumpy? She said like a little girl and quite seriously: ‘Playing with my jewels.’” The innocence is as much his as hers: Burton’s idea of wealth—dressing your princess in diamonds—is a fantasy of childhood poverty.
more from Fintan O’Toole at the NYRB here.
Like most founders of world-changing institutions (and nearly all religious ones), Young led the kind of outsize life that lends itself to these Janus-faced interpretations. The great virtue of John G. Turner’s new biography of Brigham Young—the first major study since LDS historian Leonard Arrington’s Brigham Young: American Moses (1985)—is the author’s stolid resistance to either version of the traditional Young caricature. Turner, a professor of religious studies at George Mason University, treats him as an exceptional spiritual figure (“a leader who understood himself as following in the footsteps of the ancient biblical prophets could not readily function within the US territorial system,” Turner drily notes), but also as an avatar of the frontier spirit of colonial conquest during the mid–nineteenth century. By settling a Utah territory that originally comprised one-sixth of the western United States, Young was “the greatest colonizer in American history,” Turner writes. And in establishing his desert kingdom in the face of sustained federal resistance, “he brought many of the key political issues of mid-nineteenth-century America into sharp relief: westward expansion, popular sovereignty, religious freedom, vigilantism, and Reconstruction.”
more from Chris Lehmann at The Nation here.
Christopher Wanjek in the Washington Post:
Vegetarian, vegan and raw diets can be healthful, probably far more healthful than the typical American diet. But to call these diets “natural” for humans is a bit of a stretch in terms of evolution, according to two recent studies.
Eating meat and cooking food made us human, the studies suggest, enabling the brains of our prehuman ancestors to grow dramatically over a few million years.
Although this isn’t the first such assertion from archaeologists and evolutionary biologists, the new studies demonstrate that it would have been biologically implausible for humans to evolve such a large brain on a raw, vegan diet and that meat-eating was a crucial element of human evolution at least a million years before the dawn of humankind.
At the core of this research is the understanding that the modern human brain consumes 20 percent of the body’s energy at rest, twice that of other primates. Meat and cooked foods were needed to provide the necessary calorie boost to feed a growing brain.
One study, published last month in the Proceedings of the National Academy of Sciences, examined the brain size of several primates. For the most part, larger bodies have larger brains across species. Yet humans have exceptionally large, neuron-rich brains for our body size, while gorillas — three times as massive as humans — have smaller brains with one-third the neurons. Why?
Gro Harlem Brundtland and Jimmy Carter in the New York Times:
In the current political climate, it is highly unlikely that bilateral talks between Israel and the Palestinians can restart. Action is needed that will alter the current dynamic. As Elders, we believe that the Palestinian statehood bid at the United Nations is such a moment.
On Nov. 29, U.N. member states will be asked to vote on a resolution to grant “non-member observer state status” to Palestine, a significant upgrade from its current “observer entity” status. We urge all member states to vote in favor.
In going to the General Assembly, Palestinian President Mahmoud Abbas is not carrying out a provocative act. Nor is he undermining trust and distracting from the pursuit of peace, as his critics have said.
This is a vote for human rights and the rule of law. It is completely consistent with decades of commitment by the United States, Europe and the rest of the world to peace in the Middle East based on the creation of a viable and contiguous Palestinian state existing side by side with Israel. It is a lawful, peaceful, diplomatic act in line with past U.N. resolutions and international law.
More here. [Photo of Carter and Brundtland in East Jerusalem from here.]
Let me tell you about my marvelous god
Let me tell you about my marvelous god, how he hides in the hexagons
of the bees, how the drought that wrings its leather hands
above the world is of his making, as well as the rain in the quiet minutes
that leave only thoughts of rain.
An atom is working and working, an atom is working in deepest
night, then bursting like the farthest star; it is far
smaller than a pinprick, far smaller than a zero and it has no
will, no will toward us.
This is why the heart has paced and paced,
will pace and pace across the field where yarrow
was and now is dust. A leaf catches
in a bone. The burrow’s shut by a tumbled clod
and the roots, upturned, are hot to the touch.
How my god is a feathered and whirling thing; you will singe your arm
when you pluck him from the air,
when you pluck him from that sky
where grieving swirls, and you will burn again
throwing him back.
by Susan Stewart
(University of Chicago Press, 2003)
The human genome has been busy over the past 5,000 years. Human populations have grown exponentially, and new genetic mutations arise with each generation. Humans now have a vast abundance of rare genetic variants in the protein-encoding sections of the genome1, 2. A study published today in Nature3 now helps to clarify when many of those rare variants arose. Researchers used deep sequencing to locate and date more than one million single-nucleotide variants — locations where a single letter of the DNA sequence is different from other individuals — in the genomes of 6,500 African and European Americans. Their findings confirm early work by Akey1 suggesting that the majority of variants, including potentially harmful ones, were picked up during the past 5,000–10,000 years. Researchers also saw the genetic stamp of the diverging migratory history of the two groups.
The large sample size — 4,298 North Americans of European descent and 2,217 African Americans — has enabled the researchers to mine down into the human genome, says study co-author Josh Akey, a genomics expert at the University of Washington in Seattle. He adds that the researchers now have “a way to look at recent human history in a way that we couldn’t before.” Akey and his colleagues were able to dig out genetic variants occurring in less than 0.1% of the sample population — a resolution that is a full order of magnitude finer than that achieved in previous studies, says Alon Keinan, a statistical geneticist at Cornell University in Ithaca, New York, who was not involved with the study. Of 1.15 million single-nucleotide variants found among more than 15,000 protein-encoding genes, 73% in arose the past 5,000 years, the researchers report. On average, 164,688 of the variants — roughly 14% — were potentially harmful, and of those, 86% arose in the past 5,000 years. “There’s so many of [variants] that exist that some of them have to contribute to disease,” says Akey