March 25th will mark the 200th anniversary of “An Act for the Abolition of the Slave Trade”, which, as the title suggets, abolished the slave trade in the British Empire. In the Economist (via Normblog):
In its tactics, boycotts, moral zeal, lobbying, research and its use of images, the British campaign was a template for many later ones—against slavery in the Belgian Congo in the late 19th century; against apartheid in South Africa; and against segregation in the American south.
For all the fervour of its opponents, the slave trade would not have collapsed without rebellions by the victims. The most important was in 1791 on St Domingue. Within two months the slaves had taken control of the island, led by the remarkable Toussaint L’Ouverture. His guerrillas saw off the two greatest imperial armies of the day, the French and British; this led to the establishment of the republic of Haiti in 1804 and to the emancipation of about 500,000 slaves. It was clear that European armies would find it hard to contain many more uprisings, a point proved again on the British islands of Grenada and Barbados. Samuel Sharpe’s uprising on Jamaica in 1831 was put down at great cost; the British feared that if slavery continued, they would lose some colonies altogether. So in 1833 slavery was abolished throughout their empire.
Britain was not the first to outlaw the slave trade in its territory; the Danes had done so in 1803, the French temporarily in 1794 and several northern American states had also done so before 1807. But as Britain was the big sea power of the day, it alone could enforce abolition throughout the world, as its navy resolutely tried to do for the rest of the 19th century. Other European nations, notably the Portuguese, persisted with the trade into the 1860s.
Roger Highfield in The Telegraph:
They may look like toys, but these robots have helped to back one theory of the origins of language.
Sometime between seven million years ago, when we shared our last common ancestor with chimps, and 150,000 years ago, when anatomically modern humans emerged, true language came into being.
One idea of how it emerged from the “primordial soup” of communication in the animal kingdom, whether primitive signalling between cells, the dance of bees, territorial calls and birdsong, goes as follows.
Early humans had a few specific utterances, from howls to grunts, that became associated with specific objects. Crucially, these associations formed when information transfer was beneficial for both speaker and listener. And in this way, the evolution of cooperation was crucial for language to evolve.
But this theory has been impossible to prove, given the lack of time machines or lack of fossil evidence of ancient tongues.
Now backing for the role of cooperation has come from experiments with robots – both real and virtual – that possess evolving software. The study is described today by a group including Dario Floreano of Ecole Polytechnique of the Fédérale de Lausanne, in Switzerland, and Laurent Keller of the University of Lausanne, in the journal Current Biology.
Gary Shapiro in The New York Sun:
F. Scott Fitzgerald wrote that there are no second acts in American lives, but he never met Lewis Lapham.
In September, the silver-haired septuagenarian former editor of Harper’s Magazine is set to launch Lapham’s Quarterly, a journal of history.
With a patrician demeanor and cigarette often in hand, Mr. Lapham has for decades punctured pomposity with wry observations on American society. An editor at Verso, Tom Penn, described his crystalline prose as “rapierlike.”
In a glass-enclosed office on Irving Place, Mr. Lapham sits in a gray scarf and immaculately tailored suit, leaning over a desk, writing in longhand on a legal-size notepad — no computer.
His journal will examine current topics from numerous historical perspectives. With a sheaf of historical texts alongside reflection by contemporary essayists, each issue will, according the Web site Laphamsquarterly.com, open “the doors of history behind the events in the news” and be a bulwark against “the general state of amnesia.” In the office lay paperback copies of Homer’s “Iliad” and “Odyssey.”
This is fitting, as the first issue is about war.
A couple of days ago I posted an article from The New Yorker about Robert J. Lang and his amazing origami. Now Jason Kottke has dug deeper and found more cool stuff:
Lang’s creations are truly astounding, almost to the point of being magical, because the comparison of the finished product to a flat, uncut sheet of paper is so dissonant. Here are two views of one of Lang’s signature “bugs”, a 7″ silverfish he folded in 2004. The folding pattern is followed by the completed product:
In 1987, Lang folded a 15″ long cuckoo clock out of a single sheet of paper. The clock, which “made Lang a sensation in the origami world”, took him three months to design and six hours to fold. These days, he uses a computer program he wrote called TreeMaker to design his creations and a laser cutter borrowed from Squid Labs to gently score the paper for quicker & easier folding.
Squid Labs is responsible for a site called Instructables, which allows people to share step-by-step instructions for how to do just about anything, from broiled peanut butter and jelly sandwiches to origami. Lang doesn’t seem to have any instructions for his designs up on Instructables, but he shares the site’s open source and collaborative spirit…crease patterns for many of his most complex creations are available on his site and TreeMaker and ReferenceFinder are free to download (with the source code released under the GPL).
(Speaking of Instructables, here’s an easy way to get started with origami. Just grab that stack of Post-It Notes sitting on your desk (the square ones, not the letterbox ones), peel the top one off, and follow these simple instructions to make a little box out of it. It’ll take you 5 minutes…here’s mine that I did this morning.)
From Scientific American:
“If we wish to learn more about cancer, we must now concentrate on the cellular genome.” Nobel laureate Renato Dulbecco penned those words more than 20 years ago in one of the earliest public calls for what would become the Human Genome Project. “Either try to discover the genes important in malignancy by a piecemeal approach, or sequence the whole genome.”
Dulbecco and others in the scientific community grasped that sequencing the human genome, though a monumental achievement itself, would mark just the first step of the quest to fully understand the biology of cancer. Over the span of two decades Dulbecco’s vision has moved from pipe dream to reality. Less than three years after the Human Genome Project’s completion, the National Institutes of Health has officially launched the pilot stage of an effort to create a comprehensive catalogue of the genomic changes involved in cancer: The Cancer Genome Atlas (TCGA).
When applied to the 50 most common types of cancer, this effort could ultimately prove to be the equivalent of more than 10,000 Human Genome Projects in terms of the sheer volume of DNA to be sequenced. The dream must therefore be matched with an ambitious but realistic assessment of the emerging scientific opportunities for waging a smarter war against cancer.
David Sedaris in The New Yorker:
In Paris they warn you before cutting off the water, but out in Normandy you’re just supposed to know. You’re also supposed to be prepared, and it’s this last part that gets me every time. Still, though, I try to make do. A saucepan of chicken broth will do for shaving, and in a pinch I can always find something to pour into the toilet tank: orange juice, milk, a lesser champagne. If I really got hard up, I suppose I could hike through the woods and bathe in the river, though it’s never quite come to that.
Most often, our water is shut off because of some reconstruction project, either in our village or in the next one over. A hole is dug, a pipe is replaced, and within a few hours things are back to normal. The mystery is that it’s so perfectly timed to my schedule. This is to say that the tap dries up at the exact moment I roll out of bed, which is usually between ten and ten-thirty. For me this is early, but for Hugh and most of our neighbors it’s something closer to midday. What they do at 6 A.M. is anyone’s guess. I only know that they’re incredibly self-righteous about it, and talk about the dawn as if it’s a personal reward, bestowed on account of their great virtue.
From The Pedestal Magazine:
Partial Building Collapse
Debris fluttered beneath the quiet frame
of police tape as I passed: feathers and dross
lay dazed on the pavement. Fourteen years ago
the possibility of destruction did not frighten me.
Then the pigeons returned to roost on the raw
edges of what had been broken.
Today’s leafy debris has crumbled beneath the sodden
edge of rain. I ignore the rawness, scuff some rocks
on my way to the mailbox. I know the disintegration
of a leaf is nothing to mourn, but I can’t help wishing
for more: perhaps the soft flutter of a feather carving
the wind’s broken corners.
Tomorrow it will snow; I’ll try not to mind the cold
shape of another season. Decay stalks the unwary.
Everywhere I walk, a new path of destruction. Grown
children. A dead mother. Closed doors locked between
all things. Sorrow is familiar and fickle as the wind—
I ignore its mercurial nature.
Rest of the poem here.
Joel Bleifuss in In These Times:
To help me parse what’s PC and what’s not, I had help from people attuned to the nuances of words, particularly those that describe race, ethnicity and sexual identity. Rinku Sen is a 40-year-old South Asian woman. She is the publisher of Colorlines, a national magazine of race and politics, for which she has developed a PC style manual. Tracy Baim is a 44-year-old white lesbian. She grapples with the ever-evolving nomenclature of sexual identity and politics as the executive editor of Windy City Times, a Chicago-based gay weekly. Lott Hill is a 36-year-old white gay male who works at Center for Teaching Excellence at Columbia College in Chicago. He interacts with lots of young people—the font from which much new language usage flows.
African American: In 1988 Jesse Jackson encouraged people to adopt this term over the then-used “black.” As he saw it, the words acknowledged black America’s ties to Africa. “African American,” says Hill, is now “used more by non-African-American people, who cling to it because they are unsure what word to use.” Sen says, “African American” is favored by “highly educated people who are not black. Whether one uses ‘black’ or ‘African American’ indicates how strong your social relations are with those communities.” And Chris Raab, founder of Afro-Netizen, says, “People who are politically correct chose to use African American, but I don’t recall any mass of black folks demanding the use of African American.”
In the FT, James Harkin on self-help books.
How did our bookshelves become a toolbox of methods for living our lives better? Some valuable clues can be found in Dubravka Ugresic’s gloriously, unashamedly bitchy dissection of the state of the publishing industry, Thank You For Not Reading.
The Croatian academic and critic compares the contemporary books market with the propaganda of the Stalinist school of socialist realism. The only difference is that, where the art of socialist realism promised a bright and shining future for society, these books promise a bright personal future – if only you do what they say.
Visit any large metropolitan bookshop, she says, and the display will be festooned with books about how to improve your personal situation and overcome your demons. There are books about fat people becoming thin, sick people recovering, poor people becoming rich, mutes speaking, alcoholics sobering up, unbelievers discovering faith. This literature of personal transformation, she believes, has so cornered the books market that all writers are now forced to “live Oprah” and the publishing world exploits this shamelessly. The title arrived at for Alain de Botton’s book How Proust Can Change Your Life, one London literary agent told me privately, probably doubled sales of the book. Even weighty works of non-fiction are no longer immune from the functional approach – Heat, the environmentalist George Monbiot’s new book about global warming, was brought to market saddled with the sub-title How to Stop the Planet Burning.
But why stop here? In the current publishing climate, a whole range of classics could surely be touched up to lend them a more contemporary feel. Karl Marx’s Capital: A Critique of Political Economy could become Capital: How to Overthrow the Capitalist System For Beginners; Robinson Crusoe could benefit from the sub-title How to Survive and Thrive On a Desert Island; Pride and Prejudice might shift a few more copies if it were subtitled How to Bag a Rich Husband and Live Happily Ever After.
In a post-cold war age, where political allegiances and ideologies often give us little in the way of guidance, many of us have turned inward in search of inspiration. Ideas, as a consequence, find it difficult to get a hearing unless they promise to turn our lives around or help us to get ahead. If the rise of “how-to lit” is as unstoppable as the rise of the self-help industry from which it takes its cue, perhaps the best we can hope for is for more imaginative attempts to subvert the whole genre.
In Dissent, Joschka Fischer’s August 1, 2006 speech to the Iranian Center for Strategic Research in Tehran.
Anyone familiar with recent Iranian history knows that its politics have been marked by a constant search for independence and for security from aggression and influence from its neighbors or from greater powers. For Iran, the lack of respect for and recognition of its independence, its ancient civilization, its strategic potential, and the talent and capabilities of its people has been particularly humiliating and indeed insulting throughout its modern history. When the British, French, and German governments decided in 2003 to react positively to President Mohammed Khatami’s letter and subsequently to send their foreign ministers to Tehran to negotiate a nuclear compromise, the main motivating factor was deep concern that, in the aftermath of the Iraq War, no chance for avoiding another military confrontation in the region should be missed. But this initiative was undoubtedly also underwritten by the spirit of mutual respect and recognition. Regrettably, the initiative failed to bring the desired success, though the Europeans took it very seriously in spite of their very realistic evaluation of the facts.
Asad Raza wrote a very interesting essay about compensation for women at Wimbledon here last year. Now, he has pointed me to this news item by Ola Galal of Reuters:
Wimbledon will pay women and men equal prize money for the first time at this year’s grasscourt grand slam, All England Club chairman Tim Phillips said on Thursday.
The tournament broke with its tradition to join the Australian and U.S. Opens in paying equal prize money across the board in all events and in all stages of competition.
“This year the committee decided unanimously that the time was right to move to equal prize money and bring to a close a long progression,” Phillips told a news conference.
Prize money for the 2007 Championships will be announced in April.
Wimbledon, which dates back to 1877, went “open” in 1968 but had been criticised since then for maintaining a discrepancy in the prize money offered to its male and female competitors.
I think I may have missed something important in my initial take on the assault and attempted kidnapping of Elie Wiesel by a Holocaust denier. Are you familiar with this Feb. 1 incident? Don’t be surprised if you missed it; for some reason, this emblematic outrage has been largely ignored by the media. Perhaps the lack of coverage of the attack on the Nobel Prize–winning Holocaust survivor is understandable: It’s one of the most deeply depressing, dispiriting, demoralizing and sickening stories that one can imagine. On every level.
more from the NY Observer here.
Had Benjamin Franklin managed to outwit the Grim Reaper, he would have turned three hundred years old in 2006, and would probably have been making plans for another three hundred. Journalist, scientist, diplomat, and vendor of the virtues, Franklin stands in our imagination as the iconic “First American,” the self-made man and proud inventor of the future. His scientific achievements were indeed interesting and impressive—especially his research on electricity and his invention of the lightning rod. But equally interesting, and far more complicated, was Franklin’s idea of science. He was, you might say, our first home-grown Baconian—seeing scientific ingenuity as the greatest delight and truest redeemer of human life.
In 1780, Franklin complained to his friend and fellow natural philosopher Joseph Priestley of the disparity between scientific and moral progress: so badly constructed were most human beings, said Franklin, that Priestley should have killed boys and girls instead of innocent mice in his experiments with mephitic air. How much better than the bratty kids were the results of these experiments. Scientific progress, Franklin commented,
occasions my regretting sometimes that I was born so soon. It is impossible to imagine the height to which may be carried in a thousand years, the power of man over matter. We may perhaps learn to deprive large masses of their gravity, and give them absolute levity, for the sake of easy transport. Agriculture may diminish its labor and double its produce; all diseases may by sure means be prevented or cured, not excepting even that of old age, and our lives lengthened at pleasure even beyond the antediluvian standard.
more from The New Atlantis here.