Sponsored by the Abdorrhaman Boroumand Foundation and the Georgetown chapter of Amnesty International, the exhibition was organized by two sisters, Ladan and Roya Boroumand. Their father, after whom the foundation was named, was an Iranian lawyer and democracy activist who was assassinated in Paris in 1991, almost certainly by Iranian agents. Among other valuable work, the Boroumands have created a database of some 12,000 executions carried out in Iran since the establishment of the Islamic Republic.
The display at Georgetown included three small school desks, the kind in which political detainees in Iran are required to sit to write responses during interrogations and, once they are broken, to put on paper their “confessions.” Roya Boroumand, who takes me through the exhibition, asks if I want my picture taken sitting behind one of these desks. I shudder and refuse. I have no desire to relive the long hours, days and months I spent under interrogation and writing answers to questions at Evin Prison.
The exhibition is aptly named “interrupted lives.” These young men and women, you think, should be playing soccer and basketball, could have gone to graduate school, might have been lawyers and doctors. Instead, jail and exile, and aborted schooling and careers, have been their fate. Manuchehr Es’haqi was arrested at age 13 and spent ten years in jail for “corruption on earth.” He now repairs coffee machines in Sweden. He looks at the camera through haunted eyes. “I am still not really living. Nothing makes me really happy,” the small inscription quotes him as saying.
Hamed Ruhinejad, a university student arrested after the 2009 elections, lingers in jail, despite multiple sclerosis and the loss of sight in his right eye. Bahareh Hedayat, the well-known human rights and women’s rights activist and a leading member of the Office for Fostering Unity, a student organization, has been in and out of jail since 2006. Only 25, she was sentenced in May to nine-and-a half years for speaking out on rights issues.
I'm reading about fireflies, remembering the joy these tiny beetles have given me in fields when I thought I was alone, and the first one came on and then another.
By the shadows of wild carrots, in weeds, on the bark of maples, they shine with cold light after months, years without wings. Only nothing, hunger in the sticky body, a tiny white groove in the earth, sleeping and waking in darkness.
They wait until the end of their lives to glow a sexual fire, a signal so the female will know where the male is among redolent grasses and runaway clover. They come to their senses and die. And then more lights flicker near the stone heaps of ancient fences, over the ridges my shoes make at dusk.
How plain they were in the jam jar, brought in, examined beneath the porcelain light in the kitchen. Grandmother was not an old woman then, she turned the gold lid with five straight fingers, all this excitement over brown wings and a simple body. I'm thinking
about fireflies. The more I know of them, the happier I am without wings or fire, with the heat my body creates when I stand with my back to the stars, wrists in shadow, knees chilled by a cool wind. And lonely, I speak to the flickering, white, umber, green with a dark and human voice.
by Rita Gabis from The Wild Fields; Alice James Books, 1990
The world, we are told, is in the midst of a revolution. The new tools of social media have reinvented social activism. With Facebook and Twitter and the like, the traditional relationship between political authority and popular will has been upended, making it easier for the powerless to collaborate, coördinate, and give voice to their concerns. When ten thousand protesters took to the streets in Moldova in the spring of 2009 to protest against their country’s Communist government, the action was dubbed the Twitter Revolution, because of the means by which the demonstrators had been brought together. A few months after that, when student protests rocked Tehran, the State Department took the unusual step of asking Twitter to suspend scheduled maintenance of its Web site, because the Administration didn’t want such a critical organizing tool out of service at the height of the demonstrations. “Without Twitter the people of Iran would not have felt empowered and confident to stand up for freedom and democracy,” Mark Pfeifle, a former national-security adviser, later wrote, calling for Twitter to be nominated for the Nobel Peace Prize. Where activists were once defined by their causes, they are now defined by their tools. Facebook warriors go online to push for change. “You are the best hope for us all,” James K. Glassman, a former senior State Department official, told a crowd of cyber activists at a recent conference sponsored by Facebook, A. T. & T., Howcast, MTV, and Google. Sites like Facebook, Glassman said, “give the U.S. a significant competitive advantage over terrorists. Some time ago, I said that Al Qaeda was ‘eating our lunch on the Internet.’ That is no longer the case. Al Qaeda is stuck in Web 1.0. The Internet is now about interactivity and conversation.”
These are strong, and puzzling, claims. Why does it matter who is eating whose lunch on the Internet? Are people who log on to their Facebook page really the best hope for us all? As for Moldova’s so-called Twitter Revolution, Evgeny Morozov, a scholar at Stanford who has been the most persistent of digital evangelism’s critics, points out that Twitter had scant internal significance in Moldova, a country where very few Twitter accounts exist.
While contemporary Africa is to all intents and purposes chaotic, corrupt, medieval and constantly engaged in civil wars, it is also a wild and thrilling theatre for those who wish to engage with life a little more vividly than we do in the more ordered and “civilised” West. It is this combination of danger and adventure that draws disgruntled, dissatisfied Westerners to it like moths to a flame.
Tony Fitzjohn, Fitz as he is known to his friends, fits the bill almost to the point of caricature. He grew up in suburban north London, a tetchy, rebellious foster child disappointed with the greyness of post-war Britain. Then, through a combination of wanderlust and a series of accidental meetings, he found his place on the planet in a raw and remote patch of African bushveld called Kora in Kenya, raising lions with George Adamson of Born Free fame. He worked as Adamson’s assistant from 1970 until 1989, living on a diet of bully beef, fresh vegetables, beer and gin in circumstances that we in the West would regard as somewhat marginal. For Fitzjohn this was nirvana.
The cost of whole-genome sequencing is dropping like a rock, and that’s fueling a “renaissance of activity” for scientific sleuths tracking down the genetic causes of disease, a pioneer in the field says. Harvard geneticist George Church provided a status report on the genome market, and its implications for medical research, during this week's “Open Questions in Neuroscience” symposium in Seattle, sponsored by the Allen Institute for Brain Science. Church is not only a Harvard professor and research, but also the founder of the Knome commercial venture for genome-sequencing.
Thanks to competition in the sequencing field, the price of decoding a complete human genome has been following an affordability curve that looks like Moore's Law on steroids. The cost of the federal Human Genome Project, which issued its first draft in 2000 and a complete genome sequence in 2003, was estimated at $2.7 billion in 1991 dollars. But that price tag has been falling by as much as an order of magnitude per year, and today the going rate for whole-genome sequencing is edging below $10,000 (counseling costs extra). The cost of materials — that is, the chemical reagents required to do the tests — is merely $1,000, Church said in June.
One of the main themes of the Valdai Club this year was coming to terms with Russia’s twentieth-century history, or rather the ghastly period between the revolution of 1917 and the death of Stalin in 1953. This forms part of a push by Russian establishment liberals who support President Dmitri Medvedev to galvanize Russian reform and bring about a clear break with the Soviet past. Remembering the crimes of Stalinism was also a natural accompaniment to our trip by boat along parts of the White Sea Canal, constructed under Stalin in the 1930s by political prisoners at an appalling cost in human life and suffering, from cold, hunger and mass executions. This and so many other mass atrocities committed under Stalin and Lenin are only to a very limited degree officially remembered or commemorated in the Russia of today, although Russians formed a majority of their victims. This is a subject on which non-Russians have a limited moral right to speak except where their own fellow countrymen were among the mass of victims (as with Stalin’s mass murder of Polish prisoners at Katyn)—and even then, they must be very careful to acknowledge both that this was a crime of a Communist and not a Russian national state, and that innumerable Russians were also among the mass of victims.
more from Anatol Lieven at The National Interest here.
“Abstract Expressionist New York,” the huge new exhibition at the Museum of Modern Art, is three-quarters brain dead. That is better than entirely brain dead. My advice is to begin with the strongest material, which you will find in galleries on the second and third floors at MoMA. Walking through “Rock Paper Scissors” and “’Ideas Not Theories’: Artists and The Club, 1942–1962”—with their excitable mix of works in multiple media by midcentury painters, sculptors, and architects—you can feel the gritty romantic spirit of downtown Manhattan in the years during and after World War II. The Museum of Modern Art is more than justified in saluting the artistic forces at play in New York City in that period, even if an accompanying book, Abstract Expressionism at the Museum of Modern Art, makes the museum’s relationship with the city’s avant-garde appear considerably less rocky than it actually was. In our recession-conscious times the idea of a major show drawn exclusively from the museum’s outstanding holdings is not a bad thing. Done with some zest and adventuresomeness, as it is in the smaller installations on the second and third floors, the result is museumgoing of a very high order. As for the fourth floor, much of it filled with signature works by Pollock, de Kooning, Gorky, Rothko, Kline, and Newman, there is surely a great deal of wonderful material here, but the installation is so uninspired and predictable a presentation of blue-chip stuff that a visitor may be left wondering what Ann Temkin, the curator in charge, could possibly have had in mind.
Jackie Chan is the highest-paid actor in Asia, and that makes sense. Besides producing, directing, and starring in his own action movies since 1980, he’s earned millions in Hollywood with blockbusters like Rush Hour and The Karate Kid. But the No. 2 spot goes to someone who doesn’t make any sense at all. The second-highest-paid actor in Asia is a balding, middle-aged man with a paunch, hailing from the Indian state of Tamil Nadu and sporting the kind of moustache that went out of style in 1986. This is Rajinikanth, and he is no mere actor—he is a force of nature. If a tiger had sex with a tornado and then their tiger-nado baby got married to an earthquake, their offspring would be Rajinikanth. Or, as his films are contractually obligated to credit him, “SUPERSTAR Rajinikanth!” If you haven’t heard of Rajinikanth before, you will on Oct. 1, when his movie Enthiran (The Robot) opens around the world. It’s the most expensive Indian movie of all time. It’s getting the widest global opening of any Indian film ever made, with 2,000 prints exploding onto screens simultaneously. Yuen Wo-ping (The Matrix) did the action, Stan Winston Studios (Jurassic Park) did creature designs, George Lucas’ Industrial Light and Magic did the effects, and Academy Award-winning composer A.R. Rahman (Slumdog Millionaire) wrote the music. It’s a massive investment, but the producers fully expect to recoup that, because this isn’t just some film they’re releasing; this is a Rajinikanth film.
more from Grady Hendrix at Slate here (via Aditya).
Mark Blyth, besides being a close friend and occasional 3QD writer, is an international political economist and a professor at Brown University. He is writing a book, tentatively titled “Austerity: The History of a Dangerous Idea,” investigating the return to prominence of the idea of a financial orthodoxy following the global financial crisis. The book is forthcoming from Oxford University Press. Watch this video. It’s very cool. (By the way, the best cook of Indian food that I know is Robin Varghese. When I once asked him whom he learned to cook Indian food from, he replied, “Mark Blyth.” Mark is a man of many talents!)
George Price was born a Jewish half-breed to parents who kept his Semitic side a secret; lived much of his life an aggressive atheist and skeptic of the supernatural; and died a Christian, twice converted, albeit, to his mind, a defeated one. Several years before he abandoned his career in a mission to shelter and comfort homeless alcoholics, he made a number of extraordinary contributions to evolutionary biology, a field in which he had no training. Educated as a chemist, Price had worked previously for the Manhattan Project on uranium enrichment, helped develop radiation therapy for cancer, invented computer-aided design with IBM and dabbled in journalism.
Shortly after Christmas 1974, Price slashed his carotid artery with a pair of tailor's scissors in his room in a London squat. John Maynard Smith, with whom Price published a paper that applied game theory to natural selection, was one of the few people, along with some of those homeless alcoholics, to attend his funeral. Also present was William Hamilton, the father of kin selection, which proposed that self-sacrificing behavior was able to evolve between related organisms because of the advantages conferred to their shared genes. Price used Hamilton's ideas about kin selection to derive his own equation, one that could explain selection at multiple levels of organization—the genetic level, as well as among individuals in kin groups and populations of unrelated others. The equation marked a breakthrough in the field: Price had provided a working mathematical model for the emergence of altruism in a theory of the world that took dogmatic self-interest as its first principle.
What can we learn from the bees? Honeybees practice a kind of consensus democracy similar to what happens at a New England town meeting, says Thomas D. Seeley, author of “Honeybee Democracy.” A group comes to a decision through a consideration of options and a process of elimination.
The bees are making a life-and-death decision: where to establish a new hive. Choosing a site that is too exposed, too small or too close to the ground can be fatal. Swarms don’t always do it right, but they do succeed a remarkable amount of the time, with 10,000 or more bees following the advice and signals of a few hundred leaders to re-establish themselves in a new location every spring. Along the way they have to make sure the precious queen, fatter and more sluggish than the others and prone to take a rest stop, is not lost…
In the spring, when the hive’s stores are depleted and the virgin queens are still in their queen cups, peanut-shaped cells in the comb, being nurtured with a nutrient-rich secretion called royal jelly, about two-thirds of the hive detaches itself and flies off en masse, settling somewhere nearby, on a branch or a mailbox, in the familiar beard shape of a honeybee scrum. At this point a few hundred scouts take off in all directions, checking out several dozen potential new sites. They return to the hive one by one, indicating, by a waggle dance first analyzed by Martin Lindauer 60 years ago, both the location and the quality of the site.
Dr. Seeley and his colleagues have meticulously observed the process of decision making that follows, and his research reveals an astonishingly effective system.
At dusk—west of Patch Grove— two bison become an electric fence, a fox, a question about crossing the street, yellow circles of fallen leaves, a flower arrangement that turns love again to lust. Four hundred miles east the bison, lost in wandering, witness a son bankrupt a bar, bust the town of Black Wolf, fold the farm as metal folds in train wrecks. The bison, alone again in wandering, are not box knives, not crows, not a soiled sheet, a trailer-park-storm. They do not go into the woods alone. They are not a last dance, drunk, not a blue jay, not whiskey, not a time clock.
Exactly 1,200 years after its foundation, I was born in Karradat Mariam, a Shia district of Baghdad with a large Christian community, a stone's throw away from today's Green Zone and a few miles south of the spot where one of Baghdad's most famous rulers was born in 786. His name was Abū Ja'far al-Ma'mūn. Half Arab, half Persian, this enigmatic caliph was destined to become the greatest patron of science in the cavalcade of Islamic rulers, and the person responsible for initiating the world's most impressive period of scholarship and learning since Ancient Greece. By the eighth century, with western Europe languishing in its dark ages, the Islamic empire covered an area larger in expanse than either the Roman empire at its height or all the lands conquered and ruled by Alexander the Great. So powerful and influential was this empire that, for a period stretching over 700 years, the international language of science was Arabic.
The teenage prince Ma'mūn would have known Baghdad at the height of its glory: a vast, beautiful city characterised by the domes and archways of its famously intricate Abbasid architecture. It had grown to become the world's largest city just 50 years after the first brick was laid, with some estimates putting its population at more than 1 million. Ma'mūn was not the only caliph to support scholarship and science, but he was certainly the most cultured, passionate and enthusiastic. As a young man, he memorised the Qur'an, studied the history of early Islam, recited poetry and mastered the newly maturing discipline of Arabic grammar. He also studied arithmetic and its applications in the calculation of taxes. Most importantly, he was a brilliant student of philosophy and theology, or more specifically what is referred to in Arabic as kalam, which is a form of dialectic debate and argument. The early Muslim theologians found that the techniques of kalam enabled them to hold their own in theological discussions with the Christian and Jewish scholars who lived alongside them, and who had had a head start of several centuries to hone their debating skills by studying the writings of philosophers such as Socrates, Plato and Aristotle – historical figures from ancient Greece whose names would certainly have been known to the young Ma'mūn. It is even quite likely that by the early 9th century, some of their work had already been translated into Arabic.
People may volunteer for a study simply to advance science, but a large fraction of them could wind up receiving unnerving news. A paper published today1 reports finding that 40% of participants in imaging experiments had clinical anomalies beyond the scope of the investigation, and that, of these cases, 6% provoked subsequent medical intervention.
Radiologists at the Mayo Clinic in Rochester, Minnesota, appraise images from research examinations daily and report any potential problems that they spot to physicians. An expert panel of physicians, radiologists and bioethicists assessed the benefits and burdens of radiologists' findings for research examinations taken over three months in 2004 by studying individuals' medical records over a follow-up period of three years. Out of a total of 1,426 examinations, 567 revealed at least one anomaly, and the total tally of anomalies across this subset was more than 1,000.
More here. (Note: For my radiologist sister Ga who has told me of this phenomenon for years!)
Why is cosmology so popular? Books by writers such as Paul Davies and Stephen Hawking on fine-tuning or the multiverse routinely become bestsellers. They’re good writers, of course. And there’s the aesthetic appeal of cosmology too, offering a ceaseless stream of heavenly images at which to wonder and gaze. But I suspect there’s more to it than that. After all, many other branches of physics are progressing as fast, and arguably have a bigger impact upon our daily lives. But when did you last pick up a paperback on solid state physics, one of the largest contemporary research fields? Or who would choose a book about optics over one about the Big Bang? Chaos theory gets a look in, as does quantum theory — though that’s very close to cosmology, as the history of universe turns on the physics of the very small. So here’s a possibility. Cosmology is so popular, not just because of the science, but because it allows us to ask the big questions — where we come from, who we are, where we’re going. It’s metaphysics by other means. If the Scholastic theologians of the Middle Ages liked to speculate about the number of angels on the heads of pins, we today like to speculate about the number of dimensions wrapped up in string theory. The activities are similar insofar as they feed the delight we find in awe-inspiring wonder.
more from Mark Vernon at Big Questions Online here.
In 1965, a hotel owner named Jay Sarno began construction on a new hotel on the Las Vegas Strip, and decided to set his creation apart from the competition by modelling it on a Roman palace. Caesars Palace was really no different from any other big hotel, but the Roman arches and columns stuck on its façade, not to mention the tunic-clad cocktail waitresses inside, were such a hit that the place spawned a generation of imitations, each aiming to outdo the last in eye-popping extravagance. Las Vegas became the world’s largest theme park, with hotels intended to make you feel that you are in Venice, or Paris, or Egypt, or New York, or Bellagio, or on a pirate’s island, or among King Arthur and his knights. Or—given that these weird simulacra have become famous in their own right—that you are, quite simply, in Vegas. Sarno’s palace was vulgar and crude, but his achievement is one that even the most accomplished architects can only envy: he defined a city’s style. But it’s been clear for a while that Las Vegas has been running out of themes. The trouble is that its effects rely entirely on dazzlement, an over-the-top gigantism that gets old fast. By this point, you could do a hotel that reproduced Angkor Wat or the Aztec city of Tenochtitlan and no one would raise an eyebrow. And as Las Vegas has grown—until the recession, its expansion had helped make Nevada the fastest-growing state in the nation—the city has started to feel a little uncomfortable about its reputation as a place where developers spend billions of dollars on funny buildings.
In 2004, Bob Stein founded the Institute for the Future of the Book, with the goal of finding new models for publishing as it moved from the page to the screen, from the enclosed world of the individual reader to the networked one of the Internet. While innovative for its own time, the Institute’s mission built on Stein’s decades of experience exploring the frontiers of electronic publishing, whether with Atari, the Criterion Collection, or Voyager. Long before the popularization of the Internet, the tools that Stein developed for publishing with floppy disks, CD-ROMs, and LaserDiscs laid the groundwork for dramatic shifts in how we interact with (formerly) printed media. Much of his work proposed hybrid formats, combining the referential nature of books with the visual appeal of films, using computers to turn texts into what Stein was already calling, in the mid-’80s, “user-driven media.” Today these hybrids seem natural, but the history of publishing and technology prior to the Web, which has largely gone unrecorded, suggests that the evolution of the medium was not prescribed, but rather spurred by the experiments of Stein and his cohorts.
more from an interview with Bob Stein at Triple Canopy here.
The density of the immigrant swarm on the Lower East Side at the turn of the century, more than 2,600 people per acre, equaled in its misery but exceeded the crowding then prevalent in the slums of Bombay. In the years since, most of the alien labor has been sanitized or outsourced, but the comforts of the city’s rich still depend on the abundance of its poor, the municipal wealth and well-being as unevenly distributed as in the good old days of the Gilded Age. When seen at a height or a distance, from across the Hudson River or from the roof of Rockefeller Center, Manhattan meets the definitions of the sublime. At ground level Manhattan is a stockyard, the narrow streets littered with debris and laid out in the manner of cattle chutes, the tenements and storefronts uniformly fitted to fit the framework of a factory or a warehouse. The modus vivendi under the boot of the modus operandi. The commercial imperative comes with no apology. Like most other American cities, New York is a product of the nineteenth-century Industrial Revolution, built on a standardized grid, conceived neither as a thing of beauty nor as an image of the cosmos, much less as an expression of man’s humanity to man, but as a shopping mall in which to perform the heroic feats of acquisition and consumption.
As Nietzsche constantly reminds us, morality owes a great deal, including its own existence, to the fact that it is not obeyed. It can seem to achieve closure on its own absolute kind of value only because the space in which it operates has been created, historically, socially and psychologically, by kinds of impulse that it rejects.
Tauriq Moosa is a person I usually agree with, which is why I was surprised to discover how much I disagreed with Tauriq's recent article, “How Philosophy Killed My Children and Why it Should Kill Yours, Too“. Doubtless, the breezy, polemic piece was meant to provoke rather than permanently convince, but I think that it is nonetheless quite definitely wrong. I also think an examination of why it is wrong can illuminate some very interesting, possibly disturbing things about the way certain people want us to view our actions and choices.
I take Moosa's argument to be quite simple. Human society depends vitally on procreation and on parenting. Without these, we literally have no future. Procreation is a given: children inevitably spring up all over the world for reasons that most of us understand quite well. Parenting involves the love and care of children. It does not necessarily involve the love and care of one's biological children. Given that countless needy orphans exist all over the world, a well-off person in the industrialized world is acting selfishly by having their own children. They ought to just adopt the less fortunate children.
Now, one might engage critically with this argument on several factual or practical fronts. Yet, what is most troubling about it is its uncritical acceptance of a certain form of ethical reasoning, one where our choices are evaluated from a “zoomed out” or objective perspective, one that ignores how and why individual people actually make these kinds of decisions. We see this ideology–that is what it is–at work in the idea that a child is an object into which love and care must be poured. Since young, unsocialized children are all morally interchangeable, there is no important difference between biological procreation and adoption. Potential parents are therefore morally obligated to choose the option which is better for the world in general: adoption.