On the Collapse of the Kyrgyz Kleptocracy

Alex Cooley writes on events in Kyrgyzstan.

“The revolution in Kyrgyzstan not only represents a new opportunity for the Kyrgyz people, but also for Western governments that have failed to support true democratization in post-Soviet Central Asia.

The West must abandon its support for regional strongmen who merely profess to be Western-oriented, and should set the same expectations of democratic governance for them as it has for other post-Communist states.

In fact, the United States and European governments share some responsibility for the Kyrgyz regime’s years of incompetent and corrupt rule. Immediately after the dissolution of the Soviet Union, Kyrgyzstan stood out as the only Central Asian country that appeared willing to break from its Communist past.

Its new president, Askar Akayev, was a charismatic former scientist who promised to enact Western-style democratization and pursue economic liberalization, hailing his small country as the ‘Switzerland of Central Asia.'”

The Company of Strangers: A Natural History of Economic Life

Seabright_1  Economist Paul Seabright is fascinated by human cooperation. Mistrust and violence are in our genes, he says, but abstract, symbolic thought permits us to accept one another as “honorary relatives”—a remarkable arrangement that ultimately underlies every aspect of modern civilization.

In developing these ideas for his latest book—The Company of Strangers: A Natural History of Economic Life—Seabright traveled widely, especially in Eastern Europe and Asia. He currently lives in southwest France, where he teaches economics at the University of Toulouse.

American Scientist Online managing editor Greg Ross interviewed Seabright by e-mail in March 2005.

You point out that human society has led us to interact as strangers only in the last 10,000 years, while we still carry deeper instincts toward violence and suspicion of outsiders. How fragile is the social contract?

How full is the glass? It can seem extraordinary that the vast complexity of human cooperation—from road traffic patterns to markets, the Internet and the systems that keep our houses and cities safe—should rest on nothing more solid than social convention, as though civilization were founded purely on table manners. I may think my property is secure and my life reasonably protected, but that is only because the rest of the world has agreed, for the time being, to let them be so. And what people have agreed to respect today they can agree to violate tomorrow. Yet it is just as remarkable how robust many of our conventions turn out to be in practice. Partly this is because conventions govern our reactions to people as roles and not just as individuals—an assassinated president can be replaced by a vice president, and the system as a whole can go on functioning, with people listening to the new president much as they would have listened to the old. Partly it is because the hydra of social life has too many heads to be easily incapacitated: The conventions that sustain our physical security are not coordinated in one place, such as the U.N. or the Pentagon, but are the result of billions of individual decisions concerning how we react to neighbours, friends and colleagues. Some circumstances—the genocide in Rwanda, for instance—upset those conventions radically, so that neighbours, friends and colleagues become each others’ greatest threat. But those circumstances are—fortunately—rare, and the capacity of societies to recover from them has historically proved remarkable.

Hair is good source of stem cells

Hair_stem_cells_1  The fact that hair grows quickly and is continually replenished makes it an attractive source to harvest the amount of stem cells needed for treatments. This has been a major stumbling block of stem cell research, as well as controversy surrounding the ethics of harvesting cells from embryos. The Proceedings of the National Academy of Science study shows nerve cells can be grown from hair follicle stem cells. Read more here as reported by BBC. (Image: COURTESY OF THE PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES).

Inside a hair follicle is a small bulge that houses stem cells. As hair follicles cycle through growth and rest periods, these stem cells periodically differentiate into new follicle cells. Yasuyuki Amoh of AntiCancer, Inc. and his colleagues isolated stem cells from the whiskers of mice and tested their ability to become more sophisticated cell types. The researchers cultured the cells and after one week discovered that they had changed into neurons and two other cell types–known as astrocytes and oligodendrocytes–that are associated with neurons. According to the report, when left for longer periods lasting weeks or months, the stem cells could differentiate into a variety of cell types, including skin and muscle cells.

Read more here.

Asteroid Impact Fueled Global Rain of spherules

Michael Schirber in Space.com:

AsterThe asteroid that struck the Yucatan Peninsula 65 million years ago presumably initiated the extinction of the dinosaurs. The huge collision also unleashed a worldwide downpour of tiny BB-sized mineral droplets, called spherules.

The hard rain did not pelt the dinosaurs to death.

But the planet-covering residue left behind may tell us something about the direction of the incoming asteroid, as well as possible extinction scenarios, according to new research. The falling spherules might have heated the atmosphere enough to start a global fire, as one example.

More here.

The evolutionary revolutionary

Drake Bennett writes in The Boston Globe about Robert Trivers:

Trivers Trivers’s ideas have rippled out into anthropology, psychology, sociology, medicine, even economics. His work provided the intellectual basis for the then-emergent field of sociobiology (now better known as evolutionary psychology), which sought to challenge our conceptions of family, sex, friendship, and ethics by arguing (controversially) that everything from rape to religion is bred in the bone through the process of evolution. The linguist and Harvard psychology professor Steven Pinker calls Trivers ”one of the great thinkers in the history of Western thought.”

Now his decades-long absence-what Trivers’s friends and colleagues refer to as his ”fallow period”-finally seems to be ending. In 1994 he left Santa Cruz (”the worst place in the country,” he now calls it) for Rutgers, and this spring he’s back at Harvard as a visiting professor of psychology. A major new book on genetic conflicts within individual organisms, coauthored with Austin Burt, a geneticist at Imperial College London, is due out next spring from Harvard University Press. And thanks to Brockman-agent to some of the biggest names in science-he’s under contract with Viking Penguin to write a popular book on the evolutionary origins of deceit and self-deception, one that will argue that humans have evolved, in essence, to misunderstand the world around them. Trivers thinks it could be the most important topic he has yet studied.

Trivers’s work grew out of an insight made by the Oxford biologist William D. Hamilton, who died in 2000. In a 1964 paper, Hamilton proposed an elegant solution to a problem that had rankled evolutionary theorists for some time. In a battle of the fittest, why did organisms occasionally do things that benefited others at a cost to themselves? The answer, Hamilton wrote, emerged when one took evolution down to the level of the gene. Individuals were merely vessels for genes, which survived from generation to generation, and it made no difference to the gene which organism it survived in. According to this logic, the degree to which an organism was likely to sacrifice for another should vary in direct proportion to the degree of relatedness: Humans, for example, would be more likely to share food with a son than a second cousin, and more likely to share with a second cousin than someone wholly unrelated. Hamilton called the concept ”inclusive fitness.”

In 1976, the Oxford zoologist Richard Dawkins would popularize Hamilton’s ideas in his book ”The Selfish Gene.” But more than anyone else, it was Trivers, then a graduate student, who grasped the profound implications of Hamilton’s work. In a way, Trivers’s legendary papers of the early 1970s were simply a series of startling applications of its logic.

Read more here.

Bacteria act as glue in nanomachines

Electrodes snare microbes in key sites on silicon wafers.

Nanoelectrodes Prachi Patel Predd writes in Nature: Electric currents are being used to move bacteria around silicon chips and trap them at specific locations. The technique could help to assemble nanomachines from miniature parts, and to create a new generation of biological sensors. Nanodevices are typically built by connecting tiny components. But such a delicate task is not easy. So, many researchers are exploring ways to fix components in place using the binding properties of biological molecules, notably DNA. Robert Hamers and his colleagues from University of Wisconsin-Madison propose using entire microbes instead. The cells have surface proteins that attach to certain biological molecules. Once the cells are placed at specific sites on a silicon wafer, nanoparticles tagged with these molecules can bind to the cells in those locations. This is easier than dragging the nanoparticles themselves to the right spot, because their high density makes them harder to move through fluid media than the less dense living cells. The technique gives one a way to fix components such as quantum dots or carbon nanowires at very precise locations, explains Paul Cremer, a bioanalytical chemist at Texas A&M University in College Station. “That’s potentially very exciting,” he says.

Golden rods: The researchers use Bacillus mycoides, rod-shaped bacteria that are about 5 micrometres long. They pass a solution containing the cells over a silicon wafer with gold electrodes on its surface. The charge on the electrodes captures the bacteria, which flow along the electrodes’ edges like luggage on a conveyor belt. The electrodes have tiny gaps between them. When a bacterium reaches a gap, it is trapped there by the electric field. It can be released by reducing the field between the electrodes, or permanently immobilized by increasing the voltage enough to break its cell wall.

Read more here.

Evo Devo Is the New Buzzword …

Brian K. Hall writes in the Scientific American:

Evo_devo The study of embryonic stages across the animal kingdom–comparative embryology–flourished from 1830 on. Consequently, when On the Origin of Species appeared in 1859, Charles Darwin knew that the embryos of all invertebrates (worms, sea urchins, lobsters) and vertebrates (fish, serpents, birds, mammals) share embryonic stages so similar (which is to say, so conserved throughout evolution) that the same names can be given to equivalent stages in different organisms. Darwin also knew that early embryonic development is based on similar layers of cells and similar patterns of cell movement that generate the forms of embryos and of their organ systems. He embraced this community of embryonic development. Indeed, it could be argued that evo devo (then known as evolutionary embryology) was born when Darwin concluded that the study of embryos would provide the best evidence for evolution.

Darwin’s perception was given a theoretical basis and evo devo its first theory when Ernst Haeckel proposed that because ontogeny (development) recapitulates phylogeny (evolutionary history), evolution could be studied in embryos. Technological advances in histological sectioning and staining made simultaneously in the 1860s and 1870s enabled biologists to compare the embryos of different organisms. Though false in its strictest form, Haeckel’s theory lured most morphologists into abandoning the study of adult organisms in favor of embryos–literally to seek evolution in embryos. History does repeat itself; 100 years later a theory of how the body plan of a fruit fly is established, coupled with technological advances, ushered in the molecular phase of evo devo evaluated by Carroll.

Read more here.

Most mesmerizing music

So_percussion I was fortunate enough to attend a most riveting, thrilling, gorgeous (even visually) musical performance last evening in Boston by the young and superbly talented group So Percussion, at the Isabella Stuart Gardner Museum and are performing there again this afternoon. Here’s a bit more about them:

DAVID WEININGER in the Boston Phoenix:

A salute to Steve Reich at the Gardner         

It’s customary to file Reich under the minimalism tag, but the label is proving to be less and less useful these days. The steady pulse and gradually shifting rhythms are still there, but his best pieces are so complex, with so many compositional elements, that the label rings hollow. George Steel, Miller Theatre’s director and the host of the “Composer Portraits” series, agrees. “It’s a useful term, because people sort of know you mean Steve Reich or people following Steve,” he says over the phone from his office. “But is there a minimal amount of material? No, they’re very rich pieces.” Perhaps a better label would be Steve Reich, The Artist Formerly Known As a Minimalist.

More here.

Allan Kozinn in NY Times 3.25.05

As part of its Composer Portraits series, the Miller Theater is devoting an evening to Steve Reich, with two of his major scores performed by the inventive So Percussion ensemble. For So Percussion (above), the timing is perfect: the group just released a fantastic recording of one of these works, “Drumming,” on the Canteloupe new-music label. “Drumming,” composed in 1971, is a pivotal work in Mr. Reich’s catalog, and it has been getting lots of performances lately (the most recent by Tactus, a student ensemble at the Manhattan School of Music, just a couple of weeks ago). It draws on the techniques that Mr. Reich had explored in his early “phase” pieces – works in which two lines that begin in unison move out of phase as beats are displaced, creating increasingly complicated webs of rhythm, timbre and psycho-acoustic effect. But it also looks back at his studies of African drum techniques. A performance can take on an almost ritualistic appearance; in fact, as the performers moved gradually from bongos to marimbas to xylophones and then to a combination of those instruments, plus voices and a piccolo, a listener can imagine an exotic hybrid of a gamelan orchestra and a factory production line. The other work on the program, Sextet, was composed in 1984, at a time when Mr. Reich was taking stock after the composition of “The Desert Music,” a substantial piece for orchestra and chorus. Sextet, for four percussionists and two keyboard players (who double on pianos and synthesizers), begins with a figure from “The Desert Music” but then moves in its own direction.

R.M. Campbell in the Seattle Post Intelligencer:

The quartet, So Percussion, has been performing Reich’s “Drumming” since its formation in 1999. Who better than these young, talented musicians to tackle the difficulties of Reich’s famous score for percussion?

One of the most eloquent and inventive spokesmen in the world of minimalist music, the composer became intrigued with rhythm at an early age; his interest in Asian and African music came later. “Drumming,” the devlopment of a single rhythmic figure inspired by Reich’s experiences in Ghana, helped make him a celebrated figure and has been widely performed.

It would be hard to imagine a more persuasive performance then So Percussion’s — accurate, engaged and rich in vitality. Rarely does Western percussion sound so varied.

A new biography of Kierkegaard

John Updike in The New Yorker:

Joakim Garff, an associate professor at the Søren Kierkegaard Research Center at the University of Copenhagen, in a brief preface to his monumental “Søren Kierkegaard: A Biography” (translated from the Danish by Bruce H. Kirmmse; Princeton; $35), states that “the Danish biographies of Kierkegaard that have appeared since Georg Brandes’s critical portrait was published in 1877 can easily be counted on the fingers of one hand, and Johannes Hohlenberg’s biography from 1940 is the most recent original work in the field.” Garff’s compendious yet lively work is undeniably a Danish biography; it assumes on the part of its readers a prior acquaintance with, say, the poetry of Adam Oehlenschläger and the intellectuality of King Christian VIII, a firm sense of what the rix-dollar could buy in the eighteen-forties, and a Copenhagener’s inherent familiarity with the saga of his world-famous, locally notorious fellow-townsman Magister Søren Aabye Kierkegaard.

The Kierkegaardian tempest needed Copenhagen’s teapot.

More here.

Too Late to Die Young

Thomas Jones in the London Review of Books:

Catching news about the Michael Jackson trial, I can’t help being reminded of a caustic song by Dan Bern, a singer less famous than Jackson by several orders of magnitude, called ‘Too Late to Die Young’. ‘The day that Elvis died was like a mercy killing,’ it begins, before turning its attention to the inglorious late careers of other fallen idols of American popular culture, challenging listeners to ‘name the last good film that Marlon Brando made/While trying to keep his kid from going to jail.’ ‘Too late to crash, too late to burn, too late to die young,’ goes the chorus. The song is softened somewhat by the singer’s sense of his own life lacking much direction or purpose; and it’s more than aware that dying young isn’t on its own enough of an achievement to turn someone into James Dean.

‘Too Late to Die Young’ points up, too, the contradictions of being both a star and a human being, in terms not only of what consitutes the good – dying young v. living an ignominiously long life, for example – but also of the expectations of the crowd, who want their (our) heroes to be above common human frailties, but all the same can’t help probing for weaknesses, and are both sorely disappointed and gleefully reassured when we find them.

More here.

A small Oedipus still struggling with his father’s shadow

Victor Sonkin in the Moscow Times:

Nabokov_2Vladimir Nabokov’s destiny was a difficult one. Forced into exile by the Revolution, he spent the early part of his life in Germany and France, working as a tutor and tennis coach while gradually becoming the greatest Russian writer of his time. Unfortunately, his poetry and fiction were appreciated only by a small emigre circle. After relocating to the United States, he continued to pursue his interest in entomology — he had a lifelong passion for butterflies — and, with the publication of “Lolita,” he became a living classic of American literature. However, his early Russian novels, most of them translated into English by the author and his son Dmitry, have remained more obscure to U.S. readers than the books he wrote in English.

After the fall of the Soviet regime, Nabokov’s books were finally published in Russia. Except for “Lolita,” translated by the author into Russian — although some critics consider this translation seriously inferior to the original — his English-language novels have not achieved the same success here as “The Gift” or “Glory,” his Russian masterpieces.

More here.

Imagination gets its due as a real-world thinking tool

Bruce Bower writes in Science News:

Monster Marjorie Taylor of the University of Oregon in Eugene belongs to a contingent of researchers who regard imagination as a thinking tool. Kids regularly use their imaginations to figure out how the world works and to address mysterious issues, she notes, such as what God looks like and what happened in their families or in the world before they were born. Children also apply fantasy to sidestep pain. “Fantasy is alive and well in children’s lives,” Taylor says. According to Taylor, adults as well as children are imaginative thinkers—even while posing as staunch realists. From plumbers to prime ministers, individuals encounter and converse with others purely in their own thoughts, ponder the future, and rework past events in pleasing ways. “Imagination is about considering possibilities,” Taylor says. “That’s fundamental to how people think.” 

A 3-year-old boy enthusiastically describes a scary creature after Harvard University psychologist Paul L. Harris shows the boy a box and asks him to imagine that a monster lives inside it. Nevertheless, the boy reassures Harris that a monster won’t pop out if they open the box. The monster is only make-believe, the boy declares with an air of satisfaction. Harris then leaves the room for a few minutes. Alone with his thoughts, the youngster eyes the box nervously as he moves away from it.

This type of response, which kids regularly display by around age 2, doesn’t mean that they fail to distinguish fantasy from reality, in Harris’ view. Adults react in comparable ways, he says. In one experiment that he performed, adults filled a bottle with tap water and wrote the word cyanide on a label that they attached to the bottle. The volunteers knew that they were only pretending that the water was poisonous, but most wouldn’t drink it. Taylor points out another example: Grown-ups get “really scared, not pretend scared,” while watching horror movies.

In his book The Work of the Imagination (2000, Oxford), Harris proposed that people have evolved a brain system that goes to work appraising emotionally charged situations, whether or not they’re real. In fact, responding emotionally to imagined scenarios aids decision making, he holds. For example, Harris has found a deficit among people who don’t show physical signs of emotional involvement, such as an increased heart rate, while reading a suspenseful fictional passage. Such individuals score lower on tests of reasoning and logic than do people who show strong physical and emotional reactions to such tales.

From around the time that children begin to talk, Harris argues, they contemplate not only current and past events in the real world but also imaginary versions of the present and the past, future possibilities, and spiritual or supernatural concerns. He says that many other developmental psychologists neglect imagination’s role in mental development. They assume that children generate reality-based theories primarily to explain what they observe around them, much as scientists do.

Read more here.

Cilia in C-Major

Elizabeth Gudrais in Harvard Magazine:

13_001In the human ear, it takes only a few millionths of a second from the time a sound wave vibrates the receiving “hair cells” to the time the cells generate a neural response. The equivalent process in the human eye, from photon absorption to cellular response, takes a thousand times longer. Hearing “is fast because it’s simple,” says professor of neurobiology David P. Corey of Harvard Medical School.

Well, yes and no. On a basic level, it’s easy to explain how we hear: sound waves, traveling through the air, vibrate the eardrum at certain frequencies and magnitudes, which the brain interprets to identify the sound’s pitch and volume. Betwixt vibration and human perception, though, lie several intermediate steps. Hair cells in the inner ear convert sound waves — a form of mechanical energy — into electrical signals. In the brain, those messages make several transformations between electrical and chemical signals and back again, bouncing from neuron to neuron until they reach a final resting point where we perceive them as sound.

Page13smIt was 30 years ago when Corey, as a graduate student at the California Institute of Technology, began applying his undergraduate background in physics — and his childhood drive to take things apart and figure out how they work — to the mystery of hearing. In a recent article in Nature, he and his colleagues describe a protein they believe adds a crucial piece to this intricate puzzle.

Scientists have long known that the eardrum vibrates and transmits the vibration to the inner-ear bones, touching off a mechanical process in the cochlea, the snail-shaped organ containing hair cells with bristly cilia that vibrate back and forth in response to sound waves — the greater the cilia vibration, the louder the sound. (A video clip on this magazine’s website, www.harvardmagazine.com/av/hearing.html, shows these cilia vibrating in response to a piece of music.)

More here.

Evangelist of the sanguinary and excremental

Barry Gewen in the New York Times:

DantoARTHUR C. DANTO is arguably the most consequential art critic since Clement Greenberg. He is an erudite and sophisticated observer, a trained academic philosopher who is also wholly at home in the world of modern art, about which he writes with forcefulness and jargon-free clarity. Yet what truly distinguishes Danto from his peers is that he offers his readers more than simply his personal (if highly informed) opinions. His responses to works of art, like Greenberg’s, are grounded in a coherent intellectual structure that takes them out of the realm of free-floating subjectivity. To look at a work with Danto is to see it within the context of contemporary art, its very raison d’être.

In ”Unnatural Wonders,” a collection of reviews written for The Nation between 2000 and 2004, framed by a few broadly philosophical essays, Danto declares: ”I was in a sense the first posthistorical critic of art. . . . What was special about me was that I was the only one whose writing was inflected by the belief that we were not just in a new era of art, but in a new kind of era.” Greenberg was set on his critical path by Jackson Pollock. Andy Warhol performed the same function for Danto, who argues that ever since Warhol’s Brillo boxes of 1964, an art object could be anything at all (or even nothing), that for the first time in history artists were free to do whatever they wanted — to slice up dead animals, throw elephant dung on canvases, display their soiled underwear and used tampons, mold images of themselves out of their own blood. In this world of total freedom, the actual physical attributes of a work counted for less than its philosophical justifications. All art had become conceptual art, and the job of the critic was to articulate what meaning the particular artist wished to convey and how that meaning was embodied in the work at hand.

More here.

Evolution on Trial

Steve Kemper in Smithsonian Magazine:

In the summer of 1925, when William Jennings Bryan and Clarence Darrow clashed over the teaching of evolution in Dayton, Tennessee, the Scopes trial was depicted in newspapers across the country as a titanic struggle. Bryan, a three-time presidential candidate and the silver-tongued champion of creationism, described the clash of views as “a duel to the death.” Darrow, the deceptively folksy lawyer who defended labor unions and fought racial injustice, warned that nothing less than civilization itself was on trial. The site of their showdown was so obscure the St. Louis Post-Dispatch had to inquire, “Why Dayton, of all places?”

It’s still a good question. Influenced in no small part by the popular play and movie Inherit the Wind, most people think Dayton ended up in the spotlight because a 24-year-old science teacher named John Scopes was hauled into court there by Bible-thumping fanatics for telling his high-school students that humans and primates shared a common ancestry. In fact, the trial took place in Dayton because of a stunt. Tennessee had recently passed a law that made teaching evolution illegal. After the American Civil Liberties Union (ACLU) announced it would defend anyone who challenged the statute, it occurred to several Dayton businessmen that finding a volunteer to take up the offer might be a good way to put their moribund little town on the map.

More here.

The Future Will Be Peaceful, Inshallah

Rory Stewart looks at What We Owe Iraq: War and the Ethics of Nation Building by Noah Feldman, Blinded by the Sunlight: Surviving Abu Ghraib and Saddam’s Iraq by Matthew McAllester, The Fall of Baghdad by Jon Lee Anderson, and The Freedom: Shadows and Hallucinations in Occupied Iraq by Christian Parenti, in the London Review of Books:

Will the election make all the difference? For a year after the invasion the policy was to reduce the number of police in Maysan and make them more ‘citizen-friendly’. In the last six months the police force has tripled in size and is now heavily armed. People in Maysan seem to enjoy voting. But the province on election day looks a little like a police state. There are armed men at checkpoints every few kilometres up the highway; policemen with vehicle-mounted machine-guns are checking IDs on almost every street corner; no civilian vehicles are allowed to move on the streets. This may be part of the reason ‘security has improved.’ Yet despite the checkpoints, which are in place every day, there are still daily car-jackings and roadside bombs, and towards the Iranian border there’s drug smuggling, looting, and kidnapping of children. The improvement is relative. As the sheikh found when he was shot on the steps of his mosque.

No foreigner really knows what is going on in Iraq. There are diplomats – both British and American – who speak good Arabic and have studied Iraqi history; there are intelligence officers who know tribal genealogies; and there are many soldiers who get out on the ground, build good relationships with rural leaders, deliver services and win respect. The quality of journalists in Iraq has been high: Elizabeth Rubin for the New York Times Magazine and the New Republic, George Packer for the New Yorker, Rory McCarthy for the Guardian and James Astill for the Economist have produced great pieces. But even the most energetic analysts cannot move freely. Astill’s longest conversation with an Iraqi in Fallujah was with a man urinating against a wall with a suitcase on his head, and thus unable to move for twenty seconds.

I certainly don’t know what is going on in Iraq.

More here.