The Internet and collective problem solving

The whole Dan Rather-Bush national guard forgery episode appears to have brought the blogosphere into a new prominence. (Though, check out Matthew Yglesias’ objections to all the self-congratulation.) In some ways, it does appear to be an instance of collective problem solving.

As the recent and peculiar embarce of the masses (all the rage these days thanks to James Surowiecki’s The Wisdom of Crowds, but see this review by Daniel Davies) implies, if for a pool of people who are to come to a collective decision, each individual has, say, a 51% of being correct, as the number of people grows, the chances that the collective decision will be right is significantly greater than 51%. For a group of three people, each of whom has, say, a 2/3rd (67%) chance of being right, the chances that a collective decision is right is significantly greater, nearly 75%.

When the chances that an individual is right is greater than his or her chances of not being right, majorities are very likely, more likely than any individual, to produce the correct decision. This is the upshot of Condorcet’s Jury Theorem.

(Of course, if each individual’s chance of being right is less than half, y < .5, the chance that a majority will right will be less than y, by symmetry.)

Add to this the fact that for reasons of Bayesian rationality, we should weight what majorities believe heavily, unless we have good reasons. (I’m discussing empirical issues, and not disagreements about values.) If the Jury Theorem gives us reasons to believe that the outcome is correct, we should adjust (update) our beliefs to what the majority has thrown up as the correct answer, submit, as it were, to a tyranny of the majority’s ontology, as Robert Goodin suggested as a set up to his argument.

With the advent of, not simply blogs, but large scale collective problem solving enabled by new technologies, is there the chance that we receive better information and better answers? (Again, none of this stops value conflict.) The experience of wikipedia.org, in which, incorrect information is corrected quickly, seems to suggest “yes”.

I don’t live in Korea, nor do I speak Korean, but the case of OhMyNews—a South Korean news service in which “citizen reporters”, ordinary people, call in news–may over time prove the extent to which the involvement of “crowds” improves information, though strictly speaking the number of people who report any single story isn’t clear, and neither is how stories are corrected. The stories are ranked according to credibility. But the reception of OhMyNews in Korea does suggest that the logic holds.

“When some Yonsei University students recently met with a visiting reporter to discuss the future of news, one psychology major put it simply: ‘How can you ever get truth from one source? The Internet allows us to check multiple sources, to explore message-board postings, to debate issues with others—that is the only way to find truth. And besides, what good is information if you can’t react to it?’ ‘We’re not stupid,’ added a business student. ‘We know that there is a difference between a message board, a traditional journal and OhmyNews. But by putting them together, our understanding is better. We can piece together truth.’

What a Story Lice Can Tell

05lous
“A spectator with an especially intimate view of human evolution is beginning to tell its story and has so far divulged two quite unexpected findings.

The human louse finds people so delicious that it will accept no substitutes and cannot live more than a few hours away from the warmth and sustenance of the human body. This devotion to the human cause means that the evolutionary history of human lice dovetails with that of their hosts and reflects several pivotal events that affected both species.

In a finding that seems bound to inspire several science fiction treatments, Dr. David Reed of the Florida Museum of Natural History in Gainesville has reached the startling conclusion that some human lice show signs of having evolved originally on a different human species.

In today’s issue of the journal PLoS Biology, he and his colleagues suggest that modern humans may have contracted this strain of lice from an archaic human like Homo erectus.”

More here by Nicholas Wade in the New York Times.

Alfred Kinsey: Liberator or Pervert?

“More than half a century after the publication of his landmark study, ‘Sexual Behavior in the Human Male,’ Alfred C. Kinsey remains one of the most influential figures in American intellectual history. He’s certainly the only entomologist ever to be immortalized in a Cole Porter song. Thanks to him, it’s now common knowledge that almost all men masturbate, that women peak sexually in their mid-30’s and that homosexuality is not some one-in-a-million anomaly. His studies helped bring sex – all kinds of sex, not just the stork-summoning kind – out of the closet and into the bright light of day.

But not everyone applauds that accomplishment. Though some hail him for liberating the nation from sexual puritanism, others revile him as a fraud whose ‘junk science’ legitimized degeneracy. Even among scholars sympathetic to Kinsey there’s disagreement. Both his biographers regard him as a brave pioneer and reformer, but differ sharply about almost everything else. One independent scholar has even accused him of sexual crimes.

All of which makes the decision by the writer and director Bill Condon to place him at the center of a major Hollywood biopic – one loaded up with stars, including Liam Neeson, Laura Linney and Peter Sarsgaard – rather striking.”

More here by Caleb Crain in the New York Times.

2 Columbia University Scientists Win Medicine Nobel

04cndnobel
“Two American scientists who discovered how people can smell and recall about 10,000 different odors were awarded the 2004 Nobel Prize in the category of physiology or medicine today.

The winners were Dr. Richard Axel, 58, a university professor at Columbia, and Dr. Linda B. Buck, 57, of the Fred Hutchinson Cancer Research Center and University of Washington in Seattle. The two, who will share the $1.3 million award, were cited for a discovery they made in 1991 while working together at Columbia University in Manhattan.

Until publication of their fundamental paper in 1991, the sense of smell had been ‘the most enigmatic of our senses,’ the Nobel Assembly of the Karolinska Institute in Stockholm said in announcing the award. The two scientists’ work provides a molecular understanding of how people who smell a lilac in childhood can recognize the fragrance later in life and also recall associated memories.”

More here in the New York Times.

The Myth Is the Message

“Myths are stories that express meaning, morality or motivation. Whether they are true or not is irrelevant. But because we live in an age of science, we have a preoccupation with corroborating our myths.

Consider the so-called Lost Continent of Atlantis, a mythic place that has been ‘found’ in so many places around the planet that one wouldn’t think there was anywhere left to look. Think again. On June 6 the BBC released a story about satellite images locating Atlantis in, of all places, the south of Spain…

Atlantis also has been ‘found’ in the Mediterranean, the Canaries, the Azores, the Caribbean, Tunisia, West Africa, Sweden, Iceland and even South America. But what if there is nothing to find? What if Plato made up the story for mythic purposes? He did. Atlantis is a tale about what happens to a civilization when it becomes combative and corrupt. Plato’s purpose was to warn his fellow Athenians to pull back from the precipice created by war and wealth.”

More here by Michael Shermer in Scientific American.

The Sudan, Again

Last night I saw the moving and fascinating documentary film Lost Boys of Sudan made by Megan Mylan and John Shenk, on PBS.

Lost Boys of Sudan is a feature-length documentary that follows two Sudanese refugees on an extraordinary journey from Africa to America. Orphaned as young boys in one of Africa’s cruelest civil wars, Peter Dut and Santino Chuor survived lion attacks and militia gunfire to reach a refugee camp in Kenya along with thousands of other children. From there, remarkably, they were chosen to come to America. Safe at last from physical danger and hunger, a world away from home, they find themselves confronted with the abundance and alienation of contemporary American suburbia.”

You can find out more about the film, as well as learn about how to help in Darfur and/or help the “Lost Boys” in America, here.

2004 Ig Nobel Prizes Awarded at Harvard

_40133650_head_bbc_203
“Researchers who patented a comb-over hairstyle in 1975 were top prize winners in the engineering section of this year’s Ig Nobel awards.

Father and son team Frank and Donald Smith developed a method to cover partial baldness using only the individual’s own hair.

Award organisers, the Annals of Improbable Research publishers, say the Igs are meant to celebrate the unusual.

They also aim to encourage interest in science, medicine, and technology.

This year’s prizes were awarded at a sell-out gala ceremony at Harvard University in Massachusetts and the prizes were handed to the winners by genuine Nobel Laureates.”

More here from the BBC.

Stop taking your vitamins!

“Thousands of people could be dying prematurely from vitamin supplements, researchers report today, stating that the pills increase the death rate of those who take them by 6 per cent.

One in three women and one in four men in the UK is estimated to take dietary supplements for health reasons. But a review of 14 trials of vitamin pills taken by 170,000 people found they increased the death rate by 6 per cent. While they offered no explanation as to what caused the deaths, they discovered that the supplements offered no protection against cancers of the gut.

The researchers, writing in The Lancet, estimate that for every one million people taking the supplements, 9,000 would die prematurely as a result. The figure takes account of the background level of premature death in the population.”

More here by Jeremy Laurance in The Independent.

The lightweight champion of the world

Freiotto
“Frei Otto, the 79-year-old German architect and structural engineer whose work continues to inspire leading British architects such as Richard Rogers and Norman Foster, has won this year’s Royal Gold Medal for Architecture. Presented by the Royal Institute of British Architects (Riba), it is the world’s most prestigious architectural award.

Born in Siegmar, Saxony, in 1925, Otto made his mark with a number of impressive ultra-modern and super-light tent-like structures using new materials, beginning with the West German pavilion, designed with Rolf Gutbrod, for the 1967 Montreal Expo.”

More here by Jonathan Glancey in The Guardian.

Arab scholar “cracked Rosetta code” 800 years before the West

“It is famed as a critical moment in code-breaking history. Using a piece of basalt carved with runes and words, scholars broke the secret of hieroglyphs, the written ‘language’ of the ancient Egyptians.

A baffling, opaque language had been made comprehensible, and the secrets of one of the world’s greatest civilisations revealed – thanks to the Rosetta Stone and the analytic prowess of 18th and 19th century European scholars.

But now the supremacy of Western thinking has been challenged by a London researcher who claims that hieroglyphs had been decoded hundreds of years earlier – by an Arabic alchemist, Abu Bakr Ahmad Ibn Wahshiyah.”

More by Robin McKie here in The Guardian.

Darfur Again

Various of us at 3quarksdaily have tried to keep up on the despressing state of affairs in Sudan. Here’s an update with several editorials quite to the point. Dallaire at the Carr Center for Human Rights Policy at Harvard has an excellent piece in today’s New York Times. And from the neo-con side, David Brooks had an admirable piece in the same rag a few days ago. The Darfur Information Center has a nice archive of up to date articles and info.

Findings in Cleveland

I’ve had a lingering obsession with the idea that literature and art have at least one obvious function in the modern world. That function is to try and retrieve and make sense of memory and past experience. The obvious great-granddaddy of this tendency in modern art and literature is Marcel Proust. But, interestingly enough, painting has continued to play a quiet if important role in dealing with memory and experience as regards the 20th century. For example, some of the paintings of of Gerhard Richter are, arguably, more powerful attempts to deal with the traumatic history of Germany in the 20th century than just about anything one can point to in prose or any other medium (some nod should probably be given here, though, to the poetic works of Paul Celan.
This week, I had the pleasant experience of running, purely accidentally, into an artist who is clearly a major figure in precisely this project. I happened to wander into the Museum of Contemporary Art in Cleveland. There, I discovered a painter I was shocked never to have heard of before. His paintings were an attempt to reconcile social realism with the abstraction of the Soviet Suprematists. The works, painted in the 20s, were out of time and yet timely as hell, and disconcertingly so. They were painted by Charles Rosenthal. Only at the end of the exhibit did I learn, by finally reading the exhibition pamphlet, that Charles Rosenthal is a fictional artist, created by the contemporary artist Ilya Kabakov. The entire exhibition of works by both Rosenthal and a double fictional version of Kabakov himself, also named Ilya Kabakov, was a grand installation by the artist.
The exhibit is one of the most profound reflections on the dilemmas of the 20th century, aesthetical, political, historical, and experiential, that I have come across in quite some time. If you’re in Cleveland between now and January 2nd I recommend stopping by.

‘America (The Book)’ –from John Stewart et al

Stewart184
“In 9/11’s wake, millions of Americans felt rallied — or were told they had been, anyhow — by President Bush’s bullhorn address at ground zero. As much as I wanted to be moved, I wasn’t one of them. I got my upsurge of patriotic defiance from another source — the famous 9/11 issue of The Onion, which rose to the almost unimaginable challenge of satirizing the attacks before the rubble stopped smoking. Under the circumstances, making jokes was heroic, which is why the Pulitzer judge who lobbied his colleagues to include The Onion’s mock coverage among 2001’s finalists wasn’t kidding.

Meanwhile, on the first post-attack faux newscast of ‘The Daily Show,’ Comedy Central’s Jon Stewart disconcerted everybody by weeping on camera. His utterly human reaction, far more spontaneous than Dan Rather’s provoking himself to choke up by reciting ‘America the Beautiful’ on David Letterman’s show, had the effect of reminding us that ‘Jon Stewart’ was a persona — one as yet unable to fit authenticity in, something that took Letterman himself years.

Even so, since its finest hour The Onion has often amused me, but its humor seems quaint — locked into a funhouse-mirror formula. And Stewart, who might have been mistaken for a real Sept. 10 kind of guy, has turned into the Bush years’ sharpest jester, a satirist who doubles for his fans as a goofy, imperturbable reality check. Nobody better demonstrates how those post-9/11 reports on the death of irony turned out to be, well, ironic.”

More here from the New York Times.

Richard Avedon dies at 81

Avedon200
“Richard Avedon, the revolutionary photographer who redefined fashion photography as an art form while achieving critical acclaim through his stark black-and-white portraits of the powerful and celebrated, died Friday. He was 81.

Avedon suffered a brain hemorrhage last month while on assignment in San Antonio, Texas, for The New Yorker, taking pictures for a piece called “On Democracy.” He spent months on the project, shooting politicians, delegates and citizens from around the country.

He died at Methodist Hospital in San Antonio, said Perri Dorset, a spokeswoman for the magazine.”

More here from MSNBC.

Fixing the Vote

00018dd573e71151b57f83414b7f0000_1
“Electronic voting machines promise to make elections more accurate than ever before, but only if certain problems–with the machines and the wider electoral process–are rectified.

Voting may seem like a simple activity–cast ballots, then count them. Complexity arises, however, because voters must be registered and votes must be recorded in secrecy, transferred securely and counted accurately. We vote rarely, so the procedure never becomes a well-practiced routine. One race between two candidates is easy. Half a dozen races, each between several candidates, and ballot measures besides–that’s harder. This complex process is so vital to our democracy that problems with it are as noteworthy as engineering faults in a nuclear power plant.

Votes can be lost at every stage of the process. The infamous 2000 U.S. presidential election dramatized some very basic, yet systemic, flaws concerning who got to vote and how the votes were counted. An estimated four million to six million ballots were not counted or were prevented from being cast at all–well over 2 percent of the 150 million registered voters. This is a shockingly large number considering that the decision of which candidate would assume the most powerful office in the world came to rest on 537 ballots in Florida.”

More here by Ted Selker in Scientific American.