worrying about the singularity

Intelligencedeficit

If you’ve got any spare change, the Lifeboat Foundation of Minden, Nevada, has a worthy cause for your consideration. Sometime this century, probably sooner than you think, scientists will likely succeed in creating an artificial intelligence, or AI, greater than our own. What happens after that is anyone’s guess — we’re simply not smart enough to understand, let alone predict, what a superhuman intelligence will choose to do. But there’s a reasonable chance that the AI will eradicate humanity, either out of malevolence or through a clumsily misguided attempt to be helpful. The Lifeboat Foundation’s AIShield Fund seeks to head off this calamity by developing “Friendly AI,” and thus, as its website points out, “will benefit an almost uncountable number of intelligent entities.” As of February 9, the fund has raised a grand total of $2,010; donations are fully tax deductible in the United States. The date of this coming “Technological Singularity,” as mathematician and computer scientist Vernor Vinge dubbed the moment of machine ascendance in a seminal 1983 article, remains uncertain. He initially predicted that the Singularity (sometimes referred to, in less reverential tones, as the “Rapture of the nerds”) would arrive before 2030. Inventor and futurist Ray Kurzweil, whose book The Singularity Is Near was turned into a movie last year, places it in 2045. Those predictions are too conservative for Canadian science fiction juggernaut Robert J. Sawyer: in his WWW trilogy, whose third volume, Wonder, appears in April, the Singularity arrives in the autumn of 2012.

more from Alex Hutchinson at The Walrus here.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on Reddit
Reddit
Share on LinkedIn
Linkedin
Email this to someone
email