From Pygmalion to Bladerunner, we keep falling for our robot creations. But then, what else is AI good for?

George Zarkadakis in Aeon:

ScreenHunter_161 Mar. 31 20.07Artificial intelligence is arguably the most useless technology that humans have ever aspired to possess. Actually, let me clarify. It would be useful to have a robot that could make independent decisions while, say, exploring a distant planet, or defusing a bomb. But the ultimate aspiration of AI was never just to add autonomy to a robot’s operating system. The idea wasn’t to enable a computer to search data faster by ‘understanding patterns’, or communicate with its human masters via natural language. The dream of AI was — and is — to create a machine that is conscious. AI means building a mechanical human being. And this goal, as supposedly rational technological projects go, is deeply strange.

Consider the ramifications of a conscious machine: one that thinks and feels like a human, an ‘electronic brain’ that dreams and ponders its own existence, falls in and out of love, writes sonnets under the moonlight, laughs when happy and cries when sad. What exactly would it be good for? What could be the point of spending billions of dollars and countless hours of precious research time in order to arrive at a replica of oneself?

Technology is a cultural phenomenon, and as such it is molded by our cultural values. We prefer good health to sickness so we develop medicine. We value wealth and freedom over poverty and bondage, so we invent markets and the multitudinous thingummies of comfort. We are curious, so we aim for the stars. Yet when it comes to creating conscious simulacra of ourselves, what exactly is our motive? What deep emotions drive us to imagine, and strive to create, machines in our own image? If it is not fear, or want, or curiosity, then what is it? Are we indulging in abject narcissism? Are we being unforgivably vain? Or could it be because of love?

More here.

Is Wagner bad for us?

Nicholas Spice in the London Review of Books:

In one of the European galleries at the British Museum, there’s a bronze medal of Erasmus made in Antwerp in 1519 by the artist Quentin Metsys. A portrait of Erasmus in profile is on the front of the medal. On the reverse, the smiling bust of Terminus, the Roman god of boundaries, and the words ‘concedo nulli’ – ‘I yield to no one.’It’s said that Erasmus kept a figurine of the god Terminus on his desk. He wrote: ‘Out of a profane god I have made myself a symbol exhorting decency in life. For death is the real terminus that yields to no one.’

Like anyone who has spent time thinking about Wagner, I have inevitably come back to the subject of boundaries and limits, and in particular to questions about the boundary that lies between Wagner’s works and his listeners, and about the experience, apparently not uncommon, of that boundary becoming blurred or even disappearing, an experience that may hold a clue to the feeling, also not uncommon, that Wagner’s work is in some sense not altogether good for us.

Respecting boundaries was not Wagner’s thing. Transgression he took in his stride – stealing other men’s wives when he needed them, spending other people’s money without worrying too much about paying it back – while artistically his ambitions knew no bounds. There is something awe-inspiring about his productivity under hostile conditions, the way, though living on the breadline, he turned out masterpieces when there was no reasonable prospect of any of them being performed: gigantic works, pushing singers and musicians to the limits of their technique, and taking music itself to the edges of its known universe. Theft; the breaking of vows, promises and contracts; seduction, adultery, incest, disobedience, defiance of the gods, daring to ask the one forbidden question, the renunciation of love for power, genital self-mutilation as the price of magic: Wagner’s work is everywhere preoccupied with boundaries set and overstepped, limits reached and exceeded. ‘Wagnerian’ has passed into our language as a byword for the exorbitant, the over-scaled and the interminable.

More here.

Worst Magazine Cover of the Year?

Seth Mnookin in Slate:

130328_MEDEX_TimeMagCover.jpg.CROP.article250-mediumA decade ago, when I was on the national desk at Newsweek, a handful of us would spend slow nights competing to see who could come closest to writing the Platonic ideal of a perfect coverline.

The game only had one real guideline: The headlines had to be vaguely rooted in reality.

That’s a journalistic precept that Time feels free to ignore. Witness the headline emblazoned in all-caps on the cover of the magazine’s April 1 issue: “HOW TO CURE CANCER.” It’s followed by an asterisk that directs you to a subtitle, just to make sure you get the point: “Yes, it’s now possible, thanks to new cancer dream teams that are delivering better results faster.”

Which, of course, is completely, utterly, inarguably false. The roughly 580,000 Americans who will die this year from cancer know the reality all too well. For some context, that’s more people than will die from chronic lower respiratory diseases, strokes, accidents, Alzheimer’s disease, and diabetes combined.

That’s not to say that there haven’t been major advances in treating some types of cancer, including acute lymphoblastic leukemia in children, testicular cancer in men, andearly-stage breast cancer in women. On the whole, however, our ability to treat solid tumors in late-stage disease remains, in the words of Nita Maihle, the director of Yale’s Biology of Reproductive Tract Cancers Program, “abysmal.”

More here.

“V.” at L: Pynchon’s First Novel Turns Fifty

V-thomas-pynchon-290

Alexander Nazaryan in The New Yorker:

Penguin recently announced that Thomas Pynchon will publish his next novel, “Bleeding Edge,” this fall. Set in Manhattan’s “Silicon Alley,” it will mark Pynchon’s literary return to New York City, where he has not ventured since his début, “V.,” published fifty years ago this month. In the intervening years, Pynchon has journeyed far and wide: Southern California (“The Crying of Lot 49” and “Inherent Vice”), Northern California (“Vineland”), Chicago (“Against the Day”), the American colonies (“Mason & Dixon”), and pretty much all of Europe, Harvard Square, Namibia, and Siberia (“Gravity’s Rainbow”).

The world, too, has changed a little since Benny Profane chased alligators through the sewers of Manhattan. Medgar Evers was killed three months after the publication of “V.,” and J.F.K. five months after that. Then R.F.K. and M.L.K. There was the rise of acid and pot, the riots of Newark and Detroit.

Despite all of the places he’s travelled, despite the near-infinite reach of his fiction, there is nevertheless a tendency, I find, to think of the media-averse Pynchon as hermetically sealed in a vat of his own ideas, puns, and fears. His famous paranoia has to it a pervasive, timeless quality, equally suspicious of all creeds and systems, of individuals and corporations alike.

But to read “V.” today is to experience Pynchon anew. Blast through the multilayered densities of “Gravity’s Rainbow,” “Mason & Dixon,” and “Against the Day,” and you have a young Cornell graduate, an engineer from Long Island, writing with an earnestness you might not have expected, about a world he could never recover. And though we think of Pynchon as the progenitor of postmodern irony, the novel’s central theme, as uttered by the jazz saxophonist McClintic Sphere, is one of sly but unmistakable sincerity: “Keep cool but care.”

I should confess that I have no idea what “V.” is about—and I have read it twice. It may be about Benny Profane, a hopeless schlemiel who, having been discharged from the Navy, bounces around New York City with a comically harmless gang called the Whole Sick Crew, spending a good amount of time in the aforementioned crocodilian pursuit. Or the novel could be about Herbert Stencil, the son of a prominent British consular official, Sidney Stencil, who had “died under unknown circumstances in 1919 while investigating the June Disturbances in Malta.” Stencil’s entire existence is focused on the hunt for V., a classic novelistic quest-without-resolution (in fact, V. might be fiction’s greatest example of a MacGuffin). V. may be a person, or may be a place, though it could also be neither: Pynchon calls it, at one point, “a remarkably scattered concept” and, at another, “the ultimate Plot Which Has No Name.”

No Thing

9780262018708

Richard Marshall reviews No Medium by Craig Dworkin, in 3:AM Magazine:

Something has been fixed in. Something about nothingness, about unreadability and unwriterbility, about silence and absence, abjection and a special kind of boredom. Craig Dworkin’s book is about an aspect of this fix. He looks at “works that are blank, erased, clear, or silent.” He argues that “we should understand media not as blank, base things but as social events, and that there is no medium, understood in isolation, but always a plurality of media: interpretive activities taking place in socially inscribed places.” The last chapter gives a list of key examples of more than 100 scores and readings of ‘silent’ music.

Blanchot’s ‘gigantic’ de Sade impressed Beckett as being “jealous of Satan and of his eternal torments, and confronting nature more than human-kind.” Satan’s torments were in darkness, alone and in an eternity of ice. Jealousy is a feisty off-shoot of ambition. So why is de Sade jealous? De Sade is jealous of the perturbality of Satan. 120 days of Sodom reads like an accountant’s log. What disturbed Beckett when he read Kafka was the imperturbability. “I am wary of disasters that allow themselves to be recorded like a statement of accounts.” De Sade fails in his gigantic quest to be disturbed and so is jealous of Satan’s achievement. This links to the modern fix. In the modern fix there is a crucial disturbance freaking in blankness. There is an instinct in this stuff to not tone down what is mistakenly taken to be superfluous. Oddly, complexity and the amorphous can seem abstract. But they are correspondences of a desperate tormented plenum wriggling at the abyss. Torment in this mode stands time still, skips lives, makes space hard to cross. This is the liveliness of a “nothing that is not there and the nothing that is.”

“You would do better, at least no worse, to obliterate texts than to blacken margins, to fill in the holes of words till all is blank and flat and the whole ghastly business looks like what it is, senseless, speechless, issueless misery.” That’s Sam Beckett. Carl Andre says, “A thing is a hole in a thing it is not.” Dworkin starts to work out what he calls the logic of the substrate by examining the blank-paged poetry book Nudism in Jean Cocteau’s film Orphee of 1950. It is considered a pretentious joke in the film by Orpheus. Dworkin suggests that a sophisticated reading would get that it was a joke, but that a more sophisticated reading would refuse to get the joke. It depends on “how closely one reads a work that seems to ask only that it not be read.” At more or less the same time John Cage was delivering his ‘Lecture on Nothing’ where he said, “I have nothing to say and I am saying it.”

Doctor Who and the New British Empire

1364637128

Chris Oates in the LA Review of Books:

Doctor Who is so British that Brits tend to disbelieve that it has become popular in the US. Their reaction at being told that one of their quirky national traditions attracts an audience unfamiliar with tea towels and gap years is a bit like an American being told that the Nathan’s Hot Dog Eating Contest is being livestreamed unironically across France. Really? That’s what you’re watching? But only we watch that.

First broadcast in 1963, Doctor Who centers on a humanoid alien, the Doctor, who travels throughout time and space with a human companion from contemporary Britain, fighting aliens and extricating himself from hopeless situations. The show was famous for its low production values. The Doctor’s spaceship/time machine, the TARDIS, is a wooden box that, notwithstanding its transgalactic origins, looks exactly like a police telephone booth from 1960s Britain. The Doctor’s greatest enemies, the Daleks, are slightly smaller wooden boxes whose main weapons look strikingly like toilet plungers. Nonetheless, it was a hit. The show was in production until 1989 and rebooted in 2005. In the UK, the show is a bit like Star Trek. It often inspires sketches for the annual Comic Relief telethon, which in 2011 got a 37-percent audience share, unheard of in the US, where a network on a strong night might average 14 percent. The Guardian art critic Jonathan Jones has called Doctor Who “Britain’s greatest television show.” It has that kind of hyperbolically vaunted status.

Doctor Who is also quintessentially British not because it is made in Britain or because it is popular in Britain, but because it reflects the development of the United Kingdom’s place in the world in the past half century. The show continued the youth adventure literature enabled and encouraged by imperialism into a post-imperial time. The Doctor acts as the epitome of how Britons (and perhaps Westerners in general) would like to see themselves and their actions in the world.

Sunday Poem

Easter in the Cancer Ward

Because it has been years since my hands
have dyed an egg or I’ve remembered
my father with color in his beard,
because my fingers have forgotten
the feel of wax melting on my skin,
the heat of paraffin warping air,
because I prefer to view death politely from afar,
I agree to visit the children’s cancer ward.

In her ballet-like butterfly slippers, Elaine pad-pads
down the carpeted hall. I bring the bright bags,
press down packets of powdered dye, repress my slight unease.
She sweeps her hair from her volunteer’s badge, leaves
behind her own residents’ ward for a few hours’ release.
The new wing’s doors glide open onto great light. Everything is
vibrant and clattered with color. Racing
up, children converge, their green voices rising.

What does one do with the embarrassment of staring
at sickness? Suddenly, I don’t know where to place
my hands. Children with radiant faces
reach out thinly, clamor for the expected bags, lead
us to the Nurses’ kitchen. Elaine introduces me and reads
out a litany of names. Some of the youngest wear
old expressions. The bald little boy loves Elaine’s long mane of hair
and holds the healthy thickness to his face, hearing

her laugh as she pulls him close. “I’m dying,”
he says, and Elaine tells him she is, too: too
much iron silting her veins. I can never accept that truth
yet, in five months, she’ll slip away in a September
night – leaving her parents and me to bow our heads, bury her
in a white wedding gown, our people’s custom.
But right now, I don’t know this. Right now, we are young,
still immortal, and the kids fidget, crying

out for their eggs. Elaine divides them into teams;
I lay out the tools for the operation.
I tell them all how painting Easter eggs used to be done
in the Old Country. Before easy dyes were common,
villagers boiled onion peels, ladled eggs
into pots so the shells wouldn’t break.
They’d scoop them out, flushed a brownish-
red, and the elders would polish and polish
Read more »

How Nature Resets Our Minds and Bodies

From The Atlantic:

Nature%20tree%20mainJust before the dawn of the twentieth century, William James, one of the early giants of modern psychology, explained that human attention comes in two different forms. The first is directed attention, which enables us to focus on demanding tasks like driving and writing. Reading a book also requires directed attention, and you'll notice that you start to zone out when you're tired, or when you've been reading for hours at a time. The second form is involuntary attention, which comes easily and doesn't require any mental effort at all. As James explained, “Strange things, moving things, wild animals, bright things, pretty things, words, blows, blood, etc., etc., etc.” all attract our attention involuntarily.

Nature restores mental functioning in the same way that food and water restore bodies. The business of everyday life — dodging traffic, making decisions and judgment calls, interacting with strangers — is depleting, and what man-made environments take away from us, nature gives back. There's something mystical and, you might say, unscientific about this claim, but its heart actually rests in what psychologists call attention restoration theory, or ART. According to ART, urban environments are draining because they force us to direct our attention to specific tasks (e.g., avoiding the onslaught of traffic) and grab our attention dynamically, compelling us to “look here!” before telling us to instead “look over there!” These demands are draining — and they're also absent in natural environments. Forests, streams, rivers, lakes, and oceans demand very little from us, though they're still engaging, ever changing, and attention-grabbing. The difference between natural and urban landscapes is how they command our attention. While man-made landscapes bombard us with stimulation, their natural counterparts give us the chance to think as much or as little as we'd like, and the opportunity to replenish exhausted mental resources.

More here.

Maya Angelou: my terrible, wonderful mother

From The Guardian:

The first decade of the 20th century was not a great time to be born black and poor and female in St Louis, Missouri, but Vivian Baxter was born black and poor, to black and poor parents. Later she would grow up and be called beautiful. As a grown woman she would be known as the butter-coloured lady with the blowback hair.

Maya-Angelou-with-her-mot-006My mother, who was to remain a startling beauty, met my father, a handsome soldier, in 1924. Bailey Johnson had returned from the first world war with officer's honours and a fake French accent. They were unable to restrain themselves. They fell in love while Vivian's brothers walked around him threateningly. He had been to war, and he was from the south, where a black man learned early that he had to stand up to threats, or else he wasn't a man. The Baxter boys could not intimidate Bailey Johnson, especially after Vivian told them to lay off. Vivian's parents were not happy that she was marrying a man from the south who was neither a doctor nor lawyer. He said he was a dietician. The Baxters said that meant he was just a negro cook. Vivian and Bailey left the contentious Baxter atmosphere and moved to California, where little Bailey was born. I came along two years later. My parents soon proved to each other that they couldn't stay together. They were matches and gasoline. They even argued about how they were to break up. Neither wanted the responsibility of taking care of two toddlers. They separated and sent me and Bailey to my father's mother in Arkansas. I was three and Bailey was five when we arrived in Stamps, Arkansas. We had identification tags on our arms and no adult supervision. I learned later that Pullman car porters and dining car waiters were known to take children off trains in the north and put them on other trains heading south.

Save for one horrific visit to St Louis, we lived with my father's mother, Grandmother Annie Henderson, and her other son, Uncle Willie, in Stamps until I was 13. The visit to St Louis lasted only a short time but I was raped there and the rapist was killed. I thought I had caused his death because I told his name to the family. Out of guilt, I stopped talking to everyone except Bailey. I decided that my voice was so powerful that it could kill people, but it could not harm my brother because we loved each other so much. My mother and her family tried to woo me away from mutism, but they didn't know what I knew: that my voice was a killing machine. They soon wearied of the sullen, silent child and sent us back to Grandmother Henderson in Arkansas, where we lived quietly and smoothly within my grandmother's care and under my uncle's watchful eye.

More here.

THE FACTS, THE MYTHS AND THE FRAMING OF IMMIGRATION

Kenan Malik in Pandaemonium:

ScreenHunter_161 Mar. 30 21.25At the heart of the current debate about immigration are two issues: the first is about the facts of immigration, the second about public perception of immigration.

The facts are relatively straightforward. Immigration is a good and the idea that immigrants come to Britain to live off benefits laughable. Immigrants put more money into the economy than they take out and have negligible impact on jobs or wages. An independent report on the impact of immigrationcommissioned by the Home Office in 2003, looked at numerous international surveys and conducted its own study in Britain. ‘The perception that immigrants take away jobs from the existing population, or that immigrants depress the wages of existing workers’, it concluded, ‘do not find confirmation in the analysis of the data laid out in this report.’ More recently studies have suggested that immigration helps raise wages except at the bottom of the jobs ladder where it has a slightnegative impact. That impact on low paid workers matters hugely, of course, but is arguably more an issue of labour organization than of immigration.

Immigrants are less likely to claim benefits than British citizens. According to the Department for Work and Pensions, of the roughly 1.8 million non-British EU citizens of working age in this country, about 90,000, or around 5%, claim an ‘out of work benefit’, compared with around 13% of Britons. Migrants from outside the EU are also much less likely to claim benefits.

More here.

It’s a part of my paleo fantasy, it’s a part of my paleo dream

David Gorski in Science-Based Medicine:

9324290-natural-medicineThere are many fallacies that undergird alternative medicine, which evolved into “complementary and alternative medicine” (CAM), and for which the preferred term among its advocates is now “integrative medicine,” meant to imply the “best of both worlds.” If I had to pick one fallacy that rules above all among proponents of CAM/IM, it would have to be either the naturalistic fallacy (i.e., that if it’s natural—whatever that means—it must be better) or the fallacy of antiquity (i.e., that if it’s really old, it must be better). Of course, the two fallacies are not unrelated. In the minds of CAM proponents, old is more likely to have been based on nature, and the naturalistic fallacy often correlates with the fallacy of antiquity. Basically, it’s a rejection of modernity, and from it flow the interest in herbalism, various religious practices rebranded as treatments (thousands of years ago, medicine was religion and religion was medicine—the two were more or less one and physicians were often priests as well), and the all-consuming fear of “toxins,” in which it is thought that the products of modernity are poisoning us.

Yes, there is a definite belief underlying much of CAM that technology and pharmaceuticals are automatically bad and that “natural” must be better. Flowing from that belief is the belief that people were happier and much healthier in the preindustrial, preagricultural past, that cardiovascular disease was rare or nonexistent, and that cancer was seldom heard of. Of course, it’s hard not to note that cancer and heart disease are primarily diseases of aging, and life expectancy was so much lower back in the day that a much smaller percentage of the population lived to advanced ages than is the case today. Even so, an implicit assumption among many CAM advocates is that cardiovascular disease is largely a disease of modern lifestyle and diet and that, if modern humans could somehow mimic preindustrial or, according to some, even preagricultural, lifestyles, that cardiovascular disease could be avoided.

More here.

Bitcoin May Be the Global Economy’s Last Safe Haven

Paul Ford in Bloomberg Businessweek:

Or1413__01__630x420One of the oddest bits of news to emerge from the economic collapse of Cyprus is a corresponding rise in the value of Bitcoin, the Internet’s favorite, media-friendly, anarchist crypto-currency. In Spain, Google (GOOG) searches for “Bitcoin” and downloads of Bitcoin apps soared. The value of a Bitcoin went up to $78. Someone put out a press release promising a Bitcoin ATM in Cyprus. Far away, in Canada, a man said he’d sell his house for BTC5,362.

Bitcoin was created in 2009 by a pseudonymous hacker who calls him or herself Satoshi Nakamoto (and who might be several people). It’s a form of virtual cash used to buy goods and services online. Even by Web standards, it’s a strange and supergeeky phenomenon. This is what happens when software and networks meet the concept of currency, when you take peer-to-peer networks and advanced cryptography and ask, “How can I make a new economy?”

There are 10,952,975 Bitcoins in circulation. (With a digital currency you can be specific.) Bitcoin isn’t about to replace hard currency—with a market cap of $864 million, all of it is worth less than what Facebook (FB) paid for Instagram—but it’s bigger than anyone expected. And many people will tell you that the emergence of a virtual global money supply beyond the reach and control of any government is very real and that it’s time we take it seriously. As long as the Internet remains turned on, Bitcoin will be there—to its adherents, it’s the Platonic currency.

More here.

Can Honeybees Lead To A Better Treatment For Myelodysplastic Syndromes?

Azra Raza in The MDS Beacon:

ScreenHunter_158 Mar. 30 20.59Honeybees have a fantastic story, one that may provide insight into myelodysplastic syndromes, aging, and a number of other conditions.

The drones and worker bees exist to work. Their various jobs include nursing the ever-hatching brood, visiting flowers to bring back nectar, constructing wax combs, serving as cleaners and guards for the hive, and literally living to serve the queen.

The queen bee, on the other hand, looks different and is larger than the other bees. She is fed and groomed by a hoard of attendants, does not work a day in her life, and her only job is to lay eggs that can amount to as many as 2,000 on a good summer day. She produces a pheromone called “queen substance” that informs the colony that a viable queen is present.

The greatest difference between the queen and her subjects, however, is that the workers have a life span of two to four weeks while the queen can live up to eight years.

The real kicker is that drones and the queen bee share the exact same set of genes. What accounts for the dramatic physical differ­ences is therefore not the genes but their relative expression (i.e., how much of each gene’s corresponding protein the body makes).

In the case of bees, it seems that the diet they are fed as larvae and beyond controls which genes are turned on to be translated into protein. Bees’ rich and nutritious diet, called royal jelly, is produced in the mouth glands of nursing bees and fed to all hatching larvae; however, the workers are soon weaned off the royal jelly and given nectar and pollen, while the queen bee is bathed in the royal jelly into adulthood.

What is the magic substance in royal jelly?

More here.

The Immortal Life of Henrietta Lacks, the Sequel

Rebecca Skloot in the New York Times:

24GENOME-articleInlineLast week, scientists sequenced the genome of cells taken without consent from a woman named Henrietta Lacks. She was a black tobacco farmer and mother of five, and though she died in 1951, her cells, code-named HeLa, live on. They were used to help develop our most important vaccines and cancer medications, in vitro fertilization, gene mapping, cloning. Now they may finally help create laws to protect her family’s privacy — and yours.

The family has been through a lot with HeLa: they didn’t learn of the cells until 20 years after Lacks’s death, when scientists began using her children in research without their knowledge. Later their medical records were released to the press and published without consent. Because I wrote a book about Henrietta Lacks and her family, my in-box exploded when news of the genome broke. People wanted to know: did scientists get the family’s permission to publish her genetic information? The answer is no.

Imagine if someone secretly sent samples of your DNA to one of many companies that promise to tell you what your genes say about you. That report would list the good news (you’ll probably live to be 100) and the not-so-good news (you’ll most likely develop Alzheimer’s, bipolar disorder and maybe alcoholism). Now imagine they posted your genetic information online, with your name on it. Some people may not mind. But I assure you, many do: genetic information can be stigmatizing, and while it’s illegal for employers or health insurance providers to discriminate based on that information, this is not true for life insurance, disability coverage or long-term care.

More here.