Sunday, March 31, 2013
From Pygmalion to Bladerunner, we keep falling for our robot creations. But then, what else is AI good for?
George Zarkadakis in Aeon:
Artificial intelligence is arguably the most useless technology that humans have ever aspired to possess. Actually, let me clarify. It would be useful to have a robot that could make independent decisions while, say, exploring a distant planet, or defusing a bomb. But the ultimate aspiration of AI was never just to add autonomy to a robot’s operating system. The idea wasn’t to enable a computer to search data faster by ‘understanding patterns’, or communicate with its human masters via natural language. The dream of AI was — and is — to create a machine that is conscious. AI means building a mechanical human being. And this goal, as supposedly rational technological projects go, is deeply strange.
Consider the ramifications of a conscious machine: one that thinks and feels like a human, an ‘electronic brain’ that dreams and ponders its own existence, falls in and out of love, writes sonnets under the moonlight, laughs when happy and cries when sad. What exactly would it be good for? What could be the point of spending billions of dollars and countless hours of precious research time in order to arrive at a replica of oneself?
Technology is a cultural phenomenon, and as such it is molded by our cultural values. We prefer good health to sickness so we develop medicine. We value wealth and freedom over poverty and bondage, so we invent markets and the multitudinous thingummies of comfort. We are curious, so we aim for the stars. Yet when it comes to creating conscious simulacra of ourselves, what exactly is our motive? What deep emotions drive us to imagine, and strive to create, machines in our own image? If it is not fear, or want, or curiosity, then what is it? Are we indulging in abject narcissism? Are we being unforgivably vain? Or could it be because of love?
Is Wagner bad for us?
Nicholas Spice in the London Review of Books:
In one of the European galleries at the British Museum, there’s a bronze medal of Erasmus made in Antwerp in 1519 by the artist Quentin Metsys. A portrait of Erasmus in profile is on the front of the medal. On the reverse, the smiling bust of Terminus, the Roman god of boundaries, and the words ‘concedo nulli’ – ‘I yield to no one.’It’s said that Erasmus kept a figurine of the god Terminus on his desk. He wrote: ‘Out of a profane god I have made myself a symbol exhorting decency in life. For death is the real terminus that yields to no one.’
Like anyone who has spent time thinking about Wagner, I have inevitably come back to the subject of boundaries and limits, and in particular to questions about the boundary that lies between Wagner’s works and his listeners, and about the experience, apparently not uncommon, of that boundary becoming blurred or even disappearing, an experience that may hold a clue to the feeling, also not uncommon, that Wagner’s work is in some sense not altogether good for us.
Respecting boundaries was not Wagner’s thing. Transgression he took in his stride – stealing other men’s wives when he needed them, spending other people’s money without worrying too much about paying it back – while artistically his ambitions knew no bounds. There is something awe-inspiring about his productivity under hostile conditions, the way, though living on the breadline, he turned out masterpieces when there was no reasonable prospect of any of them being performed: gigantic works, pushing singers and musicians to the limits of their technique, and taking music itself to the edges of its known universe. Theft; the breaking of vows, promises and contracts; seduction, adultery, incest, disobedience, defiance of the gods, daring to ask the one forbidden question, the renunciation of love for power, genital self-mutilation as the price of magic: Wagner’s work is everywhere preoccupied with boundaries set and overstepped, limits reached and exceeded. ‘Wagnerian’ has passed into our language as a byword for the exorbitant, the over-scaled and the interminable.
Worst Magazine Cover of the Year?
Seth Mnookin in Slate:
The game only had one real guideline: The headlines had to be vaguely rooted in reality.
That’s a journalistic precept that Time feels free to ignore. Witness the headline emblazoned in all-caps on the cover of the magazine’s April 1 issue: “HOW TO CURE CANCER.” It’s followed by an asterisk that directs you to a subtitle, just to make sure you get the point: “Yes, it’s now possible, thanks to new cancer dream teams that are delivering better results faster.”
Which, of course, is completely, utterly, inarguably false. The roughly 580,000 Americans who will die this year from cancer know the reality all too well. For some context, that’s more people than will die from chronic lower respiratory diseases, strokes, accidents, Alzheimer’s disease, and diabetes combined.
That’s not to say that there haven’t been major advances in treating some types of cancer, including acute lymphoblastic leukemia in children, testicular cancer in men, andearly-stage breast cancer in women. On the whole, however, our ability to treat solid tumors in late-stage disease remains, in the words of Nita Maihle, the director of Yale’s Biology of Reproductive Tract Cancers Program, “abysmal.”
"Accidental" by RAVI SHAVI
“V.” at L: Pynchon’s First Novel Turns Fifty
Alexander Nazaryan in The New Yorker:
Penguin recently announced that Thomas Pynchon will publish his next novel, “Bleeding Edge,” this fall. Set in Manhattan’s “Silicon Alley,” it will mark Pynchon’s literary return to New York City, where he has not ventured since his début, “V.,” published fifty years ago this month. In the intervening years, Pynchon has journeyed far and wide: Southern California (“The Crying of Lot 49” and “Inherent Vice”), Northern California (“Vineland”), Chicago (“Against the Day”), the American colonies (“Mason & Dixon”), and pretty much all of Europe, Harvard Square, Namibia, and Siberia (“Gravity’s Rainbow”).
The world, too, has changed a little since Benny Profane chased alligators through the sewers of Manhattan. Medgar Evers was killed three months after the publication of “V.,” and J.F.K. five months after that. Then R.F.K. and M.L.K. There was the rise of acid and pot, the riots of Newark and Detroit.
Despite all of the places he’s travelled, despite the near-infinite reach of his fiction, there is nevertheless a tendency, I find, to think of the media-averse Pynchon as hermetically sealed in a vat of his own ideas, puns, and fears. His famous paranoia has to it a pervasive, timeless quality, equally suspicious of all creeds and systems, of individuals and corporations alike.
But to read “V.” today is to experience Pynchon anew. Blast through the multilayered densities of “Gravity’s Rainbow,” “Mason & Dixon,” and “Against the Day,” and you have a young Cornell graduate, an engineer from Long Island, writing with an earnestness you might not have expected, about a world he could never recover. And though we think of Pynchon as the progenitor of postmodern irony, the novel’s central theme, as uttered by the jazz saxophonist McClintic Sphere, is one of sly but unmistakable sincerity: “Keep cool but care.”
I should confess that I have no idea what “V.” is about—and I have read it twice. It may be about Benny Profane, a hopeless schlemiel who, having been discharged from the Navy, bounces around New York City with a comically harmless gang called the Whole Sick Crew, spending a good amount of time in the aforementioned crocodilian pursuit. Or the novel could be about Herbert Stencil, the son of a prominent British consular official, Sidney Stencil, who had “died under unknown circumstances in 1919 while investigating the June Disturbances in Malta.” Stencil’s entire existence is focused on the hunt for V., a classic novelistic quest-without-resolution (in fact, V. might be fiction’s greatest example of a MacGuffin). V. may be a person, or may be a place, though it could also be neither: Pynchon calls it, at one point, “a remarkably scattered concept” and, at another, “the ultimate Plot Which Has No Name.”
Richard Marshall reviews No Medium by Craig Dworkin, in 3:AM Magazine:
Something has been fixed in. Something about nothingness, about unreadability and unwriterbility, about silence and absence, abjection and a special kind of boredom. Craig Dworkin’s book is about an aspect of this fix. He looks at “works that are blank, erased, clear, or silent.” He argues that “we should understand media not as blank, base things but as social events, and that there is no medium, understood in isolation, but always a plurality of media: interpretive activities taking place in socially inscribed places.” The last chapter gives a list of key examples of more than 100 scores and readings of ‘silent’ music.
Blanchot’s ‘gigantic’ de Sade impressed Beckett as being “jealous of Satan and of his eternal torments, and confronting nature more than human-kind.” Satan’s torments were in darkness, alone and in an eternity of ice. Jealousy is a feisty off-shoot of ambition. So why is de Sade jealous? De Sade is jealous of the perturbality of Satan. 120 days of Sodom reads like an accountant’s log. What disturbed Beckett when he read Kafka was the imperturbability. “I am wary of disasters that allow themselves to be recorded like a statement of accounts.” De Sade fails in his gigantic quest to be disturbed and so is jealous of Satan’s achievement. This links to the modern fix. In the modern fix there is a crucial disturbance freaking in blankness. There is an instinct in this stuff to not tone down what is mistakenly taken to be superfluous. Oddly, complexity and the amorphous can seem abstract. But they are correspondences of a desperate tormented plenum wriggling at the abyss. Torment in this mode stands time still, skips lives, makes space hard to cross. This is the liveliness of a “nothing that is not there and the nothing that is.”
“You would do better, at least no worse, to obliterate texts than to blacken margins, to fill in the holes of words till all is blank and flat and the whole ghastly business looks like what it is, senseless, speechless, issueless misery.” That’s Sam Beckett. Carl Andre says, “A thing is a hole in a thing it is not.” Dworkin starts to work out what he calls the logic of the substrate by examining the blank-paged poetry book Nudism in Jean Cocteau’s film Orphee of 1950. It is considered a pretentious joke in the film by Orpheus. Dworkin suggests that a sophisticated reading would get that it was a joke, but that a more sophisticated reading would refuse to get the joke. It depends on “how closely one reads a work that seems to ask only that it not be read.” At more or less the same time John Cage was delivering his ‘Lecture on Nothing’ where he said, “I have nothing to say and I am saying it.”
Doctor Who and the New British Empire
Chris Oates in the LA Review of Books:
Doctor Who is so British that Brits tend to disbelieve that it has become popular in the US. Their reaction at being told that one of their quirky national traditions attracts an audience unfamiliar with tea towels and gap years is a bit like an American being told that the Nathan’s Hot Dog Eating Contest is being livestreamed unironically across France. Really? That’s what you’re watching? But only we watch that.
First broadcast in 1963, Doctor Who centers on a humanoid alien, the Doctor, who travels throughout time and space with a human companion from contemporary Britain, fighting aliens and extricating himself from hopeless situations. The show was famous for its low production values. The Doctor’s spaceship/time machine, the TARDIS, is a wooden box that, notwithstanding its transgalactic origins, looks exactly like a police telephone booth from 1960s Britain. The Doctor’s greatest enemies, the Daleks, are slightly smaller wooden boxes whose main weapons look strikingly like toilet plungers. Nonetheless, it was a hit. The show was in production until 1989 and rebooted in 2005. In the UK, the show is a bit like Star Trek. It often inspires sketches for the annual Comic Relief telethon, which in 2011 got a 37-percent audience share, unheard of in the US, where a network on a strong night might average 14 percent. The Guardian art critic Jonathan Jones has called Doctor Who “Britain’s greatest television show.” It has that kind of hyperbolically vaunted status.
Doctor Who is also quintessentially British not because it is made in Britain or because it is popular in Britain, but because it reflects the development of the United Kingdom’s place in the world in the past half century. The show continued the youth adventure literature enabled and encouraged by imperialism into a post-imperial time. The Doctor acts as the epitome of how Britons (and perhaps Westerners in general) would like to see themselves and their actions in the world.
Sunday PoemEaster in the Cancer Ward
Because it has been years since my hands
have dyed an egg or I’ve remembered
my father with color in his beard,
because my fingers have forgotten
the feel of wax melting on my skin,
the heat of paraffin warping air,
because I prefer to view death politely from afar,
I agree to visit the children’s cancer ward.
In her ballet-like butterfly slippers, Elaine pad-pads
down the carpeted hall. I bring the bright bags,
press down packets of powdered dye, repress my slight unease.
She sweeps her hair from her volunteer’s badge, leaves
behind her own residents’ ward for a few hours’ release.
The new wing’s doors glide open onto great light. Everything is
vibrant and clattered with color. Racing
up, children converge, their green voices rising.
What does one do with the embarrassment of staring
at sickness? Suddenly, I don’t know where to place
my hands. Children with radiant faces
reach out thinly, clamor for the expected bags, lead
us to the Nurses’ kitchen. Elaine introduces me and reads
out a litany of names. Some of the youngest wear
old expressions. The bald little boy loves Elaine’s long mane of hair
and holds the healthy thickness to his face, hearing
her laugh as she pulls him close. “I’m dying,”
he says, and Elaine tells him she is, too: too
much iron silting her veins. I can never accept that truth
yet, in five months, she’ll slip away in a September
night – leaving her parents and me to bow our heads, bury her
in a white wedding gown, our people’s custom.
But right now, I don’t know this. Right now, we are young,
still immortal, and the kids fidget, crying
out for their eggs. Elaine divides them into teams;
I lay out the tools for the operation.
I tell them all how painting Easter eggs used to be done
in the Old Country. Before easy dyes were common,
villagers boiled onion peels, ladled eggs
into pots so the shells wouldn’t break.
They’d scoop them out, flushed a brownish-
red, and the elders would polish and polish
them with olive oil, singing hymns for the Holy Thursday hours.
The children laugh and boo when I try to sing. The boys swirl
speckles of color into hot water, while the girls
time the eggs. When a white-faced boy asks from nowhere
if I believe in Christ and living forever,
I stop stirring the mix, answer,”Yes, I do.” I answer slowly
and when I speak, my own voice deafens me.
The simple truth blooms like these painted flowers
riding up the bright kitchen walls. I come
to belief. I know that much. Still, what a man may
do with belief demands more than what he says.
Now, the hot waters are a stained, rich red. The eggs have
boiled and cooled. To each set of hands, Elaine gives
one towel, three eggs. I pass the pot of melted paraffin,
show these children how to take the eggs and dip them in
and out. While the wax hardens to an opaque film, we hum
Christos Aneste and the room bustles, ajabber
with speech. Holding pins firmly, we scratch out mad
designs where the color will fill. Small, flurried hands
etch and scrim the shells. Everyone’s fingers whorl
and scratch in names, delicate and final.
Edging the hall’s threshold, an April’s allow-
ance of sun filters through tinted windows. Faces furrow
in solemn concentration. Looking to Elaine, my thoughts clamor
for what is redemptive in illness, for having
a Credo to hold these people to me. Etchings
done, everyone immerses the waxy eggs in the pooled
dye. We ooh together when transfigured eggs are spooned
out, wiped and dried on the counters. Soft wax
is peeled gingerly, flecked away; more oohs for the tracks
of limned lines, testimonial names.
We burnish the shells with olive oil for a fine sheen
For a moment, the cultivated, finished eggs hush
the room. Then, every child goes wild in a rush
to compare, they show the nurses, each
other. The bald boy taps my waist, Lined up and speech-
less, they present me with a bright, autographed
egg, communally done. Elaine makes me close my eyes and laughs
when small limbs push at my back to follow
her. They shove my hand in the cool, wet, red dye. The hollow-
eyed girl squeals till tears streak from her laughing.
Another child cries, “You’ll never get it off!”
And today, I don’t want to. Today,
we’ve painted eggs a lively color, not caring
about the body’s cells and the cells’ incarceration.
I lift my arms to embrace Elaine and dab her nose and chin.
And my hands are vivid red. My hands
are bloody with resurrection.
and we are laughing.
by Nicholas Samaras
How Nature Resets Our Minds and Bodies
From The Atlantic:
Just before the dawn of the twentieth century, William James, one of the early giants of modern psychology, explained that human attention comes in two different forms. The first is directed attention, which enables us to focus on demanding tasks like driving and writing. Reading a book also requires directed attention, and you'll notice that you start to zone out when you're tired, or when you've been reading for hours at a time. The second form is involuntary attention, which comes easily and doesn't require any mental effort at all. As James explained, "Strange things, moving things, wild animals, bright things, pretty things, words, blows, blood, etc., etc., etc." all attract our attention involuntarily.
Nature restores mental functioning in the same way that food and water restore bodies. The business of everyday life -- dodging traffic, making decisions and judgment calls, interacting with strangers -- is depleting, and what man-made environments take away from us, nature gives back. There's something mystical and, you might say, unscientific about this claim, but its heart actually rests in what psychologists call attention restoration theory, or ART. According to ART, urban environments are draining because they force us to direct our attention to specific tasks (e.g., avoiding the onslaught of traffic) and grab our attention dynamically, compelling us to "look here!" before telling us to instead "look over there!" These demands are draining -- and they're also absent in natural environments. Forests, streams, rivers, lakes, and oceans demand very little from us, though they're still engaging, ever changing, and attention-grabbing. The difference between natural and urban landscapes is how they command our attention. While man-made landscapes bombard us with stimulation, their natural counterparts give us the chance to think as much or as little as we'd like, and the opportunity to replenish exhausted mental resources.
Maya Angelou: my terrible, wonderful mother
From The Guardian:
The first decade of the 20th century was not a great time to be born black and poor and female in St Louis, Missouri, but Vivian Baxter was born black and poor, to black and poor parents. Later she would grow up and be called beautiful. As a grown woman she would be known as the butter-coloured lady with the blowback hair.
My mother, who was to remain a startling beauty, met my father, a handsome soldier, in 1924. Bailey Johnson had returned from the first world war with officer's honours and a fake French accent. They were unable to restrain themselves. They fell in love while Vivian's brothers walked around him threateningly. He had been to war, and he was from the south, where a black man learned early that he had to stand up to threats, or else he wasn't a man. The Baxter boys could not intimidate Bailey Johnson, especially after Vivian told them to lay off. Vivian's parents were not happy that she was marrying a man from the south who was neither a doctor nor lawyer. He said he was a dietician. The Baxters said that meant he was just a negro cook. Vivian and Bailey left the contentious Baxter atmosphere and moved to California, where little Bailey was born. I came along two years later. My parents soon proved to each other that they couldn't stay together. They were matches and gasoline. They even argued about how they were to break up. Neither wanted the responsibility of taking care of two toddlers. They separated and sent me and Bailey to my father's mother in Arkansas. I was three and Bailey was five when we arrived in Stamps, Arkansas. We had identification tags on our arms and no adult supervision. I learned later that Pullman car porters and dining car waiters were known to take children off trains in the north and put them on other trains heading south.
Save for one horrific visit to St Louis, we lived with my father's mother, Grandmother Annie Henderson, and her other son, Uncle Willie, in Stamps until I was 13. The visit to St Louis lasted only a short time but I was raped there and the rapist was killed. I thought I had caused his death because I told his name to the family. Out of guilt, I stopped talking to everyone except Bailey. I decided that my voice was so powerful that it could kill people, but it could not harm my brother because we loved each other so much. My mother and her family tried to woo me away from mutism, but they didn't know what I knew: that my voice was a killing machine. They soon wearied of the sullen, silent child and sent us back to Grandmother Henderson in Arkansas, where we lived quietly and smoothly within my grandmother's care and under my uncle's watchful eye.
Saturday, March 30, 2013
THE FACTS, THE MYTHS AND THE FRAMING OF IMMIGRATION
Kenan Malik in Pandaemonium:
The facts are relatively straightforward. Immigration is a good and the idea that immigrants come to Britain to live off benefits laughable. Immigrants put more money into the economy than they take out and have negligible impact on jobs or wages. An independent report on the impact of immigrationcommissioned by the Home Office in 2003, looked at numerous international surveys and conducted its own study in Britain. ‘The perception that immigrants take away jobs from the existing population, or that immigrants depress the wages of existing workers’, it concluded, ‘do not find confirmation in the analysis of the data laid out in this report.’ More recently studies have suggested that immigration helps raise wages except at the bottom of the jobs ladder where it has a slightnegative impact. That impact on low paid workers matters hugely, of course, but is arguably more an issue of labour organization than of immigration.
Immigrants are less likely to claim benefits than British citizens. According to the Department for Work and Pensions, of the roughly 1.8 million non-British EU citizens of working age in this country, about 90,000, or around 5%, claim an ‘out of work benefit’, compared with around 13% of Britons. Migrants from outside the EU are also much less likely to claim benefits.
It’s a part of my paleo fantasy, it’s a part of my paleo dream
David Gorski in Science-Based Medicine:
There are many fallacies that undergird alternative medicine, which evolved into “complementary and alternative medicine” (CAM), and for which the preferred term among its advocates is now “integrative medicine,” meant to imply the “best of both worlds.” If I had to pick one fallacy that rules above all among proponents of CAM/IM, it would have to be either the naturalistic fallacy (i.e., that if it’s natural—whatever that means—it must be better) or the fallacy of antiquity (i.e., that if it’s really old, it must be better). Of course, the two fallacies are not unrelated. In the minds of CAM proponents, old is more likely to have been based on nature, and the naturalistic fallacy often correlates with the fallacy of antiquity. Basically, it’s a rejection of modernity, and from it flow the interest in herbalism, various religious practices rebranded as treatments (thousands of years ago, medicine was religion and religion was medicine—the two were more or less one and physicians were often priests as well), and the all-consuming fear of “toxins,” in which it is thought that the products of modernity are poisoning us.
Yes, there is a definite belief underlying much of CAM that technology and pharmaceuticals are automatically bad and that “natural” must be better. Flowing from that belief is the belief that people were happier and much healthier in the preindustrial, preagricultural past, that cardiovascular disease was rare or nonexistent, and that cancer was seldom heard of. Of course, it’s hard not to note that cancer and heart disease are primarily diseases of aging, and life expectancy was so much lower back in the day that a much smaller percentage of the population lived to advanced ages than is the case today. Even so, an implicit assumption among many CAM advocates is that cardiovascular disease is largely a disease of modern lifestyle and diet and that, if modern humans could somehow mimic preindustrial or, according to some, even preagricultural, lifestyles, that cardiovascular disease could be avoided.
Bitcoin: the fastest growing currency in the world
From The Guardian:
Bitcoin May Be the Global Economy's Last Safe Haven
Paul Ford in Bloomberg Businessweek:
One of the oddest bits of news to emerge from the economic collapse of Cyprus is a corresponding rise in the value of Bitcoin, the Internet’s favorite, media-friendly, anarchist crypto-currency. In Spain, Google (GOOG) searches for “Bitcoin” and downloads of Bitcoin apps soared. The value of a Bitcoin went up to $78. Someone put out a press release promising a Bitcoin ATM in Cyprus. Far away, in Canada, a man said he’d sell his house for BTC5,362.
Bitcoin was created in 2009 by a pseudonymous hacker who calls him or herself Satoshi Nakamoto (and who might be several people). It’s a form of virtual cash used to buy goods and services online. Even by Web standards, it’s a strange and supergeeky phenomenon. This is what happens when software and networks meet the concept of currency, when you take peer-to-peer networks and advanced cryptography and ask, “How can I make a new economy?”
There are 10,952,975 Bitcoins in circulation. (With a digital currency you can be specific.) Bitcoin isn’t about to replace hard currency—with a market cap of $864 million, all of it is worth less than what Facebook (FB) paid for Instagram—but it’s bigger than anyone expected. And many people will tell you that the emergence of a virtual global money supply beyond the reach and control of any government is very real and that it’s time we take it seriously. As long as the Internet remains turned on, Bitcoin will be there—to its adherents, it’s the Platonic currency.
Can Honeybees Lead To A Better Treatment For Myelodysplastic Syndromes?
Azra Raza in The MDS Beacon:
The drones and worker bees exist to work. Their various jobs include nursing the ever-hatching brood, visiting flowers to bring back nectar, constructing wax combs, serving as cleaners and guards for the hive, and literally living to serve the queen.
The queen bee, on the other hand, looks different and is larger than the other bees. She is fed and groomed by a hoard of attendants, does not work a day in her life, and her only job is to lay eggs that can amount to as many as 2,000 on a good summer day. She produces a pheromone called “queen substance” that informs the colony that a viable queen is present.
The greatest difference between the queen and her subjects, however, is that the workers have a life span of two to four weeks while the queen can live up to eight years.
The real kicker is that drones and the queen bee share the exact same set of genes. What accounts for the dramatic physical differences is therefore not the genes but their relative expression (i.e., how much of each gene’s corresponding protein the body makes).
In the case of bees, it seems that the diet they are fed as larvae and beyond controls which genes are turned on to be translated into protein. Bees’ rich and nutritious diet, called royal jelly, is produced in the mouth glands of nursing bees and fed to all hatching larvae; however, the workers are soon weaned off the royal jelly and given nectar and pollen, while the queen bee is bathed in the royal jelly into adulthood.
What is the magic substance in royal jelly?
The Immortal Life of Henrietta Lacks, the Sequel
Rebecca Skloot in the New York Times:
Last week, scientists sequenced the genome of cells taken without consent from a woman named Henrietta Lacks. She was a black tobacco farmer and mother of five, and though she died in 1951, her cells, code-named HeLa, live on. They were used to help develop our most important vaccines and cancer medications, in vitro fertilization, gene mapping, cloning. Now they may finally help create laws to protect her family’s privacy — and yours.
The family has been through a lot with HeLa: they didn’t learn of the cells until 20 years after Lacks’s death, when scientists began using her children in research without their knowledge. Later their medical records were released to the press and published without consent. Because I wrote a book about Henrietta Lacks and her family, my in-box exploded when news of the genome broke. People wanted to know: did scientists get the family’s permission to publish her genetic information? The answer is no.
Imagine if someone secretly sent samples of your DNA to one of many companies that promise to tell you what your genes say about you. That report would list the good news (you’ll probably live to be 100) and the not-so-good news (you’ll most likely develop Alzheimer’s, bipolar disorder and maybe alcoholism). Now imagine they posted your genetic information online, with your name on it. Some people may not mind. But I assure you, many do: genetic information can be stigmatizing, and while it’s illegal for employers or health insurance providers to discriminate based on that information, this is not true for life insurance, disability coverage or long-term care.
A Conversation with Mohsin Hamid
Watch A How-to on Finding Love and Success in 'Rising Asia' on PBS. See more from PBS NewsHour.
An Interactive Scale of Everything in the Universe
From Scientific American:
This infographic may look modest, but it is nothing short of exceptional. A few days ago, I posted it to Twitter and it seems at least the Twittersphere agrees. Now the graphic is up on Visual.ly with an embed button, so of course I had to pass it along! Truly an awesome graphic in scope and execution. Go directly to the full interactive version and sail from the boundaries of the universe to stars, planets, people, ants, atoms, quarks and beyond. There’s a slightly more colorful version as well, but I miss the icons on the slider that provide a road map for where you’re going vs. where you’ve been.
More here. (Note: Do try the interactive version...fascinating!)
At times my life suddenly opens its eyes in the dark.
A feeling of masses of people pushing blindly
through the streets, excitedly, toward some miracle,
while I remain here and no one sees me.
It is like the child who falls asleep in terror
listening to the heavy thumps of his heart.
For a long, long time till morning puts his light in the locks
and the doors of darkness open.
by Tomas Tranströmer.
from The Half-Finished Heaven
Swedish translation by Robert Bly, 2001
Graywolf Press, St. Paul, Minnesota
Vow: A Memoir of Marriage (And Other Affairs)
From The Telegraph:
Wendy Plump had been married for 18 years when she found out about the Other Woman. Susan lived, apparently, a mile down the road in a house that Plump’s husband, Bill, had bought for her; and living there also was their eight-month-old baby boy. “The news fell into place,” Plump writes, “with an almost audible click. Like a bullet revolving in its cylinder and lining up with the chamber.” Plump uses this image because it describes the “kind of violence that I lived with later on”, but the marriage between Wendy and Bill had been a crime scene from the start. Vow is not the kind of memoir usually written by a bruised wife bent on revenge, but then Plump was not the usual kind of wife, and it is not revenge that she is after. For a start, she had affairs of her own (three in all: honesty is what Plump does best), all of them in the early years. By the time Bill met Susan, the Plumps had a “360-degree view of infidelity. We knew it from every angle.” Wendy was unfaithful because she was young and excited and wanted something inexpressible. She just wanted, and wanting, she says, puts a terrible strain on marriage vows. What Plump didn’t want was for her marriage to end; she loved her husband and she liked their life. But she was hungry, and as W H Auden reminds us in the epigraph to the book, “Hunger allows no choice”.
...Like all books which take an aim at the truth, Vow may change the way you see yourself; it will at least change your marriage. This is not because it is a morality tale or an account of infidelity that will scare you enough not to give it a go. It would be easy to produce this kind of book, and Plump doesn’t take the easy way through anything. What makes Vow so powerful is that she dissects not only the carcass of her own marriage, but the drive that propels so many otherwise sane people to destroy, for the sake of a moment of wanting, the bullet-proof world they have tried to create. Plump, who is now as wise as a serpent and as harmless as a dove, suggests that monogamous couples – at whom she gawps as though they were animals in a zoo – are committed less to their spouses than they are to themselves. Fidelity “has to do with their own honour”.
Friday, March 29, 2013
Goodfriday, 1613. Riding Westward
As in so much of Donne’s devotional verse, “Goodfriday” is structured around a “collision of the liturgy with the ego”, as Kirsten Stirling has put it. The speaker expresses guilt about travelling west on the day commemorating Christ’s crucifixion in the east, fulfilling personal obligations when he ought to be performing religious duties. However, awed by his contemplation of the crucifixion he reasons that he is facing the right way, and the direction of travel therefore directly enables the poem’s devotional climax. First, the poem explores the overwhelming nature of the crucifixion vision itself. To witness God’s death on earth would lead to a kind of paradoxical death difficult even to imagine (and Donne rhymes “dye” with “dye” at this point to powerful effect). How could a human being behold hands which could encompass infinity, or comprehend the “endlesse height” of heaven “Humbled below”? Given these visual impossibilities, the back of the head – said by Galen to be the seat of the memory – offers the more appropriate means of contemplation.more from Daniel Starza Smith at the TLS here.
Berezovsky wasn’t just an oligarch: he was the first oligarch. He is sometimes referred to slightingly as a “former used car salesman”—this is a kind of joke. In fact Berezovsky was an accomplished mathematician, a corresponding member of the Soviet Academy of Sciences, with a specialization in game theory. In the late 1980s, as free enterprise began to be introduced in the USSR, piecemeal and with every possible loophole for corruption, the other future oligarchs began to go into “business”: Mikhail Prokhorov, future owner of Norilsk Nickel and then the New Jersey Nets, sold acid-washed jeans at the local market; Vladimir Gusinsky, future owner of Most-Bank and the country’s first independent television channel, NTV, became an event planner; Mikhail Khodorkovsky, future owner of the country’s largest oil company, now in prison for a decade, opened a cafe. Berezovsky, a generation older than these others, had an in at the Avtovaz factory in Togliatti, in central Russia; he had helped them set up their computer systems, and for years had been picking up hard-to-get auto parts there and reselling them in Moscow (so he was a bit of used car salesman—but they were new parts). As the USSR fell apart, Berezovsky saw that the country was moving from a barter economy to a cash economy.more from Keith Gessen at n+1 here.
the hard life of deconstruction
It’s hard to say what’s more remarkable: that the so-called father of deconstruction was already hatching his apostasy while just barely out of his teens, or that the undertaking involved so much suffering. Peeters’ Derrida is a nervous wreck: “a fragile and tormented man,” prone to nausea, insomnia, exhaustion, and despair. By the summer of 1960, after failing to get a promised post as a maître assistant at the Sorbonne and having spent the year teaching in a provincial capital instead, he was on Anafranil, one of the original anti-depressants, which had just appeared on the market. During another bout of the blues, he wrote to a friend from his infirmary bed, “I’m no good for anything except taking the world apart and putting it together again (and I manage the latter less and less frequently).” That’s not a bad description of deconstruction, an exercise in which unraveling—of meaning and coherence, of the kind of binary logic that tends to populate philosophical texts—is the path to illumination. In Derrida’s reading, Western philosophers’ preoccupation with first principles, a determination to capture reality, truth, “presence,”—what he called in reference to the phenomenologist Edmund Husserl “the thing itself”—was doomed. He traced this impulse in thinkers from Aristotle to Heidegger, famously arguing, for example, that a tendency to favor the immediacy of speech over the remoteness of writing was untenable.more from Emily Eakin at the NYRB here.
Study: Eat Protein in the Morning
PROBLEM: Skipping breakfast is strongly correlated with weight gain. "Start your day off right," right? Still, young people eat nearly half of their daily calories between 4 p.m. and midnight. So, eat breakfast, but what's best?
METHODOLOGY: A small experiment out of the University of Missouri involved 20 overweight or obese females, aged 18 to 20, who identified as infrequent breakfast eaters. Each morning for a week, the researchers had the participants eat either 350 calories of cereal (13 grams of protein), 350 calories of eggs and beef (35 grams of protein), or skip breakfast entirely. Dietary fat, fiber, sugar, and energy density were kept constant across all of their breakfasts.
Participants adjusted to their diets for six days. On the seventh day, they were kept in a lab so that researchers could track/control their behavior. They had them fill out questionnaires about their hunger levels and cravings. They took repeated blood samples. They hooked them up to an fMRI while showing them pictures of food. These tests were repeated on three different Saturdays.
On lab days, the participants were all given a standard 500-calorie lunch; for dinner they were given cut-up pieces of microwaveable pizza pockets and told to eat until they were full. They were then sent home with coolers packed with 4,000 calories worth of snacks: cookies, cakes, granola bars, candy (in its hard, chocolate, and gummy forms), chips, popcorn, crackers, pretzels, microwaveable mac and cheese, string cheese, fruits and veggies, single servings of ice cream, beef jerky, yogurt, and more microwaveable pizza pockets. This was meant to simulate the overexposure to and wide availability of snacks typical of the "modern food environment."
RESULTS: Eating any breakfast was associated with increased feelings of fullness, a reduced desire to eat, and lower levels of ghrelin (a hunger-stimulating hormone) throughout the morning. But meaty, eggy breakfast was associated with these benefits over the course of the entire day. Participants who had a lot of protein in the morning also had reductions in their "cravings-related" brain activity, and increased levels of a hormone associated with satiety. They snacked less on fatty foods in the evening, as compared to those who ate cereal or nothing.
True Fame Lasts Longer Than 15 Minutes
In 1968, Andy Warhol—already famous in his own right—further added to his celebrity by creating a lasting cliché: “In the future, everyone will be world-famous for 15 minutes.”
Prescient as Warhol might have been, it seems we haven’t reached that future quite yet, at least according to science. A new study, published today in the American Sociological Review, finds that true fame lasts a good deal longer than 15 minutes. In an analysis of the celebrity journalism nationwide, researchers found that the most famous (and most often-mentioned) celebrities stick around for decades. To come to the finding, a number of sociologists each spent a multi-year sabbatical meticulously combing the “Stars: They’re Just Like Us” feature of UsMagazine. Several reportedly declined to return to the field of academia, apparently taking their talents to the analytical departments of the glossy magazine industry full-time. Just kidding! In all seriousness, the sociologists, led by Eran Shor of McGill University and Arnout van de Rijt of Stony Brook University, used an automated search took a random sample of roughly 100,000 names that appeared in the entertainment sections of 2,200 daily American newspapers published between 2004 and 2009. Their sample didn’t include every single name published, but rather a random selection of names published at all different frequencies—so it wouldn’t be useful for telling you who was the most often-mentioned celebrity overall, but would be illustrative of the sorts of trends that famous (and not-so-famous) names go through over time. The ten most frequently-mentioned names in their sample: Jamie Foxx, Bill Murray, Natalie Portman, Tommy Lee Jones, Naomi Watts, Howard Hughes, Phil Spector, John Malkovich, Adrien Brody and Steve Buscemi. All celebrities, they note, were relatively famous before the year 2000, in some cases decades earlier (Howard Hughes rose to fame in the 1920s). All ten names, additionally, are still fairly well-known today. Overall, 96 percent of the most famous names in the sample (those mentioned more than 100 times over the course of a given year) had already been frequently featured in the news three years earlier, further dispelling the 15 minutes cliché. Furthermore, if a name was mentioned extremely often in its first year of appearing, it stood a greater chance of sticking around for an extended period of time.
There is, however, some truth to 15-minutes idea: Names of lesser fame (those less frequently mentioned to start) exhibit significantly higher amounts of turnover from year to year. The researchers say these names mostly fall into the category of people involved in newsworthy events—such as natural disasters and crimes—rather than people who readers find newsworthy in their own right. As an example, Van de Rijt mentions Chelsey Sullenberger, the US Airways pilot who briefly achieved celebrity after successfully executing an emergency landing on the Hudson River in 2011, but is now scarcely frequently mentioned in the press.