Beliefs that give meaning to life can’t be dislodged by factual evidence

Salman Hameed in The Guardian:

Salman Millions of individuals in the UK believe in UFOs and ghosts. Yet we know that there is no credible evidence for any visitation from outer space or for some dead souls hanging out in abandoned houses. On the other hand, there is now overwhelming evidence that humans and other species on the planet have evolved over the past 4.5bn years. And yet 17% of the British population and 40% of Americans reject evolution. It seems that for many there is no connection between belief and evidence.

Some – maybe most – of the blame can be attributed to an education system that does not train people to think critically. Similarly, most people do not understand methodologies of science and the way theories get accepted. For some, scientific evidence has no role in the way they envision the world.

People who claim to have been abducted by aliens provide an interesting example. The “abductions” happen mostly in the early morning hours and, apart from psychological trauma, there is no physical evidence left behind. Some scientists have attributed these episodes to sleep paralysis – a momentary miscommunication between the brain and the body, just before going to sleep or waking up.

While abductions have most likely not taken place, the trauma experienced by the individuals may still be real.

More here.

Curing the Pelvic Headache

Robert Pinsky in the New York Times Book Review:

Pinsky-articleInline The appeal of conversion stories often depends on descriptions of the darkness before enlightenment: we enjoy learning in detail about the presalvation misery, debauchery or sinfulness. The more detail, the better. The English novelist Tim Parks understands that principle. In his urbane, droll, weird yet far from charmless account of the pain and misery suffered by his body in general, and by his bladder, prostate, penis and related bits in particular, the conversion is from a cerebral, anxious, hunched-over and compulsively verbal kvetch (not his term, but the literal “squeeze” makes the Yiddish word seem appropriate) to something resembling the opposite.

Like the reformed sinner who diverts his audience with lurid, prolonged accounts of nights in the fleshpots, Parks gives an amusing, anxiously over-the-top confession of his former condition: “I was nothing but tension. . . . I brushed my teeth ferociously, as if I wanted to file them down. I yanked on my socks as if determined to thrust my toes right through them. . . . When I pushed a command button, I did so as if it was my personal strength that must send the elevator to the sixth floor, or raise the door of the garage. While I shaved I tensed my jaw, while I read I tensed my throat, while I ate (too fast) I tensed my forehead, while I talked I tensed my shoulders, while I listened I tensed my neck, while I drove I tensed everything.

This passage — much longer without my ellipses — extends over an entire page. The next paragraph begins, “And this is only the briefest summary of my chronically maladjusted state.”

More here.

O’Keeffe and Stieglitz

62705961

At the beginning of their correspondence, in 1915, she addressed him as Mr. Stieglitz. He called her Miss O’Keeffe. Within a few years, he was Dearest Duck and she was Fluffy.. By 1933, when the first volume of their letters ends, much more than appellations had changed. Photographer Alfred Stieglitz and painter Georgia O’Keeffe had evolved from acquaintances to lovers and then from marital partners to distant combatants struggling to maintain a passionate relationship despite his infidelities and her quest for independence. Much has been written about Stieglitz and O’Keeffe — his pioneering modern art galleries and photographic work, her paintings of enormous flowers and Southwestern landscapes, their epic love affair. But “My Faraway One” is not just one more big book about the couple. It’s a substantial sampling of a huge trove of correspondence that was sealed until 2006, 20 years after O’Keeffe’s death, and the first annotated selection of those letters to appear in print.

more from Suzanne Muchnic at the LA Times here.

Hüsker Dü

Bkr-Christgau-t_CA0-articleInline

It was early 1983, probably, after the “Everything Falls Apart” EP presaged Hüsker Dü’s departure from hard-core punk and before the “Metal Circus” EP made it official. Just a gig at a crummy club near CBGB, and late — after 1. There weren’t a dozen onlookers, but Hüsker Dü’s two early records were knockouts, and that Minneapolis trio never came east, so there we were. From our booth in back the music sounded terrific: headlong and enormous, the guitar unfashionably full, expressive and unending, with two raving vocalists alternating leads on songs whose words were hard to understand and whose tunes weren’t. Another half-dozen curious fans drifted in. And then, halfway through, the guitarist passed into some other dimension. When he stepped yowling off the low stage, most of us gravitated closer, glancing around and shaking our heads. The climax was the band’s now legendary cover of “Eight Miles High,” which transformed the Byrds’ gentle paean to the ­chemical-technological sublime into a roller coaster lifted screaming off its tracks — bruising and exhilarating, leaving the rider both very and barely alive.

more from Robert Christgau at the NYT here.

being human

Ab010364-9dfd-11e0-958b-00144feabdc0

What is human nature? A biologist might see it like this: humans are animals and, like all animals, consist mostly of a digestive tract into which they relentlessly stuff other organisms – whether animal or vegetable, pot-roasted or raw – in order to fuel their attempts to reproduce yet more such insatiable, self-replicating omnivores. The fundamentals of human nature, therefore, are the pursuit of food and sex. But that, the biologist would add, is only half the story. What makes human nature distinctive is the particular attribute that Homo sapiens uses to hunt down prey and attract potential mates. Tigers have strength, cheetahs have speed – that, if you like, is tiger nature and cheetah nature. Humans have something less obviously useful: freakishly large brains. This has made them terrifyingly inventive in acquiring other organisms to consume – and, indeed, in preparing them (what other animal serves up its prey cordon bleu?) – if also more roundabout in their reproductive strategies (composing sonnets, for example, or breakdancing). Human nature – the predilection for politics and war, industry and art – is, therefore, just the particularly brainy way that humans have evolved to solve the problems of eating and reproducing. Thus biologists believe that once they understand the human brain and the evolutionary history behind it, they will know all they need to about this ubiquitous brand of ape.

more from Stephen Cave at the FT here.

Rereading: Mildred Pierce

From Guardian:

Mildred-Pierce---2011-007 Edmund Wilson once called James M Cain (1892-1977) one of America's “poets of the tabloid murder”. After Dashiell Hammett and Raymond Chandler Cain is the writer most often credited with defining the “hard-boiled”, the tough-talking, fast-moving urban stories of violence, sex and money that characterised so much popular film and fiction in America during the 1930s and 40s. Unlike Hammett and Chandler, however, Cain did not focus his fiction on the consoling figure of the detective bringing a semblance of order to all that urban chaos. His novels are told from the perspective of the confused, usually ignorant, all-too-corruptible central actors in his lurid dramas of betrayal and murder. His first two novels, The Postman Always Rings Twice and Double Indemnity, were narrated by men destroyed by femmes fatales; both were made into enormously successful films, especially Billy Wilder's now-classic Double Indemnity, starring Fred MacMurray and Barbara Stanwyck in an improbable blonde wig.

In 1941, Cain published Mildred Pierce, his first novel to focus on a female protagonist; in 1945, it was duly made into a film, starring Joan Crawford in her only Oscar-winning performance, as an overprotective mother trying to cover up for her homicidal daughter. That version of Mildred Pierce is now a classic piece of stylish film noir; but its plot and tone diverge sharply from the novel, a more ostensibly “realistic” story about a divorced woman trying to raise her daughters in depression-era California. Now the film-maker Todd Haynes has returned to Cain's original text to bring us a mini-series of Mildred Pierce, with a cast including Kate Winslet in the title role, Evan Rachel Wood as the treacherous daughter and Guy Pearce. This new Mildred Pierce, produced for HBO with an apparently unlimited budget, may well be the most faithful adaptation of a book ever made: the dialogue is nearly verbatim, and the film moves painstakingly through a virtual transcription of Cain's novel. The attention to historical detail is astonishing, the performances outstanding, and the finished product is visually gorgeous, steeped in a golden sepia tone. But by the end some viewers may well be wondering what, exactly, about this story merited such reverential treatment: Cain's characterisation is uneven, to say the least, and the narrative is resolved only by means of contorted turns of the plot.

More here.

The Trouble With Common Sense

From The New York Times:

Christakis-popup The popularity of the Mona Lisa is an illusion. As Duncan J. Watts explains: “We claim to be saying that the Mona Lisa is the most famous painting in the world because it has attributes X, Y and Z. But really what we’re saying is that the Mona Lisa is famous because it’s more like the Mona Lisa than anything else.” In other words, we are trapped inside a hall of mirrors of our own devising. We think the Mona Lisa is famous because of its traits, but we think those traits are significant only because they belong to the Mona Lisa, which we know to be famous. Ditto Shakespeare? Yes. When an incredulous English professor asked him whether he believed “Shakespeare might just be a fluke of history,” Watts indicated that he meant exactly that.

Watts doesn’t tell us how that conversation ended, but common sense does. Either the literature professor sputtered that Watts — a sociologist, physicist and former officer of the Australian Navy — had no idea what he was talking about, and left him standing with a half-­empty drink in his hand, or she was quite taken with his unorthodox views and spent the rest of the evening engrossed. That both outcomes — although incompatible — strike us as predictable is actually Watts’s point in this penetrating and engaging book. We rely on common sense to understand the world, but in fact it is an endless source of just-so stories that can be tailored to any purpose. “We can skip from day to day and observation to observation, perpetually replacing the chaos of reality with the soothing fiction of our explanations,” Watts writes. Common sense is a kind of bespoke make-believe, and we can no more use it to scientifically explain the workings of the social world than we can use a hammer to understand mollusks.

More here.

Does Islam Stand Against Science?

Steve Paulson in the Chronicle of Higher Education:

Photo_13094_landscape_largeScience in Muslim societies already lags far behind the scientific achievements of the West, but what adds a fair amount of contemporary angst is that Islamic civilization was once the unrivaled center of science and philosophy. What's more, Islam's “golden age” flourished while Europe was mired in the Dark Ages.

This history raises a troubling question: What caused the decline of science in the Muslim world?

Now, a small but emerging group of scholars is taking a new look at the relationship between Islam and science. Many have personal roots in Muslim or Arab cultures. While some are observant Muslims and others are nonbelievers, they share a commitment to speak out—in books, blogs, and public lectures—in defense of science. If they have a common message, it's the conviction that there's no inherent conflict between Islam and science.

More here.

The Brain on Trial

Advances in brain science are calling into question the volition behind many criminal acts. A leading neuroscientist describes how the foundations of our criminal-justice system are beginning to crumble, and proposes a new way forward for law and order.

David Eagleman in The Atlantic:

Neuroscience2On the steamy first day of August 1966, Charles Whitman took an elevator to the top floor of the University of Texas Tower in Austin. The 25-year-old climbed the stairs to the observation deck, lugging with him a footlocker full of guns and ammunition. At the top, he killed a receptionist with the butt of his rifle. Two families of tourists came up the stairwell; he shot at them at point-blank range. Then he began to fire indiscriminately from the deck at people below. The first woman he shot was pregnant. As her boyfriend knelt to help her, Whitman shot him as well. He shot pedestrians in the street and an ambulance driver who came to rescue them.

The evening before, Whitman had sat at his typewriter and composed a suicide note:

I don’t really understand myself these days. I am supposed to be an average reasonable and intelligent young man. However, lately (I can’t recall when it started) I have been a victim of many unusual and irrational thoughts.

By the time the police shot him dead, Whitman had killed 13 people and wounded 32 more. The story of his rampage dominated national headlines the next day. And when police went to investigate his home for clues, the story became even stranger: in the early hours of the morning on the day of the shooting, he had murdered his mother and stabbed his wife to death in her sleep.

It was after much thought that I decided to kill my wife, Kathy, tonight … I love her dearly, and she has been as fine a wife to me as any man could ever hope to have. I cannot rationa[l]ly pinpoint any specific reason for doing this …

Along with the shock of the murders lay another, more hidden, surprise: the juxtaposition of his aberrant actions with his unremarkable personal life. Whitman was an Eagle Scout and a former marine, studied architectural engineering at the University of Texas, and briefly worked as a bank teller and volunteered as a scoutmaster for Austin’s Boy Scout Troop 5. As a child, he’d scored 138 on the Stanford-Binet IQ test, placing in the 99th percentile. So after his shooting spree from the University of Texas Tower, everyone wanted answers.

For that matter, so did Whitman. He requested in his suicide note that an autopsy be performed to determine if something had changed in his brain—because he suspected it had.

More here.

Leap Seconds May Hit a Speed Bump

Sophie Bushwick in Scientific American:

06-17-nistf1ph_1NIST-F1 is one of several international atomic clocks used to define international civil time (dubbed Coordinated Universal Time, or UTC), a job they perform a little too well. In fact, atomic clocks are actually more stable than Earth's orbit—to keep clocks here synched up with the motion of celestial bodies, timekeepers have to add leap seconds. The use of a leap year, adding a day to February every four years, locks the seasons, which result from Earth's orbit about the sun and the planet's tilt as it orbits, into set places in the civil calendar. Similarly, leap seconds ensure that the time it takes Earth to spin 360 degrees is equal to one day as defined by humans and their atomic clocks. Most recently, an extra second was tacked on to universal time on December 31, 2008.

However, since 1999, the Radiocommunication Sector of the ITU has been proposing the elimination of leap seconds from the measurement of UTC. Although the organization did not participate in the creation of the current leap second system, the radio waves it regulates are used to transmit UTC, giving it some influence.

Getting rid of leap seconds would certainly make it easier to calculate UTC, but this measure would also decouple astronomical time from civil time: The time measured by atomic clocks would gradually diverge from the time counted out by the movement of Earth through space. Eventually, one year will no longer be the length of Earth's orbit around the sun. Instead, it will be equivalent to a certain number of cycles of radiation from the cesium-133 atom (almost a billion billion cycles, to be precise).

More here.

Mohandas and the Unicorn

Patrick French in The National Interest:

Great_Soul_coverIf celebrity is a mask that eats into the face, posthumous fame is more like an accretion of silt and barnacles that can leave the face unrecognizable, or recognizable only as something it is not. We might feel we know Mohandas Gandhi, Abraham Lincoln, Albert Einstein, Joan of Arc or Martin Luther King Jr., but, rather, we know their iconic value: their portraits or statues, their famous deeds and sayings. We have trouble seeing them as their contemporaries did—as people. Jawaharlal Nehru, writing in the 1930s when he was in a British prison and some distance from becoming India’s prime minister, said that Gandhi’s views on marital relationships were “abnormal and unnatural” and “can only lead to frustration, inhibition, neurosis, and all manner of physical and nervous ills. . . . I do not know why he is so obsessed by this problem of sex.” Nehru was writing publicly, in his autobiography, but it is fair to say that few Indian politicians today would speak of the Father of the Nation in this unfettered way. Gandhi has become, in India and across the world, a simplified character: a celibate, cheerful saint who wore a white loincloth and round spectacles, ate small meals and succeeded in bringing down an empire through nonviolent civil disobedience. Barack Obama, who kept a portrait of Gandhi hanging on the wall of his Senate office, is fond of citing him.

Joseph Lelyveld has already found himself in some trouble over Great Soul, not for what he wrote, but for what other people say he wrote. In a contemporary morality tale of high-speed information transfer and deliberate misconstruction, his book has been identified as something it is not. The Daily Mail, one of London’s lively and vituperative tabloids, ran a story saying Great Soul claimed Gandhi “was bisexual and left his wife to live with a German-Jewish bodybuilder.”

More here.

Yemeles Bibesy, Illecebrous Quagswagging, Malagrugrous Sanguinolency, etc.

Heather Carreiro in Matador Abroad:

20101107-dictionaryDuring my undergraduate studies as a Linguistics major, one of the things that struck me most is the amazing fluidity of language. New words are created; older words go out of style. Words can change meaning over time, vowel sounds shift, consonants are lost or added and one word becomes another. Living languages refuse to be static.

The following words have sadly disappeared from modern English, but it’s easy to see how they could be incorporated into everyday conversation.

Words are from Erin McKean’s two-volume series: Weird and Wonderful Words and Totally Weird and Wonderful Words. Definitions have been quoted from the Oxford English Dictionary.

1. Jargogle

Verb trans. – “To confuse, jumble” – First of all this word is just fun to say in its various forms. John Locke used the word in a 1692 publication, writing “I fear, that the jumbling of those good and plausible Words in your Head..might a little jargogle your Thoughts…” I’m planning to use it next time my husband attempts to explain complicated Physics concepts to me for fun: “Seriously, I don’t need you to further jargogle my brain.”

2. Deliciate

Verb intr. – “To take one’s pleasure, enjoy oneself, revel, luxuriate” – Often I feel the word “enjoy” just isn’t enough to describe an experience, and “revel” tends to conjure up images of people dancing and spinning around in circles – at least in my head. “Deliciate” would be a welcome addition to the modern English vocabulary, as in “After dinner, we deliciated in chocolate cream pie.”

More here. [Thanks to Gabika Bočkaj.]

The Intelligent Homosexual’s Guide to Natural Selection and Evolution, with a Key to Many Complicating Factors

Jeremy Yoder in Scientific American:

ScreenHunter_08 Jun. 24 17.08I'm looking forward to celebrating Pride for the first time in my new hometown of Minneapolis this weekend–but as an evolutionary biologist, I suspect I have a perspective on the life and history of sexual minorities that many of my fellow partiers don't. In spite of the progress that LGBT folks have made, and seem likely to continue to make, towards legal equality, there's a popular perception that we can never really achieve biological equality. This is because same-sex sexual activity is inherently not reproductive sex. To put it baldly, as the idea is usually expressed, natural selection should be against men who want to have sex with other men–because we aren't interested in the kind of sex that makes babies. An oft-cited estimate from 1981 is that gay men have about 80 percent fewer children than straight men.

Focusing on the selective benefit or detriment associated with particular human traits and behaviors gets my scientific dander up, because it's so easy for the discussion to slip from what is “selectively beneficial” to what is “right.” A superficial understanding of what natural selection favors or doesn't favor is a horrible standard for making moral judgements. A man could leave behind a lot of children by being a thief, a rapist, and a muderer–but only a sociopath would consider that such behavior was justified by high reproductive fitness.

And yet, as an evolutionary biologist, I have to admit that my sexual orientation is a puzzle.

More here.

Biologists discover how yeast cells reverse aging

From PhysOrg:

Biologistsdi Human cells have a finite lifespan: They can only divide a certain number of times before they die. However, that lifespan is reset when reproductive cells are formed, which is why the children of a 20-year-old man have the same life expectancy as those of an 80-year-old man.

…When reproduce, they undergo a special type of cell division called meiosis, which produces spores. The MIT team found that the signs of cellular aging disappear at the very end of meiosis. “There’s a true rejuvenation going on,” Amon says. The researchers discovered that a gene called NDT80 is activated at the same time that the rejuvenation occurs. When they turned on this gene in aged cells that were not reproducing, the cells lived twice as long as normal. “It took an old cell and made it young again,” Amon says. In aged cells with activated NDT80, the nucleolar damage was the only age-related change that disappeared. That suggests that nucleolar changes are the primary force behind the aging process, Amon says. The next challenge, says Daniel Gottschling, a member of the Fred Hutchinson Cancer Research Center in Seattle, will be to figure out the cellular mechanisms driving those changes. “Something is going on that we don’t know about,” says Gottschling, who was not involved in this research. “It opens up some new biology, in terms of how lifespan is being reset.” The protein produced by the NDT80 gene is a transcription factor, meaning that it activates other genes. The MIT researchers are now looking for the targeted by NDT80, which likely carry out the rejuvenation process.

More here.