Meet the Manhattan mothers who think they deserve a ‘wife bonus’

Celia Walden in The Telegraph:

Stepfordwives_3308916bWomen are notoriously bad at asking for bonuses. Which is why I did my homework and created – as BusinessInsider.com suggested – “a master plan”. I waited “the appropriate amount of time” (in my case, five years), made sure the big boss was in a good mood and took him out to lunch (“somewhere intimate, where there will be no interruptions”). I eschewed any usage of the word “need” (stinking, as it does, of desperation) in my pitch – which was “backed up with reports, charts and documentation of my positive performance” – and I tried to “remain respectful” as he stared slack-jawed back at me, before throwing his head back and roaring with laughter. Asking my own husband for a bonus simply for being his wife was never going to be anything less than preposterous. Yet according to an author of the forthcoming memoir, Primates of Park Avenue, this is what a glittering tribe of crispy-haired Upper East Side Manhattan wives do every year – depending, of course, on how well they have managed the domestic budget, socialised, upheld a variety-filled performance in the bedroom… and succeeded in getting the kids into a ‘Big Ten’ school.

Wednesday Martin, a social researcher who has been immersing herself in the lives of “Park Lane Primates” for over a decade, explains how the “wife bonus”, as she has called it, works in practice. “It might be hammered out in a pre-nup or post-nup, and distributed on the basis of not only how well her husband’s fund had done, but her own performance — the same way their husbands were rewarded at investment banks. In turn, these bonuses were a ticket to a modicum of financial independence and participation in a social sphere where you don’t just go to lunch, you buy a $10,000 table at the benefit luncheon a friend is hosting.”

More here.

Reproducibility crisis: Blame it on the antibodies

Monya Baker in Nature:

AntibodyIn 2006, things were looking pretty good for David Rimm, a pathologist at Yale University in New Haven, Connecticut. He had developed a test to guide effective treatment of the skin cancer melanoma, and it promised to save lives. It relied on antibodies — large, Y-shaped proteins that bind to specified biomolecules and can be used to flag their presence in a sample. Rimm had found a combination of antibodies that, when used to 'stain' tumour biopsies, produced a pattern that indicated whether the patient would need to take certain harsh drugs to prevent a relapse after surgery. He had secured more than US$2 million in funding to move the test towards the clinic. But in 2009, everything started to fall apart. When Rimm ordered a fresh set of antibodies, his team could not reproduce the original results. The antibodies were sold by the same companies as the original batches, and were supposed to be identical — but they did not yield the same staining patterns, even on the same tumours. Rimm was forced to give up his work on the melanoma antibody set. “We learned our lesson: we shouldn't have been dependent on them,” he says. “That was a very sad lab meeting.”

Antibodies are among the most commonly used tools in the biological sciences — put to work in many experiments to identify and isolate other molecules. But it is now clear that they are among the most common causes of problems, too. The batch-to-batch variability that Rimm experienced can produce dramatically differing results. Even more problematic is that antibodies often recognize extra proteins in addition to the ones they are sold to detect. This can cause projects to be abandoned, and waste time, money and samples. Many think that antibodies are a major driver of what has been deemed a 'reproducibility crisis', a growing realization that the results of many biomedical experiments cannot be reproduced and that the conclusions based on them may be unfounded. Poorly characterized antibodies probably contribute more to the problem than any other laboratory tool, says Glenn Begley, chief scientific officer at TetraLogic Pharmaceuticals in Malvern, Pennsylvania, and author of a controversial analysis1 showing that results in 47 of 53 landmark cancer research papers could not be reproduced.

More here.

Thursday Poem

Captain of the Lighthouse

The late hour trickles into morning. The cattle low profusely by the anthill
where brother and I climb and call Land’s End. We are watchmen
overlooking a sea of hazel-acacia-green, over torrents of dust whipping about
in whirlwinds and dirt tracks that reach us as firths.

We man our lighthouse – cattle as ships. We throw warning lights whenever
they come too close to our jagged shore. The anthill, the orris-earth
lighthouse, from where we hurl stones like light in every direction.

Tafara stands on its summit speaking in sea-talk, Aye-aye me lad – a ship’s a-
coming! And hurls a rock at the cow sailing in. Her beefy hulk jolts and turns.
Aye, Captain, another ship saved! I cry and furl my fingers into an air-long
telescope – searching for more vessels in the day-night.

Now they low on the anthill, stranded in the dark. Their sonorous cries haunt
through the night. Aye, methinks, me miss my brother, Captain of the
lighthouse, set sail from land’s end into the deepest seventh sea.
.

by Togara Muzanenhamo
from Spirit Brides
Carcanet Press Ltd., Manchester, 2006

Comics and the Eternal Present

Kurt Klopmeier in The Critical Flame:

ScreenHunter_1195 May. 21 07.57“What, then, is time?” Christian philosopher St. Augustine asked. “If no one asks me, I know what it is. If I wish to explain it to him who asks, I do not know.” We say that time flies or that it drags. We have it on our hands or we are pressed for it. And although we cannot experience any time other than our own present, physicists tell us that there is nothing particularly special about the present. It seems that all times exist at once, and it is only our perception that limits our view of it. Because of this, it’s very hard to understand and even harder to convey the idea that everything according to special relativity is constantly happening.

The way people experience time—that it has a direction and a flow—appears to be inaccurate, at least according to our best understanding of physics. In his explanation of special relativity,The Fabric of the Cosmos, Brian Greene explains: “There is no use crying over spilled milk, because once spilled it can never be unspilled: we never see splattered milk gather itself together, rise off the floor, and coalesce in a glass that sets itself upright on a kitchen counter.” Events happen in one direction, and one alone. Time seems to move always forward in a particular sequence that is never interrupted. However, Greene writes, “as hard as physicists have tried, no one has found any convincing evidence within the laws of physics that supports this intuitive sense that time flows. In fact, a reframing of some of Einstein’s insights from special relativity provides evidence that time does not flow … The outside perspective … in which we’re looking at the whole universe, all of space at every moment of time, is a fictitious vantage point, one that none of us will ever have.” But this view of time, and the way that authors have tried to use it, can offer enlightening insights about the world that normal sequential narratives cannot, and can shed light on the way narrative operates on our understanding.

More here.

The Myth of American Meritocracy

Ron Unz in The American Conservative:

ScreenHunter_1194 May. 20 20.06Just before the Labor Day weekend, a front page New York Times story broke the news of the largest cheating scandal in Harvard University history, in which nearly half the students taking a Government course on the role of Congress had plagiarized or otherwise illegally collaborated on their final exam.1 Each year, Harvard admits just 1600 freshmen while almost 125 Harvard students now face possible suspension over this single incident. A Harvard dean described the situation as “unprecedented.”

But should we really be so surprised at this behavior among the students at America’s most prestigious academic institution? In the last generation or two, the funnel of opportunity in American society has drastically narrowed, with a greater and greater proportion of our financial, media, business, and political elites being drawn from a relatively small number of our leading universities, together with their professional schools. The rise of a Henry Ford, from farm boy mechanic to world business tycoon, seems virtually impossible today, as even America’s most successful college dropouts such as Bill Gates and Mark Zuckerberg often turn out to be extremely well-connected former Harvard students. Indeed, the early success of Facebook was largely due to the powerful imprimatur it enjoyed from its exclusive availability first only at Harvard and later restricted to just the Ivy League.

During this period, we have witnessed a huge national decline in well-paid middle class jobs in the manufacturing sector and other sources of employment for those lacking college degrees, with median American wages having been stagnant or declining for the last forty years. Meanwhile, there has been an astonishing concentration of wealth at the top, with America’s richest 1 percent now possessing nearly as much net wealth as the bottom 95 percent.2 This situation, sometimes described as a “winner take all society,” leaves families desperate to maximize the chances that their children will reach the winners’ circle, rather than risk failure and poverty or even merely a spot in the rapidly deteriorating middle class. And the best single means of becoming such an economic winner is to gain admission to a top university, which provides an easy ticket to the wealth of Wall Street or similar venues, whose leading firms increasingly restrict their hiring to graduates of the Ivy League or a tiny handful of other top colleges.3 On the other side, finance remains the favored employment choice for Harvard, Yale or Princeton students after the diplomas are handed out.4

More here.

Fake Diplomas, Real Cash: Pakistani Company Axact Reaps Millions

Declan Walsh in the New York Times:

17PAKISTAN-WEB-1-master675-v2Seen from the Internet, it is a vast education empire: hundreds of universities and high schools, with elegant names and smiling professors at sun-dappled American campuses.

Their websites, glossy and assured, offer online degrees in dozens of disciplines, like nursing and civil engineering. There are glowing endorsements on the CNN iReport website, enthusiastic video testimonials, and State Department authentication certificates bearing the signature of Secretary of State John Kerry.

“We host one of the most renowned faculty in the world,” boasts a woman introduced in one promotional video as the head of a law school. “Come be a part of Newford University to soar the sky of excellence.”

Yet on closer examination, this picture shimmers like a mirage. The news reports are fabricated. The professors are paid actors. The university campuses exist only as stock photos on computer servers. The degrees have no true accreditation.

In fact, very little in this virtual academic realm, appearing to span at least 370 websites, is real — except for the tens of millions of dollars in estimated revenue it gleans each year from many thousands of people around the world, all paid to a secretive Pakistani software company.
More here.

Study may explain mysterious cancer–day care connection

Warren Cornwall in Science:

LeukemiaFor years, scientists have noticed an interesting pattern of cancer among children. Those who went to day care early in life were less likely to later develop the most common childhood cancer: acute lymphoblastic leukemia (ALL). Now, a 7-year study appears to have unraveled the molecular mechanism driving ALL. The work may explain why early exposure to infections in places such as day cares seems to protect against the disease and why unrelated vaccines help guard against this cancer. For Mel Greaves, a cancer cell biologist at the University of London’s Institute of Cancer Research, the finding provides an explanation for the hypothesis he has long promoted: that when infants in modern societies are sheltered from routine infections, their immune systems are more likely to overreact during later infections, paving the way for ALL. “I see it as the missing link,” he says of the new research.

Most childhood ALL involves a malfunction of B cells, the scouts of the immune system that patrol the bloodstream looking for intruders like viruses and bacteria; they make antibodies that help fight infections. But with leukemia, the immune system goes haywire, churning out flawed, immature B cells at a prodigious rate and crowding out healthy blood cells. Normal B cells are a marvel of adaptability. As they mature, they reprogram their own DNA, enabling the immune system to produce millions of different B cells programmed to recognize the vast range of potential infections. The DNA rearrangement relies on a sequence of enzymes. First, proteins known as RAGs cut and paste whole chunks of DNA. After that, another enzyme, AID, goes to work “fine-tuning” the DNA by altering single nucleotides. But Greaves and colleagues suspected this process could go awry, introducing mutations that create flawed B cells that could cause leukemia. In a series of experiments, they found evidence that much of the problem lay with a breakdown in the orderly sequence of gene editing during infections. Rather than the RAGs doing their business and then stepping aside for the AID, the AID kicked in simultaneously, potentially increasing the risk of gene-editing errors. These tantalizing results came to a head in an experiment on mice with a genetic abnormality linked to childhood ALL. The condition, in which two genes associated with blood formation are fused together, is found in the cord blood of 1% of all newborns. But most children with it never go on to develop full-blown ALL. The researchers wondered if unregulated mutations set off by repeated infections later in childhood could make the difference, triggering the leukemia.

More here.

Wednesday Poem

Adelle Explains Urgency to the Judge
.

A few hours before the wedding I started to draw again. I had never taken my sketches seriously. But these new pictures showed a mastery I never thought possible before. In those few hours, I gained a sense of myself. I locked the door to the bridal suite and sketched everything I could: the windows, an armoire, my violet nightgown hanging from a hanger. That was when I heard a knock, followed by shouts and threats. It was my mother with the white dress.
.

by Kristina Marie Darling
from Amethyst Arsenic, 4.1

Women’s Work: the legacy of the 1970s women’s movement

Vivian Gornick in Book Forum:

BettyFORTY YEARS AGO, when the second wave of the American feminist movement was young, and its signature phrase, “the personal is political,” was electrifying, many of the movement’s radicals (this reviewer among them) went to war with the age-old conviction that marriage and motherhood were the deepest necessities of every woman’s life. If we looked honestly at what many of us really wanted, as we were doing in the 1970s and ’80s, it was not marriage and motherhood at all; it was rather the freedom to discover for ourselves the lives we might actually want to pursue. In our pain and anger at having been denied that freedom, we often turned recklessly on these conventional wisdoms. Marriage was rape, we cried, motherhood slavery. No equality in love? We’ll do without! What we didn’t understand—and this for years on end—was that between the ardor of our revolutionary rhetoric and the dictates of flesh-and-blood reality lay a no-man’s-land of untested pronouncements. How easy it was for us to declare ourselves “liberated,” how chastening to experience the force of contradictory feeling that undermined these defiant simplicities. As we moved inexorably toward the moment when we were bound to see that we were throwing the baby out with the bathwater, nearly every one of us became a walking embodiment of the gap between theory and practice: the place in which we were to find ourselves time and again.

KATE BOLICK is a forty-two-year-old journalist who, since childhood, has harbored a fantasy of living alone and becoming what she calls a “real” writer, but, like many women of her generation, she has found it nearly impossible to pursue that dream. In a memoir, Spinster, she traces the problem to its origins. “Whom to marry, and when it will happen” are the book’s opening words. “These two questions define every woman’s existence, regardless of where she was raised or what religion she does or doesn’t practice. She may grow up to love women instead of men, or to decide she simply doesn’t believe in marriage. No matter. These dual contingencies govern her until they’re answered, even if the answers are nobody and never.”

More here.

A Conversation with An-My Le

Le-web1

Over at The Brooklyn Rail:

Sara Christoph (Rail): The current blockbuster American Sniper, which deals with the same subject matter as your own work, might be a good place for us to begin.The success of these types of movies fascinates me, though it is not surprising, given the way the films tend to mythologize the soldier’s experience in a one-dimensional way. As someone who has spent years carefully parsing the nuances of what it means to live through or participate in a war, what was your reaction to the film?

An-My Lê: You know, I rarely have time to go to the movies, but I did see American Sniper. I also saw Rory Kennedy’s Last Days in Vietnam. I should have seen it months ago. I think I had P.T.S.D. afterwards. I was very happy to see American Sniper, because I am always fascinated with this subject, but I was disappointed. It was kind of a great story—

Rail: Just the feat of his accomplishments, leaving aside the moral issues.

Lê: Yes, the feat of it. The stress, the focus, the psychology of the mission and how it affected him—all of that really interests me more than anything else. But you’re right, it is very one-dimensional. Some filmmakers, like Kathryn Bigelow and her film Zero Dark Thirty, are interested in portraying something that is three-dimensional. She’s an artist, and hers is a fictional account. And there is something about working within that fiction that allows for a satisfying and challenging description. I don’t think Clint Eastwood did that, even though he can be a great filmmaker. I’m not sure why. Perhaps he got so caught up in wanting to pay tribute to Chris Kyle as a veteran. And of course that is important. It is a responsibility.

Rail: Specifically because of the way Kyle’s story ends, being killed by a fellow vet. There’s an added responsibility to an individual’s legacy.

Lê: The topic of the military raises questions in ways that other topics would not. There are photographers who have dealt with extreme poverty, or who have photographed horrific labor conditions, and they are not held accountable in the same way. They aren’t asked: what do you think of poverty? But the question of the military is so complicated that it riles up people’s opinions. And when your work is about the military, people want to know: are you for or are you against it? Maybe American Sniper was too caught up in having a straightforward message.

More here.

What Reading Wordsworth Teaches Us About Poverty

Paul-Ryan-Wikipedia-Commons-1024x678

Jamison Kantor in The Brooklyn Quarterly (Photo: Wikimedia Commons):

How does one get from British Romantic writers to Paul Ryan? The answer may lie in the language that each of them used.

Before turning to the emotions that are associated with poetic language, let’s look briefly at the emotional logic of the system itself. With the emergence of industrial labor in England, rural workers had to dramatically change their mindset. Now, people who had never lived under the rule of capitalism were expected to enter the industrial marketplace, endure the vicissitudes of prices—and the poor-relief to which they were connected—and reorganize their lifestyle around an administration over which they had almost no control. Swift market fluctuations did not just mean that foresight and planning were difficult. Existence under this new regime also meant a change in consciousness. In order to tolerate such insecurity, workers would have to believe in the promise of the new capitalist enterprise; that, despite the incessant variability built into their lives, rising industrial productivity would eventually bring them comforts far greater than what they had through rural work.

Ironically, the Speenhamland system may have played a role in this. “Hope,” the economic historian Karl Polanyi writes in his classic The Great Transformation (1944), “…was distilled out of the nightmare population and wage laws, and was embodied in a concept of progress so inspiring that it appeared to justify the cast and painful dislocations to come.”[3] This belief has remained a part of modern urban poverty. The endless, small decisions that the working poor have to make merely in order to survive act as a cruel stand-in for the sanctified idea of capitalist choice. In actuality, the constant pressure of evaluation and selection can fatigue people so much that their cognitive function is diminished. Recently, a group of contemporary neuroscientists from the University of Warwick have shown that poverty actually impedes cognitive function by putting this burden of choice on workers continually: which bus to take, which groceries will be least expensive, which residential utility will be most essential for living.[4] Add to these factors the inherent bustle of urban life, and poverty becomes a twofold deprivation: not only do people lack material provisions, but they also lack the time for deeper moments of contemplation. Poverty literally trades intellection for survival.

More here.

Austerity Bites: Fiscal Lessons from the British General Election

Blyth_payitforward4

Jonathan Hopkin and Mark Blyth in Foreign Affairs:

Despite Conservative spending cuts, the United Kingdom’s deficit was reduced by only half of what the party anticipated when it took office in 2010. The nation’s economy did not start to grow until late 2013, after a panicked treasury minister, George Osborne, relaxed austerity measures. The United Kingdom’s economic problems, the Conservatives maintained, were the result of Labour’s supposed profligacy in running budget deficits during the boom years of the early 2000s, leaving the economy exposed to the financial crisis. This, they argued, made draconian spending cuts inevitable.

However, as the crisis hit in 2007, the United Kingdom had the lowest debt to GDP ratio in the G7, lower than when Labour had taken power a decade earlier. And if Labour was supposedly running excessive deficits, the markets remained strangely unconcerned, with market rates on British bonds running close to pre-collapse lows. This left many wondering why the British budget exploded in 2008 and what it might say about coalition rule in the United Kingdom.

These questions, however, were strangely missing from discussion during the election. Cameron did not discuss why the United Kingdom’s outsized and overleveraged financial sector made the nation suffer disproportionately from the worst financial crisis since the 1930s. Financial deregulation and the unsustainable growth in private, not public, credit fatally exposed the United Kingdom’s banks to the United States’ subprime credit crisis. The collapse in credit growth in 2007–09 hurt the United Kingdom’s budget not because the Labour government was too deep in debt but because the national economy was more dependent on financial activity than elsewhere. By 2007, the British Exchequer was taking nearly 25 percent of total tax revenue out of the financial sector just prior to the crisis, which was a mere ten percent of the economy. With the financial crisis, these revenues plummeted, leaving the government short of cash and needing to borrow heavily.

More here.

America and horses

Symbolism-for-beginners_1260_999_80Nell Boeschenstein at The Morning News:

For as long as humans and horses have coexisted, humans have looked at horses and seen in them that ineffable quality we associate with the things we have lumped into a broad box labeled “Beautiful Things,” along with mountain ranges, the night sky, flowers in bloom, and the female form. The human-horse relationship is so much more intimate than the human-cow or human-chicken or human-sheep relationship. These are animals we ride, and when riding them we have been both mistaken for and mythologized as one and the same being. I don’t need to belabor how little man might have accomplished and how much more slowly he would have done it without the horse. The way we feel about the horse is more like how we feel about dogs than how we feel about stock animals.

Yet as urban dog and cat ownership skyrocketed in the United States between 1920 and 1940, so did meat-processing plants supplying demand for pet food with horsemeat. About 200 such meat plants opened during those decades, even as horses were publicly beloved on a scale we can hardly imagine from today’s perspective. There were the YA novels, yes. There was also Seabiscuit running across headlines and between 1930 and 1948, Gallant Fox, Omaha, War Admiral, Whirlaway, Count Fleet, Assault, and Citation won the Triple Crown in an impressive string, a feat only one American Thoroughbred had managed before. Only four have managed it since; the last one, Affirmed, was in 1978.

more here.

on Jack Smith’s ‘Hamlet in the Rented World’

ArticleJ. Hoberman at Artforum:

GIVEN THAT JACK SMITH never actually completed another movie after Flaming Creatures (1963), that most of his theater pieces concern the impossibility of their coming into existence, and that many all-but-identical drafts of the same scripts were found among his papers, it’s hardly surprising that he should have been fascinated by the most famously indecisive character in world literature.

Hamlet in the Rented World (A Fragment) is a twenty-seven-minute assemblage put together by Jerry Tartaglia on behalf of the Gladstone Gallery in New York from materials discovered in the Jack Smith Archives, including five quarter-inch audio reels and four rolls of 16-mm film (two of them untouched camera originals), all dating from the early 1970s. Guided by Smith’s scripts, Tartaglia’s reconstruction may be considered the artist’s last, posthumous word. (Hundreds of slides, the material for scores of the slide shows Smith presented during the ’70s and ’80s, remain—but we won’t go there.) Tartaglia, an avant-garde filmmaker whose deep involvement with Smith’s movies began when he discovered Flaming Creatures’ camera original in a laboratory discard bin in 1978 and who has labored over restorations of all Smith’s other film projects, knows this material better than anyone on earth.

more here.

Dennis Cooper’s Haunted HTML Novel

Screen_Shot_2015-01-16_at_8.49.26_AM.0.0Paige K. Bradley at Bookforum:

Dennis Cooper's latest book, Zac’s Haunted House, was released online in mid-January by the Paris-based small press and label Kiddiepunk. Dubbed an “html novel” and offered as a free download, it consists of seven html files, each of which expands into a long, vertical scroll of animated gifs. You could call Zac’s Haunted House many things: net art, a glorified Tumblr, a visual novel, a mood board, or a dark night of the Internet's soul. It has just a few words—the chapter titles and a few subtitles embedded in some of the gifs—but it still very clearly belongs to Cooper’s own haunted oeuvre, capable of evoking powerful and gnarled emotions. Although it is something of an about-face from his last novel, The Marbled Swarm—with that book’s intentionally contrived, digressive language—Zac’s Haunted House still displays Cooper’s obsessive attention to form and style. It also features his by now nearly classical imagery and interests: The vulnerable young male body juxtaposed with death and failure; charged use of subcultural vernacular; and confused bodies, to say nothing of identities, fumbling through sex and subterfuge. Cooper has always written characters whose ineloquence hints at experiences that defy language; now, telling a story almost exclusively in images, he pushes this inarticulateness in a new direction. The result is surprisingly eloquent, and accurately speaks to our experience of the present, online and IRL.

more here.

Tuesday Poem

I thought I was following a track of freedom
and for awhile it was
Adrienne Rich

Rivers/Roads

Consider the earnestness of pavement
its dark elegant sheen after rain,
its insistence on leading you somewhere

A highway wants to own the landscape,
it sections prairie into neat squares
swallows mile after mile of countryside
to connect the dots of cities and towns,
to make sense of things

A river is less opinionated
less predictable
it never argues with gravity
its history is a series of delicate negotiations with
time and geography

Wet your feet all you want
Hericlitus says,
it's never the river you remember;
a road repeats itself incessantly
obsessed with its own small truth,
it wants you to believe in something particular

The destination you have in mind when you set out
is nowhere you have ever been;
where you arrive finally depends on
how you get there,
by river or by road
.

by Michael Crummey
from Arguments With Gravity
Kingston, Ont.: Quarry Press, 1996.

The Trouble With Scientists

Philip Ball in Nautilus:

EyesSometimes it seems surprising that science functions at all. In 2005, medical science was shaken by a paper with the provocative title “Why most published research findings are false.”1 Written by John Ioannidis, a professor of medicine at Stanford University, it didn’t actually show that any particular result was wrong. Instead, it showed that the statistics of reported positive findings was not consistent with how often one should expect to find them. As Ioannidis concluded more recently, “many published research findings are false or exaggerated, and an estimated 85 percent of research resources are wasted.”2 It’s likely that some researchers are consciously cherry-picking data to get their work published. And some of the problems surely lie with journal publication policies. But the problems of false findings often begin with researchers unwittingly fooling themselves: they fall prey to cognitive biases, common modes of thinking that lure us toward wrong but convenient or attractive conclusions. “Seeing the reproducibility rates in psychology and other empirical science, we can safely say that something is not working out the way it should,” says Susann Fiedler, a behavioral economist at the Max Planck Institute for Research on Collective Goods in Bonn, Germany. “Cognitive biases might be one reason for that.” Psychologist Brian Nosek of the University of Virginia says that the most common and problematic bias in science is “motivated reasoning”: We interpret observations to fit a particular idea. Psychologists have shown that “most of our reasoning is in fact rationalization,” he says. In other words, we have already made the decision about what to do or to think, and our “explanation” of our reasoning is really a justification for doing what we wanted to do—or to believe—anyway. Science is of course meant to be more objective and skeptical than everyday thought—but how much is it, really?

Whereas the falsification model of the scientific method championed by philosopher Karl Popper posits that the scientist looks for ways to test and falsify her theories—to ask “How am I wrong?”—Nosek says that scientists usually ask instead “How am I right?” (or equally, to ask “How are youwrong?”). When facts come up that suggest we might, in fact, not be right after all, we are inclined to dismiss them as irrelevant, if not indeed mistaken. The now infamous “cold fusion” episode in the late 1980s, instigated by the electrochemists Martin Fleischmann and Stanley Pons, was full of such ad hoc brush-offs. For example, when it was pointed out to Fleischmann and Pons that their energy spectrum of the gamma rays from their claimed fusion reaction had its spike at the wrong energy, they simply moved it, muttering something ambiguous about calibration.

More here.

Human Ingenuity Takes On Cancer’s Darwinian Ways

George Johnson in The New York Times:

DarwinThe powerful algorithm that has populated the earth with 10 million species, each occupying a different ecological niche, is an example of what computer scientists call “random generate and test.” Start with the DNA alphabet, then blindly shuffle the letters to produce a kaleidoscope of living forms. The fittest, selected by the demands of the environment, will multiply and fill their habitats. The Darwinian principle is also at work inside the body, though in very different ways. Through random variation and selection, the immune system spins out the endless diversity of antibodies that it uses to stop microscopic invaders. But cancer also thrives through this evolutionary imperative as, mutation by mutation, a normal human cell transforms into a deadly tumor, which becomes fitter and fitter at the expense of its host. Among the advantages it evolves is the ability to outwit our immunological defenses.

One of the most encouraging developments in medical research has been the effort to help the immune system fight back, beating cancer at its own evolutionary game. That was a dominant theme last month at the annual meeting of the American Association for Cancer Research in Philadelphia as scientists discussed recent successes in immunotherapy while considering how far the field still has to go. Why have these treatments been working so well with some cancers but not others? And why, even in the best cases, do not all patients respond? The realization that Darwinian forces, for good and bad, are at work inside us can be traced to the early 1950s, when Frank Macfarlane Burnet, an Australian virologist, was pondering how we manage to fight off a potentially infinite variety of invading microbes, tailoring an antibody against each one. One possibility was that when an interloper is identified, by its molecular bumps and grooves, the immune system systematically engineers an appropriate weapon. Nature doesn’t work in such a methodical manner, and Burnet suggested a messier, more intuitive explanation: the clonal selection theory of immunity.

More here.

Bad Women (A Retro View)

by Lisa Lieberman

Frigid women. Manipulative wives. Bad mothers. Dumb blondes. Liz in BUtterfield 8Alcoholism. Failing marriages. Furtive sex. Before Mad Men revived these retro conventions and somehow made them hip, they were just tawdry. The poster for BUtterfield 8 (1960) shows Liz Taylor in a slip, highball in one hand, a mink coat hanging off her shoulder. “The most desirable woman in town and the easiest to find. Just call BUtterfield 8.” (In the more risqué version, she's standing by a pink telephone wearing nothing but a sheet).

In real life, Liz had just wrecked Eddie Fisher's marriage. He plays her friend Steve in this picture, long-suffering an older-brotherly way, a real prince. He left Debbie Reynolds for Liz, but she's the one doing penance here. Liz's character, Gloria, is angry, manipulative, and a nymphomaniac: the dark side of 1950s womanhood, as perceived by 1950s men. Nobody would ever mistake her for a nice girl.

The married guy she's cheating with, Liggett, is married to a nice girl, Emily. She's long-suffering too. She knows her husband is lying to her, he drinks too much and beats her around, but she blames herself for tempting him with a job in Daddy's company when she should have let him stand on his own two feet. Actually, it's not all Emily's fault. Emily's mother played a part in emasculating Liggett. They blamed mothers for everything in the 1950s and, let me tell you, Gloria's mother's got a lot to answer for too.

Poor Gloria. Behind her back, the men who buy her drinks and expensive trinkets (less crass than paying money for her “services”) make jokes about how they ought to rent out Yankee Stadium, the only place big enough to hold all her ex-conquests. Poor Liz. She may have won the Oscar for her role, but it wasn't worth the humiliation.

It wasn't only Liz, though. “Prepare to be shocked,” promised the trailer to A Summer Place, “because this bold, outspoken drama is the kind of motion picture excitement demanded by audiences today.” Really? I can't imagine what audiences in 1959 found shocking about this picture. As an exposé of sexual hypocrisy, it's pretty tame. Yes, there's an extramarital affair, but the betrayed spouses are so unsympathetic you're cheering the adulterous couple on. There's a pair of teenaged lovers having sex too, but Molly (Sandra Dee) and Johnny (Troy Donahue) are driven into one another's arms by the screwed-up adults in their lives. Knowing the mess that both Dee and Donahue made of their own lives, it's tempting to read more into this picture. When Johnny's alcoholic father calls Molly “a succulent little wench,” we're obviously meant to feel, with Johnny, that this accusation is unjust, but he only disputes the “wench” part. Dee is indeed succulent, her surface innocence barely concealing her sexual readiness. Toward the end of her life, the actress revealed that she had been raped repeatedly by her step-father as a child. The way she was presented in A Summer Place, it's all there. Poor Sandra.

Read more »