Friday, May 22, 2015
Matt Jakubowski in Truce:
Before we discuss your work at Harper’s and The Nation, I’d like to ask about the early years of your career. Were there specific experiences that drew you toward a life in letters, as they say? What convinced you that this was the kind of work you wanted to pursue when you were first starting out?
I had a pretty happy childhood that was clearly divided into Life and Books, the latter being as vivid and immersive for me as the former. My mother is a huge reader and took me to the library every week when I was little; at a certain point she decided to have the bus drop me and my sister off at the local branch after school, because libraries are not just repositories of knowledge but also some of the only places you can stick a latchkey kid without people calling the police.
There were no restrictions on what I could read. My upbringing in a hippied-out racially integrated neighborhood in Philadelphia wasn’t very structured, and I was terrible at sports. I liked to play with the neighborhood kids, dress up, and produce ridiculous plays with my sister. My mother worked a lot but she took us to museums and festivals and children’s concerts on weekends, so I was actively engaged with the world outside of home and school and extremely curious about it.
By the time I finished high school I was pretty done with “being taught.” I went to college to read primary sources and not textbooks. I wasn’t a specialist, and didn’t think learning specific types of methodologies was all that useful. I didn’t want to spend the rest of my life studying minutiae and writing boring papers with colons in their titles, but I did want to continue to learn.
After college most of my friends went to grad school and I went on an adventure, traveling and working abroad. I was a cook and housekeeper in Mallorca and a newspaper editor in Hanoi.
Emily Greenhouse in The Nation:
When Stephen Gaskin passed away last July, his local paper eulogized him as a “tie-dye-clad hippie philosopher, a proud ‘freethinker’” with “crystalline blue eyes.” Those of my generation who are familiar with Gaskin know him as the founder of the Farm, the 44-year-old intentional community in Summertown, Tennessee, where Gaskin’s wife, Ina May, started a movement of authentic midwifery and female body-empowerment. The Farm has 180 residents today—in the early 1970s, between 200 and 300 people traveled to Summertown in a caravan of painted school buses to create it—and maintains a focus on green community. Beyond its Ecovillage Training Center, the collective’s furthest-reaching project is a “woman-centered” approach to childbirth. Last year, a doula in Santa Cruz who runs the blog Yogini Momma posted a TEDx Talk by Ina May and praised her as midwifery’s “grandmother guru.”
I e-mailed the news of Gaskin’s death to a friend from college, a professional nurse-midwife. She replied, “When I was training at the Farm it was fascinating to see how everyone treated him with such deference.” Gaskin, the commune’s patriarch and source of “spiritual revelation,” had been in a flexible group marriage when both he and a partner began to be sexually involved with Ina May, who was still married to her first husband. Gaskin would later institutionalize monogamy on the Farm. “We think of Ina May as such a powerhouse, but really Stephen was the cult leader!” my friend noted. “When we would eat dinner he would always be served first.”
What to make of a man whose lessons as well as beliefs, it would seem, were unabashedly feminist, but who lived a life that clashed with them? This is the question posed by Jill Lepore’s invigorating and perplexing The Secret History of Wonder Woman.
Carl Zimmer in Quanta:
In March 2011, the Tara, a 36-meter schooner, sailed from Chile to Easter Island — a three-week leg of a five-year global scientific expedition. All but one of the seven scientists aboard the ship spent much of their time on the sun-drenched deck hauling up wondrous creatures such as luminous blue jellyfish and insects known as sea-skaters, which spend their entire lives skimming the surface of the ocean far from land.
At the stern of the Tara, a shipping container was bolted to the deck, with a door and a tiny window cut through the metal walls. One of the scientists, Melissa Duhaime, spent most of the voyage inside the dark, tiny cell, where she fought off an endless bout of seasickness.
“People would come in to see what I was doing and leave pretty quickly,” Duhaime said.
Inside her cell, Duhaime sat next to a hose as wide as an outstretched hand. A pump drew water through the hose from several meters below the boat and then pushed it through a series of filters. Each filter was finer than the last, blocking smaller and smaller life forms. The setup stopped animals first, then zooplankton and algae. The last filter in the hose, with pores just 220 nanometers wide, was fine enough to block bacteria. Scrubbed of all these living things, the water finally flowed into three 30-liter vats.
To the untrained eye, these vats might seem to be full of sterile water. But they were seething with ocean life — or life-like things, at the very least. The three vats held up to 1 trillion viruses.
In Tokyo, in 1964, the 31-year-old conceptual artist Yoko Ono organized a happening in which she screened a Hollywood film and gave the audience a simple instruction: Do not look at Rock Hudson, look only at Doris Day.
Like most of the countercultural riddles that appear in Grapefruit,Ono’s book from the same year, the instruction — titled Film Script 5 — was at once facile and mischievously impossible. (Other variations on the piece include asking the audience not to look at any round objects in a film, or to see only red.) It was also, in its way, autobiographical: As one of the few women associated with New York’s avant-garde music scene and the “neo-Dada” Fluxus movement, Ono was by then used to being overshadowed by the more powerful and self-serious men around her. (“I wonder why men can get serious at all,” she mused in Grapefruit. “They have this delicate long thing hanging outside their bodies, which goes up and down by its own will.”) The year she first staged Film Script 5, she’d already extricated herself from one failed marriage and her second was unraveling. She was still two years away from meeting the man with whom she would realize her dream of a completely egalitarian partnership — to symbolize this, they both wore white during their wedding ceremony — but the rest of the world wouldn’t see it that way. They would, of course, see only the towering, superior Him — what could he have possibly seen inHer?
My book, Unity Mitford: An Enquiry into Her Life and the Frivolity of Evil, became a nine-day-wonder, I can only suppose, because it brought out into the open collaboration with Hitler and the outlines of a British Vichy regime in the event of a successful Nazi invasion. The British flatter themselves that they had united to defeat a totalitarian enemy, and this was Our Finest Hour. Here I was pointing a finger at people whose beliefs and activities undermined this cozy national myth. I was to hear that I was “a traitor to my class,” a charge which concedes that England really did have its Quislings and Vichyites in waiting. Intending to analyze the social significance of my book, Bernard Levin, then the leading columnist on The Times, interviewed me over a period of several days. However many drafts he wrote, he finally told me, he couldn’t make sense of the storm, and gave up on the idea. It was left to Rebecca West to say what had to be said. She had known Meidling before the war and could remember seeing me there when I was a few days old. She had also studied the subject of treason. In a review she likened the moral atmosphere of my book to that of a burnt-out fairground.
Kahlo’s gardening was of a piece with her art, in asserting a nationalist mythos that extended even to her menagerie of pets: monkeys, parrots, turkeys, an eagle, and a pack of dogs that included Mexican hairless Xoloitzcuintles. What Rivera did on a monumental public scale, in murals picturing Mexico’s storied past and hoped-for future, Kahlo performed—and lived—privately. Even some of the nonnative plants in her garden told apposite stories. Calla lilies came to Mexico with slaves from Africa, and Chinese chrysanthemums arrived aboard Spanish galleons. By today’s gardening standards, not much of the show’s flora is particularly exotic. Even less is what you could call understated. Like everything else about Kahlo, her horticulture commands attention and rewards it with jolts of vicarious, insatiable ardor, if you open your eyes, mind, and heart to her.
Kahlo today inhabits international culture at variable points on a sliding scale between sainthood and a brand. The Botanical Garden show, besides being beautiful, can seem either reverential or exploitative. It’s really both, to a degree beyond the institution’s previous star-powered exhibitions devoted to the gardens of Charles Darwin, Claude Monet, and Emily Dickinson.
Each night we bought red wine from a small supermarket
Not too far from the Seine, where an overweight deaf teller
Smiled whenever we walked in. At the counter he read our lips
As we bought the cheapest wine we could find – never any change
As each time we paid, we paid the exact amount in coins you
Counted, one by one, into his open palm: six francs seventy-five.
Late in the evening you’d count up another six seventy-five
And we’d walk through the narrow streets back to the supermarket –
Fumbling through rich Parisians on their way to dinner; and you,
Who loved the city for our anonymity, became fond of the young teller
Who seemed alone and estranged and liked us too for the change
We brought to his long nights, when he read our hearts and lips.
Remember, when we figured out what he asked behind his mute lips,
“Why come twice, why not save yourself the walk and buy four or five
Bottles in the early evening?” We laughed, as nothing would change
The way we bought or the walks we took, hand in hand, to the supermarket.
The following evening, as we paid, we looked into the eyes of the deaf teller
And said, “It’s our habit” and left it at that; and he smiled, more so at you.
From that night on – every night, this game with him and you;
He’d lift his finger and wait for the silent words to form on our lips
And we’d say, “it’s our habit”; and he’d laugh – the deaf teller –
As we played our game, and all we needed was six francs seventy-five
On those evenings near the banks of the Seine, in that small supermarket –
Always paying the exact amount, never receiving any change.
Then you left and went away, and so heartfelt was the change –
Each night I cried, and it’s safe to say that he too sorely missed you.
In the evenings I still walk the narrow streets to the supermarket –
Remembering our walks in expensive coats, the jokes and your pale lips,
The way you kept the coins in a velvet pouch – the six seventy-five
That you’d always count into the soft, open palm of the deaf teller.
The night before I went away, I looked into the eyes of the deaf teller
And told him I was leaving the next day, his round face changed,
Something sad swelled in his young eyes as I placed the six seventy-five
Into his palm; he then signed to the sky, asking if I was on my way to you –
But no words this time, I could say nothing, no words of you from my lips.
I packed the bottles of wine and slowly began to exit the supermarket.
The deaf teller ran to me, tapped me on the shoulder as I thought of you,
With no change to his eyes, he shook my hand and silently said with his lips,
“It’s our habit, and exactly six seventy-five”. I smiled and left the supermarket.
from Spirit Brides
Carcanet Press Ltd., Manchester, 2006
Dana Stevens in Slate:
To live without seeing the films of the Indian director Satyajit Ray, said Akira Kurosawa in 1975, “means existing in the world without seeing the sun or the moon.” Though Ray was 11 years his junior, Kurosawa spoke of him that day in Moscow as a master. “I can never forget the excitement in my mind after seeing it,” he recalled of Ray’s debut Pather Panchali, 20 years after that film’s success at Cannes helped to usher in a new era of cinematic globalism—one that would eventually make it possible for a Japanese filmmaker to praise an Indian one in a speech being translated for a Russian audience. “It is the kind of cinema that flows with the serenity and nobility of a big river.”
In 2015—now 60 years since Pather Panchali’s release—Kurosawa’s simple words remain the best Ray criticism I’ve heard and, really, all the recommendation his films require. Pather Panchali, along with its two sequels, Aparajito (1956) and The World of Apu(1959)—the three together are known as “the Apu Trilogy,” after their main character—has just been re-released by Janus Films in a pristine 4K restoration, to be made available in a Criterion Blu-ray set later this year. (The original negatives of all three films were burned in a film-lab fire in London in 1993, making the restoration process especially difficult.) If this trilogy comes anywhere near your town—it opened earlier this month for a run at New York’s Film Forum, with plans to spread to more U.S. cities through the summer—I can’t exhort you any more strongly to see it than Kurosawa already has. Do you really want to exist in the world without ever seeing the sun or the moon?
Esther Landhuis in Scientific American:
For a long time researchers figured the body had a tidy way of dealing with immune cells that might trigger diabetes, lupus or other autoimmune diseases—it must kill off these rogue cells early in life, before the immune system matures. New research published on May 19 in Immunity challenges this age-old thinking. Instead, the body seems to keep these so-called self-reactive T cells in benign form to fight potential invaders later. That conclusion comes from a comprehensive set of immune analyses in mice and people, in which a team at Stanford University has found surprisingly large numbers of self-reactive T cells lurking in the bloodstream through adulthood. The cells are not easily activated, though, suggesting the presence of “a built-in brake,” says immunologist Mark Davis, the paper’s senior author. The findings renew debate about how the immune system manages to marshal its forces against myriad foreign invaders all the while leaving our own tissues alone.
The controversy emerged decades ago when researchers learned the secret to the immune system’s incredible versatility. They discovered that a special gene-shuffling process makes millions of antibodies and receptors. Their sheer number and variety allow our immune cells to recognize any conceivable pathogen, in principle. But the explanation also posed a puzzle: Those random gene rearrangements also produce T cells that could attack the body’s own tissues. As a solution, some scientists proposed that the body wipes out those self-reactive cells while the immune system is developing. Subsequent experiments by several labs supported this proposal.
Thursday, May 21, 2015
Chris Mooney in the Washington Post:
For a long time, we’ve been having a pretty confused discussion about the relationship between religious beliefs and the rejection of science — and especially its two most prominent U.S. incarnations, evolution denial and climate change denial.
At one extreme is the position that science denial is somehow deeply or fundamentally religion’s fault. But this neglects the wide diversity of views about science across faiths and denominations — and even across individuals of the same faith or denomination — not all of which are anti-climate science, or anti-evolution.
At the other extreme, meanwhile, is the view that religion has no conflict with science at all. But that can’t be right either: Though the conflict between the two may not be fundamental or necessary in all cases, it is pretty clear that the main motive for evolution denial is, indeed, a perceived conflict with faith (not to mention various aspects of human cognition that just make accepting evolution very hard for many people).
The main driver of climate science rejection, however, appears to be a free market ideology — which is tough to characterize as religious in nature. Nonetheless, it has often been observed (including by me) that evolution denial and climate science rejection often seem to overlap, at least to an extent.
“When two Englishmen meet”, wrote Samuel Johnson in 1758, “their first talk is of the weather; they are in haste to tell each other, what each must already know, that it is hot or cold, bright or cloudy, windy or calm.” It remains an insightful observation, not for what it says about the British obsession with weather – that was a truism even then – but for what it says about the value of natural knowledge. Talking about the weather in the present tense is a more or less futile undertaking, but it was as far as the science of meteorology had advanced in the millennium and a half since the appearance of Aristotle’s influential treatise, theMeteorologica, in the fourth century BC. Since then, the sky had remained an unknowable blue wilderness, populated by meteors (“any bodies in the air or sky that are of a flux and transitory nature”, according to Johnson’s Dictionary: hence “meteorology”), but as the nineteenth century dawned, things began to change. In 1802, Luke Howard gave clouds the names we still use today (cirrus, stratus, cumulus), and in 1804, Francis Beaufort devised the standardized wind-scale that now bears his name. “People were looking at the skies in new ways”, as Peter Moore observes at the outset of The Weather Experiment, his gripping account of nineteenth-century weather science, and by the middle of the century the Meteorological Department of the Board of Trade (better known today as the Met Office) was ready to issue the world’s first official weather forecast.
Walter Russell (May 19, 1871–May 19, 1963) was the progenitor of a “new world-thought” centered on light; in books such as The Electrifying Power of Man-Woman Balance, The Book of Early Whisperings, and The Dawn of a New Day in Human Relations, he foresaw “a marriage between religion and science” in which the laws of physics would be rewritten. He believed that weight “should be measured dually as temperature is,” with “an above and below zero,” and that “the sunlight we feel upon our bodies is not actual light from the sun.” (Russell’s Wikipedia entry notes gingerly that his ideology “has not been accepted by mainstream scientists.”)
In what’s ostensibly his seminal text, The Secret of Light, he outlines a philosophy rife with capitalized Nouns and portentous pseudo erudition:
Man lives in a bewildering complex world of EFFECT of which he knows not the CAUSE. Because of its seemingly infinite multiplicity and complexity, he fails to vision the simple underlying principle of Balance in all things. He, therefore, complexes Truth until its many angles, sides and facets have lost balance with each other and with him.
This book, which manages to be even slimmer than How Fiction Works, also manages to be even better. The Nearest Thing to Life is as close as we’ll ever get to a manifesto from the British-born New Yorker critic. Contained in the book’s 134 pages is a passionate defense of criticism, a memoir of Wood’s early life and influences, and an insightful study of the meaning of fiction.
This should all be old hat by now. Every year, new books arrive promising some meditation on fiction’s quintessence, and though many of them are useful and even well written, they rarely offer truly fresh observations. All of which makes The Nearest Thing to Life that much more remarkable. Wood succeeds so well because of his knack for recognizing defining contradictions. Consider the way he unpacks the duality of fiction through the lens of religion:
The idea that anything can be thought and said inside the novel –– a garden where the greatWhy? hangs unpicked, gloating in the free air –– had, for me, an ironically symmetrical connection with the actual fears of official Christianity outside the novel: that without God, asDostoyevsky put it, “everything is permitted.” Take away God, and chaos and confusion reign; people will commit all kinds of crimes, think all kinds of thoughts. You need God to keep a lid on things. This is the usual conservative Christian line. By contrast, the novel seems, commonsensically, to say: ‘Everything has always been permitted, even when God was around. God has nothing to do with it.’
Celia Walden in The Telegraph:
Women are notoriously bad at asking for bonuses. Which is why I did my homework and created – as BusinessInsider.com suggested – “a master plan”. I waited “the appropriate amount of time” (in my case, five years), made sure the big boss was in a good mood and took him out to lunch (“somewhere intimate, where there will be no interruptions”). I eschewed any usage of the word “need” (stinking, as it does, of desperation) in my pitch – which was “backed up with reports, charts and documentation of my positive performance” – and I tried to “remain respectful” as he stared slack-jawed back at me, before throwing his head back and roaring with laughter. Asking my own husband for a bonus simply for being his wife was never going to be anything less than preposterous. Yet according to an author of the forthcoming memoir, Primates of Park Avenue, this is what a glittering tribe of crispy-haired Upper East Side Manhattan wives do every year – depending, of course, on how well they have managed the domestic budget, socialised, upheld a variety-filled performance in the bedroom… and succeeded in getting the kids into a ‘Big Ten’ school.
Wednesday Martin, a social researcher who has been immersing herself in the lives of “Park Lane Primates” for over a decade, explains how the “wife bonus”, as she has called it, works in practice. “It might be hammered out in a pre-nup or post-nup, and distributed on the basis of not only how well her husband’s fund had done, but her own performance — the same way their husbands were rewarded at investment banks. In turn, these bonuses were a ticket to a modicum of financial independence and participation in a social sphere where you don’t just go to lunch, you buy a $10,000 table at the benefit luncheon a friend is hosting.”
Monya Baker in Nature:
In 2006, things were looking pretty good for David Rimm, a pathologist at Yale University in New Haven, Connecticut. He had developed a test to guide effective treatment of the skin cancer melanoma, and it promised to save lives. It relied on antibodies — large, Y-shaped proteins that bind to specified biomolecules and can be used to flag their presence in a sample. Rimm had found a combination of antibodies that, when used to 'stain' tumour biopsies, produced a pattern that indicated whether the patient would need to take certain harsh drugs to prevent a relapse after surgery. He had secured more than US$2 million in funding to move the test towards the clinic. But in 2009, everything started to fall apart. When Rimm ordered a fresh set of antibodies, his team could not reproduce the original results. The antibodies were sold by the same companies as the original batches, and were supposed to be identical — but they did not yield the same staining patterns, even on the same tumours. Rimm was forced to give up his work on the melanoma antibody set. “We learned our lesson: we shouldn't have been dependent on them,” he says. “That was a very sad lab meeting.”
Antibodies are among the most commonly used tools in the biological sciences — put to work in many experiments to identify and isolate other molecules. But it is now clear that they are among the most common causes of problems, too. The batch-to-batch variability that Rimm experienced can produce dramatically differing results. Even more problematic is that antibodies often recognize extra proteins in addition to the ones they are sold to detect. This can cause projects to be abandoned, and waste time, money and samples. Many think that antibodies are a major driver of what has been deemed a 'reproducibility crisis', a growing realization that the results of many biomedical experiments cannot be reproduced and that the conclusions based on them may be unfounded. Poorly characterized antibodies probably contribute more to the problem than any other laboratory tool, says Glenn Begley, chief scientific officer at TetraLogic Pharmaceuticals in Malvern, Pennsylvania, and author of a controversial analysis1 showing that results in 47 of 53 landmark cancer research papers could not be reproduced.
Captain of the Lighthouse
where brother and I climb and call Land’s End. We are watchmen
overlooking a sea of hazel-acacia-green, over torrents of dust whipping about
in whirlwinds and dirt tracks that reach us as firths.
We man our lighthouse – cattle as ships. We throw warning lights whenever
they come too close to our jagged shore. The anthill, the orris-earth
lighthouse, from where we hurl stones like light in every direction.
Tafara stands on its summit speaking in sea-talk, Aye-aye me lad – a ship’s a-
coming! And hurls a rock at the cow sailing in. Her beefy hulk jolts and turns.
Aye, Captain, another ship saved! I cry and furl my fingers into an air-long
telescope – searching for more vessels in the day-night.
Now they low on the anthill, stranded in the dark. Their sonorous cries haunt
through the night. Aye, methinks, me miss my brother, Captain of the
lighthouse, set sail from land’s end into the deepest seventh sea.
from Spirit Brides
Carcanet Press Ltd., Manchester, 2006
Wednesday, May 20, 2015
Kurt Klopmeier in The Critical Flame:
“What, then, is time?” Christian philosopher St. Augustine asked. “If no one asks me, I know what it is. If I wish to explain it to him who asks, I do not know.” We say that time flies or that it drags. We have it on our hands or we are pressed for it. And although we cannot experience any time other than our own present, physicists tell us that there is nothing particularly special about the present. It seems that all times exist at once, and it is only our perception that limits our view of it. Because of this, it’s very hard to understand and even harder to convey the idea that everything according to special relativity is constantly happening.
The way people experience time—that it has a direction and a flow—appears to be inaccurate, at least according to our best understanding of physics. In his explanation of special relativity,The Fabric of the Cosmos, Brian Greene explains: “There is no use crying over spilled milk, because once spilled it can never be unspilled: we never see splattered milk gather itself together, rise off the floor, and coalesce in a glass that sets itself upright on a kitchen counter.” Events happen in one direction, and one alone. Time seems to move always forward in a particular sequence that is never interrupted. However, Greene writes, “as hard as physicists have tried, no one has found any convincing evidence within the laws of physics that supports this intuitive sense that time flows. In fact, a reframing of some of Einstein’s insights from special relativity provides evidence that time does not flow … The outside perspective … in which we’re looking at the whole universe, all of space at every moment of time, is a fictitious vantage point, one that none of us will ever have.” But this view of time, and the way that authors have tried to use it, can offer enlightening insights about the world that normal sequential narratives cannot, and can shed light on the way narrative operates on our understanding.
Ron Unz in The American Conservative:
Just before the Labor Day weekend, a front page New York Times story broke the news of the largest cheating scandal in Harvard University history, in which nearly half the students taking a Government course on the role of Congress had plagiarized or otherwise illegally collaborated on their final exam.1 Each year, Harvard admits just 1600 freshmen while almost 125 Harvard students now face possible suspension over this single incident. A Harvard dean described the situation as “unprecedented.”
But should we really be so surprised at this behavior among the students at America’s most prestigious academic institution? In the last generation or two, the funnel of opportunity in American society has drastically narrowed, with a greater and greater proportion of our financial, media, business, and political elites being drawn from a relatively small number of our leading universities, together with their professional schools. The rise of a Henry Ford, from farm boy mechanic to world business tycoon, seems virtually impossible today, as even America’s most successful college dropouts such as Bill Gates and Mark Zuckerberg often turn out to be extremely well-connected former Harvard students. Indeed, the early success of Facebook was largely due to the powerful imprimatur it enjoyed from its exclusive availability first only at Harvard and later restricted to just the Ivy League.
During this period, we have witnessed a huge national decline in well-paid middle class jobs in the manufacturing sector and other sources of employment for those lacking college degrees, with median American wages having been stagnant or declining for the last forty years. Meanwhile, there has been an astonishing concentration of wealth at the top, with America’s richest 1 percent now possessing nearly as much net wealth as the bottom 95 percent.2 This situation, sometimes described as a “winner take all society,” leaves families desperate to maximize the chances that their children will reach the winners’ circle, rather than risk failure and poverty or even merely a spot in the rapidly deteriorating middle class. And the best single means of becoming such an economic winner is to gain admission to a top university, which provides an easy ticket to the wealth of Wall Street or similar venues, whose leading firms increasingly restrict their hiring to graduates of the Ivy League or a tiny handful of other top colleges.3 On the other side, finance remains the favored employment choice for Harvard, Yale or Princeton students after the diplomas are handed out.4
Declan Walsh in the New York Times:
Their websites, glossy and assured, offer online degrees in dozens of disciplines, like nursing and civil engineering. There are glowing endorsements on the CNN iReport website, enthusiastic video testimonials, and State Department authentication certificates bearing the signature of Secretary of State John Kerry.
“We host one of the most renowned faculty in the world,” boasts a woman introduced in one promotional video as the head of a law school. “Come be a part of Newford University to soar the sky of excellence.”
Yet on closer examination, this picture shimmers like a mirage. The news reports are fabricated. The professors are paid actors. The university campuses exist only as stock photos on computer servers. The degrees have no true accreditation.
Warren Cornwall in Science:
For years, scientists have noticed an interesting pattern of cancer among children. Those who went to day care early in life were less likely to later develop the most common childhood cancer: acute lymphoblastic leukemia (ALL). Now, a 7-year study appears to have unraveled the molecular mechanism driving ALL. The work may explain why early exposure to infections in places such as day cares seems to protect against the disease and why unrelated vaccines help guard against this cancer. For Mel Greaves, a cancer cell biologist at the University of London’s Institute of Cancer Research, the finding provides an explanation for the hypothesis he has long promoted: that when infants in modern societies are sheltered from routine infections, their immune systems are more likely to overreact during later infections, paving the way for ALL. “I see it as the missing link,” he says of the new research.
Most childhood ALL involves a malfunction of B cells, the scouts of the immune system that patrol the bloodstream looking for intruders like viruses and bacteria; they make antibodies that help fight infections. But with leukemia, the immune system goes haywire, churning out flawed, immature B cells at a prodigious rate and crowding out healthy blood cells. Normal B cells are a marvel of adaptability. As they mature, they reprogram their own DNA, enabling the immune system to produce millions of different B cells programmed to recognize the vast range of potential infections. The DNA rearrangement relies on a sequence of enzymes. First, proteins known as RAGs cut and paste whole chunks of DNA. After that, another enzyme, AID, goes to work “fine-tuning” the DNA by altering single nucleotides. But Greaves and colleagues suspected this process could go awry, introducing mutations that create flawed B cells that could cause leukemia. In a series of experiments, they found evidence that much of the problem lay with a breakdown in the orderly sequence of gene editing during infections. Rather than the RAGs doing their business and then stepping aside for the AID, the AID kicked in simultaneously, potentially increasing the risk of gene-editing errors. These tantalizing results came to a head in an experiment on mice with a genetic abnormality linked to childhood ALL. The condition, in which two genes associated with blood formation are fused together, is found in the cord blood of 1% of all newborns. But most children with it never go on to develop full-blown ALL. The researchers wondered if unregulated mutations set off by repeated infections later in childhood could make the difference, triggering the leukemia.
Adelle Explains Urgency to the Judge
A few hours before the wedding I started to draw again. I had never taken my sketches seriously. But these new pictures showed a mastery I never thought possible before. In those few hours, I gained a sense of myself. I locked the door to the bridal suite and sketched everything I could: the windows, an armoire, my violet nightgown hanging from a hanger. That was when I heard a knock, followed by shouts and threats. It was my mother with the white dress.
by Kristina Marie Darling
from Amethyst Arsenic, 4.1
Vivian Gornick in Book Forum:
FORTY YEARS AGO, when the second wave of the American feminist movement was young, and its signature phrase, “the personal is political,” was electrifying, many of the movement’s radicals (this reviewer among them) went to war with the age-old conviction that marriage and motherhood were the deepest necessities of every woman’s life. If we looked honestly at what many of us really wanted, as we were doing in the 1970s and ’80s, it was not marriage and motherhood at all; it was rather the freedom to discover for ourselves the lives we might actually want to pursue. In our pain and anger at having been denied that freedom, we often turned recklessly on these conventional wisdoms. Marriage was rape, we cried, motherhood slavery. No equality in love? We’ll do without! What we didn’t understand—and this for years on end—was that between the ardor of our revolutionary rhetoric and the dictates of flesh-and-blood reality lay a no-man’s-land of untested pronouncements. How easy it was for us to declare ourselves “liberated,” how chastening to experience the force of contradictory feeling that undermined these defiant simplicities. As we moved inexorably toward the moment when we were bound to see that we were throwing the baby out with the bathwater, nearly every one of us became a walking embodiment of the gap between theory and practice: the place in which we were to find ourselves time and again.
...KATE BOLICK is a forty-two-year-old journalist who, since childhood, has harbored a fantasy of living alone and becoming what she calls a “real” writer, but, like many women of her generation, she has found it nearly impossible to pursue that dream. In a memoir, Spinster, she traces the problem to its origins. “Whom to marry, and when it will happen” are the book’s opening words. “These two questions define every woman’s existence, regardless of where she was raised or what religion she does or doesn’t practice. She may grow up to love women instead of men, or to decide she simply doesn’t believe in marriage. No matter. These dual contingencies govern her until they’re answered, even if the answers are nobody and never.”