Sunday, August 23, 2015
Helen Thomson in The Guardian:
Genetic changes stemming from the trauma suffered by Holocaust survivors are capable of being passed on to their children, the clearest sign yet that one person’s life experience can affect subsequent generations.
The conclusion from a research team at New York’s Mount Sinai hospital led by Rachel Yehuda stems from the genetic study of 32 Jewish men and women who had either been interned in a Nazi concentration camp, witnessed or experienced torture or who had had to hide during the second world war.
They also analysed the genes of their children, who are known to have increased likelihood of stress disorders, and compared the results with Jewish families who were living outside of Europe during the war. “The gene changes in the children could only be attributed to Holocaust exposure in the parents,” said Yehuda.
Her team’s work is the clearest example in humans of the transmission of trauma to a child via what is called “epigenetic inheritance” - the idea that environmental influences such as smoking, diet and stress can affect the genes of your children and possibly even grandchildren.
The idea is controversial, as scientific convention states that genes contained in DNA are the only way to transmit biological information between generations. However, our genes are modified by the environment all the time, through chemical tags that attach themselves to our DNA, switching genes on and off. Recent studies suggest that some of these tags might somehow be passed through generations, meaning our environment could have and impact on our children’s health.
.. —for my sons
when i was twenty
five .. we hiked the grass
spare trails that snake
from ocean to Swan Pond .. my
two small crab catchers & me .. we
buried pet turtles at sea
.......... beneath the crooked
footbridge .. sailed stick regattas
in the slim stream in the slow
woods .. bouncing like great
explorers of Kettle Cove & sea
slashed rocks .. listening to each other's
breath .. we trudged home sand fed
by Jim Bell
from Crossing the Bar
Slate Roof: a publishing collective
Ben Ehrlich in Nautilus:
Santiago Ramón y Cajal, a Spanish histologist and anatomist known today as the father of modern neuroscience, was also a committed psychologist who believed psychoanalysis and Freudian dream theory were “collective lies.” When Freud published The Interpretation of Dreams in 1900, the science world swooned over his theory of the unconscious. Dreams quickly became synonymous with repressed desire. Puzzling dream images could unlock buried conflicts, the psychoanalyst said, given the correct interpretation. Cajal, who won the 1906 Nobel Prize for discovering neurons and, more remarkably, intuiting the form and function of synapses, set out to prove Freud wrong. To disprove the theory that every dream is the result of a repressed desire, Cajal began keeping a dream journal and collecting the dreams of others, analyzing them with logic and rigor.
Cajal eventually deemed the project unpublishable. But before his death in 1934, he gave his research, scribbled on stained loose papers and in the margins of books and newspapers, to his good friend and former student, the psychiatrist José Germain Cebrián. Germain typed the diary into a book, which was thought lost during the 1936 Spanish Civil War. In fact, Germain carried the manuscript with him as he traveled through Europe. Before his death, he gave it to José Rallo, a Spanish psychiatrist and dream researcher. To the delight of scholars and enthusiasts, Los sueños de Santiago Ramón y Cajal was published in Spanish in 2014, containing 103 of Cajal’s dreams, recorded between 1918 and his death in 1934.1 Translated here into English for the first time, these dreams, and Cajal’s notes on them, offer insight into the mind of a great scientist—insight that perhaps he himself did not always have.
A Common Dream
[Falling of pants]
I attend a diplomatic soiree and as I am leaving my pants fall down (Is it desire?)
[Drowning with daughter]
I take a walk by the bay (Santander?) and I fall into the water with one of my little daughters in my arms. I fight the waves, I am almost drowning, despite touching the seawall. The nightmare awakens me.
Saturday, August 22, 2015
Lina Sergie Attar in Politico:
Our recent history tells us that the revolutions of the Arab Spring broke the walls of fear and silence, especially in Syria, where people began speaking, writing and chanting about the injustices they endured as they demanded freedom and dignity. Then, the years past, the losses mounted and the world grew more and more indifferent; it was three full years ago that President Obama pledged to intervene if the Syrian government crossed the “red line” of using chemical weapons, a promise he has broken. Some Syrians began to recede into silence, out of not only fear, and later, exhaustion, but collective trauma. In many ways, the realities Syrians faced had become simply inexpressible.
Now, the everyday violence and death Syrians witness is no longer recorded in full force unless events surpass the daily “acceptable” quota of death—like it did on August 16 in Douma, after more than 100 people were killed by a regime aerial attack on a crowded marketplace. These kinds of mass tragedies, like the chemical weapons attack in 2013 and the Daraya massacre in 2012, capture the world’s attention—headlines, outrage, condemnation—for a few moments before Syria’s suffering once again fades to white noise. When the country has been reduced to smoldering ashes and its people have been forced into a mass exodus to new countries and new homes, our capacity to document—to speak or write and chant—dwindles. History collapses into a simple etcetera.
More here. [Thanks to Idrees Ahmad.]
Tom Stafford at the BBC:
It is perhaps the most famous experiment in neuroscience. In 1983, Benjamin Libet sparked controversy with his demonstration that our sense of free will may be an illusion, a controversy that has only increased ever since.
Libet’s experiment has three vital components: a choice, a measure of brain activity and a clock.
The choice is to move either your left or right arm. In the original version of the experiment this is by flicking your wrist; in some versions of the experiment it is to raise your left or right finger. Libet’s participants were instructed to “let the urge [to move] appear on its own at any time without any pre-planning or concentration on when to act”. The precise time at which you move is recorded from the muscles of your arm.
The measure of brain activity is taken via electrodes on the scalp. When the electrodes are placed over the motor cortex (roughly along the middle of the head), a different electrical signal appears between right and left as you plan and execute a movement on either the left or right.
The clock is specially designed to allow participants to discern sub-second changes. This clock has a single dot, which travels around the face of the clock every 2.56 seconds. This means that by reporting position you are reporting time. If we assume you can report position accurately to 5 degree angle, that means you can use this clock to report time to within 36 milliseconds – that’s 36 thousandths of a second.
Putting these ingredients together, Libet took one extra vital measurement. He asked participants to report, using the clock, exactly the point when they made the decision to move.
Shahnaz Habib in The Guardian:
My father, who lives in India, loathes travel. He will tell you this himself. When he hears about other people’s road trips, he shakes his head, wishing they had more common sense. The greatest pleasure, for him, is to be at home, reading the news and eating rice and coconut chammanthi. Ideally, the coconut should be from his own village in southern Kerala.
Alas, all his children live abroad. My siblings live in the United Arab Emirates and I’m in New York. Every few months, my brother will send my parents a non-refundable round-trip ticket and my father’s reluctance to travel will battle with his parsimony. Eventually, he will climb on the flight, bundled up thoroughly against air-conditioning, which he hates almost as much as travel. Once he arrives at my brother’s house in Sharjah, he ventures out as little as possible. He knows what he likes: reading news. Why bother doing anything else?
Outside of these Sharjah exiles, my father has made two epic trips. As soon as my parents could afford it, they went on the Hajj pilgrimage to Mecca. The other journey was a visit to New York. When my daughter was a few months old my parents and sister arrived to take over my household, cook four multi-course south Indian meals every day, and sing endless lullabies. “You must be exhausted,” I said when they arrived at the apartment after 20 hours of flying. “Of course not,” my father said and fell asleep on the couch.
I knew that between my adventure-averse father and my infant, we would be home a lot. But I also wanted to show off my city. I surprised my parents with a helicopter tour over Manhattan. My mother got off the chopper with windswept hair and shining eyes. “Just wonderful. Everyone should do this,” she declared. My father shook his head and said, “eminently avoidable.”
“Notes on the Death of Culture: Essays on Spectacle and Society” is a new nonfiction diatribe by Mario Vargas Llosa, or (should I say) by the Spanish-language Peruvian novelist, lapsed Catholic, last living public face of the Latin American “boom” and 2010 Nobel laureate in literature Mario Vargas Llosa, the author of over two dozen previous books. The subject of this one is “our” lack: of common culture, or common context, common sets of referents and allusions, and a common understanding of who or what that pronoun “our” might refer to anymore, now that even papers of record have capitulated to individually curated channels and algorithmicized feeds. “Notes” begins with a survey of the literature of cultural decline, focusing on Eliot’s “Notes Toward the Definition of Culture,” before degenerating into a series of squibs — on Islam, the Internet, the pre-eminence of sex over eroticism and the spread of the yellow press — most of which began as columns in the Spanish newspaper El País. All of which is to say that Vargas Llosa’s cranky, hasty manifesto is made of the very stuff it criticizes: journalism.
Vargas Llosa’s opening essay reduces its Eliotic ur-text to its crassest points, but my own version here must be crasser: After all, I have six browser tabs open, and my phone has been beeping all day. Eliot defines culture as existing in, and through, three different spheres: that of the individual, the group or class, and the entire rest of society.
You occasionally think living in Pakistan is an advantage. Since so much is obviously unsayable, you have developed a heightened sensitivity to the ways in which power operates on speech, not just there but everywhere. It is like living in a desiccated nook on the cliff wall of some dry, desert valley. Looking out from your nook you can see the forces of erosion at work. Erosion reshapes everything. One day soon, though hopefully not very soon, your nook, too, will be gone.
You see from your nook that humanity is afflicted by a great mass murderer about whom we are encouraged not to speak. The name of that murderer is Death. Death comes for everyone. Sometimes Death will pick out a newborn still wet from her aquatic life in her mother’s womb. Sometime Death will pick out a man with the muscles of a superhero, pick him out in repose, perhaps, or in his moment of maximum exertion, when his thighs and shoulders are trembling and he feels most alive. Sometimes Death will pick singly. Sometimes Death will pick by the planeload. Sometimes Death picks the young, sometimes the old, and sometimes Death has an appetite for the in-between.
Before Modiano won the Nobel Prize, this most singular writer, noted for his elliptical plots and regretful tone of voice, had barely caused a ripple in the English-speaking world. Only eight of his 30 novels had been translated into English and most of those had fallen out of print. But since the award, publishers in Britain and the US have been falling over themselves to have their own Modiano moment.
Last year, Yale University Press rushed into printSuspended Sentences, a standalone book comprising a trio of newly translated novellas — Afterimage (1993), Suspended Sentences (1988) and Flowers of Ruin(1991). This month sees the UK publication of Bloomsbury’s Occupation Trilogy, a retrospective grouping devised by his Spanish publisher that constitutes translations of Modiano’s first three novels, originally published in France between 1968 and 1972. And in September, MacLehose Press will publish the first English-language translations of Pedigree and his most recent novel So You Don’t Get Lost in the Neighbourhood, which came out in France last year; they are to be published in the US by Yale and Houghton Mifflin Harcourt. In January, MacLehose will also bring out new translations of Modiano’s 2007 novel In the Café of Lost Youth and The Black Notebook (2012).
The following remarks were delivered at The New Criterion’s gala on April 29, 2015 honoring Charles Murray with the third Edmund Burke Award for Service to Culture and Society.
We are living in a political system that has tied itself in knots. “Cleaning house” in Washington will do nothing to untie those knots. When it comes to an explanation of why government under both Democrats and Republicans has become so pathetically ineffectual across the board, even at simple tasks, a powerful underlying explanation is that American government suffers from an advanced case of institutional sclerosis.
Mancur Olson argued that there’s only one way to recover from advanced institutional sclerosis: be utterly defeated in a world war. He compares the postwar experiences of Germany and Japan with the postwar experiences of Britain and France to make his case. Germany and Japan had to start from scratch. That’s precisely why they were able to grow so much more quickly than Britain and France after the war, which won the war and thereby were encumbered by the survival of their prewar institutions—and their prewar sclerosis. How did the United States government avoid institutional sclerosis through almost two centuries of its existence? The answer is simple: the founders set up a system that by its nature prevents institutional sclerosis from getting out of hand. The enumerated powers restricted the number of favors within the power of government to sell. Sclerosis is impossible if no amount of lobbying can give Congress the power to satisfy the desires of the special interests.
And that brings me to my second reason for arguing that we cannot roll back the reach of government through the political process: the constitutional revolution that occurred from 1937 through 1943.
Colin Marshall in Open Culture:
One often hears lamented the lack of well-spoken public intellectuals in America today. Very often, the lamenters look back to James Baldwin, who in the 1950s and 1960s wrote such powerful race-, class-, and sex-examining books as Go Tell It on the Mountain, Giovanni’s Room, and The Fire Next Time, as one of the greatest figures in the field. Though Baldwin expatriated himself to France for much of his life, he seems never to have let the state of his homeland drift far from his mind, and his opinions on it continued to put a charge into the grand American debate.
Upon one return from Paris in 1957, Baldwin found himself wrapped up in the controversy around the Civil Rights Act and the related movements across the south. He wrote several high-profile essays on the subject, even ending up himself the subject of a 1963 Time magazine cover story on his views. That same year, he went on a lecture tour on race in America which put him in close contact with a variety of student movements and other protests, whose efficacy he and Malcolm X debated inthe broadcast above.
Fareed Zakaria in The New York Times:
The world has been horrified but also puzzled by the rise of ISIS. How does one comprehend its brutality and success? What is its likely path? In March 2015, The Atlantic offered an answer, in an analysis by Graeme Wood that quickly became the most widely read essay in the magazine’s 158-year history. Titled “What ISIS Really Wants,” it focused on the ideology that animates the group. Understand its ideas, Wood suggested, and you will understand the phenomenon and how to fight it. Many other, more polemical explanations of jihadi terrorism today — from Bill Maher to Sam Harris — also shine a spotlight on the ideas behind the mayhem.
Most intellectuals think ideas matter. In one of his most famous and oft-quoted lines, John Maynard Keynes declared, “Practical men who believe themselves to be quite exempt from any intellectual influence are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.” Scott L. Montgomery and Daniel Chirot concur, arguing that ideas “do not merely matter; they matter immensely, as they have been the source for decisions and actions that have structured the modern world.” In “The Shape of the New: Four Big Ideas and How They Made the Modern World,” Montgomery and Chirot make the case for the importance of four powerful ideas, rooted in the European Enlightenment, that have created the world as we know it. “Invading armies can be resisted,” they quote Victor Hugo. “Invading ideas cannot be.”
When grief comes to you as a purple gorilla
you must count yourself lucky.
You must offer her what’s left
of your dinner, the book you were trying to finish
you must put aside
and make her a place to sit at the foot of your bed,
her eyes moving from the clock
to the television and back again.
I am not afraid. She has been here before
and now I can recognize her gait
as she approaches the house.
Some nights, when I know she’s coming,
I unlock the door, lie down on my back,
and count her steps
from the street to the porch.
Tonight she brings a pencil and a ream of paper,
tells me to write down
everyone I have ever known
and we separate them between the living and the dead
so she can pick each name at random.
I play her favorite Willie Nelson album
because she misses Texas
but I don’t ask why.
She hums a little,
the way my brother does when he gardens.
We sit for an hour
while she tells me how unreasonable I’ve been,
crying in the check-out line,
refusing to eat, refusing to shower,
all the smoking and all the drinking.
Eventually she puts one of her heavy
purple arms around me, leans
her head against mine,
and all of a sudden things are feeling romantic.
So I tell her,
things are feeling romantic.
She pulls another name, this time
from the dead
and turns to me in that way that parents do
so you feel embarrassed or ashamed of something.
Romantic? She says,
reading the name out loud, slowly
so I am aware of each syllable
wrapping around the bones like new muscle,
the sound of that person’s body
and how reckless it is,
how careless that his name is in one pile and not the other.
by Matthew Dickman
from American Poetry Review, 2008
Friday, August 21, 2015
Sarah Crown in The Guardian:
In 1527, a fleet of five ships set sail from Spain for the New World, on a mission to settle the recently discovered land of La Florida. After making landfall on the Gulf coast, near where the city of St Petersburg stands today, the expedition’s leader, Pánfilo de Narváez, headed into the country’s unmapped interior in search of the gold he had convinced himself would be found there. Within days his men became hopelessly lost; soon after they began to die, from starvation, disease, drowning and the depredations of local tribes. In the end, of an original contingent of 300, just four survived: three Spanish gentlemen – Álvar Núñez Cabeza de Vaca, Alonso del Castillo and Andrés Dorantes – and Estebanico, a Moorish slave.
History, it’s said, is written by the victors. While the Narváez expedition was a catastrophe of almost absurd proportions, its name used for years afterwards as a byword for disaster, these four men (who were eventually picked up in northern Mexico by a group of Spanish slavers, “strangely dressed and in the company of Indians”), were, if not victors, at least survivors. Together, they’d lived through the worst the continent could throw at them and even, ultimately, carved out a niche for themselves as healers among the indigenous Americans. Their reappearance was a triumph bordering on the miraculous, and Cabeza de Vaca’s tale of their adventures, which he published on his return to Spain, was justly celebrated. But his story also revealed that, even among survivors, some are more equal than others. While the three Castilians were given joint billing, the slave who had been with them every step of the way for the eight long years of their exile was confined to a single line of biography. “The fourth [of us],” says Cabeza de Vaca, “is Estebanico, an Arab Negro from Azemmour.” And that’s it.
When she came across Cabeza de Vaca’s chronicle nearly 500 years after it was written, Laila Lalami was first puzzled, then fascinated by the omission.
Liz Kruesi in Quanta:
Dark matter — the unseen 80 percent of the universe’s mass — doesn’t emit, absorb or reflect light. Astronomers know it exists only because it interacts with our slice of the ordinary universe through gravity. Hence the hunt for this missing mass has focused on so-called WIMPs — Weakly Interacting Massive Particles — which interact with each other as infrequently as they interact with normal matter.
Physicists have reasons to look for alternatives to WIMPs. For two decades, astronomers have found less dark matter at the centers of galaxies than what WIMP models suggest they should. The discrepancy is even worse at the cores of the universe’s tiny dwarf galaxies, which have few ordinary stars but lots of dark matter.
About four years ago, James Bullock, a professor of physics and astronomy at the University of California, Irvine, began to wonder whether the standard view of dark matter was failing important empirical tests. “This was the point where I really started thinking hard about alternatives,” he said.
Bullock thinks that dark matter might instead be complex, something that interacts with itself strongly in the way that ordinary matter interacts with itself to form intricate structures like atoms and atomic elements. Such a self-interacting dark matter, Bullock suspects, could exist in a “dark sector,” somewhat parallel to our own light sector, but detectable only through the way it affects gravity.
He and his colleagues have created numerical simulations that predict what the universe would look like if dark matter feels strong interactions. They expected to see the model fail. Instead, they found that it was consistent with what astronomers observe.
More here. [Thanks to Sean Carroll.]
Shoaib Daniyal in Scroll:
Organised by the far-right group Hindu Samhati, the procession was a commemoration of the Great Calcutta Killings, the terrible communal riot that began exactly 69 years ago on August 16, 1946. In particular, it was feting the role of a certain Gopal Chandra Mukherjee in it. Large billboards mounted on vans proclaimed Mukherjee to be “Kolkatar Rakhakarta” (Kolkata’s protector) and prefixed the title “Hindu bir” (Hindu braveheart) before his name.
It was also connecting 1946 to 2015: people carried banners which called for an end to the “torture” of Hindus in Bengal, warned politicians to stop “appeasing” certain groups in the “greed for votes” and called for an end to “Jihadi riots”. A van carried a lurid billboard asking why Kolkata’s intellectuals were silent about the everyday killing of bloggers in Bangladesh.
On a truck, flanked by hectic activity, a man on a public address system drilled everyone about how the march would be conducted: regular slogans, march in line and be peaceful. The Mamata Banerjee government also seemed interested in the last bit: there was heavy police bandobast for the event, with scores of policemen milling around, in case things went out of hand.
Rail: One thing I noticed in your criticism of the ’70s and early ’80s: you made arguments against something that Robert Hughes had written, or some other prominent review. Before long that falls out of the writing. Was that because there were less people saying things you wanted to fight with, or—
Schjeldahl: That was ambition and antagonism. It was partly a sense of embattled vulnerability, which faded. I’m no longer the insecure kid that just ran into the room. Also I think it had to do with a trend in editorial judgment. It’s like magazines don’t like you reminding people of their competitors. I wish there was more reciprocal, name-citing argument—not name-calling, please. Critics being pissy about other critics is pathetic—as if anyone cares about our tender egos. At that time, I was antagonized by my elders, as I know I now antagonize young writers who want their turns at bat. It’s natural. I remember when Harold Rosenberg died, I felt a pang of guilt. I must have harbored a dark wish that he would.
Rail: You wanted him out of the way so that you didn’t have to deal with him?
Schjeldahl: I wanted to go toward the light and he was blocking it. But of course the big nemesis of us all was Clement Greenberg, and I’m reading him again—he’s great. An asshole on many levels and after the mid-’50s he ceased to be right about much of anything, but nobody in American history has been a more acute critic, who held himself to standards of evidence and logic that make everybody else seem like dilettantes. He had the strength and the weakness of his model, T. S. Eliot—a genius for analysis and a tic of overreaching, as the Voice of Culture. Greenberg’sArt and Culture has a hilarious title—there’s a tremendous lot about art but hardly a cogent word about culture in that entire book.
On the 12th of February, 1804, Immanuel Kant lay on his deathbed. “His eye was rigid, and his face and lips became discoloured by a cadaverous pallor.” A few days following his death, his head was shaved, and “a plaster cast was taken, not a mask merely, but a cast of the whole head, designed to enrich the craniological collection of Dr. Gall,” a local physician. The corpse of Kant was made up and dressed appropriately, and, according to some accounts, throngs of visitors came day and night. “Everybody was anxious to avail himself of the last opportunity he would have for entitling himself to say, ‘I too have seen Kant.’” Their impressions seemed to be at once reverent and grotesque. “Great was the astonishment of all people at the meagreness of Kant’s appearance; and it was universally agreed that a corpse so wasted and fleshless had never been beheld.” Accompanied by the church bells of Konigsberg, Kant’s corpse was carried from his home by torchlight, to a candle-lit cathedral, whose Gothic arches and spires were perhaps reminiscent of the philosopher’s elaborate, vaulted books.
In his book A Short History of Decay, E.M. Cioran once wrote: “I turned away from philosophy when it became impossible to discover in Kant any human weakness, any authentic accent of melancholy, in Kant and in all the philosophers.” Indeed, for many, the name of Immanuel Kant has become synonymous with a certain type of elaborate, grand, system-building philosophy that characterizes works such as The Critique of Pure Reason, first published in 1781.
The Wrights’ first aircraft, really a large kite, was made of bamboo and paper and had two wings, one over the other, with struts and crisscross wires connecting them. A system of control cords enabled its flight to be directed from the ground. Although they ended with a crash, the tests were successful, the brothers felt, and the following summer they built a full-sized glider with an eighteen-foot wingspan meant to be flown as a kite and, if that went well, to carry a man. Like any kite, this very large kite-glider needed wind to rise on, and Wilbur had written to Octave Chanute, an eminent engineer and a leading authority on aviation and gliders, asking for advice—they were looking for a location with good weather and reliable wind where they could conduct tests. Chanute suggested the coast of South Carolina or Georgia where there was also sand for soft landings. Poring through Weather Bureau records they became focused on a wide strip of land in the Outer Banks of North Carolina occupied only by fishermen, called Kitty Hawk. The winds there, they were informed, were reliably steady at ten to twenty miles an hour.
Kitty Hawk was isolated and accessible only by boat. It was seven hundred miles from Dayton, most of it by train. Wilbur went first. It was September and still extremely hot. It took him four days to find a boatman who agreed to take him across Albemarle Sound and they ran into a storm. The voyage was only forty miles but it took them two days. Kitty Hawk, Wilbur saw, was comprised of not much more than a lonely stretch a mile wide and five miles long with a single small hill. There were some houses but almost no vegetation. To the east lay the open Atlantic.
[Thanks to David Schneider.]
Tom Slater in Spiked:
Why Grow Up?, the latest book by American philosopher and essayist Susan Neiman, begins with a slyly subversive statement: ‘Being grown up is itself an ideal.’ In Britain today, this couldn’t seem further from the truth. Today, we’re told, is the worst time to be reaching adulthood. With economic strife, rising house prices, tuition fees and widespread youth unemployment weighing on Generation Y’s pasty back, coming of age merely means coming to the realisation that debt, destitution and living with mum and dad into your thirties is your inevitable inheritance. And that’s hardly an adulthood worth having. The question this book seeks to answer is why growing up seems such a grim prospect today. From the off, Neiman dispenses with the sort of neuroscientific apologism that we’ve become accustomed to in recent years. Within the current, fatalistic climate, adulthood has been defined down. The Science now says that adolescence stretches into your mid-twenties. But, as Neiman observes in her introduction, there’s nothing scientific about growing up. The lines between childhood, adolescence and adulthood are mutable, and have changed over time. Less than a century ago, childhood, as a time of pampered play and dependence, lasted barely a few years for the vast majority of the population. And when most young people were out of school and married by the end of their teens, adolescence – the rebellious grace period between Tonka trucks and 2.4 children – didn’t even exist.
Instead, Neiman presents adulthood as a process of coming to terms with the circumstances you find yourself in and then committing to changing them – reconciling the ‘is’ and the ‘ought’. She situates this in the history of Enlightenment thought, in which the doomy realism of Hume clashed with the rugged idealism of Rousseau. ‘It would take Kant’, Neiman writes, ‘to appreciate the fact that we must take both seriously – if we are ever to arrive at an adulthood we need not merely acquiesce in but actively claim as [our] own’. Kant’s concept of ‘the Unconditioned’, a point at which the world makes perfect sense, is central here. In order to develop into intellectual and moral maturity we must never lose sight of the idea of perfectible society – even as we come to recognise that the world is far from perfect. This rests, Neiman argues, on a refusal to rest in teeny cynicism, to be like Trasymachus – the indignant yoof of Plato’s Republic who rejects Socrates’ concept of justice as a prop for the powerful. ‘He is convinced that he’s seen through everything. It takes a grown up to know that this doesn’t mean he’s seen it’, she writes.
Chorus of Cells
even being very old,
(or perhaps because of it),
I like to make my bed.
In fact, the starting of each day
is the biggest thing I ever do.
I smooth away the dreams disclosed by tangled sheets,
I smack the dented pillow’s revelations to oblivion,
I finish with the pattern of the spread exactly centered.
The night is won.
And now the day can open.
All this I like to do,
mastering the making of my bed
with hands that trust beginnings,
All this I need to do,
directed by the silent message
of the luxury of my breathing.
And every night,
I like to fold the covers back,
and get in bed,
and live the dark, wise poetry of the night’s dreaming,
dreading the extent of its probabilities,
but surrendering to the truth it knows and I do not;
even though its technicolor cruelties,
or the music of its myths,
feels like someone else’s experience,
I know that I could no more cease
to want to make my bed each morning,
and fold the covers back at night,
than I could cease
to want to put one foot before the other.
Being very old and so because of it,
all this I am compelled to do,
day after day,
night after night,
directed by the silent message
of the constancy of my breathing,
that bears the news that I am alive.
by Peggy Freydberg
from Poems from the Pond
publisher: Hybrid Nation, 2015
Two of the most enticing ideas in cells biology have recently converged to create a paradigm shift of epic proportions. The first is that not only is it possible for mitochondria to emigrate from their host cell, they are in fact exchanged among cells much more regularly than has ever been imagined. The second is that while happenstance mutations are clearly associated with different aspects of a litany of cancers, the canonical force consistently driving tumor initiation, progression, and metastasis is now broadly understood to be the metabolic fickleness of their mitochondria. Mike Berridge is one of a handful of researchers firmly planted at the intersection of these two now ineluctable conclusions. As an author on a recent review in Cancer Research on the horizontal transfer of mitochondrial DNA (mtDNA), he adds much needed flesh to the first order simplification that cancer is merely a mitochondrial respiratory insufficiency. Most poignantly, in noting that the hidden force driving tumor-formation forward can more generally be understood to be the reacquisition of once lost mitochondrial function, new therapeutic opportunities immediately present themselves.
Of particular note Berridge found that the apparent need and ability of mitochondria-free primary tumor lines to re-assemble functional respirasomes, the supercomplexes responsible for respiration, differed according to cancer type. For example, breast cancer cells were found to have a unique 'threshold' level of respiration that was different from melanoma cells. Nover anticancer agents could in theory be designed to target specific components in more respiration-dependant cancer cells while leaving other cell types unscathed.
Thursday, August 20, 2015
Olivier Roy in Eurozine:
We Europeans live in secular societies and not in pre- or post-secular societies. Secularization has prevailed globally, even in Muslim countries. Of course, that does not mean that people have become irreligious. A society can consist of a majority of believers and still be secular, as in the United States.
In order to explain this assertion, which might sound paradoxical when the world is being shaken by the rise of the "Islamic State", it will be necessary to discuss the changing nature of the link between culture and religion, and particularly the "de-culturation" of religion.
There are many different ways to define secularization. As a social phenomenon, it is not an abstract process; it is always the secularization of a given religion, whose nature changes as secularization unfolds. Common definitions of secularization include three elements.
The first is the separation of state and religion, of politics and confession, without necessarily entailing a secularization of society. The United States is a good example: although there is a strong separation of church and state, levels of religiosity among the population are still high. The First Amendment of the American Constitution stresses both secularity and religious freedom. The second element in definitions of secularization is the decline in the influence of religious institutions in societies. Activities such as healthcare and education are now managed by the state or the private sector. In Europe, the churches have clearly withdrawn from the "management of society".
The third element in definitions of secularization is what Max Weber called Entzauberung – the disenchantment of the world. This does not mean that people become atheists, but that they care less about religion. Religion no longer plays a major role in our everyday lives, even if we still consider ourselves part of a religious community. In this sense, secularization corresponds to the marginalization of religion in society, rather than its exclusion.
Lee Drutman in Vox:
As the punditry attempts to make sense of the continued popularity of Donald Trump, the prevailing establishment narrative has been simple: He's an anti-establishment buffoon; he's channeling an angry mood; his moment will pass. But as Ezra Klein argued on Monday, this narrative may be wrong. What if Trump actually represents a sizable electorate that Beltway elites have marginalized?
The data on this is pretty clear. Put simply: While most elite-funded and elite-supported Republicans want to increase immigration and decrease Social Security, a significant number of voters (across both parties) want precisely the opposite — to increase Social Security and decrease immigration. So when Trump speaks out both against immigration and against fellow Republicans who want to cut Social Security, he's speaking out for a lot people.
By my count of National Election Studies (NES) data, 24 percent of the US population holds this position (increase Social Security, decrease immigration). If we add in the folks who want to maintain (not cut) Social Security and decrease immigration, we are now at 40 percent of the total electorate, which I'll call "populist." No wonder folks are flocking to Trump — and to Bernie Sanders, who holds similar positions, though with more emphasis on the expanding Social Security part and less aggression on immigration.