Tuesday, February 21, 2017
Paul Freedman in DelanceyPlace:
In the mid-1800s, unaccompanied women in America were generally not allowed to dine at restaurants: "Midday dining presented a challenge for women too busy or too far from home to return there for lunch. They might be in the company of other women or alone, but at any rate not escorted by men who were occupied with work and work-related socializing; men had their own luncheon habits. In the nineteenth-century United States, men made the rules about public dining and admitted women to restaurants on sufferance, according to a complex series of arrangements. Different practices governed the two main meals of the day. "Restaurants depended economically on women accompanying men at the evening meal. Lunch, however, was segregated by gender and involved a series of problems, according to the social customs of the nineteenth century. In the grand and even not-so-grand metropolis, men were increasingly likely to work at some distance from home and to stay near their workplace for the midday meal. The point at which women too absented themselves from the house created a demand for their sustenance. The growth of cities and the creation of specialized shopping districts meant that it was often inconvenient for women as well as men to return home for lunch.
"The public rooms at fancy restaurants were usually reserved at lunch for men only, but some of them allowed women to have lunch in private dining spaces. In the era before Prohibition, bars offered free food, which, along with a crowded and boisterous atmosphere, encouraged demand for drink. Free-lunch bars were hopelessly inappropriate spaces for respectable women, as alcohol-driven conviviality was inevitably coarse -- the antithesis of what was considered ladylike.
Tim Stanley in The Telegraph:
Can a white person ever really understand how a black person sees the world? Back in 1959, six years before Martin Luther King marched for civil rights in Selma, one man tried. A white Texan writer called John Howard Griffin walked into a doctor’s office in New Orleans and asked him to turn his skin colour black. Griffin took oral medication and was bombarded with ultraviolet rays; he cut off his hair to hide an absence of curls and shaved the back of his hands. Then he went on a tour of the Deep South. The result was a bestselling book called Black Like Me, which is still regarded as an American classic. Griffin wanted to test the claim that although the southern United States was segregated it was essentially peaceful and just – that the two races were separate but equal.
What he discovered tells us a lot about the subtleties of racism. In 1959, unlike today, it was legally instituted. But, like today, it also flourished at the personal level – in hostility, suspicion, fear and even self-loathing. Griffin was an extraordinary man. Born in Dallas in 1920, he went to school in France and joined the French Resistance after Hitler invaded. Griffin helped Jewish children escape to England before fleeing to America. While serving in the US army, he was blinded by shrapnel. Griffin took it all in his stride – he married, had children and converted to Catholicism. Griffin’s strong personal faith reminds us that much of the civil rights movement was in fact a Christian mission – made possible, in this instance, by what seemed like a miracle. Walking around his yard one afternoon, Griffin suddenly saw red swirls where hitherto there was only darkness. Within months his sight had returned. And it was a man determined to make the most of his second chance who hit upon the novel idea of crossing the colour line. Those reading the book today might regard Griffin’s attempt to change his colour as akin to blacking up. Certainly, the transformation was awkward. Griffin may well have had dark skin but he retained his classically Caucasian features, and one suspects that the awkwardness of his encounters with some black people was down to them wondering if he was one of them or just horribly sunburnt.
More here. (Note: At least one post throughout February will be in honor of Black History Month)
Shehryar Fazli in the Los Angeles Review of Books:
Timothy B. Tyson has written a concise and urgent book about Emmett Till’s 1955 murder in a small Mississippi town, a crime that ignited civil rights defenders into a long, hard struggle against the Jim Crow regime in the South, and inspired an outraged Rosa Parks to defy segregation laws on a Montgomery city bus. It’s a macabre story of inhumanity and injustice, but also of resistance and unity across a divided nation.
The facts may be known, but bear repeating. Fourteen-year-old Emmett, during a visit from Chicago to his family’s hometown of Money, Mississippi, allegedly whistled at a white woman, Carolyn Bryant, in a grocery store. After Bryant claimed, untruthfully, that the black boy had also grabbed her, her husband Roy Bryant and his half-brother J. W. Milam abducted Emmett from his grand uncle’s house, beat, mutilated and shot him, then dumped his body into the Tallahatchie River, from where it was recovered three days later. Just another lynching in the Jim Crow South … until it wasn’t. If it weren’t for the specific time and place, it’s unlikely to have become arguably the United States’s most consequential hate crime, the first act in a drama of reckoning that tested a nation’s moral fiber.
Expertly, Tyson demarcates and mines the territory of Till’s murder, including why the killers assumed it would go ignored; of the trial, which indeed concluded with a not-guilty verdict; and of the countrywide reaction to both. Yet his analysis of the big national moment does not upstage his attention to the Till family’s unimaginable personal loss.
Video length: 1:00:17
Video length: 15:48
Victoria Jaggard in National Geographic:
Creatures that thrive on iron, sulfur, and other chemicals have been found trapped inside giant crystals deep in a Mexican cave. The microbial life-forms are most likely new to science, and if the researchers who found them are correct, the organisms are still active even though they have been slumbering for tens of thousands of years.
If verified, the discovery adds to evidence that microbial life on Earth can endure harsher conditions in isolated places than scientists previously thought possible. (See “Life Found Deep Under Antarctic Ice for First Time?”)
“These organisms have been dormant but viable for geologically significant periods of time, and they can be released due to other geological processes,” says NASA Astrobiology Institute director Penelope Boston, who announced the find today at a meeting of the American Association for the Advancement of Science. “This has profound effects on how we try to understand the evolutionary history of microbial life on this planet.”
Emily Dreyfuss in Wired:
In the past, the president has also promised to publish a weekly list of crimes committed by undocumented immigrants. What he hasn’t promised to publish is a list of crimes committed by Americans. That’s not news. But his list is likely to create the false impression that undocumented immigrants are especially prone to commit violent crimes—an impression in which the human brain is complicit.
Lakoff, a University of California, Berkeley linguist and well-known Democratic activist, cites Ronald Reagan’s “welfare queen” as the signature “salient exemplar.” Reagan’s straw woman—a minority mother who uses her government money on fancy bling rather than on food for her family—became an effective rhetorical bludgeon to curb public assistance programs even though the vast majority of recipients didn’t abuse the system in that way. The image became iconic, even though it was the exception rather than the rule.
Psychologists call this bias the “availability heuristic,” an effect Trump has sought to exploit since the launch of his presidential campaign, when he referred to undocumented Mexican immigrants as rapists.
“It basically works the way memory works: you judge the frequency, the probability, of something based on how easily you can bring it to mind,” says Northeastern University psychologist John Coley. “Creating a vivid, salient image like that is a great way to make it memorable.”
Monday, February 20, 2017
by Katrin Trüstedt
Fundamental questions of migration and asylum that determine contemporary political debates also take center stage in some contemporary theaters. On the German speaking stage, Asylum seekers have made a lasting impact that gave rise to recent controversial discussions. Announcing his latest book, professor at the Hochschule für Schauspielkunst Ernst Busch Bernd Stegemann claimed that the impact of the "refugees on stage" marks a problematic turn towards authenticity. It implies, according to him, a form of banishing mimetic art from the stage on which no fictive world is emerging anymore. In line with several "new realisms" and a dominance of documentary forms, refugees are put on the stage as "real human beings" that are supposedly just "being themselves." Stegemann understands this as a takeover by a performative art form that not only expels mimetic art from the theater, but in his view also furthers a new populism with its claim for authenticity.
This diagnosis underestimates the complex conditions and reflective potential of the contemporary stage. A play like Elfriede Jelinek's The Charges (the Supplicants) does not take the appearance of "real people" for granted. Rather, it explores the question what it means to enter a stage, to make an appearance, and to take on or receive a role, even "as oneself." And it highlights especially what it means to appear as a "stateless," in the double sense of being without state and being without status. Not only any claim for authenticity is very much up for debate here, but the very foundations of mimetic as well as of performative art is being explored. In Jelinek's text, the question of asylum on the one hand, and the very nature of the theater on the other are fundamentally linked.
Things We Learn
Things come to us
out of nowhere
Surfers riding waves
we learn the nuances of gravity
its center-of, its bonding property,
its Gs, its fatal promise, we learn
how to stand erect and,
for the most part, stay that way
learn how to take a fall
how to shuck and jive
through sticky moments
through disequilibrium to stoop
or, chest out, stand tall
falling even into the troughs of its waves
we ride, we glide skulls full of juice
snapping, crackling through calculations
needed to adjust, adjust
we learn to know the force of the wave
behind, its feel, learn to fear and not to,
to not let its immensity in terror lock us,
to knock us off our board, we learn
immediately where our feet should be,
the optimal pose, how to shift without thought,
to enter the exhilaration of a barrel
and ride despite threat of a lethal dive
to surface sane, with soul intact, alive
by Paul Braterman
Evolution has nothing to do with progress. Most evolution doesn't even have anything to do with adaptation, and it is perfectly possible for a change that is worse than useless to spread through a population. Paradoxically, however, such non-adaptive change may be a necessary prelude for major adaptations.
This post was inspired by a recent opinion piece (open access here1) in BMC Biology, entitled "Splendor and misery of adaptation, or the importance of neutral null for understanding evolution" (I will explain what "neutral null" means later). The paper itself is in parts highly technical, with 86 references to the original scientific literature, but I will try here to give a general overview of some of the main conclusions, and to place them in context.
Darwin and Wallace both thought that evolution was driven by selection. If so, then whenever we find a feature in an organism, it makes sense to ask what function it serves. The function may for example be help in survival (natural selection, in the narrowest sense of the term), or help in obtaining mating opportunities (sexual selection).
L: The recurrent laryngeal nerve passing under the aortic arch. Illustration by Jkwchui after Truthseeker-2004, via Wikipedia
Because the evolution of a species is constrained by its history, there will be features that are themselves non-adaptive, but come about as side-effects of more important adaptive changes. Such incidental maladaptions include the tortuous paths of major nerves and arteries, which have arisen as the unwanted by-product of changes in body plan since our fish-like ancestors. One well-known example is the recurrent laryngeal nerve, which loops under the aorta near the heart and back up again on its way from the cranium to the larynx and oesophagus. In a fish, its path is more or less a straight line, but as the heart has moved down in the body, and the aorta with it, the nerve has been forced into this contorted pathway.
Likewise, we can expect to find vestigial organs, which once had a function, and are now redundant, but have not yet completely disappeared. An example is the pelvis of the whale, inherited from its four-legged ancestors. Such vestigial organs often acquire secondary functions, in the phenomenon known as exaptation. The bones in the mammalian ear, related to bones in a reptile's flexible jaw, illustrate this. [Insert diagrams: whale pelvis; jaw-to-ear] And indeed whales use their pelvis and femur relics in sexual embraces.
R: Sperm whale with drawing of skeleton, NOAA via Wikipedia
Adaptationism is the view that all aspects of an organism are, directly or indirectly, the result of selection. So every feature needs to be explained, either in terms of its own function, or as an incidental relic or side-effect of more directly functional features. This is a natural enough assumption, but like all assumptions it requires justification. Otherwise it is merely a "Just So" story.
Sughra Raza. Hong Kong Alley; Jan, 2017.
by Yohan J. John
As a neuroscientist, I am frequently asked about consciousness. In academic discourse, the celebrated problem of consciousness is often divided into two parts: the "Easy Problem" involves identifying the processes in the brain that correlate with particular conscious experiences. The "Hard Problem" involves murkier questions: what are conscious experiences, and why do they exist at all? This neat separation into Easy and Hard problems, which comes courtesy the Australian philosopher David Chalmers, seems to indicate a division of labor. The neuroscientists, neurologists and psychologists can, at least in principle, systematically uncover the neural correlates of consciousness. Most of them agree that calling this the "Easy Problem" somewhat underestimates the theoretical and experimental challenges involved. It may not be the Hard Problem, but at the very least it's A Rather Hard Problem. And many philosophers and scientists think that the Hard Problem may well be a non-problem, or, as Ludwig Wittgenstein might have said, the kind of problem that philosophers typically devise in order to maximize unsolvability.
One might assume that as a neuroscientist, I should be gung-ho to prove the imperious philosophers wrong, and to defend the belief that science can solve any sort of problem one might throw at it: hard, soft, or half-baked. But I have become increasingly convinced that science is severely limited in what it can say about consciousness. In a very important sense, consciousness is invisible to science.
The word "consciousness" means different things to different people, so it might help to cover some of the typical ways its used. The most objective notion of consciousness arises in the world of medicine. We don't usually require a degree in philosophy to tell when a person is conscious and when they are unconscious. The conscious/unconscious distinction is only loosely related to subjective experience: we say a person is unconscious if they are unresponsive to stimuli. These stimuli may come from outside the body, or from the still-mysterious wellspring of dreams.
But the interesting thing about any "medical" definition of consciousness is that it evolves with technology.
by Brooks Riley
by Claire Chambers
In her 1963 book, Eichmann in Jerusalem, Hannah Arendt argues that there is nothing in evil that is radical or lucid. Instead, she claims, even the most extreme evil is senseless and banal. Amos Elon summarized Arendt's argument in terms that cannot but resonate with the current political circumstances in the United States: 'Evil […] need not be committed only by demonic monsters, but—with disastrous effect—by morons and imbeciles as well'. As Arendt writes about Adolf Eichmann, one of the Holocaust's prime orchestrators: '[he] was not Iago and not Macbeth […]. Except for an extraordinary diligence in looking out for his personal advancement, he had no motives at all'.
The world's new Orange Overlord, 45th President of the United States Donald J. Trump has gifted us too many irrational, muddled, and downright idiotic statements and actions over the last year for enumeration in this short blog post. To take just one example, on the first day of Black History Month, Trump seemed to believe that Frederick Douglass, the nineteenth-century author of Narrative of the Life of Frederick Douglass an American Slave, was still alive. According to Trump, Douglass was 'an example of somebody who is doing an amazing job, who is being recognized more and more, I notice'.
Arendt was right to observe that the slide from thoughtlessness to evil is easy and smooth. A week before his Douglass gaffe, on Holocaust Remembrance Day 2017 Trump issued his executive order banning refugees from the United States for 120 days and from Syria permanently. Additionally, citizens from seven Muslim-majority countries (Syria, Iraq, Iran, Yemen, Libya, Sudan and Somalia) were blocked from entering for 90 days. What a way to commemorate the premeditated and industrial killing of six million Jews and 200,000 Roma by singling out refugees and a religious group for exclusion. Thankfully, Trump soon found himself struggling with implacable opposition from the US legal system and at the time of writing has been unable to execute his order.
Moreover, there was no mention of the Jews or anti-Semitism on Holocaust Remembrance Day. Trump's inept Press Secretary Sean Spicer later clarified that this omission was not regretted because the White House's intention was to 'acknowledg[e] all of the people' who died. Prince Charles responded by saying the lessons of the Holocaust are being forgotten. Yet these lessons are in fact being wilfully erased by Trump and his team.
So, I took off her blouse as she raised her arms
A trumpeter blared outside my window
She ran her fingers through my hair
I unclasped her bra
Trumpeter boomed a tune I’d heard before
“My husband will be angry if I stay”
Tip of my tongue touched her nipple
She unzipped my fly
“I should go back to my husband”
She sipped my scotch neat
Unzipped, the flame leapt
I kissed her nipples red
We savored the scotch as our lips met
My tongue trailed south from her nipples
No bush by the door bloomed
She straddled me on the king bed
My tongue brushed the door where no bush bloomed
She sighed as the epicenter shook
“Hey Hey Ho Ho: Trumpeter Must Go”
Arms raised, her hands waved double O
By Rafiq Kathwari/ @brownpundit/ rafiqkathwari.com
by Elise Hempel
I admit that Obama sometimes bored me. Not when he was fired up, almost singing, gospel-style, at a rally. Not when he was broadly smiling, affectionately joking with Joe Biden or being teased by Michelle. Not even when he was doing a serious interview with Steve Kroft, leaning forward with his hands together, deep-voiced. But during a press conference, fielding a random question – the long pauses for thought, the even longer, deliberate responses.... That's when I'd change the channel or walk out of the room for a snack. But no matter. I always knew that behind his ability to bore was a solid president, a decent man.
One afternoon last year, standing at the kitchen sink, I heard a low, almost-monotone, almost-mumbling voice that kept drifting here and there as it spoke, a voice that seemed to have no direction. I thought the radio was on in my partner's office, tuned to some daytime talk show, a soft-voiced FM deejay meandering, filling the air-space. But when I walked out of the kitchen I saw that what I'd been hearing was really the TV, a Trump rally my partner, Ray, had paused on in his channel-surfing. What I'd been hearing was really Donald Trump going on and on about something, changing from one thing to the next without transition, filling time and somehow having filled the venue with a crowd. How could anyone possibly stay awake at his rallies? (And, standing now in the living-room with the dish towel in my hand, staring at the television screen, is it possible that I noticed the possibly-paid spectators directly behind Trump turning their heads in distraction, shifting in boredom, laughing with each other about something, anything, besides what Trump was rambling on about?)
Steve Bannon has called Donald Trump "probably the greatest orator since William Jennings Bryan." (Huh?) And I've read a handful of articles that say that people like me just don't get it: Trump speaks in a language, with a style understood by only his fervent supporters. Might it really be that, as Donald Trump rambles, his supporters are hearing, through the static and "white noise," only the bits that catch their ear, that serve their needs and wants, as I do when I'm "wool-gathering" while my partner talks, my head finally turning when he says he'll watch tonight's real-life murder show with me, or as my dog does when, somewhere in the endless jumble of my baby-talk, I speak the word "bone" or "kitty" or "walkies"?
by Max Sirak
Ah, nothing like a made-up holiday to honor the CEO of our oligarchy.
Mmmm…drink it in. It goes down jagged with a bitter and retching aftertaste. Which is good. It means we're of a like mind and among friends.
3QD is a bastion of thought. It's a place where words still mean things and facts still matter. It's a digital, international safe space for liberal sisters and brothers. It's a place to exchange ideas, gather, and garner support.
This last point is important. It's easy to look around today, become discouraged, and feel alone.
But we are not alone. We all have friends and loved ones who are fighting or flighting. I know I've spent a good amount of time recently trying to figure out what I can do to make things better. My quest has led to me to travel in time and look back. Today I'd like to share some of what I've found.
So - whether you're running away or ‘rastling to make the world a better place - here are some things I've learned over the last month.
by Christopher Bacas
I left a Texas college somewhere between my Sophomore and Junior years. My body and saxophone continued to attend for another year and half. Degree requirements unfulfilled, I eventually packed and left with some South Carolina buddies, padding my stereo and lps with tangled knots of dirty laundry. High on speed, we drove sleepless; taking a stealthy moonlight swim in a motel pool en route. After dumping stuff at my parents' house, we rolled on to New York City, then in its' early 80's menacing, funky glory. Greatness poured out of musicians everywhere, as Steve Wonder says "Jus' like I pitchered it". I was only visiting, though. It was all too scary.
Back home in Pennsylvania, I practiced for the big leagues. A new local eatery, with a Vonnegut-inspired name, featured fine local musicians. I went by to sit in. A trumpet player, mostly legit guy,played as well; having fun after his lawyer day-gig. He noticed the shaggy tenor player blasting away with schoolboy enthusiasm. After talking down our city (as if I didn't know it was backwards), he encouraged me to join the Musician's Union and take advantage of their bookings. They could swear me in at the next meeting.
The Union office sat between a hat shop and barber on the town's western-running main drag.
Its' interior, one large room with beat-up office furniture and framed band photos. Our local hung tough as gigs, dues, and membership declined. Its' crown jewel, a community band, once led by Music Man-monikered Elwood Sprigle, remained strong. I introduced myself to the secretary, who smiled sweetly when told I attended the city high school. As the folding chairs around me filled, I barely looked up. The men, my father's age or older, ignored me. Roberts' Rules brought the meeting to order. After sad, monotone reporting on finances and the many bars and clubs using non-Union bands, the floor went to the business agent.
Sunday, February 19, 2017
Ulka Anjaria in the Boston Review:
In June of 1997, on the verge of graduating from high school, I received an award for my study of foreign languages, a book wrapped in blue shiny paper. As I opened it, a small clipping from TIME slipped out—an article on an Indian writer, Arundhati Roy, whose novel was taking the literary world by storm. My prize was Roy’s novel, The God of Small Things (1997). I sat down to read it immediately.
On a visit to India the summer before, I had poked around bookshops desperately seeking out new fiction—something other than the requisite thin copy of Mulk Raj Anand’s Untouchable (1935) that seemed to be everywhere, dusty and unthumbed, the few books by Anita Desai and Gita Mehta, Salman Rushdie’s Haroun and the Sea of Stories (1990), Kamala Markandaya’s Nectar in a Sieve (1954), and a graying Sahitya Akademi translation of Rabindranath Tagore’s Chaturanga (1916). This was 1997 and, unlike today, most of Mumbai’s bookstores were hidden inside luxury hotels, Indian literature meant Rudyard Kipling and E. M. Forster, and the bookshelves dedicated to India offered little more than old Lonely Planet volumes and coffee table books on the lives of the Maharajas. At age eighteen, I found Anand dry and Rushdie pompous. Desai and Mehta felt like they were writing for my parents’ generation. There was even something dull and unfashionable about the packaging of these books, most of which were published not in India but in England. Indian literature wasn’t cool—it was, somehow, embarrassing.
The God of Small Things changed all that. The idea that India could have a contemporary novel of its own, shorn of Anand’s unwieldy idioms or Markandaya’s awkward exoticisms, a novel whose writing style was new and fresh, whose irony and anger were youthful and contemporary, a novel that shouted rather than whispered, a novel by a young woman, was, to my mind, a revelation.
Hannah Devlin in The Guardian:
The woolly mammoth vanished from the Earth 4,000 years ago, but now scientists say they are on the brink of resurrecting the ancient beast in a revised form, through an ambitious feat of genetic engineering.
Speaking ahead of the American Association for the Advancement of Science (AAAS) annual meeting in Boston this week, the scientist leading the “de-extinction” effort said the Harvard team is just two years away from creating a hybrid embryo, in which mammoth traits would be programmed into an Asian elephant.
“Our aim is to produce a hybrid elephant-mammoth embryo,” said Prof George Church. “Actually, it would be more like an elephant with a number of mammoth traits. We’re not there yet, but it could happen in a couple of years.”
The creature, sometimes referred to as a “mammophant”, would be partly elephant, but with features such as small ears, subcutaneous fat, long shaggy hair and cold-adapted blood. The mammoth genes for these traits are spliced into the elephant DNA using the powerful gene-editing tool, Crispr.
Beena Sarwar in The Wire:
Did the woman bouncing a little girl on her shoulders, chanting and dancing to an inner beat before the drums sounded, go back last Thursday? Did they survive the blast?
I saw them one Thursday last April when I went to Sehwan Sharif with friends from India who were in Pakistan to attend a wedding. Every week, the day before the Muslim holy day, Friday, draws the most crowds at the Sufi shrines that dot the landscape across South Asia.
Devotees believe that you only go to the dargah – the shrine built over the grave of a revered religious figure – when you are “called” to do so. I have been “called” to Sehwan Sharif several times.
These Sufi dargahs are a symbol of the region’s syncretic culture – the unique blend of Islam with local cultures. It was the Sufi philosopher-poets’ teachings of peace and love that led to the spread of Islam in the sub-continent. It is what today’s hard-line Islamists who draw their stark puritan ideology from Wahhabi teachings, are trying to counter.
How a Ruthless Network of Super-Rich Ideologues Killed Choice and Destroyed People’s Faith in Politics
George Monbiot in Evonomics:
The events that led to Donald Trump’s election started in England in 1975. At a meeting a few months after Margaret Thatcher became leader of the Conservative party, one of her colleagues, or so the story goes, was explaining what he saw as the core beliefs of conservatism. She snapped open her handbag, pulled out a dog-eared book, and slammed it on the table. “This is what we believe,” she said. A political revolution that would sweep the world had begun.
The book was The Constitution of Liberty by Frederick Hayek. Its publication, in 1960, marked the transition from an honest, if extreme, philosophy to an outright racket. The philosophy was called neoliberalism. It saw competition as the defining characteristic of human relations. The market would discover a natural hierarchy of winners and losers, creating a more efficient system than could ever be devised through planning or by design. Anything that impeded this process, such as significant tax, regulation, trade union activity or state provision, was counter-productive. Unrestricted entrepreneurs would create the wealth that would trickle down to everyone.
This, at any rate, is how it was originally conceived. But by the time Hayek came to write The Constitution of Liberty, the network of lobbyists and thinkers he had founded was being lavishly funded by multimillionaires who saw the doctrine as a means of defending themselves against democracy. Not every aspect of the neoliberal programme advanced their interests. Hayek, it seems, set out to close the gap.
Lydialyle Gibson in Harvard Magazine:
What causes aging? “Scientists have been thinking about this question for centuries,” says Harvard professor of medicine Vadim Gladyshev. It sounds almost simple, but in fact it’s thorny and complicated, and although several theories have emerged—that organisms are “programmed” by nature to die, or that aging is the result of “hyperfunction” of biological activities, or that it’s controlled by genetics—there are as yet no settled answers. But a study published today in Science Advances, coauthored by Gladyshev, offers evidence bolstering one long-held theory: that aging is caused, at least in part, by molecular damage accumulating in the cells. “This damage is generated by nearly every cellular process,” he says—by the work of enzymes and proteins and the life-sustaining metabolic processes that occur at every level of complexity, from simple molecules and cell components to whole cells and entire organs. “So over time we have many, many ‘damage forms,’ millions or billions”—unavoidable byproducts of enzyme function, for example, or of protein-to-protein interactions, errors in DNA transcription or translation. “And as a function of age, they accumulate.” Eventually, it’s more than the body can cope with.
...“Aging is the most important biological question.” It is at the root of so many diseases. “Even if we eliminate cancer, for example, the effect would be minor, because of all the other diseases of aging: diabetes, Alzheimer’s, sarcopenia, cardiovascular disease, and so on and so on.” All of those maladies will still add up. “But if we can learn how to slow down the aging process, we can deal with all of those diseases at once. We delay their appearance. That’s why it’s important to study these fundamental questions, to ask: what is aging?”