Wednesday, April 01, 2015
Adam Shatz in the NYT (photo: Ferhat Bouda/Agence Vu, for The New York Times):
What impressed me about Daoud’s writing, both his journalism and his novel, was the fearlessness with which he defended the cause of individual liberty — a fearlessness that, it seemed to me, bordered on recklessness in a country where collectivist passions of nation and faith run high. I wondered whether his experience might provide clues as to the state of intellectual freedom in Algeria, a peculiar hybrid of electoral democracy and police state. Late last year, I had an answer of sorts. Daoud was no longer merely a writer. He was now someone you had to take a side on, in Algeria and in France.
His ordeal began on Dec. 13, during a book tour in France, where “Meursault” received rapturous reviews, sold more than 100,000 copies and came two votes shy of winning the Prix Goncourt, the nation’s most prestigious literary prize. He was on a popular late-night talk show called “On n’est pas Couché” (“We’re Not Asleep”), and he felt, he would tell me later, “as if I had all of Algeria on my shoulders.” He insisted to the French-Lebanese journalist Léa Salamé, one of the guests on the program, that he considered himself an Algerian, not an Arab — a view that’s not uncommon in Algeria, but that is opposed by Arab nationalists. He said that he spoke a distinct language called “Algerian,” not Arabic. He said that he preferred to meet with God on foot, by himself, rather than in an “organized trip” to a mosque, and that religious orthodoxy had become an obstacle to progress in the Muslim world. Daoud said nothing on the program that he hadn’t said in his columns or his novel. But saying it in France, the country that ruled Algeria from 1830 to 1962, got him noticed by people back home who tend to ignore the French-language press.
One of them was an obscure imam named Abdelfattah Hamadache, who had reportedly been an informer for the security services. Three days after Daoud’s appearance on French television, Hamadache wrote on his Facebook page that Daoud — an “apostate” and “Zionized criminal” — should be put on trial for insulting Islam and publicly executed. It was not quite a call for Daoud’s assassination: Hamadache was appealing to the state, not to freelance jihadists. But Algeria is a country in which more than 70 journalists were murdered by Islamist rebels during the civil war of the 1990s, the so-called Black Decade. Those murders were often preceded by anonymous threats in letters, leaflets or graffiti scrawled on the walls of mosques. Hamadache’s “Facebook fatwa,” as it became known, was something new, and uniquely brazen, for being signed in his own name.
William Logan in The New Criterion:
As he recalled it,
I got out of a train at, I think, La Concorde and in the jostle I saw a beautiful face, and then, turning suddenly, another and another, and then a beautiful child’s face, and then another beautiful face. All that day I tried to find words for what this made me feel. That night as I went home along the rue Raynouard I was still trying. I could get nothing but spots of colour. I remember thinking that if I had been a painter I might have started a wholly new school of painting. . . . Only the other night, wondering how I should tell the adventure, it struck me that in Japan, where a work of art is not estimated by its acreage and where sixteen syllables are counted enough for a poem if you arrange and punctuate them properly, one might make a very little poem which would be translated about as follows:—
“The apparition of these faces in the crowd :
Petals on a wet, black bough.”
—“How I Began,” T.P.’s Weekly, June 6, 1913
Early in March 1911, Ezra Pound arrived in Paris. By late May he had moved on. The specters in the Métro obviously haunted him. The lines were finished by fall the following year, when he sent Poetry a batch of poems that, he hoped, would “help to break the surface of convention.” When these “Contemporania” were published at the head of the April 1913 issue, the poem appeared in this fashion:
In a Station of the Metro
The apparition of these faces in the crowd : Petals on a wet, black bough .
The first thing striking about the couplet is the subject—beauty discovered underground.
Emily Anthes in Nature:
In 2009, a major California hospital was looking for ways to cut costs. Stanford Hospital and Clinics was on track that year to purchase nearly US$6.8 million worth of blood for transfusions. But a growing body of evidence was suggesting that physicians could often forego the procedure. So, beginning in July 2010, whenever a clinician used the hospital's computerized ordering system to request blood, it would call up the patient's most recent lab results. If the numbers indicated that she or he should be healthy enough to get by without a transfusion, an alert would pop onto the screen gently reminding the doctor of the guidelines and requesting further justification for the order.
The results, detailed in two papers published in the past 18 months1, 2, were dramatic. The number of red-blood-cell transfusions dropped by 24% between 2009 and 2013, representing an annual savings of $1.6 million in purchasing costs alone. And as transfusion rates fell, so did mortality, average length of stay and the number patients who needed to be readmitted within 30 days of a transfusion. By simply asking doctors to think twice about transfusions, the hospital had not only reduced costs, but also improved patient outcomes. Transfusions are common procedures, at least in developed nations. In 2011, US doctors transfused 21 million units of blood and blood products; in the United Kingdom, the number was nearly 3 million. But although transfusions can be lifesaving, they are often unnecessary and are sometimes even harmful. “I think we were kind of brainwashed into thinking that blood saves lives, and the more you give the better,” says Steven Frank, an anaesthesiologist and director of the blood-management programme at the Johns Hopkins Health System in Baltimore, Maryland. “We've gone 180 degrees, and now we think that less is more.”
From Jam Tarts:
How has your experience as a professor of creative writing and literature influenced your personal tastes? How has what you've taught – and perhaps who you've taught – over the years challenged or even transformed your sense of what’s pleasing and what’s not?
RP: For years I've required the young poets in my MFA workshop to compile an anthology: 36 pages that show what you mean by the words “poem” or “poetry.” Ideally, typed up by hand. The students learn from the exercise – sometimes typing something they didn't realize they liked, sometimes beginning to type something they thought they liked, then abandoning it.
A kind of secret function of that exercise has been to develop and expand my taste. The student anthologies are scouts for me, keeping my taste limber, I think. Sometimes, there’s the plausible mediocrity that the young poets (or their teachers) are reading in a particular decade, or lustrum, or year. But sometimes I get some free education. Not only contemporary finds (it may have been in one of those anthologies that I first read a poem by Terrance Hayes or Katie Peterson) but poets translated from other languages. And re-discoveries: some very hip, rather experimental young poet types out “Lycidas” and I realize I’ve sort of underestimated it as a dusty, ornate perennial.
From The Scientist:
Growing old is a fact of life. And there’s no mistaking it, given the increased fatigue, weakened bones, and ill health that generally accompany aging. Indeed, age is the number one risk factor for myriad diseases, including Alzheimer’s, cancer, cataracts, and macular degeneration. And while researchers are making progress in understanding and treating each of these ailments, huge gaps remain in our understanding of the aging process itself.
“We age so completely and in so many different ways,” says stem cell biologist Derrick Rossi of Harvard University. “We are programmed to die.”
The aging process can be traced down to the level of cells, which themselves die or enter senescence as they age, and even to the genomic level. Accumulation of mutations and impairments in DNA repair processes are highly associated with symptoms of aging. In fact, disorders that cause premature aging are typically caused by mutations in genes involved in the maintenance of our DNA. And at the cellular level, decreases in stem cells’ proliferative abilities, impairments in mitochondrial function, and proneness to protein misfolding can all contribute to aging. As scientists continue to detail these various processes, says Paul Robbins of the Scripps Research Institute, “the big question is, ‘At what step along all these pathways is the best place to intervene to try to promote healthy aging?’”
While diverse strategies—from caloric restriction to genetic manipulation—have proven to extend life span in model organisms in the lab, these animals are not necessarily enjoying longer periods of health. (See “Quantity or Quality?”) In the end, researchers studying aging must learn not just how to extend life, but how to prevent age-related disease and physical decline.
Nicholas Fitz in Scientific American:
In a candid conversation with Frank Rich last fall, Chris Rock said, "Oh, people don’t even know. If poor people knew how rich rich people are, there would be riots in the streets." The findings of three studies, published over the last several years inPerspectives on Psychological Science, suggest that Rock is right. We have no idea how unequal our society has become.
In their 2011 paper, Michael Norton andDan Ariely analyzed beliefs about wealth inequality. They asked more than 5,000 Americans to guess the percentage of wealth (i.e., savings, property, stocks, etc., minus debts) owned by each fifth of the population. Next, they asked people to construct their ideal distributions. Imagine a pizza of all the wealth in the United States. What percentage of that pizza belongs to the top 20% of Americans? How big of a slice does the bottom 40% have? In an ideal world, how much should they have?
Over at the NYRB (photo by Katherine Cecil):
On March 14–15, 2015, The New York Review of Books Foundation, Fritt Ord, and the Dan David Prize held a conference, “What’s Wrong with the Economy—and with Economics?” at Scandinavia House in New York. We are pleased to present the following video footage of the event.
The Crash of 2007-2008 was an acute crisis of market disequilibrium which has imposed itself upon an economics discipline still giving pride of place to models where market forces nudge economies in the very opposite direction—towards equilibrium. Crises of disequilibrium have occurred with increasing frequency over the past thirty years: with the Latin American debt crises of the 1980s, the American Savings and Loans collapse of the late 1980s, the Scandinavian banking crisis of the early 1990s, the Asian and Russian financial crises of the late 1990s, the American “dot-com” bust of 2000, and the Crash of 2007-2008 itself which has been global in impact.
Yet treating these crises as a series of near-identical events susceptible to economic modelling does not, on the face of it, do justice to the complexity and singularity of the forces which combined to bring them about. Many of these influences seem to have had their origins well beyond the home territory of economics. Doing justice to these outside forces may require a knowledge of ethics, anthropology, contemporary history and politics, public policy, and an understanding of the beliefs, frequently delusional, which seized many of the economic actors before and during the crises.
Among these disciplines it is, unsurprisingly ethics which intrudes questions of value deepest within the territory of economics, and forces a reappraisal of where the discipline stands in the disciplinary continuum between the humanities and the natural sciences. The overwhelming preference of economists themselves is to be as closely aligned as possible with the natural sciences. But with the intrusion of such ethically charged issues as the human fallout from the Crash, and the unrelenting growth of economic inequality in the US and most European countries, the scientific and the normative in economics are becoming increasingly difficult to keep apart.
Disputes between economists which seem to derive from disagreements about data and methodologies may on closer examination be rooted in profound disagreements about values.
Videos of the panels can be found here.
Massimo Pigliucci in Scientia Salon:
I have devoted a serious amount of time to reading the new book by Roberto Mangabeira Unger and Lee Smolin, The Singular Universe and the Reality of Time: A Proposal In Natural Philosophy . Indeed, this review actually pertains to the first part of the book, written by Unger, the philosopher in the pair. Eventually I will come back to it with a second review, focusing on the part written by Smolin, the physicist. They make the same argument, but one goes at it from a broad, philosophical perspective, the other from a more empirical, scientific point of view.
It is an ambitious book, bound to be controversial both among philosophers and among scientists, but it is worth the effort, if nothing else in order to expose one’s mind to a fairly radical way of conceiving of metaphysics, physics, and mathematics — and this despite the fact that the first part, written by Unger, is somewhat slow going and repetitious, compared to Smolin’s contribution.
Before we get to what the authors set out to accomplish, it is worth discussing a more basic premise of the book: they see it as an exercise in what they call (a revived form of) “natural philosophy.” Of course, natural philosophy was the name by which science went before it became a field of inquiry independent of philosophy itself. Descartes, Galileo, Newton and even Darwin thought of themselves as natural philosophers (the word scientist, in fact, was invented by Darwin’s mentor, William Whewell, in 1833 ). But what’s the point of going back to the old term, aside from a bit of historical nostalgia and perhaps intellectual pretentiousness?
Actually, Unger & Smolin (henceforth, U&S) make a very good case for it, which begins with the observation that many of their colleagues have indeed engaged, often stealthily, or perhaps without recognizing it, in precisely this sort of activity.
My father’s in my fingers, but my mother’s in my palms.
I lift them up and look at them with pleasure –
I know my parents made me by my hands.
They may have been repelled to separate lands,
to separate hemispheres, may sleep with other lovers,
but in me they touch where fingers link to palms.
With nothing left of their togetherness but friends
who quarry for their image by a river,
at least I know their marriage by my hands.
I shape a chapel where a steeple stands.
And when I turn it over,
my father’s by my fingers, my mother’s by my palms
demure before a priest reciting psalms.
My body is their marriage register.
I re-enact their wedding with my hands.
So take me with you, take up the skin’s demands
for mirroring in bodies of the future.
I’ll bequeath my fingers, if you bequeath your palms.
We know our parents make us by our hands.
by Sinead Morrissey
from The State of the Prisons
publisher: Carcanet, Manchester, 2005
Helen Fisher in Nautilus (scene from Before Sunrise):
One-night stands; hooking-up; friends with benefits; living together; pre-nups; civil unions. These all spell caution. But they also spell logic—because our brain is soft-wired to attach slowly to a partner.
The basic circuits for romantic love lie in primitive regions of the brain, near those that orchestrate thirst and hunger. Romantic love is a drive—one of three basic brain systems that evolved to direct our fundamental human mating and breeding strategy. The sex drive predisposes you to seek a range of mating partners; romantic love enables you to focus your mating energy on a single individual at a time; and feelings of attachment incline you to form a pair-bond at least through the infancy of a single child. Feelings of romantic love and deep attachment to a partner emerge in a pattern highly compatible with the spirit of the times—that is, with slow love.
I say this because my colleagues Lucy Brown, Art Aron, Bianca Acevedo, and I have put new lovers into a brain scanner (using functional Magnetic Resonance Imaging, or fMRI) to measure neural activity as these men and women gazed at a photo of their sweetheart. Those who had fallen madly in love within the past eight months showed activity in brain regions associated with energy, focus, motivation, craving, and intense romantic love. But those who had been passionately in love for eight to 17 months also showed activity in an additional brain region associated with feelings of attachment.
Romantic love is like a sleeping cat; it can be awakened at any time. Feelings of deep attachment, however, take time, and they can endure. In another of our studies, led by Acevedo, we put 17 men and women in their 50s and early 60s into the brain scanner. These participants had been married an average of 21 years, and all maintained that they were still madly in love with their spouse. Their brains showed that they were: They were deeply attached as well.
We have even begun to map some of the brain circuitry responsible for this marital happiness. In our study of long-term lovers, those who scored higher on a marital satisfaction questionnaire showed more activity in a brain region linked with empathy, a trait they had most likely retained from their initial passion. Moreover, when psychologist Mona Xu and her team used my original research design to collect similar brain data on 18 young men and women in China, she found that those who were in love long term showed activity in a brain region associated with the ability to suspend negative judgment and over-evaluate a partner, what psychologists call “positive illusions.” Much like men and women who have just fallen madly in love, these long-term partners still swept aside what they didn’t like about their mate and focused on what they adored.
Because feelings of attachment emerge with time, slow love is natural. In fact, rapidly committing to a new partner before the liquor of attachment has emerged may be more risky to long-term happiness than first getting to know a partner via casual sex, friends with benefits and living together. Sexual liberalism has aligned our courtship tactics with our primordial brain circuits for slow love.
Tuesday, March 31, 2015
Kat McGowan in Aeon (Anthony Quinn and Anna Karina on the set of 'The Magus'. 1976. Photo by Eve Arnold/Magnum):
Every modern generation has had its own idiosyncratic obsession with telepathy, the hope that one human being might be able to read another person’s thoughts. In the late 19th century, when spiritualism was in vogue, mind-reading was a parlour game for the fashionable, and the philosopher William James considered telepathy and other psychic phenomena legitimate subjects of study for the new science of psychology. By the 1960s, the Pentagon was concerned about Soviet telepathy research and reports that they had established remote communications with submarine commanders. In the 1970s, one ambitious Apollo 14 astronaut took it upon himself to try broadcasting his brainwaves from the moon.
In our technologically obsessed era, the search for evidence of psychic communication has been replaced by a push to invent computerised telepathy machines. Just last year, an international team of neurobiologists in Spain, France and at Harvard set up systems that linked one brain to another and permitted two people to communicate using only their thoughts. The network was basically one massive kludge, including an electroencephalography cap to detect the sender’s neural activity, computer algorithms to transform neural signals into data that could be sent through the internet and, at the receiving end, a transcranial magnetic stimulation device to convert that data into magnetic pulses that cross another person’s skull and activate certain clusters of neurons with an electrical field. With this contraption, the researchers were able to send a signal of 140 bits (the word ‘ciao’) from one person’s brain to another.
This apparatus is complex, expensive and extremely low-bandwidth, achieving a speed of about two bits per minute. Nonetheless, this study and others like it inspire a wave of hope that it might one day be possible to read another person’s thoughts. It’s easy to see why people won’t give up on the idea. Telepathy promises an intimate connection to other human beings. If isolation, cruelty, malice, violence and wars are fuelled by misunderstandings and communication failures, as many people believe, telepathy would seem to offer the cure.
In my first year as a medical student I thought I had a pretty good notion of what medicine was all about. I saw it as a branch of mechanical engineering, like building bridges, say, but inside the human body. If you want to build a bridge across a river, you’d have to take measurements and make calculations, choose building materials and then construct your bridge. Doesn’t matter whether you are working in Timbuktu, in Marseille, or on the moon.
Medicine is not like that at all. In Timbuktu it’s a completely different enterprise than in Marseille and on the moon there is no medicine. This multifariousness of medicine was brought home to me when Professor B., after describing a very complicated surgical procedure, concluded the demonstration by adding, “Of course I would never do this type of operation in a patient who is above eighty.”
I was shocked. What’s wrong with patients older than eighty? Aren’t they worth the trouble? It took me quite some time before I realized that that was not the point. Humans form a biological population, which means that every individual is a little different from all the others. Without this variety there would be nothing to select within the evolutionary process. Every face is different. Every fingerprint, every nose, every tone of voice—they are all different. The same goes for all the potatoes and roses of one kind. They are all different from each other.
Anne Fausto-Sterling in Boston Review (Photo: Rachel Mack):
During the 1950s peanut butter came in notoriously hard-to-close pry-top jars, and an enterprising rat in my family’s home took advantage. At night, after my parents’ bedroom door clicked shut, they would hear a clatter as the rat removed the metal lid and dropped it to the floor. Eventually mom and dad poisoned the ingenious beast. My brother skinned it, tanned its hide, and nailed it to a bulletin board—a stark warning to future marauders.
Lately I’ve been thinking about that rat. How did it figure out how to open the jar? How did it learn that the coast was clear when my parents retired for the night? Did it have a lid-opening gene? A peanut butter gene? Was it predisposed to explore, or did its family and rodent companions teach it to investigate? Why did that individual, and not, for example, its twin or its sister, exploit this food source?
It is a quirky case, the condiment-thieving rat, but the question of how individual differences arise is important to us all. In his State of the Union address, President Obama made a big push for personalized medicine—based on the idea that if we know an individual’s genome, we can predict medical outcomes and tailor individual treatments—and the success of that enterprise relies on the right answer.
This is a complicated challenge because, as biologists first proposed in the early twentieth century, a genotype—all the genes in the cells of an organism—does not guarantee a phenotype—what the organism looks like and how it behaves. A vast territory links genes in their cellular starting environment to individual phenotypes. Recently behavioral biologist Julia Freund and her colleagues published a fascinating study that directly addresses this problem.
Andrew Heisel in the LA Review of Books:
THIS MAY BE a golden age of television, but it’s hard to feel particularly blessed about it. According to Brett Martin’s recent book Difficult Men, this TV golden age is actually America’s third. In fact, if you add up the spans of Martin’s different ages, we’ve spent more time since 1950 within a golden age of television than without. Our current one has been running since the late ’90s. This is odd, not because we’ve found ourselves so frequently in the company of great television, but because we’ve found ourselves in golden ages at all. For a long time, the idea of “the golden age” just didn’t work like that.
The Greek poet Hesiod gave the world its first glimpse of the golden age. In his Works and Days, from ca. 700 BC, he describes a decline of man, from golden to silver to bronze, to the iron present. In the golden age, humans were without flaw and the earth was cornucopial. Man did not have to work and knew no master. In the ongoing iron, we are “with toils and grief oppressed, / Nor day nor night can yield a pause of rest.” Over the years, Plato, Ovid, and Virgil all played with the idea, but the essential concept remained the same: the golden age already happened; things are worse now. The job of the golden age is to remind us how much better things could be.
And so the golden age remained, firmly Classical, mythic, and over. And not just among the poets. To see how it worked its way into present American usage, I’ve traced the concept’s deployment through databases covering British and American writing since the 16th century — thousands of old books and newspapers. They bear out my suspicion that the “golden age” simply isn’t what it used to be, and not just on television. What changed?
With few exceptions, in earlier eras nobody ever seriously declares, “This is the golden age.” Instead, they accuse opponents of saying it or put it in the mouth of a foolish narrator. It only exists in the present as a bill of goods. In Britain, it’s what those charlatans behind the Corn Laws, the Reform Laws, or the Poor Laws would have you believe they’re creating. It’s the false promise made by those trying to root out, as The London Morning Post put it in 1828, “the rusty old customs of their forefathers” and replace them with “the delightful pleasures of change and variety.” Dreamers no less dangerous than the French revolutionaries, Edmund Burke writes, sought “a golden age, full of peace, order, and liberty.”
E. Alex Jung in Vulture:
Thank you Ellen, thank you Ellen, thank you Ellen, thank you Ellen, so much. We just love having you and your beautiful, extraordinary wife in our Scandal family. It's a good night for Shondaland up in here. It's good. So, forgive me, so I thought I was going to have a podium, so I'm going to do this the best I can without one. I am truly honored to be here and to be receiving this award. When I was told I was going to get an award for being an ally to GLAAD, it got me thinking. Being an ally means a great deal to me, and so I'm going to say some stuff. And I might be preaching to the choir, but I'm going to say it. Not just for us, but because on Monday morning, people are going to click a link to hear what that woman from Scandal said at that award show, so I think some stuff needs to be said.
There are people in this world who have the full rights of citizenship in our communities, our countries, and around the world, and then there are those of us who, to varying degrees, do not. We don't have equal access to education, to health care, and some other basic liberties like marriage, a fair voting process, fair hiring practices. Now, you would think that those of us who are kept from our full rights of citizenship would ban together and fight the good fight, but history tells us that, no, often we don't. Women, poor people, people of color, people with disabilities, immigrants, gay men, lesbians, bisexuals, trans people, intersex people - we have been pitted against each other and made to feel like there are limited seats at the table for those of us who fall into the category of "other." As a result, we have become afraid of one another. We compete with one another. We judge one another. Sometimes, we betray one another. Sometimes, even within our own communities, we designate who among us is best suited to represent us, and who really shouldn't even be invited to the party. As "others," we are taught that to be successful, we must reject those other "others," or we will never belong. I know part of why I'm getting this award is because I play characters that belong to segments of society that are often pushed to the margins. Now, as a woman and as a person of color, I don't often have a choice about that - but I've also made the choice to participate in storytelling about the LGBT community. I've made the choice to play a lot of different kinds of people in a lot of different kinds of situations. In my career, I've not been afraid of inhabiting characters who are judged, and who are misunderstood, and who have not been granted full rights of citizenship as human beings.
Sabrina Tavernise in The New York Times:
In a letter to colleagues announcing his departure as the director of the National Cancer Institute, Dr. Harold Varmus, 75, quoted Mae West. “I’ve been rich and I’ve been poor,” he wrote, “and rich is better.” The line was characteristic of Dr. Varmus: playful and frank, not what one might expect from a Nobel laureate. But it also distilled a central question facing biomedical research today. Is the decline in funding that has shaken universities and research labs here to stay? If so, what does that mean for scientific research? Dr. Varmus, whose last day at the cancer institute is Tuesday, recently reflected on financial constraints in science, the fight against cancer and his own efforts to remain healthy. Our interview has been condensed and edited for space.
Where are we in the fight against cancer?
One of the major advances we’ve had as a result of cancer research is deep recognition of the complexity of cancer. It’s not one disease, it’s lots of different diseases. Every single cancer is different when you look at it on a genetic level. When the president recently announced his precision medicine initiative, a lot of it was based on the information we are getting from genetic and molecular analysis of cancer. Precision medicine depends on being much more precise about diagnosis. That allows you to target therapies more correctly and make better inferences about likely outcomes. This is the most transforming thing that’s happened. We are beginning to understand now how different sets of mutations increase or decrease the likelihood that somebody’s going to respond to a therapy. There have been some sensational successes in immunotherapies. Some use antibodies that block the immune system’s self-regulation. I think there’s tremendous promise here. Cancer goes through an evolutionary process that is complex and not fully understood. There’s a tremendous amount of basic disease research to be done.
Is that basic research getting done?
People feel their likelihood of getting funded is greater if they work on things that may have a clinical application. I’m worried about that, because I look at the big things that have changed the face of health care, and it’s usually the result of some pioneering discovery not made in conjunction with the notion of how to treat somebody. You’ve got to do clinical testing, but if we become slackers on funding the absolutely most fundamental things, we will not hit upon the real answers. To understand how a normal cell becomes a cancer cell — we can’t lose sight of that.
What’s your advice on staying healthy?
I don’t want to go to my doctor, ever. I know flesh is heir to disease. I’ve been taking aspirin every day. It can be protective against cancers, heart disease and some strokes. I believe in keeping cholesterol levels down and keeping a healthy lifestyle. I try to be on my bike or doing something every day. Not just for health — I choose the sports I like. It’s a social event as well.
Read the rest here.
Go to any market in Mexico and you’ll see piles of grasshoppers—dusted with chile powder, roasted with garlic, sprinkled with lime juice. I’ve eaten grasshoppers ground up in salsas and semi-pulverized in micheladas, their intact legs floating in the refreshing mix of beer, lime juice, and hot sauce. If you’ve ever been served chile-dusted orange slices along with a shot of mezcal—surprise! That chile powder was actually ground up grasshoppers.
By now you’ve probably heard that entomophagy—insect eating—is in our dietary future, or at least should be. Put aside the yuck factor; insects are packed with protein, much less damaging to the environment than other livestock, and can even be killed humanely by popping them in the freezer. It’s all so crazy it just might work; the United Nations published a whole book in 2013 promoting edible insects as a solution to global food insecurity. With Earth looking down the barrel of a population of 9 billion humans, all of them hungry for protein, it makes sense to cultivate animals with 80 percent-edible bodies (crickets) instead of 40 percent (beef), and that don’t require 10 pounds of feed to get two pounds of meat (pigs). In theory.
Jonathan Guyer in Nieman Reports:
The Charlie Hebdo murders, and an attack aimed at Swedish cartoonist Lars Vilks, who had drawn images of the Prophet Muhammad many Muslims considered offensive, a month later in Copenhagen, focused attention on the threat to Western satirists. But political cartoonists around the world are at risk.
In Turkey in 2014, Recep Tayyip Erdogan, who was then prime minister, brought a criminal complaint against cartoonist Musa Kart, who in the midst of a corruption investigation into Erdogan’s inner circle depicted a hologram of Erdogan standing watch as thieves stole money from a safe. Lawyers sought nine years imprisonment; Kart was acquitted but Erdogan has appealed. In Ecuador, one of the best-known cartoonists in Latin America, Javier Bonilla, whose pen name is Bonil, is accused of “socioeconomic discrimination” for mocking the stutter and questioning the suitability for office of Agustin Delgado, a congressman from President Rafael Correa’s ruling party. In Singapore, the government charged Leslie Chew with sedition for the cartoonist’s criticism of state discrimination against ethnic minorities in his strip “Demon-Cratic Singapore.” And in Malaysia, Prime Minister Najib Razak’s government also accused cartoonist Zulkiflee Anwar Haque, known as Zunar, of sedition, for a cartoon criticizing a corrupt judiciary. “He points out corruption,” says John A. Lent, editor in chief of the International Journal of Comic Art, of Zunar. “He’s what a political cartoonist is supposed to be: a watchdog on government.”
Monday, March 30, 2015
by Scott F. Aikin and Robert B. Talisse
We've noticed a strange phenomenon in contemporary political discourse. As our politics at almost every level become increasingly tribal -- devoted to circle-the-wagons campaigns and on-point messaging of carefully curated party-lines -- the dominant images of our politics are all the more dressed in the rhetoric of reason, debate, evidence, and truth. Hence a puzzle: political communication is almost exclusively conducted by means of purported debate among people with different views, yet citizens seem increasingly unable to grasp of the perspectives of those with whom they politically disagree. Indeed, that there could be reasoned disagreement about politics among well-informed, rational, and sincere people is a though that looks increasingly alien to democratic citizens. Consequently, despite all of the rhetoric, citizens show very little interest in actually talking to those with whom they disagree. In short, as appeals to reason, argument, and evidence become more common in political communication, our capacity to actually disagree -- to respond to criticisms and objections, to address considerations that countervail our views, and to identify precisely where we think our opponents have erred -- has significantly deteriorated. That's an odd combination of phenomena. Let's call it the puzzle of political debate.
To be sure, the images that dominate the landscape of political communication are mere images. Popular tropes such as "the no spin zone," "fair and balanced" reporting, "straight talk," "real clear politics," and so on are merely slogans. And, similarly, the dominant "debate" format of television news is mostly political theater. However, these images and practices prevail. And they prevail because they are effective as marketing tools. So one must ask why citizens should demand that political views come packaged in this way. Here's an answer: an unavoidable fact about us is that we need to see ourselves as reasoners, debaters, and thinkers; and we need to see our own views regarding pressing social and political matters are the products of epistemically proper practice.
Consequently, any vision of democracy that prizes public discourse and civic debate must be supplemented by a properly social epistemology, an account of the ways in which people should go about forming, maintaining, and revising their political views, and a corresponding view of how democratic political institutions can aid or obstruct these processes. In providing a normative account of such matters, a social epistemology can also serve as a critical tool for assessing our present conditions.
by Alexander Bastidas Fry
The most commonly used noun in the English language is time. Yet time is nothing more than an idea. It is an intangible concept invoked to make sense of the world such that, ‘everything doesn't happen at once,' as Einstein said. The actual most common thing in the universe is dark matter. Dark matter purports to be more than an idea. It has some kind of elusive tangible existence, yet it has never been held in anyone's hands.
The nearly invisible components of nature such as cells or atoms can only be seen with the aid of tools. If you see a cell with a microscope there exists a physical and philosophical stratification between your perception, your eye, the optics of the microscope, and the observed cell. If you see an atom on a computer monitor rendered from data from an atomic microscope then the layers of complex stratification between you and the atom are monumental. What can we truly know about the nature of things which can only be observed through tools? I would argue quite a lot. Dark matter will always remain isolated from basic human perception, but we can know it through tools or imagination.
Imagine a sea of particles gliding through you unnoticed; this is dark matter. Imagine anything, and dark matter doesn't stop for it. Dark matter doesn't interact strongly with earth, fire, wind or water. There are many particles that have elusive existences similar to dark matter like photons or neutrinos. Unfamiliarity with these known particles doesn't hinder your ability to imagine dark matter: even these particles were not discovered without stratification between human perception and the thing itself. Imagine bits of dark matter passing through you brain at this moment, every moment, because it probably is. And if it is, but it never interacts with you in any way, does it matter?
On a street
to make one
On a street
On a street
On a street
by Jim Culleny
by Jalees Rehman
All obsessions can be dangerous. When I read the title "Why America's obsession with STEM education is dangerous" of Fareed Zakaria's article in the Washington Post, I assumed that he would call for more balance in education. An exclusive focus on STEM (science, technology, engineering and mathematics) is unhealthy because students miss out on the valuable knowledge that the arts and humanities teach us. I would wholeheartedly agree with such a call for balance because I believe that a comprehensive education makes us better human beings. This is the reason why I encourage discussions about literature and philosophy in my scientific laboratory. To my surprise and dismay, Zakaria did not analyze the respective strengths of liberal arts education and STEM education. Instead, his article is laced with odd clichés and misrepresentations of STEM.
Misrepresentation #1: STEM teaches technical skills instead of critical thinking and creativity
If Americans are united in any conviction these days, it is that we urgently need to shift the country's education toward the teaching of specific, technical skills. Every month, it seems, we hear about our children's bad test scores in math and science — and about new initiatives from companies, universities or foundations to expand STEM courses (science, technology, engineering and math) and deemphasize the humanities.
"The United States has led the world in economic dynamism, innovation and entrepreneurship thanks to exactly the kind of teaching we are now told to defenestrate. A broad general education helps foster critical thinking and creativity."
Zakaria is correct when he states that a broad education fosters creativity and critical thinking but his article portrays STEM as being primarily focused on technical skills whereas liberal education focuses on critical thinking and creativity. Zakaria's view is at odds with the goals of STEM education. As a scientist who mentors Ph.D students in the life sciences and in engineering, my goal is to help our students become critical and creative thinkers.
by Jonathan Kujawa
Human beings are tightly bound by the limits of our intuition and imagination. Even if we grasp an idea on an intellectual level, we often struggle to internalize it to the point where it becomes a native part of our thinking. Rather like the difference between being able to comfortably converse in a foreign language by translating on the fly and being fluent enough to think in the language like a native. Or, as the philosopher Stephen Colbert explained, it's the distinction between truth and truthiness.
We struggle to imagine things much different from what we see around us. This failure leads one in four Americans to believe the Sun goes around the Earth. It means we can't truly grasp the staggering, mind-boggling length of a billion years and this fuels skepticism about evolution. And for science fiction readers it leads to raging internet arguments about whether the authors have any imagination at all.
When it comes to geometry our everyday intuition tells us that we live in the planer geometry of good old Euclid. The angles of triangle add up to 180 degrees, parallel lines will never meet, and the shortest distance between two points is a straight line. But intellectually we know we live on the sphere called Earth, and that the geometry of the sphere leads to triangles whose angles sum to 230 degrees, parallel lines which meet, and flight paths between cities which follow "Great Circles".
Media portrayals to the contrary, mathematicians are human, too. From Euclid until the first half of the 19th century, everyone was on board with Euclidean geometry. After all, that was what their gut told them geometry should be. But then Bolyai and Lobachevsky showed us that there are more things in heaven and earth than Euclid could dream of. In two dimensions there are also hyperbolic and spherical (elliptic) geometry. In higher dimensions the possible geometries multiply like rabbits and Einstein's theory of relativity tells us that the geometry of our universe isn't Euclidean .
How can we free our feeble minds from their Euclidean prison and develop an intuition for these new geometries?