Thursday, February 23, 2017
Elizabeth Kolbert in The New Yorker:
In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.
Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.
As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.
In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.
“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”
Matthew Sedacca in Nautilus:
In the late 1990s, a team of physicists at the Laboratori Nazionali del Gran Sasso in Italy began collecting data for DAMA/LIBRA, an experiment investigating the presence of dark matter particles. The scientists used a scintillation detector to spot the weakly interactive massive particles, known as WIMPs, thought to constitute dark matter. They reported seeing an annual modulation in the number of “hits” that the detector receives. This was a potential sign that the Earth is moving through the galaxy’s supposed halo of dark matter—something that few, if any, researchers could claim.
Reina Maruyama’s job, at a detector buried two-kilometers deep in the South Pole, is to determine whether or not these researchers’ findings are actually valid. Previously, Maruyama worked at the South Pole to detect neutrinos, the smallest known particle. But when it came to detecting dark matter, especially with using detectors buried under glacial ice, she was initially skeptical of the task. In those conditions, she “couldn’t imagine having it run and produce good physics data.”
Contrary to Maruyama’s expectations, the detector’s first run went smoothly. Their most recent paper, published in Physical Review D earlier this year, affirmed the South Pole as a viable location for experiments detecting dark matter. The detector, despite the conditions, kept working. At the moment, however, “DM-Ice17,” as her operation is known, is on hiatus, with the team having relocated to Yangyang, South Korea, to focus on COSINE-100, another dark matter particle detector experiment, and continue the search for the modulation seen in DAMA/LIBRA.
Samuel Hammond at the Niskanen Center:
The ideals of liberalism seem increasingly under threat these days, so it’s worth reviewing what they are, where they come from, and why it’s essential that they make a comeback (a PDF version of this essay is available here). The first step is to recognize that they were not invented by some obsolete English philosopher. Rather, in their most general form, liberal principles have been rediscovered repeatedly and throughout history as practical tools for reconciling two basic social facts:
- Many of our deepest moral and metaphysical beliefs, like how to live a good life or which God to worship, are inherently contestable — reasonable people can and will disagree;
- We nonetheless all stand to benefit (on our own terms) from a social structure that enables peaceful cooperation.
Take, for instance, our separation of church and state. Yes, the Founding Fathers were cognizant of (and deeply influenced by) great liberal philosophers like John Locke, but the edict that “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof” had a much more practical origin: the extraordinary religious diversity that permeated colonial America.
Video length: 11:05
Kantorowicz’s first book on the Staufer Emperor alone may have sufficed to secure his place in the scholarly pantheon. But he managed to outdo himself with the even more influential The King’s Two Bodies, which appeared in 1957, shortly before he died in 1963, and has never been out of print. Yet most striking is that these two monumental books exist at opposite ends of the ideological and historiographic spectrums: the first, written in the overwrought, mystagogic style cultivated within the “circle” around the poet Stefan George, celebrates an almighty, autocratic ruler who held sway over a vast realm. Kantorowicz deliberately – and compellingly – cast his book as a political allegory meant to inspire his fellow Germans to seek and submit to such a leader should he appear. The latter book, written in English in American exile in Berkeley and Princeton, is a sober, meticulous, but no less scintillating study of an esoteric historical problem in what Kantorowicz called “political theology”. It is an immensely learned work bolstered by thousands of footnotes (the first book had none) and spiked with rebarbative terms only a pedant could embrace: “catoptromancy”, “geminate”, “caducity” and “equiparation”. When it was published, one reviewer hailed it as “a great book, perhaps the most important work in the history of medieval political thought, surely the most spectacular, of the past several generations”. Its appeal for subsequent readers was enhanced when Michel Foucault approvingly cited The King’s Two Bodies inDiscipline and Punish, while Giorgio Agamben called it “one of the great texts of our age on the techniques of power”.
Verdun thus came into the war on the Western Front almost by accident. Frustrated by their inability to break through the German lines, the Allied military leaders agreed to launch coordinated attacks on all fronts—Western, Eastern, and, since May 1915, Italian—in the summer of 1916. On the Western Front this coalesced into the other massive bloodletting of 1916, the Battle of the Somme. But the Germans had looked for their own opportunities after the dismal disappointments of 1915. By the turn of 1916, the fortified area around Verdun sat like an uncomfortable elbow right where the fixed positions bent in a near 90-degree angle from the relatively ignored frontier zone into the German trenches deep inside northern France. Erich von Falkenhayn, the chief of the German general staff, wrote in his memoirs that his idea was to launch a massive assault that would force the French to defend the area and bleed their army to death in the process. In a “Christmas memorandum” he purportedly sent Kaiser Wilhelm in late 1915, the operation’s bleak goal was to start a duel of attrition. Since no copy of this “Christmas memorandum” has ever been found and no other evidence of it exists, historians have surmised that this was not Falkenhayn’s actual plan. Operational orders to the local commanders and preparations for an artillery bombardment of unprecedented power instead suggest that his real intention was to break through at Verdun and then roll up the French positions to the north and west. When this failed in a bloody stalemate, Falkenhayn likely invented the attrition plan after the fact to disguise the magnitude of his failure and justify his tremendous losses.
What was originally likened by its creator to a fluttering paloma de la paz (dove of peace) because of its white, winglike, upwardly flaring rooflines seems more like a steroidal stegosaurus that wandered onto the set of a sci-fi flick and died there. Instead of an ennobling civic concourse on the order of Grand Central or Charles Follen McKim’s endlessly lamented Pennsylvania Station, what we now have on top of the new transit facilities is an eerily dead-feeling, retro-futuristic, Space Age Gothic shopping mall with acres of highly polished, very slippery white marble flooring like some urban tundra. Formally known as Westfield World Trade Center, it is filled with the same predictable mix of chain retailers one can find in countless airports worldwide: Banana Republic, Hugo Boss, Breitling, Dior, and on through the global label alphabet. (The Westfield Corporation is an Australian-based British-American shopping center company.) Far from this being the “exhilarating nave of a genuine people’s cathedral,” as Paul Goldberger claimed in Vanity Fair, Calatrava’s superfluous shopping shrine is merely what the Germans call a Konsumtempel (temple of consumption), and a generic one at that.
Still to come are 2 World Trade Center by the Bjarke Ingals Group (BIG) and 3 World Trade Center by the office of Richard Rogers. Plans are doubtful for a putative 5 World Trade Center (to replace the former Deutsche Bank Building, which was irreparably damaged by debris from the collapse of the Twin Towers and laboriously dismantled) and no architect has been selected. There will be no 6 World Trade Center to replace that eponymous eight-story component of Yamasaki’s original five-building World Trade Center ensemble, also destroyed on September 11.
Nisha Gaind in Nature:
South Korea is likely to become the first country where life expectancy will exceed 90 years, according to a study in The Lancet1. Researchers led by public-health researcher Majid Ezzati at Imperial College London have projected how life expectancy will change in 35 developed countries by 2030, using data from the World Health Organization and a suite of 21 statistical models they developed. Life expectancy is expected to increase in all 35 countries, in keeping with steady progress in recent decades, the team found. But it is South Korean women who will be living longest by 2030: there is a nearly 60% chance that their life expectancy at birth will exceed 90 years by that time, the team calculates. Girls born in the country that year can expect to live, on average, to nearly 91, and boys to 84, the highest in the world for both sexes (see 'Ageing populations').
The nation's rapid improvement in life expectancy — the country was ranked twenty-ninth for women in 1985 — is probably down to overall improvements in economic status and child nutrition, the study notes, among other factors. South Koreans also have relatively equal access to health care, lower blood pressure than people in Western countries and low rates of smoking among women.
Jen Doll in The Atlantic:
In 1965, 11 years after the Supreme Court outlawed segregated schools, Nancy Larrick wrote an article titled "The All-White World of Children's Books" for the Saturday Review. Marc Aronson, author of Race: A History Beyond Black and White, described that piece to The Atlantic Wire as "a call to arms." Larrick had been inspired to write the piece, which criticized the omission of black characters in children's literature, after a 5-year-old black girl asked why all the kids in the books she read were white. According to Larrick's survey of trade books over a three-year period, "only four-fifths of one percent" of those works included contemporary black Americans as characters. Further, the characterizations of pre-World War II blacks consisted of slaves, menial workers, or sharecroppers. Via Reading Is Fundamental, "'Across the country,' she stated in that piece, '6,340,000 nonwhite children are learning to read and to understand the American way of life in books which either omit them entirely or scarcely mention them.'"
…Myers shared the story of an 8-year-old girl who came up to him praising his picture book about a dog that plays the blues. "I said, 'You like the blues?'" he told us. "She said no. I said, 'You like dogs?' She said no. I said, 'What did you like?' She said, it looks like me.' If you have a black kid on the cover, black kids will pick it up faster." The flip side of this is a brutal one: What does it mean when kids don't see themselves on, or in, the books intended for them? As Myers told us, "I was asked by some teachers, 'What's the effect of video games on reading?' At first I was thinking it’s not that much, but a video game will give you more self-esteem than a book [especially a book that you don't see yourself in], so you go for the video games. Air Jordans will give you even more esteem. At 13 or 14, you’ve assessed yourself. You know if you’re good-looking, you know if you’re hip. So many black kids are looking at themselves and saying, 'I ain't much," he said. "This is why you need diversity."
More here. (Note: At least one post throughout February will be in honor of Black History Month)
After Sehwan —dedicated to Sheema Kermani—
because she went. and Ahmed Faiz’s Aaj Bazaar mein
Pa ba JollaN Chalo
Unwept tears, inner torments
Hidden desires, silent accusations
Flaunt your fetters in the street
Arms aloft, enraptured, intoxicated
Disheveled, blood stained
Lovers are yearning for your love
Tyrant and crowd
Slings and stones
Sorrows and failures
Who else is left to love
Who else is left to fight
Who else is left to die
Arise and go
For love’s honor
by Anjum Altaf
Sehwan is home to the shrine of a major sufi saint in
Sindh where a suicide bombing last week killed almost
a 100 devotees.
Sheema Kermani is a symbol of defiance in Pakistan as
a dancer who has continued to perform in public all
through the rise of fundamentalism and suppression.
She went to the shrine to dance right after the bombing.
Wednesday, February 22, 2017
NASA Press Release:
NASA's Spitzer Space Telescope has revealed the first known system of seven Earth-size planets around a single star. Three of these planets are firmly located in the habitable zone, the area around the parent star where a rocky planet is most likely to have liquid water.
The discovery sets a new record for greatest number of habitable-zone planets found around a single star outside our solar system. All of these seven planets could have liquid water – key to life as we know it – under the right atmospheric conditions, but the chances are highest with the three in the habitable zone.
“This discovery could be a significant piece in the puzzle of finding habitable environments, places that are conducive to life,” said Thomas Zurbuchen, associate administrator of the agency’s Science Mission Directorate in Washington. “Answering the question ‘are we alone’ is a top science priority and finding so many planets like these for the first time in the habitable zone is a remarkable step forward toward that goal.”
Walter Scheidel in The Atlantic:
Calls to make America great again hark back to a time when income inequality receded even as the economy boomed and the middle class expanded. Yet it is all too easy to forget just how deeply this newfound equality was rooted in the cataclysm of the world wars.
The pressures of total war became a uniquely powerful catalyst of equalizing reform, spurring unionization, extensions of voting rights, and the creation of the welfare state. During and after wartime, aggressive government intervention in the private sector and disruptions to capital holdings wiped out upper-class wealth and funneled resources to workers; even in countries that escaped physical devastation and crippling inflation, marginal tax rates surged upward. Concentrated for the most part between 1914 and 1945, this “Great Compression” (as economists call it) of inequality took several more decades to fully run its course across the developed world until the 1970s and 1980s, when it stalled and began to go into reverse.
This equalizing was a rare outcome in modern times but by no means unique over the long run of history. Inequality has been written into the DNA of civilization ever since humans first settled down to farm the land. Throughout history, only massive, violent shocks that upended the established order proved powerful enough to flatten disparities in income and wealth. They appeared in four different guises: mass-mobilization warfare, violent and transformative revolutions, state collapse, and catastrophic epidemics. Hundreds of millions perished in their wake, and by the time these crises had passed, the gap between rich and poor had shrunk.
Sheila Kaplan in Stat:
“My daughter had a seizure, lost consciousness, and stopped breathing about 30 minutes after I gave her three Hyland’s Teething Tablets,” the mother later told the Food and Drug Administration. “She had to receive mouth-to-mouth CPR to resume breathing and was brought to the hospital.”
The company, Hyland’s, promotes “safe, effective, and natural health solutions” that appeal to parents seeking alternative treatments. But the agency would soon hear much more about Hyland’s teething products. Staff at the FDA would come to consider Case 7682299 one of the luckier outcomes.
A review of FDA records obtained by STAT under the Freedom of Information Act paints a far grimmer picture: Babies who were given Hyland’s teething products turned blue and died. Babies had repeated seizures. Babies became delirious. Babies were airlifted to the hospital, where emergency room staff tried to figure out what had caused their legs and arms to start twitching.
Ratik Asokan in The Baffler:
Ratik Asokan: Age of Anger feels like a continuation of a project that began with your first book, Butter Chicken in Ludhiana, and which you’ve approached from various angles—memoir, fiction, reportage, now intellectual history—since then. Perhaps its subject can be described as “latecomers to modernity”?
Pankaj Mishra: As a writer, you can’t afford to become too self-conscious. You can’t become too aware of your origins or background. Because that impairs your capacity to think spontaneously. There are certain crucial experiences we have early on that set our trajectory. It’s for other people to identify them . . .
You’re right in that this particular quest started twenty years ago, with Butter Chicken in Ludhiana, which is an account of the provinces in India. Now looking back—I haven’t looked at the book for a long time—I think, and this is something I’ve been thinking of writing about, that something missing from much political, and literary-intellectual discourse, at least in the last three decades or so, is the experience of the provincial. Of the outsider from the provinces.
I have been insisting all along that that experience is very crucial, that it’s going to shape our futures, especially our cultural future. And what we are seeing today, is a political assertion of people who did not really have a voice in our political and literary discourses. That’s one reason why we find ourselves so politically and intellectually helpless before contemporary phenomena. We simply have no inkling what people in these places who felt ressentiment—felt excluded, marginalized, disdained, scorned—that they might at some point strike back by electing figures like Modi and Trump.
Video length: 9:55
Sanford Pinsker in VQR:
Enter the new generation of black intellectuals—everyone from Henry Louis Gates, Jr., Stephen Carter, and Cornel West to Shelby Steele, Orlando Patterson and Stanley Crouch. Taken together, they represent a direction that began 40 years ago with the Brown vs. Board of Education decision and continued through the civil rights movement. In short, the black intellectual voices now speaking out from our most prestigious universities are, as the title of Stephen Carter’s 1991 book would have it, “affirmative action babies.” From token representation in the 1950’s and 60’s-—when, say, Harvard typically admitted ten blacks per class—enrollments have fairly soared as Afro-American studies programs took root (often in response to student protest) and universities slowly but surely embraced a new educational paradigm based on race, class, and gender. In the process, culture became, well, one of those words. It was once spelt with a capital letter, defined by Matthew Arnold as “the best that has been thought and said,” and generally agreed to be a good thing. Now, many in the academy were not so sure, partly because selecting the “best” invariably means leaving out the “least,” and partly because culture itself often seems to be a suspect operation. Rather than “sweetness and light” (the title of the Arnold essay in which his famous definition appeared), “culture”— yet another term destined to be surrounded by inverted commas—stands for everything that first bullies and then silences minority voices.
No one would seriously argue with the proposition that black intellectuals have played a major role in the culture wars that define our time. Indeed, some would insist that they are what the New York intellectuals once were— namely, activist scholars who bring fresh blood and new perspectives to our understanding of American culture. At the same time, however, there are important differences. Regardless of how much the New York intellectuals were divided by temperament and later, by politics, they shared a fund of common experience that, for want of a better term, might be called “immigrant gratitude.” America, and perhaps more to the point, American culture, offered an escape from the hardships and parochial limitations that had narrowly defined the lives of their immigrant parents. Granted, the giddy possibility of self-transformation did not come without cost, and it would take a long arc indeed before many would rediscover the Jewishness from which they had fled. Not surprisingly, the conflict was the very stuff of which intellectuals, rather than scholars, are made, for as Daniel Bell once shrewdly observed, the scholar finds his place within an established tradition and adds his tiny piece to the mosaic. By contrast, the intellectual begins with “HIS experience, HIS individual perceptions of the world, HIS privileges and deprivations, and judges the world by these sensibilities.”
More here. (Note: At least one post throughout February will be in honor of Black History Month)
‘‘Ready to go back in time?’’ the guy sitting beside me says, rather dramatically. He’s from Long Island and is also an amateur. We’re in a dusty Suburban pitching itself headfirst down a sharp slope into the Badlands. Through the cracked wind- shield, I see a moonscape eroded out of the prairie: a mottled topography of red, brown, black, yellow, green, and gray studded with naked buttes—the sediments of the sea, silt and clay deposited and then worn down, epochs later, by water and wind. In places, the buttes are scorched and collapsed by burning coal turned into ash. Nonnative sweet yellow clover has choked out the prairie grass that usually grows between the desolate washouts and draws; in parts, the clover stands waist-high. Above, sparse thickets of cottonwoods, maybe a green ash, a few ponderosa pines. Below, baked beaches where alien outcrops of rocks bloom in strangled, man-sized shapes. A landscape of hard eternity, home to rattlers, bull snakes, prairie dogs, pheasants, foxes, coyotes, pronghorns, bobcats, mule deer, minks, and ever-thirsty toads. My companions and I are dressed in paleontologist chic: tan pants, wide-brim hat, long-sleeve button-down, boots, bandanna. As our vehicle lumbers down the hill to the desolate floor, we pass a rock layer known as the Cretaceous/Tertiary (K/T) boundary, a thin line of tan clay beneath a band of coal that pinpoints the ‘‘sudden’’ geological moment when the dinosaurs disappeared.
These aren’t the Badlands of South Dakota, which are thirty million years younger and far more popular. These are the Badlands that in 1864 Brigadier General Alfred Sully of the US Cavalry, busy marauding against the Sioux, described as ‘‘hell with the fires out.’’
Ellison is co-chair of the Congressional Progressive Caucus, the putative left-wing answer to the brinksmen of the Freedom Caucus on the right, and he was an early and fervent supporter of Sanders’s Presidential campaign. Like Sanders, he consistently opposed the Trans-Pacific Partnership, a trade deal sought by the Obama White House in its final two years which was attacked by populists in both parties. (President Donald Trump recently withdrew the U.S. from the T.P.P.) Ellison announced his candidacy for the D.N.C. chairmanship six days after the Presidential election. Sanders and Senator Elizabeth Warren, of Massachusetts, predictably endorsed him—but so did establishment figures, such as Senate Minority Leader Charles Schumer, and his predecessor, Harry Reid. One of the early objectives of Schumer’s leadership has been to placate the increasingly powerful Sanders, whom he made a member of his leadership team, and Schumer has said that he endorsed Ellison because Sanders recommended him. This may have been a canny bit of political maneuvering, but it also indicated to Sanders’s supporters that the populist wing of the Democratic Party was poised to lead the opposition against Trump.
The race for the chair has often echoed the acrimony and confusion of the Presidential primaries. Ten candidates are competing for the job, though few have a national profile. Ellison’s chief rival, Thomas E. Perez, was formerly Barack Obama’s Labor Secretary. Perez has consolidated support from much of the Democratic establishment, and increasingly appears to have seized the role of front-runner. Pete Buttigieg, the young mayor of South Bend, Indiana, has positioned himself as a compromise candidate, saying, of the 2016 Democratic primary race, “I don’t know why we’d want to live through it a second time.”
One of the many complications that make the Bruegels the most confusing clan in art is the letter H. Pieter Bruegel the Elder, the founder of the dynasty and its greatest artist, was the painter of such celebrated works as The Hunters in the Snow (1565) and The Tower of Babel (1563). Contrary to the elegance and elevating tenets of the Italian Renaissance, he made the peasant life of the Low Countries his subject, in all its scatological, rambunctious and therefore human detail. In 1559 he dropped the H in his surname and started signing in Roman capital letters – Brueghel becoming the rather more stately Bruegel.
Bruegel had two sons, Pieter and Jan, aged four and one at the time of his death in 1569. Both became painters, too, and as their careers took off Pieter the Younger reinstated the H his father had discarded (though in later life, to add to the disorder, he reversed the order of the U and E) and it remained the moniker of the innumerable painting Brueghels who followed. Rather more confusing than this alphabet jiggery-pokery, though, is the sheer number of painters in the dynasty – some 15 blood relations over the course of 150 years, before a plethora of apprentices, collaborators and intermarriages is factored in.
"Nobody asked (the candidate) what makes America great?
What are the metrics?” —Jon Stewart
Photograph, Maryland Agricultural
College Livestock Show, 1924
Blond, wholesome, serene,
thier white shirtsleeves rolled,
these boys in white ducks
keep sleek black hogs at their feet,
hogs cleaner than licorice sticks in the sun.
Five haltered calves are also held
in tandem while their names
and pedigrees are said aloud.
Mostly I think about
the unseen mud and manure, flies
and screwworms, that connect these boys
and their wildest hopes
poised radiant between two wars
while just out of reach of the lens
in their stained bib overalls
stand the farm laborers
greasy with sweat
and undoubtedly black.
by Maxine Kumin
Viking Penguin Books, 1989
Tuesday, February 21, 2017
Paul Freedman in DelanceyPlace:
In the mid-1800s, unaccompanied women in America were generally not allowed to dine at restaurants: "Midday dining presented a challenge for women too busy or too far from home to return there for lunch. They might be in the company of other women or alone, but at any rate not escorted by men who were occupied with work and work-related socializing; men had their own luncheon habits. In the nineteenth-century United States, men made the rules about public dining and admitted women to restaurants on sufferance, according to a complex series of arrangements. Different practices governed the two main meals of the day. "Restaurants depended economically on women accompanying men at the evening meal. Lunch, however, was segregated by gender and involved a series of problems, according to the social customs of the nineteenth century. In the grand and even not-so-grand metropolis, men were increasingly likely to work at some distance from home and to stay near their workplace for the midday meal. The point at which women too absented themselves from the house created a demand for their sustenance. The growth of cities and the creation of specialized shopping districts meant that it was often inconvenient for women as well as men to return home for lunch.
"The public rooms at fancy restaurants were usually reserved at lunch for men only, but some of them allowed women to have lunch in private dining spaces. In the era before Prohibition, bars offered free food, which, along with a crowded and boisterous atmosphere, encouraged demand for drink. Free-lunch bars were hopelessly inappropriate spaces for respectable women, as alcohol-driven conviviality was inevitably coarse -- the antithesis of what was considered ladylike.
Tim Stanley in The Telegraph:
Can a white person ever really understand how a black person sees the world? Back in 1959, six years before Martin Luther King marched for civil rights in Selma, one man tried. A white Texan writer called John Howard Griffin walked into a doctor’s office in New Orleans and asked him to turn his skin colour black. Griffin took oral medication and was bombarded with ultraviolet rays; he cut off his hair to hide an absence of curls and shaved the back of his hands. Then he went on a tour of the Deep South. The result was a bestselling book called Black Like Me, which is still regarded as an American classic. Griffin wanted to test the claim that although the southern United States was segregated it was essentially peaceful and just – that the two races were separate but equal.
What he discovered tells us a lot about the subtleties of racism. In 1959, unlike today, it was legally instituted. But, like today, it also flourished at the personal level – in hostility, suspicion, fear and even self-loathing. Griffin was an extraordinary man. Born in Dallas in 1920, he went to school in France and joined the French Resistance after Hitler invaded. Griffin helped Jewish children escape to England before fleeing to America. While serving in the US army, he was blinded by shrapnel. Griffin took it all in his stride – he married, had children and converted to Catholicism. Griffin’s strong personal faith reminds us that much of the civil rights movement was in fact a Christian mission – made possible, in this instance, by what seemed like a miracle. Walking around his yard one afternoon, Griffin suddenly saw red swirls where hitherto there was only darkness. Within months his sight had returned. And it was a man determined to make the most of his second chance who hit upon the novel idea of crossing the colour line. Those reading the book today might regard Griffin’s attempt to change his colour as akin to blacking up. Certainly, the transformation was awkward. Griffin may well have had dark skin but he retained his classically Caucasian features, and one suspects that the awkwardness of his encounters with some black people was down to them wondering if he was one of them or just horribly sunburnt.
More here. (Note: At least one post throughout February will be in honor of Black History Month)
Shehryar Fazli in the Los Angeles Review of Books:
Timothy B. Tyson has written a concise and urgent book about Emmett Till’s 1955 murder in a small Mississippi town, a crime that ignited civil rights defenders into a long, hard struggle against the Jim Crow regime in the South, and inspired an outraged Rosa Parks to defy segregation laws on a Montgomery city bus. It’s a macabre story of inhumanity and injustice, but also of resistance and unity across a divided nation.
The facts may be known, but bear repeating. Fourteen-year-old Emmett, during a visit from Chicago to his family’s hometown of Money, Mississippi, allegedly whistled at a white woman, Carolyn Bryant, in a grocery store. After Bryant claimed, untruthfully, that the black boy had also grabbed her, her husband Roy Bryant and his half-brother J. W. Milam abducted Emmett from his grand uncle’s house, beat, mutilated and shot him, then dumped his body into the Tallahatchie River, from where it was recovered three days later. Just another lynching in the Jim Crow South … until it wasn’t. If it weren’t for the specific time and place, it’s unlikely to have become arguably the United States’s most consequential hate crime, the first act in a drama of reckoning that tested a nation’s moral fiber.
Expertly, Tyson demarcates and mines the territory of Till’s murder, including why the killers assumed it would go ignored; of the trial, which indeed concluded with a not-guilty verdict; and of the countrywide reaction to both. Yet his analysis of the big national moment does not upstage his attention to the Till family’s unimaginable personal loss.
Video length: 1:00:17
Video length: 15:48
Victoria Jaggard in National Geographic:
Creatures that thrive on iron, sulfur, and other chemicals have been found trapped inside giant crystals deep in a Mexican cave. The microbial life-forms are most likely new to science, and if the researchers who found them are correct, the organisms are still active even though they have been slumbering for tens of thousands of years.
If verified, the discovery adds to evidence that microbial life on Earth can endure harsher conditions in isolated places than scientists previously thought possible. (See “Life Found Deep Under Antarctic Ice for First Time?”)
“These organisms have been dormant but viable for geologically significant periods of time, and they can be released due to other geological processes,” says NASA Astrobiology Institute director Penelope Boston, who announced the find today at a meeting of the American Association for the Advancement of Science. “This has profound effects on how we try to understand the evolutionary history of microbial life on this planet.”