Ibram X. Kindi in The New York Times:
Many Americans might not know the more polemical side of race writing in our history. The canon of African-American literature is well established. Zora Neale Hurston, Richard Wright, James Baldwin are familiar figures. Far less so is Samuel Morton (champion of the obsolete theory of polygenesis) or Thomas Dixon (author of novels romanticizing Klan violence). It is tempting to think that the influence of those dusty polemics ebbed as the dust accumulated. But their legacy persists, freshly shaping much of our racial discourse.
On the occasion of Black History Month, I’ve selected the most influential books on race and the black experience published in the United States for each decade of the nation’s existence — a history of race through ideas, arranged chronologically on the shelf. (In many cases, I’ve added a complementary work, noted with an asterisk.) Each of these books was either published first in the United States or widely read by Americans. They inspired — and sometimes ended — the fiercest debates of their times: debates over slavery, segregation, mass incarceration. They offered racist explanations for inequities, and antiracist correctives. Some — the poems of Phillis Wheatley, the memoir of Frederick Douglass — stand literature’s test of time. Others have been roundly debunked by science, by data, by human experience. No list can ever be comprehensive, and “most influential” by no means signifies “best.” But I would argue that together, these works tell the history of anti-black racism in the United States as painfully, as eloquently, as disturbingly as words can. In many ways, they also tell its present.
“Poems on Various Subjects, Religious and Moral,” by Phillis Wheatley (1773)
No book during the Revolutionary era stirred more debates over slavery than this first-ever book by an African-American woman. Assimilationists and abolitionists exhibited Wheatley and her poetry as proof that an “uncultivated barbarian from Africa” could be civilized, that enslaved Africans “may be refin’d, and join th’ angelic train” of European civilization and human freedom. Enslavers disagreed, and lashed out at Wheatley’s “Poems.”
More here. (Note: At least one post throughout February will be in honor of Black History Month)
Paul Buchheit in AlterNet:
Yes, inequality is getting worse every year. In early 2016 Oxfam reported that just 62 individuals had the same wealth as the bottom half of humanity. About a year later Oxfam reported that just 8 men had the same wealth as the world's bottom half. Based on the same methodology and data sources used by Oxfam, that number is now down to 6.
How to account for the dramatic increase in the most flagrant and perverse of extreme inequalities? Two well-documented reasons: (1) The poorest half (and more) of the world has continued to lose wealth; and (2) The VERY richest individuals — especially the top thousand or so — continue to add billions of dollars to their massive fortunes.
Inequality deniers and apologists say the Oxfam methodology is flawed, but they're missing the big picture. Whether it's 6 individuals or 62 or 1,000 doesn't really matter. The data from the Credit Suisse Global Wealth Databook (GWD) and the Forbes Billionaire List provide the best available tools to make it clear that inequality is extreme and pathological and getting worse every year.
Brent Simpson, Robb Willer & Ashley Harrell in Nature:
The threat of free-riding makes the marshalling of cooperation from group members a fundamental challenge of social life. Where classical social science theory saw the enforcement of moral boundaries as a critical way by which group members regulate one another’s self-interest and build cooperation, moral judgments have most often been studied as processes internal to individuals. Here we investigate how the interpersonal expression of positive and negative moral judgments encourages cooperation in groups and prosocial behavior between group members. In a laboratory experiment, groups whose members could make moral judgments achieved greater cooperation than groups with no capacity to sanction, levels comparable to those of groups featuring costly material sanctions. In addition, members of moral judgment groups subsequently showed more interpersonal trust, trustworthiness, and generosity than all other groups. These findings extend prior work on peer enforcement, highlighting how the enforcement of moral boundaries offers an efficient solution to cooperation problems and promotes prosocial behavior between group members.
Heather Jones in the Irish Times:
“I’ve found another one!” My mother is delighted, full of excitement, cup of tea in hand at our small kitchen table in Dublin, overloaded with notes and books. She has long had a passionate interest in Irish history but this is her biggest project yet – an investigation into Irish Protestant nationalists who contributed to the Easter Rising.
She has a hunch that there were more of them than anyone has realised. I know she is writing a book for the centenary. It has become all-consuming: for several years she has scoured archives, libraries, interviewed descendants of Protestant rebels, including Garret FitzGerald, whose rebel mother was Presbyterian. Each document seam uncovers a new lead, a fresh name. She feels a need to reinsert these lives that she believes have been overlooked into the history of the Rising, especially the working-class Protestants of Dublin, long neglected.
I don’t dare ask her to what extent it is a search for self. From a practising Church of Ireland family, of very humble Dublin and Wicklow origins, my mother was a scholarship girl, educated through Irish in Coláiste Moibhí, the training college established by the State to produce Gaelic-speaking, nationalist teachers for Protestant primary schools. Devout and liberal, patriotic and pacifist, she defies easy stereotypes, just like the lives she is researching.
Five months later she is dying.
More here. [Thanks to Kris Kotarski.]
Maximillian Alvarez at The Baffler:
Second, in just about every takedown or defense of highfalutin academic jargon, it’s generally taken for granted that such jargon is just part of the job academics do, but when it comes to determining the role of “the academic” in society, things get messier. The arguments make it seem like the main choice facing academics involves determining to what degree they might deign to display some civic-mindedness and try to translate their findings into something that will somehow engage and benefit “the public.” But all such arguments tend to rest on unchallenged assumptions about academics in general, and these assumptions are often the biggest problem.
There’s a huge difference, for instance, between defending academic jargon as such and defending academic jargon as the typical academic so often uses it. There’s likewise a huge difference between justifying jargon when it is absolutely necessary (when all other available terms simply do not account for the depth or specificity of the thing you’re addressing) and pretending that jargon is always justified when academics use it. And there’s a huge difference between jargon as a necessarily difficult tool required for the academic work of tackling difficult concepts, and jargon as something used by tools simply to prove they’re academics.
Steven Poole at The New Statesman:
In the preface to his new book, the philosopher Daniel Dennett announces proudly that what we are about to read is “the sketch, the backbone, of the best scientific theory to date of how our minds came into existence”. By the end, the reader may consider it more scribble than spine – at least as far as an account of the origins of human consciousness goes. But this is still a superb book about evolution, engineering, information and design. It ranges from neuroscience to nesting birds, from computing theory to jazz, and there is something fascinating on every page.
The term “design” has a bad reputation in biology because it has been co-opted by creationists disguised as theorists of “intelligent design”. Nature is the blind watchmaker (in Richard Dawkins’s phrase), dumbly building remarkable structures through a process of random accretion and winnowing over vast spans of time. Nonetheless, Dennett argues stylishly, asking “design” questions about evolution shouldn’t be taboo, because “biology is reverse engineering”: asking what some phenomenon or structure is for is an excellent way to understand how it might have arisen.
Joshua Sperling at Guernica:
In the early 1960s, Berger and his wife, the translator Anya Bostock, left London for a suburb of Geneva, where he wrote in relative obscurity for several years, publishing two further novels that attracted little attention. “It is a struggle,” he explained in a letter to an older novelist, “because I made so many enemies as an art-critic; I have now offended the sense of order by abandoning art criticism; and I have exiled myself here seeing nobody except a few cherished but powerless friends. But meanwhile one must write and hope.”
The silence of exile was in fact a preparation for the great flowering of Berger’s middle period. As the generation of ‘68 spread its wings and the New Left seemed to promise revolution, Berger’s work broke free of all previous models. Between 1965 and 1975 he produced an awe-inspiring array of forms: photo-texts, broadcasts, novels, documentaries, feature films, essays. Many of these were done in collaboration. With the Swiss photographer, Jean Mohr, he made A Fortunate Man, a documentary portrait in words and images of a country doctor in the Forest of Dean, and A Seventh Man, a kind of modernist visual essay about the courage and perseverance of migrant laborers in Europe. (This latter project was the book Berger always said he was proudest of, and in a 2010 reprinting he mused that sometimes a book, unlike its authors, can grow more of-the-moment with time, a statement that itself has only grown truer in recent years as the migrant crisis reaches new levels.) Berger also worked with Alain Tanner on several films, including Jonah who will be 25 in the year 2000, an ensemble comedy that became a touchstone of post-‘68 optimism.
This photograph was captured on the way to the Capitol Building in Washington D.C. during the 20th Anniversary of the Million Man March. The young man's face is that of determination, strength, and a burning desire for change. Holding the Pan-African flag high above his head and looking into the distance, he is a symbol of hope, unity and future prosperity.
“My two great loves are physics and New Mexico. It’s a pity they can’t be combined.”
I could have loved you
wrapped my legs tightly
around your white buttocks
to keep you thinly against me
for water from mountain streams
for the journey to Jornada del Muerto
for the creation of Trinity
I would have met you along
the ridge of Frijoles Canyon
caught breathless by your intensity
and sad eyes
your boyish dishevelment
would have seduced me
to seduce you
just clumsily enough
to surprise and charm you
away from quantum mechanics
the enigmatic half-life of identical muclei
and the gray uniform houses of Los Alamos
at least for awhile
until the 14th passed and the 15th
and the 16th
defying the test that would test us all
Before the red dawn
I would have awakened beside you
untangled myself in the narrow bed
to slide on top of you
and whisper only for you
mi Nuevo Méjico
Read more »
Betsy Schlabach in AAIHS:
In 1986, African American poet Gwendolyn Brooks wrote of her life-long friend Langston Hughes: “‘WHAT was Langston Hughes? An overwhelmer. Long ago I felt it was proper to say he had ‘a long reach / strong speech / remedial fears / muscular tears.’ I gave him titles: ‘Helmsman, hatchet, headlight.’ And I suggested: ”See / one restless in the exotic time! and ever, / till the air is cured of its fever.'” Hughes and Brooks were both in Chicago‘s south side neighborhood, Bronzeville, at mid-twentieth century. Their friendship and poetry showcased a critical love of the black experience. Brooks described Hughes’ mission:
He judged himself the adequate appreciator of his own people, and he judged blacks ‘the most wonderful people in the world.’ He wanted to celebrate them in his poetry, fiction, essays and plays. He wanted to record their strengths, their resiliency, courage, humor.
Hughes first visited Chicago in 1918 as a sophomore in high school. His mother worked as a maid for a milliner in the Loop. Hughes took a job delivering hats, which exposed him to many different neighborhoods near downtown Chicago. On Sundays, he would stroll along the south side, which he said was more exciting than anything he had ever seen before. “Midnight was like day,” he exclaimed as he explored its crowded theaters and cabarets. It was vibrant, yes, but it was violent as well. He would experience racial violence first hand one day after he had wandered into a Polish neighborhood and was assaulted physically and verbally by a gang of white boys. He found Chicago “vast, ugly, brutal, and monotonous.” Brooks met Hughes when she was very young and grew to know him well enough “to observe that when subjected to offense and icy treatment because of his race, he was capable of jagged anger – and vengeance, instant or retroactive. And I have letters from him that reveal he could respond with real rage when he felt he was treated cruelly by other people.” Brooks confirmed that Bronzeville offered a young poet like herself plenty of material. She wrote: “If you wanted a poem you only had to look out a window. There was material always walking, running, screaming, or signing.” Born in Topeka Kansas, but raised in Bronzeville, Brooks published A Street in Bronzeville in 1945. Her work bravely confronted segregation, abortion, poverty, lynching, class, restrictive gender roles, and childhood dreams. Hers was a very honest portrait of Chicago
More here. (Note: At least one post throughout February will be in honor of Black History Month)
Elizabeth Kolbert in The New Yorker:
In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.
Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.
As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.
In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.
“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”
Matthew Sedacca in Nautilus:
In the late 1990s, a team of physicists at the Laboratori Nazionali del Gran Sasso in Italy began collecting data for DAMA/LIBRA, an experiment investigating the presence of dark matter particles. The scientists used a scintillation detector to spot the weakly interactive massive particles, known as WIMPs, thought to constitute dark matter. They reported seeing an annual modulation in the number of “hits” that the detector receives. This was a potential sign that the Earth is moving through the galaxy’s supposed halo of dark matter—something that few, if any, researchers could claim.
Reina Maruyama’s job, at a detector buried two-kilometers deep in the South Pole, is to determine whether or not these researchers’ findings are actually valid. Previously, Maruyama worked at the South Pole to detect neutrinos, the smallest known particle. But when it came to detecting dark matter, especially with using detectors buried under glacial ice, she was initially skeptical of the task. In those conditions, she “couldn’t imagine having it run and produce good physics data.”
Contrary to Maruyama’s expectations, the detector’s first run went smoothly. Their most recent paper, published in Physical Review D earlier this year, affirmed the South Pole as a viable location for experiments detecting dark matter. The detector, despite the conditions, kept working. At the moment, however, “DM-Ice17,” as her operation is known, is on hiatus, with the team having relocated to Yangyang, South Korea, to focus on COSINE-100, another dark matter particle detector experiment, and continue the search for the modulation seen in DAMA/LIBRA.
Samuel Hammond at the Niskanen Center:
The ideals of liberalism seem increasingly under threat these days, so it’s worth reviewing what they are, where they come from, and why it’s essential that they make a comeback (a PDF version of this essay is available here). The first step is to recognize that they were not invented by some obsolete English philosopher. Rather, in their most general form, liberal principles have been rediscovered repeatedly and throughout history as practical tools for reconciling two basic social facts:
- Many of our deepest moral and metaphysical beliefs, like how to live a good life or which God to worship, are inherently contestable — reasonable people can and will disagree;
- We nonetheless all stand to benefit (on our own terms) from a social structure that enables peaceful cooperation.
Take, for instance, our separation of church and state. Yes, the Founding Fathers were cognizant of (and deeply influenced by) great liberal philosophers like John Locke, but the edict that “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof” had a much more practical origin: the extraordinary religious diversity that permeated colonial America.
Robert E. Norton at the Times Literary Supplement:
Kantorowicz’s first book on the Staufer Emperor alone may have sufficed to secure his place in the scholarly pantheon. But he managed to outdo himself with the even more influential The King’s Two Bodies, which appeared in 1957, shortly before he died in 1963, and has never been out of print. Yet most striking is that these two monumental books exist at opposite ends of the ideological and historiographic spectrums: the first, written in the overwrought, mystagogic style cultivated within the “circle” around the poet Stefan George, celebrates an almighty, autocratic ruler who held sway over a vast realm. Kantorowicz deliberately – and compellingly – cast his book as a political allegory meant to inspire his fellow Germans to seek and submit to such a leader should he appear. The latter book, written in English in American exile in Berkeley and Princeton, is a sober, meticulous, but no less scintillating study of an esoteric historical problem in what Kantorowicz called “political theology”. It is an immensely learned work bolstered by thousands of footnotes (the first book had none) and spiked with rebarbative terms only a pedant could embrace: “catoptromancy”, “geminate”, “caducity” and “equiparation”. When it was published, one reviewer hailed it as “a great book, perhaps the most important work in the history of medieval political thought, surely the most spectacular, of the past several generations”. Its appeal for subsequent readers was enhanced when Michel Foucault approvingly cited The King’s Two Bodies inDiscipline and Punish, while Giorgio Agamben called it “one of the great texts of our age on the techniques of power”.
Paul de Quenoy at The New Criterion:
Verdun thus came into the war on the Western Front almost by accident. Frustrated by their inability to break through the German lines, the Allied military leaders agreed to launch coordinated attacks on all fronts—Western, Eastern, and, since May 1915, Italian—in the summer of 1916. On the Western Front this coalesced into the other massive bloodletting of 1916, the Battle of the Somme. But the Germans had looked for their own opportunities after the dismal disappointments of 1915. By the turn of 1916, the fortified area around Verdun sat like an uncomfortable elbow right where the fixed positions bent in a near 90-degree angle from the relatively ignored frontier zone into the German trenches deep inside northern France. Erich von Falkenhayn, the chief of the German general staff, wrote in his memoirs that his idea was to launch a massive assault that would force the French to defend the area and bleed their army to death in the process. In a “Christmas memorandum” he purportedly sent Kaiser Wilhelm in late 1915, the operation’s bleak goal was to start a duel of attrition. Since no copy of this “Christmas memorandum” has ever been found and no other evidence of it exists, historians have surmised that this was not Falkenhayn’s actual plan. Operational orders to the local commanders and preparations for an artillery bombardment of unprecedented power instead suggest that his real intention was to break through at Verdun and then roll up the French positions to the north and west. When this failed in a bloody stalemate, Falkenhayn likely invented the attrition plan after the fact to disguise the magnitude of his failure and justify his tremendous losses.
Martin Filler at the NYRB:
What was originally likened by its creator to a fluttering paloma de la paz (dove of peace) because of its white, winglike, upwardly flaring rooflines seems more like a steroidal stegosaurus that wandered onto the set of a sci-fi flick and died there. Instead of an ennobling civic concourse on the order of Grand Central or Charles Follen McKim’s endlessly lamented Pennsylvania Station, what we now have on top of the new transit facilities is an eerily dead-feeling, retro-futuristic, Space Age Gothic shopping mall with acres of highly polished, very slippery white marble flooring like some urban tundra. Formally known as Westfield World Trade Center, it is filled with the same predictable mix of chain retailers one can find in countless airports worldwide: Banana Republic, Hugo Boss, Breitling, Dior, and on through the global label alphabet. (The Westfield Corporation is an Australian-based British-American shopping center company.) Far from this being the “exhilarating nave of a genuine people’s cathedral,” as Paul Goldberger claimed in Vanity Fair, Calatrava’s superfluous shopping shrine is merely what the Germans call a Konsumtempel (temple of consumption), and a generic one at that.
Still to come are 2 World Trade Center by the Bjarke Ingals Group (BIG) and 3 World Trade Center by the office of Richard Rogers. Plans are doubtful for a putative 5 World Trade Center (to replace the former Deutsche Bank Building, which was irreparably damaged by debris from the collapse of the Twin Towers and laboriously dismantled) and no architect has been selected. There will be no 6 World Trade Center to replace that eponymous eight-story component of Yamasaki’s original five-building World Trade Center ensemble, also destroyed on September 11.
Nisha Gaind in Nature:
South Korea is likely to become the first country where life expectancy will exceed 90 years, according to a study in The Lancet1. Researchers led by public-health researcher Majid Ezzati at Imperial College London have projected how life expectancy will change in 35 developed countries by 2030, using data from the World Health Organization and a suite of 21 statistical models they developed. Life expectancy is expected to increase in all 35 countries, in keeping with steady progress in recent decades, the team found. But it is South Korean women who will be living longest by 2030: there is a nearly 60% chance that their life expectancy at birth will exceed 90 years by that time, the team calculates. Girls born in the country that year can expect to live, on average, to nearly 91, and boys to 84, the highest in the world for both sexes (see 'Ageing populations').
The nation's rapid improvement in life expectancy — the country was ranked twenty-ninth for women in 1985 — is probably down to overall improvements in economic status and child nutrition, the study notes, among other factors. South Koreans also have relatively equal access to health care, lower blood pressure than people in Western countries and low rates of smoking among women.
Jen Doll in The Atlantic:
In 1965, 11 years after the Supreme Court outlawed segregated schools, Nancy Larrick wrote an article titled "The All-White World of Children's Books" for the Saturday Review. Marc Aronson, author of Race: A History Beyond Black and White, described that piece to The Atlantic Wire as "a call to arms." Larrick had been inspired to write the piece, which criticized the omission of black characters in children's literature, after a 5-year-old black girl asked why all the kids in the books she read were white. According to Larrick's survey of trade books over a three-year period, "only four-fifths of one percent" of those works included contemporary black Americans as characters. Further, the characterizations of pre-World War II blacks consisted of slaves, menial workers, or sharecroppers. Via Reading Is Fundamental, "'Across the country,' she stated in that piece, '6,340,000 nonwhite children are learning to read and to understand the American way of life in books which either omit them entirely or scarcely mention them.'"
…Myers shared the story of an 8-year-old girl who came up to him praising his picture book about a dog that plays the blues. "I said, 'You like the blues?'" he told us. "She said no. I said, 'You like dogs?' She said no. I said, 'What did you like?' She said, it looks like me.' If you have a black kid on the cover, black kids will pick it up faster." The flip side of this is a brutal one: What does it mean when kids don't see themselves on, or in, the books intended for them? As Myers told us, "I was asked by some teachers, 'What's the effect of video games on reading?' At first I was thinking it’s not that much, but a video game will give you more self-esteem than a book [especially a book that you don't see yourself in], so you go for the video games. Air Jordans will give you even more esteem. At 13 or 14, you’ve assessed yourself. You know if you’re good-looking, you know if you’re hip. So many black kids are looking at themselves and saying, 'I ain't much," he said. "This is why you need diversity."
More here. (Note: At least one post throughout February will be in honor of Black History Month)