Friday, March 31, 2017
The politics of outrage, and the crisis of free speech on campus
Ira Wells in the Literary Review of Canada:
Among those invested in the notion that higher education is currently collapsing before our eyes, fewer pieces of evidence are proffered more frequently (or more uncritically) than the modern university’s supposed tendency to nurture and promote “offence taking” as a default attitude toward the world. Our universities, we are told, have discarded their traditional raisonin order to become incubators of moral outrage. Administrators, having abandoned time-honoured liberal arts ideals, today quiver to the cheap thrill of indignation; professors, having given up on Shakespeare and the “great books,” now indoctrinate students in radical Marxist ideology and seek to cultivate a generation of “social justice warriors.” Our campuses have become closed, ideologically insular places that are hostile to the freedom of speech and intolerant of dissent.
This opinion—broadcast by bilious media personalities who have never listened in on a faculty meeting, have no knowledge of universities’ academic priorities and have not set foot in an undergraduate lecture since Trudeau père occupied 24 Sussex—is, unsurprisingly, a grotesque parody of the complex, often internally conflicted reality of modern institutions of higher learning.
Yet this view, however exaggerated, is not entirely baseless. An increasingly sensitive and fine-grained vocabulary for registering and opposing forms of sexism, racism, ableism and religious intolerance has undeniably been developing within higher education.
Evolution Is Slower Than It Looks and Faster Than You Think
Carrie Arnold in Wired:
In the 1950s, the Finnish biologist Björn Kurtén noticed something unusual in the fossilized horses he was studying. When he compared the shapes of the bones of species separated by only a few generations, he could detect lots of small but significant changes. Horse species separated by millions of years, however, showed far fewer differences in their morphology. Subsequent studies over the next half century found similar effects—organisms appeared to evolve more quickly when biologists tracked them over shorter timescales.
Then, in the mid-2000s, Simon Ho, an evolutionary biologist at the University of Sydney, encountered a similar phenomenon in the genomes he was analyzing. When he calculated how quickly DNA mutations accumulated in birds and primates over just a few thousand years, Ho found the genomes chock-full of small mutations. This indicated a briskly ticking evolutionary clock. But when he zoomed out and compared DNA sequences separated by millions of years, he found something very different. The evolutionary clock had slowed to a crawl.
Baffled by his results, Ho set to work trying to figure out what was going on. He stumbled upon Kurtén’s 1959 work and realized that the differences in rates of physical change Kurtén saw also appeared in genetic sequences.
Testing a guy who can talk backwards!
Video length: 12:12
Nate Silver says media assumptions, not data, led to surprise over 2016 election results
Christina Pazzanese in the Harvard Gazette:
GAZETTE: At last year’s conference here, you were still skeptical of Trump’s viability as the Republican Party nominee, which was fairly late. On election night, your site had Hillary’s chances at 71 percent; almost everyone else had her up by even more. Why do you think Trump’s victory blindsided so many?
SILVER: I think people shouldn’t have been so surprised. Clinton was the favorite, but the polls showed, in our view, particularly at the end, a highly competitive race in the Electoral College. We had him with a 30 percent chance, and that’s a pretty likely occurrence. Why did people think it was much less than that? I think there are a few things. One is that I don’t think people have a good intuitive sense for how to translate polls to probabilities. In theory, that’s the benefit of a model. But I think people thought “Well, Clinton’s ahead in most of the polls in most states, and I remember that seems similar to Obama four years ago, and therefore I’m very confident that she’ll win.” It’s ad hoc and not really very rigorous, that thought process.
The second part is that there is a certain amount of groupthink. People looking at the polls are mostly in newsrooms in Washington and Boston and New York. These are liberal cities, and so people tend to see evidence (in our view, it was kind of conflicting polling data) as pointing toward a certain thing.
—The Chinese micro-carver Chen Zhongen
can inscribe poems on a single strand of hair
I asked for a headful of sonnets
(Petrarchan) from scalp to split end.
Short-haired one, said he,
the most I can do for you
is a crop of haiku.
A bit miffed, I looked round the room
at enormous close-ups of women
with sestinas twirled through their ringlets,
thousands of Möbius strips
curled round recidivist words.
A man with a brylled-black mullet
sported tercets over his ears,
and a thicket of octets ending in knots:
the days of the week in Old Norse.
Poems with upbeat conclusions
on the flick-ups of nymphet models.
Bawdy love-lyrics from the 1700s
hidden inside dense dark shag perms,
and rhyming couplets at the outer tips
of a blonde boy’s barely-there eyebrows.
Not fair, I thought; oh, to be Rapunzel
with space for the lost Latin epics
of Valerius Flaccus cascading
down past my backside.
But no; I got Ezra Pound’s petals
above my wet and blackening brow.
Some highlights from Japanese wisdom.
And one of the stylist’s own:
What hard work this is,
blinded by flurries of snow:
by Mary O'Donoghue
from: Among These Winters
Dedalus Press, Dublin, 2007
Reza Aslan: Scholar or retailer of import goods?
Aslan’s “all religion is the same and it’s all about being a good person” claims would be fine if he positioned himself as a spiritual guide or religious teacher, but Aslan insists that he is a scholar of religion. The opening credits say so four times. But such claims are not scholarship. In his “Theses on Method,” scholar of religion Bruce Lincoln summed up the religious scholar’s job well:
When one permits those whom one studies to define the terms in which they will be understood, suspends one’s interest in the temporal and contingent, or fails to distinguish between “truths”, “truth-claims”, and “regimes of truth”, one has ceased to function as historian or scholar. In that moment, a variety of roles are available: some perfectly respectable (amanuensis, collector, friend and advocate), and some less appealing (cheerleader, voyeur, retailer of import goods). None, however, should be confused with scholarship.
Aslan permits the reforming Aghori to define their own terms, his “all religions are the same” approach ignores the temporal and contingent, and so he has ceased to function as a scholar. It’s up to viewers to decide which of the other roles suites him best. Moreover, these same things that get in the way of his role as scholar are also the cause of his troubles with his Hindu critics. By saying that he knows what Hinduism (and religion) is really about he abnegates his role as scholar and offends Hindus who have their own ideas about Hinduism’s truths.
The Unlikely Rise of an Alt-Right Hero
In a video posted on Facebook on March 27 by Kyle Chapman, the camera pans across what can only be described as a DIY armory: baseball helmet, ski goggles, shin guards, face mask, wooden shield, flag pole. “The benefit of the baseball helmet is that you have holes where the ears are,” Chapman tells his viewers. “This allows you to hear what’s going on around you.” The helmet is emblazoned with a decal reading molon labe (“come and take them”), the Second Amendment rallying cry borrowed from ancient Sparta. Chapman also recommends going to Home Depot, where one can find a wooden table top for just $25 to fashion into a homemade buckler.
Chapman made the video in response to a barrage of inquiries into his riot gear, which was on full display when he fought anti-fascist protesters (sometimes collectively referred to as Antifa) at the University of California, Berkeley, in early March. Before then, he had been a relative unknown. A 41-year-old commercial diver living in the Bay Area, Chapman told the New Republic that he “doesn’t really care for social media or the internet.” But on March 4, a video of Chapman breaking a wooden sign post over the head of an Antifa activist went viral, quickly launching him to fame as the subject of a new alt-right meme: “Based Stickman.”
on bosch and bruegel
A new book by Joseph Koerner is always an event. Here, as usual, he seems to have read everything and to have thought about everything connected with his chosen subject, the two early modern Netherlands painters Hieronymus Bosch and Pieter Bruegel the Elder, so similar in many ways and yet so different: their lives and their work; the complex history of the Netherlands and Europe in the sixteenth century; the seismic cultural shifts occurring at the time; the commissioning and afterlife of individual paintings; the way they lay on the paint and the way they intend their work to be seen and how it is seen now – the Boschs mainly in the Prado, the Bruegels mainly in the Kunsthistorisches Museum in Vienna. He feels it is as important to note how visitors respond to their work in these galleries as it is to understand the iconography they are using (visitors crowd excitedly round Bosch, they smile happily to themselves as they view Bruegel).
But it is his ability to look and to find words for what he is looking at that sets him in the very front rank of art historians. Here he is looking at Bosch’s great drawing of the Tree-Man (probably, he thinks, a version of the mysterious Tree-Man in the “Garden of Earthly Delights”, made for a collector after the painting) on a sheet now in the Albertina in Vienna:
Invention demands a capacity of mind. But as the medium of drawing lays it bare, it happens in the bodily and material activity of painting. Bosch creates his lines out of bistre, a brown pigment produced by boiling the soot of wood [there is much here about the way the artist plays with his name, “forest”].
Do We Owe Our Large Primate Brains to a Passion for Fruit?
Bret Stetka in Scientific American:
Compared with other mammals, and along with those of a few other notably bright creatures—dolphins, whales and elephants among them—the brain to body-size ratios of monkeys, apes and humans are among the highest. For decades the prevailing evolutionary explanation for this was increasing social complexity. The so-called “social brain hypothesis” holds that the pressures and nuances of interacting and functioning within a group gradually boosted brain size. Yet new research suggests otherwise. A study conducted by a team of New York University anthropologists, and published Monday in Nature Ecology & Evolution, reports diet was in all likelihood much more instrumental in driving primate brain evolution. In particular, it appears that we and our primate cousins may owe our big brains to eating fruit. Much of the research exploring the social hypothesis has rendered inconsistent results. And as many in the field have noted, a number of oft-cited studies in support of the theory suffer from small sample sizes and flawed design, including out-of-date species classification. The new work is based on a primate sample more than three times larger than that used in prior studies, and one that used a more accurate evolutionary family tree.
In over 140 primate species, the study authors compared brain size with the consumption of fruit, leaves and meat. They also compared it with group size, social organization and mating systems. By looking at factors such as whether or not a particular primate group prefers solitary to pair living or whether they are monogamous, the researchers figured they should theoretically be able to determine if social factors contributed to the evolution of larger brains. And it appears they could not. Dietary preferences—especially fruit consumption—seems to have been much more influential. The researchers found that fruit-eating species, or frugivores, have significantly larger brains than both omnivores and “foliovores,” those that prefer eating leaves. “These findings call into question the current emphasis on the social brain hypothesis, which suggests larger brains are associated with increased social complexity,” explains Alex DeCasien, a doctoral candidate in anthropology and lead author of the study. “Instead, our results resurrect older ideas about the evolutionary relationship between foraging complexity and brain size.”
Mini reproductive system on a chip mimics human menstrual cycle
Sara Reardon in Nature:
In the quest to study human reproduction, scientists have built a rudimentary model of the female system in the lab. Every 28 days, the 'ovary', cultured on a small plastic chip, releases an egg and starts producing hormones to prepare for pregnancy. The hormones travel through a series of tiny channels that mimic Fallopian tubes and into a uterus-like chamber made of human tissue. The system, described in a study1 published on 28 March in Nature Communications, is the latest in a series of organs-on-chips — miniature devices seeded with human tissues and cells that are engineered to model biological functions. Researchers hope that the synthetic reproductive system will provide another avenue for studying diseases such as cervical cancer, and allow them to test new contraceptives and fertility treatments before being used in people. There is no good animal model for the 28-day human reproductive cycle, says Teresa Woodruff, a reproductive scientist at Northwestern University in Chicago, Illinois, and a co-author of the study. The artificial system fills an “urgent unmet need for us”, she says.
All together now
Woodruff and her colleagues named their system Evatar — a portmanteau of Eve and avatar. It contains five ‘organs’ linked together by a blood-like liquid carrying hormones, cell signalling molecules and drugs. The Fallopian tubes, uterus and cervix are made from human tissues obtained from women undergoing hysterectomies. The ovaries, however, are from mouse tissue, because healthy ovaries are rarely removed from women. Tissue for the fifth ‘organ’, the liver, which metabolizes drugs, comes from humans.
Thursday, March 30, 2017
How philosophy came to disdain the wisdom of oral cultures
Justin E. H. Smith in Aeon:
As the theorist Walter J Ong pointed out in Orality and Literacy: Technologizing the Word (1982), it is difficult, perhaps even impossible, now to imagine how differently language would have been experienced in a culture of ‘primary orality’. There would be nowhere to ‘look up a word’, no authoritative source telling us the shape the word ‘actually’ takes. There would be no way to affirm the word’s existence at all except by speaking it – and this necessary condition of survival is important for understanding the relatively repetitive nature of epic poetry. Say it over and over again, or it will slip away. In the absence of fixed, textual anchors for words, there would be a sharp sense that language is charged with power, almost magic: the idea that words, when spoken, can bring about new states of affairs in the world. They do not so much describe, as invoke.
As a consequence of the development of writing, first in the ancient Near East and soon after in Greece, old habits of thought began to die out, and certain other, previously latent, mental faculties began to express themselves. Words were now anchored and, though spellings could change from one generation to another, or one region to another, there were now physical traces that endured, which could be transmitted, consulted and pointed to in settling questions about the use or authority of spoken language.
Writing rapidly turned customs into laws, agreements into contracts, genealogical lore into history.
A Burning Collection: Norman Rush reviews a book of essays from Teju Cole
Norman Rush in the New York Review of Books:
Teju Cole is a kind of realm. He has written three books—two exceptional novels and the volume of essays to be considered here—as well as many uncollected essays, interviews, newspaper columns, and a vast online oeuvre made up of skeins of tweets on fixed themes, faits divers, e-mail arguments, captioned Instagrams, mixed media exercises, and rants. At the moment he is credited with more than 13,000 tweets, 263,000 Twitter followers, 1,035 photos, and around 22,000 fans who officially like his Facebook page. Even in a time when many writers are enlarging their literary footprints by means of the Internet, he is a prodigy.
There is a strong interconnectedness between the different parts of his work. Cole’s personal story, sometimes given straight, sometimes fictionalized, pervades. The bicultural Teju Cole was born in the US in 1975, raised in Nigeria until his seventeenth year, brought back to America where he first studied art and attended medical school, and then went abroad to study African art history; he later studied Northern Renaissance art at Columbia. His initial novels brought him a storm of prizes and attention. He is currently a writer in residence at Bard College and the photography critic for The New York Times Magazine and is himself an exhibiting photographer. Cole has said in an interview that the essays on photography in this collection, which also collects many of his writings on literature, travel, politics, and art, are the most important of his writings.
Cole is very conscious of the difference between what one might think of as books aimed at a presumed posterity and his online works, aimed at a real-time and frequently interactive fandom.
"Hi-C": The Game-Changing Technique That Cracked the Zika-Mosquito Genome
Ed Yong in The Atlantic:
Ten years ago, a team of scientists published the first genome of Aedes aegypti—the infamous mosquito that spreads Zika, dengue fever, and yellow fever. It was a valiant effort, but also a complete mess. Rather than tidily bundled in the insect’s three pairs of chromosomes, its DNA was scattered among 36,000 small fragments, many of which were riddled with gaps and errors. But last week, a team of scientists led by Erez Lieberman Aiden at the Baylor College of Medicine announced that they had finally knitted those pieces into a coherent whole—a victory that will undoubtedly be helpful to scientists who study Aedes and the diseases it carries.
This milestone is about more than mosquitoes. The team succeeded by using a technique called Hi-C, which allows scientists to assemble an organism’s genome quickly, cheaply, and accurately. To prove that point, the team used Hi-C to piece together a human genome from scratch for just $10,000; by contrast, the original Human Genome Project took $4 billion to accomplish the same feat. “It’s very clear that this is the way that you want to be doing it,” says Olga Dudchenko, who was part of Aiden’s team. “At least in the foreseeable future, there’s no method that can compete,” adds her colleague Sanjit Singh Batra.
Linton Kwesi Johnson's tribute to Derek Wallcott
Video length: 1:04
Richard King reviews three books on populism in the Sydney Review of Books:
So, it’s happened. Donald J. Trump, the guy hardly anyone thought could win the Republican nomination, and, having won the Republican nomination, hardly anyone thought could become US President, is US President. It still doesn’t feel entirely real, and the sense that we’re living in an alternative present, a counterfactual come to life – more Back to the Future Part II, at the moment, than It Can’t Happen Here or The Plot Against America – has yet to fully dissipate. But dissipate it will, must. The Cheeto Jesus is in da House. Hair Force One has landed.
Trump’s supporters are ecstatic, his opponents appalled: not since the war in Vietnam has the US looked so deeply divided. In his much-shared piece published the day after the election, New Yorkereditor David Remnick warned against the media normalisation that was sure to follow the result. But if anything positions have hardened in the two months since the inauguration, with Trump’s people renewing their attacks on the media, and his political detractors – no, enemies – oscillating between denial and anger: denial that a man who lost the popular vote and possibly conspired with the Russians to undermine Clinton could ever be deemed legitimate; and anger that someone so remote from the standards of liberal decency now sits in the Oval Office. Thus do the first two stages of grief define the liberal and progressive reaction: not morning in America but America in mourning.
Getting Us Through: Ralph Waldo Emerson
Jessica Collier in Avidly:
Hiking with my young dog in the rainy woods, I think of Emerson. Normally, when amped on endorphins in inclement weather, I recall the seer of American optimism’s most jubilant moment: ’Crossing a bare common, in snow puddles, at twilight, under a clouded sky, without having in my thoughts any occurrence of special good fortune, I have enjoyed a perfect exhilaration. I am glad to the brink of fear.’ Today, as the pup bounces and skitters and slides along the trail, slippery with rotting underbrush, I feel myself twitch. Even at play, my body insists on a tensile state. It’s not Emerson the seer but Emerson the grieving father I call to mind. Later, at home, I flip a battered anthology open to ‘Experience,’ bookmarked years ago with a white paper clip. The essay, written after his son Waldo’s death, is longer than I remember, or maybe the timely, snappy news pieces that dominate my reading lately just make everything substantive and considered seem interminable.
...Here loss, a singular devastating event, doesn’t compare to the loss of sensation when you keep on keeping on. The grief comes in not feeling grief palpably enough: ‘There are moods in which we court suffering, in the hope that here, at least, we shall find reality, sharp peaks and edges of truth. But it turns out to be scene-painting and counterfeit. The only thing grief has taught me, is to know how shallow it is.’ Faced with this emotional subterfuge, Emerson lowers the bar in a way that’s painful to read and familiar to recount these days, when we shut down the barrage of news only to turn to Netflix for repose. ‘Do not craze yourself with thinking,’ he admonishes us pragmatically, ‘but go about your business anywhere. Life is not intellectual or critical, but sturdy. Its chief good is for well-mixed people who can enjoy what they find, without question.’
But there are always questions with Emerson, which is why we tolerate him. ‘Experience’ is a humbled litany, an ode to falling down and getting, if not entirely up again, then on. The final lines gear up to be a sop, a salve for the reader: ‘Patience and patience, we shall win at the last.’ Shall we, though? I ask out loud. The sentiment borders on patronizing—I wonder if optimism will ever feel authentic again—but he gives us reason to press on: ‘Never mind the ridicule, never mind the defeat: up again, old heart!—it seems to say,—there is victory yet for all justice.’ Emerson, like all of us inspecting optimism through the pall of loss, equivocates. He refuses to encourage the reader (or himself) directly, summoning instead a voice that ‘seems to say’ better days are ahead.
Elon Musk's latest target: Brain-computer interfaces
Mae Anderson in PhysOrg:
Tech billionaire Elon Musk is announcing a new venture called Neuralink focused on linking brains to computers. The company plans to develop brain implants that can treat neural disorders—and that may one day be powerful enough to put humanity on a more even footing with possible future superintelligent computers, according to a Wall Street Journal report citing unnamed sources. Musk, a founder of both the electric-car company Tesla Motors and the private space-exploration firm SpaceX, has become an outspoken doomsayer about the threat artificial intelligence might one day pose to the human race. Continued growth in AI cognitive capabilities, he and like-minded critics suggest, could lead to machines that can outthink and outmaneuver humans with whom they might have little in common. In a tweet Tuesday, Musk gave few details beyond confirming Neuralink's name and tersely noting the "existential risk" of failing to pursue direct brain-interface work.
STIMULATING THE BRAIN
Some neuroscientists and futurists, however, caution against making overly broad claims for neural interfaces. Hooking a brain up directly to electronics is itself not new. Doctors implant electrodes in brains to deliver stimulation for treating such conditions as Parkinson's disease, epilepsy and chronic pain. In experiments, implanted sensors have let paralyzed people use brain signals to operate computers and move robotic arms. Last year , researchers reported that a man regained some movement in his own hand with a brain implant. Musk's proposal goes beyond this. Although nothing is developed yet, the company wants to build on those existing medical treatments as well as one day work on surgeries that could improve cognitive functioning, according to the Journal article. Neuralink is not the only company working on artificial intelligence for the brain. Entrepreneur Bryan Johnson, who sold his previous payments startup Braintree to PayPal for $800 million, last year started Kernel, a company working on "advanced neural interfaces" to treat disease and extend cognition.
Wednesday, March 29, 2017
The Rising Tide of Educated Aliteracy
Alex Good in The Walrus:
The author of the surprise bestseller How to Talk About Books You Haven’t Read, Pierre Bayard, is a standard-bearer for today’s highbrow aliterates. Bayard is a college professor of French literature, a position that paradoxically leaves him with “no way to avoid commenting on books that most of the time I haven’t even opened” (or, for that matter, has ever had any desire to open). And this is nothing he feels any shame or anxiety about. Not reading, Bayard believes, is in many cases preferable to reading and may allow for a superior form of literary criticism—one that is more creative and doesn’t run the risk of getting lost in all the messy details of a text. Actual books are thus “rendered hypothetical,” replaced by virtual books in phantom libraries that represent an inner, fantasy scriptorium or shared social consciousness.
Assuming that Bayard’s tongue isn’t stuck too far in his cheek, one can interpret his reasoning as an argument that not reading books can be a cultured activity in itself, a way of expressing one’s faith in and affection for literature. More often, however, top-down aliteracy only expresses weariness, cynicism, and even contempt for the written word.
My first exposure to this type of thinking came, naturally enough, while studying English literature in university. Academics, for no good reason whatsoever, are expected to publish a great deal of stuff that nobody—and I mean nobody—reads.
On Flannery O’Connor and T.S. Eliot
Early in her novel Wise Blood, Flannery O’Connor describes protagonist Hazel Motes, leader of the Church without Christ, by the silhouette he casts on the sidewalk. “Haze’s shadow,” she writes, “was now behind him and now before him.” It’s a strange way to situate a character — skulking between his shadows — but it’s not unprecedented. In The Waste Land, T.S. Eliot’s narrator refers to “Your shadow at morning striding behind you/Or your shadow at evening rising to meet you.” Coincidence? Nobody can say for certain. But in the rare case of a critic linking O’Connor and Eliot,Sally Fitzgerald (O’Connor’s close friend) wrote that “it was Eliot and his Waste Land who provided for her the first impetus to write such a book as Wise Blood.”
Harold Bloom, the literary critic who thrives on making such connections, famously argued that great writers, burdened by what he called the “anxiety of influence,” subconsciously misread established literary giants to achieve originality. But in this case, O’Connor is not misreading Eliot. She’s answering him. The Waste Land delivers a darkly poetic proposition. Every line relentlessly reiterates the theme that, in the wake of World War One, hope had been leached from life. Existence, in the poem’s assessment, culminates in a word one rueful lover repeats in The Waste Land’s second section: “Nothing . . . Nothing. . . nothing . . .nothing . . .Nothing.”
The New Cult of Consensus
The revival of interest in the conflicts and the violence that mark American history proved enormously fruitful. In 1969, in a beautiful book that was his final reckoning with The Progressive Historians, Hofstadter himself acknowledged the limitations of the consensus approach, singling out the Civil War as a historic convulsion that scarcely exemplified the pragmatic genius of American politics. In some ways this was not surprising. Hofstadter had been influenced by Marxism when he was young, and he was one of the first historians to blow the whistle on U. B. Phillips’ romanticized histories of slavery. Nor should it surprise us that in the 1960s Marxism became the most effective means by which historians recovered the fundamental issues at stake in the Civil War—although it was a Marxism that accepted the structural foundations of the conflict between the North and the South but went on to examine the political and ideological manifestations of that conflict.
I think of Judith Stein’s work as having emerged from that same intellectual ferment. Attentive to class divisions, but always sensitive to the unpredictable ways class conflict has played out in American politics. It’s that sensitivity to the particularities of time and place that has repeatedly sent Judith off the archives and makes her such an industrious researcher. She had a set of priorities but no predetermined answer. Who knew, for example, that it was the foreign policy apparatus that prevented the federal government from protecting American workers from unfair trade practices during the 1970s?
How we made the hated typeface Comic Sans
Interviews by Ben Beaumont-Thomas in The Guardian:
I was working for Microsoft’s typography team, which had a lot of dealings with people from applications like Publisher, Creative Writer and Encarta. They wanted all kinds of fonts – a lot of them strange and childlike. One program was called Microsoft Bob, which was designed to make computers more accessible to children. I booted it up and out walked this cartoon dog, talking with a speech bubble in Times New Roman. Dogs don’t talk in Times New Roman! Conceptually, it made no sense.
So I had an idea to make a comic-style text and started looking at Watchmen and Dark Knight Returns, graphic novels where the hand lettering was like a typeface. I could have scanned it in and copied the lettering, but that was unethical. Instead, I looked at various letters and tried to mimic them on screen. There were no sketches or studies – it was just me drawing with a mouse, deleting whatever was wrong.
I didn’t have to make straight lines, I didn’t have to make things look right, and that’s what I found fun. I was breaking the typography rules. My boss Robert Norton, whose mother Mary Norton wrote The Borrowers, said the “p” and “q” should mirror each other perfectly. I said: “No, it’s supposed to be wrong!” There were a lot of problems like that at Microsoft, a lot of fights, though not physical ones.
A Long-Sought Mathematical Proof, Found and Almost Lost
Natalie Wolchover in Quanta:
As he was brushing his teeth on the morning of July 17, 2014, Thomas Royen, a little-known retired German statistician, suddenly lit upon the proof of a famous conjecture at the intersection of geometry, probability theory and statistics that had eluded top experts for decades.
Known as the Gaussian correlation inequality (GCI), the conjecture originated in the 1950s, was posed in its most elegant form in 1972 and has held mathematicians in its thrall ever since. “I know of people who worked on it for 40 years,” said Donald Richards, a statistician at Pennsylvania State University. “I myself worked on it for 30 years.”
Royen hadn’t given the Gaussian correlation inequality much thought before the “raw idea” for how to prove it came to him over the bathroom sink. Formerly an employee of a pharmaceutical company, he had moved on to a small technical university in Bingen, Germany, in 1985 in order to have more time to improve the statistical formulas that he and other industry statisticians used to make sense of drug-trial data. In July 2014, still at work on his formulas as a 67-year-old retiree, Royen found that the GCI could be extended into a statement about statistical distributions he had long specialized in. On the morning of the 17th, he saw how to calculate a key derivative for this extended GCI that unlocked the proof. “The evening of this day, my first draft of the proof was written,” he said.
Does poverty damage children's brains? An interview with Amy Wax
Video length: 43:06
A Tale of Two Bell Curves
Bo Winegard and Ben Winegard in Quillette:
To paraphrase Mark Twain, an infamous book is one that people castigate but do not read. Perhaps no modern work better fits this description than The Bell Curve by political scientist Charles Murray and the late psychologist Richard J. Herrnstein. Published in 1994, the book is a sprawling (872 pages) but surprisingly entertaining analysis of the increasing importance of cognitive ability in the United States. It also included two chapters that addressed well-known racial differences in IQ scores (chapters 13-14). After a few cautious and thoughtful reviews, the book was excoriated by academics and popular science writers alike. A kind of grotesque mythology grew around it. It was depicted as a tome of racial antipathy; a thinly veiled expression of its authors’ bigotry; an epic scientific fraud, full of slipshod scholarship and outright lies. As hostile reviews piled up, the real Bell Curve, a sober and judiciously argued book, was eclipsed by a fictitious alternative. This fictitious Bell Curve still inspires enmity; and its surviving co-author is still caricatured as a racist, a classist, an elitist, and a white nationalist.
Myths have consequences. At Middlebury college, a crowd of disgruntled students, inspired by the fictitious Bell Curve — it is doubtful that many had bothered to read the actual book — interrupted Charles Murray’s March 2nd speech with chants of “hey, hey, ho, ho, Charles Murray has got to go,” and “racist, sexist, anti-gay, Charles Murray go away!” After Murray and moderator Allison Stanger were moved to a “secret location” to finish their conversation, protesters began to grab at Murray, who was shielded by Stanger. Stanger suffered a concussion and neck injuries that required hospital treatment.
It is easy to dismiss this outburst as an ill-informed spasm of overzealous college students, but their ignorance of The Bell Curve and its author is widely shared among social scientists, journalists, and the intelligentsia more broadly. Even media outlets that later lamented the Middlebury debacle had published – and continue to publish – opinion pieces that promoted the fictitious Bell Curve, a pseudoscientific manifesto of bigotry.
Above Pate Valley
We finished clearing the last
Section of trail by noon,
High on the ridge-side
Two thousand feet above the creek
Reached the pass, went on
Beyond the white pine groves,
Granite shoulders, to a small
Green meadow watered by the snow,
Edged with Aspen—sun
Straight high and blazing
But the air was cool.
Ate a cold fried trout in the
Trembling shadows. I spied
A glitter, and found a flake
Black volcanic glass—obsidian—
By a flower. Hands and knees
Pushing the Bear grass, thousands
Of arrowhead leavings over a
Hundred yards. Not one good
Head, just razor flakes
On a hill snowed all but summer,
A land of fat summer deer,
They came to camp. On their
Own trails. I followed my own
Trail here. Picked up the cold-drill,
Pick, singlejack, and sack
Ten thousand years.
by Gary Snyder
from Riprap and Cold Mountain Poems
Shoemaker & Hoard Publishers.