Interviews by Ben Beaumont-Thomas in The Guardian:
Vincent Connare, typographer
I was working for Microsoft’s typography team, which had a lot of dealings with people from applications like Publisher, Creative Writer and Encarta. They wanted all kinds of fonts – a lot of them strange and childlike. One program was called Microsoft Bob, which was designed to make computers more accessible to children. I booted it up and out walked this cartoon dog, talking with a speech bubble in Times New Roman. Dogs don’t talk in Times New Roman! Conceptually, it made no sense.
So I had an idea to make a comic-style text and started looking at Watchmen and Dark Knight Returns, graphic novels where the hand lettering was like a typeface. I could have scanned it in and copied the lettering, but that was unethical. Instead, I looked at various letters and tried to mimic them on screen. There were no sketches or studies – it was just me drawing with a mouse, deleting whatever was wrong.
I didn’t have to make straight lines, I didn’t have to make things look right, and that’s what I found fun. I was breaking the typography rules. My boss Robert Norton, whose mother Mary Norton wrote The Borrowers, said the “p” and “q” should mirror each other perfectly. I said: “No, it’s supposed to be wrong!” There were a lot of problems like that at Microsoft, a lot of fights, though not physical ones.
As he was brushing his teeth on the morning of July 17, 2014, Thomas Royen, a little-known retired German statistician, suddenly lit upon the proof of a famous conjecture at the intersection of geometry, probability theory and statistics that had eluded top experts for decades.
Known as the Gaussian correlation inequality (GCI), the conjecture originated in the 1950s, was posed in its most elegant form in 1972 and has held mathematicians in its thrall ever since. “I know of people who worked on it for 40 years,” said Donald Richards, a statistician at Pennsylvania State University. “I myself worked on it for 30 years.”
Royen hadn’t given the Gaussian correlation inequality much thought before the “raw idea” for how to prove it came to him over the bathroom sink. Formerly an employee of a pharmaceutical company, he had moved on to a small technical university in Bingen, Germany, in 1985 in order to have more time to improve the statistical formulas that he and other industry statisticians used to make sense of drug-trial data. In July 2014, still at work on his formulas as a 67-year-old retiree, Royen found that the GCI could be extended into a statement about statistical distributions he had long specialized in. On the morning of the 17th, he saw how to calculate a key derivative for this extended GCI that unlocked the proof. “The evening of this day, my first draft of the proof was written,” he said.
To paraphrase Mark Twain, an infamous book is one that people castigate but do not read. Perhaps no modern work better fits this description than The Bell Curveby political scientist Charles Murray and the late psychologist Richard J. Herrnstein. Published in 1994, the book is a sprawling (872 pages) but surprisingly entertaining analysis of the increasing importance of cognitive ability in the United States. It also included two chapters that addressed well-known racial differences in IQ scores (chapters 13-14). After a few cautious and thoughtful reviews, the book was excoriated by academics and popular science writers alike. A kind of grotesque mythology grew around it. It was depicted as a tome of racial antipathy; a thinly veiled expression of its authors’ bigotry; an epic scientific fraud, full of slipshod scholarship and outright lies. As hostile reviews piled up, the real Bell Curve, a sober and judiciously argued book, was eclipsed by a fictitious alternative. This fictitious Bell Curve still inspires enmity; and its surviving co-author is still caricatured as a racist, a classist, an elitist, and a white nationalist.
It is easy to dismiss this outburst as an ill-informed spasm of overzealous college students, but their ignorance of The Bell Curve and its author is widely shared among social scientists, journalists, and the intelligentsia more broadly. Even media outlets that later lamented the Middlebury debacle had published – and continue to publish – opinion pieces that promoted the fictitious Bell Curve, a pseudoscientific manifesto of bigotry.
We finished clearing the last Section of trail by noon, High on the ridge-side Two thousand feet above the creek Reached the pass, went on Beyond the white pine groves, Granite shoulders, to a small Green meadow watered by the snow, Edged with Aspen—sun Straight high and blazing But the air was cool. Ate a cold fried trout in the Trembling shadows. I spied A glitter, and found a flake Black volcanic glass—obsidian— By a flower. Hands and knees Pushing the Bear grass, thousands Of arrowhead leavings over a Hundred yards. Not one good Head, just razor flakes On a hill snowed all but summer, A land of fat summer deer, They came to camp. On their Own trails. I followed my own Trail here. Picked up the cold-drill, Pick, singlejack, and sack Of dynamite. Ten thousand years.
by Gary Snyder from Riprap and Cold Mountain Poems Shoemaker & Hoard Publishers.
Oncologists know that men are more prone to cancer than women; one in two men will develop some form of the disease in a lifetime, compared with one in three women.But until recently, scientists have been unable to pinpoint why. In the past, they theorized that men were more likely than women to encounter carcinogens through factors such as cigarette smoking and factory work. Yet the ratio of men with cancer to women with cancer remained largely unchanged across time, even as women began to smoke and enter the workforce in greater numbers. Pediatric cancer specialists also noted a similar “male bias to cancer” among babies and very young children with leukemia. “It’s not simply exposures over a lifetime,” explains Andrew Lane, assistant professor of medicine and a researcher at the Dana-Farber Cancer Institute. “It’s something intrinsic in the male and female system.” Now, discoveries by Lane and the Broad Institute of Harvard and MIT reveal that genetic differences between males and females may account for some of the imbalance. A physician-researcher who studies the genetics of leukemia and potential treatments, Lane says that he and others noted that men with certain types of leukemia often possess mutations on genes located on the X chromosome. These mutations damage tumor-suppressor genes, which normally halt the rampant cell division that triggers cancer.
Lane initially reasoned that females, who have two X chromosomes, would be less prone to these cancers because they have two copies of each tumor suppressor gene. In contrast, men have an X and a Y chromosome—or just one copy of the protective genes, which could be “taken out” by mutation. But the problem with that hypothesis, Lane says, was a “fascinating phenomenon from basic undergraduate biology called X-inactivation.” In a female embryo, he explains, cells randomly inactivate one of the two X chromosomes. “When a female cell divides, it remembers which X chromosome is shut down, and it keeps it shut down for all of its progeny.” If female cells have only one X chromosome working at a time, then they should be just as likely as male cells to experience cancer-causing gene mutations. So Lane and his team dug deeper into existing studies and encountered a little-known and surprising finding: “There are about 800 genes on the X chromosome,” he says, “and for reasons that are still unclear, about 50 genes on that inactive X chromosome stay on.” In a “big Aha! moment,” Lane’s group realized that those gene mutations common in men with leukemia were located on genes that continue to function on women’s inactive chromosome. The researchers dubbed those genes EXITS for “Escape from X-Inactivation Tumor Suppressors.” Women, Lane explains, thus have some relative protection against cancer cells becoming cancer because they, unlike men, do have two copies of these tumor-suppressor genes functioning at all times.
Growing human tissue is a huge challenge for researchers, even on a small scale. But some ultra-creative scientists hit on a potential solution last week when they flushed out a plant's cells and injected human cells in their place. That was how they got heart cells to beat on a spinach leaf. A major issue in tissue regeneration is creating a vascular system that ensures blood can flow to the tissue and deliver all-important oxygen and nutrients to keep the tissue alive and growing. Current techniques, including 3D printing, as innovative as it is, can't yet create the blood vessels and tinier capillaries needed in a circulatory system. But guess what's abundant and already has lots of veins? Plants, that's what. Researchers from Worcester Polytechnical Institute in Massachusetts, Arkansas State University-Jonesboro, and the University of Wisconsin-Madison hope to use plants as "scaffolds" to grow human tissue. For a proof-of-concept experiment, which will be published in the May issue of Biomaterials, WPI biomedical engineering graduate student Joshua Gerslak cleared out spinach leaves' plant cells by flushing a detergent solution through the stem.
…Down the line, researchers may be able to use this technique on multiple spinach leaves to create heart tissue, which could be grafted on to the hearts of people who've had heart attacks. (Parts of survivors' hearts have died from a lack of blood flow and no longer contract properly; other researchers are looking into using stem cells to repair this tissue.) While this is all super cool and exciting, we're many years away from any salad-based heart patches. The team was able to flush the cells out of other plants including parsley, peanut hairy roots, and sweet wormwood, and they think the technique could be adapted to work with other plants that would be a good match to grow certain types of human cells. They wrote:
"The spinach leaf might be better suited for a highly-vascularized tissue, like cardiac tissue, whereas the cylindrical hollow structure of the stem of Impatiens capensis (jewelweed) might better suit an arterial graft. Conversely, the vascular columns of wood might be useful in bone engineering due to their relative strength and geometries."
This is far from the only lab looking to the plant world for body parts: One Canadian researcher is working on making ears out of apples. The phrase "you are what you eat" suddenly takes on a whole new meaning, doesn't it?
Muneeza Shamsie reviews Only the Longest Threads by Tasneem Zehra Husain in Newsweek Pakistan:
Her novel is framed and juxtaposed by the growing friendship between Sara Byrne, a theoretical physicist, and Leonardo Santorini, a science journalist. They are both in Geneva on July 4, 2012, among an expectant and excited crowd, to witness a historic event: proof of the Higgs boson’s existence. This elusive subatomic particle so crucial to the understanding of the universe and its building blocks is revealed onscreen in an auditorium and becomes reality when the underground Large Hadron Collider creates such a high-speed collision of protons that it releases energy and shortlived particles, akin to the Big Bang—the birth of the universe.
Sara, heady from the jubilation of the moment, encourages Leo to move beyond the immediacy of journalism to the imaginative realms of fiction. He wants to recreate those moments of intensity and joy which impelled scientists in their search for answers. Sara says, “Theoretical physics is largely a private matter, a life lived out in the mind.” Leo captures this in the six stories he creates. In each, he employs a different narrator. In each, he welds scientific ideas of the era in which the narrator lives with the language, intonations, references, and lifestyle of that time. Hussain enhances her narrative by creating an email exchange between them that gives further context to Leo’s stories. He sends all six to her for comment in three installments. He then asks her to write the seventh one, on string theory.
Just to give some idea of what killing the NEA will (or more aptly, will not) accomplish, the $146 million budget of the National Endowment for the Arts represents just 0.012% (about one one-hundredth of one percent) of our federal discretionary spending. According to 2012 NEA figures, the annual budget for the arts per capita (in dollars) in Germany was $19.81; in England, $13.54; in Australia, $8.16; in Canada, $5.19, and in the United States just $0.47. Yes, 47 cents annually per capita. For all the arts combined. And the new POTUS feels that’s too much.
It would be impossible to enumerate all the programs that will likely die when the NEA and the NEH are killed, and the many people these cuts will deprive of things like public television programming and National Public Radio; school enrichment programs in the arts; and community programs to encourage music, dance, theater, visual art and literary art, literacy, and the pleasure of reading.
In September 2013, Marko Ahtisaari resigned from his position as the head of product design at Nokia. The Finnish company had just been acquired by Microsoft and Ahtisaari, the son of a former president of Finland, decided it was time to look for his next startup. He joined the MIT Media Lab shortly after, where he was introduced by Joi Ito, the Lab’s director, to Ketki Karanam, a biologist who was studying how music affects the brain. Ahtisaari was naturally interested: he grew up playing the violin and later studied music composition at Columbia University. “I used to be part of the New York scene,” Ahtisaari says. “I left to do product design and to be an entrepreneur. For 15 years I didn’t play much. I have friends who are now playing with Tom Yorke and the Red Hot Chili Peppers.”
Karanam showed Ahtisaari that there was an increasing body of evidence based on imaging studies that showed what happens to the brain when exposed to music. “It fires very broadly,” Ahtisaari. “It’s not just the auditory cortex. What happens is essentially similar to when we take psycho-stimulants. In other words, when we take drugs.”
To Ahtisaari, this indicated that music could, at least in principle, complement or even replace the effects that pharmaceuticals had on our neurology. For instance, there were studies that showed that patients with Parkinson’s disease improved their gait when listening to a song with the right beat pattern.
Holmes’s “This Long Pursuit” is itself a complement to two earlier volumes: “Footsteps: Adventures of a Romantic Biographer” (1985) and “Sidetracks: Explorations of a Romantic Biographer” (2000). All three are, essentially, collections of essays, talks, reminiscences and reviews held together by their author’s description of himself as a “romantic biographer.” That phrase carries multiple meanings: While Holmes’s field is, roughly, England in the age of Coleridge, he sometimes writes about romantic figures of other nations and periods (poet Gérard de Nerval, novelist Robert Louis Stevenson) and he himself clearly possesses an adventurous, romantic spirit.
In this new book’s first essay, “Travelling,” Holmes suggests that “biography is not merely a mode of historical enquiry. It is an act of imaginative faith.” To attain the requisite empathy, he early on adopted two key practices. The first he called the Footsteps principle. “I had come to believe that the serious biographer must physically pursue his subject through the past,” he explains. “Mere archives were not enough. He must go to all the places where the subject had ever lived or worked, or travelled or dreamed.” The biographer must then try to grasp their impact on his subject. “He must step back, step down, step inside the story.”
Though few contemporary Christians would likely admit it, many of the American colonies were built upon the idea of redistribution. Those dour Puritans who first populated the territories of New England were not lured by the promise of windfall profits. Nor had they endured months of seasickness and disease for the chance to start a small business. Instead, they were hopeless utopians, runaway apostates of the established church who yearned to embrace a higher manner of being, one founded upon a system of communitarian ethics. John Winthrop, the Puritan governor of the Massachusetts Bay Colony, sketched the tenets of this new society in a sermon called “A Model of Christian Charity,” which he delivered in 1630 while on board a British ship headed across the Atlantic. A gusty ode to American exceptionalism, the homily christened the new continent “The City Upon a Hill,” a metaphor that Ronald Reagan would make a watchword for Republicans some three-hundred-and-fifty years later. But in Winthrop’s eyes what gave the New World its luster were the egalitarian principles of the Protestant gospel, central among them the commitment to redistributing wealth on the basis of individual need. “We must be willing,” Winthrop said, “to abridge ourselves of our superfluities for the sakes of others’ necessities . . . we must bear one another’s burdens.”
It is stupefying to consider how, over the course of four centuries, American Christianity would forsake these humble sentiments for the telegenic hucksterism of preachers like Joel Osteen. This Pentecostal quack with a garish smile doesn’t tout the spiritual benefits of communal interdependence. Nor does he acknowledge the ethical requirements of the Christian social contract. Instead, like so many stewards of the “prosperity gospel,” Osteen thinks individual wealth is a hallmark of Christian virtue and urges his followers to reach inside themselves to unlock their hidden potential.
My husband and I have lived in Bulgaria for six months, lived in this country often confused for other places. “You’ll have to brush up on your French,” said a friend before I left the U.S., believing me bound for Algeria. “Enjoy the northern lights,” said another. Bulgaria is one of the forgotten nations once tucked behind the Iron Curtain, its cities now stocked with crumbling Soviet tenements and silent factories and stray dogs too hungry to bark. In the winter, in Haskovo —the city where I teach English to three hundred hardened teenagers—the air thickens to a gray haze as residents burn brush and scraps of trash to heat their homes. The smoke makes me cough, makes my eyes sting, makes my thoughts turn dark.
Today, though, we have left Haskovo. We have left winter as well. The first spring blossoms are starting to show, forsythia yellowing the countryside. As the road to the Devil’s Throat continues its manic winding route through the Rhodopes, we pass the occasional village of squat red-roofed dwellings, laundry lines strung with colorful underwear like prayer flags. Chickens bustle after bugs. Kids kick soccer balls on smears of new grass.
“21 km,” says a sign.
Even in the presence of spring, I feel nervous. I can’t help imagining the ways we might die on this mountain road, squeezed between cliffs and a squalling river. It’s a bad habit of mine: envisioning worst-case scenarios.
Explanations run shallow and deep. You have a red blister on your finger because you touched a hot iron; you have a red blister on your finger because the burn excited an inflammatory cascade of prostaglandins and cytokines, in a regulated process that we still understand only imperfectly. Knowing why—asking why—is our conduit to every kind of explanation, and explanation, increasingly, is what powers medical advances. Hinton spoke about baseball players and physicists. Diagnosticians, artificial or human, would be the baseball players—proficient but opaque. Medical researchers would be the physicists, as removed from the clinical field as theorists are from the baseball field, but with a desire to know “why.” It’s a convenient division of responsibilities—yet might it represent a loss? “A deep-learning system doesn’t have any explanatory power,” as Hinton put it flatly. A black box cannot investigate cause. Indeed, he said, “the more powerful the deep-learning system becomes, the more opaque it can become. As more features are extracted, the diagnosis becomes increasingly accurate. Why these features were extracted out of millions of other features, however, remains an unanswerable question.” The algorithm can solve a case. It cannot build a case.
Yet in my own field, oncology, I couldn’t help noticing how often advances were made by skilled practitioners who were also curious and penetrating researchers. Indeed, for the past few decades, ambitious doctors have strived to be at once baseball players and physicists: they’ve tried to use diagnostic acumen to understand the pathophysiology of disease. Why does an asymmetrical border of a skin lesion predict a melanoma? Why do some melanomas regress spontaneously, and why do patches of white skin appear in some of these cases? As it happens, this observation, made by diagnosticians in the clinic, was eventually linked to the creation of some of the most potent immunological medicines used clinically today. (The whitening skin, it turned out, was the result of an immune reaction that was also turning against the melanoma.) The chain of discovery can begin in the clinic. If more and more clinical practice were relegated to increasingly opaque learning machines, if the daily, spontaneous intimacy between implicit and explicit forms of knowledge—knowing how, knowing that, knowing why—began to fade, is it possible that we’d get better at doing what we do but less able to reconceive what we ought to be doing, to think outside the algorithmic black box?
I first realized I'd been bitten by the science bug in the summer of 1987. I was walking home from the laboratory, mulling over an organic chemistry reaction that I had been attempting — and mostly failing — to execute. Suddenly, a notion coalesced in my 19-year-old brain: all human biology and disease must ultimately come down to reactions that either proceed properly or go awry. As I savoured the evening breeze, I knew that I wanted to dedicate my career to understanding these mechanisms and thereby to hasten new treatments. Nearly every scientist remembers moments like these. I am saddened, therefore, by the cynical view that has become increasingly common in both academia and industry: that much biomedical science, even — or perhaps especially — that which appears in 'high-profile' journals, is bogus. I am one of many scientists who have seen their past research subjected to unexpected scrutiny as a result. An attempt to replicate work from my team was among the first described by the Reproducibility Project: Cancer Biology, an initiative that independently repeated experiments from high-impact papers. In this case, as an editorial that surveyed the first replications explained, differences between how control cells behaved in the two sets of experiments made comparisons uninformative1. The replicators' carefully conducted experiment showed just how tough it can be to reproduce result.
…We scientists search tenaciously for information about how nature works through reason and experimentation. Who can deny the magnitude of knowledge we have gleaned, its acceleration over time, and its expanding positive impact on society? Of course, some data and models are fragile, and our understanding remains punctuated by false premises. Holding fast to the three Rs ensures that the path — although tortuous and treacherous at times — remains well lit.
The decision guaranteeing abortion rights in the United States, found in Roe v. Wade (1973), was based on a right to privacy, which the court found to be primarily protected by the Fourteenth amendment's "concept of personal liberty and restrictions upon state action" and the Ninth amendment's "reservation of rights to the people". While it is not discussed at any length, the First amendment is cited in relation to the freedom of speech, most substantially as subsidiary foundation for the right to privacy, established by Stanley v. Georgia (1969). Religion played no role in Roe v. Wade, though it has arguably played a direct role in Planned Parenthood v. Casey (1992). There, the majority's decision plainly states, "The destiny of the woman must be shaped to a large extent on her own conception of her spiritual imperatives and her place in society." One might naturally read this as an expression of "religious liberty" and an implication of the non-establishment clause of the first Amendment of the Constitution, stating that "Congress shall make no law respecting an establishment of religion".
Despite this, "religious liberty" has come to the fore most forcefully in recent years as a contrary banner under which some religiously minded people insist that the First amendment's protection against laws "prohibiting the free exercise" of religion secures the right to refuse various services to homosexuals and to deny homosexual couples the right to marry. The free exercise clause is invoked in the Supreme Court case Burwell v. Hobby Lobby (2014), in a decision finding that corporations need not pay for employees' contraception. It is worth noting that Neil Gorsuch, the current nomination to the Supreme Court, was an author of the appellate decision that was upheld in Burwell. But as important as the "free exercise" clause is, it must be balanced against the "non-establishment" clause, which precedes it in the document as the first clause in the amendment.
In philosophy the most important development in the last 300 years has been the idea that what can be intelligibly said about reality is constructed out of our subjective responses, suitably constrained by social norms and intersubjective communication. This is the essence of Immanuel Kant's so-called Copernican Revolution in philosophy which converted us from naïve realists who took reality at face value to sophisticated anti-realists constructing reality via the structures of consciousness and language.
Kant's argument is sound but preposterous. One would have thought that reality's stubborn resistance to our ideas and expectations and the fact we are often surprised by this resistance might lead us to take the idea of a real world more seriously. The performative contradiction of claiming all reality is a social construction while traipsing off to the doctor when ill renders truth and knowledge the exclusive purview of scientists who have never shown much inclination toward anti-realism. But once these "naïve" realist thoughts are cast out in favor of Kant's fastidious, critical skepticism, common sense can't find a way back in. And so for 300 years we have been denying what to non-philosophers seems obvious—there is a real world out there with which our senses put us into contact.
In light of this revolution in thought we were, by now, supposed to be basking in the friendly solidarities of intersubjective agreement, a consequence that unfortunately appears to be increasingly remote. This idea that reality is a social construction ebbs and flows outside the philosophy class but in today's "post-truth" society it seems ascendant. Perhaps a new way must be found to anchor truth in something more substantial than contingent, collective agreements.