The Tempest -Excerpt: Act 4, Scene 1
Our revels now are ended. These our actors,
As I foretold you, were all spirits and
Are melted into air, into thin air:
And, like the baseless fabric of this vision,
The cloud-capp’d towers, the gorgeous palaces,
The solemn temples, the great globe itself,
Yea, all which it inherit, shall dissolve
And, like this insubstantial pageant faded,
Leave not a rack behind. We are such stuff
As dreams are made on, and our little life
Is rounded with a sleep.
Rockwell Anyoha in SITN (Harvard University):
In the first half of the 20th century, science fiction familiarized the world with the concept of artificially intelligent robots. It began with the “heartless” Tin man from the Wizard of Oz and continued with the humanoid robot that impersonated Maria in Metropolis. By the 1950s, we had a generation of scientists, mathematicians, and philosophers with the concept of artificial intelligence (or AI) culturally assimilated in their minds. One such person was Alan Turing, a young British polymath who explored the mathematical possibility of artificial intelligence. Turing suggested that humans use available information as well as reason in order to solve problems and make decisions, so why can’t machines do the same thing? This was the logical framework of his 1950 paper, Computing Machinery and Intelligence in which he discussed how to build intelligent machines and how to test their intelligence.
Unfortunately, talk is cheap. What stopped Turing from getting to work right then and there? First, computers needed to fundamentally change. Before 1949 computers lacked a key prerequisite for intelligence: they couldn’t store commands, only execute them. In other words, computers could be told what to do but couldn’t remember what they did. Second, computing was extremely expensive. In the early 1950s, the cost of leasing a computer ran up to $200,000 a month. Only prestigious universities and big technology companies could afford to dillydally in these uncharted waters. A proof of concept as well as advocacy from high profile people were needed to persuade funding sources that machine intelligence was worth pursuing.
Five years later, the proof of concept was initialized through Allen Newell, Cliff Shaw, and Herbert Simon’s, Logic Theorist. The Logic Theorist was a program designed to mimic the problem solving skills of a human and was funded by Research and Development (RAND) Corporation. It’s considered by many to be the first artificial intelligence program and was presented at the Dartmouth Summer Research Project on Artificial Intelligence (DSRPAI) hosted by John McCarthy and Marvin Minsky in 1956. In this historic conference, McCarthy, imagining a great collaborative effort, brought together top researchers from various fields for an open ended discussion on artificial intelligence, the term which he coined at the very event.
Kathryn Hughes at The Guardian:
Above all, Cockayne wants us to revise our assumption that people in the past must have been better at reuse and recycling (not the same thing) than we are today. There is, she insists, “no linear history of improvement”, no golden age when everyone automatically sorted their household scraps and spent their evenings turning swords into ploughshares because they knew it was the right thing to do. Indeed, in the 1530s the more stuff you threw away, the better you were performing your civic duty: refuse and surfeit was built into the semiotics of display in Henry VIII’s England, which explains why, on formal occasions, it was stylish to have claret and white wine running down the gutters. Three generations later and Oliver Cromwell’s Parliamentarian troops had found new ways to manipulate codes of waste and value. In 1646 they ransacked Winchester cathedral and sent all the precious parchments off to London to be made into kites “withal to fly in the air”.
Martha Ackmann at The Atlantic:
Dickinson said her life had not been constrained or dreary in any way. “I find ecstasy in living,” she explained. The “mere sense of living is joy enough.” When at last the opportunity arose, Higginson posed the question he most wanted to ask: Did you ever want a job, have a desire to travel or see people? The question unleashed a forceful reply. “I never thought of conceiving that I could ever have the slightest approach to such a want in all future time.” Then she loaded on more. “I feel that I have not expressed myself strongly enough.” Dickinson reserved her most striking statement for what poetry meant to her, or, rather, how it made her feel. “If I read a book [and] it makes my whole body so cold no fire can ever warm me I know that is poetry,” she said. “If I feel physically as if the top of my head were taken off, I know that is poetry. These are the only way I know it. Is there any other way.” Dickinson was remarkable. Brilliant. Candid. Deliberate. Mystifying. After eight years of waiting, Higginson was finally sitting across from Emily Dickinson of Amherst, and all he wanted to do was listen.
S. Jones and Stefan Helmreich in Boston Review (figure from From Kristine Moore et al., “The Future of the COVID-19 Pandemic” (April 30, 2020). Used by Boston Review with permission from the Center for Infectious Disease Research and Policy, University of Minnesota):
On January 29, just under a month after the first instances of COVID-19 were reported in Wuhan, Chinese health officials published a clinical report about their first 425 cases, describing them as “the first wave of the epidemic.” On March 4 the French epidemiologist Antoine Flahault asked, “Has China just experienced a herald wave, to use terminology borrowed from those who study tsunamis, and is the big wave still to come?” The Asia Times warned shortly thereafter that “a second deadly wave of COVID-19 could crash over China like a tsunami.” A tsunami, however, struck elsewhere, with the epidemic surging in Iran, Italy, France, and then the United States. By the end of April, with the United States having passed one million cases, the wave forecasts had become bleaker. Prominent epidemiologists predicted three possible future “wave scenarios”—described by one Boston reporter as “seascapes,” characterized either by oscillating outbreaks, the arrival of a “monster wave,” or a persistent and rolling crisis.
While this language may be new to much of the public, the figure of the wave has long been employed to describe, analyze, and predict the behavior of epidemics. Understanding this history can help us better appreciate the conceptual inheritances of a scientific discipline suddenly at the center of public discussion. It can also help us judge the utility as well as limitations of those representations of epidemiological waves now in play in thinking about the science and policy of COVID-19. As the statistician Edward Tufte writes in his classic work The Visual Display of Quantitative Information (1983), “At their best, graphics are instruments for reasoning about quantitative information.” The wave, operating as a hybrid of the diagrammatic, mathematical, and pictorial, certainly does help to visualize and think about COVID-19 data, but it also does much more. The wave image has become an instrument for public health management and prediction—even prophecy—offering a synoptic, schematic view of the dynamics it describes.
Aparna Kapadia in Scroll.in:
In India, some of the earliest printed recipe books became popular in the 19th century and were written for Anglo-Indians, the term used for the British settled there. While recipes were recorded in the precolonial era, these were in manuscript form, their production and use restricted to very elite, mostly royal settings. Even then, unlike our contemporary cookbooks, entire books containing only recipes, were rare: the late 15th-century Persian work Nimatnama from the Malwa sultanate and Supa Shastra or Science of Cooking, composed around the same period by a Jain king from the present-day Karnataka region, are some known examples of collections that contain recipes for food as well as aphrodisiacs and health potions.
Prescriptions for what people could and should eat as also recipes were more often assimilated into texts produced for broader purposes. For instance, the 16th-century work, Ain-i-Akbari, is primarily a record of Mughal emperor Akbar’s administration. But this compendium also contains sections on the management of various branches of the imperial kitchens and describes recipes that range from simple everyday items like khichdi and saag or greens and richer dishes including a saffron infused lamb biryani and halwa made in ghee. The Emperor, it seems, liked to oversee the management of every part of his empire.
Cate Lineberry over at Smithsonian Magazine:
Like so many enslaved people, Smalls was haunted by the idea that his family—his wife, Hannah; their four-year-old daughter, Elizabeth; and their infant son, Robert, Jr.—would be sold. And once separated, family members often never saw each other again.
The only way Smalls could ensure that his family would stay together was to escape slavery. This truth had occupied his mind for years as he searched for a plan with some chance of succeeding. But escape was hard enough for a single man; to flee with a young family in tow was nearly impossible: enslaved families often did not live or work together, and an escape party that included children would slow the journey significantly and make discovery much more likely. Traveling with an infant was especially risky; a baby’s cry could alert the slave patrols. And the punishment if caught was severe; owners could legally have runaways whipped, shackled, or sold.
Now Smalls’ chance at freedom had finally come. With a plan as dangerous as it was brilliant, he quietly alerted the other enslaved crew members on board. It was time to seize the Planter.
Gita Gopinath over at the IMF:
This crisis like no other will have a recovery like no other.
First, the unprecedented global sweep of this crisis hampers recovery prospects for export-dependent economies and jeopardizes the prospects for income convergence between developing and advanced economies. We are projecting a synchronized deep downturn in 2020 for both advanced economies (-8 percent) and emerging market and developing economies (-3 percent; -5 percent if excluding China), and over 95 percent of countries are projected to have negative per capita income growth in 2020. The cumulative hit to GDP growth over 2020–21 for emerging market and developing economies, excluding China, is expected to exceed that in advanced economies.
Second, as countries reopen, the pick-up in activity is uneven. On the one hand, pent-up demand is leading to a surge in spending in some sectors like retail, while, on the other hand, contact-intensive services sectors like hospitality, travel, and tourism remain depressed. Countries heavily reliant on such sectors will likely be deeply impacted for a prolonged period.
Third, the labor market has been severely hit and at record speed, and particularly so for lower-income and semi-skilled workers who do not have the option of teleworking. With activity in labor-intensive sectors like tourism and hospitality expected to remain subdued, a full recovery in the labor market may take a while, worsening income inequality and increasing poverty.
Charles Swanton in Nature:
Before the COVID-19 pandemic, the TRACERx project had recruited 760 people with early-stage lung cancer. After a person is diagnosed with a primary lung tumour, it is surgically removed and the cells are analysed to reconstruct the tumour’s evolutionary history. Each individual receives a computed tomography (CT) scan every year for five years to check whether their cancer has returned. If there is no sign of relapse, they are discharged and deemed to have been cured. People with later-stage tumours (stages 2 and 3) are offered chemotherapy following surgery to improve the chance of remission or cure.
Analysis of tumours from the first 100 people enrolled in the study has revealed many genomic changes. These include chromosome deletions and duplications, and even the doubling of whole genomes in nearly three-quarters of tumours — a feature of many cancers1. Point mutations in DNA, arising from single changes in the genome sequence, were also prevalent. These occurred as a result of tobacco exposure and the activity of enzymes called cytidine deaminases, which normally deactivate invading viruses as part of the immune response. Another finding is that whole-genome doubling often occurs early on in those with lung cancer who have a history of smoking1. This doubling seems to protect the genes needed for the tumour’s survival in the face of the excessive mutations and chromosomal losses that occur in its genome as it develops2. Strikingly, mutations induced by smoking tend to dominate the ‘trunk’ of the tumour’s evolutionary tree. These are known as founder or truncal mutations, and are present in every tumour cell. For the most common type of non-small-cell lung cancer — adenocarcinomas that form in mucus-secreting glands — the number of smoking-related mutations in the trunk correlates with the number of cigarettes that the person has smoked. As the cancer advances, cytidine deaminases cause haphazard mutations to accumulate in some cells; we refer to these as branched mutations2.
With a view to devising immunotherapy3, we also investigated the DNA sequences of receptors on T cells, a type of white blood cell that fights off infection and emerging cancers. We were surprised to discover that the sequences of T-cell receptors evolved in parallel with the tumour.
…The overriding message is that it will take multiple approaches to outpace cancers that have sophisticated evolutionary mechanisms, such as lung cancer. Each approach will have to focus on different stages of the disease to improve outcomes.
Kamila Shamsie in Tate Etc:
It is worth mentioning here that it’s widely believed that the greatest piece of writing – indeed, the greatest piece of art – created about Partition is a short story, called ‘Toba Tek Singh’, just a handful of pages long, by the Urdu writer Saadat Hasan Manto. The story is about the (and I’m using the language of the time) inmates of a lunatic asylum in the district of Toba Tek Singh, near Lahore. When Partition takes place, the lunatics must be divided between India and Pakistan. It ends with one of the inmates lying down in no man’s land, muttering nonsense words: ‘Upar di gur gur di annexe di be-dhiyana mung di daal of di Toba Tek Singh and Pakistan’. A rough translation would go: ‘Upstairs the rumbling the annex the heedlessness the lentils of Toba Tek Singh and Pakistan’. A man speaking nonsense words in a world where reality is beyond what words can convey – that is part of the effectiveness of this story. The Partition of word and meaning. The inability of language to make sense of what is going on, except through conveying senselessness.
It is certainly true that sometimes the linearity of language, the logic of first this-then this-then this, words become sentences become paragraphs – sometimes that seems too neat, too false, to convey a shattering, a sundering.
I’ve felt this in my own writing. Some years ago, when writing about the dropping of the atom bomb on Nagasaki, I knew the moment of the bomb detonating could only be depicted by two blank pages – the closest thing to a flash of white light I could put on a page.
If I had to invent a story about my life as a writer leading up to the moment of those two blank pages, it would start with me looking and looking at that framed letter on the wall and choosing not to read it. That was the moment when I first knew that sometimes words are not what you need to convey the weight of a particular kind of moment.
As a writer it’s hard not to be drawn to considerations of how text functions in art – I’m particularly interested in how it functions around events that seem to leach the meaning out of everyday language, or else make every word so freighted that it collapses under its own weight. What are words when you don’t use them to convey their literal meaning, but find a way to burrow down into them and find the symbolic heft of them instead?
After a long time, we spoke in the same tongue.
Should I move on, I asked God and God replied,
You do not seem like you want to move on.
I sighed, realizing God’s tactic.
Our talks were indoors, in candlelit rooms.
Anytime God spoke, flames fluttered.
Would you mind if I moved on, I asked God
and waited for the candle’s noise.
God’s silence seemed a staunch no.
I asked if you were at fault for leaving me
and if I was at fault for hurting you.
Both of you made mistakes, God said.
It relieved me to know that you, too, were wrong.
by Okwudili Nebeolisa
from Ecotheo Review
Alexander Stern at the Los Angeles Review of Books:
When I was a child, my brother and I played a computer game based on the Indiana Jones movie franchise. In the course of his adventures, Indy would sometimes come to a delicate impasse that required tact and nuance to resolve. Some adversary was blocking his path to a relic or treasure (including, oddly enough, one of Plato’s lost dialogues), and he needed to say just the right thing to get past them and on to the next challenge. The game offered players a selection of five or so sentences to choose from. The world would sit there in 8-bit paralysis and wait while we pondered how to make it do what we wanted. Press enter on the right sentence and pixelated Indy would sail through. Choose something inapt and game over.
We are all playing Indiana Jones and the Fate of Atlantis now. A significant chunk of our lives involves gaping at screens, finger hovering over the send button, weighing our options, strategizing, obsessing over what will happen next. When we’ve finally made our decision we sit back and wait for the program to update — the response to land in our inbox, the algorithmic world to process our input and tell us if it has produced the hoped-for effect — or, God forbid, backfired and ruined us forever.
The interpolation of screens into the fabric of our lives has changed the character of our interactions in innumerable small but important ways that we have scarcely begun to recognize, reckon with, or reconcile ourselves to.
Sharon Begley in Stat:
What they are understanding is that this coronavirus “has such a diversity of effects on so many different organs, it keeps us up at night,” said Thomas McGinn, deputy physician in chief at Northwell Health and director of the Feinstein Institutes for Medical Research. “It’s amazing how many different ways it affects the body.”
One early hint that that would be the case came in late January, when scientists in China identified one of the two receptors by which the coronavirus, SARS-CoV-2, enters cells. It was the same gateway, called the ACE2 receptor, that the original SARS virus used. Studies going back some two decades had mapped the body’s ACE2 receptors, showing that they’re in cells that line the insides of blood vessels — in what are called vascular endothelial cells — in cells of the kidney’s tubules, in the gastrointestinal tract, and even in the testes.
Simon Blackburn at 3:16:
When I migrated from Cambridge to my new post at Pembroke College Oxford in the fall of 1969 I got the sense that many of the Old Guard there regarded me as some kind of foreign usurper. Perhaps their own favourite pupils had not been offered the job as they undoubtedly deserved. An exception was Peter Strawson, who was always friendly, and whom I came to admire greatly. Some years later when I belonged to a rather grubby photography workshop in the city I persuaded him to come downtown for me to make a portrait of him. I hope that some of the affection I felt comes through, as well as his undoubted amusement at the occasion.
The film of Strawson’s conversation with Gareth Evans brings many memories of those years flooding back. It reminds me too of the Davidsonic Boom, as it was christened by Bernard Williams (the noise a research program makes when it gets to Oxford), for Davidson’s Locke lectures had ignited all the Young Turks in the Faculty. Convention T was all the rage. I later wrote Spreading the Word to try to help puzzled students make sense of it all.
Revisiting the film I was struck once more by Strawson’s wonderfully delicate, feline, handling of the issues surrounding truth. As Cheryl Misak has emphasized, he sounds remarkably like Frank Ramsey, whose famous dismissal of the idea that there is a separate problem of truth is but a prelude to his hurling himself at problems of meaning and assertion.
More here. [Thanks to Huw Price.]
Maria Popova at Brain Pickings:
In 1977, the poet Adrienne Rich exhorted a graduating class of young women to think of education not as something one receives but as something one claims. But what does an education mean, and what does claiming it look like, for lives and minds animating bodies born into dramatically different points along the vast spectrum of privilege and possibility which human society spans?
This question comes alive in a wonderfully unexpected and necessary way in one of the highlights of the the third annual Universe in Verse by another great poet, essayist, and almost unbearably moving memoirist: Elizabeth Alexander — the fourth poet in history read at an American presidential inauguration (she welcomed Barack Obama to the presidency with her shimmering poem “Praise Song for the Day”) and the first woman of color to preside over one of the world’s largest philanthropic foundations.
Sianne Ngai at The Paris Review:
Arising by most accounts in the last decades of the nineteenth century, the novel of ideas reflects the challenge posed by the integration of externally developed concepts long before the arrival of conceptual art. Although the novel’s verbal medium would seem to make it intrinsically suited to the endeavor, the mission of presenting “ideas” seems to have pushed a genre famous for its versatility toward a surprisingly limited repertoire of techniques. These came to obtrude against a set of generic expectations—nondidactic representation; a dynamic, temporally complex relation between events and the representation of events; character development; verisimilitude—established only in wake of the novel’s separation from history and romance at the start of the nineteenth century. Compared to these and even older, ancient genres like drama and lyric, the novel is astonishingly young, which is perhaps why departures from its still only freshly consolidated conventions seem especially noticeable.
Albert and Mary Lasker Foundation Newsletter:
The highlights of Leroy Hood’s scientific career are like peaks in a mountain range spanning diverse fields, from molecular immunology and engineering, to genomics, to systems medicine. But Hood doesn’t think his trailblazing approach should be unusual, emphasizing that “one of the really key things about science is every 10 or 15 years, you really make a dramatic break and do something new… and you have to learn a lot before you can make fundamental contributions.”
One of Hood’s early contributions was in cracking the long-standing mystery of how the immune systems of humans and all vertebrates give rise to the vast diversity of antibodies that is critical for fighting myriad pathogens and foreign substances. During his PhD research at the California Institute of Technology (Caltech) in the mid-1960s, Hood and his advisor, William Dreyer, determined the amino acid sequences of components of antibody molecules and found that their sequences varied greatly between different antibodies. The finding helped advance their idea that each antibody is actually encoded by more than one gene, a big challenge to the existing dogma. Later, with his own research group at Caltech, Hood detailed the intricate process of how segments of antibody-encoding genes are rearranged, further creating antibody diversity. For his work in this area, Hood, along with Philip Leder and Susumu Tonegawa, won the Albert Lasker Basic Medical Research Award in 1987.
Despite his groundbreaking discoveries in molecular immunology, Hood has not focused on this field for the last 20 years or so. Soon after starting his research group in 1970, he realized there were “striking technological limitations” that were holding scientists back from deeper understanding of the immune system. So, over the next two decades, Hood and his colleagues set out developing instruments, including the first automated DNA sequencer that allowed more rapid reads of genes. Hood’s intense focus on technology development as a faculty member at Caltech opened his eyes to what was the first paradigm shift of his career: “bringing engineering to biology.”