“There is alcohol in this establishment. You love alcohol!”
These words recently greeted me from a chalkboard sign at a bar a few blocks away from my apartment. The sheer cheekiness nearly knocked me over. If I’d been about to enter that bar, I might have turned on my heels and walked away. The commercialism mixed with annoying solicitousness mixed with elbow-in-ribcage jokiness—it all felt so familiar. When did bar and café chalkboards start reading like some kind of cross between a pick-up line, “neg,” and Internet meme?
Long after the printing press rendered town criers obsolete, that other ancient form of information dissemination, the sidewalk sandwich board, quietly persists. Sometimes these chalkboards—you can find them standing outside certain not-corporate-and-proud-of-it businesses like bars, coffee shops, and boutiques—list the day’s specials or when happy hour is. But perhaps you too have lately noticed a certain creep away from the practical toward a softer sell: jokes, puns, quotations, drawings, and other creative expressions of branding. Too often, the results are cringeworthy…
Wondering if I was the only crank who found these signs aggressively unnecessary, I took to the Internet in search of sympathizers. I found plenty. “I think what irks me in general about these signs is just the overfamiliarity,” emailed Chiara Atik, a playwright and writer who has tweeted her ire for these signs. “Like I just want a coffee, not some timely allusion to last night’s Game of Thrones.” The strategy of attracting attention through clever signage may even be backfiring, resulting not in additional business but eye rolls. (From me anyway. I acknowledge the possibility that some people read these signs, laugh heartily, and happily hand over their dollars.)
In the 1960s, the English psychologist Peter Wason devised an experiment that would revolutionize his field. This clever puzzle, known as the “Wason selection task,” is often claimed to be “the single most investigated experimental paradigm in the psychology of reasoning,” in the words of one textbook author. Wason was a funny and clever man and an idiosyncratic thinker. His great insight was to treat reasoning as an enigma, something to scrutinize both critically and playfully. He told his colleagues, for instance, that he would familiarize himself with their work only after doing his own experiments, so as not to bias his own mind. He also said that before running experiments, researchers—quixotically—should never really know exactly why they were doing them. “The purpose of his experiments was not usually to test a hypothesis or theory, but rather to explore the nature of thinking,” a pair of his students wrote in Wason’s obituary. (He died in 2003.) “His aim was to reveal a surprising phenomenon—to show that thinking was not what psychologists including himself had taken it to be.”
The groundbreaking nature of Wason’s selection task may have been a result of his unconventional style. In one version of the task, one subject (always one—he spurned testing subjects in groups) is presented with four cards lying flat on a table, each with a single-digit number on one face and one of two colors on the other. Let’s imagine that you’re Wason’s subject. The first and second cards you see are a five and an eight; the third and fourth cards are blue and green, respectively. Wason liked to chat with his subjects, but he probably didn’t tell them that this logical puzzle was “deceptively easy,” which was how he described it in the paper he would later write, in 1968. Wason tells you that if a card shows an even number on one face, then its opposite face is blue. Which cards must you turn over in order to test the truth of his proposition, without turning over any unnecessary cards? Click on your answer in the interactive video below:
Stem-cell scientists at McMaster University have developed a way to directly convert adult human blood cells to sensory neurons, providing the first objective measure of how patients may feel things like pain, temperature, and pressure, the researchers reveal in an open-access paper in the journal Cell Reports. Currently, scientists and physicians have a limited understanding of the complex issue of pain and how to treat it. “The problem is that unlike blood, a skin sample or even a tissue biopsy, you can’t take a piece of a patient’s neural system,” said Mick Bhatia, director of the McMaster Stem Cell and Cancer Research Institute and research team leader. “It runs like complex wiring throughout the body and portions cannot be sampled for study. “Now we can take easy to obtain blood samples, and make the main cell types of neurological systems in a dish that is specialized for each patient,” said Bhatia. “We can actually take a patient’s blood sample, as routinely performed in a doctor’s office, and with it we can produce one million sensory neurons, [which] make up the peripheral nerves. We can also make central nervous system cells.”
Testing pain drugs
The new technology has “broad and immediate applications,” said Bhatia: It allows researchers to understand disease and improve treatments by asking questions such as: Why is it that certain people feel pain versus numbness? Is this something genetic? Can the neuropathy that diabetic patients experience be mimicked in a dish? It also paves the way for the discovery of new pain drugs that don’t just numb the perception of pain. Bhatia said non-specific opioids used for decades are still being used today. “If I was a patient and I was feeling pain or experiencing neuropathy, the prized pain drug for me would target the peripheral nervous system neurons, but do nothing to the central nervous system, thus avoiding addictive drug side effects,” said Bhatia.
It was Kashimiri poetry that sparked the idea of a family summer holiday in Srinagar. I encountered Ranjit Hoskote’s I, Lalla—The Poems of Lal Ded in 2011, and was instantly hooked by the power packed in the four-line vakhs. Lal Ded, an unusual 14th-century female Kashmiri mystic and poet, inhabited a “Hindu-Buddhist universe of meaning,” as Hoskote puts it, while simultaneously drawing on Persian, Arabic, and Sufi philosophy. Similarly, deeply rooted syncretism is part of my Goan heritage, and Lal Ded’s poems touched a personal chord. Before long, I became obsessed with the idea of an extended visit to Kashmir to learn more about the cultural roots that yielded this intriguing poetry.
When my wife, three young sons, and I finally arrived in Srinagar the following summer, we discovered Lal Ded’s poems are truly the bedrock to Kashmir’s many-layered identity. Favourite vakhs were recited to us proudly by schoolchildren and kebab-sellers; by the gate-keeper who ushered us through the wood-and-brick shrine dedicated to Naqshband Sahib, a 17th century mystic who came to Kashmir from Bukhara; and also by the young man with wildly curly hair who piloted us through Dal Lake’s floating tomato plantations.
The heartfelt verses of Lal Ded are an important part of Kashmir’s living regional tradition, where Shaivism flows into Sufism through the unique “Muslim Rishis”. We found this richly confluent identity—Kashmiriyat—shining brightly on our very first night in Srinagar, when we attended a moonlit bhand pather performance as part of the Dara Shikoh festival hosted at Almond Villa, on the shores of Dal Lake. Directed by one of India’s best-known theatre directors, M.K. Raina, the folk troupe poked exuberant fun at the hypocrisies of religion.
Modern, fast, processed food is a disaster. That, at least, is the message conveyed by newspapers and magazines, on television cooking programs, and in prizewinning cookbooks.
It is a mark of sophistication to bemoan the steel roller mill and supermarket bread while yearning for stone ground flour and brick ovens; to seek out heirloom apples and pumpkins while despising modern tomatoes and hybrid corn; to be hostile to agronomists who develop high-yielding modern crops and to home economists who invent new recipes for General Mills.
We hover between ridicule and shame when we remember how our mothers and grandmothers enthusiastically embraced canned and frozen foods. We nod in agreement when the waiter proclaims that the restaurant showcases the freshest local produce. We shun Wonder Bread and Coca-Cola. Above all, we loathe the great culminating symbol of Culinary Modernism, McDonald’s — modern, fast, homogenous, and international.
Like so many of my generation, my culinary style was created by those who scorned industrialized food; Culinary Luddites, we may call them, after the English hand workers of the nineteenth century who abhorred the machines that were destroying their traditional way of life. I learned to cook from the books of Elizabeth David, who urged us to sweep our store cupboards “clean for ever of the cluttering debris of commercial sauce bottles and all synthetic flavorings.”
It was the 18th-century scientist Carolus Linnaeus that laid the foundations for modern biological taxonomy. It was also Linnaeus who argued for the existence of Homo troglodytes, a primitive people said to inhabit the caves of an Indonesian archipelago. Although troglodyte1 has since been proven to be an invalid taxon, archaeological doctrine continued to describe our ancestors as cavemen. The idea fits with a particular narrative of human evolution, one that describes a steady march from the primitive to the complex: Humans descended from the trees, stumbled about the land, made homes in caves, and finally found glory in high-rises. In this narrative, progress includes living inside confined physical spaces. This thinking was especially prevalent in Western Europe, where caves yielded so much in the way of art and artifacts that archaeologists became convinced that a cave was also a home, in the modern sense of the word.
By the 1980s, archaeologists understood that this picture was incomplete: The cave was far from being the primary residence. But archaeologists continued focusing on excavating caves, both because it was habitual and the techniques involved were well understood.
Then along came the American anthropological archaeologist, Margaret Conkey. Today a professor emerita at the University of California, Berkeley, she had asked a simple question: What did cave people do all day? What if she looked at the archaeological record from the perspective of a mobile culture, like the Inuit? She decided to look outside of caves.
Each of the three monotheistic religions, commonly referred to as ‘Abrahamic’, has its own affirmation of faith, a single statement held to be fundamental by its adherents.
In Judaism, such a proclamation is Shema (Listen), drawn from Deuteronomy 6:4. It reads: “Listen, O Israel: The Lord is our God, the Lord is One!” Observant Jews must recite Shemadaily—for instance, before falling asleep—and it is supposed to be the last thing they utter before dying. Even in the most private nocturnal moments and on the deathbed, Shemaannounces monotheistic creed, in the imperative, to the religious community, united around “our God” who is “One.”
Christianity, too, has its dogma going back to the Apostles’ Creed, dating to the year 150. Still read during the baptismal ritual, the statement of faith begins with the Latin wordCredo, “I believe” and continues “…in the all-powerful God the Father, Creator of heavens and earth, and in Jesus Christ, His only Son, our Lord, conceived by the Holy Spirit, born of the Virgin Mary…” Credo individualizes the believer; not only does it start with a verb in the first person singular, but it also crafts her or his identity through this very affirmation. While the Judaic Shema forges a community through a direct appeal to others, the Christian profession of faith self-referentially produces the individual subject of that faith.
The declaration of Islamic creed is called Shahada, “Testimony.” In contrast to its other monotheistic counterparts, however, it commences with a negation.
WHY DO THESE PEOPLE need so much water? The answer, in large part, is corn. In the 19th century, cattle raised on the plains were shipped off to Chicago for slaughter, but over time meatpacking moved progressively closer to the cow. The stockyards grew so huge that their size became inefficient. Improvements in the railroads and, later, the advent of the semitruck made it cheap to transport meat without a central site of production. Decentralization also enabled management to escape Chicago’s strong labor movement. The industry is now dispersed across dozens of small plains cities: Dodge City and Garden City on the Arkansas in Kansas, and Liberal, which isn’t far, as well as Greeley, Colorado, and Grand Island, Nebraska, along the Platte. Each city and its small hinterland is a vertically integrated unit for producing beef, and corn is the cheapest means to fatten cattle before they are sent to the slaughterhouse. Consequently, many plains farmers now grow corn instead of dryland crops like wheat. But corn is water hungry and must have twenty inches of rainfall a year to survive and at least forty to thrive. Only one of the corn-growing counties along the upper Arkansas receives twenty inches of rain a year, and some places are so dry that they are, both technically and in outward appearance, deserts. Although corn is manifestly unsuited to the climate, it is grown in enormous volumes, and irrigation is what allows this to continue.
Since the early 1980s, conflicts have generally become more fragmented, meaning they involve more than two warring parties. The spread of internal conflicts has led outside nations to become more involved, which tends to prolong hostilities. In the 1990s, few internal conflicts drew outside powers. By 2010, almost 27 per cent of internal wars entangled outside nations. The causes of these fragmented internal conflicts are complex, varying from region to region. In parts of Africa, especially parts of West Africa in the 1990s, diamonds and other easily looted resources have helped drive conflict. In other parts of Africa, such as the eastern edge of the DRC, disease and environmental degradation have shaped regional fighting. An unrelenting appetite for narcotics in the US has stoked violence in many Latin American countries. Globally, a booming arms trade has helped give rise to Kalashnikov politics, ie politics practised with either an overt or implied threat of armed violence by competing factions. For the world’s aggrieved and malcontent, making war is easier than ever; making politics more violent and dangerous. So when the US goes to war today, it typically becomes a party to internal conflict instead of a combatant against another country.
Military triumphs against other nations – for example Iraq in 2003 – offer only fleeting victories and serve as preludes to the actual war. In these internal, fragmented conflicts, victory is elusive for any party involved…Statistically, the odds of the US coming up a winner in a modern war are perhaps as low as one in seven.
Superpowers and hegemons are also winning less frequently these days than they once did. From 1900 to 1949, strong militaries fighting conventionally weaker forces won victories about 65 per cent of the time. From 1950 to 1998, advantaged military powers claimed war victories only 45 per cent of the time. In the first part of the 19th century, superior powers won wars almost 90 per cent of the time. For hundreds of years, nations with the will and the means to raise strong militaries have wagered that the extraordinary investment of time, treasure and lives would yield rewards in war when the moment came. For hundreds of years, that was a safe bet – but not any more. For 21st-century superpowers, war is no longer likely to be a winning endeavour.
John Nash, a Nobel laureate and mathematical genius whose struggle with mental illness was documented in the Oscar-winning film A Beautiful Mind, was killed in a car accident on Saturday. He was 86. The accident, which occurred when the taxi Nash was traveling in collided with another car on the New Jersey Turnpike, also claimed the life of his 82-year-old wife, Alicia. Neither of the two drivers involved in the accident sustained life-threatening injuries. Born in West Virginia in 1928, Nash displayed an acuity for mathematics early in life, independently proving Fermat’s little theorem before graduating from high school. By the time he turned 30 in 1958, he was a bona fide academic celebrity. At Princeton, Nash published a 27-page thesis that upended the field of game theory and led to applications in economics, international politics, and evolutionary biology. His signature solution—known as a “Nash Equilibrium”—found that competition among two opponents is not necessarily governed by zero-sum logic. Two opponents can, for instance, each achieve their maximum objectives through cooperating with the other, or gain nothing at all by refusing to cooperate. This intuitive, deceptively simple understanding is now regarded as one of the most important social science ideas in the 20th century, and a testament to his almost singular intellectual gifts.But in the late 1950s, Nash began a slide into mental illness—later diagnosed as schizophrenia—that would cost him his marriage, derail his career, and plague him with powerful delusions. Nash believed at various times that he was the biblical figure Job, a Japanese shogun, and a “messianic figure of great but secret importance.” He obsessed with numbers and believed the New York Times published coded messages from extraterrestrials that only he could read.
Mental institutions and electroshock therapy failed to cure him, and for much of the next three decades, Nash wandered freely on the Princeton campus, scribbling idly on empty blackboards and staring blankly ahead in the library.
One of the biggest mistakes my husband made as a new father was to tell me he thought his diaper-changing technique was better than mine. From then on, guess who assumed the lion’s share of diaper patrol in our household? Or rather, the northern flicker’s share. According to a new report in the journal Animal Behaviour on the sanitation habits of these tawny, 12-inch woodpeckers with downcurving bills, male flickers are more industrious housekeepers than their mates. Researchers already knew that flickers, like many woodpeckers, are a so-called sex role reversed species, the fathers spending comparatively more time incubating the eggs and feeding the young than do the mothers. Now scientists have found that the males’ parental zeal also extends to the less sentimental realm of nest hygiene: When a chick makes waste, Dad, more readily than Mom, is the one who makes haste, plucking up the unwanted presentation and disposing of it far from home.
Researchers have identified honeybee undertakers that specialize in removing corpses from the hive, and they have located dedicated underground toilet chambers to which African mole rats reliably repair to perform their elaborate ablutions. Among chimpanzees, hygiene often serves as a major driver of cultural evolution, and primatologists have found that different populations of the ape are marked by distinctive grooming styles. The chimpanzees in the Tai Forest of Ivory Coast, for example, will extract a tick or other parasite from a companion’s fur with their fingers and then squash the offending pest against their own forearms. Chimpanzees in the Budongo Forest of Uganda prefer to daintily place the fruits of grooming on a leaf for inspection, to decide whether the dislodged bloodsuckers are safe to eat, or should simply be smashed and tossed. Budongo males, those fastidious charmers, will also use leaves as “napkins,” to wipe their penises clean after sex.
In Oslo on May 19 John Nash and Louis Nirenberg received the 2015 Abel Prize “for striking and seminal contributions to the theory of nonlinear partial differential equations and its applications to geometric analysis”. The Abel Prize is barely a decade old but has quickly became one of the most prestigious awards in mathematics. To learn more about this year's winners, visit the Abel Prize webpage here. For an insight into the personalities of the two winners, I especially recommend these short videos.
This year's prize comes with sad news. On their way home from the award ceremony, John and Alicia Nash were killed in an auto accident. You can read the New York Times obituary here.
Last year at 3QD we talked about Yakov Sinai's work in dynamical systems. By coincidence this year's winners' work is closely related to the “exotic” non-Euclidean geometries we discussed at 3QD in March. It's a good chance to dig a little deeper into these topics and get the flavor of Nash and Nuremberg's work. Like last year I should say straight off that I'm not an expert, but I'm happy to talk about some cool mathematics.
John Nash, of course, is one of the most widely known mathematicians of the twentieth century. His life story was told by Sylvia Nasar in “A Beautiful Mind”. The book was made into an award-winning film of the same name starring Russell Crowe. It tells of Nash's brilliant work as a young man and his subsequent difficulties with mental health issues. It's a dramatic story and well worth watching the film. It should go without saying, but the movie turns the drama knob up to eleven and shouldn't be taken as an accurate depiction of Nash's life. For a more nuanced version of events I recommend Nasar's book.
The movie closes with John Nash winning the Nobel prize in Economics for his work in game theory. In game theory we use mathematics to study potential strategies, outcomes, etc., when two or more players are in competition. If you only think about tic-tac-toe, chess, and other such games it first it sounds like a mathematical trifle. But once you begin to look around you see players in competition everywhere: people and corporations in the marketplace, countries in geopolitics, species in evolutionary competition, etc. Game theory is serious business!
I was first inspired to write a Qasida in English when I came across Lorca’s “Casida de la Rosa” while researching the history of Al Andalus for my book-length series of poems on Muslim Spain. I also knew of Qasida poems in Urdu. For Lorca, who was a native of Granada, Andalucia, and had fallen under the spell of Andalusi history, writing a “casida” was a way to enter an erased, haunting, vivifying past whose mystique and poetic sensibility he identified with and felt the urgency to express. Lorca’s work was produced at a time, when, according to a contemporary of Lorca’s, Europe was “suffering from a withering of the ability to desire.” A recurrent word in Lorca’s poetry is “quiero” or “I desire,” and in Bly’s words, Lorca “adopted old Arab forms to help entangle that union of desire and darkness, which the ancient Arabs loved so much.”
The qasida can certainly be seen as a poetic tradition with desire as its central theme. The classical Arabic qasida has fifty to a hundred lines with a fixed rhyming pattern. It is divided into three main thematic components and further divided into smaller units of certain fixed metaphors, which find nuances in the hands of the particular poet using the form. The primary metaphor that constitutes the qasida is that of being in sojourn, lost in the desert, in the pursuit of the loved one whose caravan always eludes the speaker. The journey, a figurative and literal subject of the qasida, may stand for desire. The different movements in the poem signify specific places along the journey that co-relate to the poet’s emotional journey: the origins of his desire, nostalgia for past campsites, intense passion for the absent beloved, the larger map of life, the pride he takes in his tribe/caravan, how he relates to the tribe of the beloved, so on. The tone of the subsections could be laudatory, melancholic or romantic, allowing even humor and light-hearted derision of other tribes in one of the sub-sections. The imagery often tends to be abstract or symbolic, relying on the traditional, complex network of metaphors. As the ancient form of qasida developed through the centuries and across cultures, poets adapted it to suit concerns relevant to them, as in the case of the Andalusi Arabic poets that Lorca emulates.