The development of Pakistani literature in English

Rafia Zakaria in the Times Literary Supplement:

What, then, shall that language be? One-half of the committee maintain that it should be the English. The other half strongly recommend the Arabic and Sanscrit. The whole question seems to me to be – which language is the best worth knowing?” So asked Lord Macaulay of the British Parliament on February 2, 1835. He went on, of course, to answer his own question; there was no way that the natives of the subcontinent over which they now ruled could be “educated by means of their mother-tongue”, in which “there are no books on any subject that deserve to be compared to our own”. And even if there had been, it did not matter, for English “was pre-eminent even among languages of the West”. English, it was decided, would be the language that would be taught to the natives. By 1837, English replaced Persian as the language of courtrooms and official business in Muslim India and took with it the cultural ascendancy of the Persian speakers.

This sordid story of tainted beginnings is aptly recounted in Muneeza Shamsie’s Hybrid Tapestries: The development of Pakistani literature in English, which traces the history of an often vexed but always intriguing literary lineage from the nineteenth century until today. It is a tricky tale to tell, not least because the moment of origin is also the moment of im­position and conquest. The development of Pakistani literature is directly linked to those deposed Muslims and their cherished Persian, which adds further flavours of resentment and betrayal to the mixture.

More here.

Philosophy shrugged: ignoring Ayn Rand won’t make her go away

Skye C Cleary in Aeon:

Philosophers love to hate Ayn Rand. It’s trendy to scoff at any mention of her. One philosopher told me that: ‘No one needs to be exposed to that monster.’ Many propose that she’s not a philosopher at all and should not be taken seriously. The problem is that people are taking her seriously. In some cases, very seriously.

A Russian-born writer who moved to the United States in 1926, Rand promoted a philosophy of egoism that she called Objectivism. Her philosophy, she wrote in the novel Atlas Shrugged (1957), is ‘the concept of man as a heroic being, with his own happiness as the moral purpose of his life, with productive achievement as his noblest activity, and reason as his only absolute’. With ideals of happiness, hard work and heroic individualism – beside a 1949 film starring Gary Cooper and Patricia Neal based on her novel The Fountainhead (1943) – it’s perhaps no wonder that she caught the attention and imagination of the US.

Founded three years after her death in 1982, the Ayn Rand Institute in California reports that her books have sold more than 30 million copies. By early 2018, the institute planned to have given away 4 million copies of Rand’s novels to North American schools. The institute has also actively donated to colleges, with the funding often tied to requirements to offer courses taught by professors who have ‘a positive interest in and [are] well-versed in Objectivism, the philosophy of Ayn Rand’ – with Atlas Shrugged as required reading.

Rand’s books are becoming increasingly popular. The Amazon Author Rank lists her alongside William Shakespeare and J D Salinger. While these rankings fluctuate and don’t reflect all sales, the company her name keeps is telling enough.

More here.

If borders were open: A world of free movement would be $78 trillion richer

From The Economist:

A HUNDRED-DOLLAR BILL is lying on the ground. An economist walks past it. A friend asks the economist: “Didn’t you see the money there?” The economist replies: “I thought I saw something, but I must have imagined it. If there had been $100 on the ground, someone would have picked it up.”

If something seems too good to be true, it probably is not actually true. But occasionally it is. Michael Clemens, an economist at the Centre for Global Development, an anti-poverty think-tank in Washington, DC, argues that there are “trillion-dollar bills on the sidewalk”. One seemingly simple policy could make the world twice as rich as it is: open borders.

Workers become far more productive when they move from a poor country to a rich one. Suddenly, they can join a labour market with ample capital, efficient firms and a predictable legal system. Those who used to scrape a living from the soil with a wooden hoe start driving tractors. Those who once made mud bricks by hand start working with cranes and mechanical diggers. Those who cut hair find richer clients who tip better.

“Labour is the world’s most valuable commodity—yet thanks to strict immigration regulation, most of it goes to waste,” argue Bryan Caplan and Vipul Naik in “A radical case for open borders”.

More here.

The depression epidemic and why the medical profession is failing patients

William Leith in The Sunday Times:

In 1989, a trainee physician called Edward Bullmore treated a woman in her late fifties. Mrs P had swollen joints in her hands and knees. She had an autoimmune disease. Her own immune system had attacked her, flooding her joints with inflammation. This, in turn, had eaten away at Mrs P’s collagen and bone, noted Bullmore, who was 29, and whose real ambition was to become a psychiatrist.

He asked Mrs P some routine questions about her physical symptoms, and made a correct diagnosis of rheumatoid arthritis. Then he asked her a few questions he wasn’t supposed to ask. How was she feeling? How would she describe her mood? Well, said Mrs P, she was feeling very low – she was tired, listless and losing the will to live. She couldn’t sleep.

At this point, Bullmore made another diagnosis. “She’s depressed,” he told his boss at the hospital.

“Depressed?” said the consultant. “Well, you would be, wouldn’t you?”

Both of these doctors understood that Mrs P had an inflammatory disease. They knew that it had wrecked her joints. They understood the basic process that caused the joints to be wrecked. And they also knew that Mrs P was depressed.

More here.

We Need to Save Ignorance From AI

Leuker and Van Den Bos in Nautilus:

After the fall of the Berlin Wall, East German citizens were offered the chance to read the files kept on them by the Stasi, the much-feared Communist-era secret police service. To date, it is estimated that only 10 percent have taken the opportunity. In 2007, James Watson, the co-discoverer of the structure of DNA, asked that he not be given any information about his APOE gene, one allele of which is a known risk factor for Alzheimer’s disease. Most people tell pollsters that, given the choice, they would prefer not to know the date of their own death—or even the future dates of happy events. Each of these is an example of willful ignorance. Socrates may have made the case that the unexamined life is not worth living, and Hobbes may have argued that curiosity is mankind’s primary passion, but many of our oldest stories actually describe the dangers of knowing too much. From Adam and Eve and the tree of knowledge to Prometheus stealing the secret of fire, they teach us that real-life decisions need to strike a delicate balance between choosing to know, and choosing not to.

But what if a technology came along that shifted this balance unpredictably, complicating how we make decisions about when to remain ignorant? That technology is here: It’s called artificial intelligence. AI can find patterns and make inferences using relatively little data. Only a handful of Facebook likes are necessary to predict your personality, race, and gender, for example. Another computer algorithm claims it can distinguish between homosexual and heterosexual men with 81 percent accuracy, and homosexual and heterosexual women with 71 percent accuracy, based on their picture alone.1 An algorithm named COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) can predict criminal recidivism from data like juvenile arrests, criminal records in the family, education, social isolation, and leisure activities with 65 percent accuracy.

More here.

Book clinic: which books best explain why life is worth living?

Julian Baggini in The Guardian:

Surprisingly, few of the world’s great philosophers have directly addressed this question. Instead, they have focused on a subtly different question: what does it mean to live well? In his Nicomachean Ethics, Aristotle emphasised the need to cultivate good character, finding the sweet spot between harmful extremes. For example, generosity lies between the extremes of meanness and profligacy, courage between cowardice and rashness. A remarkably similar vision is presented in the Chinese classics TheAnalects of Confucius and Mencius.

However, in the west, millennia of Christian dominance created the assumption that life needed some justification outside of itself. As religious belief waned, the question of whether life is worth living emerged as a central concern for the French existentialists of the 20th century. The gist of their answer was hardly inspiring: life is absurd so you’ve just got to get on with it and create your own meaning. If you’re up for the challenge, Jean-Paul Sartre’s Existentialism and Humanism and The Myth of Sisyphus by Albert Camus expand on this.

More recently, anglophone philosophers have offered more positive answers by pulling together threads in their tradition that have previously been separate. Two fine examples of this are Robert Nozick’s The Examined Life and Christopher Belshaw’s 10 Good Questions About Life and Death.

More here.

Monday Poem

9-Lived Cat

.where are you

on the willow-hung swing
in a goldfield of grass
where
in the hemlock
straddling the branch just below the top
hands sticky with sap
where, where 
sitting on the well-house step
with the lake at your back
remembering a future
of victory or collapse
where
on the topside deck above the bridge
holding the cable-rail fast
exhilarated at how the bow’s pitch feels
spearing a new wave’s gut
as green water breaks over steel
and you feel up your spine
the meaning of
…………………….….….splash!
among zucchini
grubbing for ones green and fat
or off in a high in a twelve-string cage
hoping to harmonize with truth in that
where
are you tumbling up a shaft
like a 9-lived cat

Jim Culleny
6/18/18

Reading the Bakhshali Manuscript

Bill Casselman at the website of the American Mathematical Society:

The Bakhshali manuscript is a mathematical document found in 1881 by a local farmer in the vicinity of the village of Bakhshali, near the city of Peshawar in what was then British India and is now Pakistan. It is written in ink on birch bark, a common medium for manuscripts in northwestern India throughout much of history. In the tough climate of India and neighbouring regions, such things deteriorate rapidly, and it is miraculous that this document has survived.

The Bakhshali manuscript is in a very damaged state, but is a valuable mathematical record nonetheless. It consists now of 70pages, but was probably once part of something much longer. Some of the pages we have are themselves broken up into fragments, and large parts are missing. Even the exact order of the pages has been a matter of conjecture, since the state in which it first came under careful examination is not necessarily the original order. The first edition of the manuscript was published by the Government of India in Calcutta in 1927, and its editor was G. R. Kaye. In 1995 a new edition was published, edited by Takao Hayashi as an extension of his PhD thesis at Brown University. He ordered the pages very differently from Kaye, and made a much more thorough translation.

The manuscript was donated to the Bodleian Library at Oxford University early in the twentieth century. Attempts to assess its age have generated much controversy–estimates have ranged, roughly, from 300 C.E. to 1200 C.E.

More here.  [Thanks to Pramathanath Sastry.]

Weapons reveal how this 5,300-year-old ice mummy lived, and died

Ashley Strickland at CNN:

Although he’s older than the Giza pyramids and Stonehenge, the 5,300-year-old mummy of Otzi the Tyrolean Iceman continues to teach us things.

The latest study of the weapons he was found with, published in the journal PLOS ONE on Wednesday, reveals that Otzi was right-handed and had recently resharpened and reshaped some of his tools before his death. They were able to determine this by using high-powered microscopes to analyze the traces of wear on his tools.

The upper half of the Iceman’s body was accidentally discovered by a vacationing German couple hiking in the North Italian Alps in 1991. Otzi was found with a dagger, borer, flake, antler retoucher and arrowheads. But some of the stone was collected from different areas in Italy’s Trentino region, which would have been about 43.5 miles from where he was thought to live.

More here.

The age of patriarchy: how an unfashionable idea became a rallying cry for feminism today

Charlotte Higgins in The Guardian:

On 7 January this year, the alt-right insurgent Steve Bannon turned on his TV in Washington DC to watch the Golden Globes. The mood of the event was sombre. It was the immediate aftermath of multiple accusations of rape and sexual assault against film producer Harvey Weinstein, which he has denied. The women, whose outfits would normally have been elaborate and the subject of frantic scrutiny, wore plain and sober black. In the course of a passionate speech, Oprah Winfrey told the audience that “brutally powerful men” had “broken” something in the culture. These men had caused women to suffer: not only actors, but domestic workers, factory workers, agricultural workers, athletes, soldiers and academics. The fight against this broken culture, she said, transcended “geography, race, religion, politics and workplace”.

Bannon, Donald Trump’s former chief strategist, was one of 20 million Americans watching. In his view, the scene before him augured the beginning of a revolution “even more powerful than populism”, according to his biographer Joshua Green. “It’s deeper. It’s primal. It’s elemental. The long black dresses and all that – this is the Puritans. It’s anti-patriarchy,” Bannon declared. “If you rolled out a guillotine, they’d chop off every set of balls in the room … Women are gonna take charge of society. And they couldn’t juxtapose a better villain than Trump. He is the patriarch.” He concluded: “The anti-patriarchy movement is going to undo 10,000 years of recorded history.”

Until very recently, “patriarchy” was not something rightwing men were even supposed to believe in, let alone dilate upon with such apocalyptic relish. It was the sort of word that, if uttered without irony, marked out the speaker as a very particular type of person – an iron-spined feminist of the old school, or the kind of ossified leftist who complained bitterly about the evils of capitalism. Even feminist theorists had left it behind.

Nevertheless, “patriarchy” has, in the past year or so, bloomed in common parlance and popular culture.

More here.

The fall of New York and the urban crisis of affluence

Kevin Baker in Harper’s:

As New York enters the third decade of the twenty-first century, it is in imminent danger of becoming something it has never been before: unremarkable. It is approaching a state where it is no longer a significant cultural entity but the world’s largest gated community, with a few cupcake shops here and there. For the first time in its history, New York is, well, boring.

This is not some new phenomenon but a cancer that’s been metastasizing on the city for decades now. And what’s happening to New York now—what’s already happened to most of Manhattan, its core—is happening in every affluent American city. San Francisco is overrun by tech conjurers who are rapidly annihilating its remarkable diversity; they swarm in and out of the metropolis in specially chartered buses to work in Silicon Valley, using the city itself as a gigantic bed-and-breakfast. Boston, which used to be a city of a thousand nooks and crannies, back-alley restaurants and shops, dive bars and ice cream parlors hidden under its elevated, is now one long, monotonous wall of modern skyscraper. In Washington, an army of cranes has transformed the city in recent years, smoothing out all that was real and organic into a town of mausoleums for the Trump crowd to revel in.

More here.

Saturday Poem

They flee from me that sometime did me seek”
………………………………………..
– Sir Thomas Wyatt)

After Wyatt After 11/8/’01

I flee from some whom sometime I sought out
for honest kind opinions: ‘Tell – do you like this?’
These trustees’ approval I have drunk like milk
to fortify my unproved notions’ bones;
their contrasting praises have supplied
my groping inspirations’ vitamins.

I’d now avoid their eyes and voices.

War’s ejaculation having mashed to dust
machines and walls and flesh, injects cement
into divergent certainties.
The knowing now all know
what knowledge will improve their faiths.

Discovered hesitant between, I would be crushed.

I grow surreptitious,
hide away my thoughts,
in case dear confidants, now fired
with passionate convictions,
find me out – the insult of my questioning,
the chill treachery inherent in my doubt.

by Lionel Abrahams
from International Poetry Web, 2004

On Ahmed Bouanani’s ‘The Hospital’

Chris Clarke at The Quarterly Conversation:

Like Bouanani’s memories of his childhood rue de Monastir, The Hospital is fastened securely to Morocco, even if it floats above it in a haze of time and space. Vergnaud expresses this endemic connection between lexicon and place succinctly: “The taxonomy of flora and fauna, smells and tastes, saints and legends permeates The Hospital,” she writes, meaning of course the one in Bouanani’s novel, and Bouanani’s novel itself. “With amnesia as the disease, and time itself in question, Bouanani delights in naming things—weeping willows and cyclamen flowers, prickly pears and esparto grass, Sidi bel Abbas and the two-horned Alexander—to anchor his character’s memories and dream lives.” Vergnaud’s lexical choices in these instances affect her reader in a slightly different way that do Bouanani’s, as the local implications can’t necessary cross the gap, but in the end, the result is quite similar: these precisely vague choices tie us to a Morocco we can’t reach, much as they connect the in-patients to a Morocco that is fragmentary, inaccessible, and lost in the past.

more here.

David Lynch’s memoir-slash-biography

Tyler Malone at the LA Times:

The book gives us a glimpse not only into Lynch, the man and the artist, but also into Lynch’s America — the place the man came from, the space the artist depicts. “In Lynch’s realm,” McKenna writes, “America is like a river that flows ever forward, carrying odds and ends from one decade into the next, where they intermingle and blur dividing lines we’ve invented to mark time.” Lynch’s America is dream-like, uncanny, full of mystery, full of madness, ever-askew.

Lynch was born Jan. 20, 1946, in Missoula, Mont., but he’s lived all over the country, getting a taste for its small towns, its cookie-cutter suburbs, its bustling metropolises. He attended the Pennsylvania Academy of Fine Arts in Philadelphia and graduated from Los Angeles’ American Film Institute in its storied early years.

more here.

AI rest my case: Intelligence is pointless if you can’t crack a joke

Tim Smith-Laing in More Intelligent Life:

The internet has for some time hummed with anxious murmuring about the Singularity. The rate of technological progress is accelerating exponentially; the Singularity refers to the moment when computers have become so smart that they escape our control and eventually become super-intelligences capable of stamping out humans like so much vermin. Those tuned into news of the coming catastrophe keep a beady eye on IBM, whose scientists are doing all they can to ensure their own survival as obsequious quislings to our future mechanical overlords. On Tuesday, the company announced that it had brought us one step closer to “real AI” (an intelligence as smart as a human) with its snappily named Project Debater: a supercomputer dedicated to the art of competitive debating. After years of research, this week it finally competed against two real-life human debaters. The result? A thumping one-all draw – according to an audience that I suspect was almost entirely made up of people who thought that HAL, the genial yet murderous computer in “2001: A Space Odyssey”, was the real hero of the film.

It was not quite John Henry versus the steam hammer. Even as IBM’s press office trumpeted the passing of another milestone on the road to true AI, one of the researchers offered the more-understated claim that Project Debater had managed to do something “sort of like what a human does when debating”. In fanfare terms, that is like hearing the Twentieth-Century Fox theme tune played on a kazoo. Most editors, even in the tech press, reached for their Brief-Recycled-Thinkpiece-on-the-End-of-Man button and left it at that. A good chunk of the rest of the internet just kept repeating the phrase “master debater”, as if it were actually a pun.

It is undeniably impressive, though. Set aside for the moment the following facts: that Project Debater is called Project Debater, that it manifests as a black monolith emitting the gentle, affectless tones of a child-killing psychopath, and that its “thinking face” is an animated set of gently bouncing blue balls. Ignore these, and you are left with a machine that can argue with a real human in real time. The hot topics at issue were the questions of whether “we should subsidise space exploration” and whether “we should increase the use of telemedicine”. It’s not clear what investment Project Debater was meant to have in either.

More here.

Harper Lee and Her Father, the Real Atticus Finch Image

Howell Raines in The New York Times:

When “Go Set a Watchman” was published in 2015, an Alabama lawyer called me with a catch in his voice. Had I heard that his hero Atticus Finch had an evil twin? Unlike the virtuous lawyer who saved an innocent black man from a lynch mob in “To Kill a Mockingbird,” the segregationist Atticus organized the white citizens council, figuratively speaking, in Boo Radley’s peaceful backyard. Three years later, my friend still believes that Harper Lee was tricked, in her dotage, into shredding the image of perhaps the only white Alabamian other than Helen Keller to be admired around the world. Never mind that this better Atticus is fictional; my home state has learned to grab admiration where it can.

Atticus-worship is not confined to Alabamians who revere the saint portrayed in “To Kill a Mockingbird” and then enshrined in 1962’s movie version by a magisterially virtuous Gregory Peck. By winning the Pulitzer Prize for fiction in 1961 and selling more than 40 million copies worldwide, Lee’s novel created a global role model for a virtuous life. Even the gifted Northern novelist Jonathan Franzen cited the original Atticus as the epitome of moral perfection in a New Yorker essay on Edith Wharton.

Although dismaying to some Lee fans, the belated publication of “Watchman,” an apprentice work containing the germ plasm of “Mockingbird,” cast light on the virtues and limitations of the author and her canonical novel. It also opened the door to serious scholarship like “Atticus Finch: The Biography,” Joseph Crespino’s crisp, illuminating examination of Harper Lee’s dueling doppelgängers and their real-life model, Lee’s politician father, A. C. Lee. Crespino, who holds a wonderful title — he is the Jimmy Carter professor of history at Emory University — displays a confident understanding of the era of genteel white supremacists like A. C. Lee. He understands that the New South still labors, as Lee’s daughter did throughout her long, complicated life, under an old shadow. This book’s closely documented conclusion is that A. C. Lee, who once chased an integrationist preacher out of the Monroeville Methodist Church, and his devoted albeit sporadically rebellious daughter, Nelle Harper Lee, both wanted the world to have a better opinion of upper-class Southern WASPs than they deserve. These are the people Harper Lee and I grew up among — educated, well-read, well-traveled Alabamians who would never invite George Wallace into their homes, but nonetheless watched in silence as he humiliated poor Alabama in the eyes of the world.

More here.

Ehrenreich’s critique of wellness and self-improvement

Gabriel Winant in The New Republic:

Barbara Ehrenreich cuts an unusual figure in American culture. A prominent radical who never became a liberal, a celebrity, or a reactionary, who built a successful career around socialist-feminist writing and activism, she embodies an opportunity that was lost when the New Left went down to defeat. Since the mid-1970s she has devoted her work to an unsparing examination of what she viewed as the self-involvement of her professional, middle-class peers: from their narcissism and superiority in Fear of Falling and Nickel and Dimed to their misplaced faith in positive thinking in Bright-Sided. Again and again, she has offered a critique of the world they were making and leaving behind them. She is, in other words, both a boomer and the opposite.

At first glance, her new book, Natural Causes, is a polemic against wellness culture and the institutions that sustain it. What makes the argument unusual is its embrace of that great humbler, the end of life. “You can think of death bitterly or with resignation … and take every possible measure to postpone it,” she offers at the beginning of the book. “Or, more realistically, you can think of life as an interruption of an eternity of personal nonexistence, and seize it as a brief opportunity to observe and interact with the living, ever-surprising world around us.” With a winning shrug, she declares herself “old enough to die” and have her obituary simply list “natural causes.”

Ehrenreich contemplates with some satisfaction not just the approach of her own death but also the passing of her generation. As the boomers have aged, denial of death, she argues, has moved to the center of American culture, and a vast industrial ecosystem has bloomed to capitalize on it.

More here.

Einstein’s General Relativity Passes Its First Extragalactic Test

Ethan Siegel in Forbes:

In order to test General Relativity as a theory of gravity, you need to find a system where the signal you’ll see differs from other theories of gravity. This must at least include Newton’s theory, but should, ideally, include alternative theories of gravity that make distinct predictions from Einstein’s. Classically, the first such test that did this was right at the edge of the Sun: where gravity is strongest in our Solar System.

As light from a distant star passes close to the limb of the Sun, it should bend by a very specific amount, as dictated by Einstein’s theory. The amount is twice that of Newton’s theory, and was verified during the total solar eclipse of 1919. Since then, a number of additional tests have been performed to great precision. Each and every time, Einstein’s theory has been validated, and alternatives emerge defeated. Yet on scales larger than the Solar System, the results have always been inconclusive.

Until today. We’ve finally taken that first step towards verifying General Relativity on those large, cosmic scales, where gravity is often the only force that matters.

More here.