America Came, America Went

by Mathangi Krishnamurthy

ViewMasterLong years ago, when I waddled around in pigtails, I said aloud the magic words that for many years characterized how I felt about the world, my world. “I will settle in America”, I said. Neither did I know how heavy “settling” can be nor was I clued into the power of words. Carelessly, toddler-ly, I threw around that which would one day make my world.We didn't say politically correct things then. As far as we all knew, all of the Americas was North America, and all of North America was the US. My father had just returned from travels to the US, and he had brought back suitcases spilling over with things guaranteed to charm curmudgeonly three year olds.

VMaster2America was then not only an idea but an escape. I was charmed into thinking that going to America indicated not only the newness of a world, but a not-ness of the one I inhabited. No school, no dreary days, no strange scapes of a scary adult world with its inexplicable sorrows and forbidding rules. America was fabulous, with its flowery denims, and video games, and automatic erasers. I was mesmerized by View-Masters, with their otherworldly scuffed gaze onto so-near foreign shores.

These were the eighties. India was a sovereign, socialist, secular, democratic republic with one, and later two, television channels. We all read the national pledge aloud in school, that went something to the effect of “India is my country and all Indians are my brothers and sisters”. We all suffered one heckler in every class who would mutter sotto voce “Well who do I marry then?” We received our news from singular sources and imagined our leaders sovereign, if ineffectual. We trusted secularism, even if in its often troubled avatar, tolerance. We muddled through power cuts, and ration cards, and held onto a quiet, steely middle-classness. Benedict Anderson would have pronounced us a truly well-imagined nation; or at least, some of us.

In this world, America's otherness beckoned ever so strongly with its free love (read sex), and rampant spending; with its alter-egoness of individualism and seeming control over the world. But India allied with the USSR. The mythical Russia communicated to us only held Mathematics books, fairy tales, and War and Peace in stock. I hated math, much preferred the Brothers Grimm, and to date, am at odds with the melancholies of Tolstoy.

Read more »

A potent theory has emerged explaining a mysterious statistical law that arises throughout physics and mathematics

Natalie Wolchover in Quanta:

ScreenHunter_859 Oct. 26 18.44Imagine an archipelago where each island hosts a single tortoise species and all the islands are connected — say by rafts of flotsam. As the tortoises interact by dipping into one another’s food supplies, their populations fluctuate.

In 1972, the biologist Robert May devised a simple mathematical model that worked much like the archipelago. He wanted to figure out whether a complex ecosystem can ever be stable or whether interactions between species inevitably lead some to wipe out others. By indexing chance interactions between species as random numbers in a matrix, he calculated the critical “interaction strength” — a measure of the number of flotsam rafts, for example — needed to destabilize the ecosystem. Below this critical point, all species maintained steady populations. Above it, the populations shot toward zero or infinity.

Little did May know, the tipping point he discovered was one of the first glimpses of a curiously pervasive statistical law.

The law appeared in full form two decades later, when the mathematicians Craig Tracy and Harold Widom proved that the critical point in the kind of model May used was the peak of a statistical distribution. Then, in 1999, Jinho Baik, Percy Deift and Kurt Johansson discovered that the same statistical distribution also describes variations in sequences of shuffled integers — a completely unrelated mathematical abstraction. Soon the distribution appeared in models of the wriggling perimeter of a bacterial colony and other kinds of random growth. Before long, it was showing up all over physics and mathematics.

“The big question was why,” said Satya Majumdar, a statistical physicist at the University of Paris-Sud. “Why does it pop up everywhere?”

More here.

Modi’s Idea of India

Pankaj Mishra in the New York Times:

ScreenHunter_858 Oct. 26 18.39India, V.S. Naipaul declared in 1976, is “a wounded civilization,” whose obvious political and economic dysfunction conceals a deeper intellectual crisis. As evidence, he pointed out some strange symptoms he noticed among upper-caste middle-class Hindus since his first visit to his ancestral country in 1962. These well-born Indians betrayed a craze for “phoren” consumer goods and approval from the West, as well as a self-important paranoia about the “foreign hand.” “Without the foreign chit,” Mr. Naipaul concluded, “Indians can have no confirmation of their own reality.”

Mr. Naipaul was also appalled by the prickly vanity of many Hindus who asserted that their holy scriptures already contained the discoveries and inventions of Western science, and that an India revitalized by its ancient wisdom would soon vanquish the decadent West. He was particularly wary of the “apocalyptic Hindu terms” of such 19th-century religious revivalists as Swami Vivekananda, whose exhortation to nation-build through the ethic of the kshatriya (the warrior caste) has made him the central icon of India’s new Hindu nationalist rulers.

Despite his overgeneralizations, Mr. Naipaul’s mapping of the upper-caste nationalist’s id did create a useful meme of intellectual insecurity, confusion and aggressiveness. And this meme is increasingly recognizable again. Today a new generation of Indian nationalists lurches between victimhood and chauvinism, and with ominous implications.

More here.

Julian Assange: Google Is Not What It Seems

Julian Assange in Newsweek:

ScreenHunter_857 Oct. 26 18.20Eric Schmidt is an influential figure, even among the parade of powerful characters with whom I have had to cross paths since I founded WikiLeaks. In mid-May 2011 I was under house arrest in rural Norfolk, England, about three hours’ drive northeast of London. The crackdown against our work was in full swing and every wasted moment seemed like an eternity. It was hard to get my attention.

But when my colleague Joseph Farrell told me the executive chairman of Google wanted to make an appointment with me, I was listening.

In some ways the higher echelons of Google seemed more distant and obscure to me than the halls of Washington. We had been locking horns with senior U.S. officials for years by that point. The mystique had worn off. But the power centers growing up in Silicon Valley were still opaque and I was suddenly conscious of an opportunity to understand and influence what was becoming the most influential company on earth. Schmidt had taken over as CEO of Google in 2001 and built it into an empire.

I was intrigued that the mountain would come to Muhammad. But it was not until well after Schmidt and his companions had been and gone that I came to understand who had really visited me.

More here.

Project Cybersyn and the Origins of the Big Data Nation


Evgeny Morozov in The New Yorker:

In June, 1972, Ángel Parra, Chile’s leading folksinger, wrote a song titled “Litany for a Computer and a Baby About to Be Born.” Computers are like children, he sang, and Chilean bureaucrats must not abandon them. The song was prompted by a visit to Santiago from a British consultant who, with his ample beard and burly physique, reminded Parra of Santa Claus—a Santa bearing a “hidden gift, cybernetics.”

The consultant, Stafford Beer, had been brought in by Chile’s top planners to help guide the country down what Salvador Allende, its democratically elected Marxist leader, was calling “the Chilean road to socialism.” Beer was a leading theorist of cybernetics—a discipline born of midcentury efforts to understand the role of communication in controlling social, biological, and technical systems. Chile’s government had a lot to control: Allende, who took office in November of 1970, had swiftly nationalized the country’s key industries, and he promised “worker participation” in the planning process. Beer’s mission was to deliver a hypermodern information system that would make this possible, and so bring socialism into the computer age. The system he devised had a gleaming, sci-fi name: Project Cybersyn.

Beer was an unlikely savior for socialism. He had served as an executive with United Steel and worked as a development director for the International Publishing Corporation (then one of the largest media companies in the world), and he ran a lucrative consulting practice. He had a lavish life style, complete with a Rolls-Royce and a grand house in Surrey, which was fitted out with a remote-controlled waterfall in the dining room and a glass mosaic with a pattern based on the Fibonacci series. To convince workers that cybernetics in the service of the command economy could offer the best of socialism, a certain amount of reassurance was in order. In addition to folk music, there were plans for cybernetic-themed murals in the factories, and for instructional cartoons and movies. Mistrust remained. “CHILE RUN BY COMPUTER,” a January, 1973, headline in the Observer announced, shaping the reception of Beer’s plan in Britain.

At the center of Project Cybersyn (for “cybernetics synergy”) was the Operations Room, where cybernetically sound decisions about the economy were to be made. Those seated in the op room would review critical highlights—helpfully summarized with up and down arrows—from a real-time feed of factory data from around the country. The prototype op room was built in downtown Santiago, in the interior courtyard of a building occupied by the national telecom company. It was a hexagonal space, thirty-three feet in diameter, accommodating seven white fibreglass swivel chairs with orange cushions and, on the walls, futuristic screens. Tables and paper were banned. Beer was building the future, and it had to look like the future.

That was a challenge: the Chilean government was running low on cash and supplies; the United States, dismayed by Allende’s nationalization campaign, was doing its best to cut Chile off. And so a certain amount of improvisation was necessary.

More here. Greg Grandin follows up on “The Anti-Socialist Origins of Big Data” in The Nation.

What Do Animals Think They See When They Look in the Mirror?

Chelsea Wald in Slate:

BirdThe six horses in a 2002 study were “known weavers.” When stabled alone, they swayed their heads, necks, forequarters, and sometimes their whole bodies from side to side. The behavior is thought to stem from the social frustration brought on by isolation. It can be seen in a small percentage of all stabled horses, and owners hate it—they think it causes fatigue, weight loss, and uneven muscle development, and it looks disturbing. People had tried stopping the weaving by installing metal bars that limit a horse’s movement, but the study found that a different modification to the stable worked surprisingly well: a mirror. “Those horses with the mirror were rarely [observed] weaving,” the researchers reported. A later study even found that the mirror worked just as well as the presence of another horse.

Studies have shown that mirrors can improve the lives of a variety of laboratory, zoo, farm, and companion animals. Isolated cows and sheep have lower stress reactions when mirrors are around. With mirrors, monkeys alone or in groups show a healthy increase in social behaviors such as threats, grimaces, lip-smacking, and teeth chattering, and laboratory rabbits housed alone are also more active. Mirrors in birdcages reduce some birds’ fear. Gordon Gallup invented the test that shows whether an animal recognizes itself in the mirror: He marked primates’ faces and ears with dye and watched whether they used a mirror to investigate the spots. If they did, it revealed that the animals understood that the faces in the mirror were their own. But he thinks that most animals probably think of their reflections as another animal. The calming effect in some cases could come partly from the reflection’s apparent mimicking. “The animal confronting its own reflection in a mirror has complete control over the behavior of the image, and therefore the image is always attentive and ready to reciprocate when the animal is,” he and Stuart Capper wrote in 1970. In other words, the mirror image is sort of like a friend who always does exactly what you want.

More here.

Jared Diamond: ‘150,000 years ago, humans wouldn’t figure on a list of the five most interesting species on Earth’

Oliver Berkeman in The Guardian:

Jared-Diamond-011Most people would be overjoyed to receive one of the MacArthur Foundation’s annual “genius grants” – around half a million dollars, no strings attached – but when Jared Diamond won his, in 1985, it plunged him into a depression. At 47, he was an accomplished scholar, but in two almost comically obscure niches: the movement of sodium in the gallbladder and the birdlife of New Guinea. “What the MacArthur call said to me was, ‘Jared, people think highly of you, and they expect important things of you, and look what you’ve actually done with your career’,” Diamond says today. It was a painful thought for someone who recalled being told, by an admiring teacher at his Massachusetts school, that one day he would “unify the sciences and humanities”. Clearly, he needed a larger canvas. Even so, few could have predicted how large a canvas he would choose.

In the decades since, Diamond has enjoyed huge success with several “big books” – most famously, 1997’s Guns, Germs and Steel – which ask the most sweeping questions it is possible to ask about human history. For instance: why did one species of primate, unremarkable until 70,000 years ago, come to develop language, art, music, nation states and space travel? Why do some civilisations prosper, while others collapse? Why did westerners conquer the Americas, Africa and Australia, instead of the other way round? Diamond, who describes himself as a biogeographer, answers them in translucent prose that has the effect of making the world seem to click into place, each fact assuming its place in an elegant arc of pan-historical reasoning. Our interview itself provides an example: one white man arriving to interview another, in English, on the imposing main campus of the University of California, Los Angeles, in a landscape bearing little trace of the Native Americans who once thrived here. Why? Because 8,000 years ago – to borrow from Guns, Germs and Steel – the geography of Europe and the Middle East made it easier to farm crops and animals there than elsewhere.

More here.

Sunday Poem

Hazards of Hindsight

For a moment
forget hindsight
prudence and reconsideration
Hindsight dry-cleans your speech
Forget caution and correction
don’t render me speechless with your reason –
all I want from you is a quick artless response
that knocks judgement off into history’s oblivion
only then I'll get a pure no, a simple yes from you
not the elusive past, I wasn’t a part of

To make any sense of history
I need an artless response
In its freshness
I can see better
the peanuts enclosed in the sturdy shell
the fresh oil in its ripened seeds.

by Monika Kumar
from Samalochan, 2012
translation by author

Art of Darkness

1026-bks-Iyer-sub-master495-v2Pico Iyer at the New York Times:

To what extent is the price of immortality humanity, as you could put it? Must the revolutionary artist ignore — even flout — the basic laws of decency that govern our world in order to transform that world? “Perfection of the life, or of the work,” as Yeats had it. “And if it take the second,” he went on, the intellect of man “must refuse a heavenly mansion, raging in the dark.”

It was an ancient question even then, but somehow every other book I’ve been reading of late comes back to it. Walter Isaacson’s unbiddable 2011 biography of Steve Jobs presents his subject as a kind of Lee Kuan Yew of the tech industry, demanding we give up our ideas of democracy and control in exchange for a gorgeously designed new operating system. Innovation doesn’t have to be so dictatorial: Albert Einstein, the subject of Isaacson’s previous biography, is revered in part for his readiness to defer to what he didn’t understand. Yet the more we read about Jobs publicly humiliating colleagues and refusing to acknowledge responsibility for the birth of his first child, the more we see that his genius could seem inextricable from his indifference to social norms.

more here.

the hilltop: life in the occupied territories

La-ca-jc-assaf-gavron-the-hilltop-20141023-001David L. Ulin at The LA Times:

Assaf Gavron's 2010 novel “Almost Dead” does something I would have thought impossible — it makes satire out of terrorism. The story of a man who becomes an Israeli national hero after surviving three attacks in a single week, the book offers a sharply ironic look at the intersection of image and reality.

This character is no role model; he's a guy in the wrong place at the right time. Gavron, who was born near Jerusalem and lives in Tel Aviv, is suggesting that we are all of us (citizens, nations, even, to some extent, terrorists) making it up as we go along.

A similar sensibility centers “The Hilltop,” Gavron's seventh book, although only the second (after “Almost Dead”) to appear in the United States. A sprawling novel that revolves around a small settlement in the occupied territories, its focus is less satirical than absurdist, offering a middle vision between the ridiculous and the sublime.

more here.

Dylan Thomas: a poet unlike any other

57764764-a8c5-485e-b677-fe4672f48162Owen Sheers at The Financial Times:

Perhaps of greater importance to Thomas’s poetry, however, was the wider cultural landscape of 1930s Wales, and Thomas’s geographical and familial location within it. Thomas’s parents were the personification of the intellectual and industrial movement from rural to urban that characterised Wales in the early 20th century. Both originated from Welsh-speaking families of agricultural and religious occupation. The young Thomas, a listener “in love with words”, found himself at the centre of a linguistic and cultural maelstrom. The languages of both Welsh and English informed his ear, just as both the streets of Swansea and the fertile fields of the Llanstephan peninsula informed his eye.

In his now famous notebooks, Thomas’s search for a poetic voice can be traced as if following his route on a map. In these notebooks, he passes through a period of derivative free verse before evolving his poems into the grander, more visceral and patterned work the world met just a few years later when he published 18 Poems. Although the book itself was modest, even retiring – no jacket copy and no author portrait, both on Thomas’s request – inside, the poems themselves were the polar opposite. Bold, physical and sonorous, they have been described by some critics as “biomorphic”. Thomas once wrote: “Every idea, intuitive or intellectual, can be imaged and translated in terms of the body, its flesh, skin, blood, sinews, veins, glands, organs, cells and senses.

more here.

‘The Immortal Evening’

Priscilla Gilman in The New York Times:

MomentOn Dec. 28, 1817, Benjamin Robert ­Haydon, then England’s pre-eminent history painter, hosted a dinner party to celebrate his progress on his latest work, “Christ’s Entry Into Jerusalem.” He invited, among others, three men anachronistically pictured in that painting: John Keats, William Wordsworth and the essayist Charles Lamb. In “The Immortal Evening” (the phrase is from Haydon’s letters and diaries), the poet and biographer Stanley Plumly offers an idiosyncratic, heartfelt, at once sinuous and expansive exploration of the dinner, its “aesthetic context and the larger worlds of the individual guests, particularly the three ‘immortal’ writers, Keats, ­Wordsworth and Lamb.”

Plumly begins the story strikingly, ­elliptically, in the present tense: “Keats has the most ground to cover.” What the walk to the dinner was like for each of its major participants, the look and feel of Regency London, what kind of food they would have eaten, all come to vivid life in Plumly’s evocative rendering. But if it presents a historical place and moment with immediacy and “present-tense personal intensity” (as Plumly says of Romantic art), “The Immortal Evening” also tackles timeless questions: “How does a living moment in time become ‘immortal’? What are a painting’s terms of immortality?” (Or, for that matter, a poem’s?) Why are some artists remembered and some forgotten?

More here.

Who Really Found the Higgs Boson?

Neal Hartman in Nautilus:

4549_3eb2f1a06667bfb9daba7f7effa0284bTo those who say that there is no room for genius in modern science because everything has been discovered, Fabiola Gianotti has a sharp reply. “No, not at all,” says the former spokesperson of the ATLAS Experiment, the largest particle detector at the Large Hadron Collider at CERN. “Until the fourth of July, 2012 we had no proof that nature allows for elementary scalar fields. So there is a lot of space for genius.”

She is referring to the discovery of the Higgs boson two years ago—potentially one of the most important advances in physics in the past half century. It is a manifestation of the eponymous field that permeates all of space, and completes the standard model of physics: a sort of baseline description for the existence and behavior of essentially everything there is.

By any standards, it is an epochal, genius achievement.

What is less clear is who, exactly, the genius is. An obvious candidate is Peter Higgs, who postulated the Higgs boson, as a consequence of the Brout-Englert-Higgs mechanism, in 1964. He was awarded the Nobel Prize in 2013 along with Francois Englert (Englert and his deceased colleague Robert Brout arrived at the same result independently). But does this mean that Higgs was a genius? Peter Jenni, one of the founders and the first “spokesperson” of the ATLAS Experiment Collaboration (one of the two experiments at CERN that discovered the Higgs particle), hesitates when I ask him the question.

More here.