Monday, October 27, 2014
by Randolyn Zinn
Flipping through photos of a recent trip to Spain, I was struck by this one.
A typical tobacco drying barn a few miles from Granada, Spain, in the fields of Fuento Vaqueros -- Federico Garcia Lorca’s birthplace. In town we toured the Lorca family house and museum (no photos allowed) to ogle his cradle, his mother's kitchen and the piano where he practiced cancionnes. Out back an old pomegranate tree in the courtyard was old enough to have shaded Federico as a child as he played beneath its boughs. Upstairs, glass cases displayed selected drawings, notebooks and first editions of his poetry and plays. We sat down to watch a quick film with no sound of the young poet in overalls unloading scenery from the back of a truck with his theatrical troupe, La Barraca, on tour performing Calderon’s La Vida Es Sueno or Life Is A Dream in the white towns of Andulucia. He wrote his own plays at this time: Blood Wedding, Yerma and The House of Bernarda Alba. We gasped at the end of the clip when Lorca smiled and waved at the camera…he was waving to us ninety years later in his own house. Life is a dream.
by Brooks Riley
by Carl Pierer
It is necessary that two men have the same number of hair, gold, and others.[i]
This meme is taken from a scene in the Cohen brother's 1998 comedy "The Big Lebowski". During a game of bowling, Walter, in the picture, gets annoyed at the other characters constantly overstepping the line. Drawing a gun, he asks: "Am I the only around here who gives a shit about rules?"[ii]
Considering that there are roughly 7 billion people on earth, a positive answer seems highly unlikely. But it is possible to do better. We can know with certainty, i.e. prove, that the creator of the meme is not the only one. This is a simple and straightforward application of a fascinating, intuitive and yet powerful mathematical principle. It is usually called "pigeonhole principle" (for reasons to be explained below) or "Dirichlet's principle".
The German mathematician Gustav Lejeune Dirichlet was born in 1805 in Düren, a small town near Aachen. Although Dirichlet was no child prodigy, his love for mathematics and studies in general became apparent early in his life. His parents had him destined for the career of a merchant, but upon his insisting to attend the Gymnasium (secondary school), they sent him to Bonn, at the age of 12. After only two years, he transferred to a Gymnasium in Cologne, where he studied mathematics with Georg Simon Ohm (1789-1854), who is famous for his discovery of Ohm's Law. Dirichlet left this school after only one year, with a leaving certificate in his pocket but without an Abitur, which would cause him some troubles later in his life. At that time, students were required to be able to carry a conversation in Latin to pass the Abitur examination. With only three years of secondary education, Dirichlet could not comply with this crucial requirement. However, Dirichlet was fortunate that no Abitur was required to study mathematics.
by Thomas Rodham Wells
You may have heard of the gender income gap. It is one of the most obvious signs that despite being equal in theory, women still lack real equality. Some of it is still due to active discrimination by people who still haven't got the equal treatment message. But much more of it is the result of a history of unjust gender norms and factual errors inscribed into our institutions, most notably the bundle of moral expectations we hold about what can be demanded of women rather than men in terms of unpaid care of children, the disabled and the elderly.
The problem is that fairness – the principle of the equal treatment of equals – is a poor guide to action here. Our history has bequeathed us a gender injustice complex of interlocking and mutually reinforcing institutional arrangements and moral values that altogether make women less economically valued than men. The outcome is pretty clear - women tend to earn much less than men - but it is hard to pin down specific violations of fair treatment by specific agents who can be held responsible. Sexist pigs are relatively easy to pick out and chastise, and in some cases may even be successfully prosecuted for discrimination or other misbehaviour. But it's rather harder to condemn a university educated couple for agreeing between themselves to follow the traditional model of male breadwinner and female homemaker. Even if that decision is replicated in household after household leading to dramatic aggregate differences in labour market participation rates for women, especially in full-time professional work.
It is true that a great many policies have been proposed, and sometimes even implemented, to address different pieces of the gender injustice complex, from quotas in boardrooms and the top management of public institutions to compulsory paternity leave. But such reforms struggle politically, not least because they seem to impose more unfairness - the unequal treatment of men and women because of their gender. A good many people, including many women, reasonably object to the incoherence of trying to solve a fairness problem by creating more unfairness. More positive measures, such as providing free child-care from tax revenues, are considered too expensive to fully implement. And for all the political capital these policies require to be put into action, each can only have incremental effects anyway because they only address one piece of the puzzle at a time. They rarely inspire much popular support.
We've been thinking about this the wrong way, distracted by the idea that unfairness must be produced by bad motives that are best addressed by cumulative moral exhortation, or something else equally cheap like training young women to 'lean in'. If we all want gender equality then eventually, surely, it will come about by itself.
by Mathangi Krishnamurthy
Long years ago, when I waddled around in pigtails, I said aloud the magic words that for many years characterized how I felt about the world, my world. "I will settle in America", I said. Neither did I know how heavy "settling" can be nor was I clued into the power of words. Carelessly, toddler-ly, I threw around that which would one day make my world.We didn't say politically correct things then. As far as we all knew, all of the Americas was North America, and all of North America was the US. My father had just returned from travels to the US, and he had brought back suitcases spilling over with things guaranteed to charm curmudgeonly three year olds.
America was then not only an idea but an escape. I was charmed into thinking that going to America indicated not only the newness of a world, but a not-ness of the one I inhabited. No school, no dreary days, no strange scapes of a scary adult world with its inexplicable sorrows and forbidding rules. America was fabulous, with its flowery denims, and video games, and automatic erasers. I was mesmerized by View-Masters, with their otherworldly scuffed gaze onto so-near foreign shores.
These were the eighties. India was a sovereign, socialist, secular, democratic republic with one, and later two, television channels. We all read the national pledge aloud in school, that went something to the effect of "India is my country and all Indians are my brothers and sisters". We all suffered one heckler in every class who would mutter sotto voce "Well who do I marry then?" We received our news from singular sources and imagined our leaders sovereign, if ineffectual. We trusted secularism, even if in its often troubled avatar, tolerance. We muddled through power cuts, and ration cards, and held onto a quiet, steely middle-classness. Benedict Anderson would have pronounced us a truly well-imagined nation; or at least, some of us.
In this world, America's otherness beckoned ever so strongly with its free love (read sex), and rampant spending; with its alter-egoness of individualism and seeming control over the world. But India allied with the USSR. The mythical Russia communicated to us only held Mathematics books, fairy tales, and War and Peace in stock. I hated math, much preferred the Brothers Grimm, and to date, am at odds with the melancholies of Tolstoy.
Sunday, October 26, 2014
A potent theory has emerged explaining a mysterious statistical law that arises throughout physics and mathematics
Natalie Wolchover in Quanta:
Imagine an archipelago where each island hosts a single tortoise species and all the islands are connected — say by rafts of flotsam. As the tortoises interact by dipping into one another’s food supplies, their populations fluctuate.
In 1972, the biologist Robert May devised a simple mathematical model that worked much like the archipelago. He wanted to figure out whether a complex ecosystem can ever be stable or whether interactions between species inevitably lead some to wipe out others. By indexing chance interactions between species as random numbers in a matrix, he calculated the critical “interaction strength” — a measure of the number of flotsam rafts, for example — needed to destabilize the ecosystem. Below this critical point, all species maintained steady populations. Above it, the populations shot toward zero or infinity.
Little did May know, the tipping point he discovered was one of the first glimpses of a curiously pervasive statistical law.
The law appeared in full form two decades later, when the mathematicians Craig Tracy and Harold Widom proved that the critical point in the kind of model May used was the peak of a statistical distribution. Then, in 1999, Jinho Baik, Percy Deift and Kurt Johansson discovered that the same statistical distribution also describes variations in sequences of shuffled integers — a completely unrelated mathematical abstraction. Soon the distribution appeared in models of the wriggling perimeter of a bacterial colony and other kinds of random growth. Before long, it was showing up all over physics and mathematics.
“The big question was why,” said Satya Majumdar, a statistical physicist at the University of Paris-Sud. “Why does it pop up everywhere?”
Pankaj Mishra in the New York Times:
India, V.S. Naipaul declared in 1976, is “a wounded civilization,” whose obvious political and economic dysfunction conceals a deeper intellectual crisis. As evidence, he pointed out some strange symptoms he noticed among upper-caste middle-class Hindus since his first visit to his ancestral country in 1962. These well-born Indians betrayed a craze for “phoren” consumer goods and approval from the West, as well as a self-important paranoia about the “foreign hand.” “Without the foreign chit,” Mr. Naipaul concluded, “Indians can have no confirmation of their own reality.”
Mr. Naipaul was also appalled by the prickly vanity of many Hindus who asserted that their holy scriptures already contained the discoveries and inventions of Western science, and that an India revitalized by its ancient wisdom would soon vanquish the decadent West. He was particularly wary of the “apocalyptic Hindu terms” of such 19th-century religious revivalists as Swami Vivekananda, whose exhortation to nation-build through the ethic of the kshatriya (the warrior caste) has made him the central icon of India’s new Hindu nationalist rulers.
Despite his overgeneralizations, Mr. Naipaul’s mapping of the upper-caste nationalist’s id did create a useful meme of intellectual insecurity, confusion and aggressiveness. And this meme is increasingly recognizable again. Today a new generation of Indian nationalists lurches between victimhood and chauvinism, and with ominous implications.
Julian Assange in Newsweek:
Eric Schmidt is an influential figure, even among the parade of powerful characters with whom I have had to cross paths since I founded WikiLeaks. In mid-May 2011 I was under house arrest in rural Norfolk, England, about three hours’ drive northeast of London. The crackdown against our work was in full swing and every wasted moment seemed like an eternity. It was hard to get my attention.
But when my colleague Joseph Farrell told me the executive chairman of Google wanted to make an appointment with me, I was listening.
In some ways the higher echelons of Google seemed more distant and obscure to me than the halls of Washington. We had been locking horns with senior U.S. officials for years by that point. The mystique had worn off. But the power centers growing up in Silicon Valley were still opaque and I was suddenly conscious of an opportunity to understand and influence what was becoming the most influential company on earth. Schmidt had taken over as CEO of Google in 2001 and built it into an empire.
I was intrigued that the mountain would come to Muhammad. But it was not until well after Schmidt and his companions had been and gone that I came to understand who had really visited me.
Evgeny Morozov in The New Yorker:
In June, 1972, Ángel Parra, Chile’s leading folksinger, wrote a song titled “Litany for a Computer and a Baby About to Be Born.” Computers are like children, he sang, and Chilean bureaucrats must not abandon them. The song was prompted by a visit to Santiago from a British consultant who, with his ample beard and burly physique, reminded Parra of Santa Claus—a Santa bearing a “hidden gift, cybernetics.”
The consultant, Stafford Beer, had been brought in by Chile’s top planners to help guide the country down what Salvador Allende, its democratically elected Marxist leader, was calling “the Chilean road to socialism.” Beer was a leading theorist of cybernetics—a discipline born of midcentury efforts to understand the role of communication in controlling social, biological, and technical systems. Chile’s government had a lot to control: Allende, who took office in November of 1970, had swiftly nationalized the country’s key industries, and he promised “worker participation” in the planning process. Beer’s mission was to deliver a hypermodern information system that would make this possible, and so bring socialism into the computer age. The system he devised had a gleaming, sci-fi name: Project Cybersyn.
Beer was an unlikely savior for socialism. He had served as an executive with United Steel and worked as a development director for the International Publishing Corporation (then one of the largest media companies in the world), and he ran a lucrative consulting practice. He had a lavish life style, complete with a Rolls-Royce and a grand house in Surrey, which was fitted out with a remote-controlled waterfall in the dining room and a glass mosaic with a pattern based on the Fibonacci series. To convince workers that cybernetics in the service of the command economy could offer the best of socialism, a certain amount of reassurance was in order. In addition to folk music, there were plans for cybernetic-themed murals in the factories, and for instructional cartoons and movies. Mistrust remained. “CHILE RUN BY COMPUTER,” a January, 1973, headline in the Observer announced, shaping the reception of Beer’s plan in Britain.
At the center of Project Cybersyn (for “cybernetics synergy”) was the Operations Room, where cybernetically sound decisions about the economy were to be made. Those seated in the op room would review critical highlights—helpfully summarized with up and down arrows—from a real-time feed of factory data from around the country. The prototype op room was built in downtown Santiago, in the interior courtyard of a building occupied by the national telecom company. It was a hexagonal space, thirty-three feet in diameter, accommodating seven white fibreglass swivel chairs with orange cushions and, on the walls, futuristic screens. Tables and paper were banned. Beer was building the future, and it had to look like the future.
That was a challenge: the Chilean government was running low on cash and supplies; the United States, dismayed by Allende’s nationalization campaign, was doing its best to cut Chile off. And so a certain amount of improvisation was necessary.
Chelsea Wald in Slate:
The six horses in a 2002 study were “known weavers.” When stabled alone, they swayed their heads, necks, forequarters, and sometimes their whole bodies from side to side. The behavior is thought to stem from the social frustration brought on by isolation. It can be seen in a small percentage of all stabled horses, and owners hate it—they think it causes fatigue, weight loss, and uneven muscle development, and it looks disturbing. People had tried stopping the weaving by installing metal bars that limit a horse’s movement, but the study found that a different modification to the stable worked surprisingly well: a mirror. “Those horses with the mirror were rarely [observed] weaving,” the researchers reported. A later study even found that the mirror worked just as well as the presence of another horse.
Studies have shown that mirrors can improve the lives of a variety of laboratory, zoo, farm, and companion animals. Isolated cows and sheep have lower stress reactions when mirrors are around. With mirrors, monkeys alone or in groups show a healthy increase in social behaviors such as threats, grimaces, lip-smacking, and teeth chattering, and laboratory rabbits housed alone are also more active. Mirrors in birdcages reduce some birds’ fear. Gordon Gallup invented the test that shows whether an animal recognizes itself in the mirror: He marked primates’ faces and ears with dye and watched whether they used a mirror to investigate the spots. If they did, it revealed that the animals understood that the faces in the mirror were their own. But he thinks that most animals probably think of their reflections as another animal. The calming effect in some cases could come partly from the reflection’s apparent mimicking. “The animal confronting its own reflection in a mirror has complete control over the behavior of the image, and therefore the image is always attentive and ready to reciprocate when the animal is,” he and Stuart Capper wrote in 1970. In other words, the mirror image is sort of like a friend who always does exactly what you want.
Jared Diamond: ‘150,000 years ago, humans wouldn’t figure on a list of the five most interesting species on Earth’
Oliver Berkeman in The Guardian:
Most people would be overjoyed to receive one of the MacArthur Foundation’s annual “genius grants” – around half a million dollars, no strings attached – but when Jared Diamond won his, in 1985, it plunged him into a depression. At 47, he was an accomplished scholar, but in two almost comically obscure niches: the movement of sodium in the gallbladder and the birdlife of New Guinea. “What the MacArthur call said to me was, ‘Jared, people think highly of you, and they expect important things of you, and look what you’ve actually done with your career’,” Diamond says today. It was a painful thought for someone who recalled being told, by an admiring teacher at his Massachusetts school, that one day he would “unify the sciences and humanities”. Clearly, he needed a larger canvas. Even so, few could have predicted how large a canvas he would choose.
In the decades since, Diamond has enjoyed huge success with several “big books” – most famously, 1997’s Guns, Germs and Steel – which ask the most sweeping questions it is possible to ask about human history. For instance: why did one species of primate, unremarkable until 70,000 years ago, come to develop language, art, music, nation states and space travel? Why do some civilisations prosper, while others collapse? Why did westerners conquer the Americas, Africa and Australia, instead of the other way round? Diamond, who describes himself as a biogeographer, answers them in translucent prose that has the effect of making the world seem to click into place, each fact assuming its place in an elegant arc of pan-historical reasoning. Our interview itself provides an example: one white man arriving to interview another, in English, on the imposing main campus of the University of California, Los Angeles, in a landscape bearing little trace of the Native Americans who once thrived here. Why? Because 8,000 years ago – to borrow from Guns, Germs and Steel – the geography of Europe and the Middle East made it easier to farm crops and animals there than elsewhere.
Hazards of Hindsight
For a moment
prudence and reconsideration
Hindsight dry-cleans your speech
Forget caution and correction
don’t render me speechless with your reason –
all I want from you is a quick artless response
that knocks judgement off into history’s oblivion
only then I'll get a pure no, a simple yes from you
not the elusive past, I wasn’t a part of
To make any sense of history
I need an artless response
In its freshness
I can see better
the peanuts enclosed in the sturdy shell
the fresh oil in its ripened seeds.
by Monika Kumar
from Samalochan, 2012
translation by author
Saturday, October 25, 2014
To what extent is the price of immortality humanity, as you could put it? Must the revolutionary artist ignore — even flout — the basic laws of decency that govern our world in order to transform that world? “Perfection of the life, or of the work,” as Yeats had it. “And if it take the second,” he went on, the intellect of man “must refuse a heavenly mansion, raging in the dark.”
It was an ancient question even then, but somehow every other book I’ve been reading of late comes back to it. Walter Isaacson’s unbiddable 2011 biography of Steve Jobs presents his subject as a kind of Lee Kuan Yew of the tech industry, demanding we give up our ideas of democracy and control in exchange for a gorgeously designed new operating system. Innovation doesn’t have to be so dictatorial: Albert Einstein, the subject of Isaacson’s previous biography, is revered in part for his readiness to defer to what he didn’t understand. Yet the more we read about Jobs publicly humiliating colleagues and refusing to acknowledge responsibility for the birth of his first child, the more we see that his genius could seem inextricable from his indifference to social norms.
Assaf Gavron's 2010 novel "Almost Dead" does something I would have thought impossible — it makes satire out of terrorism. The story of a man who becomes an Israeli national hero after surviving three attacks in a single week, the book offers a sharply ironic look at the intersection of image and reality.
This character is no role model; he's a guy in the wrong place at the right time. Gavron, who was born near Jerusalem and lives in Tel Aviv, is suggesting that we are all of us (citizens, nations, even, to some extent, terrorists) making it up as we go along.
A similar sensibility centers "The Hilltop," Gavron's seventh book, although only the second (after "Almost Dead") to appear in the United States. A sprawling novel that revolves around a small settlement in the occupied territories, its focus is less satirical than absurdist, offering a middle vision between the ridiculous and the sublime.
Perhaps of greater importance to Thomas’s poetry, however, was the wider cultural landscape of 1930s Wales, and Thomas’s geographical and familial location within it. Thomas’s parents were the personification of the intellectual and industrial movement from rural to urban that characterised Wales in the early 20th century. Both originated from Welsh-speaking families of agricultural and religious occupation. The young Thomas, a listener “in love with words”, found himself at the centre of a linguistic and cultural maelstrom. The languages of both Welsh and English informed his ear, just as both the streets of Swansea and the fertile fields of the Llanstephan peninsula informed his eye.
In his now famous notebooks, Thomas’s search for a poetic voice can be traced as if following his route on a map. In these notebooks, he passes through a period of derivative free verse before evolving his poems into the grander, more visceral and patterned work the world met just a few years later when he published 18 Poems. Although the book itself was modest, even retiring – no jacket copy and no author portrait, both on Thomas’s request – inside, the poems themselves were the polar opposite. Bold, physical and sonorous, they have been described by some critics as “biomorphic”. Thomas once wrote: “Every idea, intuitive or intellectual, can be imaged and translated in terms of the body, its flesh, skin, blood, sinews, veins, glands, organs, cells and senses.
Priscilla Gilman in The New York Times:
On Dec. 28, 1817, Benjamin Robert Haydon, then England’s pre-eminent history painter, hosted a dinner party to celebrate his progress on his latest work, “Christ’s Entry Into Jerusalem.” He invited, among others, three men anachronistically pictured in that painting: John Keats, William Wordsworth and the essayist Charles Lamb. In “The Immortal Evening” (the phrase is from Haydon’s letters and diaries), the poet and biographer Stanley Plumly offers an idiosyncratic, heartfelt, at once sinuous and expansive exploration of the dinner, its “aesthetic context and the larger worlds of the individual guests, particularly the three ‘immortal’ writers, Keats, Wordsworth and Lamb.”
Plumly begins the story strikingly, elliptically, in the present tense: “Keats has the most ground to cover.” What the walk to the dinner was like for each of its major participants, the look and feel of Regency London, what kind of food they would have eaten, all come to vivid life in Plumly’s evocative rendering. But if it presents a historical place and moment with immediacy and “present-tense personal intensity” (as Plumly says of Romantic art), “The Immortal Evening” also tackles timeless questions: “How does a living moment in time become ‘immortal’? What are a painting’s terms of immortality?” (Or, for that matter, a poem’s?) Why are some artists remembered and some forgotten?
Neal Hartman in Nautilus:
To those who say that there is no room for genius in modern science because everything has been discovered, Fabiola Gianotti has a sharp reply. “No, not at all,” says the former spokesperson of the ATLAS Experiment, the largest particle detector at the Large Hadron Collider at CERN. “Until the fourth of July, 2012 we had no proof that nature allows for elementary scalar fields. So there is a lot of space for genius.”
She is referring to the discovery of the Higgs boson two years ago—potentially one of the most important advances in physics in the past half century. It is a manifestation of the eponymous field that permeates all of space, and completes the standard model of physics: a sort of baseline description for the existence and behavior of essentially everything there is.
By any standards, it is an epochal, genius achievement.
What is less clear is who, exactly, the genius is. An obvious candidate is Peter Higgs, who postulated the Higgs boson, as a consequence of the Brout-Englert-Higgs mechanism, in 1964. He was awarded the Nobel Prize in 2013 along with Francois Englert (Englert and his deceased colleague Robert Brout arrived at the same result independently). But does this mean that Higgs was a genius? Peter Jenni, one of the founders and the first “spokesperson” of the ATLAS Experiment Collaboration (one of the two experiments at CERN that discovered the Higgs particle), hesitates when I ask him the question.
Farahnaz Ispahani Nina Shea in The Weekly Standard:
Pakistan’s blasphemy law, which turns 30 this year, has become only more deadly with age. Since blasphemy was made a capital crime under the nation’s secular penal code, the effect has been to suppress moderate influences, pushing “Pakistani society further out on the slippery slope of extremism,” said Mujeeb-ur-Rahman, senior advocate at the Supreme Court of Pakistan, in Washington last week. With its large population and sensitive location, Pakistan is a place where any societal shift in the direction of the Taliban deserves the attention of all concerned about Islamic extremism. Instead, this is one more foreign threat that the Obama administration underestimates.
On October 16, for the first time, an appeals court affirmed a death sentence for blasphemy meted out to a woman. A Christian mother of five, Asia Bibi was arrested in 2009 after fellow field hands complained that, during a dispute, she had insulted the prophet of Islam. No evidence was produced, because to repeat blasphemy is blasphemous. Similarly, anyone who defends an accused blasphemer risks being labeled a blasphemer; two officials who made appeals on Bibi’s behalf—Salman Taseer, governor of Punjab, and Shahbaz Bhatti, federal minister for minorities affairs—were assassinated in 2011. Bibi has one last legal recourse, an appeal to the federal Supreme Court, but now no public official dares speak up for her—or for any other blasphemy defendant.
Accusations of blasphemy are brought disproportionately against Pakistan’s Christians, some 2 percent of the population. Intent is not an element of the crime, and recent years have seen cases brought against illiterate, mentally disabled, and teenage Christians. Each case seems to heighten the sensitivities of the extremists and further fracture society. The flimsiest rumor of a Koran burning can spark hysteria ending in riots against entire Christian communities. Lahore’s St. Joseph Colony was torched last year in such a pogrom.
Moustafa Bayoumi, Kayla Epstein, Alan Yuhas, and Eli Valley in The Guardian:
As this panel’s Orthodox Jewish participant, I’m aware that I’ve been asked to participate for a very specific purpose: to bring the bearing of my religious and cultural upbringing to the question of whether The Death of Klinghoffer is antisemitic.
So let’s just get that out of the way: the answer is no.
I can understand why some would jump to to that conclusion, especially if they haven’t seen the opera. Klinghoffer forces Jewish audiences to confront some uncomfortable aspects of Israel’s history, and to relive a tragic chapter in the history of the Israeli-Palestinian conflict. There are some antisemitic lines delivered by one of the hijackers: “America is one big Jew,” he sneers at his cowering captives. But his brutal actions and the shrill, frenzied music that accompanies his words so clearly prove him a villain that it’s ridiculous to say composer John Adams and his librettist Alice Goodman are promoting that view.
Though two opposing sides are given the opportunity, over three hours, to present their narratives, it’s crucial to remember who gets the last word. Klinghoffer ends with a beautiful, heartbreaking aria by his widow Marilyn, who has just learned of the death of her husband. “I wanted to die,” she cries out. The finale lays bare the suffering and anguish that terrorism and antisemitism has wrought.
So why the outrage? Because there’s a lot of ignorance out there drowning out the facts about what this opera is about.
Sitting up with a yawn,
Rolling up the tattered mat,
Tucking up the torn mundu,
Walking along the hedges.
Not for a lark.
The muddy fields grimace,
The cows wag their tails.
Where is that long night –
The one they sang their fervent hymns about,
The one they said spring thunder
Would light up with brilliant flashes
Before the great new dawn arrived?
Hate, anger –
On racing pulses.
They stood leaning against the good old walls,
The graying firebrands.
Out of the dry, cracked, poetry-less soil they had sprung.
Drained by the waters of compassion
They had grown dreams on their bodies.
They now watch
As texts are served on a platter.
by Raghavan Atholi
from Poetry International Web
Over at the Columbia University Press Blog, an interview with Herve This on his new book:
Question: How does note-by-note cooking differ from molecular gastronomy?
Herve This: Molecular gastronomy is a scientific activity, not to be confused with molecular cooking. Indeed, molecular gastronomy, being science, has nothing to do with cooking. In other words, science is not about making dishes. Science looks for the mechanism of phenomena. That’s all. And technology uses the results of science to improve technique. So, note-by-note cooking is a technique.
Another question could be, how is note-by-note cooking different from molecular cooking? And here the answer would be that the definition of molecular cooking is “to cook using modern tools” (such as siphons, liquid nitrogen, etc.). But you still use meat, vegetables, etc. However, with note-by-note cooking, the instruments are not important, and the big revolution is to cook with pure compounds, instead of meat, vegetables, fruits, eggs, etc.
Q: Where does the name Note-by-Note Cooking come from?
HT: In 1999, when I introduced the name “molecular cooking,” I was upset, because it was a bad choice, which had to be made for many complex reasons. Unfortunately, people now confuse molecular gastronomy and molecular cooking. So, For note-by-note cooking, I wanted a name that could appeal to artists and it’s fair to say that note-by-note cooking is comparable to a term such as electro-acoustic music.
Q: Won’t not-by-note cooking produce artificial forms of food?
HT: Yes, but all food is “artificial”! Do you think that barbecue meat hangs “naturally” on the trees of the wild forest? Or that French fries appear suddenly from potatoes? No, you need a cook, to make them. In ordinary language, “natural” means “what was not transformed by human beings”, and “artificial” means that it was transformed, it was the result of human “art”.