Sunday, September 21, 2014
Nathaniel Comfort in Nature:
Is race biologically real? A clutch of books published this year argue the question. All miss the point.
Michael Yudell's Race Unmasked and Robert Sussman's The Myth of Race can be read as inadvertent retorts to former New York Times journalist Nicholas Wade's A Troublesome Inheritance, published while the former were in the press. Wade's book is by far the most insidious, but all three are polemics that become mired in proving (in Wade's case) or disproving (in the others') whether race is biological and therefore 'real'. This question is a dead end, a distraction from what is really at stake in this debate: human social equality.
Race is certainly real — ask any African American. It originated long before the science of genetics, as sets of phenotypes and stereotypes. These correlate with haplotypes, clusters of genetic variation. In this sense, race is genetically 'real'. But those correlations depend on judgement calls. Wade cites population-genetics studies that identify three principal races: caucasian, African and East Asian. Elsewhere he cites five, adding Australasian and Native American; or seven, splitting caucasians into people from Europe, the Middle East and the Indian subcontinent. A study in Scientific Reports this year identified 19 “ancestral components”, including Mozabites, Kalash and Uygurs (D. Shriner et al. Sci. Rep. 4, 6055; 2014). Palaeogeneticist Svante Pääbo and others have revealed the underlying human genetic variation to be a series of gradients. Whether and how one parses that variation depends on one's training, inclination and acculturation. So: race is real and race is genetic, but that does not mean that race is 'really' genetic. The completion of the draft human-genome sequence in 2000 led some optimists to forecast the end of race (one of them, Craig Venter, wrote the foreword to Yudell's book), but use of the term in the biomedical literature has actually increased since then. For clinicians, race is a matter of pragmatism. Although each of us is genetically and epigenetically unique, our ancestry leaves footprints in our genomes. Consequently, clinicians use familiar racial categories such as 'black' or 'Ashkenazi Jewish' as crude markers of genotypes, in a step towards individualized medicine. For them, the reality of race is immaterial; diagnosis and treatment are what count (see page 301).
Land of the Houyhnhnms
........it is what they see done every day, and
.....they look upon it as one of the necessary
.....actions of a reasonable being. —Swift
even crossing the street
how not to draw the ire
of tall, beautiful blonds
(chiseled from the bosom of Odin)
for the WALK sign
on a street empty of cars,
glaring at me,
the dark little savage,
unable to abide by civil law,
already on the other side of the street.
LUND, SWEDEN, MARCH 2009
by Sassan Tabatabai
The Pen and Anvil Press, 2011
Saturday, September 20, 2014
In Full Stop:
Paul Holdengräber of the New York Public Library claims that good conversation can leave one “hopeful about the possibility of speech.”
As one of the world’s leading conversationalists, he would know. In the hundreds of events he’s done since coming out to New York a decade ago to join the library as their director of Live from the NYPL, he’s spoken with everyone from Mike Tyson to John Waters, Toni Morrison to Matthew Barney, Pete Townshend to Paul Auster, Harold Bloom to Jay-Z. Listening to him talk with various artists and intellectuals, one is reminded of the fact that Western philosophy began with dialogue. How could it have started with anything else? Conversation, which is really just connection, is what makes us human, what gives us our will to live.
I’ve never been more hopeful about the possibility of speech, and thus hopeful about the possibility of life itself, than the two times I’ve had the privilege to converse with this curator of public curiosity.
Tyler Malone: What is conversation to you and what is its value?
Paul Holdengräber: It’s interesting to have children in that regard because one thing you notice very early on is that conversation is how we become human. The word “infant” literally means “without the possibility of phatic expression.” We begin our lives by being spoken to and then slowly by responding. It’s what makes us come together as a kindred species. Without this dialogue, without this possibility of exchange, part of our humanity — that which makes us truly human — is lost. So for me conversation is a way of going back to that initial moment. Conversation is a giving and a taking, back and forth.
You know, the only sport I emphatically love is ping pong. Now I often play with my young boy at six in the morning before he goes off to school. I caught myself thinking the other day that our game was but a variation on what I do professionally. Ping pong is a form of exchange, of back and forth, of quick and slow, of spin and no spin — conversation is something of that nature.
Patrícia Vieira on Doris Summer's The Work of Art in the World: Civic Agency and Public Humanities and Peter Brooks's (ed.) The Humanities and Public Life, in The LA Review of Books:
Peter Brooks’s edited collection The Humanities and Public Life and Doris Sommer’s The Work of Art in the World: Civic Agency and Public Humanitiesoffer spirited defenses of the humanities that attempt to explain why these fields of study matter. Both volumes cogently argue for the significance of the humanities, focusing primarily on their role in public life. Studying philosophical, historical, and artistic works may well make one a better person — or at least more knowledgeable, skilled, or intelligent — but these two books are chiefly concerned with the public benefit of such studies. What is the social function of the humanities? Is there a correlation between reading and ethics? What about between the humanities and human rights? Can the arts empower disenfranchised communities and, if so, in what ways?
Brooks’s volume is based upon the papers given at a symposium he convened at Princeton University. It brings together essays by a group of distinguished scholars from various fields in the humanities, together with a few representatives from the social sciences, to reflect on a series of key issues related to the social function of their research and teaching. The editor’s introduction is followed by Judith Butler’s article “Ordinary, Incredulous,” which, surprisingly enough, is one of the most melancholic pieces in the collection. Going back to Louis Althusser’s concept of ideology, Butler denounces neoliberal anti-intellectualism that substitutes ideologically backed “obviousnesses” for genuine thought. One of these obviousnesses is the compulsion to justify the humanities for their instrumental value, as being useful for economic or political life.
The humanities are precisely the space to consider what value is and to ponder upon different ways of valuating, in other words, to unravel the obviousnesses of ideology.
David Marquand reviews Inventing the Individual: the Origins of Western Liberalism by Larry Siedentop and Liberalism: the Life of an Idea by Edmund Fawcett, in TNR:
In its 19th-century heyday, as Fawcett’s history reminds us, liberalism was optimistic, passionate and imbued with strongly held moral convictions. Without using the terms, its proponents were for Burke’s social freedom and for Mill’s vision of human nobility. In France, radicals such as Clemenceau took on the army, an exceptionally reactionary Catholic Church and an ugly wave of anti-Semitism in defense of the unjustly imprisoned Captain Dreyfuss, an Alsatian Jew by origin. In Britain, Gladstone made his extraordinary transformation from High Tory to Liberal messiah because he came to believe that the masses were nobler and more virtuous than the classes.
Twenty-first-century liberalism is a pale shadow of its 19th-century ancestor. Albeit with some honourable exceptions, the passion and optimism have gone. Latter-day Clemenceaus and Gladstones are nowhere to be seen. Burke’s vision of social freedom has virtually disappeared from the liberal repertoire; few now echo Mill’s call for strenuous self-improvement. For the most part, today’s liberals see individuals as free-floating, history-less and untethered social atoms, quite unlike the rooted, flesh-and-blood individuals presupposed by their counterparts of yesteryear. The most obvious result is that, all too often, the robust moral convictions of the past have withered into a querulous self-righteousness, strongly tinged with moral relativism.
Why should this be?
Levé’s projects often invite us into discomfort, into awkward gaping at the failures of art. This work features a number of speakers—a confused adolescent trying to recount a dream, an old man recalling a shameful event which he describes “through a long and obscure circumlocution which he alone understands.” Another man tells a story with too many characters that soon becomes impossible to follow. It is like the worst episode of This American Life ever. It puts us in mind of one of the remarkable things about Levé’s career: that he seems to have rejected conventional narrative right from the beginning. There was no realist teething phase as with the American avant-gardists David Markson, Padgett Powell, and David Shields. This innate confidence could be attributed to the strong tradition of French experimental writing, particularly the Oulipo group. The notion of potential literature is obviously in play here, with the paradoxical liberations offered by its strict forms. The specters of Alain Robbe-Grillet and Raymond Roussel also haunt Levé’s intricately imagined performances, stupefying films, and impossible architecture. In Workshe is obsessed with those moments at which conventional art-responses break down: the catalog includes a number of projects where masterpieces are turned into bad copies, videos played without sound, books attributed to the wrong authors. It is a systematic undermining of “aura” as Walter Benjamin put it memorably in “The Work of Art in the Age of Mechanical Reproduction,” his ode to democratization and lament for art’s failed transcendence.
Her thesis is that the climate movement has been a victim of appalling timing. Scientists came to a decisive view on the dangers of global warming in the 1980s, a decade when faith in the power of unfettered markets surged and it was harder than ever to make the case for collective action, market regulation and a strong role for the state.
Now, she argues, a looming climate crisis has created a “historic opportunity” to attack globalisation, privatisation and other aspects of an economic model that is fundamentally at odds with a habitable climate. Just as the disasters of the Great Depression and the second world war ushered in a swath of social and economic reforms, from retirement pensions to public housing, Klein hopes the climate threat will galvanise a grassroots movement to revive vigorous market interventionism.
This is, of course, precisely the type of thinking that some conservative writers have long claimed underpins a “watermelon” climate movement (green on the outside, red on the inside) filled with closet socialists using global warming to advance their ideological aims. On this, Klein is frank: “I have long been greatly concerned about the science of global warming – but I was propelled into a deeper engagement with it partly because I realized it could be a catalyst for forms of social and economic justice in which I already believed.”
Noel Malcolm in TheTelegraph:
'Many of us saw religion as harmless nonsense. We thought, if people needed a crutch for consolation, where’s the harm? September 11 changed all that.” So said Richard Dawkins, who until his retirement enjoyed the title of Professor for the Public Understanding of Science at Oxford University. Some of us began to wonder whether Dawkins had secretly renegotiated the terms of his job, becoming instead the Professor for the Public Misunderstanding of Religion. To argue that one act of terrorism, however extreme, committed by members of one radical movement proved the harmfulness of all religion was a strange piece of reasoning. But, undeniably, it caught a popular mood, and the Dawkins-Hitchens denunciation of religious faith as a force for evil in the world has been on a roll ever since. If the argument here were just about radical Islamism, this debate would at least have a clear and narrow focus. But the Dawkinsite argument is grafted on to an older tradition of anti-religious rhetoric going back to Enlightenment thinkers such as Voltaire, who compiled an entire history of religiously inspired mayhem – from the brutal campaigns of the ancient Israelites to the Crusades, the Spanish Inquisition and the many “wars of religion” in western Europe. This is a heavy burden for any would-be defender of the faith to pick up and deal with.
Karen Armstrong does not flinch from this task. A prolific author of books about religion, she seems to have the right qualifications to be a moderate, non-dogmatic apologist for it: as a former nun, she can see things, so to speak, from both sides of the convent wall. Previously she has written about early religious history as well as modern fundamentalism; her new book runs from the one to the other, from Gilgamesh to bin Laden, covering almost five millennia of human experience in between. This is both an apologia and a history book, aimed always at supplying the context of what may look like religiously motivated episodes of violence, in order to show that religion as such was not the prime cause.
Marc Parry in the Chronicle of Higher Education:
On a Friday night in early August, Corey Robin put out a call on his blog. There had been plenty of grumbling over the University of Illinois’s decision to revoke a job offer to Steven G. Salaita, who gained notoriety for incendiary tweets about Israel. But it had not been enough to persuade the university to reinstate the professor. So Mr. Robin, a political theorist at the City University of New York’s Brooklyn College, ratcheted up the pressure.
He suggested that scholars in every field begin organizing public statements refusing to accept any invitations to speak on any campus of the University of Illinois—a serious disruption of academic business.
"Nobody’s gonna do this," Mr. Robin remembers telling his wife, who was reading in the bedroom of the Park Slope apartment that the couple shares with a daughter and five cats.
To his surprise, they did. Philosophers, citing CoreyRobin.com, took up the challenge. The boycotts snowballed. English professors. Political scientists. Anthropologists. All signed on, and Mr. Robin blogged each fresh step. By the professor’s last count, more than 5,000 scholars have joined boycotts.
The Salaita Affair has riveted academe.
Elliott Colla in his blog:
My first year as a student in Cairo. I visit Cairo’s main book market located in the famous area of Ezbekiyya. When Napoleon tried to conquer Egypt, this was the site of a man-made lake surrounded by the ornate palaces of Turkish Pashas and high-ranking officials of the late Mameluke state. A century later, during British rule, the lake had been filled in and the area converted into a vast entertainment district. Bars and theatres, cabarets and brothels catered to Cairo’s elites who met in this border zone located between the medieval casbah and the new colonial downtown. By the time I get to Cairo, most of this history has disappeared under flyovers and Soviet-era concrete projects. Still, a few sordid belly-dance clubs still hold out over near the decrepit old fire station and post office.
The book market is literally fastened to an old black iron fence. Inside the bars, sit the stately gardens of Ezbekiyya Park, completely off-limits to the general public. Outside, the book market stalls cling to a tiny strip between the fence, a chaotic bus depot, and the busy streets of Ataba.
I do not read Arabic in 1985. So, I mostly look around at the posters. During those years, most of them featured the Indian beefcake actor, Amitabh Bhachchan and a woman provocatively fixated on a snake, her full red lips about to kiss it.
On the 10th anniversary of the Clinton Global Initiative, Bill Clinton assesses the state of the world, and of his post-presidency
James Bennet in The Atlantic:
In his distinction-defying way, Clinton has managed to prove the worriers both right and, more fundamentally, wrong. He certainly hasn’t focused; instead, he has found a way to turn his appetite for everything and everyone—along with his instinctive preference for what he has called “bite-size” approaches over sweeping, one-size-fits-all solutions—into a force for significant change, through the Clinton Foundation and through the do-gooder conference he created, the Clinton Global Initiative, or CGI, as he usually calls it. Overall, Bill Clinton has conducted the most energetic, high-profile post-presidency since at least Teddy Roosevelt’s, pouring himself into philanthropic, political, and, yes, moneymaking ventures. But besides supporting his wife as she worked as a senator, secretary of state, and once-and-future presidential candidate, he has made his most unconventional contribution through the Clinton Global Initiative. On the cusp of its 10th anniversary, I sat down with the former president in Washington, D.C., to ask about its lessons so far, and what he hopes to do with it in the future.
One is becoming as well-known for her autobiographical work as she is for her test for what movies meet a gender-balance baseline. Another directed one of the best-reviewed and most surreal documentaries of the past decade and has a follow-up on this year's film-festival circuit. Another has been leading the fight for gay-marriage rights since 2004 in Massachusetts.
Alongside cartoonist Alison Bechdel, The Act of Killing director Joshua Oppenheimer and attorney Mary Bonauto, other 2014 MacArthur Award winners are exploring the subtleties of race via psychology and poetry, using math to model the human brain or define the limits of prime numbers, or providing physical, home and job security to some of the country's most at-risk populations.
Anand Giridharadas in The New York Times:
There are places in America where life is so cheap and fate so brutal that, if they belonged to another country, America might bomb that country to “liberate” them. This book is a mesmeric account of such a place — a ghetto near Newark — that asks the consummate American question: Is it possible to reinvent yourself, to sculpture your own destiny? “The Short and Tragic Life of Robert Peace” seeks answers in the true story of two men, reared in the same mostly black, mostly luckless neighborhood, whose trajectories spectacularly diverge. One man is Shawn, born to a sweet-talking, drug-pushing father named Skeet, who tries to keep his son from books, fearing they will make him too soft for a hard world. Instead, Skeet teaches Shawn how to fight, intimidate, know everyone on avenues where it’s lethal not to. When Skeet is imprisoned for killing two women, Shawn inherits his friends. He becomes a dealer, too, eventually sleeping in his car, wearing a Kevlar vest. The other man is Rob, son of a feistily aspirational mother, who, while toiling in kitchens, wishes for her child the escape she never had. She borrows books from the local library to read to her small son, and later buys him the first volume of an encyclopedia, getting additional ones, letter by letter, when she can afford them. She navigates their bleak world to find institutions and people who will help him. A Benedictine school rescues Rob. A bank executive offers to pay all his college expenses. Yale accepts him. He majors in molecular biophysics and biochemistry, and works in a cancer and infectious disease laboratory.
What makes this book so devastating is that these two men, Rob and Shawn, are really one: Robert DeShaun Peace, who went from a New Jersey ghetto to Yale to wherever men go after dying face down, knees bent, in a drug-related murder.
We Are Here
we are here
slaving for sovereignty by selling freedom
into the captivity of patriotism.
we are here silent, brainwashed
we are here
poor, frightened, and angry
wondering who is the next torture victim or petrol-
we are here, clutching at a fragile economy
a disintegrating social system.
we are here feasting on propaganda
while poets sing praise litanies
we are here
queuing for basic commodities
chasing sky-rocketing prices
doing business in an unstable environment
we are here where the dollar is extinct
and millionaires are homeless
mother, what happened to the breadbasket of Africa?
sister, what happened to Africa’s paradise?
brother, what happened to the sunshine city
and that of Kings?
we are here honouring the zhing-zhong products flooding the
market while home industry gathers dust in
well, we are here,
wondering where, when and how
we lost our bearings.
by Cosmas Mairosi
from Poetry International Web, 2007
Friday, September 19, 2014
Ian Sample in The Guardian:
Not to be confused with the more prestigious – and lucrative – prizes doled out from Stockholm next month, the Ig Nobels are awarded for science that makes people laugh and then makes them think.
The winners this year received their awards at a ceremony at Harvard University, where a stern eight-year-old girl was on hand to enforce the strict 60 second limit on acceptance speeches. The ceremony is organised by the science humour magazine, Annals of Improbable Research.
Speaking at the event was Rob Rhinehart, creator of the all-in-one food, Soylent, and Dr Yoshiro Nakamatsu, a prolific inventor with more than 3,000 patents who won an Ig Nobel in 2005 for photographing every meal he ate in the previous 34 years.
Holding the flag for Britain, though only figuratively because the flight to Boston cost too much, was Amy Jones, who shared the Ig Nobel prize forpsychology. Her work with Minna Lyons at Liverpool Hope University revealed that people who habitually stayed up late were, on average, more self-admiring, manipulative and psychopathic.
"To be honest, I hadn't heard of the awards before," Jones told the Guardian. "It's absolutely overwhelming. No one could be more surprised than me."
People who display the traits often do very well in life, having desirable jobs and more sexual partners, she said. "Successful psychopaths are going to end up in all the high end jobs, in charge of companies, making millions. The unsuccessful psychopaths are the ones that end up in jail."
Frances Stonor Saunders in the LRB:
In his youth Pasternak looked, Marina Tsvetaeva said, ‘like an Arab and his horse’. In older age, he looked the same. Sinewy and tanned from long walks and tending his orchard, at 66 he was still an intensely physical presence. This was the woodsman-poet who was waiting by the garden gate to greet his friend Isaiah Berlin, 19 years younger, bespectacled and pudgy, his indoor skin betraying the rigours of the Senior Common Room and the international diplomatic circuit.
‘The Foreigner Visiting Pasternak at His Dacha’ is its own subgenre of intellectual history. Its principal theme is the excitement of discovering a lost generation who, like ‘the victims of shipwreck on a desert island’, have been ‘cut off for decades from civilisation’ (Berlin). The foreigner, moved by his role as witness to an impossible reality, records every detail of the encounter: the welcome (Pasternak’s handshake is ‘firm’, his smile ‘exuberant’); the walk (oh, that ‘cool’ pine forest, and look, some dusty peasants); the conversation, with Pasternak holding forth ‘as if Goethe and Shakespeare were his contemporaries’; the meal, at which his wife, ‘dark, plump and inconspicuous’ (and often unnamed), makes a sour appearance; the arrival of other members of the Peredelkino colony, the dead undead; the toasts, invoking spiritual companions – Tolstoy, Chekhov, Scriabin, Rachmaninov. And finally the farewell at the gate, at which Pasternak disappears back into the dacha and re-emerges with sheaves of typescript. These are given to the visitor (‘the guest from the future’, as Anna Akhmatova put it), who is now tasked with the sacred and thrillingly immortalising responsibility of carrying Pasternak’s writings out of this place where the clock has stopped and into the world beyond.
Berlin’s reports of his meetings with Pasternak, which cover two periods spanning a decade, conform to the conventions of the genre (not surprising, as he largely invented it) but his published account of his visit of 18 August 1956 is curiously short on colour, and there is no mention of his bride, Aline, who accompanied him, or of Pasternak’s wife, Zinaida. We learn only that the two men convened in a lengthy conversation, which must have vibrated amid the pine trees like some strange antiphon. Pasternak, Berlin once observed, ‘spoke slowly in a low tenor monotone, with a continuous even sound, something between a humming and a drone’; Berlin’s voice was variously described as ‘a low, rapid rumble’, ‘a melting Russian river’, the ‘bubble and rattle’ of a ‘samovar on the boil’. At some point, Pasternak took Berlin into his study, where he thrust a thick envelope into Berlin’s hands and said: ‘My book, it is all there. It is my last word. Please read it.’
Richard Marshall interviews Lori Gruen in 3:AM magazine:
3:AM: A recent book of yours looks at ethics and animals. You begin by looking at the position of human exceptionalism, something that goes back to at least Aristotle. What is the position, and is it a kind of default position for those who just don’t think we should think about animals ethically?
LG: Human exceptionalism is a prejudice that not only sees humans as different from other animals but that also sees humans as better than other animals. Of course humans are unique in a variety of ways, although those differences are often articulated based on naïve views about other animals. In Ethics and Animals, I explore some of the claims that have been made to differentiate humans from other animals (that we are the only beings that use tools or that use language or that have a theory of mind) and show that they do not establish that humans are unique in the ways postulated. But I also discuss the ways that other animals are indeed different from us and different from each other. These differences are important for understanding them and for promoting, or at least not negatively impacting, their well being.
Human exceptionalism also underlies skepticism about including other animals in the sphere of moral concern. It is related to two other views that are discussed more often in the literature about moral considerability – speciesism and anthropocentrism. Speciesism is the view that I only owe moral consideration to members of my own species. Although this view is usually thought to be focused on humans, it seems consistent with the view that only Vulcans matter to members of that species, or only orangutans matter to that species. Anthopocentrism is the view that humans are at the center of everything and that everything is understood through our human interpretive lenses. Of course we humans experience everything as humans, so in some sense humans are necessarily the center of our own perceptions, but that doesn’t mean we are unable to try to understand or care about non-humans. There is a sense in which we are inevitable anthropocentrists, but we needn’t be human exceptionalists.
Human exceptionalism sees humans as the only beings worthy of moral concern. Normative exceptionalist arguments generally fail in one of two ways—they pick out a supposedly unique characteristic or property upon which moral worth is supposed to supervene but it turns out that either not all humans have that property or that humans aren’t the only ones that have it.
Often at night, sometimes
out in the snow or going into the music, the hunch says,
I don't know what it means.
Just, "Push it. Go further. Go deeper."
And when they come talking at me I get
antsy at times, but mostly I stay put and it keeps saying,
"Deeper. This is not it. You must go deeper."
There is danger in this, also
beautiful fingers and I believe it can issue in
gestures of concord; but I
cannot control it, all I know is one thing—
"Deeper. You must go further. You must go deeper."
by Dennis Lee
from Canadian Poetry Online
If philosophy questions everything, surely it must also question the periodization of its own history. Professional historians themselves tend to agree that the imposition of periods on the past –premodern, Renaissance, early modern, and so on– is always to some degree arbitrary, even if it is also impossible to imagine how we could describe the past without any periodization at all. The bounding off of temporal regions in this way is made all the more problematic if we wish to consider the past from a global perspective, rather than simply focusing on a single region, since the rationale for periodization in one place might not apply in another. However artificial the notion of the ‘medieval’ period is, we may nonetheless say with certainty that this notion is more usefully applied to Europe than to, say, South America: there is nothing ‘medieval’ about the 10th century in Peru (nor, strictly speaking, is there any meaningful sense in which Peruvians can be said to have experienced the 10th century). There is also nothing medieval about what we often call ‘medieval Islamic philosophy’. Whether or not we may see the period between the 8th and the 12th centuries as a ‘Golden Age’, a term that implies a subsequent decline, it is in any case a mistake to see the period of flourishing of ibn Rushd in Iberia, or of ibn Sina in Central Asia, as a relative void between antiquity and modernity. It was certainly not experienced by the people who lived it as ‘between two ages’, and nor, within the context of Islamic history, is there any interesting sense in which this period was a transitional one.
To be considered a serious artist in late-19th-century America, you had to have studied in a European, academic workshop, testing your brushtrokes among the masters of the continent. But art is nothing if not transformation, and almost as soon as American artists embraced the European traditions, they rebelled against it. Taking a cue from the French Impressionists who made their debut in their own private exhibition in 1874, these Americans grappled for a style that reflected the new realities of the post-war industrial American city.
It is this journey—from the European tradition of impressionism to the avant-garde movement of Modernism—that will be on display at the Smithsonian Affiliate Peoria Riverfront Museum from September 26 through January 11, 2015. Featuring works spanning from the 1880s to 1950s, the exhibition "Impressionism Into Modernism: A Paradigm Shift in American Art," covers the Industrial Revolution, two world wars and a depression—all of which shaped the way American artists worked. "I felt that it would be interesting and appropriate to use American impressionism as a jumping off point as the story of the process of American artists embracing change," says Kristan McKinsey, the show's curator. "It's a time where American artists are moving away from academic art traditions and looking to create an art that was original and not derivative of European art."
When I was a child, my favorite room at home was the library, a large oak-paneled room with all four walls covered by bookcases—and a solid table for writing and studying in the middle. It was here that my father had his special library, as a Hebrew scholar; here too were all of Ibsen’s plays—my parents had originally met in a medical students’ Ibsen society; here, on a single shelf, were the young poets of my father’s generation, many killed in the Great War; and here, on the lower shelves so I could easily reach them, were the adventure and history books belonging to my three older brothers. It was here that I found The Jungle Book; I identified deeply with Mowgli, and used his adventures as a taking-off point for my own fantasies.
My mother had her favorite books in a separate bookcase in the lounge—Dickens, Trollope, and Thackeray, Bernard Shaw’s plays in pale green bindings, as well as an entire set of Kipling bound in soft morocco. There was a beautiful three-volume set of Shakespeare’s works, a gilt-edged Milton, and other books, mostly poetry, that my mother had got as school prizes.
Medical books were kept in a special locked cabinet in my parents’ surgery (but the key was in the door, so it was easy to unlock).
Saskya Jain in MoreIntelligentLife:
I don’t remember thinking of running away when I asked Ram Singh, our house help, to get my small grey suitcase from the storeroom. We were living in a government flat surrounded by a big garden in the centre of New Delhi. I was five or six years old, and it was the first of many long summer holidays. My classmates had all fled from the heat—abroad, mostly. The school fees sapped my parents’ income and, with both of them working full-time, the only prospect of travel was accompanying my father to meetings in nearby Jaipur. So began what turned into a ritual of sorts. Every day I would arrange a varying selection of belongings in the empty stomach of my suitcase—only to unpack them all a little while later.
To fill our own empty stomachs, my family relied on Ram Singh’s limited repertoire of roti, sabzi, dal and chawal—unleavened flatbread, fried or curried vegetables, lentils and rice. My brother and I often ate by ourselves, and we knew that, of the four, only the roti lent itself to mid-meal entertainment. It could be torn in half if a ship’s hold needed to be loaded up with potato bricks, okra beams or chickpea crates. It could be attached to each ear, to make a pair of giant earrings such as we had seen dangling from certain aunties’ rubbery lobes. With just one bite, a solo roti could become Krishna’s lethal chakra, which he’d spin around his finger on Sunday-morning episodes of “The Mahabharata” before using it to slice off his enemy’s heads. But despite our best efforts, lunch rarely brought us more than 15 minutes closer to the end of the holidays. I was often in our garden, watching aeroplane trails wrinkle the clear blue sky. The promise of discovery wrapped in the idea of travel appealed to me. I started telling my parents that I had lived in America in my previous life, before I was born into our family. They encouraged me to tell them stories of my prenatal adventures; it took me some time to figure out that their queries were motivated by something other than a keen interest in geography.
Kenneth Chang in The New York Times:
Artificial sweeteners may disrupt the body’s ability to regulate blood sugar, causing metabolic changes that can be a precursor to diabetes, researchers are reporting. That is “the very same condition that we often aim to prevent” by consuming sweeteners instead of sugar, said Dr. Eran Elinav, an immunologist at the Weizmann Institute of Science in Israel, at a news conference to discuss the findings. The scientists performed a multitude of experiments, mostly on mice, to back up their assertion that the sweeteners alter the microbiome, the population of bacteria that is in the digestive system. The different mix of microbes, the researchers contend, changes the metabolism of glucose, causing levels to rise higher after eating and to decline more slowly than they otherwise would.
The findings by Dr. Elinav and his collaborators in Israel, including Eran Segal, a professor of computer science and applied mathematics at Weizmann, are being published Wednesday by the journal Nature. Cathryn R. Nagler, a professor of pathology at the University of Chicago who was not involved with the research but did write an accompanying commentary in Nature, called the results “very compelling.” She noted that many conditions, including obesity and diabetes, had been linked to changes in the microbiome. “What the study suggests,” she said, “is we should step back and reassess our extensive use of artificial sweeteners.”