October 31, 2010
The Great Indian Love Affair With Censorship
Ashis Nandy in Outlook India:
"Patriotism,” Samuel Johnson said nearly 250 years ago, “is the last refuge of a scoundrel.” These days in India, the adage can be safely applied to nationalism. There is no other explanation of the threat to arrest and try Arundhati Roy on charges of sedition for what she said at a public meeting on Kashmir, where Syed Ali Geelani too spoke. I was not there at the meeting, but I have read her moving statement defending herself afterwards. I feel both proud and humbled by it. I am a psychologist and political analyst, handicapped by my vocation; I could not have put the case against censorship so starkly and elegantly. What she has said is simultaneously a plea for a more democratic India and a more humane future for Indians.
I faced a similar situation a couple of years ago, when I wrote a column in the Times of India on the long-term cultural consequences of the anti-Muslim pogrom in 2002. It was a sharp attack on Gujarat’s changing middle-class culture. I was served summons for inciting communal hatred. I had to take anticipatory bail from the Supreme Court and get the police summons quashed. The case, however, goes on, even though the Supreme Court, while granting me anticipatory bail, said it found nothing objectionable in the article. The editor of the Ahmedabad edition of the Times of India was less fortunate. He was charged with sedition.
I shall be surprised if the charges of sedition against Arundhati are taken to their logical conclusion. Geelani is already facing more than a hundred cases of sedition, so one more probably won’t make a difference to him. Indeed, the government may fall back on time-tested traditions and negotiate with recalcitrant opponents through income-tax laws. People never fully trusted the income-tax officials; now they will distrust them the way they distrust the CBI.
In the meanwhile, we have made fools of ourselves in front of the whole world. All this because some protesters demonstrated at the meeting that Arundhati and Geelani addressed! Yet, I hear from those who were present at the meeting that Geelani did not once utter the word “secession”, and even went so far as to give a soft definition of azadi. By all accounts, he put forward a rather moderate agenda. Was it his way of sending a message to the government of India? How much of it was cold-blooded public relations, how much a clever play with political possibilities in Kashmir?
We shall never know, just because most of those who pass as politicians today and our knowledge-proof babus have proved themselves incapable of understanding the subtleties of public communication. They are not literate enough to know what role free speech and free press play in an open society, not only in keeping the society open but also in serious statecraft. In the meanwhile, it has become dangerous to demand a more compassionate and humane society, for that has come to mean a serious criticism of contemporary India and those who run it. Such criticism is being redefined as anti-national and divisive. In the case of Arundhati, it is of course the BJP that is setting the pace of public debate and pleading for censorship. But I must hasten to add that the Congress looks unwilling to lose the race. It seems keen to prove that it is more nationalist than the BJP.
Bela Lugosi's Dead (for Shuffy)
Scientific evidence for psychic powers?
Jerry Coyne in Why Evolution is True:
A respected peer-reviewed journal in psychology, The Journal of Personality and Social Psychology, is about to publish a paper that presents scientific evidence for precognition. The paper, by Daryl Bem of Cornell University, is called “Feeling the future: Experimental evidence for anomalous retroactive influences on cognition and affect,” and you can download a preprint on his webpage. I’ve scanned the paper only briefly, and am posting about it in hopes that some of you will read it carefully and provide analyses, either here or elsewhere.
The paper purports to show that a choice that you make in a computer test can be influenced by stimuli you receive after you’ve already made the choice. This implies you have some way, consciously or unconsciously, of detecting things that haven’t yet happened. In an article in Psychology Today, “Have scientists finally discovered evidence for psychic phenomena?“, psychologist Melissa Burkley at Oklahoma State University summarizes two of Bem’s studies:
However, Bem’s studies are unique in that they represent standard scientific methods and rely on well-established principles in psychology. Essentially, he took effects that are considered valid and reliable in psychology – studying improves memory, priming facilitates response times – and simply reversed their chronological order.
The Cask of Amontillado
what's inside a girl
Economic Recessions, Banking Reform and the Future of Capitalism
Jesús Huerta de Soto delivers the Hayek Memorial Lecture at the London School of Economics:
I would like to start off by stressing the following important idea: all the financial and economic problems we are struggling with today are the result, in one way or another, of something that happened precisely in this country on July 19, 1844… What happened on that fateful day that has conditioned up to the present time the financial and economic evolution of the whole world? On that date, Peel’s Bank Act was enacted after years of debate between Banking and Currency School Theorists on the true causes of the artificial economic booms and the subsequent financial crises that had been affecting England especially since the beginning of the Industrial Revolution.
The Bank Charter Act of 1844 successfully incorporated the sound monetary theoretical insights of the Currency School. This school was able to correctly discern that the origin of the boom and bust cycles lay in the artificial credit expansions orchestrated by private banks and financed not by the prior or genuine savings of citizens, but through the issue of huge doses of fiduciary media (in those days mainly paper banknotes, or certificates of demand deposits issued by banks for a much greater amount than the gold originally deposited in their vaults). So, the requirement by Peel’s Bank Act of a 100 percent reserve on the banknotes issued was not only in full accordance with the most elementary general principles of Roman Law regarding the need to prevent the forgery or the over-issue of deposit certificates, but also was a first and positive step in the right direction to avoid endlessly recurring cycles of booms and depressions.
However Peel’s bank Act, not withstanding the good intentions behind it, and its sound theoretical foundations, was a huge failure. Why? Because it stopped short of extending the 100 percent reserve requirement to demand deposits also (Mises 1980, 446-448).
Stephen Fry Kinetic Typography - Language
The pros of a rapidly aging planet
Stefany Anne Golberg in The Smart Set:
The world has never aged like this before, and the aging of the world is happening everywhere. It is true that, now, developed countries are aging fastest, but it won’t be this way for long. Countries such as Brazil and Sri Lanka may not experience rapid aging now, but when they do, it will happen in just a couple of decades, while the rest of the world needed the entire 20th century.
People worry whether our social welfare systems will collapse. Whether we will have enough hospitals and housing. Whether overall human productivity will decrease. In the New York Times Magazine, Ted C. Fishman worries that global power may be determined by how much a country is willing to invest in care for its elderly, that the old may be pushed aside if they prove too costly. He worries that the old, unable to work, will live in poverty, but that the very act of an old country taking young workers from young countries will just hasten the aging of our last remaining young nations.
And yet, we haven't really asked ourselves just what it will feel like to live in an old world. Will reminiscing replace love songs? Will wisdom replace surprise?
Browsing the Annual Tree Ring Data Bank
I found your Himalayan chronology:
a comprehensive set of cores
from a ski area in Kashmir.
I know you were there in 1973
and you likely felt the stay of November,
before snow slams down
the airplanes -- mountain-shine
through long blue needles, shadows
and cores fresh on the snow in stripes.
I can picture the measurement, later:
Ashok bringing in tea, sweet, gingery,
goat-milk thick and held far
from the calipers. You drank
the first half in 1790 between the earlywood
and latewood. In 1600, you remembered
the rest of it but it had a skin by then.
The oldest pith came from a seedling
in the year of Babur’s first arrival,
complete with court painters to capture
wild Hidustani beasts. (There’s a moment of privacy
before uploading data onto the Persian vellum
of the internet like a miniature painting
before the gold leaf.)
by Hanna Coy
from You Are Here–
The Journal of Creatrive Geography, 2010
Browsing the Annual Tree Ring Data Bank
Spartan Means, Splendid Spaces
From Harvard Magazine:
In the summer of 1991, as a new North Carolina State University graduate in environmental design in architecture, Elizabeth Whittaker, M.Arch. ’99, wore a hard hat, pouring concrete over rebars at Arcosanti, a planned community in the Arizona desert designed by the celebrated architect Paolo Soleri. “It was a hippie-throwback place,” she recalls. “Living off the land in a progressive, communal atmosphere. A hilarious place.” Today, as principal of MERGE Architects, Inc. (www.mergearchitects.com) in Boston, Whittaker still dons a hard hat occasionally, but now she’s overseeing the pours, and the buildings under construction are her own designs.
The hard hat suggests the hands-on, intimate involvement with details of a project that Whittaker specializes in, a way of working that she calls “extreme collaboration.” It’s a modus operandi that took form in the early days of her firm, which she founded in 2003, when “we were flying by the seat of our pants, doing these small, quick, needs-to-be-built-in-three-weeks-for-10-dollars kind of projects,” she explains. “We would be inventing the construction details right in the shop or on site with the artists and craftsmen—the steel fabricators, woodworkers, structural engineers, concrete fabricators. Every architect collaborates; this is extreme only in that it is so immediate. We’re inventing it with the tradesmen. I’ve built a practice on learning from these people—it’s more inventive when there are more voices.”
The Cancer Sleeper Cell
Siddhartha Mukherjee in The New York Times:
The word “relapse” comes from the Latin for “slipping backward,” or “slipping again.” It signals not just a fall but another fall, a recurrent sin, a catastrophe that happens again. It carries a particularly chilling resonance in cancer — for it signals the reappearance of a disease that had once disappeared. When cancer recurs, it often does so in treatment-resistant or widely spread form. For many patients, it is relapse that presages the failure of all treatment. You may fear cancer, but what cancer patients fear is relapse. Why does cancer relapse? From one perspective, the answer has to do as much with language, or psychology, as with biology. Diabetes and heart failure, both chronic illnesses whose acuity can also wax and wane, are rarely described in terms of “relapse.” Yet when a cancer disappears on a CT scan or becomes otherwise undetectable, we genuinely begin to believe that the disappearance is real, or even permanent, even though statistical reasoning might suggest the opposite. A resurrection implies a previous burial. Cancer’s “relapse” thus implies a belief that the disease was once truly dead.
But what if my patient’s cancer had never actually died, despite its invisibility on all scans and tests? CT scans, after all, lack the resolution to detect a single remnant cell. Blood tests for cancer also have a resolution limit: they detect cancer only when millions of tumor cells are present in the body. What if her cancer had persisted in a dormant state during her remissions — effectively frozen but ready to germinate? Could her case history be viewed through an inverted lens: not as a series of remissions punctuated by the occasional relapse, but rather a prolonged relapse, relieved by an occasional remission?
More here. (Note: Congratulations to dear friend and brilliant colleague, Sid. My MDS patients have hope because of you! BRAVO!)
October 30, 2010
Stefan Collini in LRB:
Much of the initial response to the Browne Report seems to have missed the point. Its proposals have been discussed almost entirely in terms of ‘a rise in fees’. Analysis has largely concentrated on the amount graduates might pay and on which social groups may gain or lose by comparison with the present system. In other words, the discussion has focused narrowly on the potential financial implications for the individual student, and here it should be recognised that some of the details of Browne’s proposed system of graduate contributions to the cost of fees are, if his premises are granted, an improvement on the present patchwork arrangements.
But the report proposes a far, far more fundamental change to the way universities are financed than is suggested by this concentration on income thresholds and repayment rates. Essentially, Browne is contending that we should no longer think of higher education as the provision of a public good, articulated through educational judgment and largely financed by public funds (in recent years supplemented by a relatively small fee element). Instead, we should think of it as a lightly regulated market in which consumer demand, in the form of student choice, is sovereign in determining what is offered by service providers (i.e. universities). The single most radical recommendation in the report, by quite a long way, is the almost complete withdrawal of the present annual block grant that government makes to universities to underwrite their teaching, currently around £3.9 billion. This is more than simply a ‘cut’, even a draconian one: it signals a redefinition of higher education and the retreat of the state from financial responsibility for it.
Instead, Browne wants to see universities attracting customers in a competitive marketplace: there will be a certain amount of public subsidy of these consumers’ purchasing power, especially for those who do not go on to a reasonably well-paid job, but the mechanism which would henceforth largely determine what and how universities teach, and indeed in some cases whether they exist at all, will be consumer choice.
Mystery Science Theater 2010
“American Gothic” has been described as the most reproduced painting in this country, which is not necessarily high praise. What artist would be elated to hear that one of his paintings had been appropriated in an advertising campaign for General Mills country cornflakes, or Coors beer? For most of his life, Grant Wood endured the scorn of leading art critics, who failed to recognize his refinement. He was known for one painting only, that image of a pale, homely farming pair posed in front of their white house, looking as if their dog just died. Wood painted his creaky masterpiece in 1930, amid the ravages of the Great Depression. Unable to move forward, Americans glanced back and found consolation in images of the sturdy agrarian past. Wood rose to fame as one of the three leaders of Regionalism (Thomas Hart Benton and John Steuart Curry were the other two) and, dressed in his bibbed overalls, presented himself as an antidote to East Coast pretentiousness. “All the really good ideas I’ve ever had came to me while I was milking a cow,” he said, somewhat goofily, in his most famous statement.more from Deborah Solomon at the NYT here.
"Where Good Ideas Come From": Epiphanies are overrated
Where do brilliant ideas come from? When reporters ask Tim Berners-Lee about the moment he conceived of the World Wide Web, he can't answer. He hasn't forgotten, it just never happened. The idea percolated in his mind for nearly a decade, based on a desire to organize massive amounts of data shared between connected computers. He needed ideas of others to buzz around him and he needed an image that would make his idea understandable. His "stack" of information became a "mesh" before eventually becoming a "web." The cliché did not hold true: His moment of insight, as it turns out, wasn't the result of a single flashbulb going off in his brain.
In his sixth book, "Where Good Ideas Come From: The Natural History of Innovation," popular science writer Steven Johnson tries to dispel the notion of the "eureka moment." As with nature, new concepts, like the Internet, slowly grow out of old concepts. They don't spring forth from nowhere. Darwin's theory, for instance, was built on centuries of observation, including his own. During his fateful voyage on the HMS Beagle, Darwin also discovered that atolls, islands made of coral, were created through the lives and deaths of tropical marine organisms, hardened bodies built up on one another. This key image, according to Johnson, gave Darwin a picture for his epic explanation of how life emerged. Using natural science's tendencies to build upon itself, as well as examples of major innovations in science, technology and even art, Johnson makes a case that ideas beget ideas, which means would-be innovators don't need an ivory tower; they need a crowd.
A flashier sort of supernatural novel, aimed at teenagers, is experiencing a startling revival; at the moment you can’t move for vampires and werewolves. Yet the corny “English country house with a spook” template is also being dusted off. It became respectable – and fit for the grown-ups – when Sarah Waters used a full-on array of supernatural effects in her last novel, The Little Stranger. If anything, she overdid it with her bumpings, visions, scratchings, unexplained fires and malign entities; but she also managed to pull off some splendid shocks, as well as cleverly investigating the many purposes a ghost can serve in a narrative. So what do the latest supernatural novels bring to the Hallowe’en party? In an age where viewers are inured to ever more graphic scenes of horror on film, how do you frighten with simple words on a page? I road-tested five recent examples to see if they could make me shudder: two classic English ghost stories and a sparky American take on the genre; an 18th-century chiller set in a spooky old Cambridge college; and a wainscot-free novel that colonises new territory for terror.more from Suzi Feay at the FT here.
I was sitting on an airplane with a copy of "Storyteller: The Authorized Biography of Roald Dahl" when an elegant woman in the seat next to me murmured, almost to herself, "I live just down the lane from his old cottage in Oxfordshire." Turning to her with excitement I asked if she'd ever run into him. "Oh, no, no," she said with obvious amusement, as if the very suggestion was completely absurd. "He was a great writer," she said, sounding very genuine. Yet she had a puzzled expression on her face. She asked me what I did for a living. I said I wrote books for grown-ups and children, just like Dahl. There was an awkward silence. We parted ways. For those who do not know Dahl's grown-up stories, one of his most beloved — if I may use that word — is called "Pig" (1959), about an orphan raised by a tender, vegetarian aunt. The boy's talents as a young vegetarian chef are depicted in a magical, mystical tone. When the aunt dies, the boy buries her and goes to the city where he encounters, gasp … pork! He loves it, and ends up with his throat slit by a butcher. Pure horror.more from Donald Sturrock at the LAT here.
Isn’t It Rich?
Paul Simon in The New York Times:
I saw “West Side Story” when I was 16 years old, and I have two vivid memories of the show. One, I didn’t believe for a minute that the dancers were anything like the teenage hoods I knew from the street corner, and secondly, I was completely overwhelmed by the beauty of the song “Maria.” It was a perfect love song. Sondheim was less enamored with the lyric he wrote for Bernstein. He describes it as having a kind of “overall wetness” — “a wetness, I regret to say, which persists throughout all the romantic lyrics in the show.” Sondheim’s rule, taught to him by his mentor, Oscar Hammerstein II, is that the book and composer are better served by lyrics that are “plainer and flatter.” It is the music that is meant to lift words to the level of poetry.
Sondheim’s regret about “Maria” reminded me of my own reluctance to add a third verse to “Bridge Over Troubled Water.” I thought of the song as a simple two-verse hymn, but our producer argued that the song wanted to be bigger and more dramatic. I reluctantly agreed and wrote the “Sail on silvergirl” verse there in the recording studio. I never felt it truly belonged. Audiences disagreed with both Sondheim and me. “Maria” is beloved, and “Sail on silvergirl” is the well-known and highly anticipated third verse of “Bridge.” Sometimes it’s good to be “wet.”
When I think of Stephen Sondheim songs, I think of his melody and lyrics as one. His career as a lyricist for other composers (Bernstein, Jule Styne and Richard Rodgers) is as distinct from his later work as night is to day, or conversely, day to night, since the quintessential Sondheim song is perceived to be somehow darker, lyrically more cerebral and colder than his earlier collaborative work. From “Sweeney Todd”:
There’s a hole in the world
Like a great black pit
And the vermin of the world
And its morals aren’t worth
What a pig could spit,
And it goes by the name of London.
Arab elections: free, sort of fair... and meaningless
Shadi Hamid in Foreign Policy:
A certain Arab country recently held parliamentary elections. The vote was reasonably free and fair. Turnout was 67 percent, and the opposition won a near majority of the seats -- 45 percent to be exact. Sounds like a model democracy. Yet, rather than suggesting a bold, if unlikely, democratic experiment, Saturday's elections in Bahrain instead reflected a new and troubling trend in the Arab world: the free but unfair -- and rather meaningless -- election.
Something similar will happen on Nov. 9 in Jordan. The Hashemite Kingdom is a close U.S. ally that has grown increasingly proficient at predetermining election results without actually rigging them. It involves gerrymandering at a scale unknown in the West and odd electoral engineering (Jordan is one of only three countries in the world that uses something called Single Non Transferable Vote for national elections). Even when the opposition is allowed to win, the fundamentals do not necessarily change. Parliamentary legislation in countries like Jordan and Bahrain, after all, can be blocked by appointed "Upper Houses." And even if that were not the case, the King (or the President) and his ministers -- all appointed -- can also kill any threatening legislation.
Why Intelligent People Drink More Alcohol
Satoshi Kanazawa in Psychology Today:
Controlling for a large number of demographic variables, such as sex, race, ethnicity, religion, marital status, number of children, education, earnings, depression, satisfaction with life, frequency of socialization with friends, number of recent sex partners, childhood social class, mother’s education, and father’s education, more intelligent children grow up to drink more alcohol in the UK and the US.
[This] graph shows the association between childhood intelligence (grouped into five “cognitive classes”: “very dull” – IQ < 75; “dull” – 75 < IQ < 90; “normal” – 90 < IQ < 110; “bright” – 110 < IQ < 125; “very bright” – IQ > 125) and the latent factor for the frequency of alcohol consumption. The latter variable is constructed from a large number of indicators for the frequency of alcohol consumption throughout adult life and standardized to have a mean of 0 and a standard deviation of 1.0. The data come from the National Child Development Study (NCDS) in the United Kingdom. There is a clear monotonic association between childhood intelligence (measured before the age of 16) and the frequency of alcohol consumption in their 20s, 30s, and 40s. “Very bright” British children grow up to consume alcohol nearly one full standard deviation more frequently than their “very dull” classmates.
So you Want to Get a PhD in the Humanities
In Writings of Obama, a Philosophy Is Unearthed
Patricia Cohen in the New York Times:
When the Harvard historian James T. Kloppenberg decided to write about the influences that shaped President Obama’s view of the world, he interviewed the president’s former professors and classmates, combed through his books, essays, and speeches, and even read every article published during the three years Mr. Obama was involved with the Harvard Law Review (“a superb cure for insomnia,” Mr. Kloppenberg said). What he did not do was speak to President Obama.
“He would have had to deny every word,” Mr. Kloppenberg said with a smile. The reason, he explained, is his conclusion that President Obama is a true intellectual — a word that is frequently considered an epithet among populists with a robust suspicion of Ivy League elites.
In New York City last week to give a standing-room-only lecture about his forthcoming intellectual biography, “Reading Obama: Dreams, Hopes, and the American Political Tradition,” Mr. Kloppenberg explained that he sees Mr. Obama as a kind of philosopher president, a rare breed that can be found only a handful of times in American history.
“There’s John Adams, Thomas Jefferson, James Madison and John Quincy Adams, then Abraham Lincoln and in the 20th century just Woodrow Wilson,” he said.
To Mr. Kloppenberg the philosophy that has guided President Obama most consistently is pragmatism, a uniquely American system of thought developed at the end of the 19th century by William James, John Dewey and Charles Sanders Peirce.
October 29, 2010
Leadership and Leitkultur
JÜRGEN HABERMAS in The New York Times:
SINCE the end of August Germany has been roiled by waves of political turmoil over integration, multiculturalism and the role of the “Leitkultur,” or guiding national culture. This discourse is in turn reinforcing trends toward increasing xenophobia among the broader population. These trends have been apparent for many years in studies and survey data that show a quiet but growing hostility to immigrants. Yet it is as though they have only now found a voice: the usual stereotypes are being flushed out of the bars and onto the talk shows, and they are echoed by mainstream politicians who want to capture potential voters who are otherwise drifting off toward the right. Two events have given rise to a mixture of emotions that are no longer easy to locate on the scale from left to right — a book by a board member of Germany’s central bank and a recent speech by the German president.
It all began with the advance release of provocative excerpts from “Germany Does Away With Itself,” a book that argues that the future of Germany is threatened by the wrong kind of immigrants, especially from Muslim countries. In the book, Thilo Sarrazin, a politician from the Social Democratic Party who sat on the Bundesbank board, develops proposals for demographic policies aimed at the Muslim population in Germany. He fuels discrimination against this minority with intelligence research from which he draws false biological conclusions that have gained unusually wide publicity. In sharp contrast to the initial spontaneous objections from major politicians, these theses have gained popular support. One poll found that more than a third of Germans agreed with Mr. Sarrazin’s prognosis that Germany was becoming “naturally more stupid on average” as a result of immigration from Muslim countries.
‘The Ticking Is the Bomb’
From The Paris Review:
The Ticking Is the Bomb, the second memoir by nonfiction writer and poet Nick Flynn, describes his experiences with fatherhood, writing, and the Abu Ghraib torture victims, some of whom he met personally.
Going back to the book’s organization, I love how the scenes with the Abu Ghraib victims are juxtaposed with more personal scenes; it doesn’t establish equivalence, but it mixes the intimacies and distances of both in really gripping ways. Is there any one thing that you want readers to take away as far as our connection to the victims is concerned?
With the Abu Ghraib photographs I was never interested in the question of how our soldiers came to torture other human beings, or even in how Dick Cheney came to authorize it. That Dick Cheney is pro-torture surprises no one; he freely admits it. That soldiers do terrible things during wartime should not surprise us. So at some point the book became about the darker impulses we all carry within us, which led me to examine my own darker impulses. The only way to break out of these darker impulses, for me, was to make a human, face-to-face connection with some of the ex-detainees from the photographs. This is always the only way out.
Any notes from your time with them that weren’t in the book, after your having more time and distance to reflect?
I hope that their humanity came through in the pages, how each had internalized what had happened to them in completely different ways. All of us, we laughed a lot during our time together, when we weren’t hearing about atrocities.
600 million years of jet lag
At first glance, a sea anemone doesn’t seem much like a human. It’s a creature from the tidal zone, affixed to the rock or coral below, and without most of the anatomical features associated with humankind: It has no arms, legs, ears, eyes, or nose. It almost seems more like a plant than an animal. Anemones don’t even have a brain; instead their nerves form a network distributed throughout the body; each nerve cell can communicate with its neighbors, but no central structure controls the entire organism. But a study published last month shows that anemones share one trait with humans: They, like us, are susceptible to jet lag. Like humans, anemones have a strong circadian rhythm, an activity cycle kept on a roughly 24-hour period by built-in biological clocks.more from Dave Munger at Seed here.
why houdini matters
Magic is an amusing, intellectual art in which what you see collides with what you know, and there’s a sparkling little jolt that makes you gasp or laugh. It’s recreation. Magic is clever and fun. We buy children magic kits in toy stores. When we shop for magic books, we find them shelved among the “games and pastimes.” In Las Vegas production shows, magic occupies the “variety arts” spot as an alternative to trained dogs that dance in tutus—a sorbet to refresh our palates between the important courses of perfect naked bodies. Even Harry Kellar, the “dean” of American stage magicians in the generation that preceded Houdini, declared that a magician should transport his audiences “to fairyland without scaring them with the devil.” This—with one hairstyle or another—has pretty much been the job description ever since. But there was nothing fairyland about Houdini (the subject of a major exhibition that opened recently at the Jewish Museum, in Manhattan, with a handsome catalogue by Brooke Kamin Rapaport). He was made of flesh—taut, handsome, muscular—and never let us forget it. The buttoned-up world devoured pictures of Houdini’s physique as he leaped handcuffed from the bridges we crossed every day. Houdini gleefully defied authority. He would challenge police to throw him naked into a jail cell (always a great photo op, with manacles discreetly covering his privates); his clothes were locked in an adjoining cell. A little later the officers—smugly congratulating themselves on stumping the Great Self-Liberator—would hear the telephone ring. It was Houdini, calling from across town.more from Teller at Vanity Fair here.
thin slicing times
Making a photograph -- a snapshot of a passing scene or the staging of a scene as though for posterity -- has usually been understood as an act of consciousness, what Henri Cartier-Bresson called a ”decisive moment” of consciousness, but I suggest that it has less to do with consciousness than the unconscious. It has to do with that ”critical part of rapid cognition known as thin-slicing” -- ”the ability of our unconscious to find patterns in situations and behaviors based on very narrow slices of experience” -- those thin slices of experience we call photographs. “Thin-slicing is part of what makes the unconscious so dazzling” -- and photographs so haunting and fascinating -- “but it’s also what we find most problematic about rapid cognition. How is it possible to gather the necessary information for a sophisticated judgment in such a short time?” -- in the blink of the camera’s eye, which seems to think without thinking, to refer to Malcolm Gladwell’s Blink, The Power of Thinking Without Thinking?(1)more from Donald Kuspit at Artnet here.
A Conversation with Amitava Kumar
Barbara Spindel in the Barnes and Noble Review:
Amitava Kumar wears many hats: Vassar College English professor, literary critic, journalist, poet, and novelist. Duke University Press has just published two books by the prolific writer. Nobody Does the Right Thing is a richly textured novel about a Bombay journalist struggling to reconcile his idealism with his desire to write a Bollywood screenplay. A Foreigner Carrying in the Crook of His Arm a Tiny Bomb is an impassioned critique of the war on terror that focuses on the cases of Hemant Lakhani and Shahawar Matin Siraj, two men that the U.S. government, with the help of paid informants, convicted of plotting acts of terrorism.
Kumar uses the cases of the two men, whom he sees as "accidental terrorists" created by a government desperate for suspects to prosecute, to argue that in the post-9/11 world, "public interest will need to be defined more boldly as the rights that offer protection against the encroachments of a security state." Our wide-ranging email conversation covered the two books and also touched on Kumar's education in India, the role of politics in art, and the "ground zero mosque" controversy.
B&N Review: I thought I'd begin with A Foreigner. I'm wondering how the idea for the book was born and how you ended up focusing on Lakhani and Siraj. What did you find particularly compelling about their cases?
Amitava Kumar: I had just come out of Home Depot and turned on the car radio. On the news was Hemant Lakhani. His lawyer was saying how no real terrorist would have come to Lakhani. Lakhani was a bungler. And right there, in the parking lot, while loading boxes in my car, I thought I would write a story about it.
Electoral victory brings a surprising consequence: the winners look at porn
From The Economist:
When Barack Obama won the American presidency in 2008 his supporters cheered, cried, hugged—and in many cases logged on to their computers to look at pornography. And, lest Republicans crow about the decadence of their opponents, precisely the obverse happened when their man won in 2004.
That, at least, is the conclusion of a study by Patrick Markey of Villanova University, in Pennsylvania, and his wife Charlotte, who works at Rutgers, in New Jersey. The Markeys were looking for confirmation of a phenomenon called the challenge hypothesis. This suggests that males involved in a competition will experience a rise in testosterone levels if they win, and a fall if they lose.
The challenge hypothesis was first advanced to explain the mating behaviour of monogamous birds. In these species, males’ testosterone levels increase in the spring, to promote aggression against potential rivals. When the time comes for the males to settle down and help tend their young, their testosterone falls, along with their aggressive tendencies.
Something similar has since been found to apply to fish, lizards, ring-tailed lemurs, rhesus monkeys, chimpanzees—and humans. In many of these animals, though, there is a twist. It is not just that testosterone ramps up for breeding and ramps down for nurturing. Rather, its production is sensitive to a male’s success in the breeding competition itself. In men, then, levels of the hormone rise in preparation for a challenge and go up even more if that challenge is successfully completed. Failure, by contrast, causes the level to fall.
The View from East Rock (Day #17,119)
You stand at the helm of East Rock as
Though it were a ship setting sail. Your gaze
Reads the horizon like tea leaves, and,
Suddenly, you feel dwarfed by this ordinary
Sunset as if for the first time in 17,119 days.
Transfixed by the light and the longing,
You follow the ungodly caw-cawing of gulls
Gutting fish on a not-too-distant dock.
In fact, you too are restless tonight.
Or is it “restive?” You wonder about that.
And you remember wondering about
That before, and looking it up, and then
The forgetting. It seems as though forgetting
Should be harder than remembering,
Like running downhill is harder than up.
Instead, all you can track is the wind
Rushing by, carrying leaves like you once
Carried children across Mill River, and
Before that, the silence of fireflies kindling
A path up the mountain that marks our
Once-upon-a-time New Haven. One by one,
Such moments slip like pearls off a strand,
And, as you blink, the strand itself blows by.
By K. Ann Cavanaugh
Everyday BPA Exposure Decreases Human Semen Quality
From Scientific America:
The common industrial chemical bisphenol A (BPA) has been linked to many ills, including reproductive abnormalities, cardiovascular disease and cancer. Much of the evidence for these associations, however, has been drawn from animal or in vitro research and has been somewhat controversial as to its precise implications for human health.
Now, a human study has found strong links between BPA levels and semen quality—and the findings are not looking good, especially for men frequently exposed to the compound on the job.
Researchers studied the urine (where BPA can be measured) and semen of 218 male factory workers in China, some of whom make BPA or put it into other products (such as plastics and epoxy resins that line cans), and the remainder, whose work did not put them in direct contact with the chemical.
The Power of Noise vs. The Power of Now
The scientific impotence excuse
Christian Jarrett in the Research Digest blog of the British Psychological Society:
When attempting to change people’s behaviour – for example, encouraging them to eat more healthily or recycle more – a common tactic is to present scientific findings that justify the behaviour change. A problem with this approach, according to recent research by Geoffrey Munro at Towson University in America, is that when people are faced with scientific research that clashes with their personal view, they invoke a range of strategies to discount the findings.
Perhaps the most common of these is to challenge the methodological soundness of the research. However, with newspaper reports and other brief summaries of science findings, that’s often not possible because of lack of detail. In this case, Munro's research suggests that people will often judge that the topic at hand is not amenable to scientific enquiry. What’s more, he’s found that, having come to this conclusion about the specific topic at hand, the sceptic will then generalise their belief about scientific impotence to other topics as well (further detail). Munro says that by embracing the general idea that some topics are beyond the reach of science, such people are able to maintain belief in their own intellectual credibility, rather than feeling that they’ve selectively dismissed unpalatable findings.
The Digest caught up with Professor Munro to ask him, first of all, whether he thinks there are any ways to combat the scientific impotence excuse or reduce the likelihood of it being deployed.
October 28, 2010
Is there anything new to say about Bruce Chatwin?
William Dalrymple in the Times Literary Supplement:
In the autumn of 1986, the painter Derek Hill rang out of the blue and invited me to lunch at his club in St James’s. He was then about seventy; I was twenty-one. I was just back from a journey following in the footsteps of Marco Polo, and Derek wanted me to bring to the lunch some of the Mongol roof tiles I had found at Polo’s final destination, Kubla Khan’s summer palace at Xanadu. The lunch, he explained, was for a friend of his who particularly wanted to see them.
That friend turned out to be Bruce Chatwin, and the lunch was one of those encounters that happen only once or twice in a lifetime and that really do change the direction you end up taking. Chatwin, I thought, was simply astounding. As we sat in the panelled dining room, surrounded by whispering pin-striped clubmen, my small fragments of glazed tile were the starting point for a conversational riff that moved from the nomads of Mongolia in the thirteenth century and cantered over the steppes to Timurid Herat, then leapt polymathically to Ibn Battuta, Ibn Khaldun, Sufi sheikhs and the shamans of the Kalahari bushmen; before long we were being told about Taoist sages, Aboriginal “dreaming” pictures and ancient Cycladic sculpture and thence, as coffee came, via Proust and Pascal and Berenson, to Derek’s portraits, and the latter’s story about sharing a railway carriage with Robert Byron who performed a pitch-perfect imitation of Queen Victoria, using the train’s antimacassar as the Queen’s mourning veil.
At the end, Chatwin limped off on crutches to the London Library saying he needed to check some references for his forthcoming book on the Aborigines of Central Australia...
An evolutionary psychologist proposes a new framework for understanding the root causes of our political beliefs
Tom Jacobs in Miller McCune:
As much as we stake our identity on such core beliefs, it’s unlikely we emerged from the womb as little liberals or libertarians. This raises a fundamental question: At what point in our development did such predispositions begin to form, to coalesce and to harden? What is it about our biology and/or psychology that propels us toward a liberal or conservative mindset?
The question has long intrigued social psychologists such as John Jost of New York University. In a 2003 meta-analysis of 50 years of research, he summarizes the overwhelming evidence that political ideologies, “like virtually all other belief systems, are adopted in part because they satisfy various psychological needs.” Jost quickly adds that this “is not to say they are unprincipled, unwarranted, or unresponsive to reason or evidence” — only that the underlying motivation to believe in them emerges from somewhere other than the rational, conscious mind.
“Most of the research literature … suggests that conservatives are more easily threatened, more likely to perceive the world as dangerous, and less trusting in comparison with liberals,” he notes. This is fairly self-evident. If you perceive the world as a threatening place, you’re more likely to cling tightly to those you trust (i.e., your in-group, however you define it), and to warily eye those you don’t.
the rosenberg case lives
AS A COLLEGE STUDENT in the mid-1960s, I was assigned an array of books that for the most part were unremarkable and quickly forgotten. Of the few that really captured my interest was one that explored the trial and execution of a young, Jewish couple from New York convicted of conspiring to steal the secrets of the atom bomb. Invitation to an Inquest struck me as a powerful piece of investigative journalism and I told many friends the book was a must read. The authors, Walter and Miriam Schneir, persuasively argued that Julius and Ethel Rosenberg were an innocent, progressive couple caught up in an anti-Communist, FBI-inspired witch hunt and that a “pathological liar” and “weirdly twisted creature” named Harry Gold was the government appointed finger man who fabricated a highly unlikely story that put them in the electric chair. Young, impressionable, and unschooled in the nuances of the case, my admiration for the book would remain intact for many years. Of course, I was aware that the guilt or innocence of the Rosenbergs was a controversial and much-debated issue with numerous and knowledgeable advocates on both sides. As time went on, I read other accounts of the case and my confidence in the Schneir thesis began to wane. For example, The Rosenberg File, Ron Radosh and Joyce Milton’s 1983 take on the case, was equally compelling and easily matched the Schneirs’ for solid historical detective work.more from Allen M. Hornblum at The Fortnightly Review here.
There can be no turning two hundred without regrets. Even so, the element of wistfulness was bound to play an especially large role in the Argentine case. The surprise for me last month, as a yanqui spectator auto-marooned these past few years in Buenos Aires, while I strolled up and down the Avenida 9 de Julio—broadest street in the world, so they say—picking my way through the throngs of Argentines out celebrating the May Revolution of 1810, was that the experience of the bicentenario should look so joyous, as it was later reported to have been in polls of the huge numbers who took part, and that the official commemoration of two centuries of Argentine history should at the same time concentrate on several of the darkest passages in the country’s history. On the occasion of the big parade, fighter jets flew overhead and gauchos rode by on horseback, just as you might expect. But there were also actors depicting militant workers calling for a general strike, to evoke the hundreds cut down by paramilitary gangs in the semana trágica of 1919; a gigantic installation, suspended on guy-wires, of the constitution in flames; a float portraying the Mothers of the Disappeared who campaigned to know their children’s whereabouts during the ruling junta’s frenzy of state terrorism in the late ’70s; and another troupe of actors in business suits tossing funny money to the crowd in much the way—this was the idea—that the Argentina of the 90s had plunged into a delirium, soon punctured, of fictitious prosperity.more from Benjamin Kunkel at n+1 here.
In January 1765, Mr and Mrs Ricketts, the new tenants of Hinton Manor in Hampshire, “became alarmed by the frequent opening and shutting of doors during the night”, a phenomenon that persisted even after all of the locks had been changed, and which soon came to be augmented by sightings of “a figure in a ‘snuff-coloured’ coat” and by the sounds of disembodied conversation between “a shrill female voice . . . and then two others with deep and manlike tones”. In 1695, the curate of Warblington, dispatched to investigate a haunted building, sensed “something in the room that went about whistling”. In 1879, a lady staying with “some north country cousins . . . at their house in Yorkshire” woke in the night to see “at the foot of the bed a child . . . a little girl with dark hair and a very white face . . .”, her eyes “turned up with a look of entreaty, an almost agonised look”. In 1682 in Spraiton, Devon, a shoelace “was observed (without the assistance of any hand) to come of its own accord out of its shoe and fling itself to the other side of the room”. A maid went to retrieve the thing only to discover that “it strangely clasp’d and curl’d about her . . . like a living eel or serpent”. In 1649, a party of Cromwell’s men, staying in “the Mannor-house of Woodstock”, were tormented by something “treading as they conceived much like a Bear”, which threw “a Glass and great Stones at them . . . and the bones of Horses, and all so violently that the Bed-stead and the Walls were bruised by them”. One hundred and fifty years later, Lord Brougham, reclining in his bath and “enjoying the comfort of the heat”, was confronted by the figure of his oldest friend, recently deceased but sitting, all the same, on a chair beside the tub, “looking calmly” at him.more from Jonathan Barnes at the TLS here.
All Programs Considered
Bill McKibben in the New York Review of Books:
Radio receives little critical attention. Of the various methods for communicating ideas and emotions—books, newspapers, visual art, music, film, television, the Web—radio may be the least discussed, debated, understood. This is likely because it serves largely as a transmission device, a way to take other art forms (songs, sermons) and spread them out into the world. Its other uses can be fairly pedestrian too: ball games and repetitive, if remarkably effective, right-wing commercial talk radio. Rush Limbaugh is the radio ratings champ; according to the industry’s trade journal he reaches 14.25 million listeners in an average week. Sean Hannity, working the same turf, trails him slightly.
But an equally large audience turns to the part of the dial where public radio in its various forms can be found. Public radio claims at least 5 percent of the radio market. National Public Radio’s flagship news programs, Morning Edition and All Things Considered, featuring news and commentary alongside in-depth reports and stories that can stretch over twenty minutes—are the second- and third-most-popular radio programs in the country, each drawing about 13 million unique listeners in the course of the week. These NPR shows have far larger audiences than the news on cable television; indeed, all four television broadcast networks combined only draw twice as large an audience for their evening newscasts.
Stories vs. Statistics
John Allen Paulos in the New York Times:
Half a century ago the British scientist and novelist C. P. Snow bemoaned the estrangement of what he termed the “two cultures” in modern society — the literary and the scientific. These days, there is some reason to celebrate better communication between these domains, if only because of the increasingly visible salience of scientific ideas. Still a gap remains, and so I’d like here to take an oblique look at a few lesser-known contrasts and divisions between subdomains of the two cultures, specifically those between stories and statistics.
I’ll begin by noting that the notions of probability and statistics are not alien to storytelling. From the earliest of recorded histories there were glimmerings of these concepts, which were reflected in everyday words and stories. Consider the notions of central tendency — average, median, mode, to name a few. They most certainly grew out of workaday activities and led to words such as (in English) “usual,” “typical.” “customary,” “most,” “standard,” “expected,” “normal,” “ordinary,” “medium,” “commonplace,” “so-so,” and so on. The same is true about the notions of statistical variation — standard deviation, variance, and the like. Words such as “unusual,” “peculiar,” “strange,” “original,” “extreme,” “special,” “unlike,” “deviant,” “dissimilar” and “different” come to mind. It is hard to imagine even prehistoric humans not possessing some sort of rudimentary idea of the typical or of the unusual. Any situation or entity — storms, animals, rocks — that recurred again and again would, it seems, lead naturally to these notions. These and other fundamentally scientific concepts have in one way or another been embedded in the very idea of what a story is — an event distinctive enough to merit retelling — from cave paintings to “Gilgamesh” to “The Canterbury Tales,” onward.
A bunch of barbarians in the desert
This video contains the most preposterous performance I have seen yet in a talk show, by the "Republican strategist" Jack Burkman. It is as offensive as it is ludicrous and laughable, and Robert Fisk and Anas al-Tikriti seem to have trouble deciding whether to get angry or break out in guffaws at the ridculousness of it all!
Where does homosexuality come from?
What makes a person gay? Is it genetics, upbringing, or some combination of the two? Over the past few decades, a slew of scientific research has bolstered the notion that sexuality is, at least in part, innate. Studies of the sexual behavior of various animal species have shown that homosexuality is not just a human phenomenon. Then there is the curious finding that the number of older brothers a male has may biologically increase his chances of being gay.
Now Simon LeVay, a former Harvard neuroscientist, has written, "Gay, Straight, and the Reason Why: The Science of Sexual Orientation," a comprehensive, engaging and occasionally quite funny look at the current state of the research on the topic. LeVay is one of the leading authorities in the field: Back in 1991, he discovered that INAH3, a structure in the hypothalamus of the brain that helps to regulate sexual behavior, tended to be smaller in gay men than in straight men. It was a watershed moment in our understanding of sexual orientation (the study was published at the height of the AIDS epidemic, when the disease was widely regarded in religious circles as divine punishment for the sin of being gay) and the first scientific finding to support the idea that gayness might be more than just a lifestyle.
'Marilyn Monroe' neuron aids mind control
People have used mind control to change images on a video screen, a study reports. The volunteers, whose brains were wired up to a computer, enhanced one of two competing images of famous people or objects by changing firing rates in individual brain cells.
The research, by Moran Cerf from the California Institute of Technology in Pasadena and his colleagues, demonstrates how our brains, which are constantly bombarded with images, noise and smells, can, through conscious thought, select what stimuli to notice and what to ignore (see video). The research is particularly exciting, says neuroengineer John Donoghue of Brown University, "because it shows how we can now peer into the process of thinking at a level we have not been able to get at before". Donoghue was responsible for the first successful transplantation of a chip into the motor cortex of a tetraplegic man, enabling him to move a computer cursor and manipulate a robotic arm with his mind.
In the last six years or so they have shown that single neurons can fire when subjects recognise — or even imagine — just one particular person or object. They propose that activity in these neurons reflect the choices the brain is making about what sensory information it will consider further and what information it will neglect.
October 27, 2010
Romanticism, Reflexivity, Design: Nathan Schneider Interviews Colin Jager
Over at The Immanent Frame:
NS: What makes modern sociological terms like “secularism” and “secularization” useful for interpreting eighteenth- and nineteenth-century literature? Is there a danger of falling into misleading anachronism?
CJ: There’s always that danger when we use a term from one historical period to describe aspects of another one. “Secularism” first emerges in Victorian England as a self-description, a way to avoid being labeled an atheist, and it has a long history within Christianity before that, as the secular, or worldly, time before the Second Coming. “Secularization” is a bit trickier, since it aims to describe a process and to give that process the aura of scientific neutrality, like the weather. I think the danger is not so much anachronism—which, frankly, I don’t think is a bad thing anyway—but rather forgetting that terms are never merely descriptive. So, I use the term, and I try to be reflexive about it. It’s comforting for many people to see themselves as living on the far side of a secularization process, and it’s that sense of comfort that I’d like to disrupt a bit.
NS: What does it mean for you to be reflexive?
CJ: What I mean by “reflexivity” is really just a critical consciousness that whenever you invoke a term, you are also invoking its history—the conditions under which it was forged and the uses to which it has been subsequently put. At the same time, we need these terms: something has changed over the course of modernity, for instance, and I’m comfortable with calling that change “secularization,” as long as it’s defined very carefully and I know what the stakes are in a given definition. Reflexivity is just my shorthand for the process, which I take to be central to serious intellectual practice, anyway—to strike the balance between using a term or concept or idea and simultaneously being aware of what you’re doing when you use it. It’s a mental habit of disembedding from the stuff you really care about—which, appropriately enough, is a pretty good definition of the secular!
An Interview with Lera Boroditsky
Why Sisterly Chats Make People Happier
My most beloved and second oldest friend, my younger sister, turned a new decade a few days ago. It was an occasion to openly appreciate the happiness she brings me. A recent study suggests that this effect is wholly normal. Deborah Tannen in the NYT offers some further thoughts on the topic:
“Having a Sister Makes You Happier”: that was the headline on a recent article about a study finding that adolescents who have a sister are less likely to report such feelings as “I am unhappy, sad or depressed” and “I feel like no one loves me.”
These findings are no fluke; other studies have come to similar conclusions. But why would having a sister make you happier?
The usual answer — that girls and women are more likely than boys and men to talk about emotions — is somehow unsatisfying, especially to a researcher like me. Much of my work over the years has developed the premise that women’s styles of friendship and conversation aren’t inherently better than men’s, simply different.
A man once told me that he had spent a day with a friend who was going through a divorce. When he returned home, his wife asked how his friend was coping. He replied: “I don’t know. We didn’t talk about it.”
His wife chastised him. Obviously, she said, the friend needed to talk about what he was going through.
This made the man feel bad. So he was relieved to read in my book “You Just Don’t Understand” (Ballantine, 1990) that doing things together can be a comfort in itself, another way to show caring. Asking about the divorce might have made his friend feel worse by reminding him of it, and expressing concern could have come across as condescending.
The man who told me this was himself comforted to be reassured that his instincts hadn’t been wrong and he hadn’t let his friend down.
But if talking about problems isn’t necessary for comfort, then having sisters shouldn’t make men happier than having brothers. Yet the recent study — by Laura Padilla-Walker and her colleagues at Brigham Young University — is supported by others.
[H/t: Maeve Adams]
Values Added: Women and Islam
Let me start from the beginning. Vasilij had fallen ill, and I went to see him after visiting an exhibition in the Riga Art Space. The exhibition included paintings of the most diverse quality (including very poor ones), grouped by decade and forming what was quite literally a kind of labyrinth. The subterranean exhibition space was crammed with works, which had even prompted someone to write in the visitors' book that art isn't firewood (evidently meaning that paintings cannot be piled up like logs of firewood). I recalled this comment when, trying to step back in order to get a better view of a large painting, I tripped on the steps directly in front of the painting. And so I sat in Vasilij's living room with my sprained leg on a pillow and, while sipping tea, recounted my impressions. Our state of health prompted us to adopt a resignedly ironic view. At the beginning of the conversation I mentioned the guiding principle of the exhibition: to cast a look at the art of the Soviet period without ideological prejudices, something that may have accounted for the varying quality of the exhibited works. "I didn't know that ideological prejudice or the lack of it could serve as a criterion for quality in art", said Vasilij scornfully.more from Janis Taurens at Eurozine here.
'I am a camera with its shutter open, quite passive, recording, not thinking.' Anyone familiar with the declaration by the narrator of Christopher Isherwood's most enduring work of fiction, Goodbye to Berlin (1939), will be surprised by how uncinematic, indeed incomprehensive, his diary entries can be. There's a lot of thinking, and nothing like the gestures towards abandoning subjectivity and self-consciousness that Isherwood crafted into his novels, not least the one masterpiece penned during the period covered by this second collection - A Single Man (1964). As in the first volume of diaries, published in 1996, Isherwood comes across as, by turns, rebarbative, loving, insecure, opinionated, self-critical, self-destructive, reticent, controlling and grand. His sing-song voice - caught in the 2007 documentary Chris and Don: A Love Story - is hard to square with these entries, which are rarely light-hearted. What they are, however, is a huge relief after this book's thousand-page predecessor.more from Richard Canning at Literary Review here.
dust in the wind?
For most of us, one of the fundamental appeals of art is its exemplary capacity in the struggle against entropy — a cultural artifact is valued according to the degree of order it embodies — and the strength of its resistance to the ravages of time. The more intricately woven the tapestry or solidly constructed the pyramid, the more reassured we are that perhaps Kansas got it wrong with regard to all we are being dust in the wind. Of course, this being the case, modernist and postmodernist artists have made it their business to challenge this preconception on a number of fronts — by ostentatiously reintegrating the already discarded detritus of culture into new arrangements, as in the collages of Kurt Schwitters and the Combines of Robert Rauschenberg; by emphasizing the spontaneous improvisational gesture in order to destabilize the balance between order and chaos, as in the abstract expressionist drip paintings of Jackson Pollock; by creating deliberately ephemeral performances, happenings and installations whose only record is whatever documentation or relics happen to be left over, as in Chris Burden's often life-endangering actions of the early 1970s, whose collectible evidence consists of snapshots, Super-8 film, audiocassettes and a handful of used bullets. One of the pivotal figures in the development of this broad-spectrum aesthetic of decay was Alberto Burri (1915-1995), an Italian painter who first gained attention with his abstract compositions stitched together from scraps of surplus burlap sacks, then proceeded to explore the surface possibilities of shredded and burned plastic, welded plates of scrap metal, eroded acoustic tile and other quotidian industrial materials.more from Doug Harvey at the LA Weekly here.