Critical Digressions: Literary Pugilists, Underground Men

Ladies and gentlemen, boys and girls,

Cover200510_350_2 After being attacked for a number of years by a new generation of literary critics – indeed, sucker-punched, phantom-punched, even body-slammed – “contemporary” (or “postmodern”) prose has hit back: in this month’s Harper’s, one of our favorite publications (less than $10.99 for an annual subscription), one Ben Marcus has donned his fighting gloves – which seem a little big for his hands, his pasty, bony frame – climbed into the ring, earnestly, knocky-kneed, sweating from the hot lights, the camera flashes, the hoarse roar of the audience, the sense of anticipation, broken noses, blood…

Like Dostoevsky’s Underground Man, Ben announces, “I am writing this essay from…a hole…” He continues:

“…it’s my view that…the elitists are not supposedly demanding writers such as myself but rather those who caution the culture away from literary developments, who insist that the narrative achievements of the past be ossified, lacquered, and rehearsed by younger generations. In this climate…writers are encouraged to behave like cover bands, embellishing the oldies, maybe, while ensuring that buried in the song is an old familiar melody to make us smile in recognition, so that we may read more from memory than by active attention.”

Fighting words, ladies and gentlemen! We’d like to tell you that Ben fought a good fight; that he came out swinging; that he staged an upset; that an underdog took on the emerging consensus on contemporary prose, shaped by the likes of James Wood, Dale Peck and B.R. Meyers, and according to Ben, Jonathan Franzen, Tom Wolfe and Jonathan Yardley. But criticism is no fairy-tale world, and Ben is no hero. A welterweight in a heavyweight fight, he doesn’t have enough behind his punch.

The ambitiously titled Why Experimental Fiction Threatens to Destroy Publishing, Jonathan Franzen, and Life As We Know It begins with a peculiar digression on the anatomy of the brain including a quick explanation of the Heschl’s gyri, Boca’s area and the Wernicke’s area (“think of Wernicke’s area as the reader’s muscle”), which may be novel, experimental, but has no business in a literary critique. Perhaps had Ben fused literary theory with neuroscience in a more serious, symbiotic, technically rigorous way, he may have achieved something. But just as Ben gets us thinking about the sort of neural implications of literature, he gets wishy-washy, namby-pamby:  “If we [writers] are successful, we touch or break readers’ hearts. But the heart cannot be trained to understand language…”

Tyson_fingers_shrunk_2 This, introduction may have been overlooked had Ben knocked the reigning heavyweight champions down by the second or third round. But he doesn’t. He quarrels with the prevailing neo-realist sensibilities of critics – that is, “The notion that reality can be represented only through a certain kind of narrative attention – and with those who argue against “literature as an art form, against the entire concept of artistic ambition.” He then has beef with Franzen: “Even while popular writing has quietly glided into the realm of the culturally elite, doling out its sever judgment of fiction that has not sold well, we have entered a time when book sales and artistic meit can be neatly equated without much of a fuss, Franzen has argued that complex writing, as practiced by…Joyce…Beckett and their descendents, is being forced upon readers by powerful cultural institutions…and that this less approachable literature…is doing serious damage to the commercial prospects for the literature industry.” Fair enough but not hard enough.

But though we want to Ben to win this fight because we champion underdogs and such contrarian projects on principle, Ben is quite unable to summon the fierce intelligence and evangelical zeal of, say, James Wood or the flamboyance and shock value of Dale Peck. He may pretend to be the Underground Man but he’s not a sick man…a spiteful man…” In 2001, however, B.R. Meyers, more non-entity than underdog, managed the sort of upset Ben aspires to. Writing in the Atlantic, his thorough, articulate attack began: 

“Nothing gives me the feeling of having been born several decades too late quite like the modern ‘literary’ best seller. Give me a time-tested masterpiece or what critics patronizingly call a fun read – Sister Carrie or just plain Carrie. Give me anything, in fact, as long as it doesn’t have a recent prize jury’s seal of approval on the front and a clutch of raves on the back. In the bookstore I’ll sometimes sample what all the fuss is about, but one glance at the affected prose – “furious dabs of tulips stuttering,” say, or “in the dark before the day yet was” – and I’m hightailing it to the friendly spines of the Penguin Classics.”

A Reader’s Manifesto: An Attack on the Growing Pretentiousness in American Literary Prose caused commotion as the Wall Street Journal, New Yorker, Harpers, NYT, Washington Post and New York Review of Books joined the fray, a real battle royale. And Meyers came out on top: presently, he’s a senior editor at the Atlantic. When you hit hard, it doesn’t really matter what you say. So what’s Meyer’s beef? “What we are getting today is a remarkably crude form of affectation: a prose so repetitive, so elementary in syntax, and so numbing in its overuse of wordplay that it often demands less concentration that the average ‘genre’ novel.” And what is his methodology? He proceeds to categorize contemporary prose in four broad groups – “evocative,” “muscular,” “edgy,” “spare” and “generic ‘literary prose,’” – citing weak passages from the writers who he finds representative of each group; Proulx (Shipping News), McCarthy (All the Pretty Horses), Delilo (White Noise), Auster (City of Glass) and Guterson (Snow Falling on Ceders). Manifestly, Meyers packs a formidable punch.

Peck_1 Of course, even back in 2001, Meyers may have been a non-entity but was no underdog. Literary fashion has been changing well before him with Wood in pages the Guardian, and consequently in the New Republic, where Wood was joined by Dale “The Hatchet Man Peck. Peck, you may remember, famously proclaimed, “I will say it once and for all, straight out: it all went wrong with James Joyce…Ulysses is nothing but a hoax upon literature.” Like Tyson, Peck writes, “Sometimes even I am overwhelmed by the extent of the revaluation I’m calling for, the sheer f***ing presumptuousness of it.” In one critique, in one sentence in fact, Peck excises “most of Joyce, half of Faulkner and Nabokov, nearly all of Gaddis, Pynchon and DeLillo, not to mention the contemporary heirs.” This assertion makes for interesting if idle exercises: we mull, for example, which half of The Sound the Fury Peck would excise if given the opportunity – the first two books, of course, Benjy’s and Quentin’s – and what effect his reductive, retrograde editing would have on the novel as a whole. Peck, like Mike Tyson before him, bites ears off, and often punches below the belt, smack in the crotch. Tyson once said, “I wish that you guys had children so I could kick them in the f***ing head or stomp on their testicles so you could feel my pain because that’s the pain I have waking up every day.”

The New York Review of Books noted that “Like his colleague at the New Republic, the estimable and excellent James Wood, Peck seems to want more novels like the great [19th] century social novels: serious, impassioned, fat.” Were we to step into the ring, brandishing our shiny brown muscles, we would simply but forcefully argue that the world, that civilization, and literature with it, has moved a hundred years forward since the 19th century. Looking fondly back towards realism is quite literally retrograde, like those other Underground Men, Wahabi Islamists urging Muslims to return to the 7th century. The novel, like these critics and the critical canon (that includes the Russian Formalists, the New Critics, Structuralists, the Post-Structuralists, whatever), is grounded in a certain context. It’s is a palimpsest, distilling, processing the anxieties, sensibilities, the diction, the colloquial, news, popular culture, of a particular time and place and people.

Dreiser and Dos Passos, for example – two different writers, the former considered traditional, the latter experimental – were unable to write novels that are relevant today except as history, as part of evolution of the modern novel. On the other hand and off the top of our head, we just finished Roth’s Goodbye, Columbus – his first novel – which features a Jewish protagonist, a class divide, a sectarian divide, specific references and allusions to the fifties in America – including, incidentally, the title itself – but were charmed by the sweet, straightforward adolescent love story (and the voice). Unlike Manhattan Transfer, Goodbye, Columbus remains relevant. Some novels transcend their cultural and temporal trappings.

We dig Roth for different reasons than, say, Melville, Dostoevsky, Dickens. We dig 20th century writers for different reasons than their antecedents: the lyrical and frenetic Marquez and Rushdie, the postcolonial and serious Naipaul and Coetzee, the very contemporary, Franzen and Wallace.

Sure, from Dostoevsky to Wallace, the conventions of storytelling have changed and prose has become more self-conscious but don’t let the Underground Men lecture you that change is good or bad; change is. And we’ll tell you this much: anybody advocating cutting Nabokov down to size should be paraded naked in the ring, weak chest, hairy buttocks, spindly legs exposed, wearing his own novel as a fig leaf. Sure, some contemporary prose has become gimmicky, adjective laden, rife with metaphor (which in a way, is arguably Nabokov’s legacy); and sure, silly alliteration needs to be caught, condemned. Meyers will rightly beat you up for it. That’s his job, and Wood’s and Peck’s. Ben nobly got into the ring but he needs to train harder if he’s going to go twelve rounds with them. Somebody, however, needs to hit back, to keep it real.

As Eddie Scrap Iron Dupris once said (somewhat heavy-handedly), “Boxing is an unnatural act…everything in boxing is backwards: sometimes the best way to deliver a punch is step back…But step too far and you ain’t fighting at all.” We’re not entirely sure if this is relevant but it sure sounds good.

Other Critical Digressions:

Gangbanging and Notions of the Self

Dispatch from Cambridge (or Notes on Deconstructing Chicken)

The Three-Step Program for Historical Inquiry

The Naipaulian Imperative and the Phenomenon of the Post-National

The Media Generation and Nazia Hassan

Dispatch from Karachi

Dispatches: Where I’m Coming From

When I was little, I used to go to JFK airport a lot. I would pass through on my way to and from Pakistan, or we would drive the four hundred miles from Buffalo to pick up my grandparents or various siblings, aunts, uncles, cousins, friends of the family.  Among Pakistanis, letting a relation arrive unmet at the airport is not done.  It’s an expression of filial duty and the strength of extended family bonds to drive for seven hours personally to escort your parents or grandparents or older sister from New York, the great gateway, back to your house or some other relative’s house.  So I ended up there frequently, and I loved it.

Back then, as now, Pakistan International Airlines flights came into and departed from Terminal 4, known as the International Arrivals Building.  However, it was an older IAB, now destroyed, that I knew and loved.  When JFK was first built, replacing Idlewild as New York City’s, and thus the country’s, primary international airport, the idea was to let each major European and American airline build their own terminal.  Thus a sort of competition occurred in which British Airways, American Airlines, etc. hired architects and took it upon themselves to demonstrate the modernist flair of their brand identities by the design of their flagship terminals.  The most famous of these, of course, was Eero Saarinen’s TWA terminal, a beautiful concrete structure reminiscent of an insect’s compound eye, with a cantilevered overhang that seems to hover in flight.  (That terminal has now been taken over by JetBlue airways, whose management already painted the striking raw concrete of Terminal 6 an ugly glossy white, but I digress.)

Terminal 4, however, was reserved for the many airlines of the world that would be using JFK but who weren’t large enough to need an entire building.  These carriers were mostly from Asia, South America, and Africa: Egypt Air, PIA, Varig, etc.  For them, the designers of the airport hired Skidmore, Owings and Merrell, the great mid-century American corporate-modernist firm, to build the International Arrivals Building.  Utilitarian in layout, the IAB used two long two-story corridors to house the ticket and check-in counters of the individual airlines, with a rectangular central structure through which all arriving passengers incame.  Instead of competing with the perpetually shifting tableaux of humanity that resided within it, the architecture faded into the background, its square enormity framing the action humbly, like the simple black frame around a gelatin silver print.

The great hall, square and with a balcony overlooking its main space, was a nonstop riot of second and third-world peoples hugging, jostling, exclaiming, blearily treading, and generally searching for whoever was going to shepherd them to the subway, the taxi, the Dodge, the new home, the old home.  I remember very clearly arriving as a ten-year-old from a summer spent with my older brother in Karachi and Islamabad, getting lost in a crowd of white-robe clad Africans, unable to see anything above these tall, slender people.  For a while I listened to their conversations (perhaps in Swahili? French?), and then, from behind me, my younger sister and cousin, waiting to receive us, karate-chopped me on the back by way of welcome, excitedly telling me all about our new parakeet, E.T.  Later, when I would pass through the terminal on my way to London (I used to fly Air India there, because it was cheap, and was the last airline to allow smoking), I would stand on the second floor balcony and watch the scenes of hello taking place on the floor below.  I had my own such scenes too, picking out my girlfriend’s head amongst the crowds from above, and then hurriedly descending into the fray, losing her all over again, then being found by her while looking the other way.  Abbas and I would sometimes wait for people in the bar on the eastern side, where they had a pool table and an encouragingly squalid atmosphere, like the cocktail bar next to the Port Authority’s bowling alley.  I went back before it was demolished, armed with a superwideangle lens, to document the place, but they wouldn’t let me in.

Recent writing about airports, by people like Pico Iyer, celebrate its symbolic relation to our postmodern condition, the Rushdie-like sense of everything’s connection to everything else.  Everywhere infects here, and here leaks into everywhere, in the form of these dusty traveler-viruses, and the airport is the one place where you see the anomie caused by the meaningless of it all, or, alternately, a kind of rampant giddiness in its blurring of identities.  I don’t think that’s quite right.  Or at least, I never felt that way there.  To me, the scene at JFK always reminded me of a touching collectivism, a faith in extended ties and of a certain dogged kind of loyalty that was so different to the atomized individualism of my young suburban life.  It was a mingling.  And yet it wasn’t the same thing as the potentially suffocating extended family life in Karachi, where privacy is unimaginable.  Terminal 4 was melancholic but heartening all at once, and, most importantly, rather than symbolizing placelessness or globalism, always seemed to me very specifically New York, and particularly its role as the liminal space between America and the world.

The new Terminal 4, by contrast, is very much a glib, postmodern, placeless place.  A bland wing-shaped immensity, roughly isomorphic with Stansted or Dulles, it even has signage designed by the Dutch-based design team that did Heathrow and Schipol’s yellow markers, the same typeface and everything.  There’s no particular sense of where you are in the world.  Worst of all, despite having probably quadruple the square footage of the old place, in the new terminal passengers arrive into a smallish area on the ground floor, with no sense of that the architects ever considered the place passengers first gaze at New York and the U.S.  Even those waiting only see half the arrivants, as they have to choose one of two forks as they exit, the only notable feature in the linear progress out of the airport being a large yellow question mark erected above the information desk.  That question mark might as well stand for the sensation the space produces: where am I?  Without grandeur, you are just arriving at an anonymous node of global circulation, reminiscent of a luxurious version of Rem Koolhaas’s ‘junkspace.’

The old International Arrival Building’s architecture was very New York.  The large cubic space, with its high ceiling, framed the people like the grid of the city, highlighting their colors by its comparative drabness and lack of architectural hubris.  Not that it was badly designed – to the contrary, was a solid and imposing structure, and looked like no other place.  You knew it was JFK.  It also provided a suitably grand but functional setting for one’s arrival to the country, the chaotic bricolage it contained becoming a metaphor for the city’s true identity: the home of those from elsewhere.  By providing a balcony from which to view the secular pilgrimage of disembarkation, immigration and emigration, the airport dramatized and made visible the social. Being there was to be a part of the social, to witness and be a part of a scene.  For me, growing up without feeling native to either Buffalo or Pakistan, that scene was like home: more than anywhere else, where I’m from.

Dispatches:

Optimism of the Will
Everything You Always Wanted to Know About Vince Vaughan…
The Other Sweet Science
Rain in November
Disaster!
On Ethnic Food and People of Color
Aesthetics of Impermanence

Selected Minor Works: Early Modern Primitives

Justin E. H. Smith

I have recently come across a delightfully obscure 1658 treatise by the very pious John Bulwer, entitled Antropometamorphosis: or, the Artificial Changling. This may very well be the first study in Western history of piercing, tattooing, scarification, and other forms of bodily modification. It is thus a distant ancestor of such contemporary classics as the 1989 RE/Search volume, Modern Primitives.

But if the Voice Literary Supplement once praised RE/Search for its dispassionateness, today a hallmark of respectable ethnography, Bulwer’s science is at once a moral crusade. In each chapter, Bulwer bemoans a different deplorable practice, including “Nationall monstrosities appearing in the Neck,” “Strange inventive contradictions against Nature, practically maintained by diverse Nations, in the ordering of their Privie parts,” and (my favorite) “Pap-Fashions.”

If Bulwer hates nipple rings and dick bars, he is no less concerned about the now rather innocent habit of shaving. He rails in one chapter against “Beard haters, or the opinion and practice of diverse Nations, concerning the naturall ensigne of Manhood, appearing about the mouth.” For him any bodily modification is but a “Cruell and fantasticall invention of men, practised… in a supposed way of bravery… to alter and deforme the Humane Fabrique.”

Bulwer believes that morally degenerate practices can over time lead to actual physical degeneration within a human population. Thus, for him, phenotypic variation in the species is a consequence of cultural bad habits, rather than teleologically driven progress from lower to higher forms, let alone adaptation by way of natural selection. The ugliness of non-Europeans may be attributed to the rottenness of their souls and consequent savage lifestyles. Indian pinheads and Chinese blockheads, whose skulls are sculpted from birth by malevolent adults, are cited as cases of degeneration in action.

200 years before Darwin, then, there was widespread acceptance of the idea that species could change over time. But for moralists such as Bulwer, change could only ever be change for the worse. In this connection, Bulwer denounces the view of a (regrettably unnamed) libertine philosopher that human beings evolved from other primates: “[I]n discourse,” he writes, “I have heard to fall, somewhat in earnest, from the mouth of a Philosopher that man was a meer Artificial creature, and was at first but a kind of Ape or Baboon, who through his industry by degrees in time had improved his Figure & his Reason up to the perfection of man.”

Bulwer believes that the ‘Philosopher’s’ opinion constitutes a symptom of the moral decline of the modern period. For, he thinks, if mutation of humanity over time can occur, it will not, as the Philosopher thinks, take the character of an ascent from beast to man, but rather the reverse, a descent into ape-likeness: “But by this new History of abused Nature it will appeare a sad truth, that mans indeavours have run so farr from raising himselfe above the pitch of his Originall endowments, that he is muchfallen below himselfe; and in many parts of the world is practically degenerated into the similitude of a Beast.”

Evolutionary thinking, then, opens up the possibility not just of progress out of animality, but of degeneration into it, and this was a possibility that the pious, such as Bulwer, were beginning to fear.

If we move forward a few hundred years, we find that the human species still has technology that beats the reed dipped into the anthole, and that we still exercise our freedom to mate outside of estrus. Indeed, not much of anything has changed since the 17th century, either through degeneration or evolutionary progress. One thing that has remained entirely the same is the art of moralistic ranting: we find that, now as then, precisely those who are most concerned about the moral stain of body piercing and tattoos, who are most active in the movement to make visible thongs in suburban Virginia malls a misdemeanor, are the same people who would have us believe that humans were instantaneously and supernaturally created with no kinship relation to other animal species.

It is worth reflecting on why these two crusades, which prima facie have nothing in common, have proven such a durable pair throughout the centuries. I suspect that human thought is constrained (as a result of the way our minds evolved), to move dialectically between two opposite conceptions of animal kinds: that of the book of Genesis on the one hand, positing eternally fixed and rigid kinds with no overlap, and that of Ovid’s Metamorphoses on the other. In spite of the relatively recent ascent of evolutionism to accepted scientific orthodoxy, there has always been available a conception of species as fluid and dynamic. This conception easily captures the imaginations of social progressives and utopians, that is, of those who believe that change for the better is possible and indeed desirable. The numerous monuments to Darwin throughout the Soviet Union (which I hope have not been scrapped along with those to Lenin) were once a testament to this.

Social conservatives on the other hand see fixity as desirable, and tend to conceive of change in terms of degeneration. A bestiary of eternal, non-overlapping animal species would provide for them a paradigm of stability that could easily be carried over from the natural to the social world, while the loss of this fixed taxonomy of natural kinds would seem equally to threaten the social stasis the conservative seeks.

The prospect of change in species over time, then, –including the human species– will be a more useful way of conceptualizing the natural world in times of heady social upheaval; in political climates such as the current one, it is not surprising to see public figures shying away from the chaotic instability of the Metamorphoses in favor of the clear boundaries of the Old Testament.

I am not saying that evolution is just ideology. I believe it is true. I believe that creationism, in turn, is false, and that it is an ideology. And precisely because it is one, it is a waste of time to do intellectual battle with creationists as though they had a respectable scientific theory. Instead, what we should focus on is the rather remarkable way in which folk cosmology –whether that of the Azande, the ancient Hebrews, or Pat Buchanan– may be seen to embody social values, and indeed may be read as an expression on a grand scale of rather small human concerns.

The small human concerns at the heart of the creationist movement are really just these: that everything is going to hell, that the kids don’t listen to their folks anymore, that those low-cut jeans show far too much. Creationism is but the folk cosmology of a frightened tribe. This is also an ancient tribe, counting the authors of Genesis, Bulwer, and Buchanan among its members, and one that need be shown no tolerance by those of us who recognize that change reigns supreme in nature, and that fairy tales are powerless to stop it.

Monday Musing: Be the New Kinsey

Last week my wife and I saw the biopic Kinsey in which Liam Neeson plays the entomologist turned pioneering sex researcher, Dr. Alfred Charles Kinsey. It’s a pretty good movie. Rent the DVD. Kinsey spent the early part of his career as a zoologist studying gall wasps, on which he became the world’s foremost expert. He collected over one million different specimens and published two books on the subject. Then, at some point in the early 30s, while pondering the variety of sexual behavior in the wasps he was studying, he started wondering about the range and variety of sexual behavior in humans. When he looked into it, he was dismayed by the prevalent scientific ignorance about even very basic physiological sexual function in humans, much less complex sexual behaviors. Remember, this was a time when even college-age men and women often had very little information about sex.

TimekinseyBut in this vacuum of knowledge where the angels of science feared to tread, as usual, the ever-confident fools of myth and dogma had rushed in with loud proclamations such as: masturbation causes blindness; oral sex leads to infertility; the loss of one ounce of precious, life-giving semen is equivalent to the (rather more obviously enervating) loss of 40 ounces of blood; and so on and on. We’ve all heard these sorts of things. In addition there was very little information about more real dangers and risks of sexual behavior, such as venereal disease transmission. When Kinsey taught a “marriage” class at the University of Indiana at Bloomington, a type of early sex-ed, his students often asked him questions that he could not answer because the answers simply were not known.

Embarrassment at this state of affairs prompted Kinsey to action. As an accomplished ethologist (one who studies animal behavior in the natural environment) he realized that in addition to studying the physiological sexual equipment of humans and the mechanics of sexual response, it was important to compile data on human sexual behavior “in the wild”, and he undertook the prodigious task of conducting detailed surveys of men and women across the 48 states of America to compile statistics about their sexual behavior. He didn’t reach his goal of interviewing more than a hundred thousand people, but he did make it to the tens of thousands. I cannot go into the details of his methodology here but it has been fairly well defended. In 1948, Kinsey published the first of two books based on his exhaustive sex research that would eventually alter the course of this country in significant ways: Sexual Behavior in the Human Male. Five years later he published Sexual Behavior in the Human Female.

What Kinsey found is well-known now but was absolutely scandalous at the time: the prevalence of homosexuality and bisexuality; the ubiquity of masturbation, especially in males; the common practice (at least much more than anyone had previously thought) of oral sex, anal sex, premarital sex, infidelity, and other forms of “deviant” behavior. While Kinsey simply reported the raw data, never advocating social policy or drawing moral conclusions, the effects of an open airing of these previously taboo subjects had far-reaching effects, not only contributing significantly to the sexual revolution of the 60s, but, importantly, resulting in the eventual massive decriminalization of various sexual practices such as (but not limited to) sodomy and oral sex across the states. There were other results as well: it took until 1973 for the American Psychiatric Association to remove homosexuality from its list of mental illnesses, but it happened at least partly because of Kinsey’s work. Most significant of all, however, Kinsey’s reports went a long way toward lifting the clouds of ignorance and fear that had long whelmed sex.

Now it occurred to me after I saw the movie, that there is an area of human practice (and, yes, it is more-or-less universal) which is today covered in the same clouds of ignorance and fear; which has distorted the well-intentioned aims of the criminal justice system and is filling up our jails; and which is dominated by myth and dogma in much the same way sex was before Kinsey had the courage to defy the taboos surrounding it and clear that fog with his bright beam of information: it is drug use.

What is drug use? I shall define it here for my purposes as a consumption (whether by ingestion, inhalation, injection, absorption or any other means) of a substance with little or no nutritional benefits, simply out of a desire for its effect on the nervous system. This then includes substances ranging from caffeine and nicotine, to alcohol, marijuana, LSD, PCP, ecstasy, cocaine, crystal meth, heroin, and the thousands of other substances that people use to enhance, or at least alter, their subjective experience of the world. And I won’t even get into prescription drug (ab)use from Valium to Viagra. Extremely large portions of all the cultures that I know of use at least some of these and other substances. Of course, most people enjoying a friendly chat over tea are not explicitly aware that they are taking a drug, but they are, and it makes them feel better, more energetic, and more awake. Just as some sexual practices are relatively harmless, while others pose real dangers to those who practice them, certain drugs and related practices are more dangerous than others.

Here’s the problem: there is very little useful information available about drugs. The reason is that there is a reluctance bordering on taboo on the part of government and non-government agencies alike to actually spell out the risks of taking drugs. In the case of illegal or other “bad” drugs, there is an absolute refusal to accept the fact that a large part of the population uses and finds pleasure in these substances, and an attempt to marginalize all drug users as criminals, addicts, and degenerates; just as in Kinsey’s time, absolute abstinence is at present the only acceptable message for the public. “Just say no!” That’s it. Where there should be informed messages of the exact risks of various substances, there is fear-mongering instead: the “this is your brain on drugs” ad on TV showing a frying egg, for instance. The implication is, just as an egg cannot be unfried, once you have used drugs (which drugs? how much? for how long? –these questions are not to be asked, and cannot be answered), your brain is permanently fried, whatever that means. After all, a fried brain is just a metaphor, it does not say anything scientific about exactly what sort of damage may be done to one’s brain by how much of what drug over what period of time. “Cocaine Kills!” is a common billboard ad. Have you noticed that Kate Moss hasn’t died? Why? I happen to know a bunch of Wall Street types who have been snorting lines off little mirrors in the bathrooms of fancy downtown clubs pretty much every weekend for at least a decade, and so, probably, do you. None of the ones I know have died so far. I also know a man who tried cocaine for the first time and ended up in an emergency room. So what is the real risk?

The problem with just telling people “Cocaine Kills!” and nothing more is that because they may see many people doing cocaine who are not dropping like flies, they completely dismiss the message as a crying of wolf. Or, they may think, “Yeah, sure cocaine kills, but so does skiing. Think Sonny Bono. Think Michael Kennedy. Just say no to skiing? I don’t think so. The pleasures outweigh the risks for me.” Why not tell them something useful about what the real statistical risks are? What percentage of the people who do it die from cocaine? What are the chances of dying for every ten times, or a hundred, or a thousand that you take cocaine? In the almost religious zeal to curb smoking, even there the situation about the actual risks is endlessly confusing. I have repeatedly read that 9 out of ten people who have lung cancer were smokers, but this tells me nothing about what risk I am taking of getting lung cancer by smoking. It could be that only a small percentage of the population gets lung cancer, and of those, the smokers are at disproportionately higher risk. There are hugely inflated figures of the number of deaths caused by smoking which are routinely thrown around. I have even seen a poster claiming that 3 million people are killed every year in the U.S. by smoking. That’s more than the total number of deaths every year in the U.S.! What I would really like to know is, on average, how much I am shortening my life with every pack-year of cigarettes I smoke? I just looked at various websites such as those of the American Heart Association, the Centers for Disease Control, the World Health Organization, and countless others, and I cannot find this simple information. Why? Just as Kinsey could not answer many of his students’ questions about sex, if a young person today asks just how risky it is to use ecstasy at a rave, for example, we have nothing to say.

Another problem with this “just say no” approach is that just as the total abstinence approach masks the differences in danger to one’s health of different sexual practices (having frequent unprotected sex with prostitutes is obviously not the same risk as having protected sex with a regular partner) because they are equally discouraged, this approach also masks the differences between the various practices of using drugs. Smoking a joint is not the same as mainlining heroin, but there is no room for even such crude distinctions in this simplified message. There is only the stark line drawn between legal and illegal drugs. Go ahead and have your fourth martini, but, hey, that hash brownie will kill you!

The same unrealistic refusal to acknowledge that drug use is very common (yes, there are always a few polls of high school sophomores and college freshmen, but nothing serious and comprehensive) across all strata of society results in a distorted blanket approach to all drug use, and the same ignorant fear-mongering that used to exist about sex. The first thing to do is to compile, like Kinsey, detailed information on all drug use (or at least the top twenty most used drugs) by employing the best polling techniques and statistical methods we have. Let’s find out who is using what drugs, legal or illegal. Break it down by age, gender, race, income, geographical location, education, and every demographic category you can think of. Ask how often the drug is taken, how much, in what situations. Ask why the drug is taken. What are the pleasures? Poll emergency rooms. Research the physiological effects of drugs on the human body. Write a very fat book called Drug Taking Behavior in the American Human. I am not advocating any policy at all. I am just advocating replacing ignorance and confusion with irrefutable facts. New directions will suggest themselves once this is done. Maybe just as people who engage in oral sex are no longer seen as perverts and degenerates, maybe one day Bill Clintons won’t have to say “But I didn’t inhale,” and George W. Bushes won’t have to lie about their cocaine use. On the other hand, maybe we will decide as a society that Muslims were right all along, and ban alcohol. Go ahead, be the new Kinsey.

Have a good week!

My other recent Monday Musings:
General Relativity, Very Plainly
Regarding Regret
Three Dreams, Three Athletes
Rocket Man
Francis Crick’s Beautiful Mistake
The Man With Qualities
Special Relativity Turns 100
Vladimir Nabokov, Lepidopterist
Stevinus, Galileo, and Thought Experiments
Cake Theory and Sri Lanka’s President

Science only adds to our appreciation for poetic beauty and experiences of emotional depth

From Scientific American:

Does a scientific explanation for any given phenomenon diminish its beauty or its ability to inspire poetry and emotional experiences? I think not. Science and aesthetics are complementary, not conflicting; additive, not detractive. I am nearly moved to tears, for example, when I observe through my small telescope the fuzzy little patch of light that is the Andromeda galaxy. It is not just because it is lovely, but because I also understand that the photons of light landing on my retina left Andromeda 2.9 million years ago, when our ancestors were tiny-brained hominids.

In Charles Darwin’s “M Notebook,” in which he began outlining his theory of evolution, he penned this musing: “He who understands baboon would do more towards metaphysics than Locke.” Science now reveals that love is addictive, trust is gratifying and cooperation feels good. Evolution produced this reward system because it increased the survival of members of our social primate species. He who understands Darwin would do more toward political philosophy than Jefferson.

More here.

Why children shouldn’t have the world at their fingertips

From Orion:

Computers_1 There is a profound difference between learning from the world and learning about it. Any young reader can find a surfeit of information about worms on the Internet. But the computer can only teach the student about worms, and only through abstract symbols—images and text cast on a two-dimensional screen. Contrast that with the way children come to know worms by hands-on experience—by digging in the soil, watching the worm retreat into its hole, and of course feeling it wiggle in the hand. There is the delight of discovery, the dirt under the fingernails, an initial squeamishness followed by a sense of pride at overcoming it. This is what can infuse knowledge with reverence, taking it beyond simple ingestion and manipulation of symbols.

At the heart of a child’s relationship with technology is a paradox—that the more external power children have at their disposal, the more difficult it will be for them to develop the inner capacities to use that power wisely. Once educators, parents, and policymakers understand this phenomenon, perhaps education will begin to emphasize the development of human beings living in community, and not just technical virtuosity. I am convinced that this will necessarily involve unplugging the learning environment long enough to encourage children to discover who they are and what kind of world they must live in. That, in turn, will allow them to participate more wisely in using external tools to shape, and at times leave unshaped, the world in which we all must live.

More here.

Turning the Pages™: 14 Great Books

From Slashdot:

The British Library has made available 14 great books on its website. One of them is a 1508 notebook by Leonardo Da Vinci containing short treatises, notes and drawings of a wide range of subjects from mechanics to the moon. The site allows you to view the original manuscript written in Leonardo’s own handwriting.

Slashdot commentary:

Will I have to flip my display to read Leonardo Da Vinci’s backwards handwriting?

…they actually have a “mirror” button to flip it over for you!

Among other works included on the site is Jane Austen’s early work, and the original Alice In Wonderland manuscript, written and illustrated by Lewis Carrol.

More here.

Scientists dangle bait for screenwriters

From Nature:Science_2

A strange event took place at a Manhattan theatre this week. The packed audience was normal for this lively venue, but the stars of the stage were not: at the end of the show, it was Nobel laureates rather than actors who obliged with autographs. The Sloan Film Summit, coordinated by the philanthropic Alfred P. Sloan Foundation and the Tribeca Film Institute, aims to bring scientists and film-makers together to make more realistic and entertaining stories about science. On 6 October, many of the summit guests assembled for panel discussions on science as entertainment, among them James Watson, co-discoverer of the structure of DNA.

Watson suggested the crowd should take a closer look at stomach ulcers. Barry Marshall, he went on to explain, is an Australian pathologist who won half of this year’s Nobel Prize in Medicine for helping to prove that bacteria are the culprits behind ulcers. He would make a fantastic film hero, said Watson. At one point Marshall went so far as to swallow a solution containing the bacteria Helicobacter pylori to show a sceptical medical world that the microbes, and not stress, caused the stomach condition.

More here.

A Less Fashionable War

From Newtopia Magazine:Manjailcellgrain

Malcolm X once said, “Any person who claims to have deep feeling for other human beings should think a long, long time before he votes to have other men kept behind bars—caged. I am not saying there shouldn’t be prisons, but there shouldn’t be bars. Behind bars, a man never reforms.” On Friday September 9th I became one of the roughly 25,000 people released from an Illinois prison this year—600,000 nationally—after completing only 10 weeks of a one year sentence due to extreme overcrowding. My crime was victimless, simple possession of a controlled substance, specifically a small amount of marijuana and MDMA.

But as the rare upper-middle class educated White American in prison, I found myself in a truly alien, self-perpetuating world of crushing poverty and ignorance, violent dehumanization, institutionalized racism, and an entire sub-culture of recidivists, some of whom had done nine and ten stints, many dating back to the Seventies. Most used prison as a form of criminal networking knowing full well they would be left to fend for themselves when released. We were told on many occasions that an inmate was worth more inside prison than back in society. Considering it costs an average of $37,000 a year to incarcerate offenders, and the average income for Black Americans is $24,000, and only $8,000-12,000 for poor Blacks, one can easily see their point.

More here.

God Bless You, Mr. Vonnegut

A.O. Scott in the New York Times:

Scot450His liberalism grows out of some principles that can only be called conservative, like the belief in community and extended family that has become one of the big themes of his later work. He remains unimpressed by technology or the other trappings of progress, and he remains one of America’s leading critics of evolution – not of the theory, mind you, but of the practice, which has left us far too clever and vain for our own good.

It will hardly come as a shock that Vonnegut – who identifies himself as “a lifelong Northern Democrat in the Franklin Delano Roosevelt tradition, a friend of the working stiffs,” and therefore unapologetically “sappy” – has a low opinion of the current American administration and its policies, and “Man Without a Country” has already joined the ranks of the Bush-bashing best sellers that compete with liberal-bashing best sellers for dominance in our overheated climate of opinion. But Vonnegut is much funnier, and much crabbier, than the cable-bred polemicists, and smarter too. At times, he may slide toward Andy Rooneyesque or Grandpa Simpsonesque crotchetiness, but mostly, like his literary ancestor Mark Twain, his crankiness is good-humored and sharp-witted, and aimed at well-defended soft spots of hypocrisy and arrogance.

On Nov. 11 he will turn 83, and since he has no expectation of a heavenly perch from which to look down and eavesdrop on his friends, it is best that we appreciate him while he’s still around.

More here.

Archimedes Death Ray: Idea Feasibility Testing

From an MIT class website:

1_deathrayfrescoAncient Greek and Roman historians recorded that during the siege of Syracuse in 212 BC, Archimedes (a notably smart person) constructed a burning glass to set the Roman warships, anchored within bow and arrow range, afire. The story has been much debated and oft dismissed as myth.

TV’s MythBusters were not able to replicate the feat and “busted” the myth.

Intrigued by the idea and an intuitive belief that it could work, MIT’s 2.009ers decided to apply the early product development ‘sketch or soft modeling’ process to the problem.

2_burningsketchmodel_medOur goal was not to make a decision on the myth—we just wanted to assess if it was at least possible, and have some fun in the process. Jumping ahead, you can see the result… but let’s start at the beginning of the process.

(btw, the boat is made of 1″ thick red oak and this is a photoshop-free zone!

When a new idea pops into one’s head it’s a good idea to do a quick feasibility estimate. The course instructor’s quick “back of the envelope” calculation (done while pondering the MythBuster result) indicated that it could be possible (assuming that the wood is not reflective).

When the 2.009 class was given a 5 minute challenge to assess technical feasibility, about 95% (of 80 students) deemed the death ray infeasible. In a democracy this would probably doom the idea. However, since ‘the bosses’ thought it might work, further exploration and sketch model tests to learn more were merited.

More here.

Great White Shark Travels Shocking Distance

Larry O’Henlon in Discovery News:

Ws_light_effectsWhy did the great white shark cross the Indian Ocean? It sounds like a chicken joke, but it’s a genuine puzzler among shark scientists after the announcement that a female great white shark was tracked crossing and then re-crossing the Indian Ocean.

What’s more, the shark, named Nicole in honor of the shark-admiring actress Nicole Kidman, made its 6,900-mile round trip in just nine months, which is faster than any known marine traveler, said Nicole’s discoverers.

That’s some pretty efficient traveling for a shark traditionally thought of as a lifetime coastal local.

More here.

Harold Pinter has devised a new radio play for his 75th birthday

Alice Jones in The Independent:

09bwAs he approaches his 75th birthday on Monday, Harold Pinter appears frail and gaunt, leaning heavily on a walking stick decorated, somewhat incongruously, with sparkly stickers. When we meet to celebrate the unveiling of his latest work, Voices, he tells me: “I’m exhausted, I’m at the end of my tether,” and admits that he is “not writing anything much at the moment”. His formerly stentorian stage voice is notably weakened – a consequence of his battle with cancer of the oesophagus over the last three years. But in Voices, a 29-minute musical-dramatic collaboration with the composer James Clarke, his creative voice rings out as powerfully as ever.

This latest work by the indefatigable playwright will be broadcast on BBC Radio 3 on his birthday. In February, Pinter appeared to herald his retirement as a playwright, announcing with characteristic terseness: “I think I’ve stopped writing plays now… I’ve written 29 plays, isn’t that enough?”

But there was never really any danger that Pinter, a self-professed “bit of a pain in the arse”, was going to bow out of the limelight for good.

More here.

3 generations later, Lolita has lost none of its allure

Kim Curtis in the Chicago Tribune:

Nabokov_3Lolita was 12 when Vladimir Nabokov brought her to life as the obsession of her stepfather, a middle-age man who calls her “light of my life, fire of my loins. My sin. My soul. . . . Lo. Lee. Ta.”

After three generations, readers remain relentlessly drawn to Nabokov’s opening lines — more poetry than prose. They remain equally repelled by Humbert Humbert, a child molester who essentially held his stepdaughter captive; he is as despicable today as he was in 1955.

“Lolita,” a deceptively thin volume, has sold 50 million copies. Vintage Books already has sold all 50,000 copies of a new, special 50th anniversary edition it released this month.

A close-up of a young woman’s mouth replaces the previous cover photograph, a black-and-white photograph of a girl’s legs, in ankle socks and saddle shoes.

“Lolita” and “nymphet” — another word Nabokov coined — have worked their way into the lexicon. Two movie versions, first by Stanley Kubrick in 1962 starring James Mason and later by Adrian Lyne in 1997 starring Jeremy Irons, have coaxed millions into theaters. Iranian author Azar Nifisi penned her own contemporary best seller, “Reading Lolita in Tehran: A Memoir in Books,” inspired in part by Nabokov, and the “Gothic Lolita” is all the rage among teenage fans of Japanese anime.

More here.

Fifteen Bush bureaucrats you should worry about

From The New Republic:

…while cronies populate every presidency, no administration has etched the principles of hackocracy into its governing philosophy as deeply as this one. If there’s an underappreciated corner of the bureaucracy to fill, it has found just the crony (or college roommate of a crony), party operative (or cousin of a party operative) to fill it. To honor this achievement, we’ve drawn up a list of the 15 biggest Bush administration hacks–from the highest levels of government to the civil servant rank and file. The tnr 15 is a diverse group–from the assistant secretary of commerce who started his career by supplying Bush with Altoids to the Republican National Committee chair-turned-Veterans Affairs secretary who forgot about wounded Iraq war vets–but they all share two things: responsibility and inexperience…

15: Israel Hernandez
Assistant Secretary for Trade Promotion and Director General of the United States and Foreign Commercial Service, Department of Commerce (confirmation pending)

Fresh out of college and seeking a job on George W. Bush’s 1994 Texas gubernatorial campaign, Israel Hernandez showed up an hour early for his interview with the candidate. Impressed by his punctuality, Bush hired Hernandez within days and eventually invited him to live with the Bush family in their Dallas home, where Hernandez reportedly became like an older brother to Jenna and Barbara Bush. Serving as Bush’s travel aide for the next few years, “He was always there with the Altoids, the speech box, the schedule, whatever I needed,” Bush later wrote in his autobiography…

More here.

Doom is Everywhere

Paul Laity reviews Worst Cases by Lee Clarke, in the London Review of Books:

If you’re feeling vulnerable in these cataclysmic times, stay clear of Lee Clarke, the Eeyore of American sociology and author of the forthcoming study of disaster, Worst Cases (Chicago, £16). ‘Doom is everywhere,’ he says, ‘catastrophes are common.’ Viruses as deadly as Ebola could circle the globe in 24 hours, ‘on the planes that don’t crash’. And ‘it’s not a question of if but of when terrorists will detonate a nuclear device.’

Bad things happen all the time, but once in a while the bad thing is so unlikely as to be almost inconceivable. In 2001 a hunter in the middle of a wood in Pennsylvania fired his gun: the bullet failed to hit a single tree, travelled through the window of a house, went through a door and a wall, and killed a woman standing in her bedroom. A few years before, on Long Island, Andres Perez, testing his new .22 rifle, pointed it into the sky and fired. A minute or so later, Christina Dellaratta, sunbathing in her backyard nearby, felt a nasty sting.

For Clarke, five hundred airline passengers are five hundred potential casualties.

More here.

Inventor of Fake Dog Testicles Wins Ig Nobel Prize

From Physorg.com:

This year’s Ig Nobel winners include:

PHYSICS: John Mainstone and the late Thomas Parnell of the University of Queensland, Australia, for patiently conducting an experiment that began in the year 1927 — in which a glob of congealed black tar has been slowly, slowly dripping through a funnel, at a rate of approximately one drop every nine years.

MEDICINE: Gregg A. Miller of Oak Grove, Missouri, for inventing Neuticles — artificial replacement testicles for dogs, which are available in three sizes, and three degrees of firmness.

CHEMISTRY: Edward Cussler of the University of Minnesota and Brian Gettelfinger of the University of Minnesota and the University of Wisconsin, for conducting a careful experiment to settle the longstanding scientific question: can people swim faster in syrup or in water?

More here.

new atheism

Rameauvoltaire1

As Alan Wolfe points out, the newly revitalized religions have made next to no changes on the doctrinal level. But they have modified their practices, appeals, and attitudes in a more accepting and nurturing direction, creating a new sense of community. This is more than a matter of marketing; it involves living one’s faith and meeting people’s needs. Atheists have much to learn from this. If the appeal of atheism relies on arguments or it casts itself as a messenger bearing cold hard truths, it will continue to fare poorly in today’s world. For secularists, the most urgent need is for a coherent popular philosophy that answers vital questions about how to live one’s life. As McGrath points out, classical atheists were able to provide this, but no more. A new atheism must absorb the experience of the twentieth century and the issues of the twenty-first. It must answer questions about living without God, face issues concerning forces beyond our control as well as our own responsibility, find a satisfying way of thinking about what we may know and what we cannot know, affirm a secular basis for morality, point to ways of coming to terms with death, and explore what hope might mean today. The new atheists have made a beginning, but much remains to be done.

more from Bookforum here.

do onto self, do onto others

Web_zoom1

Models of movement, which are activated in the brain when we observe the actions of another person, hold information and knowledge about the way our own body functions. The possibilities and limitations of movement of our own body are the reference from which we process and interpret the actions of another person. In other words, we understand in others that which we can do ourselves, and what we cannot do ourselves, we cannot also understand in others. Feedback from our own bodies apparently plays a role in our intuitive knowledge of the intentions of other people. In this way, we can predict not only the consequences of other people’s actions, but we are able to “put ourselves in the position” of the other person. Such a mechanism is the basis for sympathy and empathy, and thus decisive for the success and continuity of social relationships.

more from the Max Planck Society here.