Tuesday, May 31, 2016
The Seven Broken Guardrails of Democracy
David Frum in The Atlantic:
A long time ago, more than 20 years in fact, the Wall Street Journal published a powerful, eloquent editorial, simply headlined: “No Guardrails.”
In our time, the United States suffers every day of the week because there are now so many marginalized people among us who don't understand the rules, who don't think that rules of personal or civil conduct apply to them, who have no notion of self-control.
Twenty years later, that same newspaper is edging toward open advocacy in favor of Donald Trump, the least self-controlled major-party candidate for high office in the history of the republic. And as he forged his path to the nomination, he snapped through seven different guardrails, revealing how brittle the norms that safeguard the American republic had grown.
Here’s the part of the 2016 story that will be hardest to explain after it’s all over: Trump did not deceive anyone. Unlike, say, Sarah Palin in 2008, Trump appeared before the electorate in his own clothes, speaking his own words. When he issued a promise, he instantly contradicted it. If you chose to accept the promise anyway, you did so with abundant notice of its worthlessness. For all the times Trump said believe me and trust me in his salesman patter, he communicated constantly and in every medium that there was only thing you could believe and trust: If you voted for Donald Trump, you’d get Donald Trump, in all his Trumpery and Trumpiness.
The television networks that promoted Trump; the primary voters who elevated him; the politicians who eventually surrendered to him; the intellectuals who argued for him, and the donors who, however grudgingly, wrote checks to him—all of them knew, by the time they made their decisions, that Trump lied all the time, about everything.
After Tens of Thousands of Pigeons Vanish, One Comes Back
Robert Krulwich in National Geographic:
At first the whole thing seemed preposterous. No way this could happen. Tom Roden, 66 at the time, was standing at the door of his home near Manchester, England. “I was just setting out on a walk with my dog when I saw him,” he told a reporter. “I recognized him straight away because of his white tail feathers.”
It was a pigeon. His pigeon. It had been missing for five years. Suddenly it was back. Why? And where were the tens of thousands of pigeons that vanished with him?
It had a name: Champion Whitetail. In 1997, Roden had sent Whitetail and a bunch of other racing birds to France, 430 miles south, to compete in the Royal Pigeon Association’s centenary cross-Channel competition, a major long-distance pigeon race with cash prizes that attracted 60,000 bird entries. The contestants, quietly cooing, were brought to a field near Nantes and released at 6:30 in the morning—that was the race’s motto: “At dawn we go.”
At the signal the birds took flight and, following a deep pigeon instinct, dashed at speeds as high as 50 miles an hour straight back toward their roosts, or “lofts,” all across England. This is something pigeons do. It’s called a homing instinct, and even though many of these animals had never been to France before, didn’t recognize the land below them, and had to cross a wide channel of ocean water before finding the house or roof or backyard from which they came, normally most of these racers would have find their way home.
Whitetail was expected to arrive early, because he was a champion. He’d already won 13 races in his lifetime, had flown across the English Channel 15 times, and had finished the Central Southern Classic from Lessay in northern France against a field of 3,026 birds with the winning time. He was a bird to watch.
How the Profound Changes in Economics Make Left Versus Right Debates Irrelevant
Eric Beinhocker in Evonomics:
Economic ideas matter. The writings of Adam Smith over two centuries ago still influence how people in positions of power – in government, business, and the media – think about markets, regulation, the role of the state, and other economic issues today. The words written by Karl Marx in the middle of the 19th century inspired revolutions around the world and provided the ideological foundations for the cold war. The Chicago economists, led by Milton Friedman, set the stage for the Reagan/Thatcher era and now fill Tea Partiers with zeal. The debates of Keynes and Hayek in the 1930s are repeated daily in the op-ed pages and blogosphere today.
Economic thinking is changing. If that thesis is correct – and there are many reasons to believe it is – then historical experience suggests policy and politics will change as well. How significant that change will be remains to be seen. It is still early days and the impact thus far has been limited. Few politicians or policymakers are even dimly aware of the changes underway in economics; but these changes are deep and profound, and the implications for policy and politics are potentially transformative.
For almost 200 years the politics of the west, and more recently of much of the world, have been conducted in a framework of right versus left – of markets versus states, and of individual rights versus collective responsibilities. New economic thinking scrambles, breaks up and re-forms these old dividing lines and debates.
Tales of African-American History Found in DNA
Carl Zimmer in the New York Times:
The first brought hundreds of thousands of Africans to the southern United States as slaves. The second, the Great Migration, began around 1910 and sent six million African-Americans from the South to New York, Chicago and other cities across the country.
In a study published on Friday, a team of geneticists sought evidence for this history in the DNA of living African-Americans. The findings, published in PLOS Genetics, provide a map of African-American genetic diversity, shedding light on both their history and their health.
Buried in DNA, the researchers found the marks of slavery’s cruelties, including further evidence that white slave owners routinely fathered children with women held as slaves.
And there are signs of the migration that led their descendants away from such oppression: Genetically related African-Americans are distributed closely along the routes they took to leave the South, the scientists discovered.
John Von Neumann Documentary
ENRIQUE VILA-MATAS talks about the future
I have come to talk to you about the future. The future of the novel, I suppose, though possibly just the future of this speech. I’m going to describe to you the future as for years I imagined it would be. Put yourselves in 1948, the year I was born, on the August afternoon when music stations in Maryland began to play the sounds of a strange, all but noiseless disc, soon spreading all along the East Coast, leaving a trail of perplexity in anyone who happened to hear them. What was it? Nothing of the kind had ever been heard before, so it still didn’t have a name, but it was—we now know—the first Rock n’ Roll song in history. Whoever heard it was suddenly pitched into the future. The music of that disc seemed to come from the ether and to literally float on the airwaves of Maryland. This, ladies and gentlemen, was the arrival of Rock n’ Roll, and it came with the deep unhurriedness of that which is truly unexpected. The song was called It’s Too Soon to Know, and it was the first recording by The Orioles, five musicians from Baltimore. It sounded strange—which isn’t so strange, bearing in mind that it was the first sign that something was changing.
What thoughts might have crossed the mind of the first person who, hearing Radio Maryland that morning, comprehended that it was the start of a new era? It’s so, went the song, in the halting whispered delivery of singer Sonny Til, it’s too soon, way too soon to know.
I have come to talk to you about the future, which was for years something I thought of as arriving in the same way that Rock music arrived in the year I was born, with the deep unhurriedness of that which is truly unexpected.
'The Noise of Time' by Julian Barnes
How strange the whirligigs of time are when it comes to literature. It’s only a few decades, a second in the eye of eternity, since Julian Barnes and his then friend and ally Martin Amis represented the new British writing. I remember Barnes saying to me back in the days when I published him in Scripsi and celebrated him when I could – or should – in the literary pages, ‘I think my work and Martin Amis’s both benefited from the fact that the dominant mode of British fiction ceased to be social realism with a comic twist.’
That was in the early nineties when Smarty Anus (as he has long been called) was looking like a behemoth of Dickensian novelistic invention. In books such as London Fields, sordor, sorrow, squalor and sex were all wrapped in the silk (sometimes the cellophane) of Amis’s prose. That prose appeared back then, more than any British prose before it, as something like a lassoing larrikin idiom, prose that could give the Americans at their wildest and most idiolectally inspired a run for their money.
But of course, there had always been another voice in the new British writing — Julian Barnes. It was clear then, and remains so now, that the unassailable masterpiece of the period was Barnes’ shortish novel Flaubert’s Parrot (1984). This strange story of obsession, which distilled the essence of the author ofMadame Bovary via Barnes’ transfiguration of Steegmuller translating the Master, is the work that stands in relation to Amis not only at his grandest but also his most loose and baggy the way Jeffrey Eugenides’ Virgin Suicideswould stand to David Foster Wallace’s Infinite Jest a few years later.
how francis Bacon constructed his striking faces
The Waste Land was one of Francis Bacon’s favourite poems. A phrase from section 2, “A Game of Chess”, exactly epitomises Study of the Human Body (1982): “And other withered stumps of time/Were told upon the walls”. This closing picture, one of 29 paintings on show at Tate Liverpool, depicts a body part, a gross truncation, bereft of torso and head. Topped by its bottom, it is a rump, a sturdy circumcised cock in a haze of pubic hair, and white-booted legs, advancing towards the viewer, clad in cricket pads. We are advised that David Gower, the England batsman, was an inspiration, but I wonder if Eliot’s word “stumps” didn’t also play its part, consciously or sub-consciously, as a verbal trigger.
The whole of Bacon’s masochistic homosexuality is encapsulated in this painting. Most of the indispensable parts are there – the penis, the buttocks, the anus – though the mouth is absent, presumably too tender for the ideal rough encounter. In fact, a mouth-part is there, displaced, but not the lips and tongue. You can see a row of teeth in the right cricket pad where, just below the knee, the white protective ridges have been summarily and severely pruned.
Triggering the protein that programs cancer cells to kill themselves
Programmed cell death (a.k.a. apoptosis) is a natural process that removes unwanted cells from the body. Failure of apoptosis can allow cancer cells to grow unchecked or immune cells to inappropriately attack the body. The protein known as Bak is central to apoptosis. In healthy cells, Bak sits in an inert state but when a cell receives a signal to die, Bak transforms into a killer protein that destroys the cell.Institute researchers Sweta Iyer, PhD, Ruth Kluck, PhD, and colleagues unexpectedly discovered that an antibody they had produced to study Bak actually bound to the Bak protein and triggered its activation. They hope to use this discovery to develop drugs that promote cell death.
The researchers used information about Bak’s three-dimensional structure to find out precisely how the antibody activated Bak. “It is well known that Bak can be activated by a class of proteins called ‘BH3-only proteins’ that bind to a groove on Bak. We were surprised to find that despite our antibody binding to a completely different site on Bak, it could still trigger activation,” Kluck said. “The advantage of our antibody is that it can’t be ‘mopped up’ and neutralized by pro-survival proteins in the cell, potentially reducing the chance of drug resistance occurring.”
Monday, May 30, 2016
Skepticism about skepticism
by Dave Maier
If you ever meet a guy who tells you that he is a skeptic, most likely he means that he doesn’t believe in angels or fairies or anything “metaphysical”. Maybe he is a member of CSICOP (the Committee for Scientific Investigation of Claims of the Paranormal, publishers of Skeptical Inquirer magazine). We should, he will tell you, examine the evidence carefully before committing to anything, and be neither gullible nor dogmatic. But of course he himself believes plenty of things, and one person’s skeptic is another’s denialist. What, after all, is “intelligent design” if not skepticism about the biological theory of evolution, and climate change “denialism” if not skepticism about climate science? In all such cases the objector accuses his opponents of epistemological dirty pool and demands that the matter be instead illuminated by the sweet light of reason, as manifested (naturally) in his own views and the ironclad evidence for same.
Such battles about which particular things to believe do not concern the philosopher, who has bigger, more theoretical fish to fry. But these fish can smell pretty fishy to those primarily concerned to beat back the dark forces of dogma and superstition (or “metaphysics”). Perhaps they should be left out for the cat.
Not long ago, for example, Bill Nye the Science Guy opined on the value of philosophy. He was not impressed. One of his gripes was that philosophers spill lots of ink on pointless questions such as whether there’s really a real world out there, or whether instead we might all be in the Matrix, maaaan [*bong hit*]. There is much indefensible stupidity and ignorance packed into Nye’s short remarks, and it is not our task today is to air it out, but I did want to say a few things about the very idea of philosophical skepticism.
As it is presented in popular works and (sometimes) in Phil 101, the skeptical question is indeed given in just this form: how do we know anything at all about what’s “out there”? Most of the time we think we know all kinds of things, but here comes the skeptic to burst our bubble, and put everything we thought we knew into question. Maybe we all (or just you) are simply dreaming! Maybe we don’t know anything at all! And yet of course we do, for that way madness lies; so the whole thing looks like a perverse, logic-chopping sideshow. Why should we care about such nonsense?
The first thing to understand about modern philosophical skepticism (we’ll leave the ancients for another day) is that it is not concerned to show that we don’t know anything. After all, as critics point out, that itself would be as much a dogma about knowledge as its target. Instead, the skeptic simply presents us with a paradox, demanding not that we give this or that theoretical belief up – let alone any empirical beliefs – but instead that we get clearer about what knowledge is, and thus relieve the intolerable theoretical pressure the skeptical problem reveals.
In fact it’s precisely because the modern skeptic agrees that it is an unacceptable solution to the paradox to even think about putting our empirical knowledge into doubt that he presents it as a paradox in the first place. I don’t want to get into it in any great detail (the locus classicus modernus is Barry Stroud’s The Significance of Philosophical Skepticism (1984)), but the simplest and indeed strongest version of the idea, although it is rarely put so bluntly, is this. First, it is granted by all sides that if our conception of empirical knowledge is to apply to the cases we’re actually interested in, the inference from evidence to belief must be, as we say, ampliative – that is, that it cannot be simply one of entailment, as mathematical or logical inferences are, but instead tell us something to which we were not already (however implicitly) committed. It is this epistemic gap that explains the very possibility of error – something which again all sides concede.
Not only is error theoretically possible – it’s actual, as we can see (Descartes makes this point repeatedly) from the actuality of disagreement, on matters, that is, where not all of the competing views can be correct. Someone in error is deceived: he believes that his evidence is sufficient when in fact (obviously, to us) it is not. But we also, who are (ex hypothesi) not deceived, are in precisely the same phenomenological boat: it seems to us that our evidence is sufficient and thus that we are not deceived. From the inside, then, we cannot tell whether or not we are deceived.
This, it seems, given the conception of knowledge under the microscope, is enough – or so the skeptic argues – to cause serious conceptual problems. But again, that what we think of as our knowledge is not actually known is not the skeptic’s conclusion. It is instead, as Stroud puts it with admirable candor, that there must be some mistake in there somewhere, and the bulk of his book is devoted not to debunking our knowledge claims but instead to attacking what he sees as ineffective diagnoses and treatments of the ills caused by the paradox. It is as much a defense of the modern conception of knowledge as it is an attack on it – even more, in fact, as the “anti-skeptics” he criticizes seem to him all too willing to abandon central pillars of that conception in order to preserve our knowledge against the perceived skeptical attack.
In this sense, then (paradoxically enough), modern skepticism doesn’t concern the epistemological status of our beliefs at all – after all, even Stroud finds the skeptical conclusion unacceptable – but rather their metaphysical status. Seen this way, the concept of knowledge concerns not a bunch of things – the good beliefs as opposed to the bad – but a relation, making the question not “are our beliefs true and how can we tell?” but instead “what does knowledge relate to what?”
Naturally we moderns tend to think we know the answer to this one as well: when we know something, we have an accurate representation of the outside world, like a map that shows us where the various places in the world really are, only ideally accurate (as maps are not). On this view, while it is the world’s objective nature – that it does not depend in any way on what we think or say – that forces the skeptical paradox upon us, once we are satisfied, as Descartes himself came to be, that our knowledge is safe from skeptical doubt, then the objectivity of the world is unobjectionable, and indeed a key component of the modern picture.
Thanks to Descartes and his contemporaries, the modern worldview is a broadly scientific one. That is, the modern realization, thanks in part to Descartes’s skeptical inquiries, that our representations must be rigorously scrubbed of the merely subjective contamination of their sensory pedigree in order to be valid fits perfectly with the newly mathematized and experimental scientific methods pioneered by Descartes, Huygens, et al. Since it is science and science alone which manifests the required epistemological rigor, we must turn to science if we are to have knowledge of the world as it is anyway, independent of our believing it to be that way. Premodern appeals to intuition, faith/superstition, and tradition are inferior and downright suspect. The contemporary skepticism of the Skeptical Inquirer turns out to be a direct result of Descartes’s skeptical philosophy.
Indeed, when philosopher Michael Williams subjects Stroud’s skeptical argument to sustained criticism (taking it, typically, as an attack on our knowledge and looking, as anti-skeptics will, to defend it), he specifically clears Stroud’s “objectivity condition” on knowledge of being the source of the theoretical problem (as well as the “theoretical” condition, the idea that it is philosophical grounding of our everyday belief which is our concern, not the beliefs themselves). Williams locates the problem in Stroud’s “totality” condition – the idea that we must theoretically vindicate our knowledge all at once rather than piecemeal – and proposes his own doctrine, a version of epistemological contextualism, in response to what he calls Stroud’s “epistemological realism”. (While this sort of philosophical niggling is to my mind precisely as boring as it sounds – I wrote about it at length in chapter 4 of my dissertation – it is still far from earning Nye’s contemptuous dismissal.)
In my view, then, contemporary skeptics like Stroud are doing the same thing as Descartes: defending modern metaphysics – i.e., the same general picture assumed without argument (not inappropriately for a modern non-philosopher, I grant) by our CSICOP skeptic – against theoretical attack. What Stroud does, in Significance and later work, that Descartes cannot is respond directly to contemporary critics of the Cartesian picture such as Kant, Quine, Wittgenstein, and Rorty, all of whom he regards as too quick to respond to the skeptical threat by abandoning the Cartesian notion of objectivity as a world-in-itself ideally free of all subjectivity, and thus as soft on relativism or idealism in one way or other.
This is my terminology, as no one nowadays – least of all our scientific skeptic – likes to think of himself as Cartesian. Descartes’s dualistic metaphysics is generally supposed to have been decisively refuted, in favor of a scientifically respectable materialism. Daniel Dennett, for example, has led the contemporary charge against the Cartesian picture of the disembodied mind and, more importantly, the persistent fantasy of essentially subjective internal states inexplicable in objective scientific terms which survives the collapse of strict mind-body substance dualism, e.g. as manifested in the views of such contemporary Cartesians (in this sense) as Thomas Nagel, Colin McGinn, and Stroud himself.
As I have argued earlier in this very space, though, the materialist attack on the Cartesian subjective “inner” (as the last remaining residue of the rejected substance-dualist view) leaves in place, and indeed often overtly appeals to, the ideally objective “outer” world which is its direct analogue. That is, it displaces mind-body substance dualism only to leave in place a remarkably similar conceptual dualism of subject and object. This is why we cannot overcome the incoherent notion of an inaccessible subjective “inner”, as Dennett has managed more than anyone else to do, without at the same time addressing the corresponding commitment to the equally incoherent notion of an “objective” world viewable from no particular point of view – the other half of the subject-object dualism.
That this latter conception goes along with the former, overtly Cartesian one is explicit in Nagel, e.g. The View from Nowhere, the title phrase meant to capture the conception of objectivity he sees as “philosophically fundamental”. It is also evidently a paradox, like Stroud’s, meant not to be dismissed as incoherent but instead to be redeemed, in Nagel and elsewhere, with a renewed commitment to the Cartesian picture.
The dualism itself is what’s keeping the Cartesian view in place, but if we can’t see how to give it up without threatening the objectivity of science (in the relevant sense), then no modern will be willing to do so – especially in the face of postmodern attacks on science as at best a philosophical house of cards, if not an outright fraud. Cartesians have long been so successful at painting their opponents as relativists and idealists that the mere accusation can seem to carry the day. Perhaps ironically, one of the favored putdowns of contemporary anti-Cartesian views is that it amounts to “postmodern skepticism” – that is, doubt concerning the idea that the Cartesian ideally objective world is a world that we can know.
Of course doubt of this kind is just what I described Stroud pushing as part of his project to defend the Cartesian view. So it’s not surprising that the issue can seem to be a giant muddle. We’ll have to stop here for now, but at the very least I think we should find ourselves out of spitting distance from the likes of Bill Nye the Science Guy.
Searching For America
by Michael Liss
It is time for navel-gazing here in the US.
We are about to have an election in which the two likely nominees have managed to alienate the electorate to an unprecedented degree. It has led to a surreal atmosphere. Hillary Clinton slogs on with a message that brings to mind the appeal of an appointment with a dental hygienist—it won’t be the highlight of your day, but it’s the healthy choice. Donald Trump has managed to do something quite brilliant—he has identified his target audience, taken disgust with dysfunction, mixed it with a shot of anger, and distilled it into one easily digestible slogan: “Make America Great Again.”
It is a genius-level move by a master salesman. With those few words, Trump seizes for himself and his supporters a core identity as the true heirs of a legacy of American preeminence. Like a classic old building, American greatness is still here—it’s just covered under layers of accumulated grime. With the right man in charge, someone of vigor and boldness, we can sandblast it all away and have a palace—even a cathedral—that celebrates. As we once were, so shall we be again.
But who were we? To what are we returning? That’s a fascinating question, because to own something, you need to be able to define it. And history lacks the clarity of a mathematical proof or a replicable scientific experiment. To paraphrase an interesting point Mary Beard makes in SPQR: A History of Ancient Rome, the historian engages in a work of reconstruction which, by definition, is self-limiting. When the written word is absent or suspect, you learn about things by piecing together inference and fact, as if you were reassembling a broken amphora. You can scientifically analyze the contents, you can date the time it was fired, you can make assumptions about the economic and social standing of the owner and the community he lived in, but, in the end, what you have in front of you is likely the remains of an attractive, once useful, pot. A pot—not an unimpeachable set of facts about the nature of the people who used it.
Yet, from some pots, and some ruins, and some odes and epics, we think we know Professor Beard’s Romans, even from two millennia distant, and what we see, we like. Putting aside, for delicacy’s sake, the more violent and unsavory aspects of conquest and governance (sacking Carthage, lions and Christians, etc.) we admire what they defined as Roman virtues: Prudence, mercy, dignity, tenacity, truthfulness, and virtus (manliness, excellence, valor). Of course, we don’t know how many Romans lived up to that code, and we can be sure that at least some of it is self-aggrandizing myth. But we project upon these ancients a modern set of values and give them life in our imagination.
In looking for what might be thought of as an American character, and placing it in contemporary political terms, we need to apply a different type of rigor than the historian might. A purely objective analysis, or how others might think of us, is less relevant than our subjective sense of what we were, and want to return to (Trump’s potent message), or what we are, and aspire to be.
So, who are we?
We do have a creation myth—a costume drama of 18th Century men in hose and formal wear, debating over gigantic issues such as the meaning of democracy, the form of government, the rights of the individual. There are common people in the story—militiamen springing out from behind some tree to pick off a Redcoat, or Molly Pitcher, or the shivering, starving soldiers at Valley Forge, but it is very much an aristocratic narrative. Creation needs Gods, the Founders are our secular equivalent, the Constitution our Ten Commandments.
But few children grow up wanting to be James Madison. What fires the imagination is what happens after the Gods do their work, when ordinary people are left to till the garden. That is one of first inflection points that is part of the American reality (or the American myth)—that neither power nor virtue is permanent or hereditary. The great things to come—the taming of our own wilderness, and then the preserving of the world—are much less the product of great men than of countless individual efforts by those who took risks and braved hardships.
For that part of the story, the key to determining our emotional center, pots and buildings even the written word have far less value. Instead, we have visual poetry, and the reflections of two great filmmakers—John Ford and Steven Spielberg—on American character and values.
Ford is the “tame the wilderness” artist. He deals with the opening of the West, cowboys and Indians, the struggle against difficult conditions, and the sometimes tenuous hold on life. His locations—dusty trails, beat-up saloons in small towns, tired ranches, arid farms, displayed against a backdrop of physical magnificence like Monument Valley—reflect the sheer scale of the challenge of the frontier. His heroes are quintessentially American in the sense that they explain a range of our sensibilities—complex emotionally, but sometimes unrefined; able, but not always in every way; seeking justice, but not always free of prejudice or anger; overcoming obstacles, but occasionally failing. Ford relies on an entire troop of stock characters to give familiar texture and color, but he convinces his stars to take risks. John Wayne not only gets to play the hero, in movies like Fort Apache, She Wore a Yellow Ribbon, and Rio Grande, but also, in The Searchers, the bitter monomaniacal racist Ethan Edwards, who (just barely) redeems himself. Henry Fonda is a thoughtful and deliberate Lincoln in Young Mr. Lincoln, saintly Tom Joad in Grapes of Wrath, the man of action and moral avenger (or revenger) Wyatt Earp in My Darling Clementine, and the brittle, arrogant, contemptuous Colonel Thursday in Fort Apache.
Ford’s later work in The Man Who Shot Liberty Valance comes to grips with another part of the frontier/Western legacy—hard men have to do hard work to secure the peace, and other men stand on their shoulders. Jimmy Stewart’s Rance Stoddard has the moral and physical courage to challenge Liberty Valance, but can barely aim a gun, much less be willing to kill someone with it—even if it means his own death. John Wayne’s Tom Doniphon, who contemptuously calls Stoddard “Pilgrim,” recognizes that physical evil like Valance’s has to be opposed, even with extreme acts. From an unseen vantage point, Doniphon fires the shot that kills Valance, but it’s Stoddard who gets the “credit” for Valance’s death—and uses it to launch a successful political career. The movie is tremendously bittersweet—neither man’s hopes are fully realized: Doniphon’s secret rescue of Stoddard costs him the woman he loves—Hallie, who then marries Stoddard. And Stoddard knows he’s been living a lie—the last seven minutes of Liberty Valance, following Doniphon’s funeral, are confessional, redemptive, and excruciating all at once.
Spielberg, who approaches Ford’s talent in both visual mastery and story-telling ability, uses Tom Hanks’ Fonda-like appeal to great effect in wrestling with America’s post-frontier ethos—the time in which our battles were the world’s. In Saving Private Ryan, Hanks’ Captain John Martin is like millions of others in the Greatest Generation—an ordinary man placed in extraordinary circumstances, who then rises to the challenge. He is sent on a mission he sees as largely symbolic, and, to carry it out, must put himself and his six men at great risk. He sees and suffers loss all around him, and ultimately settles on the only rationale that gives him comfort—finish the task and he can get home to his wife. But he is acutely aware of the paradox of war and what it does to the psyche: “Every man I kill, the farther away from home I feel.”
Take that “Greatest Generation” forward to the 1960s, and Hanks appears again as the lawyer James Donovan, in Bridge of Spies. The world has changed dramatically. The absolute evil of Hitler and the Nazis has been replaced by a Cold War between stupendously powerful forces with incalculably deadly weapons. The people who fight this war are cognizant, to the point of paranoia, that mistakes can have existential consequences. The public feels the same. With that as backdrop, Donovan finds himself the one straight-arrow in a world of shadow-boxers for whom the ends justify any means. “Don’t go all boy-scout on me,” says Hoffman, Donovan’s CIA contact. But boy scout is what Donovan is. He defends the accused Russian spy Rudolph Abel at significant personal and reputational risk. He sticks to his principles, at home, and in East Berlin, and succeeds. This is Spielberg’s statement about American values—our strength derives from our belief that we made a compact some 200+years ago, that we play fair, that everyone is equal before the law. That, what Donovan calls “the Rules” (the Constitution), are what binds us as Americans, regardless of our place in society, ethnicity, or place of national origin.
Do those ties still hold and are we all legatees? We live in a complex and contradictory world where risk aversion can muffle the better angels of our nature. Are we still the America of Ford and Spielberg, realists and idealists at the same time? This is a test, in this election cycle and beyond. As Teddy Roosevelt once said: “Character, in the long run, is the decisive factor in the life of an individual and of nations alike.”
Pakistan is digging trenches —graves for people who have not yet died
as the country prepares for another record-breaking heat wave. Scientists
place the blame for rising temperatures squarely on climate change.
............................................................. —IndiaTimes, May 23, 2016
spades trace dolorous arcs in dry air
making long scars for many corpses.
sharp bell-like clangs of steel on stone
echo from the depths of this new scar.
the swoosh of pick-heads through air
end in thuds as their pikes take bites.
men sling dry earth over shoulders.
they lean into their work.
they heave the earth upon itself
raising mountains of waist-high ranges
that parallel the long straight wound they carve.
these sweating ghosts-to-be
who may soon be thrown as well
into the coarse cut of their work,
a ditch that will soon be healed, forgotten, lost
when the undulating range piled by gravediggers
is thrown back in to bury hearts that break,
covering myriad sins: myopia,
misanthropy, masochism, mistake,
this ditch where now-breathing, sweating,
living, loving dead will go—
we’re so good to ourselves, so profligate
we‘ll waste even our own last breath,
we'll make a place for it in a hewn slash,
bury it in our blue mother’s flesh,
the one we have not wisely loved
but sold for cash instead
by Jim Culleny
Memorial Day: The Heartbreaking Convergence of Freedom and Fear
by Humera Afridi
Mere steps from Castle Clinton in Battery Park, on the southern tip of Manhattan, stands a striking bronze sculpture titled, The Immigrants. Created in 1973 by the Spanish sculptor Luis Sanguino, it portrays a group of individuals who have undertaken an arduous voyage. Their gripping expressions and postures tell a story of endurance—borne with patience and prayer; kindled by hope for a life of dignity, free of fear, whose nimbus-like promise will surely unfurl in this new world where they have disembarked.
Amid the deep-green lawns, beds of blooming tulips, and the sunny melodies of street jazz, the bronze figures beckoned. I spotted them on my lunch break, a fortnight or so before Memorial Day. Their raw emotions and the naked display of the human spirit expressed in all its earnestness caught me by surprise. Here in plain sight was a visual testimony to the search for sanctuary—a struggle that is painfully alive in a world beset by wars, but also, immediate and close to home, visceral in the lives of many thousands of immigrants in America who having found refuge here, nevertheless, now tragically live in fear of being deported and separated from their families.
A figure kneels, bare-chested with head thrown back, arms spread wide, broken chain-links dangling from fingers; another clasps both hands in fervent prayer, gaze directed heavenward. Disconcertingly candid and telling is the stance of one at the front of the line, who crouches, with a hand outstretched—surely symbolic of the labor of immigrants, and former slaves, upon whose foundation this nation is built. In the middle of the group stands a robed male of dignified bearing, arm held across his breast in a gesture of allegiance? of self-determination?
"Give me your tired, your poor,
Your huddled masses yearning to breathe free,
The wretched refuse of your teeming shore.
Send these, the homeless, tempest-tost to me,
I lift my lamp beside the golden door!"
If once on approaching these shores, the mercy and beneficence of these word elicited tears of gratitude and relief, today remembering these lines against a backdrop of fear-ridden rhetoric and anti-immigrant vitriol, one weeps with despair at their growing hollowness, at the realization that the "golden door" is starting to resemble more and more the metal gate of a jail cell.
On January 6, 1941, President Franklin D. Roosevelt delivered his famous Four Freedoms speech. The world was at war, he declared. And in order to safeguard America's cherished value of Liberty, the dictators and tyrants abroad had to be defeated. In fact, the outcome of the war would determine if America's ideal of freedom would prevail over tyranny. The expansion of liberty in the world, Roosevelt insisted, was the best hope for peace at home. He envisioned Americans to be at the vanguard of establishing freedom and democracy, first at home, and then enabling their fruition in the world beyond.
"In the future days which we seek to make secure, we look forward to a world founded upon four essential human freedoms," Roosevelt said. "The first is freedom of speech and expression—everywhere in the world. The second is freedom of every person to worship God in his own way—everywhere in the world. The third is freedom from want…. The fourth is freedom from fear…"
Standing in historic Battery Park—former "golden gate" to the new world, and, at the same time, an area of defense equipped with artillery batteries to protect the settlement—I suddenly understood that freedom and fear breathe side by side. They share the same topography. They are tribal cousins in feudal rivalry. Caught in a tensile dance, a forever friction, freedom and fear are creative-exploitative; they are neighbors and enemies; they are light and shadow, sharing a volatile locus. To remain at peace Americans needed to prepare to fight, said Roosevelt. To secure Liberty, armaments and defenses had to be built. It was a cruel paradox.
Thirteen months after Franklin D. Roosevelt's eloquent speech—in which he impressed upon Americans the right of every human being to be free of fear—America, in a sinister, Kafka-esque turn of events, turned on its own citizens, and systematically destroyed dreams and families, homes and livelihoods. It was Roosevelt himself who signed the evacuation order for Japanese Americans to be rounded up and interned in concentration camps. Two thirds of the 127,000 imprisoned were American born, some were even veterans of the United States Army in World War I, many had never set foot in Japan. They resembled the enemy, they were of Japanese ancestry. That was their crime. And no one who resembled the enemy was exempt.
At the beginning of Julie Otsuka's harrowing novella, When the Emperor was Divine, we witness the movements of a woman who remains nameless as she works with the quiet, concentrated energy of one upon whom a great violence has been enacted. She is a woman without a husband now—we learn that the FBI came for him around midnight a few weeks earlier, took him away in his bathrobe and bedroom slippers and locked him away in detention, in a treeless wilderness, where he sleeps on a metal cot. She is a woman with children and pets to care for. We sense the love coiled tight in her heart. The weight of care in this new reality, and the work of love, were she to ponder these now and allow them to unravel, she might collapse. She can't afford that luxury. She has seen the notices all over town for evacuation orders.
She must act. She must pack. She is a good citizen, an obedient citizen, a loving mother. So, she feeds the old, half-blind dog a delicious meal of rice balls, egg and salmon; she rubs his stomach, talks to him, walks him over to a tree, ties him with a piece of twine and bludgeons him to death. Then she buries him. She feeds her daughter's beloved parakeet before she lets it out of its cage, out of the window, and shoos it away to freedom. She packs her suitcase. She buries the silver in the garden. The children each have a small suitcase. That's all they're allowed to carry. A subdued, stifled, ominous quality pervades the narrative, creating a haunting evocation of the living death experienced by those who left home to be quarantined on the other side of the barbed wire fence.
After the war, the mother and her two children return to a world that looks upon them with suspicion, with a mistrust and hate that the children internalize. "We looked at ourselves in the mirror and did not like what we saw: black hair, yellow skin, slanted eyes. The cruel face of the enemy. We were guilty now… No good… A dangerous people who could never be trusted again… On the street we tried to avoid our own reflections wherever we could. We turned away from shiny surfaces and storefront windows. We ignored the passing glances of strangers. What kind of "ese" are you, Japanese or Chinese?"
They were told it was a matter of military necessity, the camps were an opportunity for them to prove their loyalty. It was all in the interest of national security.
Fear, every politician knows, is a powerful political tool. And, in the current electoral campaign in the United States, it is fear, once again, that is being wielded—in the guise of Liberty. Ah, the seductive power of Liberty! Liberty governs the quality of our material lives; it fires our spiritual ideals; our patriotism. Liberty it is that ensures happiness. And to pursue happiness, we must attain freedom from fear, and to do so we must eliminate the enemy. What happens, though, when you resemble the enemy?
For Muslims in America, it feels very much like history is repeating itself. Muslim Americans are experiencing and feeling anti-Muslim bigotry at unprecedented levels, more so than in the aftermath of 9/11. Community organizations who are at the forefront of confronting islamophobia agree that discrimination is heightened by government and state policies that view Muslims through a national security lens. Fahd Ahmed, Executive Director of Desis Rising Up and Moving (DRUM), a working class and youth-based organization in New York City, states: "Nowhere in history will you find a time of war creating an environment for open-minded and progressive thinking around inclusion and community-building. Wars inherently create suspicion, distrust, division and the environment for these ideas." Fifteen years of war have enabled the atmosphere for politicians to openly and vehemently denigrate Muslims and immigrants.
Nationally there are 3.3 million Muslims, who make up just under one percent of the population of the United States. One out of five Muslims in the U.S. lives in the New York region. Muslim youth are feeing the burden of representing their community. They're feeling the pressure to be politicized, to know more than their peers, to be representative and knowledgeable about Islam and the politics of the region—because they find themselves under scrutiny, and asked such questions by teachers and peers. Muslim youth are experiencing a critical need to define their identity. Many are fearful to speak another language lest they be perceived as not assimilating. Girls in hijab want to be seen beyond the barrier of this visual marker. Many are experiencing the challenges of poverty but feel isolated and marginalized. They want to express themselves and be heard. They want to counter the mainstream narrative, the biased slant of the media. A great number of Muslim youth who do not have a pre-9/11 reference point feel unwelcome in America. Linda Sarsour, Executive Director of the Arab American Association of New York, believes if more Americans don't speak out, if policy makers and government don't provide sufficient support, the country will fail its young American Muslims.
"We have a firm belief that institutional forms of violence actually create the platform for social violence," emphasizes Fahd Ahmed. He lists racial profiling, surveillance, anti-immigration policies, the targeting of South Asian and Arab students by school safety officers; deportations and detentions as sending a message—there's something suspicious about these communities, something wrong.
In her book We Too Sing America, lawyer and activist Deepa Iyer writes: "Post 9/11 policies and the narratives used to justify them bear an eerie resemblance to those implemented during World War II. Their ultimate ineffectiveness … (in) fighting terrorism generate the inquiry: are these policies, in effect, ways to purge America of its ‘undesirable' immigrants?" Iyer advises that the country assess the impact of its policies on immigrant communities. "The state cannot both welcome immigrants and enforce the civil rights of people of color while simultaneously engaging in practices that justify wholesale profiling of these same communities. Instead, the state must hold itself to the highest civil and human rights standards at all times, especially during times of significant national turmoil. Otherwise, we risk losing our nation's core values and compromising the ideals that draw so many immigrants to America."
Today, it is Memorial Day, a United States federal holiday commemorating all those who've lost their lives while serving in the armed forces. I think of the Muslim American women and men who are currently serving the United States military. To be oriented wholly by your conviction in the cause of Liberty, where faith and family and cultural roots and ancestry come second, is truly laudable. I can't celebrate Memorial Day without thinking of Noor Inayat Khan, the first woman wireless operator parachuted into Nazi-occupied France in World War II. As a Sufi, Noor believed in nonviolence, but she also believed in the right to freedom and felt it was a spiritual and moral duty to fight the horrors of fascism. The daughter of an Indian mystic and American mother, Noor was born in Moscow and grew up in France. During her interviews with the British War Office, they asked questions and contemplated her allegiance. Could they trust this brown-skinned ‘Indian' woman? India was restless for independence. If Noor felt so strongly against occupying forces what did she truly feel about Britain?
In truth, Noor was as French as could be and turned out to be extremely valuable to the Special Operations Executive, the British espionage and sabotage organization, who recruited her. She worked for the Resistance in highly dangerous circumstances until she was betrayed and captured by the Nazis. Placed in solitary imprisonment for a year, she was transferred to the Dachau concentration camp. There, through the course of a night she was tortured and beaten. In the early hours of September 13, 1944, she uttered her last word, Liberte!, right before she was shot. In 1946, Noor was posthumously awarded the Croix de Guerre for bravery and in 1949 the George Cross.
For those who look like ‘the enemy,' but are, in truth, valued members of this nation, who gave and are giving of their lives, let's remember them today. Muslims have fought for the United States in all her major wars, including as far back as the Revolutionary War, serving in the Continental Army under George Washington. At Arlington National Cemetery there are graves of Muslim soldiers decorated with Purple Hearts who died fighting in Iraq. On the granite pylons of the East Coast Memorial in Battery Park—bearing in alphabetical order the names of WWII air and navy servicemen who lost their lives in the Atlantic Ocean during combat and who now "sleep in the American coastal water"—I discover Anees under the A's, Khoury in the list of names beginning with K. Assimilation of non-European immigrants in America has been far from smooth; it's easier for those who pass as White. I think of the spouses and the children of those who died in battle, who are left to fight the war of survival, with its wounds of loss and rupture. I wonder, too, how it must be to give all of yourself over—body, mind, heart and soul— to the cause of Liberty, to a nation that accepts your labor and your life, but still perceives you, your children, in essence, as other?
The Immigrants call me back. I visit again and again. There's something prophetic about the sculpture. The patina of vulnerability on these figures is real. It points to a crisis in America—one that can no longer remain hidden in the shadows, where vulnerable immigrants are suspect and mistreated in the name of Liberty. On this Memorial Day as we honor our lost brave, let us also remember the terrifying, late night knocks on the door, the families that are riven by the violence of discriminatory policies right here at home. And let us not forget Liberty, who awaits with her lightening-flame lamp beside the golden door.
The Banality Of Neoliberalism (As Exemplified By The Clintons) And Why Americans Never Saw Its Evil (Until Occupy, Bernie Sanders And Donald Trump Alerted Us)
If it hadn't been for the disaster that was George W. Bush, the worst president of our time would be that arch-neoliberal serial philanderer Bill Clinton.
Clinton was almost as crappy an a-hole as W.
George W killed hundreds of thousands of innocent Iraqi women and children in a monstrous war crime. Bill Clinton merely made the lives of millions of Americans utterly miserable.
1. How Bad Was Bill Clinton?
Breath in the stench from the pile of crap that Slick Willy stuffed up our nostrils.
He destroyed thousands of good American jobs by exporting them with NAFTA.
He created the 2008 Wall Street crash and the Great Recession when he signed the two laws that repealed Glass-Steagall and removed financial derivatives from all oversight — the two worst laws signed by any president ever.
Internationally, he refused to intervene in Rwanda, and allowed 800,000 Tutsis to be brutally genocided.
He exploded the size of our Black and Latino prison population with his harsh 1994 crime bill and the building of many privatized prisons.
He doubled the number of our poor with his welfare reforms (today 47 million Americans live in poverty, and over 20% of our kids are poor, a higher rate than any other developed nation).
Clinton's presidency left Americans jailed, poorer, and brutally screwed in every sensitive orifice. He forced many of us to eat an eternal shit sandwich, a record of destruction topped by George W. only because W committed the satanic war crime of the Iraq War.
2. What made Bill Clinton so bad?
It's all because Clinton was a consummate neoliberal, whose agenda favored anti-labor-union big business (via the right-leaning Democratic Leadership Council), with its faith in a fundamentalist global "free market" ideology that wants to privatize all economic activity and drown government in the bath tub. This fundamentalist global religion, devoted to a "free market" God, comes down to a peculiar un-christian heartlessness: cut government regulations to the point that corporations are free to screw us every which way.
Few Americans know that in concert with the abominable Newt Gingrich, Clinton was secretly getting ready in 1997 to privatize Social Security, before he got distracted by America's discovery that when he pulled his stiffie from Monica Lewinsky's luscious mouth, he splooged presidential pearl jam all over her dress. (Just like Obama, another neoliberal, was getting ready to give up Social Security benefits to make a grand deficit-cutting bargain with the Republicans, only to be frustrated by their unwillingness to work with him on anything.)
But who today would even admit that Bill Clinton was a close second to George W. Bush as a piece-of-crap president?
And why is that? Why do Americans still love the man who slathered our faces in fecal matter?
3. The Banality Of Neoliberalism
It's because Clintonian neoliberalism has become the sea in which we swim: so much a part of us, we don't know it exists.
Neoliberalism has become what's normal. Big business rules. Neoliberalism is what's accepted. Wall Street pays fines instead of going to jail. Neoliberalism is the status quo. Our wages have remained stagnant since 1970. Neoliberalism is the natural order of things. Congress doesn't listen to ordinary folks, only to the 1% rich.
Neoliberalism: an everyday commonplace quotidian banality.
Neoliberalism: what unregulated capitalism has turned into.
Neoliberalism has become what we are, how we live, how we think the world is, our humdrum selves. As per Fredric Jameson, the literary Marxist scholar: "It is easier for people to imagine the end of the earth than it is to imagine the end of capitalism."
In short, neoliberalism is our current banality of evil.
As Margaret Thatcher remarked: "There is no alternative."
And nary a single mind of our many public intellectuals has ever told us that there is any alternative. Instead, they've sold us a bunch of hooey. They've swallowed neoliberalism whole, like a slavish sub swallowing her dom's insults along with his jism.
As Joe Bageant, the deceased red-neck philosopher and pundit blogger extraordinaire put it:
"The brutal way Americans were forced to internalize the values of a gangster capitalist class continues to elude nearly all Americans. Most foreigners too. This is to say nothing of how our system replaced our humanity with ideology, our liberty with money, and fostered fascist nationalism through profound degeneration of the people's mind and spirit. It's not as if one can ever escape that sort of thing. We are made in America's image, whether we admit it or not, and America's image is the face on a ten-dollar bill."
4. A Walmart economy instead of a Ford economy
Neoliberalism is why instead of a distributive Ford economy, who paid his workers double the going rate so they could afford to buy the cars they made, we have a predatory Walmart economy, who pay their workers so little they need food stamps to survive on their slave wages (your taxes subsidize the profits of the Walmart family to the tune of $6 billion a year).
It's why today, labor unions only represent 10% of our workers, when they represented 30% in 1971. The last three Democratic presidents — Carter, Clinton and Obama — didn't care a hairy testicle about labor unions or the working class (Obama dropped labor's wish for card-check like a used condom). As neoliberals, these presidents favored white-collar professionals over blue-collar workers. They preferred the meritocracy to the proletariat. Limousine liberals.
That's why Obama did not prosecute Wall Street fraudsters, because, like him, they were white-collar professionals, and because his administration was stacked with their pals (Summers, Geithner et al).
A kid can go to jail for years if he smokes a joint, but a Wall Street crook gets rewarded with a big bonus for committing massive fraud.
President Obama has been the worst lackey of the 1% we've ever had. Under Reagan, some 800 senior executives went to jail because of the Savings and Loans scandal. Today, after a far bigger fraud that led to millions of jobs and homes lost -- with banks shorting their own fraudulent products, ripping off homeowners, hiring trucks to drive bags of drug lord cash over borders, investing funds for terrorists, rigging the Libor rate — not a single big bank executive has gone to jail. The crooks have kept their blood money. Sometimes the banks pay a fine, but that's the cost of doing business, and it's a small percentage of their criminal profits. The fact that Wall Street can steal, con, commit fraud, cheat pensions funds, screw little old ladies and tank the economy with impunity proves that the 1% has become untouchable. The fix is in. We the 99% are powerless to change it.
You might want to call all this neoliberalism terribly unfair to regular folks, but I want to call it straight-up evil.
This only sounds shocking because we're so used to the banality of its existence.
5. Wage Stagnation And Other Neoliberal Evils
Take wage stagnation. Our wages are what they were in 1970. Stuck in the remote past. Yet worker productivity has sky-rocketed since then. The average household income of Americans today is around $50,000 a year. However, if we were paid for how much more productive we have become, the average household income today would have been $92,000. Imagine what a thriving economy we'd have today if that were our median household income. Where did that extra $40,000 a year go that we should have been earning for our improved productivity? To the top 1%. To CEOs, who used to make maybe 20 times what an average worker makes, and now make between 300 to 500 times that.
But how many Americans know this? No one except me and a few other progressives (probably no more than the readership of that most excellent publication, The Nation).
The banality of evil neoliberalism, and its unthinking acceptance by our corporate-owned media and pundits, have shielded us from the truth.
Take the boondoggle of the Pentagon budget. It's the biggest item in government spending, and it's totally unnecessary. We can easily get by with a defense capability at least a hundred times smaller than what we pay for. Who is going to attack America? Nobody. To guarantee our security, we need no more than say a hundred thousand drones (useful little critters), ground troops of maybe two thousand Navy Seals teams, a thousand nuclear-tipped rockets tops, at most five hundred jet fighters, and perhaps a dozen aircraft carriers. That about covers it.
Shut the Pentagon down; turn it into an assisted-home facility. Spend our defense billions on free pre-K for all our kids, and free college tuition for all our students. Not on defense — a euphemism for corporate welfare. Or on homeland security, another billion-dollar boondoggle. The entire Pentagon — which takes our tax dollars to support manufacturers of weapons (can you think of a more evil occupation than this business of mass killing?) — is corporate welfare at its most pernicious.
Meanwhile, we suffer. According to a 2014 study by the Social Progress Index, among similar countries, the US ranks 31st in personal safety, 34th in access to water and sanitation, 39th in basic education, 69th in ecosystem sustainability, and 70th in health.
6. Capitalism on steroids
Neoliberalism is capitalism on steroids. It believes in free-trade arrangements that give corporations the right to sue states for causing them any loss in profits. For example, if a government legislates against the dangers of tobacco, free-trade arrangements want to give tobacco companies the right to sue that government for their loss of profits. Not in a court, but in arbitration presided over by corporate representatives.
Capitalism on steroids whelps monopolies like rabbits. Today our economy is dominated by them. Amazon, Google, Facebook and Wall Street banks (now bigger than they were before the crash) are monopolies that would have made the hands of trust-busting Teddy Roosevelt itch. But today these monopolies are untouchable, because they exist safely in the banality of neoliberalism.
Neoliberalism means shifting the tax burden from corporations to individuals. After WW2, for every dollar that individuals paid in taxes, corporations paid $1.50. Today it's 25 cents. Corporations used to pay more taxes than people, but now people pay more taxes than corporations. Goldman Sachs often pays less than 2% in taxes, and GE often nothing. Us hard-working regular folks (Romney's notorious 47%) are not the takers who mooch off the government tit -- our big corporations are. So are our 1%. Since 1997 the 400 richest Americans have more than tripled their average annual income to $345m, while their taxes have gone down by 40%. A billionaire like Romney pays 13% in taxes, far less than you and I.
And take Wall Street. Its job used to be to lend money to businesses — to start businesses, to expand businesses, to fuel business. Now only 15% of its money goes to funding businesses with productive lending (that's why startups — which is how new jobs get created — have declined by 44% from 1978 to 2012). Wall Street uses the other 85% to make money for itself from debt-fueled speculation. Financialization: the corroding worm inside the apple of neoliberal capitalism.
And don't forget that neoliberalism started under president Carter. He deregulated the railroads, airlines and, most importantly, interest rates (which led to financial "innovations" and shifted banking from lending to trading). And he cut the top marginal tax rate from 90% to 70% (and then Reagan cut it to 28%).
7. We Don't Know How Buggered We Are
We have gone from a democracy to a plutocracy without us being aware of it — like the frog luxuriating in slowly warming water until the poor fellow is boiled alive.
The 1% has stolen $40,000 a year from our workers and we don't know it.
Big business steals our tax dollars with the defense budget and we don't know it.
Corporate welfare is bigger than social welfare and we don't know it.
Monopolies run our lives and we don't know it.
There has been a relentless attack on the New Deal since the Powell Memo inspired big business to start think tanks and siphon neoliberalism into our lives … and we don't know it.
Our democratic government operates for the rich and against the average American and we don't know it.
We don't know it because the banality of neoliberalism has inculcated a banality of psychology into our minds. This psychology makes of us not human beings, but consumers. We are surrounded by products and advertising. Our lives and status are determined by what we buy. Marxist analysts call this commodity fetishism. When the iPhone came out, people lined up around the block for it. Commodity fetishism rules us: we're more attached to our iPhones than to our pets.
And we don't know any of this. We live mute and deaf in the banality of neoliberalism. A kind of soft totalitarianism. While we're hooked on our iPhones and our Game of Thrones and our diabetic-inducing soda pop and our GM food and our Prozac and our opoids and our social media and our superhero movies (the reverse coin of consolation to our helplessness), we forget that there is a big neoliberal dick up our asses, shoving away till we bleed out.
It wasn't always like this. There was a time when the Democrats, and even corporate leaders, weren't neoliberals. That most essential of pundits and investigatory journos, Rolling Stone's Matt Taibbi, puts it like this:
"There was a time in this country – and many voters in places like Indiana and Michigan and Pennsylvania are old enough to remember it – when business leaders felt a patriotic responsibility to protect American jobs and communities. Mitt Romney's father, George, was such a leader, deeply concerned about the city of Detroit, where he built AMC cars.
"But his son Mitt wasn't. That sense of noblesse oblige disappeared somewhere during the past generation, when the newly global employer class cut regular working stiffs loose, forcing them to compete with billions of foreigners without rights or political power who would eat toxic waste for five cents a day.
"Then they hired politicians and intellectuals to sell the peasants in places like America on why this was the natural order of things. Unfortunately, the only people fit for this kind of work were mean, traitorous scum, the kind of people who in the military are always eventually bayoneted by their own troops. This is what happened to the Republicans, and even though the cost was a potential Trump presidency, man, was it something to watch."
8. Four Hopeful Chinks In Neoliberal Armor
Perhaps the prime function of any democratic government should be this: to stop the elite from stealing everything.
In this, the American government has failed us miserably. In fact, they've encouraged the elite to steal everything. Made it easy for them.
And why was it possible for our government to do this? Because nobody knows what the hell is going on. We're as clueless as a Texan evangelist about the transgendered.
That cluelessness is how the banality of evil neoliberalism has had its way with us. We and our corporate-owned media and our intellectuals accept neoliberalism without giving it a moment's thought.
We're as blind as plankton in a whale's gut.
But now there are four chinks in the banality of neoliberal armor:
1. Occupy Wall Street, which brought income inequality into the national conversation.
2. Elizabeth Warren, who tells us that the economy is rigged in favor of Wall Street.
3. Donald Trump, who explains how free trade punishes our workers and our economy.
4. Bernie Sanders, who tells us that Washington does not regulate Wall Street, but that Wall Street regulates Washington, and who has presented us with the first effective plan of attack on the evil of Clintonian neoliberalism. His agenda aims to subvert neoliberalism with the following: a living minimum wage of $15-an-hour; free community college tuition; getting big money out of politics; higher taxes on the rich; breaking up the big banks; a financial transaction tax; Medicare for all; infrastructure spending that will create millions of good jobs; fighting climate change (the destruction of our planet being the final evil outcome of neoliberalism); and wrapping all this up in his call for a political revolution. Because that's the only thing that will topple neoliberalism: a revolution. In thought and in action. Clarity must upend banality.
(Some other chinks to be noted: Noam Chomsky, Naomi Klein, and the activists who waged the Battle of Seattle.)
9. Our Youth Sees Through The Banality of Neoliberalism
It's a pity that Bernie, the democratic socialist outsider, has not been able to hijack the Democratic Party the way Trump, the billionaire outsider, has hijacked the Republican Party. Unfortunately a neoliberal champion, Hillary Clinton, stands in Bernie's way, with the formidable backing of Wall Street, The New York Times, and the entire neoliberal establishment behind her.
But at least Bernie's voice has been heard, and 80% of our voters under 30 are on his side.
Which means the future of neoliberalism is doomed. All that has to happen is for the old neoliberal Kool-Aid drinkers to die off. We could change like Iran will when their old mullahs die off.
Unlike their elders, our educated young people — saddled with massive college debt, suffering through the Great Recession, unable to get the good jobs their education prepared them for — clearly see through the banality of neoliberalism and capitalism to the evil it has veiled. In fact, in a recent YouGov survey, folks under 30 rated socialism more favorably (43%) than capitalism (23%).
In another 15 years — when Republican gerrymandering has been corrected so Nancy Pelosi can preside over a productive Democratic Congress again, and when a liberal Supreme Court reverses the money out of politics, and when President Elizabeth Warren enters the final year of her second term — America will be firmly set on a progressive course now presaged by Bernie's advent.
We will have capitalism with a human face.
And the banality of Clintonian neoliberalism will finally be a forgotten evil.
between mountains and the sea (山海间)
I was recently reading a book by the dreadful Robert Kaplan on the topic of China and the South China Sea, in which the author suggested that Chinese culture exists in one of its purest forms in Malaysia. He argued that only in the overseas Chinese communities that have continuously existed scattered around the Pacific Rim has Chinese civilization survived, uninfected by the tumultuous events of the Communist Revolution. Similarly, I have a friend who is a political philosopher and expert in Chinese philosophy who believes that it is in Japan and Korea where one can most easily find the artifacts of the Chinese civilization--specifically Confucian philosophy. Japan is, after all, a place where a lot of cultural practices and material culture from China have been preserved. And not just China, for much Silk Road artifacts are preserved in Japan as well, for the country has long stood as a kind of terminus, lying at the end of the line in East Asia.
And speaking of Confucianism and the Communist Revolution, have you ever wondered why Confucian philosophy has such a bad name in the West? Largely unknown--except in its fortune-cookie format-- if it is recognized at all, the tradition is rarely fully appreciated. This is partly because of its association with patriarchy and elitism-- and this bad wrap is something that was invented by the Chinese communists, who strongly discouraged Confucian thinking as being counter-productive to the egalitarian ideals of the revolution. (They were especially worried by its patriarchal stance toward women).
Personally, I've always thought this to be a shame as Chinese philosophy happens to be one of the world's oldest existing philosophies; one which has arguably impacted more human lives than any other philosophy-- past or present. It stands as one of the world's greatest philosophical traditions, and it is also my own personal belief that Chinese philosophy--in particular Confucian philosophy-- that more than any other tradition is most compelling for what it tells us about the Good Life.
I was, therefore, thrilled to see all the press that Michael Puett and Christine Gross-Loh's new book is receiving. The Path became an amazon best-seller immediately after its publication and immediately began seeing reviews in major papers, like The Guardian and the Wall Street Journal. Pretty impressive for a book on Chinese philosophy!
First, a word about the authors.
This is a true story. Several ago a reader of these pages sent me an email with the subject line: "On the Far East, mindblowingly." The email contained a transcript of Michael Puett giving a talk in Korea about building a more enlightened state; namely a kind of state which would be capable of better promoting human flourishing. I was very excited by this new friend's obvious excitement in Puett's ideas, as I knew that at Harvard Michael Puett's lectures on Chinese philosophy are so popular that they are receiving attention from the general public now as well. It is exciting indeed! The other author, Christine Gross-Loh is even more exciting--if that is possible. Christine Gross-Loh a few years back wrote the best book on child rearing I have ever read, called Parenting Without Borders. By providing different ways of approaching parenting from other parts of the world, she challenges American parents to think critically about what we are seeing in the form of over-parenting and the infamous helicopter parents. Both authors, therefore, are working on ways to bring in different styles of thinking in order to challenge people in thinking about "absolutely everything."
So what does Chinese philosophy have to say to us, then?
Well, what if I told you that Puett and Gross-Loh agree with philosopher Charles Taylor as seeing some of the ills of our modern secular age as stemming from Calvinism? From our concept of ourselves as unique individuals on a path to uncover our greatest potential; to our distrust of ritual and organized religion ("spiritual but not religious"): these are all ideas that derive ultimately from the Protestant Revolution and John Calvin. This might surprise people who think that our modern secular understanding of self is more firmly rooted in the Enlightenment and scientific revolution (as I had)-- but having read Taylor's A Secular Age, I am with Puett and Gross-Loh in positing our modern secular understanding of the self in Calvin.
[For those who have not read Taylor's A Secular Age, don't lose any more time!]
But what does this have to do with the Good Life? Puett, as mentioned, made a name for himself with the students at Harvard for his incredibly exciting presentation of Chinese philosophy. This was achieved not in terms of presenting its venerable history or the intricacies of its logic but rather, Puett excited students by using Chinese philosophy in order to stimulate students to think of their lives and their world in totally new ways. Sounds good, right?
Can you imagine a young person who has been raised to "follow their passions" and "be the best me you can be" listening to their famous professor telling them that, being an authentic self is actually stressful; that authenticity is an illusion based on a flawed understanding of the self since there is no such thing as a "true self." What must they think being confronted with the idea that humans are multi-faceted "works in progress." Rather than being tortured by decisions and elaborate plans for the future, imagine him standing there telling Harvard undergraduates they might be better of "going with the flow?" And what could these young people make of the idea of the transformational power of ritual?
Stop trying to find yourself! Stop deciding! Don't try to impose your will on everything and whatever you do, stop taking your choices in life so seriously! Stop trying to define yourself all the time. And, most difficult perhaps of all: you must work to moderate your emotions. (Has there ever been a generation raised to so thoroughly indulge in dwelling and expressing their own emotions as this generation? I know, I am such an old lady...)
For all these reasons and more, this beautifully-written short book is a must-read.
And it is beautifully-written.
I guess my only question with the project is concerning the limitations of presenting various concepts from a foreign tradition so totally out of context. Thinking of the Confucian commitment to the unity of thinking and ritual action in the form of knowing, Confucius suggests that
知者樂水 仁者樂山 知者動 仁者靜 知者樂 仁者壽
The wise delight in water while the virtuous delight in the mountains.
A wise person is active and enjoys change while a virtue person seeks serenity and enjoys long life
That is to suggest that in an enlightened world, both wisdom and virtue are necessary. The good life is a byproduct of both the attainment of wisdom and virtue, but according to all the great philosophers mentioned in the book, from Confucius to Mencius and Xunzi to Zhuangzi and Laozi-- a person is really a person-in-context. Think of the Chinese characters for person (人) and personhood (人間). Packed right into the characters themselves is the notion of inter-personhood and that "no man is an island." There is no self-encapsulated self, but rather the philosophy is rooted in the idea that our personhood is based in interconnectedness or person-in- context. This communal and inter-connected aspect of personhood is not a by-product of sagehood but rather what makes personal cultivation possible. I think it is very hard for some westerners to fully grasp Chinese and Indian philosophy because this significant different approach to understanding what is a self.
To wit, in The Path the authors end their book with a short meditation on the problems inherent to American-style Buddhism. According to them (and I completely agree), American forms of Buddhism rather than diminishing the ego, instead serve to prop up or strengthen an individualistic understanding of person-hood. I have a friend, who is a scholar of Sanskrit and Buddhist philosophy in Korea who calls Buddhism as practiced in the US the Path of Ignorance. Because it is un-moored from its context with each person cherry-picking the parts of the tradition that "works for them," it can be problematic and indeed, as the authors suggest counter-productive. So, I couldn't help but wonder what would save their book from a similar fate? That is specifically, without the communal rituals and shared practices, how can any concepts from a foreign tradition avoid a similar fate as what we see in much of American style Buddhism?
I am no sure about how to answer this but I do appreciate that they offer the ideas from more natural notions of self-hood as conceived in the philosophy. Maybe much like William Irvine's A Guide to the Good Life: the Ancient Art of Stoic Joy (I loved that book!)-- the Path has tremendous riches to offer young people (and not so young people)--precisely because it begins with a challenge to Western notions of the Self. Does anyone not know a young person who graduating from college become petrified because they actually have no idea what to do next? The limitless choices they were promised are illusory at best and at worst are utterly boxing them in to the point of paralysis. Haven't you ever wanted to shout at someone, "the world is unpredictable and a person grows up by living "as if" and not by seeking some kind of authenticity--just do something!" Have you not found as you grow older that more and more you are at the whims of your emotions and indeed that negative emotions are undermining any sense of equanimity and serenity that you had once upon a time? And what about the feeling that our lives have become more and more cut off from real-world experience--to the point that we feel enervated and non-receptive to new things? I personally feel that these are the most pressing problems I see in my own life--and maybe that's why I thought that this slender little book carried a very big punch.
Sughra Raza. Scaffolding. April, 2016.
Franz Wright's Poetry of Remission
by Evan Edwards
I have a copy of Franz Wright's Walking to Martha’s Vineyard on my bedside table. It has been there since my son was born last year. I’ve been trying to educate myself on contemporary poets for more than a year now. Wright was the one who happened to stick the most readily. I want my son to know about poetry; good, modern poetry that speaks to the vibrancy of the present. Of course, we’ll always read the classics, but I want him to also get an education in the words of those who aren’t yet dead, who are living and here and maybe coming to speak somewhere nearby at some point so that we can go together to hear a great poet speak in person and then walk out of the lecture hall feeling the brief surge of ecstasy you feel when you experience something extraordinary. Maybe it’s my obstinacy that drew me to him, or maybe it’s just the way that irony works, because of course Franz Wright is dead.
I first encountered Wright through the blog of a poet I met when I lived in Indianapolis. In the interview he gave, I remember feeling overwhelmed by the way he spoke about his recent economic troubles. The way he hadn’t been invited to speak or teach or fraternize (to be part of the brother/sisterhood of poets) since he’d made some admittedly snide and vicious remarks about MFA programs. How he was struggling with cancer. How he didn’t have the means to keep up the struggle for much longer. He was in remission, and had a tenuous relationship with hope. The cancer would, eventually, come back and then end his life in May of last year.
The word remission comes up one time in the interview, in the context of saying that he’s posted on Facebook saying that he is in the state of remission, and that he can give talks and readings, if anyone wants to get in touch with him. There was something very tender and heartbreaking in that statement. Here is one of the greatest living poets, recipient of a fucking Pulitzer Prize in poetry, Guggenheim fellow, son of poetry royalty, subtlest and most brutal portrayer of spiritual suffering, reaching out for work through his personal social media page. The desperation of that. It seemed hauntingly appropriate to speak of remission in that moment.
From the hellish existence of chemotherapy, constant doctor visits, insomniac fear. From that half-life to the tentative steps back into the light of the life that everyone has been taking for granted since you were gone.
So when this man announces on the informal, the casual, and humble stage of a Facebook page that he is in remission, and available to do readings, one has to feel the full weight of that claim. He’s also looking to be included once again in the communal life of poetry that he’d been absent from for so long. To come into that company again, like a penitent in the church of art, seeking remission.
And in fact, Wright’s poetry is a poetry of remission. A constant struggle with sobriety, alcohol, addiction, faith. Cancer was just one more barrier between him and Life, one more debt to cancel. This feeling of being on the wrong end of a loan emanates from his poetry. His personal faith might be to blame/thank for this. To be in an eternal debt to Christ is unfathomable for me. In the first poem of Martha’s Vineyard we’re struck by this permeating sense of seeking remission, of desperate searching for Life.
I was standing
on a northern corner.
Moonlit winter clouds the color of the desperation of wolves.
of Your existence? There is nothing
In this poem we’re presented with Wright himself, standing, presumably, on a street corner at night. One can imagine it’s an empty road. Who knows whether or not there is a streetlight. The sky is the color of desperation. The proof of the debt we owe to God, all around. And that line, “There is nothing/but.” The thought fits into the poem as a whole, but it also stands alone. There is nothing, meaning, we are alone and desperate addicts, cancer victims. Abandoned.
The poem “Baptism,” later in the collection, speaks to another aspect of this relationship. Here, we get the clearest example of a theme that ties the whole text together: fathers, Father, and sons, the Son. The complex relationship between James Wright, God, Franz, and Christ, comes to a head here. Who is the ‘insane asshole?’ James? Or the sort of creator God who would subject us to so much suffering? And in what medium does this baptism take place? Is it in in “alcohol, water, or light,” as he says earlier in the collection? It is entirely unclear. What do we owe to the ones who gave us Life? Earthly or otherwise. Here the genius of Wright breaks through with an impassioned luminosity. In his hands, the themes of Christianity are rendered universal. Who is your father? Mother? What do you owe them? What debt do they ask you to remit?
Do you have any children?
No, lucky for them.
Bad things happen when you get hands, dolphin.
If they’d stabbed me to death on the day I was born, it
would have been an act of mercy.
We don’t ask for our lives. They’re thrust upon us. And then we’re asked to remit that debt. To seek remission. Always, it seems. And to find the strength to bear that debt.
Wright found, from time to time, the courage to live up to the challenge.
You said, though my own heart condemn you
I do not condemn you.
Who is speaking here? Whose words are said? It is either Franz, speaking to his F/father, or his F/father speaking to him. But it is certainly a prayer, to find the strength to persist in the search for remission.
Franz Wright did not remain in remission, of course. He died on May 14, 2015, at his home in Waltham, Massachusetts at the age of 62. As far as I know, he did not receive remission in the company of poets. Remission is not always granted, it seems.
‘Every symphony is a suicide postponed, true or false?’ he wrote in ‘Intake Interview.’ Or, every work of art is a way to come back to Life. To seek, to be in remission.
I read these poems to my son, who is just over five months now. My partner tells me not to read the line where he says that “if they’d stabbed me to death on the day I was born, it/would have been an act of mercy.” She’s probably right. Perhaps it’s just a reminder for me, a message I send back to myself from time to time. That this is the heaviest burden to have placed on another person. That to be in any relationship with another human being is at once to place a burden upon them and to release them from their burden. That love is a load and a crutch. And even though he is dead, to consider the work of Franz Wright a work of Life, a living work.
The closer I get to death, the more I love the earth, the thought
introduced itself as I sat shivering on my old park bench before
the dusk fog; as it has, I suppose, to every human being
who has ever lived
The Prescriptivist's Progress
by Ryan Ruby
This month, two minor controversies revived the specter of the "language wars" and reintroduced the literary internet to the distinction between prescriptivism and descriptivism. One began when Han Kang's novel The Vegetarian won the Man Booker Prize and readers took to their search engines en masse to look up the word "Kafkaesque," which had been used by the book's publishers and reviewers to describe it. Remarking upon the trend, Merriam-Webster noted sourly: "some argue that ‘Kafkaesque' is so overused that it's begun to lose its meaning." A few weeks before, Slate's Laura Miller had lodged a similar complaint about the abuse of the word "allegory." "An entire literary tradition is being forgotten," she warned, "because writers use the term allegory to mean, like, whatever they want."
When it comes to semantics, prescriptivists insist that precise rules ought to govern linguistic usage. Without such rules there would be no criteria by which to judge whether a word was being used correctly or incorrectly, and thus no way to fix its meaning. Descriptivists, by contrast, argue that a quick glance at the history of any natural language will show that, whether we like it or not, words are vague and usage changes over time. The meaning of a word is whatever a community of language users understands it to mean at any given moment. In both of the above cases, Merriam-Webster and Miller were flying the flag of prescriptivism, protesting the kind of semantic drift that results from the indiscriminate, over-frequent usages of a word, a drift that has no doubt been exacerbated thanks to the internet itself, which has increased the recorded usages of words and accelerated their circulation.
Since the trials of the word "Kafkaesque" have already received ample coverage (by Allison Flood writing for The Guardian and Jonathon Sturgeon writing at Flavorwire), I'd like to turn my attention instead to the uses and abuses of the word "allegory" as described by Miller. Most of the time Miller is not one to quibble with the way people use words. But a recent spate of film reviews—one claimed Batman vs. Superman was an allegory for the primary contest between Ted Cruz and Donald Trump, another said that Zootopia was an anti-Trump allegory, a third called Jafar Panahi's Taxi an allegory of artistic repression in Iran—caused her to draw a line in the sand. "What people usually mean when they call something an allegory today is that the fictional work in question can function as a metaphor for some real-world situation or event," Miller writes. But allegory "is not just another word for metaphor."
Because one good quibble deserves another, allow me to point out that this last assertion isn't entirely accurate. The offending examples Miller lists are indeed abuses of the term. The first two films were made before the political events they are supposed to allegorize; the third simply is about artistic repression in Iran. But this is not because allegory stands in no relation to metaphor, it's because these particular films stand in little to no relation to what the reviewers claim they are metaphors for. If Miller is normally a descriptivist, it's quite difficult to understand why she has chosen to make an exception in the case of allegory, which Angus Fletcher, in his definitive study of the term, calls "a protean device, omnipresent in Western Literature from the earliest time to the modern era."
Miller takes the features of the medieval literary genre to define its limits. Unlike more realistic fictions, the characters of medieval allegory are personified representations rather than representations of people. The protagonist of a typical medieval allegory, let's call him Everyman, journeys from Doomville to Blisstown, encountering, along the way, such embodied abstractions as Truth, Justice, and Sin who act and speak truthfully, justly, and sinfully, helping our hero reach his destination or tempting him away from the right path. Beginning "in the waning years of the Roman Empire"—presumably with Boethius' Consolation of Philosophy (c. 524)—Miller claims that allegory reaches its heights in works such as Guillaume de Lorris and Jean de Meun's Romance of the Rose (1275), Edmund Spenser's The Faery Queene (1596) and John Bunyan's The Pilgrim's Progress (1678). Although she admits that the genre has largely been eclipsed by the realist novel, it lives on in the writing of C.S. Lewis, J.K. Rowling and Haruki Murakami, in the films of David Lynch and in the drawings of today's political cartoonists.
Unfortunately, this simplifies history to the point of falsification (and not just because The Divine Comedy does not figure into it). To fix a word's meaning, a prescriptivist should start with its etymology, lest her definition seem as cherry-picked as that of the descriptivists she criticizes. Allegory comes from the Greek words allos ("other") and agoreuein ("to speak openly"). Originally the word did not refer to a literary genre at all, but a rhetorical mode. "In the simplest terms," Fletcher writes, "allegory says one thing and means another." Like irony, allegory exploits the natural polysemy of language. It's a kind of double talk that is especially useful under conditions of political censorship or in societies where blasphemy is a crime. Allegorical speech deploys figurative language to alert the hearer the existence of a latent meaning beneath the manifest content of what is said. You would not be wrong to detect in agoreuein the word agora, the place where the Greeks came together to discuss politics. Nor would you be wrong to detect in Fletcher's paraphrase something akin to metaphor, which, to quote the prescriptivists at Merriam-Webster, is "an object, activity, or idea that is used as a symbol for something else." The English lexicographer Edward Phillips, writing in the same year as The Pilgrim's Progress was published, defined allegory as a kind of semantic "Inversion," derived from translatio, the Latin word for metaphor.
Allegory—"one of the foundations of Western literature"—is in fact much older than Miller suggests. The first known usage of the word can be found in the Moralia, a collection of essays by the Hellenist philosopher, biographer and literary critic Plutarch, who died in 125, four hundred years before The Consolation of Philosophy and over a millennium before The Romance of the Rose were written. According to Plutarch, the ancients called it hyponoiai ("under-thought" or "hidden ideas"). The most famous example from antiquity is of course the "strange image" in the seventh book of Plato's Republic (c. 380 B.C.). There, Socrates describes a society of imprisoned cave dwellers who take the shadows of things for the things themselves and relates what happens when one of them frees himself from his shackles and sees what the world beyond the cave is like. In what is variously known as the Analogy, Myth, Metaphor, or Allegory of the Cave, Socrates' story reveals itself to be a network of metaphors or symbols, wherein each element is meant to correspond to an element of reality as Plato sees it. Platonic allegory is a corpus symbolicum whose cells are metaphors. In so far as allegory and metaphor are different here, it is a difference of degree, not kind.
The same is true of allegorical reading. In Plutarch's time, allegorical exegesis of canonical texts, the Homeric epics above all, was a well-established critical practice, as philosophers demonstrated correspondences between the stories of Greek mythology and their own cosmological and ethical theories. In "How a Young Man Should Study Poetry," Plutarch instructed readers not to take the myths about the Gods in the Iliad and the Odyssey literally, but rather to interpret them as astronomical metaphors and symbolic prefigurations of Platonic ideas. Around the same time, a similar operation was being performed on the myths of Genesis by the philosopher Philo of Alexandria and by the early biographers of a parable-speaking preacher from Nazareth.
By focusing on medieval allegory, Miller takes a particular, historically situated usage of a word—albeit a well-known one—to stand in, synechdochally as it were, for a whole tradition of usage. The works Miller takes as emblematic of the form are actually deviations from and even inversions of this older tradition. The personages and places of these works are entirely literal; irony is absent from their narratives and metaphors are reified as proper names. When Lady Philosophy speaks to Boethius, or when Despair tempts Red Cross Knight with an argument about suicide, there's no need to wonder whether the author means anything other than what he says. All allegories alert their reader to the fact that they are allegories, but few do so as ham-handedly as Pilgrim's Progress. Nearly everything a reader needs to know about Bunyan's book can be found on its frontispiece (see above).
Bunyan turns the distinction between manifest and latent content inside out; then he dispenses with latent content altogether. In so doing he dispenses with the very feature that had distinguished the form for centuries (all the way back to the prophet Hosea in the 8th century B.C. if we are to take his word for it). The Pilgrim's Progress does not represent the form's culmination; it represents its decadence.
Miller is right to wonder if we are even capable of reading such books any more. Aside from children, who can still enjoy allegories as pure tales of adventure, contemporary readers are likely to prefer the round characters, psychological depth, moral ambiguity, and narrative complexities that are some of the hallmarks of the realist novel, which has been the dominant form of storytelling since the late eighteenth century. "Should a book or form present its argument so simply that even a child can discern it, what's left to talk about?" she asks. "Merely language, story, and imagery—all the pleasures that art is made of."
As a defense of allegory in the age of the novel, this is puzzling, to say the least. Having begun with an attempt to distinguish allegory from metaphor, Miller ends up arguing that pure formalism is the only way we can still appreciate the most didactic of all genres. The pleasures of language, story, and imagery were the very criteria by which Flaubert wanted his arch-realist "book about nothing" to be judged. For all the formal differences between a book like The Pilgrim's Progress and a book like Madame Bovary, the ideological literalism of medieval allegory is only a step away from the mimetic naturalism of the realist novel. In any event, stripping an allegory of its ideological framework in order to read it as "entertaining adventure yarn" isn't how the form stays relevant in the twenty-first century. It's how Dante's Inferno gets turned into a video game.
This reductio ad absurdum is the inevitable consequence of taking medieval allegory to exhaust the meaning of the term. More generally, it shows how a narrow definition of a word can be just as harmful to its meaning as overly broad usage of it. With a prescriptivist for a white knight, meaning hardly needs a dragon.
Grandpa, Proust, Ulysses and World War II
My paternal grandfather, Axel Benzon, was a Dane. He and his wife, Louise, immigrated to America early in the 20th Century. He was trained as an engineer, was educated in the classics, and took up photography and woodcarving. He ended his professional career as chief engineer of the main U.S. Post Office in Manhattan.
He kept a diary, the pages of which are generically entitled: “Leaves from my diary.” It’s not handwritten, kept in one of those blank books one can buy at a stationary store. It’s typed on ordinary 8.5 by 11 paper. I’ve got a photocopy of much or most of it, but, judging by his index, not all.
In commemoration of this Memorial Day, May 31, 2016, I would like to share some passages from his diary, passages written just before the United States was drawn into the war. As you read these passages keep in mind that you are reading the reflections of a well-educated middle-class European who had immigrated to the United States.
But I want to approach the war obliquely. Let’s start with the best Western civilization has on offer. Here we have Grandpa commenting on Grandma’s interest in Proust (November 22, 1938):
Talking about books I think mama [his wife Louise] is on the way to become literary. She was interested in Anatole France some time ago and read some of his books, and now she is buried in Marcel Proust. Whether she is enjoying their language or their outpourings or both I do no know for she does not say much about it. Anatole France’s language is of course concise, clear and classically French and is therefore enjoyable …
… As to Proust it is said that the translation into English is so much better than the French edition that if it were retranslated into French it would be a much better book. The French language is not adapted to the outpourings of the quickly decaying spirit departing disillusioned from the splendor that was nothing less than a stinking dung heap as was the fate of Proust. He longed for what he thought was the highest he could think of on this earth; he found it and discovered it was rottenness. But just the same his description is more worth than Dos Passos’ description of the world as he found it in the twenties, to take an example.
Mama enjoys her reading more than she enjoys bringing up flowers or plants.
I just barely remember her. She died when I was quite young. Grandpa lived into my teens. I didn’t hear about Proust until I went to college, in 1969.
About a year later Grandpa fears for his homeland (14 April 1940):
Sunday and cloudy with occasionally a little snow–a good day to remain indoors and listen to the war news from Europe. These news are coming in frequently but are most confusing and it is difficult from the British and German dispatches to a form a true picture about the situation in all parts of Norway.
The Danish goose is cooked–there the Germans are in possession of all parts and are now fortifying points of vantage, especially the northernmost part of Jutland from where they can dominate a great port of Skagerak and Kartegat. [The Skagerrak strait between the Jutland peninsula of Denmark and Norway and Sweden; the Kattegat sea leads to the Baltic.]
The invasion of Norway was a masterstroke, no matter how it turns out. It gave evidence of the usual German thoroughness and precision and coupled with the fact that the German navy is so much inferior to that of the English it has been most successful and must have taken the English by surprise.
As you can imagine, his reflections are much occupied by the war. But not entirely so. For example, he also talks of his fondness for the game of golf and playing it on public courses in New York City—he lived in Jackson Heights at the time. I rather imagine that THAT land has long since been given over to building of one sort or another. In fact, at one point he mentions exactly that.
At one point he has copied one of his letters into his diary. He’d written the letter to one of his daughters, Karen, who apparently was visiting Demark at the time, the time when the Germans entered the country. This entry is dated April 20, 1940, just a week after the previous entry:
You are affected by the insensate sacrifice to the voracious Moloch of the flower of youth driven to the slaughter by monsters whose greed can never be sated and by the senseless destruction of the fruits of toilers whose only earthly desire is to be permitted peacefully to toil as long as they can labor.
How are your aunts and cousins in Denmark, and how is aunt Kate in Oslo with her two boys? Pity for they have toiled and suffered for many years until lately they all felt reasonably secure to enjoy the fruits of their labor. Well–I often ask myself this question–and we cannot help, cannot even communicate with them.
But let these blows not deprive you of the desire to continue your own life as happily and peacefully as you are privileged to do. You are born in an age different from that in which your parents were born and in a country different from the little pastoral Denmark. The premature invasion into your time of an unbridled science which as a colt breathlessly has galloped over your era will in time be curbed and led into the field of anthropology where it will either destroy or make useful the parasitic growth that now is the cause of our folly and inhumanity.
Notice the contrast between “pastoral Denmark” and “unbridled science.” I wonder just exactly what was on his mind there for, as an engineer, he was himself a man of science. I wonder what he would have thought about the “shock and awe” of the 2003 American invasion of Iraq or of the drones so beloved by our first black president, the one who received a Nobel Prize for Peace, and who is also the first sitting president to have visited Hiroshima?
In a letter of May 19, 1940, Grandpa writes about one of his fellow expatriate Danes:
Some time ago when Bang from Baltimore visited us he lamented about the poor condition in which Denmark was situated with respect to defend herself against an aggressor. The Finnish war was on at that time and we were filled with reports about the bravery of the Finns. The Danes could do as well and it would be better to go down in glory than to give in without a fight.
Poor Bang, he still lives in a world of illusion. He did not see that the news we received from Finland were all highly colored and that Finland was doomed. And still, he wanted Denmark to defend herself from German invasion.
I wonder what he’d think about The New York Times reporting on Iraq, or Afghanistan, or Syria. Would he think that the mainstream media now has been as lost in illusion as his countryman Bang had been back then?
And yet the mail must go on (June 8, 1940):
With all this misery in Europe things are quiet at the Post Office. Mail is not heavy and we can take our vacations knowing that there are hard times ahead of us so far as money is concerned. We must be glad if our salaries are not cut, for that in addition to increased taxation will be hard to bear.
He was waiting for the war to get worse. What are we now waiting for? What are the chances that the undeclared war on terror will end before the Statue of Liberty is claimed by the rising sea? I’m pretty sure that Grandfather would have had little trouble accepting the data on climate change.
War brings immigrants, a fact which is painfully and tragically evident these days (August 14, 1940):
At the Post office we are preparing for registering the aliens. This gives me more work for we have to build a number of typewrite desks and other things that have to be used. We do much work in the Post Office other than handling mail.
I don’t know what Grandpa would have thought about all the Arab immigrants who’ve been fleeing to Europe these days.
Here he alludes to the northwestern corner of the Roman Empire:
Incidentally I listened to [H.G.] Wells the other day over the radio and was shocked to hear how feeble was his voice–hardly distinguishable–but the old radical spirit was there undaunted–he really sounded as were he speaking from one of the many and deep shell holes dug by the barbaric German bombers in the relics from the old Londinium.
There's that late 19th Century education for you, and he was educated as an engineer, not a preacher or a diplomat.
But it’s not all war. Here Grandpa talks about more mundane matters (September 8, 1941):
From Billy we finally got words today. They have moved and are now settled in the town [Johnstown, PA]. It was not all good news in his letter for Betty’s mother is bedridden with a bad heart and his former landlady presumably has cancer.
He has further more lost his nice golf clubs–they were mislaid by a caddie in a wrong automobile when he went in for a drink and now after ten days he has not gotten them yet. That is a serious loss and I sympathize with him for he had a very good set of clubs.
Billy was my father and Betty, his wife, was my mother.
Grandfather goes on to report on a book he’s been reading, The Managerial Revolution by Burnham (whoever he was), that offers “another alternative to capitalism than socialism namely the ruling of the country by a new class of managers.” He says a bit more about the book and then: “I agree with him in most of his points, but if that is not socialism as I understand it then I do not know what it is. Socialism as he defines it is the Utopia which, if we should try to establish it now would be anarchism and chaos.” I don’t think Grandfather had much objection to socialism, though I rather suspect he wouldn’t think too much of the financial managers who run the world these days. If he were alive today, would he feel the Bern?
At last, as he continues talking about his reading, he closes with this:
Also a book by Frank Buck, the animal dealer and a novel by Storm, Count Ten, which I should read at least twice in order to understand it properly. The style is somewhat like that of Ulysses and it deals with a man who does not know what he should do but tries his utmost to live a life of decency wherein he can retain his self-respect.
Never heard of this (Hans Otto) Storm or his novel, but the Internet of course has. He was a Stanford-educated engineer; Count Ten was his third novel. Edmund Wilson thought it inferior to Storm’s previous two, Pity the Tyrant and Made in U.S.A., but found material of interest in it:
Implausible though a good deal of it is, it evidently makes use of actual experience; and the experience of Hans Otto Storm has been of a kind rather unusual among out fiction-writers. In the first place, Mr. Storm, though a radical, is not, like so many other novelists, a radical of the depression vintage. He is–one gathers from Count Ten–the descendant of German refugees of the Revolution of 1848 settled in Southern California. The hero of his novel, at any rate, begins by going to jail for resisting the draft in the last war and ends by going to jail again as the result of his activities as campaign manager for a movement evidently drawn from Upton Sinclair’s EPIC. He has, in the meantime, had a successful career as an agent of the mining interests.
Commenting on the fact the Storm is not a writer by vocation, but an engineer, Wilson observes:
An engineer who thus goes in for literature is such a novelty that Hans Otto Storm is able to carry us with him because we have never listened to precisely his story before. His writing about the sea–in Made in U.S.A. and in the episode of the yacht in Count Ten–without the parade of technical knowledge which is the betrayal of the layman in Kipling, gives us a much more intimate sense of living the life of the ship than we get from The Ship That Found Herself or The Devil and the Deep Sea.
But this is a digression. It wasn’t Grandpa’s reference to a forgotten book by a forgotten writer that caught me eye. It was his reference to Ulysses, a celebrated book by a celebrated writer, though a book that, in my experience, is mostly read by college students and their teachers. And yet there it is, in Grandpa’s diary, mentioned as though any well-read person would know it.
That’s what was on Grandpa’s mind on September 8, 1941, my father’s lost golf clubs and a forgotten book in the style of Ulysses. Two months later, on December 7, 1941, here is what’s on Grandpa’s mind:
It is cold today on this Sunday but the wires or rather the air is hot with reports about the attack of the Japanese air forces upon Hawaii this morning when five civilians and apparently three hundred fifty soldiers were killed. It is also reported that a large battleship was set afire and two others sunk …
The Dutch East Indies and the republic of Costa Rica have declared war on Japan.
10 pm. Canada has declared war on Japan.
He must have been typing while listening to the radio. A day later, December 8, 1941, America too declared war on Japan.
by Brooks Riley
Current Genres of Fate: Fate's Epic Side
—because despite being enlightened, civilized, advanced, and free, we are trapped—
by Paul North
In the 1930s a Hungarian psychiatrist, Leopold Szondi, began to think that families predetermine the lives of their members, before he was deported to Bergen-Belsen because his family was Jewish. Through a special negotiation he and other intellectuals were released and sent into exile. Szondi settled in Switzerland, where he worked the rest of his long life on tests and treatments for Genotropism, the name he gave to this curse on families. Members of a family share, he thought, a narrow set of psychological tendencies that are transmitted across generations. Who you choose as a life partner, what kind of career you end up practicing, even how much money you make are all determined up to a point by a ‘familial unconscious.'
The familial unconscious contains drives and needs specific to the family and gives them their desires, their limits, their fate. Now, although Szondi wanted to release individuals from the family's unconscious predeterminations, and he invented a therapy to do so, the principle that underpinned his therapy was itself a fateful idea. Instead of staying limited by family traits, he wanted you to learn that: "Wahl macht Schicksal" — "Choice makes fate." With this principle, Szondi hoped to break through the walls of his patients' familial unconscious. What if he succeeded? Well, through this principle he also locked patients into a new idea of destiny. Fate may not pre-determine you, but it does determine you. The way it determines you now is not necessarily better, only different. Now your fate happens to you choice by choice.
Let us imagine that there is a history for the idea of fate. It is a fiction or a semi-fiction, but that doesn't matter. It will help us to see a pattern. The first stage of the history is ancient, even archaic. We see Greek and Roman worry about fate all over epic poetry and stoic philosophy. In monotheisms, however, and especially in Christianity, fate takes a back seat to a different kind of story, where what happens at the end of time cannot be pre-judged by humans. At the end of all things, whether it comes as a last judgment or a gift of grace, a human-looking God will be there, making all the final decisions.
The philosophical essayist Odo Marquard, who first sketched out this historical tale about fate—the fate of fate, he called it—was right: the weightiest things in life, which used to be completely out of our hands (threads were held by "the fates," judgments were made by God) at some point were put directly into our hands. After the great monotheisms (this is fiction too: we know they have not ended), everything, Marquard wrote in 1981, comes to be seen as made by human beings, including the highest things, like God, history, and truth. He notes that the expansive new human power of making did not actually put an end to the fate idea. Just because we began to think of ourselves as in charge, as making all things, including our own history, our ideas and ideals, this did not mean that we were free—on the contrary.
As Szondi had already recognized in the 1930s, fate was put into our hands too. In the great making era, making makes—not freedom—but destiny. It may take some time to come about, it may shift, its force may accumulate slowly, but it comes to be, in the end, as ineluctable as ever. We make our fate through our choices. Absolutely: fate has changed. There may no longer be a preexisting plan, but there is still a single way things will turn out, and this way is coded into all our individual actions.
Let us call this genre of fate "epic," although we will have to paint the term with some new colors, to see how it has changed over the millennia. It is not uncommon, in Europe and its satellites, especially in the U.S., to feel that we live in a world without an overall plan, in a human life that is long and full of forking paths, overflowing with experiences, encounters with friends, allies, enemies, plagued by journeys, personal and public battles, upturns and downturns, not to mention day to day banalities. What's more, we are confounded by conflicting accounts of our agency—we make democracy work/ we can do nothing to change the system. For us, fate has to take a different shape. A split-second decision, a blink of an eye, a wrong turn is an index of the whole, each small event a single shard of the large urn of destiny. The urn must be being—so we presume—reconstructed out of these shards. Fate lies then in the way each piece fits together with another, and then another.
A great friend of this, the epic side of fate, is the movies. The sense that the staccato scenes will eventually merge together, that the cuts will add up to something great and continuous, the feeling of loose threads gradually being woven up into a tight fabric—this is the stuff of film art. Film borrows this procedure from novels, which carried the epic side of fate through the 19th century. A novel implies, before it is even picked up and read, that the story has already been completed and thus each event, each page, is a symbol of the whole. A reader enters the novel ignorant about everything except for one fact: that the story will take its pre-written course. Likewise, no matter how wild the story, the course of a movie can be felt in each frame, each episode. It doesn't really matter if the story is complete. The smallest details carry an extra glow, shinning indications of the invisible unity.
A minor genre of movies has been taking shape since the late 1970s, a new and improved epic genre with an updated, modernized, though perhaps not all that modern, pattern of total determination. These movies have taken epic fate to an extreme. They do not even have a "story." The more random, the more improbable the characters, dialogue, and happenings, the more the power of the fateful movement is at work. You have no idea, at the beginning of Robert Altman's 1975 Nashville (or Short Cuts, produced almost 20 years later), you have no idea that the disparate characters and situations will turn out to be intricately interrelated. In the end you do. Through a series of coincidences, a musician carrying a fiddle case finds the perfect opportunity to open it and take out a gun. And you recognize retrospectively that fate had been accomplishing its handiwork all along. That feeling of order, unity, and careful construction that accompanies a movement from lack to satisfaction—the feeling we long to get from art works—is denied you in these films. The unknown is immensely pleasurable of course, as are incongruity, indefiniteness, and surprise. But nothing can be more pleasurable, I think, than the moment when these unstable things that cannot possibly be subsumed under any higher order are in fact subsumed under a higher order, a rule, a set of connections, a pattern to which they all belong.
Given the nature of the 20th century, it is not surprising the way in which the pattern finally emerges. The various details that seemed so unassimilable are shown to be fundamentally connected in a violent scene. In Nashville there is an assassination. Short Cuts ends in suicide and an earthquake. Even nature expresses the network of human fate. Magnolia, Paul Thomas Anderson's film from 1999, is obviously indebted to Altman's style of filmmaking. This is the most allegorical of these films. It is a full return to Szondi's theory. You might say that its motto is "everyone is related." Magnolia closes with a whole series of violent events—a suicide attempt, an ambulance crash, teeth smashed in, a character dies—but none of these is the final coup de destin. That happens when frogs fall from the sky, in a sort of backward biblical plague. It is as if to announce out loud: the improbable is the new fate. We might think the ‘butterfly effect' is a chance beginning to an enchained series of events. In this finale we learn that the butterfly's wing is the tip of a great iceberg, a hidden system where every little thing is dependent on everything else. Improbabilities are not isolated events. They reveal their enmeshment in a secret system. And every seeming accident throws another stick onto the bonfire of the final, violent eruption of the hidden matrix.
In the Homeric poem the Iliad, the Trojan hero Hector tells his wife:
"And fate? No one alive has ever escaped it,
neither brave man nor coward, I tell you—
it's born with us the day that we are born."
(from Robert Fagles' translation)
What is born with all human beings in this archaic ethos is death, personal death. Fate means to Hector only the when, where, and how of his own end. And that means that, in the period before his death, he is more or less free. The difference between this fate idea and Szondi's "Wahl ist Schicksal" is huge. Fate in the Homeric poem (moira) is quite different from progressive entrapment by the limitations placed on us by our choices and by the accidents that befall us. In the Homeric poem, a human being is substantially free until the final appointment with death. My friend Francesco Casetti calls these end-of-20th-century films "informal epics." In them, characters are progressively trapped by their own freedom, so to speak. Their choices and chances add to their ultimate immobilization.
We can't say that this latter day epic idea rules our lives, or even that we believe it to be in force most of the time. But we can appeal to this idea whenever we like. And we appeal to it, I would imagine, when we most need to tell ourselves that, despite appearances, frighteningly unconnected things actually belong together.
Leopold Szondi should be given a posthumous prize. He should get it for describing the new form of fate—"Wahl ist Schicksal." We see this form taking hold in many places—not just in Hollywood's imagination. We see it in conspiracy theories (here the matrix that explains random events is the government, or aliens), and in their converse, in governmental theories of terrorism (unseen underground networks, distributions of cells issuing in apparently random events, and random events that point back to the hidden network). We see it in the "environment," a condition made by our inattentions that totally surrounds and encompasses us, which soon will return our neglect with violent interest. Experimental natural science shares the idea too to some extent. Experiments become data points, and statistics derives from them an underlying pattern, which we call nature.
Strained Analogies Between Recently Released Films and Current Events: Angry Birds and Angry Voters
by Matt McKenna
Angry Birds is one of those generic children’s films that incorporates already popular intellectual property to mitigate the risk of losing money. The logic is that kids might skip a boring film about madcap animated animals, but if these madcap animated animals are the same ones with whom the children already have an established connection through video games, toys, and school supplies, the terribleness of the film won’t impact revenue. It works too: Angry Birds, a movie based on a smartphone video game franchise, has already made $164 million at the box office worldwide. I don’t mean this as a knock on children’s taste in films--the same risk reduction strategy applies to grown-up films as well. I bring up the Angry Birds intellectual property only because the “angry” in Angry Birds reflects the “angry” in America’s current political zeitgeist. So while children aren’t allowed to vote in the upcoming 2016 election, Hollywood is still able to provide them with an alternative entertainment option that promotes anger as the most responsible reaction to current events.
Angry Birds’ protagonist is Red, a bird who isolates himself from his community by being a pugnacious jerk. While the other adult birds are nauseatingly nice, Red is sociopathic: in one of his first scenes, Red assaults a father and smashes the father’s egg to cause a premature birth of the bird within (the baby bird survives, thank goodness). Of course, Red’s nasty disposition is eventually validated when the island is invaded by deceitful pigs who claim to be friendly but wind up stealing the birds’ eggs. Because Red had previously warned his fellow birds that the visitors were up to no good, he is subsequently chosen to lead these once wimpy flock to battle against the duplicitous pigs. By the end, Red defeats the pigs, saves the stolen eggs, and the birds who used to look down on Red now sing songs about how angry and valiant he is.
If Angry Birds wasn't so boring, it's horrifying moral would be the the film’s primary attribute. Reinforcing the current “you're either with us or against us” political climate in America, Angry Birds’ moral hinges on the idea that kindness is for the weak, and aggression is the only way to avoid looking like a sap.
I realize Angry Birds is a kids' film so nuance isn't the main objective, but protagonists in kids’ films usually learn something about themselves by the end of the story. However, instead of having the film’s plot enlighten the main character, Angry Birds takes a different tack and uses its plot to reaffirm the main character’s initial feelings. Indeed, at the end of Angry Birds, Red hasn’t learned anything about himself--it is society that has been enlightened by Red.
I should address the obvious “red herring” about Red’s color. Though his color could easily be misinterpreted as representing Republicans in the “red state” sense, Democrats don't get off easy in this film either. Sure, Red was living outside of town and could be thought of as a rural bird who would likely vote Republican. But by the end of the film, Red moves back into the city, joining the throngs of other birds where he would most likely vote Democrat. Therefore, if the film is saying anything about Republicans and Democrats specifically, it's that the anger from one party is always mirrored by the anger in the other party. Despite his red feathers, Red doesn’t represent one particular political ideology, but rather he embodies the attitude all political ideologies posses: anger.
It may seem strange for a children’s film like Angry Birds to advocate rage as a problem solving tool, but it makes sense when considering the culture that spawned it. A CNN poll from last year showed that 69% of Americans are either “very angry” or “somewhat angry” about “the way things are going” in the United States. So perhaps it shouldn’t be surprising for our movies to condone the anger we already feel. Unfortunately for us in the United States, while Angry Birds has a happy ending in which the birds rally around Red’s point of view, there is no hope for such consensus in reality--it’s not as if Democrat voters will rally behind a Republican president or vice versa. Alas, maybe someday our attitudes towards elections will shift and we can look forward to a video game franchise called Reasonable Birds in which even-keeled birds solve their differences through thoughtful discussion.
by Akim Reinhardt
Hotter. I need it to be hotter.
I'm sitting in the backyard of my sister's carriage house apartment in Orange, California, a circle of jolly boutique and micro brew quaintness amid the sprawling shit hole that is Orange Country.
Of course nowadays, most any place in America afflicted by people is a shit hole. Indeed, even a quotient of the unpopulated spaces is beginning to emit a fecal stench, as if the human foulness emanating from the peopled portions of our nation is so strong as to waft and stain everything around it, like a halo of shimmering, homo sapiens stank.
I want it to be hotter.
After all, there are no more distinct places in the United States, or precious few at any rate. Instead, there are just types. The urban playground loaded with bars and restaurants, and kickball and skeeball leagues for childless 20- and 30-somethings; the poor and working class black and brown food deserts that gird the yuppies and empty nesters; the little towns hemorrhaging people, stragglers holding onto the local bar like shipwreck survivors grasping a buoy in the ocean; the increasingly opulent college towns full of precious students, microcosmic training yards for the urban playgrounds; the tourist spots offering up overpriced drinks and glossy nostalgia; all of it bound together by highways, those endless concourses of fast food, gasoline, and the occasional pile of roadkill.
But all of those types are just islands scattered about the uber-type, that oceanic wasteland of suburbia and its relentless waves of roads, strip malls, and tract housing, repeating itself over and over again like the backdrop of a cheap 1970s cartoon where a boring bipedal cat, arms outstretched, chases a smarmy little mouse who's certainly got it coming, but predictably manages to perpetually escape the fanged horror it deserves, thus prolonging the crankshaft repetition of house tree fence; house tree fence; house tree fence . . .
And all of it, every last bit of it, shot through with shitty chain outlets. Your uppers, your downers, your food in wrappers and boxes, your slave labor clothing, your mega stores, your tech shacks, and your money huts, all of them speckling the landscape like aggressive tumors mindlessly devouring their host.
No more places. Just types.
And now I'm in this type. The southern California backyard, walled off from everything but the murderous sun, several blocks from the bubbling dot of a used-to-be-an-actual-town-center-but-is-now-a-bourgeois-simulacrum-of-a-town-in-the-form-of-antique shops-and-almost-interesting-food, itself a lonely island amid the yawning expanse of ubiquitous sprawl.
And I'm wishing it were hotter.
When I'm in southern California, I prefer to do my writing outside, half-naked and sweating onto a laptop. There's something about those cinder block privacy walls and the endless, arid sunshine that puts an even cruder bent to my degeneracy than I'm apt to feel elsewhere. Nothing matters here. That's what everyone strives very hard to convince themselves of.
Truth be told, they're more neurotic than a bespectacled Upper West Sider stumbling out of a therapy session. But their biggest neurosis of all is the gut wrenching need to believe they're not neurotic. So they wear flip flops and self-medicate with weed or wine if they're not partial to pills, and vaguely intimate that the official street food of the West Coast, the burrito, is inherently more relaxing than the official street food of the East Coast, the slice and/or the hotdog.
They try so hard to not give a shit. But they're failing miserably, and deep down they know it, which is why they shudder at the sight of my wiry salt and pepper maw. Yet making them twitch isn't as much fun as it used to be, so I don naught but a pair of stained gym shorts, retreat to the walled off yard, bang on the keyboard, and occasionally pee on the fig tree.
If there's anything to care about in Orange County, it's the doughnuts. I'd say the Mexican food, but there are a lot of places you can get good Mexican food. However, the man tells me there are nearly 300 independent doughnut shops in this wide eyed paean to sunshine and orange juice. Why they insist on spelling it "donut" is beyond me, but either way, fried dough is OC's saving grace. A great doughnut can revive the soul. Hell, a merely good one is enough to ward off genital warts.
From here, I head north to the Bay area. For a long time, San Francisco was a unique spot on the map. I remember Johnny Carson making late night fag jokes about the place back when most Americans thought "a little light in the loafers," was an inherently funny phrase. Then again, they also thought the Village People were just some theatrical young men. If ignorance is bliss, then innocence is the white, faux-suede gloves we use to hide the blood on our hands.
Before it was a gay Mecca, San Francisco helped invent the hippie subculture. Some nice things came out of that. "White Rabbit" is a helluva song, and while no on wants to admit it, those patchouli-reeking bastards were right: deodorant will kill you in the end.
Then again, Raoul Duke probably hit it flush when he deemed that whole scene a failure: just another orgiastic Baby Boomer sideshow that disavowed both politics and serious art, while drugs became the goal instead of the pathway. Drifting pot heads morphed into homeless junkies; from naive and directionless to mean and chincy. All of it self-absorbed.
It's not enough to turn me into a reactionary Conservative, but between you and me, I didn't really give a shit when Nicholson got stomped in Easy Rider.
Before the hippies, Frisco (a name the natives detest, which is why I use it) was a crazy patchwork quilt of misfit and castoffs. The Italians, the Chinese, and various other tightrope walkers balanced themselves along the fine line and managed to cobbled together a vibrant urban space despite the race riots and lynchings.
Go back far enough and the place was a 3-2-1 liftoff spot for the genocide of California's Indigenous peoples. That level of evil, it marks you. Sets you aside as, if not unique, then goddamn special in ways too wrong to remember, which is why most Americans live in daily denial.
But that was a long time ago, before the hoary dot com bubble bloated and burst like an inflamed corpuscle. Of course that wasn't the end of it; the puss oozed and the infection spread. During the last two decades, Silicon Valley has reshaped the entire region by flooding it with the kind of callow money that makes the con game shoot all the angles until every loser thinks they're a winner and every winner is an insufferable boor.
Not all money's created equal. Don't believe me? Wait til the day comes when they throw your filthy lucre back in your face like a zoo ape flinging feces at the plexiglass.
Either way, the bottom line for the City by the Bay is the same as everywhere else. Its vast metroplex is just another melange of types, from the world class playground in the middle, to the archetypal preciousness of Berkeley, to the Oakland food deserts shrinking in the face of gentrification, and finally the aching morass of suburbia surrounding it all.
We'll stay for two days. Maybe I'll catch a ball game. Maybe I'll blow my brains out and file next month's venomous screed from the grave. 3ZombiesDaily, motherfucker.
Living or dead, after the Bay I'll make my way to Reno, Nevada. The Biggest Little Town in America, they like to call it. I guess that's because they still got trains hauling silver from somewhere to somewhere else passing through downtown and blowing their horns in the middle of the night. But the hustlers and whores are mostly gone, the 24 hour chili dog was never that good, and the usual creep has crept through the place just like every other place. So to hell with it. One night at a locals casino, room courtesy of a local friend with points up the wazoo, and then on to the great adventure across a continent.
We'll head east and follow a tendril of highway out to the dry void, that grand expanse of the West which, unlike Phoenix, SoCal and Vegas, isn't raping the environment for hundreds of miles around in the quest for water so they can turn the desert into suburbia.
Somewhere in Utah we aim to find the remains of a WWII Japanese-American internment camp. A rotting reminder that while it's all the same now, being a special little snowflake wasn't always a good thing.
Afterwards we'll trek on, with stops in eastern Wyoming, eastern Nebraska, and whichever god-forsaken Midwestern motel we collapse in before finally returning to Baltimore.
It's good to return to Baltimore. Baltimore knows what it is and what it ain't. And while the is can sometimes leave you wanting, at least the ain't is honest.
Akim Reinhardt's website is ThePublicProfessor.com
Sunday, May 29, 2016
Charles Yu in The New Yorker:
The man lived in a one-bedroom efficiency cottage all by himself, in a sort of dicey part of town. One day, the man woke up and realized that this was pretty much it for him. It wasn’t terrible. But it wasn’t great, either. And not likely to improve. The man was smart enough to realize this, yet not quite smart enough to do anything about it. He lived out the rest of his days and eventually died. The end. Happy now?
The man could see that his therapist was not amused.
A rather unsatisfactory ending, the therapist opined, and suggested that the man could do better. The man thought, Is she really serious about this? But he didn’t say anything out loud. The man was not convinced that he needed to be talking to the therapist at all, but he had tried so many other things (potions, spells, witches), and spent so much of his copper and silver, with absolutely nothing to show for it, that he figured why the hell not.
So how do I do this? he asked.
Why don’t you start again? the therapist replied. And, instead of rushing to the end, try to focus on the details.
O.K., the man said.
A Shocking Find In a Neanderthal Cave In France
Ed Yong in The Atlantic:
The cave sits in France’s scenic Aveyron Valley, but its entrance had long been sealed by an ancient rockslide. Kowalsczewski’s father had detected faint wisps of air emerging from the scree, and the boy spent three years clearing away the rubble. He eventually dug out a tight, thirty-meter-long passage that the thinnest members of the local caving club could squeeze through. They found themselves in a large, roomy corridor. There were animal bones and signs of bear activity, but nothing recent. The floor was pockmarked with pools of water. The walls were punctuated by stalactites (the ones that hang down) and stalagmites (the ones that stick up).
Some 336 meters into the cave, the caver stumbled across something extraordinary—a vast chamber where several stalagmites had beendeliberately broken. Most of the 400 pieces had been arranged into two rings—a large one between 4 and 7 metres across, and a smaller one just 2 metres wide. Others had been propped up against these donuts. Yet others had been stacked into four piles. Traces of fire were everywhere, and there was a mass of burnt bones.