Monday, August 25, 2014
by Gerald Dworkin
Many of the quotations that appear in my e-reader Philosophy: A Commonplace Book are one-liners:
There are many ways in which the thing I am trying in vain to say may be tried in vain to be said. —Beckett
Better latent than never. —Morgenbesser
Philosophy is to the real world as masturbation is to sex. —Marx
But often the difference between a one-liner and a many- is arbitrary. Wilde's "I do all the talking. It saves time and prevents arguments" could have a semi-colon instead of a period and be a one-liner. Thus Shaw: One sees things as they are and asks why; another dreams things that never were, and asks why not. Or, as Nabokov put it: The difference between a therapist and the rapist is a matter of distance.
1) a pithy observation
2) a terse saying
3) a short phrase
4) a brief statement
5) a concise statement
6) a laconic expression
It was Nietzsche's aim to "to say in ten sentences what everyone else says in a book—what everyone else does not say in a book." But he was also brilliant at much shorter length. "All truth is simple…is that not doubly a lie?"
Obviously this pithetic character is at best a necessary condition but not sufficient. "Today is Monday" is pithy enough but lacks a certain je ne sais quoi. Again, definitions try to supply the missing ingredient in different ways-- "embodying a general truth", "makes a statement of wisdom" , "astute observation". The first of these seems, to us now, too weak. "Objects fall when unsupported" is both pithy and a general truth. But Bacon titled one of his books on the nature of science Aphorisms Concerning the Interpretation of Nature.
For what it is worth, the word derives from aphorismos (greek) meaning definition. And indeed historically many aphorisms took the form of definitions. Ambrose Bierce's Devil's Dictionary being a prime example. ACADEMY. Originally a grove in which philosophers sought a meaning in nature; now a school in which naturals seek a meaning in philosophy. PHILOSOPHY. A route of many roads leading from nowhere to nothing.
All of this is by way of introducing the reader to a particularly clever and astute practitioner of that current system of aphoristic communication knows as the TWEET. Now those inclined to resist all things contemporary may object that any message that may be as large as 140 characters cannot be an aphorism. (Joke interruption: I was asked the other day to supply a password with at least eight characters. I decided upon Snow White and the Seven Dwarfs.) And those truly hostile may note that the definition of Tweet given first in all dictionaries is "A weak chirping sound, as of a young or small bird." But the tweets I am bringing to your attention are both short and clever.
by Tasneem Zehra Husain
I'm a total pushover when it comes to stories of connection. I am delighted by accounts of barriers breaking down and disparate people uniting in purpose, of ideas coalescing and theories fusing to reveal the common threads that underlie diversity. As I look back upon the history of physics, what reaches out and grabs me are the moments of unification when strands long thought separate are suddenly braided together in a whole that is stronger and more beautiful than the sum of its parts. Sometimes we uncover hidden affinities by exploring a motif repeated in apparently unrelated contexts; at other times, we are compelled by circumstance to form alliances with those we may have neglected, to put our heads together and come up with a solution acceptable to all. The conundrum of dark matter falls solidly in the latter category.
For several decades, cosmologists and astronomers had been growing progressively distant from their particle physics colleagues. As one group craned their necks further out into uncharted space, the other crawled deeper into the recesses of the atom. The disciplines began to seem as divergent as the scales upon which they operate, but there is a surprising resonance between the minute and the colossal. Even objects of cosmic proportions are built from subatomic particles. The discovery of dark matter was a reminder that no part of the universe can be completely understood by those who turn their backs on the rest.
Discussions of dark matter (and dark energy) are often front-ended by a startling admission of ignorance: the entire gamut of matter particles we conventionally study - quarks and leptons combined - forms less than 5% of the known universe. There is about five times as much dark matter out there, we are told, while the rest of the universe is made up of dark energy. But, since neither dark matter nor dark energy can be seen, how do scientists justify this shocking claim? An analogy might help. The mechanism of human vision is such that we see objects only when they reflect light. But if you find yourself in a pitch dark room, you don't immediately conclude that just because nothing is visible, the room must be empty. You simply realize that sight is no longer a reliable guide under these circumstances, and you must lean on sounds and smells, and touch (and taste?) to probe your surroundings. For lifeforms less dependent on vision, the darkness is multi-textured and alive with variety. Consider bats, for instance. Where we rely on light hitting objects and bouncing back, bats bank on sound. They emit high frequency calls, inaudible to human ears, and use the resulting echoes to construct a sonic map of their surroundings (the further an object is, the longer it takes for the echo to come back). The moral of the story is this: as long as there is a way for you to interact with an object, you can "sense" its presence.
by Rishidev Chaudhuri
It's impossible for me to leave a place well. I used to think that I was merely bad at logistics and planning (and I am), but I manage to conspire against myself with such sinister competence that this explanation no longer seems viable. As the time to leave approaches my consciousness starts to fragment, and I become exhausted and flee into sleep. I wait too long to do things, unable to act unless I have killed my inertia with drink or other confusion, or distracted myself sufficiently that anything I do is useless. I spend hours on minutiae, reorganizing my book collection and cataloguing my kitchen equipment; they're happy hours, once I forget why I'm doing it.
Perhaps it's that leaving is quite obviously a rehearsal for death, disrupting even the faint illusions of permanence that spatial and environmental contiguity offer us. So is everything, if we have learned to listen to the philosophers and to live well, but of course we have not learned to listen and who has the time to rehearse for death these days?
I have trouble even with leaving hotel rooms and getting off of airplanes. I'm haunted by the sense that I've left traces of my self behind. Maybe in the shape of things: do I have my keys? has my wallet finalized the escape it has been plotting all these years? Perhaps these things I've left are important and their absence will make the self who leaves unviable. Eventually I get frustrated and resentful of the unreasonable claims of that future self but by then it is too late: I am nearly that future self and the instincts of self-preservation take over.
With leaving comes the return of beginner's mind, that flush of seeing things fresh as you did when you first arrived, of being once again surprised at the particularity of things, troubled by their contingency and delighted by the odd way the fragments of a world fit together (Louis Macneice's delightful "drunkeness of things being various"). As everyone knows by now, the only time it is truly possible to appreciate anything is when you are faced with its transience and, by then, it is too late and the moments are inextricably entangled with the melancholy of their endings. Sometimes, though, the melancholy parts to reveal intimations of an exuberant noonday joy, as when the sun stands still and makes the world bright and shimmering for a few moments before it begins to fall towards the horizon.
Bev Butkow. Untitled/Unknown/Unwanted. 2013
Vinyl collaged onto clear plastic sheeting.
by Akim Reinhardt
Thirteen year old Mo'ne Davis recently took America by storm when she pitched her south Philly baseball team deep into the Little League World Series, where clubs from around the world compete every August.
A beloved celebrity of the moment, her success brought to mind my own somewhat tortured little league experiences.
I. While not terribly big, my father was nevertheless a super-stud athlete at his highschool in Fresno, California during the mid-1950s. Captain of the football team (he played end on both sides of the ball), member of the track, field, diving, swimming, and basketball teams, he was popular enough to be voted president of the class of `56. And he was good enough, despite being only 145 pounds, to earn a football scholarship to Redding College in northern California, although he would soon lose it in a gambling scandal. True.
So you'd think I grew up in a household that paid attention to sports and that I learned it all from at my father's knee.
Quite to the contrary, not only didn't the old man watch sports, he didn't even understand the appeal. To him, sports were something to do, not something you watch other people do. I think he looked at it like drinking: he liked drinking, especially with others and alone if need be, but why on earth would he turn on the TV to watch someone else drink? Or drive across the city and pay for parking and admission to watch people drink. It didn't make any sense to him.
Fair enough, you say. But then he must've been a great coach when I was a kid, right? The kind of dad who could really teach the fundamentals and show you the tricks to getting ahead.
Again, not really.
Great players often make for lousy coaches. One common explanation is that their prodigious talent makes it more difficult for them to become good teachers, not easier. That the concept of pedagogy is foreign to them. That they are dumbfounded when mediocre players play, well, mediocre.
How could you not hit that ball or make that shot? That's easy, what's wrong with you? It was easy for them, of course. Not so much for the other 99% of humanity.
And that's kind of what it was like with my dad. As I became old enough to participate in organized sports on the rock and glass strewn fields of the Bronx, he was, more than anything, dumbfounded when it became obvious that I wasn't a great, natural athlete. He wondered about my eyesight (which was fine), and told me to concentrate more (which I did, sometimes). But generally, he was at a loss to explain it.
by Eric Byrd
There's a subgenre of military memoirs produced by elderly emeriti, the crew-cut close readers of postwar English departments, who in late career published personal recollections of they and the other terrified teenagers who mostly fought World War Two. Alvin Kernan (Shakespeare editor, torpedo bomber crewman) is like Paul Fussell (Johnsonian, infantry officer) and Samuel Hynes (Auden biographer, Marine aviator). Seventeen year-old Kernan joined the Navy before the war, to escape the bleakness of Depression Wyoming: Ma and Pa down on the ranch, hard winters and harder times. Kernan's mother had a representatively difficult life. She killed herself while he was at sea. Home on leave, he inspects her grave "already collapsing and pocked with gopher holes":
The World War I generation to which she, born in 1900, belonged was the first to leave the land, and with a little education, she married a soldier, moved to town, went to Florida, lost the money from the sale of her father's farm in the land boom, had a child, divorced, and began wandering—Chicago, Memphis, a ranch in Wyoming. She remarried, became a Catholic, and put a determined face on it all, but she was part of the first generation of really rootless modern Americans, moving restlessly by car about the country, emancipated socially and intellectually to a modest degree, but lost, really, without the supporting ethos and family that had protected people in the years when the continent was being settled. Alienation was the familiar state of my generation of Depression and another world war, but the old people had few defenses against it when it appeared.
Hemingway, Fitzgerald, and Dos Passos are the favorite writers of young Seaman Kernan. He could be one of their characters. As with Hemingway's Nick Adams, death-shaded excursions in the American wilds precede and forebode initiation overseas. And Kernan must have recognized his family in Dos Passos' panorama of the wandering and the unmoored, the war-mobilized, the desperately migratory. The down but not out, bumming the freights, going to sea, following work; displaced but for all that able to dream of landing somewhere better:
Returning from out baseball game, we came alongside the ship and began to send sailors up the gangway. At that moment another landing craft came up carrying officers, including the executive officer of the Suwanee—a small, dark, mean man—who stood up in the bow, dead drunk, shouting in a loud voice to the officer-of-the-deck, "Get those fucking enlisted men out of there and get us aboard." Protocol was that officers always take precedence in landing, and our boat shoved off immediately, circling while the officers staggered up the gangway after their afternoon drinking in the officers' club. The gap between enlisted men and officers in the American navy during WWII was medieval. Enlisted men accepted the division as a necessary part of military life, but it never occurred to us that it in any way diminished our status as freeborn citizens who, because of a run of bad luck and some unfortunate circumstances like the Depression, just happened to be down for a brief time. "When we get rich" were still words deep in everybody's psyche. But the exec's words, "those fucking enlisted men," spoke of deep and permanent divisions. He obviously really disliked us, and his words made shockingly clear that he, and maybe the other officers he represented, had no sense that we had shared great danger and won great victories together.
Dos Passos' Three Soldiers, in a paragraph.
Beyond the charm of the Lost Generation atmosphere, the virtues of Crossing the Line are its swift pace and concision of evocation. No episode lasts longer than is necessary to make the essential impressions—usually Kernan's fear and awe (at times laced with boyish glee) before the military juggernauts whose savage collisions he is witnessing. Kernan did not set out to reconstruct the birth of his literary consciousness, or find the boy in the vitae. Quite the opposite. Seaman Kernan is a small animal in a world of threats. He thinks with his gut, senses through the soles of his feet.
by Evert Cilliers aka Adam Ash
You've got to be mad brave to whack yourself. Yep, suicide takes a lot of balls. The most courage any human can ever muster. Suicides are the bravest people who ever lived, because they commit the greatest act possible -- a deed against actual existence, against their very being. They say no to life itself, and then have the courage of that unbelievable conviction to end everything. Suck on the barrel of a gun or cast themselves down from a great height on to the indifference of solid ground.
And we often resent them for it. Because they say no to all of us, to all of us who persist in living. They place the idea of living in jeopardy. They undermine our pathetic belief in life. How could they? How dare they?
Why do they say no to life? Because for them, living is not worthy. Life is too crappy to merit a fart. Not up to scratch. They feel this way because they are depressed. So depressed, there is no more pleasure in being alive; only persistent, absolute pain. And no advice from the living can help.
I know about that.
I've been mortally depressed in my life, clinically depressed, and thought about committing suicide, but never got around to trying it. (I believe I saved myself from depression by exercise: as a runner all my life, I think I finally ran my topsy-turvy brain chemistry into balance: if more people exercised, we'd need fewer therapists.)
by Josh Yarden
I posted a story last month about biblical metaphor, entitled "What Fruit Grows on the Tree of Knowledge?" The class discussion I related there continues with this question from a student:
"Ok, let's get back into to the mythical garden."
"So, you're saying it's all just a myth?"
"It's not just a myth. When I say a text is mythic, I don't mean that it is false. I mean that the power of the story is in the way it reflects experiences that happen over and over again in our lives. That's how people in different cultures over thousands of years can relate to these essentially human stories. We do know for a fact that the story has existed for millennia, and it has had a powerful and a memorable impact on our society. That makes it real, whether or not the events happened as described.
"Ok, but that doesn't explain whether or not the tree of knowledge of good and bad is an actual tree."
"You can decide for yourself, but keep in mind that Torah does not claim to be ‘the truth, the whole truth and nothing but the truth.' There aren't enough details in these brief stories to suggest that any of them are full accounts of actual events, but they do contain enough symbolism to be read on three levels: the myth, the moral and the metaphor.
by Brooks Riley
Strained Analogies Between Recently Released Films and Current Events: The Guardians of the Galaxy are Taking Our Jobs
by Matt McKenna
At the end of Guardians of the Galaxy, there is much rejoicing by the citizens of the noble planet of Xander after their having been saved by the film's titular ragtag bunch of lovable anti-heroes. What is interesting to note, however, is how unconcerned the individuals on Xandar are by the troubling labor dynamics made apparent in their pyrrhic victory against the evil tyrant, Ronan the Accuser. Consider that a mere five "Guardians" (three humanoids, a tree, and a raccoon) were required to protect the Milky Way, a galaxy containing three hundred billion stars and, in the Marvel canon anyway, is so utterly teeming with bipedal life forms that one can't even land a spaceship on a random abandoned husk of a planet without running into at least one English speaking vigilante/mercenary/henchman who has dedicated her/his/its life to finding one lost relic or another. For goodness sake, just imagine the sheer number of plots against freedom-loving Xandarians that would arise in such a galaxy. And yet, Marvel's Milky Way apparently only requires a handful of part-time crime fighting goofballs to prevent evil from running roughshod over the forces of its PG-13-themed justice. Though it may sound as if I'm suggesting this implausibly small cosmic police force is a plot hole in Guardians of the Galaxy, it is precisely this miniscule ratio of guardians-to-villains that constitutes the film's most salient point about the real world: In our Milky Way as in Marvel's, the good jobs of the future will be dominated by a lionized elite few.
by Leanne Ogasawara
One of my favorite 3QD associates recently wrote a wonderful blog post, Old Man Bush: The Last Motherfucker. Reminiscing about the good ol' days, he asks the inevitable question, what happened to today's youth?
It's true, George HW Bush was old school. Despite being accepted at Yale, he postpones college to fight in the war, becoming a young aviator and then war hero... and not just that, says Akim, but the badass is still jumping out of helicopters at 90 years old today. Akim is impressed and wonders how it is that we all became so soft?
Honestly. How else do you explain seedless watermelons? Nope, we can’t be bothered to spit black watermelon seeds anymore, much less just eat the white ones. Cause we’re soft.
I mean, good luck finding regular grapefruit juice. No siree Bob, it’s gotta be ruby red on every grocery store shelf, cause the plain old yellow grapefruits are a little bitter. Can’t be expected to put up with that.
Or reading a map. Or cooking dinner from scratch. Or getting up to change the channel. Or waving a hand fan. Or walking anywhere. Nope. Middle class America is too soft for any of that. Just gimme a smart phone, a remote, some takeout, a shit ton of air conditioning, and a good parking spot.
You know things are bad when you start looking back at old HW's presidency with nostalgia, right? What is really scary, though, is I had just been thinking the exact same thing!
by Bill Benzon, appendix by Charlie Keil
Adolescents seem gifted in the belief that, if only the adults would get out of the way and grow up, the world would be a much better place. In that sense I am still, at 66 going on 67 (Pearl Harbor Day of this year) an adolescent. I still believe that the world needs changing, though it’s been decades since I naively thought that letters to The New York Times were a reasonable means to that end. And I still believe that it’s the adults that need changing.
But I must also move forward in the realization that I too am an adult, and have been so for some time now.
What to do?
I painted this when I was nine or ten.
I was ten years old when the Russians put Sputnik into the heavens. I still remember the October evening when my father took me outside and pointed to a moving light in the night sky. “That’s it.”
That’s when my personal history joined world history. That’s the first event that was both personally meaningful to me–I’d been drawing sketches of spaceships for years and had even done a painting or two–and was also of world historical importance. By the time I was old enough to be an astronaut, however, I’d changed.
I’d gone to college, marched against the Vietnam War, done my conscientious objector’s alternative service in the Chaplain’s Office at Johns Hopkins, and lost all interest in becoming an astronaut. Inner space had become my territory.
I got my PhD, then a job at the Rensselaer Polytechnic Institute, was astounded when Reagan was elected and re-elected–that hadn’t been in the plan, no it hadn’t. And I was really surprised when the Soviet Union collapsed. After all, I’d grown up during the height of the Cold War, read articles about back-yard bomb shelters, and had even picked out the spot in our back yard where a shelter should go. I figured that, whatever else happened in the world, that I’d go to my grave in the shadow of the Cold War.
by Grace Boey
Imagine this: someone secretly laces your coffee with meth, every morning, for 28 mornings. Over the first week, you become increasingly hyperactive, and start to bubble with confidence and energy. You feel great, but by day 7, your behaviour starts to get erratic, and you’re irritated with everyone else who can’t keep up. By day 21, you’re having flashes of paranoia, and freak out from time to time because your mind keeps racing, and you’re convinced everyone’s watching you move too fast.
By day 28, you haven’t slept for a week. You feel invincible, so much so that you decide to take all the drugs you’ve got to see if it will kill you. Because that which doesn’t kill you makes you stronger, right? And if it does kill you, you’ll die feeling amazing… and dying would be such an incredible thing to do. In fact, this had damn well better be fatal. Thanks to the meth and sleep deprivation, you are so confused, irrational and psychotic, that this babbling seems entirely sensible.
Was the suicide attempt ‘your own decision’, in any meaningful sense? Of course it wasn’t. It certainly wasn’t my decision, when those very events happened to me a couple of years ago. The only difference? No one had secretly laced my coffee with drugs (though they might as well have). The terrifying effects were a product of my very first full-blown bipolar manic episode. Thankfully, I survived—although the doctors who treated me assured me I could just as easily not have. I had no clue what was happening at the time; my mania had swept me away, before I even realized anything was amiss.
Despite all this, people like Christian blogger Matt Walsh would say I had committed a “terrible, monstrous atrocity” that was entirely my decision. On August 12, one day after Robin Williams appeared to have killed himself as a result of depression, Walsh published an article with the headline “Robin Williams didn’t die from a disease, he died from his choice.” In it, he claimed that “suicide does not claim anyone against their will”. Depression—and by extension of Walsh’s arguments, all mental illness—is not responsible for suicide: you are. When a huge backlash ensued, he stuck to his guns and wrote a detailed response to his critics.
When I first came across the headline of Walsh's original post, I took a deep breath, read the article, took another deep breath... and read it again. My conclusion at the end of this exercise was the exactly same as my initial response: what a load of exploitative, uninformed rubbish. Walsh's statements reflect deep misconceptions about mental illness, competent decision-making and ‘free will’, which (unfortunately) hinge on the supernatural metaphysics that accompanies Christianity. It angers me that someone like this should feel entitled to piss on the grave of Robin Williams with a headline like that. And personally, as someone who has attempted suicide under the grips of both mania and depression, I am insulted by Walsh's backward ideas.
Sunday, August 24, 2014
Richard Marshall interviews Rebecca Gordon in 3:AM Magazine:
3:AM: Are you approaching this via virtue ethics, four cardinal virtues and Alisdair MacIntyre and what is the best way to understand what torture is?
RG: I’m going to reverse the order of these questions, because I think that once we understand what institutionalized state torture is, it becomes clearer why I think MacIntyre’s contemporary virtue ethics provide a useful way of understanding torture’s moral implications.
The torture that I am concerned with is institutionalized state torture – the kind of organized, intentional program carried on by governments. It’s not Jack Bauer saving Los Angeles on24. It’s not some brave person preventing a ticking time-bomb from going off by torturing the one person who can stop it. We must stop thinking of torture as a series of isolated actions taken by heroic individuals in moments of extremity, and begin instead to understand it as a socially embedded practice. A study of past and present torture regimes suggests that institutionalized state torture has its own histories, its own traditions, its own rituals of initiation. It encourages, both in its individual practitioners and in the society that harbors it, a particular set of moral habits, call them virtues or vices as you prefer.
Here’s my definition of institutionalize state torture: It is the intentional infliction of severe mental or physical suffering by an official or agent of a political entity, which results in dismantling the victim’s sensory, psychological, and social worlds, with the purpose of establishing or maintaining that entity’s power. This definition can be expanded to reveal its legal, phenomenological, and political dimensions.
The language about “intentional infliction of severe mental or physical suffering by an agent of a political entity” mirrors the definition found in the UN Convention against Torture and Other Cruel, Inhumane, or Degrading Treatment, to which the U.S. is a signatory. A phenomenological definition describes the ways in which torture reduces and distorts its targets’ orientation in time and space, its effects on language, and its destruction persons’ social connections. The “political” portion deals with the purposes of torture, which when it is institutionalized by a state, has much less to do with “intelligence gathering” than it does with political and social control.
So what does this understanding of torture have to do with virtue ethics and Alasdair MacIntyre? I would argue that when we understand torture as an ongoing practice, we can begin to see how it affects moral habits.
Matthew Snyder in the LA Review of Books:
A biologist at Duke University, Stuart Pimm, recently published a research article in the journalScience, which claims — that in the past — before humans evolved, only one species went extinct each year per every 10 million years. However, after the emergence of humans, that extinction rate has exploded at a rate between 100 and 1,000 species each year. To make matters worse, by 2050, three things are quite likely to occur: 1) the North Pole will have melted to such an unreal extent that — by summer — passenger ships will be able to cross through the North Pole with ease, nudging small, dainty ice chunks past the petro-churn of their port and starboard; 2) the largest living thing on our planet with an ecology stretching for 2,600 kilometers, The Great Barrier Reef, will go extinct. As the Earth warms, its coral will increasingly bleach into a white death by the parallel acidic warming of the ocean temperatures; 3) just as well, according to the United Nations, the Earth’s oceans might be entirely absent of fish — the kind of fish people like to eat in their sushi restaurants. Beyond these dire benchmarks, by 2100, the Earth’s temperatures will rise from 2.4 to 6.4 Celsius. If global warming ramps up to its ultimate extreme, centuries into the future, with both the North and South Poles having melted completely, important swaths of world’s continents would be engulfed by water. In a elaborate, but sophisticated work of cartography by Martin Vargic, an amateur graphic designer from Slovakia, he imagines such a future where sea levels rise 260 as a result of the caps melting: America’s major cities would be underwater: Miami, New Orleans, New York and Washington D.C. From the farther South and West of the U.S., in Latin America, the Amazon would bursts its banks, becoming a sea reaching into vast expanses of Brazil, while a sizable component of Australia’s continent would be swamped by the Murray Gulf and the Artesian Sea. All of this would make Kevin Costner’s fictional Waterworld(1995) and Radiohead’s animated video for “Pyramid Song” hapless documentaries streamed to us from this forlorn future. But this new world map would match the depressing baritone of Paolo Bacigalupi’s The Windup Girl — a masterful, SF corollary to Dicken’s Bleak House, which stratocasts us into such a future. This paradigm-shifting novel details a world where global warming in the 23rd century has left Earth’s coastlines underwater and new eco-plagues make any hour of life a precarious one; where calorie-companies control global food production via private armies and bioterrorist acts on third-world ecologies (or just the usual buying off of their politicians and the wealthy); and where sophisticated levees and pumps keep Bangkok from going underwater.
Unlike the drowned world of the future-possible described in The Windup Girl, Jacques Lob and Jean-Marc Rochette’s French comic, Le Transperceneige (1982),imagines a dystopian world that’s frozen over. The comic book cleverly blends the SF subgenres of the post-apocalypse, where life no longer exists outside the train, to the dystopia that exists inside the train’s many caste-cars, where minus the awareness of its slowing engine, the people acknowledge that the train is bleak but altogether sustainable. Therein, Le Transperceneige, written by Lob and illustrated by Rochette, details a world where an unnamed ecological catastrophe has frozen the Earth solid of any living beings other than the formidable Snowpiercer — a train whose engine of perpetual motion keeps passengers alive, and in doing so, no longer stops at stations for the lonely, frozen, and forgotten. Everything outside is dead — encased in a white carapace of ice, storms, and snow.
Ray Monk in Prospect:
Ludwig Wittgenstein is regarded by many, including myself, as the greatest philosopher of this century. His two great works, Tractatus Logico-Philosophicus (1921) and Philosophical Investigations (published posthumously in 1953) have done much to shape subsequent developments in philosophy, especially in the analytic tradition. His charismatic personality has fascinated artists, playwrights, poets, novelists, musicians and even movie-makers, so that his fame has spread far beyond the confines of academic life.
And yet in a sense Wittgenstein’s thought has made very little impression on the intellectual life of this century. As he himself realised, his style of thinking is at odds with the style that dominates our present era. His work is opposed, as he once put it, to “the spirit which informs the vast stream of European and American civilisation in which all of us stand.” Nearly 50 years after his death, we can see, more clearly than ever, that the feeling that he was swimming against the tide was justified. If we wanted a label to describe this tide, we might call it “scientism,” the view that every intelligible question has either a scientific solution or no solution at all. It is against this view that Wittgenstein set his face.
Scientism takes many forms. In the humanities, it takes the form of pretending that philosophy, literature, history, music and art can be studied as if they were sciences, with “researchers” compelled to spell out their “methodologies”—a pretence which has led to huge quantities of bad academic writing, characterised by bogus theorising, spurious specialisation and the development of pseudo-technical vocabularies. Wittgenstein would have looked upon these developments and wept.
Brian Leiter in The Huffington Post:
Late Friday afternoon (August 22), the University of Illinois broke its three-week long silence on the controversy regarding the Chancellor's revocation of a tenured offer to Steven Salaita, who had accepted a faculty position in the American Indian Studies Program at the flagship campus at Urbana-Champaign. Chancellor Phyllis Wise and Board of Trustees Chairman Christopher Kennedy both issued statements explaining the revocation, but in terms far more alarming than the original decision itself. It is not an exaggeration to say that the Chancellor and the Board of Trustees have now declared that the First Amendment does not apply to any tenured faculty at the University of Illinois.
A bit of background to Friday's bombshell statements. Last October, Professor Salaita, then teaching at Virginia Tech, accepted a tenured offer from the Urbana-Champaign campus. He went through the regular appointments process at the University of Illinois, and received approval by the relevant departments and deans after a review of his scholarship and teaching. The offer, which he accepted, was conditional on approval by the Board of Trustees. Such approval clauses are typical in all teaching contracts and had, previously, been pro forma at Illinois, as they are at all serious universities: it is not the job of the Board of Trustees of a research institution to second-guess the judgment of academics and scholars. Well before the Board took the matter up, even University officials were describing Salaita as a faculty member, and he moved to Illinois and was scheduled to teach two classes this fall.
Salaita also has a Twitter account. "Tweets" are limited to 140 characters, so the medium is conducive primarily to spontaneous and superficial commentary. As a Palestinian-American and scholar of colonialism, Salaita tweeted extensively about the Israeli attack on Gaza. Contrary to the initial misrepresentations put into circulation by far right websites, none of the tweets were either anti-semitic or incitements to violence. Some were vulgar, some juvenile, some insulting, some banal. The First Amendment unequivocally protects Salaita's right to express every one of those opinions on a matter of public concern, and to do so, if he wants, with vulgarity and insults. As a matter of American constitutional law, this is not a close case.
Amar Sindhu in Herald:
Fahmida Raiz, writer, human rights activist and the author of more than 15 books on fiction and poetry, has always remained at the centre of controversies. When Badan Dareeda, her second collection of verse, appeared, she was accused of using erotic and sensual expressions in her poetry. The themes prevalent in her verse were, until then, considered taboo for women writers. The feminist scholarship and women’s movement, however, not only acknowledged her expressions but welcomed them with applause. Riaz was also faced with challenges due to her political ideology. More than 10 cases were filed against her during General Ziaul Haq’s dictatorship. She was forced into exile during the same regime, only to return to Pakistan after Haq’s death in 1988. The poems from her collection Apna Jurm Sabit Hae are politically charged and reflect the torment her homeland experienced under dictatorship. In terms of using creative expression for political discourse, Riaz stands among literary greats such as Nazim Hikmet, Pablu Neruda, Sartre and Simone de Beauvoir. Following are excerpts of a conversation she had with Herald on her literary journey and issues confronting Pakistan’s literati.
Amar Sindhu: Does creativity need ideology?
Fahmida Riaz: Once creativity expands beyond the very personal, almost biological paradigms, it seeks some ground to stand upon. Creativity is very often rooted in some idea. Our folk songs and stories do not seem to be ideological but they seem to have ideas, when looked at closely. The question of ideology is raised mostly in the context of progressive literature that sees individuals in a web of external circumstances and class conflicts. Literary creativity does not have to emanate from this consciousness, nor does this consciousness hamper creativity. In the 20th century, great writers such as Pablo Neruda, Paul Nizan, Nazim Hikmet, Faiz Ahmed Faiz and Gabriel García Márquez declared themselves to be Marxists. An artist like Pablo Picasso, who revolutionised the world of painting, was a member of the Communist party of France. On the other hand, two literary giants before these writers, Leo Tolstoy and Dostoyevsky, saw the individual and the society in the context of Christian teachings and sought the answers of all human problems in Christ. You may notice, though, that too was a kind of ideology.
Mary Beth McCauley in The Christian Science Monitor:
Gallup polls report that 86 percent of Americans say they believe in God. Thirty nine percent say they attended worship services in the past week. So while God may not be dead, religion struggles. And why shouldn’t it? Religion has awful PR: unrelenting sectarian wars abroad, political infighting over morality at home, scandal, shopping to be done and football to be watched on the Sabbath, high profile competition from secular ideology, and a worldview that can seem out of step with popular culture.
Krista Tippett, host of the award-winning public radio show and podcast “On Being,” which takes up questions of religion and meaning, is alarmed. This, she says, is “the first generation of humans in any culture who didn’t inherit a religious identity.” But even while parents who have distanced themselves from their faith traditions are hesitant to pass that religion on to their children, science seems to take up the cause, as it unearths a host of practical benefits of religious practice. Everything from the physical effects of a heightened immune response to social benefits like closer interpersonal ties and better behavior in teens, is linked to the state of being religious. Is there a way parents can overcome their personal ambivalence about religion in order for their children to have its benefits?
October, Month Without Gods
The Japanese think this is the month-without-gods.
They celebrate it this way. They don’t alliterate October
with gold falling from the fragile trees,
or with revolutions that changed history.
October, like a truce. Like an absence of everything
that exceeds limits. May it be for us
liberation. Because now they don’t exhibit
the relentless naked gods of summer,
the too many gods, and so much remains
for the child of winter to be born,
and our sight doesn’t reach any further, from this
month of distances, month of far aways,
imperfect, attained, fortuitous. If only it would be
like this for us. Without the eight million
gods that hide in the city or in the forest,
the scales coincide with our statures.
Let us be carried away by our premonitions.
Let us write things with small letters.
Let us celebrate October for its absence of gods.
Let us enjoy its name because it is only a number
in a truncated series. And forgotten. It is October.
We have thirty days all to ourselves.
by Juan Antonio González-Iglesias
from Circumference Magazine
translated from Spanish by Curtis Bauer
Saturday, August 23, 2014
Sam Leith in The Guardian:
Around the time his novel The Pregnant Widow came out in 2010, an interviewer asked Martin Amis whether the book constituted a return to form. "What's this return shit?" he shot back. "I don't know how this will go down, but my talent seems to me to be perfectly vigorous." You can almost hear the voice, the italics, the roll-your-own rasp: part surly, part amused. A little bit more surly.
This is Amis in combat stance, the position he has occupied for as long as most of us can remember. There is no living British writer who garners as much attention as Amis; so much of it hostile; and so much of that hostility, circularly, arising from the attention itself. He pushes back.
With a new novel coming – this month's heavily embargoed Auschwitz book The Zone of Interest – the circus starts up again. Amis occupies a really peculiar position in our national life. He is the object of envy, contempt, anger, disapproval, theatrical expressions of weariness – but also of fascination. Has there in living memory been a writer whom we (by which I mean the papers, mostly) so assiduously seek out for comment – we task him to review tennis, terrorism, pornography, the state of the nation – and whom we are then so keen to denounce as worthless? In recent years his public interventions on everything from Islamist terror to population demographics have caused mini shitstorms; and critics seem to take a particular, giant-killing glee in slamming his fiction. Setting out to write a retrospective essay on his work and reputation, the implied title you find yourself reaching for is "in defence of ... "
It's as if, and in answer to some inchoate public need, we demand of Amis that he say things in public so we can all agree on what an ass he is. He has spoken in the past – surly/amused – of an "eisteddfod of hostility", as if his detractors were the excitable participants in a provincial arts festival.
Why the eisteddfod? Why him? I think it has to do with the way we have positioned him, and – to an extent – with the way he has positioned himself.
Tom Holland in The Spectator:
As the fighters of the Islamic State drive from village to captured village in their looted humvees, they criss-cross what in ancient times was a veritable womb of gods. For millennia, the Fertile Crescent teemed with a bewildering variety of cults and religions. Back in the 3rd Christian century, a philosopher by the name of Bardaisan was so overwhelmed by the sheer array of beliefs to be found in Mesopotamia that he invoked it to disprove the doctrines of astrology. ‘It is not the stars that make people behave the way do but rather the diversity of their customs.’
Bardaisan himself was a one-man monument to Mesopotamian multiculturalism. A Jewish convert to Christianity, a Platonist fascinated by the wisdom of the Brahmins, an inhabitant of the border zone between the Roman East and the Iranian empire of the Parthians, he stood at the crossroads where antiquity’s most potent traditions met and intermingled. Just how far the process of blending rival faiths could be taken was best illustrated by a man born in Mesopotamia a few years before Bardaisan’s death: a soi-disant prophet called Mani. Brought up within a Christian sect that practised circumcision, held the Holy Spirit to be female, and prayed in the direction of Jerusalem, he fused elements of Christianity with Jewish and Zoroastrian teachings, while also claiming, just for good measure, to be the heir of the Buddha. Although Mani himself would end up executed by a Persian king, his followers were nothing daunted. Cells of Manichaeans were soon to be found from China to Carthage. Syncretic as their religion was, and global in its ambitions, Manichaeism was a classic Mesopotamian export of the age.