Monday, May 04, 2015
Utopia, Frame By Frame
"A map of the world that does not include Utopia
is not worth even glancing at."
I've recently become obsessed with a TV show, which is rather unusual for me. I like to tell people that, after HBO wrapped The Wire, I went ahead and sold my TV. Perhaps more melodramatic than true, but this is nevertheless close enough for essayistic purposes. The present show, however, could not be more different than the gritty realism of David Simon's character-driven creation. Created by Dennis Kelly and broadcast by Channel 4, the UK's fourth public service broadcaster, Utopia had a short run – only two seasons of six episodes each. Late in 2014, it was decisively announced that the series would not be renewed for a third season, but I think this was for the best. My grandfather once related an old Arab proverb to me: "One should always stop eating when it tastes the sweetest". I don't know if this is really an old Arab proverb, but there are certain things one just isn't inclined to Google.
At any rate, one thing that is certainly true for Utopia and shows like it: if you thrive on massively complex, increasingly far-fetched scenarios, the longer you go on, the more likely you are to trip over your own plotlines, and all hopes for a tightly orchestrated dramatic tension eventually evaporate. The most instructive recent example, which still rankles with fans, is how Lost wrecked itself on reefs of its own devising, despite the impressive hermeneutical gymnastics deployed by some in its defense. I would imagine that few producers and executives enjoy contemplating a similar fate for their own endeavors.
The hazard for Utopia's genre – the paranoid thriller – is especially acute. And settling on the proverbial ‘shadowy international conspiracy' as the principal plot mechanism only doubles down on the risk, since it is tempting to mop up any inconveniences using said conspiracy. Nevertheless, I have always had great faith in the British when it comes to the respect required to make conspiracies, erm, plausible. That's right – a conspiracy needs to be treated respectfully if it is to have any currency.
Obviously, we must at this juncture invoke the masters of the genre, such as Albert Hitchcock and John le Carré. More recently, Chris Mullin's novel A Very British Coup comes to mind, not only for the fact that it was also adapted into a serial by Channel 4 in 1988, but also because the premise – the consolidation of Margaret Thatcher's grip on power in 1979-1980 with the help of MI5 – forms one of many casual subplots thrown off by Utopia. If you always suspected that Labour got shivved in that election, this is the show for you.
The British are also decidedly superior at conveying paranoia, at least in the English-speaking world (although the Russians may have a cultural advantage here). By my reckoning, Duncan Jones' Moon is one of the masterpieces of paranoid sci-fi filmmaking of the last 25 years, and essentially relies only on the talents of Sam Rockwell's acting and Kevin Spacey's voice. The loneliness of Rockwell's character – the sole human resident of a lunar base, locked in to a three-year contract – is a different kind of loneliness than that of Dave Bowman, Kubrick's astronaut in 2001: A Space Odyssey. In Bowman's case, he is coddled by an alien intelligence whose intentions may be unknown, but never come off as sinister. Weirdly, there is a kind of comfort in the cosmically inscrutable nature of his fate, which is one of instrumentality. Rockwell's character, by contrast, struggles to understand the purposes for which he has been deployed to the lunar base, in this case by his corporate sponsors. It is therefore appropriate that those purposes are decidedly practical and driven by economics, to the exclusion of any moral considerations. In order for the paranoid genre to reach its fullest expression, it seems best to have only a few people around, and for the ultimate author of one's fate to be present and known, if only just out of reach – something that Kafka understood most intimately. In an ideal world, paranoia and claustrophobia are generative of one another.
Kubrick also occupies an interesting reference point when considering the visual appeal of Utopia. Almost every scene is shot from a one-point perspective, which was favored by Kubrick (and, truth be told, Wes Anderson). The psychological effect of centering the camera angle with its subject matter generally forces attention and creates drama and suspense – think of the hotel hallways in The Shining. Now you see the twins, now you don't. For his part, Anderson subverts the dramatic in favor of the ironic (you can view supercuts of the two directors' use of one-point perspective here and here, respectively). But in both cases the result is one of visual depth. Those hallways go on for a long time, perhaps forever.
For its part, Utopia drives the one-point perspective to new levels of zealotry. Remarkably, however, the end result is that depth is entirely sucked out of the frame. A crucial reason for this flatness is the equally fanatical use of color. Before you begin finding your way around the characters and plot, the first thing that hits you is the extraordinarily acidic nature of the palette. Blues are not warm but cyan and teal, and there are exquisite purples and pinks. The greens take on a lurid quality, and the yellows and oranges are nothing short of radioactive. Just as the one-point perspective is ever-present, for the length of the series there is no respite in the assault of color. Somewhat remarkably, this is not exhausting to the eye. The relentlessness consolidates itself into a sort of queasy consistency. Taken together, the resulting flatness forces a paradox onto the viewer: a kind of hyperreality mashed right up into an utter disregard for reality. What are we really seeing here?
Too often we mistake – or perhaps more accurately, accept – striking visual effects as a semi-autonomous phenomenon, existing within a film but not necessarily fully integrated into it, either. This blunting of our critical faculties may be the most pernicious legacy of CGI in cinema today. The fact that "The movie was crap, but the effects were really great" is considered an acceptable thing to say about a film at all (if not an actual endorsement!) illustrates just how far we have fallen. But the use of color and perspective in Utopia is so extreme that it clearly warrants further consideration.
The event that sets the Utopia universe in motion is the discovery of a manuscript, thought to be a sequel to a graphic novel, The Utopia Experiments, that was originally published in the late 1980s. The original novel is less a coherent narrative, and more of a cryptic set of drawings that imply the tale of a brilliant geneticist who has made a deal with the devil. All that is known of the author is that he was a paranoid schizophrenic who created the drawings as part of his art therapy while institutionalized. Soon after, the author/patient died in the same institution. Twenty-odd years later, a marginal yet robust community of enthusiasts continue to discuss and parse the novel and its possible meanings via an online discussion forum. Through chance timing, four of these participants are called together by a fifth, who claims he has just acquired the sequel manuscript. This motley mixture of the curious, bored, afflicted and conspiratorially minded constitutes the core protagonists of the series.
However, the first people we meet are the nemeses of this group, the representatives of the aforementioned ‘shadowy conspiracy'. Arby and Lee are cryptic, implacable killers who come to a comics shop, attempting to track down the manuscript ‘sequel'. The importance of the manuscript, or rather what it may hide, is demonstrated by the fact that, in the opening four minutes, about as many people get offed.
Needless to say, Arby and Lee don't recover the manuscript, but procure a lead that sets them on a collision course with the first group. But more importantly, what is established, in a sort of strange loop fashion, is the key to both the substance and the presentation of Utopia: the graphic novel is both the object of desire, and the lens through which that world is seen. For to watch Utopia is, literally, to watch a graphic novel unfold on the screen, panel by panel. This has been attempted previously, the prime example being the 2009 film adaptation of Alan Moore and Dave Gibbons's Watchmen, but Utopia has no precedent source material, so it is free to invent itself from whole cloth. Hence the very deliberate use of color and one-point perspective I discuss above – the resulting flatness makes each scene easily imagined as a frame in a graphic novel that is itself entirely imaginary.
As visually rewarding and intellectually stimulating as it is, do not think that Utopia shirks violence. In keeping with the aesthetic of a paranoid conspiracy theory graphic novel, plenty of people meet untimely ends. Pretty much everyone seems to be a good shot; when they aim for the head, they tend not to miss, and they rarely aim for any other part of the body. Children are not only murdered in cold blood, but become murderers themselves. There is a stark unsentimentality, and not a small dose of psychopathy, that befits a narrative that ultimately concerns itself with the end of the human race, or rather its possible salvation.
But this also introduces a further interesting consequence of the integration of substance and presentation: as the stakes rise implacably higher and the characters find themselves in more absurdly improbable and dangerous circumstances, the ‘graphic-novelization' of the visual style acts as a vaccine against our disbelief. Once jarring, the cinematography offers us license to accept what is happening. I think that if the series had been filmed in a more conventional, that is to say, realistic way, losing the audience would have been far likelier, regardless of how tightly scripted both seasons have been.
To say more about Utopia risks running into spoiler territory. In fact, there are quite a few timely issues that are either alluded to, or form core parts of the narrative. Some historical events that were appropriated by the show's writers even riled up the public, although I doubt that this contributed to the show's cancellation. Suffice to say that, except for some shoddy microbiology, I didn't really find myself rolling my eyes at each next big reveal, mostly thanks to the series' clever construction.
That said, throughout Utopia, there are plenty of Easter eggs for sci-fi enthusiasts: Bejan's fall off the terrace of his London high-rise is a reference to The Comedian's similar demise in the opening of Watchmen; a common spoon becomes an object of meditation for Wilson Wilson, but for entirely different reasons than it did for Neo in The Matrix. Utopia also has a close kinship with Black Mirror, about which I have written previously, but whereas Black Mirror is concerned with excavating the intended and unintended consequences of the relationship between society and technology, Utopia does not indulge in the dark satire that is Charlie Brooker's stock in trade. It is more otherwordly than didactic, and yet does not lack its own moments of leavening humor, which are expertly sprinkled. Nevertheless, both shows truck with the idea that we, either as individuals or as a society, are not nearly in control of our destinies as we might like to believe. Decisions have alrady been made, ostensibly in our collective best interest, whether we like it or not; such is the nature of Utopia.
Monday, March 23, 2015
You're on the Air!
by Carol A. Westbrook
The excitement of a live TV broadcast...a breaking news story...a presidential announcement...an appearance of the Beatles on Ed Sullivan. These words conjure up a time when all America would tune in to the same show, and families would gather round their TV set to watch it together.
This is not how we watch TV anymore. It is watched at different times and on different devices, from mobile phones, computers, mobile devices, from previously recorded shows on you DVR, or via streaming service such as Netflix and, soon, Apple. Live news can be viewed on the web, via cell phone apps, or as tweets. An increasing number of people are foregoing TV completely to get news and entertainment from other sources, with content that is never "on the air." (see the chart,below, from the Nov 24, 2013 Business Insider). Many Americans don't even own a television set!
We take it for granted that we will have instant access to video content--whether digital or analog, television, cell phone or iPad. But video itself has its roots in television; the word itself means, "to view over a distance." The story of TV broadcasting is a fascinating one about technology development, entrepreneurship, engineering, and even space exploration. It is an American story, and it is a story worth telling.
At first, America was tuned in to radio. From the early 20's through the 1940s, people would gather around their radios to listen to music and variety shows, serial dramas, news, and special announcements. Yet they dreamed of seeing moving pictures over the airwaves, like they did in newsreels and movies. A series of technical breakthroughs were needed to make this happen.
The first important breakthrough was the invention in 1938 of a way to send and view moving images electronically--Farnsworth's "television." Thus followed a series of patent wars, but at the end of the day, we had television sets which could be used to view moving pictures transmitted by the airwaves. In 1939, RCA televised the opening of the New York Worlds Fair, including a speech by the first President to appear on TV, President Franklin D. Roosevelt. There were few televisions to watch it on, though, until after the end of World War II, when America's demand for commercial television rapidly increased.
This led to the next big advance in television--network broadcasting. The big radio broadcast companies such as RCA (Radio Corporation of America) and CBS (Columbia Broadcasting System) naturally expanded into this media, but their infrastructure was limited. Though the frequencies used for AM radio transmission, from 540 to 1780 kHz (kHz means cycles per second) can travel long distances from their transmitting stations, each wavelength can only carry a limited amount of signal energy; in other words, it has a narrow bandwidth. Much higher frequency wavelengths, in the megahertz range (million cycles per second) are required for television so they can carry the additional information needed for picture as well as sound. As a result there was a scramble for higher frequency wavelengths, which was mediated by the FCC (Federal Communications Commission), the entity that regulates broadcasting. In 1948 the FCC allocated the higher frequency bands, designating which ones would be reserved for radio, and which ones for television, and and assigned channel numbers to the TV bands. The VHF television channels were designated 2 - 13. Channel 1 was reallocated to public and emergency communications, which explains why your TV starts with Channel 2! Several higher frequencies, designated as UHF, were reserved for later TV use, including channels 32 to 70. The FCC also froze the number of station licenses at 108 in 1948.
Because the number of broadcast stations was limited, TV was available only if you lived within range of a broadcast network, primarily CBS, NBC or ABC. In other words, if you lived a large city--New York, Chicago, Washington, Philadelphia, Boston, Los Angeles, Seattle or Salt Lake City. Outside of these areas, you might have a chance if you lived on a hill, put up a very high antenna, and prayed for a thermal inversion or a charged ionosphere to propagate the short signal to your television. My husband Rick, an electrical engineer and amateur radio buff, recounts that he watched the coronation of Queen Elizabeth in 1952 from his TV set in a small town in Pennsylvania, due to an environmental quirk (sunspots?), but everyone else had to wait for the films to cross the Atlantic and be shown on their local station.
Yet, for those of us who lived in a prime location, there was an ever-expanding number of programs to watch, such as the Texaco Star Theater, the Milton Bearle Show, and a variety of news shows. Many of us grew up on Howdy Doody, or shows created locally and televised live. I recall walking home from grade school for lunch as a child in Chicago, spending an hour watching "Lunchtime Little Theater," before returning to school to finish the afternoon's lessons! Many of these early shows have been lost, as they were never recorded, and video had not yet been invented.
Television broadcasting eventually went nationwide, thanks to microwave transmission, which developed out of WWII radar. This technology was used to relay television broadcasts to local affiliate stations, which could then broadcast them on their regular channels in the local area. Microwaves use point-to-point transmission, from one microwave tower to the next, and microwave towers were constructed to span the continent. The FCC increased the number of television station licenses, and the broadcast companies truly became "networks." Finally, everyone could watch the same shows at the same time.
But TV was still limited geographically--it could not cross the ocean. This problem was not solved until the third important technology was developed, that of satellite broadcasting. Sputnik, the first space satellite, was launched in 1957. Five years later, July 23, 1962, the first satellite-based transatlantic broadcast took place using the Telstar satellite to relay TV signals from the US ground station in Andover, Maine, to the receiving stations in Goonhilly Downs, England and Pleumeur-Bodou, France.
It's fun to watch this broadcast, which was introduced by Walter Cronkite, and began with a split screen showing the Statue of Liberty on the left and the Eiffel tower on the Right. The satellite transmission was followed a live broadcast of an ongoing baseball game in Chicago's Wrigley Field between the Philadelphia Phillies and the Chicago Cubs, and also included live remarks from President Kennedy, as well as footage from Cape Canaveral, Florida, Seattle, and Canada. I've included a short clip of the Kennedy broadcast.
If you looked up at the night in 1962, you might see the Telstar satellite zoom across your backyard sky. It took about 20 minutes to traverse, passing every 2.5 hours. Broadcast signals could only be transmitted to Telstar and back to land stations on either side of the Atlantic only during this 20-minute transit time, so the tracking satellite dishes had to be fast-moving; they also had to be very large to capture such a weak signal. It is impressive to see the massive size of the dishes in these satellite ground stations, and, and to imagine how quickly they had to move to sweep the sky. This picture of Goonhilly Downs gives you an idea of their size.
Although Telstar demonstrated that satellite transmission was possible for long-range broadcasting, the equipment and precision needed for tracking a rapidly-moving low-earth satellite was onerous. So the space scientists at NASA and Bell Labs launched the next generation of satellites, named "Syncom," into high earth orbit at just the right distance from the earth so that their speed matched the speed of the earth's rotation. When orbiting directly above the equator, the Syncom satellites appeared to be stationery over a single geographic location. Thus, the geostationary (or geosynchronous) satellite was born.
Stationery satellites paved the way for a tremendous expansion in telecommunications, and are still in widespread use. Satellites enabled the rise of cable TV networks such as HBO and CNN in the 1970s, which broadcast without having to go through FCC-regulated television transmitting stations. Instead, their programming was sent via satellite to the cable service, and from there selected programs went by cable to the TV of paid subscribers. These stations could also be accessed through Satellite TV subscription, such as Galaxy, which broadcast them directly to their customers' satellite dishes. Because early satellites could only carry a limited number of cable channels, multiple satellites had to be accessed to provide the purchased programming. Moveable satellite dishes of about four to twelve feet in diameter were positioned in the subscriber's yards or on their roof. Satellite TV further expanded American's access to television, reaching rural communities that had limited (or no) cable service and poor antenna reception; they also provided special paid programming, such as sports events watched at bars. This picture shows a 10-foot moveable dish in my yard in Indiana.
Stationery TV dishes--such as Direct TV antennas--were not feasible until satellites were able to carry more programming, so the dish could stay parked on only one geosynchronous satellite. The technical advance which allowed this was the development of digital video, in the late 1990's. Digital video would eventually displace analog-- remember when the DVD was introduced, which rendered VCRs obsolete in just a few years' time? Each genosynchronous satellites could now carry many more simultaneous channels than before, since each channel takes up only a small fraction of the bandwidth when compared to analog signals. Digital signals also increased the capacity of traditional TV, broadcast from ground towers, which eventually transferred to the HDTV standards, which broadcast at the high capacity UHF frequencies. The transition to HDTV was completed in June 2009, and the TV networks abandoned analog transmission on the old VHF channels, though many of the newer stations carry the old numbers (2 - 13). TV viewers are surprised to learn that they can watch their favorite channels on the newer HDTV sets using only a simple indoor antenna, and many are giving up their pricey cable services. Digital video signals were ready for growth in other media, as they theoretically be transmitted over the internet or by cell phone, and could be stored easily for re-broadcast.
Yet one more step was needed before widespread internet and cellular-based video could occur, allowing us to watch television programs as we do now. This was not a technical advance but an economic one--the sharp drop in the price of computer memory, which happened about 2009. Prior to that, computers had a lot less memory and storage capacity. Perhaps you remember the agony of trying to watch a YouTube video in its early years? Or of waiting for your browser to load? Now we take it for granted that we can view digitized images, create them, share them, watch pre-recorded programs, and record on our TIVO from multiple sources. There seems to be no limit to the ways that we can enjoy television, truly viewing "pictures at a distance." It is a far cry from the early years of television that many of us still remember, when we all watched a small, black-and-white screen with poor sound, to watch John, Paul, George and Ringo sing "I Love You." Now those were the days!
Thanks to my husband Rick Rikoski, for his patient and helpful explanations of the technology of television and its early development.
Monday, April 01, 2013
by Jalees Rehman
Nietzschean, Heideggerian, fascist, anarchist, libertarian, brilliant genius, blabbering nutjob - these and many other labels have probably been used to describe Peter Sloterdijk, who is one of Germany's most widely known contemporary philosophers. He has achieved a rock-star status in the echelons of contemporary German thinkers, perhaps because none is more apt than Sloterdijk at fulfilling the true purpose of a public intellectual: inculcating his audience with an insatiable desire to think. His fans adore him; his critics are maddened by him. Few, if any, experience indifference when they encounter the provocateur Sloterdijk.
Sloterdijk achieved fame in Germany after publishing his masterpiece "Kritik der zynischen Vernunft" (English translation: "Critique of Cynical Reason") in 1983, but his hosting of the regular late-night talk show "Das Philosophische Quartett" on the major German TV network ZDF for ten years turned him into a cultural icon and a household name. I realize that it might seem strange to non-Germans that philosophers instead of comedians can host TV talk shows, however Sloterdijk would probably be the first to agree that there isn't much of a difference between a true comedian and a true philosopher. Not only do we Germans have TV philosophers, we even enjoy the TV gossip and cockfights that they indulge in. When the ZDF network decided to get rid of Sloterdijk and replace him with the younger, more handsome and less thoughtful philosopher Richard David Precht, they start engaging in reciprocal mockery and name-calling.
Unfortunately, Sloterdijk is not quite so well-known in the English-speaking world and this may in part be due to the fact that much of his oeuvre has only recently been translated into the English language. It is no easy feat to translate his writings, in part because his playful mastery of German words is one of his signatures. Sloterdijk is a wonderful story-teller who weaves in beautiful images and puns into his narration, many of which are unique to the German language. His story-telling also makes it difficult to understand some of his texts in the original German. One may be enthralled by his stories, but after reading a whole chapter or book, it is quite difficult to condense it into a handy "message" or "point". Sloterdijk is a professional digressor, going off on tangents that are entertaining and exciting, but at times quite frustrating. He shares his brilliant insights on a broad range of topics ranging from metaphysics to politics with his readers, but he also offers practical advice on how we can change our lives as well as bizarre and pompous statements.
One of his more recent books is called "Philosophische Temperamente: Von Platon bis Foucault", which can be translated as "Philosophical Temperaments: From Plato to Foucault" and it is not yet available in an English translation. In the 1990s, Sloterdijk assembled a collection of texts and excerpts by 19 philosophers (Plato, Aristotle, Augustine, Bruno, Descartes, Pascal, Leibniz, Hegel, Schelling, Fichte, Schopenhauer, Kierkegaard, Marx, Nietzsche, Husserl, Wittgenstein, Sartre, Foucault) which he felt ought to be studied. Sloterdijk was convinced that the best way to truly approach a philosopher was to read the primary texts instead of relying on secondary sources. He also wrote short prefaces for the 19 volumes, each containing 400-500 pages of texts by one philosopher. The prefaces were intended to serve as brief introductions, enticing the readers to delve into the main volume. These prefaces were not academic-style summaries of the lives or works of the philosophers, they were verbal portraits painted by Sloterdijk. They were subjective impressions of their philosophical moods and temperaments, which explains why the collection of these 19 prefaces was released under the title "Philosophical Temperaments".
As with so many portraits, they reveal more about the painter than the subject of the portrait. "Philosophische Temperamente" allows us to take a peek into Sloterdijk's own temperaments. These portraits are stand-alone essays, but what is most striking is that despite their brevity, they are packed with provocative insights. The whole book has only 144 pages and only few of them are longer than seven pages. Even in these tiny portraits, Sloterdijk manages to digress, using a few core ideas of the philosopher as a starting point and then drawing parallels to our lives today. But it is precisely these kinds of digressions and parallels that remind us why these dusty classic of philosophy continue to be relevant for our lives.
This past decade has seen the rise of the TED-talk mentality. The idea of providing a forum for innovative thinkers to share their ideas with rich conference attendees, as well as the not-so-rich general public via a free internet broadcast has become a hot fad. Now that we are inundated with thousands of TED-talks and TED-copycats, many of us have developed TED-fatigue. The expression "TEDtalking" may soon become a new form of insult, referring to the watering down and oversimplifying of complex ideas, the sharing of touching and life-changing personal stories or exuding excessive positivity which fills the audience with vacuous joy and earns a heartfelt applause. I always thought of Sloterdijk as the prototypical anti-TEDtalker, because his writings do not attempt to leave the reader in a happy and cozy place. Sloterdijk likes to challenge us, evoking intellectual unease and restlessness in our minds and invites us to disagree. His essays and books with all their digressions tend to be so long, that I thought it was inconceivable for him to condense them into a 15 minute TED time slot. Sloterdijk does not offer any convenient prefab take-home messages or TED-style smug happiness.
After reading "Philosophische Temperamente", I have begun to reconsider my views on Sloterdijk and TED-talks. In these 19 mini-essays, Sloterdijk gives TED-talks without TEDtalking. His TED stands for "Tease Entertain Disagree" and instead of the traditional TED motto of "Ideas worth spreading", Sloterdijk presents us with "Ideas worth critiquing". Perhaps the organizers and presenters at TED-conferences could learn something from Sloterdijk's style.
Each mini-essay is a teaser which could potentially ignite discussions, not only about a specific philosopher, but also about the role of philosophy itself. The portrait of Augustine, arguably the least flattering of all portraits in the book, suggests that he infused Western thought with a sense of debasing anti-humanist "masochism", the idea that humankind is worthless, were it not for the grace of God. This idea thus directly connects Augustine to contemporary debates revolving around the role of religion, which do not only apply to Augustine or Christianity, but to all religions. Similarly, all other portraits also offer similarly provocative statements.
Here are translations of a few short excerpts from the book:
The chapter on Plato is the longest in Sloterdijk's book, but it discusses far more than just Plato, ranging from the purpose of philosophy to the ills of contemporary fundamentalism.
„Der Fundamentalismus, der heute weltweit aus dem Mißtrauen gegen die Modernität entspringt, kann immer nur Hilfskonstruktion für Hilflose liefern; er erzeugt nur Scheinsicherheiten ohne Weiterwissen; auf lange Sicht ruiniert er die befallenen Gesellschaften durch die Drogen der falschen Gewißheit."
"The world-wide phenomenon of fundamentalism which in today's world is rooted in a distrust of modernity can only serve as futile aides for the helpless; it generates pseudo-certainties without the desire for further knowledge; in the long run it ruins the afflicted societies with the addictive drug of false certainty."
The portrait of Schopenhauer introduces him as the pioneering thinker who quit the "Church of Reason" ("Vernunftkirche").
„Von Schopenhauer könnte der Satz stammen: Nur die Verzweiflung kann uns noch retten; er hatte freilich nicht von Verzweiflung, sondern von Verzicht gesprochen. Verzicht ist für die Modernen das schwierigste Wort der Welt."
„Schopenhauer might have uttered the phrase: Only desperation can save us. Yet he did not speak of despair, but of renunciation. Renunciation is the most difficult word for the modern world."
This passage from the chapter on Marx includes a fascinating statement about contemporary media:
„Telekommunikation läßt sich von Televampyrismus immer schwerer unterscheiden. Fernseher und Fernsauger schöpfen aus einer verflüssigten Welt, die kaum noch weiß, was widerstandsfähiges oder eigenes Leben wäre."
"It is becoming difficult to distinguish between telecommunication and televampirism. Television and Telesuction draw from a liquefied world that hardly knows the concept of an independent or resistant life."
It is difficult to translate Sloterdijk's neologism "Fernsauger", which literally means "tele-sucker" or "tele-suction device". In the original German, it is a beautiful play on the words Fernseher (television or tele-viewer) and the German word for a vacuum cleaner ("Staubsauger" , literally a "dust-sucker").
„Was Sartre angeht, so blieb er zeitlebens seiner Weise, die bodenlose Freiheit zu leben, treu. Für ihn war das Nichts der Subjektivität kein herabziehender Abgrund, sondern eine heraufsprudelnde Quelle, ein Überschuß an Verneinungskraft gegen alles Umschließende."
"As for Sartre, he remained true to leading a life of boundless freedom. For him, the void of subjectivity was not an abyss that pulls us down. Instead, it was a spring, gushing upwards and resisting all forms of enclosure."
English-speaking readers will soon be able to read a translation of the complete book, to be published by Columbia University Press in May 2013. I have not yet seen the translation, but I suspect and hope that the nature of this particular Sloterdijk book will make it one of the most accessible introductions to Sloterdijk's thinking and explanation for why we should continue to study classic Western philosophers.
Monday, November 07, 2011
The (De)Merits of Pop Culture Conferences: Coyne and Tanner on the 'Jersey Shore' Academic Conference
by Tauriq Moosa
The gods of irony are smiling. I recently attributed the existence of the TV-show Jersey Shore as the closest thing to an insult I could fathom for myself, when comparing myself to Christians who regularly want things banned. Then, thanks to Jerry Coyne, I discovered my old friend – my seriously old and now obviously senile friend – academia has cozied up to said show, in order to get them young folk interested in “bigger questions”.
Not so long ago, the University of Chicago had an academic conference on Jersey Shore, where the various sessions discussed important topics like: “The Monetization of Being: Reputational Labor, Brand Culture, and Why Jersey Shore Does, and Does Not, Matter”, “The Construction of Guido Identity” and “Foucault’s Going to the Jersey Shore, Bitch!”. What are the merits of having conferences on pop-culture, where questions are discussed on metaphysics, ethics and “identity” (I still don’t understand that topic)? Anchoring these questions to pop-culture topics, like Jersey Shore, is like putting scented oils on a corpse, serving little purpose other than to keep our breakfasts down before we bury the whole mess and carry on with our actual lives.
Coyne certainly thinks it’s largely useless:. He says: “(1) I’m not a huge fan of academic pop-culture studies, which seem shallow, too infested with postmodern obscurantism, and bad in that they replace more substantive material that can actually make students think deeply about things. (2) Pop-culture courses seem to me to be an easy way for professors to attract students by tapping into their t.v.-watching and music-listening habits.”
Now those are two distinct points. The first part argues that pop-culture conferences are largely useless, a waste time and resources, too indulging in obscurantism, and replace actual learning with the illusion of grappling with profound subjects because the titles indicate “big questions”. The second part points out why such conferences exist at all and how professors can be comfortable teaching this with a straight face: it gets them students, therefore maintains income because more students would come to a course on Jersey Shore than just vanilla ones on Plato, etc. The second is a description and seems to me obviously true: It is one way to keep education alive, one way to secure oneself a regular job, and so on, by affixing your learning toward what your audience actually cares about.
Whether it is a good or even the best way to do so is another question, which, by assessing Coyne’s first point, we should come closer to answering. In the end, I do think there is a better way and I think it’s in fact more damaging if we indulge in what “the kids” are focused on rather than what they should be focused on.
Larry Tanner, at his blog Textuality, has written something of a reply to Coyne (actually that’s the name of the post). Tanner notes that Coyne only went to two lectures (which is two more than I would). Coyne summarised his views at the end, which only maintained his view on pop culture conferences. Says Coyne:
Waste of time and the money used to fund it. I know readers will contest this, and I did go to only two talks, but both were dire, boring, and completely unenlightening. It was a deadly combination of postmodern theory and pop culture. It’s harmless to talk about this, I suppose, but it’s a question of how to prioritize academic funds—and scholarship.
Tanner, looking at the program, which Prof Coyne was kind enough to share, says “the broad topics seem worthwhile.” I want to look at what Tanner says and see whether he manages to offer good reasons for us to take the conference even a little seriously, and therefore disagree with Coyne on pop-culture conferences in general. Each topic I'm numbering here is on the program which you can see over at Prof. Coyne's website.
1. “The Construction of Guido Identity”
Tanner on the topic “The Construction of Guido Identity” says: “I don't know much about Jersey Shore in particular, but a session that looks at the show's representations of masculinity, race, sexuality, and identity seems pretty interesting. What makes someone manly in that world? What importance is place on identifying as an Italian American?”
Again, I’ve never understood what identity studies are about and what it means. I say this and I live in South Africa. Having engaged with it for many years, I’ve found identity studies to be nothing but nonsense posturing as deep, complex, psychological questions. In the end, who the hell cares? I’m an ex-Muslim who studies bioethics, to change public policy on matters on euthanasia and organ donation, and I read too many comics – I’ve never considered what my identity is or means in the context of a society that is largely unemployed and uneducated. What I have considered is what those factors of unemployment and no education will do when I attempt to engage in political change on matters of medicine (since the majority of the very population I want to benefit might not at first understand my reasons for wanting medical practioners to kill their patients, legally).
But will engaging with what it means to, say, be a man in today’s world really be an important topic? I’m always hesitant about such topics since sometimes people want to take what should be a discussion as a platform to advocate how men (or women) should be; which I think is unfounded, since gender roles don’t make sense anymore with, for example, increasing acceptance of homosexual relationships and artificial insemination. Who cares “how” a man should be in today’s world? I don’t think it’s a relevant topic, but then that’s just me.
Furthermore, what does it matter what such ideas mean to Italian-Americans? Unless you want to discuss culture and politics, I’m not sure gender discussions are relevant. What conclusion could one possibly reach from this kind of discussion that is so important it would change our view on Italian-American men or vice versa?
Notice, all we’ve done here is focus on the topic and not its relation to Jersey Shore. If the actual topic is pointless and a waste of time, what advantage is there to adding the characters and interactions of this very stupid show? It seems to make it even worse.
2. “Morality and Ethics”
I know Prof. Coyne thinks this an important topic. Naturally, I do too given I’m doing a postgrad degree in it. Coyne would not say the topic of morality and ethics is pointless, but how it is discussed can be. And given that Jersey Shore and morality are as incompatible as science and monotheistic religion, we should not be surprised Coyne thinks this. Consider: The topic of whether euthanasia should be legalised is an important one. But this debate could be held by two over-emotional, first-year students and lead nowhere (I’ve even seen futile moral discussions with medical students I’ve taught and my colleagues who are medical practitioners). This, it seems to me, is what Coyne means: the discussion is futile even if the topic is legitmately important.
Tanner asks: “Again, the general topic [of morality and ethics] seems worthy of the humanities. Why not use a popular TV show to explore moral behavior and ethical dilemmas faced by the characters (albeit as edited by the show's producers)?”
Firstly, we disagreed that the first topic was worth investigating at all. The fact that it was discussed within a Jersey Shore context only highlighted how pointless the discussion was.
Secondly, as I’ve said, the way the conversation is conducted could make a legitimately important topic, like morality and ethics, inundated with myopic views and lead nowhere. A glance through that section (on Coyne’s website) shows that in fact two out of three discussions deal with Foucault. So there isn’t much diversity in the topic itself, let alone aspects that would appeal to those who know little or nothing about Foucault. Note that Foucault discussions tend to be often quite narrowly focused anyway, so it wouldn't even fit the criteria Tanner discusses.
Now to be clear, I can understand using certain examples from pop-culture to highlight important topics. I’ve just completed a chapter on The Walking Dead and philosophy (about why having babies is immoral), which should be coming out soon. Using familiar subjects can make discussing the overall topic enlightening to new people. My favourite modern philosopher, James Rachels, used various examples from literature and television to clarify his points. It’s simply a way, which, depending on your ability as a teacher, can either work or bomb. However, as I’ve highlighted, the very topic themselves in this conference seem poorly handled. As I was not there, I can’t say whether this is true but I find it hard to believe you could salvage a good discussion from these topics except to say why no one in academia should be wasting their time discussing them in the first place. Furthermore, staying within the bracket of "morality and ethics", if the majority of your topics concerns Foucault, I’d hardly call the entire topic morality and ethics.
I won’t say more, since I do think it possible – if I imagine very hard – that one could have a fruitful discussion on morality and ethics, using Jersey Shore. But I’m not convinced, given the actual discussion topics and the nature of the whole conference from what I’ve read.
3. Affect, Honor and Desire
Tanner says of this topic:
This is one session I would have liked to attend, especially the paper involving medieval Iceland. Ultimately, this session appears to want to understand how the characters of the show view their world. This understanding is the bread and butter of the humanities. We want to know the world just as Hamlet did, or just as Beowulf did, or just as Guinevere did. We want to capture the cultural perspectives that audiences in the past brought to their lives and to the artworks presented to them.
I have a short response to this: Why?
4. Guido Cultural Signifiers
Tanner says: “Another good session, if the paper titles are any indication. Surely, the people on the show have personalities and behaviors that appeal to viewers. Asking why this is so and looking for answers that go beyond pat stereotypes--well, these seem like good things to me.”
I don’t know what cultural signifiers are. I assume they are properties that indicate what constitutes a particular culture? Perhaps like the crescent moon and star signifies Muslim culture, red-white-and-blue signifies America, etc. Regardless, Tanner does have a point. It is good to know why millions of people would pay to watch mundane, untalented people go about getting drunk and acting idiotic. There are plenty of hypotheses: most people are comfortably bored with their lives and, lacking creative stimulus enjoy seeing "better" versions of themselves through the tanned, ripped abs of Italian-American people from New Jersey; the show is so unbelievably stupid, you watch it the same way you do a car-crash in slow motion, except the things breaking are people’s lives and what’s dissolving is time better spent elsewhere; and so on.
Whatever the reason, you could use that information to get those millions of viewers on to better, more important topics. For example, we can also ask why could Carl Sagan get the whole world fascinated with science? How does Richard Dawkins (and indeed, Jerry Coyne) do it?
These are legimate questions since here we actually have something the other topics appear to lack: data. Yes, information we can use. I’m not sure how many humanities students have heard about data and evidence – judging from my year of teaching them, not much – but here is a topic in which data and evidence could arise.
But um given that it’s about cultural studies and signifiers, I think it’s doubtful.
Tanner has raised at least one good point, but quite I think by accident. Coyne is right that these all seem rather boring or obscure, pointless and a waste; Coyne is not saying, for example, morality and ethics is useless but how it’s conducted surely matters. Similarly, if the topics in and of themselves are useless – as I’ve indicated – then its hard to see what use adding the Jersey Shore element could possibly add (make it more tanned, stupid, pointless, rough, dirty?)
Notice that despite my intense dislike for Jersey Shore, I’ve not said because the topics deal with Jersey Shore that that is why the conference was perhaps silly – I’ve even indicated I could imagine (because I have a wonderful, active imagination) using Jersey Shore to explain elements of ethics. All I’ve argued is that this conference seems to be just as Prof. Coyne has said. The topic could’ve been anything: the Darwin biopic Creation, The Dark Knight, Carnivale, LOST, etc., and my points would remain.
Tanner has said we discuss all manner of topics, like identity and sexuality and power in Shakespeare: we put it in the context of England at the time. This has historical importance, since we gain clarity of the time. By examining Jersey Shore, Tanner says, we can do the same.
But again: even if this conference was on Shakespeare, it still seems pointless. The danger with pop culture topics is that it allows more easily for nonsense topics to sift through. What could the lecture topic “Foucault’s Going to Jersey Shore, Bitch” possibly be about? You could put whatever you want there, perhaps qualifying it by mentioning Foucault in passing. (I find Foucault a fascinating writer about European history. I stop listening when he talks normative politics and ethics).
It is also good to see that Tanner agrees with Coyne that “most humanities scholars are taught, at least through imitation, to present the paper first and foremost, and worry very much less about connecting with an audience. This is unfortunate, especially since in our classrooms we are talking to our students and seeking to engage them as best we can.” Alison Gopnik has recently given an interview confirming this: that lecturing perhaps is not the best way to teach. Small groups, with active engagement, is better where it is possible to do so. This forces you, as the teacher, to be on your feet, asking questions and having a conversation. Of course there’s a place for lectures but even lectures should be, I think, tempered by trying to convey the information rather than as a display for your intellect. I would rather be thought dumb but coherent than smart but incomprehensible. Humanities, I’ve learnt, trains you to try be the latter – again the main problem is that it, like this conference, prevents the most important aspect of education - grappling with problems, failing, succeeding in arguments, discovering alternate views and so on - from arising.
In the end though, Coyne and Tanner’s view serve to engage us in this discussion on how education is best served. It seems to me, as does Coyne, that this is not a good way to do it. But then, what do I know?
Monday, December 13, 2010
An Open Letter to the National Punditry
Dear Esteemed Pundits of America,
The 2010 mid-term elections are behind us, and all the post-mortem analyses of the races are complete. Yet the 24/7 news cycle, and the corresponding demand for your incisive commentary, will not abate. So, what next? Will you turn your attention to the Congress and examine the ways in which the new House leadership clashes with President Obama? Will you look ahead to 2012 and offer odds on who will be the Republican nominee and how likely he or she is to defeat Obama? Will you continue to discuss the Tea Party in your ongoing attempt to discern who they are, what they want, and whether they matter? Will you investigate the gradual implementation of our healthcare bill and monitor the inevitable dissolution of DADT? Will you be able to sustain your interest in our increasingly quixotic military adventures? Or will you take up a cause you regard as underappreciated among the American people? These are all arguably worth your consideration. But we have a better idea: Resign from your job in broadcasting and run for public office.
We admit that this is a bold suggestion. Perhaps it has never occurred to you to seek political office. But consider how this course of action is required in light of the things you say and how you understand yourselves.
You take yourselves to be public figures committed to keeping the American government in check and on the right track. You offer daily commentary on national politics as a crucial contribution American democracy. You do not merely report the day’s news; indeed, many of you claim that you are not reporters at all. Rather, you claim to be commentators on the news, and you draw a sharp conceptual divide between yourselves and “the mainstream media.” We understand that you must insist on this distinction, for you take one of your central tasks to be that of exposing the media’s biases, distortions, and blind-spots. You understand your job to be that of helping the American citizenry to strip away propaganda, double-talk, and spin. You present the facts, and then you help the American people to understand what they mean. We’re thankful.
As critics, you play an invaluable role in American democracy. If, as you say, nearly everything reported on the nightly news and in the daily papers is infected with spin and bias, democracy surely needs people like you to help us to sort it out. Democracy is centrally about holding power accountable to those over whom it is exercised. And transparency is necessary for accountability.
That’s where you come in. You help us to hold power accountable by making transparent to us all the ways in which our government is incompetent, inefficient, dishonest, untrustworthy, two-faced, and unscrupulous. In fact, many of you go further than this. You claim not only to expose the intellectual and moral failures of those in power; you also take yourselves to know how to do better. You frequently speak as if seemingly complex questions about domestic and international policy are actually simple and easy, once the scales of stupidity and immorality are lifted from one’s eyes. You take it upon yourselves to educate us. Consequently, in your daily communications you present your cases in support of the policies you think best. And since you are in control of a forum which enables you to address large masses of people over many days, weeks, and months, you can be highly persuasive.
Since you take yourselves to know so much, and to know that others know so little, it arguably is a very good thing that you wield the power to persuade sizable portions of the American citizenry. However, there is a catch. And it’s a catch that should come as no surprise to you, given the principles you espouse. The power to persuade people about political matters is itself a form of political power. And, as you know, in a democracy, those who wield political power must be accountable.
Thus we find it odd that you, political commentators who hold that accountability is so crucially important, are content to hold positions of power that are not accountable to the people. How can you stomach it? Everyday you call your audience to indignation over the incompetence, duplicity, cowardice, and immorality of those who hold public office. You claim for yourselves the heroic role of uncovering governmental ineptitude. You proffer warnings of our great country’s immanent demise. You openly condemn those who hold public office for being out of touch with the American people. And you freely provide what you claim are fail-proof prescriptions for restoring freedom and prosperity in America. Still, you sit in a television or radio studio, safe from actually having to work to put your ideas into practice, forever insulated from the accountability you claim to so highly prize.
At the risk of sounding unappreciative of your daily service to American democracy, we ask: Aren’t you ashamed of yourselves? You profess to know so much about how the country ought to be run. You claim to know how to save America. Isn’t it immoral of you to decline to serve the American public in an official capacity? Given the principles you embrace concerning power and accountability, it would seem that you should take yourself to be morally required to seek public office. So why don’t you?
Of course, a cynic would assert that in writing this open letter, we are simply proving our naïveté. They will say that we have falsely assumed from the start that you sincerely take yourselves to be serving American democracy, when in fact you couldn’t care less about America, but care only about that which you on a daily basis purport to hold in contempt: amassing money and exercising power. To be clear, we’re not cynics. But we must confess that in light of the considerations above we find it increasingly difficult to resist the cynical conclusion that you’re nothing but a collection of wind-bag opportunists immorally and obscenely profiting from the civic decay that you are in significant measures willfully precipitating. So consider: Major elections are coming up in 2012. Now is the time to begin planning your campaign. Resign from your comfy studio, get up from your armchair, and hit the pavement. Present your ideas to audiences that are not composed of self-selected sympathizers. Defend your views in forums where you cannot control the phone-lines or edit the footage after the fact. Seek a position where your political power and ambitions are constrained by the constitution you say you honor above almost all else. Hold yourself accountable to the people whose interests you claim to serve and for whom you profess to speak. Prove the cynics wrong.
Scott F. Aikin and Robert B. Talisse
Department of Philosophy
Monday, December 21, 2009
Remediality Studies: The Decade Gone By
T.S. Eliot might well have smirked at the events of the Naughty Oughties. By one yardstick, they came in with a bang and ended with a whimper, trussed up and devoured by the dirty deeds, done extravagantly, of the stuffed men, the hollow men.
Back in the green days of Communism's defeat (which we, in our typical hubris, called capitalism's victory) an American president spoke of creating a "bridge to the 21st century." Of course, this was dismissed as mere rhetoric by less (publicly) priapic politicians. Through the hindsight of the intervening years, however, it's become clear that such a bridge was indeed necessary. The left and right banks of America, blue in mood and red in face, were left hanging by chads on a Bridge to Nowhere, suspended within a fiction called The End of History.
History, that's the rub – history, and its myths. From the very first days of the Bush Administration, I sensed that the conservative American consciousness, boiled down into its thick molasses, was simply in fear of the future. We were held back, as a nation, by a persistent fear (predominently by those who witnessed the chaos of the '60s) that history Xeroxes itself; that any struggle towards positive change, any at all, was a fool's errand, doomed to devolve back to Fascism or Communism, except this time with the extra added bonus of nuclear apocalypse. And those of us who came to oppose this nation's decisions perhaps understood ourselves as being held back, from advancing a grade in a school called Democracy and the Pursuit of Happiness. Held back, by a dubiously legitimate leader who clearly attended Bible School dutifully but spoke as if he himself hadn't passed the 3rd grade.
History, as Morpheus said, is not without a sense of irony. And it doesn't like to be declared deceased.
I know we want to leave this low, dishonest decade, but I say: not yet, not quite yet. There are still a few days in which we may legitimately consider what happened to us, before the tsunami of ever-present tensions crashes down upon us anew.
From my peculiar and partial vantage point, every great American crisis of the '00s – Y2K, the Dot-Com Collapse, The Great Indecision, 9/11, Iraq, Abu Ghraib, Katrina, global warming, the Media Crisis, and the Financial Crisis – stemmed from our inability to integrate the hyperspeed advances in media technology with our aging infrastructures – physical, economic, managerial, governmental, and moral. This is the chasm that needed and still needs to be bridged – it is, I believe, the parsing of Clinton's metaphor.
And from this chasm (with ceaseless turmoil seething) I saw two great übercrises mingling, and seeding the events of the Double Zeroes: a Crisis of Information, and a Crisis of "Reality." Information: too much of it, in terms too jargonized, too euphemized, and too fractured. "Reality": a state of being controlled by the new technologies of media, without sufficient intellectual tools or time for us to interrogate adequately.
Yeah, whatevs, you say. Too subtle by half, you say. It's the "postmodern condition," get over it. Or: hubris and incompetence, failing upward rather than failing better, same as it ever was. Or: The Matrix. Live in the sewers, Neo, and jump buildings in your brain (got a better idea?) Sorry, folks, but I need to plumb a little deeper than those keyword searches.
The first true terror I felt in this decade, the first moment I perceived a great unraveling, was not on September 11, 2001. The date was May 8, 2002, when MTV broadcast the episode of "The Real World: Chicago" that was filmed on 9/11.
I kept an extensive journal in those days, and here are some of my immediate responses to that moment:
Pornography: it doesn't have to involve ass-fucking, ladies and gentlemen.
"The Real World," they call it – staged and manipulated by directorial and cinematographic choices, cut, distorted, warped by 15 minutes of fame – "The Real World" looks up, horrified, to find the real world.
Picture: a real-time television show manipulated to give the illusion of a "reality," which comes face-to-face with its evil twin, its alter-ego, its own Bizarro-World: a real event, an event so terrifyingly real that it blisters out any attempts at simulacra, a real event which then itself becomes a mirror of its own unreality as it's been portrayed in dozens of blockbuster action movies.
I watched the towers fall on t.v., videotaped and relayed, rewound and replayed for my your our pornoedification. Then I watched the towers fall on (in) "The Real World: Chicago," reflected in the eyes of the characters, who were watching it on t.v. I was watching the towers fall on a t.v. in "The Real World." In "The Real World," I was watching the towers fall on t.v. How many panes of glass are between me and it now? How bloody will my fists get when I try to punch through them all?
And what was more pornographic? I did and didn't want to see the rawness of emotion in the characters' faces. It was as if the producers had the same bifurcation of sensibilities: they started – then stopped. I began comparing: at this moment, when they are doing X, I was doing Y. I couldn't make any sense of it. They took characters out of the group ensemble around the t.v. – I assume at a later point, but edited as if it were commentary at the present moment – for individual "speak-to-the-camera" interludes – measured, solemn platitudes. Defense of the country, tragedy, yadda yadda yadda. Then a sober circle-discussion... unreal city. Unreal everything. As if the State Department and the CIA hiked straight over to MTV and said, "Listen, now, we can't broadcast anything that might be seen as 'inflammatory,' if you get my drift...'" Or was it MTV itself? Or the kids? Or was it unreal? And God, GOD, how could they have manipulated these kids' emotional responses? How could they live with themselves? How could they have had the cynical, the serpentine savvy, in the middle of that day and the days that followed, to think, "Okay, folks, we gotta make some tee-vee out of it, let's work." To convert the unreal indigestible reality into digestible unreal "reality"... my god, my GOD! TV eats itself – I couldn't even trust the reactions of the kids sitting in front of the t.v., I couldn't even trust them to be honest, I was embarrassed to think of their reactions as "acted," "melodramatic" ...People drank, smoked, stoned themselves into frightened numbness – I did, you did, we all did... what did they do? Will we ever know? "The Real World," "The Real World" like dada.
As I discovered much later, of course it was fake. The entirety of that episode was staged. On the morning of September 11, the whole cast of the show was at Wrigley Field for a photo shoot. I still think about the psychological damage done to those kids. When asked for advice to up-and-coming RealWorlders, cast member Kara Kahn shouted out, "Don't do it!" But she might be bitter, the entertainment press sniped, "She admits she can't fnd an agent." Oh, good. So that's what it's about.
Not long after, I was road-tripping through Wisconsin. I stopped for a bite at Culver's ButterBurgers, a much-loved local fast-food chain. In every corner of the restaurant, upon every support column, there were televisions. And each one held a static image: not just one, but five pairs of burning Twin Towers, four corners and the center of the screen, with superimposed text: WE WILL NOT FORGET. For the record, that's four corners of the restaurant, perhaps another four support columns, making eight television sets, each with five sets of burning Twin Towers – that's 40 images, 80 smoking skyscrapers, in one random fast-food joint off a random Midwestern interstate.
My companion at Culver's, who'd been there in Manhattan, who'd watched the towers collapse from Washington Square Park, practically had a (totally justified) nervous breakdown in the time it took to order a milkshake. This was a multiplied terrorism: not just by the fact of the images alone, but also by the exploitation of terrorism to service commercial sales under the guise of patriotism, (probably) unwittingly coordinated by American businesses.
So I was unsurprised when, two months later, White House Chief of Staff Andew Card announced the Iraq War as a commercial enterprise: ""From a marketing point of view, you don't introduce new products in August." As a veteran of the marketing industry, having observed the entire media campaign, I can only say: well played, folks. (Do I need a license for that lemonade stand?) As a human being and an American citizen, I can only say: can we finally indict these bastards on any one of a number of charges, beginning with Conspiracy to Defraud the Government of the United States? (Remember: We the People.)
No, I, at least, will not forget. I will not forget that one word said to "The Graduate," "plastics," (petroleum-based, you know), which came to designate not just the card that we indebted ourselves upon but the surgery which we committed upon our minds and bodies. I will especially not forget a program called "The Pulse," which was broadcast on FOX Thursday, April 3, 2003, three weeks after the war began. It contained a segment called "Extreme Makeover," (perhaps the first incident) in which a woman submitted to plastic surgery, on the network's dime, to satisfy her husband.
"Honey, if you lost that weight on your legs, you could almost be pretty," said her husband in an interview, as tears streamed down her face. No, I won't forget. I won't forget the look in her eyes, as the plastic surgeon drew lines upon her cheeks, describing how he was going to fix her "flaws" – the look of a cow's eyes, as it's herded to the abattoir. I won't forget how her individual face was replaced with the anonymous glaze of market-researched, pageant-approved "beauty."
We don't need Calvino's talents to trace that trope. It's a straight linear narrative to "The Swan," the FOX Network's 2004 confla(gra)tion of surgical self-improvement and beauty pageant. Jump-cut to a few years later, and Ralph Lauren models are posing with heads wider than their hips. And for evidence that history is at first tragedy, repeated as farce, gaze upon the Miss Plastic competition of 2009. In this international beauty pageant for surgically enhanced women, held this autumn in Hungary, front-runner Alexandra Horvath's silicone breasts proved too top-heavy, and she tumbled over on the catwalk, tearing a ligament and forcing her to a wheelchair.
She ought to have learned from former British Prime Minister Tony Blair: there are limits to the amount you can sex up your body of evidence, before your artful seduction causes self-harm. In an irreducible sign of our delusion (or is it a reality?) that the superficial appearance is of paramount importance, the Healthcare Bill's provision for a 5% tax on elective cosmetic surgery drew protests.
In the early 21st century, Andy Warhol proved himself a sage. Others followed his augury, and proved to be excellent speculators. With entire industries devoted to the manufacture of the fleeting famous (make 'em cheap, so they break down after a year and you'll have to buy this season's model) selling your soul was not enough to make your name. (That came cut-rate. When did a soul ever bank a decent ROI?) It was no longer your name; it was theirs. Your body, and your image, and your history, were to be forfeit too. On September 18, 2002, Salon obtained a copy of the contract that "American Idol" finalists were to sign. It stipulated, in part, that
"... I hereby grant to Producer the unconditional right throughout the universe in perpetuity to use, simulate or portray (and to authorize others to do so) or to refrain from using, simulating or portraying, my name, likeness (whether photographic or otherwise), voice, singing voice, personality, personal identification or personal experiences, my life story, biographical data, incidents, situations and events which heretofore occurred or hereafter occur, including without limitation the right to use, or to authorize others to use any of the foregoing in or in connection with the Series ...
"... I understand that, in and in connection with the Series, I may reveal and/or relate, and other parties ... may reveal and/or relate information about me of a personal, private, intimate, surprising, defamatory, disparaging, embarrassing or unfavorable nature, that may be factual and/or fictional."
Yes, you read that right. "Throughout the universe." "In perpetuity. "To use." "My life story." Hundreds of thousands of teenagers and young adults have bivouaced outside for days, and and have waited on mile-long lines, for the incredible opportunity to sell the entirety of their lives to a corporation – before they're even alive for very long – speculating that the corporation will make their life worthwhile. It's a superb economic model: cashier a generation raised on shoddily-administered self-esteem programs, profit off its inflated self-worth, then turn yet another profit by finding a single diamond-in-the-rough who's worth far more than she bargained for.
Hey, it's a living.
(In this respect, digital-media thievery from the entertainment factories may be seen – in a time far more advanced than ours, perhaps – as a nonviolent civil-rights protest.)
America was attacked on 9/11, but it was a symbol of capitalism, trade and finance that was most visibly and horrifically targeted. With our first "MBA President" installed in office, it was natural that Business would fight its war, with its world-beating weapons: investment banking, media, and entertainment. CGI extravaganzas and demagogic opinionators dissolved the distinction between reality and "reality," while Wall Street deployed its ARMs, blowing away Reason with such dexterity that it advanced to the Bonus Round.
Considering the hypermaterialism with which we pacified our unreal worlds, the housing boom and bust, Iraq, Afghanistan, Bernie Madoff and the rest, perhaps the exact phrase to describe our epoch would be: "The Decade of the Own Goal."
It is said of the writer's art that lies are deployed in the service of deeper truths, because, as T.S. Eliot wrote, "human kind cannot bear very much reality." I cannot lie: the fact of the fiction of "The Real World" on 9/11 tells us more about our age than we really want to know.
David Schneider is completing a book on American media, culture and politics in the early 21st century.
Monday, March 16, 2009
Of Sleuths and Starships
One of the great achievements in the art of today will draw to its conclusion this Friday on the Sci-Fi Channel. If you're not familiar with Battlestar Galactica, but you admire superb filmmaking, literature, or the languages of symbol and myth; if the sci-fi genre gives you the geeky creepies, but you consider issues of government, history and technology to be critically important for our collective future – if you want to provide a superior education for your children of teenage years or above – I recommend marathoning the DVDs. The four-season show caps an extraordinary decade of accomplishment in a medium that we, for the moment at least, refer to as as "television"; however increasingly antiquated that word might sound.
A completely new type of televisual art has bloomed right under our noses, so quickly it's only just acquired a genre. (I hope the name's provisional. "Mega-movie" is pretty bad.) I prefer the term "video literature," or "VidLit," as the the college shorthand would have it: densely woven, symbolically rich, long-arc dramas with a large ensemble cast of rounded, three-dimensional characters who mature and evolve. In this category we'd place, among others, Buffy the Vampire Slayer, Firefly, The Wire, Deadwood and Veronica Mars.
Veronica Mars, for its first two seasons (2004-06), really upped the ante in terms of what could be accomplished through video narrative. The show, which tv writers pegged as a "high-school detective drama," wasn't just diagnosing America's socioeconomic illnesses; it began predicting them with freakish accuracy. In the Season Two premiere, Veronica and her high-school journalism class go on a field trip to meet with a local baseball star. The rich popular kids hop a limo back to class; the working-class kids take the schoolbus. The schoolbus drives off a cliff and plummets into the Pacific. It's the lever on which the entire season is lifted. And it was broadcast on September 28, 2005, less than a month after New Orleans drowned, exposing the fatal inequities in American society.
Two episodes later, Veronica finds herself in a Future Business Leaders of America club, betting on investments. A local real-estate investment trust, Casablancas Enterprises, appears to be doing spectacularly well – until Veronica discovers the properties are all scams. Two scenes later, the CEO, H. Richard Casablancas, hears "There's a gentleman from the SEC here to see you" – abandons his office, tells his employees to "shred everything," and jumps aboard a company 'copter bound for Mexico.
That's the mortgage crisis in a nutshell, folks: narrated by the character of a 17-year-old girl, on a show about a "high-school detective," almost exactly three years before the collapse of Lehman Brothers.
In Veronica Mars, the action-movie celebrity is a sex addict; the Major League slugger, a gambler in hock to the mob. The mayor tries to incorporate class division; and the gang-banger from the barrio delivers street justice when the politically-knotted sheriff's department sits on its hands. The fictional town of Neptune, CA becomes a psychopolitical portrait of America, and the Private Investigator, Veronica Mars, is protagonist, author and instrument: an X-ray revealing the interpersonal archaeology, the socioeconomic strata, and the webs of personal politics that comprise 21st-century American life. For a societal cross-section of equal bandwidth, I think you'd have to turn to Dickens. You can watch the entire first season online.
It's been thrilling to witness so many artistic renderings of our postmodern condition: Buffy's use of horror and the occult to illuminate our adolescence; Firefly's portrayal of the rogue versus the establishment; The Wire's surveillance of our social ills; Deadwood's ballad of interdependence and moral compromise in the making of the American West. And then there is Battlestar Galactica.
For the uninitiated, BSG was originally an ersatz sci-fi 1978 movie and 1979 tv series about the extermination of the human species by a race of robots called Cylons. The survivors of the holocaust leave aboard a ragtag fleet of space-freighters and pleasure cruisers led by the sole surviving military vessel, the Battlestar Galactica. Their goal: an uncharted planet, known only to myth – a new home, a planet called Earth. It was reimagined in 2003 by Ronald D. Moore, who cut his sci-fi teeth writing for Star Trek: The Next Generation, Star Trek: Deep Space Nine, and Star Trek: Voyager. And what he did looks something like this:
Here's a witty, zippy recap of Seasons 1-3. It's designed not to "spoil" the show but to whet your appetite for a detailed viewing. It's also pretty darn entertaining:
It has been said, with a degree of accuracy, that Battlestar Galactica's plotlines have tracked, and have provided a running commentary on, American government during 9/11 and the Iraq War: a subtle and sophisticated counterpoint to the ideological certitude of 24. There's a Memorial Wall for victims of the Cylons' terrorism; Cylon "sleeper cells"; a president of dubious legitimacy directed by religious prophecy; questions about responsibility and "blowback"; and a brutal, complicated conversation on the ethics of torture, both physical and psychological. There are also important conversations about the constitution of democratic government, the balance of power between civil and military leadership, the role of a free press, radicalism and political imprisonment, and the role of labor unions. To which we can say, "Not half bad, Mr. Moore." But BSG's ambitions are far greater than portraiture; it attempts, and I would say succeeds, in creating a vital, dynamic myth for contemporary Western civilization.
BSG's primary underlying theme, as of most science fiction, is the relationship between our technology and our humanity. In its narrative, we can see a thematic genetic code stretching back through a number of landmarks in science-fiction: the consequences of the scientifically monstrous creation in Frankenstein; the replicants of Blade Runner; the robot wars of The Terminator. And from William Gibson's Neuromancer and The Matrix we're introduced to the fusion of cybernetic and organic mind, the mind able to interpret multiple sets of streaming code.
What BSG brings to this lineage is a spectacularly creative rendering of religious history. The Cylons are monotheists – radically defamiliarizing our own civilizational heritage of monotheism. Perhaps Moore was inspired by the ominous Cyclopean eye of a Cylon, moving back and forth in its black slit. Interestingly, in checking this idea I encountered the original Cylon – an Athenian who, directed by the Oracle at Delphi, attempted a coup in 632 BC, the first reliably dated event in ancient Greek history. This fact may well have inspired Moore to make the humans in this story polytheists, with an Olympian pantheon; they comprise the 12 colonies of Kobol, each named for a Zodiac sign. (A mythical 13th colony is said, in the ancient writings of Kobol, to have discovered the route to Earth.) It may have also inspired Moore's single greatest creative innovation, the Hybrid: a failed Cylon clone who, plugged into the heart of each Cylon base ship, effectively becomes the ship – sensing the entire universe in its data stream, which it voices in a mad Oracular stream-of-consciousness like Ulysses with a tech degree.
A recent article in The Ampersand considered BSG's relationship to The Aeneid; I think that limits the scope of our imagination. Galactica is commanded by an Admiral Adam-a. Two important characters are named Saul (the first King of Israel and also, Saul of Tarsus, later St. Paul) and Helen (the face that launched a thousand ships). BSG, I think, has remixed the chaotic uncertainty and strange syncretism of religious belief that characterized the late Roman Empire, during the first few centuries of Christianity.
In the midst of our anxieties over the future of our Republic, now a cultural Empire in all but name, we have been granted with a work of the highest art: a saga that recontextualizes our deep cultural history by defamiliarizing it, vaulting it into the distant future, reminding us of the chaotically cyclical nature of history, and then returning us to our immediate present. In the penultimate episode, broadcast last Friday, we are returned, Lost-style, to the lives of the characters on Caprica before the nuclear apocalypse.
They look very much like our own.
"All this has happened before, and all this will happen again," say the Prophecies of the Lords of Kobol. Without giving much away, in this season's astonishing episode "No Exit," a Cylon sleeper on board Galactica is shot in the head; the bullet hyperactivates his neural net, and he begins narrating, breathlessly, the entire history of the Cylon-human conflict, thousands of years old, forgotten beneath the hardened facts of the survivalist present tension – like a prophetic poet. Like an Oracle becoming Herodotus:
bright stars I'm lost. And all the forgotten faces all the forgotten
showed and we seek the Great Forgotten we seek the Forgotten Language…
He whose guile, stirred with revenge… and all the forgotten faces…
Something wonderful is happening. Get the others. I remember everything. I see everything.
Later in the episode, another Cylon model tells of witnessing a supernova:
"I don't want to be human," spits Dean Stockwell's character with agonized contempt:
I want to see gamma-rays. I want to hear X-rays! I want to– I want to...smell dark matter. Do you see the absurdity of what I am? I can't even express these things properly because I have to– I have to conceptualize complex ideas in this stupid, limiting spoken language! But I know I want to reach out with something other than these prehensile paws, and feel the solar wind of a supernova flowing over me. I am a machine: and I could know much more. I could experience so much more but I'm trapped in this absurd body. And why? Because my…creators thought that God wanted it that way.
It's as if Hamlet's monologue had continued after "quintessence of dust." As if Milton's Satan had circuitry. And it's as if both Hamlet and Paradise Lost are being quoted, by a character I'll refrain from naming, in the words spoken next:
From there, it's a debate about history and responsibility – who is ultimately responsible for the Fall? I can't go into detail here, but suddenly it appears that Homer, Moses, Shakespeare, Milton and Blake have all been collected for that Dinner Party in Heaven we've all fantasized about attending, and they're debating the Israeli-Palestinian conflict using the chicken and the egg:
"Go back far enough, a germ gets blamed for splitting in two," says another.
Back in the operating room, the bullet-brained Cylon is struggling through aphasia. Because this is about history. Words. The Word. He manages to explain, "…Cavil rejected mercy, he had a twisted sense of morality, he blocked access to our books – no – – ahh – our memories –" and brings up a name. Daniel. "Judged by God," the name means, a famous interpreter of dreams, a prophet, able to read the writing on the wall.
In last week's episode, "Daybreak Pt. 1," this character falls into a coma, and assumes the role of a Hybrid. He remembers his life as a "human." Again, hopefully without giving too much away, he speaks some extraordinary words:
Ladies and gentlemen, we've heard from the "quintessence of dust"; we've heard the voice of Milton's God; now we're hearing Hamlet talk about the genius of this "paragon of animals" after hearing about Newton's God, a geometer.
To be as gods. We Fall. We become human. We evolve. We invent. We strive for a more perfect union. Once upon a time, the world was young and life was nasty, brutish and short. In the struggle for survival, heroes were made, and people who understood humanity were endowed with far sight and wrote histories and myths.
We believe, we investigate the world, we invent technologies to serve our ends, only to have them buckle civilization itself: the printing press, slavery, variable-rate mortgages – yes, all are technological systems we've invented. But once upon a time, science and religion and mythical syncretism were apprehended in artistic vision, and it pulled us out of our medievalism – into the Renaissance. There's a reason why academic symposia have been held on Battlestar Galactica and why, tomorrow evening, there will be a panel discussion on Battlestar Galactica at the United Nations.
Just maybe, we can make it.
Monday, December 22, 2008
Gaza, Giza and the other CNN effect
My grandmother loves me very much. The feeling, of course, is mutual.
So, with that qualifier out of the way, please forgive the following anecdote. I am a good boy at heart, and my grandma’s English is poor enough that she will never read this.
In early June 2007, I flew to The Middle East (“The” has to be capitalized, for reasons that will become clear). I landed at Ben Gurion International Airport and made my way to Ra’anana, one of the satellite communities around Tel Aviv, where I presented an academic paper on a Polish journalist who interviewed the famous (and infamous) Avraham Stern shortly before his death.
My grandmother, who raised me in my youth and with whom I enjoy an Obama-ish relationship, was quite proud that I was presenting my research at an academic conference in a foreign country (“My grandson! Look at him!”). However, she was worried. A conference was great, she said, but why did it have to take place in what she still refers to as the Holy Land, which, in her mind, is a country of bombs, raids, irate settlers and marauding bulldozers, each liable to maim or kill her eldest grandson.
“Why don’t you present the paper in Canada?” she asked when I first told her about my trip. “Or come visit, and do it here?”
Although I had no answer for her at the time other than my customary “don’t worry,” I began to consider my grandmother’s anxieties.
She has never been to Israel, Palestine, Jordan or Egypt (my itinerary), and the last time she set foot in “The Middle East” was in the 1980s when she travelled to Libya to visit my grandfather who was among the Polish engineers helping the then-evil Gaddafi regime build highways in exchange for oil.
Since her very successful visit(she found Libyans to be kind and engaging, and she recalls the archeological ruins near Tripoli with a smile in her eyes) her only exposure to “The Middle East” came from the same source as for the rest of us, from the international press. And, with the Cold War in the rearview mirror, international reporting in Poland and the rest of the old Soviet bloc has come to be dominated by the same international news agencies, the BBC, and, of course, CNN, which has scrutinized the region with increasing frequency (and increasing anxiety) since its rise to prominence during the first Gulf War.
While the term "CNN effect" has been used by television pundits (sometimes on CNN) and by international scholars to denote the influence of the 24-hour news cycle on foreign policy formation, there is another, non-elite, CNN effect at play. While scholars (such as George Washington University’s Steven Livingston) tend to focus on the effect images of wars, natural catastrophes, terrorist attacks or man-made humanitarian disasters have on policy-makers, fewer studies look downstream, to the types of opinions and prejudices formed among the general population that outlast each particular crisis. What happens after the headlines change to the next earthquake, explosion or massacre? What is the residual CNN effect, and how long do the headlines echo, beyond the initial flashpoint that draws in and consumes the nomadic international press?
Because my grandmother devours the news (some of my first memories involve being told to be quiet as we listened to jammed Radio Free Europe broadcasts in her Warsaw apartment), her views on “The Middle East” are just as firm as her thoughts on her most beloved topic—Polish electoral politics. She wascertainthat Israel was a dangerous place to visit for her favourite Polish-Canadian academic/journalist, just as she iscertainthat the Kaczynski twins represent Poland’s best chance for maintaining sovereignty within the EU. (Needless to say, grandma and I do not always see eye-to-eye on the issues.)
Because she does not travel much anymore and because Middle Eastern geopolitics have never consumed her (she does, after all, have a compelling geopolitical chess match taking place in her own backyard), I am not certain that she can distinguish between the first intifada and the second, or between Ismail Haniyeh and Mohammed Dahlan. Yet, as far as the CNN effect is concerned, this does not matter. The emotional triggers that help to shape her views are fully formed, and she is not likely to be dissuaded.
Of particular pertinence is her view of Gaza, which, in her vocabulary, has become something of a four-letter word. Although she sympathizes with the Palestinian cause (and sympathizes very strongly, as one of an ever-diminishing number of Europeans who know firsthand what a military occupation looks like), the word “Gaza” evokes particular dread. When I first mentioned the conference in Ra’anana, she said that “at least it wasn’t Gaza.” This theme would continue throughout my visit.
Now, considering the timing of my trip, grandma was not entirely wrong to be worried. This was early June 2007, Hamas and Fatah were about to engage in a battle for Gaza, and the entire region was tense. The BBC’s Alan Johnson was in captivity for almost three months at that point and the almost-daily reports on his fate dominated the international coverage. When I told grandma that I was planning to do some minor freelance reporting once my academic duties were fulfilled, she became uneasy. When I told her that I was going to the Palestinian territories, I could hear her heart stop, and hesitate a little.
I wanted to visit Bethlehem and maybe Ramallah (both West Bank towns), partly because I wanted to write that it is a shame to form one’s opinion of the Palestinian people from CNN alone, and partly because some of my Israeli hosts kept insisting that I not go for reasons of ideology.
Now, my grandmother’s only ideology is that her eldest grandson stay safe, so when I phoned to say that I was off to Palestine, I took an earful. I tried to explain away Bethlehem saying that I wanted to visit for biblical reasons, but grandma wasn’t sold on such a flimsy explanation.
“You don’t even go to church,” she said wearily. “Please stay safe, and please stay away from Gaza.”
Of course, my visit to the West Bank was quite pleasant, and, although it should go without saying (but sadly it does not), the Palestinian people were quite unlike the angry masses one occasionally sees on the evening news. I could happily report to my Ra’anana hosts that the Israeli portrait of the average Palestinian seems just as off base as the Palestinian portrait of the average Israeli, and that despite the obvious tensions, people remain people, even when politics can make daily life incredibly difficult.
After making my way back through the structure that some Israelis insist on calling a fence (it sometimes is a fence, but where I crossed it looked like a ten-meter concrete wall with a Berlin-esque watchtower), I called from the safety of Jerusalem saying that with my conference finished, I was off to Egypt.
“You’re not going to go to Gaza, are you?” my grandma asked in a nervous tone.
“No, grandma. Just the Sinai and Cairo.”
Just as I crossed the border into Egypt, the nervous situation momentarily erupted into something much more violent. As I struggled against the heat on an Egyptian bus completely unaware of the world around me, Hamas drove Fatah from the Gaza Strip leading to the current equilibrium (or stalemate) in Palestinian politics.
Before I knew that anything had happened, I was in the Sinai backpacker haven of Dahab, many many miles away, watching the images, like my grandmother, courtesy of CNN International and the BBC.
I sent grandma a brief note telling her that I was safe, and a week later, after I made it to Cairo and after I took the metro to see the pyramids (that still sounds a bit surreal), I called grandma to let her know that I was ok.
“Grandma, I went to Giza today!”
“No, no! Giza! G-i-z-a!”
“No, the pyramids, the Sphinx!”
At this point, my innkeeper, by then fully briefed on my grandmother’s fears, almost doubled over from his chair laughing.
“My friend,” he said, “sometimes, you just don’t think.”
“But I cannot lie to my grandmother,” I protested.
“And besides, it’s just the CNN effect.”
Wednesday, November 09, 2005
Engrossed in a World of Political Idealism
From The New York Times:
Most television dramas play with the question "what if?" NBC's "West Wing" revels in "if only...."
Sunday's live presidential debate was the quintessence of wishful writing. Two intelligent, principled candidates tossed aside debate rules and went at each other full throttle on live television, debating everything from immigration and energy policy to foreign debt relief.
The world hates us, and even Americans deplore the sorry state of political discourse in their country. But only the uninformed or disingenuous complain about the quality of American television. It has a variety and breadth that no other nation can match. For every offensive reality series or inane daytime talk show, there are comedies and dramas that reach far higher in a single episode than most movies or Broadway shows.
Saturday, July 30, 2005
From The New York Times:
The current tendency to political polarization in news reporting is a consequence of changes not in underlying political opinions but in costs, specifically the falling costs of new entrants. The rise of the conservative Fox News Channel caused CNN to shift to the left. CNN was going to lose many of its conservative viewers to Fox anyway, so it made sense to increase its appeal to its remaining viewers by catering more assiduously to their political preferences.
So why do people consume news and opinion? In part it is to learn of facts that bear directly and immediately on their lives - hence the greater attention paid to local than to national and international news. They also want to be entertained, and they find scandals, violence, crime, the foibles of celebrities and the antics of the powerful all mightily entertaining. And they want to be confirmed in their beliefs by seeing them echoed and elaborated by more articulate, authoritative and prestigious voices. So they accept, and many relish, a partisan press. Forty-three percent of the respondents in the poll by the Annenberg Public Policy Center thought it ''a good thing if some news organizations have a decidedly political point of view in their coverage of the news.''
Monday, July 04, 2005
On a New Showtime Series, America's Protector Is a Muslim
From The New York Times:
The lead character is an undercover F.B.I. agent who has managed to infiltrate a Southern California sleeper cell largely because he is a practicing Muslim. The character, Darwyn, is the first major role created on an American series - whether before or after the Sept. 11, 2001, hijackings - that depicts a Muslim as a hero seeking to check the intentions of terrorists.
That the production has such a high gloss of credibility - at least in terms of the prayers that Darwyn utters, the ways he interprets the Koran and his struggles to reconcile his religion with his daily life - is a function of the creative team supporting it: three of those playing prominent roles behind the scenes are themselves Muslims. And having been raised on a steady diet of Arab bad guys - whether on shows like "JAG" or "24," or movies like the Arnold Schwarzenegger vehicle "True Lies" - they say they welcomed the opportunity to put a character on television who looked like them, shared their values and sought to save the day.