Algorithms decide what we are recommended on Amazon, what films we are offered on Netflix. Sometimes, newspapers warn us of their creeping, insidious influence; they are the mysterious sciencey bit of the internet that makes us feel websites are stalking us—the software that looks at the e-mail you receive and tells the Facebook page you look at that, say, Pizza Hut should be the ad it shows you. Some of those newspaper warnings themselves come from algorithms. Crude programs already trawl news pages, summarise the results, and produce their own article, by-lined, in the case of Forbes magazine, “By Narrative Science”.
Others produce their own genuine news. On February 1st, the Los Angeles Times website ran an article that began “A shallow magnitude 3.2 earthquake was reported Friday morning.” The piece was written at a time when quite possibly every reporter was asleep. But it was grammatical, coherent, and did what any human reporter writing a formulaic article about a small earthquake would do: it went to the US Geological Survey website, put the relevant numbers in a boilerplate article, and hit send. In this case, however, the donkey work was done by an algorithm.
Too little has been written about Brightonian novelist Ann Quin since her death in August 1973. Most of what has been has highlighted the striking opening sentence of her first novel, Berg, originally published by John Calder in 1964 and later reissued by Dalkey Archive Press:
‘A man called Berg, who changed his name to Greb, came to a seaside town intending to kill his father …’
Robert Buckeye’s Re: Quin, also published by Dalkey and described as an “unabashedly personal and partisan critical biography” of “one of the best and most neglected” British “experimental” writers of the 1960s, breaks with convention by opening with a quote from contemporary author-artistStewart Home about “The body of a dead princess” serving “as a metaphor for literature”. Buckeye then moves onto a Malcolm X speech from 1964, using it to illustrate his point that radical times need radical culture, before placing Quin into a post-war avant-garde with William S. Burroughs, Alexander Trocchi, B. S. Johnson and others.
The price of things is crowding out their value. When it comes to art, the belief that the price of a work is its sole worth constitutes the peculiar accord between the hedge-fund millionaires driving prices into the stratosphere and the would-be revolutionaries who fantasize about the collapse of the art-market bubble and the whole hideous economic system of which it is a prominent sideshow. Is it even remotely possible to see the exhibition hanging in the Guggenheim Museum right now—paintings, drawings and photographs by the American artist Christopher Wool—as art instead of dollar signs, now that one of Wool’s paintings (not included in the exhibition, which is on view through January 22) has sold at auction for $26.5 million, just a year after another sold for what then seemed an already outlandish $7.7 million? According to a recent article in The Art Newspaper, speculation on Wool’s art over the past few years indicates that it “‘has become a parking lot for money,’ says one high-profile European curator. Like the market for Jean-Michel Basquiat, Wool’s market is in danger of being controlled by a small, powerful group of players, he [adds].”
“Parking one’s money” is apparently an everyday concept among those who have too much of it; a recent New York Times article headlined “Record Prices Mask a Tepid Market for Fine Art” quoted a market expert who accounted for the popularity of contemporary art among hedge-fund managers this way: “They can hang anything they want in their Manhattan co-ops or in Aspen and nobody can say that’s ugly because contemporary art has not been subjected to sustained critical appraisal. There are no markers of good or bad taste that have yet been laid down. It’s a safe place to park your money.”
Between 1950 and 1980, the United States experienced a reported 32 “broken arrows,” the military’s term for accidents involving nuclear weapons. The last of these occurred in September 1980, at a U.S. Air Force base in Damascus, Arkansas. It started when a young technician performing routine maintenance on a Titan II missile housed in an underground silo dropped a socket wrench. The wrench punctured the missile’s fuel tank. As the highly toxic and flammable fuel leaked from the missile, officers and airmen scrambled to diagnose the problem and fix it. Their efforts ultimately failed, and eight hours after the fuel tank ruptured, it exploded with tremendous force. The detonation of the missile’s liquid fuel was powerful enough to throw the silo’s 740-ton blast door more than 200 yards and send a fireball hundreds of feet into the night sky. The missile’s nine-megaton thermonuclear warhead — the most powerful ever deployed by the United States — was found, relatively intact, in a ditch 200 yards away from the silo.
The Damascus accident epitomizes the hidden risk of what the sociologist Charles Perrow has dubbed “normal accidents,” or mishaps that become virtually inevitable once a system grows so complex that seemingly trivial miscues can cause chain reactions with catastrophic results. As the journalist Eric Schlosser explains in his new book, Command and Control, “The Titan II explosion at Damascus was a normal accident, set in motion by a trivial event (the dropped socket) and caused by a tightly coupled, interactive system.” That system, he writes, was so overly complex that technicians in the control room could not determine what was happening inside the silo. And basic human negligence had only made things worse: “Warnings had been ignored, unnecessary risks taken, sloppy work done.”
Robert B. Talisse (on the right of the picture) and Scott F. Aikin (on the left of the picture) are the dynamic duo of 3Quarksdaily, thinking about the social nature and political significance of argument, about the two things the word ‘argument’ captures, about the straw man fallacy, about misfiring sound arguments, about the intimate connection between epistemology and democracy, about the nature of democracy, pragmatism and Rawls, about Dewey, Elizabeth Anderson and Pierce, about ‘pluralism’ as a halo term, about the truth orientation of our cognitive life, about Nietzsche’s challenge, about being fearless about the fear of regress, about the use of tone, about the need for political arguers and the dangers of cognitive insulation, about when to revise ones beliefs, about civility in argument and about why their new book is keyed to all contemporary democracies. Epistemocracy doubled!
3:AM: What made you become philosophers?
Scott Aikin: I was a Classics major at Washington University in St. Louis, and I was very lucky to have the patient instruction of Merritt Sale, George Pepe, and Carl Conrad there. We would have class discussion about some line from Seneca or Plato, and I’d get hung up on some philosophical issue. I originally thought it was because my ancient languages weren’t good enough, but it became clear that disagreements about virtue or knowledge aren’t solved by dictionaries, but by doing some philosophical work. You had to think about what virtue and knowledge really are. It was like my mind caught fire – I was eighteen years old and could dispute with the greats on what was good and true. Authority with these matters came with having reason on your side, not any status or anything like that. It was exhilarating, and that anti-authoritarian appeal of philosophical work still enlivens me.
Robert Talisse: I grew up in northeastern New Jersey, and I took a class in Philosophy in my senior year in high school. The class was a survey of the great philosophers’ ideas, paying nearly no mind to the arguments they devised. I liked that class, but it left me with the impression that Philosophy was a dead discipline, something that had ended in the 19th Century. So, when I entered William Paterson College (it was not yet a university then), I was not aware that it was possible to major in Philosophy. I spent my first semester as an Economics major, but once I discovered that there was a Philosophy major, I switched immediately. At the time William Paterson was a small commuter school filled with Business majors, yet somehow there was a critical mass of really serious Philosophy students, all of whom eventually earned PhDs, and many of whom are now professional philosophers. In any case, I quickly learned there that Philosophy is about challenging those (including oneself) who claim to know. Like Aikin, I latched on to the anti-authoritarianism of it all. And I soon realized that the impression of Philosophy that I got from my high school class – that it had died as a discipline – was exactly wrong. Philosophy is one of the few disciplines that is not dead. I eventually found myself with a PhD in Philosophy from CUNY and a job at Vanderbilt as a philosopher. To be honest, I’m not really sure how it all happened.
Dalton is a prestigious, decades-old, K-12 prep school on New York City’s Upper East Side that filters its students into the best universities in the country. In 2010, Forbesreported that 31 percent of its students matriculated into MIT, Stanford, or an Ivy League institution. Former students include Anderson Cooper, Claire Danes, and Ralph Lauren’s daughter Dylan. Even imaginary peoplemake sure their families are present for parent-teacher conferences. For years, however, Dalton was largely inaccessible to minority and lower-income students. Maintaining its reputation as a top-tier place of learning did not require administrators to extend invitations to those groups.
When Idris Brewster and his friend Seun Summers entered kindergarten at Dalton in the late 1990s, they were one of the few students of color in their class. Idris and Seun’s parents believed that getting into Dalton was the first step to a life filled with accomplishments.
“Students that came out of independent schools were well-prepared on the level of networking, internships, job and school opportunities—you name it—and we were offered great financial-aid incentives,” Michèle Stephenson, Idris's mother, told me. “We thought this intensive, intellectually stimulating institution would open doors for Idris and take him anywhere he wanted to go.”
Fourteen years later, Idris's parents have released American Promise, a documentary that records the boys' personal and academic experiences from kindergarten through senior year of high school. The film reveals a hard truth about being a student of color at an elite school: Simply being admitted doesn't guarantee a smooth or successful educational journey.
There are many things that make humans a unique species, but a couple stand out. One is our mind, the other our brain. The human mind can carry out cognitive tasks that other animals cannot, like using language, envisioning the distant future and inferring what other people are thinking. The human brain is exceptional, too. At three pounds, it is gigantic relative to our body size. Our closest living relatives, chimpanzees, have brains that are only a third as big.
Scientists have long suspected that our big brain and powerful mind are intimately connected. Starting about three million years ago, fossils of our ancient relatives record a huge increase in brain size. Once that cranial growth was underway, our forerunners started leaving behind signs of increasingly sophisticated minds, like stone tools and cave paintings. But scientists have long struggled to understand how a simple increase in size could lead to the evolution of those faculties. Now, two Harvard neuroscientists, Randy L. Buckner and Fenna M. Krienen, have offered a powerful yet simple explanation. In our smaller-brained ancestors, the researchers argue, neurons were tightly tethered in a relatively simple pattern of connections. When our ancestors’ brains expanded, those tethers ripped apart, enabling our neurons to form new circuits. Dr. Buckner and Dr. Krienen call their idea the tether hypothesis, and present it in a paper in the December issue of the journal Trends in Cognitive Sciences.
From The Genius in All of Us: New Insights into Genetics, Talent and IQ by David Shenk via delancyplace:
Genius. The popular conception of genius is that it is an inborn gift, yet an increasingly large body of research suggests the opposite — that genius is always the product of sustained effort. A case in point — Mozart: “Standing above all other giftedness legends, of course, [is] that of the mystifying boy genius Wolfgang Amadeus Mozart, alleged to be an instant master performer at age three and a brilliant composer at age five. His breathtaking musical gifts were said to have sprouted from nowhere, and his own father promoted him as the 'miracle which God let be born in Salzburg.' “The reality about Mozart turns out to be far more interesting and far less mysterious. His early achievements — while very impressive, to be sure — actually make good sense considering his extraordinary upbringing. And his later undeniable genius turns out to be a wonderful advertisement for the power of process. Mozart was bathed in music from well before his birth, and his childhood was quite unlike any other. His father, Leopold Mozart, was an intensely ambitious Austrian musician, composer, and teacher who had gained wide acclaim with the publication of the instruction book … Treatise on the Fundamental Principles of Violin Playing. For a while, Leopold had dreamed of being a great composer himself. But on becoming a father, he began to shift his ambitions away from his own unsatisfying career and onto his children — perhaps, in part, because his career had already hit a ceiling: he was vice-kapellmeister (assistant music director); the top spot would be unavailable for the foreseeable future. “Uniquely situated, and desperate to make some sort of lasting mark on music, Leopold began his family musical enterprise even before Wolfgang's birth, focusing first on his daughter Nannerl. Leopold's elaborate teaching method derived in part from the Italian instructor Giuseppe Tartini and included highly nuanced techniques …
“Then came Wolfgang. Four and a half years younger than his sister, the tiny boy got everything Nannerl got — only much earlier and even more intensively. Literally from his infancy, he was the classic younger sibling soaking up his big sister's singular passion. As soon as he was able, he sat beside her at the harpsichord and mimicked notes that she played. Wolfgang's first pings and plucks were just that. But with a fast-developing ear, deep curiosity and a tidal wave of family know-how, he was able to click into an accelerated process of development. “As Wolfgang became fascinated with playing music, his father became fascinated with his toddler son's fascination — and was soon instructing him with an intensity that far eclipsed his efforts with Nannerl. Not only did Leopold openly give preferred attention to Wolfgang over his daughter; he also made a career-altering decision to more or less shrug off his official duties in order to build an even more promising career for his son.
…The tiny Mozart dazzled royalty and was at the time unusual for his early abilities. But today many young children exposed to Suzuki and other rigorous musical programs play as well as the young Mozart did — and some play even better. Inside the world of these intensive, child-centered programs, such achievements are now straightforwardly regarded by parents and teachers for what they are: the combined consequence of early exposure, exceptional instruction, constant practice, family nurturance, and a child's intense will to learn.
I have never written anything because it is a Poem. This is a mistake you always make about me, A dangerous mistake. I promise you I am not writing this because it is a Poem.
You suspect this is a posture or an act I am sorry to tell you it is not an act.
You actually think I care if this Poem gets off the ground or not. Well I don't care if this poem gets off the ground or not And neither should you. All I have ever cared about And all you should ever care about Is what happens when you lift your eyes from this page.
Do not think for one minute it is the Poem that matters. Is is not the Poem that matters. You can shove the Poem. What matters is what is out there in the large dark and in the long light,
by Gwendolyn MacEwen from Afterworlds McClelland & Stewart, 1987
In the memoir he was writing at the time he died, my friend Avresh described returning to the Czechoslovakian town of Sevlush, his birthplace, in the winter of 1946. He'd left some fifteen years earlier to attend a Jewish gymnasium in a larger city, stayed on to study engineering at the university and never looked back. This was his mother's wish for him: that he enter the great, free, secular world, liberate himself from the narrowness of his tradition. Escape.
When the Germans occupied Czechoslovakia in 1939, Avresh joined the Communist resistance. Captured and tortured by the Gestapo but inexplicably released, he made his way to the Soviet Union, expecting to be welcomed with open arms, a comrade in the fight against Nazism. Instead he was arrested at the border and charged with espionage—the fate of most Jewish refugees from Eastern Europe. Avresh spent two and a half years in the Gulag, shuffled from one prison camp to the next, but ended up an artillery officer in a Czech unit of the Russian army; by the time he was discharged, he'd earned four medals for his service on the Eastern front. His favorite featured a picture of Stalin.
So it was as a decorated officer in a Russian army uniform that he returned to his town after the war. All the Jews were gone, rounded up and deported to Auschwitz. A Slovak family was living in his childhood home and not a trace of Avresh's own family remained. Looking for answers, he went to the neighborhood synagogue and peered in the door. The sanctuary, the balcony, the corridors and stairways were cluttered with belongings: furniture, pots and pans, bedding, books, knickknacks and photographs. A policeman stood watch over the household goods of the departed Jews of Sevlush. Town officials had collected the Jews' possessions and stored them in the synagogue to prevent looting. No Jews had returned to claim their things. Was there something he wanted from the collection, the policeman asked, some memento?
Avresh said he took nothing when he left Sevlush, but this is not strictly true. He carried no objects away from the synagogue, no material belongings, pointedly refusing the money the officials offered as “rent” on his family's house. What he took, along with the burden of guilt he carried—”I share the usual remorse of most Holocaust survivors lamenting why they are alive and why they did not try harder to save their perished family,” he wrote in his memoir—what he took, I would say, was a sense of spiritual belonging, the token that remained of his Jewish inheritance.
(This is the first post in a series on Pakistan's struggle against militancy).
Almost a decade in, the rebellion by the Pakistani Taliban against Islamabad shows no signs of flagging. Tough, savvy, and agile, the insurgents have expanded their campaign from the isolated northwestern tribal regions all the way to urban centers in the south such as the port city of Karachi. Their declared agenda has grown with each success: they first demanded acceptance of their control over large swathes of the tribal areas; they then denied the authority of Islamabad across Pakistan altogether; today, influenced by Al Qaeda's rhetoric, they boast of sending fighters to wars in Arab lands and attacking the United States.
We need not accept all their grandiose declarations at face value. When it comes to global terrorism, in particular, there is a chasm between their rhetoric and their capacity. The only terrorist plot on American soil they can claim is of the failed Times Square Bomber in 2010. The evidence of Taliban involvement in Middle Eastern battlefields is ambiguous at best. And their operations are constrained by an overall pool of fighters that is small: estimates vary because data is hard to collect and the definition of an active fighter is murky but at any given time there may only be between ten and twenty thousand rebel fighters.
But the insurgents have substantially expanded their campaign within Pakistan itself. They have strategic clarity where Islamabad does not and their aspirations have been whetted by the confusion of the state. In recent years the rebels have complemented their fight against Pakistani armed forces in the tribal areas with a systematic campaign of terrorism in towns and cities across the country. To this end the insurgents have leveraged and expanded a vast ‘infrastructure of extremism', which originates in decades of state sponsorship of non-state militant groups.* The network includes combat trainers, militant recruiters, funders, suicide jacket makers, indoctrinators and foot soldiers who have access to training camps, safe houses, telephone getaway exchanges, madrassahs (some, not all) and highly sophisticated media communications facilities across the length and breadth of Pakistan. The insurgents are not cave dwellers: they are adept organization builders who have institutionalized the production of terrorism as one weapon in their broader war against the state.
There's a genre of shows, above the level of House or Friends and below that of The Wire, that exude high quality even if the actual level of characterization and plot isn't deep. Julian Fellowes' Downton Abbey is one of the prime examples of this genre. It's beautifully done and acted, has enough characters and plots to keep anyone's interest, and is full of references that seem smart.
It just so happens that none of these references is particularly intellectual or obscure. Instead, they're the sort of history that everyone knows. The first episode discusses the Titanic; we do not live in an alt history in which James Cameron chose to make more Terminator sequels in the 90s. Every time Lord Grantham's American wife's mother comes, we're treated to the usual tropes of differences between British and American culture. In the season that just concluded with its Christmas special, two additional common references are added: a rich English expat goes to Munich in 1922 and is killed by the early Nazi party because he vocally disagreed with them; and there's a subplot regarding Edward VIII's playboy philandering. This is about as smart as an American mid-18th century period drama inserting a reference to Washington not being able to tell a lie.
The problem is that even the stronger points of symbolism on the show are like this. The biggest is the analogy between the upstairs and the downstairs. The servants form a tight group (except Thomas and O'Brien) in which Carson is the father, Hughes is the mother, and the rest of the servants have a hierarchy in which valets and lady's maids are above the rest. Bates/Anna is of course parallel to Matthew/Mary, and the stronger parts of the show are the ones that showcase the differences between their relationships, with Matthew/Mary having more resources and more clout than Bates/Anna so that they face more rich-people problems rather than a possible execution.
The only problem is, the show didn't really invent this view of the butler as the father, the housekeeper as the mother, and the other servants as lesser members of the house. It was common in that era. I don't think it's as well-known a reference, but that symbolism is still a trope, and the servants' order of precedence within the great houses reflected it. It works well enough as a reference, but as symbolism, it's trite.
Everywhere else, Fellowes' Tory baron biases show. The show can't write women well, and descends to a virgin/whore/mother trichotomy. The only man who is as conniving as the median woman is gay. The treatment of race is facile. Lord Grantham is self-consciously written as an upper-class twit, but he doesn't suffer any consequences for it and is always saved by more competent family members, nor does he have interesting moral dilemmas. The characters are never shown to engage in any effort – they do some work and succeed, without any of the failures that are associated with actual effort. The show wants to be about the aristocracy's struggles with its decline after WW1, but it's instead about an aristocratic family that weathered all the troubles, which is about as interesting as any riches-to-riches story could be.
One of the main aims of modern science is finding mathematical expressions to describe the relationships between observed quantities. For example, Newton's law of gravitation tells us that the force of gravity between two bodies depends in a certain way on their masses and the distance between them; thermodynamics tells us that the pressure of a gas depends in a certain way on its volume and temperature; and an economist studying income might conclude that income increases with educational level according to some functional form.
Sometimes these mathematical relationships emerge from an underlying model. We might model a gas as made up of molecules that collide with each other and the walls of the container, think that pressure is a measure of collisions with the walls and temperature a measure of kinetic energy, and then our functional form is a result of mechanistic insight into pressure and temperature. In other cases, the relationships serve to provide a summary representation of the data (instead of giving you a list of pressures at various temperatures, I could say pressure=3*temperature) and, even without providing an explanation of how the relationship came to be, allow us to make predictions about new data (for example, I might have observed the pressures at temperatures of 30 degrees and 60 degrees and want to predict the pressure at 90 degrees).
As we choose a relationship (or hypothesis) to explain a given set of data, the two goals of accounting for the existing data and making predictions for new data are often in conflict. Look at the graph below, which plots the simultaneously measured values of two quantities, X and Y.
Say we're trying to describe this relationship in a way that allows us to predict the values of Y at unobserved points (for example, we haven't measured the value of Y when X is 0.25 and we want to predict this). A common thing to do is to draw a line along the middle of this scattered cloud of points and use this as an approximation of the underlying relationship.
We are in the position, unique in human history, of possessing the knowledge of how to alleviate much of the unnecessary suffering in the world. What we lack is the knowledge of how to deliver and disseminate that knowledge, and of how to encourage its uptake. In an eerily prescient article of the sort at which he excelled, John Maynard Keynes wrote in The New Republic in 1932:
At present the world is being held back by something which would have surprised our fathers—by a failure of economic technique to exploit the possibilities of engineering and distributive technique; or, rather, engineering technique has reached a degree of perfection which is making obvious defects in economic technique which have always existed, though unnoticed, and have doubtless impoverished mankind since the days of Abraham.
Replace engineering technique with scientific understanding and economic technique with, well, some broadly defined sociopolitical will to implement our understanding, and Keynes's point is precisely applicable to the current state of health and longevity around the world.
Whether you read into modern medicine an amazing ability to deflect and defer nature's slings and arrows, or you bemoan the failed ‘War on Cancer' and the much-delayed genomic revolution, the fact remains that we now know the major ways for the majority of people to lead long, healthy, lives. It isn't medical technology that allows people to live into old age; with the exception of vaccines, the causes of good health aren't to be found in hospitals and medical clinics. Nutrition, sanitation, and hygiene are the keys to population health, and it isn't too great an exaggeration to say that health depends, above all else, on where you live, eat and excrete. The solutions are, for the most part and in a strictly technical sense, rather simple and well understood.
Keynes wrote that statement in 1932, before modern vaccines and antibiotics, without much (successful) interventional surgery, and with scant knowledge of the mechanisms of disease. The 80 years before he wrote had seen a revolution in human health – maximum life expectancy had long been on its steep upward trajectory, and infant mortality in the Western world was fast becoming a rarity.
“Using an orienteering compass, measuring tape and a pair of snowshoes, 54-year-old Simon Beck turns the hills and frozen lakes around Les Arcs into geometrically-perfect immaculate masterpieces. His intricate prints are huge, often spanning the equivalent size of six football fields, but while you’d be tempted to think Beck needs at least several days to complete just one of these patterns, he really only needs about 10 hours, on average.”
Until my grandmother—whose 100th birthday we celebrated this year—took up residence first with my parents and then at the care center where three of her sisters also spent their last years, she lived independently and, in many ways, unconventionally. (Whereas she is content to describe her long life as “good,” my grandmother deviated from the norms of small-town Texas just enough, and in enough domains of her life, for that life to seem quite remarkable to me. That nearly everyone calls the lady “Morris”—a long story, but it originated when I was very young and couldn't replicate my mother's polite “Mrs. Morris,” so I shortened it and the name stuck—is only the first of many odd details that I'd need to explain to anyone meeting her for the first time.) When her husband suffered a fatal heart attack after a morning spent plowing, she inherited a prosperous family farm and kept it that way for four decades more. She hosted retired teacher banquets, a duty (though certainly not a grim one, my grandmother was the type to understand it as a duty nonetheless) born of a storied 40-year career as teacher and principal in the Quail Rural Consolidated School District (the largest such district in the country at the time). To this day, she is my family's only elected official, having served a term as the County Superintendent of Education. For many years she split her leisure time between a full slate of daytime TV dramas—what she called her “stories”—and virtually any televised sporting event. Whenever I asked, she could catch me up on the tangled relationships and intrigues of any given soap opera, somehow managing to dignify the most idiotic plot or one-dimensional character. She could conjure the same remarkable effect with sports; normally oblivious, I would suddenly understand the beauty and depth of a sport (who knew golf could be anything but tedious?), envying her effortless command of baseball stats and NFL playoff hopes, and sharing her quiet marvel at a beautiful swing.
And, on top of all this, every two years or so she would vote a straight Democratic ticket. This, at least, is how her only son, my father, tells it. About ten years ago—or it could have been fifteen, or five; it hardly matters because this stunning revelation came when Morris was already quite old, and long after Texas had turned solidly Republican—my father referred to my very proper grandmother as a “yellow dog Democrat” (meaning, to any Southerner, someone who would sooner vote for a yellow dog than a Republican). He said it with what seemed like mild exasperation, as if he couldn't make sense of, or fully commend, this irrational allegiance to a political party. But I remember being secretly thrilled (I think he could have told me that Morris was an avid day-trader and I would have been less surprised). Maybe I felt vindicated, too; apparently the Democratic gene can skip a generation, but obviously it was there, deep in me, ensuring that a family's rich history would continue to bind, and instruct. Perhaps most surprising of all, I discovered that I was proud—suddenly proud of a party that could have earned my dear grandmother's life-long support.
While I do not like the phrase, “at the height of her powers,” it comes to mind when I think of Taymor directing this comedy. I don't like the phrase because it seems to anoint the critic with a false sense of her own fortune-telling powers, and has an undue emphasis on the importance of being urgent–as if I were saying, “run, don't walk” to this play. But perhaps you should run–or, more accurately, sit. I had to sit in the stand-by line for a long time, because the play, now running at the Polonsky Shakespeare Center is officially sold out. Sit, and wait, for a long time for this production, because the images it gives you will delight at first, and then, over time, will resolve themselves into a sort of important pastiche that helps you think about love, madness, and Shakespeare.
I had to go because my mother made me. She was a theatre student, a mime, a theatre-director, and a folklorist before she became the director of a k-12 school in West Africa, where she finds an outlet for her enormous creative energy by putting on plays. This year it is Midsummer Night's Dream. “I can't get a feel for it, yet,'' she said. “But Julie Taymor directed it, and I love her, and they're putting it on in a theatre six blocks from your house. I read all about her troubles with Spider-man and had been following her before I had the idea to do Midsummer.It means something. Go. Find out what she's up to.”
I had no idea that my mother loved Julie Taymor. Or that she 'followed” anything that had anything to do with the internet. I promised to do it, but procrastinated, and when I saw that the tickets were sold out, I nearly panicked.
Anyway, here is the gist: initially, even if you are not compiling a list of directorial choices for your mother's use, you will be startled and awed by the choices Taymor makes. There is a stunning mixture of expensive technology and simple stagecraft, and a viewer feels safe the whole while–safe because they are in the hands of a person who will not bore them, who seems to have an exact sense of rhythm, scale and color scheme-and who presents recognizable character 'types' that amuse without degrading the people who make that type.