Tuesday, December 31, 2013
everything is an algorithm
Algorithms decide what we are recommended on Amazon, what films we are offered on Netflix. Sometimes, newspapers warn us of their creeping, insidious influence; they are the mysterious sciencey bit of the internet that makes us feel websites are stalking us—the software that looks at the e-mail you receive and tells the Facebook page you look at that, say, Pizza Hut should be the ad it shows you. Some of those newspaper warnings themselves come from algorithms. Crude programs already trawl news pages, summarise the results, and produce their own article, by-lined, in the case of Forbes magazine, "By Narrative Science".
Others produce their own genuine news. On February 1st, the Los Angeles Times website ran an article that began "A shallow magnitude 3.2 earthquake was reported Friday morning." The piece was written at a time when quite possibly every reporter was asleep. But it was grammatical, coherent, and did what any human reporter writing a formulaic article about a small earthquake would do: it went to the US Geological Survey website, put the relevant numbers in a boilerplate article, and hit send. In this case, however, the donkey work was done by an algorithm.
An overdue study of the "experimental" novelist Ann Quin
the money and art problem
The price of things is crowding out their value. When it comes to art, the belief that the price of a work is its sole worth constitutes the peculiar accord between the hedge-fund millionaires driving prices into the stratosphere and the would-be revolutionaries who fantasize about the collapse of the art-market bubble and the whole hideous economic system of which it is a prominent sideshow. Is it even remotely possible to see the exhibition hanging in the Guggenheim Museum right now—paintings, drawings and photographs by the American artist Christopher Wool—as art instead of dollar signs, now that one of Wool’s paintings (not included in the exhibition, which is on view through January 22) has sold at auction for $26.5 million, just a year after another sold for what then seemed an already outlandish $7.7 million? According to a recent article in The Art Newspaper, speculation on Wool’s art over the past few years indicates that it “‘has become a parking lot for money,’ says one high-profile European curator. Like the market for Jean-Michel Basquiat, Wool’s market is in danger of being controlled by a small, powerful group of players, he [adds].”
“Parking one’s money” is apparently an everyday concept among those who have too much of it; a recent New York Times article headlined “Record Prices Mask a Tepid Market for Fine Art” quoted a market expert who accounted for the popularity of contemporary art among hedge-fund managers this way: “They can hang anything they want in their Manhattan co-ops or in Aspen and nobody can say that’s ugly because contemporary art has not been subjected to sustained critical appraisal. There are no markers of good or bad taste that have yet been laid down. It’s a safe place to park your money.”
America's Secret History of Atomic Accidents
Gregory D. Koblenz in Foreign Affairs:
Between 1950 and 1980, the United States experienced a reported 32 “broken arrows,” the military’s term for accidents involving nuclear weapons. The last of these occurred in September 1980, at a U.S. Air Force base in Damascus, Arkansas. It started when a young technician performing routine maintenance on a Titan II missile housed in an underground silo dropped a socket wrench. The wrench punctured the missile’s fuel tank. As the highly toxic and flammable fuel leaked from the missile, officers and airmen scrambled to diagnose the problem and fix it. Their efforts ultimately failed, and eight hours after the fuel tank ruptured, it exploded with tremendous force. The detonation of the missile’s liquid fuel was powerful enough to throw the silo’s 740-ton blast door more than 200 yards and send a fireball hundreds of feet into the night sky. The missile’s nine-megaton thermonuclear warhead -- the most powerful ever deployed by the United States -- was found, relatively intact, in a ditch 200 yards away from the silo.
The Damascus accident epitomizes the hidden risk of what the sociologist Charles Perrow has dubbed “normal accidents,” or mishaps that become virtually inevitable once a system grows so complex that seemingly trivial miscues can cause chain reactions with catastrophic results. As the journalist Eric Schlosser explains in his new book, Command and Control, “The Titan II explosion at Damascus was a normal accident, set in motion by a trivial event (the dropped socket) and caused by a tightly coupled, interactive system.” That system, he writes, was so overly complex that technicians in the control room could not determine what was happening inside the silo. And basic human negligence had only made things worse: “Warnings had been ignored, unnecessary risks taken, sloppy work done.”
Epistemology, Democracy, and the dynamic duo of 3 Quarks Daily
From 3:AM Magazine:
Robert B. Talisse (on the right of the picture) and Scott F. Aikin (on the left of the picture) are the dynamic duo of 3Quarksdaily, thinking about the social nature and political significance of argument, about the two things the word ‘argument’ captures, about the straw man fallacy, about misfiring sound arguments, about the intimate connection between epistemology and democracy, about the nature of democracy, pragmatism and Rawls, about Dewey, Elizabeth Anderson and Pierce, about ‘pluralism’ as a halo term, about the truth orientation of our cognitive life, about Nietzsche’s challenge, about being fearless about the fear of regress, about the use of tone, about the need for political arguers and the dangers of cognitive insulation, about when to revise ones beliefs, about civility in argument and about why their new book is keyed to all contemporary democracies. Epistemocracy doubled!
3:AM: What made you become philosophers?
Scott Aikin: I was a Classics major at Washington University in St. Louis, and I was very lucky to have the patient instruction of Merritt Sale, George Pepe, and Carl Conrad there. We would have class discussion about some line from Seneca or Plato, and I’d get hung up on some philosophical issue. I originally thought it was because my ancient languages weren’t good enough, but it became clear that disagreements about virtue or knowledge aren’t solved by dictionaries, but by doing some philosophical work. You had to think about what virtue and knowledge really are. It was like my mind caught fire – I was eighteen years old and could dispute with the greats on what was good and true. Authority with these matters came with having reason on your side, not any status or anything like that. It was exhilarating, and that anti-authoritarian appeal of philosophical work still enlivens me.
Robert Talisse: I grew up in northeastern New Jersey, and I took a class in Philosophy in my senior year in high school. The class was a survey of the great philosophers’ ideas, paying nearly no mind to the arguments they devised. I liked that class, but it left me with the impression that Philosophy was a dead discipline, something that had ended in the 19th Century. So, when I entered William Paterson College (it was not yet a university then), I was not aware that it was possible to major in Philosophy. I spent my first semester as an Economics major, but once I discovered that there was a Philosophy major, I switched immediately. At the time William Paterson was a small commuter school filled with Business majors, yet somehow there was a critical mass of really serious Philosophy students, all of whom eventually earned PhDs, and many of whom are now professional philosophers. In any case, I quickly learned there that Philosophy is about challenging those (including oneself) who claim to know. Like Aikin, I latched on to the anti-authoritarianism of it all. And I soon realized that the impression of Philosophy that I got from my high school class – that it had died as a discipline – was exactly wrong. Philosophy is one of the few disciplines that is not dead. I eventually found myself with a PhD in Philosophy from CUNY and a job at Vanderbilt as a philosopher. To be honest, I’m not really sure how it all happened.
When Minority Students Attend Elite Private Schools
Judith Ohikuare in The Atlantic:
Dalton is a prestigious, decades-old, K-12 prep school on New York City’s Upper East Side that filters its students into the best universities in the country. In 2010, Forbes reported that 31 percent of its students matriculated into MIT, Stanford, or an Ivy League institution. Former students include Anderson Cooper, Claire Danes, and Ralph Lauren’s daughter Dylan. Even imaginary peoplemake sure their families are present for parent-teacher conferences. For years, however, Dalton was largely inaccessible to minority and lower-income students. Maintaining its reputation as a top-tier place of learning did not require administrators to extend invitations to those groups.
When Idris Brewster and his friend Seun Summers entered kindergarten at Dalton in the late 1990s, they were one of the few students of color in their class. Idris and Seun’s parents believed that getting into Dalton was the first step to a life filled with accomplishments.
"Students that came out of independent schools were well-prepared on the level of networking, internships, job and school opportunities—you name it—and we were offered great financial-aid incentives," Michèle Stephenson, Idris's mother, told me. "We thought this intensive, intellectually stimulating institution would open doors for Idris and take him anywhere he wanted to go."
Fourteen years later, Idris's parents have released American Promise, a documentary that records the boys' personal and academic experiences from kindergarten through senior year of high school. The film reveals a hard truth about being a student of color at an elite school: Simply being admitted doesn't guarantee a smooth or successful educational journey.
More here. [Thanks to Anjuli Raza Kolb.]
Google Zeitgeist: Here's to 2013
In the Human Brain, Size Really Isn’t Everything
Carl Zimmer in The New York Times:
There are many things that make humans a unique species, but a couple stand out. One is our mind, the other our brain. The human mind can carry out cognitive tasks that other animals cannot, like using language, envisioning the distant future and inferring what other people are thinking. The human brain is exceptional, too. At three pounds, it is gigantic relative to our body size. Our closest living relatives, chimpanzees, have brains that are only a third as big.
Scientists have long suspected that our big brain and powerful mind are intimately connected. Starting about three million years ago, fossils of our ancient relatives record a huge increase in brain size. Once that cranial growth was underway, our forerunners started leaving behind signs of increasingly sophisticated minds, like stone tools and cave paintings. But scientists have long struggled to understand how a simple increase in size could lead to the evolution of those faculties. Now, two Harvard neuroscientists, Randy L. Buckner and Fenna M. Krienen, have offered a powerful yet simple explanation. In our smaller-brained ancestors, the researchers argue, neurons were tightly tethered in a relatively simple pattern of connections. When our ancestors’ brains expanded, those tethers ripped apart, enabling our neurons to form new circuits. Dr. Buckner and Dr. Krienen call their idea the tether hypothesis, and present it in a paper in the December issue of the journal Trends in Cognitive Sciences.
The Genius in All of Us
From The Genius in All of Us: New Insights into Genetics, Talent and IQ by David Shenk via delancyplace:
Genius. The popular conception of genius is that it is an inborn gift, yet an increasingly large body of research suggests the opposite -- that genius is always the product of sustained effort. A case in point -- Mozart: "Standing above all other giftedness legends, of course, [is] that of the mystifying boy genius Wolfgang Amadeus Mozart, alleged to be an instant master performer at age three and a brilliant composer at age five. His breathtaking musical gifts were said to have sprouted from nowhere, and his own father promoted him as the 'miracle which God let be born in Salzburg.' "The reality about Mozart turns out to be far more interesting and far less mysterious. His early achievements -- while very impressive, to be sure -- actually make good sense considering his extraordinary upbringing. And his later undeniable genius turns out to be a wonderful advertisement for the power of process. Mozart was bathed in music from well before his birth, and his childhood was quite unlike any other. His father, Leopold Mozart, was an intensely ambitious Austrian musician, composer, and teacher who had gained wide acclaim with the publication of the instruction book ... Treatise on the Fundamental Principles of Violin Playing. For a while, Leopold had dreamed of being a great composer himself. But on becoming a father, he began to shift his ambitions away from his own unsatisfying career and onto his children -- perhaps, in part, because his career had already hit a ceiling: he was vice-kapellmeister (assistant music director); the top spot would be unavailable for the foreseeable future. "Uniquely situated, and desperate to make some sort of lasting mark on music, Leopold began his family musical enterprise even before Wolfgang's birth, focusing first on his daughter Nannerl. Leopold's elaborate teaching method derived in part from the Italian instructor Giuseppe Tartini and included highly nuanced techniques ...
"Then came Wolfgang. Four and a half years younger than his sister, the tiny boy got everything Nannerl got -- only much earlier and even more intensively. Literally from his infancy, he was the classic younger sibling soaking up his big sister's singular passion. As soon as he was able, he sat beside her at the harpsichord and mimicked notes that she played. Wolfgang's first pings and plucks were just that. But with a fast-developing ear, deep curiosity and a tidal wave of family know-how, he was able to click into an accelerated process of development. "As Wolfgang became fascinated with playing music, his father became fascinated with his toddler son's fascination -- and was soon instructing him with an intensity that far eclipsed his efforts with Nannerl. Not only did Leopold openly give preferred attention to Wolfgang over his daughter; he also made a career-altering decision to more or less shrug off his official duties in order to build an even more promising career for his son.
...The tiny Mozart dazzled royalty and was at the time unusual for his early abilities. But today many young children exposed to Suzuki and other rigorous musical programs play as well as the young Mozart did -- and some play even better. Inside the world of these intensive, child-centered programs, such achievements are now straightforwardly regarded by parents and teachers for what they are: the combined consequence of early exposure, exceptional instruction, constant practice, family nurturance, and a child's intense will to learn.
Let me make this perfectly clear.
I have never written anything because it is a Poem.
This is a mistake you always make about me,
A dangerous mistake. I promise you
I am not writing this because it is a Poem.
You suspect this is a posture or an act
I am sorry to tell you it is not an act.
You actually think I care if this
Poem gets off the ground or not. Well
I don't care if this poem gets off the ground or not
And neither should you.
All I have ever cared about
And all you should ever care about
Is what happens when you lift your eyes from this page.
Do not think for one minute it is the Poem that matters.
Is is not the Poem that matters.
You can shove the Poem.
What matters is what is out there in the large dark
and in the long light,
by Gwendolyn MacEwen
McClelland & Stewart, 1987
Monday, December 30, 2013
by Lisa Lieberman
In the memoir he was writing at the time he died, my friend Avresh described returning to the Czechoslovakian town of Sevlush, his birthplace, in the winter of 1946. He'd left some fifteen years earlier to attend a Jewish gymnasium in a larger city, stayed on to study engineering at the university and never looked back. This was his mother's wish for him: that he enter the great, free, secular world, liberate himself from the narrowness of his tradition. Escape.
When the Germans occupied Czechoslovakia in 1939, Avresh joined the Communist resistance. Captured and tortured by the Gestapo but inexplicably released, he made his way to the Soviet Union, expecting to be welcomed with open arms, a comrade in the fight against Nazism. Instead he was arrested at the border and charged with espionage—the fate of most Jewish refugees from Eastern Europe. Avresh spent two and a half years in the Gulag, shuffled from one prison camp to the next, but ended up an artillery officer in a Czech unit of the Russian army;
by the time he was discharged, he'd earned four medals for his service on the Eastern front. His favorite featured a picture of Stalin.
So it was as a decorated officer in a Russian army uniform that he returned to his town after the war. All the Jews were gone, rounded up and deported to Auschwitz. A Slovak family was living in his childhood home and not a trace of Avresh's own family remained. Looking for answers, he went to the neighborhood synagogue and peered in the door. The sanctuary, the balcony, the corridors and stairways were cluttered with belongings: furniture, pots and pans, bedding, books, knickknacks and photographs. A policeman stood watch over the household goods of the departed Jews of Sevlush. Town officials had collected the Jews' possessions and stored them in the synagogue to prevent looting. No Jews had returned to claim their things. Was there something he wanted from the collection, the policeman asked, some memento?
Avresh said he took nothing when he left Sevlush, but this is not strictly true. He carried no objects away from the synagogue, no material belongings, pointedly refusing the money the officials offered as "rent" on his family's house. What he took, along with the burden of guilt he carried—"I share the usual remorse of most Holocaust survivors lamenting why they are alive and why they did not try harder to save their perished family," he wrote in his memoir—what he took, I would say, was a sense of spiritual belonging, the token that remained of his Jewish inheritance.
My friend was not a believer. Faith in God could not console him for the loss of his father and siblings, his beloved mother. Proud of his youthful "love adventures" with the beautiful gypsy girls of Sevlush, unfaithful to his wife of forty-some years, Avresh was hardly a paragon of virtue. Quite the opposite: he played the part of a ladies man with relish well into his eighties, calling my office and leaving cryptic messages in my voice mail as if we were carrying on some tryst behind our spouses' backs when all I wanted was to schedule him to speak in one of my classes. Hitting on my female students. And yet he tried to be useful to humanity, to heal the world as Jewish tradition instructed, in his own way.
I have a vivid memory of Avresh, age ninety, protesting in front of the Pennsylvania State Capitol building in Harrisburg on a bitter January day, demanding a moratorium on capital punishment in the name of human dignity. In the Gulag, when a fellow prisoner collapsed from overwork, he told my students, they left him lying in the snow. He could not stand by now and watch a man die. It was his obligation to speak out, his obligation as a Jew, a fallen Jew who no longer lived according to the laws of his tradition, but one who clung stubbornly to a shred of that tradition: the lesson that life is sacred and that dignity is owed to even the poorest and the most degraded members of society.
* * *
Dignity was the core value of the Polish physician, educator, and Godless Jew who is the subject of Andrzej Wajda's film Korczak (1990). In the face of terrible poverty and disease—conditions in the Warsaw Ghetto were appalling, the death rate surpassing 5,000 per month out of a total population of some 470,000 inhabitants—Janusz Korczak's orphanage was a model of order and civility. His charges were not only fed and clothed, but they were also educated to the highest standards even as the deprivations increased. Musical recitals continued to be held, art lessons, dramatic performances, poetry readings all went on as if these children's world were not coming to an end and as if the outside world had not turned callous and lost interest in the plight of such innocents. And when the round-ups began, Korczak accompanied his orphans in the cattle car to Treblinka, keeping the truth from the children so that they might meet their deaths with composure.
Korczak's insistence on upholding the cultural values of the European elite amid the squalor of the Ghetto functioned as an eloquent defense against Nazi efforts to degrade the Jews by reducing them to the level of beasts, both at the time and symbolically, in the commemorative literature that turned him into a legend after the war. But Wajda's film ends on a surreal note that undercuts this message. The Nazis raid the orphanage. Korczak bargains for a few extra minutes, to give him time to organize an orderly exodus. The children have been told they are going on a trip to the countryside. Obediently, they line up behind the adult staff members, each carrying a little knapsack filled with cherished belongings. We see them marching past armed German soldiers, soft snowflakes floating about, like feathers. We seem to have entered the realm of bedtime stories, a muffled world where bad things happen without touching us. Here's how cinema critic Danièle Heymann described what happens next in her review for Le Monde:
The deportation orders are signed. The liquidation of the ghetto is underway. Under the Star of David, the children and Dr. Korczak enter the sealed carriage singing.
And then the doors swing open—a coda to a sleepy, disgusting dream on the edge of revisionism—and we see how the little victims, energetic and joyful, emerge in slow-motion from the train of death. Treblinka as the salvation of murdered Jewish children? No . . . Not ever.
Heymann was offended by Wajda's Christianizing impulse, which presented the children's extermination as salutary suffering for the edification of humanity, but it is worth noting that Elie Wiesel's Night (1958) was embraced by Nobel laureate and Catholic humanist François Mauriac in similar terms, as a spur to Christian faith. In his foreword to the French edition (which still appears in American editions of the book), Mauriac recalled his first encounter with "the young Jew" (Wiesel) in the light of his subsequent reading of the novel.
What did I say to him? Did I speak of that other Jew, this crucified brother who perhaps resembled him, and whose Cross has conquered the world? Did I explain to him that what had been a stumbling block for his faith had become a cornerstone for mine?
Christianizing the Holocaust was nothing new, although Wajda did not help his case by insisting that it was art's responsibility to be uplifting. "There would have been nothing easier than showing the death of the children in the gas chamber," he said. "[But] it seems to me that it is beautiful that when we do not agree to the fact that the children were gassed, we create a legend that these children go somewhere, into some better world."
I do not want to diminish the seriousness of the complaints against the film. After the fall of Communism, the Holocaust became contested ground as Polish Catholics sought to reassert their identity by memorializing Catholic victims of the Nazis—something they were not permitted to do when Poland was part of the Soviet Bloc. The controversy surrounding the opening of a Carmelite convent near Auschwitz had reached a peak at the very time that Wajda was making Korczak. Polish nationalists would eventually erect hundreds of crosses at the site, angering Jewish groups and provoking a showdown with the government. A large cross commissioned in 1979 for a mass celebrated at Auschwitz by Pope John Paul II still stands in view of the camp. As historian Omer Bartov argues in The "Jew" in Cinema, "Whether the cinematic Korczak speaks as a Pole or as a Jew, he is clearly represented in the film—and remembered by the Poles—as a Pole who chose to share the fate of the Jews in the heroic manner befitting his nation."
And yet, viewing Korczak alongside Wajda's famous World War II trilogy, I've come to appreciate the director's intentions. A Generation (1955) dramatizes the Communist underground's involvement in the Warsaw Ghetto uprising. Kanal (1957) follows a doomed group of Warsaw resistance fighters as they battle it out with the Nazis, much of the action taking place in the city's sewers, while Ashes and Diamonds (1958) focuses on the confrontation between Polish partisans and the incoming Communist forces on the last day of the war. None of these films is uplifting; Wajda conceived of them as tributes to people defending lost causes. "One has to fight to the end," he said of Kanal. What made the characters heroic was their ability to master their fear.
I would like to suggest that Korczak was no less of a tribute to a heroic warrior, although the fight, here, was a nonviolent one. In a key scene toward the end of the picture, Korczak is confronted by a former charge, an orphan who has grown up to become a member of the Jewish underground. The young man sneers at his teacher's pacifism. Ghetto Jews are colluding in their own destruction thanks to teachings like his. What is wanted is armed resistance.
My friend Avresh shook his head at this point. Armed resistance got him nowhere (unless you count the Stalin medal). He owed his survival to sheer luck. The family he left behind in Sevlush was not so lucky, but Wajda's film allowed him to imagine that they went to their deaths with dignity.
Pakistan's War - Part I
by Ahmed Humayun
(This is the first post in a series on Pakistan's struggle against militancy).
Almost a decade in, the rebellion by the Pakistani Taliban against Islamabad shows no signs of flagging. Tough, savvy, and agile, the insurgents have expanded their campaign from the isolated northwestern tribal regions all the way to urban centers in the south such as the port city of Karachi. Their declared agenda has grown with each success: they first demanded acceptance of their control over large swathes of the tribal areas; they then denied the authority of Islamabad across Pakistan altogether; today, influenced by Al Qaeda's rhetoric, they boast of sending fighters to wars in Arab lands and attacking the United States.
We need not accept all their grandiose declarations at face value. When it comes to global terrorism, in particular, there is a chasm between their rhetoric and their capacity. The only terrorist plot on American soil they can claim is of the failed Times Square Bomber in 2010. The evidence of Taliban involvement in Middle Eastern battlefields is ambiguous at best. And their operations are constrained by an overall pool of fighters that is small: estimates vary because data is hard to collect and the definition of an active fighter is murky but at any given time there may only be between ten and twenty thousand rebel fighters.
But the insurgents have substantially expanded their campaign within Pakistan itself. They have strategic clarity where Islamabad does not and their aspirations have been whetted by the confusion of the state. In recent years the rebels have complemented their fight against Pakistani armed forces in the tribal areas with a systematic campaign of terrorism in towns and cities across the country. To this end the insurgents have leveraged and expanded a vast ‘infrastructure of extremism', which originates in decades of state sponsorship of non-state militant groups.* The network includes combat trainers, militant recruiters, funders, suicide jacket makers, indoctrinators and foot soldiers who have access to training camps, safe houses, telephone getaway exchanges, madrassahs (some, not all) and highly sophisticated media communications facilities across the length and breadth of Pakistan. The insurgents are not cave dwellers: they are adept organization builders who have institutionalized the production of terrorism as one weapon in their broader war against the state.
It is easy to get ensnared in the web of this sprawling infrastructure. This past November, I interviewed a former senior Pakistani police officer who has investigated terrorism cases for over twenty years. He told me about a 16 year old caught with two hundred and fifty kilograms of explosives in a failed assassination attempt on former Pakistani President General Pervez Musharraf. When he asked the boy why he became a suicide bomber, the boy replied: "Maybe all of you are right when you say ‘madrassah students hate us' but we are your kids. Our maulvi told us that killing Musharraf would send us to heaven but no one told us any differently." Some would-be-bombers (and their families) are true believers; some are coerced; some are brainwashed; some are mentally ill; some are paid off; some are poor, some are middle class; some see no place in society for themselves and find they can play a role as a martyr. There is no one defined, predictable route to militancy but the insurgents have shown flexibility in sucking potential recruits in.
The insurgent advance has been aided by the significant disparity between their capabilities and experiences relative to the civilian security forces. Insurgents are battle tested and have far more advanced weaponry than the demoralized policemen who, at the frontlines of the urban war, have effectively become cannon fodder in any confrontation. The rebels also tend to be much better paid. The owner of a security services company in Lahore that provides armed guards, many of whom are former police officers, told me his information suggests that the rank and file Taliban fighter gets paid at least twice the monthly salary of a low-grade police officer. The late Hakimullah Mehsud, who sat at the top of the rebel chain of command before being dispatched by a U.S. drone strike, owned a luxurious eight room compound in the tribal area of North Waziristan where his family resided and which was valued by one estimate at one hundred and twenty thousand dollars, a significant sum in Pakistan.
It is unsurprising, then, that state control in Pakistan's cities is eroding. A series of recent jailbreak videos released by the insurgency's propaganda arm show armed assaults on outgunned, overwhelmed prison authorities in cities such as Dera Ismail Khan in Khyber-Pakhtunkhwa province. The rebels have eviscerated many governance structures at the local level through assassinating government officials, local power brokers, and civil society activists; manipulated election outcomes by selectively targeting political parties; leveraged alliances and partnerships with an assortment of non-state groups outside the tribal areas, whether on the basis of shared ideology or short-term expediency or both; and participated in a vast range of criminal enterprises that have filled their coffers with proceeds from drug trafficking, kidnapping, bank heists, and racketeering.
The degree of insurgent infiltration varies by city. An editor at one of the country's largest dailies told me that of the one hundred and seventy eight local union councils (the administrative unit of local governance) in Karachi, perhaps in as many as sixty or seventy the police could not enter because of de facto control by non-state actors, of whom the Taliban form a rapidly growing constituency. History and geography matter here: where the state is weaker, opportunities for insurgents are greater.Local mafias, drug lords and crime bosses, often allied with local political parties, inhibited the state's writ in Karachi long before the Taliban showed up. The situation in Karachi is not therefore analogous to, say, the city of Lahore in the province of Punjab, where the provincial government has a tighter grip, due in part to greater investments in civilian law enforcement and service delivery. Nevertheless, Karachi is the country's largest city and its financial hub, and its ongoing subversion provides a template for the escalation of urban war elsewhere.
The insurgency's ability to overthrow the state faces some basic barriers. Pakistan is a big country of more than one hundred and eighty million people and outright insurgent control outside the border areas, which have a unique history of semi-autonomy, is still limited. The Pakistani state can claim the seventh largest army in the world which has in recent years launched a series of offensives that have wrested back some territory in the northwest. And the Taliban model has limited ideological appeal —most of all where citizens have actually experienced rebel rule.
Nor is the Pakistani Taliban a monolithic movement. It is best understood as a loosely organized coalition of like-minded factions alternately cooperating and competing for recruits, funds, and credibility. Groups sometimes pay allegiance to high ranking rebel commanders and sometimes declare independence; they are often engaged in fierce turf battles with each other. Individual gangs may have strong leaders but these do not necessarily exercise day-to-day operational control. These entities are decentralized networks rather than rigid top-down structures: after terrorist attacks individual factions often do not claim immediate responsibility because they have to ask around to make sure who carried out the operation. Limits on the insurgency's capacity for coordination and divisions among its factions can create an opening for effective state action.
But even if the full on 'failed state' scenario seems implausible at this time, the fact is that the rebels do not have to overthrow the state to win. They are attenuating the relevance of an already-weak state and they are aggravating divisive trends in Pakistani society. The country has sustained enormous damage already: Over the last decade, tens of thousands have died at the hands of insurgents; millions have been displaced due to clashes between insurgents and government forces; the country's Ministry of Finance estimates direct and indirect economic costs upwards of sixty five billion dollars since 2001 as a result of conflict; even foreign cricket teams don't tour the country anymore.
Equally important, the insurgency's successes have had a wider demonstration effect. Many militant groups in Pakistan that are not in revolt against the state have their own private, sectarian agendas. In a general atmosphere where it is perceived that violence can be committed with impunity, the operations of these organizations have expanded. As a result the country is turning into a theme park of religious and political violence. Pakistan may not be a failed state but it is certainly a fracturing society, the fissures beginning to widen between Deobandi and Barelvi, Sunni and Shia, an indication of the general tendency towards polarization.
If these trends are not arrested in the coming years a new social and political order may emerge. The Pakistani state will still exist and it will still be the single strongest player across the country’s territory, but its monopoly over force will gradually be reduced to scattered cantons. Divided sectarian communities will live under multiple, conflicting sovereignties that alternate between the state, insurgents, and criminals, the balance between them constantly renegotiated, region by region, through a combination of guerilla war, urban battle, targeted assassinations, backdoor political deals, and protection payoffs that purchase the peace, if only for a time.
*‘infrastructure of extremism': This was the characterization used in the report by the Abbottabad Commission (leaked by Al Jazeera) which was constituted by Islamabad to inquire into the circumstances leading to Bin Laden's assassination by American special forces.
Why Downton Abbey isn't as Good as People Think
by Alon Levy
There's a genre of shows, above the level of House or Friends and below that of The Wire, that exude high quality even if the actual level of characterization and plot isn't deep. Julian Fellowes' Downton Abbey is one of the prime examples of this genre. It's beautifully done and acted, has enough characters and plots to keep anyone's interest, and is full of references that seem smart.
It just so happens that none of these references is particularly intellectual or obscure. Instead, they're the sort of history that everyone knows. The first episode discusses the Titanic; we do not live in an alt history in which James Cameron chose to make more Terminator sequels in the 90s. Every time Lord Grantham's American wife's mother comes, we're treated to the usual tropes of differences between British and American culture. In the season that just concluded with its Christmas special, two additional common references are added: a rich English expat goes to Munich in 1922 and is killed by the early Nazi party because he vocally disagreed with them; and there's a subplot regarding Edward VIII's playboy philandering. This is about as smart as an American mid-18th century period drama inserting a reference to Washington not being able to tell a lie.
The problem is that even the stronger points of symbolism on the show are like this. The biggest is the analogy between the upstairs and the downstairs. The servants form a tight group (except Thomas and O'Brien) in which Carson is the father, Hughes is the mother, and the rest of the servants have a hierarchy in which valets and lady's maids are above the rest. Bates/Anna is of course parallel to Matthew/Mary, and the stronger parts of the show are the ones that showcase the differences between their relationships, with Matthew/Mary having more resources and more clout than Bates/Anna so that they face more rich-people problems rather than a possible execution.
The only problem is, the show didn't really invent this view of the butler as the father, the housekeeper as the mother, and the other servants as lesser members of the house. It was common in that era. I don't think it's as well-known a reference, but that symbolism is still a trope, and the servants' order of precedence within the great houses reflected it. It works well enough as a reference, but as symbolism, it's trite.
Everywhere else, Fellowes' Tory baron biases show. The show can't write women well, and descends to a virgin/whore/mother trichotomy. The only man who is as conniving as the median woman is gay. The treatment of race is facile. Lord Grantham is self-consciously written as an upper-class twit, but he doesn't suffer any consequences for it and is always saved by more competent family members, nor does he have interesting moral dilemmas. The characters are never shown to engage in any effort - they do some work and succeed, without any of the failures that are associated with actual effort. The show wants to be about the aristocracy's struggles with its decline after WW1, but it's instead about an aristocratic family that weathered all the troubles, which is about as interesting as any riches-to-riches story could be.
Early on, there are three pairs of women presented as bitches who connive against each other at every opportunity: O'Brien and Anna, Mary and Edith, Isobel and the Dowager Countess. In the first season, they were half the female main cast of the show, and the other half consisted of people who are mother figures (Lady Grantham, Hughes), had little plot at the time (Daisy, Mrs. Patmore, Sybil), or had plot but it was being put on a bus never to be seen again (Gwen). Anna improves, to the point of being presented as impossibly pure and saintly. The others don't. That's the virgin/whore/mother trichotomy. As far as we can tell, Anna is a virgin until she marries Bates, and she's so good and incorruptible and naive that when she's raped she blames herself, and of course will never tell a man, and of course will never try to stand up for herself the way a whore would. Sybil is a virgin who becomes a mother who the show couldn't figure out a plot for so she was killed. Edith is a whore - a more subdued one, but still someone who undercuts the family and then has to work to redeem herself (the only one who does) and who gets pregnant out of wedlock. Mary, the most complicated character, shifts more: she starts as a whore and redeems herself through marriage to Matthew, and after his death acts as a virgin. Even at the end of the latest Christmas special, Mary's playing two suitors against each other is chaste.
Anna's rape plot especially underscores this. We have never seen Anna have trouble standing up for herself. On the contrary, when O'Brien and Thomas wanted to frame them for something early on, she was the one who suggested framing them back, and Bates the incorruptible was the one who shot down the idea. Subsequently, as Anna became closer to Mary, she was the loyal servant-cum-spy, doing minor conniving. And now she's raped and like every saintly woman decides to blame herself, and needs other people, like her more blemished upper-class counterpart (Mary) or her mother figure (Hughes) to sort out her problems and out her to Bates. To remain unblemished, Anna can't reveal this to Bates, so he instead extracts this from Hughes; she can't say who the rapist was, so instead he figures this out himself, and it is clear that he will want to kill him. We even see the rapist set his sights on a kitchen maid just to underscore that Bates is morally right in killing him. There is no moral complexity here - Bates could just as well have killed the rapist without our seeing the rapist try to assault another character on the show.
The men who are not presented as unrealistically saintly are at least allowed to contain multitudes, except Thomas, who after O'Brien's departure is the sole token evil character, and also happens to be the only openly gay character. Bates is as saintly as Anna but can at least do dark things, though the show takes his side in the most blatant way possible. These are the rules of the society the show is set in, but these are also the rules of the show itself. What Bates is allowed to do while staying in the presumed viewers' good graces, Anna and Mary are not. Mad Men does this better - the society in the early seasons has strict gender rules, but Peggy manages to break many of them and remain a well-rounded character rather than the Bad Girl.
The Dowager Countess is a practical, witty person, saying things that sound clever when said on the fly and trite when a screenwriter has time to think them up. The West Wing has a similar problem with substituting witter banter for moral complexity and character development; Downton Abbey wisely limits itself to 1.5 such characters, the half being Isobel. But I digress. As a woman, the Dowager Countess is dishonest and manipulative; it's all for good causes, but somehow Matthew doesn't need to be so manipulative to get what he wants. But as a noble, her horribleness to the working class is an informed trait, and is expressed more in her clothing choice and her microexpressions than in how she acts in any individual case. She doesn't treat Tom well, but she doesn't do bad things to him the way Thomas does; even Mary and Edith made more of an effort to stop his relationship with Sybil. Isobel says of her that she'd be against teaching the working class to read, and yet she helps laid off servants find jobs elsewhere. I don't think it's meant as a critique of the mainstream view of racism/classism/sexism as a string of individual acts rather than as systemic inequality; the show isn't that self-aware. It's just that the show can't bring itself to show the nobles as callous to the commoners without extremely good reason. The show seems to treat the Dowager Countess as a woman who looks down on the working class but helps working people when it matters, rather than as a woman who helps servants who she knows but looks down on them. The order of such but clauses is important.
Lord Grantham is an upper-class twit. He has much more rigid views than his supposedly more classist mother the Dowagerr Countess, and at several points in the plot would spell doom on his lifestyle if he weren't bailed out by smarter characters. He faces bankruptcy for having invested all his money in one railroad that ended up failing, but Matthew bails him out with money he inherited from Lavinia's father (with the required guilt about marrying someone other than Lavinia). The estate is mismanaged and he needs Matthew and Tom to run it competently, but he doesn't need to compromise any of his lifestyle for that. In season 4 there's a crisis of ownership since he wants the estate back after Matthew dies, and Mary has to wrestle the estate from him, but she has a convenient will by Matthew giving her control. No servants or tenant farmers are laid off, except for the son of a dead, rent-delinquent tenant, who Lord Grantham loans money to and who becomes competent and steps forward as a potential manager. In other words, nobody is hurt by Lord Grantham's stupidity.
Mary then steps up, and has the formal papers to back it up. She is visited by a pair of government officials, one of whom she knew once, who survey postwar agriculture. The one she didn't know before, Blake, lets her know he doesn't care for the survival of the aristocracy and that he's only there to make sure food supplies won't be disrupted while capitalism displaces aristocracy. But not to worry: as baby pigs arrive on the farm their trough is knocked over, so Mary and Blake have to join together to haul water from a barn to give the pigs lest they die of thirst and the Downton farm probably collapse. Mary is covered in mud, and Blake comments that she's good at farm work.
Note how nobody on the show needs to make any serious effort: one try is all that's required, and you'll be good at it no matter what. If you're never seen a farm animal up close in your life you'll still make a good farmer. Alfred, too, when he leaves to work as a cook for the Ritz, is rejected, but almost instantly thereafter hired anyway as one of the people ranked above him on the cooking test leaves. There's no need for the long process of learning, or for a mentor as in the hero's journey trope. I've done wrong and/or uninteresting math; so have my peers and mentors with multiple papers in top journals and tenure or tenure-track positions at top universities. I've also done a lot of bad writing, and so have professional writers. There's a trope that first drafts of books always suck. I'm sure that Fellowes has written his share of crap, but either he forgot about it, or he thought the British viewers are morons who can't stomach the processes of slow learning and of learning by failing. The most popular newspapers in Britain are the Sun and the Daily Mail, so if it's the latter then Fellowes may well be right.
Even after the pig story, the show pulls its punches. The tenant farmer who they were about to evict who Lord Grantham loaned money to steps up, and says he's experienced with pigs and could take over the day-to-day running. Mary dirties herself once to prove her bona fides to Blake, but then puts lowborn people in charge of dirtying themselves subsequently. Blake and Mary still take an instant liking to each other, but there's a problem: Blake is a commoner who is against aristocracy, and Mary is pursued by another man, a lord, who she liked at the beginning of the season but spurned out of mourning for Matthew. Mary is still class-obsessed and has trouble marrying a commoner, but in the Christmas special, the other love interest tells her that Blake is in fact the heir to a rich barony, and just doesn't like to mention his title to people. No need to reexamine her assumptions, then - she's set. The ideal woman in a romance story is a naturally pretty woman who doesn't act like it and doesn't care much for looks, but still looks like a model and will get a makeover by a side character; the ideal man is the aristocrat who doesn't care much for his heritage, but is going to inherit it anyway.
And they're so good to everyone. The strong always take care of the weak, and always know how to do so in a way that optimally matches workers to their skills. Lord Grantham never abuses others' trust in him, or physically assaults his servants. The House of the Spirits has no trouble writing hacienda owner Esteban Trueba as a sympathetic aristocrat, who nonetheless is authoritarian, looks down on others, and rapes his tenant farmers. This isn't treated as a harmless vice, but as a horrific aspect of the hacienda system that ends up biting his family many decades later. Downton Abbey doesn't have the ability to look critically at breakdowns or abuses of authority, and just wills them away. There, there are no abuses. Nor are there mistakes with long-term consequences, with the partial exception of Sybil's death, in which two doctors disagreed and Lord Grantham picked the wrong one. In the show's world, there's no need for uniform wage scales or formal training, or for worker agitation or unions, or for governments and regulators except as advisors to the aristocrats. The aristocrats know best, and if they don't, they'll learn everything effortlessly through their superior breeding.
At a time when England is beset by rural flight, we never see any acrimony involving tenant farmers leaving, or demanding higher wages to the point that the family replaces them by machines. We see little of that in the house, too: technology isn't causing layoffs, and although Carson and Mrs. Patmore grumble about technology, the servants quickly accept it, and nobody is made redundant. The only layoff is of Molesley and that's because Matthew dies, and he is rehired as a footman after a few episodes. All departures are voluntary. Nobody demands raises. The servants accept their place in society, and the ones who don't try to marry up (which is treated as decent romance when Tom does it and as awful deceit when Edna does); Gwen and Alfred leave, but the house accepts that, and the jobs they leave for are ones that the modern viewer will view as unskilled and lowly. Nobody moves to York to start a small business.
Nor do we get much indication of the state of the British economy, with the exception of postwar inflation, in reality followed by a sharp recession, which doesn't seem to affect anyone on the show. Nobody moves to work at a factory at the start of the war and is then laid off when the war ends and drifts, unable to find work at Lancashire's declining textile mills but made redundant at the house by technology. This is not a self-aware show, painting problems of society while showing that the aristocrats barely even know of them, let alone care. The commoner characters do not discuss those problems, either.
After the war ends, all the characters say the world is changing. This is seen in small details: more cars, telephones, refrigerators, no more ironing newspapers, more women going out alone. But the servants are still not demanding anything the nobles wouldn't want to give them anyway, and from the perspective of the tenant farmer, all that's changing is that the methods by which his mostly idle landowner extracts income from him are becoming more modern and efficient.
Instead of going for this kind of conflict rooted in real English history, we only get to see palace intrigue, which could have been set anywhere. For the most part, we're told and shown that England is facing difficult times, but if everyone pulls ahead and respects tradition and authority nobody will suffer. The lord knows people and can choose, based on his personality, who to reward, and it is never shown to be wrong. There are a few freeloaders and people pulling in the wrong direction, but they are deviant and clearly wrong, and there's no conflict among good people. Watching the show you'd never understand why any state moved from administration by personal loyalty and family ties to administration by a merit-based bureaucracy.
Tom is a socialist and says his ideals are brought into conflict with his role as an estate manager, but this is informed or lampshaded conflict - we never actually see him face political dilemmas. He only has to deal with personal dilemmas, concerning relationships with other lowborn women. This isn't conflict; it's again palace intrigue in the form of gossip. He is changed after the Russian Revolution turns totalitarian, but this is a cheap punch, predictable to the viewers (the socialist believes the Bolsheviks won't execute the Romanovs), reinforcing an unreconstructed aristocratic message. If I want to see all the pieces fall together in support of an ideology, I can watch Socialist Realist art, or read Ayn Rand.
The show's view of race is in the same spirit. Rose surreptitiously dates a black man, an American jazz player. As in Guess Who's Coming to Dinner, she is naive and doesn't care what the world, and he is world-weary and knows how awful the world is for interracial relationships. He even has his own mother opposing the engagement, since black people have to be shown to be symmetrically prejudiced against whites (in Guess Who's Coming to Dinner, the white woman's parents are eventually convinced, but the black man's father remains opposed). Mary discovers the plans and goes to him and asks him to call it off since it's improper for Rose and she's only doing this to piss her mother off - she doesn't really love him even if she thinks she does. He responds by saying that because of all the problems facing interracial relationships he's already decided against marrying her. There's no conflict, again: Mary's racism hasn't caused any problems, and it was the black man rather than the white woman who decided against it. Good people - and the show unambiguously depicts the jazz player as good, while also being self-aware enough to avoid the Sidney Poitier stereotype - all pull in the same direction.
The problem is not that the show is conservative. Yes, Minister/Yes, Prime Minister advocated public choice theory and lampooned the British civil service as hopeless, and as a result it was Margaret Thatcher's favorite show. But it was capable of laughing at financiers and conservatives (Sir Humphrey is clearly Conservative) and portraying them as part of the system that makes government dysfunctional. Downton Abbey isn't that. It's a wish fulfillment exercise, and is incapable of showing any negative consequences stemming from sympathetic characters' actions. Jimmy McNulty and Don Draper screw up and cause real lasting damage to people or organizations that the viewers care about. Lord Grantham does not.
And McNulty and Don have interesting life stories. McNulty is a college dropout who joined the police when his wife got pregnant, and got tapped by homicide because as a beat cop he was better at it than one of the homicide detectives. He is shown to engage in concerted effort even when it's a relatively simple task, like calculating currents to figure out in which jurisdiction a victim was killed. It sometimes takes multiple episodes for the detail or for Major Crimes to follow clues. Don Draper is as close to literal rags-as-riches as possible; he takes multiple episodes to deal with personal crises, and needs to come up with good advertising pitches often on short notice, sometimes impressing his clients but sometimes failing. In contrast, the nobles are idle, not just in life, but also in terms of doing things that move the plot. Lord Grantham isn't changed by events, and Mary says she's changed but is only superficially different from her old self.
What makes a character a Mary Sue isn't that she's unrealistically perfect. Sidney Poitier's characters in To Sir With Love and Guess Who's Coming to Dinner are unrealistically perfect but are not Mary Sues. Rather, Mary Sues are defined around exercises in authorial wish fulfillment. The entire show works like this: if only all of England pulled in the same direction, jut as Downton does, then we wouldn't have all these problems of political conflict. The aristocracy would still be intact and that would clearly be a good thing, and lords wouldn't be humiliated after WW2 by having to turn their houses into museums and work as tour guides. Lord Grantham is an exercise in wish fulfillment of a lord who succeeds despite never working at it, Mary is an exercise in wish fulfillment of a lady who gets whatever she wants despite only working toward it perhaps once per season, the house is an exercise in what-could-have-been wish fulfillment.
There's a stereotype that British shows are better than American ones, and that British culture is more intelligent than American culture. None of the people who subscribe to it outside the UK seems to have heard of the Daily Mail; the people within the UK who subscribe to it often read the Daily Mail and think it's good journalism. In Israel at least, there's the stereotype that Americans care more about ratings and Brits care more about quality. It doesn't pass a simple sanity check: the US has 5 times Britain's population, and more TV channels, so that its shows have much lower ratings on average, and need lower ratings to be commercially successful. Downton has 12 million UK viewers, the equivalent of 60 million American viewers; this is taken not as evidence that it appeals to a lowest common denominator as would be the case for any American show with the same ratings, but as evidence that it is so amazing that even the supposedly smart British all love it.
The real difference between American and British shows is how they flatter their audience's prejudices. In US mass-market shows, the writers flatter the viewers by making the characters so obviously stupid and unsubtle that the viewers feel superior; think, for example, how few times in mainstream US television a lie lasts longer than a single episode, and compare this with how long lies last on upmarket shows (typically at least a season arc) or in real life (sometimes forever). In British mass-market shows, and in some American period dramas on cable, the pandering is cleverer. The writers insert references that everybody knows and that everybody thinks they're clever for knowing, like the Titanic or the difference between British stiff upper lip and American straightforwardness. Viewers get to congratulate themselves for watching such a smart show with so many references that they get and with so much witty banter by the Dowager Countess. In period dramas like this, every issue is replaced by universal themes of palace intrigue with nice costumes. What takes courage is creating a sympathetic main character who is still flawed to the point of causing real damage to others, and either not changing or changing in a very hard manner. David Simon and Matthew Weiner, for all of their faults as writers, have this courage. Fellowes doesn't have that courage. He just wants us to bask in the glory of Merry Old England, when everyone knew their place.
SHOCK AND STOICISM
A doctor in tailcoats
Makes a house call.
He straps her ankles,
Shoves a buffer in her mouth.
Her husband kisses her eyes closed,
Pins down her fleshy arms.
The doctor pads her temples,
Inserts wires into a black box.
Bulbs flicker as he smoothes
Moonlight back in her throat.
Doctor unplugs his machine.
A boy with birthmark on forehead
Tiptoes to his mother’s bed
Where she calmly asks my name.
by Rafiq Kathwari, the first non-Irish winner of the Patrick Kavanagh 2013 Poetry Award, representing Ireland Literature Exchange at the Hyderabad Literary Festival January 23-25, 2014.
Fitting and overfitting data
by Rishidev Chaudhuri
One of the main aims of modern science is finding mathematical expressions to describe the relationships between observed quantities. For example, Newton's law of gravitation tells us that the force of gravity between two bodies depends in a certain way on their masses and the distance between them; thermodynamics tells us that the pressure of a gas depends in a certain way on its volume and temperature; and an economist studying income might conclude that income increases with educational level according to some functional form.
Sometimes these mathematical relationships emerge from an underlying model. We might model a gas as made up of molecules that collide with each other and the walls of the container, think that pressure is a measure of collisions with the walls and temperature a measure of kinetic energy, and then our functional form is a result of mechanistic insight into pressure and temperature. In other cases, the relationships serve to provide a summary representation of the data (instead of giving you a list of pressures at various temperatures, I could say pressure=3*temperature) and, even without providing an explanation of how the relationship came to be, allow us to make predictions about new data (for example, I might have observed the pressures at temperatures of 30 degrees and 60 degrees and want to predict the pressure at 90 degrees).
As we choose a relationship (or hypothesis) to explain a given set of data, the two goals of accounting for the existing data and making predictions for new data are often in conflict. Look at the graph below, which plots the simultaneously measured values of two quantities, X and Y.
Say we're trying to describe this relationship in a way that allows us to predict the values of Y at unobserved points (for example, we haven't measured the value of Y when X is 0.25 and we want to predict this). A common thing to do is to draw a line along the middle of this scattered cloud of points and use this as an approximation of the underlying relationship.
What we've done here is to postulate that the relationship between the two quantities comes from a certain family of explanations (here, the set of lines with different slopes), and from this family we've picked a certain best explanation (the particular line we drew on the graph). We then noted that it did a reasonable job of explaining the data, and we now believe that it should similarly predict new data.
Why did we do this, and why is this belief justified? After all, the points don't lie exactly on the line, meaning that our relationship doesn't fit the data exactly. What if we instead fit the data using a family of curves that are allowed to wiggle a number of times (here, polynomials, which are squares, cubes etc.) as below:
I've truncated the ends of the graph because the curve makes big loops back and forth. This function does a great job of fitting the data, accounting for every observed point perfectly. And note that this family of explanations is bigger than the previously family and, in fact, it includes the family of lines: as you make the wiggles smaller and smaller you eventually end up with a line. Are we justified in believing that we've found the right relationship and will be able to predict new data perfectly? Intuitively, something about that seems wrong and we might appeal to Occam's razor or some principle suggesting that simpler explanations are better. And this is often confirmed in practice; typically, when we try to predict the result of new observations we do badly, as below, where I've generated some more data (in grey).
Here the line's performance on the first set of data and the second is roughly similar: it accounts for both reasonably well but not perfectly. On the other hand, the curve does very well on the first set of data and badly on the second. So what goes wrong?
First, note what happens if the relationship really is a straight line but the points we observe don't lie exactly along this line for various reasons (perhaps our measuring instrument was noisy, perhaps each data point comes from averaging across insufficiently many samples, etc.). Naively, one might still think that using the family of hypotheses with more explanatory power is better; after all, this includes the simpler family of lines and so is perfectly able to describe straight lines. However, the problem is that the larger family is much too expressive and is also capable of describing the noise in our measurements, which isn't part of the underlying relationship. So when we generate more data, the more complicated explanation fails. We've tried to fit regularities where there are none and, in technical terms, we've overfit the data. Given that one of our primary goals is to extract simple regularities from the world, this a useful reminder that just because our explanatory family allows us to describe simple relationships doesn't mean that it'll find them, and it tells us to be careful that our explanatory framework doesn't force regularities where there are none.
Now what happens when the true underlying relationship isn't either a line or one of those polynomial curves, but is something weird and squiggly and complicated (as in the points in the figures, which were generated by a combination of lines, sines and cosines). After all, the world is a complicated messy place and most relationships we want to model are mediated by many unobserved factors. Are there still reasons to believe the simpler explanation over the more complicated one? Attempts to formalize this problem tell us that there are.
So we're given the data, and we fit it with a line and one of the polynomial curves, and we look at how well we've done and find that we've fit the data with some level of accuracy. Now we run our measurements some more or make more observations and want to know how well our fitted relationship predicts the new data. How does the accuracy with which we fit the existing data translate into the accuracy with which we can predict new data?
Intuitively, the simpler set of hypotheses can explain a smaller set of possible data and if we find that such a simple explanation does fit our data reasonably well, then it's more likely that we've found a real relationship. So how well we fit the first set of data is a good predictor of how well we'll fit the second set. On the other hand, a very complex set of hypotheses can construct a relationship for many possible sets of input data, so it's less surprising and informative when such a class of hypotheses is able to fit our data. If a complex explanation fits the data well, it may be because we've actually fit the relationship well but it also may be that we've just considered a broad enough class of explanations that it could fit any data we gave it without capturing an underlying relationship. So we're less sure that this good fit will generalize (though it might). Making this precise mathematically typically involves using a measure of the complexity of the class of possible explanations we're picking from and then showing that if we generate new data, the error we make when predicting this new data is less than the accuracy with which we fit the existing data combined with some factor that depends on the complexity of the class of explanations we considered.
There are all sorts of closely-related ways to keep a model simple. For example, we might restrict ourselves to a simple class of models like the set of lines above. Or we might use a broader class of models, but penalize unpleasant features, like excessive wiggliness or functions that take on very large values (this is called regularization). Or we might take a Bayesian approach and start out with some sort of prior probability distribution on functions, which effectively says that we think that simple descriptions are more likely than complex ones. Typically, we want to relax these constraints as we get more data (intuitively, we become more willing to believe a complex hypothesis as we get more data in support of it). The Bayesian methods do this semi-automatically. For the others, we might weaken our penalty or consider a larger class of functions as the data gets larger. And it's always a good idea to actually test how well the fitted relationship does, typically by fitting the relationship on only part of the data set and then testing it on the rest.
This is a very brief introduction to a large and fascinating field (look up bias-variance tradeoff, statistical learning theory, PAC learning, VC dimension and model comparison for good places to start learning more) that is particularly compelling in that it lets us take intuitive ideas of simplicity and explanability and make them precise, giving us new ways of thinking about classic and often-invoked ideas like Occam's razor.
 More precisely, it involves showing that this relationship holds with high probability. We could, after all, have been unlucky and gotten a really unrepresentative set of initial data.
Public Health, Personal Choice
by Ryan Seals
We are in the position, unique in human history, of possessing the knowledge of how to alleviate much of the unnecessary suffering in the world. What we lack is the knowledge of how to deliver and disseminate that knowledge, and of how to encourage its uptake. In an eerily prescient article of the sort at which he excelled, John Maynard Keynes wrote in The New Republic in 1932:
At present the world is being held back by something which would have surprised our fathers—by a failure of economic technique to exploit the possibilities of engineering and distributive technique; or, rather, engineering technique has reached a degree of perfection which is making obvious defects in economic technique which have always existed, though unnoticed, and have doubtless impoverished mankind since the days of Abraham.
Replace engineering technique with scientific understanding and economic technique with, well, some broadly defined sociopolitical will to implement our understanding, and Keynes's point is precisely applicable to the current state of health and longevity around the world.
Whether you read into modern medicine an amazing ability to deflect and defer nature's slings and arrows, or you bemoan the failed ‘War on Cancer' and the much-delayed genomic revolution, the fact remains that we now know the major ways for the majority of people to lead long, healthy, lives. It isn't medical technology that allows people to live into old age; with the exception of vaccines, the causes of good health aren't to be found in hospitals and medical clinics. Nutrition, sanitation, and hygiene are the keys to population health, and it isn't too great an exaggeration to say that health depends, above all else, on where you live, eat and excrete. The solutions are, for the most part and in a strictly technical sense, rather simple and well understood.
Keynes wrote that statement in 1932, before modern vaccines and antibiotics, without much (successful) interventional surgery, and with scant knowledge of the mechanisms of disease. The 80 years before he wrote had seen a revolution in human health – maximum life expectancy had long been on its steep upward trajectory, and infant mortality in the Western world was fast becoming a rarity.
Overall progress in human health and longevity has been more or less constant for the past century and a half, an observation too often overlooked. If I were to nominate the single most important graph, I would point you to one produced by Jim Oeppen and James Vaupel for a 2002 article [pdf] in Science, shown below. It plots, since 1840, female record life expectancy (in other words, the best life expectancy on earth at any given time). What strikes one first is the linearity; the increase in best-performance life expectancy, while admittedly moving about the globe, has been quite constant. The slope, however, is the truly startling feature of the graph. Record life expectancy for women has increased at a slope of 0.24, meaning that two children born four years apart have been expected, on average and over the past century and half, to have an entire year difference in their life expectancy. If you are 40 years old, a child born today would be expected to live a full decade longer than you. (The horizontal lines denote various – erroneous – predictions of where life expectancy would peak. While the point here is a different one, such plots are a good antidote to overconfidence!)
That having been said, it's no secret that inequalities in health and longevity exist today, at greater levels than perhaps ever before. The Robert Wood Johnson Foundation has created a startling series of maps as depressing as the above is heartening, showing inequalities in life expectancy that exist within cities. These are individuals living just down the highway from one another. Within greater New Orleans, for example, there is a spread of 25 years between the highest and lowest life expectancies. While not as extreme, the pattern repeats in DC, Kansas City, Minneapolis, and the San Joaquin Valley.
The causes of such inequalities are many, and often complicated. Smoking, diet, environment, unemployment, crime, access to healthcare… a list to which much more could be added. But the fact that we can even make such a list, and with some degree of confidence explain the causes is something new in human history. We may have partly bumbled our way into our current state, but once here have come to understand it.
It's often said that public health deals with irreducibly ‘social' goods. Like neurons giving rise to the emergent property of consciousness, there are public health values – equality and access to name but two – that cannot be defined in any meaningful way at the individual level, and this justifies approaches that value the population over the individual. But the analogy fails in that consciousness, the emergent property of neurons in cooperation, is the ultimate valuable property; neurons are interesting only insofar as they support consciousness. The opposite is true in public health. All of the supposedly social goods are made meaningful by their effect on individuals. Inequality matters only when it is embodied in individuals, and affects their individual lives. In this it is important to remember that, while Maggie was wrong to argue "there is no such thing as society," she was right in her next line: "There are individual men and women, and there are families." The latter live in the former; but the former is real, and certainly affects the latter.
Now that we understand the causes and consequences of health inequalities, it has become incumbent on society to alleviate these inequalities. The rapid rise in the number and prominence of global health goals in the past half-century attests to this. We look to science to implement these changes in a more effective way than might otherwise be achieved; "otherwise" referring to blindly hoping for economic growth and a recurrence of the blind process by which much of the Western world has achieved long, healthy lives.
Fallibility is built into the process by which scientific evidence is accrued, and it might be left at that; so long as one remains open to new evidence, blind allegiance is effectively precluded. But of course there is, at the individual level, another equally important kind of fallibility, and it deserves a word of mention. Individuals are often at odds with the prevailing wisdom of the day. In some arenas we might write such persons off as charmingly idiosyncratic, but more and more we tend to criticize individuals for not following "best practices."
Once a fact or theory becomes well established, particularly in an arena related to personal health or safety (smoking, seatbelts, weight control) we tend to moralize, and judge negatively those who, for whatever reason, extend their eccentricity to false beliefs in such areas. When individuals, by choice or circumstance, seem to be at odds with the accepted knowledge of how to improve their health, we naturally look to encourage them to adopt better behaviors. This sums up, quite baldly, a great deal of public health research today.
The operative word in that last paragraph though, was ‘encourage'. On the boundary between encouragement and compulsion, a little Locke is in order:
Every man has commission to admonish, exhort, convince another of error, and, by reasoning, to draw him into truth; but to give laws, receive obedience, and compel with the sword, belongs to none but the magistrate.
The more difficult type of fallibility, incumbent on those of us who engage with false belief, is to remember that we compel as man, not magistrate. This fallibility doesn't necessarily imply a rejection of legal means, or even certain types of compulsion, only a considered reluctance to impose our beliefs on others – one of the pillars, after all, of liberal democracy. If chiropractic and herbal supplements are being sold as medical interventions, they should be regulated as such. But outright prohibition, misguided as these techniques may be, is an illiberal overreach. Whether you hold individual liberties as the foundation of morality, or you believe that respecting individual liberties is the surest way to encourage human wellbeing, modern progress has been founded on a respect for individual rights, wrong as individuals so often are.
And thus we return to the modern technocratic idea of the "nudge". Can we, without significantly curbing individual's choices, encourage behaviors that lead to demonstrably better social indicators of health and longevity? There is, of course, a wealth of work to be done at the purely societal level: environmental improvements, economic growth, healthcare access. But even these involve impacts on individuals, if only in the decision of how to allocate limited resources. The two approaches converge on the question of what a significant constraint on individual choice means.
My goal over the next few months will be to explore the gap between, in Keynes's words, "engineering technique" and "economic technique", particularly in the realm of health and longevity. How does evidence become translated into health policy? What do we know about long, healthy lives, and why don't more people live them? How do we, how should we, value health and longevity? Is the push towards a technocratic approach the right one, or are there intractably moral and political questions to be answered, touching on deep values about what constitutes the best way to live? (hint: I think the answers are yes and yes). I encourage comments and suggestions about avenues to explore.
In that same article Keynes displayed an admirable sense of perspective. "My goal is the ideal; my object is to put economic considerations into a back seat." Always one to recognize that economics serves in the interest of human flourishing, Keynes looked forward a time when mankind could afford to do what was right, rather than only that which was "economically sound". Our ideals change with progress, but we always have them. The gap between reality and the ideal is real and, by some accounts, growing. Understanding the gap, when we can explain it so well, if only at one level, is the challenge of the 21st century.
Simon Beck. #5.
"Using an orienteering compass, measuring tape and a pair of snowshoes, 54-year-old Simon Beck turns the hills and frozen lakes around Les Arcs into geometrically-perfect immaculate masterpieces. His intricate prints are huge, often spanning the equivalent size of six football fields, but while you’d be tempted to think Beck needs at least several days to complete just one of these patterns, he really only needs about 10 hours, on average."
Thanks to Walter Johnston.
My Grandmother's Democratic Party (Part 1)
by Debra Morris
Until my grandmother—whose 100th birthday we celebrated this year—took up residence first with my parents and then at the care center where three of her sisters also spent their last years, she lived independently and, in many ways, unconventionally. (Whereas she is content to describe her long life as "good," my grandmother deviated from the norms of small-town Texas just enough, and in enough domains of her life, for that life to seem quite remarkable to me. That nearly everyone calls the lady "Morris"—a long story, but it originated when I was very young and couldn't replicate my mother's polite "Mrs. Morris," so I shortened it and the name stuck—is only the first of many odd details that I'd need to explain to anyone meeting her for the first time.) When her husband suffered a fatal heart attack after a morning spent plowing, she inherited a prosperous family farm and kept it that way for four decades more. She hosted retired teacher banquets, a duty (though certainly not a grim one, my grandmother was the type to understand it as a duty nonetheless) born of a storied 40-year career as teacher and principal in the Quail Rural Consolidated School District (the largest such district in the country at the time). To this day, she is my family's only elected official, having served a term as the County Superintendent of Education. For many years she split her leisure time between a full slate of daytime TV dramas—what she called her "stories"—and virtually any televised sporting event. Whenever I asked, she could catch me up on the tangled relationships and intrigues of any given soap opera, somehow managing to dignify the most idiotic plot or one-dimensional character. She could conjure the same remarkable effect with sports; normally oblivious, I would suddenly understand the beauty and depth of a sport (who knew golf could be anything but tedious?), envying her effortless command of baseball stats and NFL playoff hopes, and sharing her quiet marvel at a beautiful swing.
And, on top of all this, every two years or so she would vote a straight Democratic ticket. This, at least, is how her only son, my father, tells it. About ten years ago—or it could have been fifteen, or five; it hardly matters because this stunning revelation came when Morris was already quite old, and long after Texas had turned solidly Republican—my father referred to my very proper grandmother as a "yellow dog Democrat" (meaning, to any Southerner, someone who would sooner vote for a yellow dog than a Republican). He said it with what seemed like mild exasperation, as if he couldn't make sense of, or fully commend, this irrational allegiance to a political party. But I remember being secretly thrilled (I think he could have told me that Morris was an avid day-trader and I would have been less surprised). Maybe I felt vindicated, too; apparently the Democratic gene can skip a generation, but obviously it was there, deep in me, ensuring that a family's rich history would continue to bind, and instruct. Perhaps most surprising of all, I discovered that I was proud—suddenly proud of a party that could have earned my dear grandmother's life-long support.
I don't recall having much of a conversation at the time, but it seems to me that my father accounted for Morris's "yellow dog" sensibilities in the same way he explained other key features of her psyche, as the stamp left by the Great Depression. And this would have explained a lot about her, certainly: my grandmother's caution, her prudence, her invulnerability to "isms" of any kind (they simply didn't register on her; when I begged her for confirmation that a new hairstyle would transform the life of this particular awkward teenage girl, and she replied with a simple "pretty is as pretty does," I already recognized this as more than a platitude to her: she really did privilege actions over images, over ideas). But I'm not sure that surviving the Depression would explain party allegiance—as opposed to, say, gratitude for a particular Democratic president, FDR, whom my grandfather also apparently revered. Nor would it account for what I suspect was a full-bodied rather than narrowly ideological political identity, something I'm tempted to call "organic" because that word is evocative of the ways in which Morris's party affiliation intertwined with her life as a Texan; with her husband's and later her own success in a tough profession, farming; with her quiet determination over some 20 summers to secure a BA and then an MA in Education, when it was uncommon for a woman to aspire to any kind of career. And her surviving the Depression would not account for gratitude, by which I mean the decision, again and again, to support a particular party and its candidates, however varied or superficially unlike her, because it seemed the right thing to do based on what I think any of my grandparents might have called that party's character.
To explain these sorts of things, it is necessary to speak of values, of ideals capable of enduring once embodied in solemn acts and institutions. It is only possible to speculate (my grandmother is now often beyond reach, heartbreakingly so for her family), but speculation is the point of this exercise, anyway. I want to suggest that my grandmother's allegiance to the Democratic Party was due in no small part to the aggressive actions of a particular president, acting self-consciously as the leader of a particular party. FDR may have won elections with sweeping majorities, including some traditionally Republican constituencies, but he governed as a Democrat. That he governed during acutely challenging times, and was able to govern for an exceptionally long time, both undoubtedly influenced what he called "Democratic," such that there were dramatic shifts (perhaps the better word is "shades") in how he conceptualized the party's essence and his own attempts to actualize it. H. W. Brands paints a rich portrait of FDR's career and political philosophy in Traitor to His Class: The Privileged Life and Radical Presidency of Franklin Delano Roosevelt (2008), and any attempt to summarize Brands's analysis in this brief space invites certain dispute. What I wish simply to highlight here is the centrality of the Democratic Party to FDR's thought and practice, a centrality evident even when he appears to be equating the party with, or deriving it from, something else entirely. Roosevelt could virtually identify the Democratic Party with liberalism—"The Democratic party by tradition and by the continuing logic of history, past and present, is the bearer of liberalism and of progress" (on accepting the nomination at the 1932 Democratic Convention)—and then shift, often within the same speech, from what he considered liberalism's authentic past to its promise for the future, which promise he thereby claimed as the Democratic Party's own. Again, in 1932: "We are going to make the voters understand this year that this nation is not merely a nation of independence, but it is, if we are to survive, bound to be a nation of interdependence." And where did this prospect of "interdependence," this aspiration to "share in the distribution of national wealth," come from? Well, far from being a novel and un-American idea—it is strange that invoking interdependence and, moreover, contrasting it to "mere" independence clearly raised fewer eyebrows in 1932 than it does today—it had described a vital part of real Americans' real lives, according to Roosevelt, such that "[o]n the farms, in the large metropolitan areas, in the smaller cities and in the villages millions of our citizens cherish the hope that their old standards of living and of thought have not gone forever." There was nothing illogical about a New Deal that aimed to restore old standards of living and even to reconstitute the "logic of history"—nothing dishonest about Democrats who saw themselves as conservators of old, even specifically liberal values, and simultaneously as "prophets of a new order"—once it is understood that the Democratic Party served, in FDR's thought and politics, to bind together a variety of ideals, allegiances, experiences, facts of the matter. I would contend that it is this function of a political party that served FDR, and served him well, as a Democratic President, whatever boost it may have given him as a candidate. "This," as he always contended, "is more than a political campaign."
One thing that Brands documents uniquely well is this complex and evolving, though still strong, party affiliation and how "proudly, defiantly, confidently" FDR articulated it and how resolutely he brought it to bear across a long and challenging career. This dimension of his presidency seems to me quite extraordinary; nothing like it is in evidence these days. Democratic Party identification now seems nothing but a campaign matter; strategists worry over the party's electoral fortunes rather than the depth, substance, or integrity of its vision. Nowhere is this focus on electability rather than governance so stark as in Texas—to the point, even, of downplaying that it is a Democratic that is running, as in Wendy Davis's paradoxical "no label" campaign for governor. Perhaps we should scrutinize this strategy—this exclusive focus on winning elections—more closely, not least because resurrecting the party in Texas, making that state truly competitive, would very frequently swing presidential elections and decisively influence national politics. But I'm actually more interested in the broader consequences for the Democratic Party: this may seem a much narrower concern but I contend that it might just be more important. I think we could expect that, with greater electoral success in Texas, whatever came to define the party in Texas would reverberate well outside the state, conditioning and constraining the party's ideals, vision, its long-term purpose and viability.
That the Democratic Party in Texas is focused exclusively on electability seems clear. Indeed, what defines Battleground Texas (the effort to transform Texas into an actual battleground, where Democrats have a legitimate shot at winning statewide and national office and where Democrats and Republicans actually vie for control in the Texas Legislature) are its infrastructure and its methods. As an organization, Battleground Texas is constituted by the "top campaign talent" brought in from outside the state; it will work by exploiting the very same "data-mining" tools that have proved so successful elsewhere (most recently, of course, in President Obama's re-election campaign). Now, all of this makes perfect sense, and I am as excited as the next registered Democrat that it might actually work. But there is something dubious here, and it is latent in the very name Battleground Texas, for what is particularly Texan about all this? "Texas" is quickly revealed to be no more than a particularly expansive and weighty electoral district—to be gerrymandered, as it were, from the inside out this time—"a prize so spectacular" as to justify the money and talent poured into it. If Texas is "where the next opportunity lies" for the Democratic Party, the opportunity extolled is a pure function of demographics, that is to say of numbers and of crunching those numbers more advantageously. All right, then, two quick responses here (which I'll try to elaborate and defend in future installments of this essay, as commentators take me to task on them!): first, that while getting Democrats elected is, of course, necessary, it may not be sufficient to their governing (much less their governing well); second, it invokes at the same time that it obscures whatever it is about Texas that deserves to be replicated nationwide.
I want to emphasize that second point, as a way of closing out this part of the essay as well as bringing it back to where it began, with a fond appreciation for my grandmother's breed of Democrat. In statements that are fairly baffling given two seconds' thought, it has been claimed that Battleground Texas is essentially a matter of securing representation of Texans as Texans, of ensuring that candidates actually work for their support (as opposed to being able to count on it by virtue of how district lines are contingently drawn and redrawn). Statements such as these invoke the power of an identity, that of "Texan," and I can't quite decide whether the attempt is merely lazy (and sort of dumb in the way that campaign rhetoric often can be) or dishonest and maybe even dangerous these days. Does it make sense any longer to speak of Texans "as Texans"—and is it in fact a good thing for the Democratic Party, my party, to hope that these Texans are Democrats deep down, when I know for a fact that most of my own extended Texas family are confirmed Republicans (if they are not simply disgusted by and alienated from all politics and politicians, whatever the label)? When a Democratic Party strategist touts a "broad grass-roots organization" the purpose of which is to "identify and ultimately mobilize progressive voters," shouldn't we worry that assuming a necessary relation between "grass-roots" (a matter of numbers) and "progressive" (a matter of values) is a dead-end for the Democratic Party? Perhaps the fact that the Tea Party exposed a similar schism between electoral strength and political values—exposed the terrain between numbers and values that has to be fought for, and forged, rather than assumed—provides a valuable lesson. In the very least, it might elicit that "harder work" we're told to demand from Democratic candidates and party strategists, first by actually identifying as Democrats and then by trying to tell us all that might mean.
Is it a little clearer, then, why I feel compelled to plumb my grandmother's complex relation to her party, her identification with it, even as most people seem happy to leave such partisanship behind and settle, instead, for artful data-mining? I suppose I still hope that, through some patient "thick description" of the lives of those closest to us, we might rediscover lost opportunities for political identification, for politics itself. I may be quite wrong, but it seems clear to me that Morris's identity as Yellow Dog Democrat evanesced from a rich variety of things—her life as a woman and a Texan; as a teacher and a farmer's wife, and then as a farmer herself; as a rural school district principal and the mother of an only child, a career Army officer whose life she saw risked by administrations Democratic and Republican alike; as the best kind of sports fan, appreciative of grace and skill and cheerfully indifferent among teams ("pretty is as pretty does," remember). It also seems clear that, for those in my grandmother's generation, party affiliation had a historical dimension as well; there could be a legitimate sense that a party deserved one's allegiance for its accomplishments in the past, that there was nothing nonsensical or disreputable in continuing to honor a compact that didn't promise immediate, tangible payoffs. This is both something more and something less than we expect today from political parties conceived as brands, from politics conceived as discrete battlegrounds—something more, because it is binding across a life and across time; something less, because it is not a gargantuan matter of ego. It is a Democratic Party, and a Democratic politics, built to a grandmother's size—a real grandmother, no less; my own.
 All quotations are taken from Brands, Traitor to His Class, pp. 252-53.
 As Mark Barabak observes in the Los Angeles Times, in her nearly five-minute campaign video Brown makes no mention of her party affiliation. Of course, she also neglects to mention the filibuster against abortion restrictions that catapulted her to national fame last summer. Barabak doesn't hesitate to speculate about the reason for these omissions: "[S]he can't possibly win running as an abortion rights crusader or champion of the political left." But if Brown can't run as a Democrat, then what chance does she have of winning as one? See http://www.latimes.com/nation/politics/politicsnow/la-pn-wendy-davis-no-democrat-texas-governor-run-20131008,0,5201667.story#axzz2otPXcb2E
 I suggest that, for similar reasons, the Republican National Committee might want to reconsider possible rule changes that would effectively make Texas the "big prize on March 1, 2016,…the first multiprimary day" of a proposed new calendar. "As a result," Jonathan Tilove writes in the Austin American-Statesman (December 22, p. A1), "for the first time in more than three decades, Texas would command attention in the GOP primaries befitting its size, drawing candidates' time and advertising dollars." Is it necessarily good for Republicans nationwide that Texas would be in a position to shape the agenda of a presidential candidate so profoundly?
 This is how Jeremy Bird conceptualizes the mission of Battleground Texas. Bird is former national field director for President Obama's reelection campaign, founder of 270 Strategies, and driving force behind Battleground Texas. For various perspectives on the organization, only some of which spell out the implications of a resurgent Democratic Party for Texans, as opposed to the entity of Texas, see Alexander Burns' piece here: http://www.politico.com/story/2013/01/democrats-launch-plan-to-turn-texas-blue-86651.html
Why I Love Julie Taymor's Midsummer Night's Dream
by Mara Jebsen
While I do not like the phrase, "at the height of her powers," it comes to mind when I think of Taymor directing this comedy. I don't like the phrase because it seems to anoint the critic with a false sense of her own fortune-telling powers, and has an undue emphasis on the importance of being urgent--as if I were saying, "run, don't walk" to this play. But perhaps you should run--or, more accurately, sit. I had to sit in the stand-by line for a long time, because the play, now running at the Polonsky Shakespeare Center is officially sold out. Sit, and wait, for a long time for this production, because the images it gives you will delight at first, and then, over time, will resolve themselves into a sort of important pastiche that helps you think about love, madness, and Shakespeare.
I had to go because my mother made me. She was a theatre student, a mime, a theatre-director, and a folklorist before she became the director of a k-12 school in West Africa, where she finds an outlet for her enormous creative energy by putting on plays. This year it is Midsummer Night's Dream. "I can't get a feel for it, yet,'' she said. "But Julie Taymor directed it, and I love her, and they're putting it on in a theatre six blocks from your house. I read all about her troubles with Spider-man and had been following her before I had the idea to do Midsummer.It means something. Go. Find out what she's up to."
I had no idea that my mother loved Julie Taymor. Or that she 'followed" anything that had anything to do with the internet. I promised to do it, but procrastinated, and when I saw that the tickets were sold out, I nearly panicked.
Anyway, here is the gist: initially, even if you are not compiling a list of directorial choices for your mother's use, you will be startled and awed by the choices Taymor makes. There is a stunning mixture of expensive technology and simple stagecraft, and a viewer feels safe the whole while--safe because they are in the hands of a person who will not bore them, who seems to have an exact sense of rhythm, scale and color scheme-and who presents recognizable character 'types' that amuse without degrading the people who make that type.
Which is to say: she does a lot of amazing things with pillows, and children rolling on their backs, kicking thier legs in the air. She can make an enormous, treacherous forest out of kids holding bamboo,and rolling underfoot so that they must be leapt over. There is blood made out of paper streamers, and enormous effect created by the decision to make the characters undress. But there are also projections of huge flowers; there are trapdoors, there is flight, and the costumes are so sumptuous and correct that at the end of it, I found myself in love with both Oberon and Titania, who genuinely seemed something beyond human.
But what is at the core of it? What 'sense' of Shakespeare's play do we get? How can I help my mother get a 'sense' of this play, which I did, myself, in high school, and loved? I read one review--otherwise glowing--that likened Taymor's direction to a 'glittering necklace', and said that it held little emotional depth. This may be sort of true. But If it is true, its only because it is difficult for us to feel different kinds of delight at once. The delight of the spectacle, and the magic trick, seems almost to inhibit the deep thrill one cans sometimes get at the theatre. But I would argue that after the urge to clap like a child wears off, the stranger bits of the bard's nutty story come to light.
For one thing, this is a good play for the lovelorn. It offers the coldish comfort that people's qualities have little or nothing to do with how much they are loved, and by whom. The initial alliances that mark the various couples we learn of make as little sense as the false alliances created by Puck, acting as a cupid. Until I saw Taymor's production, which gets the four young people to undress hilariously as they attack one another in the woods (the angers of the men and women switching from rage at the ones who would not love them, to thier rivals) I hadn't quite sensed the vulnerability, the sheer outrage towards the fates that seems to radiate out of the naked hipbones and chestbones of the characters, who, at one point, are all spurned by thier beloveds at once. What's more, female desire gets a clearer representation on this stage than it does on most stages, and many films, as both Hermia and Helena "forget themselves" occasionally, and are believably come-hither-ish, before they remember that they cannot afford to be that way. Titania, being a fairy-queen, has no such concerns, and gorgeously parries with Oberon, and has no regrets about her wierd romp with a donkey.
When I learned that Taymor also studied folklore and theatre, I had a wierd chill that you get sometimes when you see/feel the similarities between kinds of thinkers. I thought about all of the villages in Benin I was dragged to as a four-year old to see the ritual dances, which were often performances involving amazing animal masks. I saw how it was that my mother, somehow "found" and follows this director on the internet, a thing she generally avoids. Her need to do this is clear to me. I wish I had a more eloquent way of naming this than calling it "mind-kin," but that is what I've got so far. I have my own intellectual kinships, and the reason these kinship create a chill/thrill is that they feel uncanny, and sometimes connect you to the dead, and to people you've never met. It is lovely to me that my mother has her own mind-kin, and that they are somewhat kin to me, by association.
In any case, Julie Taymor may not be at the height of her powers. There may be more heights, and different ones. But I woud urge audiences to delight unabashedly in her beautiful magic tricks, because they may end up with what she really wants to give them: a stronger sense of what Shakespeare knew about love--and that's no mean feat.
by Brooks Riley
Unexpected Awesome Possum-ness
by Tom Jacobs
Years ago an earnest young student entered his professor’s office for a brief chat about a paper topic he had been turning around in his head. The professor was esteemed and well-dressed and famous for being a cool and political yet accessible writer about cool and political things. The student was spangly mediocre, wildly intimidated by his professor because he knew that he (the professor) was indifferent to this object/student before him that/who didn’t know enough about the world or the past or theory to challenge him on any level that he might recognize.
They shook hands limply and then sat down across from each other, the power differential radiating out in all directions, but it was mainly felt by the beleaguered student. The student fumblingly explained that the paper was to be about the nature of emergent electronic communities (this was the mid-nineties). He babbled and referenced a few novels and sociological and philosophical works that seemed to him potentially useful.
The professor, un-noddingly and somewhat socially autistically stared at him from across his desk with a mixture of curiosity, interest, and pity, giving the student neither quarter or shelter. After a few moments of squirming awfulness he asked a depth-charge question: “What do we mean when we say “’community?’” The question exploded in the student’s mind, and the shockwaves resonated well into the future. To this very day, this very moment, to speak truthfully.
It’s not necessary to go into the student’s flummoxed response. What’s important is the question: what the fuck do we mean when we say “community?” Because we say it all the time, the media says it all the time, politicians say it all the time, and it does an enormous amount of work for us even if none of us know quite what we mean when we say it.
I know, I know, there are many sociologists and philosophers and so forth who have considered this very question, and I will refer to some of them below. But the feeling precedes the concept, and that’s what makes it so interesting. We all know that there is such a thing as a community, even if we can’t put our finger on it. In that regard it’s a bit like pornography or (as has been said, I’m told) the clitoris. So what’s important is to figure out how really smart people who have thought long and hard about what “community” means maps on to what it means to the rest of us. How might we make these two disparate worlds sing in harmony?
I intend to approach this subject Augie March-like, which is freestyle, in the ways I’ve taught myself, and I will make the record in my own way. That’s the only way I know to do it, even if I nod and bow or reject the thoughts and ideas of others. I will do my best to do each of these things.
Our understanding of things either begins with our intuition and experience or with our rational and empirical selves. No question that both bundles of understanding overlap and mutually define the other like drunken co-dependents, or like a Mobius strip, but we have to begin somewhere. So I’ll begin with intuition and experience and the meaning of sensuous experience because rationality and empiricism and judgment have never been my strong points. I’m not proud of this, but this is the way things are and have always been, with me at least.
In a conversation about the sadness and sullenness that leaches into or dissolves the substance of our sense of being from time to time, a very good friend of mine once noted that “we are meant to live in communities.” We are not meant to be alone. We are meant to live in communities; who would deny or doubt that? But what does that mean? So we return to the gristly question.
Let’s try again. What does it feel like to be in a community? Is this an experience we recognize, know, or understand? Let’s think about this.
The moments that I have felt a profound sense of community have always been fleeting.
I went to mass with my parents over the holidays. There was much talk of community but I can’t say that I felt it. Perhaps because I wasn’t a fellow-believer.
I have been to sports events and felt a profound sense of communal identity, but it fades rather quickly once the absurdity of the whole mechanism is laid bare after a few moments of reflection.
Then there is the notion of being a part of a neighborhood , which is perhaps not precisely the same as community, but close. Here’s a story that might help illustrate.
Perhaps five years ago or so I used to live in the remote provinces of Bushwick, one of the formerly most dangerous neighborhoods in Brooklyn but now one of the most gentrified. But the neighborhood I lived in was decidedly not gentrified. I was a thoroughly white guy living in what the census might call an “ethnically-diverse” neighborhood—meaning in this case, an African American neighborhood. And so, initially I felt kinda odd. But odd in a good way, in the way that someone who never feels their whiteness or straightness is made to feel it when they go into a black neighborhood or a gay bar. Time passed and I became just another denizen of the block. I would loan my laptop to neighbors who didn’t have one, I would help them wash cars, I would bring them food and they would bring food to me. It was like Norman Rockwell at a slight angle.
My thoughts about community and neighborhood (and the distinction is a fuzzy one) came to a head one evening when I went out to buy a six pack at a local bodega at about 4 am. Why I was doing this is none of your business. But at this hour you have to wait outside in a line for the man behind the bullet-proofed glass to retrieve whatever it is that you want. You basically have to point and yell “I want the twinkies on the shelf over there and two 40s from the far end….NO! not those 40s, the others! Yes! Perfect.” That type of thing
So I was waiting in line and was surrounded by several large men. Men who subtly but decisively began to surround me. I began to feel that this would not go well. Eventually one of them asked me rather curtly, “Who do you think you are, the white messiah?” I paused for a moment trying to understand what he meant and then said, “No, man. But I don’t even know what you mean…who’s the ‘white messiah?’” To which he responded something to the effect, “you know, motherfucker.”
It was at this point that my anus shrunk to the size of a honey nut cheerio and that I realized I was about, for the first time, to get in a fight and to no doubt get my ass kicked.
But at this precise moment, as these large men approached me with malice and vinegar in their eyes that a girl who couldn’t have been more than fifteen interceded and said, “no! he’s my neighbor! He’s a good man.” At which point everyone receded and my main interlocutor said, “nah, man, I’m just fucking with you.”
So I was saved by a neighbor. Someone in my community. And I felt that sense pretty intensely at that moment.
The readings from the gospels that I heard over Christmas were readings from Paul and John, I believe. And they are readings that are clearly, at least from this side of history, meant to keep the church coherent, together, and unified. That’s a tough thing. But here’s a quote that will give you a sense of what I mean (and this is taken somewhat at randomly from the gospel readings over the past few weeks):
Masters, give unto your servants that which is just and equal; knowing that ye also have a Master in heaven.
2 Continue in prayer, and watch in the same with thanksgiving;
3 Withal praying also for us, that God would open unto us a door of utterance, to speak the mystery of Christ, for which I am also in bonds:
4 That I may make it manifest, as I ought to speak.
5 Walk in wisdom toward them that are without, redeeming the time. Let your speech be always with grace, seasoned with salt, that ye may know how ye ought to answer every man.
7 All my state shall Tychicus declare unto you, who is a beloved brother, and a faithful minister and fellowservant in the Lord:
8 Whom I have sent unto you for the same purpose, that he might know your estate, and comfort your hearts;
9 With Onesimus, a faithful and beloved brother, who is one of you. They shall make known unto you all things which are done here.
10 Aristarchus my fellowprisoner saluteth you, and Marcus, sister's son to Barnabas, (touching whom ye received commandments: if he come unto you, receive him;)
11 And Jesus, which is called Justus, who are of the circumcision. These only are my fellowworkers unto the kingdom of God, which have been a comfort unto me.
12 Epaphras, who is one of you, a servant of Christ, saluteth you, always labouring fervently for you in prayers, that ye may stand perfect and complete in all the will of God.
13 For I bear him record, that he hath a great zeal for you, and them that are in Laodicea, and them in Hierapolis.
14 Luke, the beloved physician, and Demas, greet you.
15 Salute the brethren which are in Laodicea, and Nymphas, and the church which is in his house.
16 And when this epistle is read among you, cause that it be read also in the church of the Laodiceans; and that ye likewise read the epistle from Laodicea.
17 And say to Archippus, Take heed to the ministry which thou hast received in the Lord, that thou fulfil it.
18 The salutation by the hand of me Paul. Remember my bonds. Grace be with you. Amen.
Here's the thing. We all want to succeed, to not fail, to become some kind of messiah. It’s a tough racket, but may we never lose faith, never give up. We will all, no doubt, fail in various and remarkable ways. We are all running around in circles, but even that circular movement represents movement, and movement is all that we have. may those ways rise like incense to the heavens, and may we all, each of us, become awesome and better than we currently are. I think that’s possible. I hope so. And I hope this happens for each of us in the coming year. I really do.
The Polio Jihad
by Omar Ali
Polio is an ancient scourge that spreads only within human populations and can cause paralysis, most frequently of the lower extremities, but can also be fatal when the paralysis extends to the muscles of breathing. For reasons that are not completely clear, the disease erupted in huge epidemics from the late 19th century onwards, causing millions of victims to die or become paralyzed for life. Once a virus had been identified as the cause, the race was on to develop a vaccine. Finally, in 1952, Jonas Salk and his colleagues developed the first effective inactivated vaccine for this disease. Within a few years, mass vaccination decreased the number of victims in developed countries from hundreds of thousands to just a few hundred per year. In the mid-fifties, Albert Sabin and colleagues developed an effective live vaccine that was cheaper, easier to adminster and provided better immunity and that was then adopted by the WHO as the main vaccine for use in endemic areas. Thanks to mass immunization campaigns, the number of victims dropped precipitously and by 1988 the WHO was ready to launch a well-coordinated international initiative to completely wipe out wild polio from the planet. Like smallpox, polio does not have an animal reservoir, so if human to human transmission is completely blocked by mass vaccination the disease can be effectively wiped out.
Initially, the campaign proceeded well, with the Americas being declared polio-free in 1994 and Europe in 2002. Today, there are only 3 countries where polio still remains endemic: Pakistan, Afghanistan and Nigeria. Unfortunately, the reason in all three is the same; the moronic wing of the international Jihadist movement has somehow picked up bits and pieces of chatter about risks from oral polio vaccine, combined it with pre-existing paranoia about modern international institutions, and created a robust anti-vaccine meme that is able to draw upon the ruthless killing power of Jihadi militias to effectively stop polio eradication campaigns in their area of influence.
I would like to clarify this a bit further:
How and why these Jihadist organizations became infected with this meme is still unclear. My own hunch is that it was simply a matter of ideal host meets appropriate parasite; Islamists in general thrive on conspiracy theories and a paranoid anti-modern worldview (the elders of Zion being the best known, but hardly the only example). As one moves to the fringes of the movement, the educational level declines, the scientific ignorance increases and the paranoia reaches incredible heights (I urge all readers who are suppressing an urge to jump in and say “but the paranoia is not without foundation” to please suppress that urge a little longer, I will get to that). Some mullah reads somewhere that X or Y Western anti-vaccine crusader has written about the possible effects of vaccine contaminants. Already convinced that Western powers and their evil agents in Muslim countries are working day and night to wipe out the Muslim Ummah, it is not hard to imagine that the oral polio vaccine may itself be a weapon in that war. Since both male impotence and vaccinations are commonplace, the connection is easily proven to the satisfaction of all concerned. Anti-vaccine propaganda thus became embedded in the fringes of the Jihadi world and as civil wars accelerated in Nigeria, Afghanistan and Pakistan, so did the attacks on polio teams.
It is worth noting that this meme did NOT start with Dr Shakil Afridi’s CIA-sponsored fake hepatitis vaccination jig in Abbottabad (meant to try and obtain DNA from the Bin-Laden family). Major problems had arisen in Northern Nigeria as far back as 2003 and in Pakistan by 2007.
The governments of Nigeria and Pakistan made extensive efforts to try and convince resistant populations to permit vaccination. Major religious figures thought to be respected by the Jihadists (Sheikh Qardawi in Nigeria, Maulana Samiulhaqin Pakistan) have been roped in to try and show that the vaccine is not a plot against Islam. Some effort has been made to tell people that “Islamic” countries like Saudi Arabia and Malaysia immunize their populations and do not regard the vaccine as an imperialist plot, but to no avail. The insane fringe of the Jihadist movement is too far gone by now to be convinced in this manner. In any case, the Pakistani state has itself long encouraged anti-Western and especially anti-Jewish memes as tools to mobilize irregular forces for various foreign policy adventures (and some Muslim politicians in Northern Nigeria have likely done the same in their ongoing competition with Christian Nigerians). These memes have taken firm root in the network of “scholars”, journalists and other opinion leaders who provide intellectual leadership to the Jihadist cause. Paranoia about polio vaccine fits in smoothly with the rest of this intellectual complex and is proving extremely difficult to overcome even where pro-Jihadi mainstream politicians like Imran Khan have taken the lead and tried to convince the jihadists that this particular Jewish invention (both Sabin and Salk happen to have been Jews) is kosher, so to speak.
As an example of how the meme operates, take a look at the good folks at Ummat newspaper in Karachi, who published a "research article" back in December 2012 that provided "scientific evidence" of the threat from polio vaccine. This newspaper is affiliated with the most literate and modern Islamists in Pakistan, i.e. the Jamat e Islami old guard from Karachi. These are the kind of people whom postcolonial scholars sometimes regard as a secularizing force in Pakistan (I understand that her argument in the piece linked above is relatively sophisticated, but I think it’s still wishful thinking; Like many liberal Muslims she too has difficulty accepting how committed even modern Islamists are to the medieval texts and fascist dreams of their less sophisticated Jihadist friends). Anyway, on a day when 8 polio campaign workers (mostly women) were killed in 4 different cities for trying to immunize kids against polio, this modern Islamist newspaper published a major feature article full of ignorant paranoid claptrap about the dangers of polio vaccine.
For those who cannot read Urdu, the headline says: Monkey cells are used to prepare polio vaccine
This is followed by the following main points:
1. In 2006 Mr. Mohammed Nabi filed a petition in the Peshawar high court asking for a ban on polio vaccine because it contained female hormones.
2. in 2004 there was a campaign in Nigeria against the same vaccine and Nigerian scientists determined (via testing in India) that the vaccine was "harmful to the reproductive system".
3. Ummat has conducted its own investigation on the internet and determined that the vaccine indeed contains female hormones and is made using monkey cells (the last part is true, and led to some SV40 virus contamination in the early years of polio vaccine but there is no conclusive evidence for harm from that contamination).
4. Before 1950 polio was mostly prevalent in Europe and America (the hint here is that we may be victims of a plot that is spreading this new disease to us, and then using this as an excuse to load us up with killer vaccine).
5. Polio vaccine can cause paralysis (this is true, but extremely rare and the risk from wild polio is MUCH higher).
6. Oral polio vaccine is banned Europe and America but continues to be used in third world countries. (it’s true that since the only 3-4 cases of polio that were occurring in the US were due to vaccine virus, wild type polio having been wiped out USING THE VACCINE, therefore it made sense to switch entirely to the more expensive and less immunogenic, but safer, injectable vaccine. The situation is completely different in endemic regions and the risk-benefit ratio is very much in favor of oral polio vaccine in those areas).
7. In India the vaccine drive has led to a 1200% increase in paralytic polio (I am not sure what this claim is supposed to mean. India was declared polio-free this year so the claim makes no sense).
The point of citing this article in detail is to show that Ummat is feeding anti-polio vaccine hysteria (especially with its baseless claims about female hormones and danger to reproductive health) just as their Taliban brothers are shooting innocent female health workers trying to immunize children who are at risk. And neither is claiming Dr Afridi’s CIA campaign as their main objection to this vaccine.
In the last few weeks alone, jihadist terrorists have attacked several polio teams, killed male and female polio workers, and kidnapped teachers who took part in the polio campaign. The courage of the health workers who continue to operate under these conditions and the absolute evil of those who target them are both exemplary. But it is very difficult to see how polio eradication can proceed under these circumstances. In fact, I think it’s a safe prediction that we (as in humanity) will not be able to eliminate wild polio in the foreseeable future because of the efforts of these few determined enemies of infertility and impotence. Already we have seen outbreaks in China and Syria that have both been traced to Pakistan (and it is likely that in both cases the vector was Jihadists travelling from Waziristan to China and Syria for holy war) and if current trends continue, we may see wild polio reappear in countries like India and Indonesia from which it was recently wiped out after great effort.
Unfortunately, people who happily kill innocent teachers and health workers and are directly responsible for the paralysis and death of hundreds of children (with many more to come) sometimes get more sympathy from Western anti-imperialists and anti-globalization activists than their victims. In fact, these activists provide the killers with new and better justifications via the internet (ironically, another feature of globalization). There is a very powerful strain of racism and paternalism hidden in this form of “understanding and empathy”. These same activists clearly do not expect their own population to cut off their nose to spite the face. Even if the state occasionally uses health or educational institutions to spy on people (as it clearly has in the past), Carol Grayson and her friends do not expect Welshmen and West Virginians to shoot public health workers and teachers as a result. But since they seem to regard Nigerians and Pakistanis as especially retarded and simple-minded (unspoilt and innocent, but also unsophisticated), they find it perfectly reasonable for them to go around doing the same. The fact is no CIA or Mossad operation and no unethical drug trial is sufficient excuse for killing innocent health workers trying to stop a lethal disease. If people are doing so, they need to be told that it is not acceptable to do so, instead of using every atrocity as another opportunity to attack imperialism, capitalism or whatever ideological current you hold responsible for the state of the world as a whole.
I realize that the above paragraph is not philosophically air-tight. If capitalism is indeed the cause of all evil, then everyone who is gumming up the onward march of international capitalism is, by definition, a good guy. But my contention is that Western activists (and their Westoxicated Eastern admirers) do not really believe in any such absolute clash of good and evil and would not really want to live in the pre-industrial utopia of the Taliban. They only find it easy to admire heartless killers when faraway people are being discussed. I realize that I cannot stop this huge anti-capitalist cultural force with one article, but I just wish they would stay off the topic of polio vaccination. Humanity is tantalizingly close to wiping out this menace. It would be a shame to fail now just to make a point about the CIA or capitalism or American imperialism.
My New Year's Resolution: Getting to Know my Genome Sequence
by Carol A. Westbrook
On November 12, 2013, I placed a package containing a small sample of my blood into a UPS drop box. It is a fait accompli. I'm going to get my Genome Sequenced! I was thrilled!
No doubt you are wondering why I wanted to do this. The short answer -- because I can.
When I started my research career in the early 1980's, scientists such as myself understood how valuable the human DNA sequence would be to medical research, but it seemed an unattainable dream. Yet in 1988 the Human Genome Program was begun, proposing obtain this sequence within 20 years. I was hooked. I was active in the Program, on advisory panels, on grant reviews, and on my own research, mapping cancer genes. Obtaining DNA sequence was painstakingly difficult, while interpreting and searching the resulting sequence was almost beyond the capability of the computers of the time. Nonetheless, in 2003, a composite DNA sequence of the human genome was completed, 5 years ahead of schedule. Shortly thereafter, two of the leading genome researchers, J. Craig Venter and James Watson, volunteered to have their own genome sequenced in their research labs, and Steve Jobs purportedly had his sequenced for $100,000.
I never imagined that in 2013, only 10 years later, sequencing and computational technology would improve so much so that an individual's genome could be sequenced quickly and (relatively) affordably. I could have my own genome sequenced! For a genomic scientist like myself, this was the equivalent of going to the moon.
I found a company, Illumina, which offered whole genome sequencing for medical diagnosis. I wrote to Illumina, "I have had over 25 years of experience in the Human Genome Program, and at this time would like to truly explore what I contributed to, these many years. I think the time is right to do this. I am able to interpret the results based on my previous experience in this field, and am comfortable with any results that might be found. So is my family. Realistically, I am 63 years old and healthy, so my risk of discovering a dangerous genetic condition is minimal."
Illumina invited me to participate in their "Understand Your Genome Program," where I and about 50 others "sequencees" would have our DNA sequenced and attend a daylong seminar on the interpretation and significance of our individual results. We would receive our personal sequence on an iPad at the seminar. This program is a combination of education, publicity, and "getting the message out," and the sequencing is offered at half the commercial cost--and within my budget. So I submitted my credit card info and sent in my sample on November 12, 2013, 10 years and 7 months after the completion of the first human genome was announced.
I hadn't really thought much about the implications of knowing my personal genome sequence until that morning, when filled out the required paperwork to accompany with my sample. A doctor's signature was required to order the test -- no problem, I'm an MD -- and there was an optional signature for genetic counseling -- I signed that, too, since I have clinical experience in that area. Next, my personal medical history: a checklist of common conditions that might have a genetic link (e.g. asthma, blood clots), and whether or not I was adopted. That was easy, I'm pretty healthy and I'm not adopted.
The family history took longer because my father and mother came from large families, 12 and 5 siblings, respectively, and I have 3 sibs of my own. Heart disease, high cholesterol and strokes run rampant in my dad's family. But I never really noted that there was cancer on my mother's side and I, too, might carry a predisposition, too. And Mom did develop Parkinson's disease, and eventually non-Alzheimer's dementia. Hmm. That was something to think about. Did I want to know?
Finally, the informed consent. I signed a statement agreeing to go ahead with the test, and acknowledging that I understand the implications and/or will discuss them with my doctor. I agreed to let them keep my leftover specimen for research. I was also asked to indicate whether there were any categories of genetic diseases that I might find that I did not want to know about, such as those that can't be treated, or progressive neurologic conditions like Huntington's disease, or genes that put me at risk for cancer. I decided that I wanted everything revealed. I signed the forms and sent in the sample.
The next step was to talk to my children and siblings (2 brothers and a sister) about my pending genome sequence, reminding that that they each have a 50-50 chance of carrying any gene that I have. I offered to let them to know my results, or to opt out for some or all of the genes, as I was asked to do. Everyone was okay with this because they knew I was healthy, I was past the age for many genetic conditions, and I didn't have cancer. My son jokingly said "sure, but don't tell me if I have Huntington's disease."
Although I'm certain I don't have Huntington’s disease, I might still carry gene that puts me at risk of a disease, such as cancer or diabetes, but never develop the disease. Geneticists call this "low penetrance." My children my get the gene and the disease. I might also carry a single gene for a recessive condition, such as hemochromatosis, which causes disease only if you inherit two abnormal genes. Who knows what is in the half of my parent's genome that I didn't inherit but my siblings may have? Or in my children’s' father's DNA? Finally, there are X-linked genes, in which women carry the gene and pass it on to their daughters, but only sons and grandsons develop the disease. Some examples are color blindness and hemophilia. Clearly there are results of my genome sequence that may impact my relatives. I decided to bring my daughter along with me to the March reveal, and to bring her iPad along.
At this time, my DNA is going through the sequencer and the results are being uploaded to the iPad. I am curious to know what I will find. There may be some data on ethnic origins, which may be helpful in understanding my heritage, as my father's father was illegitimately conceived shortly before his mother emigrated from Poland. Was my great grandfather Polish, or will we find genes from some far-away place? Of course, it will also be fun to know what percent of my genome is Neanderthal, too. And from the health perspective, I will be screened for the "known" genetic conditions, such as those revealed by the less inexpensive, more-limited chip-based DNA tests, such as 23andMe. This is valuable information, particularly as it might identify risk factors (cardiac, cancer, diabetes, etc.) or unexpected interactions with medications.
Learning about the known genes is useful but, in most cases, it is not going to make a major impact on a person's health. And, considering the expense, it is certainly not going to justify implementing whole genome sequencing as a standard part of our medical care -- at least not for now. But most of the current discussion on the benefits of genome sequencing has been one-dimensional, focusing on the significance of identifying these known genes and risk factors. Yet what excites me about this project is not the known genes, but the incredible potential of a person's DNA sequence having a major impact on his health and longevity in the future, in ways that we cannot even predict.
Consider, for example, common conditions that clearly have a genetic component but can't be pinned to a single gene. These include asthma, rheumatoid arthritis, lupus, cancer, hypertension, diabetes, obesity, metabolic syndrome, kidney failure, anemia, cancer, depression, schizophrenia, obesity, heart attacks, osteoarthritis and many others. In fact, conditions like these probably cover the majority of all doctors' visits (excluding infection and accidents). These are the "unknown unknowns," where combinations of genes and environmental factors come to play. Perhaps we will be able to use our genome sequence to prevent these diseases by targeting the mutation, or lessen the severity of the condition, or modify outside factors that impact them.
-What if you knew that would get diabetes if you were overweight, but you also knew that you could prevent this obesity by modifying a gene in your liver?
-What if you knew that your daughter had the potential to be a math genius? Would you help her develop her potential?
-What if your doctor could treat your hypertension with an individualized combination of drugs that had no side effects for you?
-What if you learned you have a risk of schizophrenia, but could prevent it by a treatment designed target the DNA sequence and stop its progression?
-What if you knew what biochemical subtype of depression you had, so you could treat it with the correct drug?
Impossible dreams? Sure, but so was obtaining the complete human genome sequence 1988. There is no question that genome research is moving so rapidly that we don't even have a vision of where it will be in 10 years. But I'm confident that the medical implications will strengthen as research continues and more complete genomes are compiled. I am pleased to be an early contributor. I will have my iPad at the ready when some of these new discoveries are made.
Sunday, December 29, 2013
An Open Letter to the Makers of The Wolf of Wall Street, and the Wolf Himself
Cristina McDowell in LA Weekly:
Let me introduce myself. My name is Christina McDowell, formerly Christina Prousalis. I am the daughter of Tom Prousalis, a man the Washington Post described as "just some guy on trial for penny-stock fraud." (I had to change my name after my father stole my identity and then threatened to steal it again, but I'll get to that part later.) I was eighteen and a freshman in college when my father and his attorneys forced me to attend his trial at New York City's federal courthouse so that he "looked good" for the jury -- the consummate family man.
And you, Jordan Belfort, Wall Street's self-described Wolf: You remember my father, right? You were chosen to be the government's star witness in testifying against him. You had pleaded guilty to money laundering and securities fraud (it was the least you could do) and become a government witness in two dozen cases involving your former business associate, but my father's attorneys blocked your testimony because had you testified it would have revealed more than a half-dozen other corrupt stock offerings too. And, well, that would have been a disaster. It would have just been too many liars, and too many schemes for the jurors, attorneys or the judge to follow.
But the records shows you and my father were in cahoots together with MVSI Inc. of Vienna, e-Net Inc. of Germantown, Md., Octagon Corp. of Arlington, Va., and Czech Industries Inc. of Washington, D.C., and so on -- a list of seemingly innocuous, legitimate companies that stretches on. I'll spare you. Nobody cares. None of these companies actually existed, yet all of them were taken public by the one and only Wolf of Wall Street and his firm Stratton Oakmont Inc in order to defraud unwitting investors and enrich yourselves.
Richard Dawkins’ Hate Mail
Rowan Hooper in Slate:
Richard Dawkins: I don't know how many people think I'm mean. I'm certainly not and I didn't consciously set out to do any image-cleaning or anything. I like to think it's an honest portrayal of how I really am. And I hope it is human, yes.
RH: Nevertheless, there's a gulf between the real you and the caricature Richard Dawkins. How has that come about?
RD: I have two theories which are not mutually exclusive. One is the religion business. People really, really hate their religion being criticized. It's as though you've said they had an ugly face, they seem to identify personally with it. There is a historical attitude that religion is off-limits to criticism.
Also, some people find clarity threatening. They like muddle, confusion, obscurity. So when somebody does no more than speak clearly it sounds threatening.
RH: You definitely polarize people. How do you feel about the hate mail you get?
RD: I did a film that's on YouTube of me reading hate mail with a woman playing the cello in the background. Sweet strains to contrast with this awful, "you fucking wanker Dawkins" and so on. Making comedy of it is a pretty good way of absorbing it.