Monday, June 20, 2016
by Scott F. Aikin and Robert B. Talisse
In the real world of political talk, getting the last word is often what counts most. This is especially the case where political talk is conducted in the limited space between commercial breaks. In such a forum, "getting the last word" does not mean what it means in a purely academic setting. In academic argument, one gets the last word when one articulates a decisive point, a point to which not even one's smartest and best informed opponents could object. In popular political talk, by contrast, "getting the last word" means being the last speaker to utter a coherent and self-contained thought. Statements of this self-contained variety tend to be received by one's audience as the "take away" from the exchange, and hence they are most likely to be remembered. The arena of national politics is high-stakes and highly-public; and the need to get the last word creates a strong incentive for a distinctive kind of conversational distortion, namely, that of derailing discussion. One derails a discussion when one speaks for the sake of creating a conversational disruption that substitutes the topic previously under consideration with some ambiguous and unwieldy alternative. Once derailed in this sense, conversation loses focus, and the disorientation leaves subsequent speakers unable to get the last word.
Derailing of course takes many forms. But one derailing strategy has become so prevalent in current political discourse that it is worthy of focused analysis.
The derailing strategy we have in mind may be called spitballing. At its core, spitballing works as follows: One makes multiple contributions to a discussion, often as fast as one can think them up (and certainly faster than one can think them through). Some contributions may be insightful, others less so, but all are overtly provocative. What is most important, though, is that each installment express a single, self-contained thought. Accordingly, slogans are the spitballer's dialectical currency. As the metaphor of the spitball goes, one keeps tossing until something sticks; hence it helps if one's slogans are tinged with something disagreeable or slightly beyond the pale. As the spitballer's interlocutors attempt to reply to what he has said, the spitballer resolutely continues spitballing.
That is, instead of defending his claims against the interlocutor's pushback, he simply introduces an entirely new topic, usually by voicing another slogan that is even more outrageous than its predecessor. As a result, the interlocutors cannot keep up, and in the process of reacting to so many provocations, they never have the chance to fully respond to any of them. By the time the next commercial break arrives, the spitballer will have voiced multiple memorable slogans, and none of them will appear to have been adequately challenged. To many in the viewing audience, the spitballer will appear uniquely reasonable, informed, collected, and decisive, while the interlocutors will seem scattered, irrational, and desperate. In the high-stakes world of national politics, that's a huge win for the spitballer.
As we have said, spitballing is understandably effective in formats where discussion is shoehorned into 12-minute segments in between commercial advertisements. But as spitballing trades in slogans, it brings correlate advantages in other forums as well. Slogans are intrinsically vague and suggestive, and they thereby admit of multiple interpretations. Accordingly, when a spitballer's pronouncement is subjected to critical analysis in, say, print media, the spitballer's response is simply to return to the confines of the television studio to denounce the interpretation of the slogan that was scrutinized. The denouncement begins with an indignant "what I actually said was . . ." and is followed with the introduction of a new slogan –hence a new provocation – which is no more precise or transparent than the original. Thus the process begins anew. All the while, the spitballer derails the discussion by ensuring that no one else actually gets to speak about anything other than what the spitballer has said. Yet, as the spitballer trades only in vague but provocative slogans, there can be no real discussion about his claims.
We are sure that you already recognize that Donald Trump is an incorrigible spitballer. It is thus no surprise that his favored communications outlet is Twitter, a platform that permits only slogans. And his performance in longer-form interviews confirms our diagnosis. He is infrequently well-informed or temperate, but what he lacks in quality, he makes up for in sheer quantity; he is never at a loss for words, and he is a wellspring of slogans, many of which suggest underlying commitments that seem outrageous. In only a short while, he has accrued a substantial and public record of claims which do not form a coherent set. And for the most part his critics are overwhelmed; they do not know where to start with substantive evaluation. Whenever a critic voices an objection to one of Trump's slogans, Trump indignantly protests that he has been willfully misrepresented and treated unfairly. Indeed, that's the central value of political slogans -- they're moving targets. So, to cite only one familiar example, we are all acquainted with Trump's pronouncements about national security and Muslims, but what exactly is his view? Unsurprisingly, it shifts opportunistically. Sometimes the view seems to be that no Muslims at all should be allowed into the United States; on other occasions, the proposal is that the US suspend legal immigration for Muslims; and at other moments, the claim is weakened to the view that Syrian asylum seekers should be thoroughly investigated. Moreover, although the policy is always presented as a "temporary" measure, the conditions under which the ban would be lifted are never articulated. Trump only says the policy (whatever it actually is) should stay in place "until we know what's going on" (whatever that means).
This deliberate imprecision makes Trump's proposal impossible to evaluate according to its feasibility, costs, and probably effectiveness. And those who have tried to assess the proposal along these metrics have been rebuffed by Trump representatives; any assessment must fix on some particular interpretation of the Trump slogan, but the entire Trump strategy depends upon escaping precision so that every criticism can be instantly dismissed as a willful and unfair misrepresentation, thereby creating opportunities for introducing new slogans, which then draw critical responses that are again swiftly dismissed. Consequently, the spitballer controls the discussion by derailing any attempt to scrutinize what he has said; thus, in a very real sense, he always speaks unopposed. Meanwhile, public conversation is dominated by counterfeit ideas; popular political discourse is crowded out by a mode of exchange that merely mimics dialogue; and the pressing political issues that face the nation remain undiscussed.
Peter Soriano. From Permanent Maintenance, 2015.
"… his largest wall drawing to date. Commissioned by the Colby College Museum of Art, this multipart piece spans approximately one hundred linear feet …"
Current exhibition at Colby College in Maine.
What Everyone is Getting Wrong about Predictive Policing
by Olivia Zhu
Predictive policing is catching the public’s attention. Interest in the topic hasn’t abated, ever since greater scrutiny, strained budgets, and racial tension have plagued police departments and the communities they are meant to protect. The Marshall Project and ProPublica, among a host of other news organizations, have published in-depth—and extremely popular—descriptions and critiques of the trend.
These pieces merely scratch the surface of the technologies and methods required for predictive policing. The majority of discussions in this space focus on the ethics involved: Are the results increasing instances of racial profiling? Does the practice violate Fourth Amendment rights?
But here’s the thing.
Although predictive policing is in its infancy with regard to adoption and success, there’s far more to it—and there are better questions to be asking.
For example, journalists have wondered about the quality of the data that police departments are shifting into newly purchased software programs. It’s certainly not wrong to state that predictive models will only be as good as the data that serves as their foundation. Nevertheless, assessors have quibbled over whether the data that police departments collect under- or over-represents poor, often minority-dominated communities.
One theory goes that these underserved communities don’t trust the police, and thus are less likely to report crime—making it less likely they’ll be served by any of the benefits of predictive policing. Conversely, perhaps police presuppose that certain neighborhoods are more prone to crime, and decide to patrol them more frequently. That, in turn, increases the likelihood that more incident reports are filed for the region. Predictive models suggest more patrols in these areas, and racial profiling may occur as a result.
The very first set of questions that should be asked, then, is: “How can we determine if under- or overreporting is happening?” “Do reporting trends vary by type of crime?” and “Once we know, can we fold the knowledge into effective predictive policing programs?”
Startups like ShotSpotter and Knightscope might help with the data issue, as they collect and report phenomena such as gunshots, license plates, and other real-time data. Though they may help address the question of how faithful the data is to actual events, other critics have questioned if their kind of data collection may present a privacy concern.
The data that police will consistently rely upon comes from their own troves of paperwork and documentation, in addition to the information collected by private vendors or provided by other jurisdictions. Combining that data when needed is going to be extremely difficult, requiring customization for each department’s idiosyncrasies. Scaling that fusion in a way that is lucrative for the software vendors but still useful for the communities at hand is going to be an interesting problem to tackle, and it seems unlikely to happen unless coupled with massive overhaul of IT infrastructure and thorough redesign of data collection processes.
Thus, the next question set should be: “Are predictive policing programs being implemented in a way that encourages maximum success, given the dollar and time investments of the departments?” “How is success being measured—and are the success rates colored by the fact that the programs themselves changed what data is being collected?”
Moving forward, it’s clear that the most important metrics for stakeholders involve reducing incidents of violent crime. Those statistics are the ones that grab the attention of journalists and citizens alike—they’re sexy, for one thing, beyond being undeniably important.
Yet to focus only on violent crime would be an oversight. The St. Louis County Police Department, for example, had asked a software vendor called HunchLab to only present analyses of violent crimes to mitigate the effects of “racially disproportionate policing.” According to the article, HunchLab and the St. Louis police are seeing some promising results. But it’s important to note that numbers relating to petty crimes (that could still exclude ones excessively tilted to minorities, such as drug arrests) are more robust. The data set is bigger, and therefore could be more reliable.
So, what should police departments be focusing on? First and foremost, I—as a citizen—would expect there to be methods to identify such quotidian, apparently mundane details such as how ticket quotas should be set, or where and when officers could be best deployed to manage traffic flows. The ripple effects of these and other miniscule improvements are what management consultants and manufacturers have been using to make entire industries more productive for years upon years—so why not public sector organizations, too?
Moreover, policing involves so much more than estimating when and where crimes will be committed—the issues that generate so much play in the media and so much concern by civil rights groups. The market has not addressed critical questions of how, say, to improve staffing efficiency, or to manage predicting headcounts given expected population increases or decreases. Even further—could predictive policing help identify problem behavior in certain officers, not just offenders?
The final note is going to seem trite—and yes, I’m warning you in advance. The media needs to stop associating predictive policing with Minority Report, either the short story, film, or short-lived television series. First of all, it’s a lazy analogy to present for a layman reader. More critically, it limits the imagination in thinking of what predictive policing is capable of and should be leveraged for. Good data, math, software, and policies shouldn’t be locked in a black box labeled “Precog”—they need to be understood, criticized, lauded, and used well.
The Mesh of Civilizations in Cyberspace
by Jalees Rehman
"The great divisions among humankind and the dominating source of conflict will be cultural. Nation states will remain the most powerful actors in world affairs, but the principal conflicts of global politics will occur between nations and groups of different civilizations. The clash of civilizations will dominate global politics."
—Samuel P. Huntington (1972-2008) "The Clash of Civilizations"
In 1993, the Harvard political scientist Samuel Huntington published his now infamous paper The Clash of Civilizations in the journal Foreign Affairs. Huntington hypothesized that conflicts in the post-Cold War era would occur between civilizations or cultures and not between ideologies. He divided the world into eight key civilizations which reflected common cultural and religious heritages: Western, Confucian (also referred to as "Sinic"), Japanese, Islamic, Hindu, Slavic-Orthodox, Latin-American and African. In his subsequent book "The Clash of Civilizations and the Remaking of the World Order", which presented a more detailed account of his ideas and how these divisions would fuel future conflicts, Huntington also included the Buddhist civilization as an additional entity. Huntington's idea of grouping the world in civilizational blocs has been heavily criticized for being overly simplistic and ignoring the diversity that exists within each "civilization". For example, the countries of Western Europe, the United States, Canada and Australia were all grouped together under "Western Civilization" whereas Turkey, Iran, Pakistan, Bangladesh and the Gulf states were all grouped as "Islamic Civilization" despite the fact that the member countries within these civilizations exhibited profound differences in terms of their cultures, languages, social structures and political systems. On the other hand, China's emergence as a world power that will likely challenge the economic dominance of Western Europe and the United States, lends credence to a looming economic and political clash between the "Western" and "Confucian" civilizations. The Afghanistan war and the Iraq war between military coalitions from the "Western Civilization" and nations ascribed to the "Islamic Civilization" both occurred long after Huntington's predictions were made and are used by some as examples of the hypothesized clash of civilizations.
It is difficult to assess the validity of Huntington's ideas because they refer to abstract notions of cultural and civilizational identities of nations and societies without providing any clear evidence on the individual level. Do political and economic treaties between the governments of countries – such as the European Union – mean that individuals in these countries share a common cultural identity?
Also, the concept of civilizational blocs was developed before the dramatic increase in the usage of the internet and social media which now facilitate unprecedented opportunities for individuals belonging to distinct "civilizations" to interact with each other. One could therefore surmise that civilizational blocs might have become relics of the past in a new culture of global connectivity. A team of researchers from Stanford University, Cornell University and Yahoo recently decided to evaluate the "connectedness" of the hypothesized Huntington civilizations in cyberspace and published their results in the article "The Mesh of Civilizations in the Global Network of Digital Communication".
The researchers examined Twitter users and the exchange of emails between Yahoo-Mail users in 90 countries with a minimum population of five million. In total, they analyzed "hundreds of millions of anonymized email and Twitter communications among tens of millions of worldwide users to map global patterns of transnational interpersonal communication". Twitter data is public and freely available for researchers to analyze whereas emails had to be de-identified for the analysis. The researchers did not have any access to the content of the emails, they only analyzed whether users any given country were emailing users in other countries. The researchers focused on bi-directional ties. This means that ties between Twitter user A and B were only counted as a "bi-directional" tie or link if A followed B and B followed A on Twitter. Similarly, for the analysis of emails analysis, the researchers only considered email ties in which user X emailed user Y, and there was at least one email showing that user Y had also emailed user X. This requirement for bi-directionality was necessary to exclude spam tweets or emails in which one user may send out large numbers of messages to thousands of users without there being any true "tie" or "link" between the users that would suggest an active dialogue or communication.
The researchers then created a cluster graph which is shown in the accompanying figure. Each circle represents a country and the 1000 strongest ties between countries are shown. The closer a circle is to another circle, the more email and Twitter links exist between individuals residing in the two countries. For the mathematical analysis to be unbiased, the researchers did not assign any countries to "civilizations" but they did observe key clusters of countries emerge which were very close to each other in the graph. They then colored in the circles with colors to reflect the civilization category as defined by Huntington and also colored ties within a civilization as the same color whereas ties between countries of two distinct civilization categories were kept in gray.
At first glance, these data may appear as a strong validation of the Huntington hypothesis because the circles of any given color (i.e. a Huntington civilization category) are overall far closer to each other on average that circles of a different color. For example, countries belonging to the "Latin American Civilization" (pink) countries strongly cluster together and some countries such as Chile (CL) and Peru (PE) have nearly exclusive intra-civilizational ties (pink). Some of the "Slavic-Orthodox Civilization" (brown) show strong intra-civilizational ties but Greece (GR), Bulgaria (BG) and Romania (RO) are much closer to Western European countries than other Slavic-Orthodox countries, likely because these three countries are part of the European Union and have shared a significant cultural heritage with what Huntington considers the "Western Civilization". "Islamic Civilization" (green) countries also cluster together but they are far more spread out. Pakistan (PK) and Bangladesh (BD) are far closer to each other and to India (IN), which belongs to the "Hindu Civilization" (purple) than to Tunisia (TN) and Yemen (YE) which Huntington also assigned to an ‘Islamic Civilization".
One obvious explanation for there being increased email and Twitter exchanges between individuals belonging to the same civilization is the presence of a shared language. The researchers therefore analyzed the data by correcting for language and found that even though language did contribute to Twitter and email ties, the clustering according to civilization was present even when taking language into account. Interestingly, of the various factors that could account for the connectedness between users, it appeared that religion (as defined by the World Religion Database) was one of the major factors, consistent with Huntington's focus on religion as a defining characteristic of a civilization. The researchers conclude that "contrary to the borderless portrayal of cyberspace, online social interactions do not appear to have erased the fault lines Huntington proposed over a decade before the emergence of social media." But they disagree with Huntington in that closeness of countries belonging to a civilization does not necessarily imply that it will lead to conflicts or clashes with other civilizations.
It is important to not over-interpret one study on Twitter and Email links and make inferences about broader cultural or civilizational identities just because individuals in two countries follow each other on Twitter or write each other emails. The study did not investigate identities and some of the emails could have been exchanged as part of online purchases without indicating any other personal ties. However, the data presented by the researchers does reveal some fascinating new insights about digital connectivity that are not discussed in much depth by the researchers. China (CN) and Great Britain (GB) emerge as some of the most highly connected countries at the center of the connectivity map with strong extra-civilizational ties, including countries in Africa and India. Whether this connectivity reflects the economic growth and increasing global relevance of China or a digital footprint of the British Empire even decades after its demise would be a worthy topic of investigation. The public availability of Twitter data makes it a perfect tool to analyze the content of Twitter communications and thus define how social media is used to engage in dialogue between individuals across cultural, religious and political boundaries.
Huntington, S. P. (1993). The Clash of Civilizations. Foreign Affairs, 72(3) 22-49.
State, B., Park, P., Weber, I., & Macy, M. (2015). The mesh of civilizations in the global network of digital communication. PLoS ONE, 10(5), e0122543.
Monday, June 13, 2016
perceptions of intense tragedy
Aftermath of the massacre at Pulse in Orlando.
This post is in honor of the LGBT community in Orlando and worldwide, and the innocent victims of this horrific crime. In solidarity.
"Club was sister's tribute to gay brother who died of AIDS ... "
Six of One and Half a Dozen of the Other
by Maniza Naqvi
Wearing white, Hillary Clinton made her speech as the presumptive Presidential nominee of the Democratic Party after the California Primary as the one who would save us from Trump. But she is the one who has been saved by Trump. Wealthy warriors, Trump and she, members of the one percent, diverting America's attention from this fact and uniting America through fear, presenting fear as their net worth and credentials to the ninety-nine percent.
The same fear, that the specter of extremism would take over, had Americans marching on into war now has them marching towards Hillary who voted for the wars. Hillary Clinton didn't stand much of a chance (given her record on supporting war and her accumulation of wealth) in today's context of a deafening roar of protest about the rising poverty and the growing gap between the rich and the middle class the 99% versus the 1%. But then miraculously that context was trumped by Trump by his rhetoric of fascism. Trump has provided the theater needed to make Hillary viable and credible. The more preposterous he gets, the lighter and more bearable becomes, her very real baggage regarding trust and bad decisions. Hillary's trump card is Trump, in the game against Bernie Sanders.
Everything sorted out for a television screen. Television loves Donald Trump—Ratings have never been better thanks to him. He is better than Jon Stewart, Stephen Colbert and whoever else helped us laugh away the war and violence over the last sixteen years—helping us all turn facts of atrocities into late night comedy and entertainment. It's all been a hahaha moment hasn't it? Now they have been replaced by Donald Trump, making us continue to laugh and smirk as we watchTV and feel comforted in fear, before we sleep. The over the top rhetoric of Trump seems to ensure that nothing sticks to Hillary Clinton, except the fear of the alternative. TV's Trump will get Hillary Clinton elected. But this tactic—has unleashed a very real monster—that has roared its racism and fascism: the crowded mob like rallies that Trump has gathered around him. White hooded sheets in the dead of night are no longer required; Thanks to Trump and TV now hatred can be expressed loud and clear on prime time television and be cheered.
We are to believe that Hillary Clinton will save Americans from Trump. When, in fact, it is Trump who has saved Hillary Clinton from Americans by making them think that Hillary Clinton will save them from Donald Trump. The one who acted and voted for violence will save Americans from the one whose rhetoric is violent? The one who voted for wars in Afghanistan and Iraq and Libya—and for every military action in the mainly Muslim lands—is the one who stands between Americans and the man who says he wants to ban Muslims from America. Trump has fashioned his rhetoric to an already primed audience, the cheering crowds at his rallies are those who have literally grown up and grown old listening to the propaganda of hatred and fear and the lies which were necessary for the prosecution of the wars.
The one who has been part of and parcel of furthering the business of war is going to stop us from the rhetoric of violence spewed by Trump, the man who opposed the Iraq war? But Trump's rhetoric, far- fetched as it is may be designed for ensuring the business of war. Ensuring that the System will prevail and Hillary will win where earlier neither could have done so without just such extreme rhetoric to scare voters into acquiescence. And it has woken up the rage of the majority who are bewildered and angered by the politics of victimhood and grievances of a coalition of minorities. The tyranny of the minorities.
How long has America been on this trajectory? This trajectory of needing a victim for its monetized and profitable violence? For 16 years? Or for far more? Did it start when the atom bombs were dropped on Hiroshima and Nagasaki? Did those who took the decision to use the bomb, not once but twice, know how many the bombs would kill? Or were they just trying to find out how many? Hundreds of thousands or millions? Did it matter? Power had to become super power. Super power. Did the trajectory begin then? Or was it with slavery? Or before then with the Native Americans. Each time, power had to become super. Eisenhower called it the military industrial complex. The System which must be safeguarded. The logic of the business of war.
Here we are sixteen years later a population cynically and expediently marinated in fear and hate by the highbrow and low barred media, military and politicians. Here we are with fascism as the main discourse, being argued for and against and made stronger with every sound bite--playing strongly to cheering crowds. Donald Trump, the casino and real estate near bankrupt, fraud who is playing to ecstatic crowds when he screams hate against Muslims---and Mexicans. This fascist demagogue is challenged by his opponent Hillary Clinton—who voted for war in Afghanistan and Iraq and Libya and who is part and parcel of the architecture of military and covert violence and war crimes that have been unleashed against the very people that Donald Trump wants to build walls against, build surveillance against, and build prison camps for. And there is Bernie Sanders the man we all love, what's not to love? Yet, he too is running on the strength of the anger of the majority, called white, those who feel disempowered and angry about how they have been left out and for whom the system does not work. If this system worked for them---would it have been okay? All three of them Trump, Clinton and Sanders are appealing to angry white voters. And for the rest ----whoever wins in November its six of one and half a dozen of the other. Because what happens when all these supporters of these three candidates don't have their candidate win? How disgruntled will they be then?
Uncharacteristically, I've fallen silent. I have nothing to say. Sixteen years of lies about Muslims, about Islam and the spurious justifications to invade, occupy and bomb lands has brought us to this. Americans who are used to winning regardless of being doped or duped or both are now feeling unmasked, so that the doping and duping is out in the open for all to see, along with the costly investments in unwinnable wars and in little else. Of course that makes for a lot of rage. Because the entitled believe ‘When we say let's roll, you roll over and die.' And that hasn't happened. So there is this huge ball of rage ready to go to the next level of "Let Freedom and Liberty ring!" Fascism.
Those who began this new series 2000, for a new century, of wars--those war criminals have escaped forward, without a single moment of reckoning for their crimes. An odd mistress uncovered here, another pathetic peccadillo there. Or the breach of confidentiality and security. No matter. Or the confidential speeches to the wealthy Bankers assuring them full support. Given a pass. Or the ill thought through decisions of war. No matter at all. That's it. No accounting for acts of abuse of power to kill innocent people. Nothing. And who will pay the price? Not them, for they will constantly consolidate power by invoking fear. And when it isn't fear that they invoke it is the politics of gender appropriate bathrooms and other issues of equal weight. Or both.
My throat feels constricted as if I've swallowed a lump of grief that feels like a ball of bile, my neck and shoulders ache with the strain and stress of anxiety and my heart aches—I have been worried about this inevitability from the start as I've read and listened to and watched the so called free media all these years play the role of being the propaganda arms of the institutions of war and occupation. The relentless distortions, misrepresentations and outright lies steadily increased the space and permission for public demonstrations and admissions of bigotry, and the targeting and vilification of whole peoples and their religion.
Once again, those with their unending sense of privilege and an eternal carte blanche of innocence, who are singularly enraged and empowered by fear and hatred are poised to act egregiously. And I cannot even bring myself to say: I told you so. Who would I say it to? Them?
Me and 23: Confessions of a Genome Junky
by Carol A. Westbrook
I have 23 first cousins. Me and my 23 cousins are not particularly interested in our genealogy or our genetics. We know our roots: Polish ancestry via our common grandparents, and Polish on both sides for me; pictured here are my paternal grandparents. We know that we will eventually succumb to cardiovascular disease (heart, stroke or high blood pressure), while no other serious diseases run in the family. And we all look alike, as you can see from this picture of a recent cousins-only reunion.
In truth, I don't have 23 first cousins, I have 30. I have nine cousins on my mother's side. I have twenty-one on my father's side, most of whom were at the reunion in the picture, and all of whom are descended from our common Polish grandparents, Eva and Marek Garstka, pictured at the top of this article. I use the phrase "23 cousins" figuratively, as it is a convenient segue to the topic of the DNA test service, 23andMe. Initially I was dismissive of this genealogy-based service because, like most of my cousins, I felt that there was nothing to learn from genetic testing. I know my medical heritage, I don't need to confirm that I was 100% Polish, and I, for one, was not interested in using this service to find any more relatives. I have too many already.
I had been resisting 23andMe for another reason: because I am a genome junky, and I am on the wagon. I love having my DNA tested. I feel a kinship with DNA testing because I did research in the US Human Genome Program for twenty years of my career. I had had my own genome sequenced by Illumina, about 2 years ago, at considerable expense. (See my post of April 23, 2014, My Genome Report Card). I realized then that I had to stop spending money on DNA tests and have sworn them off since then.
What changed my mind was my husband. The discussion went like this,
"You have no culture. You are truly Neanderthal," he said.
"No, you are more Neanderthal than I am, " I responded.
"Okay, prove it," he countered.
So we both ordered the 23andMe kits.
We sent in our money and got the kit in the overnight mail. We collected out spit in the tubes and mailed it in. A few weeks later we were each notified by email that our results were available, so we logged into the secure web site and went straight to the ancestry section on "Neanderthal DNA."
Yep. I was right. I he was more Neanderthal than I was, his 295 cave-man genes to my 279. A slight margin, but not enough, he pointed out, to secure the nomination.
But there was more to learn from 23andMe.
Not surprisingly, I did not carry genes for any of 36 recessive diseases tested by 23andMe, most of which are rare diseases of childhood. I already knew this from my Illumina sequence. Unlike Illumina, though, 23andMe does not test for serious dominant diseases that show up later in life, such as breast cancer, Lou Gehrig's Disease, or Huntington's Disease.
23andMe tests for traits, and accurately predicted that I have light skin with minimal freckling, blonde, straight hair, and blue eyes, and that I don't have a cleft chin, dimples, or a widow's peak. It also reported that I was likely to be a heavy consumer of caffeine and a runner, but not a deep sleeper. All true. I can taste bitter, I can smell asparagus, I prefer salty snacks to sweet, and don't flush with alcohol (though I flush for other reasons). I have the gene for lactose intolerance, though this was puzzling because I'm not bothered by dairy products, which I consume daily.
Many people have found long-lost relatives through 23AndMe, including orphans, people sired by sperm donors, and people whose real father is not the guy on their birth certificate. My 23andMe report showed that I was related to 1,611 of their customers who chose to share their information! However, no one was closer than 4th cousin, and all had less than 1% DNA shared with me, slightly less than eight degrees of separation. This is a relief. I do not need any more first or second cousins, thank you.
Moving on to ancestry, Rick found that he was 94.2% Eastern European or "Broadly" European, with a smattering of Northwestern European, as to be expected for someone who is 100% Lithuanian. No surprises there.
My ethnic ancestry reports confirmed that I was 99.7% European, but instead of 100% Eastern European or "Broadly" European, I was only 84.3! The remainder, of my DNA, 13.2%, was from Southern European, primarily from the Balkans area, with a bit of Sardinia and "General Southern European" thrown in. That means I was not 100% Polish, but I was about one-eighth "something else."
Wow, that's a surprise. One of my four grandparents was only half Polish! But which one? And from what side of the family? And what was his or her other half? Who was the unknown great grandparent, the mysterious stranger in the family?
I asked around in my family.
As it turns out, the identity of our great-grandparents is clouded in obscurity, due to immigration and loss of Eastern European church and civil records during the war. What I know, though, is that my father's father, Marek, is likely to be illegitimate, while the grandmother on my mother's side is the daughter of an illegitimate mother. At the turn of the last century, this was a good reason on its own for a young Catholic person to migrate and keep mum about his or her roots.
What little data we have is the baptismal birth certificate of my maternal grandmother, which indicated she was the daughter of an illegitimate woman and a Polish father whose parents may have been "Greek". Further confusing the issue was a family tradition on my father's family that my paternal grandfather, known to be illegitimate, had an unknown father who was "Greek."
I put "Greek" in quotation marks because it is hard to imagine that each of these families, both from central Poland, had any occasion to run into someone from Greece. People didn't travel far from their homes in those days, and my grandparents lived about as far from the sea as you could be in Europe. I toyed with the idea of a Jewish wanderer, but 23andMe showed that I did not carry and Ashkenazi Jewish genes. Because of the Balkan genetics, I suspect that the mysterious stranger was a visitor from nearby Romania or Hungary. Perhaps in those days, any swarthy, mysterious, Southern-appearing stranger would be called a "Greek" in Poland.
We will never know who was this mysterious stranger and where he came from. There is no one to ask, as the suspect grandparents, and all of their first generation offspring (my parents, uncles and aunts) are deceased.
But that's not the end of the story. We may yet have a chance at finding out what side of the family carries these non-Polish genes if we use genetics. That is, if a cousin on each side would go ahead and do the 23andMe test, we could see which of them carries this Southern European ancestry. Yet it gets complicated. We would have to eliminate cousins that have parents of other ethnicities, who are half Italian or half German, for example. We would have to limit the study to those who are 100% Polish, sharing one pair of grandparents with me, and the other pair all Polish. And who's to say they don't also have a mysterious "Greek" ancestor in their family also?
I will bring it up at the next family reunion. But until we have the genome sequences for most of the rest of the earth's population, we won't be able to locate any of our presumed long-lost relatives. That is a relief. Our family reunion could become unwieldy with 1,611 more cousins.
Monday, June 06, 2016
by Holly A. Case
When Joseph Brodsky taught poetry at Mount Holyoke College, his method of choice was memorization. At the beginning of every class, students took out a blank sheet of paper and wrote out the poem for discussion that day from memory. Every comma, every line break, every word: they all had to be in the proper place. More than three errors of any kind would earn a zero.
I audited Brodsky's course on the poetry of W. H. Auden as a sophomore. Though I rarely adhered to his strict regimen, I did with Auden's "September 1, 1939."
After the ritual of the blank sheet came the discussion. Holding a plastic espresso cup, and often—in defiance of every code—a cigarette, Brodsky walked among us, repeating lines from the poem with Russian-accented rhythmic intonation:
Blind skyscrapers use
Their full height to proclaim
The strength of Collective Man,
Each language pours its vain
Competitive excuse: …
(He pronounced the noun and verb forms of "excuse" identically, always like the verb.)
Then came a question: Why "blind skyscrapers"? A hand or two went up. Possible answers were proffered and gently dismissed. Finally, he offered an image of clouds reflected in the glass; everything deflected, nothing allowed in. As I listened, the adjective "blind" opened wide, swallowing a hissing tangle of nouns: "ignorance," "hubris," "superficiality," "soullessness," "emptiness," "selfishness," whereupon—already grotesquely distended with meaning—it proceeded to engulf the hundreds of pages of Ayn Rand's The Fountainhead, as well. Brodsky was passing from behind on my right as he spoke, the light on the desks was diffuse and without shadow, and a boy in a tutu from Hampshire College was sitting to my left: nothing happened, everything changed.
Despite having memorized the poem perfectly and even experienced an in-class revelation, I still made a mistake that day that has taken a very long time to amend. What I had scrawled in nervous haste on the paper at the start of class was technically correct, but I had misunderstood one of the words in the poem. The final stanza of "September 1, 1939" reads:
Defenceless under the night
Our world in stupor lies;
Yet, dotted everywhere,
Ironic points of light
Flash out wherever the Just
Exchange their messages:
May I, composed like them
Of Eros and of dust,
Beleaguered by the same
Negation and despair,
Show an affirming flame.
My mistake came in the eighth line, where my mind had swapped one Greek deity for another. Though I wrote "Eros" (the god of love), I was thinking of "Ares" (the god of war). With this mental substitution, the meaning of the poem is wholly transformed. In Auden's version, the poet does what comes naturally to him—namely love—in a show of solidarity with others like him. In my version, the poet must defy his nature. Though he is tempted to march along to the drumbeat of war, he wills himself to resist temptation. Yet he may fail, for the final sentence, though it ends in a period, takes the rhetorical form of a question. Will he manage to do what is right? The poet's internal struggle to overcome his nature and unite in solidarity with the Just becomes part of the drama, and his triumph will be all the greater if he manages to "Show an affirming flame."
Years later I realized my error, and felt relieved that no one—thank heavens not the late Brodsky, Nobel laureate, protégé of Anna Akhmatova—had noticed. To cover for my shame, I tried to bend reality to the way I had come to see it. Was it not possible that Auden had accidentally written "Eros" for what he most certainly intended to be "Ares"? And how would we even know if that were the case? A friend of mine pointed out—with a most unbecoming smirk—that it was unlikely that the author of "The Shield of Achilles" had mixed up his Greek gods. "Well then," I retorted, in a rapture of dilettantism, "if the mistake was not Auden's, then perhaps it was a mistake on his part not to have made such a mistake." I still believed that Auden's Eros made matters too easy for the poet.
More years passed. I assigned the nine-stanza, ninety-nine-line poem in accordance with Brodsky's method to my own class. (The impact this assignment had on student evaluations can be safely left to the imagination.) Reading the many iterations the students had written—some nearly perfect, others more tortured and impressionistic—again and again I lowered my pen over botched versions of the treacherous sixth stanza, with its two Russian names and capitalized "Persons":
The windiest militant trash
Important Persons shout
Is not as crude as our wish:
What mad Nijinsky wrote
Is true of the normal heart;
For the error bred in the bone
Of each woman and each man
Craves what it cannot have,
Not universal love
But to be loved alone.
Comprehension, that miniature demon who had spent years in the prison of my blind fixation, finally lost patience and started snapping the multiple "l's" of the last two lines over his knees in exasperated disgust. He knew! The demon shrieked. Auden always knew! And Brodsky knew he knew! And Brodsky knew you ought to know, too, because you bloody well had learnt it by heart!
The demon was right. The sixth stanza reveals all: Auden had not been mistaken to fashion the Just out of Eros rather than Ares, for his Eros is forever tainted by an error, the desire to be the sole object of affection. How very nearly impossible it then becomes for a "normal heart" to transcend its crude wish and "Show an affirming flame." Auden's Eros knew wickedness intimately. Brodsky's method teaches intimate knowledge, because only what you know intimately can take you beyond what you know.
Pedro Ruiz. The Displacement Series. Ca. 2005.
Oil & resin on canvas.
Monday, May 30, 2016
Memorial Day: The Heartbreaking Convergence of Freedom and Fear
by Humera Afridi
Mere steps from Castle Clinton in Battery Park, on the southern tip of Manhattan, stands a striking bronze sculpture titled, The Immigrants. Created in 1973 by the Spanish sculptor Luis Sanguino, it portrays a group of individuals who have undertaken an arduous voyage. Their gripping expressions and postures tell a story of endurance—borne with patience and prayer; kindled by hope for a life of dignity, free of fear, whose nimbus-like promise will surely unfurl in this new world where they have disembarked.
Amid the deep-green lawns, beds of blooming tulips, and the sunny melodies of street jazz, the bronze figures beckoned. I spotted them on my lunch break, a fortnight or so before Memorial Day. Their raw emotions and the naked display of the human spirit expressed in all its earnestness caught me by surprise. Here in plain sight was a visual testimony to the search for sanctuary—a struggle that is painfully alive in a world beset by wars, but also, immediate and close to home, visceral in the lives of many thousands of immigrants in America who having found refuge here, nevertheless, now tragically live in fear of being deported and separated from their families.
A figure kneels, bare-chested with head thrown back, arms spread wide, broken chain-links dangling from fingers; another clasps both hands in fervent prayer, gaze directed heavenward. Disconcertingly candid and telling is the stance of one at the front of the line, who crouches, with a hand outstretched—surely symbolic of the labor of immigrants, and former slaves, upon whose foundation this nation is built. In the middle of the group stands a robed male of dignified bearing, arm held across his breast in a gesture of allegiance? of self-determination?
"Give me your tired, your poor,
Your huddled masses yearning to breathe free,
The wretched refuse of your teeming shore.
Send these, the homeless, tempest-tost to me,
I lift my lamp beside the golden door!"
If once on approaching these shores, the mercy and beneficence of these word elicited tears of gratitude and relief, today remembering these lines against a backdrop of fear-ridden rhetoric and anti-immigrant vitriol, one weeps with despair at their growing hollowness, at the realization that the "golden door" is starting to resemble more and more the metal gate of a jail cell.
On January 6, 1941, President Franklin D. Roosevelt delivered his famous Four Freedoms speech. The world was at war, he declared. And in order to safeguard America's cherished value of Liberty, the dictators and tyrants abroad had to be defeated. In fact, the outcome of the war would determine if America's ideal of freedom would prevail over tyranny. The expansion of liberty in the world, Roosevelt insisted, was the best hope for peace at home. He envisioned Americans to be at the vanguard of establishing freedom and democracy, first at home, and then enabling their fruition in the world beyond.
"In the future days which we seek to make secure, we look forward to a world founded upon four essential human freedoms," Roosevelt said. "The first is freedom of speech and expression—everywhere in the world. The second is freedom of every person to worship God in his own way—everywhere in the world. The third is freedom from want…. The fourth is freedom from fear…"
Standing in historic Battery Park—former "golden gate" to the new world, and, at the same time, an area of defense equipped with artillery batteries to protect the settlement—I suddenly understood that freedom and fear breathe side by side. They share the same topography. They are tribal cousins in feudal rivalry. Caught in a tensile dance, a forever friction, freedom and fear are creative-exploitative; they are neighbors and enemies; they are light and shadow, sharing a volatile locus. To remain at peace Americans needed to prepare to fight, said Roosevelt. To secure Liberty, armaments and defenses had to be built. It was a cruel paradox.
Thirteen months after Franklin D. Roosevelt's eloquent speech—in which he impressed upon Americans the right of every human being to be free of fear—America, in a sinister, Kafka-esque turn of events, turned on its own citizens, and systematically destroyed dreams and families, homes and livelihoods. It was Roosevelt himself who signed the evacuation order for Japanese Americans to be rounded up and interned in concentration camps. Two thirds of the 127,000 imprisoned were American born, some were even veterans of the United States Army in World War I, many had never set foot in Japan. They resembled the enemy, they were of Japanese ancestry. That was their crime. And no one who resembled the enemy was exempt.
At the beginning of Julie Otsuka's harrowing novella, When the Emperor was Divine, we witness the movements of a woman who remains nameless as she works with the quiet, concentrated energy of one upon whom a great violence has been enacted. She is a woman without a husband now—we learn that the FBI came for him around midnight a few weeks earlier, took him away in his bathrobe and bedroom slippers and locked him away in detention, in a treeless wilderness, where he sleeps on a metal cot. She is a woman with children and pets to care for. We sense the love coiled tight in her heart. The weight of care in this new reality, and the work of love, were she to ponder these now and allow them to unravel, she might collapse. She can't afford that luxury. She has seen the notices all over town for evacuation orders.
She must act. She must pack. She is a good citizen, an obedient citizen, a loving mother. So, she feeds the old, half-blind dog a delicious meal of rice balls, egg and salmon; she rubs his stomach, talks to him, walks him over to a tree, ties him with a piece of twine and bludgeons him to death. Then she buries him. She feeds her daughter's beloved parakeet before she lets it out of its cage, out of the window, and shoos it away to freedom. She packs her suitcase. She buries the silver in the garden. The children each have a small suitcase. That's all they're allowed to carry. A subdued, stifled, ominous quality pervades the narrative, creating a haunting evocation of the living death experienced by those who left home to be quarantined on the other side of the barbed wire fence.
After the war, the mother and her two children return to a world that looks upon them with suspicion, with a mistrust and hate that the children internalize. "We looked at ourselves in the mirror and did not like what we saw: black hair, yellow skin, slanted eyes. The cruel face of the enemy. We were guilty now… No good… A dangerous people who could never be trusted again… On the street we tried to avoid our own reflections wherever we could. We turned away from shiny surfaces and storefront windows. We ignored the passing glances of strangers. What kind of "ese" are you, Japanese or Chinese?"
They were told it was a matter of military necessity, the camps were an opportunity for them to prove their loyalty. It was all in the interest of national security.
Fear, every politician knows, is a powerful political tool. And, in the current electoral campaign in the United States, it is fear, once again, that is being wielded—in the guise of Liberty. Ah, the seductive power of Liberty! Liberty governs the quality of our material lives; it fires our spiritual ideals; our patriotism. Liberty it is that ensures happiness. And to pursue happiness, we must attain freedom from fear, and to do so we must eliminate the enemy. What happens, though, when you resemble the enemy?
For Muslims in America, it feels very much like history is repeating itself. Muslim Americans are experiencing and feeling anti-Muslim bigotry at unprecedented levels, more so than in the aftermath of 9/11. Community organizations who are at the forefront of confronting islamophobia agree that discrimination is heightened by government and state policies that view Muslims through a national security lens. Fahd Ahmed, Executive Director of Desis Rising Up and Moving (DRUM), a working class and youth-based organization in New York City, states: "Nowhere in history will you find a time of war creating an environment for open-minded and progressive thinking around inclusion and community-building. Wars inherently create suspicion, distrust, division and the environment for these ideas." Fifteen years of war have enabled the atmosphere for politicians to openly and vehemently denigrate Muslims and immigrants.
Nationally there are 3.3 million Muslims, who make up just under one percent of the population of the United States. One out of five Muslims in the U.S. lives in the New York region. Muslim youth are feeing the burden of representing their community. They're feeling the pressure to be politicized, to know more than their peers, to be representative and knowledgeable about Islam and the politics of the region—because they find themselves under scrutiny, and asked such questions by teachers and peers. Muslim youth are experiencing a critical need to define their identity. Many are fearful to speak another language lest they be perceived as not assimilating. Girls in hijab want to be seen beyond the barrier of this visual marker. Many are experiencing the challenges of poverty but feel isolated and marginalized. They want to express themselves and be heard. They want to counter the mainstream narrative, the biased slant of the media. A great number of Muslim youth who do not have a pre-9/11 reference point feel unwelcome in America. Linda Sarsour, Executive Director of the Arab American Association of New York, believes if more Americans don't speak out, if policy makers and government don't provide sufficient support, the country will fail its young American Muslims.
"We have a firm belief that institutional forms of violence actually create the platform for social violence," emphasizes Fahd Ahmed. He lists racial profiling, surveillance, anti-immigration policies, the targeting of South Asian and Arab students by school safety officers; deportations and detentions as sending a message—there's something suspicious about these communities, something wrong.
In her book We Too Sing America, lawyer and activist Deepa Iyer writes: "Post 9/11 policies and the narratives used to justify them bear an eerie resemblance to those implemented during World War II. Their ultimate ineffectiveness … (in) fighting terrorism generate the inquiry: are these policies, in effect, ways to purge America of its ‘undesirable' immigrants?" Iyer advises that the country assess the impact of its policies on immigrant communities. "The state cannot both welcome immigrants and enforce the civil rights of people of color while simultaneously engaging in practices that justify wholesale profiling of these same communities. Instead, the state must hold itself to the highest civil and human rights standards at all times, especially during times of significant national turmoil. Otherwise, we risk losing our nation's core values and compromising the ideals that draw so many immigrants to America."
Today, it is Memorial Day, a United States federal holiday commemorating all those who've lost their lives while serving in the armed forces. I think of the Muslim American women and men who are currently serving the United States military. To be oriented wholly by your conviction in the cause of Liberty, where faith and family and cultural roots and ancestry come second, is truly laudable. I can't celebrate Memorial Day without thinking of Noor Inayat Khan, the first woman wireless operator parachuted into Nazi-occupied France in World War II. As a Sufi, Noor believed in nonviolence, but she also believed in the right to freedom and felt it was a spiritual and moral duty to fight the horrors of fascism. The daughter of an Indian mystic and American mother, Noor was born in Moscow and grew up in France. During her interviews with the British War Office, they asked questions and contemplated her allegiance. Could they trust this brown-skinned ‘Indian' woman? India was restless for independence. If Noor felt so strongly against occupying forces what did she truly feel about Britain?
In truth, Noor was as French as could be and turned out to be extremely valuable to the Special Operations Executive, the British espionage and sabotage organization, who recruited her. She worked for the Resistance in highly dangerous circumstances until she was betrayed and captured by the Nazis. Placed in solitary imprisonment for a year, she was transferred to the Dachau concentration camp. There, through the course of a night she was tortured and beaten. In the early hours of September 13, 1944, she uttered her last word, Liberte!, right before she was shot. In 1946, Noor was posthumously awarded the Croix de Guerre for bravery and in 1949 the George Cross.
For those who look like ‘the enemy,' but are, in truth, valued members of this nation, who gave and are giving of their lives, let's remember them today. Muslims have fought for the United States in all her major wars, including as far back as the Revolutionary War, serving in the Continental Army under George Washington. At Arlington National Cemetery there are graves of Muslim soldiers decorated with Purple Hearts who died fighting in Iraq. On the granite pylons of the East Coast Memorial in Battery Park—bearing in alphabetical order the names of WWII air and navy servicemen who lost their lives in the Atlantic Ocean during combat and who now "sleep in the American coastal water"—I discover Anees under the A's, Khoury in the list of names beginning with K. Assimilation of non-European immigrants in America has been far from smooth; it's easier for those who pass as White. I think of the spouses and the children of those who died in battle, who are left to fight the war of survival, with its wounds of loss and rupture. I wonder, too, how it must be to give all of yourself over—body, mind, heart and soul— to the cause of Liberty, to a nation that accepts your labor and your life, but still perceives you, your children, in essence, as other?
The Immigrants call me back. I visit again and again. There's something prophetic about the sculpture. The patina of vulnerability on these figures is real. It points to a crisis in America—one that can no longer remain hidden in the shadows, where vulnerable immigrants are suspect and mistreated in the name of Liberty. On this Memorial Day as we honor our lost brave, let us also remember the terrifying, late night knocks on the door, the families that are riven by the violence of discriminatory policies right here at home. And let us not forget Liberty, who awaits with her lightening-flame lamp beside the golden door.
between mountains and the sea (山海间)
I was recently reading a book by the dreadful Robert Kaplan on the topic of China and the South China Sea, in which the author suggested that Chinese culture exists in one of its purest forms in Malaysia. He argued that only in the overseas Chinese communities that have continuously existed scattered around the Pacific Rim has Chinese civilization survived, uninfected by the tumultuous events of the Communist Revolution. Similarly, I have a friend who is a political philosopher and expert in Chinese philosophy who believes that it is in Japan and Korea where one can most easily find the artifacts of the Chinese civilization--specifically Confucian philosophy. Japan is, after all, a place where a lot of cultural practices and material culture from China have been preserved. And not just China, for much Silk Road artifacts are preserved in Japan as well, for the country has long stood as a kind of terminus, lying at the end of the line in East Asia.
And speaking of Confucianism and the Communist Revolution, have you ever wondered why Confucian philosophy has such a bad name in the West? Largely unknown--except in its fortune-cookie format-- if it is recognized at all, the tradition is rarely fully appreciated. This is partly because of its association with patriarchy and elitism-- and this bad wrap is something that was invented by the Chinese communists, who strongly discouraged Confucian thinking as being counter-productive to the egalitarian ideals of the revolution. (They were especially worried by its patriarchal stance toward women).
Personally, I've always thought this to be a shame as Chinese philosophy happens to be one of the world's oldest existing philosophies; one which has arguably impacted more human lives than any other philosophy-- past or present. It stands as one of the world's greatest philosophical traditions, and it is also my own personal belief that Chinese philosophy--in particular Confucian philosophy-- that more than any other tradition is most compelling for what it tells us about the Good Life.
I was, therefore, thrilled to see all the press that Michael Puett and Christine Gross-Loh's new book is receiving. The Path became an amazon best-seller immediately after its publication and immediately began seeing reviews in major papers, like The Guardian and the Wall Street Journal. Pretty impressive for a book on Chinese philosophy!
First, a word about the authors.
This is a true story. Several ago a reader of these pages sent me an email with the subject line: "On the Far East, mindblowingly." The email contained a transcript of Michael Puett giving a talk in Korea about building a more enlightened state; namely a kind of state which would be capable of better promoting human flourishing. I was very excited by this new friend's obvious excitement in Puett's ideas, as I knew that at Harvard Michael Puett's lectures on Chinese philosophy are so popular that they are receiving attention from the general public now as well. It is exciting indeed! The other author, Christine Gross-Loh is even more exciting--if that is possible. Christine Gross-Loh a few years back wrote the best book on child rearing I have ever read, called Parenting Without Borders. By providing different ways of approaching parenting from other parts of the world, she challenges American parents to think critically about what we are seeing in the form of over-parenting and the infamous helicopter parents. Both authors, therefore, are working on ways to bring in different styles of thinking in order to challenge people in thinking about "absolutely everything."
So what does Chinese philosophy have to say to us, then?
Well, what if I told you that Puett and Gross-Loh agree with philosopher Charles Taylor as seeing some of the ills of our modern secular age as stemming from Calvinism? From our concept of ourselves as unique individuals on a path to uncover our greatest potential; to our distrust of ritual and organized religion ("spiritual but not religious"): these are all ideas that derive ultimately from the Protestant Revolution and John Calvin. This might surprise people who think that our modern secular understanding of self is more firmly rooted in the Enlightenment and scientific revolution (as I had)-- but having read Taylor's A Secular Age, I am with Puett and Gross-Loh in positing our modern secular understanding of the self in Calvin.
[For those who have not read Taylor's A Secular Age, don't lose any more time!]
But what does this have to do with the Good Life? Puett, as mentioned, made a name for himself with the students at Harvard for his incredibly exciting presentation of Chinese philosophy. This was achieved not in terms of presenting its venerable history or the intricacies of its logic but rather, Puett excited students by using Chinese philosophy in order to stimulate students to think of their lives and their world in totally new ways. Sounds good, right?
Can you imagine a young person who has been raised to "follow their passions" and "be the best me you can be" listening to their famous professor telling them that, being an authentic self is actually stressful; that authenticity is an illusion based on a flawed understanding of the self since there is no such thing as a "true self." What must they think being confronted with the idea that humans are multi-faceted "works in progress." Rather than being tortured by decisions and elaborate plans for the future, imagine him standing there telling Harvard undergraduates they might be better of "going with the flow?" And what could these young people make of the idea of the transformational power of ritual?
Stop trying to find yourself! Stop deciding! Don't try to impose your will on everything and whatever you do, stop taking your choices in life so seriously! Stop trying to define yourself all the time. And, most difficult perhaps of all: you must work to moderate your emotions. (Has there ever been a generation raised to so thoroughly indulge in dwelling and expressing their own emotions as this generation? I know, I am such an old lady...)
For all these reasons and more, this beautifully-written short book is a must-read.
And it is beautifully-written.
I guess my only question with the project is concerning the limitations of presenting various concepts from a foreign tradition so totally out of context. Thinking of the Confucian commitment to the unity of thinking and ritual action in the form of knowing, Confucius suggests that
知者樂水 仁者樂山 知者動 仁者靜 知者樂 仁者壽
The wise delight in water while the virtuous delight in the mountains.
A wise person is active and enjoys change while a virtue person seeks serenity and enjoys long life
That is to suggest that in an enlightened world, both wisdom and virtue are necessary. The good life is a byproduct of both the attainment of wisdom and virtue, but according to all the great philosophers mentioned in the book, from Confucius to Mencius and Xunzi to Zhuangzi and Laozi-- a person is really a person-in-context. Think of the Chinese characters for person (人) and personhood (人間). Packed right into the characters themselves is the notion of inter-personhood and that "no man is an island." There is no self-encapsulated self, but rather the philosophy is rooted in the idea that our personhood is based in interconnectedness or person-in- context. This communal and inter-connected aspect of personhood is not a by-product of sagehood but rather what makes personal cultivation possible. I think it is very hard for some westerners to fully grasp Chinese and Indian philosophy because this significant different approach to understanding what is a self.
To wit, in The Path the authors end their book with a short meditation on the problems inherent to American-style Buddhism. According to them (and I completely agree), American forms of Buddhism rather than diminishing the ego, instead serve to prop up or strengthen an individualistic understanding of person-hood. I have a friend, who is a scholar of Sanskrit and Buddhist philosophy in Korea who calls Buddhism as practiced in the US the Path of Ignorance. Because it is un-moored from its context with each person cherry-picking the parts of the tradition that "works for them," it can be problematic and indeed, as the authors suggest counter-productive. So, I couldn't help but wonder what would save their book from a similar fate? That is specifically, without the communal rituals and shared practices, how can any concepts from a foreign tradition avoid a similar fate as what we see in much of American style Buddhism?
I am no sure about how to answer this but I do appreciate that they offer the ideas from more natural notions of self-hood as conceived in the philosophy. Maybe much like William Irvine's A Guide to the Good Life: the Ancient Art of Stoic Joy (I loved that book!)-- the Path has tremendous riches to offer young people (and not so young people)--precisely because it begins with a challenge to Western notions of the Self. Does anyone not know a young person who graduating from college become petrified because they actually have no idea what to do next? The limitless choices they were promised are illusory at best and at worst are utterly boxing them in to the point of paralysis. Haven't you ever wanted to shout at someone, "the world is unpredictable and a person grows up by living "as if" and not by seeking some kind of authenticity--just do something!" Have you not found as you grow older that more and more you are at the whims of your emotions and indeed that negative emotions are undermining any sense of equanimity and serenity that you had once upon a time? And what about the feeling that our lives have become more and more cut off from real-world experience--to the point that we feel enervated and non-receptive to new things? I personally feel that these are the most pressing problems I see in my own life--and maybe that's why I thought that this slender little book carried a very big punch.
Sughra Raza. Scaffolding. April, 2016.
The Prescriptivist's Progress
by Ryan Ruby
This month, two minor controversies revived the specter of the "language wars" and reintroduced the literary internet to the distinction between prescriptivism and descriptivism. One began when Han Kang's novel The Vegetarian won the Man Booker Prize and readers took to their search engines en masse to look up the word "Kafkaesque," which had been used by the book's publishers and reviewers to describe it. Remarking upon the trend, Merriam-Webster noted sourly: "some argue that ‘Kafkaesque' is so overused that it's begun to lose its meaning." A few weeks before, Slate's Laura Miller had lodged a similar complaint about the abuse of the word "allegory." "An entire literary tradition is being forgotten," she warned, "because writers use the term allegory to mean, like, whatever they want."
When it comes to semantics, prescriptivists insist that precise rules ought to govern linguistic usage. Without such rules there would be no criteria by which to judge whether a word was being used correctly or incorrectly, and thus no way to fix its meaning. Descriptivists, by contrast, argue that a quick glance at the history of any natural language will show that, whether we like it or not, words are vague and usage changes over time. The meaning of a word is whatever a community of language users understands it to mean at any given moment. In both of the above cases, Merriam-Webster and Miller were flying the flag of prescriptivism, protesting the kind of semantic drift that results from the indiscriminate, over-frequent usages of a word, a drift that has no doubt been exacerbated thanks to the internet itself, which has increased the recorded usages of words and accelerated their circulation.
Since the trials of the word "Kafkaesque" have already received ample coverage (by Allison Flood writing for The Guardian and Jonathon Sturgeon writing at Flavorwire), I'd like to turn my attention instead to the uses and abuses of the word "allegory" as described by Miller. Most of the time Miller is not one to quibble with the way people use words. But a recent spate of film reviews—one claimed Batman vs. Superman was an allegory for the primary contest between Ted Cruz and Donald Trump, another said that Zootopia was an anti-Trump allegory, a third called Jafar Panahi's Taxi an allegory of artistic repression in Iran—caused her to draw a line in the sand. "What people usually mean when they call something an allegory today is that the fictional work in question can function as a metaphor for some real-world situation or event," Miller writes. But allegory "is not just another word for metaphor."
Because one good quibble deserves another, allow me to point out that this last assertion isn't entirely accurate. The offending examples Miller lists are indeed abuses of the term. The first two films were made before the political events they are supposed to allegorize; the third simply is about artistic repression in Iran. But this is not because allegory stands in no relation to metaphor, it's because these particular films stand in little to no relation to what the reviewers claim they are metaphors for. If Miller is normally a descriptivist, it's quite difficult to understand why she has chosen to make an exception in the case of allegory, which Angus Fletcher, in his definitive study of the term, calls "a protean device, omnipresent in Western Literature from the earliest time to the modern era."
Miller takes the features of the medieval literary genre to define its limits. Unlike more realistic fictions, the characters of medieval allegory are personified representations rather than representations of people. The protagonist of a typical medieval allegory, let's call him Everyman, journeys from Doomville to Blisstown, encountering, along the way, such embodied abstractions as Truth, Justice, and Sin who act and speak truthfully, justly, and sinfully, helping our hero reach his destination or tempting him away from the right path. Beginning "in the waning years of the Roman Empire"—presumably with Boethius' Consolation of Philosophy (c. 524)—Miller claims that allegory reaches its heights in works such as Guillaume de Lorris and Jean de Meun's Romance of the Rose (1275), Edmund Spenser's The Faery Queene (1596) and John Bunyan's The Pilgrim's Progress (1678). Although she admits that the genre has largely been eclipsed by the realist novel, it lives on in the writing of C.S. Lewis, J.K. Rowling and Haruki Murakami, in the films of David Lynch and in the drawings of today's political cartoonists.
Unfortunately, this simplifies history to the point of falsification (and not just because The Divine Comedy does not figure into it). To fix a word's meaning, a prescriptivist should start with its etymology, lest her definition seem as cherry-picked as that of the descriptivists she criticizes. Allegory comes from the Greek words allos ("other") and agoreuein ("to speak openly"). Originally the word did not refer to a literary genre at all, but a rhetorical mode. "In the simplest terms," Fletcher writes, "allegory says one thing and means another." Like irony, allegory exploits the natural polysemy of language. It's a kind of double talk that is especially useful under conditions of political censorship or in societies where blasphemy is a crime. Allegorical speech deploys figurative language to alert the hearer the existence of a latent meaning beneath the manifest content of what is said. You would not be wrong to detect in agoreuein the word agora, the place where the Greeks came together to discuss politics. Nor would you be wrong to detect in Fletcher's paraphrase something akin to metaphor, which, to quote the prescriptivists at Merriam-Webster, is "an object, activity, or idea that is used as a symbol for something else." The English lexicographer Edward Phillips, writing in the same year as The Pilgrim's Progress was published, defined allegory as a kind of semantic "Inversion," derived from translatio, the Latin word for metaphor.
Allegory—"one of the foundations of Western literature"—is in fact much older than Miller suggests. The first known usage of the word can be found in the Moralia, a collection of essays by the Hellenist philosopher, biographer and literary critic Plutarch, who died in 125, four hundred years before The Consolation of Philosophy and over a millennium before The Romance of the Rose were written. According to Plutarch, the ancients called it hyponoiai ("under-thought" or "hidden ideas"). The most famous example from antiquity is of course the "strange image" in the seventh book of Plato's Republic (c. 380 B.C.). There, Socrates describes a society of imprisoned cave dwellers who take the shadows of things for the things themselves and relates what happens when one of them frees himself from his shackles and sees what the world beyond the cave is like. In what is variously known as the Analogy, Myth, Metaphor, or Allegory of the Cave, Socrates' story reveals itself to be a network of metaphors or symbols, wherein each element is meant to correspond to an element of reality as Plato sees it. Platonic allegory is a corpus symbolicum whose cells are metaphors. In so far as allegory and metaphor are different here, it is a difference of degree, not kind.
The same is true of allegorical reading. In Plutarch's time, allegorical exegesis of canonical texts, the Homeric epics above all, was a well-established critical practice, as philosophers demonstrated correspondences between the stories of Greek mythology and their own cosmological and ethical theories. In "How a Young Man Should Study Poetry," Plutarch instructed readers not to take the myths about the Gods in the Iliad and the Odyssey literally, but rather to interpret them as astronomical metaphors and symbolic prefigurations of Platonic ideas. Around the same time, a similar operation was being performed on the myths of Genesis by the philosopher Philo of Alexandria and by the early biographers of a parable-speaking preacher from Nazareth.
By focusing on medieval allegory, Miller takes a particular, historically situated usage of a word—albeit a well-known one—to stand in, synechdochally as it were, for a whole tradition of usage. The works Miller takes as emblematic of the form are actually deviations from and even inversions of this older tradition. The personages and places of these works are entirely literal; irony is absent from their narratives and metaphors are reified as proper names. When Lady Philosophy speaks to Boethius, or when Despair tempts Red Cross Knight with an argument about suicide, there's no need to wonder whether the author means anything other than what he says. All allegories alert their reader to the fact that they are allegories, but few do so as ham-handedly as Pilgrim's Progress. Nearly everything a reader needs to know about Bunyan's book can be found on its frontispiece (see above).
Bunyan turns the distinction between manifest and latent content inside out; then he dispenses with latent content altogether. In so doing he dispenses with the very feature that had distinguished the form for centuries (all the way back to the prophet Hosea in the 8th century B.C. if we are to take his word for it). The Pilgrim's Progress does not represent the form's culmination; it represents its decadence.
Miller is right to wonder if we are even capable of reading such books any more. Aside from children, who can still enjoy allegories as pure tales of adventure, contemporary readers are likely to prefer the round characters, psychological depth, moral ambiguity, and narrative complexities that are some of the hallmarks of the realist novel, which has been the dominant form of storytelling since the late eighteenth century. "Should a book or form present its argument so simply that even a child can discern it, what's left to talk about?" she asks. "Merely language, story, and imagery—all the pleasures that art is made of."
As a defense of allegory in the age of the novel, this is puzzling, to say the least. Having begun with an attempt to distinguish allegory from metaphor, Miller ends up arguing that pure formalism is the only way we can still appreciate the most didactic of all genres. The pleasures of language, story, and imagery were the very criteria by which Flaubert wanted his arch-realist "book about nothing" to be judged. For all the formal differences between a book like The Pilgrim's Progress and a book like Madame Bovary, the ideological literalism of medieval allegory is only a step away from the mimetic naturalism of the realist novel. In any event, stripping an allegory of its ideological framework in order to read it as "entertaining adventure yarn" isn't how the form stays relevant in the twenty-first century. It's how Dante's Inferno gets turned into a video game.
This reductio ad absurdum is the inevitable consequence of taking medieval allegory to exhaust the meaning of the term. More generally, it shows how a narrow definition of a word can be just as harmful to its meaning as overly broad usage of it. With a prescriptivist for a white knight, meaning hardly needs a dragon.
Grandpa, Proust, Ulysses and World War II
My paternal grandfather, Axel Benzon, was a Dane. He and his wife, Louise, immigrated to America early in the 20th Century. He was trained as an engineer, was educated in the classics, and took up photography and woodcarving. He ended his professional career as chief engineer of the main U.S. Post Office in Manhattan.
He kept a diary, the pages of which are generically entitled: “Leaves from my diary.” It’s not handwritten, kept in one of those blank books one can buy at a stationary store. It’s typed on ordinary 8.5 by 11 paper. I’ve got a photocopy of much or most of it, but, judging by his index, not all.
In commemoration of this Memorial Day, May 31, 2016, I would like to share some passages from his diary, passages written just before the United States was drawn into the war. As you read these passages keep in mind that you are reading the reflections of a well-educated middle-class European who had immigrated to the United States.
But I want to approach the war obliquely. Let’s start with the best Western civilization has on offer. Here we have Grandpa commenting on Grandma’s interest in Proust (November 22, 1938):
Talking about books I think mama [his wife Louise] is on the way to become literary. She was interested in Anatole France some time ago and read some of his books, and now she is buried in Marcel Proust. Whether she is enjoying their language or their outpourings or both I do no know for she does not say much about it. Anatole France’s language is of course concise, clear and classically French and is therefore enjoyable …
… As to Proust it is said that the translation into English is so much better than the French edition that if it were retranslated into French it would be a much better book. The French language is not adapted to the outpourings of the quickly decaying spirit departing disillusioned from the splendor that was nothing less than a stinking dung heap as was the fate of Proust. He longed for what he thought was the highest he could think of on this earth; he found it and discovered it was rottenness. But just the same his description is more worth than Dos Passos’ description of the world as he found it in the twenties, to take an example.
Mama enjoys her reading more than she enjoys bringing up flowers or plants.
I just barely remember her. She died when I was quite young. Grandpa lived into my teens. I didn’t hear about Proust until I went to college, in 1969.
About a year later Grandpa fears for his homeland (14 April 1940):
Sunday and cloudy with occasionally a little snow–a good day to remain indoors and listen to the war news from Europe. These news are coming in frequently but are most confusing and it is difficult from the British and German dispatches to a form a true picture about the situation in all parts of Norway.
The Danish goose is cooked–there the Germans are in possession of all parts and are now fortifying points of vantage, especially the northernmost part of Jutland from where they can dominate a great port of Skagerak and Kartegat. [The Skagerrak strait between the Jutland peninsula of Denmark and Norway and Sweden; the Kattegat sea leads to the Baltic.]
The invasion of Norway was a masterstroke, no matter how it turns out. It gave evidence of the usual German thoroughness and precision and coupled with the fact that the German navy is so much inferior to that of the English it has been most successful and must have taken the English by surprise.
As you can imagine, his reflections are much occupied by the war. But not entirely so. For example, he also talks of his fondness for the game of golf and playing it on public courses in New York City—he lived in Jackson Heights at the time. I rather imagine that THAT land has long since been given over to building of one sort or another. In fact, at one point he mentions exactly that.
At one point he has copied one of his letters into his diary. He’d written the letter to one of his daughters, Karen, who apparently was visiting Demark at the time, the time when the Germans entered the country. This entry is dated April 20, 1940, just a week after the previous entry:
You are affected by the insensate sacrifice to the voracious Moloch of the flower of youth driven to the slaughter by monsters whose greed can never be sated and by the senseless destruction of the fruits of toilers whose only earthly desire is to be permitted peacefully to toil as long as they can labor.
How are your aunts and cousins in Denmark, and how is aunt Kate in Oslo with her two boys? Pity for they have toiled and suffered for many years until lately they all felt reasonably secure to enjoy the fruits of their labor. Well–I often ask myself this question–and we cannot help, cannot even communicate with them.
But let these blows not deprive you of the desire to continue your own life as happily and peacefully as you are privileged to do. You are born in an age different from that in which your parents were born and in a country different from the little pastoral Denmark. The premature invasion into your time of an unbridled science which as a colt breathlessly has galloped over your era will in time be curbed and led into the field of anthropology where it will either destroy or make useful the parasitic growth that now is the cause of our folly and inhumanity.
Notice the contrast between “pastoral Denmark” and “unbridled science.” I wonder just exactly what was on his mind there for, as an engineer, he was himself a man of science. I wonder what he would have thought about the “shock and awe” of the 2003 American invasion of Iraq or of the drones so beloved by our first black president, the one who received a Nobel Prize for Peace, and who is also the first sitting president to have visited Hiroshima?
In a letter of May 19, 1940, Grandpa writes about one of his fellow expatriate Danes:
Some time ago when Bang from Baltimore visited us he lamented about the poor condition in which Denmark was situated with respect to defend herself against an aggressor. The Finnish war was on at that time and we were filled with reports about the bravery of the Finns. The Danes could do as well and it would be better to go down in glory than to give in without a fight.
Poor Bang, he still lives in a world of illusion. He did not see that the news we received from Finland were all highly colored and that Finland was doomed. And still, he wanted Denmark to defend herself from German invasion.
I wonder what he’d think about The New York Times reporting on Iraq, or Afghanistan, or Syria. Would he think that the mainstream media now has been as lost in illusion as his countryman Bang had been back then?
And yet the mail must go on (June 8, 1940):
With all this misery in Europe things are quiet at the Post Office. Mail is not heavy and we can take our vacations knowing that there are hard times ahead of us so far as money is concerned. We must be glad if our salaries are not cut, for that in addition to increased taxation will be hard to bear.
He was waiting for the war to get worse. What are we now waiting for? What are the chances that the undeclared war on terror will end before the Statue of Liberty is claimed by the rising sea? I’m pretty sure that Grandfather would have had little trouble accepting the data on climate change.
War brings immigrants, a fact which is painfully and tragically evident these days (August 14, 1940):
At the Post office we are preparing for registering the aliens. This gives me more work for we have to build a number of typewrite desks and other things that have to be used. We do much work in the Post Office other than handling mail.
I don’t know what Grandpa would have thought about all the Arab immigrants who’ve been fleeing to Europe these days.
Here he alludes to the northwestern corner of the Roman Empire:
Incidentally I listened to [H.G.] Wells the other day over the radio and was shocked to hear how feeble was his voice–hardly distinguishable–but the old radical spirit was there undaunted–he really sounded as were he speaking from one of the many and deep shell holes dug by the barbaric German bombers in the relics from the old Londinium.
There's that late 19th Century education for you, and he was educated as an engineer, not a preacher or a diplomat.
But it’s not all war. Here Grandpa talks about more mundane matters (September 8, 1941):
From Billy we finally got words today. They have moved and are now settled in the town [Johnstown, PA]. It was not all good news in his letter for Betty’s mother is bedridden with a bad heart and his former landlady presumably has cancer.
He has further more lost his nice golf clubs–they were mislaid by a caddie in a wrong automobile when he went in for a drink and now after ten days he has not gotten them yet. That is a serious loss and I sympathize with him for he had a very good set of clubs.
Billy was my father and Betty, his wife, was my mother.
Grandfather goes on to report on a book he’s been reading, The Managerial Revolution by Burnham (whoever he was), that offers “another alternative to capitalism than socialism namely the ruling of the country by a new class of managers.” He says a bit more about the book and then: “I agree with him in most of his points, but if that is not socialism as I understand it then I do not know what it is. Socialism as he defines it is the Utopia which, if we should try to establish it now would be anarchism and chaos.” I don’t think Grandfather had much objection to socialism, though I rather suspect he wouldn’t think too much of the financial managers who run the world these days. If he were alive today, would he feel the Bern?
At last, as he continues talking about his reading, he closes with this:
Also a book by Frank Buck, the animal dealer and a novel by Storm, Count Ten, which I should read at least twice in order to understand it properly. The style is somewhat like that of Ulysses and it deals with a man who does not know what he should do but tries his utmost to live a life of decency wherein he can retain his self-respect.
Never heard of this (Hans Otto) Storm or his novel, but the Internet of course has. He was a Stanford-educated engineer; Count Ten was his third novel. Edmund Wilson thought it inferior to Storm’s previous two, Pity the Tyrant and Made in U.S.A., but found material of interest in it:
Implausible though a good deal of it is, it evidently makes use of actual experience; and the experience of Hans Otto Storm has been of a kind rather unusual among out fiction-writers. In the first place, Mr. Storm, though a radical, is not, like so many other novelists, a radical of the depression vintage. He is–one gathers from Count Ten–the descendant of German refugees of the Revolution of 1848 settled in Southern California. The hero of his novel, at any rate, begins by going to jail for resisting the draft in the last war and ends by going to jail again as the result of his activities as campaign manager for a movement evidently drawn from Upton Sinclair’s EPIC. He has, in the meantime, had a successful career as an agent of the mining interests.
Commenting on the fact the Storm is not a writer by vocation, but an engineer, Wilson observes:
An engineer who thus goes in for literature is such a novelty that Hans Otto Storm is able to carry us with him because we have never listened to precisely his story before. His writing about the sea–in Made in U.S.A. and in the episode of the yacht in Count Ten–without the parade of technical knowledge which is the betrayal of the layman in Kipling, gives us a much more intimate sense of living the life of the ship than we get from The Ship That Found Herself or The Devil and the Deep Sea.
But this is a digression. It wasn’t Grandpa’s reference to a forgotten book by a forgotten writer that caught me eye. It was his reference to Ulysses, a celebrated book by a celebrated writer, though a book that, in my experience, is mostly read by college students and their teachers. And yet there it is, in Grandpa’s diary, mentioned as though any well-read person would know it.
That’s what was on Grandpa’s mind on September 8, 1941, my father’s lost golf clubs and a forgotten book in the style of Ulysses. Two months later, on December 7, 1941, here is what’s on Grandpa’s mind:
It is cold today on this Sunday but the wires or rather the air is hot with reports about the attack of the Japanese air forces upon Hawaii this morning when five civilians and apparently three hundred fifty soldiers were killed. It is also reported that a large battleship was set afire and two others sunk …
The Dutch East Indies and the republic of Costa Rica have declared war on Japan.
10 pm. Canada has declared war on Japan.
He must have been typing while listening to the radio. A day later, December 8, 1941, America too declared war on Japan.
by Akim Reinhardt
Hotter. I need it to be hotter.
I'm sitting in the backyard of my sister's carriage house apartment in Orange, California, a circle of jolly boutique and micro brew quaintness amid the sprawling shit hole that is Orange Country.
Of course nowadays, most any place in America afflicted by people is a shit hole. Indeed, even a quotient of the unpopulated spaces is beginning to emit a fecal stench, as if the human foulness emanating from the peopled portions of our nation is so strong as to waft and stain everything around it, like a halo of shimmering, homo sapiens stank.
I want it to be hotter.
After all, there are no more distinct places in the United States, or precious few at any rate. Instead, there are just types. The urban playground loaded with bars and restaurants, and kickball and skeeball leagues for childless 20- and 30-somethings; the poor and working class black and brown food deserts that gird the yuppies and empty nesters; the little towns hemorrhaging people, stragglers holding onto the local bar like shipwreck survivors grasping a buoy in the ocean; the increasingly opulent college towns full of precious students, microcosmic training yards for the urban playgrounds; the tourist spots offering up overpriced drinks and glossy nostalgia; all of it bound together by highways, those endless concourses of fast food, gasoline, and the occasional pile of roadkill.
But all of those types are just islands scattered about the uber-type, that oceanic wasteland of suburbia and its relentless waves of roads, strip malls, and tract housing, repeating itself over and over again like the backdrop of a cheap 1970s cartoon where a boring bipedal cat, arms outstretched, chases a smarmy little mouse who's certainly got it coming, but predictably manages to perpetually escape the fanged horror it deserves, thus prolonging the crankshaft repetition of house tree fence; house tree fence; house tree fence . . .
And all of it, every last bit of it, shot through with shitty chain outlets. Your uppers, your downers, your food in wrappers and boxes, your slave labor clothing, your mega stores, your tech shacks, and your money huts, all of them speckling the landscape like aggressive tumors mindlessly devouring their host.
No more places. Just types.
And now I'm in this type. The southern California backyard, walled off from everything but the murderous sun, several blocks from the bubbling dot of a used-to-be-an-actual-town-center-but-is-now-a-bourgeois-simulacrum-of-a-town-in-the-form-of-antique shops-and-almost-interesting-food, itself a lonely island amid the yawning expanse of ubiquitous sprawl.
And I'm wishing it were hotter.
When I'm in southern California, I prefer to do my writing outside, half-naked and sweating onto a laptop. There's something about those cinder block privacy walls and the endless, arid sunshine that puts an even cruder bent to my degeneracy than I'm apt to feel elsewhere. Nothing matters here. That's what everyone strives very hard to convince themselves of.
Truth be told, they're more neurotic than a bespectacled Upper West Sider stumbling out of a therapy session. But their biggest neurosis of all is the gut wrenching need to believe they're not neurotic. So they wear flip flops and self-medicate with weed or wine if they're not partial to pills, and vaguely intimate that the official street food of the West Coast, the burrito, is inherently more relaxing than the official street food of the East Coast, the slice and/or the hotdog.
They try so hard to not give a shit. But they're failing miserably, and deep down they know it, which is why they shudder at the sight of my wiry salt and pepper maw. Yet making them twitch isn't as much fun as it used to be, so I don naught but a pair of stained gym shorts, retreat to the walled off yard, bang on the keyboard, and occasionally pee on the fig tree.
If there's anything to care about in Orange County, it's the doughnuts. I'd say the Mexican food, but there are a lot of places you can get good Mexican food. However, the man tells me there are nearly 300 independent doughnut shops in this wide eyed paean to sunshine and orange juice. Why they insist on spelling it "donut" is beyond me, but either way, fried dough is OC's saving grace. A great doughnut can revive the soul. Hell, a merely good one is enough to ward off genital warts.
From here, I head north to the Bay area. For a long time, San Francisco was a unique spot on the map. I remember Johnny Carson making late night fag jokes about the place back when most Americans thought "a little light in the loafers," was an inherently funny phrase. Then again, they also thought the Village People were just some theatrical young men. If ignorance is bliss, then innocence is the white, faux-suede gloves we use to hide the blood on our hands.
Before it was a gay Mecca, San Francisco helped invent the hippie subculture. Some nice things came out of that. "White Rabbit" is a helluva song, and while no on wants to admit it, those patchouli-reeking bastards were right: deodorant will kill you in the end.
Then again, Raoul Duke probably hit it flush when he deemed that whole scene a failure: just another orgiastic Baby Boomer sideshow that disavowed both politics and serious art, while drugs became the goal instead of the pathway. Drifting pot heads morphed into homeless junkies; from naive and directionless to mean and chincy. All of it self-absorbed.
It's not enough to turn me into a reactionary Conservative, but between you and me, I didn't really give a shit when Nicholson got stomped in Easy Rider.
Before the hippies, Frisco (a name the natives detest, which is why I use it) was a crazy patchwork quilt of misfit and castoffs. The Italians, the Chinese, and various other tightrope walkers balanced themselves along the fine line and managed to cobbled together a vibrant urban space despite the race riots and lynchings.
Go back far enough and the place was a 3-2-1 liftoff spot for the genocide of California's Indigenous peoples. That level of evil, it marks you. Sets you aside as, if not unique, then goddamn special in ways too wrong to remember, which is why most Americans live in daily denial.
But that was a long time ago, before the hoary dot com bubble bloated and burst like an inflamed corpuscle. Of course that wasn't the end of it; the puss oozed and the infection spread. During the last two decades, Silicon Valley has reshaped the entire region by flooding it with the kind of callow money that makes the con game shoot all the angles until every loser thinks they're a winner and every winner is an insufferable boor.
Not all money's created equal. Don't believe me? Wait til the day comes when they throw your filthy lucre back in your face like a zoo ape flinging feces at the plexiglass.
Either way, the bottom line for the City by the Bay is the same as everywhere else. Its vast metroplex is just another melange of types, from the world class playground in the middle, to the archetypal preciousness of Berkeley, to the Oakland food deserts shrinking in the face of gentrification, and finally the aching morass of suburbia surrounding it all.
We'll stay for two days. Maybe I'll catch a ball game. Maybe I'll blow my brains out and file next month's venomous screed from the grave. 3ZombiesDaily, motherfucker.
Living or dead, after the Bay I'll make my way to Reno, Nevada. The Biggest Little Town in America, they like to call it. I guess that's because they still got trains hauling silver from somewhere to somewhere else passing through downtown and blowing their horns in the middle of the night. But the hustlers and whores are mostly gone, the 24 hour chili dog was never that good, and the usual creep has crept through the place just like every other place. So to hell with it. One night at a locals casino, room courtesy of a local friend with points up the wazoo, and then on to the great adventure across a continent.
We'll head east and follow a tendril of highway out to the dry void, that grand expanse of the West which, unlike Phoenix, SoCal and Vegas, isn't raping the environment for hundreds of miles around in the quest for water so they can turn the desert into suburbia.
Somewhere in Utah we aim to find the remains of a WWII Japanese-American internment camp. A rotting reminder that while it's all the same now, being a special little snowflake wasn't always a good thing.
Afterwards we'll trek on, with stops in eastern Wyoming, eastern Nebraska, and whichever god-forsaken Midwestern motel we collapse in before finally returning to Baltimore.
It's good to return to Baltimore. Baltimore knows what it is and what it ain't. And while the is can sometimes leave you wanting, at least the ain't is honest.
Akim Reinhardt's website is ThePublicProfessor.com
Monday, May 23, 2016
Kind Of Like A Metaphor
"I got my own pure little bangtail mind and
the confines of its binding please me yet."
~ Neal Cassady, letter to Jack Kerouac
One of the curious phenomena that computing in general, and artificial intelligence in particular, has emphasized is our inevitable commitment to metaphor as a way of understanding the world. Actually, it is even more ingrained than that: one could argue that metaphor, quite literally, is our way of being in the world. A mountain may or may not be a mountain before we name it - it may not even be a mountain until we name it (for example, at what point, either temporally or spatially, does it become, or cease to be, a mountain?). But it will inhabit its ‘mountain-ness' whether or not we choose to name it as such. The same goes for microbes, or the mating dance of a bird of paradise. In this sense, the material world existed, in some way or other, prior to our linguistic entrance, and these same things will continue to exist following our exit.
But what of the things that we make? Wouldn't these things somehow be more amenable to a more purely literal description? After all, we made them, so we should be able to say exactly what these things are or do, without having to resort to some external referents. Except we can't. And even more troubling (perhaps) is the fact that the more complex and representative these systems become, the more irrevocably entangled in metaphor do we find ourselves.
In a recent Aeon essay, Robert Epstein briefly guides us through a history of metaphors for how our brains allegedly work. The various models are rather diverse, ranging from hydraulics to mechanics to electricity to "information processing", whatever that is. However, there is a common theme, which I'll state with nearly the force and certainty of a theorem: the brain is really complicated, so take the most complicated thing that we can imagine, whether it is a product of our own ingenuity or not, and make that the model by which we explain the brain. For Epstein - and he is merely recording a fact here - this is why we have been laboring under the metaphor of brain-as-a-computer for the past half-century.
But there is a difference between using a metaphor as a shorthand description, and its broader, more pervasive use as a guide for understanding and action. In a 2013 talk, Hamid Ekbia of Indiana University gives the example of the term ‘fatigue' used in relation to materials. Strictly speaking, ‘fatigue' is "the weakening of a material caused by repeatedly applied loads. It is the progressive and localised structural damage that occurs when a material is subjected to cyclic loading." (I generally don't like linking to Wikipedia but in this instance the banality of the choice serves to underline the point). Now, for materials scientists and structural engineers, the term is an explicit, well-bounded shorthand. One doesn't have pity for the material in question; perhaps a poet would describe an old bridge's girders as ‘weary' but to an engineer those girders are either fatigued, or they are not. Once they are fatigued, no amount of beauty rest will assist them in recuperating their former, sturdy (let alone ‘well-rested' or ‘healthy') state.
The term ‘fatigue' is furtherly instructive because it illustrates the process by which metaphor spills out into the world. If a group of engineers are having a discussion around an instance of ‘fatigue' their use of the term in conversation is precise and understood. This is a consequence of the consistency of their training just as much as it's relevance to the physical phenomenon. After all, it's easier to say "the material is fatigued" than "the material has been weakened by the repeated application of loads, etc." But the integrity of a one-to-one relationship between a word and its explanation comes under pressure (so to speak) when this same group of experts presents its findings to a group of non-experts, such as politicians or citizens. Of course, taken by itself, the transition of a phrase such as ‘fatigue' does not have overly dramatic implications. What it does do, however, is invite the dissemination of other, adjacent metaphors into the conversation. Soon enough ‘fatigue', however rigorously defined, accumulates into declarations of the ‘exhausted' state of our nation's ‘ailing' infrastructure. There are no technical equivalents to these terms, which call us to action by insinuating that objects like roads and tunnels may be feeling pain, whereas at best we are the recipients of said suffering.
Intriguingly, the complexity of this semiotic opportunism ramps up quickly and considerably. Roads and bridges may be things that we have built, but they still exist in the world, and will continue to exist whether we fix them or not. They may remind us of our success or inadequacy, but their intended purpose is almost never unclear. On the other hand, there are other things that we have built, things that exist in a much more precarious sense - it may even be a stretch to call them objects - and whose success qua objects is also much more variable. This is where we find computation, software and artificial intelligence.
The purpose of computation, broadly speaking, is to perform an action - some kind of service, or analysis, that may or may not be regular (in the sense that it can be anticipated) and is rarely, if ever, regulated. In the world of infrastructure, you either make it across the bridge or you don't, and there are regulations meant to ensure a positive outcome. As Yoda advises, "Do or do not. There is no try." But computation is different. I am not talking about something linear, like programming a computer to add two numbers. With a search engine, for example, you may find the information or not; or what you find may be good enough, or you may think it's good enough but it's really not, and you'll never know. The service, or rather the experience of the service, becomes the object; the code, which is perhaps the true object, is obscured from your view. And we tend to be poor at processing this kind of ambiguity, and when faced with ambiguity we reach for metaphor as a sense-making bulwark against the messiness of the unknown.
As we expect more of our computing technologies, the ensuing purposes also shift temporally. Our software models the world around us, and the way in which we inhabit the world. As such, its utility is displaced into the future: we value it for its predictive nature. We want it to anticipate not simply what we need right now (let alone what we needed yesterday) but what we might want tomorrow, or six months from now. At this point we find ourselves squarely in a place of mind. That is, we expect our inventions to become extensions of ourselves, because we cannot seem to make the leap that something non-human can have any chance of assisting us at being better humans. Software (and specifically AI) is singularly pure in this regard, although traces already exist in previous technologies. So while we don't worry about making our bridges anything more than functional and, somewhat secondarily, aesthetically pleasing, we tend to additionally attribute human-like traits to ships, perhaps because we perceive our lives as much more committed to the latter's successful functioning. But while we may ascribe personality to ships, we go a step further and come to expect intelligence of the software that we make: witness the proliferation of chatbots and personal assistants, to the point that we can now consult articles about why chatbot etiquette may be important.
In the meantime, these technologies themselves are being generated via metaphor. After all, these are exceedingly complex pieces of software, designed, implemented and refined by hundreds of software engineers and other staff. It is inevitable that there should be philosophies that guide these efforts. According to Ekbia, every one of the ‘approaches' is fundamentally metaphorical in nature. That is, if you decide you're going to write software that will appear intelligent to its users, you have to put a stake in the ground as to what intelligence is, or at least how it is come by. And since we haven't really figured out how intelligence arises within ourselves to begin with, we wind up with a series of investments in a mutually exclusive array of metaphors.
Is intelligence symbolic, and therefore symbolically computable? People like Stephen Wolfram would say yes. Or perhaps intelligence arises if you have enough facts and ways to relate those facts; in which case Cyc and other expert systems are your ticket. Another approach to modeling intelligence has been getting the most press lately: the idea of reinforcement learning of neural networks. (Of course, this last one models how neurons work together within our own brains, so it is a double metaphor.)
The point is that all of these ‘approaches' are metaphorical in substance. We still have not been able to resolve the mind-body problem, or how consciousness somehow arises from the mass of neurons that are discrete, physical entities beholden to well-documented laws of nature. And even though lots of theories of mind have been disproven, the fact that we cannot agree on the nature of intelligence for ourselves implies that any idea of what a constructed intelligence may be is, by definition, a metaphor for something else. Science can avail itself of the luxury of not-knowing, of being able to say, "We are fairly certain that we know this much but no more, and these theories may or may not help us to push farther, but they also may fall apart and we'll have to start over". Technology, on the other hand, must deliver a solution - something that works from end to end. In the case of AI, where models must be robust, predictive and productive, the designers of a constructed intelligence cannot say, "Well, we know this much and the rest happens without us understanding it." Your respect for the truth results in no product, and a lot of angry investors. So metaphor in this sense is not a philosophical luxury, it's how you're able to ship any code at all.
Where things get really interesting in this kind of a world is when the metaphors start getting good at producing results. So now we find ourselves in a very weird situation. There are competing metaphors out there in the computational wild: symbolic, expert, neural network systems, as well as others. Increasingly, hybrid systems are also appearing. What if some or even all of these approaches succeed in functioning 'intelligently'? I have to put the word in quotes here, because it's pretty clear that, without a mutually agreed-upon anchoring definition, we have ventured into some very murky waters. These waters are made all the more turbulent because technology's need to solve problems for us (or perhaps to also create them) will continue to push what we consider as viably or usefully 'intelligent'.
The fact is that no AI outfit or its investors will sit around waiting for the scientific community to settle on a model for cognition and then proceed to build products consistent with that model. The truth is nice, but there are (market) demands that need to be met now. If science can supply industry with signposts on how to build better technology, great. At the same time, if the product solves the clients' or users' problems then who cares if it's really intelligent or not? Recall the old adage: Nothing succeeds like success. The tricky bit is that, with enough such success, our very definition of what is intelligent may be on the verge of shifting. Next month I'll look at the implications of living in a world awash in these kinds of feedback loops.
S. Abbas Raza. Untitled, 2016.
ZOONOTIC TALES: LIVING WITH ROACHES
by Genese Sodikoff
There is the nightmare of fecundity and the nightmare of the multitude. There is the nightmare of uncontrolled bodies and the nightmare of inside our bodies and all over our bodies. There is the nightmare of unguarded orifices and the nightmare of vulnerable places. There is the nightmare of foreign bodies in our bloodstream and the nightmare of foreign bodies in our ears and our eyes and under the surface of our skin.
—Hugh Raffles, Insectopedia
I am writing anthropological stories of zoonosis, disease that spills over from animals to humans and then potentially spreads person-to-person. A zoonosis may erupt into an alarming epidemic (Ebola, HIV/AIDS), or may idle in a reservoir host as an ever-present threat (rabies, Lyme disease, hantavirus). Insects often vector these diseases by sucking up the tainted blood of an animal and injecting it into human skin. Zoonosis can encompass parasitic infections too, such as when larvae afloat in the drinking water or nestled in the litter box penetrate our bodies and mature into worms that make us sick. By some definitions, zoonosis and vector-borne diseases are distinct categories, even though viruses and bacteria introduced by insects into human populations may have originally been lifted from an animal.
Beyond the role of vector, there's another kind of insect that acts more as a disease server. It wears pathogens like foundation, coated with bacteria, viruses, fungi, and larval cysts, as it goes about its business. Chief among these is the cockroach, whose glossy cuticle teams with unwholesome microbes. Since the cockroach does not convey pathogens from vertebrate animals to humans, it does not transmit zoonotic disease, properly understood. Instead it traffics pathogens that are just out there, free floating in the dwellings and detritus of humanity, and deposits them on our food and our wounds. Cockroaches are responsible for introducing Staphylococcus into hospitals and spreading antibiotic resistance bacteria. They sprinkle kitchen counters and cabinets with Salmonella, Shigella, and E. coli. They truck Hepatitis A from sewers into homes. If that isn't enough, their odiferous droppings and sloughed-off skins trigger asthma attacks. The list goes on.
Several centuries ago, the ancient insect order, Blattodea, embedded itself in our history as we began voyaging overseas. Periplaneta ("around the planet") is the genus to which several pest species belong. Drawn to our dwellings and slop, cockroaches became our shadow society: well fed, enamored of our stuff, and habituated to the dark, moist spaces we create. We, in turn, have adapted to urban life with cockroaches in various ways, none of them evolutionarily positive other than being self-defensive. By laying sticky traps and poison pellets and carrying out insecticidal spray regimes, we keep roaches at bay. Mostly we have learned to push them into the dark crevices of our consciousness as much as possible because they are legion, and they are disgusting.
Yet also fascinating, as I have been learning from my colleague and friend Jessica Ware, evolutionary entomologist at Rutgers University, Newark, who was part of the team that recently identified a new invasive species, the Japanese cockroach (Periplaneta japonica), in Manhattan. Certain roaches, says Dr. Ware, have developed cunning responses to getting stomped on. Cockroaches carry their eggs in an ootheca, a kind of "suitcase with a hinge," which some females can release from their bodies upon being smushed, leaving hundreds of larval progeny to mature on their own. When ready to hatch, the baby roaches heave a collective breath, pop open the hinge, and stream out.
Although we equate them with filth and disease, says Rutgers University-Newark doctoral student Xueyang ("Sean") Fan, cockroaches are themselves fastidious creatures, obsessively grooming their antennae—their sensory organs—by swiping them through their mouths. Another fact: if a cockroach loses its head, it can live for about a month breathing through spiracles on its body segments, finally dying from hunger and thirst.
Cockroaches (from the Spanish, cucaracha, "crazy bug") probably arrived in the Americas on slave ships, Dr. Ware tells me. The theory is that the American cockroach (Periplaneta americana) sailed the triangular trade route of slavery, embarking on ships in West Africa, and settling in the Americas and Europe by around 1625.
I have seen the old drawings of slave ships that crossed the Atlantic, of tortured African bodies crammed horizontally onto lower decks like sardines, or else sitting upright and wedged between each other's legs. These were the packing strategies slavers used to hedge their bets on how much human chattel would survive the Middle Passage. Millions died. Imagine now, in more specific detail, the captives lying there, starving, for weeks at sea on fouled floorboards shared with swarms of cockroaches that sipped any moisture from their parched mouths, nibbled their toenails as they slept, and ferried bacteria from body to body, ensuring an endless cycle of gastroenteritis ("the flux") amongst them.
Waves of immigrant cockroaches arrived in the Americas and scattered into gradually emerging cityscapes. Through time, they have established genetically distinct enclaves in buildings that are nested within ethnically subdivided cities. All these formations are constantly in flux as bodies migrate. Roaches colonize new territory whenever they crawl into a handbag or get packed up in boxes with our possessions.
Cockroaches in hospitals pose a serious problem. They circulate pathogens around stitched-up cuts and open wounds, wreaking havoc on the healing process. The pathogen load of German cockroaches extracted from US hospitals was found to carry bacteria types that cause wound infections, diarrhea, food poisoning, conjunctivitis, gas gangrene, leprosy, sepsis, typhoid, skin and organ infections, and pneumonia. The problem has worsened with the rise of antibiotic resistant bacteria, like MRSA, which do not harm cockroaches in the least but kill approximately 50,000 people per year in the US.
Urban roaches have the upper hand in our skewed symbiotic relationship. But out in the wild, cockroaches are very different creatures. Entomologists at Rutgers University, New Brunswick keep bins of cockroaches in darkened rooms of the Department's basement. Wlodek Lapkiewicz has an assortment of exotic roaches from other continents. Unlike the domestic varieties, the exotic cockroaches carry no pathogen load, living as they do in unsettled spaces: the Australian outback, South American caves, and tropical rain forests. They are "clean." Some are kept as household pets, and others make good food for other pets. A few species, like the giant burrowing cockroach of Australia, take several years to mature and reproduce (unlike the more fecund distant cousins). These can fetch prices of $100 or more per insect.
The exotics vary widely in appearance. The slight banana roach is grasshopper green. The tan "peppered roach" has speckles of dark brown. The brown Dubia roaches, popular feeders for reptiles, "don't stink," Wlodek says appreciatively. (Cockroach colonies otherwise give off a musky, fungal smell, the telltale clue that your place is infested.) The Giant cockroach, Blaberus giganteus, native to Central and South America, has a wingspan of up to 6 inches across. The black "Question Mark" roach sports gold punctuation marks on its back. The "Domino" roach's name speaks for itself. An evil jack-o-lantern grins on thorax of the "Death Head" roach. The "Halloween" roach resembles a broach, banded in gold, orange, and black. TheLucihormetica luckae glows in the dark if it eats the bioluminescent flor de coco mushroom of Brazil, making it look to predators like a toxic click beetle and to humans like a tiny, gleaming-eyed alien.
There is nothing remotely beautiful about the common American, German, and brown-banded cockroaches. Even though most of us can't identify the subspecies, we know a roach when we see it. The entomologists at Rutgers are working diligently to control urban roach populations. Chen Zha, a doctoral student working under Dr. Changlu Wang, traps live roaches in apartment units and nursing homes (restaurant owners do not, as a rule, choose to participate in these studies) and brings the live specimens back to the lab, where they are placed in bins and fed peanut butter and jelly. "They eat better than I do!" said one masked grad student as he rearranged the bin habitat with a pair of tweezers. The masks protect the scientists from inhaling an excessive amount of roach allergens. Chen traces how many roach generations it takes for insecticide-resistance to evolve and concocts the most effective chemical brews to keep them in check.
Meanwhile, on the Rutgers-Newark campus, Sean Fan, who works under Dr. Ware, sets sticky traps for his research on cockroach genetics. Sticky traps kill the roaches, but Sean only needs their DNA. The genes reveal the effects of a group's habitat, be it the lunchroom of a university office or a nearby store, as well as the degree of inbreeding going on. The quality of the roach's diet and the level of insecticide soaked up by the living space are reflected in the genes, as is the pathogen load the insect carries. Sean sorts roach genes from bacteria after pulling off the roach's leg and submerging in a solution that makes the cells explode. "The DNA flows of the open end of the leg (which used to be attached to the body) and into the solution," Dr. Ware explains. Good pathogens can also be had on the roaches' wing surfaces and in their guts.
Although cockroaches are immune to the bacteria that sicken humans, they do suffer from other ailments, such as fungus and mites. Certain bacteria have co-evolved with roaches as "endosymbionts," living inside their body fat and colorless bloodstream and enabling roaches to sequester and store nitrogen, which they need for growth, from even the scrappiest of diets (roaches can survive on meals of feces and Styrofoam).
If urban roaches have any redemptive value for us, it lies in their resistance to superbugs. The rise in multidrug-resistant bacterial strains challenges scientists to ferret out new sources of antibiotics in nature, which, unlike synthetic drugs, have nuanced properties that make them more effective. Where better to look than at insects crawling around the pathogenic stew of hospital surfaces? A few years ago, scientists discovered antimicrobial activity in cockroach brain tissue, which explains their resilience to unsavory environments. The cockroach brain contains a rich vein of antibiotics. We just need to figure out how to mine it.
This is life in the late Anthropocene, the age of human dominance on the planet, an age of blighted landscapes, warming temperatures, mass extinction, overcrowded cities, zoonotic pandemics, superbugs, nuclear fallout, and mind-boggling advances in biotechnology. The irony, that a creature that flourishes in our waste and toxic residues, one that has come to symbolize life in a post-apocalyptic, peopleless landscape—the Cockroach, the Survivor—may well prolong our lives a while longer.
Johnny B.A.N.G. Reilly, Being Free
by Olivia Zhu
When I first get to meet Johnny B.A.N.G. Reilly, he looks tired. Really tired, leaning back away from his computer screen with most of his head cocooned tight in a sweatshirt hood. The light is wan, his hood is grey, and his famous voice is at first raspy and subdued. As quiet as he is, though, Reilly speaks in punctuated, verse-like phrases. His responses to my questions seem to arrive as fully formed from his head as do the spoken word, “visual” poems he has become known for.
Chief among these is “Dear Brother,” a spec ad for Johnnie Walker created by two students, Daniel Titz and Dorian Lebherz. Since the video was uploaded half a year ago, it has amassed over four million views and plenty of praise—including some for the haunting poem and voiceover by Reilly.
“Dear Brother” was, in fact, how I learned about Reilly in the first place. He somehow has the ability to sound joyous and heartbroken in the same breath, with words timed so they roll out perfectly at the last possible second to still sound melodic. That perfect rhythm might be attributed to his time as a street dancer, or as a mixed martial artist. Yet “my rhythm comes from what’s actually beating in my chest,” says Reilly. After suffering a heart attack due to a former drug habit, he experienced irregular heartbeats that sped up and slowed down, informing the cadence of his poems. He rushes and pauses and sometimes drops single syllables, leaving them to float amidst longer phrases.
The timing, the gravel-in-the-sun voice—they make Reilly’s work distinct. However, the YouTube video makes it clear the filmmakers who created “Dear Brother” credit themselves, along with Reilly, for the creation of the poem. In the comments, they note that “It was written by voice actor John “Bang” Reilly in collaboration with us.” Reilly disagrees.
He says that the attention he’s received from “Dear Brother” is entirely due to another work he created almost a year before called “Nostalgia,” in collaboration with filmmaker Judith Veenendaal. On her Vimeo page, she calls it a “visual poem,” perhaps because of how she had mapped images to the meaning and pattern of Reilly’s words. According to Reilly, Titz and Lebherz “ripped it [“Nostalgia”] off completely. The music is the same. They used me. And at the end of it, they used the word ‘free.’” He seems more frustrated over the fact Veenendaal had not received any sort of credit for her work than Titz and Lebherz’s assertion that they helped in the writing of the poem. Reilly says to write it, he had to imagine carrying his own brother’s ashes—making the claim on his work all the more hurtful.
When comparing “Nostalgia” and “Dear Brother,” it seems clear that the former inspired the latter. “Nostalgia” opens with a set of lines read calmly, though they are replete with remembered pain (note that all the following emphasis mine):
“I have vivid memories from my youth
Horrible beatings for my truth
Hurts me and molds me
No one holds me my soul screams free. Free.”
In what seems like a clear parallel, “Dear Brother” begins as follows:
“Walking the roads of our youth
through the land of our childhood, our home and our truth
Be near me, guide me
always stay beside me so I can be free, free.”
I’ve just bolded the repeated words, but take note, too, of the similar structure. Reilly is a frequent internal rhymer, a nod to his long-time experience with rap and street dance, and the reading of his lines alternate fast and slow. Moreover, the repetition of “free” in both pieces here is broken by a caesura, the long pause somehow conveying the poet’s questioning of what it might mean to be free.
Another point of similarity between the pieces has to do with the presence of two personas in each. In “Nostalgia,” there is an old poet and a young, childish poet, who tells his older self “I empty me to light you” and later, “I run away to run to me.” To Reilly, his younger self was preoccupied with seeming violent and becoming physically stronger. Now, he’s “concentrating on the other parts,” focusing on better expressing his soul, emptying his violent past to run toward his poetic work.
In “Dear Brother,” the personas are more obvious—perhaps due to the constraint of the film’s plot. The first half is spoken by the brother still alive, asking his sibling to stay near him and reminiscing on their adventures. In the latter half of the poem, though, the view shifts. Now the speaking brother comforts the other, saying
“if your heart’s full of sorrow, keep walking, don’t rest
and promise me from heart to chest
to never let your memories die, never.
I will always be alive and by your side
in your mind.”
All of this together would seem to indicate that, yes, Titz and Lebherz were heavily inspired by “Nostalgia”—possibly so much so they directed Reilly to repeat the motifs and structures of his previous work. Does that mean they collaborated with Reilly on the poem itself? Perhaps, depending on your definition of collaboration. Does it mean they wrote the poem? An affirmative answer there seems somewhat less likely. (Titz and Lebherz did not respond to a request for an interview).
Again, both poems end with the phrase “I’m free.” To Reilly, that means being “free from one-dimensional expression. I want to be three-dimensional, all aspects.” He means it in the sense that—as a former homeless man, ex-addict, and current heavily tattooed fighter—he looks “brutal,” but wants to move past that image, expressing “love and gentle emotions.” His collaborator over the past few years, Benjamin Hounam agrees: “now his whole thing is love, that’s what he’s on. He’s kind of evolved.”
Part of that evolution involves expressing in words what others cannot. Reilly seems to want to be a kind of medicine man, offering poems and voice recordings of them to people trying to make peace with their loved ones. He says his “ultimate dream” is to do voiceovers for other people, not just large corporations, especially since he feels he has always been able to write love, in any circumstance. Says Reilly, “No matter what situation I was in—whether I was homeless, whether I was addicted—when I was addicted to drugs, I still wrote love songs.”
Maybe that is why his partnership with Hounam has been so fruitful. Hounam, who has about half of Reilly’s 52 years, told me “words have never really been my thing.” Filmmaking, however, is. His collaboration with Reilly seems to have made them both prolific and—more than that—extremely proficient at the kind of “visual poems” that have made Reilly popular. In them, the poet fights. He dances. He stares, and he raps. And most of all, he recites. His process is a bit of an odd one: “I write a poem about a particular subject, and I may make a piece of film to that subject, but I never put that poem to that film. I just use it as a template. And then, in a month, I find another poem and put it on that film, and it just seems to work, like the subliminal is working with the unconscious.” However the works are made, they seem to achieve what Hounam says is their aim “to make everyday life epic.”
When critic Dana Gioia asked if poetry could matter, he wondered why poetry mixed so infrequently with other art forms. In fact, he implied that poetry’s “diminished stature” might be in part due to its self-isolation from other mediums. Reilly and Hounam could be reversing that trend. Reilly talks about other artists with enthusiasm, marveling at b-boys, fast rappers, inventive beat creators. He riffs on their work and seems hungry for new challenges, a philosophy taken from his time in the ring. When fighters are to be taught something new, Reilly says “your response is osu, ‘I’m going to do it.’ You dive in, and I love that. I think the way I’ve been living all my life is like osu.” That eagerness bears out in his collaboration with Hounam, also labeled OSU, which runs the gamut of moods and types of content.
The pairing of poetry and film might be what helps re-awaken popular interest in poetry. After all, Hounam found Reilly on the Internet, too, after following “random bits of poetry, or him fighting” posted online. After that, the younger man reached out, feeling compelled to partner with the poet. I think Reilly inspires that desire to connect in a lot more people, too. The success of “Dear Brother” seems a testament to his ability to reach others—and Reilly has already harnessed technology and his poetry to start talking about the topics he cares about: socioeconomic injustice, racism and prejudice, healthy living, and—above all—love.
By the end of our conversation, I realize he seems calm, not tired. Sometimes he’ll turn his head to talk to his daughter, showing off his facial tattoos—the ones that he says make him look more of a brute—in profile. When he smiles at his child, though, the curved tattoos seem to smile too. Reilly says that “now osu means to do something that I don’t do too often, which is to only express love.” From what I can tell, he’s doing it already.
Should Biologists be Guided by Beauty
by Jalees Rehman
Lingulodinium polyedrum is a unicellular marine organism which belongs to the dinoflagellate group of algae. Its genome is among the largest found in any species on this planet, estimated to contain around 165 billion DNA base pairs – roughly fifty times larger than the size of the human genome. Encased in magnificent polyhedral shells, these bioluminescent algae became important organisms to study biological rhythms. Each Lingulodinium polyedrum cell contains not one but at least two internal clocks which keep track of time by oscillating at a frequency of approximately 24 hours. Algae maintained in continuous light for weeks continue to emit a bluish-green glow at what they perceive as night-time and swim up to the water surface during day-time hours – despite the absence of any external time cues. When I began studying how nutrients affect the circadian rhythms of these algae as a student at the University of Munich, I marveled at the intricacy and beauty of these complex time-keeping mechanisms that had evolved over hundreds of millions of years.
Over the course of a quarter of a century, I have worked in a variety of biological fields, from these initial experiments in marine algae to how stem cells help build human blood vessels and how mitochondria in a cell fragment and reconnect as cells divide. Each project required its own set of research methods and techniques, each project came with its own failures and successes. But with each project, my sense of awe for the beauty of nature has grown. Evolution has bestowed this planet with such an amazing diversity of life-forms and biological mechanisms, allowing organisms to cope with the unique challenges that they face in their respective habitats. But it is only recently that I have become aware of the fact that my sense of biological beauty was a post hoc phenomenon: Beauty was what I perceived after reviewing the experimental findings; I was not guided by a quest for beauty while designing experiments. In fact, I would have been worried that such an approach might bias the design and interpretation of experiments. Might a desire for seeing Beauty in cell biology lead one to consciously or subconsciously discard results that might seem too messy?
I was prompted to revisit the role of Beauty in biology while reading a masterpiece of scientific writing, "Dreams of a Final Theory" by the Nobel laureate Steven Weinberg in which he describes how the search for Beauty has guided him and many fellow theoretical physicists to search for an ultimate theory of the fundamental forces of nature. Weinberg explains that it is quite difficult to precisely define what constitutes Beauty in physics but a physicist would nevertheless recognize it when she sees it.
One such key characteristic of a beautiful scientific theory is the simplicity of the underlying concepts. According to Weinberg, Einstein's theory of gravitation is described in fourteen equations whereas Newton's theory can be expressed in three. Despite the appearance of greater complexity in Einstein's theory, Weinberg finds it more beautiful than Newton's theory because the Einsteinian approach rests on one elegant central principle – the equivalence of gravitation and inertia. Weinberg's second characteristic for beautiful scientific theories is their inevitability. Every major aspect of the theory seems so perfect that it cannot be tweaked or improved on. Any attempt to significantly modify Einstein's theory of general relativity would lead to undermining its fundamental concepts, just like any attempts to move around parts of Raphael's Holy Family would weaken the whole painting.
Can similar principles be applied to biology? I realized that when I give examples of beauty in biology, I focus on the complexity and diversity of life, not its simplicity or inevitability. Perhaps this is due to the fact that Weinberg was describing the search of fundamental laws of physics, laws which would explain the basis of all matter and energy – our universe. As cell biologists, we work several orders of magnitude removed from these fundamental laws. Our building blocks are organic molecules such as proteins and sugars. We find little evidence of inevitability in the molecular pathways we study – cells have an extraordinary ability to adapt. Mutations in genes or derangement in molecular signaling can often be compensated by alternate cellular pathways.
This also points to a fundamental difference in our approaches to the world. Physicists searching for the fundamental laws of nature balance the development of fundamental theories whereas biology in its current form has primarily become an experimental discipline. The latest technological developments in DNA and RNA sequencing, genome editing, optogenetics and high resolution imaging are allowing us to amass unimaginable quantities of experimental data. In fact, the development of technologies often drives the design of experiments. The availability of a genetically engineered mouse model that allows us to track the fate of individual cells that express fluorescent proteins, for example, will give rise to numerous experiments to study cell fate in various disease models and organs. Much of the current biomedical research funding focuses on studying organisms that provide technical convenience such as genetically engineered mice or fulfill a societal goal such as curing human disease.
Uncovering fundamental concepts in biology requires comparative studies across biology and substantial investments in research involving a plethora of other species. In 1990, the National Institutes of Health (NIH – the primary government funding source for biomedical research in the United States) designated a handful of species as model organisms to study human disease, including mice, rats, zebrafish and fruit flies. A recent analysis of the species studied in scientific publications showed that in 1960, roughly half the papers studied what would subsequently be classified as model organisms whereas the other half of papers studied additional species. By 2010, over 80% of the scientific papers were now being published on model organisms and only 20% were devoted to other species, thus marking a significant dwindling of broader research goals in biology. More importantly, even among the model organisms, there has been a clear culling of research priorities with a disproportionately large growth in funding and publications for studies using mice. Thousands of scientific papers are published every month on the cell signaling pathways and molecular biology in mouse and human cells whereas only a minuscule fraction of research resources are devoted to studying signaling pathways in algae.
The question of whether or not biologists should be guided by conceptual Beauty leads us to the even more pressing question of whether we need to broaden biological research. If we want to mirror the dizzying success of fundamental physics during the past century and similarly advance fundamental biology, then we need substantially step-up investments in fundamental biological research that is not constrained by medical goals.
Dietrich, M. R., Ankeny, R. A., & Chen, P. M. (2014). Publication trends in model organism research. Genetics, 198(3), 787-794.
Weinberg, S. (1992). Dreams of a final theory. Vintage. (2014).
The Past of Islamic Civilization
by Muhammad Aurangzeb Ahmad
“Those who control the present, control the past and those who control the past control the future.”
― George Orwell, 1984
These days every other person seems to be concerned about the future of Islamic Civilization. From the Islamists, the traditionalists, the Liberals, the Conservatives etc. almost everyone seems to have a stake in the future of Islam. While these different groups may have different vision of the future they do have one thing in common – they almost always define the future in terms of the past: From the Salafis harkening back to a supposed era of purity, to the academics yearning for the Golden Age of Islam and to the more recent Ottoman nostalgia in Turkey and the wider Middle East. The study of history becomes paramount in such an encounter since a distorted view of the past can become a potentially unrealizable view of the future.
As any historian will tell us each group reads history in terms of its own aspirations and agenda. For the Muslims world in general the nostalgia for the past usually seems to be heavy on reviving the glories of the past. The danger here being that one may start living in a non-existent romanticized past and be condemned to repeat the mistakes of the past. In the West every other political pundit seems to be calling for an Islamic Reformation even though parallel religious structures do not exist in Islam. What do these visions of the future-past look like and what can be learned from these?
For the majority of Muslims, it is the time of the Prophet that represents an idealized society. However many of them implicitly also realize that by its own constitution that era cannot be replicated precisely because of the Muslim belief that the Prophet cannot come back and there will not be another prophet – a perfect society needs a perfect man. The era after that which is most revered by (Sunni) Muslims is the era of the Righteously Guided Caliphs. What most folks fail to realize that that era was revered in classical Islam not because of it was a time of peace and prosperity but because of its proximity to the time of the Prophet. On one hand it was a time of great expansion where the foundations of Islamic governance were also laid down. On the other hand it was also a time of civil wars when there was a great deal of intra-Muslim bloodshed. To anyone who wants to revive that era, one must caution that it is also the time when the Khawarij, the intellectual ancestors of groups like ISIS first arose. One must be careful in what one wishes for.
Then there is the Golden age of Islamic civilization centered on Baghdad, Cordoba and Samarkand. While the Islamists tout the greatness of this era what gets shoved under the rugs is that many of the rulers of this era were less than exemplary when it comes to their orthodoxy. Another fact that many people fail to recognize is that even during the Golden Age the majority of the subjects of the Islamic empire were non-Muslims including an appreciable percentage of scientists and philosophers who were instrumental for the rise of Islamic civilization. Science and technology back then as it is now is an international collaboratory enterprise. The last point is especially relevant to our day and age since the exclusion of non-Muslims in the national narratives in the Muslim world has unfortunately become the norm. Outside of small academic circles most people, Muslims or non-Muslims, are unaware of the fact that what came to be identified with the Islamic systems of governance was heavily borrowed from the Sassanids. The millet system of the Ottomans was inspired by a similar system that the Rightly Guided Caliphs has enacted which in turn was invented by the Sassanids. When the rulers of Vijaynagar in South India were copying the style of Muslim palaces they were actually copying the plans that Muslims had acquired from Sassanids. The greatest irony here being that while some modern day non-Muslim Iranians blame Muslims for destroying their culture, it was actually the Islamic culture that led to the widespread dissemination of the Iranian culture but under the Islamic garb.
The lesson being that he early Muslims had to deal with many practical considerations of ruling a multi-ethnic multi-religious Empire and inspiration had to be taken from anywhere and everywhere. Even the aesthetics that came to be identified as quintessentially Islamic had roots in earlier civilizations. The archetypical image of how a mosque should look like is heavily borrowed from Eastern Orthodox Churches that the Muslims first encountered in their conquests of Syria and the Levant. These observations do not make these developments less Islamic but rather it shows the openness of the Islamic culture of a bygone era. The translation of Greeks works did not start in the time of Abbasids as in the popular imagination but rather it had already started in the second generation from the time of the Prophet. Khalid bin Yazid, an ummawid prince and a scientist himself was instrumental in sponsoring the translation project in Alexandria less than 50 years after the time of the Prophet.
Then there is Al-Andalus or Muslim Spain has been romanticized by people as diverse as Tariq Ali, Osama bin Laden, head of States etc. It is supposed to represent a time when Jews, Muslims and Christians lived side by side in an era of supposed harmony. While one may disagree with the details of convicencia how it came to an end is where a great deal of disconnect lies in the minds of the Muslim masses. It was not just the advancing armies of Aragon and Castilia that doomed the culture of tolerance but the Almohad were equally intolerant towards the culture of coexistence. It is easier to kill a culture if it already has a self-inflicted wound. The last great flowering of the Islamic civilization was of course the Ottoman Empire. It is also arguably the most successful Muslim empire in history. The empire that now stands as the emblem of the Caliphate only took up that mantle in its declining years. The Ottoman conquest of Constantinople was followed by more than a century of Romanphilia. In fact Mehmet the Conqueror wanted to conquer Rome itself to “reunify” the Roman Empire. The Ottoman Sultans styled themselves as the Emperors of the Romans till the very end of the empire. People in the Balkans focus on the ethnic conflict in the dying days of the empire but fail to mention the 5 centuries of coexistence prior to that. Thus the question of brining back the Ottomans is more appropriately phrased as which incarnation of the Ottoman Empire as it greatly changed over the course of seven centuries. The dichotomies of Christian vs. Muslim disappear even more when one realizes that the majority of the Ottoman army at the siege of Vienna in 1683 may actually have been made up of Christians. Can we really imagine such a scenario three hundred years after the fact?
If the Abbasids represented the Golden Age of Islam then perhaps the rise of Ottomans, Mughals, Safavids, the Mali Empire, the Indian Ocean trade networks and the spread of Islam in South East Asia should be termed as the Silver Age of Islam. There may be many more things that we can learn from the less glamorous and the seemingly peripheral parts of Islamic history: The question of Islamic law and how Muslims should live as minorities in a non-Muslim state has come to the fore in the West recently. Even some of the learned ulemas act as if this is an unprecedented situation and as if this has not happened before. Islam has been in China for longer than most Muslim majority places in the world. Chinese Muslims have a long history of integrating with and long conversing with a foreign civilization. Chinese Muslims arts, culture, language and even philosophy has much to teach the rest of the Muslim world if only they are willing to listen. Al-Andalus was not the only region of Europe that was occupied by Arabs and Berbers and brought advanced civilization there. Sicily was the Andalus that disappeared early on. What is however missing from discussions of Sicily is that Muslims did continue to flourish in Sicily for 150 years even after the fall of Islamic rule there mainly because of the somewhat enlightened rule of the Normans in Sicily. The most well known of these Christian rules was Roger II, after whom the most famous book of Islamic geography is named Kitab ul Rigel (The Book of Roger), by the celebrated geographer Muhammad Al-Idrisi. Even after the persecutions started it was not until 1336 that the last group on Muslims in Italy were forced to convert.
My point here is not to argue that one must not look at the past for inspiration but as with most things in life the good comes with bad. It could be that none of the examples that I have outlined previously are that relevant to the present predicament of the Muslim world. Pick up almost any history text on Islamic history especially in Arabic, Persian or Urdu it reads more like a series of events than a meaningful analysis of history. Thus missing from the narrative of the Islamic world is the impact of the Black Plague in the historical imagination of the Muslim world even though which is equally devastating in the Middle East as it was in Europe or how the disruption of the Indian Ocean trade network by the Portuguese was instrumental in the long term economic decline of the Muslim world.
Perhaps the answer may lie in something else entirely – alternate history. By forcing Muslims to think about what could have been they may also start thinking about what could be in a more nuanced manner. There will of course be people who might object and say that why dwell in the past in any form whether positive or negative. It is however impossible to escape history, the kind of people that we want to become is usually a function of how we imagine how we got to be what we are. Historians may argue that all national, ethnic and even religious histories have an element of useful fiction and objectivity is relative at best. Even if we take this view we still have to answer the question, what kind of fiction do we choose for ourselves. We are who we are by the virtue of stories that we tell about ourselves. Make sense of history thus becomes paramount in moments of civilizational crisis. This is of course not to discount that there are indeed more urgent problems in the world like the Syrian Refugee crisis or the Climate Change but with a group of people that constitute around a forth of humanity there are enough people to have interesting and valuable thoughts to any subject under the Sun and perhaps even beyond.
Monday, May 16, 2016
Perceptions: Art in Nature
"Acorn woodpeckers drill into trees not in order to find acorns, but in order to make holes in which they can store acorns for later use, especially during the winter.
As the acorn dries out, it decreases in size, and the woodpecker moves it to a smaller hole. The birds spend an awful lot of time tending to their granaries in this way, transferring acorns from hole to hole as if engaged in some complicated game of solitaire.
Multiple acorn woodpeckers work together to maintain a single granary, which may be located in a man-made structure – a fence or a wooden building – as well as in a tree trunk. And whereas most woodpecker species are monogamous, acorn woodpeckers take a communal approach to family life. In the bird world, this is called cooperative breeding. Acorn woodpeckers live in groups of up to seven breeding males and three breeding females, plus as many as ten non-breeding helpers. Helpers are young birds who stick around to help their parents raise future broods; only about five per cent of bird species operate in this way."
Narrative History or Non-Fiction Historical Novel?
by Aasem Bakhshi
Why does an apple fall when it is ripe? Is it brought down by the force of gravity? Is it because its stalk withers? Because it is dried by the sun, because it grows too heavy, or the wind shakes it, or because the boy standing under the tree wants to eat it? ‘None of these is the cause. They only make up the combination of conditions under which every living process of organic nature fulfills itself. In the same way the historian who declares that Napoleon went to Moscow because he wanted to, and perished because Alexander desired his destruction, will be just as right and wrong as the man who says that a mass weighing thousands of tons, tottering and undetermined, fell in consequence of the last blow of the pickaxe wielded by the last navy. In historical events great men - so-called - are but labels serving to give a name to the event, and like labels they have the least possible connection with the event itself. Every action of theirs, that seems to them an act of their own free-will, is in the historical sense not free at all but is bound up with the whole course of history and preordained from all eternity.
―Leo Tolstoy, War and Peace
Wouldn't you visualize Livia Drusila ― the wife of Roman emperor Augustus ― as a cunning and venomous political mastermind if your sole introduction to ancient Roman history is Robert Graves' engrossing autobiographical tale of emperor Claudius? Haven't you always visualized the last Roman emperor of Julio-Claudian dynasty, the infamous Nero, playing fiddle while Rome was burning in 64 AD? Can anyone have a more predominant image of Abu Sufyan's wife Hind Bint Utbah than the one represented by Irene Papas through her revengeful eyes and blood-dripping lips in the film The Message (1976) when she was shown chewing the liver of Prophet Muhammad's uncle Hamza after the Battle of Uhud?
These are all overpowering images, sustained over time, and hard to erase from the slate of our memories. It doesn't matter much if we argue, for instance, that it was not Hind but the black slave Wahshi who actually gouged out Hamza's liver according to a traditional Muslim historian Ibn Kathir's narrative or else that the earliest recording of the incident by the historian Ibn Ishaq is a dubious attribution because of broken chains of narration. Similarly, does it matter that fiddles were non-existent in first-century Rome and it is probably an anciently preserved metaphor, as Nero was famous for his love of extraordinary indulgence in music and play? It would not transform these images the least if we juxtapose the contradicting accounts of Suetonius, Cassius and Tacitus and present evidence that Nero even returned immediately from Antium and organized a great relief effort from his own funds, even opening his palaces for the survivors. And it is pretty much futile to argue ― after BBC popularized Graves' autobiographical account of Claudius by adapting it into a TV series ― that Livia might not be a such a thorough Machiavellian character, and in fact it was not her favorite pastime to scheme political upheavals and poison every other claimant to Roman throne.
Thus after centuries of dust settling over innumerable layers of narratives, the quest for historical certainty, for that which actually happened, is overpowered by popular images that refuse to erase themselves from collective memory.
And this, of course, is also the single most important contribution of British-American psychologist Lesley Hazleton's narrative history of Shia-Sunni split: refreshing and reinforcing some already held soppy images.
But this is not about reviewing Hazleton's reading of the perennial sectarian split at the heart of Islam per se; rather, using her as a
template to locate the increasingly blurred lines between narrative history and historical fiction. In the wake of this relatively new genre taking a sharp modern turn, where must a reader not well-rooted within the whole literary tradition of the respective historical current place his sensibility regarding authenticity of the historical truth?
That Hazleton is more interested in psychological characterization and building a juicy and well-coherent narrative, rather than objective historical analysis and criticism, is easily evident even from a cursory look through the text. Even though, the characterization and speculative psychological insights are evenly distributed all over the text, Hazleton surely has her pivotal choice of heroes and villains to build a gripping narrative. Well-meaning heroes, who are eventually destined to be gypped, and pernicious villains, who are designed to exploit. As Hazleton's publishers must have carefully put it in the title, it has to be marketed for the reader as an 'Epic Story', an epic Game of Thrones adventure intricately built around the desire for power.
Therefore, right from the beginning, the narrative essentially revolves around the struggle for accession to this proverbial throne. The opening part supplies images in which Prophet Muhammad, who according to the author, was perhaps leading a life of celibacy after the death of his most beloved wife Khadija is dying and the community is not yet ready to grapple with his evident death. In an authorial figment of imagination, all of his wives surely did try to get pregnant by him in order to bear a son and it was Ayesha who was specially haunted by her childlessness. Understandably so, as her readers naturally having modern sensibilities and this being a medieval monarchical structure, Hazleton must logically supply reader with an image where the community is fragile enough to disintegrate in the absence of an immediate political center. Hence, as they say, the stage is set in the opening part for the power play amidst usual chaos depicted in a medieval folklore,
What did he intend to happen after his death? This is the question that will haunt the whole tragic story of the Sunni-Shia split, though by its nature, it is unanswerable. In everything that was to follow, everyone claimed to have insight into what the Prophet thought and what he wanted. Yet in the lack of a clear and unequivocal designation of his successor, nobody could prove it beyond any shadow of doubt. However convinced they may have been that they were right, there were always those who would maintain otherwise. Certainty was a matter of faith rather than fact.
Subsequently, in this cheesy narrative pivoted around power struggle, Ayesha is depicted as a charming and impudent young brat who, as she gets older, essentially acquires a Livian element with a soft Machiavellian composition, which Hazleton carefully imparts as if there is enough historical truth to substantiate her psychological make-up beyond reasonable doubt.
How could a teenage girl possibly compete against the hallowed memory of a dead woman? But then who but a teenage girl would even dream of trying? Charming she must have been, and sassy she definitely was. Sometimes, though, the charm wears thin, at least to the modern ear. The stories Ayesha later told of her marriage were intended to show her influence and spiritedness, but there is often a definite edge to them, a sense of a young woman not to be crossed or denied, of someone who could all too easily switch from spirited to mean-spirited.
Throughout her narrative, this Machiavellian composition of Ayesha is carefully pitted against composed and well-balanced demeanour of Muhammad's cousin Ali, whom Hazleton portrays something closer to an Arthurian legend with Excalibur (book has a reference to Excalibur too comparing it with Ali's famous sword Al-Zulfiqar). And because it is naturally a demand of a stronger narrative, Hazleton never fails to speculate even when there is little room to supply a tinge of any imagined political conflict between Ali and other challengers of succession to Prophet Muhammad, namely Abu Bakr and Omar
The meaning was clear: in a society where to give was more honorable than to receive, the man who gave his daughter’s hand bestowed the higher honor. While Abu Bakr and Omar honored Muhammad by marrying their daughters to him, he did not return the honor but chose Ali instead.
But if there is a true Livian character in this tale, it is Muawiyah, the powerful governor of Syria whose promised reinforcements didn't arrive to avert the assassination of third caliph Othman, according to some of Hazleton's sources.
Certainly he was no one-dimensional villain, though it is true he looked the part. He had a protruding stomach, bulging eyes, and feet swollen by gout, but as though in compensation for his physical shortcomings, he was possessed of an extraordinary subtlety of mind [...] Eight centuries before Niccolò Machiavelli wrote The Prince, Muawiya was the supreme expert in the attainment and maintenance of power, a clear-eyed pragmatist who delighted in the art and science of manipulation, whether by bribery, flattery, intelligence, or exquisitely calculated deception [...] The famed image of Hind cramming Hamza’s liver into her mouth worked to his advantage. Any son of such a mother could inspire not just fear but respect, and Muawiya commanded both. Except from Ali [...] Poison has none of the heroics of battle. It works quietly and selectively, one might almost say discreetly. For Muawiya, it was the perfect weapon.
For an informed reader, therefore, authorial intention easily protrudes from the text, rather it is the subtext itself which lays bare the intent to give a chilling speculative quality to the whole story as it is told. Hence, it is usually through the subtext that we see Muawiyah and his associates, among them Amr Bin Al Aas, poisoning, deceiving and when it is necessary, battling their way to the throne. From the point of view of an impartial author who doesn't have a possible conflict of interest, Hazleton carefully chooses her sources ― her chief source being the Annals of Tabari ― and claims not to prefer less authentic ones over the stronger. However, using her authorial right to choose among various versions of the same incident, she intelligently prefers the most chilling and controversial version over the casual and discreet ones. This is the primary reason why the readers who are generally unacquainted with classical Muslim sources such as those of Tabari, Ibn Saad, Ibn Athir and Masudi etc would find Hazleton's accounts of Battles of Siffin, Jam'l and subsequent events of Karbala in Yazid's reign simply unputdownable. However, such readers must understand that the chief success of Hazleton's work lies in its ability to create an extremely readable and gripping narrative with psychological insights of a bystander looking piercingly into her historical subjects.
Moreover, if the text is read carefully, she is able to present a decent popular point of view, drawing from both sides of history as well heresiography. What she fails to make emphatically clear is that historical certainty and objectivity must not be compromised for the flair of narrative. From a sheer academic point of view, the text is absolutely unworthy of attention primarily because it doesn't live up to its promise of linking the present Shia-Sunni conflicts in contemporary Syria and Iraq to its alleged historical roots. There is a lot more to the Shia-Sunni conflict then a supposed Game of Thrones and it certainly has as much to do with the global politics during post-formative periods of Islam, not to mention another more interesting conflict between two different theological meta-narratives.
Hazleton neither has the historical insight of William Dalrymple, nor has she the profundity of Orlando Figes to produce a useful narrative-history for widely informed audience. In the absence of footnotes and textual references, it is extremely hard to trace her contentions and speculations to original sources. Furthermore, the distraught and superficially agitated nature of the narrative is generally distasteful to a serious reader, who might not be interested in an over-dramatized good vs evil story. At the most, Hazleton's account must be read as a riveting historical novel adapting real characters and actual events. Unfortunately for a serious student of history, it has nothing much to chew.
Therefore, as a reader who is certainly not a history buff but have at least this much interest to have an occasional monthly drift towards the genre, this leaves me baffled about the whole genre, and I am compelled to raise a question about the balance between imagining a narrative or creating one from the sources. Of course, latter has its downside as well since there has to be a certain degree of selective bias in choosing the particular sources to support the preconceived line of enquiry; however, a committed reader can always point that out after a little hard-work.
From a Christian point of view, a somewhat similar case in point is Reza Aslan's reconstruction of life and times of Jesus of Nazareth. One can argue that from a particular kind of revisionist Christian setting Aslan does a decent act of balancing the historical Jesus, that is of Nazareth, with the theological one that is Christ and Savior. However, from the standpoint of all the hype that it created due to generally misplaced Islamophobic critiques and the authorial defenses centered on a presumably academic unbiased historical work, shouldn't it be considered a mediocre work when placed into narrative history genre?
Let us see how. Aslan's basic idea: disentangle historical Jesus from the theological one by contending that the former was a radicalized anti-Roman zealous Jew. The claim is not alarmingly novel, at least from a Muslim standpoint, however, Aslan's work merely moves on the fringes of the arguments. In my humble opinion, its neither a rebuttal of classical orthodox Christian position and nor a critical challenge to it. To achieve any of that Aslan had to delve deep into the theology and scriptural interpretation of last 20 centuries, from which he deliberately distanced himself by calling his work a 'historical study'. However, on the chronological scale that he is working, its nearly impossible to disentangle history from theology and the work obviously suffers not recognizing that. Even a Muslim reader would struggle to grapple with Aslan's portrait of Jesus and in the end it would only prove to be a gripping read for his non-religious audience.
In a nutshell, Aslan claims to engage himself in the domain of critical history (drawing from a rich archive of secondary revisionist sources) rather than literary analysis but the grandiose claims that he makes belong as much to the latter. I do not even have an amateur reading in Biblical studies but my reader's hunch says that a Biblical scholar would accuse him of cherry-picking from selective ancient sources, such as Josephus. As far as his flat reconstructions of Jewish resistance into a formal zealotry is concerned, well, one can only leave it to a more informed reader; a more or less equally informed reader, as I am in the case of Hazleton.
Coming back to my original motivation of producing this seemingly agitated tirade against writers whom I otherwise adore, where should one position himself as a reader while accessing narrative history? More specifically, where is the exact boundary between the idealized narrative history and the nonfiction historical novel? Historians know best, but from a reader's point of view, it is probably not so much an art of failure to separate the tendency, rather impulsive proclivity, to sensationalize the reader from the will to inform him about history. Since more and more lay-historians are embracing the sensationally imaginative version of narrative form, it is perhaps time to call it a nonfiction historical story rather than proper narrative history.
Monday, May 09, 2016
The First Garden Party of the Year
by Holly A. Case
It was the first garden party of the year. In attendance were a couple dozen writers and would-be writers, a pastor, and myself. I knew no one except the host and a couple of other people, who were all knotted around each other engaged in writerly shoptalk, so I made friends with the buffet. A potato salad presented itself to my acquaintance. We got on well, but there was no place to sit. A white-haired lady was perched on the side of a chaise out on the patio, not quite taking up the whole length of it, so I asked if I might occupy the end. She nodded her approval and looked at my plate. “How’s the potato salad?” I said I thought it was fine, but would benefit from some pickles. She claimed it as her own contribution to the buffet and quickly changed the subject.
“This party could have happened forty years ago,” she began, with the authority of an eyewitness. She pointed out the clothes people were wearing, their quiet and respectful social configurations and controlled outbursts of laughter. Her finger rose to single out a girl in shorts as the sole anachronism. Whether by force of empirical evidence or persuasion, I could see she was right.
But she did not linger long on the lawns of past parties. Turning to me she asked what I did, and soon we were talking about languages. Though she reads French, she confessed to not really believing in other languages; a chat is qualitatively not a cat. Then she stretched out her foot, “This is not a pied”; the streets of Paris may perfectly well be full of chiens, but they are not full of dogs. French words were like signifiers; they stood in for meaning like a paper cut-out stands in for the real chat. A man came by at that point whom she introduced as mon mari. Seeing my expression, she was quick to reassure me that he was not a signifier, but the real thing.
He moved on. We talked about W. H. Auden and how his poems start close up, and then pull back to a great distance. “It’s funny we’re talking about this,” she said, “because I’ve been thinking all this time that the branch of that tree looks like a horned owl from here.” I looked up and saw the branch; through a squint, the blurred outlines of an owl began to emerge. “Horned you say?” I asked. “Yes, definitely a horned owl. But if you could see it up close it would be impossible to see in it anything but a branch.” Indeed.
We talked about time travel, how utterly strange it is that Ebenezer Scrooge is watching himself up close in the present in A Christmas Carol, what it means when we say “If you could just hear yourself talk!”; how, in order to be moral, you have to be estranged from yourself, but not from a great distance. She told me about her passion for Edith Nesbit, who wrote about five children going around in time. “Nesbit was a Fabian.” Then she outed herself as the child of socialists who had called her parents by their first names. They had bought a book called The Gifted Child, which she promptly found on the shelf and read. In one passage the author explained how gifted children ask questions like, “Where does the wind come from?” As she walked to the mailbox with her father one day, she asked, “Harry, where does the wind come from?” “You read that in the book, didn’t you.”
She told me about going to Radcliffe and about the room full of typists at Oxford University Press where she typed correspondence after college; about having three babies and following her late first husband to four colleges. It began to grow dark. We moved from the chaise into the house and I lost track of her in the churn of other guests. Not knowing quite what to do, I remembered the potato salad and fell back to the buffet.
A few minutes later she made a surprising dash in my direction. “Come and see this!” she said, and led me by the sleeve to the washroom. “Look,” she pointed through the open door. “There’s another door!” On the adjacent wall there was indeed a door very like the one we were peering through. “I mistook this door for that one and opened it. Go in and see for yourself.” I went in, tried to disorient myself as one often does in another person’s house, and dutifully chose the wrong door. It opened onto a darkened room packed with shelves, a washer and dryer, stacks of toys and other exiles, faintly outlined by the dusk falling outside. The room was large, but felt tight and pushy with the crowd of half-visible objects, a sharp contrast to the eclectic warmth of rest of the house.
“Isn’t it incredible?” she greeted me as I came out through the “right” door. “Imagine going into there from here, expecting this, and coming out into that!” I couldn’t tell if she was thrilled or terrified. I mentioned the washer and dryer. She hadn’t noticed them. “But look!” she pointed at a stitched cushion on a chair against the wall—an owl. Not a horned one, but an owl nonetheless. “A horned owl!” Later I tried to work out how it was this woman could turn English into a foreign language, how her way of not seeing worked a kind of magic, turning dogs into chiens and quite ordinary owls into horned ones.
She asked what I thought the future held in store. I said the abolition of death. The party crowd was noticeably thinning and we relocated to a couch in the living room. No one else was there. “Most conversation bounces along on high,” she said with an emphatic accompanying gesture. I told her that was a benign metaphor; I would have chosen a nastier one. “I used to be meaner myself,” she nodded wistfully. Her mother would scold her for favoring the intelligent over the good. She said she was coming around to her mother’s way of thinking.
We were the last guests to leave. The host was good-natured about it. A friend of his, somewhat tipsy, on his way out said how adorable we were, the white-haired woman and I, the way we talked so intensively the whole evening. We both straightened up and traded off protesting that there was nothing whatever unusual about it. She wrote her name and mailing address on a piece of paper before we parted ways.
The next day I sat down to write an actual letter. “Dear Alison Lurie,” it began, and picked up the conversation where we had left off. “On the subject of the horned owl and the mysterious room,” the letter concluded after several paragraphs, “when I give it a third thought, it’s absolutely identical to the boar's skull over which Mrs. Ramsay (in To the Lighthouse) drapes her shawl so a child won’t be frightened by it at night, and where the shawl stays after Mrs. Ramsay is dead and the child has grown up. The boar's head is not in itself a horned owl, but with a shawl it approximates an owl much more readily, and when found in a mysterious room, adding to that an edge of fear and a sense that something at once trivial and immensely important has taken place, the identity is complete. All of this is to say that there are forces of cosmic disturbance in the world (viz. Mrs. Ramsay) around which (or whom) whole worlds move; invisibly, but sure enough. Insofar as you have magical sight of the sort that fails to see a washer and dryer, you must be such a person.”
[Dedicated to my own Mum.]
Seon Ghi Bahk. Existence, 2001.
"Bahk strings together delicate chunks of charcoal using nylon thread, arranging the intricate configurations into various abstract and figurative shapes. The monochromatic sculptures take the forms of everything from decomposing architectural columns to ethereal floating orbs. Tough yet ephemeral, the charcoal is reminiscent of birds in flight or an architectural explosion occurring in slow motion.
The shattered columns dwell in the space between the organic and the manmade, their imposing stature already fading into oblivion. The works embody the transience of human culture, implying that even the most ancient facets of human civilization are, in the grand scheme of nature, destined to disappear. Furthermore the charcoal that comprises the columns, made from a purely geological process, represents our eternal dependence on nature’s processes."