Tuesday, August 30, 2016
The terms “nativism,” “reactionary,” even “fascism” appear in political conversation with increasing regularity. Though few of these leaders profess deep religious commitments, their popularity seems driven in significant part by religious ressentiment — an awareness of the decline of Christian (or “Judeo-Christian”) civilization and a determination to arrest and, if possible, reverse that decline.
Political liberals who long expected to live in an increasingly liberal world may find themselves disoriented by these manifestations, whose nature they are ill prepared to understand, and they certainly wish such “forces of reaction” would just go away. But these forces will not go away. If we were to wish for something less fantastic than the disappearance of our political opposites, we might think along these lines: It would be valuable to have at our disposal some figures equipped for the task of mediation — people who understand the impulses from which these troubling movements arise, who may themselves belong in some sense to the communities driving these movements but are also part of the liberal social order. They should be intellectuals who speak the language of other intellectuals, including the most purely secular, but they should also be fluent in the concepts and practices of faith. Their task would be that of the interpreter, the bridger of cultural gaps; of the mediator, maybe even the reconciler.
Half a century ago, such figures existed in America: serious Christian intellectuals who occupied a prominent place on the national stage. They are gone now. It would be worth our time to inquire why they disappeared, where they went, and whether — should such a thing be thought desirable — they might return.
First, the scary subject of euthanasia. To avoid any misunderstanding: euthanasia, as I am defining it, is the handing or administering of a fatal overdose to a patient by a doctor on the patient’s request. This includes Physician Assisted Suicide. We shall not here go into all the terms and conditions attached to such an act here in the Netherlands. Suffice it to say that it is quite a procedure and not something that is arranged overnight or on the whim of a patient or a doctor. In the United States, the administering of a lethal medication by a doctor is never allowed, but under certain conditions Physician Assisted Suicide is allowed in five states—Oregon, California, Washington, Maine, and New Mexico —and may be on its way to legal status in Vermont.
It is often said that it takes courage to perform euthanasia, and a colleague described to me the other day why he finds it so difficult: “It feels somehow as if the very foundation of my existence is being undermined. The thought of it causes an experience of vertigo. A request almost seems to set me dangling above an abyss.”
I find this a very convincing description, because that is precisely what we feel when faced with the possibility of a predetermined, explicitly arranged death. It is a fearful business, but I don’t quite understand what it is we are so afraid of. Being courageous means that you realize the danger of a situation.
The pudgy cheeks and flaring hairdo of North Korea’s young ruler Kim Jong-un, his bromance with tattooed and pierced former basketball star Dennis Rodman, his boy-on-a-lark grin at missile firings, combine incongruously with the regime’s pledge to drown its enemies in a “sea of fire.” They elicit a mix of revulsion and ridicule in the West. Many predict that the Democratic People’s Republic of Korea cannot survive much longer, given its pervasive poverty, genocidal prison camp system identified by a UN commission of inquiry as committing crimes against humanity,1 self-imposed economic isolation, confrontations with all of its neighbors, and its leader’s youth and inexperience. The Obama administration has adopted a position of “strategic patience,” waiting for intensifying international sanctions to force North Korea either to give up its nuclear weapons or to implode and be taken over by the pro-Western government of South Korea.
But North Korea’s other closest neighbors, the Chinese, have never expected the DPRK to surrender or collapse, and so far they have been correct. Instead of giving up its nuclear bomb and missile programs, Pyongyang is by now thought to have between ten and twenty nuclear devices and over one thousand short-, medium-, and long-range missiles, and to be developing a compact warhead that will be able to hit the US mainland.
Scott Barry Kaufman in Scientific American:
The thing is, the whole concept of giftedness was, from the very beginning of its inception, tied to educational outcomes. When Lewis Terman invented the concept*, he made giftedness synonymous with high IQ scores (on his own test, of course), and linked it to high achievement (genius). What seems to be going on here (and I document this trend in my book Ungifted), is that a sizable proportion of the gifted and talented community-- mostly clinicians who actually work with such children on a daily basis-- fundamentally conceptualize giftedness as something very different than high achievement, and often also very different from high cognitive ability. Now, don't get me wrong: I could get behind this newer conceptualization of giftedness. What this particular segment of the gifted and talented community seem to be describing as giftedness-- exquisite sensitivity to the environment-- certainly is a particular dimension of human variation that is important, and most certainly has substantial variation, like the rest of human personality differences.
But here's the thing: I think in order for this new conceptualization of giftedness to be tractable, it should have more clearly delineated properties, better measurement, and it should also be more clearly tied to particular educational interventions. What can you specifically do to support children who "experience the world intensely"? How do you identify that unique population in the first place, independent of IQ tests, academic achievement, and other very non-experiencing-oriented assessments? From a scientist's point of view, and even from a pragmatists point of view, I don't know what to do with this new definition of giftedness. How do you know what other people really feel, or how intensely they feel it? You know your own qualia, and that's it.
Monday, August 29, 2016
by Hari Balasubramanian
Of the 7097 languages in the world, twenty-three (including the usual suspects: Mandarin, English, Spanish, various forms of Arabic, Hindi, Bengali, Portuguese) are spoken by half of the world's population. Hundreds of languages have only a handful of speakers and are disappearing quickly; one language dies every four months. Some parts of the world (dark green regions in the map) are linguistically far more diverse than others. Papua New Guinea, Cameroon, and India have hundreds of languages while in Japan, Iceland, Norway, and Cuba a single language dominates.
Why are languages distributed this way and why such large variations in diversity? These are hard questions to answer and I won't be dealing with them in this column. So many factors – conquest, empire, globalization, migration, trade necessities, privileged access that comes with adopting a dominant language, religion, administrative convenience, geography, the kind of neighbors one has – have had a role to play in determining the course of language history. Each region has its own story and it would be too hard to get into the details.
I also won't be discussing the merits and demerits of linguistic diversity. Personally, having grown up with five mutually unintelligible Indian languages, I am biased towards diversity – each language encapsulates a unique way of looking at the world and it seems (at least theoretically) that a multiplicity of worldviews is a good thing, worth preserving. But I am sure there are opposing arguments.
Instead, I'll restrict my focus to the following questions. How can the linguistic diversity of a particular region or country be numerically quantified? How do different parts of the world compare? How to account for the fact that languages may be related to one another, that individuals may speak multiple languages?
by Holly A. Case
It was from Isabel Hull that I learned what tu quoque means, and how important it is to know. Hull is a professor of German history at Cornell, where I have also taught. Once I invited her to a class to talk about the British blockade of Germany during the First World War. She explained how the Germans had made war by invading neutral Belgium in 1914, knowing full well they were breaking international law. The title of her latest book, A Scrap of Paper (2014), alludes to the phrase that the German chancellor used to describe the international agreement governing Belgium's neutrality: it meant that little to him.
Hull described to my class the blockade's origins, what the Germans had thought and done, what the British were thinking, how they reached the decision to initiate the blockade, and what its likely impact was. But one concept stood out and remained a topic for discussion for the rest of the semester, even finding its way onto the final exam: it was the Latin phrase tu quoque. A literal translation of the phrase is "you also." Tu quoque is a rhetorical strategy whereby, instead of arguing directly against the claim of your opponent, you challenge their right to make an argument by charging them with hypocrisy. For example: the British government asserts that Germany violated international law by invading neutral Belgium and persecuting its inhabitants. The German government retorts that the British government itself is in breach of international law for having subsequently initiated a naval blockade against Germany, cutting off not only its supply of raw materials, but also (potentially) food to civilians.
The tu quoque is as old as the hills. Cicero used it to win a case in the trial of the exile Ligarius: "You are accusing one who has a case, as I say, better than your own." The Nazis were especially adept at deploying it. In 1942, the Nazi propaganda minister Joseph Goebbels confided to his diary: "The question of Jewish persecution in Europe is being given top news priority by the English and the Americans…We won't even discuss this theme publicly, but instead I gave orders to start an atrocity campaign against the English on their treatment of Colonials." There have been countless examples of tu quoque since. The Soviets countered American claims of human rights abuses with the phrase "And you are lynching negroes," which has its own entry on Wikipedia. Some Turkish scholars have used tu quoque to argue against claims that the Ottoman Empire instigated a genocide against the Armenians in 1915: "No nation is innocent. [T]hough the West has always accused the rest of the world of not being civilized enough, no other nations can be compared with the Germans, French, or Americans if we are talking about racism, fascism, and genocide."
In logic, the tu quoque is considered a fallacy, because it does not actually controvert the original statement. If anything, it confirms the moral valence of wrongdoing, declaring: Yes, I have done wrong, but so have you.
I Hold Things Up
As a carpenter I learned, before you can leverage things apart
you have to find purchase. You have to have a place where a pry-bar
can be slipped in or driven with a hammer to separate.
That being done, whether by violent or pursuasive means,
when two factions have been split
they're easier to manipulate.
These are also political techniques.
They apply as well to sweaty things.
They dictate the tone and conditions of our species' life.
They reach into souls and wrench them.
Though pneumatic they're not ephemeral.
They're tough and mean as muscle.
As a carpenter I also learned
If you set a post upon a solid pier
and brace it well it will never
tilt in glory
it will simply know
I'm here to serve
I hold things up,
end of story.
by Jim Culleny
by Emrys Westacott
In evaluating candidates for political office there are two main things to consider:
a) their ideology–that is, their political views and general philosophy
b) their personal qualities
With respect to ideology, the most important questions one should ask are these:
· Are their beliefs true? (Do they hold correct beliefs on, say, climate change, or on whether a particular policy will increase or reduce poverty, crime, unemployment, pollution, or the likelihood of war?)
· Do I share their values and ideals? (E.g. Are they willing to sacrifice economic growth for the sake of environmental protection (or vice versa)? Where do they stand on issues like gun control, abortion, euthanasia, capital punishment, foreign aid, gay rights, or economic inequality?)
· Whose interests do they represent? (Do they generally favor policies that benefit the rich, the middle class, the poor, employers or workers, corporations or consumers, cities or rural communities?)
Regarding personal qualities, the ones that matter most are:
· knowledge – Are they decently informed about the world and the issues they will be dealing with
· intelligence – Are they able to understand and think through complex problems
· wisdom – Are they reasonable? Do they exercise good judgment?
· effectiveness – Do they have the practical skills to realize their goals?
· integrity – Are they truthful? Is what they do consistent with what they say? Are they motivated by a concern for the public good rather than by self-interest?
These personal qualities obviously cannot be possessed absolutely but only to a greater or lesser degree. And they may often conflict. Most politicians who are effective sometimes have to compromise their integrity, and the first compromise is invariably made before they hold office. As the historian George Hopkins (emeritus professor at Western Illinois university) has observed, "all presidents lie for the simple reason that if they didn't, we wouldn't elect them." A candidate who was perfectly truthful would be ineffective because they would probably never get the chance to implement any of their ideas.
Effective governance may also require leaders to lie, mislead, hide the truth, and break promises. Franklin Roosevelt was by any account a highly effective president; but in the two years prior to Pearl Harbor, he consistently told the American public that he was fully committed to keeping the US out of any foreign wars while simultaneously, and secretly, preparing the country for war against Japan and Germany. The political leaders we are most inclined to venerate are those like Lincoln or Mandela who, in addition to possessing the other qualities listed above, somehow mange to be practically effective with minimum loss of integrity.
by Libby Bishop
Amid the latest privacy kerfuffle in which WhatsApp agreed to sell users' data to its parent Facebook, an article published by Jackman and Kanerva in the Washington and Lee Law Review Online that describes new procedures for research review at Facebook could be deemed inconsequential, or at best, ironic. Even readers familiar with the outcry over Facebook's "emotion contagion" experiment might conclude, with boyd (2015), that Institutional Review Boards are not the solution (IRBs are committees that assess the ethics of federally funded research in the U.S.), and move on to the next item in their newsfeed. That would be a mistake, for there is more at stake here. First, Facebook has over 1.6 billion users, all of whom are potentially its research subjects and thus, would be affected by these procedures. Second, the authors hope the principles they present will "inform other companies" (Microsoft has also recently formed a review group https://vimeo.com/134004122.) Most important, however, this new system at Facebook provokes urgent questions about the role of review systems in achieving ethical research.
The Facebook contagion experiment
In 2010, researchers at Facebook and Cornell University published research that provided evidence that online social networks can transmit large-scale emotional contagion (Kramer, et al., 2014). The experiment demonstrated that reducing positive inputs to users' feeds resulted in users posting fewer positive, and more negative posts, and when negative inputs were reduced, the pattern was reversed: there were more positive and fewer negative posts. Kramer et al. emphasised the meaning of their findings: emotional contagion had been shown to occur without face-to-face and non-verbal cues. The change was small but statistically significant. Moreover, the authors pointed out that small changes can have "large aggregated consequences" (the sample size was 689,003) in part because of connections between emotions and off-line behaviour in areas such as health.
The import of the findings was swamped by the ensuing public outcry about the methodology, in particular, the manipulation of users' feeds, and hence emotions, without their consent. But a key question that emerged was the issue of research review: had the project been subjected to any formal ethical review, and if not, why not? Editors of the journal where the article had been published wrote an Expression of Editorial (Verma, 2014) stating that Cornell had confirmed that the research did not fall under the purview of their Human Research Protection Program because the experiment had been done at Facebook and not Cornell. Furthermore, because the research was not federally funded, it was not required to go through an IRB (boyd, 2015).
by Mathangi Krishnamurthy
"Kar le kar le, tu ik sawaal,
Kar le kar le, koi jawaab,
Aisa sawaal jo zindagi badal de…
[Ask a question,
Try and answer,
The kind of question that will change your life]
It's just a question of a question."
—Title track, Kaun Banega Crorepati
Light bursts forth like rays from the sun. The Indian film star Shahrukh Khan pirouettes across a set, made deliberately larger than life. It is glitzy, neon inundated and disproportionate. Women in some form of modernized traditional Indian clothing stand behind the so-called King Khan as he exhorts the audience to ask a question. The irony, of course, is that in this Indian version of "Who wants to be a millionaire?" it is Khan who asks the questions. As he swiftly changes clothes from scene to scene, a rapper in one moment, a suave sleazy conman of some sort in the other and an overgrown American teen hipster in yet another, his supporting cast range from close cropped capped rappers to women of unidentified nationality in golden and silver lamè. In another frame, Shahrukh in waistcoat and trousers dances with women in tartan mini-skirts and white shirts. They all gyrate to a catchy tune that repeats the mantra of the one question that can change lives.
Slowly seducing the audience with song and dance, Shahrukh coaxes them into participation, insisting that they must come out with their deepest desires since this opportunity might not arise again. Assuring them that they will win the game he asks them to strengthen their hopes. He ends with the oxymoronic question "Is a hot chick cool or a cool chick hot?" On the poorly manifested and highly pixellated version that I watch on the Internet, the paucity of this content seems glaringly obvious.
Danny Boyle's film Slumdog Millionaire, is set in Mumbai and chronicles the unexpected success of a contestant on Kaun Banega Crorepati, the Indian version of Who Wants to be a Millionaire. A rags-to-riches chronicle of a protagonist called Jamal Malik who wins the game show, the plot is nothing if not predictable. The twists in the plot and the form of resolution are, however, what are interesting to this essay. Jamal is also what Prem, the character who portrays Shahrukh's counterpart in this reel life version of reel life, refers to as a slumdog. By winning the game's prize of Rupees one crore, Jamal stands as testimony to what chance can offer even the most underprivileged, as long as they have the hunger to grab it.
Nguyen Phan Chanh (1892-1984). Channeling Experience with a Medium, 1931.
by Richard King
When Lionel Jospin, the Socialist Party candidate for the 2002 French Presidential election, unexpectedly finished in third place in the initial round of voting – behind the Gaullist conservative Jacques Chirac (first) and the far-right candidate Jean-Marie Le Pen (second) – progressive and leftwing voters in France were presented with a stark choice: should they support Chirac in the run-off or should they abstain from voting at all and risk a (still unlikely) victory for the Front National. Characterising the decision as a choice between ‘cholera and plague', most progressives took the first option, often demonstrating their unhappiness by turning up to vote in rubber gloves and nose-pegs. One group of activists even set up a symbolic shower in a Paris square and invited Chirac voters to pass through it after voting.
Fourteen years later, the conflict between political pragmatism and political principle is as relevant as it ever was. With rightwing demagogues on the march in Europe (Le Pen's superior genes go marching on in the shape of his youngest daughter, Marine), a situation may soon arise where progressive voters have to choose between, say, a Jobbik or a Danish People's Party on the one hand and some milquetoast neoliberal or smooth-talking Tory on the other. In the UK, Labour Party members are warned that a vote for Jeremy Corbyn in the upcoming leadership election is sure to mean another Conservative government; vote for the more electable (i.e. centrist) candidate, they are told, lest the Tories have their evil way. And then of course there's Hillary and The Donald – a cholera-or-plague choice if ever there was one. Having run Clinton close in the primaries and set out an agenda for change far to the left of the Democratic candidate, the Sandernistas are faced with a dilemma. Should they sink their differences with the Clintonoids? Or should they stay pure and risk a Trump win?
Thus the lesser evil calculus – the proposition that one must choose the candidate most likely to win who will do the least harm – continues to exert its pull. ‘Vote for me,' says the ‘cholera' candidate, ‘not because I have good policies but because I'm not the other guy, and the other guy, well, just look at him! You wouldn't want that on your conscience, now would you?' The pitch is as old as politics itself and a constant source of frustration to those who see the need for more than just piecemeal change. It is an appeal to fear, and a brake on real progress. ‘Don't waste your vote on a principle,' say the cholerites; ‘Don't risk a bout of plague.'
by Sue Hubbard
I am silver and exact. I have no preconceptions.
Whatever I see I swallow immediately
Mirror —Sylvia Plath
Like many good ideas it is deceptively simple. The artist Mark Wallinger has installed a large mirror across the ceiling of Sigmund Freud's iconic study in Maresfield Gardens. The effect is dramatic. Immediately the space is doubled, turned inside out so that top and bottom, reflection and reality all become blurred. What is real suddenly seems like an illusion. Everything is destabilised - the famous couch, the archaeological figurines and artefacts arranged on Freud's desk, the leather books and densely patterned Turkish rugs. It is disorientating. Are we looking at an actual object or its doppelganger? With its heavy red velvet curtains and oriental drapes the room surrounds us like a womb and the couch, with its comfortable Persian cushions, and Freud's chair at the head where he would have sat out of sight of his analysand, invites us to lie down and rehearse our infantile fantasies and dreams. As we look up we catch sight of our own small, isolated reflection peering into this complex double space.
The mirror has been used throughout art history as a metaphor for both revelation and philosophical conundrum. Some of the oldest drawings found on temple walls and papyrus scrolls depict images of Egyptian Neters gazing into hand-held Mirrors. In Diego Velázquez's Las Meninas, one of the world's most enigmatic paintings, the artist melds the fabric of reality and the illusion of identity in a game of mirrors. While in his Rokey Venus, the goddess of Love, the most beautiful of all the goddesses, is shown lying languidly on a bed, as her son Cupid holds up a mirror – in an act that is at once both narcissistic and Oedipal. As Venus looks both at herself and the viewer the borders between self and other disintegrate.
by Brooks Riley
by Mara Naselli
In the spring of 1917, Alfred Kreymborg brought Marianne Moore to a baseball game. In his autobiography, he recalls how they stood on the crowded elevated on the way to the Polo Grounds, holding the straps as the train lurched. Moore held forth on technical matters of poetics, undisturbed. Kreymborg, the editor of Others, strongly supported Moore’s work and held her in “absolute admiration.” He was not alone. In the early years of Moore’s career, when she circulated among the art and literary avant guard of New York, men and women alike were enthralled. Artists asked to make her portrait. Scofield Thayer fell in love with her. Even Ezra Pound sent her pages of erotically charged prose, which she ignored. Moore was intelligent, striking, and famously felicitous in her speech. “We’re a pair of tongue-tied tyros by comparison,” said William Carlos Williams.
“Never having found her at a loss on any topic whatsoever,” Kreymborg writes, “I wanted to give myself the pleasure at least once of hearing her stumped about something.” Surely baseball was out of her reach. When Moore praised the first strike, Kreymborg asked if she knew who was pitching.
“‘I’ve never seen him before,’ she admitted, ‘but I take it it must be Mr. Mathewson.’”
“I could only gasp,” Kreymborg writes.
Actually, it wasn’t Christy Mathewson on the mound that day, but Moore had read Pitching in a Pinch and knew enough to thwart Kreymborg’s sporting attempt to find the limits of her knowledge. How difficult it is to put a smart woman in her place.
by Humera Afridi
"The pity of partition was not that instead of one country there were now two—independent India and independent Pakistan—but the fact that "human beings in both countries were slaves, slaves of bigotry… slaves of religious passions, slaves of animal instincts and barbarity." —Ayesha Jalal, The Pity of Partition. Manto's Life, Times, and Work across the India-Pakistan Divide.
On a recent—I feel the urge to insert ‘historic'—trip to India—any trip to India, after all, is momentous for a person born in Pakistan, it may well be her last, given the vagaries of the visa-granting authorities—I spent the greater part of my 11 days communing with those who'd passed into the after-life. I sat cross-legged outside marble screen walls whispering supplications at the tombs of Sufi saints in Delhi, while the ancient, beautiful city crumbled all around me. Within the murmuring walls and environs of the shrines, encapsulated in the passionate verses of the qawwals singing in the courtyards, the spirit of the past was palpable and boundaries between realms of time diaphanous.
Poet, mystic, and daughter of the Mughal Emperor Shah Jehan, Jahanara Begum, whose tomb lies across the courtyard from Hazrat Nizamuddin Awliya's own tomb, whispered past me one evening. Dressed in a long muslin gown fragrant with perfumed ittar, she stepped directly into the sanctum sanctorum, unhindered and seemingly oblivious to the present-day ban against women entering the doors of the saint's shrine, to rest a garland of crimson roses, threaded with her own hands, on the blessed saint's tomb. Time collapsed, a myriad histories intersected. In the heightened atmosphere created by a feeling of belonging on this exilic land, fact and imagination co-mingled to manifest new truths.
Not just at the tombs, but also in the clogged lanes of Old Delhi—Shahjahanabad as it was known before the British Raj—with my feet sunk in the sodden ground of the monsoon-humid now, dodging the tyranny of oncoming scooters and rickshaws, I found myself seeking out the palimpsest-like layers of the city's past. The pungent aromas of the marketplace and the stabbing sight of a crippled dog rooted me in the present but I walked wraith-like into history. Unfinished, amorphous stories—familial and historical—propelled me on with urgency. Time is of the essence, they whispered, yearning to be resolved.
I invite you to tell me why I am wrong. I wrote a similar post on facebook and now want to engage you here in this debate. So tell me am I wrong and why.
The issue about the hijab, burka and now burkini is not simply about its presence on the beach or in public institutions and spaces including schools, or about the presence of Islam in public spaces in Europe or about freedom of choice there. The issue is about the hijab, burka and burkini becoming the symbol of Islam and all that there is about Islam.
A garment now defines Islam. A cloth, has become Islam. The issue is that modesty and virtue have been reduced to the abundance or lack of abundance of a garment. And that indeed is a shame.
It isn't that the space for hijabs and niqabs is threatened to be reduced. It is Islam that is being reduced. Reduced to a piece of cloth. And who is responsible for this?
Those responsible for doing so are Muslim women who wear it. Indeed it is about misogyny and patriarchy. Those who promote it are women. And they are predominantly articulating themselves to the West. They are reducing themselves, reducing the air around them, the light, the conversation, and they are reducing the faith that they profess to belong to by this reductionist action.
They have reduced Islam to a piece of cloth. There were two American Muslim women who participated in the Olympics and won medals. NBC and the media only played up and focused on one. Yup, the one wearing the hijab. Regularly, those women invited to speak about Muslims or Islam or represent Muslims are wearing hijabs. Those appointed and recruited to police and surveil and provide security duties are in hijab. Why?
Sunday, August 28, 2016
Charles J. Shields at his own website:
Mark Vonnegut has said that the father he knew growing up wasn’t a famous author. He was a family man, a struggling freelance writer, who couldn’t get a job teaching English at the local community community college. And that’s not to mention his father’s disasterous foray into selling SAAB automobiles on Cape Cod, either— another of Kurt’s attempts to make money.
For almost twenty years before the publication of Slaughterhouse-Five in 1969, Vonnegut was broke most of the time. (Someone claiming to be his newsboy told me he was somehow never around at the end of the month to pay for delivery.)
The poignancy of how success and the comfort of money eluded him year after year can be summed up in a tale, here told for the first time: Kurt Vonnegut’s idea for an atomic bowtie (alas, another anecdote that didn’t make final draft of his biography). In 1950, Vonnegut was sure that a bowtie polka-dotted with the symbol for nuclear energy would be a big seller and bring him money he so desperately needed to keep writing and supporting his family.
Kristina Killgrove in Forbes:
The history of modern forensic anthropology is a bit murky. As an applied science rather than a “pure” one, forensics was shunned for decades, its findings inadmissible in court. But the 19th century murder of a Harvard Medical School doctor launched the field, revolutionized law in the process, and began our longstanding fascination with TV shows like CSI and Bones.
The story starts just before Thanksgiving in 1849, when Dr. George Parkman went missing. Parkman was from a wealthy Boston family, an old-timey Doogie Howser who entered Harvard at age 15. He went to medical school in Scotland, returning after the War of 1812. Parkman donated some land in Boston to Harvard Medical College so that the school could relocate from Cambridge. He was also well-known for lending money from his considerable fortune and for walking around town to collect on those debts.
A professor of chemistry and geology at Harvard, John White Webster, was one of those debtors. He had been having financial problems, requiring him to give up his family’s Cambridge mansion. Webster’s salary as a lecturer at Harvard simply didn’t cover his grandiose lifestyle. So Webster borrowed $400 from Parkman in 1842. Seems like a paltry sum, but the equivalent in today’s dollars is nearly $10,000.
Duncan Fyfe in Slate:
Virtually every film in modern memory ends with some variation of the same disclaimer: “This is a work of fiction. Any similarity to actual persons, living or dead, or actual events, is purely coincidental.” The cut-and-paste legal rider must be the most boring thing in every movie that features it. Who knew its origins were so lurid?
For that bit of boilerplate, we can indirectly thank none other than Grigori Rasputin, the famously hard-to-assassinate Russian mystic and intimate of the last, doomed Romanovs. It all started when an exiled Russian prince sued MGM in 1933 over the studio’s Rasputin biopic, claiming that the American production did not accurately depict Rasputin’s murder. And the prince ought to have known, having murdered him.
Here’s the story. In 1916, the fabulously wealthy, Oxford-educated Prince Felix Yusupov was one of several Russian aristocrats agonizing over the unseemly influence that Rasputin—the magical healer, charismatic lech, and peasant—had over the Tsar and, particularly, the Tsarina. In December, Yusupov invited Rasputin to his palace, where he offered him cyanide-laced cakes and then shot him.
Although the Tsarina was distraught, the Tsar let Yusupov off lightly, exiling the prince and his wife Irina. (In doing so, he inadvertently spared them from the impending slaughter of the revolution.)
Sixteen years later, MGM produced Rasputin and the Empress, based on those events.
Alva Noë at NPR:
And there's the question of what types of animals you can love. You're allowed to love a dog or a cat. But can you, should you, is it appropriate, to love other kinds of animals? My brother had a hermit crab when he was a boy. I don't know how he felt about it — but can a healthy, well-rounded person love a hermit crab?
I'm not passing judgment. It strikes me that the shifting, unstable, historical, emotional, playful and earnest feelings we Americans have about animals has a lot to do with other kinds of value, meaning and quality in our lives.
And, so, it is with a real sense of curiosity that I wonder about our varying relationships with animals. Why, for example, it is that we do not even notice road kill, for the most part — let alone stop to mourn it? And what can be said about the fact that the sale of bull semen is a big part of the cattle industry — and the methods used to create supply?
You can get the salacious details in Jane C. Desmond's fascinating new book Displaying Death and Animating Life: Human-Animal Relations in Art, Science, and Everyday Life. This is a scholarly work devoted to looking at the variety and tensions surrounding human-animal relations in, as the subtitle puts it, art, science and everyday life. Her focus, in this gripping book, is ourcontemporary American society (to the extent that there is any such a unified thing).
Nora Caplan-Bricker in The New Yorker:
The sly and playful Austrian performance artists Sonja Stummerer and Martin Hablesreiter want to make us reëxamine the culinary mores that we take for granted.
The leap to high-concept performance happened almost by accident: they wanted to submit photos of people eating for an exhibition about table manners at the 2011 Gwangju Biennale, in Korea, and a tight budget forced them to serve as their own models. “Many people liked these photos so much that for the next book, ‘Eat Design,’ we decided to make forty or fifty photos of ourselves eating,” Hablesreiter said. Eventually, they began making videos as well.