Over at Philosophy Bites:
Is there a place for the sacred in contemporary life? Roger Scruton believes there is. In this discussion he explains his understanding of the experiences he calls sacred.
Richard Marshall interviews Pascal Engel in 3:AM Magazine [Photo: Claire Poinsignon]:
3:AM: As an epistemologist do you think knowledge is elusive because the term is empty, and would that be an approach developed out of your work on Ramsey’s principle, (which I’ll ask about in a minute)?
PE: I do not take knowledge to be elusive or empty. On the contrary, it seems to me to designate a bona fide natural kind, although not one which is easy to pin down. Unlike contextualists, I take expressions such as “knowledge” or “knows” to be invariant across contexts (although I do not deny a certain amount of context sensitivity in our epistemic terms). In the current jargon I am an insensitive invariantist. Unlike eliminativists about knowledge, among whom I count a number of experimental philosophers, I do not think it is an empty term. I do not think, however, that knowledge can be defined through a set of necessary and sufficient conditions. This I take to be one of the lessons of Gettierology.
This does not mean that there is nothing to say about knowledge and that epistemologists have to pack up and leave. Although knowledge cannot be defined in the strict sense of this term, we can still characterize it functionally, through its relationships with other notions, such as those of justification, evidence, reliability, or safety, and we can try to give explanations and theories about these notions. Thus it makes sense to ask whether internalism or externalism about knowledge are correct, whether reliabilism or evidentialism are correct, and to work out the best versions of these.
I also take it that , although the analysis of knowledge is basically a conceptual and a priori matter, we can learn a lot (although not everything) from cognitive science, ethology and especially from developmental psychology about what knowledge is. These issues cannot be dealt with only at the level of an account of knowledge in general, but have to be dealt with about particular kinds of knowledge depending on its sources (perceptual, inferential, testimonial, a priori) and of its domain (natural, scientific, moral, aesthetic). Perhaps there is no single account which works fully for all domains, but I take it that they have a number of traits in common (in this respect the shape of the issues is pretty much like for truth, which can vary across domains, but keeps a functionalist core)
Ramsey’s ideas about knowledge, which were in many ways pioneering, are perfectly consistent with this functionalist account.
Anne Fausto-Sterling in The Boston Review of Books:
Consider the case of Grady Nelson, a Florida man who was convicted of murdering his wife, Angelina Martinez. During the 2010 hearing to decide if Nelson should be sentenced to death, his lawyer showed the jury an image suggesting that Nelson had an abnormality in his left frontal lobe. At least two jurors were impressed by the evidence, shifting the voting balance toward life imprisonment. As much as I oppose the death penalty, the outcome raises some basic questions. Was the brain scan taken at the time of the murder, or, more likely, after years in jail? Could the brain deformations be linked to the murder? Scientifically, the introduction of this neural image was pretty lame, but the emotional impact was huge, and it carried the day for the defendant, who escaped execution.
Or what about the flurry of news stories this past December with headlines such as “Brain wiring in men, women could explain gender differences,” all reporting on a publication in the Proceedings of the National Academy of Sciences, which used neural imaging to produce average connectomes for brains of several hundred males and females.
Again, the images are compelling, but the science is not. First, neither the Proceedings article nor any other reputable research has tied specific wiring diagrams to variation in behaviors or cognitive skills. Just as with the fly larvae, the activity of differently wired networks can lead to the same behavior. Indeed, in an earlier study, the Proceedings researchers showed only small differences in the “big skills”—map reading, social cognition, spatial processing—that supposedly separate men from women. Second, the researchers do not assess the possibility that different experiences of gender might themselves produce differently wired brains. Did the young people in their samples play the same sports, have the same hobbies, wear the same type of clothing, or study the same subjects in high school?
H. Bruce Franklin in The LA Review of Books:
Teaching students in the 21stcentury — including the combat veterans, National Guard soldiers, and reservists in my classes at Rutgers University, Newark — I have to keep reminding myself that they have lived their entire conscious lives during America’s endless warfare. For them, that must seem not just normal, but how it has always been and always will be. Is that also true for the rest of us?
I had to rethink this question a few months ago, when I woke one day to discover that I was 80 years old. For more than half a century I’ve been involved in struggles to stop wars being waged by our nation or to keep it from starting new ones. Before that, in the late 1950s, I had spent three years in the US Air Force flying in Strategic Air Command operations of espionage and provocation against the Soviet Union and participating in launches for full-scale thermonuclear war. Some of these launches were just practice, but a few were real war strikes that were recalled while we were in flight just minutes before it would have been too late. (I recall with embarrassment that I never had a flicker of doubt about whether I should be participating in the start of a thermonuclear Armageddon.) And before that were four years of ROTC, which I joined during the Korean War, a war that had started when I was 16. From age 11 to 16, I had bounced right from the Victory Culture at the end of World War II into the repression and militarization of the early Cold War years.
So it dawned on me that living one’s life during America’s Forever War is hardly unique to those millennials I’m teaching. How many people alive today have ever lived part of their conscious lives in a United States of America at peace with the rest of the world? Would someone even older than I am have any meaningful memory of what such a state of peace was like? How many Americans are even capable of imagining such a state? I can remember only two periods, bracketing World War II, when I believed I lived in a nation at peace. And even these were arguably just childish illusions.
Ismail Khalidi interviews Max Blumenthal in Guernica:
Guernica: The last month has seen the killing of three teenage Israeli settlers near Hebron and a massive Israeli sweep into the West Bank in which hundreds of Palestinians were arrested, injured, and killed. Earlier this month a Palestinian teen was abducted and killed by Israelis in Jerusalem (who are said to have burned the boy alive). Now the Israeli military is engaged in an offensive against Gaza while Hamas fires rockets toward Israel. What do the last month’s events tell us about the state of the conflict?
Max Blumenthal: The entire crisis occurred against the backdrop of a peace process that Netanyahu was blamed for destroying and in the wake of the Hamas-Fatah unity deal, which the US recognized and which Netanyahu was determined to destroy as well. The kidnapping of the three Israeli teens by what appears to be a rogue Hamas cell apparently seeking to generate some kind of prisoner exchange was too good of an opportunity for him to waste.
And so, as I’ve documented with on-the-record sources, Israeli investigators, Netanyahu and the honchos of the military-intelligence apparatus knew by the sound of gunshots on a recorded call by the teens to the police that the teens were killed right away. And they chose to lie, not only to the teens’ parents, whom they sought to deploy as props in their global PR campaign, but to the Israeli public. Through a military gag order, the Israeli media was not allowed to report on the investigation or the details of the recorded phone call. With the Israeli public and the world convinced that the teens were alive, Israeli troops ransacked the West Bank under the guise of a rescue mission, and embarked on a global propaganda campaign centering around the hashtag #BringBackOurBoys. The Israeli public was not emotionally prepared for the discovery of the teens’ bodies because they thought they would be returned home as Gilad Shalit was. So Netanyahu and his inner circle set the public up for a truly dangerous reaction.
In Goliath, I detailed the rise of anti-Arab mobs comprised of soccer thugs and of the burgeoning anti-miscegenation movement in Israel. Netanyahu’s manipulation of the kidnapping and his response to the discovery of the dead teens—he said, “Vengeance for the blood of a small child, Satan has not yet created”—validated these elements and emboldened them as they set out for revenge. Those young men who abducted the Palestinian teen Mohamed Abu Khdeir met at one of the revenge rallies in Jerusalem; they were fans of the soccer club Beitar Jerusalem, which I wrote about in Goliath and whose racist history is absolutely legion.
Andrew Bacevich in Notre Dame Magazine:
For well over 30 years now, the United States military has been intensively engaged in various quarters of the Islamic world. An end to that involvement is nowhere in sight.
Tick off the countries in that region that U.S. forces in recent decades have invaded, occupied, garrisoned, bombed or raided and where American soldiers have killed or been killed. Since 1980, they include Iraq and Afghanistan, of course. But also Iran, Lebanon, Libya, Turkey, Kuwait, Saudi Arabia, Qatar, Bahrain, the United Arab Emirates, Jordan, Bosnia, Kosovo, Yemen, Sudan, Somalia and Pakistan. The list goes on.
To judge by various official explanations coming out of Washington, the mission of the troops dispatched to these various quarters has been to defend or deter or liberate, punishing the wicked and protecting the innocent while spreading liberal values and generally keeping Americans safe.
What are we to make of the larger enterprise in which the U.S. forces have been engaged since well before today’s Notre Dame undergraduates were even born? What is the nature of the military struggle we are waging? What should we call it?
For several years after 9/11, Americans referred to it as the Global War on Terrorism, a misleading term that has since fallen out of favor.
More here. [Thanks to Christopher Lydon.]
Many of us get confused by claims of how much the risk of a heart attack, for example, might be reduced by taking medicine for it. And doctors can get confused, too. Just ask Karen Sepucha. She runs the Health Decisions Sciences Center at Boston's Massachusetts General Hospital. A few years ago she surveyed primary care physicians, and asked how confident they were in their ability to talk about numbers and probabilities with patients. “What we found surprised us a little bit,” Sepucha says. “Only about 20 percent of the physicians said they were very comfortable using numbers and explaining probabilities to patients.”
Doctors, including Leigh Simmons, typically prefer words. Simmons is an internist and part of a group practice that provides primary care at Mass General. “As doctors we tend to often use words like, 'very small risk,' 'very unlikely,' 'very rare,' 'very likely,' 'high risk,' ” she says. But those words can be unclear to a patient. “People may hear 'small risk,' and what they hear is very different from what I've got in my mind,” she says. “Or what's a very small risk to me, it's a very big deal to you if it's happened to a family member.” Simmons and her colleagues are working on ways to involve their patients in shared decision-making. The initiative at Mass General gives patients online, written and visual information to help them. One of the goals is to make risk understandable — bridging the gap between percent probabilities and words.
Douglas Coupland in FT Magazine:
A way of rethinking the global web of overlapping allegiances would be to wonder what might happen if Earth instituted a planet-wide citizenship flush. Whoever you are, you now have to choose just one passport – so, which is it going to be? The answer would probably boil down to multiple factors, the largest including personal identity, ease of crossing borders, consular access while abroad and, of course, taxes. Sure, a low tax rate is great, but if I break my arm do I really want to spend $75,000 fixing it? Yes, popping in and out of Europe is terrific, but would I want to forfeit getting a lump in my throat if I hear my ex-national anthem playing? What exactly is citizenship? What does it mean to say I’m this and you’re that? The fact that almost every country on Earth makes it very difficult to become a citizen means that citizenship has to mean something. I think this week when I was asking my guests what citizenship they would choose if they could only have one, I was unwittingly taking them to task for trying to have their cake and eat it, too. Can you really have the best of all worlds, bing bang boom, whenever it suits your needs? I suspect polycitizenry is a creation of the 20th century, and a creation whose days are numbered. As the world gets ever more pay-per-use, the luxury of low-commitment semi-disposable allegiance seems, if nothing else, too expensive. If nothing else, Canada put a number on it.
He built his palette
around the ragged colors
of her tortoise-shell calico,
piled like clotted earth
in a sunny corner.
His worn soul
embraced the tales
of those crumpled old shoes—
shredded laces, wilted leather—
scuffed with the stuff of a heart
that beat wild paths through
fields of irises, sunflowers, wheat,
the streets of Arles, and
dreams lost to the night sky.
by Elaine Frankonis
from What the Seasons Leave —soon to be published
Finishing Line Press, 2014
Michael Schulson in Aeon (Illustration by Tim McDonagh):
Over the millennia, cultures have expended a great deal of time, energy and ingenuity in order to introduce some element of chance into decision-making. Naskapi hunters in the Canadian province of Labrador would roast the scapula of a caribou in order to determine the direction of their next hunt, reading the cracks that formed on the surface of the bone like a map. In China, people have long sought guidance in the passages of the I Ching, using the intricate manipulation of 49 yarrow stalks to determine which section of the book they ought to consult. The Azande of central Africa, when faced with a difficult choice, would force a powdery poison down a chicken’s throat, finding the answer to their question in whether or not the chicken survived – a hard-to-predict, if not quite random, outcome. (‘I found this as satisfactory a way of running my home and affairs as any other I know of,’ wrote the British anthropologist E E Evans-Pritchard, who adopted some local customs during his time with the Azande in the 1920s).
The list goes on. It could – it does – fill books. As any blackjack dealer or tarot reader might tell you, we have a love for the flip of the card. Why shouldn’t we? Chance has some special properties. It is a swift, consistent, and (unless your chickens all die) relatively cheap decider. Devoid of any guiding mind, it is subject to neither blame nor regret. Inhuman, it can act as a blank surface on which to descry the churning of fate or the work of divine hands. Chance distributes resources and judges disputes with perfect equanimity.
Above all, chance makes its selection without any recourse to reasons. This quality is perhaps its greatest advantage, though of course it comes at a price. Peter Stone, a political theorist at Trinity College, Dublin, and the author of The Luck of the Draw: The Role of Lotteries in Decision Making (2011), has made a career of studying the conditions under which such reasonless-ness can be, well, reasonable.
‘What lotteries are very good for is for keeping bad reasons out of decisions,’ Stone told me. ‘Lotteries guarantee that when you are choosing at random, there will be no reasons at all for one option rather than another being selected.’ He calls this the sanitising effectof lotteries – they eliminate all reasons from a decision, scrubbing away any kind of unwanted influence. As Stone acknowledges, randomness eliminates good reasons from the running as well as bad ones. He doesn’t advocate using chance indiscriminately. ‘But, sometimes,’ he argues, ‘the danger of bad reasons is bigger than the loss of the possibility of good reasons.’
Marc Weitzmann in Tablet:
Last May 24, a Saturday, at 3:27 p.m., according to the accusatory file, a man appeared at the doorstep of the Jewish Museum of Belgium. Out of his bag he pulled a Magnum .357 and fired. The bullets hit Emanuel and Miriam Riva, a couple of Israeli tourists in their mid-fifties who had just entered the place. Each was struck in the back of the skull, and they died on the spot. (Later on, a witness showing up a few minutes after the killing would post on his Facebook page a picture of Miriam’ s body lying in her blood, her hand still carrying the museum’s pamphlet program; what her children, ages 15 and 16, living in Israel, thought of the photograph is not known.) Letting go of the Magnum, the shooter then took from his bag a Kalashnikov, aimed it at a 65-year-old woman by the name of Dominique Sabrier, and shot her, also in the head.
A retired art publisher of Polish descent, Sabrier had left France for Brussels only two months before. Her reason for moving, ironically enough, was, according to her friends, the anti-Semitic atmosphere that now permeates France. The Toulouse killing had scared her, as had the hate demonstration in Paris the previous winter—when, for the first time since World War II, anti-Jewish slogans were chanted in public in the French capital. In Brussels, a city Sabrier knew, she hoped to live a quiet retirement. She had registered for law classes at the Free University of the town and was volunteering as a tourist guide at the museum.
Alexander Strens, 25, found the time to seek refuge under his reception desk—before the killer found him and shot him, once again in the head. Strens, hired at the museum’s communication department the previous year, was the only victim still alive after the shooting. Sent to the Saint-Pierre hospital of Brussels, he was declared brain dead there the next day. He died on June 6, raising the murder total to four. Although Strens’ mother is Jewish, his father is a Muslim Berber from Morocco and, in accordance with the wishes of both families, he was buried in the Muslim cemetery of Taza.
Then, with Strens—and with no more reason than it had when it started, the massacre ends. The surveillance video shows the shooter running away, bag in hand. He disappears.
Brussels is the capital of Europe. The day after the shooting, an election was held for a new European parliament. Xenophobic nationalist parties across the continent were predicted to win a lot of seats even before the killing, and as soon as the news broke the already perceptible tension among the continental political class was imbued with a new sense of frailty and paranoia: Was the scheduling of the massacre just a coincidence? Or was a message being sent—and by whom? Europe was under siege, no doubt, and humiliated, too.
Dinah Birch at the Times Literary Supplement:
Andersen was profoundly committed to the Romantic ideal of the extraordinary genius, singled out for distinction from birth. “It doesn’t matter if you’re born in a duck yard if you’ve lain in a swan’s egg”, as he explains in “The Ugly Duckling”, one of his most self-reflective tales. His desire to please his many acquaintances meant he could be irritatingly anxious and deferential, and his inclination to a fawning submissiveness in his relations with aristocratic patrons was vexing to those wishing to promote the professional dignity and independence of writers. But he never doubted his credentials as an artist. Despite his lifelong social uncertainties, he was convinced that his unique gifts meant that he was perfectly entitled to special treatment from the hands of fate and his friends.
Binding is especially persuasive in tracing Andersen’s creative relations with Walter Scott, whose work had been translated into Danish in the 1820s. Andersen’s first published tale, “The Apparition at Palnatoke’s Grave” (1822), was influenced by the character of Madge Wildfire in Scott’s The Heart of Midlothian (1818), a novel that acquired cult status among its European readers, and made a lasting impression on Andersen. The story was published under the pseudonym of Villiam Christian Walter, in a volume that included one of the plays that won the approval of the Royal Theatre. Choosing the name “Walter” was an act of homage, but it was also a bold statement of intent. Binding suggests that Gerda’s journey to the icy palace of Kay’s glamorous captor in “The Snow Queen” reflects Jeanie Deans’s indomitable walk to London, undertaken so that she can plead with Queen Caroline for the pardon and release of her condemned sister.
Alexandra Molotkow at The Believer:
Rock stars are the gods of the last century, avatars for the emotional and religious yearnings 1960s youth would have had nowhere else to place. Bob Dylan’s cryptic magnetism marked him as the Person With the Answers; Mick Jagger’s shaking hips stood for personal and sexual liberation. “Mick Jaggerpersonified a penis,” wrote Pamela Des Barres, the famed groupie and author of several books, in her 1987 memoir, I’m With the Band: Confessions of a Groupie. As a teenager, she “rushed home from school every day to throb along with Mick while he sang: ‘I’m a king bee, baby, let me come inside.’”
Getting in close proximity to a rock star, then—for a night, or a string of tour dates, or the time it would take them to compose an album especially for you—would seem a service to a higher power. “Perhaps by embracing their cherished rock gods, groupies tap into their own divinity,” Des Barres wrote in 2007’sLet’s Spend the Night Together: Backstage Secrets of Rock Muses and Supergroupies. Unfortunately, rock stars are not gods but rather human beings whose emotions happen to resonate with millions—emotions that are inspired by other human beings, some of whom have written memoirs.
Joan Acocella at The New Yorker:
For people accustomed to the cooler precincts of modernist and postmodernist art, it is often a joy to reëncounter older, messier forms of theatre, with coincidences and murders and the like. Therefore, when I arrived at the Rose Theatre for “The Ghost Tale of the Wet Nurse Tree,” the Kabuki company Heisei Nakamura-za’s contribution to the Lincoln Center Festival, I was not surprised to find the lobby packed with people spending too much money at the snack bar and looking as though they were going to a soccer game.
Here, with considerable abridgment, is what happens in “The Ghost Tale of the Wet Nurse Tree.” The distinguished painter Shigenobu and his wife, Oseki, have a new baby boy. Hanging around the neighborhood is a self-styled samurai, Namie, wearing a hat the size of a washtub, with a nasty smirk on his face. Shigenobu announces that he’s leaving town to create a dragon painting for a famous temple. Incredibly, he entrusts the care of his wife and son to Namie.
Christopher de Bellaigue in The Guardian:
Along with the “Trojan Horse” controversy about the imposition of a strict Islamic ethos on a number of Birmingham schools, the disclosure that several hundred Britons have been to Syria and Iraq to fight with the jihadis has fired up those who believe that Islam represents an urgent threat to this country. Amid the hype – some of it justifiable, much not – nuance has inevitably been lost. It is significant that the British jihadis have chosen to realise their fantasies not here but in Mesopotamia. From a theological point of view, a caliphate can only be set up in Muslim lands. Britain would be a poor choice even for a pilot scheme – it has a substantial opposing majority and a competent intelligence service. As for the Birmingham “conspiracy”, that, too, is more complicated than it seems: while there have undoubtedly been moves to Islamise the schools' curricula and atmosphere, much of the pressure in this direction has come from parents. It would be understandable if law-abiding families with children at schools such as the formerly “outstanding” Park View Academy – now deemed “inadequate” and placed in special measures – felt targeted by former education secretary Michael Gove's campaign to “drain” the fundamentalist “swamp”. While in opposition, Gove authored a famously error-strewn and intemperate screed on Islamic fundamentalism, “Celsius 7/7”. And before his unexpected sacking, he wanted to inculcate “British values” in people whose social attitudes suggest they have had a bellyful.
The deeper concern is that a significant number of British Muslims are getting more conservative while much of the rest of society – including, of course, very many other Muslims – liberalises apace. Exporting high-profile hate preachers such as Abu Qatada is no solution, for whether one likes it or not the values of conservative Muslims are “British”, too. As the shortcomings of “Celsius 7/7” demonstrated, and as Innes Bowen confirms in her sober, meticulous and revelatory new book, the state's attitude towards British Muslims has been defined in part by ignorance.
In My Spare Time
During my long, boring hours of spare time
I sit to play with the earth’s sphere.
I establish countries without police or parties
and I scrap others that no longer attract consumers.
I run roaring rivers through barren deserts
and I create continents and oceans
that I save for the future just in case.
I draw a new colored map of the nations:
I roll Germany to the Pacific Ocean teeming with whales
and I let the poor refugees
sail pirates’ ships to her coasts
in the fog
dreaming of the promised garden in Bavaria.
I switch England with Afghanistan
so that its youth can smoke hashish for free
provided courtesy of Her Majesty’s government.
I smuggle Kuwait from its fenced and mined borders
to Comoro, the islands
of the moon in its eclipse,
keeping the oil fields in tact, of course.
At the same time I transport Baghdad
in the midst of loud drumming
to the islands of Tahiti.
I let Saudi Arabic crouch in its eternal desert
to preserve the purity of her thoroughbred camels.
This is before I surrender America
back to the Indians
just to give history
the justice it has long lacked.
I know that changing the world is not easy
but it remains necessary nonetheless.
by Fadhil al-Azzawi
from Poetry International
translation: 2000, Khaled Mattawa
By 2050, the number of people over the age of 80 will triple globally. These demographics could come at great cost to individuals and economies. Two groups describe how research in animals and humans should be refocused to find ways to delay the onset of frailty.
The problems of old age come as a package. More than 70% of people over 65 have two or more chronic conditions such as arthritis, diabetes, cancer, heart disease and stroke1. Studies of diet, genes and drugs indicate that delaying one age-related disease probably staves off others. At least a dozen molecular pathways seem to set the pace of physiological ageing. Researchers have tweaked these pathways to give rodents long and healthy lives. Restricting calorie intake in mice or introducing mutations in nutrient-sensing pathways can extend lifespans2 by as much as 50%. And these 'Methuselah mice' are more likely than controls to die without any apparent diseases3. Post-mortems reveal that tumours, heart problems, neurodegeneration and metabolic disease are generally reduced or delayed in long-lived mice. In other words, extending lifespan also seems to increase 'healthspan', the time lived without chronic age-related conditions.
These insights have made hardly a dent in human medicine. Biomedicine takes on conditions one at a time — Alzheimer's disease, say, or heart failure. Rather, it should learn to stall incremental cellular damage and changes that eventually yield several infirmities. The current tools for extending healthy life — better diets and regular exercise — are effective. But there is room for improvement, especially in personalizing treatments. Molecular insights from animals should be tested in humans to identify interventions to delay ageing and associated conditions. Together, preclinical and clinical researchers must develop meaningful endpoints for human trials.
Picture: Fauja Singh, here aged 100, prepares for Britain's Edinburgh marathon in 2011
William Dalrymple in The Guardian (Photograph: Karim Sahib/AFP/Getty Images):
According to tradition it was St Thomas and his cousin Addai who brought Christianity to Iraq in the first century. At the Council of Nicea, where the Christian creed was thrashed out in AD325, there were more bishops from Mesopotamia than western Europe. The region became a refuge for those persecuted by the Orthodox Byzantines, such as theMandeans – the last Gnostics, who follow what they believe to be the teachings of John the Baptist. Then there was the Church of the East, which brought the philosophy of Aristotle and Plato, as well as Greek science and medicine, to the Islamic world – and hence, via Cordoba, to the new universities of medieval Europe.
Now almost everywhere Arab Christians are leaving. In the past decade maybe a quarter have made new lives in Europe, Australia and America. According to Professor Kamal Salibi, they are simply exhausted: “There is a feeling of fin de race among Christians all over the Middle East. Now they just want to go somewhere else, make some money and relax. Each time a Christian goes, no other Christian comes to fill his place and that is a very bad thing for the Arab world. It is Christian Arabs who keep the Arab world 'Arab' rather than 'Muslim'.”
Certainly since the 19th century Christian Arabs have played a vital role in defining a secular Arab cultural identity. It is no coincidence that most of the founders of secular Arab nationalism were men like Michel Aflaq – the Greek Orthodox Christian from Damascus who, with other Syrian students freshly returned from the Sorbonne, founded the Ba'ath party in the 1940s – or Faris al-Khoury, Syria's only Christian prime minister. Then there were intellectuals like the Palestinian George Antonius, who in 1938 wrote in The Arab Awakening of the crucial role Christians played in reviving Arab literature and the arts after their long slumber under Ottoman rule.
If the Islamic state proclaimed by Isis turns into a permanent, Christian-free zone, it could signal the demise not just of an important part of the Arab Christian realm but also of the secular Arab nationalism Christians helped create.
Amanda Marcotte in The Daily Beast (NBC/Getty):
In the three decades since the scandal erupted, some things have changed and some haven’t. You still have plenty of people who want to shame young women for failing to meet the paradoxical demand to be sexy and not sexual, but there’s a growing chorus of people who see through that hypocrisy and have stopped punishing women for being, well, human beings who enjoy sex.
Beauty pageants, of course, are ground zero for the hypocritical demands on women to flaunt their bodies without actually acknowledging the existence of sex. The whole point of being a pageant queen is to trot around in your bikini to be ogled at while feigning sexual naiveté. But, of course, women—even young women—are actually sexual beings, as much as society denies that. Williams is hardly the only beauty queen who has been the subject of a scandal because she was caught dropping the “what is this sex you speak of?” act and found to be—gasp!—interested in actually enjoying her youth.
In 2006, Miss Nevada Katie Rees, got a bunch of exploitative attention for “sexy” pictures of her showing off her breasts and underwear and kissing other women, an offense for which she lost her crown. A few years later, Miss California Carrie Prejean endured having a few semi-nude photos leaked. She was able to keep her crown, but only after Donald Trump did a big, pompous show of how magnanimous he was being by saying, “We have determined that the pictures taken were fine.” That it’s a subject that needs to be “determined” at all is ridiculous, suggesting that being in pageants still comes at the price of having outsiders—outsiders like Donald Trump—feel entitled to sit in judgment of your sexual behavior.
Women who want to go into politics find themselves under similar pressure to conceal that they have bodies under their clothes or that they know what sex is all about. Witness what happened to Krystal Ball, a young Democrat who wanted to run for Congress in 2010. A pair of conservative bloggers decided to shame her by running pictures they obtained from a party she attended many years prior, where she was seen posing for prank photos with a dildo.
Noah Smith over at his website (via Crooked Timber):
Consider Proposition H: “God is watching out for me, and has a special purpose for me and me alone. Therefore, God will not let me die. No matter how dangerous a threat seems, it cannot possibly kill me, because God is looking out for me – and only me – at all times.”
So P(H|E) is greater than P(H) – every moment that you fail to die increases your subjective probability that you are an invincible superman, the chosen of God. This is totally and completely rational, at least by the Bayesian definition of rationality.