Approaching 85, cine-essayist Chris Marker remains as lively, engag ed, and provocative as ever—and no less fond of indirection. (His La Jetée is not only a movie about the pathos of time travel, but a rumination on film-watching as well.) Marker’s hour-long video The Case of the Grinning Cat meditates on the state of post–9-11 French politics, taking as its apparent subject the enigmatic M. Chat, who in late 2001 began appearing, as if by magic, on Paris rooftops, walls, and métro stations.
This anonymously produced graffito—a wide-eyed, broadly smiling, boldly cartooned, bright-yellow feline—spread to other cities, and Marker does his part, matting M. Chat into artworks from cave paintings to van Goghs. During the 2002 French election that saw right-wing centrist Jacques Chirac defeat right-wing extremist Jean-Marie Le Pen, M. Chat took to the streets. Cat placards and masks dotted the anti–Le Pen demonstrations and appeared in crowds rallying against Bush’s war.
Via Political Theory Daily Review, on monogamy and polyamory, evolutionary psychology and human spirituality, in Tikkun.
For a variety of evolutionary and historical reasons, polyamory has had “bad press” in Western culture and spiritual circles—being automatically linked, for example, with promiscuity, irresponsibility, inability to commit, and even narcissistic hedonism. Given the current crisis of monogamy in our culture, however, it may be valuable to explore seriously the social potential of responsible forms of nonmonogamy. And given the spiritual potential of such exploration, it may also be important to expand the range of spiritually legitimate relationship choices that we as individuals can make at the various karmic crossroads of our lives.
It is my hope that this essay opens avenues for dialogue and inquiry in spiritual circles about the transformation of intimate relationships. It is also my hope that it contributes to the extension of spiritual virtues, such as sympathetic joy, to all areas of life and in particular to those which, due to historical, cultural, and perhaps evolutionary reasons, have been traditionally excluded or overlooked—areas such as sexuality and romantic love.
The culturally prevalent belief—supported by many contemporary spiritual teachers—that the only spiritually correct sexual options are either celibacy or monogamy is a myth that may be causing unnecessary suffering and that needs, therefore, to be laid to rest. It may be perfectly plausible to hold simultaneously more than one loving or sexual bond in a context of mindfulness, ethical integrity, and spiritual growth, for example, while working toward the transformation of jealousy into sympathetic joy and the integration of sensuous and spiritual love. I should add right away that, ultimately, I believe that the greatest expression of spiritual freedom in intimate relationships does not lie in strictly sticking to any particular relationship style—whether monogamous or polyamorous—but rather in a radical openness to the dynamic unfolding of life that eludes any fixed or predetermined structure of relationships. It should be obvious, for example, that one can follow a specific relationship style for the “right” (e.g. life-enhancing) or “wrong” (e.g., fear-based) reasons; that all relationship styles can become equally limiting spiritual ideologies; and that different internal and external conditions may rightfully call us to engage in different relationship styles at various junctures of our lives. It is in this open space catalyzed by the movement beyond monogamy and polyamory, I believe, that an existential stance deeply attuned to the standpoint of Spirit can truly emerge.
Truly a great loss. In Rolling Stone:
Fifty years after recording his first hit song, the Hardest Working Man in Show Business has played his final encore. James Brown, the Godfather of Soul, died of congestive heart failure early Christmas morning, after a brief bout with pneumonia in an Atlanta hospital. By his count, he was seventy-three years old.
One of the most influential performers of the 20th century, Brown had a hard-charging, hypnotically rhythmic signature sound that inspired peers and successors from doo-wop to hip-hop. Among his many chart successes – more than forty Top Forty hits and dozens more on the R&B charts — were the timeless classics “Papa’s Got a Brand New Bag” and “I Got You (I Feel Good)” and civil-rights anthems such as “Say It Loud – I’m Black and I’m Proud.” His best-known album, 1962’s Live at the Apollo, is often cited as the most exciting live album of all time. One of the original inductees into the Rock and Roll Hall of Fame in 1986, Brown received a Lifetime Achievement Grammy Award in 1992.
Brown, as some of his elaborate nicknames (“The Minister of the New New Super Heavy Funk”) imply, was best known for his indefatigable showmanship. His revue-style shows were designed to take his audience to ever-higher levels of delirium, and he was famous for “fainting” near the end of the evening, only to be revived by his band mates.
In news@nature, more on whether experiments in virtual worlds can bypass ethical concerns in physchological tests.
Half the volunteers could see the woman and half could not, communicating with her only through text. Both were told to give her ‘electric shocks’ of increasing voltage when she gave incorrect answers to test questions. The woman responded to these with protests and discomfort, asking for the test to stop as the voltage was ramped up.
The group from whom the virtual woman was hidden delivered shocks up to the maximum voltage, like many of those in Milgram’s experiment. Those who could see her were more likely to stop before reaching this limit2.
Almost half of those who could see the woman said afterwards that they had considered withdrawing from the study, and several actually did. “Of course, consciously everybody knows nothing is happening,” says Slater. “But some parts of the person’s perceptual system just takes it as real. Some part of the brain doesn’t know about virtual reality.”
And instead of becoming accustomed to the virtual person and ceasing to empathise, many volunteers became more anxious as the study continued.
In the Economist, what do discoveries in neuroscience imply about free will?
IN THE late 1990s a previously blameless American began collecting child pornography and propositioning children. On the day before he was due to be sentenced to prison for his crimes, he had his brain scanned. He had a tumour. When it had been removed, his paedophilic tendencies went away. When it started growing back, they returned. When the regrowth was removed, they vanished again. Who then was the child abuser?
His case dramatically illustrates the challenge that modern neuroscience is beginning to pose to the idea of free will. The instinct of the reasonable observer is that organic changes of this sort somehow absolve the sufferer of the responsibility that would accrue to a child abuser whose paedophilia was congenital. But why? The chances are that the latter tendency is just as traceable to brain mechanics as the former; it is merely that no one has yet looked. Scientists have looked at anger and violence, though, and discovered genetic variations, expressed as concentrations of a particular messenger molecule in the brain, that are both congenital and predisposing to a violent temper. Where is free will in this case?
In 1946, Boris Vian—novelist, poet, playwright, songwriter, jazz trumpeter, screenwriter, actor, and general scourge of anyone failing to have enough fun in Paris in the postwar era—came to New York. He made the trip from France by submarine, caused a small international incident upon arrival, and had lunch. Then he ventured forth to discover America.
Vian was impressed by the state of American progress, which, he concluded, was far ahead of that of his native country, and not so impressed by American girls, whom he deemed silly things with large behinds. He ran into Hemingway but didn’t recognize him, and failed to say hello. He went to see the Empire State Building, only to discover that it had recently been demolished. He came across the Surrealist André Breton living in Harlem camouflaged as a black man and calling himself Andy. He spent a morning sitting in front of his hotel, hoping to see a lynching, but was disappointed.
more from the New Yorker here.
The premise of Cormac McCarthy’s new novel, The Road, is simple: In a ruined, postapocalyptic future, a nameless father and his young son—”each the other’s world entire”—trudge down a road toward the ocean, with the hope of finding a warmer, more hospitable locale. Along the way, they scrounge for cans of food in cities and countryside already thoroughly pillaged by other refugees. Death from starvation and exposure hovers, but a more immediate terror is the constant threat of dismemberment by roving bands of cannibals, for this is what most survivors have been reduced to. There is an urgency to each page, and a raw emotional pull in the way McCarthy, the poet laureate of violence, known for brutal and biblical novels like Child of God (1973) and Blood Meridian, or, The Evening Redness in the West (1985), renders the father’s attempts to keep alive the hopes of the young boy as well as his own, making it easily one of the most harrowing books you’ll ever encounter. Nearly unreadable in its heartbreaking detail, it is also, once opened, nearly impossible to put down; it is as if you must keep reading in order for the characters to stay alive.
Hardcore fans would have forgiven the seventy-three-year-old legend (the galley cover announces “His New Novel,” as if God himself had written the book) had he produced another in his recent string of accessible novels. Some might see it as a return to form, but The Road diverges from his earlier work as McCarthy switches the focus from the hunters to the hunted. And some might see this free-floating futuristic nightmare as a radical departure, yet for true believers who’d followed the signs in his previous work, this is where they hoped he would arrive.
more from Bookforum here.
The husband wants to be taken back
into the family after behaving terribly,
but nothing can be taken back,
not the leaves by the trees, the rain
by the clouds. You want to take back
the ugly thing you said, but some shrapnel
remains in the wound, some mud.
Night after night Tybalt’s stabbed
so the lovers are ground in mechanical
aftermath. Think of the gunk that never
comes off the roasting pan, the goofs
of a diamond cutter. But wasn’t it
electricity’s blunder into inert clay
that started this whole mess, the I-
echo in the head, a marriage begun
with a fender bender, a sneeze,
a mutation, a raid, an irrevocable
more of Dean Young’s poem at Paris Review here.
Maybe it was just a Freudian slip. Or a case of hiding in plain sight. Either way, Sigmund Freud, scribbling in the pages of a Swiss hotel register, appears to have left the answer to a question that has titillated scholars for much of the last century: Did he have an affair with his wife’s younger sister, Minna Bernays?
Rumors of a romantic liaison between Freud and his sister-in-law, who lived with the Freuds, have long persisted, despite staunch denials by Freud loyalists. The Swiss psychoanalyst Carl Gustav Jung, Freud’s disciple and later his archrival, claimed that Miss Bernays had confessed to an affair to him. (The claim was dismissed by Freudians as malice on Jung’s part.) And some researchers have even theorized that she may have become pregnant by Freud and have had an abortion.
What was lacking was any proof. But a German sociologist now says he has found evidence that on Aug. 13, 1898, during a two-week vacation in the Swiss Alps, Freud, then 42, and Miss Bernays, then 33, put up at the Schweizerhaus, an inn in Maloja, and registered as a married couple, a finding that may cause historians to re-evaluate their understanding of Freud’s own psychology.
A yellowing page of the leather-bound ledger shows that they occupied Room 11. Freud signed the book, in his distinctive Germanic scrawl, “Dr Sigm Freud u frau,” abbreviated German for “Dr. Sigmund Freud and wife.”
Some biblical scholars argue that Eve pulled down the suggestive pomegranate, not an apple, in the Garden of Eden. In the Koran, as in Persian iconography and poetry, images of pomegranates symbolized fertility, and in China, a bride and groom went to bed with seeds scattered on their covers to assure conception. In the early sixteenth century, the Spanish carried crateloads of them across the sea because the vitamin C-rich fruit guarded sailors against scurvy. The friars on board, meanwhile, brought roots to plant in the New World, where the fruit flourishes four hundred years later in California’s Mediterranean climate.
Folk healers have long used every part of the fruit to staunch wounds and treat illnesses like dyspepsia and leprosy. And these days, scientists in Israel have been actively researching the fruit’s pharmaceutical properties (the country harvests three thousand tons annually) to battle everything from viruses to breast cancer and aging skin.
The pomegranate contains a flavonoid that is a powerful cancer-fighting antioxidant. The fruit is also rich in estrogen, and one company is now marketing pomegranate-derived EstraGranate as an alternative to hormone-replacement therapy. In the works is a condom coated with pomegranate juice that will reportedly fend off HIV. In rural Sonoma County, California, where I live, stores now carry pomegranates from fall through winter, but we are offered only one variety, called Wonderful, grown by Paramount Farms, the corporate farm giant. Our nurseries carry only Wonderful seedlings, so when I wanted to plant a pomegranate, it had to be Wonderful.
Two years ago we at 3QD as well as Richard Dawkins independently decided to celebrate December 25th as Newton’s Day (it is Sir Isaac’s birthday). You can see my post from last year here. So here we are again. This year I will just provide two interesting things related to Newton, who some argue was the greatest mind of all time. For example, did you know that he hung out in bars and pubs in disguise, hoping to catch criminals? He did. Read this, from wikipedia:
As warden of the royal mint, Newton estimated that 20% of the coins taken in during The Great Recoinage were counterfeit. Counterfeiting was treason, punishable by death by drawing and quartering. Despite this, convictions of the most flagrant criminals could be extremely difficult to achieve; however, Newton proved to be equal to the task.
He gathered much of that evidence himself, disguised, while he hung out at bars and taverns. For all the barriers placed to prosecution, and separating the branches of government, English law still had ancient and formidable customs of authority. Newton was made a justice of the peace and between June 1698 and Christmas 1699 conducted some 200 cross-examinations of witnesses, informers and suspects. Newton won his convictions and in February 1699, he had ten prisoners waiting to be executed. He later ordered all records of his interrogations to be destroyed.
Newton’s greatest triumph as the king’s attorney was against William Chaloner. One of Chaloner’s schemes was to set up phony conspiracies of Catholics and then turn in the hapless conspirators whom he entrapped. Chaloner made himself rich enough to posture as a gentleman. Petitioning Parliament, Chaloner accused the Mint of providing tools to counterfeiters (a charge also made by others). He proposed that he be allowed to inspect the Mint’s processes in order to improve them. He petitioned Parliament to adopt his plans for a coinage that could not be counterfeited, while at the same time striking false coins. Newton was outraged, and went about the work to uncover anything about Chaloner. During his studies, he found that Chaloner was engaged in counterfeiting. He immediately put Chaloner on trial, but Mr Chaloner had friends in high places, and to Newton’s horror, Chaloner walked free. Newton put him on trial a second time with conclusive evidence. Chaloner was convicted of high treason and hanged, drawn and quartered on March 23, 1699 at Tyburn gallows.
More from Wikipedia here. And if you are in the mood for something much more substantive, I highly recommend watching this video of my mentor and friend, Professor Akeel Bilgrami, delivering the University Lecture at Columbia earlier this fall, entitled “Gandhi, Newton, and the Enlightenment.” I admit that the subject is only weakly related to Newton, but it is well worth watching on Newton’s Day nevertheless. The following description is excerpted from a Columbia University website:
Bilgrami devoted much of his talk to tracing the origins of “thick” rationality as well as the critiques it has received over the years. He identified the 17th century as the critical turning point, when scientific theorists such as Isaac Newton and Robert Boyle put forward the idea of matter and nature as “brute and inert”—as opposed to a classical notion of nature as “shot through with an inner source of dynamism, which is itself divine.”
Even at the time, there were many dissenters who accepted all the laws of Newtonian science but protested its underlying metaphysics, Bilgrami explained. They were anxious about the political alliances being formed between the commercial and mercantile interests and the metaphysical ideologues of the new science—anxieties echoed by the “radical enlightenment” as well as later by Gandhi.
According to Bilgrami, both Gandhi as well as these earlier thinkers argued that in abandoning our ancient, “spiritually flourishing” sense of nature, we also let go of the moral psychology that governs human beings’ engagement with the natural, “including the relations and engagement among ourselves as its inhabitants.”
Bilgrami expressed a certain sympathy for this dissenting view, noting that even if we moderns cannot accept the sacralized vision favored by these earlier thinkers, we should still seek alternative secular forms of enchantment in which the world is “suffused with value,” even if there is no divine source for this value. Such “an evaluatively enchanted world” would be susceptible not just to scientific study, Bilgrami argued, but would also demand an ethical engagement from us all.
See the video here.
And Merry Christmas!!!
Over at Cosmic Variance, Risa Wechsler has a fun and thoughtful post that’s spurred some interesting comments:
From Paul Kedrosky, via Rebecca Blood, an excellent challenge:
Physicist Richard Feynman once said that if all knowledge about physics was about to expire the one sentence he would tell the future is that “Everything is made of atoms”. What one sentence would you tell the future about your own area, whether it’s entrepreneurship, hedge funds, venture capital, or something else? Examples: An economist might say that “People respond to incentives”. I had an engineering professor years ago who said all of that field could be reduced to “F=MA and you can’t push on a rope”.
There’s lots of good and diverse responses out there…
One of the most interesting conclusions of the Baker-Hamilton report resides in the observations that, since the war in Iraq, the American government has often sought to rule out any information that runs counter to its policies, and that this refusal to take the truth into account has had calamitous effects. The report says so in measured, but firm, terms: “Good policy is difficult to make when information is systematically collected in a way that minimizes its discrepancy with policy goals.” In other words, the American government has held truth to be a negligible value that could easily be sacrificed to the will to power.
This reflection is not really a surprise for observers outside the United States. The preparation and unleashing of the war were based on a double lie or double illusion – that is, that Al-Qaeda was linked to the Iraqi government and that Iraq possessed weapons of mass destruction: nuclear, biological, or chemical. Since the fall of Baghdad, this casual attitude to the truth has been in constant evidence. At the very moment when the images of torture in Abu Ghraib prison were being revealed to the whole world, the US asserted that democracy was being securely implanted in Iraq. Then, while hundreds of prisoners had already been moldering for five years in the camp at Guantanamo, subjected to degrading treatment, without any trial or any possibility of defending themselves, [the US government] declares that the United States is using its forces in defense of human rights. The very same people who declare that they are the incarnation of freedom have legalized the use of torture. The Baker-Hamilton report chose not to go into the past; it simply notes that the refrain repeated until recently that “everything is going well in Iraq” does not strictly correspond to the truth.
Political science’s inability to predict any of the great events of the previous decade had proven a serious embarrassment. Eager to make up for their prewar irrelevance, post-war political scientists sought to provide policymakers with predictions regarding, as Gabriel Almond put it, “exotic and uncouth” parts of the world (Almond and Coleman 1960, 10). As Karl Lowenstein (1944) wrote, to overcome past errors comparative politics would have to become “a conscious instrument of social engineering” (541) because “the discipline ha[d] a mission to fulfill in imparting our experience to other nations integrating scientifically their institutions into a universal pattern of government” (547). Political science therefore had to become positive and predictive, and the discipline rebuilt itself around the latest theories of the day (functionalism, modernization theory, and political culture) to meet these new expectations.
This new version of political science posited that societies were self-equilibrating entities that shared common functionally related subsystems for integration, adaptation, and goal attainment.Actually existing societieswere then arrayed along a developmental continuum with the United States posited as the world’s historical end. Where states actually sat on this telos was determined by some combination of their functional fit (Huntington 1968) and/or political culture (Almond and Verba 1966). Some political cultures were seen as better or worse at adapting to the dictates of modernity, but overall the path to a stable capitalist democracy was pretty much set. At least this is what members of the discipline imagined into the 1960s, a decade that proved to be, just like the 1920s and 1930s, a watershed for political science. As occurred in the 1920s and 1930s, these new and scientifically rigorous theories were about to be punctuated (and thereby invalidated) by the politics of the day.
Thomas Dixon in the TLS:
In the heyday of natural theology, the human eye was the great example of divine design – a wonderful symbol of vision and insight, as well as a marvel of optical engineering. God’s intelligence is apparently discerned these days in the E. coli bacterium – a wonderful symbol of diseases of the gut, propelled by an ingenious rotating tail. It is this “flagellum”, a bacterial outboard motor, that is used by proponents of so-called Intelligent Design as an example of the sort of “irreducible complexity” that they claim cannot be explained by Darwinism. It was recently reported that teaching materials promoting Intelligent Design had been sent to all heads of science at British secondary schools, but it is unlikely that they will have much impact here. Intelligent Design is a quintessentially American movement responding to a set of constitutional, cultural and religious dilemmas peculiar to the United States.
Opinion polls today consistently find that, when asked to say whether human beings were created by God within the past 10,000 years, or by a process of evolution guided by God, or by an entirely natural process of evolution, about half the population of the US choose the first option, and most of the rest choose the second. In a country where the question of whether Intelligent Design should be taught in schools on equal terms with Darwinism is regularly debated, it is understandable that books about science and religion sell well and that they have a more tangible political impact than they do in Britain. In this American context, Richard Dawkins’s recent atheistic broadside, The God Delusion, also makes a little more sense. It is really a book to keep up the morale of that embattled 10 per cent of Americans who think God has nothing to do with evolution.
Although Dawkins of course has no truck with “irreducible complexity”, one thing that he and his Intelligent Design antagonists agree about is that God’s existence or non-existence is, in Dawkins’s phrase, “a scientific fact about the universe”. Most theologians would want to reject Intelligent Design, along with the theology of The God Delusion, for exactly that reason. For them it is axiomatic that if we are going to talk about God at all, then God is not part of the natural order and should not be expected either to conform to the laws of physics or to feature as another entity in scientific accounts of life or the cosmos.
In openDemocracy.net, what the Iraq Study Group’s suggestions mean for the Kurds:
In a retrograde step, the ISG unapologetically makes five potentially damaging recommendations:
- suspension and/or gross manipulation of the democratically adopted constitution
- inhibition of the constitutionally agreed referendum on the fate of the arabised city of Kirkuk, representing yet another major betrayal of the Kurds
- centralisation of power in Baghdad, in a return to dictatorial central government
- rewarding extremists by their integration into the Iraqi state machinery
- reaching out to appease terrorists and their sponsoring countries.
Thus, the US’s arch-enemies – Iran, Syria, al-Qaida, Muqtada al-Sadr’s Mahdi army and Saddamists – are among the winners; so, too, are Turkey, Saudi Arabia, Egypt and Jordan. Bush’s policymakers and their main Iraqi allies (Kurds and some Shi’a) are the chief losers.
Virginia Hughes in Seed Magazine:
Smilack has a rare form of synesthesia that involves all of her senses—the sound of one female voice looks like a thin, bending sheet of metal, and the sight of a certain fishing shack gives her a brief taste of Neapolitan ice cream—but her artistic leanings are shared by many other synesthetes. Scientists estimate that synesthesia is about seven times more common in poets, novelists, and artists than in the rest of the population. (Some of the most famous examples include artists David Hockney and Wassily Kandinsky and writer Vladimir Nabokov.)
In the last decade, this connection between synesthesia and art has drawn much attention from neuroscientists. And now several genetic and behavioral studies aim to pin down the biological mechanisms linking art and synesthesia, with hopes of answering even bigger questions about how every brain perceives art.
From The New York Times:
I finally came unhinged in the dentist’s office — one of those ritzy pediatric practices tricked out with comic books, DVDs and arcade games — where I’d taken my 3-year-old daughter for her first exam. Until then, I’d held my tongue. I’d smiled politely every time the supermarket-checkout clerk greeted her with “Hi, Princess”; ignored the waitress at our local breakfast joint who called the funny-face pancakes she ordered her “princess meal”; made no comment when the lady at Longs Drugs said, “I bet I know your favorite color” and handed her a pink balloon rather than letting her choose for herself. Maybe it was the dentist’s Betty Boop inflection that got to me, but when she pointed to the exam chair and said, “Would you like to sit in my special princess throne so I can sparkle your teeth?” I lost it.
“Oh, for God’s sake,” I snapped. “Do you have a princess drill, too?”
She stared at me as if I were an evil stepmother.
“Come on!” I continued, my voice rising. “It’s 2006, not 1950. This is Berkeley, Calif. Does every little girl really have to be a princess?”
My daughter, who was reaching for a Cinderella sticker, looked back and forth between us. “Why are you so mad, Mama?” she asked. “What’s wrong with princesses?”
A Japanese research team has succeeded in filming a giant squid live — possibly marking a first — and says the elusive creatures may be more plentiful than previously believed, a researcher said Friday.
The research team, led by Tsunemi Kubodera, videotaped the giant squid at the surface as they captured it off the Ogasawara Islands south of Tokyo earlier this month. The squid, which measured about 24 feet long (7 meters), died while it was being caught.
“We believe this is the first time anyone has successfully filmed a giant squid that was alive,” said Kubodera, a researcher with Japan’s National Science Museum. “Now that we know where to find them, we think we can be more successful at studying them in the future.”
More here. (For Sheherzad)