Robert Burriss in Psychology Today:
The femme fatale is a stock character of classic film noir and hard-boiled detective stories: the seductive, fast-talking dame who lures a man into a trap of his own making. By the end of the tale, the man usually finds himself guilty of some hitherto undreamed-of crime, and wondering how he was ever convinced to err from the path of moral rectitude.
Of course, the audience is never in any doubt as to what transpired. The poor sap had sex on the brain. Confronted with the femme fatale, our hero never stood a chance.
This is all well and good as far as fiction goes, but does the femme fatale hold as much sway in the real world? Can a good guy be turned bad by a sexy dame?
This is a question that occurred to Wen-Bin Chiou, a psychologist at National Sun Yat-sen University in Taiwan. To find out, Chiou brought 74 heterosexual men to the laboratory. The volunteers were first shown photographs of women, which they had to rate for sex appeal. Half of the volunteers, selected at random, saw women who had previously been rated as sexy; the other half of the volunteers saw women who rated low for sexiness.
Afterwards, the men took part in what they were led to believe was an unrelated task.
Advances in computer science and engineering have lifted animatronic lovers from the realms of science fiction to reality; the first models are due to go on sale by the end of the year. Jenny Kleeman meets the men who are making the sex robots, the customers who want to buy them – and the critics who say they are dangerous. [From The Guardian.]
Video length: 16:25
James Salter in The Paris Review:
In Bertolt Brecht’s diaries he writes about such things as the essence of art, which he describes as “simplicity, grandeur, and sensitivity,” and its form, coolness.
…Making the shape and rhythm of sentences intensely felt was part of the teaching method at the writing school that James Jones and a woman named Lowney Handy established in Illinois in the years after the war. Jones was in the long process of writing his novel From Here to Eternity, and Lowney Handy was his muse. Students at the school sat for several hours every day copying out by hand passages written by Hemingway, Faulkner, and Thomas Wolfe to imbibe their strength and quality. It was the mimetic method, perhaps not as ridiculous as it sounds.
I would say that teaching writing is more like teaching dancing. If someone has a sense of rhythm, you might teach them something. There’s a great longing in people to be able to write, and the teaching of it, fiction and poetry, has become widespread in colleges and universities and outside of them as well. The teachers are often well-known and eagerly sought. Some are virtual gurus with doctrines and followers. In various cities there are private classes with selected students. You hear of a dramatic figure striking in appearance wearing boots and jodhpurs, perhaps with long white hair like a prophet, and bearing a kind of literary ichor, the fluid in the veins of the gods. He has a limitless number of great—known and lesser-known—books and authors at his fingertips, just as a musician knows a thousand pieces. He speaks only the truth, the core truth about everything and the truth about you, as a writer and as a person, which of necessity is likely to be hard. The class sessions are long, lasting for hours, and cannot be interrupted. Questions are not permitted. In this intensely charged atmosphere, the students read their stories aloud, and he stops them when they have made enough mistakes. For some, that is after a few sentences. Others are allowed to go to the end. The importance of the first sentence, he insists, can’t be overemphasized. It leads the way into the story. It sets its tone and also dictates the sentence that follows. Never begin a sentence with an adverb—it only tells what the sentence itself should reveal.
Zach Campbell at Harper's Magazine:
Some say that Arnaldo Otegi is an assassin. Others call him a peacemaker. Given his history, he might be a little of both. Otegi used to be a member of E.T.A., the armed militant group that fought in Spain for fifty years for an independent Basque state, first against the dictatorship of Francisco Franco in the 1960s and ’70s and later against the country’s democratically elected government. Otegi has gone to jail on terrorism charges three times, and is now the leader of the second strongest electoral force in the Basque Country. His actions led to E.T.A. issuing a ceasefire seven years ago, but the group still hasn’t disbanded.
In the Basque Country, violence is often justified behind closed doors. Since its inception in 1959, E.T.A. has killed over 800 people and, for decades, kidnapped and extorted to finance their activities. In response, Spain’s civilian and military police, and paramilitary groups financed by the Spanish government, killed hundreds and tortured thousands, even after the country’s transition to democracy. At different times in history, both sides have had enormous popular support in the Basque Country, and it has divided the region as much socially as it has politically. Here in many workplaces, schools, social circles, and even families, people found themselves on opposite sides of the conflict. Now after fifty years of conflict, Otegi says he knows how to end the war.
Ghaith Abdul-Ahad at The London Review of Books:
On the morning of 5 March a group of soldiers belonging to the Iraqi Special Operations Forces left the ruined village that had been their base for the past three weeks and drove north towards Mosul. Their target was the Baghdad Circle, a bleak intersection on the main highway into town, adorned since 2014 with a black and white billboard showing the black flag of Islamic State, with the seal of the prophet and beneath it the words ‘The Islamic State, Wilayat al-Mosul’. Since operations to recapture the western side of Mosul began in mid-February, the Iraqi soldiers had twice attacked the Circle and twice they had been pushed back.
‘They have formidable fortifications,’ an officer told me. IS had built a berm – a raised earthwork bank – with a trench behind it, and then another berm, all laid with IEDs. ‘In a whole day of fighting,’ the officer said, ‘we advanced no more than 150 metres.’
He pinched and zoomed a satellite map on his tablet. The Circle is the gateway to western Mosul, the oldest part of the city. The eastern part, on the other side of the Tigris, had been retaken by the end of January. Western Mosul, with its dense neighbourhoods and narrow streets, was a bigger challenge. As long as IS held the Circle, the officer explained, the highway to Baghdad could not be opened to traffic. Refugees and troops were forced to take a circuitous route through the hills to avoid snipers and rocket launchers. For the third attack, he said, a small team of special forces would cross the highway under cover of a massive barrage of fire, outflank the Circle and try to breach the fortifications from behind. Once a bridgehead was established, the rest of the troops would follow.
Holly B. Shakya and Nicholas A. Christakas at Harvard Business Review:
The average Facebook user spends almost an hour on the site every day, according to data provided by the company last year. A Deloitte survey found that for many smartphone users, checking social media apps are the first thing they do in the morning – often before even getting out of bed. Of course, social interaction is a healthy and necessary part of human existence. Thousands of studies have concluded that most human beings thrive when they have strong, positive relationships with other human beings.
The challenge is that most of the work on social interaction has been conducted using “real world,” face-to-face social networks, in contrast to the types of online relationships that are increasingly common. So, while we know that old-fashioned social interaction is healthy, what about social interaction that is completely mediated through an electronic screen? When you wake up in the morning and tap on that little blue icon, what impact does it have on you?
Prior research has shown that the use of social media may detract from face-to-face relationships, reduce investment in meaningful activities, increasesedentary behavior by encouraging more screen time, lead to internet addiction, and erode self-esteem through unfavorable social comparison. Self-comparison can be a strong influence on human behavior, and because peopletend to display the most positive aspects of their lives on social media, it is possible for an individual to believe that their own life compares negatively to what they see presented by others.
Julie Sedivy in Nautilus:
Reading medieval literature, it’s hard not to be impressed with how much the characters get done—as when we read about King Harold doing battle in one of the Sagas of the Icelanders, written in about 1230. The first sentence bristles with purposeful action: “King Harold proclaimed a general levy, and gathered a fleet, summoning his forces far and wide through the land.” By the end of the third paragraph, the king has launched his fleet against a rebel army, fought numerous battles involving “much slaughter in either host,” bound up the wounds of his men, dispensed rewards to the loyal, and “was supreme over all Norway.” What the saga doesn’t tell us is how Harold felt about any of this, whether his drive to conquer was fueled by a tyrannical father’s barely concealed contempt, or whether his legacy ultimately surpassed or fell short of his deepest hopes.
Jump ahead about 770 years in time, to the fiction of David Foster Wallace. In his short story “Forever Overhead,” the 13-year-old protagonist takes 12 pages to walk across the deck of a public swimming pool, wait in line at the high diving board, climb the ladder, and prepare to jump. But over these 12 pages, we are taken into the burgeoning, buzzing mind of a boy just erupting into puberty—our attention is riveted to his newly focused attention on female bodies in swimsuits, we register his awareness that others are watching him as he hesitates on the diving board, we follow his undulating thoughts about whether it’s best to do something scary without thinking about it or whether it’s foolishly dangerous not to think about it. These examples illustrate Western literature’s gradual progression from narratives that relate actions and events to stories that portray minds in all their meandering, many-layered, self-contradictory complexities. I’d often wondered, when reading older texts: Weren’t people back then interested in what characters thought and felt?
Somewhere at Google there is a database containing 25 million books and nobody is allowed to read them.
James Somers in The Atlantic:
You were going to get one-click access to the full text of nearly every book that’s ever been published. Books still in print you’d have to pay for, but everything else—a collection slated to grow larger than the holdings at the Library of Congress, Harvard, the University of Michigan, at any of the great national libraries of Europe—would have been available for free at terminals that were going to be placed in every local library that wanted one.
At the terminal you were going to be able to search tens of millions of books and read every page of any book you found. You’d be able to highlight passages and make annotations and share them; for the first time, you’d be able to pinpoint an idea somewhere inside the vastness of the printed record, and send somebody straight to it with a link. Books would become as instantly available, searchable, copy-pasteable—as alive in the digital world—as web pages.
It was to be the realization of a long-held dream. “The universal library has been talked about for millennia,” Richard Ovenden, the head of Oxford’s Bodleian Libraries, has said. “It was possible to think in the Renaissance that you might be able to amass the whole of published knowledge in a single room or a single institution.” In the spring of 2011, it seemed we’d amassed it in a terminal small enough to fit on a desk.
James Owen Weatherall in Nautilus:
Physicists know how to use quantum theory—your phone and computer give plenty of evidence of that. But knowing how to use it is a far cry from fully understanding the world the theory describes—or even what the various mathematical devices scientists use in the theory are supposed to mean. One such mathematical object, whose status physicists have long debated, is known as the quantum state.
One of the most striking features of quantum theory is that its predictions are, under virtually all circumstances, probabilistic. If you set up an experiment in a laboratory, and then you use quantum theory to predict the outcomes of various measurements you might perform, the best the theory can offer is probabilities—say, a 50 percent chance that you’ll get one outcome, and a 50 percent chance that you’ll get a different one. The role the quantum state plays in the theory is to determine, or at least encode, these probabilities. If you know the quantum state, then you can compute the probability of getting any possible outcome to any possible experiment.
But does the quantum state ultimately represent some objective aspect of reality, or is it a way of characterizing something about us, namely, something about what some person knows about reality? This question stretches back to the earliest history of quantum theory, but has recently become an active topic again, inspiring a slew of new theoretical results and even some experimental tests.
Adam Shatz in the London Review of Books:
One of the great paradoxes of the Obama era is that it encouraged so many liberals, both black and white, to see the black experience in America not as a slow, arduous struggle for freedom culminating in the election of a black president – Obama’s version, not surprisingly – but as an unending nightmare. Not least among the reasons was that a black man of unerring self-discipline and caution, bipartisan to a fault, should have provoked such ferocious white resistance – fanned by the man who questioned the validity of his birth certificate and then succeeded him as president. This most eloquent champion of ‘post-racialism’ may have been the most powerful man in the world, yet he remained a prisoner of his race, of his ‘black body,’ as Ta-Nehisi Coates put it in Between the World and Me.1 In the face of repeated police shootings of young black men or atrocities such as the church massacre in Charleston, South Carolina, Obama did little more than deliver one of his formidable speeches. And – as he did in Charleston – sing ‘Amazing Grace’, as if only a higher power could cure America of its original sin, and end the nightmare.
That nightmare began in the early 17th century, when Africans were packed into slave ships and transported to the American colonies, where they – or those who survived the Middle Passage – were sold at auction, stripped naked for the perusal of prospective buyers. With the defeat of the South in the Civil War – by 1804, all the northern states had abolished slavery – four million slaves won their freedom. Under the protection of federal troops, they gained the right to vote, and to elect representatives to state legislatures and the US Congress, in the unfinished revolution known as Reconstruction. But the forces of white supremacy in the former Confederacy proved resilient and inventive, and succeeded in overturning the gains of Reconstruction: black captivity wasn’t liquidated so much as reconfigured.
Gwendoline Riley at the Times Literary Supplement:
The stories in The Left Bank are often very short, but are hardly denuded. Here are glimpses, curiosities, street scenes. In these spates of impressions and perceptions, Rhys combines sensitivity and dash to bring us the ethnography of a nightclub (“Tout Montparnasse and a lady”) and a jazz café (“In a Café”) and a department store staff canteen (“Mannequin”), each as crowded as a sketchbook page. “In the Luxembourg Gardens” illustrates a pick-up, and would not look out of place in a seaside postcard-rack, while the narrator of “Illusion” is a demi-monde Miss Marple, keenly investigating a “gentlemanly” female friend’s proclivity for hoarding frocks. Each character comes fully accoutred, with pipe or dirty waistcoat, spectacles, or monocle, green hat or yellow wig, string bag, silver rings – and ready to peer cautiously through atelier doors, or rush into a café, or burst into song.
Their situations run from “rum” to “gay”, though with a marked tendency to the former. Montparnasse is described as “full of tragedy – all sorts – blatant, hidden, silent, voluble, quick, slow”. The voice that tells all this is sometimes abject, but more often downright larky, if savagely so. It can declare: “Poor Sara . . . also a Romantic!” Or “Poor André! Let us hope he had some compensation for forgetting for once that ‘eat or be eaten’ is the inexorable law of life”. It can lament, damn and dispense. It isn’t cruel, though. How could it, why would it, out-cruel such a cruel world? In fact it can conjure pure pity. Of an exhausted drunk, Rhys writes, “She sighed heavily, instinctively, as a dog sighs”.
E. Thomas Finan at The Millions:
In March, the acclaimed poet Derek Walcott died at the age of 87. Born on the Caribbean island of Saint Lucia, Walcott became a literary voice known throughout the globe. Celebrated for his verse and his plays, he won the Nobel Prize in Literature, a MacArthur “genius” grant, an Obie award, and countless other prizes. He also taught at a number of institutions, including Boston University (where I now teach, though I didn’t know Walcott personally).
Reconciliation was one of Walcott’s great tasks as a poet. He fused the iconography of the Americas and of Europe in order to create a hybrid poetry. He combined allusions to classical myths with descriptions of the landscape of his native Saint Lucia, and he incorporated quotations from countless European authors in his works. This enterprise of poetic fusion reached a peak in perhaps his most famous work,Omeros, a reworking of Homer that loosely follows the terza rima verse form used by Dante Alighieri in The Divine Comedy. Omeros was published shortly before Walcott won the Nobel Prize in 1992, and, at least if last month’s obituaries are to be believed, will go down as a landmark piece in his poetic oeuvre.
While Omeros has gotten most of the headlines, a shorter and much earlier poem, 1956’s “Ruins of a Great House” reveals some of the abiding concerns of Walcott’s work in a more condensed way. In only about 50 lines, it shows how Walcott reworked tradition and reflected on the legacy of colonialism. The poem’s setting is the manor house at the heart of a former lime plantation. The speaker wanders the ruins of the house and conjures hints of the suffering wrought by life on this plantation.
Howard Tharsing at Threepenny Review:
In 1896, when Jack London was twenty, the San Francisco Chroniclehad referred to him as “the boy socialist of Oakland.” His fame grew out of his power as a public speaker. Week after week he stood on a soap box in the little park in front of City Hall arguing that the unbridled capitalism of his day condemned a great many of his fellow citizens to lives of degradation and misery while enriching a small number outrageously. Dozens of speakers held forth in the park every week, but Jack London always drew the biggest crowds and held their attention better than any other speaker. And in 1897, when Oakland passed a law forbidding public meetings on public streets, London challenged the law by getting himself arrested for climbing on that soap box and speaking. Oakland authorities were surprised that instead of paying the fine or consenting to spend a few days in jail, London demanded a jury trial. Acting as his own lawyer, London argued that the law violated the constitution’s guarantees of the rights to freedom of speech and freedom of assembly, and he won.
Even at that young age, London already had long experience in the exploitation of labor and the difficulties suffered by the poor. He had worked in a fish cannery, in a coal mine, and in a jute factory, and as a fisherman, a seal hunter, an oyster pirate, and an officer in the Fish Patrol. He had traveled across the country as a hobo jumping trains. He had spent thirty days in jail in upstate New York on a charge of vagrancy.
Samantha Hunt in Lapham's Quarterly:
When I wrote my book about Tesla, I thought he belonged to me alone. I had never heard of him before. No one had ever taught me about him in school, and certainly no one had ever named a car after him. I knew only of Tesla the hair-metal band. When I discovered Tesla the poet-inventor, who built a motor powered by june bugs at age nine, and later harnessed Niagara Falls, and later concocted ways to photograph thought, it seemed I’d dreamed him into existence. Thus he belonged to me, only me. Tesla worked independently in laboratories he built himself with little corporate or military interference. He invented radio. He invented our modern AC electrical system. But as he often failed to protect his patents—not believing a person could own thunder and lightning—eventually he could no longer afford a proper laboratory. He then made his inventions in his New York City hotel rooms, in his mind. What’s the difference between invention and discovery? Is it just a question of ego? Or is it one of money? Living with Tesla’s legacy and papers for more than three years of research, one hard thought kept cropping up. Everywhere I knew people who were making buildings, mugs, plays, paintings, sweaters, chocolate, operas, but I didn’t know any people, except children, who were trying to fly, who were grafting DNA for wings. I didn’t know anyone with a basement lab made for playing with protons. I wondered why we are well acquainted with the phrase starving artist while the term starving scientist does not even exist.
I now receive email from men who also believe that Tesla belongs to them alone. Some of these men are angry that I, a woman, a nonengineer, speak for Tesla. Some of these men host ideas about life on Venus, the spirit molecule, free energy, government cover-ups and conspiracies. Some of these men are just glad Tesla has finally been recognized. I like the mad men the best. I imagine they have good dreams.
Jason Daley in Smithsonian:
Wax worms, which are the larval stage of the wax moth Galleria mellonella, are commonly used in the United States as fishing bait or birdfeeder snacks. But in Europe, the worms are considered a beehive pest where they chew through the beeswax, disrupting the hive. But researchers have found another use as plastic recyclers. Federica Bertocchini, an amateur beekeeper and scientist at the Spanish National Research Council, picked some wax worms out of one of her beehives and put them in a plastic shopping bag. She left to clean the honeycomb panels. When she returned, the worms were all over the place. "When I checked, I saw that the bag was full of holes. There was only one explanation: The worms had made the holes and had escaped. This project began there and then," she says in a press release. Bertocchini and colleagues from Cambridge University began studying the creatures and found that the common wax worm can not only munch but also metabolize polyethylene, the plastic in shopping bags which makes up about 40 percent of the plastics used in Europe. They published their results this week in the journal Current Biology. In order to study the worms’ munching ability, the researchers put 100 wax worms in a plastic shopping bag from a U.K. supermarket. Within 40 minutes, holes began to appear. Within 12 hours, they had eaten about 92 milligrams of plastic, which Bertocchini says is pretty fast, especially compared to bacteria discovered last year which dissolves polyethylene at a rate of about 0.13 milligrams per day. As Ian Sample at The Guardian reports, the researchers wanted to make sure the worms weren’t just chewing the plastic into microscopic particles. So they smooshed up some of the worms and applied the paste to the plastic, which also caused holes to appear.
“The caterpillars are not just eating the plastic without modifying its chemical make-up. We showed that the polymer chains in polyethylene plastic are actually broken by the wax worms,” co-author Paolo Bombelli says in a press release. “The caterpillar produces something that breaks the chemical bond, perhaps in its salivary glands or a symbiotic bacteria in its gut. The next steps for us will be to try and identify the molecular processes in this reaction and see if we can isolate the enzyme responsible.” The hope is that the discovery could lead to a method for breaking down polyethylene that is currently filling in landfills and clogging waterways. But just how that will work is speculative. The enzyme could be produced by modified E. coli bacteria or plankton that would attack plastic in the wild, writes Sample. Bombelli also suggests it may be possible to breed and release an army of wax worms. But that means learning more about the worm’s motivation.
Tom Nichols in Market Watch:
Why can’t Americans agree about anything? The United States has survived through periods of great division and yet today we all now seem incapable of finding common ground on even the smallest issues. This is a problem that is approaching the level of a national crisis that threatens our democracy.
Some of this tendentiousness is part of an irascible American culture that is, paradoxically, woven into our greatness as a nation. Our willingness to speak our minds and rely on our own common sense has been central to an American character noted by Tocqueville and others since our founding as a nation.
Still, American politics were once characterized by a fair amount of bipartisanship and even ticket-splitting in national elections. Today, in public forums, we engage each other not to learn or to converse, but to fight along the harshest and most intractable partisan lines — and to win, no matter how obnoxious we must be in order to carry the day.
Of course, some of this problem is generated by human nature, especially the problem of “confirmation bias.” We want to believe that our experiences and our beliefs, including the important issue of how we view ourselves, explain the world around us. We naturally want to reject evidence that conflicts with those cherished views (especially the ones about ourselves). We all do it, and it’s why we so easily drive each other crazy in our daily conversations.
More here. [Thanks to Ali Minai.]
Susana Martinez-Conde and Stephen L. Macknik in Scientific American:
We are surrounded by mirrors all day, every day—when we drive, brush our teeth, check our hair while heading out the door. Yet for all their ubiquity, mirrors remain somewhat mysterious. In folktales and fiction at least, they can be conduits to spiritual, magical or supernatural realms: mirrors can out the soulless vampires in our midst. They can summon the legendary hook-handed murderer known as Candyman. And the Mirror of Erised—of Harry Potter fame—holds the remarkable power to lay bare its viewer's deepest desire.
Our enchantment with mirrors may stem in part from the fact that they often defy expectations. Not only do we find the right-left reversal of reflecting surfaces discomfiting, but many of our hard-won intuitions about how mirrors work are dead wrong. Psychologist Marco Bertamini of the University of Liverpool in England and his colleagues have identified three false beliefs we typically have about mirrors: First, people usually predict that they will see themselves in a mirror before they arrive in front of it. In other words, they overestimate what is visible in a mirror. This miscalculation is called the “early error.” Second, most people assume that their projection on a mirror (the outline they could trace with a pen on its surface) is the same size as their body. In reality, that projection, as they see it, is half the physical size of their body. Third, people tend to think that the mirror projection of their own image will shrink with distance, so they will see their full body in a small mirror if they move far enough away from it. But in fact, distance does not affect the size of a body's projection. Moreover, some research indicates that people see objects in a mirror as somehow less real than nonreflected ones. The illusions we present here all take advantage of how little we grasp about the looking glass.
Debra W. Soh in Playboy:
On a beautiful sunny morning on Saturday, I followed hundreds of women, men and children making their way through Toronto’s downtown core as part of the highly publicized March for Science that took place in Washington, D.C. and another 610 cities around the world. There was no tear gas or trees lit on fire—just a lot of white lab coats and witty protest signs like “Cell-ebrate Science” and “Alternative Facts are the Square Root of Negative 1.”
Sounds uplifting, doesn’t it? In reality, scientists have been critical of the March for Science since its inception, with public figures, notably Steven Pinker and Michael Shermer, speaking out. What I found particularly concerning was the March’s emphasis on intersectionality as a “core principle.” This theory is fuelled by anti-science sentiments, such as the belief that we should prioritize subjective feelings over objective fact. These ideas have no place in the discourse on legitimate science.
Many of the people I spoke with were scientists protesting President Donald Trump’s defunding; others were self-proclaimed defenders of science or had friends and family working in research.
But at the March, it took all of about five minutes before many of the organized speakers broke out the usual identity politics rhetoric, with talk about how “race, religion, gender and class” should not divide us. Don’t get me wrong, I totally agree with this, but since the goal of science is to be impartial, it really wasn’t necessary for them to keep pointing this out. Another speaker blamed “power structures” and capitalism for our current predicament. I saw people holding anti-fascism signs and wasn’t sure what this had to do with funding cuts to the NIH or NASA.
More here. [Thanks to Patrick Lee Miller.]