Insects Are In Serious Trouble

Ed Yong in The Atlantic:

ScreenHunter_2867 Oct. 24 19.06The bottles were getting emptier: That was the first sign that something awful was happening.

Since 1989, scientists from the Entomological Society Krefeld had been collecting insects in the nature reserves and protected areas of western Germany. They set up malaise traps—large tents that funnel any incoming insect upward through a cone of fabric and into a bottle of alcohol. These traps are used by entomologists to collect specimens of local insects, for research or education. “But over the years, [the Krefeld team] realized that the bottles were getting emptier and emptier,” says Caspar Hallmann, from Radboud University.

By analyzing the Krefeld data—1,503 traps, and 27 years of work—Hallmann and his colleagues have shown that most of the flying insects in this part of Germany are flying no more. Between 1989 and 2016, the average weight of insects that were caught between May and October fell by an astonishing 77 percent. Over the same period, the weight of insects caught in the height of summer, when these creatures should be at their buzziest, fell by 82 percent.

More here.

From the satanic verses to charlie hebdo

Kenan Malik in Pandaemonium:

Satanic-versesOn 14 February 1989, Valentine’s Day, the Ayotollah Khomeini issued his infamous fatwa against Salman Rushdie. It was a brutally shocking act that forced Salman Rushdie into hiding for almost a decade.

26 years later, on 7 January 2015, came an even more viscerally shocking act, when two gunmen forced their way into the Paris offices of the satirical magazine Charlie Hebdo, sprayed the room with machine gun fire, killing 12, and injuring another 11.

What I want to look at today is what each of these events represented, and how we made the journey from the one to the other.

When The Satanic Verses was published in September 1988, Salman Rushdie was perhaps the most celebrated British novelist of his generation. The novel was not, it’s worth reminding ourselves, a novel solely, or even primarily about Islam. It was, Rushdie observed in an interview, about migration, metamorphosis, divided selves, love, death, as well as an attempt to write about religion and revelation from the point of view of a secular person.

It’s also worth reminding ourselves that until the fatwa most Muslims had ignored the book. The campaign against The Satanic Verses was largely confined to India, Pakistan and Britain. With the singular exception of Saudi Arabia, whose authorities bankrolled the initial efforts to ban the novel, there was little anti-Rushdie fervour in the Arab world or in Turkey, or among Muslim communities in France or Germany. When at the end of 1988 the Saudi government tried to persuade Muslim countries to ban the novel, few responded except those with large Indian subcontinental populations, such as South Africa and Malaysia. Even Iran was relaxed about Rushdie’s irreverence. It was available in Iranian bookshops and even reviewed in Iranian newspapers.

It was the fatwa that transformed the Rushdie affair into a global conflict with historic repercussions.

More here.

In Tlayacapan

Download (9)Lorna Scott Fox at the LRB:

‘Here, the dead are more alive than ever,’ the ad on the radio said. ‘That’s why I love Mexico.’ I was on my way to Tlayacapan, one of Mexico’s pueblos mágicos, a category invented to promote tourism. Tourism is down in this magic village. Located near the epicentre of the earthquake of 19 September, in Morelos state, south-west of the capital, it experienced the worst impact in living memory. There are husks of adobe homes on every street, most of the churches are damaged, and the town hall clock tower fell; the arches where the last scene of Butch Cassidy and the Sundance Kid was filmed are still standing, pocked and scuffed as if after a gun battle. I saw a sign flapping taped to a gate: ‘Careful with the wall.’ A woman was organising a tequio, the old indigenous form of community labour, to make adobe bricks. Scrawled in purple all the way across a yellow house, its outbuildings now tidied into piles of rubble, was: ‘Thanks to everyone for your help.’ The state is nowhere to be seen, apparently.

This may be an exaggeration; but in Mexico City, too, friends, neighbours and volunteers stepped in where the authorities failed in the immediate aftermath. There’s a terrible sense of déjà vu, as the magnitude-7.1 quake hit 32 years to the day after the big one in 1985. Up to 40,000 died then, according to some estimates; the official figure was 4000. This time the national total hovers around 400, progress of a sort.

more here.

On Philip K. Dick and Blade Runner 2049

Mv5bmtu1njqzodewnf5bml5banbnxkftztgwmdm5mjy2mzi._v1_sy500_cr00859500_al_Paul Youngquist at The Paris Review:

Rereleases of Blade Runner (seven total, but Ridley Scott had artistic control only over The Final Cut of 2007), clarify a decisive departure from Dick’s narrative: Deckard, too, is a replicant. Blade Runners kill for a living, making it hard to defend their humanity (Dick’s point). Why not make them replicants? Implanted memories make Deckard feel human. But he isn’t. As a Blade Runner he protects humans against predators like himself, while displaying plenty of empathy, most urgently for the alluring Rachel, with whom he absconds at movie’s end. Empathy can’t make human life sacred if replicants feel it, too. So much for the Voight-Kampff test—and Dick’s attempt to distinguish artificial from sacred life on the basis of feeling. Lucky for Deckard, the love of his replicant life enjoys an extended life span, which means he probably does, too.

Blade Runner 2049 picks up these hints from the original and runs with them. It’s a visually gorgeous film, panning, as it opens, a vast cityscape in ambient sfumato, architecture somewhere between Albert Speer, Bauhaus, and I. M. Pei. Sans-serif text reveals that replicants have seen improvements: there’s now a model with an extended life span, another capable of complete obedience. The opening scene ends with an obedient one, a Blade Runner called K (Ryan Gosling), carrying a bloody eyeball in a plastic bag toward his police-issue hover car parked in the sand. The eyeball belonged to Sapper Morton, a geriatric replicant leading a peaceful if illegal life in the desert as a grub farmer. After verifying a serial number embedded in that eyeball, K retired him. No need now for that clumsy empathy test.

more here.

On ‘The Collected Essays of Elizabeth Hardwick’

1681371545.01.LZZZZZZZAlex Andriesse at The Millions:

A review of Elizabeth Hardwick is almost obliged to begin with the following facts: (1) she was born in Kentucky in 1916 and moved to Manhattan in the early 1940s with the self-declared aim of becoming a “New York Jewish intellectual;” (2) in 1963, along with Barbara and Jason Epsteinand Robert B. Silvers, she helped found The New York Review of Books; (3) for more than two decades she was married to the famous—and famously “confessional”—poet Robert Lowell. Notable though these facts may be, however, they are hardly the reasons why Hardwick’s writing continues to be read. As the 55 essays gathered in the new Collected Essays make clear, Hardwick was one of the most penetrating literary critics of her time. Whether she was writing about Henry James or Renata Adler, Edith Wharton or Joan Didion, “every assignment got Hardwick at full sail,” as Darryl Pinckney says in his introduction. She was a “writer’s writer” without question—a prose stylist par excellence.

Hardwick’s style is not for everyone. Her wit is subtle, her syntax sinuous, her learning deep, which is no doubt why her work is so seldom taught in the classroom. It is, in the best sense, un-teachable. “The essayist,” Hardwick once wrote, distinguishing him from the journalist, “does not stop to identify the common ground; he will not write, ‘Picasso, the great Spanish painter who lived long in France.’” Such refusal to stop and explain might easily be mistaken for snobbery today; Hardwick, however, saw it as a gesture of respect. She was not only a “writer’s writer,” she was also—silly though the phrase may be—a “reader’s writer.” She addressed her readers as equals, never wanting to bore them with what they already knew, or what, in the course of their reading, they would soon enough find out for themselves.

more here.

An Essay Concerning Human Understanding by John Locke (1689)

Robert McCrum in The Guardian:

LockeThis celebrated essay, available to its first readers in December 1689, though formally dated 1690, could hardly be more topical today. It is an examination of the nature of the human mind, and its powers of understanding expressed in brilliant, lapidary prose: “General propositions are seldom mentioned in the huts of Indians: much less are they to be found in the thoughts of children.” In the first two books, the argument moves through the source of ideas, the substance of experience (the origin of ideas), leading to a discussion of “the freedom of the will”: “No man’s knowledge here can go beyond his experience”. In book three, Locke proceeds to discuss language, and in book four he defines knowledge as our perception of the agreement or disagreement between ideas. Eventually, after several arguments of great intricacy and subtlety, Locke establishes good arguments for empirical knowledge, and moves to explore the existence of God, discussing the relations between faith and reason: “Reason is natural revelation, whereby the eternal Father of light, and fountain of all knowledge, communicates to mankind that portion of truth which he has laid within the reach of their natural faculties.”

Bertrand Russell once said, possibly speaking for effect, that Locke had made a bigger difference to the intellectual climate of mankind than anyone since Aristotle. He added that “no one ever had Common Sense before John Locke” – and common sense was the watchword of much 18th and 19th century English endeavour. A sentence such as “I have always thought the actions of men the best interpreters of their thoughts” could equally have been written by Johnson. Nonetheless, there is really no writer in this series who more impressively embodies the English spirit than Locke, in the sense that it is he who teaches us to think for ourselves, to weigh evidence empirically, to keep belief within limits, and to put all things to the test of reason and experience. He is also witty: “All men are liable to error; and most men are, in many points, by passion or interest, under temptation to it.”

More here.

To Mend a Birth Defect, Surgeons Operate on the Patient Within the Patient

Denise Grady in The New York Times:

HeartThe patient, still inside his mother’s womb, came into focus on flat screens in a darkened operating room. Fingers, toes, the soles of his feet — all exquisite, all perfectly formed. But not so his lower back. Smooth skin gave way to an opening that should not have been there, a bare oval exposing a white rim of bone and the nerves of the spinal cord. “All right, it’s the real deal,” said Dr. Michael A. Belfort, the chairman of obstetrics and gynecology at Baylor College of Medicine and obstetrician and gynecologist-in-chief of Texas Children’s Hospital.

The fetus, 24 weeks and two days old, less than two pounds, was about to have surgery. He had a severe form of spina bifida, in which the backbone and spinal cord do not develop properly. Children born with this condition usually cannot walk, and suffer from fluid buildup in the brain, lack of bladder control and other complications. A pediatric neurosurgeon, Dr. William Whitehead, joined Dr. Belfort at the operating table. Doctors have been performing fetal surgery to repair spina bifida since the 1990s; it is not a cure, but can lessen the degree of disability. But now Dr. Belfort and Dr. Whitehead are testing a new, experimental technique — one that some in the field are eager to learn, but that others regard warily, questioning its long-term safety for the fetus. The surgeons had made a wide incision in the mother’s lower abdomen, gently lifted out her uterus — still attached internally — and made two tiny, 4-millimeter slits. In one, they inserted a “fetoscope,” a small telescope fitted with a camera, light and grasping tool. The second slit was for other miniature instruments. Lit from within, the uterus glowed, red and magical in the darkened room. Spina bifida occurs early, at three to four weeks of pregnancy, when the tissue forming the spinal column should fold into a tube but does not close properly. There are 1,500 to 2,000 cases a year in the United States.

More here.

Man without Qualities

by Holly A. Case and John Palattella

Picture1

Sebastian Kurz, Ankara 2015

[John Palattella is editor-at-large at The Nation and contributing editor at The Point.]

On Sunday, October 15, the New York Times ran a story on the Austrian parliamentary elections that were being held that day. "As Austrians head to the polls Sunday," the web teaser said, "Foreign minister Sebastian Kurz's far-right Freedom Party is expected to grab a share of power in the next government."

Later in the day, the heading was corrected: Kurz is not a member of the Freedom Party (FPÖ), but rather the leader of the conservative People's Party (ÖVP). Yet the Times' mistake was a telling one. Even for some in Austria it has been difficult to tell the difference between the positions of the far right and those of Sebastian Kurz.

Kurz is thirty-one years old, hitherto Austria's youngest foreign minister, and the youngest head of one of the two parties in the outgoing governing coalition. On May 14 of this year, in what felt like a well-orchestrated and oddly consensual coup, the ÖVP handed its own head to him on a platter, agreeing to all seven of his conditions for assuming party leadership. The conditions included absolute decision-making power over party matters, as well as the party's unqualified support for a ticket he later ran as part of a movement under his own name in the parliamentary elections. The party agreed to everything, in writing. Comparisons multiplied: Is Kurz the Austrian Macron? Or Victor Orbán (Hungary's right-wing populist prime minister), who was among the first to congratulate him for the coup? Or "Recep Kurz" (a riff on Turkey's neo-authoritarian Recep Tayyip Erdoğan)? One thing is certain: after Sunday the election, in which the ÖVP came in first with 31.6 percent of the vote, an increase of 7.6 percent from the party's performance in 2013, Kurz will almost certainly be Austria's new chancellor—the youngest the country has known—and calling the shots for the next five years.

Politics does not have many child prodigies, and foreign policy in particular has long been the domain of old hands. But for over a year now there has been a swelling fascination with Kurz, who, in addition to serving as Austrian foreign and integration minister, is also now chair of the Organization for Security and Cooperation in Europe. Ask around Austria and everyone seems to have a favorite Kurz anecdote. There's the one about how, during a municipal election campaign in Vienna, he drove a "cool-o-mobile" as party girls threw black condoms to the people; or the time he took an economy seat on a commercial flight and received an enthusiastic ovation from fellow passengers; or how he used to stroll through the outdoor Hannover Market, in the middle of one of Vienna's most diverse neighborhoods, and address the Turkish vendors by name. Members of his party have long spoken of him like a horse they're waiting to trot out for the big race. In December, when Kurz entered a room to take a seat beside the Czech foreign minister at an event on regional cooperation, an economist leaned over to whisper to his neighbor: "Behold, Austria's next chancellor!"

Read more »

Why does North Korea really want nukes?

by Thomas R. Wells

DownloadNorth Korea’s development of atomic fission bombs and ICBMs is very worrying. Unfortunately the analysis of it in the news media is woeful. Some commentators assume that North Korea works like a normal country (like their country); some clearly don’t understand how war works; some believe the regime’s propaganda; some seem unable to think in a straight line at all. Some manage to make all those mistakes at the same time and more. One can only hope that the US, South Korean, and Japanese war ministries have better experts. In the meantime, at least we can throw out the worst nonsense.

Myth 1: This Will Lead to World War III

The exchange of threats between Kim Jong-un’s regime and Trump’s leads some to assume that world war is imminent. It is never explained how. The Cold War was the last time we seriously thought about an exchange of nuclear weapons and it seems that a lot of people who write for newspapers still think in the same patterns, in terms of extraordinary powers of annihilation and hair trigger global alliances.

But this situation is nothing like that.

War is the use of military might to achieve political objectives against the will of another government. Killing lots of people isn’t the point of a war; only a means to an end. North Korea could already do that with its arsenal of chemical and biological weapons. The fact that Kim Jong-un will soon be able to kill lots of Americans in spectacular fiery explosions doesn’t mean he can now beat the USA into submission in a war. In any nuclear exchange, America’s government would be the only one left standing.

It is possible that nuclear weapons might allow Kim Jong-un to achieve certain political objectives against America by their threat rather than their use. For example, getting America to renounce its defence treaties with S. Korea and Japan. Although you may have noticed that countries with nuclear weapons don’t generally have much success in using them to order other countries around. After all, if it worked then America would already have used it on North Korea.

Read more »

MOOD SWINGS: ROBERT LOWELL AT 100 (AND A BIT)

by Richard King

Robert-lowell-by-elsa-dorfmanI happened to be emerging from a bout of depression when I first realised we were approaching the centenary of Robert Lowell's birth in 1917. Now that date – 1st March – has passed, but I've been rereading the poetry anyway, in the spirit of the young student in Richard Attenborough's 1993 film Shadowlands: ‘We read to know we're not alone.' Lowell once told his fellow poet Stanley Kunitz: ‘It may be that some people have turned to my poems because of the very things that are wrong with me, I mean the difficulty I have with ordinary living …' I think that's right: The idea of the ‘mad artist' is a mystification, a secular version of the divinely inspired genius; but there is no doubt that in the middle of the twentieth century mental illness began to emerge as a topic in US poetry in particular, and Lowell was one of its most sensitive registers, the equal of John Berryman and Sylvia Plath. Whatever his personal failings (and they were many) his best poems offer an exquisite exploration of this inescapable modern theme.

He was born in Boston into a ‘Brahmin' family – i.e. a family that can claim descent from the original English colonists, or ‘Mayflower screwballs' as Lowell would later call them in ‘Waking in the Blue'. As a boy he was violent and unpredictable, earning himself the nickname ‘Cal' – a nod to the pitiless Roman Emperor Caligula, and to Caliban, the bitter savage in Shakespeare's The Tempest. He attended Harvard, then Kenyon College, graduating in 1940. A convert to Roman Catholicism, he refused the draft in 1943 in protest at the allied bombing of German cities. (His subsequent stint in jail is recounted in his poem, ‘Memories of West Street and Lepke'.)

The turbulence and rebelliousness of Lowell's early adulthood were reflected in his poetry. The first collection, Lord Weary's Castle (1946), is a blast against the ‘spirit of New England', the ‘hell-fire streets' of Calvinist Boston. His passion is the passion of the religious convert, his Catholicism charged with puritanical zeal. Typically his poems will attempt a marriage between the modern world and a religious theme, a marriage that, like Lowell's own marriages, seems always to be on the brink of collapse. The incongruity is intentional: Lowell means to give us a vision of the world as having dropped short of expectations, as having forsaken the word of God and descended into war and commerce.

Read more »

Trying to understand random violence

by Emrys Westacott

ImagesA man goes to the doctor because he is worried about a possibly malignant tumor on his neck. Two weeks later he goes back, concerned about another growth on his spine. Two more weeks and he again goes to the doctor to ask about a lesion in his mouth. Each time the doctor examines him carefully, conducts tests, and consults with colleagues. But each time, the physicians concern themselves mainly with the question of why the lesions appear where they do. Why has the tumor appeared on the neck rather than on the liver? Why on the spine and not in the brain? The patient can't help feeling that they are neglecting the more important question: why are tumors appearing in the first place?

Listening to some of the news coverage following the mass shooting in Las Vegas on October 1st, when Stephen Paddock opened fire from a hotel window on the audience at an outdoor concert, killing 58 and injuring over 500, I felt rather like this patient. Reports on NPR would typically begin: "Police still don't know why Stephen Paddock opened fire on…….." Of course, it is legitimate and important to ask why this particular individual suddenly committed mass murder, just as it's worth asking why tumors appear where they do. Establishing correlations between acts of random violence and elements in the perpetrator's life story, situation, or psychological profile could possibly help us anticipate and thereby forestall future tragedies. But we also need to ask the more fundamental question. Why are lesions appearing on the body? Why are spree killings much more common in the US than in other countries?

First, it is worth establishing a few facts. According to a CNN report, there were 90 mass shootings in the US between 1966 and 2012. These are shootings that kill more than four people but don't include gang violence or incidents involving several family members. They include such spree killings as those at the Orlando night club (June 2016, 49 killed), Sandy Hook (Dec. 2014, 27 killed), and Virginia Tech (April 2007, 32 killed). In the rest of the world during this period, there were 292 incidents of this sort. And compared to other economically developed countries, the US is a total outlier. So although violent crime has declined significantly in the US over the last 20 years, the question still remains: why are there are so many more spree shootings in the US than in other countries?

Read more »

Prison Literature: Constraint and Creativity

by Samir Chopra

4000prison1The American philosopher Ivan Soll attributed "great sociological and psychological insight" to Hegel’s remarks that "the frustration of the freedom of action results in the search of a type of freedom immune to such frustration," that "where the capacity for abstract thoughts exists, freedom, outwardly thwarted, is sought in thought."[1] The perspicuity of this insight of Hegel—one found in Nietzsche and Freud’s explorations of the depths of human psychology too—is visible in a species of literary and intellectual production intimately associated with physical confinement: prison literature. This genre is populated with many luminaries: Boethius, John Bunyan, Marquis De Sade, Antonio Gramsci, Solzhenitsyn, Bukharin, Elie Wiesel, Henry Thoreau, Jean Genet—among others. These writers found constraint conducive to creativity; the slamming shut of one gate prompted the unlocking of another; confinement produced a search for “substitute gratification”–whether conscious or unconscious–and the channeling of the drive to freedom into the drive for concrete expression of abstract thought. Like Nietzsche in The Genealogy of Morals these writers argued—by the act of writing their works—that if pathological repression is to be avoided, our drives must be appropriately and masterfully directed toward alternative, creative, expression. The prison writer thus demonstrates the truth of the claim—with which Hannah Arendt and George Orwell’s visions in The Origins of Totalitarianism and 1984 resonate—that the prison officials who place prisoners in solitary confinement convey crucial information to future oppressors: mere imprisonment of the political or moral gadfly is not enough; if confinement is to work as a mode of repression, it must aspire to totality.

The Peculiarities of Prison

The central irony of the prison—as the prisoner quickly discovers—is that it is a zone of legal enforcement and lawlessness. Prisoners confront unblinking, resolute bureaucracy, beholden to its procedures and their utter rigidity, all the while knowing their guards—the corrections officers who can ‘correct’ them at any time—can violate them with impunity. The incarcerated are always aware they are powerless, that their guards can exert all manner of power over them. Prisoners do not just fear other prisoners; they fear the lawless application of the law too. Any formal legal redress available will not diminish the terrifying powerlessness in the face of a guard exerting total and final control over body and mind. The long arm of the law rarely reaches out to accost a prison guard; the prisoner is at the guard’s mercy.

Read more »

‘Minority’ Languages

by Carl Pierer

Endangered LanguagesIn a recent article, the philosopher Rebecca Roache raises the question if there are good reasons to preserve endangered languages. In particular, she worries about ‘minority languages', which she defines as: "(…) one that is spoken by less than half [of the population] (…) even in the country in which [it is] most widely spoken." Starting from "[t]he sorrow we feel about the death of a language (…)", she finds that languages, and endangered languages in particular, are valuable for two kinds of reasons.

First of all, there are scientific reasons. For instance, one of the big questions of linguistics, she writes, concerns the truth of the Sapir-Whorf hypothesis. The hypothesis comes in two versions: the strong one – roughly – states that language determines thought, whereas the weak holds that language shapes thought. It seems that to decide whether either version of the hypothesis is true, many different languages would need to be studied. The greater the variety of languages examined, the more confident a verdict can be reached. Moreover, since the hypothesis is about the mental process of the speakers of a language, it is not enough to simply study dead languages. Therefore, endangered languages are valuable for linguistics because they provide rich study material.

The second reason is ‘sentimentality'. Here, Roache distinguishes two ways of sentimental valuing. The first is connected to what G A Cohen calls personal value. This would be a person valuing something due to their personal connection to it. An example could be a person's bike with which they cycled from Stockholm to Rome, or, the example Cohen gives, an old eraser Cohen bought when he first became a lecturer. The second is a person valuing an object because it is connected to someone or something they care about, e.g. "(…) parents around the world stick[ing] their children's drawings to the fridge." Roache then observes:

We can all agree that it is sentimental of Cohen to insist (as he did) that he would decline an opportunity to upgrade his old eraser to a brand-new one. Yet were the Louvre to decline an offer from a skilled forger to exchange the Mona Lisa for an ‘improved' copy that eliminated the damage suffered over the years by the original, we are unlikely to view this decision as sentimental.

This is surprising because the values informing both decisions are similar: one object with a history is valued over and above a ‘better' object lacking that history. What this shows is that we value certain things (cultural monuments, artefacts, artworks, etc.) for their "historical and cultural significance". This may be derided as ‘sentimental' in certain contexts. So, since "[h]istorical and cultural significance is part of why we value languages (…)", denying the value of languages (and hence ‘minority' languages) amounts to rejecting the value of the Mona Lisa and the Cologne Cathedral. Sentimental value cannot be put aside so easily it seems.

Read more »

The moral case against procreation

David Benatar in Aeon:

ScreenHunter_2865 Oct. 23 11.02In 2006, I published a book called Better Never to Have Been. I argued that coming into existence is always a serious harm. People should never, under any circumstance, procreate – a position called ‘anti-natalism’. In response, readers wrote letters of appreciation, support and, of course, there was outrage. But I also got this message, which is the most wrenching feedback I have received:

I have suffered horribly since I was a teen because of severe bullying in school that left me profoundly traumatised to the point I had to abandon school. Unhappily, I also have terrible looks and I’ve been judged, mocked, insulted because of being ‘too ugly’ even by random strangers in the street what usually happens almost daily. I’ve been called the ugliest person they ever seen. That’s extremely hard to deal with. Then, to finish it, I’ve been diagnosed with a serious congenital heart disease when I was just 18, and today in my early 20s, I suffer from severe heart failure and malignant arrhythmia that threaten to kill me. My heart has almost stopped many times and I deal with the fear of sudden death each day of my existence. I am petrified by fear of death and the agony and torment of imminent death is indescribable. I don’t have much time left and the unavoidable will happen soon. My life has been pure hell and I don’t even know what to think anymore. Certainly, sentencing someone to such a world is the worst of all crimes, and a serious moral violation. If it wasn’t by my parents’ selfish desire, I wouldn’t be here today suffering what I suffer for no reason at all, I could have been spared in the absolute peace of non-existence but I am here living this daily torture.

One does not have to be an anti-natalist to be moved by these words (which are quoted with permission). Some might be inclined to say my correspondent’s situation is an exceptional one, which should not incline us towards anti-natalism. However, severe suffering is not a rare phenomenon, and thus anti-natalism is a view that, at the very least, should be taken seriously and considered with an open mind.

More here.

The newest AlphaGo mastered the game with no human input

Maria Temming in Science News:

101717_MT_alpha-go_mainThe latest version of the computer program, dubbed AlphaGo Zero, is the first to master Go, a notoriously complex Chinese board game, without human guidance. Its predecessor — dubbed AlphaGo Lee when it became the first computer program with artificial intelligence, or AI, to defeat a human world champion Go player (SN Online: 3/15/16) — had to study millions of examples of human expert moves before playing practice games against itself. AlphaGo Zero trained solely through self play, starting with completely random moves. After a few days’ practice, AlphaGo Zero trounced AlphaGo Lee 100 games to none, researchers report in the Oct. 19 Nature.

“The results are stunning,” says Jonathan Schaeffer, a computer scientist at the University of Alberta in Edmonton, Canada, who wasn’t involved in the work. “We’re talking about a revolutionary change.”

AI programs like AlphaGo Zero that can gain mastery of various tasks without human input may be able to solve problems where human expertise falls short, says Satinder Singh, a computer scientist at the University of Michigan in Ann Arbor. For instance, computer programs with superhuman smarts could find new cures for diseases, design more energy-efficient technology or invent new materials.

AlphaGo Zero’s creators at Google DeepMind designed the computer program to use a tactic during practice games that AlphaGo Lee didn't have access to.

More here.

The central story of our lives

Michael Saler in the Times Literary Supplement:

B74e23e6-b4a2-11e7-bd81-0feeb2b41cb44Once upon a time science seemed destined to replace religion as the source of all explanations. Today, however, “story” has become the master metaphor that we use to interpret experience, including the mysteries of God and Nature. This recourse to story-talk is everywhere, uniting the two cultures, the arts and sciences. It is thus not surprising to find the astrophysicist Sean Carroll endorsing Muriel Rukeyser’s line of poetry, “The universe is made of stories, not of atoms”. Carroll used it to support his own brief for the “poetic naturalism” of science: “That is absolutely correct. There is more to the world than what happens; there are the ways we make sense of it by telling its story”.

This cultural turn from metaphysics to metafictions helps to explain why so many readers, young and older, have greeted Philip Pullman’s La Belle Sauvage as if it were the Second Coming. A forthright atheist, Pullman has made the secular balm of stories one of his principal themes, finding in them the “capacity to enchant, to excite, to move, to inspire”. This holds true for “science stories” as well, assuaging our fear that science repudiates wonder for analysis, prescriptive morals for descriptive accuracy. Pullman insists that scientific narratives can be as marvellous as fairy tales, and as ethical as a chivalric quest. The key is that “we have to behave honestly towards them and to the process of doing science in the first place”.

More here.

Coming To America: The making of the South Asian diaspora in the United States

Namit Arora in The Caravan:

01_Coming-To-America_The-Caravan-Magazine_October-2017-652x435On a September night in 1907, an angry mob of about six hundred white people attacked and destroyed an Asian Indian settlement in Bellingham, in the north-western US state of Washington. Many of the traumatised residents fled to Canada. A San Francisco-based organisation called the Asiatic Exclusion League, dedicated to “the preservation of the Caucasian race upon American soil,” blamed the victims for the riot, adding that the “filthy and immodest habits” of Indians invited such attacks. Despite the small number of Indians in the United States—there were fewer than 4,000 at the time—the Asiatic Exclusion League had been warning of a “Hindu invasion” of the country’s west coast. Two months later, another angry white mob struck a settlement of Indian workers in Everett, Washington, forcibly driving them out of the town. In 1910, the US Immigration Commission on the Pacific Coast deemed Indians “the most undesirable of all Asiatics” and called for their exclusion.

Many anti-immigrant laws had already been enacted against other Asian communities, starting with the Chinese Exclusion Act of 1882. In 1907, a new law in the western US state of Oregon barred all Indians from becoming permanent residents (the state had long excluded black people). In 1913, California passed the Alien Land Law, mainly targeted at Japanese immigrants, after which California’s attorney general also barred Indians from owning property in the state. In 1914, at a congressional hearing on “Hindu immigration” led by a vitriolic representative from the seventh congressional district of California, Indians were variously called “a menace,” “thick-headed and obtuse,” illiterate, carriers of strange diseases, people who worked “too hard for too little,” and, according to a purportedly “scientific” document, “likely to deplete the vitality of our people, as the Negro had done.”

Now fast-forward a century. In an expression of poetic justice, California’s voters elected Kamala Harris, an Indian American, as the state’s attorney general in 2010. Two years later, the same seventh congressional district of California elected Ami Bera, another Indian American, as its congressman. Today, there are over three million Indian Americans, making up 1 percent of the US population. They are by far the richest and most educated ethnic group in one of the richest and most powerful countries in the world.

More here.

When Wealth Inequality Arose

Mark Barna in Discover Magazine:

MegalithWe’ve heard how great times used to be, and I don’t mean in 1950s America.

For eons, our hunter-gatherer ancestors shared their spoils with one another, didn’t own much and had very little social hierarchy. Sure, it wasn’t all kumbaya and high-fives. But the fact that individuals had so few personal possessions took the bitter dish of economic inequality off the table. So how’d we get to a world today where 1 percent of the population controls so much of the wealth? That’s a complicated question. But scientists are in agreement that the Neolithic transition — when, between 9500 and 3000 BC, farming became the dominant subsistence strategy — was the moment economic inequality first flashed its billfold. The sedentary life of farming led to complex societies that included division of labor and land ownership. Farmers owning fertile fields got rich, while farmers with rocky plots got by or found other work.

Neolithic burial sites offer evidence of the growing divide between the rich and the poor. On the Balkan Peninsula, in a city called Varna, burials show that in the fifth millennium BC (about 1,000 years after the rise of agriculture in the region) “some of the earliest evidence of extreme inequality in wealth,” according to a paper published in May. One individual “was buried with more gold than is known from any site prior to that time,” the authors say. Teresea Fernández-Crespo, a physical anthropologist, coauthored a paper, published in 2015 in the Journal of Archaeological Science, indicating that in seven megalithic graves in northern Spain, many of the burials between 3700 and 1500 BC were male adults. The stone monument structures, scientists say, are thought to convey status upon those interred. Burial there was “most likely restricted only to those with particular rights and privileges,” Crespo says. Last month, Crespo published a paper in the journal PLOS One suggesting that many people interred in the megaliths from 3500 to 1900 BC owned valley land rather than mountain land. The study was within a region in north-central Spain called Rioja Alavesa that boasts plains, rolling hills and mountainous terrain. Crespo studied both megalith burials and cave burials.

More here.

Wounded Women

Jessa Crispin in Boston Review:

Crispin---Sala-webIf you are wounded, everything you do is brave and beyond reproach. If you are wounded, you get to say that any portrayal of a woman as lying or manipulative is harmful to the culture and all of the future wounded women. If you are wounded, you get to control what is said and thought about you, and you get to try to create a criticism-free world.

The world is not a safe place. It harms us, jostles us, exposes us to burns and pricks. So we tell ourselves and each other stories to help us understand the what and the why. If we didn’t we would all be like Melzack’s dogs, unsure who is hurting us or what is to be done about it. But it is easy to misdiagnose the source of the problem, and once you do, the proper treatment will also elude you. Universalizing our pain challenges the culture to protect us, but it diminishes our individual responsibility. These stories gain traction because they validate what we feel—vulnerable and tossed around—and give us simplistic reasons for why we feel this way. If we claim vulnerability is our natural state, there is nothing we need to change. The world needs to change for us. Insisting we are distinct from men in our woundedness is an easy and soothing story. Men are the enemy who can redeem themselves by turning their nature to our benefit, by protecting us. But in the end we are estranged from our humanity. Here we are not participants in society; we are merely at the mercy of it.

More here.