Diving Deep into Danger

Nathaniel Rich in the New York Review of Books:

Rich_1-020713_jpg_230x1382_q85The first dive to a depth of a thousand feet was made in 1962 by Hannes Keller, an ebullient twenty-eight-year-old Swiss mathematician who wore half-rimmed glasses and drank a bottle of Coca-Cola each morning for breakfast. With that dive Keller broke a record he had set himself one year earlier, when he briefly descended to 728 feet. How he performed these dives without killing himself was a closely guarded secret. At the time, it was widely believed that no human being could safely dive to depths beyond three hundred feet. That was because, beginning at a depth of one hundred feet, a diver breathing fresh air starts to lose his mind.

This condition, nitrogen narcosis, is also known as the Martini Effect, because the diver feels as if he has drunk a martini on an empty stomach—the calculation is one martini for every additional fifty feet of depth. But an even greater danger to the diver is the bends, a manifestation of decompression sickness that occurs when nitrogen gas saturates the blood and tissues. The problem is not in the descent, but the ascent. As the diver returns to the surface, the nitrogen bubbles increase in size, lodging in the joints, arteries, organs, and sometimes the brain or spine, where they can cause pain and potentially death. The deeper a diver descends, the more slowly he must ascend in order to avoid the bends.

More here.

The Afghan End Game?

Ann Jones in TomDispatch:

US-troops-set-out-on-a-pa-001The euphemisms will come fast and furious. Our soldiers will be greeted as “heroes” who, as in Iraq, left with their “heads held high,” and if in 2014 or 2015 or even 2019, the last of them, as also in Iraq, slip away in the dark of night after lying to their Afghan “allies” about their plans, few here will notice.

This will be the nature of the great Afghan drawdown. The words “retreat,” “loss,” “defeat,” “disaster,” and their siblings and cousins won’t be allowed on the premises. But make no mistake, the country that, only years ago, liked to call itself the globe’s “sole superpower” or even “hyperpower,” whose leaders dreamed of a Pax Americana across the Greater Middle East, if not the rest of the globe is… not to put too fine a point on it, packing its bags, throwing in the towel, quietly admitting — in actions, if not in words — to mission unaccomplished, and heading if not exactly home, at least boot by boot off the Eurasian landmass.

Washington has, in a word, had enough. Too much, in fact. It’s lost its appetite for invasions and occupations of Eurasia, though special operations raids, drone wars, and cyberwars still look deceptively cheap and easy as a means to control… well, whatever. As a result, the Afghan drawdown of 2013-2014, that implicit acknowledgement of yet another lost war, should set the curtain falling on the American Century as we’ve known it. It should be recognized as a landmark, the moment in history when the sun truly began to set on a great empire. Here in the United States, though, one thing is just about guaranteed: not many are going to be paying the slightest attention.

More here.

Cambridge, Cabs and Copenhagen: My Route to Existential Risk

Huw Price in the New York Times:

In Copenhagen the summer before last, I shared a taxi with a man who thought his chance of dying in an artificial intelligence-related accident was as high as that of heart disease or cancer. No surprise if he’d been the driver, perhaps (never tell a taxi driver that you’re a philosopher!), but this was a man who has spent his career with computers.

Indeed, he’s so talented in that field that he is one of the team who made this century so, well, 21st – who got us talking to one another on video screens, the way we knew we’d be doing in the 21st century, back when I was a boy, half a century ago. For this was Jaan Tallinn, one of the team who gave us Skype. (Since then, taking him to dinner in Trinity College here in Cambridge, I’ve had colleagues queuing up to shake his hand, thanking him for keeping them in touch with distant grandchildren.)

I knew of the suggestion that A.I. might be dangerous, of course. I had heard of the “singularity,” or “intelligence explosion”– roughly, the idea, originally due to the statistician I J Good (a Cambridge-trained former colleague of Alan Turing’s), that once machine intelligence reaches a certain point, it could take over its own process of improvement, perhaps exponentially, so that we humans would soon be left far behind. But I’d never met anyone who regarded it as such a pressing cause for concern – let alone anyone with their feet so firmly on the ground in the software business.

I was intrigued, and also impressed, by Tallinn’s commitment to doing something about it.

More here.

Genetic evidence suggests that, four millennia ago, a group of adventurous Indians landed in Australia

From The Economist:

20130119_STM999_0The story of the ascent of man usually casts Australia as the forgotten continent. Both archaeology and the genes of aboriginal Australians suggest that a mere 15,000 years were required for humanity to spread from its initial toehold outside Africa, on the Arabian side of the straits of Bab el Mandeb, to the land of Oz. The first Australians thus arrived about 45,000 years ago. After that, it took until 1788, when Captain Arthur Phillip, RN, turned up in Sydney Cove with a cargo of ne’er-do-wells to found the colony of New South Wales, for gene flow between Australia and the rest of the world to be resumed.

This storyline was called into question a few years ago by the discovery, in some aboriginal Australian men, of Y chromosomes that looked as though they had come from India. But the details were unclear. Now a study by Irina Pugach of the Max Planck Institute for Evolutionary Anthropology, in Leipzig, and her colleagues, which has just been published in the Proceedings of the National Academy of Sciences, has sorted the matter out. About 4,000 years before Captain Phillip and his merry men arrived to turn the aboriginals’ world upside down, it seems that a group of Indian adventurers chose to call the place home. Unlike their European successors, these earlier settlers were assimilated by the locals. And they brought with them both technological improvements and one of Australia’s most iconic animals.

More here.

Proton is smaller than we thought

Hamish Johnston in Physics World:

Prot1The radius of the proton is significantly smaller than previously thought, say physicists who have measured it to the best accuracy yet. The surprising result was obtained by studying “muonic” hydrogen in which the electron is replaced by a much heavier muon. The finding could mean that physicists need to rethink how they apply the theory of quantum electrodynamics (QED) – or even that the theory itself needs a major overhaul.

A proton contains three charged quarks bound by the strong force and its radius is defined as the distance at which the charge density drops below a certain value. The radius has been measured in two main ways – by scattering electrons from hydrogen and by looking very closely at the difference between certain energy levels of the hydrogen atom called the Lamb shift. Until recently the best estimate of the proton radius was 0.877 femtometres with an uncertainty of 0.007 fm

This Lamb shift is a result of the interactions between the electron and the constituent quarks of the proton as described by QED. These interactions are slightly different for electrons occupying the 2S and 2P energy levels and the resulting energy shift depends in part on the radius of the proton.

However, in muonic hydrogen the Lamb shift is much more dependent on the proton radius because the much heavier muon spends more time very near to – and often within – the proton itself.

Now an international team led by Randolf Pohl at the Max Planck Institute for Quantum Optics in Garching, Germany has measured the Lamb shift in muonic hydrogen for the first time and found the proton radius to be 0.8418 fm with uncertainty 0.0007 fm.

More here.

Sunday Poem

Clary

Her cart like a dugout canoe.

Had been an oak trunk.

Cut young. Fire-scoured.

What was bark what was heartwood : P u r e C h a r – H o l e

Adze-hacked and gouged.

Ever after (never not) wheeling hollow there behind her.

Up the hill toward Bennett Yard; down through Eight-Mile, the Narrows.

C o m e s C l a r y b y h e r e n o w

Body bent past bent. Intent upon horizon and carry.

Her null eye long since gone isinglassy, opal.

—The potent (brimming, fluent) one looks brown.

Co u r s e s C l a r y s u r e a s b a y o u t h r o u g h h e r e n o w

Bearing (and borne ahead by) hull and hold behind her.

Plies the dark.

Whole nights most nights along the overpass over Accabee.

Cr o s s e s C l a r y b l e s s h e r b a r r o w u p t h e r e n o w

Pausing and voweling there— the place where the girl fell.

( )

Afterwhile passing.

Comes her cart like a whole-note held.

by Atsuro Riley
from Poetry, Vol. 192, No. 5
publisher: Poetry, Chicago, 2008

The end of an epithet: How hate speech dies

From Time:

Hate-speech2I thought about that moment last weekend, when my 12-year-old daughter was having a Harry Potter-themed sleepover with a few of her friends. One of the girls was recalling a moment in a Potter book and came up short as she groped for a word. She was looking for ferret, but what came out was faggot. Another girl immediately jumped. “That’s a bad word,” she said. The first girl asked what it meant and after she was told, simply nodded her head at the nastiness of the thing. The girls, in effect, had gang-tackled the word, first by opprobrium, then by indifference—and then they went back to their playing. The slow, inexorable sunset of this most-used and most-loathed gay slur is by no means complete. It still burns brightly and horribly in far too many places and far too many lives, but its day is undeniably passing — a process only hastened by President Obama’s inaugural address, which included an explicit call for the rights of “our gay brothers and sisters” and memorably invoked the lessons of Seneca Falls, Selma and Stonewall. How this particular bit of hate speech finally dies will be a lesson both in the way a language and, more important, a culture matures.

The roots of the anti-gay f-word are not what most people think they are. Popular lore has it that suspected homosexuals were once put to death by fire, and that piles of sticks — or “faggots,” in the antiquated term — were used as kindling. The pile-of-sticks definition is correct, but everything else appears not to be. “There’s no historical evidence that this is how and why it originated,” says Ben Zimmer, language columnist for the Boston Globe and executive producer of the website Vocabulary.com. “Its first recorded use was in the early 20th century, when it was applied to women. As with words like queen, it then became an epithet for gay men.” But there’s value even in the etymological misconception. Gay people may never have been put to the torch, but the widespread belief that they were serves to sensitize people to the very real bigotry—and often very real danger—they’ve faced over the centuries. “Even if it has no historical truth it has a different kind of truth as a lesson,” Zimmer says. Epithets fade not just by public censure and growing disuse, but by appropriation. Queer used to pack a terrible punch of its own until gays picked it up and began using it in chants (“We’re here, we’re queer, get used to it!”), as a name for an activist group (Queer Nation) and in the “queer studies” programs offered in many college curricula.

More here.

Pride and Prejudice: universally acknowledged guide to the human heart

From The Telegraph:

Pride-prejudice_2460050b“It is a truth universally acknowledged that a single man in possession of a good fortune must be in want of a wife.” Thus begins Jane Austen’s Pride and Prejudice, one of the most famous opening lines of any novel ever written. It is a story that has touched hearts for exactly 200 years: girl meets boy, girl loses boy, girl gets boy.

…And when Austen wasn’t slicing up the men, she was defining women into tribes (long before the Spice Girls): the pretty, the funny, the clever, the bookish, the bold. Of course, I knew that in real life I was an Elizabeth – not the handsomest, not the fastest, but the “sparkiest” of girls. My true love would value me for my mind first and foremost, and that – like Elizabeth – is what I would want.

Some warn that Pride and Prejudice sets modern girls up to fail. At night, we dream of an honourable man like Darcy. By day, we learn that many modern men favour the pulchritudinous countenance of a Miss Jane Bennet, the rather relaxed morals sported by Lydia-a-likes, and especially the juicy inheritance behind an Anne de Bourgh. Against those temptations, which Elizabeth among us fancies our chance? Coraggio, whispers the author, be true to yourself. Thirty-five years later, living just eight miles from Chawton, Austen’s home, now a museum devoted to her, I find my love for the book endures (although I have long since found my Darcy). So what keeps me – and so many others – wedded to this novel? Especially when we could just whack on the Colin Firth box set instead? Certainly, I enjoy a hit of Georgian grace and fantasy: a dip into that world where problems could be solved by a new gown, an invitation to a ball, or some scrumptious item of gossip. And I appreciate more knowingly Austen’s descriptions of how money rules society. But it is Austen’s knack of describing the human heart that still sets my literary pulse racing, and makes me long for a quiet corner in which to curl up with the book. And now I read it with my daughter in mind; will she, too, find Pride and Prejudice, the gold standard of love stories, a primer for romantic life?

More here.

For the first time in history we could end poverty while protecting the global environment. But do we have the will?

John Quiggin in Aeon:

TianjingEven to those who are thoroughly inured to warnings of impending catastrophe, the World Bank’s recent report on climate change, Turn Down the Heat (November, 2012), made for alarming reading. Looking at the consequences of four degrees of global warming, a likely outcome under current trajectories, the Bank concludes that the full scope of damage is almost impossible to project. Even so, it states: ‘The projected impacts on water availability, ecosystems, agriculture, and human health could lead to large-scale displacement of populations and have adverse consequences for human security and economic and trade systems.’ Among the calamities anticipated in the paper are large-scale dieback in the Amazon, the collapse of coral reef systems and the subsistence fishing communities that depend on them, and sharp declines in crop yields.

By contrast, most of us are already inured to the continuing catastrophe reported in the Bank’s annual World Development Report. Hundreds of millions of people go hungry every day. Tens of millions die every year from easily treatable or preventable diseases. Uncontrolled climate change could produce more crop failures and famines, and spread diseases and the pests that cause them even more widely.

Economic development and technological progress provide the only real hope of lifting billions of people out of poverty and destitution, just as it has done for the minority in the developed world. Yet the living standards of the developed world have been built on cheap energy from carbon-based fossil fuels. If everyone in the world used energy as Americans or even Europeans do, it would be impossible to restrict climate change to even four degrees of warming.

For those of us who seek a better life for everybody, the question of how much our environment can withstand is crucial. If current First World living standards can’t safely be extended to the rest of the world, the future holds either environmental catastrophe or an indefinite continuation of the age-old struggle between rich and poor. Of course, it might hold both.

More here.

Why Did Men Stop Wearing High Heels?

_65495951_louis_xiv_getty

Via Laura Agustín, William Kremer in the BBC:

[T]he intellectual movement that came to be known as the Enlightenment brought with it a new respect for the rational and useful and an emphasis on education rather than privilege. Men's fashion shifted towards more practical clothing. In England, aristocrats began to wear simplified clothes that were linked to their work managing country estates.

It was the beginning of what has been called the Great Male Renunciation, which would see men abandon the wearing of jewellery, bright colours and ostentatious fabrics in favour of a dark, more sober, and homogeneous look. Men's clothing no longer operated so clearly as a signifier of social class, but while these boundaries were being blurred, the differences between the sexes became more pronounced.

“There begins a discussion about how men, regardless of station, of birth, if educated could become citizens,” says Semmelhack.

“Women, in contrast, were seen as emotional, sentimental and uneducatable. Female desirability begins to be constructed in terms of irrational fashion and the high heel – once separated from its original function of horseback riding – becomes a primary example of impractical dress.”

High heels were seen as foolish and effeminate. By 1740 men had stopped wearing them altogether.

But it was only 50 years before they disappeared from women's feet too, falling out of favour after the French Revolution.

By the time the heel came back into fashion, in the mid-19th Century, photography was transforming the way that fashions – and the female self-image – were constructed.

Fraud, Disclosure, and Degrees of Freedom in Science

Lead2

Robert Trivers in Psychology Today:

I point out in The Folly of Fools that science is naturally self-correcting—it requires experiments, data gathering and modes of analysis to be fully explicit, the better to be replicated and thus verified or falsified—but where humans or social behavior are involved, the temptation for quick and illegitimate progress is accelerated by the apparent importance of the results and the difficulty of checking on their veracity. Recently cases of deliberate fraud have been uncovered in the study of primate cognition (Harvard), the health benefits of resveratrol (U Conn), and numerous social psychology findings (Tilburg U, Netherlands). I will devote some later blogs to other aspects of fraud in science but will begin here with a very clever analysis of statistical fraud and lack of data sharing in psychology papers published in the United States. This and related work suggest that the problem of fraud in science is much broader than the few cases of deliberate, large-scale fraud might suggest.

Wicherts and co-authors made use of a little noted feature of all papers published in the more than 50 journals of the American Psychological Association (APA)—the authors of these papers commit by contract to sharing their raw data with anyone who asks for it, in order to attempt replication. Yet earlier work by this same group showed that for 141 papers in four top APA journals, 73 percent of the scientists did not share data when asked to. Since, as they point out, statistical errors are known to be surprisingly common and accounts of statistical results sometimes inaccurate and scientists often motivated to make decisions during statistical analysis which are biased in their own preferred direction, they were naturally curious to see if there was any connection between failure to report data and evidence of statistical bias.

Here is where they got a dramatic result. They limited their research to two of the four journals whose scientists were slightly more likely to share data and most of whose studies were similar in having an experimental design. This gave them 49 papers. Again, the majority failed to share any data, instead behaving as a parody of academics.

Growth and Political Change: Transition Duration is Critical

Freund fig1 21 jan

Caroline Freund and Mélise Jaud in Vox:

The Arab world is undergoing a major political transition. The final outcomes of the changes are far from certain in nations where they have occurred. The geographical spread of the changes is also far from clear at this point. Nevertheless, there have been and will continue to be economic consequences from the moves towards democracy (see Besley and Kudamatsu 2007).

In recent research (Freund and Jaud 2013), we have looked at historical experiences to get an idea of likely outcomes. Specifically, to get a sense of what to expect, we identified and examined 90 attempts at transition from autocracy to democracy that took place over the last half century. Our results offer a cautiously optimistic tale for the Arab countries: most transitions are successful politically and/or economically.

In particular, we find that about 45% succeeded, 40% failed, and 15% achieved democracy gradually. Success is defined as achieving a high level of democracy within three years and maintaining it; failure is when democracy is achieved temporarily or only at very low levels; and gradual is sustained and significant democratic change that takes 4-15 years to complete.

Importantly, we find that the majority of countries that underwent a transition experienced long-run gains in income growth following short-run declines (see Figure 1). Typically, countries face temporary challenges around the time of change with growth declining by 7-11 percentage points in the year of transition, though in the case of gradual transition declines were much larger around 21 percentage poiints and lasted longer.

Why the Ideas of Karl Marx are More Relevant than Ever in the 21st century

German-Political-Philosop-006

Bhaskar Sunkara in The Guardian:

Capital used to sell us visions of tomorrow. At the 1939 World's Fair in New York, corporations showcased new technologies: nylon, air conditioning, fluorescent lamps, the ever-impressive View-Master. But more than just products, an ideal of middle-class leisure and abundance was offered to those weary from economic depression and the prospect of European war.

The Futurama ride even took attendees through miniature versions of transformed landscapes, depicting new highways and development projects: the world of the future. It was a visceral attempt to renew faith in capitalism.

In the wake of the second world war, some of this vision became a reality. Capitalism thrived and, though uneven, progress was made by American workers. With pressure from below, the state was wielded by reformers, not smashed, and class compromise, not just class struggle, fostered economic growth and shared prosperity previously unimaginable.

Exploitation and oppression didn't go away, but the system seemed not only powerful and dynamic, but reconcilable with democratic ideals. The progress, however, was fleeting. Social democracy faced the structural crisis in the 1970s that Michal Kalecki, author of The Political Aspects of Full Employment, predicted decades earlier. High employment rates and welfare state protections didn't buy off workers, it encouraged militant wage demands. Capitalists kept up when times were good, but with stagflation – the intersection of poor growth and rising inflation – and the Opec embargo, a crisis of profitability ensued.

An emergent neoliberalism did curb inflation and restore profits, but only through a vicious offensive against the working class.

Torture, Drones, and Detention: A Conversation Between Laleh Khalili and Lisa Hajjar

Over at Jadaliyya Reports:

[A]n audio recording of a joint book talk held on 16 January 2013. Laleh Khalili and Lisa Hajjar recently published their Time in the Shadows: Confinement in Counterinsurgencies and Torture: A Sociology of Violence and Human Rights, respectively. The event was held at the School of Oriental and African Studies, and featured a conversation between the two authors entitled “Torture, Drones, and Detention: The Vagaries of Liberal Warfare.” The discussion ranged from the Boer War to Drone Warfare, legal torture, the role of law and everything in between.

How We Fight

From The New York Times:

BookThe American occupation of Iraq in its early years was a swamp of incompetence and self-delusion. The tales of hubris and reality-denial have already passed into folklore. Recent college graduates were tasked with rigging up a Western-style government. Some renegade military units blasted away at what they called “anti-Iraq Forces,” spurring an inchoate insurgency. Early on, Washington hailed the mess a glorious “mission accomplished.” Meanwhile, a “forgotten war” simmered to the east in Afghanistan. By the low standards of the time, common sense passed for great wisdom. Any American military officer willing to criticize his own tactics and question the viability of the mission brought a welcome breath of fresh air. Most alarming was the atmosphere of intellectual dishonesty that swirled through the highest levels of America’s war on terror. The Pentagon banned American officers from using the word “insurgency” to describe the nationalist Iraqis who were killing them. The White House decided that if it refused to plan for an occupation, somehow the United States would slide off the hook for running Iraq. Ideas mattered, and many of the most egregious foul-ups of the era stemmed from abstract theories mindlessly applied to the real world.

There is no one better equipped to tell the story of those ideas — and their often hair-raising consequences — than Fred Kaplan, a rare combination of defense intellectual and pugnacious reporter. Kap­lan writes Slate’s War Stories column, a must-read in security circles. He brings genuine expertise to his fine storytelling, with a doctorate from M.I.T., a government career in defense policy in the 1970s and three decades as a journalist. Kaplan knows the military world inside and out; better still, he has historical perspective. With “The Insurgents: David Petraeus and the Plot to Change the American Way of War,” he has written an authoritative, gripping and somewhat terrifying account of how the American military approached two major wars in the combustible Islamic world. He tells how it was grudgingly forced to adapt; how it then overreached; and how it now appears determined to discard as much as possible of what it learned and revert to its old ways.

More here.

Why Are Superachievers So Successful?

From Smithsonian:

Super-achiever-martina-navratilova-631What does a Pulitzer Prize-winning war photographer have in common with a tennis legend? Or how about a celebrated opera diva and a Los Angeles civil rights lawyer? What does Alec Baldwin have in common with Yogi Berra? A lot, says journalist Camille Sweeney, who, along with co-author Josh Gosfield, interviewed dozens of highly accomplished men and women for a new book, The Art of Doing: How Superachievers Do What They Do and How They Do It So Well. Whether someone is setting out to create one of the most popular blogs on the Internet, as Mark Frauenfelder did with BoingBoing, or to win a record amount of money on “Jeopardy!,” people who accomplish amazing things rely on a particular collection of strategies to get to the top—and many of them are not what you’d expect.

Who is a superachiever? Somebody at the top of their craft. Ken Jennings, for example, he didn’t just win on “Jeopardy!,” he was the winningest contestant ever on “Jeopardy!”—he won 74 times. It’s the person who is going beyond success.

Do you think that the people you interviewed for the book are fundamentally different from the rest of us? No! It’s interesting. I think when we started out I might have thought that. But after talking to them and really thinking about their lives, I don’t think that they’re different. When they arrived at what they thought they were going to be doing, they just kept at it. They kept up the energy. And when all the doubters and the haters were saying, “This isn’t going to work,” they didn’t listen. When they felt like they could learn something, they took what they could. It gave me hope that if you put your mind to something, you can be a superachiever. It takes a lot of work, and the work doesn’t stop. These people are pretty 24/7 about what they’re doing.

Even if we aren’t superachievers, can regular people use these techniques and strategies in our own lives? Absolutely. There is a process of doing everything. Superachievement may seem like this impenetrable block of success, this almost intimidating concept. But when you break it down into very small things, or patterns to the way somebody does something, you can grab it and absorb it right into your life. There is this exciting opportunity for people to start seeing the world through this different lens, whether you’re looking at the people we chose or people in your life.

More here.

Saturday Poem

Djinn

Haunted, they say, believing
the soft, shifty
dunes are made up
of false promises.

Many believe
whatever happens
is the other half
of a conversation.

Many whisper
white lies
to the dead.

“The boys are doing really well.”

Some think
nothing is so
until it has been witnessed.

They believe
the bits are iffy;

the forces that bind them,
absolute.
.
.

by Rae Armantrout
from Poetry, Vol. 192, No. 3
publisher: Poetry, Chicago, 2008