Life outside the lab: The ones who got away

Ewen Callawa in Nature:

LabWhen Soroosh Shambayati left his organic-chemistry lab, he didn't leave chemical synthesis behind. As a chemist PhD turned investment banker, he started working in the derivatives market in the 1990s. The transactions involved arranging a complex series of trades in a precise order, and it reminded him of synthesizing an organic compound, reaction by reaction. As a graduate student, Shambayati had excelled at synthesis, just as he did at everything he turned his hand to. He was “other-worldly brilliant”, says his former adviser Stuart Schreiber. He juggled three distinct projects during his PhD, one in organic synthesis, one in theoretical physical chemistry and a third in biochemistry and immunology. He was also calm, thoughtful and well read: his bookshelf spans science philosophy, evolutionary biology and physics. Schreiber, a biochemist at the Broad Institute in Cambridge, Massachusetts, knew that if Shambayati wanted to become an academic scientist, he was sure to succeed. “It was very clear to me that he was going to become a star,” he says. But Shambayati chose the financial world — and excelled there instead: he is now chief executive at Guggenheim Investment Advisors (Suisse) in Geneva, Switzerland, a firm that manages billions of dollars for wealthy families and foundations.

Shambayati is among the hundreds of thousands of scientists who train in academia but then leave to follow a different career. According to the latest survey of doctorate recipients conducted by the US National Science Foundation, nearly one-fifth of employed people with science and engineering PhDs were no longer working in science in 2010. This is partly due to a lack of room at the top. In the United States, the number of PhDs entering the workforce has skyrocketed but the number of stable academic jobs has not. In 1973, nearly 90% of US PhDs working in academia held full-time faculty positions, compared with about 75% in 2010.

More here.

Predictions for the Year 2514: Dystopia That Feels Like Utopia

Anis Shivani in TruthOut:

My intuition says that the future will be an extreme form of dystopia, encompassing all the tendencies the humanist-pessimists have amply illustrated over the course of the twentieth century, with added restraints imposed by new developments in genetic engineering, artificial intelligence, and quantum physics. But it will not appear as dystopia to our descendants, rather, it will appear as the fulfillment of utopia. This radical rupture between reality and perception is already evident to a large extent in the management of human affairs, as capitalism perfects its skills at making us eagerly swallow what used to be unpalatable not very long ago.

I remain highly skeptical of the utopian promises of nanotechnology (which, it is said by the optimists, will do away with all forms of material scarcity, since we will supposedly be able to create any product ex nihilo) and artificial intelligence (I do not think that we are anywhere near real intelligence on the part of machines, despite all the credit given to the continued inexorability of Moore's law – i.e., the doubling of computing capacity every two years). I think there is a greater possibility that genetic engineering might perhaps lead to a real extension of the human lifespan (perhaps 120 years, or 150 years, or even 200 years or beyond), resulting from changes in the human germline; this too is a far-fetched notion at the moment, but it seems to have a chance of happening.

More here.

Wednesday Poem

Learning of the Death of Al Purdy

Together we spent ten hectic days in London and Cardiff.
It was my first trip abroad but his umpteenth trip across the Atlantic.
Yet the notion of visiting London's galleries and museums
Had never occurred to him, so we jointly visited Madame Tussaud's Wax Museum;
He visited the pubs and napped a lot; I went to see all the rest.
All the books of V.S. Pritchett that he could buy he bought,
And in one pub he introduced me to three of his new Cockney friends.

He introduced me, as well, to a lot of other and different things,
Not ones that I would normally have found at all interesting.
But he shared his interests and concerns, abruptly, garrulously.
Anything ancient he found eloquent, worthy of one poem or two poems:
“Isn't there a word for the word 'Catholic' that is spelled with a K?”
“What d'ya suppose they built London Bridge for in the first place?”
“In London, there's got to be a pub at every street-corner!”

One of the experiences that he related to me will never be forgotten:
“I was workin' on this old house that my father'd built,
Trying to unscrew one of his god-damn door hinges.
Try as I might, I couldn't loosen that stubborn screw.
I thought, 'Now I'm pittin' my strength against my old man's
And I'm losin', and that's the way it's gotta be, I guess.”
I too guess: It was a duel and the only one of his that ended in a draw.
.

by John Robert Colombo
from Canadian Poetry Online

Perspectives on Cultural Evolution

PGS_Cube

Last May at the Sante Fe Institute, Daniel Dennett gathered Susan Blackmore, Robert Boyd, Nicolas Claidière, Peter Godfrey-Smith, Joseph Henrich, Olivier Morin, Peter Richerson, Dan Sperber, and Kim Sterelny to discuss cultural evolution (via Dan Sperber). Over at the International Cognition and Culture Institute, there are summaries of each of their comments. Dennett:

The working group agreed on a number of points, some methodological and some substantial, that are still considered controversial by others, or in some cases just not yet considered:

1. We should be Darwinian about Darwinism; there are few if any bright lines between phenomena of cultural change for which cultural natural selection is clearly at work and phenomena of cultural change that are not at all Darwinian. The intermediate and mixed cases need not be marginal or degenerate, a fact nicely portrayed in Godfrey-Smith’s Darwinian Spaces.

2. Models must always “over-“simplify, and the existence of complications and even “counterexamples” relative to any model does not automatically show that the model isn’t valid when used with discretion. For instance, the absence of explicit treatment of SCM’s “hetero-impacts” in BRH’s models “does not amount to a denial of its importance”(Godfrey-Smith). Grain level of modeling and explaining can vary appropriately depending on the questions being addressed.

3. The traditional idea that human culture advances primarily by “improvisational intelligence,” the contributions of insightful, intentional, comprehending individual minds, is largely mistaken. Just as plants and animals can be the beneficiaries of brilliant design enhancements that they cannot, and need not, understand, so we human beings enjoy culturally evolved competences that far outstrip our individual comprehension. Not only do we not need to “re-invent the wheel,” we do not need to appreciate or understand the design of many human institutions, technologies, and customs that nevertheless contribute to our welfare in various ways. Moreover (a point of agreement between Sperber and Boyd, for instance), the opacity of some cultural memes (their inscrutability to human comprehension) is often an enhancement to their fitness: “This opacity—which is a matter of degree, of course—is what makes social transmission so important. It plays, I believe, a crucial role in the acceptability of cultural traits: it is, in important ways easier to trust what you don’t fully understand and hence cannot properly evaluate on its own merits.” (Sperber)

More here from Dennett. Other participants' comments can be found here.

Can News Literacy Grow Up?

Newslit1

Lindsay Beyerstein in The Columbia Journalism Review:

In 2005, as Howard Schneider was developing a plan for Stony Brook University’s new journalism school, he taught a course called Ethics & Values of the American Press as a way to get to know the students. He was shocked to discover that about a third of his students believed everything they read—from The New York Times to People magazine—and judged it all to be equally credible. Another third reflexively rejected anything in the news as hopelessly biased. And the remaining third were confused and peppered him with questions, like, “Is Michael Moore a journalist?” and “Is Oprah a journalist when she interviews the survivors of Hurricane Katrina?”

“That class haunts me,” says Schneider, a former editor at Newsday. It also shaped his proposal for the new journalism school. At the time, Bowling Alone, Robert Putnam’s 2000 treatise on the decline of civic engagement in America, had helped spur a national debate about the future of democracy and what our young people needed to be effective citizens. Schneider was convinced that a modern journalism school could no longer teach only journalism; it needed to reinvent itself as the purveyor of a core competency for the entire student body: the ability to be savvy and critical consumers of news and information.

He oversaw the creation of a 15-week “news-literacy” class, open to all students at Stony Brook, and a movement was born. In 2006, the John S. and James L. Knight Foundation gave Stony Brook $1.7 million to enroll 10,000 students in the course—the university hit that mark this fall.

In the decade since, Schneider’s vision has inspired similar programs in schools and communities around the country, from Alan Miller’s News Literacy Project, which works with high schools and middle schools, to Free Spirit Media, which teaches media production and analysis to low-income kids in Chicago. Stony Brook launched a summer institute to teach news literacy to educators and has collaborated on programs in Bhutan, Hong Kong, Australia, Vietnam, and China.

Meanwhile, the need for news literacy has only grown.

More here.

Time Travel Simulation Resolves “Grandfather Paradox”

C5202551-4E4B-4CF6-B3BE6EA54E0E6B20_article

Lee Billings in Scientific American (via Jennifer Ouellette):

Recently [Tim] Ralph and his PhD student Martin Ringbauer led a team that experimentally simulated Deutsch's model of CTCs for the very first time, testing and confirming many aspects of the two-decades-old theory. Their findings are published in Nature Communications. Much of their simulation revolved around investigating how Deutsch's model deals with the “grandfather paradox,” a hypothetical scenario in which someone uses a CTC to travel back through time to murder her own grandfather, thus preventing her own later birth. (Scientific American is part of Nature Publishing Group.)

Deutsch's quantum solution to the grandfather paradox works something like this:

Instead of a human being traversing a CTC to kill her ancestor, imagine that a fundamental particle goes back in time to flip a switch on the particle-generating machine that created it. If the particle flips the switch, the machine emits a particle—the particle—back into the CTC; if the switch isn't flipped, the machine emits nothing. In this scenario there is no a priorideterministic certainty to the particle's emission, only a distribution of probabilities. Deutsch's insight was to postulate self-consistency in the quantum realm, to insist that any particle entering one end of a CTC must emerge at the other end with identical properties. Therefore, a particle emitted by the machine with a probability of one half would enter the CTC and come out the other end to flip the switch with a probability of one half, imbuing itself at birth with a probability of one half of going back to flip the switch. If the particle were a person, she would be born with a one-half probability of killing her grandfather, giving her grandfather a one-half probability of escaping death at her hands—good enough in probabilistic terms to close the causative loop and escape the paradox. Strange though it may be, this solution is in keeping with the known laws of quantum mechanics.

In their new simulation Ralph, Ringbauer and their colleagues studied Deutsch's model using interactions between pairs of polarized photons within a quantum system that they argue is mathematically equivalent to a single photon traversing a CTC.

More here.

Earthly Happenings

Time-History-Literature-Cover

James Ley in The Sydney Review of Books:

‘Odysseus’ Scar’, the opening chapter of Erich Auerbach’s Mimesis: The Representation of Reality in Western Literature (1946), is a classic of twentieth century literary criticism — a brilliant comparative reading of sections of theOdyssey and the Book of Genesis as foundational texts of Western literature’s two great informing traditions: the Hellenic and the Judaeo-Christian. Auerbach first draws our attention to the moment in book nineteen of theOdyssey, after Odysseus has returned in disguise from his wanderings, when the old servant woman Euryclea notices a scar on his leg and recognises him. At this point in the narrative, there is a long digression that explains how Odysseus came to have the scar (a hunting accident) and how Euryclea is aware of this because she has known him since he was young. Auerbach contrasts this with the biblical story of Abraham, whom God orders to sacrifice his son, Isaac. Here we find a very different style of narrative, notable for its lack of explanatory detail. God speaks to Abraham from a contextless void. Abraham obeys without question. He travels for three days to the place where he is to kill his son, but details of the journey and his state of mind are absent.

Encoded in these contrasting narrative styles, argues Auerbach, are fundamentally different ways of representing and therefore understanding reality. In the Odyssey, as in the Iliad, there is only foreground. Everything is explained and externalised; nothing is allowed to remain obscure. People do not change: they are who they are. Homer’s poetry can thus be analysed but it does not lend itself to reinterpretation. The elliptical Old Testament stories, on the other hand, open up interpretive spaces that admit figurative readings. Their perplexing omissions, which leave their protagonists’ motivations shrouded in mystery, create suspense and psychological intrigue.

So it is the biblical style that anticipates the modern notion of character as a layered psychological phenomenon, something that retains an element of inscrutability and is capable of developing over time. But no less important for Auerbach is the implication of an entirely different conception of history.

More here.

reckoning with edmund burke

Edmund_Burke2_cFerdinand Mount at The London Review of Books:

‘You could not stand five minutes with that man beneath a shed while it rained, but you must be convinced you had been standing with the greatest man you had ever yet seen.’ Dr Johnson’s remark on Edmund Burke, related in one of Hester Thrale’s anecdotes, is unforgettable. The greatest Tory of the 18th century takes off his hat and makes the lowest possible bow to the much younger Irish Whig (Burke’s dates are 1729-97, Johnson’s 1709-84). Johnson’s veneration started a fashion which lasted long after Burke’s death. By 1856, Karl Marx, who himself denounced Burke as a sycophant and ‘out-and-out vulgar bourgeois’, was also telling the readers of the New York Daily Tribune that he was ‘the man who is held by every party in England as the paragon of British statesmen’. Burke was revered by Tories and Liberals alike, if with rather different motives, not just for his torrential eloquence but as a politician who somehow transcended politics and as a philosopher who uniquely immersed himself in the world.

Then, quite suddenly, it all changed. For the next century or so, Burke was reviled with the same enthusiasm as he had been praised: he was a corrupt placeman, a party hack, a coward and a stick-in-the-mud, a reactionary mystagogue, his speeches and writings irredeemably tainted by his personal corruption and his superstitiousness.

more here.

Norman Mailer’s “A Fire on the Moon”

BooksGeoff Dyer at Threepenny Review:

Mailer starts with the news of Hemingway’s death; we’ll start with Ezra Pound’s claim, in ABC of Reading, that literature “is news that STAYS news.” The appeal of having one of America’s best-known writers cover the biggest news story of the decade—probably of the century, conceivably of all time—was obvious, and Mailer was a natural fit. Back then a lot of people were quoting the opinion that he was the best journalist in America. One of those people was Mailer himself, who took umbrage at praise that tacitly downgraded his achievements as a novelist. This gets aired very early on in a book in which, sooner or later, most things get aired. The irony is that Mailer “knew he was not even a good journalist.” Unless, that is, he could succeed in redefining and enlarging journalism to cover pretty much everything, including the writing of the book in which the attempt would be made. Imagine Laurence Sterne with a huge subject, a big advance, and a looming deadline and you have some sense of the conflicting pressures at work on Of A Fire on the Moon (the original American title).

The deadline needs emphasizing. Other writers had plenty to say about the moon landing—everyone had something to say about it—but few would have had the chops to bang out 115,000 words for publication in three issues of Life magazine, the first tranche of which, Mailer groans, was due less than three weeks after the astronauts splashed down in the Pacific.

more here.

A sweeping account of how the Reagan years began as the Nixon era

Cover00Christopher Caldwell at Bookforum:

“HE WORE A PURPLE PLAID SUIT his staff abhorred and a pinstripe shirt and polka-dot tie and a folded white silk puffing up extravagantly out of his pocket.” This was not some tea-sipping Edwardian dandy. It was Ronald Reagan announcing his presidential candidacy at the National Press Club in November 1975, as described by the historian Rick Perlstein. Back then, Reagan was, to most people, a novelty candidate, with a bit of the fop or eccentric about him. Political affinities and antipathies have since hardened into a useful but wholly unreliable historical “truth” about Reagan’s political career, one that casts him as either a hero or a villain. It requires an effort of the imagination to see him as the voters he addressed did.

Most historians of the late twentieth century wallow in their youthful prejudices. Not Perlstein. For two decades, he has been scraping away layers of self-justifying platitudes and unreliable recollections. A leftist (one assumes) with an empathy for insurgents and underdogs of all stripes, he has opted not to write the eleventy-zillionth recapitulation of this or that New Deal or civil-rights milestone. He has focused instead on the followers of various reviled and misunderstood conservatives, particularly Barry Goldwater and Richard Nixon, sometimes revealing in them an affinity for straightforward radicalism. He is a man of the archives—patient, punctilious, refreshingly disinclined to moralize.

Perlstein’s ambitious new chronicle, The Invisible Bridge, runs from Nixon’s reelection in 1972 to the 1976 Republican-primary campaign, when a staffer to Nixon’s successor, Gerald Ford, warned in a memo, “We are in real danger of being out-organized by a small number of highly motivated right-wing nuts.”

more here.

The Unbearable Emptiness of a New York Times Op-Ed

Peter Beinart in The Atlantic:

LeadI have my concerns about President Obama’s foreign policy. But nothing eases them like listening to his Republican critics. There’s an onion-like quality to the arguments GOP politicians often deploy against Obama’s policies in the Middle East. Peel away the layers of grave-sounding but vacuous rhetoric, and you’re left with almost nothing intellectually nourishing at all. Take Senators John McCain and Lindsey Graham’s op-ed on Saturday in The New York Times. It starts with a lie: that Obama said “we don’t have a strategy yet” to deal with ISIS. In fact, Obama was speaking solely about ISIS in Syria. (“Do you need Congress’s approval to go into Syria?” asked a reporter last Thursday. “We don’t have a strategy yet. … We need to make sure that we’ve got clear plans, that we’re developing them. At that point, I will consult with Congress,” Obama replied.) When it comes to Iraq, by contrast, the Obama administration does have something of a strategy: It is launching air strikes to protect imperiled religious groups, bolstering the Kurdish Peshmerga even though that may embolden Kurdish leaders to seek independence, and using the prospect of further air strikes to encourage Iraq to form a government that includes Sunnis in the hope this will convince them to abandon ISIS. Later in their op-ed, McCain and Graham call for Obama to “strengthen partners who are already resisting ISIS: the Kurdish pesh merga, Sunni tribes” and push for “an inclusive government in Baghdad that shares power and wealth with Iraqi Sunnis.” In other words, they call on Obama to pursue the same strategy in Iraq that he’s already pursuing, while simultaneously twisting his words to claim that he’s admitted to having no strategy at all.

…It’s a wonderful illustration of the emptiness of much Beltway foreign-policy-speak. McCain and Graham want Obama to act both “deliberately” and “urgently” because they’re both happy words. (As opposed to “lethargically” and “rashly,” which are nastier synonyms for the same thing.) But when you translate these uplifting abstractions into plain English, you see how contradictory McCain and Graham’s demands actually are. You can either demand that Obama not bomb Syria until he’s ensured he has a plan likely to win international and congressional support, or you can demand that he bomb as soon as possible. You can’t demand both. One reason Obama isn’t bombing in Syria yet is that he’s not clear on what the goal would be. McCain and Graham are. “ISIS,” they write, “cannot be contained.” Why not? Hasn’t the U.S. been containing al-Qaeda—ISIS’s estranged older brother—for more than a decade now? But the two senators don’t pause to explain. “It must be confronted,” they declare. What does that mean? If the U.S. is bombing ISIS in Iraq, aren’t we confronting the group already?

More here.

A Call for a Low-Carb Diet

Anahad O'Connor in The New York Times:

CarbPeople who avoid carbohydrates and eat more fat, even saturated fat, lose more body fat and have fewer cardiovascular risks than people who follow the low-fat diet that health authorities have favored for decades, a major new study shows. The findings are unlikely to be the final salvo in what has been a long and often contentious debate about what foods are best to eat for weight loss and overall health. The notion that dietary fat is harmful, particularly saturated fat, arose decades ago from comparisons of disease rates among large national populations. But more recent clinical studies in which individuals and their diets were assessed over time have produced a more complex picture. Some have provided strong evidence that people can sharply reduce their heart disease risk by eating fewer carbohydrates and more dietary fat, with the exception of trans fats. The new findings suggest that this strategy more effectively reduces body fat and also lowers overall weight.

…Diets low in carbohydrates and higher in fat and protein have been commonly used for weight loss since Dr. Robert Atkins popularized the approach in the 1970s. Among the longstanding criticisms is that these diets cause people to lose weight in the form of water instead of body fat, and that cholesterol and other heart disease risk factors climb because dieters invariably raise their intake of saturated fat by eating more meat and dairy. Many nutritionists and health authorities have “actively advised against” low-carbohydrate diets, said the lead author of the new study, Dr. Lydia A. Bazzano of the Tulane University School of Public Health and Tropical Medicine. “It’s been thought that your saturated fat is, of course, going to increase, and then your cholesterol is going to go up,” she said. “And then bad things will happen in general.” The new study showed that was not the case.

More here.

Tuesday Poem

As I Grew Older

It was a long time ago.
I have almost forgotten my dream.
But it was there then,
In front of me,
Bright like a sun—
My dream.
And then the wall rose,
Rose slowly,
Slowly,
Between me and my dream.
Rose until it touched the sky—
The wall.
Shadow.
I am black.
I lie down in the shadow.
No longer the light of my dream before me,
Above me.
Only the thick wall.
Only the shadow.
My hands!
My dark hands!
Break through the wall!
Find my dream!
Help me to shatter this darkness,
To smash this night,
To break this shadow
Into a thousand lights of sun,
Into a thousand whirling dreams
Of sun!

by Langston Hughes

Deep Surface

by Brooks Riley

ScreenHunter_782 Sep. 01 11.58Jean-Luc Godard once inscribed a picture to me with these words: “This is the surface, Brooks, and that’s why it’s deep.” At the time, I was skimming the surface, darting from one life experience to another without stopping to sink down or dive deeper—or give his jeu de mots much thought. While I always relished his love of word play in both English and French, this time I was suspicious of what sounded to me like a facile paradox.

As a man of cinema, Godard must first have thought of that great cinematic paradox, the flat screen and the depth of field that miraculously occurs when a film is projected onto it. In the photograph, he stands in front of a blank wall, very like the blank screen he would soon use for a shadow play to the opening bars of Mozart’s Requiem Mass in D Minor, defying the double-entendre of flatness and cinematic depth with a chiaroscuro ballet in front of the screen—a crane operator and his crane moving the camera and cameraman slowly up, over and then down again, a graceful pas de deux silhouetted against the flat white surface—a two-dimensional triumph.

ScreenHunter_781 Sep. 01 11.40How appropriate that this holy moment was filmed in the soundstage where the glitzy streets of Las Vegas had been built for Francis Ford Coppola’s One from the Heart. The crew, borrowed for a Saturday from that bigger film, consisted of Italians—Vittorio Storaro and his Italian crew—and the hardboiled Hollywood mainstream pros who had seen it all—or thought they had. As Godard piped Mozart over the loudspeakers, and the camera rolled, a cathedral hush permeated the vast interior of the soundstage as the middle-aged, elegant crane operator began to move in front of the screen with the assurance of a dancer, or a man who knows his job. When the music faded out at the end, the hush prevailed. No one, not the crew, not the visitors, not the cast, had ever seen anything like it. It was surface magic, deep beyond words. Now I knew that his inscription made sense.

(As a 9-year-old with not enough movie experience I could easily have retorted, ‘This is the surface, Jean-Luc, and it’s a grande illusion,’ as I waited in vain for Marlon Brando to emerge from the back door of my local movie theatre after a showing of Desiree.)

Too often surface is a euphemism for superficial. But living on the surface makes it easier to be ubiquitous. The assumption that one has to dig or dive for treasures is not necessarily reliable. Analogies can also be arrived at by moving far afield over a surface, like the gerridae, those bugs who walk on water, always finding what they need on top, not deep down. Knowledge is like that body of water: You can dive down into it, but to see clearly, you have to rise to the surface.

Read more »

Monday Poem

Parallel Universe

everything unknown returns to life
upon awakening in my bed supine in light
sun bequeathed day ignites a fire beneath
my blankets burn mind’s the filament of a lamp
upon awakening stupidity tumbles down a sheer of chance
small thoughts plunge they start an avalanche
the ground gives way beneath my feet
upon awakening where am I?
light ricochets from every wall
blind see deaf hear motion stills
minutiae interlock upon awakening
east and west do not collide they mesh
upon awakening bias stands upon its head
draining deadliness, its river Cocytus circles a sewer
upon awakening states recede decline abjure
the babble of all the varied words of god unite
upon awakening they steep in a cauldron of love
the clock’s a joke upon awakening
doors swing wide though no one knocks
upon awakening each ajar as each unlocks
windows blast from jambs upon awakening
lions lie with lambs every noise becomes a note
upon awakening every weight begins to float
even cacophony sings upon awakening
nothing is ever learned again by rote
upon awakening everything becomes
the final sacrificial goat

by JIm Culleny
8/31/14

A personal ethics of clicking

by Charlie Huenemann

ClickNow that every click we make is watched, archived, and meta-data-fied, it is time to start thinking seriously about a personal ethics of internet consumption. This goes beyond mere paranoia and worry over what others might think of what you're taking interest in. Each click is in fact a tiny vote, proclaiming to content providers that you support this sort of thing, and hope to see more of it in the future. And – as always! – we should vote responsibly.

It's too bad, really. Gone are the days where, with the adjustment of a couple of browser settings (“Privacy – on!”), no one could ever know that we were clicking away at all sorts of embarrassments, from naked people to celebrity gossip to stuff that might accurately be labeled as very nasty. It was a seemingly harmless way to let that little id go crazy and graze its fill. Content providers happily supplied the forbidden fruits and we gobbled them up.

Now the jig is up. Privacy settings are as effective as the dark vs. light lever on a toaster. But more significant than any embarrassment we may feel is the fact that our clicking is factored into incredibly effective algorithms which help to steer more of the same our way. And as more of us click on crap, more and more similar crap is generated for consumption, and the internet gradually expands into wall-to-wall crap.

Immanuel Kant's perspective on ethics might suggest to us a Categorical Internet Imperative: Click only on those links that you can at the same time will all your fellow citizens to click on. I don't know about you, but many times I feel that if everybody were just clicking on what I'm clicking on, our culture would be racing toward – well, to pretty much where we are these days, I guess: a few reliable sources of insight and information doing their best to compete with freak shows, bear-baitings, and adorable kittens attacking paper bags.

Not to say we have to be Prussian prudes, of course. Insight and information can come from surprising places, and we surely need clowns to tip us off balance and question ourselves. And there's nothing wrong with just plain old fun (as if anyone needs to be told that). But once we begin seeing our clicks as tiny votes, we begin to think about what sort of sustenance we are channeling into our own minds, and what sort of diet we are recommending to our neighbors. Let us dream a little: if all the clickers out there aimed more consistently toward “good stuff”, content providers would be competing to produce more and more of that stuff. Gradually, one hopes, we would witness the ebbing of the crap, and the waxing of a gloriously informed and inspired culture.

(Okay, that's crazy talk. But even a small shift in that direction would be to the good.)

Read more »

Why I don’t like tipping

by Emrys Westacott

Images

I dislike tipping. That is, I dislike the whole tipping system. As a card-carrying tightwad I can't honestly say I enjoy leaving tips, but that's not my point. My point is about the general practice, the social institution.

What set me thinking about this was a slightly unpleasant experience I had recently in a café in Quebec City. My wife and I had finished breakfast and after quite a long delay the waitress brought the bill. In Canada these days, as in Europe, it's normal for customers to pay using a portable credit card reader that is brought to the table. These reportedly reduce credit card fraud by eliminating the opportunity for dishonest wait staff to “skim” the credit card information while out of sight of the card owner. The bill is displayed on a screen along with various tipping options. These vary according to the machine, but a typical range of options is: 10%, 15%, 20%, custom tip, no tip. Usually I tip 15%, but on this occasion, partly because of the long delay in getting the bill, and partly because I felt the waitress had from first to last been unpleasantly condescending, I tapped the 10% button. She was looking over my shoulder (another thing I had against her), and immediately asked me if I was dissatisfied in any way with the service. Being taken by surprise, and also being a wimp, I answered “No.” She then told me that in Quebec it was normal to tip at least 15%. I said, “Oh, I didn't know,” and left the tip at 10%. If I'd been less of a wimp I would have explained my dissatisfaction and complained about her looking over my shoulder. Then again, if I'd been even wimpier I would have adjusted the tip according to her recommendation.

Tipping is a peculiar institution. Whether you leave a tip is optional, and there are many circumstances where you would suffer no adverse consequences (other than possible feelings of guilt) should you not tip: for instance, when you check out of a hotel, alight from a taxi, or eat at a restaurant you are unlikely to revisit. If we were nothing but little carbon-based bundles of rational self-interest, as some economists prone to abstraction have at times assumed, tipping would be much less common and might even never have become an established custom. In some places—Japan, Finland, South Korea, for instance–it isn't. And even in places like the US, where tipping is widespread, the conventions aren't especially consistent. Many people leave a tip for the person who cleans up their hotel room, but not for the person at the reception desk who checks them in and out. They add a tip for their hairdresser, but not for their dental hygienist.

Read more »

Inconsistent Mathematics, Reutersvärd, and Buddhism: An Interview with Chris Mortensen

by Michael Lopresto

IA_23_Bike Rack with Shadows

Chris Mortensen is Emeritus Professor of Philosophy at the University of Adelaide. He thinks that the inconsistent hasn't been taken seriously enough in Western philosophy, that the masterpieces of Reutersvärd rub our noses in the inconsistent, and that Western philosophy and Buddhism are complementary. He's the author of Inconsistent Mathematics (1995) and Inconsistent Geometry (2010).

Firstly, what made you get into philosophy?

I think I was always interested in it, really—since high school, anyway. I was diverted for while into maths and physics during my first couple of years at university, before coming back to philosophy. I realised that if what you want to do is what you like doing, then philosophy is the thing to do. I still kept up with the maths subjects, but philosophy was more fun, and I was better at it.

Morty

Had there always been a lot of overlap between your interest in philosophy and your interest in maths?

There was, but one thing I noticed was that my logic lecturers would always motivate what they were doing. They would tell you why this was interesting, why there was a debate here. Whereas my maths lecturers on the other hand tended to be very pure and syntactical, leaving aside motivation much of the time. Some logicians are very pure – some of my best friends are very pure. But perhaps it is possible to be a bit too pure and syntactical in philosophy, it depends on what you are trying to achieve I suppose. Just pop down to the library and have a look at Russell and Whitehead's Principia Mathematica. It doesn't contain too much English (even though Russell excelled as a philosopher, as opposed to a logician).

Read more »