Jane O'Grady in The Guardian:
Jack Smart, who has died aged 92, changed the course of philosophy of mind. He was a pioneer of physicalism – the set of theories that hold that consciousness, sensation and thought do not, as they seem to, float free of physicality, but can – or will eventually – be located in a scientific material worldview. His article Sensations and Brain Processes (1959) put forward his Type Identity theory of mind – that consciousness and sensations are nothing over and above brain processes. Invariably included in any collection of mind-body problem papers, it is now part of the canon, for, along with UT Place and David Armstrong, Smart converted what was once “the Australian heresy” into orthodoxy.
While all three were based principally at Australian universities, Place was born in Yorkshire and Smart to Scottish parents in Cambridge, where his father was professor of astronomy. Jack went to the Leys School in the city, studied maths, physics and philosophy at Glasgow University, and during the second world war served mainly in India and Burma. He gained a BPhil at Queen's College, Oxford, in 1948, under the behaviourist Gilbert Ryle, and in 1950 became professor at Adelaide, where he stayed until 1972.
Away from the language-centred philosophy of Britain, Smart was freer to draw the implications that science had for philosophy. He began to ask why consciousness alone should remain exempt from physico-chemical explanation. The behaviourist view he had espoused at Oxford got round this question by denying that mental states, like anger, pain or believing, can even qualify as things or events, whether physical or non-physical. Rather, to talk about mental states is, for behaviourism, simply to talk about collections of actual or potential behaviour. But Smart objected that in this case seeing an after-image due to strong light can amount to nothing more than saying “I have a yellowish-orange after-image”. Such an utterance is surely superfluous to the sensation on which the utterer, who has just experienced it, would be “reporting”.
Smart agreed with old-fashioned mind-body dualism – against behaviourism – that many mental states are indeed episodic, inner and potentially private; what he disputed was that this made their essential nature non-physical. “Why should not sensations just be brain processes of a certain sort?” he demanded. If regarded as neuro-physiological processes, they too would be potentially explicable by scientific laws.
Morgan Meis in The Smart Set:
It is one of history's great art heists. This October 16th, thieves broke into the Kunsthal Museum in Rotterdam, circumvented the high-tech security system and stole seven paintings during the wee hours before the museum opened on Tuesday morning. They got two paintings by Monet, a Picasso, a Matisse, a Gauguin, a Lucien Freud, and one painting by Meyer de Haan.
In every great crime there is a clue. And the clue is often the element that doesn't fit. In this case, the thing that doesn't fit is the painting by Meyer de Haan. That's the one that makes you stop and think. Meyer de Haan. Why would anyone stealing expensive paintings from a major museum steal a Meyer de Haan?
The Meyer de Haan is a self-portrait by a minor artist most people have never heard of. It is worth only a fraction of what the other paintings are worth. Jop Ubbens, the general director of Christie's in Amsterdam told The Guardian that the de Haan “might have been stolen by mistake.” The Guardian's art critic, Jonathan Jones, thinks “any idea that a tasteful collector commissioned this theft is undermined by the inclusion of Meyer de Haan's “Self-Portrait”. No offence, but this comparatively minor Dutch artist does not really belong in the company of the others whose works have been stolen.” True. But suppose for a moment that it wasn't a mistake. Suppose that whoever masterminded this robbery actually did want the Meyer de Haan. Why? What does the painting by Meyer de Haan tell us? What might that self-portrait have to do with all the other, more famous paintings that were stolen? There is a mystery here, perhaps, that only needs the right key for unlocking, the right set of questions. And the first, most obvious question is staring us right in the face.
Who is Meyer de Haan?
One departs and three more come charging in. It’s always rush-hour for Chekhov in the capital. As the Young Vic’s production of Three Sisters is drawing to a close, the Vaudeville is preparing to host a star-studded version of Uncle Vanya. Up the road, at the Novello, another Uncle Vanya is about to arrive from Moscow. And rehearsals are already under way for The Seagull, starring Matthew Kelly, at Southwark Playhouse. For years, we’ve been recreational users of Chekhov. We’re now in danger of becoming hopeless addicts. How come we’re hooked? Chekhov’s career as a dramatist was short and full of trouble. Early plays flopped. His breakthrough hit, The Seagull, also bombed when it was first performed in 1896 at the highly traditional Alexandrinsky Theatre in St Petersburg. Two years later, a revival at the more progressive Moscow Arts Theatre was a surprise success. Chekhov followed it up with Uncle Vanya (1899), Three Sisters (1901) and The Cherry Orchard (1904). Then he died.
more from Lloyd Evans at The Spectator here.
The rap on Picasso during his lifetime and even today is that Matisse, not he, was the colorist. Picasso was the draftsman, the graphic master. (And, after all, who doesn’t love Matisse’s cosmic secret garden of color?) Ever competitive, Picasso regularly addressed the criticism himself: “Color weakens”; “I use the language of construction”; “If you don’t know what color to take, take black.” Picasso’s own dealer said he was “indifferent to … color.” I disagree, and concur with late MoMA curator and wild-man Picasso maniac, William Rubin, who crowed that Picasso was “one of the great colorists of the century.” Picasso is more of a hyena of color, rash, using it to reveal his animal-being, Yeats’s “terrible beauty,” omega points of form. For Picasso, black and white are colors, and so are the thousand shades of gray in between.
more from Jerry Saltz at New York Magazine here.
Though I have in my life experienced gout, bladder stones, a botched bone marrow biopsy, and various other screamable insults, until recently I had no idea what pain was. It islands you. You sit there in your little skeletal constriction of self—of disappearing self—watching everyone you love, however steadfastly they may remain by your side, drift farther and farther away. There is too much cancer packed into my bone marrow, which is inflamed and expanding, creating pressure outward on the bones. “Bones don’t like to stretch,” a doctor tells me. Indeed. It is in my legs mostly, but also up in one shoulder and in my face. It is a dull devouring pain, as if the earth were already—but slowly—eating me. And then, with a wrong move or simply a shift in breath, it is a lightning strike of absolute feeling and absolute oblivion fused in one flash. Mornings I make my way out of bed very early and, after taking all of the pain medicine I can take without dying, sit on the couch and try to make myself small by bending over and holding my ankles. And I pray. Not to God, who also seems to have abandoned this island, but to the pain. That it ease up ever so little, that it let me breathe. That it not—though I know it will—get worse.
more from Christian Wiman at The American Scholar here.
One of the most famous stories of H. G. Wells, “The Country of the Blind” (1904), depicts a society, enclosed in an isolated valley amid forbidding mountains, in which a strange and persistent epidemic has rendered its members blind from birth. Their whole culture is reshaped around this difference: their notion of beauty depends on the feel rather than the look of a face; no windows adorn their houses; they work at night, when it is cool, and sleep during the day, when it is hot. A mountain climber named Nunez stumbles upon this community and hopes that he will rule over it: “In the Country of the Blind the One-Eyed Man is King,” he repeats to himself. Yet he comes to find that his ability to see is not an asset but a burden. The houses are pitch-black inside, and he loses fights to local warriors who possess extraordinary senses of touch and hearing. The blind live with no knowledge of the sense of sight, and no need for it. They consider Nunez’s eyes to be diseased, and mock his love for a beautiful woman whose face feels unattractive to them. When he finally fails to defeat them, exhausted and beaten, he gives himself up. They ask him if he still thinks he can see: “No,” he replies, “That was folly. The word means nothing — less than nothing!” They enslave him because of his apparently subhuman disability. But when they propose to remove his eyes to make him “normal,” he realizes the beauty of the mountains, the snow, the trees, the lines in the rocks, and the crispness of the sky — and he climbs a mountain, attempting to escape.
more from Aaron Rothstein at The New Atlantis here.
The rich, roasted aroma of coffee or the golden-brown colour of crispy French fries are enough to set most mouths watering. But the high-temperature cooking that gives these foods their alluring taste, scent and texture also adds a sting: acrylamide, a probable human carcinogen. Swedish scientists discovered in 2002 that a wide range of baked and fried goods contain worryingly high levels of acrylamide1 — a simple organic molecule that is a neurotoxin and carcinogen in rats. The finding sparked an international effort to reduce concentrations of the chemical by changing ingredients and cooking methods. Ten years on, a report2 from the European Food Safety Authority (EFSA) in Parma, Italy, suggests that this effort has stalled, amid patchy monitoring, uncertainty about acrylamide’s true health effects and the challenge of weeding out a molecule present in hundreds of products.
Soon after the Swedish discovery, two teams — one led by chemist Donald Mottram at the University of Reading, UK, the other by Richard Stadler at Nestlé in Lausanne, Switzerland — unpicked the chemistry behind the problem3, 4. They found that sugars and amino acids such as asparagine found in potatoes and cereals were making acrylamide (C3H5NO) as a by-product of the Maillard reaction, the very process that generates the heady blend of colour, flavour and taste in cooked foods. Subsequent epidemiological studies involving tens of thousands of people have looked for links between acrylamide and various forms of cancer in humans, including breast5 and colorectal cancer6. For the most part, the results have been negative. In 2007, however, a Dutch study7 of almost 2,600 women found that, among those who had never smoked, women consuming about 40 micrograms of acrylamide per day doubled their risk of developing cancers of the womb or ovaries, compared with those taking in roughly 10 μg per day. And last month, a study8 showed that women who ate acrylamide-rich food during pregnancy tended to give birth to smaller babies.
your life is your life
don’t let it be clubbed into dank submission.
be on the watch.
there are ways out.
there is light somewhere.
it may not be much light but
it beats the darkness.
be on the watch.
the gods will offer you chances.
you can’t beat death but
you can beat death in life, sometimes.
and the more often you learn to do it,
the more light there will be.
your life is your life.
know it while you have it.
you are marvelous
the gods wait to delight
by Charles Bukowski
more from the Weather Channel here.
It doesn’t look good for the United States. We are proud when Iraqis and Libyans dodge bombs to vote in their first free elections in decades, and then, when it’s our chance, we barely exceed their turnout rates. Often, we do worse. Roughly half of us vote, and the other half don’t. It made me wonder: What’s stopping us? Do we have reasons not to vote? How can we hear so much about the election, and not participate? If hope isn’t doing it, isn’t the fear of the other guy winning enough to brave the roads, the long lines? In the middle of October, I spoke to more than 50 people between 18 and 40, almost all of whom are planning to go to the polls on Nov. 6. That made them exceptional: only 51 percent of young people voted in 2008. A smaller group is expected this year.
more from Errol Morris with a short film at the NY Times here.
But what if science is fundamentally incapable of explaining our own existence as thinking things? What if it proves impossible to fit human beings neatly into the world of subatomic particles and laws of motion that science describes? In Mind and Cosmos (Oxford University Press), the prominent philosopher Thomas Nagel’s latest book, he argues that science alone will never be able to explain a reality that includes human beings. What is needed is a new way of looking at and explaining reality; one which makes mind and value as fundamental as atoms and evolution. For most philosophers, and many people in general, this is a radical departure from the way we understand things. Nagel, according to his critics, has completely lost it. Linking to one particularly damning review in The Nation, Steven Pinker tweeted, “What has gotten into Thomas Nagel? Two philosophers expose the shoddy reasoning of a once-great thinker.”
more from Malcolm Thorndike Nicholson at Prospect Magazine here.
One thing revived by the “3/11” earthquake, tsunami, and nuclear disaster is the culture of protest, which had been pretty much moribund since the great anti–Vietnam War and antipollution demonstrations of the 1960s. In his new collection of essays, Ways of Forgetting, Ways of Remembering, John Dower describes these 1960s protests as a “radical anti-imperialist critique [added] to the discourse on peace and democracy.” There hasn’t been much of that in Japan of late. But now, since the nuclear meltdowns at the Fukushima Daiichi reactors, thousands of protesters gather in front of Prime Minister Noda Yoshihiko’s Tokyo residence every Friday demanding an end to nuclear power plants. Even larger gatherings of up to 200,000 people have been demonstrating in Tokyo’s central Yoyogi Park, as part of the “10 Million People’s Action to Say Goodbye to Nuclear Power Plants.” Eight million have already signed. This has had at least some cosmetic effect.
more from Ian Buruma at the NYRB here.
SEPTEMBER 18, 1991, WAS A HOT DAY IN GLOUCESTER, TOURISTS shuffling down Main Street and sunbathers still crowding the wide expanses of Good Harbor Beach. Day boats bobbed offshore in the heat shimmer, and swells sneaked languorously up against Bass Rocks. At Gloucester Marine Railways, a haul-out place at the end of a short peninsula, Adam Randall stood contemplating a boat named the Andrea Gail. He had come all the way from Florida to go swordfishing on the boat, and now he stood considering her uneasily. The Andrea Gail was a 70-foot longliner that was leaving for Canada’s Grand Banks within days. He had a place on board if he wanted it. “I just had bad vibes,” he would say later. Without quite knowing why, he turned and walked away.
more from Sebastian Junger in Outside from 1994 here.
In Mind and Cosmos, Nagel holds that materialism can’t deliver the goods. Drawing on his bolder and more recent paper “The Psychophysical Nexus,” he now says that materialistic reductionism is false, not that we currently don’t understand how it could be true. For Nagel, perception and other psychological processes involve irreducibly subjective facts; important aspects of the mind are, therefore, forever beyond the reach of physical explanation. This position is compatible with many doctrines that are associated with materialism. For example, Nagel doesn’t gainsay the slogan “no difference without a physical difference”—if you and I have different psychological properties, then we must be physically different. Indeed, Nagel’s position is even compatible with the idea that every mental property is identical with some physical property—for example, it may be that being in pain and being in some neurophysiological state X are identical in the same way that being made of water and being made of H2O are identical properties.
more from Elliott Sober at The Boston Review here.
Here we go, trying
the infinite possibilities of life
from the limited circumstances
At the last breath
none of us know
whether it was
or the grain
that flew off in the wind.
by Simon Ó Faoláin
from Anam Mhadra
publisher: Coiscéim, Dublin, 2008
From The Independent:
So you remember your wedding day like it was yesterday. You can you spot when something is of high quality. You keep yourself well-informed about current affairs but would be open to debate and discussion, You love your phone because it's the best, right? Are you sure? David McRaney from Hattiesburg, Mississippi, is here to tell you that you don't know yourself as well as you think. The journalist and self-described psychology nerd's new book, You Are Not So Smart, consists of 48 short chapters on the assorted ways that we mislead ourselves every day. “The central theme is that you are the unreliable narrator in the story of your life. And this is because you're unaware of how unaware you are,” says McRaney. “It's fun to go through legitimate scientific research and pull out all of the examples that show how everyone, no matter how smart or educated or experienced, is radically self-deluded in predictable and quantifiable ways.” Based on the blog of the same name, You Are Not So Smart is not so much a self-help book as a self-hurt book. Here McRaney gives some key examples.
The Misconception: Your opinions are the result of years of rational, objective analysis.
The Truth: Your opinions are the result of years of paying attention to information that confirmed what you believed, while ignoring information that challenged your preconceived notions.
From The New York Times:
Many people cite Albert Einstein’s aphorism “Everything should be made as simple as possible, but no simpler.” Only a handful, however, have had the opportunity to discuss the concept with the physicist over breakfast. One of those is Peter G. Neumann, now an 80-year-old computer scientist at SRI International, a pioneering engineering research laboratory here.
As an applied-mathematics student at Harvard, Dr. Neumann had a two-hour breakfast with Einstein on Nov. 8, 1952. What the young math student took away was a deeply held philosophy of design that has remained with him for six decades and has been his governing principle of computing and computer security. For many of those years, Dr. Neumann (pronounced NOY-man) has remained a voice in the wilderness, tirelessly pointing out that the computer industry has a penchant for repeating the mistakes of the past. He has long been one of the nation’s leading specialists in computer security, and early on he predicted that the security flaws that have accompanied the pell-mell explosion of the computer and Internet industries would have disastrous consequences.
by Dave Maier
My grad school colleague M.B. once told me about an exchange he had had with one of our professors. His area was personal identity, and his dissertation advanced a view about same which our professor found counter-intuitive – or at least worried about whether most people would do so. His response, he told me, was this: “Why should I worry about what most people think about this issue? Who is more likely to be right about it – someone who has spent five years becoming an expert on this very topic, considering the arguments for and against it in minute detail? or someone who knows virtually nothing about it, but simply asserts his immediate intuitive reaction as fact?”
I thought this was very well said, but I still wasn't sure. One of the tradeoffs of highly technical philosophy is that the more comprehensive and ironclad a theory is, the more likely it is to stretch our ordinary concepts to the breaking point. Whether or not this is a bad thing will depend on how you feel about comprehensive, ironclad philosophical theories, as opposed to speaking normally with one's friends and neighbors (should they not be professional philosophers).
As the “experimental philosophy” movement is typically construed, it joins this battle of philosophical intuitions firmly on the side of the folk. It's not, as critics sometimes charge, that x-phi wants to put philosophical theories to a vote – after all, my colleague had plenty of arguments to go along with his intuitions – but to the extent that it is indeed a battle of intuitions, x-phi is determined not to let traditional philosophers get away with simply saying “it seems to me that in such a case we would say that _______”.
3QD readers know all about x-phi, naturally, as our Top Philosophy Quark for 2012 was Wesley Buckwalter's most interesting post on an x-phi consideration of non-factive conceptions of knowledge. I say “an” x-phi consideration because x-phi is no one monolithic, um, monolith, but an umbrella term for a wide variety of related approaches (for more on this see here, and the links therein). That is, it doesn't have to take the form of surveys of intuitions; but sometimes it does, and in this post I wonder aloud about what we should really make of the results of such surveys.
by James McGirk
After four debates and with a tsunami of political advertising inundating the United States, it is clear that neither presidential candidate is willing to act decisively on what should be the most pressing issue of our day: student loan debt.
Democrats offer crumbs. Republicans even can’t be bothered to pander to young voters. Yet no other issue so neatly encapsulates the miseries of contemporary American existence. An entire generation of smart, educated people are being crippled with debt. Without some sort of relief, upward mobility will vanish, the gap between rich and poor will yawn wider, our economy will be left in ruins, and what’s left of our once vaunted ability to innovate will die. The parasite is killing the host.
The time has come for decisive action. Student loan debt must be forgiven completely. The federal government should not be lending money to students. All it does is drive up prices and push us deeper in debt. Offer amnesty, get rid of the program, and let colleges pare down tuition until it makes sense for a family to save up or borrow money privately for their children to go. At the very least, let these loans be dischargeable in bankruptcy. This may seem like a drastic thing to do, but the situation is out of control. Something has to be done.
Student loan debt now accounts for 18 percent of American consumer debt. Unlike a mortgage there is no way to discharge a student loan (short of total medical disability). The interest is painful: 3.4 percent for a loan taken out as an undergraduate, and a usurious 6.8 percent for a graduate student. The interest capitalizes. Not only is it charged on the principal, but on any unpaid interest as well, meaning that a loan balloons while student is in school, or during the increasingly frequent forbearances necessary during periods of unemployment. There is no risk of default to the lender. The government guarantees all student loans. Nor is there any risk to universities. It is a trough of free money and these swine have gorged themselves, shamelessly raising tuition year after year, at a rate far outpacing inflation.
The class of 2011 graduated with $26,600 worth of debt each. That’s just for a bachelor’s degree. And those numbers include the lucky third that graduated with no debt at all. For a shot at a job that might offer entrée into a white-collar career you need a graduate degree and a year or two of unpaid internships. Lawyers and doctors, the traditionally secure gateways into America’s upper middle class can easily amass hundreds of thousands of dollars worth of debt. A year of unemployment could wipe them out completely. Very few people who graduate with six-figures of un-dischargable debt will take risks. Every wonder why so many math and science PhDs are taking jobs on Wall Street? Wonder no more.