Pakistan’s Briefcase Warriors

Ilhan Niaz in Foreign Affairs:

Niaz_Briefcase_411One of the truly disheartening aspects of researching Pakistan's history is uncovering evidence that, at critical moments, the country's central bureaucracy provided its rulers of the day with rational and wise advice, only to be ignored.

In 1952, for example, G. Ahmed, Pakistan's Secretary of the Interior, urged Prime Minister Khwaja Nazimuddin to restrain the members of his party from treating the state as their personal estate, abandon manipulating religious fundamentalists for short-term political gain, and focus on policymaking. Nazimuddin ignored Ahmed. In March 1953, sectarian rioting broke out in the Punjab as rival factions of the ruling party aligned themselves with religious fundamendalists. The governor general and the military took the opportunity to push Nazimuddin out establishing the bureaucracy and army's primacy over the elected government.

Similarly, in the early and mid-1980s, Syed Ijlal Haider Zaidi, Secretary Establishment (in charge of the administrative tasks of posting and transfers within the civilian bureaucracy) produced a series of prescient summaries for Zia-ul Haq, Pakistan's third military dictator. His writings dealt with the need to reform the civil service and rehabilitate the provincial administration. Zaidi proposed a number of feasible solutions, such as creating specialized civil service elites to administer education, health, and infrastructure; restoring supervisory functions to the field level; and strengthening the provincial governments. These all could have been implemented, given the relatively healthy finances of Pakistan at the time. Instead, Zia opted to do nothing.

More here.

Darpa Has Seen the Future of Computing … And It’s Analog

Robert McMillan in Wired:

ScreenHunter_51 Aug. 25 17.13“One of the things that’s happened in the last 10 to 15 years is that power-scaling has stopped,” he says. Moore’s law — the maxim that processing power will double every 18 months or so — continues, but battery lives just haven’t kept up. “The efficiency of computation is not increasing very rapidly,” he says.

Hammerstom, who helped build chips for Intel back in the 1980s, wants the UPSIDE chips to do computing in a whole different way. He’s looking for an alternative to straight-up boolean logic, where the voltage in a chip’s transistor represents a zero or a one. Hammerstrom wants chipmakers to build analog processors that can do probabilistic math without forcing transistors into an absolute one-or-zero state, a technique that burns energy.

It seems like a new idea — probabilistic computing chips are still years away from commercial use — but it’s not entirely. Analog computers were used in the 1950s, but they were overshadowed by the transistor and the amazing computing capabilities that digital processors pumped out over the past half-century, according to Ben Vigoda, the general manager of the Analog Devices Lyric Labs group.

“The people who are just retiring from university right now can remember programming analog computers in college,” says Vigoda. “It’s been a long time since we really questioned the paradigm that we’re using.”

Probabilistic computing has been picking up over the past decade, Vigoda says, and it’s being spurred now by Darpa’s program. “They bringing an emerging technology into the limelight,” he says.

More here.

The Age of Niallism: Ferguson and the Post-Fact World

Ferguson3

Matthew O'Brien in The Atlantic:

People who believe facts are nothing think you'll fall for anything. Call it Niallism.

This is my last word (well, last words) on Niall Ferguson, whose Newsweek cover story arguing that Obama doesn't deserve a second-term has drawn deserved criticism for its mendacity from Paul Krugman, Andrew Sullivan, Ezra Klein, Noah Smith, my colleagues James Fallows and Ta-Nehisi Coates and myself. The problem isn't Ferguson's conclusion, but how Ferguson reaches his conclusion. He either presents inaccurate facts or presents facts inaccurately. The result is a tendentious mess that just maintains a patina of factuality — all, of course, so Ferguson can create plausible deniability about his own dishonesty.

Exhibit A is Ferguson's big lie that Obamacare would increase the deficit. This is not true. Just look at the CBO report Ferguson himself cites. Paul Krugman immediately pointed this out, and asked for a correction. How did Ferguson respond? He claims he was only talking about the bill's costs and not its revenues — a curious and unconvincing defense to say the least. But then Ferguson reveals his big tell. He selectively quotes the CBO to falsely make it sound like they don't think Medicare savings will in fact be realized. Here's the section Ferguson quotes, with the part he ellipses out in bold. (Note: Pseudonymous Buzzfeed contributor @nycsouthpaw was the first to notice this quote-doctoring. The italics below are Ferguson's).

In fact, CBO's cost estimate for the legislation noted that it will put into effect a number of policies that might be difficult to sustain over a long period of time. The combination of those policies, prior law regarding payment rates for physicians' services in Medicare, and other information has led CBO to project that the growth rate of Medicare spending (per beneficiary, adjusted for overall inflation) will drop from about 4 percent per year, which it has averaged for the past two decades, to about 2 percent per year on average for the next two decades. It is unclear whether such a reduction can be achieved through greater efficiencies in the delivery of health care or will instead reduce access to care or the quality of care (relative to the situation under prior law).
Ferguson completely changes the CBO's meaning. Why not just say he finds the CBO's analysis unconvincing, like Andrew Sullivan suggested, and leave it at that? Well, Ferguson tries that later — but not before appealing to the authority of the CBO when the CBO is not on his side. The damage is done.

On the Ground in Bamako: What’s next for Mali?

Bruce-Whitehouse-head-shot-e1345775951876-946x1024Rachel Signer interviews anthropologist Bruce Whitehouse on the coup in Mali, in Construction magazine:

On his blog, Bridges From Bamako, Whitehouse documented not only snippets of his research findings, but also observations of daily life in Bamako during the pandemonium and thoughts about how Bamakois were responding to the coup. The blog began attracting the attention of journalists who were covering the turmoil in Bamako from afar. Soon, Whitehouse was giving interviews to Time magazine, the New York Times, the blog Africa Is A Country, and other media outlets.

Whitehouse returned from Mali in June. He spoke with me from campus of Lehigh University, where he is an assistant professor.

Construction: Having previously lived in Mali, and living in Bamako as a researcher right before the coup, to what extent did you foresee the uprising?

Bruce Whitehouse: The history of Tuareg unrest and separatism goes back at least to 1963 and has recurred roughly once a decade. The Bamako regime has never fully controlled the desert. The region has long been home to smugglers, insurgents, and criminals. The coup wasn’t really a surprise to anybody. What was a surprise, I think, was just how different things were this time. When the rebellion resurged in late 2011, you had fighters showing up from Libya with lots of arms. But I don’t think the Libyan civil war made all the difference. If the state in Bamako had been stronger, the rebellion could have still been headed off. If you look at Libya, a lot of those fighters had to cross through Niger to get to Mali. And many convoys were fought and destroyed by Nigerien officials. Generally what had been seen as the Tuareg home region extended into Niger and Algeria, as well as Mali. But the Nigerien government showed that it had a firmer control of its territory. Niger has a border with Libya, yet it was able to contain the problem; Mali does not share a border and yet it wasn’t.

So why is that? You have to go back to what we’re calling the failure or the deterioration of the State.

The Moral Significance of Sex Workers and People With Disabilities

Tauriq Moosa in Big Think:

ScreenHunter_50 Aug. 25 16.05When prostitution cases are brought before a judge in Britain, a particular kind of “John” (or customer) will almost always have the case tossed out of court: that is, if the customer is a person with a disability. So explains a member of the groupTLC Trust to me: a group that defends and promotes the interaction between sex workers and people with disabilities – TLC, as you’ll see, and similar groups, has become my new favourite advocacy group. This powerful statement, of judges dismissing almost out of hand any prostitution cases involving persons with a physical disability, sets an important moral precedent that I think we all ought to follow.

For many, voluntary sex work – that is, done by people who do it without being physically forced* or blackmailed into it – is inherently wrong for reasons I find extremely wanting (unless they mean sex trafficking, in which case we're not talking about the same thing. Please see the notes below for more). But many, including judges, accept that there is a unique situation when it comes to the relationship between sex workers and people with disabilities**. However, what I perceive from sex workers and this relationship is an element of morality worth emulating and promoting; and thus ultimately treating both sex workers and people with disabilities with the respect both groups deserve, as persons with interests, that warrant wider respect and, indeed, admiration.

More here.

The talent myth: How to maximise your creative potential

From The Independent:

If you thought that geniuses were born not bred, you'd be wrong, says Daniel Coyle. He visited centres of excellence across the world and discovered that, if we all just followed a few key rules, success could be ours for the taking.

A few years back, on an assignment for a magazine, I began visiting talent hotbeds: tiny places that produce large numbers of world-class performers in sports, art, music, business, maths, and other disciplines. My research also took me to a different sort of hotbed: the laboratories and research centres around the country investigating the new science of talent development. For centuries, people have instinctively assumed that talent is largely innate, a gift given out at birth. But now, thanks to the work of a wide-ranging team of scientists, including Dr K Anders Ericsson, Dr Douglas Fields, and Dr Robert Bjork, the old beliefs about talent are being overturned. In their place, a new view is being established, one in which talent is determined far less by our genes and far more by our actions: specifically, the combination of intensive practice and motivation that produces brain growth.

… What follows is a collection of simple, practical tips – all field-tested and scientifically sound – for improving skills, taken directly from the hotbeds I visited and the scientists who research them.

1. Stare at who you want to become

If you were to visit a dozen talent hotbeds tomorrow, you would be struck by how much time the learners spend observing top performers. When I say observing, I'm not talking about passively watching. I'm talking about staring – the kind of raw, unblinking, intensely-absorbed gazes you see in hungry cats or newborn babies.

More here.

Saturday Poem

I'm Alone and You're in a Bottle

In you, empty blue bottle on the windowsill,
people walk on a paved sky,
turn a swimming, sun-stroked periwinkle.
Birds fly backwards and upside-down,
traffic is truncated, tiny, curving into nothingness.

Sunlight filters through, and you,
open-mouthed and tinted blue, are learning
the world's so silly,
and nothing sticks around long enough.

I know, I've been at the window too,
standing there all blue, watching
the world come and go,
unable to hold on to any of it

by Angela Rydell
from Barrow Street, Winter 2001

Debunking the Hunter-Gatherer Workout

From The New York Times:

CalDARWIN isn’t required reading for public health officials, but he should be. One reason that heart disease, diabetes and obesity have reached epidemic levels in the developed world is that our modern way of life is radically different from the hunter-gatherer environments in which our bodies evolved. But which modern changes are causing the most harm? Many in public health believe that a major culprit is our sedentary lifestyle. Faced with relatively few physical demands today, our bodies burn fewer calories than they evolved to consume — and those unspent calories pile up over time as fat. The World Health Organization, in discussing the root causes of obesity, has cited a “decrease in physical activity due to the increasingly sedentary nature of many forms of work, changing modes of transportation and increasing urbanization.” This is a nice theory. But is it true? To find out, my colleagues and I recently measured daily energy expenditure among the Hadza people of Tanzania, one of the few remaining populations of traditional hunter-gatherers. Would the Hadza, whose basic way of life is so similar to that of our distant ancestors, expend more energy than we do? Our findings, published last month in the journal PLoS ONE, indicate that they don’t, suggesting that inactivity is not the source of modern obesity.

…All of this means that if we want to end obesity, we need to focus on our diet and reduce the number of calories we eat, particularly the sugars our primate brains have evolved to love. We’re getting fat because we eat too much, not because we’re sedentary. Physical activity is very important for maintaining physical and mental health, but we aren’t going to Jazzercise our way out of the obesity epidemic. We have a lot more to learn from groups like the Hadza, among whom obesity and heart disease are unheard of and 80-year-old grandmothers are strong and vital. Finding new approaches to public health problems will require further research into other cultures and our evolutionary past.

More here.

Playboy Interview: Richard Dawkins

PlaygroundHero

Chip Rowe in Playboy:

PLAYBOY: Your call for militant atheism is one reason you were featured as a character on an episode of South Park. The show’s creators, Trey Parker and Matt Stone, had been accused of being atheists, so they thought of the most militant atheist they could skewer.

DAWKINS: It’s the only South Park episode I’ve seen. There was an attempt at something approaching satire in the idea of an imagined future in which different sects of atheists are fighting each other. But most of that episode was ridiculous in the sense that what they had the cartoon figure of me doing, like buggering the bald transvestite——

PLAYBOY: Transsexual, actually.

DAWKINS: Transsexual, okay. That isn’t satire because it has nothing to do with what I stand for. And the scatological part, where they had somebody throwing shit, which stuck to my forehead—that’s not even funny. I don’t understand why they couldn’t go straight to the atheists fighting each other, which has a certain amount of truth in it. It reminded me of the bit from Monty Python’s Life of Brian with the Judean People’s Front and the People’s Front of Judea.

PLAYBOY: President Obama acknowledged “nonbelievers” in his inaugural address, which caused a fuss. But when you consider religious belief, one of the largest groups in the U.S. is atheists and agnostics. Why do they get overlooked in political discussions?

DAWKINS: It’s a good point. Of course, it depends how you slice it. Christians are by far the largest group. If you divide Christians into denominations, agnostics and atheists come in third, behind Catholics and Baptists. That’s interesting when you contrast it with the lack of influence of nonbelievers. And if you count up the number of Jews, certainly observant Jews, it’s much smaller than the number of nonbelievers. Yet Jews have tremendous influence. I’m not criticizing that—bully for them. But we could do the same.

More here.

The worst art restoration project of all time

ScreenHunter_49 Aug. 24 21.02

Raphael Minder in the New York Times:

An elderly woman stepped forward this week to claim responsibility for disfiguring a century-old “ecce homo” fresco of Jesus crowned with thorns, in Santuario de la Misericordia, a Roman Catholic church in Borja, near the city of Zaragoza.

Ecce homo, or behold the man, refers to an artistic motif that depicts Jesus, usually bound and with a crown of thorns, right before his crucifixion.

The woman, Cecilia Giménez, who is in her 80s, said on Spanish national television that she had tried to restore the fresco, which she called her favorite local representation of Jesus, because she was upset that parts of it had flaked off due to moisture on the church’s walls.

The authorities in Borja said they had suspected vandalism at first, but then determined that the shocking alterations had been made by an elderly parishioner. The authorities said she had acted on her own.

But Ms. Giménez later defended herself, saying she could not understand the uproar because she had worked in broad daylight and had tried to salvage the fresco with the approval of the local clergy. “The priest knew it,” she told Spanish television. “I’ve never tried to do anything hidden.”

More here.

The subtle perils of Eurocentrism

Mihir S. Sharma in the Business Standard:

ScreenHunter_48 Aug. 24 21.00Few events have so up-ended the established order as Japan’s crushing victory over the Russians at the Battle of Tsushima in 1905. This, the first salvo in the long war to push back the subjugation of the East by the West, was heard around the colonised world; and it is where Pankaj Mishra begins From the Ruins of Empire, which purports to be a history of the ways in which the East imagined that war. Sadly, the glaring flaws that populate Mishra’s book, reducing it even from pop history to puerile polemic, begin there, too. Misleading quotes, for example: he says Gandhi responds by recognising it was “self-respect” that won Japan the battle, except most of Gandhi’s writing on Tsushima actually praised Japan’s patriotism and national unity, a considerably more inward-looking and less reactive claim.

Mishra’s treatment of attitudes to Japanese ambition, in fact, is just one instance of the double standards – which match those of the most devoted apologist of empire — that riddle this book. The Russo-Japanese war was a battle of empires for land in Manchuria; but throughout, Mishra insists on describing the horrors of Japanese imperialism as “but a reaction”. So, too, could the British Empire be a “reaction” to the Spanish Empire, and the German Empire a “reaction” to the British. But white people are granted agency by Mishra, and people of colour are not – one of the many, many ways in which this book fits squarely into the Eurocentric, mentally colonised framework which Mishra wants us to believe he is helping us escape. Later on in the book, the moral blindness that comes with such double-standards is hideously exposed in his description of Japanese expansionism, where the Rape of Nanking is hastily glossed over, and that empire’s brutality against fellow-Asians is excused as “revenge for decades of racial humiliation.” Indeed, he goes on to say essentially that the occupied should be thankful for this good, Asian, empire: it allowed them to imagine what freedom from the West would be like.

More here.

No Epiphanies Whatsoever

Cat-sp2Jane Hu in New Inquiry:

Wednesday morning, I wake up to face the computer screen—still open—on my bedside table. One swipe of keypad and a line of tabs brightens into view. I glance at the last page open and last night blows by like a smudge. Did I really read Cat Marnell’s Vice columns until I fell asleep? Rubbing liner from an eyelid, I shift the laptop onto my stomach and lie back down again. Where did I drop off?

What, you don’t know who Cat Marnell is? Oh, you don’t care. Then just forward this to the nine friends on your contacts list who do. We might be hopelessly hooked on her exploits, but that doesn’t mean you need be too.

Honestly, I hadn’t even heard of Marnell until this June, when she left xoJane.com (after failed attempts to resolve her drug addiction) and subsequently joined Vice as their “pills and narcissism” correspondent with a column titled “Amphetamine Logic.” As writers buzzed about Marnell’s media crackup, they tracked to the start of her writing career, when she interned and edited at various Condé Nast publications. Marnell worked at magazines such as NYLON, Teen Vogue, Glamour, and Lucky for, predominantly, their beauty sections. A narrative was set: Young talent starts early, works hard, rises only to go out prematurely—though with a bang.

As one tag affixed to Marnell’s xoJane columns reassures: “It Happened to Me.” That phrase performs a democratizing gesture—prompting readers to engage with a writer’s specific experience—that finally normalizes what “happened” for both writer and reader. Could the two main things that finally happened to Marnell be found in her job title for Vice? Pills and narcissism.

The public ate it up. Eager readers followed Marnell, as her articles moved deep inside half-lit bedrooms, sticky with sex and shaded with angel dust. With each new scandalous detail of her addictions, Marnell’s audience couldn’t wait to see where she would spiral next. Gimme gimme more, gimme more, gimme gimme more.

As Jen Doll has repeatedly observed in the Atlantic, “the same habit of addiction that drives a person to return again and again to the drug of his or her choice may have found a parallel in the reward-and-shame cycle of writing about oneself.” The more readers saw, the more they wanted to see. Inversely, the more Marnell displayed her disintegration, the more she had, and even wanted, to show.

The Lost Futures of Chris Marker

Tumblr_m3yufg2fjb1qb59k0o1_1280_jpg_470x472_q85J. Hoberman on Chris Marker, in the New York Review of Books blog:

Gracefully off-kilter, stylized as semaphores, the shadow of a man and an outlined woman are positioned at the center of a sea shell spiral. Are they dancing on air—or falling into the void?

The poster for Alfred Hitchcock’s Vertigo is scarcely less haunting than the movie. I first saw the image, without understanding what it was, as a nine-year-old on summer vacation and carried the memory with me for some twelve or fifteen years before I first saw the film. It was, as the filmmaker Chris Marker—one of Vertigo’s most ardent admirers—might say, a memory of the future.

As Vertigo is the most uncanny of movies, it feels more than coincidental that on August 1, two days following Marker’s death, at ninety-one, in Paris, the British film journal Sight and Sound announced, with no little fanfare that, after forty years, his favorite movie had finally dethroned Citizen Kane atop the magazine’s once-a-decade critics poll.

Kane is the movie that hyper-dramatized the act of filmmaking. Vertigo is about film-watching in extremis—the state of being hopelessly, obsessively in love with an image. Unlike Kane (or Psycho for that matter), Vertigo was not immediately recognized as great cinema, except in France by people like Marker. Steeped as it is in the pathos of unrecoverable memory, Marker’s La Jetée (1962) was probably the first movie made under Vertigo’s spell.

Tied for fiftieth place (one vote ahead of Rear Window) in the Sight and Sound poll, La Jetée is Marker’s most generally known work, in part because it was remade in the mid 1990s by Terry Gilliam as 12 Monkeys. Marker was the opposite of a celebrity; he was famous not for his well-knownness but for a certain willful unknowability. The man born Christian François Bouche-Villeneuve was permanently incognito. He allowed few interviews and carefully concealed his personal life; although he turned his camera on countless people, including several fellow filmmakers (Andrei Tarkovsky, Akira Kurosawa), he never allowed himself to be photographed.

I called Marker a “filmmaker” but it would more accurate to term him a “film artist.” His oeuvre encompasses movies, photography, videos, TV series, CD-ROMS, computer games, and gallery installations. Some of these might be considered memento mori, often for the film medium. Others propose cinema as a model for historical consciousness. “We can see the shadow of a film on television, the longing for a film, the nostalgia, the echo of a film, but never a film,” is a characteristic Marker observation; one of his favorite aphorisms is borrowed from George Steiner: “It is not the past that rules us—it is the image of the past.”

Notes of a Novice Student of India

India-map1Justin E. H. Smith in Berfrois:

Any specialist on anything will have had that peculiar experience of coming across some casual comment from a total non-specialist about the very thing to which one has devoted one’s life, a comment made as if there were no such thing as specialist knowledge, as if what we know in any domain at all were just so much hearsay and vulgarisation. Lord knows I’ve seen plenty of people denouncing Descartes, for example, or praising Spinoza (seldom the reverse), who know nothing, but nothing, about Descartes or Spinoza. This is easy and costless to do (and we all do it, including those of us who pride ourselves on being specialists and who really care about getting things right in our special domains), so long as one doesn’t mingle with the specialists in the domain about which one holds forth.

I’ve been thinking about how this works, about this seldom-discussed aspect of the sociology of knowledge, quite a bit recently, as I go deeper in my mid-career shift to what used to be called ‘Indology’ (more on this telling term soon). I am still a near-absolute beginner, yet I am now reaching the point where I can no longer say whatever I want to say on the grounds that I don’t know anything anyway, and that the people with whom I’m speaking don’t know anything either. I am now interacting with people who do not find it at all peculiar to care about Pāṇinian syntax theory, or about the rules of proper inference in Navya-Nyāya logic. The days are over when I could make sweeping claims about civilizational differences (the sort of sweeping claims my colleagues in philosophy often make) as regards rationality, for example. So in short I’m learning to be careful about what I say, which is really nothing other than entering a community of specialists. I expect anything I say now will appear naive to me when I look back on it in a few years, which is only to say that I will have entered more fully into that community. But one has to start somewhere.

What used to be called ‘Indology’ is now referred to more obliquely by phrases such as ‘South Asian Studies’, ‘Religions- und Kulturgeschichte Südasiens’, and so on. To some extent this shift can be explained as part of the broader changes that turned geology into ‘earth science’, and so on. Here, it’s just a matter of rebranding, and has nothing to do with respecting the sensibilities of the subjects themselves that are being studied (rocks and sediment don’t have sensibilities). In addition, there is the broad impact of Saïd’s critique of Orientalism, and the bizarre presumption that if we redescribe ourselves as doing ‘studies’ of something rather than the ‘-logy’ of it, then we are somehow immune to that critique. But unlike the transformation of Sinology into East-Asian Studies (it’s gone translinguistic now, too: in Montreal you can major in ‘Études est-asiatiques’), Indology is weighted down by other historical legacies than just the one Saïd picked out, since the gaze upon India has often been one that did not treat it as exotically other, but also, for often less than liberal reasons, treated it as fundamentally, autochthonously, the same.

Wilde in the Office

From LARB:

OscarFor those interested in, or like me obsessed with, anniversaries: this summer marks the quasquicentennial of Oscar Wilde’s first ever office job (fifty years to go before the dodransbicentennial and a century before the sestercentennial). Admittedly, the significance of the event pales in comparison with the centennial of the sinking of Titanic or the bicentenary of Charles Dickens’s birth, but Wilde’s experience in the office provides that curious anniversary where the writer, who wants the best of both worlds as a journalist and a serious author, can see in practice whether such a thing is possible or desirable. When he was 33 years old, Wilde began working for the publishing firm Cassell & Company for the duration of more than two years. His best non-fiction and fiction work was produced during the time he spent in the office at Ludgate Hill, near Fleet Street. In between May 18, 1887, when he signed the contract with Thomas Wemyss Reid, who was general manager of the company, and October 1889, when he was handed his notice, Wilde managed to write the most brilliant and lengthy of his essays, including “The Critic as Artist,” “The Decay of Lying,” “Pen, Pencil and Poison,” and “The Portrait of Mr W. H.,” a speculation on Shakespeare's Sonnets (which later became a favourite of Borges), not to mention The Picture of Dorian Gray, which is often considered Wilde's best work and the defining text of the late-Victorian age. It is difficult to imagine a serious author of our day performing a similar feat. Could Jonathan Franzen, that great enemy of superficial twittering, have written The Corrections while editing GQ, spending his weekdays in its offices? While numerous contemporary authors prefer unplugging the network cable from their laptops while writing, Wilde did the opposite thing and tried to have as many connections as possible, which he thought would contribute to his competence and inventiveness as an author.

Having toured the United States and parts of England during the early 1880s for a series of lectures about decoration, fashion, and applied arts, Wilde had amused American and British audiences with his personality and oratorical skills. When this great tour came to an end, he immediately looked for fame in prestigious literary magazines and newspapers where he could review books and publish essays about his favourite subjects. In the course of a year he reviewed dozens of books, some of which he confessed to not reading in their entirety (“I never read a book I must review,” he wrote, “it prejudices you so.”) Building for himself a credible byline which he hoped would open new opportunities for him, Wilde inhabited a freelancer's existence for a few years. This period was central to his growth as an independent thinker.

More here.

Two Steps to Free Will

From Harvard Magazine:

WillAstronomy naturally inspires cosmic thinking, but astronomers rarely tackle philosophical issues directly. Theoretical astrophysicist Robert O. Doyle, Ph.D. ’68, associate of the department of astronomy, is an exception. For five years, Doyle has worked on a problem he has pondered since college: the ancient conundrum of free will versus determinism. Do humans choose their actions freely, exercising their own power of will, or do external and prior causes (even the will of God) determine our acts? Since the pre-Socratics, philosophers have debated whether we live in a deterministic universe, in which “every event has a cause, in a chain of causal events with just one possible future,” or an indeterministic one, in which “there are random (chance) events in a world with many possible futures,” as Doyle writes in Free Will: The Scandal in Philosophy (2011). The way out of the bottle, he says, is a “two-stage model” whose origin he traces to William James, M.D. 1869, LL.D. ’03, philosopher, psychologist, and perhaps the most famous of all Harvard’s professors. Some of the confusion, Doyle believes, stems from how thinkers have framed the question—in an either/or way that allows only a rigidly predetermined universe or a chaotic one totally at the mercy of chance. David Hume, for example, asserted that there is “no medium betwixt chance and an absolute necessity.” But Doyle also finds the term “free will” unclear and even unintelligible, because the condition of “freedom” applies to the agent of action, not the will: “I think the question is not proper, whether the will be free, but whether a man be free,” in John Locke’s concise phrasing. “The element of randomness doesn’t make us random,” Doyle says. “It just gives us possibilities.”

Doyle limns a two-stage model in which chance presents a variety of alternative possibilities to the human actor, who selects one of these options and enacts it. “Free will isn’t one monolithic thing,” he says. “It’s a combination of the free element with selection.” He finds many antecedents in the history of philosophy—beginning with Aristotle, whom he calls the first indeterminist. But he identifies James as the first philosopher to clearly articulate such a model of free will, and (in a 2010 paper published in the journal William James Studies and presented at a conference honoring James; see “William James: Summers and Semesters”) he honors that seminal work by naming such a model—“first chance, then choice”—“Jamesian” free will.

More here.