Malala Yousafzai’s Fight Continues

Nicholas Kristoff in The New York Times:

MalalaWHEN the deputy head mistress pulled Malala Yousafzai out of high school chemistry class one morning a year ago, Malala nervously searched her mind for recent offenses. “You usually get a bit scared if your head teacher comes, because you think you are being caught doing something,” Malala recalled. “But she told me: ‘I need to tell you something. You have won the Nobel Peace Prize.’ ” After a brief celebration, Malala returned to class for the rest of the school day; as the world’s news organizations clamored for interviews, she wrestled with physics. She’s a champion of girls’ education worldwide, she explains, and that must include her own. Malala, now a high school junior, was in New York this past week to address the United Nations, attend the premiere of a full-length documentary movie about her life and hound world leaders to pay attention to girls’ education.

…Malala is determined not to be used as window dressing by world leaders, and her advice to presidents and prime ministers is to focus not on elementary school or middle school but on 12 full years of education. “Your dreams were too small,” she tells U.N. members. “Your achievements are too small. Now it is time that you dream bigger.” She scolded Nigeria’s president at the time for not helping girls abducted by Boko Haram. She told President Obama at the White House that drones were counterproductive and that he should invest in education. Just eight days of global military spending, she notes, would pay to get all remaining kids in school worldwide. “No world leader would want nine years of education for their children,” she told me. “Every world leader wants quality education for their children. They need to think of the rest of the world’s children as their own children.”

More here.

What Can We Learn From TV Coverage of Feminism in 1970?

Bonnie J. Dow in Women's Media Center:

WomenOnce upon a time, there were only three television networks. Before cable and especially before the Internet, a social phenomenon that went unnoticed by the “Big Three”—CBS, NBC, ABC—might as well not be happening at all. That was the case for second-wave feminism before 1970, the year that the national television news networks finally gave airtime to the rapidly growing movement. In Watching Women’s Liberation, 1970: Feminism’s Pivotal Year on the Network News (University of Illinois Press, 2014), I analyze the meaning and influence of that surge of news coverage. In addition to numerous feature stories on the movement as a whole, network news covered important protests that year, such as January’s disruption of Senate hearings on the birth control pill by radical feminists, and the March sit-in at the Ladies’ Home Journal by 100-plus women. The August 26, 1970 Women’s Strike for Equality march, which involved thousands of women across the country and closed down Fifth Avenue in New York City to demand abortion rights, child care, and equal opportunity, led the evening news on all three networks. Yet CBS’s story that night termed the marchers a “militant minority,” even though they included current and former members of Congress, editors from Ladies’ Home Journal and McCall’s, and Cosmopolitan editor Helen Gurley Brown. These 1970 reports were opening salvos in the televised battle over feminism’s public image, one that continues today in a much wider array of media forms.

In 1970, network news coverage of feminism was a surprising mix of positive and negative reporting. Most reports, for instance, treated abortion rights and the ERA as reasonable, even commonsensical, demands.

More here.

Walter Benjamin’s legacy, 75 years on

John Dugdale in The Guardian:

WalterLike many a refugee in southern and central Europe today, Walter Benjamin was in flight from war and persecution 75 years ago, but was blocked at an intermediate border en route to the country chosen as his haven. He was part of a Jewish group which, hoping to escape occupied France, had hiked through a Pyrenean pass in autumn 1940 with a view to entering Franco’s Spain, crossing it to Portugal and then sailing to the US. However, in the words of Hannah Arendt, they arrived in the frontier village of Portbou “only to learn that Spain had closed the border that same day” and officials were not honouring American visas such as Benjamin’s. Faced with the prospect of returning to France and being handed over to the Nazis, he “took his own life” overnight on 26 September, whereupon the officials “allowed his companions to proceed to Portugal”. For Arendt, who successfully reached New York via his intended route a few months later, this was a tragedy of misunderstanding, a poignant but fitting end for a brilliant but misfortune-prone older relative (her cousin by marriage) whom she writes about with a kind of affectionate exasperation.

Yet Edward Stourton, in Cruel Crossing: Escaping Hitler Across the Pyrenees, notes “there are all sorts of unanswered questions surrounding Benjamin’s death. His travelling companions remembered him carrying a heavy briefcase containing a manuscript he described as ‘more important than I am’. No such manuscript was found after his death … A Spanish doctor’s report gave the cause of death as a cerebral haemorrhage, not a drugs overdose. There has been persistent speculation that he was actually murdered, perhaps by a Soviet agent who had infiltrated his escaping party.” By the time Arendt wrote her memoir (later used as the introduction to Illuminations) in 1968 the chaotic freelance critic she evoked, pinballing between temporary homes, disparate obsessions and the incompatible views of his friends Adorno, Brecht and Gershom Scholem, was fast emerging a la Orwell as a giant figure with an unexpectedly substantial estate in print – his collected writings were published in Germany in 1955, eventually followed by a four-volume Harvard edition in English – and formative power in multiple fields.

More here.

Silicon Valley shouldn’t let China strong-arm it into spying

Ken Roth in Quartz:

ScreenHunter_1388 Sep. 26 19.34In an impassioned speech at the White House’s February 2015 cybersecurity summit, Apple’s chief executive Tim Cook argued that in a world where “too many people do not feel free to practice their religion or express their opinion or love who they chose,” privacy can “make a difference between life and death.” In the two years since Edward Snowden’s first revelations about the National Security Agency’s surveillance excesses, Silicon Valley companies such as Apple, Google,Facebook, Microsoft and Cisco have used their influence in meetings with President Obama and concerted lobbying efforts to rein in mass surveillance in the United States.

To date, these efforts have found Silicon Valley allied with its users. But with companies under mounting pressure in places like Russia andChina to aid abusive government surveillance, the industry must decide whether to stand up for their most vulnerable users—those in countries where peaceful dissent can lead to serious reprisals—even if that may affect its business opportunities.

This month’s state visit by Chinese President Xi Jinping is a major test. Xi’s visit opened with US tech executives in Seattle. Chinese officials like Internet czar Lu Wei will also join companies at the US-China Internet Industry Forum, an annual meeting organized by Microsoft and the Internet Society of China. Media reports indicate that executives from Apple, Facebook, Google, Uber and Cisco are invited.

These meetings come at a time when China is considering a raft of new laws—ostensibly about security and counter-terrorism—that would expand digital surveillance and censorship.

More here.

translating ‘The Dirty Dust: Cré na Cille’ by Máirtín Ó Cadhain

Mairtin-o-cadhainJeremy M. Davies at The Quarterly Conversation:

Long have I labored in the temples of translation, if not as a cleric, then let us say as a graying vestal. In those drop-ceiling’d holy sites, papered with grant applications and hung with the leathered hides of forgotten interns, rumors have long persisted of the great untranslated Irish-language novel Cré na Cille, its title traditionally English’d as “Graveyard Clay.” Now called The Dirty Dust (the better to retain author Máirtín Ó Cadhain’s alliterative original, says its introduction), it has at last been made available to Anglophones thanks to translator Alan Titley and the Yale Margellos World Republic of Letters.

“An influence on Finnegans Wake!” was one commonly heard refrain concerning this as-yet obscure object of desire, never mind that the two novels’ respective dates of publication make this a strained point at best. “In a league with Flann O’Brien!” was another, more reasonable, certainly more accurate line. To complete the trifecta, I even heard a few variations on “Beckett loved it!”—presumably unsubstantiated, but nonetheless tantalizing. Whether or not Ó Cadhain’s prose could really match or anyway trot sans embarrassment alongside the mighty strides of this Holy Trinity, the book’s premise was enough to lend credence to the rumors. Cré na Cille comes with an unbeatable “elevator pitch” that rhymes most deliciously with the work of its author’s best beloved countrymen: it’s none of your garden-variety narratives, following a protagonist or protagonists through which- and whatever conflicts and experiences, no. It’s 100% dialogue, and not just any dialogue, but a chorale of dead souls, every character already having snuffed it and been stuffed into their graves. À la an Our Town or Spoon River cross-pollinated with No Exit, however, these corpses are perpetually, rather hellishly awake, aware, and gabbing in Ó Cadhain’s wonderfully unsplendid hereafter.

more here.

‘The Invention of Nature,’ by Andrea Wulf

27THURBRUN-master675Colin Thubron at The New York Times:

Alexander von Humboldt was the pre-eminent scientist of his time. Contemporaries spoke of him as second in fame only to Napoleon. All over the Americas and the English-speaking world, towns and rivers are still named after him, along with mountain ranges, bays, waterfalls, 300 plants and more than 100 animals. There is a Humboldt glacier, a Humboldt asteroid, a Humboldt hog-nosed skunk. Off the coast of Peru and Chile, the giant Humboldt squid swims in the Humboldt Current, and even on the moon there is an area called Mare Humboldtianum. Darwin called him the “greatest scientific traveler who ever lived.”

Yet today, outside Latin America and Humboldt’s native Germany, his name has receded into near oblivion. His insights have become so ingested by modern science that they may no longer seem astonishing. As Andrea Wulf remarks in her arresting “The Invention of Nature: Alexander von Humboldt’s New World,” “it is almost as though his ideas have become so manifest that the man behind them has disappeared.”

This formidable genius was born in 1769 to a Prussian court official and a forceful mother of Huguenot descent. He was brought up in the shadow of his precocious elder brother, Wilhelm, a linguist and philosopher, but Alexander flowered into a brilliant polymath: a slight, apparently delicate man driven by furious ambition and insecurity.

more here.

‘Portraits: John Berger on Artists’, by John Berger

B38080b6-6322-11e5-9846-de406ccb37f2Jackie Wullschlager at the Financial Times:

At its best, there is a sort of poetry about Berger’s mix of storytelling and critique, and his receptiveness to literature of all stripes, which consistently enriches this account. An outstanding example is essays on Velázquez and the harsh “Spanish landscape of the interior”, which connect to a musing on unpaintable landscapes worldwide (“if we tend to forget this it is the result of a kind of Eurocentrism”) and — verging bravely, provocatively, on fraught orientalist territory — on the “special place” in Arab poetry of the blade, knife, sword, dagger.

“In the Sahara one enters the Koran,” Berger writes. “Islam was born of, and is continually reborn from, a nomadic desert life whose needs it answers, whose anguish it assuages . . . the blade was a reminder of the thinness of life. And this thinness comes, very materially, from the closeness in the desert between sky and land . . . In the thin stratum of the living laid on the sand like a nomad’s carpet, no compromise is possible because there are no hiding places; the directness of the confrontation produces the emotion, the helplessness, the fatalism.”

Berger’s vision of geography shaping history shaping art and life is almost always infused with such imaginative empathy. When, rarely, it is not, the absence is also revealing: the artists with whom Berger struggles are those seeming to him to lack that empathy, their focus on existential alienation excluding them from social constructs and connections. Giacometti’s “extreme proposition” that no reality could ever be shared “reflects the social fragmentation and manic individualism of the late bourgeois intelligentsia”.

more here.

Why some scientists are worried about a surprisingly cold ‘blob’ in the North Atlantic Ocean

Chris Mooney in the Washington Post:

ScreenHunter_1388 Sep. 25 22.06It is, for our home planet, an extremely warm year.

Indeed, last week we learned from the National Oceanic and Atmospheric Administration that the first eight months of 2015 were the hottest such stretch yet recorded for the globe’s surface land and oceans, based on temperature records going back to 1880. It’s just the latest evidence that we are, indeed, on course for a record-breaking warm year in 2015.

Yet, if you look closely, there’s one part of the planet that is bucking the trend. In the North Atlantic Ocean south of Greenland and Iceland, the ocean surface has seen very cold temperatures for the past eight months:

What’s up with that?

More here. [Thanks to Ali Minai.]

In Memoriam: Yogi Berra

Akim Reinhardt in his blog, The Public Professor:

Berra-231x300As a boy of 8 and 9 and 10, growing up in the Bronx, I was a big New York Yankees fan. When you grow up in the Bronx, that’s really all there is to brag about. A zoo and the Yankees.

Nearly every game aired on channel 11 WPIX, and I watched as many as I could, which was nearly all of them.

The Yankees are by far the most successful team in the history of American sports. Not even close. They’re probably the most successful team in the world. For this reason, rooting for the Yankees has often been equated with rooting for a large, wealthy corporation like IBM or GM. I’ve always thought it’s a very poor analogy.

Rooting for the Yankees is actually like rooting for the United States. Each in their own way, the Yankees and United States are the 300 lb. gorilla, that most powerful of entities winning far more than anyone else. Their wealth creates many advantages. Supporters expect them to win, and they usually do. Opponents absolutely revel in their defeats.

All that success means you will be adored by some non-natives who are tired of losing and want to bask in your glory, even if it must be from afar. But mostly you are hated. Anywhere you go in America, some people love the Yankees and many more hate them. Just like the United States is either loved or hated everywhere else in the world.

Who hates IBM?

More here.

 What Can ‘Star Trek’ Teach Us About American Exceptionalism?

John Feffer in The Nation:

Star_trek_movie_imgThey were the “best and the brightest,” but on a spaceship, not planet Earth, and they exemplified the liberal optimism of their era. The original Star Trek, whose three-year TV run began in 1966, featured a talented, multiethnic crew. The indomitable Captain Kirk had the can-do sex appeal of a Kennedy; his chief adviser, the half-human, half-Vulcan Mr. Spock, offered the cool rationality of that “IBM machine with legs,” then–Secretary of Defense Robert McNamara. And the USS Enterprise, on a mission “to boldly go where no man has gone before,” pursued a seemingly benign anthropological interest in seeking out, engaging with, and trying to understand the native populations of a fascinating variety of distant worlds.

The “prime directive,” designed to govern the conduct of Kirk and his crew on their episodic journey, required non-interference in the workings of alien civilizations. This approach mirrored the evolving anti-war sympathies of series creator Gene Roddenberry and many of the show’s scriptwriters. The Vietnam War, which raged through the years of its initial run, was then demonstrating to more and more Americans the folly of trying to reengineer a society distant both geographically and culturally. The best and the brightest, on Earth as on the Enterprise, began to have second thoughts in the mid-1960s about such hubris.

Even as they deliberately linked violent terrestrial interventions with celestial ones, however, the makers of Star Trek never questioned the most basic premise of a series that would delight fans for decades, spawning endless TV and movie sequels. Might it not have been better for the universe as a whole if the Enterprise had never left Earth in the first place and if Earth hadn’t meddled in matters beyond its own solar system?

More here.

The not-quite-romance of Eudora Welty and Ross MacDonald

WeltymacdonaldMargaret Eby at The Paris Review:

Some friendships hover between romantic and platonic, anchored to the latter by circumstance or fate. It’s a sitcom trope: the will-they-or-won’t-they couple, always teetering at the edge of love. But though TV demands a tidy resolution—the answer is almost always that they will, and do—in life such friendships often remain in limbo indefinitely, stretching on for years, even decades.

Such was the case for Eudora Welty and Ross MacDonald. By the time they became acquainted, in 1970, both were well established in their fields—Welty in that nebulous genre called Southern literature, and MacDonald in hard-boiled detective fiction. Welty’s stories and novels captured the voice of small towns in Mississippi; MacDonald, the pen name for Ken Millar, set his novels in Southern California, where he and his wife, Margaret, had settled. His books explored, through his Philip Marlowe–equivalent Lew Archer, the ways in which the dream of suburbia could turn twisted and nightmarish.

Welty was an avid reader of crime fiction, so much so that the now-defunct Choctaw Books in Jackson used to keep a pile of paperbacks on hand for when she stopped by.

more here.

In whose service does a painter paint, or a critic criticize?

9780262028523_0Barry Schwabsky at The Nation:

When Buchloh derides more or less all of modernist and contemporary representational art, from Giorgio de Chirico’s pittura metafisica to Neo-Expressionism circa 1980, as “a masquerade of alienation from history, a return of the repressed in cultural costume,” the vehemence of his condemnation is impressive until one considers how all-encompassing it is, and how easily it might be turned into praise. After all, maybe the repressed should be encouraged to return—and who’s to say that being alienated from history is categorically bad? Yet as slippery as Buchloh’s rhetoric may be, the object of his fulminations is, at times, clear enough. Klein is an easy target, given that he was an incorrigible mystifier who really did take a reactionary political stance. But Buchloh, who maintains that it’s impossible to “evaluate any artistic production without considering at the same time its manifest political and ideological investments”—and who also feels certain that he can detect its unconscious agenda—forgets that the artist’s politics are not necessarily those of his art. That Balzac was a royalist did not prevent his writing from having a revolutionary effect, or so Marx and Engels believed. Buchloh, by contrast, thinks he’s made his case by citing Klein’s “crypto-fascist statements.” He accuses Donald Judd, who admired the Frenchman’s blue monochrome paintings, of a “patently formalist” approach, and—pot calling the kettle black—considers Judd’s promulgation of autonomous art as an “authoritarian prohibition” of his own brand of Ideologiekritik. But while Buchloh would be happy to prohibit Judd’s attentiveness to form, the latter at least accounts for why Klein is still worth talking about today.

more here.

Did william styron reserve his best work for non-fiction?

Deb949d4-61e2-11e5_1179303hPhillip Lopate at the Times Literary Supplement:

Styron remains a dimly realized figure in his personal essays. We are told repeatedly the same facts about his childhood in Tidewater, Virginia, his grandmother who owned two slaves, his enlistment in the Marine Corps, his annus mirabilis in 1952 when his first novel was published and he went to France, found a circle of friends who would start The Paris Review, and met his wife. Of course any writer serially engaged in autobiographical accounts will be forced to repeat material; but Styron never rethinks or questions any of it. He uses practically the same language each time. It isn’t that he’s dishonest, but his public presentation of self lacks a more probing honesty: he always seems to be holding back. To get a sense of Styron the man you would have to turn to his daughter Alexandra Styron’s perceptive, frank portrait, Reading My Father (TLS, September 30, 2011). She claims her father had a wicked sense of humour, but there is precious little in evidence here, except maybe for a tongue-in-cheek takedown of Flo Aadland’s pop tell-all about Errol Flynn, The Big Love, and a sweet, whimsical piece about walking his dog, previously uncollected.

In the fallow decades when Styron struggled to bring off a new novel after the success of Sophie’s Choice, he dedicated considerable, if reluctant, energy to non-fiction. Many of the best pieces here, including the brilliant “This Quiet Dust” about how he was moved to write the story of the Negro slave, Nat Turner, appeared in the 1982 collection of that name. Styron was taken to task for The Confessions of Nat Turner by a group of African American critics, who objected to his having had the temerity to write a novel in the voice of a historic black rebel. He seems never to have got over the sting of that controversy, as evidenced by his follow-up piece, “Nat Turner Revisited”. In a tribute to Philip Roth, he singles out Roth’s having been the target of censorious rabbis as his point of identification.

more here.

Friday Poem

The Unnamable River

1.

Is it in the anthracite face of a coal miner,
crystallized in the veins and lungs of a steel
worker, pulverized in the grimy hands of a railroad engineer?
Is it in a child naming a star, coconuts washing
ashore, dormant in a volcano along the Rio Grande?

You can travel the four thousand miles of the Nile
to its source and never find it.
You can climb the five highest peaks of the Himalayas
and never recognize it.
You can gaze though the largest telescope
and never see it.

But it's in the capillaries of your lungs.
It's in the space as you slice open a lemon.
It's in a corpse burning on the Ganges,
in rain splashing on banana leaves.

Perhaps you have to know you are about to die
to hunger for it. Perhaps you have to go
alone in the jungle armed with a spear
to truly see it. Perhaps you have to
have pneumonia to sense its crush.

But it's also in the scissor hands of a clock.
It's in the precessing motion of a top
when a torque makes the axis of rotation describe a cone:
and the cone spinning on a point gathers
past, present, future.

Read more »

Tiny mitochondria play outsized role in human evolution and disease

From PhysOrg:

MitochondriaMitochondria are not only the power plants of our cells, these tiny structures also play a central role in our physiology. Furthermore, by enabling flexible physiological responses to new environments, mitochondria have helped humans and other mammals to adapt and evolve throughout the history of life on earth. A pioneering scientist in mitochondrial biology, Douglas C. Wallace, Ph.D., synthesizes evidence for the importance of mitochondria in a provocative Perspective article today in the journal Cell. Residing in large numbers outside the nucleus of every cell, mitochondria contain their own DNA, with unique features that “may require a reassessment of some of our core assumptions about human genetics and evolutionary theory,” concludes Wallace, director of the Center for Mitochondrial and Epigenomic Medicine at The Children's Hospital of Philadelphia. Wallace has investigated mitochondria for more than 40 years. In 1988, he was the first to show that mutations in mitochondrial DNA (mtDNA) can cause inherited human disease. His body of research has focused on how mtDNA mutations contribute to both rare and common diseases by disrupting bioenergetics—chemical reactions that generate energy at the cellular level.

Wallace and colleagues previously showed in the late 1970s that human mitochondrial DNA is inherited exclusively through the mother. They then used this knowledge to reconstruct the ancient migrations of women by comparing variation in mtDNA among populations throughout the world. From such studies, scientists have concluded that humans arose in Africa about 200,000 years ago and that only two mtDNA lineages successfully left Africa about 65,000 years ago to colonize the rest of the world.

More here.

Extreme altruism: should you care for strangers at the expense of your family?

Larissa Macfarquhar in The Guardian:

ImagesThe term “do-gooder” is, of course, often demeaning. It can mean a silly or intrusive person who tries to do good but ends up only meddling. It can mean someone who seems annoyingly earnest, or priggish, or judgmental. But even when “do-gooder” simply means a person who does good deeds, there is still some scepticism, even antagonism, in it. One reason may be guilt: nobody likes to be reminded, even implicitly, of his own selfishness. Another is irritation: nobody likes to be told, even implicitly, how he should live his life, or be reproached for how he is living it. And nobody likes to be the recipient of charity. But that is not the whole story. Ambivalence towards do-gooders also arises out of a deep uncertainty about how a person ought to live. Is it good to try to live as moral a life as possible – a saintly life? Or does a life like that lack some crucial human quality? Is it right to care for strangers at the expense of your own people? Is it good to bind yourself to a severe morality that constricts spontaneity and freedom? Is it possible for a person to hold himself to unforgiving standards without becoming unforgiving? Is it presumptuous, even blasphemous, for a person to imagine that he can transfigure the world – or to believe that what he does in his life really matters when he is only a tiny, flickering speck in a vast universe? There are powerful forces that push against do-gooders that are among the most fundamental, vital and honourable urges of human life.

For instance: there is family and there are strangers. The do-gooder has a family, like anyone else. If he does not have children, he has parents. But he holds himself to moral commitments that are so stringent and inflexible that they will at some point conflict with his caring for his family. Then he has to decide what to do. To most people, it is obvious that they owe far more to family than to strangers; caring for the children of strangers as much as your own, say, would seem not so much difficult as unnatural, even monstrous. But the do-gooder does not believe his family deserves better than anyone else’s. He loves his more, but he knows that other people love their families just as much. To a do-gooder, taking care of family can seem like a kind of moral alibi – something that may look like selflessness, but is really just an extension of taking care of yourself.

More here.

How termites ventilate

Peter Reuell in the Harvard Gazette:

ScreenHunter_1387 Sep. 24 19.07As builders go, termites don’t have many tools at their disposal — just their bodies, soil, and saliva. For guidance they have nothing to go on save variations in wind speed and direction and fluctuations in temperature as the sun rises and sets.

Despite such limitations, the insects have managed to develop structures that are efficiently ventilated, a challenge that’s still a struggle for human builders.

Led by L. Mahadevan, Lola England de Valpine Professor of Applied Mathematics, of Organismic and Evolutionary Biology, and of Physics, a team of researchers that included postdoctoral fellow Hunter King and MIT grad student Samuel Ocko has for the first time described in detail how termite mounds are ventilated. The study, described in an Aug. 31 paper in theProceedings of the National Academy of Sciences, reveals that the structures act similarly to a lung, inhaling and exhaling once a day as they are heated and cooled.

“The direct measurements essentially overthrow the conventional wisdom of the field,” said Mahadevan. “The classic theory was that if you have wind blowing over the mounds, that changes the pressure, and can lead to suction of CO2 from the interior … but that was never directly measured.

“We measured wind velocity and direction inside the mounds at different locations. We measured temperature, CO2 concentrations … and found that temperature oscillations associated with day and night can be used to drive ventilation in a manner not dissimilar to a lung. So the mound ‘breathes’ once a day, so to speak.”

More here.

David Hume, the Buddha, and a search for the Eastern roots of the Western Enlightenment

Alison Gopnik in The Atlantic:

ScreenHunter_1386 Sep. 24 18.55In 1734, in scotland, a 23-year-old was falling apart.

As a teenager, he’d thought he had glimpsed a new way of thinking and living, and ever since, he’d been trying to work it out and convey it to others in a great book. The effort was literally driving him mad. His heart raced and his stomach churned. He couldn’t concentrate. Most of all, he just couldn’t get himself to write his book. His doctors diagnosed vapors, weak spirits, and “the Disease of the Learned.” Today, with different terminology but no more insight, we would say he was suffering from anxiety and depression. The doctors told him not to read so much and prescribed antihysteric pills, horseback riding, and claret—the Prozac, yoga, and meditation of their day.

The young man’s name was David Hume. Somehow, during the next three years, he managed not only to recover but also, remarkably, to write his book. Even more remarkably, it turned out to be one of the greatest books in the history of philosophy: A Treatise of Human Nature.

In his Treatise, Hume rejected the traditional religious and philosophical accounts of human nature. Instead, he took Newton as a model and announced a new science of the mind, based on observation and experiment. That new science led him to radical new conclusions.

More here.