Sex and the brain

“Are men more likely than women to be born with the potential for abstract brilliance in science, mathematics, the arts or music? Los Angeles correspondent Robert Lusetich reports on new research claims from the author of The Bell Curve.”

From The Australian:

The idea is as simple as its implications are seismic: women, as a group, lack the evolutionary genetic intelligence to master the highest strata of mathematics and the hard sciences. This is the central tenet of a contentious theory forwarded by famed US social scientist Charles Murray, who a decade ago made similarly explosive claims about the inferior genetic intelligence of blacks in his best-selling book The Bell Curve.

“It’s quite satisfying to see that I didn’t get nearly the hostile reaction I was expecting this time,” Murray says from his home near Washington. “After The Bell Curve, I was the Antichrist, so perhaps we have moved on and we can start looking at this data in an un-hysterical way.”

Perhaps. Another explanation may be that Murray has used up his 15 minutes of fame. Lisa Randall, an eminent Harvard theoretical physicist and cosmologist, had agreed to dissect Murray’s work, which appeared in the September issue of Commentary magazine in the US, for Inquirer but on reflection declined to respond. “The reason is that this just isn’t news and it’s not worthy of being covered,” she says. “If it really gets to the point where people accept it, I can explain the many logical fallacies in his piece.”

Murray counters with the shrug of a man who has heard it all before; he is fully prepared to take it on the chin from the “women’s studies crowd”.

“Universities are supposed to be places where we talk about these things, not run from them,” he says. “These are, in the end, questions of data, not my opinion.”

More here.

The social logic of Ivy League admissions

Malcolm Gladwell in The New Yorker:

IvyAt Princeton, emissaries were sent to the major boarding schools, with instructions to rate potential candidates on a scale of 1 to 4, where 1 was “very desirable and apparently exceptional material from every point of view” and 4 was “undesirable from the point of view of character, and, therefore, to be excluded no matter what the results of the entrance examinations might be.” The personal interview became a key component of admissions in order, Karabel writes, “to ensure that ‘undesirables’ were identified and to assess important but subtle indicators of background and breeding such as speech, dress, deportment and physical appearance.” By 1933, the end of Lowell’s term, the percentage of Jews at Harvard was back down to fifteen per cent.

If this new admissions system seems familiar, that’s because it is essentially the same system that the Ivy League uses to this day. According to Karabel, Harvard, Yale, and Princeton didn’t abandon the elevation of character once the Jewish crisis passed. They institutionalized it.

More here.

High Metal Tower

Katharine Logan in Architecture Week:

12823_image_9A crisp, subtly articulated new form has risen among the towers of New York. The Helena, a 580-unit apartment building designed by FXFOWLE ARCHITECTS, formerly Fox & Fowle Architects, brings elegant design and sustainable technologies to a building type often underserved in both these regards.

As the first voluntarily sustainable highrise residential building in New York City, the Helena has won the AIA 2005 Green Affordable Housing Award from the American Institute of Architects. “It is a source of pride that the AIA has recognized the Helena as a new model of what a New York sustainable apartment building can and should be,” says Dan Kaplan, AIA, senior principal of FXFOWLE.

The Helena’s envelope of floor-to-ceiling glass, wrap-around windows, and metal panels weaves a shimmering pattern of opacity and reflection. With floor bands seeming from below to stretch on a bias across the building’s facets, the building looks taut and smart. Its understated formal composition, accented with a twist of the balcony and a tilt of the photovoltaics, balances verve with restraint: a welcome achievement in a building type that, as a supporting actor on the urban stage, often tries either too hard or not hard enough.

More here.

From the Tail: Betting on Uncertainty

I think I know where you stand on the ongoing federal court case in Pennsylvania, where parents have sued to block the teaching of intelligent design in their schools. Your position notwithstanding, only 13% of the respondents to a November 2004 Gallup poll believed that God has no part to play in the evolution or creation of human beings. Fully 45% said they believe that God created humans in their present form less than 10,000 years ago!

What’s going on here? Many (perhaps even a majority) of these respondents were taught evolution in school. Did they choose to disregard it merely because it contradicted their religion? They do seem to accept a whole host of other things during the course of their education which may contradict it as well. For example, there appears to be far less skepticism about the assertion that humans occupy a vanishingly small fraction of the universe. I’ll throw out three other explanations that are often advanced, but which I believe to be inadequate as well:

  1. Natural selection is not a good enough explanation for the facts: Clearly, it is.
  2. Natural selection has not been properly explained to the general public: Sure there are common misconceptions, but proponents have had enough school time, air time and book sales mindshare to make their points many times over.
  3. Religious zealots have successfully mounted a campaign based on lies, that has distorted the true meaning of natural selection: This has conspiracy theory overtones.  There are too many people who do not believe in natural selection — have they all been brainwashed?

My explanation is simply this: Human beings have a strong visceral reaction to disbelieve any theory which injects uncertainty or chance into their world view. They will cling to some other “explanation” of the facts which does not depend on chance until provided with absolutely incontrovertible proof to the contrary.

Part of the problem is that we all deal with uncertainty in our daily lives, but it is, at best an uncomfortable co-existence. Think of all the stress we go through because of uncertainty. Or how it destabilizes us and makes us miserable (what fraction of the time are you worrying about things that are certain?). In addition to hating it, we confuse uncertainty with ignorance (which is just a special case), and believe that eliminating uncertainty is merely a matter of knowing more. Given this view, most people have no room for chance in the basic laws of nature. My hunch is that that is what many proponents of Intelligent Design dislike about natural selection. Actually, it’s more than a hunch. The Discovery Institute, a think tank whose mission is to make “a positive vision of the future practical”, (but which appears to devote a bulk of its resources to promoting intelligent design) has gotten 400 scientists to sign up to the following “Scientific Dissent from Darwinism“:   

We are skeptical of the claims for the ability of random mutation and natural selection to account for the complexity of life. Careful examination of the evidence for Darwinian theory should be encouraged.

In this world of sophisticated polling and sound bites, I think that the folks at the Discovery Institute have gotten their message down pat. To be sure, natural selection is not a theory of mere chance. But without uncertainty it cannot proceed. In other words, Natural Selection is a theory that is not of chance, but one that requires it.  The advocates of Intelligent Design are objecting to the “purposeless” nature of natural selection and replacing it with the will of a creator. It doesn’t really help matters for Darwinians to claim that chance plays a marginal role, and that the appeal to chance is a proxy for some other insidious agenda. Chance is the true bone of contention. In fact, as Jacques Monod put it over thirty years ago:

Even today a good many distinguished minds seem unable to accept or even to understand that from a source of noise, natural selection could quite unaided have drawn all music of the biosphere. Indeed, natural selection operates upon the products of chance and knows no other nourishment; but it operates in a domain of very demanding conditions, from which chance is banned. It is not to chance but to these conditions that evolution owes its generally progressive course.

The inability of otherwise reasonable people to accept a fundamental role for randomness is not restricted to religious people — scientists are hardly immune to it. We know that even Einstein had issues with God and dice in the context of Quantum Mechanics. Earlier, in 1857, when Ludwig Boltzmann explained the Second Law of Thermodynamics by introducing, for the first time, probability in a fundamental law, he was met with extreme skepticism and hostility. He had broken with the classical Newtonian imperative of determinism, and so could not be right. After much heartache over answering his many critics, Boltzmann (who had been struggling with other problems as well) hanged himself while on holiday.

Of course one reason we hate to deal with uncertainty is that we are so ill equipped to do so. Even when the facts are clearly laid out, the cleverest people (probabalists included) make mistakes. I can’t resist providing the following example:

William is a short, shy man. He has a passion for poetry and lives strolling through art museums. As a child he was often bullied by his classmates. Do you suppose that Williams is (a) a farmer, (b) a classics scholar?

Everyone I ask this question chooses (b). But that isn’t right. There are vastly more farmers than classics scholars, and even if a small fraction of farmers match William’s characteristics, that number is likely to be larger than the entire set of classics scholars. (Did you just get burned by your meager probabilistic reasoning faculties?) The psychologists Kahneman and Tversky pioneered the field of behavioral economics, which establishes among other things that our heuristics for reasoning about uncertainty are quite bad. You can probably think of many patently dumb things that people have done with their money and with their lives when a simple evaluation of the uncertainties would have resulted in better outcomes.

So back to getting people to accept uncertainty as an inherent part of the world. As you can probably tell, I am not holding my breath. On evolution, the timescales are too long to be able to provide the incontrovertible proof to change most people’s minds. Maybe a better approach is to reason by analogy. There is an absolutely staggering amount of purposeless evolution unfolding at breakneck speed before our very eyes. I am talking about the Web, the very medium through which you are reading this. In only about ten years a significant portion of the world’s knowledge has become available, is almost instantaneously accessible, and it’s free. Consider these figures from a recent article by Kevin Kelly. The thing we call the Web has

  • more than 600 billion web pages available, which are accessible by about 1 billion people.
  • 50 million simultaneous auctions going on on Ebay,  adding up to  1.5 billion a year.
  • 2 billion searches a month being done on  Google alone.

Think back to what you were doing ten years ago. Did you ever really think that any of this would happen? The scale at which the internet operates was envisioned by none of the engineers and computer scientists who collaboratively attempted to design the basic substrate of protocols upon which it runs. In truth, the innovations and designs of the web come from the collective energies of its users, and not according to an intelligent design or a blueprint. Here the purposeless of evolution is much easier to see. One day in the future some theory will reveal as a simple consequence, why all of a sudden in the years 2004-05, there sprung up 50 million blogs, with a new one coming on line every 2 seconds. This theory of evolution will be framed by a Law and this law will have at its core an indelible, irreducible kernel of chance. And chances are, most people will have a hard time believing it.

Monday Musing: Enchantment and pluralism, some thoughts while reading Jonathan Strange & Mr. Norrell

Throughout much of the writings of the German sociologist Max Weber, you can find the claim that modernity and its rational control over the natural demanded the disenchantment of the world; that is, the exit of the sacramental in material things and the end of sacrament as a means (or rather appeal to the world) to fulfill our roles and ends. The role of the religious and the spiritual dwindle. Science and technology displace magic. But specifically, it displaces magic in the realm of means.

Weber saw this mostly in the rise of capitalism and the modern bureaucracy and in the Protestantism that has, or had, an “elective affinity” to modernity itself.

Only ascetic Protestantism completely eliminated magic and the supernatural quest for salvation, of which the highest form was intellectualist, contemplative illumination. It alone created the religious motivations for seeking salvation primarily through immersion in one’s worldly vocation. . . For the various popular religions of Asia, in contrast to ascetic Protestantism, the world remained a great enchanted garden, in which the practical way to orient oneself, or to find security in this world or the next, was to revere or coerce the spirits and seek salvation through ritualistic, idolatrous, or sacramental procedures. No path led from the magical religiosity of the non-intellectual strata of Asia to a rational, methodical control of life. (The Great Religions of the World)

And that pinnacle expression of and institution for methodical control of the world, the bureaucracy, was notable, according to Weber, precisely for its irreligion.

A bureaucracy is usually characterized by a profound disesteem of all irrational religion . . .(Religious Groups)

Reading Susanna Clarke’s Jonathan Strange & Mr. Norrell, which admittedly I’m only half-way through, I was reminded of Weber (which is not so uncommon). The novel, set in the early 19th century, concerns the reappearance of magic in the modern world. In the novel, magic existed once upon a time, but had disappeared three centuries earlier, at the end of the Middle Ages. Against the backdrop of the Napoleonic Wars, two practicing magicians appear in England—a re-enchantment, of sorts.

Prior to the appearance of the two practical magicians, magic is purely theoretical, the occupation of historians and scholars, but not practitioners. Interestingly enough, these historians and scholars in the novel are also called “magicians.” The magic societies resemble philosophy circles and salons. And the idea of magic in the novel as a metaphor for philosophy is an obvious one, if only because the line between magic and philosophy seems so blurry in the Middle Ages. Merlin certainly appears a philosopher magician, a sage.

The two, Jonathan Strange and his teacher Mr. Norrell, lend their services to the war effort, and we are given an image of magic interacting with the specialized, but also distant and abstract, knowledge of bureaucracy. And it’s a funny image: two separate relationships to means in conflict, with neither depicted in a flattering way.

Enchanted (or mysterious) means don’t seem any more sensible or effective than dis-enchanted (rational, methodical) ones. (At least so far.)

(And I was also disappointed to learn that the connection between “wizard” and “vizier” is accidental.)

I was thinking of these issues in the context of a larger one: namely, why does so much fantasy appear to be conservative. The Lord of the Rings seems clearly to be conservative in its politics, not just Tolkien. And by conservative, I don’t mean that it simplifies politics but rather it harkens back to a time before a monistic conception of the good—as given by religion, usually—collapsed in favor of the pluralism of ends that we enjoy and which defines the freedom of the moderns. To follow John Holbo and invoke Isaiah Berlin, people disagree with the ends of life and not just the means. And the modern world has been set up to allow people to disagree and live their lives in the way they like without too much conflict, at least ideally.

There are exceptions to my claim that fantasy seems to go with conservatism, to be sure: Buffy the Vampire Slayer, for one. But it does seem that the practical representation of magic often takes place against the backdrop of, at least, a locally all-embracing purpose, most commonly war. It’s almost as if the absence of a methodical control of life and the world requires that the ends of life are controlled thoroughly. Conversely, the rationalization of the world appears to go part and parcel with the pluralism of ends. (Of course, Weber, and some of those he inspired including the Marxist Frankfurt School, was terrified that values—monistic or plural—would exit altogether from the modern world under its rationalization, and means would become ends in themselves. Although, it seems that no one can give an example other than the accumulation of money or commodities.)

At least so far, Clarke seems to avoid the conundrum, or appears to make fun of the genre’s political naiveté. (It apparently gets even better, in terms of political richness.)  And it seems to me that to the extent that the backdrop of fantasy can shift from the Wagnerian saga into the quotidian, magic can find a place in the modern world.

Lives of the Cannibals: Redemption

On May 29, 1983, Steve Howe, a 25 year-old relief pitcher for the Los Angeles Dodgers, checked himself into a drug rehabilitation center to treat an addiction to cocaine. Howe was a promising young star, 1980’s rookie of the year, and endowed with the hyperactive, pugnacious demeanor of a natural-born “closer,” the pitcher charged with saving tight games in treacherous late-inning situations. He completed his rehab in late June, but was sent away again in September after missing a team flight and refusing to submit to urinalysis. He tested positive for cocaine three times that November, and was suspended from baseball for the 1984 season, one of several players caught up in the decade’s snorty zeitgeist. Howe returned to the mound in ’85 and over the next 6 years pitched sporadically for the Dodgers, the Minnesota Twins and the Texas Rangers, as well as a Mexican League team and a couple of independent minor-league level clubs in the States. But June of ’92 found Howe busted again, and Fay Vincent, then the commissioner of baseball, banned him for life. An arbitrator later vacated Vincent’s decision, reinstating Howe, and the New York Yankees signed him to pitch in the Bronx. After Yankee relievers suffered a mid-season collapse in 1994, Howe stepped into the breach and, notwithstanding his caged pacing and myriad facial tics, recorded 15 clutch saves and a 1.80 earned run average, winning the enduring affection and respect of Yankee fans, who have a proud history of adopting the troubled and eccentric, just so long as they win.

Welcome to New York, perhaps the most prolifically redemptive island in human history. Granted, islands are built for redemption. Their isolation and exclusivity require new beginnings from their inhabitants, and they tend in general (and New York’s islands tend in particular) to transact life on terms different from other places. In the City, where the hybrid system runs on aggression, aplomb and sex appeal, fatuous Wall Street wizards and Upper-East Side tastemakers serve prison sentences and emerge hotter than ever, redeemed not by God or humanism but by the very fact of their fall from grace. It’s exotica, a matter of salacious interest and a perfect bluff for the social scene, where a big rep is all it takes, and the smart ones ride theirs all the way to a clubby write-up in Talk of the Town. Sure, a prison term is a nuisance, but it’s also useful (if bush-league) preparation for the more exigent realities of life in Manhattan. So it’s no surprise that we should admire the same things in our more middle-class heroes–our athletes and actors, and our politicians too. We want contrition, of course, and we must remember the children, but a little imperfection makes for a compelling character, and we won’t have that sacrificed.

The New York Yankees opened their 2005 season 11-19. It was the worst start anyone could remember, and it came on the heels of the greatest collapse (or comeback, depending on your regional perspective) in baseball history, when, in the second round of the 2004 playoffs, the Yankees were eliminated by the Red Sox despite winning the first three games of a best-of-seven series. In every one of the last nine years, they had made it to the playoffs, and in every one of the last seven, they had won the American League’s Eastern Division title, but 2005 seemed different. They were paying 15, ten and seven million dollars to three starting pitchers of dubious value–Brown, Wright and Pavano–and they had purchased the super-rich contract of Randy Johnson, once inarguably the finest pitcher in the major leagues, but now, at 41, a cranky and unreliable prima donna, whose 6’7 frame and acne-scarred face looked pained and out of place in Yankee pinstripes. Their beloved veteran center fielder Bernie Williams couldn’t throw anymore, and their traditionally solid bullpen hemorrhaged runs nightly. It was a difficult reality for fans who had been treated to a decade of near-constant success, but it was manna for the millions of Yankee haters, whose unfailing passion evinces the team’s historical greatness and cultural significance. In the wake of their ignominious 2004 defeat at the hands of the Red Sox, and finding themselves in last place in the American League East, the Yankees and their fans despaired. It was over.

Enter Jason Gilbert Giambi and Aaron James Small, high school classmates from California and unlikely Yankee teammates, whose personal redemptions spurred the 2005 Yankees to their eighth consecutive division title on Saturday. Giambi, a longtime star slugger, is one of the few known quantities in the recent steroid controversy (and Capitol Hill comedy, where the workout regimens of professional athletes have curiously attained massive political profile), whose leaked congressional testimony marks him as a confirmed (though not explicitly stated) user. Giambi spent most of 2004 on the Yankees’ disabled list, recovering from mysterious fatigue and a suspicious tumor, both of which, it seemed likely to pretty much everyone who gave it any thought, might just be the rightful wages of sticking a hypodermic needle in your ass and suffering nascent breast development, in exchange for increased strength and the ability to heal faster (a superhero’s tradeoff). But if nothing else came clear in 2005, at least Jason Giambi wasn’t on the juice. Never did a hitter look more helpless at the plate than poor Jason. He flailed and whiffed, and the earnest cheerfulness that once endeared him to fans and teammates curdled into delusive optimism. He was done.

But he wasn’t. Through the first two months of the season, Giambi claimed to be on the right track. He still had his good eye, he pointed out, referring to all the walks he earned, and it was just a matter of timing and bat speed after that. Fans and the media were indulgent but skeptical. The Yankees are a known rest-home for aging, overpriced talent, and Giambi’s story, though more dramatic than the trajectory of your average baseball player’s decline, did fit the profile. But, much to everyone’s surprise, he started hitting again, and what he started hitting were home runs–tall flies that took ages to land, and missiles that slammed into the bleachers moments after cracking off his bat. Giambi began driving in runs at a faster pace than anyone else on a team full of standout run-producers, and he continued reaching base on the walks that served as his crutch in those first miserable months, all of which amounted to league-leading slugging and on-base percentages. Jason was redeemed, and his legend is assured now as the star who wanted more, who lost everything to greed and arrogance, and who recovered his glory, which is now vastly more appealing for the fact that it’s tarnished. It’s a real New York kind of story.

As for Aaron Small, his is a story of redemption too, but one more suitable for middle America, which might not take so kindly to the resurrected likes of Steve Howe and Jason Giambi. Like Giambi, Small is a 34 year-old baseball veteran, but a veteran of the minor-leagues, whose only pro success has been the several “cups of coffee” (as baseball cant has it) he’s enjoyed in the majors in 16 years of playing–short stints in the bigs, followed by interminable bus rides back to the minors. This year, Small was called up to plug the holes left by the Yankees’ multimillion-dollar washouts, Brown, Wright and Pavano. Small, it should be noted, is the type of guy who thanks God for minor successes, a tendency not uncommon in local basketball and football players, but one that seems exceedingly peculiar in a glamorous Bronx Bomber. Nevertheless, he has been embraced by New York fans, and their acceptance has everything to do with the ten victories he compiled (against no defeats) in his partial 2005 season. This modest, Southern country boy outpitched every high-priced arm the Yankee millions could buy, and after every game he shucksed his way through interviews, praising his patient wife, praising his remarkably attentive savior, and just generally expressing his shock and pleasure at finding himself in the heat of a big-league pennant race after more than a decade-and-a-half of slogging his way from minor-league town to minor-league town. Small’s story is relevant here because his time is short. His 16-year patience, his redemption, will not remain in the minds of New Yorkers very long, not unless he does something colossally self-destructive–and he better do it quick. We like a little dirt on our heroes, a little vulgarity, because otherwise it’s all hearts and flowers and straight-laced (and -faced) fortitude, and what could be more dull? New York takes pride in its corruptions, and a hero isn’t a New York hero until he’s been dragged down and beaten (preferably by his own hand).

And this is why the 2005 Yankees have a shot at being the most memorable team to come out of the City in years. They’ve seized every opportunity to make things hard this season. Every potential run-scoring at bat, every pitching change and every difficult fielding chance has come with the sour taste of unavoidable failure, the sense that we’re almost out of gas now after a decade at the top. Our trusty veterans have lost their vigor and our big-name stars are compromised–by their egos, their paychecks and their tendency to choke. The obstreperous owner is lapsing into dementia, and even Yankee Stadium itself has entered its dotage. Indeed, what we’re confronted with is the last, limping formation of a great baseball team, occasionally disgraced by its swollen personalities and bottomless, ignorant pockets, trying to fashion for itself a true New York-kind of glory–one that climbs out of the depths, battered and ugly. This is our redemption.

Lives of the Cannibals: The Spell of the Sexual
Lives of the Cannibals: Rage

Poison in the Ink: The Life and Times of Fridtjof Nansen

In 1906 Santiago Ramòn Y Cajal and Camillo Golgi shared the Nobel Prize in Physiology or Medicine for their contributions to neuroscience: Cajal for his contributing work that helped lay the foundation for the Neuron Doctrine, and Golgi for the development of his Golgi stain which was crucial for the work of so many neuroscientists, including Cajal. Unknown to most people however, is that a Norwegian zoologist named Fridtjof Nansen had declared the independence of the cellular nerve unit a year and a half earlier than Cajal, using the same Golgi stain employed by the Spanish histologist. When Cajal was just beginning to learn about the staining technique from a colleague, Nansen had already published a paper stressing the point.

On October 26, 1892, a crowd gathered for the christening of the Fram, a custom-built ship designed to take Fridtjof Nansen and his crew to the roof of the world. Four years had passed since Nansen had become the first European to cross the interior of Greenland, and he now hoped to win the race of becoming the first to reach the North Pole.

Among the guest present at the event was Gustaf Retzius, a colleague from Nansen’s early days as a neuroscientist. During a speech made at dinner that night, Nansen turned to Retzius and said that the field of neurobiology, like polar exploration, involved “penetrating unknown regions” and he hoped one day to return to it.

For all of his good intentions, Nansen never did return, and it would be something he would express regret over many times throughout his life. As he put it, after “…having once really set foot on the Arctic trail, and heard the ‘call of the wild’, the call of the ‘unknown regions’, [I] could not return to the microscope and the histology of the nervous system, much as I longed to do so.”

Those familiar with Nansen probably know him as an arctic explorer and as a world-famous diplomat who was awarded the Nobel Peace Prize in 1922 for his efforts to repatriate refugees after World War I.

But before the arctic expeditions and the humanitarian work, Nansen was a young zoologistNansen_lab_5 interested in biology and the nervous system. He was one of the world’s first modern neuroscientist and one of the original defenders of the idea that the nervous system was not one large interconnected web, but instead was made up of individual cells that Wilheim Waldeyer would later call “neurons” in his famous 1891 Neuron Doctrine.

From a young age, Nansen was fascinated with nature; he loved its “wildness” and its “heavy melancholy” and he was happiest when he was outdoors. When it came time for Nansen to enter the University of Christiania (currently known as the University of Olso), he chose to major in zoology.

During his first year, Nansen answered a call from his department for someone to visit the arctic and collect specimens of marine life. In 1882, he set off for the east coast of Greenland aboard the sealing vessel Viking on a voyage that would last four and a half months.

The trip was a unique turning point in Nansen’s life. It provided him with his first glimpse of the Arctic and instilled in him the desire to cross Greenland’s icy interior.

“I saw mountains and glaciers, and a longing awoke in me, and vague plans revolved in my mind of exploring the unknown interior of that mysterious, ice-covered land,” Nansen wrote.

Upon his return, the 20-year-old Nansen was offered a post as the curator of the zoological department at the museum of Bergen. Nansen gladly accepted the position. His arctic dreams were put aside and the next six years were spent studying the invertebrate nervous system through a microscope.

One of the greatest difficulties Nansen faced in his research involved staining sections of nerve tissue. With the methods available at the time, the most that could be revealed of a neuron was its cell body, the proximal—and sometimes secondary—branch-like extensions of its dendrites and the initial segments of its thread-like axon.

At around this time, word was circulating that an Italian physician named Camillo Golgi had developed a new staining technique, one that stained only a few nerve cells in a section at a time, but which stained them so thoroughly that they were visible in their entirety.

After catching wind of the Golgi’s technique from a colleague, Nansen decided to pay the Italian doctor a visit. Despite arriving unannounced at Golgi’s lab in Pavia, Nansen was surprisingly well received and under the doctor’s careful tutelage, Nansen mastered what would become known as the Golgi stain in only a matter of days.

Upon his return, Nansen applied the Golgi stain to the nerve cells of a primitive fish-like animal called the lancelet. For the first time, Nansen could see clearly all the intricate branches of a neuron’s dendrites and could follow the entire length of an axon before it made contact with another neuron.

Armed with this new tool, Nansen began seeing things that couldn’t be explained by the reticular network theory, the reigning theory at the time for how nervous systems were organized. According to this theory, the nervous system was like a giant mesh net, with nerve impulses—whatever they might be—traveling unimpeded from one area to another.

One of Nansen’s objections to this view was based on a simple anatomical observation. The existences of unipolar neurons, or unipolar “ganglion” cells as they were known at the time, puzzled Nansen and lead him to ask a very logical question: How could unipolar neurons exist if nerve cells fused into one another as commonly believed, he asked. “How could direct combination between the cells be present where there are no processes to produce the combination?”

As their name suggests, unipolar neurons have a single primary trunk that divides into dendrites and an axon once away from the cell body. This is different from the image of neurons that most people are accustomed to, which show numerous dendrites branching off the cell body at one end and a long thread-like axon, terminating in tiny knobs at the other.

Other scientists attempted to explain away unipolar neurons by arguing that they were not very common. The closer the nervous system was examined, they said, the fewer unipolar neurons were found, especially in vertebrates like mammals and humans. Nansen remained unconvinced and pointed to the nervous systems of invertebrates like lobsters which have nervous systems made up almost entirely of unipolar neurons. To Nansen, this was strong evidence that the reticular network theory couldn’t be correct and in an 1887 paper, Nansen made the statement–bold at the time–that “a direct combination between the ganglion cells is…not acceptable.”

Nansen had his own theory about how nerve cells might combine. He proposed that it was in the ‘dotted-substance’ (what modern neuroscience calls “neuropil” in invertebrates and “gray matter” in vertebrates) that nerve cells communicated with one another. Nansen went even further, prophetically declaring that this ‘dotted-substance’ was the “principle seat of nervous activity” and “the true seat of the psyche.”

In the concluding paragraph of his 1887 paper, Nansen insisted that the dotted-substance will no doubt prove to play an essential role in whatever the final function of the nerve cells is determined to be. Unable to resist making one last speculation, Nansen also wrote the following:

“It is not impossible that [ganglion cells] may be the seat of memory. A small part of each irritation producing a reflex action, may on its way through the dotted substance be absorbed by some branches of the nervous processes of the ganglion cells, and can possibly in one way or another be stored up in the latter.”

In this, Nansen was especially farsighted, touching upon what modern neuroscience calls “neuroplasticity,” currently one of the most promising explanations to account for how simple reflexes can undergo modifications that last for minutes at a time and how learning can lead to behavioral changes that can last for a lifetime.

In the spring of 1888, Nansen presented a shortened version of his paper for PhD consideration to a review board in the ceremonial auditorium of Christiania University. In what was described as a heated presentation, Nansen reiterated his firm belief that nerve cells were not fused into a reticular network, that they were instead independent cellular units. Nansen’s conclusions were met with hostility by the review board’s members and he was accused of jumping the gun and getting ahead of his evidence.

Nansen was awarded his degree in the end, but not before one panel member expressed his firm conviction that Nansen’s hypothesis was destined to be forgotten like so many others.

The experience was a taxing one for Nansen. “In the end, there was such a confusion of one thing on top of another…that I believe that had it continued any longer I would have had a nervous breakdown,” he later wrote to a friend. “There was hardly a second to spare; we finished precisely as calculated, but no more.”

In this, Nansen wasn’t exaggerating. He was running out of time. Nansen was scheduled to depart four days after his PhD defense on a cross-country trek across the unexplored interior of Greenland. A long-time dream was finally coming true.

Nansen personally saw to every aspect of the trip. In a plan that critics called dangerous and foolhardy, Nansen proposed to cross Greenland from east to west. It would be a one-way ticket for himself and his team, with no chance of turning back.

“In this way one would burn one’s boats behind one,” Nansen wrote. “There would be no need to urge one’s men on, as the east coast would attract no one back, while in front would like the colonies on the west coast with the allurements and amenities of civilization.”

It took nearly two months, but in the end Nansen proved his critics wrong and his company of six became the first Europeans to cross the frozen island’s expansive interior.

The Greenland expedition gave Nansen his first taste of international fame. It sealed his reputation as an explorer and ended his career as a zoologist. Wanderlust had found its perfect victim, and soon Nansen was making plans to embark on another first.

Nansen_ice_1For his next adventure, Nansen set his sights on becoming the first to circumnavigate the North Pole. In a highly criticized plan, Nansen proposed to freeze the in an ice flow and let it drift along a current that flowed from east to west across the Polar Sea.

But things didn’t turn out quite as Nansen had hoped, and the Fram did not drift close enough to the Pole. In a last ditch effort to salvage the mission, Nansen left the ship, determined to complete the journey on foot. He took with him only one other companion, Hjalmar Johansen, some sled dogs and enough supplies to last three months.

But the harsh conditions and uneven terrain proved to be more than the pair expected, and the two watched helplessly as their original three months stretched on for much longer.

“We built a stone hut, we shot bears and walrus, and for ten months we tasted nothing but bear meat,” Nansen wrote in his journal. “The hides of the walrus we used for the roof of our hut, and the blubber for fuel.”

In the end, a lack of supplies forced the two to turn back before they could reach the North Pole, but they held the record for Farthest North for five years until 1899.

The Fram voyage was Nansen’s last major expedition. As he grew older, Nansen became increasingly involved in politics, first becoming the Norwegian ambassador to London and then a high commissioner for the newly formed League of Nations. From 1919 until his death in 1930, Nansen was a devoted global humanitarian. In 1920, when nations were still trying to rebuild Europe after the devastation of World War I, Nansen was dispatched by the international organization to direct the repatriation of half a million prisoners of war who had not yet been exchanged. Afterwards, Nansen successfully raised funds for famine relief efforts in Russia on behalf of the Red Cross.

For his success in these two tasks, Nansen was awarded the Nobel Peace Prize in 1922. When presenting him with the award, the Chairman of the Nobel Committee had these words to say about Nansen: “Perhaps what has most impressed all of us is his ability to stake his life time and time again on a single idea, on one thought, and to inspire others to follow him.”

The reference was to Nansen’s humanitarian work, but the same sentiment could have just as easily been applied to his numerous other undertakings. Whether he was navigating uncharted landscapes of ice, introducing compassion to the realm of politics, or defending an unpopular view of the nervous system, Nansen readily staked his reputation and often his life on his beliefs. Any of these tasks could have easily occupied a person for a lifetime, but Nansen tackled each unknown with fresh enthusiasm, and was rewarded in many cases with success.

Poison in the Ink: The Makings of a Manifesto
Poison in the Ink: Visiting Trinity

On the universals of language and rights

Noam Chomsky in the Boston Review:

Thirty-five years ago I agreed, in a weak moment, to give a talk with the title “Language and Freedom.” When the time came to think about it, I realized that I might have something to say about language and about freedom, but the word “and” was posing a serious problem. There is a possible strand that connects language and freedom, and there is an interesting history of speculation about it, but in substance it is pretty thin. The same problem extends to my topic here, “universality in language and human rights.” There are useful things to say about universality in language and about universality in human rights, but that troublesome connective raises difficulties.

The only way to proceed, as far as I can see, is to say a few words about universality in language, and in human rights, with barely a hint about the possible connections, a problem still very much on the horizon of inquiry.

More here.

What If…

Gene Weingarten in the Washington Post:

What if Freud had been a woman?

Sex would not be considered the primary force that drives human behavior. Instead, it would be Fear of Having a Large Behind. All men would be haunted by a condition known as “penis shame.” The mind would not be divided into the Id, the Ego and the Superego but the Shoe-Desire Region, the Weeping Center, and the If-You-Don’t-Know-What-You-Did-Wrong-I’m-Not-Going-to-Tell-You Lobe. Also, sometimes a dried apricot is just a dried apricot.

What if wishes were horses?

Then beggars would ride. But so would everyone else. We would each have, like, 7,000 horses. They would completely paralyze civilization, consuming all vegetable matter in a week or less. Continents would rise several feet, just from accumulated poo. And anytime anyone wished for no more horses, another horse would appear. The world would end in a terrifying, thundering apocalypse of horses, is what would happen.

What if Hitler had beaten us to the bomb?

Humor wäre heutzutage verboten, und Humoristen würde man in der Öffentlichkeit erschiessen.*

What if Shakespeare had been born in Teaneck, N.J., in 1973?

He would call himself Spear Daddy. His rap would exhibit a profound, nuanced understanding of the frailty of the human condition, exploring the personality in all its bewildering complexity: pretension, pride, vulnerability, emotional treachery, as well as the enduring triumph of love. Spear Daddy would disappear from the charts in about six weeks.

What if our thoughts scrolled across our foreheads, like a TV news crawl?

More here.

On Jonathan Strange and Mr. Norrell

Jenny Davidison has an interesting review of Susanna Clarke’s Jonathan Strange and Mr. Norrell in n+1.

“Like something from a fairy tale, three farfetched things had to happen before an 800-page literary fantasy by a British first-time novelist in her forties could shoot to the top of the bestseller lists. First, the success of the Harry Potter books gave credence to the idea that fantasy novels could be purchased by adults with no history of lurking in the sword-and-sorcery aisles at Barnes and Noble. Second, the internet matured as a place where serious readers and writers evaluate books and make recommendations to other readers. Since January, the comic-book writer, best-selling novelist, and influential blogger Neil Gaiman has praised Jonathan Strange and Mr Norrell repeatedly, and partly as a result Clarke’s novel became a top-ten bestseller on Amazon more than a month before publication, with bound galleys reportedly fetching as much as $200 on eBay. The novel also made the Man Booker long-list in England, where the bookmaker William Hill now lists Jonathan Strange as the third favorite, just behind David Mitchell’s Cloud Atlas and Alan Hollinghurst’s The Line of Beauty.

One other condition remained, of course: the novel had to live up to its hype. And it does. Set in a version of early 19th-century England whose history reeks of magic, the novel recounts the numerous adventures of two rival magicians, Jonathan Strange and Mr. Norrell, who try to revive magic in an England threatened by Napoleon from abroad and by social and political unrest at home.”

Secret Society, a new (primarily) music blog

Darcy Argue has a new blog about music, music technology, upcoming shows in New York and the like. Secret Society:

“is the online home of Secret Society, a New York-based big band made up of 18 of my favorite co-conspirators, plus myself as ringleader. I will, of course, bring you news of all our musical endeavors, but I also hope also to encourage a community of friends, fans, and colleagues to gather here. I will be making as much of my music available as possible — live recordings, podcasts, scores, rehearsal excerpts, sketches, works in progress, thoughts on the compositional process, etc, with an eye to opening up discussion. I will be posting semi-regularly about music, politics, life in New York, and whatever shiny baubles happen to catch my eye.”

The American Mystical World View

From The Observer (via Cosmic Variance):

64 per cent of people questioned for a recent poll said they were open to the idea of teaching creationism in addition to evolution in schools, while 38 per cent favoured replacing evolution with creationism.

40 per cent of Americans believe God will eventually intervene in human affairs and bring about an end to life on Earth, according to a survey carried out in 2002. Of those believers, almost half thought this would occur in their lifetime with a return of Jesus from heaven.

1 adult American in five believes that the Sun revolves around Earth, according to one study carried out last summer.

80 per cent of Americans surveyed by the CNN TV news network believe that their government is hiding evidence of the existence of space aliens.

More here.

And a few intelligent words from an Episcopalian Bishop:

Intelligent Design is just one more smoke screen. The task of geologists and anthropologists is to study the sources of the life of this world. They should be free to follow wherever their scientific research carries them. If Christianity is threatened by truth, it is already too late to save it. Imagine worshiping a God so weak and incompetent that the Kansas School Board must defend this God from science and new learning. It is pitiful.

More here.

Visual Poetry and Other Beautiful Graphics

From Information Aesthetics:

VisualpoetryThe newest creation of Boris Müller, famous for his (yearly reoccuring) poetry visualizations: online, interactive applications that are capable of visualizing textual input into very beautiful graphics, so that every image is the direct representation of a specific text, which can then be directly used as book illustrations.
this year, an entire poem was considered to be a tree-like structure, that branches out over the page. attached to these branches are the words of the poems, represented by leaves. more specifically, particular symbols in a text control the growth of the tree: specific letter-combinations create a new branch, others make it grow stronger. words are visualised as leafs: the amount of letters in a word is represented by the number of spikes on a leaf, whereas the letter sequence in a word also controls the overall shape of a leaf, such as the roundness of the shapes, the length of the spikes & the density of the colour. the size of the leaves depends on the length of the poem.

More here.

The Mournful Giant

From The Washington Post:Lincoln

President Buchanan is reported to have said to President-elect Lincoln as they rode down Pennsylvania Avenue on the latter’s Inauguration Day: “My dear sir, if you are as happy on entering the White House as I shall feel on returning to Wheatland [Buchanan’s Pennsylvania home], you are a happy man indeed.” But Abraham Lincoln did not expect to attain “happiness” in the White House or, as this intellectually energetic book shows, anywhere else. Lincoln’s Melancholy sounds again the half-forgotten, minor-key background music of his life. Joshua Wolf Shenk rejects the notion that Lincoln got over his melancholy under the demands of the presidency; his Lincoln is never too busy to be gloomy. And, drawing on modern studies of depression, Shenk even has a reference — humorous, I think — to “happiness” as a mental disorder.

More here.

Roman à clef

From The Guardian:Polanski10

For someone whose name has made headlines for the past 40 years, Roman Polanski is a bit of an Artful Dodger when it comes to his own publicity. At the outset it looks as though it will be a harder job to get Bill Sikes to go straight than to get Polanski to talk about his new film Oliver Twist. Since his libel victory over Vanity Fair, he has gone to ground at home in Paris, not even answering requests for interviews from a British press he believes has always had it in for him. I telephone his office and by sheer luck Polanski himself answers. ‘Why should I make an exception for you?’ he asks, in that voice fascinatingly poised between French and Polish. Because he’ll enjoy it, I tell him. ‘Bullshit,’ he replies. Then laughs.

As Charles Dickens knew so well, it’s amazing what a little laughter can do. A week later I am sitting opposite Polanski in L’avenue, a trendy restaurant situated among the Guccis and Chloes of smart Avenue Montaigne, just next door to where he lives with his third wife, the 39-year-old French actress Emmanuelle Seigner, and their two children, Morgane, 12 and Elvis, 7.

‘I am widely renowned, I know, as an evil, profligate dwarf,’ the director wrote in his 1984 autobiography Roman. But that was then. The Polanski I meet is an attractively rumpled family man with a thick head of grey hair, expensively creased linen jacket and trainers. While certainly small, he is slim and agile and, like many people who lost their childhood in the Holocaust, looks much younger than his real age, which is 72.

More here.

Scents and Sensibility

Tim Stoddard in Columbia Magazine:

Nobel Prize–winning molecular biologist Richard Axel followed his nose to the mysteries of smell and cracked the two great problems of olfaction: how the nose recognizes thousands of odors and how the brain knows what it’s smelling.

Slumping into the gray leather couch in his office, one leg draped over the armrest, Richard Axel admits that he was not the first to clone a nose. That distinction belongs to Woody Allen, who in 1973 regenerated a dead tyrant from a disembodied schnoz in the movie Sleeper. Axel, a University Professor of biochemistry, molecular biophysics, and pathology, shrugs and says, “Woody thought of it before me.”

Allen’s comic device has a whiff of scientific plausibility, as Axel recently demonstrated when he and others grew a mouse from a nose. To be accurate, the mouse was a clone, created by removing the genetic material from a nerve cell deep inside another mouse’s nose and injecting it into an empty egg. This elegant experiment was not really an homage to Sleeper, although Axel does refer to the zany movie in lectures on the science of smell. Nor was it a laboratory stunt. It was an important step toward unlocking the mysterious mechanisms of the mammalian olfactory system.

More here.

Joe Louis vs. Max Schmeling

Joyce Carol Oates in the New York Times:

Oates184What Margolick has accomplished in “Beyond Glory” is to provide an exhaustively researched background to the Louis-Schmeling rivalry that includes sympathetic portraits of both Joe Louis and Max Schmeling; an examination of racism at home and anti-Semitism in Germany; a look at the predominant role of Jews in professional boxing in the United States; and, interlarded through the text, opinions by just about anyone, from boxing experts and sportswriters to celebrities and ordinary, anonymous citizens, who might have had something to say about Louis or Schmeling that found its way into print, valuable or otherwise. Less cultural criticism than Margolick’s artfully focused “Strange Fruit,” “Beyond Glory” is historical reportage, a heavyweight of a book that is likely to be the definitive chronicle of its subject.

More here.  [This post dedicated to my favorite pugilist, Alan Koenig.]

Liars’ brains make fibbing come naturally

Celeste Biever in New Scientist:

The brains of pathological liars have structural abnormalities that could make fibbing come naturally.

“Some people have an edge up on others in their ability to tell lies,” says Adrian Raine, a psychologist at the University of Southern California in Los Angeles. “They are better wired for the complex computations involved in sophisticated lies.”

He found that pathological liars have on average more white matter in their prefrontal cortex, the area of the brain that is active during lying, and less grey matter than people who are not serial fibbers. White matter enables quick, complex thinking while grey matter mediates inhibitions.

Raine says the combination of extra white matter and less grey matter could be giving people exactly the right mix of traits to make them into good liars. These are the first biological differences to be discovered between pathological liars and the general population.

More here.

Flame-Broiled Whopper: Theo Tait on Salman Rushdie

From the London Review of Books:

Salman Rushdie’s two best books manage both these things – the big political picture and the telling individual detail – in different quantities. Midnight’s Children (1981) is a family story first and a political allegory about India second: a glorious reinvention of the Bombay of Rushdie’s childhood, of his own family stories (‘autobiography re-experienced as fairytale’, as Ian Hamilton put it). The exaggerations and magical touches are rooted in the characters and the story. Shame (1983), a savage satire about Pakistan, is a less personal and less peopled work, with a clear political message at its heart. But both, although baggy and prodigious, were anchored in subjects Rushdie knew intimately. Character and subject, like design and detail, were closely fused and passionately, originally imagined: they created something that could never be broken down into a mere message.

Perhaps understandably, these two great novels seem to have inspired Rushdie with a form of artistic megalomania. Since then, he has roved more freely, played faster and looser, written about anything and everything, and the results have never been as impressive. The Satanic Verses (1988), an interesting book with some brilliant passages, suffered from his belief that he could incorporate everything – from channel-hopping to the Prophet Muhammad’s flight to Medina, from advertising to race relations in Britain, from mountain-climbing to the nature of religious belief – into one all-singing, all-dancing extravaganza. The Moor’s Last Sigh (1995), which was based more squarely in Bombay, was better. And it’s surely no coincidence that his truly terrible last novel, Fury (2001), was an outsider’s view of New York – which begins in superficial imitation of Saul Bellow (ex-wives, big ideas, trying to read the city and the times) and ends in God knows what (serial killers, puppets, ethnic strife in the South Pacific etc).

More here.