Friday, July 22, 2016
Sara Chodosh in Scientific American:
Think about the first time you met your college roommate. You were probably nervous, talking a little too loudly and laughing a little too heartily. What else does that memory bring to mind? The lunch you shared later? The dorm mates you met that night? Memories beget memories, and as soon as you think of one, you think of more. Now neuroscientists are starting to figure out why. When two events happen in short succession, they feel somehow linked to each other. It turns out that apparent link has a physical manifestation in our brains, as researchers from the Hospital for Sick Children in Toronto (SickKids), the University of Toronto and Stanford University describe in this week’s Science. “Intuitively we know that there’s a structure to our memory,” says neuroscientist Paul Frankland, affiliated with both the University of Toronto and SickKids. “These experiments are starting to scratch the surface of how memories are linked in the brain.”
In your brain, and in the brains of lab mice, recollections are physically represented as collections of neurons with strengthened connections to one another. These clusters of connected cells are known as engrams, or memory traces. When a mouse receives a light shock to the foot in a particular cage, an engram forms to encode the memory of that event. Once that memory forms the set of neurons that make up the engram are more likely to fire. Furthermore, more excitable neurons—that is, brain cells that activate easily—are more likely to be recruited into an engram, so if you increase the excitability of particular neurons, you can preferentially include them in a new engram. The question was, did that principle apply to two memories that happen close together in time? Neurons in a newly formed memory trace are subsequently more excitable than neighboring brain cells for a transient period of time. It follows then that a memory formed soon after the first might be encoded in an overlapping population of neurons, which is exactly what Frankland and study co-lead author Sheena Josselyn, found.
Thursday, July 21, 2016
Julian Baggini in The Guardian:
The way these cream cakes flaunt themselves,” says saucy Carry On star Barbara Windsor, glaring disapprovingly at a chocolate eclair bursting with whipped cream, “it’s enough to lead a girl astray.” Her frown turns into a giggle. “Given half a chance,” she adds before tucking in gleefully.
Nothing captures the peculiarly moralistic British attitude to food better than this 15-second advert from the 1970s. And if poetry is the art of capturing whole worlds in few words then its immortal slogan “naughty but nice” is greater proof of its author’s artistry than the Booker prize its writer Salman Rushdie would go on to win.
For as long as we can remember, the British have associated delicious food with depraved indulgence. Anything that tastes good has got to be bad for your body, soul or both. The marketing department of Magnum knew this when it called its 2002 limited edition range the Seven Deadly Sins. Nothing makes a product more enticing than its being naughty, or even better, wicked.
Ed Yong in The Atlantic:
In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.
At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.
Thanks to his family background, he could speak German, and he had heard that many universities there charged no tuition fees. His missing qualifications were still a problem, but one that the University of Gottingen decided to overlook. “They said that under exceptional circumstances, they could enroll a few people every year without transcripts,” says Sprirbille. “That was the bottleneck of my life.”
Will Davies at the Political Economy Research Center:
Given that Brexit was an event imagined and delivered from within the Conservative Party, one of the most important analyses of it is Matthew d’Ancona’s examination of how the idea shifted from the party’s margins to its mainstream over the post-Thatcher era. Two things in particular stand out in his account.
Firstly, the political plausibility of Brexit rose as a direct response to Tony Blair’s dogmatic assumption that European integration was a historical destiny, which encompassed the UK. No doubt a figure such as Blair would have discovered a messianic agenda under any historical circumstances. But given he gained power specifically in the mid-90s, he was one palpable victim of the fin de siècle ideology (stereotyped by Francis Fukuyama’s ‘end of history’ thesis, but also present in Anthony Giddens’ ‘Third Way’) that the world was programmed to converge around a single political system.
Neo-conservative faith in violent ‘democratisation’ was Blair’s worst indulgence on this front, but a view of European unification (and expansion) as inevitable was responsible for inciting the Tory reaction within Westminster. Europe could have been viewed as a particular historical path, adopted in view of the particular awfulness of the European 20th century. Instead, in a Hegelian fashion, the idea of Europe became entangled with the idea of ‘globalisation’, and the conservative reaction was to refuse both.
Secondly, Tory Brexiteers view the EU as an anti-market project, which blocks economic freedom. This is also weirdly ahistorical.
Josh Cohen in 1843 Magazine:
When Steve first came to my consulting room, it was hard to square the shambling figure slumped low in the chair opposite with the young dynamo who, so he told me, had only recently been putting in 90-hour weeks at an investment bank. Clad in baggy sportswear that had not graced the inside of a washing machine for a while, he listlessly tugged his matted hair, while I tried, without much success, to picture him gliding imperiously down the corridors of some glassy corporate palace. Steve had grown up as an only child in an affluent suburb. He recalls his parents, now divorced, channelling the frustrations of their loveless, quarrelsome marriage into the ferocious cultivation of their son. The straight-A grades, baseball-team captaincy and Ivy League scholarship he eventually won had, he felt, been destined pretty much from the moment he was born. “It wasn’t so much like I was doing all this great stuff, more like I was slotting into the role they’d already scripted for me.” It seemed as though he’d lived the entirety of his childhood and adolescence on autopilot, so busy living out the life expected of him that he never questioned whether he actually wanted it. Summoned by the bank from an elite graduate finance programme in Paris, he plunged straight into its turbocharged working culture. For the next two years, he worked on the acquisition of companies with the same breezy mastery he’d once brought to the acquisition of his academic and sporting achievements. Then he realised he was spending a lot of time sunk in strange reveries at his workstation, yearning to go home and sleep. When the phone or the call of his name woke him from his trance, he would be gripped by a terrible panic. “One time this guy asked me if I was OK, like he was really weirded out. So I looked down and my shirt was drenched in sweat.”
One day a few weeks later, when his 5.30am alarm went off, instead of leaping out of bed he switched it off and lay there, staring at the wall, certain only that he wouldn’t be going to work. After six hours of drifting between dreamless sleep and blank wakefulness, he pulled on a tracksuit and set off for the local Tesco Metro, piling his basket with ready meals and doughnuts, the diet that fuelled his box-set binges. Three months later, he was transformed into the inertial heap now slouched before me. He did nothing; he saw no one. The concerned inquiries of colleagues quickly tailed off. He was intrigued to find the termination of his employment didn’t bother him. He spoke to his parents in Chicago only as often as was needed to throw them off the scent. They knew the hours he’d been working, so didn’t expect to hear from him all that much, and he never told them anything important anyway.
Jackson Landers in Smithsonian:
Humans have been practicing agriculture for about 10,000 years. But the attine ants of South America (which include the well-known leafcutters) have us beat by a long way. According to a new paper co-authored by entomologist Ted Schultz, curator of ants at Smithsonian's National Museum of Natural History, attine ants, which farm on an industrial scale similar to humans, have been carefully cultivating gardens with a complex division of labor to grow an edible fungus. Schultz's team found that the ants have been doing this far longer than previously believed—up to 65 million years—and that we have much to learn from them. Schultz and his co-authors, led by by Sanne Nygaard, Guojie Zhang and Jacobus Boomsma of the University of Copenhagen, conducted an analysis of the genomes of the various species of attine ants as well as the fungus that they cultivate. Their results answer some long-standing evolutionary questions. The 210 species of attine ants, including the 47 species of leafcutters, forage through the forests of Central and South America in search of leaves and other vegetation, which they carve into pieces using their powerful jaws and carry back to their nests. But they never eat the leaves directly. The plant matter is used as a growth medium for certain varieties of edible fungi which Schultz's team says have been cultivated and passed on by generations of ants going back tens of millions of years.
...Humans may have important lessons to learn from the attine ants. We have struggled to protect the survival of our crops for only about 10,000 years. “We're constantly coming up with herbicides or antibiotics to control pests. And the pests are constantly evolving countermeasures against those things,” Schultz says. The most economically important variety of banana became functionally extinct in the 1960's and another variety is heading in the same direction. “Somehow this system with the ants has been in an equilibrium for millions of years,” he adds.
Wednesday, July 20, 2016
Zadie Smith in the New York Review of Books:
Back in the old neighborhood in North West London after a long absence, I went past the local primary school and noticed a change. Many of my oldest friends were once students here, and recently—when a family illness returned us to England for a year—I enrolled my daughter. It’s a very pretty redbrick Victorian building, and was for a long time in “special measures,” a judgment of the school inspection authority called Ofsted, and the lowest grade a state school can receive. Many parents, upon reading such a judgment, will naturally panic and place their children elsewhere; others, seeing with their own eyes what Ofsted—because it runs primarily on data—cannot humanly see, will doubt the wisdom of Ofsted and stay put. Still others may not read well in English, or are not online in their homes, or have never heard of Ofsted, much less ever considered obsessively checking its website.
In my case I had the advantage of local history: for years my brother taught here, in an after-school club for migrant children, and I knew perfectly well how good the school is, has always been, and how welcoming to its diverse population, many of whom are recently arrived in the country. Now, a year later, Ofsted has judged it officially “Good,” and if I know the neighborhood, this will mean that more middle-class, usually white, parents will take what they consider to be a risk, move into the environs of the school, and send their kids here.
If this process moves anything like it does in New York, the white middle-class population will increase, keeping pace with the general gentrification of the neighborhood, and the boundaries of the “catchment area” for the school will shrink, until it becomes, over a number of years, almost entirely homogeneous, with dashes of diversity, at which point the regulatory body will award its highest rating at last.
Jacqueline Howard in the Huffington Post:
The concept of race in such research is “problematic at best and harmful at worst,” the researchers argued in a new paper published in the journal Science on Friday.
However, they also said that social scientists should continue to study race as a social construct to better understand the impact of racism on health.
So what does all this mean? HuffPost Science recently posed that question and others to the paper’s co-author, Michael Yudell, who is associate professor and chair of community health and prevention at the Dornsife School of Public Health at Drexel University in Philadelphia.
Why is it problematic to view race as a biological concept?
For more than a century, natural and social scientists have been arguing about whether race is a useful classificatory tool in the biological sciences — can it elucidate the relationship between humans and their evolutionary history, between humans and their health. In the wake of the U.S. Human Genome Project, the answer seemed to be a pretty resounding “no.”
Richard L. Hasen and Dahlia Lithwick in Slate:
Late last week, Supreme Court Justice Ruth Bader Ginsburg tried to put thecontroversy over her recent criticisms of presumptive Republican presidential nominee Donald Trump behind her, issuing a written statement of regret and telling NPR’s Nina Totenberg: “I did something I should not have done. It’s over and done with, and I don’t want to discuss it anymore.”
But the issue of judicial speech on political matters is hardly over and done with. It will remain fodder for the 2016 presidential election because Donald Trump criticized Ginsburg, even questioning her mental competence (“her mind is shot”) and calling on her to resign. Many court watchers worry what might happen if the court is called upon to rule on any kind of election dispute and that brings a reprise of calls for her to recuse in any Trump-related litigation. And on top of all that, the court itself will soon decide whether to weigh in on a case challenging an Arizona rule that bars judicial candidates from doing the very thing Justice Ginsburg did: openly supporting or opposing a candidate for public office.
Following a colorful inspection of his post-Mythologies reflections on film, what emerges is a portrait of Barthes the fetishist, deriving furtive pleasure from “the insignificant detail, the trivial object, the commonplace element that somehow seems slightly out of place.” The import of Barthes’s relish for the “ticklish detail” or the “obtuse meaning,” however, is not confined to mere fetishism. Watts persuasively argues that Barthes’s eye for the sensuous surface of things—whether the idiosyncratic exactness of Sergei Eisenstein’s mise en scène or Michelangelo Antonioni’s meandering landscape shots—has trenchant consequences for what Watts calls a “micropolitics” of film. This micropolitics, according to Watts, finds expression in a kind of egalitarian cinematic gaze, in “an aesthetic sensibility intent upon exploring the inexhaustible fascination with the ordinary.” This angle identifies the revolutionary potential of a film not strictly in its story, message, or even style, but rather in its “fractions and particles,” in those miniature, fleeting, fortuitous, or seemingly insignificant elements that nonetheless manage to “transmit to viewers . . . new conceptions of being a body, of linking one gesture to another, of moving in space, of being together.” This sensibility, conveyed across several of Barthes’s late writings from the mid- to late-1970s, also marks his repudiation of the fettering protocols of theory and the universalizing claims of abstract science. By this time, he is no longer under the impression that the power of cinema resides “in its capacity to hypnotize or render us passive,” but rather in its “ability to transform the sensory experience of the world around us.” According to Watts, Barthes’s cinema not only gradually became a delectable pastime but also a bridge to the world, and a machine for dreaming of ways to change it.
In 1943, at the age of twenty, Frederick Terna knew that if he survived the war he was going to be a painter. At the time he was an inmate of Terezín, which was not his first concentration camp, nor was it to be his last. In all, Terna was interned at four different camps: first at Lipa, known as Linden bei Deutsch-Brod, from October 3, 1941 to March 1943; then Terezín, known as Ghetto Theresienstadt, to fall of 1944; then Auschwitz to the end of 1944, and finally to Kaufering, a sub-camp of Dachau, outside of Munich, near Landsberg. Terna was born in Vienna but was raised in Prague, where his family moved when he was quite young. He lacked the realist rigor to enter art school at age thirteen. At Terezín, with horror all around him, he found himself composing mental pictures of the barracks, fences, and roads leading through the camp and adjusting them in his mind’s eye for better compositions. His fellow inmates told him he could be killed for his drawings.
Holocaust survivors are remarkable for the mere and obdurate fact of their survival. Each one is an extraordinary case study, even if they achieve little in their life afterward. The children of survivors are notoriously at risk for neuroses of their own, and so on. History’s great crimes reverberate for generations. Terna lost his entire family in the camps.
Editorial in Nature:
Millions of people have spent the past week walking around. Ostensibly, they are playing the online game Pokémon Go and hunting for critters in an ‘augmented reality’ world. But as gamers wander with their smartphones — through parks and neighbourhoods, and onto the occasional high-speed railway line — they are spotting other wildlife, too. Scientists and conservationists have been quick to capitalize on the rare potential to reach a portion of the public usually hunched over consoles in darkened rooms, and have been encouraging Pokémon hunters to snap and share online images of the real-life creatures they find. The question has even been asked: how long before the game prompts the discovery of a new species? It’s not out of the question: success is 90% perspiration after all, and millions of gamers peering around corners and under bushes across the world can create a very sweaty exercise indeed. By definition, each Pokémon hunter almost certainly holds a high-definition camera in their hands. And there is a precedent: earlier this year, scientists reported Arulenus miae, a new species of pygmy devil grasshopper, identified in the Philippines after a researcher saw an unfamiliar insect in a photo on Facebook (J. Skejo and J. H. S. Caballero Zootaxa 4067, 383–393; 2016).
But Pokémon Go players beware. It is one thing to conquer a world of imaginary magical creatures with names like Eevee and Pidgey, and quite another to tangle with the historical complexity of the International Code of Zoological Nomenclature. So, say you do manage to snap a picture of something previously unknown to science — what then? Let Nature be your guide. First, the good news — the Code (we’ll call it that from now on to save on Twitter characters) is now officially with the times, and no longer reliant on the dead trees that were so popular before you were born. Despite grumbles from traditionalists, in 2012 the International Commission on Zoological Nomenclature, which hosts the Code, agreed to embrace online-only media. In doing so, it relaxed its rule that species could be officially named only in printed academic journals. Now, the bad news — if your picture of an unusual butterfly or bird or hippopotamus does look to a friendly online biologist like a new species, then you’ll probably have to go back and catch the beast. (Whisper it, but you might even have to kill it.)
The Problem of Describing Color
If I said – remembering in summer,
The cardinal’s sudden smudge of red
In the bare gray winter woods –
If I said, red ribbon on the cocked straw hat
Of the girl with pooched-out lips
Dangling a wiry lapdog
In the painting by Renoir –
If I said fire, if I said blood welling from a cut –
Or flecks of poppy in the tar-grass scented summer air
On a wind-struck hillside outside Fano –
If I said, her one red earring tugging at her silky lobe,
If she tells fortunes with a deck of fallen leaves
Until it comes out right –
Rouged nipple, mouth –
(How could you not love a woman
Who cheats at the Tarot?)
Red, I said. Sudden, red.
by Robert Hass
from Time and Materials. Poems 1997-2005
Ecco (HarperCollins Publishers), New York, 2007
Tuesday, July 19, 2016
Sophia Efthimiatou in Literary Hub:
He was at a meditation retreat in the Catskills, sitting cross-legged on a big flat rock on the side of a lake, eyes closed, pulse steady, surrounded by chipmunks and beavers and deer and newts, when Patrick Ryan decided he would never again try to write a book.
He had completed seven unpublished novels by then, attempted eight or nine more unfinished ones, all of them shoved away into manuscript boxes that took up as much space in his apartment as a child’s coffin. As he was nearing 40, there was nothing impressive about this activity of his—writing, that is—but a sad compulsion that bordered on the absurd.
It wasn’t that he had experienced no success at all. In two decades of writing nearly every day, he had seen some of his stories published in literary journals, and he had a few close calls with publishers and agents who found value in his work but ultimately could not sell it. One of them, an editor from Simon & Schuster, had even invited him to lunch after reading one of his novel submissions. Ryan walked into that restaurant on a snowy January day thinking that his time had finally come, only to be schooled on the concept of track records.
“I really love it,” the editor told him of his unsolicited manuscript. “It’s depressing but funny, and I think the writing is wonderful. But if I publish it, it’ll be the end of your career.”
Sean Carroll in Preposterous Universe:
The other day I was amused to find a quote from Einstein, in 1936, about how hard it would be to quantize gravity: “like an attempt to breathe in empty space.” Eight decades later, I think we can still agree that it’s hard.
So here is a possibility worth considering: rather than quantizing gravity, maybe we should try to gravitize quantum mechanics. Or, more accurately but less evocatively, “find gravity inside quantum mechanics.” Rather than starting with some essentially classical view of gravity and “quantizing” it, we might imagine starting with a quantum view of reality from the start, and find the ordinary three-dimensional space in which we live somehow emerging from quantum information. That’s the project that ChunJun (Charles) Cao, Spyridon (Spiros) Michalakis, and I take a few tentative steps toward in a new paper.
We human beings, even those who have been studying quantum mechanics for a long time, still think in terms of a classical concepts. Positions, momenta, particles, fields, space itself. Quantum mechanics tells a different story. The quantum state of the universe is not a collection of things distributed through space, but something called a wave function. The wave function gives us a way of calculating the outcomes of measurements: whenever we measure an observable quantity like the position or momentum or spin of a particle, the wave function has a value for every possible outcome, and the probability of obtaining that outcome is given by the wave function squared. Indeed, that’s typically how we construct wave functions in practice. Start with some classical-sounding notion like “the position of a particle” or “the amplitude of a field,” and to each possible value we attach a complex number. That complex number, squared, gives us the probability of observing the system with that observed value.
David Sloan Wilson and Joseph Henrich in Evonomics:
Paleontologists tell us that numerous Homo species once roamed the earth, although only Homo sapiens remains. Several Homo species still inhabit economic world, however — the world as described by traditional economics. The most common is Homo economicus, whose preferences and abilities were described by neoclassical economists a long time ago. More recently, behavioral economists described a new species called Homo anomalous, because it departs from H. economicus in so many ways. Now a brand new species has been discovered by a multi-disciplinary team of scientists. I’ll call itHomo bioculturus and it might well become the one that inherits the world of economics.
Joseph Henrich is one member of the team that discovered H. bioculturus and his new book, The Secret of Success: How Culture is Driving Human Evolution, Domesticating Our Species, and Making Us Smarter, is arguably the best way for the economic profession to learn about it. Henrich is an intellectual Indiana Jones, equally at home slashing through the Jungle or conducting lab experiments. He spearheaded the famous “15 Societies Study” that played experimental economics games in traditional societies around the world. He recently moved from the University of British Columbia, where he was jointly appointed in the Departments of Psychology and the Vancouver School of Economics, to Harvard University’s Department of Human Evolutionary Biology.
DSW: Greetings, Joe, and welcome to Evonomics.com.
JH: Hello David! It’s great to be with you.
DSW: First, let me congratulate you on writing such a terrific book. Without attempting to flatter you, it is a tour de force—great fun to read in addition to brimming with ideas—my current favorite book for recommending to others. Second, let me ask you to provide a synopsis for an economically oriented audience.
The fears and the hopes of genetically engineering the human race have been haunting the modern mind for the better part of a century, although only in the last decade have techniques been developed that might give us the power to modify the genomes of human beings at the embryonic stage. Foremost among these has been the CRISPR-Cas9 system — a set of bacterial enzymes first identified in the late 1980s, and during just the last few years harnessed as a gene-editing tool. What sets CRISPR apart from earlier genetic modification techniques is its accuracy and versatility: the enzymes that cut the targeted DNA are guided by short sequences of RNA that can be custom-designed for any site in the genome. Earlier genetic engineering methods required different enzymes to target different locations in the genome, but by using RNA instead, CRISPR makes that targeting process much easier. (Although there are some differences in what the terms “genetic engineering,” “genetic modification,” and “gene editing” mean, they are for the most part interchangeable.)
This new gene-editing tool has rapidly become ubiquitous in molecular biology, with many applications beyond gene therapy. For instance, scientists have used CRISPR to remove retroviral sequences from the genomes of pig embryos in the hope of producing pigs with organs that can be transplanted more safely into humans. Because CRISPR is relatively easy to use, some journalists have even speculated that the technique might lead to the democratization of genetic engineering, with “home hobbyists” using it for who-knows-what. This claim seems overblown. While it is true that CRISPR makes the specific task of editing DNA much easier, there are other technically complicated steps and procedures involved in most forms of genetic engineering.
At this late date Southern literature is only slightly coherent as a marketing label, but if understood as a series of related, competing traditions, it’s vastly more interesting. And like the modern Republican Party, the umbrella of “Southern literature” shades a handful of not always easily alignable interests that often regard each other with suspicion.
For instance, there is the khaki-pants-and-seersucker strain, what you might call the Walker Percy tradition. There is the Faulkner strain, a fabricated country gentleman intent on creating a mythos of the South. (This fabricated gentleman also happens to be Ahab, unfortunately.) And then there is the evangelical degenerate strain, a ménage a trois of booze, Bible thumping, and Foghorn Leghorn. The patron saint of this strain is probably Barry Hannah, most famous for his early story collection Airships and one of the most legitimately weird postwar American writers, at least at the sentence level. Hannah taught writing at the University of Alabama and then at Ole Miss for a sizable chunk of his career, and he was famous for various apocryphal stories related to his drunken outlandishness, both in and out of the classroom. Hannah was himself a character, a presence of folk heroic proportions. (A beloved folk hero, it should be said.) Stories about Hannah were almost as important as stories by him. In his vapor trails he outlined the parameters of his own tradition, and we continue to read his disciples: the Hannahs. (Jim Harrison, who recently died, was, if not a direct disciple, a type of Hannah—a Hannah from Montana, if you will.)
Michael Bible is definitely one of the Hannahs, and this would be the case even if his latest book Sophia didn’t boast a promotional quote from Hannah himself, who’s been dead since 2010.
Dan P. McAdams in The Atlantic:
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage. “It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it. The same feeling perplexed Mark Singer in the late 1990s when he was working on a profile of Trump for The New Yorker. Singer wondered what went through his mind when he was not playing the public role of Donald Trump. What are you thinking about, Singer asked him, when you are shaving in front of the mirror in the morning? Trump, Singer writes, appeared baffled. Hoping to uncover the man behind the actor’s mask, Singer tried a different tack.
“O.K., I guess I’m asking, do you consider yourself ideal company?”
“You really want to know what I consider ideal company?,” Trump replied. “A total piece of ass.”
I might have phrased Singer’s question this way: Who are you, Mr. Trump, when you are alone? Singer never got an answer, leaving him to conclude that the real-estate mogul who would become a reality-TV star and, after that, a leading candidate for president of the United States had managed to achieve something remarkable: “an existence unmolested by the rumbling of a soul.”
Jane Brody in The New York Times:
Does this sound like anyone you know?
*Displays a grandiose sense of self, violating social norms, throwing tantrums, even breaking laws with minimal consequences; generally behaves as if entitled to do whatever he wants regardless of how it affects others.
*Shames or humiliates those who disagree with him, and goes on the attack when hurt or frustrated, often exploding with rage.
*Arrogant, vain and haughty and exaggerates his accomplishments; bullies others to get his own way.
*Lies or distorts the truth for personal gain, blames others or makes excuses for his mistakes, ignores or rewrites facts that challenge his self-image, and won’t listen to arguments based on truth.
These are common characteristics of extreme narcissists as described by Joseph Burgo, a clinical psychologist, in his book “The Narcissist You Know.” While we now live in a culture that some would call narcissistic, with millions of people constantly taking selfies, spewing out tweets and posting everything they do on YouTube and Facebook, the extreme narcissists Dr. Burgo describes are a breed unto themselves. They may be highly successful in their chosen fields but extremely difficult to live with and work with. Of course, nearly all of us possess one or more narcissistic trait without crossing the line of a diagnosable disorder. And it is certainly not narcissistic to have a strong sense of self-confidence based on one’s abilities.
Monday, July 18, 2016
by Paul Braterman
This is an excellent review of an important but difficult subject, and a welcome change from the ill-informed bluster of a Sam Harris, or the limp apologetics of a Karen Armstrong. It is the work of an author who is exceptionally well placed to appreciate the context of the mass of information on which he draws. Lucidly written, it is also a work of broad scholarship (there are more than 500 references and footnotes), and provides an overview of one of the most important developments of our times. Overall, it is a much-needed corrective to the popular view that these times are particularly violent, and that the roots of this violence lie within Islam.
It is also a very disturbing book, and I mean that as a compliment. While fully committed to secular Enlightenment values, Edis recognises that this cannot be the starting position in any worthwhile discussion of committed Islam. Secularism is neither historically inevitable, nor a logical necessity, nor a moral imperative. In his native Turkey, for example, secularism was the founding principle of the modern State, but has lost out to an Islamic pious modernity, whose advocates cannot simply be dismissed as deluded or wicked. Secularism cannot claim to be the more democratic option, where it is not what people would prefer. The secular ideal of rule of an impartial law is not neutral, since it places judges, members of the power elite, as arbiters. Moreover, Edis turns a critical searchlight on the ostensively secular United States, where he now lives and works, finds echoes there of much of what concerns him about Islam today, and challenges the West's air of injured innocence in the face of violence. Ultimately, he regards Islam as a far smaller peril than a rampant neoliberalism that values individuals only as producers and consumers, sells political influence to the highest bidder, and still sponsors the denial of the world's most urgent problem, global warming. He shows how the rhetoric of the "war on terror" is used, in the West as in his native Turkey, to suppress dissent, and is contemptuous of how western defenders of freedom have accepted the facile and counterfactual narrative of an inherently violent Islam. Most disturbing of all, he critically examines his own Enlightenment assumptions, which his readers, and mine, will generally take for granted. For instance, why do we regard free speech as good? To what extent do our own institutions follow this ideal in practice? And should we not be more aware of the degree of coercion implicit in our own social order?
by Jonathan Kujawa
While I was in graduate school the film "Trekkies" was released. You can see the trailer here and the full film here. What could easily be mocking is in fact a heartfelt look at a group of people who choose to devote their lives to something they love. After seeing the film my friends and I semi-seriously suggested that mathematicians would make a great subject for a documentary. We have more than our share of interesting folks. And, like Trekkies, there is an entire subculture.
One corner of that subculture is Mathematical Reviews. An arm of the American Mathematical Association, Math Reviews is a compendium of everything published in mathematics. It was founded in 1940 and contains over three million publications, with the earliest published in 1810. What makes Math Reviews invaluable is the reviews. Each research paper, monograph, book, etc., is assigned to a volunteer mathematician who has the expertise to write a review of the work. Short of personal attacks, slander, and the like, the reviewer is pretty much free to write what they choose. The usual thing is to give a summary of the work along with commentary. As a reviewer you might discuss how the results fit in the broader field or highlight aspects of the work which might be of particular interest. Oftentimes it's hard to tell from the title and abstract if a paper, say, contains needed results. Well written reviews can save the reader countless hours in the library.
Since reviewers have a free hand there are plenty of exceptional reviews amongst Math Reviews's vast collection. Ten years ago my colleague, Kimball Martin, began a compilation of truly great reviews. If you have access to a library with a subscription to Math Reviews, you can read his entire collection for yourself. Some are rave reviews, but there are some real zingers in there as well (see the title of this essay) which I thought the readers of 3QD would appreciate .
With decades worth of publications, some truly terrible papers have appeared. Reviewers aren't ones to let rubbish slide through. Sometimes it is the mathematics itself which is questionable:
It is hard to imagine in a single paper such an accumulation of garbled English, unfinished sentences, undefined notions and notations, and mathematical nonsense. The author has apparently read a large number of books and papers on the subject, if one looks at his bibliography; but it is doubtful that he has understood any of them.... What is amazing to the reviewer is that such a thing was ever printed.
Not every text containing mathematical formulae or terminology may be considered as a scientific work. Sometimes it is a mere imitation. My impression is that this is exactly the case of the paper under review. The paper deals with some relations between Riemann theta functions, but I have a feeling that the authors have only a rather vague notion about this subject. I doubt that they have read items 1,2,3,6 of their own references. All of the authors' statements are either tautological or false.