Sunday, August 30, 2015
Lydia Kiesling in The Millions:
There are a few digs at you, reader, in Purity, Jonathan Franzen’s big new novel. Here’s one buried in the musings of Andreas Wolf, the sociopathic leader of a data-dumping transparency project — one analogous to but at odds with WikiLeaks: “The more he existed as the Internet’s image of him, the less he felt like he existed as a flesh-and-blood person. The Internet meant death.” Have you read a take or a tweet excoriating Jonathan Franzen? You inhabit a world “governed…by fear: the fear of unpopularity and uncoolness, the fear of missing out, the fear of being flamed or forgotten.”
Ironically, the Internet — the thing with which Franzen’s opprobrium is most frequently associated — is also the vehicle by which his utterances become collectively memorable. The Internet is why I know, for example, that 20 years ago, Franzen expressed anxiety about cultural irrelevance in the type of tone-deaf revelation primed to annoy less-famous writers and destined to become characteristic: “I had already realized that the money, the hype, the limo ride to a Vogue shoot weren’t simply fringe benefits. They were the main prize, the consolation for no longer mattering to the culture.”
No one should be permanently lashed to his or her remarks of decades past, but Franzen, with his frequent public grumping, invites a certain amount of scrutiny. And despite the easy prey of Franzen’s Vogue shoots, that essay, “Perchance to Dream,” published in Harper’s in 1996, contains an artist’s statement that remains the tidiest, most cogent thesis on the project of Franzen’s writing: “It had always been a prejudice of mine that putting a novel’s characters in a dynamic social setting enriched the story that was being told; that the glory of the genre consisted in its spanning of the expanse between private experience and public context.”
Hannah Osborne in Yahoo! News:
Subatomic particles have been found that appear to defy the Standard Model of particle physics. The team working at Cern's Large Hadron Collider have found evidence of leptons decaying at different rates, which could possibly point to some undiscovered forces.
Publishing their findings in the journal Physical Review Letters, the team from the University of Maryland had been searching for conditions and behaviours that do not fit with the Standard Model. The model explains most known behaviours and interactions of fundamental subatomic particles, but it is incomplete – for example it does not adequately explain gravity, dark matter and neutrino masses.
Researchers say the discovery of the non-conforming leptons could provide a big lead in the search for non-standard phenomenon. The Standard Model concept of lepton universality assumes leptons are treated equally by fundamental forces.
They looked at B meson decays including two types of leptons – the tau lepton and the muon, both of which are highly unstable and decay within just a fraction of a second. The tau lepton and muon should decay at the same rate after mass differences are corrected. But the researchers found small but important differences in the predicted rates of decay.
Frederic C. Hof in Foreign Policy:
On Aug. 16, Syrian regime aircraft bombed a vegetable market in the rebel-held Damascus suburb of Douma, slaughtering over 100 Syrian civilians and wounding some 300 more. Many of the victims were children; it was one of the deadliest airstrikes of a brutal war. This is far from the first regime-committed atrocity in a Damascus suburb: Exactly two years ago today, Bashar al-Assad’s forces launched a chemical weapons attack in Ghouta, which killed hundreds. In the case of the Douma attack, President Barack Obama’s administration reacted with its usual pantomime of outrage: strong verbal condemnation, condolences for the families of victims, and a plea that the international community “do more to enable a genuine political transition in Syria.”
A genuine political transition in Syria, however, is not right around the corner. Yet every airstrike by President Bashar al-Assad’s regime is fueling radicalization in the Syrian here and now. The only clear winner in the Douma abomination was the pseudo “caliph” of the so-called Islamic State, Abu Bakr al-Baghdadi, a hardened criminal who recruits followers courtesy of the Iranian-sponsored Assad regime’s atrocities and Western complacency. Iran and Assad know exactly what they are doing by bolstering this evil. The West, meanwhile, is complacently unresponsive.
Oliver Sacks has died. As my friend John Ballard has said, "He taught us how to live and die gracefully." John also sent me this article by Sacks which appeared in the New York Times a couple of weeks ago:
In December 2014, I completed my memoir, “On the Move,” and gave the manuscript to my publisher, not dreaming that days later I would learn I had metastatic cancer, coming from the melanoma I had in my eye nine years earlier. I am glad I was able to complete my memoir without knowing this, and that I had been able, for the first time in my life, to make a full and frank declaration of my sexuality, facing the world openly, with no more guilty secrets locked up inside me.
In February, I felt I had to be equally open about my cancer — and facing death. I was, in fact, in the hospital when my essay on this, “My Own Life,” was published in this newspaper. In July I wrote another piece for the paper, “My Periodic Table,” in which the physical cosmos, and the elements I loved, took on lives of their own.
And now, weak, short of breath, my once-firm muscles melted away by cancer, I find my thoughts, increasingly, not on the supernatural or spiritual, but on what is meant by living a good and worthwhile life — achieving a sense of peace within oneself. I find my thoughts drifting to the Sabbath, the day of rest, the seventh day of the week, and perhaps the seventh day of one’s life as well, when one can feel that one’s work is done, and one may, in good conscience, rest.
Put the internet, vintage TV, and C-SPAN's funniest home videos into a blender, and what you pour out might look something like the GIF art of German-American artist Peekasso. A quick glance at his Tumblr melts eyes with an avalanche of strobing fluorescent colors, heavily Photoshopped cultural icons, and ideological statements that range from the subtle and thought-provoking, to the politically incorrect, over-the-top, and unabashedly honest.
Peekasso, whose given name is Peter Stemmler, immigrated to the United States in 1997, and started the successful illustration company Quickhoney with artist Nana Rausch three years later. In 2007, he began putting personal projects on a the Peekasso Tumblr, filling it with stylized memes of Spock, Mr. T, and then-Senator Obama. In 2011 he began experimenting with GIFs, "out of boredom," he tells The Creators Project. Here's his very first one. His frenetic GIF art style has developed over the last five years, through hundreds of graphic experiments mixing corporate and political branding, pornography, and nostaliga into a miasma of inside jokes and discomfort that reflects the miasma of online culture. "I like to see myself changing," he says. "I don't mind my old work, but now I'm faster, more secure in my decisions, and more political."
Steven Shapin in The Guardian:
Richard Dawkins has had a wonderful life. He’s been happy in his scientific work on evolution, blessed (if that’s a permissible word) by smooth good looks and contented in his (third) marriage. He’s been given joy by his collaborators and colleagues and taken pleasure in poetry and music, even religious music. He’s collected bouquets of honorary degrees, including one from Valencia, which, he tells us, gave special delight because it came with a “tasselled lampshade” cap, and he has both an asteroid and a genus of fish named after him. Oxford college life has been sweet, and he’s been fulfilled by his role as public intellectual, defender of scientific reason, secular saint and hammer of the godly, switching from the zoology department in 1995 to a new endowed chair which allowed him to work full-time on “the public understanding of science”. His books – from The Selfish Gene (1976), River Out of Eden (1995) and The God Delusion (2006) to the first volume of his autobiography An Appetite for Wonder (2013) – have been successful, well-received, and, as Dawkins proudly notes, are all still in print. They have sold extraordinarily well – more than 3m copies of The God Delusion alone – making their author comfortably off as well as famous. According to the notions he coined, both selfish genes and memes want to make lots of copies of themselves, but there must be some genes or memes that haven’t been as successful as Dawkins himself.
Where once the humanists and philosophers were cocks of the cultural walk, now Dawkins can claim without argument that there are “deep philosophical questions that only science can answer”. There are no mysteries, just as-yet-unsolved scientific problems: “Life is just bytes and bytes and bytes of digital information.” The culture wars are over; science has won and Dawkins is confident that he has played a non-trivial role in that victory. Surveying the enormous change in the public prestige of science since CP Snow’s The Two Cultures (1959), he takes satisfaction that his books have been “among those that changed the cultural landscape”. Snow complained that, for some unfathomable reason, scientists were not counted as “intellectuals”. That has all changed. In 2013, readers of Prospect magazine voted Dawkins the world’s “top thinker”.
Saturday, August 29, 2015
Elena Fagotto and Archon Fung in Boston Review:
Americans eat out more than ever before, and their waistlines are showing it. Restaurant foods pack more calories than most patrons imagine—a single entrée or shake can contain as many as 2,000 calories—contributing to the epidemic of obesity, which affects a third of the adult population. Will information help Americans to take better care of themselves? We will soon find out. Due to new regulations, by the end of 2015, calorie counts will appear on the menus and menu boards of large restaurant chains, grocery stores, and even movie theaters.
The calorie-disclosure rule is just one of the recent attempts at legislating transparency in the hope of changing behavior without resorting to more invasive and politically difficult regulatory approaches such as banning products or setting specific product standards. For instance, police departments in Seattle, Phoenix, and Albuquerque have deployed body cameras to reduce police violence, and in December of last year, the White House called for funding to purchase an additional 50,000. Faith in cameras seems well placed: after Rialto, California, adopted cameras, the use of force by police officers dropped by almost 60 percent and complaints declined by almost 90 percent. Transparency has also been used to inspire resource conservation. U.S. utility companies have found that by sending customers information about how their energy usage compares to their neighbors’, they can induce those customers to cut down. In another example, the incidence of food-borne illnesses decreased in Los Angeles after local laws began requiring restaurants to post cleanliness scores they received from hygiene inspections. And thanks to other disclosure requirements, you can learn about school performance, local water quality, crime levels on university campuses, and vehicle safety. The Supreme Court and many others have looked to disclosure as a bulwark against the corrosive effect of money on our democratic political institutions. The applications of transparency seem boundless, its promise to empower consumers and citizens and to discipline corporations and governments considerable.
But more information does not always make things better. Where there is a glut of information, it is often ignored. Worse still, it can be misused and cause harm.
Italo Calvino in The New York Review of Books:
I went to the cinema in the afternoon, secretly fleeing from home, or using study with a classmate as an excuse, because my parents left me very little freedom during the months when school was in session. The urge to hide inside the cinema as soon as it opened at two in the afternoon was the proof of true passion. Attending the first screening had a number of advantages: the half-empty theater, it was like I had it all to myself, would allow me to stretch out in the middle of the third row with my legs on the back of the seat in front of me; the hope of returning home without anyone finding out about my escape, in order to receive permission to go out once again later on (and maybe see another film); a light daze for the rest of the afternoon, detrimental to studying but advantageous for daydreaming. And in addition to these explanations that were unmentionable for various reasons, there was another more serious one: entering right when it opened guaranteed the rare privilege of seeing the movie from the beginning and not from a random moment toward the middle or the end, because that was what usually happened when I got to the cinema later in the afternoon or toward the evening.
Italian spectators barbarously made entering after the film already started a widespread habit, and it still applies today. We can say that back then we already anticipated the most sophisticated of modern narrative techniques, interrupting the temporal thread of the story and transforming it into a puzzle to put back together piece by piece or to accept in the form of a fragmentary body. To console us further, I’ll say that attending the beginning of the film after knowing the ending provided additional satisfaction: discovering not the unraveling of mysteries and dramas, but their genesis; and a vague sense of foresight with respect to the characters. Vague: just like soothsayers’ visions must be, because the reconstruction of the broken plot wasn’t always easy, especially if it was a detective movie, where identifying the murderer first and the crime afterward left an even darker area of mystery in between. What’s more, sometimes a part was still missing between the beginning and the end, because suddenly while checking my watch I’d realize I was running late; if I wanted to avoid my family’s wrath I had to leave before the scene that was playing when I entered came back on.
Kieran Healy over at his website:
Abstract: Seriously, fuck it.
As alleged virtues go, nuance is superficially attractive. Isn’t the mark of a good thinker the ability to see subtle differences in kind or gracefully shade the meaning terms? Shouldn’t we cultivate the ability to insinuate overtones of meaning in our con- cepts? Further, isn’t nuance especially appropriate to the difficult problems we study? I am sure that, like mine, your research problems are complex, rich, and multi-faceted. (Why would you study them if they were simple, thin, and one-dimensional?) When faced with problems like that, a cultivated capacity for nuance might seem to reflect both the difficulty of the topic and the sophistication of the researcher approaching it. I am sure that, like me, you are a sophisticated thinker. When sophisticated people like us face this rich and complex world, how can nuance not be the wisest approach?
It would be foolish, not to say barely comprehensible, for me to try to argue against the idea of nuance in general. That would be like arguing against the idea of yellow, or the concept of ostriches. It does not make much sense, in any case, to think of nuance as something that has a distinctive role all of its own in theory, or as something that we can add to or take away from theory just as we please. That is a bit like the author whom Mary McCarthy described busily revising a short story in order to “put in the symbols” (Goodman 1978, 58). What I will call “Actually-Existing Nuance” in sociological theory refers to a common and specific phenomenon, one most everyone working in Sociology has witnessed, fallen victim to, or perpetrated at some time. It is the act of making—or the call to make—some bit of theory “richer” or “more sophisticated” by adding complexity to it, usually by way of some additional dimension, level, or aspect, but in the absence of any strong means of disciplining or specifying the relationship between the new elements and the existing ones. Theorists do this to themselves and demand it of others. It is typically a holding maneuver. It is what you do when faced with a question that you do not yet have a compelling or interesting answer to. Thinking up compelling or interesting ideas is quite difficult, and so often it is easier to embrace complexity than cut through it.
World War I was the greatest empire slayer of all time. Down went the Ottoman Empire, ruling from Bosnia to Basra. Hapsburg shrank into tiny Austria. Germany and Russia remained largely intact, but Wilhelm II ended up in exile, while the Romanovs were murdered by the Bolsheviks. Exit sultans and kaisers; enter authoritarians and totalitarians.
The irony can’t be topped. All four dynastic regimes went to war for the usual reasons: security, power and possession — as did democratic France, Britain and the United States. But beset by indomitable nationality and class conflicts, they also fought for sheer regime survival, following Henry IV’s counsel, in Shakespeare’s words, to “busy giddy minds with foreign quarrels.”
It was a momentous miscalculation that would transform 20th-century history. Had the old despots been gifted with foresight, they would have opted for peace über alles.
This is the takeoff point for Dominic Lieven’s book “The End of Tsarist Russia.” The tomes on the Great War fill a small library by now. Since history is written by the victors, the first batch fingered the German Reich as starring culprit; later works spread out along an explanatory spectrum that ranged from inevitability to contingency.
It is almost impossible to overstate the significance of the scientific revolution. As David Wootton’s masterly The Invention of Science shows, it was nothing less than the triumph of the future over the past. Before it, Aristotle had been the leading authority on nature and philosophers had sought above all to recover the lost culture of the ancients. Afterwards, the idea that new knowledge was possible had become axiomatic.
According to Wootton, who is anniversary professor of history at the University of York, modern science was invented between 1572, when the astronomer Tycho Brahe saw a new star in the sky (proof that the heavens could change), and 1704, when Isaac Newton published his book Opticks, which drew conclusions on the nature of light, based on experiments. Everything changed within those decades — even, Wootton contends, the very language used to understand the world. Indeed, one of the premises of The Invention of Science is that “a revolution in ideas requires a revolution in language”.
Take the word “discovery.” Wootton argues that when Christopher Columbus discovered America in 1492, he didn’t have a word to describe what he had done. The nearest Latin verbs were invenio (find out), which Columbus used, reperio (obtain), which was employed by Johannes Stradanus in the title of his book of engravings depicting the new discoveries, and exploro (explore), which Galileo used to report his sightings of Jupiter’s moons.
How, the novel asks, do we make sense of an era in which information has become a burden? "Like the old politburos," Franzen observes, "the new politburo styled itself as the enemy of the elite and friend to the masses, dedicated to giving consumers what they wanted, but to Andreas … it seemed as if the Internet was governed more by fear: the fear of unpopularity and uncoolness, the fear of missing out."
That this is anathema to his characters should go without saying; a deep loneliness pervades the book. Sometimes, the loneliness is shared, as in Tom's relationship with Anabel, who restricts herself until they are a miserable society of two. Sometimes, it is cultural, as when Purity, or Pip, as she is known, goes to work for Andreas, only to find herself surrounded by true believers, as if his mission were a source of faith.
Always, there are bad mothers (the bad mother is a staple of Franzen's fiction): Andreas', Purity's, Tom's. "He watched as a strange thing happened in her face," Franzen writes about the first of these women, "a subtle but crazy-looking modulation of expressions, some interior struggle made visible — her fantasy of being a loving mother, her resentment at the bother of it." The description brings to mind Enid Lambert, the damaged matriarch of "The Corrections," and her struggles with the children she by turns torments and loves.
Laila Lalami in The New York Times:
Ghosh, the author of seven previous novels and five books of nonfiction, is a writer with a passion for language. He doesn’t simply create a world, he delights in giving it the truest words. In his prose, a half-Indian, half-Chinese opium addict sounds completely different from an upper-class Parsee widow, who in turn sounds completely different from a British woman who has been raised in India. (In one amusing episode, the British woman comes up with an impressive array of terms for sex organs or sex acts. She calls a slack penis “a sleeping bawhawder,” masturbation is referred to as “soaping the sepoy” and cunnilingus becomes “making a chutney.”)
Ghosh’s novel is also concerned with how the nascent free trade of the region has brought about a major conflict, which is resolved through military force. Watching one of the battles of the Opium War, Neel wonders: “How was it possible that a small number of men, in the span of a few hours or minutes, could decide the fate of millions of people yet unborn? How was it possible that the outcome of those brief moments could determine who would rule whom, who would be rich or poor, master or servant, for generations to come?”
Friday, August 28, 2015
Sam Anderson in The New York Times Magazine:
In normal everyday life, time is a liquid that flows around us all unceasingly — a kind of existential syrup. During election season, this syrup is captured, boiled down, dehydrated and separated into its constituent grains — grains that we like to call, without fail, ‘‘moments.’’ Thus Donald Trump is (according to The New York Times) ‘‘the man of the moment,’’ and although he was briefly ‘‘out-Foxed’’ (according to The Belfast Telegraph) ‘‘by a Megyn moment’’ (Jim Rutenberg's coinage in this magazine), he went on to recover with a ‘‘big, symbolic moment’’ (according to Mark Halperin on the ‘‘Today’’ show) at the Iowa State Fair. Hillary Clinton, meanwhile, might be having an ‘‘Al Capone moment’’ with regard to the legality of her private email server (The Washington Post), but she still found time to have a ‘‘celebrity moment’’ (The Times again) by taking a selfie with Kim Kardashian and Kanye West. No nexus of events is too large or heterogeneous — no geopolitical weather too swirlingly turbulent — to avoid being reduced to the shorthand of the moment.
As the election grinds on, the names attached to such moments will change. Marco Rubio might succeed Trump as the official man of the moment; Al Gore might have his Lazarus moment. The only thing we can be certain of is that the moments will arrive, incessantly, and that when they do, they will be collected, labeled neatly and displayed for public consumption. We are living in the moment moment.
Modern media-saturated humans didn’t invent the concept of the moment, of course. Our obsession with it probably goes back, as most modern obsessions do, to the ancient Greeks. The Greeks had at least two different notions of time: chronos (the vast, inhuman, infinite stretch of time) and kairos (the moment). A boring old hour — 15 degrees on the sundial, 60 soulless ticks of the clock — is a little patch of chronos. Kairos, on the other hand, is where the magic happens: those decisive instants in which the world suddenly changes. Kairos is significant time, charged time, heavenly time. It transcends calendars, soaks everything in meaning.
Isabel Ortiz in the LA Review of Books:
MAD MAX: FURY ROAD does not look like an animated movie. Throughout the film, Charlize Theron’s Imperator Furiosa is gorgeously sweaty, her skin is granularly textured, and her face is paint-smeared. Tom Hardy as Mad Max sports levels of shaggy micro-scruff that no cartoon character could ever achieve. Large, fleshy burns dot his arms, and his face appears red, coarse, and scorched. Ash collects on everyone’s arm hairs, and oil stains their cheeks. In fact, one of the primary joys of watching the movie is that such painstaking attention has been paid to the little details that make moving human flesh look like moving human flesh: chests heave, lips parch, brows sweat, clothes itch, hands burn.
And yet, despite the oozing corporeality of the film’s protagonists, many critics have commented on its resemblance to a cartoon. These resonances are often spoken of as more structural — perhaps even spiritual — than they are stylistic. In his New York Times review of Fury Road, A.O. Scott cites the film’s debt to Chuck Jones’s Road Runner cartoons as “models of ingenuity and rigor,” while Richard Brody in The New Yorker refers to the movie as a “Rube Goldberg contraption set to the speed of a ‘Road Runner’ cartoon.” A.A. Dowd of the A.V. Club praises the movie’s “Tex Avery touches,” which he attributes to the limitless imagination of its director, George Miller, who previously worked in animation.
Early animators gave life to their drawings, endowing static figures with unique voices and movements.
Omar Waraich in Caravan:
ON 3 SEPTEMBER 1939, Subhas Chandra Bose was addressing a rally of 200,000 people by the oceanfront in the city now known as Chennai. The ambitious Congress leader had acquired an impressive national following. He could draw similar-sized crowds throughout the country, luring them with his charismatic style and his uncompromising demands for Indian freedom. During the speech, a member of the audience thrust an evening paper into his hand. Bose paused to glimpse the headline on the front page. War had broken out in Europe. It was the event that he, an inveterate opponent of the British, had eagerly anticipated. The moment, he would go on to write in his memoir, offered Indians “a unique opportunity for winning freedom.”
In India, Bose was a distinguished leftist with pronounced views on equality. He regarded Mohandas Gandhi’s wing of the party as too weak and too right-wing for his taste. Earlier that year, he had formed his own faction within the party, the Forward Bloc, to break Gandhi’s grip on the Congress and steer it in a more progressive direction. When it came to the wider world, however, Bose was an ultranationalist. For years, he had been busy ingratiating himself with Europe’s foremost fascists. He met Benito Mussolini multiple times in Italy, a fact he advertised with pride, provoking cringes from Jawaharlal Nehru who suspected Bose fancied himself a local variant of the Duce.
Nehru had travelled to Spain during the country’s civil war to express solidarity with the republican cause. In London, he spoke at anti-fascist rallies alongside leading British socialists. Bose, who by this time had developed a weakness for military uniforms, was unbothered by the character of the Italian and German regimes. Getting the British out was all that mattered to him. A fascist victory in Europe, he hoped, would break up the British Empire to finally deliver the dream of Indian independence.
Over the next ten months, Bose addressed hundreds of rallies like the one in Madras. He openly agitated for a British defeat. Much to Bose’s dismay, other Indian leaders didn’t share his enthusiasm. After some wavering, Gandhi came out on the side of the British. He had earlier urged the British people to resist Hitler only through “spiritual force,” while counselling the German leader, in an undelivered letter addressed to “my friend,” to discover the virtues of peace.
Michiko Kakutani in The New York Times:
Fans of Stieg Larsson’s captivating odd couple of modern detective fiction — the genius punk hacker Lisbeth Salander and her sometime partner, the crusading investigative journalist Mikael Blomkvist — will not be disappointed by the latest installment of their adventures, written not by their creator, Stieg Larsson (who died of a heart attack at the age of 50 in 2004), but by a Swedish journalist and author named David Lagercrantz. Though there are plenty of lumps in the novel along the way, Salander and Blomkvist have survived the authorship transition intact and are just as compelling as ever. “The Girl in the Spider’s Web” finds the pair drawn into the case of the enigmatic computer scientist Frans Balder: a prominent expert in artificial intelligence who’s become ensnared in a global intrigue involving the Swedish Security Police (Sapo), the Russian mob, Silicon Valley industrial spies and United States national security interests.
Mr. Lagercrantz’s efforts to connect unsavory doings in Sweden to machinations within America’s National Security Agency are strained and fuzzy — a bald attempt to capitalize on Edward J. Snowden’s revelations about the agency and the debate over its surveillance methods. But then, readers weren’t smitten by “The Girl With the Dragon Tattoo” because of its plotting (which relied heavily on straight-to-video serial-killer-movie clichés), its plausibility or Larsson’s anti-authoritarian politics. They were smitten with that novel and its two sequels — “The Girl Who Played With Fire” and “The Girl Who Kicked the Hornet’s Nest” — because of the fierce charm of Salander and Blomkvist, and their unlikely chemistry. And because Larsson was so adroit at conjuring a moody, noirish Sweden that turned the stereotype of a clean, bright Scandinavia (where people drive Volvos and buy Ikea furniture) back into a land of long winters, haunted by the ghosts of Strindberg and Bergman.
Adrienne LaFrance in The Atlantic:
Most of the time, people don’t actively track the way one thought flows into the next. But in psychiatry, much attention is paid to such intricacies of thinking. For instance, disorganized thought, evidenced by disjointed patterns in speech, is considered a hallmark characteristic of schizophrenia. Several studies of at-risk youths have found that doctors are able to guess with impressive accuracy—the best predictive models hover around 79 percent—whether a person will develop psychosis based on tracking that person’s speech patterns in interviews.
A computer, it seems, can do better.
That’s according to a study published Wednesday by researchers at Columbia University, the New York State Psychiatric Institute, and the IBM T. J. Watson Research Center in the Nature Publishing Group journal Schizophrenia. They used an automated speech-analysis program to correctly differentiate—with 100-percent accuracy—between at-risk young people who developed psychosis over a two-and-a-half year period and those who did not. The computer model also outperformed other advanced screening technologies, like biomarkers from neuroimaging and EEG recordings of brain activity.
Monya Baker in Nature:
Don’t trust everything you read in the psychology literature. In fact, two thirds of it should probably be distrusted. In the biggest project of its kind, Brian Nosek, a social psychologist and head of the Center for Open Science in Charlottesville, Virginia, and 269 co-authors repeated work reported in 98 original papers from three psychology journals, to see if they independently came up with the same results. The studies they took on ranged from whether expressing insecurities perpetuates them to differences in how children and adults respond to fear stimuli, to effective ways to teach arithmetic.
According to the replicators' qualitative assessments, as previously reported by Nature, only 39 of the 100 replication attempts were successful. (There were 100 completed replication attempts on the 98 papers, as in two cases replication efforts were duplicated by separate teams.) But whether a replication attempt is considered successful is not straightforward. Today in Science, the team report the multiple different measures they used to answer this question.