One of the better arguments for Britain’s leaving the EU was that it might reinvigorate and liberate national politics, stifled for too long by the absence of real choice at election time. The EU is a legalistic and treaty-based political institution designed to take some of the heat out of domestic politics. That left people complaining that the EU was generating all the heat. Brexit offered the prospect of a return to two-party politics across Britain, squeezing out the minor parties motivated by single-issue grievances. To a remarkable extent, this is what seems to have happened. The electoral map of Britain in 2017 looks very much as it did in 1970, before we joined the EEC. The two main parties now command well over 80 per cent of the popular vote between them, a figure not reached since the election of 1970. Scotland is no longer another country in electoral terms and both Labour and the Tories can claim to speak for all parts of the Union. This was achieved without either of the main parties cleaving to the centre. The choice on offer at this election was real and the voters embraced it.
It may be one of the better arguments, but that still doesn’t make it convincing. No election that results in the prospect of a minority government formed between the Tories and the DUP can be a great advertisement for two-party politics. There are many words that could be used to describe such an arrangement, but reinvigorated and liberated are not among them. British politics feels pinched and insecure after this vote. The prospect of real change hovers in the background but it is still hard to see how we get there from here. This is not 1970. The SNP retains a large bloc of support in Scotland that makes it much more difficult for either of the main parties to forge decisively ahead. Seventy seats in the new Parliament are held by MPs who are not Labour or Tory. In 1970 that figure was 12. Back then, Northern Irish electoral politics were still an extension of what happened on the mainland: the Ulster Unionists, effectively the Northern Irish branch of the Conservative Party, won eight of the 12 seats contested in the province. Now, Westminster politics looks more like an extension of what happens in Northern Ireland. If a return to the early 1970s means British politics being overshadowed by entrenched divisions in Northern Ireland, then here we go again. But that isn’t what Brexit was designed to achieve.
The idea of using the brain as a model of computation has deep roots. The first efforts focused on a simple threshold neuron, which gives one value if the sum of weighted inputs is above a threshold and another if it is below. The biological realism of this scheme, which Warren McCulloch and Walter Pitts conceived in the 1940s, is very limited. Nonetheless, it was the first step toward adopting the concept of a firing neuron as an element for computation.
In 1957, Frank Rosenblatt proposed a variation of the threshold neuron called the perceptron. A network of integrating nodes (artificial neurons) is arranged in layers. The “visible” layers at the edges of the network interact with the outside world as inputs or outputs, and the “hidden” layers, which perform the bulk of the computation, sit in between.
Rosenblatt also introduced an essential feature found in the brain: inhibition. Instead of simply adding inputs together, the neurons in a perceptron network could also make negative contributions. This feature allows a neural network using only a single hidden layer to solve the XOR problem in logic, in which the output is true only if exactly one of the two binary inputs is true. This simple example shows that adding biological realism can add new computational capabilities. But which features of the brain are essential to what it can do, and which are just useless vestiges of evolution? Nobody knows.
We do know that some impressive computational feats can be accomplished without resorting to much biological realism. Deep-learning researchers have, for example, made great strides in using computers to analyze large volumes of data and pick out features in complicated images. Although the neural networks they build have more inputs and hidden layers than ever before, they are still based on the very simple neuron models. Their great capabilities reflect not their biological realism, but the scale of the networks they contain and the very powerful computers that are used to train them. But deep-learning networks are still a long way from the computational performance, energy efficiency, and learning capabilities of biological brains.
The debate about the origins of the “Aryans” and their arrival in India has flared up again, this time triggered by new genetic findings that appear to confirm beyond any reasonable doubt that large numbers of Indo-Europeans herders migrated into the Indian subcontinent about 4000 or so years ago. Razib Khan (one of the best informed and unbiased bloggers on this topic) has written in detail about this topic in several posts, the most recent of which is here. I am not going to go into the genetics or the details, I just wanted to recap the story in very simple layperson outline and focus mostly on some of the politics around this topic. My basic argument is that the Hindutvadi reaction to the political uses of “Aryan Invasion Theory” is relatively justified, but opting to take a stand against population genetics and common sense in the form of a relatively recently concocted (and very unlikely) “Out of India” (OIT) theory is an unfortunate and self-defeating mistake.
The Indo-Europeans who migrated into India were one of several migratory stream that, between 4000-2000 BCE, spread in all directions out of the Pontic Steppe (what is now Eastern Ukraine and Southern Russia, North and North-East of the Black Sea). They were cattle herding, horse breeding steppe dwellers who, like practically all other human populations, were themselves a product of the layers of human settlement and migration that have woven the intricate net of human racial groups since the first emergence of modern humans in Africa. They were also a very successful, capable and warlike people who had developed light, fast, spoke-wheeled chariots (and possibly, the composite bow) that were the wonder weapon of their time.
Some further observations: the two-ness of the moon and Orion is reflected in the basic structure of the poem, by which I refer to the fact that just under half of “September Sky” is between parentheses; what emerges, of course, is that the specific content of that parenthesis isn’t at all parenthetical to the basic situation — rather the parenthesis is a rhetorical or say structural device that that enables the juxtaposition of the actual scene and Cendrars’ poem, with the eventual aim of bringing the second to bear on the first — to illuminate it from beyond, we might say, as if the empirical constellation was at one and the same time Cendrars’ “étoile,” which is also to say his missing hand. (This, I’m suggesting, is the poem’s aim; I’m in no position to judge whether or not the aim is realized.) And once again it matters that the shift to the French language and Cendrars’ poem begins not at the beginning of the third stanza but in line two of the second — as with the placement of “A half-moon has risen” at the end of the first stanza the interlocking of stanzas in this respect turns out to be vital to the poem’s effectiveness. Which is not to insist on that effectiveness, only to invite you to confirm the intuition that “September Sky” would be much less effective than it is if each of the three stanzas were confined to a particular subject matter: deck, half-moon, Cendrars’ poem (and Orion).
One more observation: note how the word “form” at the end of the penultimate line manifestly refuses the alternative word “shape,” which would make a rather close off-rhyme with “sleep” three lines before. (This sort of refusal is something else poems sometimes do.) “Form,” of course, bears an alliterative relation to the pairing “favorite French” of two lines before. And it echoes the word “forme” in Cendrars’ poem. In any case, the wager of “September Sky” is that coming where it does it is a better word than “shape.”
It should be said that Dennett’s overall stance on consciousness is not always entirely clear. His “user-illusion” story comes at the end of the book, and offers a not entirely implausible explanation of why we make the mistake of thinking our conscious thoughts have a reality beyond the material realm. Still, you might well wonder where that leaves consciousness itself, as opposed to our mistaken ideas about it. What about animals, who certainly don’t monitor their own thoughts, and so are surely free of the “user-illusion”, but which one might have supposed could still be conscious for all that? As it happens, Dennett raises the question of animal consciousness at various points in the book, and says he will sort it out in due course, but as far I could see he never does.
Even more puzzling is a claim Dennett makes in the course of the “user-illusion” story: “We won’t have a complete science of consciousness until we can align our manifest-image identifications of mental states . . . with scientific-image identifications of the subpersonal informational structures and events that are causally responsible for generating the details of the user illusion . . .”. This brought me up short. It is exactly what Dennett’s mainstream philosophical opponents would say. Conscious states are internal brain states that we currently refer to in other ways, but that will in time be identified by science. In the end, I was rather left wondering what the “user-illusion” fuss was all about.
It’s difficult for people who weren’t around at the time to grasp the scale of the Hemingway cult in twentieth-century America. As late as 1965, the editor of The Atlantic could write reverently of scenes from a kind of Ernest Hemingway Advent calendar: “Wine-stained moods in the sidewalk cafés and roistering nights in Left Bank boîtes. Walking home alone in the rain. Talk of death, and scenes of it, in the Spanish sun. Treks and trophies in Tanganyika’s green hills. Duck-shooting in the Venetian marshes. . . . Loving and drinking and fishing out of Key West and Havana.” It was real fame, too, not the thirty-minutes-with-Terry Gross kind that writers have to content themselves with now. To get close to the tone of it today, you would have to imagine the literary reputation of Raymond Carver joined with the popularity and political piety of Bruce Springsteen. “Papa” Hemingway was not just a much admired artist; he was seen as a representative American public man. He represented the authority of writing even for people who didn’t read.
The debunking, when it came, came hard. As the bitter memoirs poured out, we got alcoholism, male chauvinism, fabulation, malice toward those who had made the mistake of being kind to him—all that. Eventually there came, from his avid estate, the lucrative but not reputation-enhancing publication of posthumous novels. The brand continues: his estate licenses the “Ernest Hemingway Collection,” which includes an artisanal rum, Papa’s preferred eyewear, and heavy Cuban-style furniture featuring “leather-like vinyl with a warm patina.” (What would Papa have said of that!)
In the past decade, the coal-mining region that runs from Ohio to West Virginia has logged nearly 1,000 cases of “black lung disease” plaguing workers who’ve faced prolonged exposure to coal dust. But Senate Republicans have stalled legislation that retired coal miners desperately need to access the healthcare plans and pensions they were promised. Donald Trump became the forty-fifth president of the United States in part based on the claim that he would restore jobs by reviving the nation’s coal industry. But coal embodies capitalism’s most telling paradox: that the most lucrative industries are often the most dangerous. From the time it was first discovered in the United States in 1701 in Chesterfield County, Virginia, coal promised to revolutionize the world of energy and transportation. Yet, coal is responsible for untold damage to the environment and has led to the exploitation of workers—as laborers and assets—stretching back to the age of legalized slavery.
Two and a half decades after coal was discovered in what is now the Richmond Coal Basin, Abraham and Archibald Woolridge leased land from Major Henry Heth to establish a mining firm. Heth, a British émigré to the United States, had settled near the nation’s oldest coal mines in 1759. He fought in the American Revolution as part of the 1st Virginia Regiment, shooting through the ranks from captain to major. After the war, Heth became a successful entrepreneur, starting a family business at the Black Heath Coal Pits that was later taken over by his grandson, Confederate general Harry Heth. The end of the transatlantic slave trade and subsequent expansion of U.S. domestic trade enticed Major Heth to make the most of his access to capital in land and slaves. Since slaves could no longer enter the country legally, planters focused on smuggling them in (which was unpredictable), breeding them (a lengthy process), and renting them. In 1810, Heth placed an ad in the Richmond Enquirer soliciting “30 or 40 able bodied Negro Men, for whom a liberal price will be given” to “be employed in the Coal Mines.” From twenty slaves in 1801, Heth successfully acquired 114 slaves by 1812. In addition to the slaves he purchased, Heth would employ scores of enslaved miners rented from nearby merchants at the Black Heath Coal Pits.
Most enslaved persons were subject to harsh punishment and grueling work. Strategies of coercion and intimidation were integral to economic production, especially in the cotton kingdom. Industrial slaves like those operating in Virginia coal mines enjoyed a great deal more flexibility and a different work regimen.
Viruses and their hosts have been at war for more than a billion years. This battle has driven a dramatic diversification of viruses and of host immune responses. Although the earliest antiviral systems have long since vanished, researchers may now have recovered remnants of one of them embedded, like a fossil, in human cells. A protein called Drosha, which helps to control gene regulation in vertebrates, also tackles viruses, researchers report today in Nature1. They suggest that Drosha and the family of enzymes, called RNAse III, it belongs to were the original virus fighters in a single-celled ancestor of animals and plants. “You can see the footprint of RNAse III in the defence systems through all kingdoms of life,” says Benjamin tenOever, a virologist at Icahn School of Medicine at Mount Sinai in New York and lead author of the paper.
Plants and invertebrates deploy RNAse III proteins in an immune response called RNA interference, or RNAi. When a virus infects a host, the proteins slice the invader’s RNA into chunks that prevent it from spreading. But vertebrates take a different approach, warding off viruses with powerful interferon proteins — while Drosha and a related protein regulate genes in the nucleus. But in 2010, tenOever witnessed an odd phenomenon: Drosha appeared to leave the nucleus of human cells whenever a virus invaded2. “That was weird and made us curious,” tenOever says. His team later confirmed the finding, and saw that Drosha demonstrates the same behaviour in cells from flies, fish and plants. To test the hypothesis that Drosha leaves the nucleus to combat viruses in vertebrates, the researchers infected cells that had been genetically engineered to lack Drosha with a virus. They found that the viruses replicated faster in these cells. The team then inserted Drosha from bacteria into fish, human and plant cells. The protein seemed to stunt the replication of viruses, suggesting that this function dates back to an ancient ancestor of all the groups. “Drosha is like the beta version of all antiviral defence systems,” tenOever says.
—for Xulhaz Mannan, LGBT activist murdered in Bangladesh, April 2016
I have heard the summons. The wind tossed my hair and wrestled me down to the earth’s amorous embrace.
I have lain down among the rushes and offered myself to whatever it was within me, calling. Some said don’t.
I went wherever the wind blew me. I fathomed the fall of that abyss, held only by the thought of one I loved—
the arch of his brow, the two-day scruff of his jaw rasping against my cheek, the pulsing veins of his slender limbs.
I have loved my brothers and comrades. I have blessed the new year and painted the town with all the colors of my love.
I have faced the flash of steel, the howl of unholy voices. But it was their eyes, their hard unloving eyes, that undid me. .. by Nausheen Eusuf from Not Elegy, But Eros NYQ Books (US) and Bengal Lights Books (Bangladesh) .
Writing in one of Mussolini’s prisons in the 1930s, the Italian Marxist Antonio Gramsci jotted down the fragments that would become his theory of intellectuals. New classes, like the European bourgeoisie after the Industrial Revolution, he proposed, brought with them their own set of thinkers, which he called “organic intellectuals”—theorists, technicians, and administrators, who became their “functionaries” in a new society. Unlike “traditional intellectuals” who held positions in the old class structure, organic intellectuals helped the bourgeoisie establish its ideas as the invisible, unquestioned conventional wisdom circulating in social institutions.
Today, Gramsci’s theory has been largely overlooked in the ongoing debate over the supposed decline of the “public intellectual” in America. Great minds, we are told, no longer captivate the public as they once did, because the university is too insular and academic thinking is too narrow. Such laments frequently cite Russell Jacoby’s The Last Intellectuals (1987), which complained about the post-1960s professionalization of academia and waxed nostalgic for the bohemian, “independent” intellectuals of the earlier twentieth century. Writers like the New York Times columnist Nicholas Kristof attribute this sorry state of affairs to the culture of Ph.D. programs, which, Kristof claims, have glorified “arcane unintelligibility while disdaining impact and audience.” If academics cannot bring their ideas to a wider readership, these familiar critiques imply, it is because of the academic mindset itself.
In his book The Ideas Industry, the political scientist and foreign policy blogger Daniel W. Drezner broadens the focus to include the conditions in which ideas are formed, funded, and expressed. Describing the public sphere in the language of markets, he argues that three major factors have altered the fortunes of today’s intellectuals: the evaporation of public trust in institutions, the polarization of American society, and growing economic inequality. He correctly identifies the last of these as the most important: the extraordinary rise of the American superrich, a class interested in supporting a particular genre of “ideas.”
Two weeks ago, I interviewed Seth Stephens-Davidowitz, author of Everybody Lies, a new book that uses data on America’s Google habits as an insight into our national consciousness.
Two findings from the book dominated the conversation: America is riddled with racist and selfish people, and there may be a self-induced abortion crisis in this country.
But there was plenty more revelatory data in the book that we didn’t cover. So I wanted to follow up with Stephens-Davidowitz to talk about some of the other provocative claims he is making.
I was particularly interested in sexuality and online porn. If, as Stephens-Davidowitz puts it, “Google is a digital truth serum,” then what else does it tell us about our private thoughts and desires? What else are we hiding from our friends, neighbors, and colleagues?
Quantum mechanics is the consummate theory of particles, so it naturally describes measurements and interactions. During the past few decades, as computers have nudged the quantum, the theory has been reframed to encompass information, too. What quantum mechanics implies for measurements and interactions is notoriously bizarre. Its implications for information are stranger still.
One of the strangest of these implications refutes the material basis of communication as well as common sense. Some physicists believe that we may be able to communicate without transmitting particles. In 2013 an amateur physicist named Hatim Salih even devised a protocol, alongside professionals, in which information is obtained from a place where particles never travel. Information can be disembodied. Communication may not be so physical after all.
This past April, the early edition of a short article about Salih’s protocol appeared online in the Proceedings of the National Academy of Sciences. Most of the article’s 10 authors were members of the University of Science and Technology of China, at its branches in Shanghai and Hefei. The final author was Jian-Wei Pan, an eminent physicist who has also developed a constellation of satellites for communicating through quantum mechanics. He recently used this network for transmitting entangled particles over a distance of 1,200 kilometers.
One morning in April 2007 journalist Sacha Batthyany was approached by an elderly colleague at the Swiss daily where they both worked at the time. The colleague waved a newspaper clipping in front of him. It was an investigative report entitled, “The Hostess from Hell,” published by a German daily.
Glancing at the headline, Batthyany didn’t understand why he was being shown this article, but then he looked at the picture of the hostess and recognized it immediately. It was Margit, his father’s aunt —someone to whom the family demonstrated the utmost respect and also around whom they tended to tread carefully.
So he started to read the piece. In March 1945, it said, just before the end of World War II, Margit held a large party in the town of Rechnitz on the Austrian-Hungarian border to fete her Nazi friends. She, the daughter and heiress of European baron and tycoon Heinrich Thyssen, and her friends drank and danced the night away.
At the height of the evening, just for fun, 12 of the guests boarded trucks or walked to a nearby field, where 180 Jewish slave laborers who had been building fortifications were assembled. They had already been forced to dig a large pit, strip, and get down on their knees. The guests took turns shooting them to death before returning to the party. The organizer of this operation was Margit’s lover Hans Joachim Oldenberg. Margit’s husband, Count Ivan Batthyany, Sacha’s grandfather’s brother, was also at the party.
Many contemporary historians and specialists in North African history give the impression that the major interest of Ibn Khaldun's work is that it provides us with a complete explanation of the crisis that put an end to the social and economic development of the Maghreb. They argue that the crisis was the result of the gradual invasion of North Africa by nomadic Arab tribes from the east, first the Beni Hilal and then the Beni Solayn. According to C.A. Julien, the most famous specialist in North African history, the Hilalian invasion was “the most important event of the entire medieval period in the Maghreb.”1 It was, he writes, “an invading torrent of nomadic peoples who destroyed the beginnings of Berber organization — which might very well have developed in its own way and put nothing whatever in its place.”2 It must be stressed at the outset that The Muqaddimah does not provide a systematic account of this crisis, the effects of which were still visible in the fourteenth century. Ibn Khaldun gives no methodical account of the underlying causes of this destructive phenomenon. The Histoire des Berbères describes a series of upheavals and crises, and several unsuccessful attempts to establish a centralized monarchy. But the problem of a Crisis with a capital 'C' is never raised. The Hilalian invasion is not the main theme of the The Muqaddimah. Ibn Khaldun refers to it simply as one of the causes of the turmoil.
The encyclopedic Muqaddimah contains a section on methodology, an analysis of political and social structures, and a general synthesis, but basically it does not describe the spectacular collapse which modern historians claim to have discovered. Ibn Khaldun was not studying a major localized event such as an invasion and its aftermath; he makes no systematic distinction between the character of the Maghreb before and after the crisis. But he does make a methodical analysis of the permanent political and social structures that characterized North Africa. And, according to Ibn Khaldun, the arrival of the Hilalian tribes did not alter those structures to any great extent. No space is given to a detailed study of the Hilalian invasion in the systematic and analytic framework of The Muqaddimah or in the Histoire des Berbères, each chapter of which deals with a different dynasty.
The lengthy modern accounts of the Hilalian invasion do not, therefore, derive directly from Ibn Khaldun. It is, of course, quite legitimate to formulate a thesis by collating scattered data. But the theory that the “Arab invasion” was the determining factor in the crisis of medieval North Africa is less than legitimate, as it takes into account only part of the data provided by Ibn Khaldun.
Researchers have shown for years that men tend to be more confident about their intelligence and judgments than women, believing that solutions they’ve generated are better than they actually are. This hubris could be tied to testosterone levels, and new research by Gideon Nave, a cognitive neuroscientist at the University of Pennsylvania, along with Amos Nadler at Western University in Ontario, reveals that high testosterone can make it harder to see the flaws in one’s reasoning.
How might heightened testosterone lead to overconfidence? One possible explanation lies in the orbitofrontal cortex, a region just behind the eyes that’s essential for self-evaluation, decision making and impulse control. The neuroscientists Pranjal Mehta at the University of Oregon and Jennifer Beer at the University of Texas, Austin, have found that people with higher levels of testosterone have less activity in their orbitofrontal cortex. Studies show that when that part of the brain is less active, people tend to be overconfident in their reasoning abilities. It’s as though the orbitofrontal cortex is your internal editor, speaking up when there’s a potential problem with your work. Boost your testosterone and your editor goes reassuringly (but misleadingly) silent.
In a classic study conducted at the University of Wisconsin, college students taking final exams rated their confidence about each answer on a five-point scale, “one for a pure guess” and “five for very certain.” Men and women both gave themselves high scores when they answered correctly. But what happened when they’d answered incorrectly? Women tended to be appropriately hesitant, but men weren’t. Most checked “Certain” or “Very certain” when they were wrong, projecting as much confidence for their bad answers as for their good ones.
72 beats a minute. 4,320 an hour. That’s 103,680 a day, or 37,843,200 a year. Now subtract for the cigarettes, the bourbon, the sleepless nights, the lost weekend in the Poconos. That leaves 567,600,000 taps until the clouds part, the dust bows down, until the little black train comes to take me away, O Lord. All this I calculated this morning, Ash Wednesday, as I sit here under duress—one hand over my heart pledging allegiance, the other drumming furiously at the calculator, while snow drifts down over the empty lawn chairs, the flakes too many to count—and dedicate to Pythagoras, Euclid, and all the other early mathematicians, but mostly to those three overworked draft horses who darkened the stables at Washington Elementary—Miss Keeley, Miss Ramsey, and Miss Loper—whom I thank now for the flash cards, the mountains of homework, and the long-suffering hours I spent at the blackboard, adding columns taller than I was. May they rest in peace, wherever they are.
Workers wheel Ralph Ellison’s coffin to a vault at the Trinity Church Cemetery on 153rd and Riverside Drive in Manhattan: “There’s no room in the ground to be buried.” His mourners follow the pallbearers out of a small, unadorned chapel. Classical music plays faintly from a cassette player. The vaults, about fifteen feet high, look like “oversized pink marble post office boxes in the sunlight.” The George Washington Bridge is visible in the distance, darkly present in the afternoon haze, like a bridge to a world beyond our own. I’m reading an account of Ralph Ellison’s funeral, nine pages typed, hiding in a folder among the 127 boxes of Joseph Mitchell’s extant papers, at the New York Public Library. There is no byline, and it isn’t Mitchell’s prose. I stumble on it during my third day in the archives, sitting under lamplight at a corner desk in the Brooke Russell Astor Reading Room. Mitchell and Ellison’s friendship has never been documented, as far as I know, but here in the preserved debris of Mitchell’s life, Ellison fills an entire folder. Four, in fact. I keep reading: “The ceremony is perfunctory, and except for watching Joe Mitchell comfort Mrs. Ellison, his arms encircling her small body, his sorrowful face bent toward hers, you almost forget someone has died.”
Last March, during the second of six trips to the library, I spent a week reading Mitchell’s papers, attending to the details he also felt drawn to when assembling his portraits, the “scraps and crumbs and odds and ends and bits and pieces.” Mitchell’s stories reveal a writer who had little use for the spectacle at the center of things; he looked in the city’s shadows and at its dark edges for companions who reflected his own attitude toward life and death. So I began by following his threads, wondering if the truth of his life would be found beyond or beneath or adjacent to the legend that has grown up around him, in some as yet unexplored place or with the people surrounding his life—not only in the bound volumes of his work, but in his notes and collected objects, discarded things sacred only to him. My father joined me in the archives to help take photos, and I came back to Durham, North Carolina, with more than nine thousand images, including a photo for every page of Mitchell’s diary notes, all twenty boxes of them.
"Because graphene is so thin, diffusion across it will be extremely fast," Kidambi says. "A molecule doesn't have to do this tedious job of going through all these tortuous pores in a thick membrane before exiting the other side. Moving graphene into this regime of biological separation is very exciting." Dialysis, in the most general sense, is the process by which molecules filter out of one solution, by diffusing through a membrane, into a more dilute solution. Outside of hemodialysis, which removes waste from blood, scientists use dialysis to purify drugs, remove residue from chemical solutions, and isolate molecules for medical diagnosis, typically by allowing the materials to pass through a porous membrane.
Today's commercial dialysis membranes separate molecules slowly, in part due to their makeup: They are relatively thick, and the pores that tunnel through such dense membranes do so in winding paths, making it difficult for target molecules to quickly pass through. Now MIT engineers have fabricated a functional dialysis membrane from a sheet of graphene—a single layer of carbon atoms, linked end to end in hexagonal configuration like that of chicken wire. The graphene membrane, about the size of a fingernail, is less than 1 nanometer thick. (The thinnest existing memranes are about 20 nanometers thick.) The team's membrane is able to filter out nanometer-sized molecules from aqueous solutions up to 10 times faster than state-of-the-art membranes, with the graphene itself being up to 100 times faster. While graphene has largely been explored for applications in electronics, Piran Kidambi, a postdoc in MIT's Department of Mechanical Engineering, says the team's findings demonstrate that graphene may improve membrane technology, particularly for lab-scale separation processes and potentially for hemodialysis.