Monday, May 25, 2015
"I can't understand why people are frightened of new ideas.
I'm frightened of the old ones."
~ John Cage
Not long after moving to New York around 2000, I picked up an odd little side gig, as a gallery sitter at a space called Engine 27. Taking its name from the decommissioned TriBeCa firehouse which housed it, Engine 27 wasn't your usual art gallery, but rather one that focused exclusively on sound art. It achieved this by meticulously renovating the ground floor of the firehouse into a nearly perfect acoustic environment. Floors, walls and ceilings were treated with rugs and acoustic paneling. Speakers were strategically situated throughout the roughly 2000 square feet; they could be found lurking in corners, or hanging from the ceiling. If you weren't careful you might stub your toe against a subwoofer squatting on a seemingly random patch of floor. Pretty much anything that wasn't already black was painted so, and the lights were kept low. Feeding all the speakers was a basement full of amplifiers, computers and other hardware. It was, to put it mildly, a sound nerd's paradise.
Engine 27 was the brainchild of Jack Weisberg, a self-taught sound engineer who earned his nut innovating approaches to both arena-scale sound and smaller, more high-brow projects. As an example of the latter, he worked with artist-composer Max Neuhaus on the 1978 MoMA iteration of his "Underground" project, which projected sound into the Sculpture Garden from beneath a ventilation shaft. (Neuhaus' Times Square version, sponsored by the Dia Foundation, ran from 1977 to 1992, then was reincarnated ten years later, but, befitting the fragility of sound, is currently ‘temporarily unavailable due to construction'.) Jack was a curmudgeonly fellow and used to getting things done his way. This is perhaps why Engine 27 became an extraordinary space for practicing what some people call "deep listening", which for me is just a tacit admission that we don't listen very closely to much of anything anymore.
Part of what makes good sound art so fascinating is exactly this prerequisite. Perhaps I am being overly optimistic here, though, since our culture, and especially what we consider to be ‘art', is so biased towards the visual. And for the purposes of the current argument – ie, I am sidestepping the question of what differentiates sound from music – the visual bias provides us with the dispensation of a quick scan. The people who speed-walk their way through an art museum will later on assert how great the museum was. They may even have the selfie to prove it. In some minimal way, they would be correct to say that they saw the art, but this is no different from saying that you "saw the grass" while driving down the freeway at 80mph. In this manner a viewer is entirely justified in dismissing an Ad Reinhardt painting as ‘just black' (although ‘none more black' might be more accurate). What else could he or she do, without spending the time needed to let the painting actually unfold before one's eyes, as was Reinhardt's intention?
by Brooks Riley
by Jalees Rehman
"The goal of privacy is not to protect some stable self from erosion but to create boundaries where this self can emerge, mutate, and stabilize. What matters here is the framework— or the procedure— rather than the outcome or the substance. Limits and constraints, in other words, can be productive— even if the entire conceit of "the Internet" suggests otherwise.
Evgeny Morozov in "To Save Everything, Click Here: The Folly of Technological Solutionism"
We cherish privacy in health matters because our health has such a profound impact on how we interact with other humans. If you are diagnosed with an illness, it should be your right to decide when and with whom you share this piece of information. Perhaps you want to hold off on telling your loved ones because you are worried about how it might affect them. Maybe you do not want your employer to know about your diagnosis because it could get you fired. And if your bank finds out, they could deny you a mortgage loan. These and many other reasons have resulted in laws and regulations that protect our personal health information. Family members, employers and insurances have no access to your health death unless you specifically authorize it. Even healthcare providers from two different medical institutions cannot share your medical information unless they can document your consent.
The recent study "Privacy Implications of Health Information Seeking on the Web" conducted by Tim Libert at the Annenberg School for Communication (University of Pennsylvania) shows that we have a for more nonchalant attitude regarding health privacy when it comes to personal health information on the internet. Libert analyzed 80,142 health-related webpages that users might come across while performing online searches for common diseases. For example, if a user uses Google to search for information on HIV, the Center for Disease Control and Prevention (CDC) webpage on HIV/AIDS (http://www.cdc.gov/hiv/) is one of the top hits and users will likely click on it. The information provided by the CDC will likely provide solid advice based on scientific results but Libert was more interested in investigating whether visits to the CDC website were being tracked. He found that by visiting the CDC website, information of the visit is relayed to third-party corporate entities such as Google, Facebook and Twitter. The webpage contains "Share" or "Like" buttons which is why the URL of the visited webpage (which contains the word "HIV") is passed on to them – even if the user does not explicitly click on the buttons.
Sunday, May 24, 2015
Natalie Wolchover in Wired:
IN JANUARY, THE British-American computer scientist Stuart Russell drafted and became the first signatory of an open letter calling for researchers to look beyond the goal of merely making artificial intelligence more powerful. “We recommend expanded research aimed at ensuring that increasingly capable AI systems are robust and beneficial,” the letter states. “Our AI systems must do what we want them to do.” Thousands of people have since signed the letter, including leading artificial intelligence researchers at Google, Facebook, Microsoft and other industry hubs along with top computer scientists, physicists and philosophers around the world. By the end of March, about 300 research groups had applied to pursue new research into “keeping artificial intelligence beneficial” with funds contributed by the letter’s 37th signatory, the inventor-entrepreneur Elon Musk.
Russell, 53, a professor of computer science and founder of the Center for Intelligent Systems at the University of California, Berkeley, has long been contemplating the power and perils of thinking machines. He is the author of more than 200 papers as well as the field’s standard textbook, Artificial Intelligence: A Modern Approach (with Peter Norvig, head of research at Google). But increasingly rapid advances in artificial intelligence have given Russell’s longstanding concerns heightened urgency.
Recently, he says, artificial intelligence has made major strides, partly on the strength of neuro-inspired learning algorithms. These are used in Facebook’s face-recognition software, smartphone personal assistants and Google’s self-driving cars. In a bombshell resultreported recently in Nature, a simulated network of artificial neurons learned to play Atari video games better than humans in a matter of hours given only data representing the screen and the goal of increasing the score at the top—but no preprogrammed knowledge of aliens, bullets, left, right, up or down. “If your newborn baby did that you would think it was possessed,” Russell said.
Joseph Brodsky (who would've been 75 today) in the NYRB:
No matter how daring or cautious you may choose to be, in the course of your life you are bound to come into direct physical contact with what’s known as Evil. I mean here not a property of the gothic novel but, to say the least, a palpable social reality that you in no way can control. No amount of good nature or cunning calculations will prevent this encounter. In fact, the more calculating, the more cautious you are, the greater is the likelihood of this rendezvous, the harder its impact. Such is the structure of life that what we regard as Evil is capable of a fairly ubiquitous presence if only because it tends to appear in the guise of good. You never see it crossing your threshold announcing itself: “Hi, I’m Evil!” That, of course, indicates its secondary nature, but the comfort one may derive from this observation gets dulled by its frequency.
A prudent thing to do, therefore, would be to subject your notions of good to the closest possible scrutiny, to go, so to speak, through your entire wardrobe checking which of your clothes may fit a stranger. That, of course, may turn into a full-time occupation, and well it should. You’ll be surprised how many things you considered your own and good can easily fit, without much adjustment, your enemy. You may even start to wonder whether he is not your mirror image, for the most interesting thing about Evil is that it is wholly human. To put it mildly, nothing can be turned and worn inside out with greater ease than one’s notion of social justice, public conscience, a better future, etc. One of the surest signs of danger here is the number of those who share your views, not so much because unanimity has a knack of degenerating into uniformity as because of the probability—implicit in great numbers—that noble sentiment is being faked.
By the same token, the surest defense against Evil is extreme individualism, originality of thinking, whimsicality, even—if you will—eccentricity. That is, something that can’t be feigned, faked, imitated; something even a seasoned impostor couldn’t be happy with. Something, in other words, that can’t be shared, like your own skin—not even by a minority. Evil is a sucker for solidity. It always goes for big numbers, for confident granite, for ideological purity, for drilled armies and balanced sheets. Its proclivity for such things has to do presumably with its innate insecurity, but this realization, again, is of small comfort when Evil triumphs.
Which it does: in so many parts of the world and inside ourselves. Given its volume and intensity, given, especially, the fatigue of those who oppose it, Evil today may be regarded not as an ethical category but as a physical phenomenon no longer measured in particles but mapped geographically. Therefore the reason I am talking to you about all this has nothing to do with your being young, fresh, and facing a clean slate. No, the slate is dark with dirt and it’s hard to believe in either your ability or your will to clean it. The purpose of my talk is simply to suggest to you a mode of resistance which may come in handy to you one day; a mode that may help you to emerge from the encounter with Evil perhaps less soiled if not necessarily more triumphant than your precursors. What I have in mind, of course, is the famous business of turning the other cheek.
Erica Goode in the NYT (image John F. Nash Jr. at his graduation from Princeton in 1950. Credit Courtesy of Martha Nash Legg):
Dr. Nash was widely regarded as one of the great mathematicians of the 20th century, known for the originality of his thinking and for his fearlessness in wrestling down problems so difficult few others dared tackle them. A one-sentence letter written in support of his application to Princeton’s doctoral program in math said simply, “This man is a genius.”
“John’s remarkable achievements inspired generations of mathematicians, economists and scientists,’’ the president of Princeton, Christopher L. Eisgruber, said, “and the story of his life with Alicia moved millions of readers and moviegoers who marveled at their courage in the face of daunting challenges.”
Russell Crowe, who portrayed Dr. Nash in “A Beautiful Mind,” tweeted that he was “stunned,” by his death. “An amazing partnership,” he wrote. “Beautiful minds, beautiful hearts.”
Dr. Nash’s theory of noncooperative games, published in 1950 and known as Nash equilibrium, provided a conceptually simple but powerful mathematical tool for analyzing a wide range of competitive situations, from corporate rivalries to legislative decision making. Dr. Nash’s approach is now pervasive in economics and throughout the social sciences and is applied routinely in other fields, like evolutionary biology.
Harold W. Kuhn, an emeritus professor of mathematics at Princeton and a longtime friend and colleague of Dr. Nash’s who died in 2014, said, “I think honestly that there have been really not that many great ideas in the 20th century in economics and maybe, among the top 10, his equilibrium would be among them.” An economist, Roger Myerson of the University of Chicago, went further, comparing the impact of Nash equilibrium on economics “to that of the discovery of the DNA double helix in the biological sciences.”
Richard Marshall interviews Richard Healey in 3:AM Magazine:
3:AM: So does your pragmatism at work in these two cases mean that we should think of quantum mechanics as a realist or an instrumentalist theory or is it a middle way?
RH: Too often contemporary philosophers apply the terms ‘realism’ and ‘instrumentalism’ loosely in evaluating a position, as in the presumptive insult “Oh, that’s just instrumentalism!” Each term may be understood in many ways, and applied to many different kinds of things (theories, entities, structures, interpretations, languages, ….). I once characterized my pragmatist view of quantum mechanics as presenting a middle way between realism and instrumentalism. But by adopting one rather than another use of the terms ‘realism’ and ‘instrumentalism’ one can pigeon hole my view under either label.
In this pragmatist view, quantum probabilities do not apply only to results of measurements. This distinguishes the view from any Copenhagen-style instrumentalism according to which the Born rule assigns probabilities only to possible outcomes of measurements, and so has nothing to say about unmeasured systems. An agent may use quantum mechanics to adjust her credences concerning what happened to the nucleus of an atom long ago on an uninhabited planet orbiting a star in a galaxy far away, provided only that she takes this to have happened in circumstances when that nucleus’s quantum state suffered suitable environmental decoherence.
According to one standard usage, instrumentalism in the philosophy of science is the view that a theory is merely a tool for systematizing and predicting our observations. For the instrumentalist, nothing a theory supposedly says about unobservable structures lying behind but responsible for our observations should be considered significant. Moreover, instrumentalists characteristically explain this alleged lack of significance in semantic or epistemic terms: claims about unobservables are meaningless, reducible to statements about observables, eliminable from a theory without loss of content, false, or (at best) epistemically optional even for one who accepts the theory. My pragmatist view makes no use of any distinction between observable and unobservable structures, so to call it instrumentalist conflicts with this standard usage.
In this view, quantum mechanics does not posit novel, unobservable structures corresponding to quantum states, observables, and quantum probabilities; these are not physical structures at all. Nevertheless, claims about them in quantum mechanics are often perfectly significant, and many are true. This pragmatist view does not seek to undercut the semantic or epistemic status of such claims, but to enrich our understanding of their non-representational function within the theory and to show how they acquire the content they have.
Frederick Bohrer reviews Catastrophe!: The Looting and Destruction of Iraq’s Past edited by Geoff Emberling and Katharyn Hanson, in the LA Review of Books:
It should surprise no one that the threat to antiquities today — worldwide — is far greater from projects for dams, airports, parking lots, and the rest of the activities of modernization than targeted wholesale devastation. I mention this because I think it offers a way to specify what, qualitatively, is the nature of the issue raised by ISIS’s actions. The effects of modernization parallel what Rob Nixon, in another context, calls “slow violence”: a gradual but devastating change effected almost invisibly on daily life. By contrast, ISIS purveys a sort of “fast violence”: shocking, theatrical, and easily commodified to the Western (addled, distracted) TV viewer, and highly useful for its own recruiting as well. I turn to the video evidence itself below. But it must first be noted that many Iraqi archaeological sites have already been devastated by slow violence as well, and one that cannot be conveniently relegated to Islamic extremism: looting.
In 2008 the Oriental Institute of the University of Chicago mounted Catastrophe!: The Looting and Destruction of Iraq’s Past, an exhibition and accompanying catalog (the best of its kind) that describes and pictures in horrifying detail the devastations to archaeological sites caused by hordes of looters, large and small. Just its cover photograph is enough to make one cringe, showing looters in 2004 actively digging at the site of the ancient city of Isin, now a blasted wasteland of hundreds of holes in the earth. As is well known, a rapacious worldwide antiquities market, unconcerned with ethics, fuels this looting; governments meanwhile rarely enforce existing laws. This market is one of the largest sources of funding for ISIS itself — another way besides television in which the organization cynically uses global norms for its own purposes. Under the economic sanctions first imposed in 1990, civil conditions in Iraq have been extraordinarily difficult and unemployment high. Looting is one of the few moneymaking opportunities available to local populations (much like drug production in Afghanistan). Thus a CNN correspondent casually mentions that the threat to antiquities also involves “ordinary people just desperate to make a living.” A prominent archaeologist of Iraq told me long ago that deprivation and economic inequality drive farmers to plow up their fields in search of artifacts, as they have little access to seeds, farm equipment, and other necessities.
Richard Friedman in The New York Times:
AMERICANS disapprove of marital infidelity. Ninety-one percent of them find it morally wrong, more than the number that reject polygamy, human cloning or suicide, according to a 2013 Gallup poll. Yet the number of Americans who actually cheat on their partners is rather substantial: Over the past two decades, the rate of infidelity has been pretty constant at around 21 percent for married men, and between 10 to 15 percent for married women, according to the General Social Survey at the University of Chicago’s independent research organization, NORC. We are accustomed to thinking of sexual infidelity as a symptom of an unhappy relationship, a moral flaw or a sign of deteriorating social values. When I was trained as a psychiatrist we were told to look for various emotional and developmental factors — like a history of unstable relationships or a philandering parent — to explain infidelity. But during my career, many of the questions we asked patients were found to be insufficient because for so much behavior, it turns out that genes, gene expression and hormones matter a lot. Now that even appears to be the case for infidelity.
We have long known that men have a genetic, evolutionary impulse to cheat, because that increases the odds of having more of their offspring in the world. But now there is intriguing new research showing that some women, too, are biologically inclined to wander, although not for clear evolutionary benefits. Women who carry certain variants of the vasopressin receptor gene are much more likely to engage in “extra pair bonding,” the scientific euphemism for sexual infidelity.
Tom Payne in The Telegraph:
From early 1915, under the “fog of war”, Armenians began to disappear from the Ottoman Empire. It could happen in a number of ways. Sometimes it was a matter of destroying villages and rounding up the inhabitants. Their murderers, Turks or Kurds, were as likely to use bayonets, swords or axes as guns, because they wanted to save bullets. Many left their homes on forced marches, to be attacked by killers, frequently thrown into the Euphrates, the women raped. Those who survived ended up in the Syrian Desert, around or in the town of Der Zor, where they were murdered or starved to death. A German witness noted: “Their stomachs, weakened by months of hunger, are no longer able to absorb any food … If you give them bread, they put it aside indifferently. They lie there quietly and wait for death.” It is impossible to say how many died. Figures begin at 600,000. Even the Ottoman government of 1919 acknowledged that 800,000 were killed. A million is probable, a million and a half possible. One problem in calculating the death toll is that some really did disappear. The slaughterers thought little of killing children, and one commander, Cevdet Bey, bragged before an attack, “I won’t leave one, not one so high,” while holding his hand below knee-height. But at other times children were taken and offered to local Muslims. Even now there are people discovering that their grandparents were survivors of the genocide.
Genocide. To study the numbers, and to hear commanders barking and repeating the orders “Burn, destroy, kill” is to think, what other word is there? And yet the rows continue to impede understanding between Turks and Armenians, and even among Armenians themselves. Those who left their homelands lobby for recognition that what happened was genocide. Those in the now-independent Republic of Armenia are more pragmatic – if they stop asking the Turks to use what negotiators have to call “the G-word”, then maybe the Turks will be calmer about Armenian claims on parts of Azerbaijan. Meanwhile, Turks don’t want to use the G-word because if they do, they fear having to give parts of Turkey to the Armenians.
Saturday, May 23, 2015
Carl Zimmer in the New York Times:
Octopuses, squid and cuttlefish — a group of mollusks known as cephalopods — are the ocean’schampions of camouflage.
Octopuses can mimic the color and texture of a rock or a piece of coral. Squid can give their skin a glittering sheen to match the water they are swimming in. Cuttlefish will even cloak themselves in black and white squares should a devious scientist put a checkerboard in their aquarium.
Cephalopods can perform these spectacles thanks to a dense fabric of specialized cells in their skin. But before a cephalopod can take on a new disguise, it needs to perceive the background that it is going to blend into.
Cephalopods have large, powerful eyes to take in their surroundings. But two new studies in The Journal Experimental Biology suggest that they have another way to perceive light: their skin.
It’s possible that these animals have, in effect, evolved a body-wide eye.
“The 9/11 Commission Report” is one of the documents discovered on Bin Laden’s bookshelf — no surprise there. What better way to understand one’s enemy than to understand the narratives we hold dear?
Something similar might be said about the dozens of other federal reports in his possession, which range from the practical (applications for both new and reissued passports, instructions on how to register the birth of a U.S. citizen abroad) to the analytical (a 2009 Senate assessment of “the Evolving Al-Qaeda Threat to the Homeland,” a 2005 National Security Council “Strategy for Victory in Iraq”). It makes sense that Bin Laden would find such materials useful, for the insights they offer into our way of thinking, of strategizing, if nothing else.
And yet, I find myself compelled — and in a perverse way, cheered — by another aspect of these holdings, which is what they have to say about American transparency.
The Good Story is a fizzing collection of exchanges, begun in 2008, between JM Coetzee and the psychotherapist Arabella Kurtz. The authors are well matched, able to draw one another out and to nudge each other in moments of complacency. What emerges is a Platonic dialogue with a postmodern twist. Rather than presenting a series of conclusions, the two retain their differences and their uncertainties: “Does this clarify something of the matter or just add to the confusion?” Coetzee writes at one point. At another, he admits: “So, as you can see, I am still stuck”.
Coetzee and Kurtz don’t confine themselves to a single issue. Instead, the book is a freewheeling conversation about psychotherapy, fiction, fantasy, repression and, in a sense that can draw these ideas together, the relationship between subjectivity and truth. It’s because of the authors’ modest intentions — they aim to discuss, not to conclude — that their minds can roam so freely. After discussing the individual’s capacity to repress, for example, they begin a loose but fascinating debate about the way in which groups or nations do the same. Coetzee refers to Australia and apartheid South Africa, Kurtz to her observations of staff in the NHS.
In 2012, in a global game of Chinese Whispers, a single message traveled through seven languages and across six continents, starting in St. Kilda, Melbourne as “Life must be lived as play” (a commonly paraphrased quote from Plato), and ending in Homer, Alaska as “He bites snails.”
According to a Wikipedia entry, the now–politically incorrect name of the popular children’s game (alternately played as Gossip, Broken Telephone, Pass the Message, Operator, andDon’t Drink the Milk), derives from
Westerners’ use of the word Chinese to denote “confusion” and “incomprehensibility” to the earliest contacts between Europeans and Chinese people in the 1600s, and attribute[s] it to Europeans’ inability to understand China’s culture and worldview.
Chinese, it was assumed, like other “foreign” languages, was an incomprehensible one. Common phrases like it’s all Greek to me, mumbo jumbo, gibberish, and double Dutchdemonstrate our apprehension of certain foreign languages as impenetrable glossolalia.
It is this assumption of the otherness and obscurity of the foreign in language that Eduard Stoklosinski examines in Another View: Tracing the Foreign in Literary Translation.
Lost Things, Found Hopes
For Nietzsche, hope was the beginning of loss.
But we can be even more radical:
the beginning of anything is the beginning of loss.
We all lose, but some lose more slowly
‘How’s it going?’ we ask mercilessly.
‘Slowly’, we answer, without really knowing.
Losing slowly is what we call winning.
But I, who do not love losing, love to lose myself in the forest.
Especially in forests
of music and breath,
skin and bark.
by Harkaitz Cano
from Malgu da gaua / Flexible is the night
publisher: Etxepare Institutua, San Sebastián, 2014
Julian Barnes in The Guardian:
Some years ago, a journalist friend, posted to Paris by his magazine, became in quick succession the father of two children. As soon as their eyes were able to focus properly, he would take them round the Louvre, tenderly pointing their infant retinas at some of the world’s greatest paintings. I don’t know if he also played classical music to them while they were in their mother’s womb, as some prospective parents do; but I have occasionally found myself wondering how those children will turn out: as potential directors of MoMA – or, perhaps, as adults with no visual sense at all, and a horror of art galleries. My own parents never tried feeding me culture at an early (or any other) age; neither did they seek to dissuade me from it. They were both schoolteachers, and so the arts – or perhaps, more precisely, the idea of the arts – were respected in the house. There were proper books on the shelves; and there was even a piano in the sitting room – though at no point in my childhood was it actually played. My mother had been given it by her doting father when she was a young, capable and hopeful pianist. Her playing, however, came to a halt in her early 20s when she found herself faced with a difficult piece of Scriabin. She realised, as she repeatedly failed to master it, that she had reached a certain level, which she could never go beyond. She stopped playing, abruptly and finally. Even so, the piano could not be got rid of; it moved house with her, following her loyally into marriage, and maternity, and old age and widowhood. On its regularly dusted top was a pile of sheet music, including that Scriabin piece she had abandoned decades previously. As for art, there were three oil paintings in the house. Two were of country scenes in Finistère, painted by one of my father’s French assistants. They were, in a way, as misleading as the piano, since “Uncle Paul”, as he was known, hadn’t exactly painted them en plein air; rather, he had copied – and aggrandised – them from picture postcards. I still have the originals he worked from (one smeared with real paint) on my desk. The third picture, which hung in our hall, was somewhat more authentic. An oil of a female nude, in a gilt frame, it was probably an obscure 19th-century copy of an equally obscure original. My parents had bought it at an auction sale in the outer London suburb where we lived. I remember it mainly because I found it completely unerotic. This seemed very strange, because most other depictions of unclad women had an invigorating effect on me. Perhaps this was what art did: by being solemn, it took the excitement out of life.
...I first began writing about art with a chapter on Géricault’s Raft of the Medusa in my novel A History of the World in 10½ Chapters (1989). Since then, I have never followed any particular plan. But the period from 1850 to 1920 continues to fascinate me, as a time of great truth-speaking combined with a fundamental reexamination of the forms of art. I think we still have a lot to learn from that time. And if I was right as a boy about the dullness of that nude we had at home, I was wrong in my deductions about art’s solemnity. Art doesn’t just capture and convey the excitement, the thrill of life. Sometimes, it does even more: it is that thrill.
Friday, May 22, 2015
Matt Jakubowski in Truce:
Before we discuss your work at Harper’s and The Nation, I’d like to ask about the early years of your career. Were there specific experiences that drew you toward a life in letters, as they say? What convinced you that this was the kind of work you wanted to pursue when you were first starting out?
I had a pretty happy childhood that was clearly divided into Life and Books, the latter being as vivid and immersive for me as the former. My mother is a huge reader and took me to the library every week when I was little; at a certain point she decided to have the bus drop me and my sister off at the local branch after school, because libraries are not just repositories of knowledge but also some of the only places you can stick a latchkey kid without people calling the police.
There were no restrictions on what I could read. My upbringing in a hippied-out racially integrated neighborhood in Philadelphia wasn’t very structured, and I was terrible at sports. I liked to play with the neighborhood kids, dress up, and produce ridiculous plays with my sister. My mother worked a lot but she took us to museums and festivals and children’s concerts on weekends, so I was actively engaged with the world outside of home and school and extremely curious about it.
By the time I finished high school I was pretty done with “being taught.” I went to college to read primary sources and not textbooks. I wasn’t a specialist, and didn’t think learning specific types of methodologies was all that useful. I didn’t want to spend the rest of my life studying minutiae and writing boring papers with colons in their titles, but I did want to continue to learn.
After college most of my friends went to grad school and I went on an adventure, traveling and working abroad. I was a cook and housekeeper in Mallorca and a newspaper editor in Hanoi.
Emily Greenhouse in The Nation:
When Stephen Gaskin passed away last July, his local paper eulogized him as a “tie-dye-clad hippie philosopher, a proud ‘freethinker’” with “crystalline blue eyes.” Those of my generation who are familiar with Gaskin know him as the founder of the Farm, the 44-year-old intentional community in Summertown, Tennessee, where Gaskin’s wife, Ina May, started a movement of authentic midwifery and female body-empowerment. The Farm has 180 residents today—in the early 1970s, between 200 and 300 people traveled to Summertown in a caravan of painted school buses to create it—and maintains a focus on green community. Beyond its Ecovillage Training Center, the collective’s furthest-reaching project is a “woman-centered” approach to childbirth. Last year, a doula in Santa Cruz who runs the blog Yogini Momma posted a TEDx Talk by Ina May and praised her as midwifery’s “grandmother guru.”
I e-mailed the news of Gaskin’s death to a friend from college, a professional nurse-midwife. She replied, “When I was training at the Farm it was fascinating to see how everyone treated him with such deference.” Gaskin, the commune’s patriarch and source of “spiritual revelation,” had been in a flexible group marriage when both he and a partner began to be sexually involved with Ina May, who was still married to her first husband. Gaskin would later institutionalize monogamy on the Farm. “We think of Ina May as such a powerhouse, but really Stephen was the cult leader!” my friend noted. “When we would eat dinner he would always be served first.”
What to make of a man whose lessons as well as beliefs, it would seem, were unabashedly feminist, but who lived a life that clashed with them? This is the question posed by Jill Lepore’s invigorating and perplexing The Secret History of Wonder Woman.
Carl Zimmer in Quanta:
In March 2011, the Tara, a 36-meter schooner, sailed from Chile to Easter Island — a three-week leg of a five-year global scientific expedition. All but one of the seven scientists aboard the ship spent much of their time on the sun-drenched deck hauling up wondrous creatures such as luminous blue jellyfish and insects known as sea-skaters, which spend their entire lives skimming the surface of the ocean far from land.
At the stern of the Tara, a shipping container was bolted to the deck, with a door and a tiny window cut through the metal walls. One of the scientists, Melissa Duhaime, spent most of the voyage inside the dark, tiny cell, where she fought off an endless bout of seasickness.
“People would come in to see what I was doing and leave pretty quickly,” Duhaime said.
Inside her cell, Duhaime sat next to a hose as wide as an outstretched hand. A pump drew water through the hose from several meters below the boat and then pushed it through a series of filters. Each filter was finer than the last, blocking smaller and smaller life forms. The setup stopped animals first, then zooplankton and algae. The last filter in the hose, with pores just 220 nanometers wide, was fine enough to block bacteria. Scrubbed of all these living things, the water finally flowed into three 30-liter vats.
To the untrained eye, these vats might seem to be full of sterile water. But they were seething with ocean life — or life-like things, at the very least. The three vats held up to 1 trillion viruses.
In Tokyo, in 1964, the 31-year-old conceptual artist Yoko Ono organized a happening in which she screened a Hollywood film and gave the audience a simple instruction: Do not look at Rock Hudson, look only at Doris Day.
Like most of the countercultural riddles that appear in Grapefruit,Ono’s book from the same year, the instruction — titled Film Script 5 — was at once facile and mischievously impossible. (Other variations on the piece include asking the audience not to look at any round objects in a film, or to see only red.) It was also, in its way, autobiographical: As one of the few women associated with New York’s avant-garde music scene and the “neo-Dada” Fluxus movement, Ono was by then used to being overshadowed by the more powerful and self-serious men around her. (“I wonder why men can get serious at all,” she mused in Grapefruit. “They have this delicate long thing hanging outside their bodies, which goes up and down by its own will.”) The year she first staged Film Script 5, she’d already extricated herself from one failed marriage and her second was unraveling. She was still two years away from meeting the man with whom she would realize her dream of a completely egalitarian partnership — to symbolize this, they both wore white during their wedding ceremony — but the rest of the world wouldn’t see it that way. They would, of course, see only the towering, superior Him — what could he have possibly seen inHer?