Jason Stanley in the New York Times:
In previous columns for The Stone, I argued that the public’s trust in public speech, whether by politicians or in the media, has disintegrated, and to such a degree that it has undermined the possibility of straightforward communication in the public sphere. The expectation is that any statement made either by a politician or by a media outlet is a false ideological distortion. As a result, no one blames politicians for making false statements or statements that obviously contradict that politician’s beliefs. I believe that the unfolding presidential campaign provides a compelling demonstration of my previous claims.
Consider Paul Ryan’s speech at the Republican National Convention last night. Ryan took President Obama to task for allegedly having “funneled out of Medicare” $716 billion dollars. It is simple for anyone with a computer to discover that the claim is problematic.
Gideon Lewis-Kraus in Wired:
Oh hai. A cat wearing a short tie plays music on a cat-shaped keyboard (“Pancake Meowsic Video“, 188,083 views). A woman performs Sun salutations with a cat on her back (“Cat Loves Yoga“, 1,584 views). A man slaps two cats on an ironing board to the beat of “Atmosphere” (“Cat Slap Joy Division“, 359,461 views; watch this one). Kittens try to keep up with an accelerating treadmill (“Treadmill Kittens“, 3.4 million views). A fat cat walks on an underwater treadmill (“Fat Cat Walking on Underwater Treadmill“, 136,922 views). Two cats cuff at a treadmill in perplexed inquisition (“Cats Try to Understand Treadmill“, 1.9 million views). Search YouTube for “cat treadmill” and see how many results there are. Or, actually, don't.
Writing that paragraph took more than an hour. To continue the catalogue for an entire page would've taken weeks. But if one has set out to say something definitive about the relationship between cats and the internet, it's important not to allow oneself to be delayed indefinitely by internet cats.
The obvious place to begin an inquiry into the internet cat is with Maru, the most famous feline on the internet. Maru's shtick, in brief: Maru gets into a box (“大きな箱とねこ“,8.1 million views). Maru gets into a box (“箱とねこ8. A box and Maru 8“, 3.2 million views). Maru gets into some boxes (“いろいろな小さ過ぎる箱とねこ. Many too small boxes and Maru“, 8.1 million views). Maru tries to get into a box (“入れない箱とねこ. The box which Maru can't enter“, 2.3 million views).
A conversation with Alex (Sandy) Pentland, in Edge:
[SANDY PENTLAND:] Recently I seem to have become MIT's Big Data guy, with people like Tim O'Reilly and “Forbes” calling me one of the seven most powerful data scientists in the world. I'm not sure what all of that means, but I have a distinctive view about Big Data, so maybe it is something that people want to hear.
I believe that the power of Big Data is that it is information about people's behavior instead of information about their beliefs. It's about the behavior of customers, employees, and prospects for your new business. It's not about the things you post on Facebook, and it's not about your searches on Google, which is what most people think about, and it's not data from internal company processes and RFIDs. This sort of Big Data comes from things like location data off of your cell phone or credit card, it's the little data breadcrumbs that you leave behind you as you move around in the world.
What those breadcrumbs tell is the story of your life. It tells what you've chosen to do. That's very different than what you put on Facebook. What you put on Facebook is what you would like to tell people, edited according to the standards of the day. Who you actually are is determined by where you spend time, and which things you buy. Big data is increasingly about real behavior, and by analyzing this sort of data, scientists can tell an enormous amount about you. They can tell whether you are the sort of person who will pay back loans. They can tell you if you're likely to get diabetes.
They can do this because the sort of person you are is largely determined by your social context, so if I can see some of your behaviors, I can infer the rest, just by comparing you to the people in your crowd. You can tell all sorts of things about a person, even though it's not explicitly in the data, because people are so enmeshed in the surrounding social fabric that it determines the sorts of things that they think are normal, and what behaviors they will learn from each other.
As a consequence analysis of Big Data is increasingly about finding connections, connections with the people around you, and connections between people's behavior and outcomes. You can see this in all sorts of places. For instance, one type of Big Data and connection analysis concerns financial data. Not just the flash crash or the Great Recession, but also all the other sorts of bubbles that occur. What these are is these are systems of people, communications, and decisions that go badly awry. Big Data shows us the connections that cause these events. Big data gives us the possibility of understanding how these systems of people and machines work, and whether they're stable.
David Plotke, one of the smarter observers of social movements and collective action I know of, in Dissent:
I survey the main accounts of what Occupy did and what it might mean. Proponents of these views often claim both to provide analytical insight (this is what OWS was and is) and to express valid preferences (this is what OWS should be).
1. It was a flash movement.
Occupy assembled and expressed anger about economic and social injustice. Not many opinions changed, but the terms of national debate shifted, with durable aftershocks. OWS actions registered deep concern among significant parts of several (mainly left-of-center) publics. Yet OWS as we saw and knew it is gone.
If this is the trajectory—an explosion and then a fast decline toward a nominal but not significant Occupy—no one gets to own it except as a memory. It can’t be reconvened. On this account, with the election season underway, OWS seems like a reminder of the angry public mood of 2011. It is tempting to say that rather than the Tea Party of the left, it was the Herman Cain of the left, but that understates the force of the views OWS expressed and its likely persistence as a symbol.
More important, OWS represents a new kind of political and social effort—intense, broad, brief, and dramatic. Such efforts involve large numbers of people rather than narrow groups. There are leaders and organizers but they do not simply control the movement, which expands rapidly and surprisingly in size and forms.
2. Occupy is a significant current on the left of the Democratic coalition.
While it arrived too late to field its own primary candidates it will be a presence in the 2012 election cycle and perhaps beyond.
Although most OWS supporters hate the Tea Party analogy, here it’s apt. The Tea Party experience shows how political currents can now appear both inside and outside the party system. One hope (for those who want to re-elect Obama) is that OWS as a symbol can attract some independent voters. This may be wishful thinking, insofar as the 2011 Occupy story occurred to such a large extent among people who were already likely Democratic voters and in pro-Democratic settings.
Jeff Sharlet on Christopher Hitchens's Mortality (via Corey Robin), in Bookforum:
In the book’s best essay, a literal consideration of “freedom of speech” following the partial loss of his voice—“like a silly cat that had abruptly lost its meow”—Hitchens writes of the “awful fact” that friends are coerced by cancer into listening to his attempts at communication “sympathetically.” And yet Hitchens had always been a sentimentalist despite himself—a quality that is no small part of his popularity with readers who think themselves too reasonable for emotional appeals. Hitchens’s sentimentalism, in fact, allowed him at his best to detect the false sweetening of public ideas—and that is also the case here, in the more private world of Tumortown. His most sustained argument is with Nietzsche’s oft-quoted maxim on being made stronger by that which doesn’t kill you, but the sharpest rebuke of Mortality is reserved for Pausch’s enormously popular farewell video made before his own death from cancer, a catalogue of clichés “so sugary you may need an insulin shot to withstand it.” Hitchens proposes the criminalization of such saccharine: “It ought to be an offense to be excruciating and unfunny in circumstances where your audience is almost morally obliged to enthuse.”
Of course, that’s also the dilemma of Mortality. “My grandmother was diagnosed with terminal melanoma of the G-spot,” he writes, mimicking the cancer tales imposed on him by ostensible well-wishers, “but she hung in there . . . and the last postcard we had was from her at the top of Mount Everest.” Funny, sort of, but more like stand-up shtick than the wit that made Hitchens famous. His arguments with the pious, too, have been whittled down, but not sharpened, by suffering. Prayer is silly, he proposes, because if god (no capitalization here) is in fact almighty, he is “enjoined or thanked to do what he was going to do anyway.” Noting the Jewish woman’s prayer thanking god “for creating her ‘as she is,’” he observes that for a true divine “the achievement would seem rather a slight one.”
One needn’t be religious to grasp that Hitchens is bickering with a straw man’s notion of prayer (or, as the case may be, a straw woman’s), a crudely utilitarian conception of the appeal to divine power that seems pointedly deaf to the nuances of meaning contained within even the most rote devotions. Almost all prayer contains at least the bones of a story about how the prayerful supplicant understands suffering—“or misunderstands,” one can imagine one of Hitchens’s own devotees quipping. Perhaps; but the distinction misses a larger truth to which Hitchens himself returns again and again throughout Mortality: To understand suffering is not to master it, or to defeat it. Whether one understands chemotherapy or not makes it no more or less painful.
Margalit Fox's (slightly thin) obit in the NYT:
Shulamith Firestone, a widely quoted feminist writer who published her arresting first book, “The Dialectic of Sex,” at 25, only to withdraw from public life soon afterward, was found dead on Tuesday in her apartment in the East Village neighborhood of Manhattan. She was 67.
Ms. Firestone apparently died of natural causes, her sister Laya Firestone Seghi said.
Subtitled “The Case for Feminist Revolution,” “The Dialectic of Sex” was published by William Morrow & Company in 1970. In it, Ms. Firestone extended Marxist theories of class oppression to offer a radical analysis of the oppression of women, arguing that sexual inequity springs from the onus of childbearing, which devolves on women by pure biological happenstance.
“Just as the end goal of socialist revolution was not only the elimination of the economic class privilege but of the economic class distinction itself,” Ms. Firestone wrote, “so the end goal of feminist revolution must be … not just the elimination of male privilege but of the sex distinction itself: genital differences between human beings would no longer matter culturally.”
In the utopian future Ms. Firestone envisioned, reproduction would be utterly divorced from sex: conception would be accomplished through artificial insemination, with gestation taking place outside the body in an artificial womb. While some critics found her proposals visionary, others deemed them quixotic at best.
Reviewing “The Dialectic of Sex” in The New York Times, John Leonard wrote, “A sharp and often brilliant mind is at work here.” But, he added, “Miss Firestone is preposterous in asserting that ‘men can’t love.’ ”
“The Trail is Not a Trail”
I drove down the Freeway
And turned off at an exit
And went along a highway
Til it came to a sideroad
Drove up the sideroad
Til it turned to a dirt road
Full of bumps, and stopped.
Walked up a trail
But the trail got rough
and it faded away—
Out in the open,
Everywhere to go.
by Gary Snyder
from Left Out in the Rain
North Point Books
From New Statesman:
“We’re not going to let our campaign be dictated by fact checkers.”…Mitt Romney.
Romney pollster Neil Newhouse admits that his party has given up on telling the truth.
In a related story:
Paul Ryan made his big speech at the Republican National Convention, and ThinkProgress summed it up best: “An energetic, post-factual speech by Ryan.” Time and again, Ryan mislead, misspoke, and made “Demonstrably Misleading Assertions“. If you're interested in the politics of it, he's also been attacked on style – Mother Jones' Kevin Drum recalled Harrison Ford's famous snipe to George Lucas, “you can type this shit, but you sure can't say it” – and doubtless, his “John Galtesque” evocation of the mythical grey, socialist hellhole of Obama's America will win over some. But if Ryan gets away with some of what he said, political discourse in the United States has a lot to answer for. The most egregious of Ryan's statements was an attack on Obama for failing to protect a General Motors plant in his constituency:
A lot of guys I went to high school with worked at that GM plant. Right there at that plant, candidate Obama said: “I believe that if our government is there to support you … this plant will be here for another hundred years.” That’s what he said in 2008.
Well, as it turned out, that plant didn’t last another year. It is locked up and empty to this day. And that’s how it is in so many towns today, where the recovery that was promised is nowhere in sight.
The plant's closure was announced in June 2008, over six months before Obama was inaugurated. Ryan probably knows this, because on 3 June, he issued a statement bemoaning the closure.
Gina Kolata in The New York Times:
For 25 years, the rhesus monkeys were kept semi-starved, lean and hungry. The males’ weights were so low they were the equivalent of a 6-foot-tall man who tipped the scales at just 120 to 133 pounds. The hope was that if the monkeys lived longer, healthier lives by eating a lot less, then maybe people, their evolutionary cousins, would, too. Some scientists, anticipating such benefits, began severely restricting their own diets. The results of this major, long-awaited study, which began in 1987, are finally in. But it did not bring the vindication calorie restriction enthusiasts had anticipated. It turns out the skinny monkeys did not live any longer than those kept at more normal weights. Some lab test results improved, but only in monkeys put on the diet when they were old. The causes of death — cancer, heart disease — were the same in both the underfed and the normally fed monkeys.
Lab test results showed lower levels of cholesterol and blood sugar in the male monkeys that started eating 30 percent fewer calories in old age, but not in the females. Males and females that were put on the diet when they were old had lower levels of triglycerides, which are linked to heart disease risk. Monkeys put on the diet when they were young or middle-aged did not get the same benefits, though they had less cancer. But the bottom line was that the monkeys that ate less did not live any longer than those that ate normally.
Daniel Hartley in 3:AM Magazine:
Gone are the golden days when an author’s bio blurb read like an obituary. Date and place of birth, occupation, current abode, names and dates of publications, year of death (if applicable): this was, apparently, all an educated public really needed to know about their writers to be able to ‘place’ their work. And as staid and conventional as that may now seem, there’s a lot to be said for this approach, not the least of which is avoiding bio-blurbs like this: “X lives in New York with her three cats. She makes cookies out of the weirdest things (and they taste REAL GOOD!). Her favourite word is ‘red’ and when it snows she wears sandals.” The only reasonable response to such postmodern narcissism is, firstly, to remind the author that we don’t actually give a damn about his or her personal idiosyncrasies, and, secondly, to ask them to grow up.
That said, it is not only younger writers who fall foul of this idiotic celebration of so-called eccentricity; as well-known a writer as Stephen Fry informs us in the author bio to his Ode Less Travelled (a wonderful book, as it happens) that “[h]is powers grow daily and his disciples are many” and that “his best friends are flowers”. Likewise, Neil Gaiman tells us that he is “a messy-haired white male author trapped in the body of an identical white male author with perhaps even less-tidy hair”, which, as author bios go, is hardly a knock-out. The point here, though, is not to be a killjoy, to declaim like a troubled old soul that the good days are behind us; rather, it is to point out an increasing trend of egoism fused with an insidious celebration of “randomness”.
Indeed, if there is any word in the English language whose current popularity is profoundly and boringly unrandom, it is “random”. Intellectual after intellectual, from Fredric Jameson toDavid Harvey and Terry Eagleton, have shown us that the shifts in the structure of the capitalist mode of production following World War II resulted in a new “cultural logic” which reflected those shifts. The upshot was a rejoicing in the ephemeral, the fleeting, the contingent, the hybrid, the liminal, the accidental, the fragmentary, the part, the border: the random. Randomness and the affirmative cries of “Random!” which accompany it are constitutive aspects of this cultural logic which has been unfurling since the postwar boom. That means that any indulgence in this logic leaves itself open to that once terrifying Enlightenment adjective: “uncritical”.
BLS Nelson in Talking Philosophy:
In my experience, many skilled philosophers who work in the Anglo-American tradition will tend to have a feverish streak. They will tend to find a research program which conforms with their intuitions (some of which may be treated as “foundational” or givens), and then hold onto that program for dear life. This kind of philosopher will change her mind only on rare occasions, and even then only on minor quibbles that do not threaten her central programme. We might call this kind of philosopher a “programmist” or “anti-skeptic“, since the programmist downplays the importance of humility, and is more interested in characterizing herself in terms of the other virtues like philosophical rigour.
You could name a great many philosophers who seem to hold this character. Patricia and Paul Churchland come to mind: both have long held the view that the progress of neuroscience will require the radical reformation of our folk psychological vocabulary. However, when I try to think of a modern exemplar of this tradition, I tend to think of W.V.O. Quine, who held fast to most of his doctrinal commitments throughout his lifetime: his epistemological naturalism and holism, to take two examples. This is just to say that Quine thought that the interesting metaphysical questions were answerable by science. Refutation of the deeper forms of skepticism was not very high on Quine’s agenda; if there is a Cartesian demon, he waits in vain for the naturalist’s attention. The most attractive spin on the programmist’s way of doing things is by saying they have raised philosophy to the level of a craft, if not a science.
“What’s attractive about looking at all philosophers in part suspiciously and in part mockingly is not that we find again and again how innocent they are… but that they are not honest enough in what they do, while, as a group, they make huge, virtuous noises as soon as the problem of truthfulness is touched on, even remotely.” – Nietzsche, Beyond Good & Evil
Programmists are common among philosophers today. But if I were to take you into a time machine and introduced you to the elder philosophers, then it would be easy to lose all sense of how the moderns compare with their predecessors. The first philosophers lived in a world where science was young, if not absent altogether; there was no end of mystery to how the universe got on. For many of them, there was no denying that skepticism deserved a place at the table. From what we can tell from what they left behind, many ancient philosophers (save Aristotle and Pythagoras) did not possess the quality that we now think of as analytic rigour. The focus was, instead, of developing the right kind of life, and then — well, living it.
Terry Rudolph over at Cosmic Variance:
There has long been a tension between the academic publishing process, which is slow but which is still the method by which we certify research quality, and the ability to instantaneously make one’s research available on a preprint server such as the arxiv, which carries essentially no such certification whatsoever. It is a curious (though purely empirical) observation that the more theoretical and abstract the field the more likely it is that the all-important question of priority – when the research is deemed to have been time-stamped as it were – will be determined by when the paper first appeared on the internet and not when it was first submitted to, or accepted by, a journal. There are no rules about this, it’s simply a matter of community acceptance.
At the high-end of academic publishing, where papers are accepted from extremely diverse scientific communities, prestigious journals need to filter by more than simply the technical quality of the research – they also want high impact papers of such broad and general interest that they will capture attention across ranges of scientific endeavour and often the more general public as well. For this reason it is necessary they exercise considerably more editorial discretion in what they publish.
Topics such as hurdling editors and whether posting one’s paper in preprint form impacts negatively the chances of it being accepted at a high-end journal are therefore grist for the mill of conversation at most conference dinners. In fact the policies at Nature about preprints have evolved considerably over the last 10 years, and officially they now say posting preprints is fine. But is it? And is there more to editorial discretion than the most obvious first hurdle – namely getting the editor to send the paper to referees at all? If you’re a young scientist without experience of publishing in such journals (I am unfortunately only one of the two!) perhaps the following case study will give you some pause for thought.
Last November my co-authors and I bowed to some pressure from colleagues to put our paper, then titledThe quantum state cannot be interpreted statistically, on the arxiv. We had recently already submitted it to Nature because new theorems in the foundations of quantum theory are very rare, and because the quantum state is an object that cuts across physics, chemistry and biology – so it seemed appropriate for a broad readership. Because I had heard stories about the dangers of posting preprints so many times I wrote the editor to verify it really was ok. We were told to go ahead, but not to actively participate in or solicit pre-publication promotion or media coverage; however discussing with our peers, presenting at conferences etc was fine.
Based on the preprint Nature themselves published a somewhat overhyped pop-sci article shortly thereafter; to no avail I asked the journalist concerned to hold off until the status of the paper was known. We tried to stay out of the ensuing fracas – is discussing your paper on blogs a discussion between your peers or public promotion of the work?
Kieran Healy in Crooked Timber:
Will there be Bingo in Utopia? It is hard to say. The emancipatory potential of bingo as praxishas been criticized from the earliest days of modern social theory. In 1862 Marx was prompted to write the first draft of what became Theories of Surplus Value during very straitened financial circumstances (he had pawned the clothes of his children and his maid, Helene Demuth) brought on mostly by clandestine visits to an East London bingo emporium, where he would play games of “Housey-Housey” while his wife Jenny believed him to be at the British Library conducting research. The game itself was for some time believed to be mentioned by Marx directly in a well-known if difficult section of the Grundrisse:
Capital’s ceaseless striving towards the general form of wealth drives labour beyond the limits of its natural paltriness, and thus creates the material elements for the development of the rich individuality which is as all-sided in its production as in its consumption, and whose labour also therefore appears no longer as labour, but as the full development of bingo itself, in which natural necessity in its direct form has disappeared; because natural need has been replaced by historically produced need.
This passage provoked considerable confusion—and a substantial amount of theoretical debate—amongst the small circle of scholars who had access to it from 1935 onwards.
Following the thaw and wave of rehabilitations during the Khruschev era, however, it transpired that David Riazanov’s original transcription of this passage (with a reading of “activity” and not “bingo”) had been correct. It was altered by an unknown member of theNKVD as part of the effort to falsify evidence establishing the existence of a so-called “United Front of Mensheviks and Mah-Jongg”. The unhappy fate of bingo as an element of emancipatory praxis was sealed by Adorno, who intensely disliked the game (and indeed much else) in all its forms, defending instead what he saw as the more austere but purer pleasures of the tombola.
Andrea Wills in American Scientist:
In grassy areas along the equator lives a tiny plant, Mimosa pudica, that has the captivating property of closing its leaves in response to touch. Rest a finger on one leaf, and that leaf and its neighbor will fold abruptly toward the stem. Brush your finger along the length of the stem and every pair of leaves will collapse in turn. For everyone who has wondered at Mimosa, the suddenly snapping Venus flytrap or the way a sunflower’s head unerringly turns to follow the sun, Daniel Chamovitz has written the perfect book.
What a Plant Knows: A Field Guide to the Senses examines the parallels and differences between plant senses and human senses by first considering how we interpret sensory inputs and then exploring how plants respond to similar inputs. Each chapter covers one sense—sight, smell, touch and hearing are covered, along with “How a Plant Knows Where It Is” and “What a Plant Remembers”—and each examines a wide taxonomical range of flora and a complementary historical range of experiments. In the book’s introduction, Chamovitz is careful to clarify his intentions in using language that might be considered anthropomorphic to explore the world of plants:
When I explore what a plant sees or smells, I am not claiming that plants have eyes or noses (or a brain that colors all sensory input with emotion). But I believe this terminology will help challenge us to think in new ways about sight, smell, what a plant is, and ultimately what we are.
A plant biologist who has held positions at Columbia and Yale and is now director of the Manna Center for Plant Biosciences at Tel Aviv University, Chamovitz is well qualified to present an archive of research on plant perception. Happily, he also has narrative dexterity: The book is delightful and a fast read.
Kurt Vonnegut excerpted in Harper's Magazine:
I, Kurt Vonnegut, Jr., that is, do hereby swear that I will be faithful to the commitments hereunder listed:
I. With the agreement that my wife will not nag, heckle, or otherwise disturb me on the subject, I promise to scrub the bathroom and kitchen floors once a week, on a day and hour of my own choosing. Not only that, but I will do a good and thorough job, and by that she means that I will get under the bathtub, behind the toilet, under the sink,under the icebox, into the corners; and I will pick up and put in some other location whatever movable objects happen to be on said floors at the time so as to get under them too, and not just around them. Furthermore, while I am undertaking these tasks I will refrain from indulging in such remarks as “Shit,” “Goddamn sonofabitch,” and similar vulgarities, as such language is nerve-wracking to have around the house when nothing more drastic is taking place than the facing of Necessity. If I do not live up to this agreement, my wife is to feel free to nag, heckle, and otherwise disturb me until I am driven to scrub the floors anyway—no matter how busy I am.
—for my sons
when i was twenty
five .. we hiked the grass
spare trails that snake
from ocean to Swan Pond .. my
two small crab catchers & me .. we
buried pet turtles at sea
…………… beneath the crooked
footbridge .. sailed stick regattas
in the slim stream in the slow
woods .. bouncing like great
explorers of Kettle Cove & sea
slashed rocks .. listening to each other's
breath .. we trudged home sand fed
by Jim Bell
from Crossing the Bar
Slate Roof, Northfield, MA, 2005
From The New Yorker:
Writing in the American Journal of Preventive Medicine, Dr. Felicia H. Stewart and Dr. James Trussell have estimated that there are twenty-five thousand rape-related pregnancies each year in the United States. While these numbers make up only a small part of this country’s annual three million unwanted pregnancies, the numbers are still extremely high. Nonetheless, the relationship between rape and pregnancy has been a topic of highly politicized debate since long before Todd Akin’s comments on “legitimate rape,” Paul Ryan’s bill with its category of “forcible rape,” and Sharron Angle’s suggestion, two years ago, that women pregnant through rape make “a lemon situation into lemonade.” There is a veritable war of statistics about rape and pregnancy, and the confusion is exacerbated by the competing agendas of the pro-choice and anti-abortion movements. It has been argued that fear promotes ovulation, and that women who are raped have a ten-per-cent risk of pregnancy; there are estimates of as little as one per cent. Numbers are also skewed when they are adjusted to include or exclude women not of reproductive age; for sodomy and other forms of rape that cannot cause pregnancy; for rape victims who may be using oral birth control or I.U.D.s; and for women who are raped and become or are pregnant as a result of consensual sex with a husband or partner who is not the rapist, before or after the rape. Women who are being abused on an ongoing basis are particularly likely to conceive in rape. Catherine MacKinnon has written, “Forced pregnancy is familiar, beginning in rape and proceeding through the denial of abortions; this occurred during slavery and still happens to women who cannot afford abortions.”
I have been researching a book, “Far from the Tree,” that deals in part with women raising children conceived in rape, and have therefore met the living reproof to Akin’s remark. Life for these children may be extremely difficult. One of the few groups founded to address this population, Stigma Inc., took as its motto, “Rape survivors are the victims … their children are the forgotten victims.”
The symbol’s modern obscurity ended in 1971, when a computer scientist named Ray Tomlinson was facing a vexing problem: how to connect people who programmed computers with one another. At that time, each programmer was typically connected to a particular mainframe machine via a phone connection and a teletype machine—basically a keyboard with a built-in printer. But these computers weren’t connected to one another, a shortcoming the U.S. government sought to overcome when it hired BBN Technologies, the Cambridge, Massachusetts, company Tomlinson worked for, to help develop a network called Arpanet, forerunner of the Internet. Tomlinson’s challenge was how to address a message created by one person and sent through Arpanet to someone at a different computer. The address needed an individual’s name, he reasoned, as well as the name of the computer, which might service many users. And the symbol separating those two address elements could not already be widely used in programs and operating systems, lest computers be confused.
Tomlinson’s eyes fell on @, poised above “P” on his Model 33 teletype. “I was mostly looking for a symbol that wasn’t used much,” he told Smithsonian. “And there weren’t a lot of options—an exclamation point or a comma. I could have used an equal sign, but that wouldn’t have made much sense.” Tomlinson chose @—“probably saving it from going the way of the ‘cent’ sign on computer keyboards,” he says. Using his naming system, he sent himself an e-mail, which traveled from one teletype in his room, through Arpanet, and back to a different teletype in his room. Tomlinson, who still works at BBN, says he doesn’t remember what he wrote in that first e-mail. But that is fitting if, as Marshall McLuhan argued, “The medium is the message.” For with that message, the ancient @, once nearly obsolete, became the symbolic linchpin of a revolution in how humans connect.