October 31, 2006
Silas Smimmons, 1895-2006
Si Simmons, the former Negro leagues baseball player who was believed to be the longest-living professional ballplayer in history, died Sunday in at a retirement home in St. Petersburg, Fla. He was 111.
His death was announced by a spokeswoman for the retirement home.
A Philadelphia native, Simmons was a left-handed pitcher for the local Germantown Blue Ribbons beginning in either 1912 or 1913, in the primordial and poorly recorded days of organized black baseball. He played for Germantown and other clubs for many years after that, including the New York Lincoln Giants of the Eastern Colored League in 1926 as well as the Negro National League’s Cuban Stars in 1929.
The fact that Simmons was still alive was unknown to baseball’s avid research community until the summer of 2006, when a genealogist discovered he was living in the St. Petersburg, Fla., nursing home.
Romans have never been more popular
Allan Massie in Prospect Magazine:
In his short book "The Future of the Classical," Salvatore Settis, director of the Scuola Normale Superiore in Pisa, writes that "the marginalisation of classical studies in our education systems and our culture at large is a profound cultural shift that would be hard to ignore." At the same time, he asks: "What place is there for the ancients in a world… characterised by the blending of peoples and cultures, the condemnation of imperialism, the end of ideologies, and the bold assertion of local traditions, and ethnic and national identities in the face of all forms of cultural hegemony? Why seek out common roots, if everyone is intent on distinguishing their own from those of their neighbour?"
The points are well made, the questions pertinent, though the implication is not always as cogent as Settis supposes. After all, one characteristic of the Roman world was a very similar "blending of peoples and cultures," as eastern gods and goddesses were introduced to Rome and worshipped there, and as the emperors came more often from the provinces than from Italy, let alone Rome.
Berman's The Primacy of Politics, a Crooked Timber Online Seminar
For those of you who've missed it, Crooked Timber has an online seminar of Sheri Berman's The Primacy of Politics: Social Democracy and the Making of Europe's Twentieth Century. Henry Farrell, Tyler Cowen of Marginal Revolution, Mathew Yglesias, John Quiggin, SEIU's Jim McNeill, and our own Mark Blyth offer insights and critques, Berman responds, and readers chime in. From Mark's piece:
Social democracy may have been a good idea, but it was also a post-war phenomenon brought about by the devastation fascism brought upon itself. If World War Two hadn’t happened, if Strasser had bested Hitler, if the xenophobia had stayed in the bottle, would fascism have fallen? While counterfactuals are at best a parlor game, they are nonetheless helpful in clarifying possibilities. If the war had not happened, and if the alternative of the Soviet Union had not risen to post-war prominence, would the need to placate the working classes of Europe with welfarism and democracy been so pressing? Would the victory have come about at all, never mind later than advertised.
In short, if we read the history of social democracy as a highly contingent outcome, it raises an interesting angle on contemporary developments. If social democracy was a species of fascism (or vice versa), do we need a re-born fascism now to (re)energize the ‘dead-men walking’ parties of social democracy in the present?
Paris and Practice
Helmut at Phronesisaical:
Apart from a work weekend of torture and globalization and developing a new seminar on ethics in management and leadership, my thoughts have wandered over to, well, Paris, and to a problem that constantly arises for a philosopher teaching at a public policy school: the moment of policy practice. There's a vague relation between these two disparate items. Bear with me, and I'll see if I can weave them together.
Paris: Paris is a big city, of course, but it's also very small. I don't mean this only in the sense that - like other European cities - it has a center from which the rest of the city radiates, turning the city into something more intimate, walkable, and experientially and historically rich than we usually know with American cities. I mean this also in a sense that a relatively unknown French photographer I like, Michel-Jean Dupierris, has a clever eye for: the tiny, passed-over worlds underlying the city. Paris is grand, yet infinitesimally complex. Dupierris, like other artists before him, notices the small and complex. He has the eye of an abstract expressionist.
uncle tom lives
The best-selling American novel of the nineteenth century, “Uncle Tom’s Cabin,” by Harriet Beecher Stowe, does not quite go away, much as many Americans, from black militants to white aesthetes, might wish it. Withina year of its publication, in March of 1852, it had sold three hundred thousand copies, in a country one-thirteenth its present size and—in a surprising show of Victorian globalization—more than two million in the rest of the world. Ten years later, in 1862, Abraham Lincoln allegedly greeted its diminutive author in the White House with the words “So you’re the little woman who wrote the book that started this great war!” The President’s subsequent abolition of slavery and the Union’s hard-won victory in the Civil War would seem to have taken the wind out of Stowe’s fiercely abolitionist novel of ideas, but its melodramatic images—the Kentucky slave Eliza’s flight across the ice-choked Ohio River, pursued by bloodhounds, with her son in her arms; the Louisiana slaveholder Simon Legree’s boastful villainy; fair-haired little Eva’s saintly death and the snaggle-headed black orphan Topsy’s reluctant reformation—persisted, though travestied, in popular plays, shows, films, figurines, and cartoons.
more from John Updike at The New Yorker here.
the third hockney is the best
All art, perhaps, is at heart an attempt to answer the question, How do we see? In these two shows, Hockney has a range of answers, but the one constant is the search, the gaze. The third David Hockney, the serious one, the important one, has been asking this question for over fifty years now, and his answers are consistently interesting and surprising. The body of work he has accumulated through his restless use of a vast range of media, combined with his solid technique, has given us an artist of the very first rank. Both these exhibitions set out to celebrate Hockney, and they do so magnificently: the NPG’s retrospective of half a century of his portraiture shows a depth and a breadth that is hard to match in any artist working today. There are perhaps rather too many of the very recent portraits – more rigorous selection would have made viewing easier – but there is no slackening off in quality. Annely Juda’s show of the new landscapes indicates that, if anything, David Hockney is having yet another late flowering. In a long career, he has frequently seemed to have reached a peak, only to dart off at a tangent and, in another style, another medium, surpass himself. His most recent work shows a serene, soaring mastery.
more from the TLS here.
IF THE UNITED STATES takes military action to prevent Iran from acquiring nuclear weapons, planning for which has been much speculated about but denied by the Bush administration, who will deserve the blame? The Iranian regime, for its brazen defiance of the international ban on nuclear proliferation? America’s neoconservatives, itching to remake the Middle East? Or Azar Nafisi, the Iranian expatriate author of the 2003 women’s book-club fave ‘‘Reading Lolita in Tehran’’?
Hamid Dabashi, a professor of Iranian studies at Columbia University, would blame all three, but it’s his vituperative attack on Nafisi that earned him a spot this month on the cover of the Chronicle of Higher Education.
more from Boston Globe Ideas here.
caravaggio: what a bleeder
Great stuff, mate" said someone sticking his head through the window of the cab about to take me to St Pancras. "Caravaggio; what a bleeder!"
Too right. Music to my ears. Vox populi, vox dei. And a whole lot better than Carpo Marx in the Sundays giving us all a hard time about the first episode of The Power of Art. We did know we were taking a risk beginning with the most in-your-face of the eight films, lots of sweaty aggression and heavy pathos, but then that was what Caravaggio specialised in. One reviewer complained about the "script" which the actors had to work with but that script ("smell the artichokes") was drawn entirely from the court records of Caravaggio's trials and punishments.
more from SImon Schama at The Guardian here.
One for the Ages: A Prescription That May Extend Life
How depressing, how utterly unjust, to be the one in your social circle who is aging least gracefully. In a laboratory at the Wisconsin National Primate Research Center, Matthias is learning about time’s caprice the hard way. At 28, getting on for a rhesus monkey, Matthias is losing his hair, lugging a paunch and getting a face full of wrinkles.
Yet in the cage next to his, gleefully hooting at strangers, one of Matthias’s lab mates, Rudy, is the picture of monkey vitality, although he is slightly older. Thin and feisty, Rudy stops grooming his smooth coat just long enough to pirouette toward a proffered piece of fruit.
Tempted with the same treat, Matthias rises wearily and extends a frail hand. “You can really see the difference,” said Dr. Ricki Colman, an associate scientist at the center who cares for the animals. What a visitor cannot see may be even more interesting. As a result of a simple lifestyle intervention, Rudy and primates like him seem poised to live very long, very vital lives.
This approach, called calorie restriction, involves eating about 30 percent fewer calories than normal while still getting adequate amounts of vitamins, minerals and other nutrients. Aside from direct genetic manipulation, calorie restriction is the only strategy known to extend life consistently in a variety of animal species.
Elephants not fooled by mirrors
Elephants possess the highly cerebral ability to recognize their own jumbo reflections in mirrors, scientists have found. Traditionally, only an elite group of animals including humans, chimpanzees and orangutans have been proved to be capable of self-recognition in a mirror. A lone study several years ago also reported that dolphins could recognize their own gaze in a glass. To study the elephants' behavior, the researchers placed an "elephant-proof, jumbo-sized" mirror, 2.5 metres high by 2.5 metres wide, inside the enclosure of three female Asian elephants (Elephas maximus) at the Bronx Zoo in New York City. The team used a still camera on a roof to observe the animals over a period of five months.
Upon entering the yard, all three elephants ran to inspect the mirror. The elephants, named Happy, Maxine and Patty, immediately investigated the surface by sniffing and touching it with their trunks — even attempting to climb the mirror to look behind it and kneeling down to look under it. They didn't display threatening behaviour such as trumpeting, which might have been expected if they saw the images as intruder elephants.
October 30, 2006
Monday Musing: Milosz v. Gombrowicz
I’m tempted to make the rather bold assertion that the most interesting duo in Western literature of the 20th century is Czeslaw Milosz and Witold Gombrowicz. I say duo because you really have to take the two of them together. When Milosz zigs, Gombrowicz zags, when you’re feeling one way, Milosz expresses it for you, and when the mood shifts, there is Gombrowicz waiting in the wings with a change of pace.
The twentieth century was insane. We forget to remember that. For us, it’s what made us what we are and therefore it has taken on a sense of inevitability, even naturalness. But looking at it from the other way around, from the perspective of those who were going through it and for whom its twists and turns were anything but a foregone conclusion, the century is filled with so many shocks and amazements it is difficult to comprehend. And that, of course, was one of the great, if not the great, themes of literature from the collapse of the Austro-Hungarian empire and the outbreak of WWI to the effective finale to the 20th century in the breakup of the Soviet Union and the reuniting of Western and Eastern Europe.
Through it all, the challenge to the coherence and sustainability of human experience was relentless. If tradition was disrupted and broken down here and there in the 19th century, it was upended completely, remade from the insight out, and sometimes obliterated during the 20th. Again, we don’t often bother to stop and consider how extraordinary that process was. It was by turns exhilarating and terrifying and sometimes both simultaneously. And perhaps the horrors of the twentieth century were all the more horrifying given that the century continuously produced strains of hope that things could possibly be otherwise. Suffering is that much worse the degree to which it is perceived to have been avoidable.
Czeslaw Milosz was as sensitive to these issues as anyone. This is a man who picked his way through the rubble of Warsaw when its ruins were still steaming, when the place was just an open wound. That experience, and the knowledge gained from it, is shot through everything that Milosz ever wrote. For Milosz, man is guaranteed nothing. That’s it. Nothing. And man can be reduced, or reduce himself, to nothing, at any moment.
Gombrowicz too experienced such things. As Milosz says of him, “Gombrowicz lived in an epoch which neither quantitatively nor qualitatively brings to mind any of the previous epochs and which distinguishes itself through ubiquitous cases of ‘infection’ with mass and individual madness.” Man, as Aristotle once mentioned, needs a world, a complicated arrangement of social interactions, in order actually to be man. But that same ordering of complicated social arrangements can also be the vehicle by which human beings destroy themselves and one another.
But Gombrowicz chose flight, literally and metaphorically. From his exile in Argentina he conjured up an absurd mental universe that spins out the problems of experience in countless ‘as if’ scenarios that are so powerful exactly insofar as they make sense despite their insanity. Gombrowicz took flight into the endless malleability of human experience in order to keep a step ahead of the world as it is. That is his particular freedom. It is the freedom of Socrates as Kierkegaard describes him in The Concept of Irony, the freedom that escapes from every possible determination.
Truth be told, this version of freedom annoys Milosz. Because for Milosz, the possibility of meaning in human affairs is dependent on commitment. If nothing else, it is founded on the capacity for human beings to hold experience together even as forces from within and without work to tear it apart. How one does this is not entirely clear but Milosz’s entire oeuvre is the sustained attempt to do so even as he lacks a blueprint. That is a pretty brave literary task to set in front of oneself. From Milosz’s standpoint, Gombrowicz has retreated into his own consciousness instead of forcing himself constantly to confront the problems of the world as it is encountered. Milosz has said that, “what fascinates me is the apple: the principle of the apple, appleness in and of itself. In Gombrowicz, on the other hand, the emphasis is placed on the apple as a ‘mental fact’, on the reflection of the apple in consciousness.”
But then the two come together again, in Milosz’s mind, because Gombrowicz never falls into the trap of those intellectuals who have lost track of the root problems of experience, actual experience, that have been thrown up by the 20th century. Milosz writes that, “A comparison of Gombrowicz with western writers, with Sartre, for example, would reveal, in the case of the latter, a deficiency of a certain type of experience connected with history and specific cultural traditions, a deficiency that is compensated for by theory.”
I think we’re still working this stuff through. And I’ll make one more rash claim. The future right now is in the past. Sometimes it is in the past, the immediate past, where things get clear again. For those of us whose lives stretch from the era of the 20th century into the next one, the most important thing for taking the future seriously is doing work on the things that have recently past. Only now is it becoming even vaguely possible to understand how important are the tentative thoughts put forward by people like Milosz and Gombrowicz. And there are others, back there, waiting for us. We simply have to take seriously the idea that turning our backs on the future is a way of renewing it.
The Future of Science is Open, Part 1: Open Access.
I've never had an idea that couldn't be improved by sharing it with as many people as possible -- and I don't think anyone else has, either. That's why I have become interested in the various "Open" movements making increasing inroads into the practice of modern science. Here I will try to give a brief introduction to Open Access to research literature; in the second instalment I will look at ways in which the same concept of "openness" is being extended to encompass data as well as publications, and beyond that, what a fully Open practice of science might look like.
The original paradigm: Open Source
Although the underlying concept of information as a public good goes back at least to the invention of the printing press and the end of the aristocratic/theocratic duopoly on literacy, programmers were the first people I know of to popularize this sort of "openness" in an academic setting. Richard Stallman started the GNU Project in 1983/4 as a reaction against the rising influence of proprietary software, and a year or so later founded the Free Software Foundation, which "is dedicated to promoting computer users' rights to use, study, copy, modify, and redistribute computer programs." What Stallman and the FSF mean by "free software" is famously summed up by the dictum, "free as in speech, not free as in beer"; more precisely, they mean "free" as in:
- The freedom to run the program, for any purpose
- The freedom to study how the program works, and adapt it to your needs
- The freedom to redistribute copies
- The freedom to improve the program and release your improvements to the public
Access to the source code is a precondition for these freedoms, and many advocates prefer that the "four fundamental freedoms" also be combined with some form of copyleft (basically a licence which explicitly disallows use of the original resource in any way that restricts the four freedoms for anyone else). About a decade later the Open Source Initiative appeared, offering itself as a "more pragmatic" approach to free software. The two definitions are pretty similar, though the OSI version allows some licencing that the FSF considers too restrictive of end users. Today, both the FSF and the OSI are powerhouse advocates for non-proprietary software, code that you can get your hands on and hack to your heart's content. There is a wealth of free software freely available for scientific purposes: for instance, the OpenScience Project maintains a list, as do (inter many alia) the NCEAS, the CBS and Indiana University. The NIH and EBI both maintain extensive services, there's an entire Linux distribution for science, SourceForge lists over 350 projects under "scientific", and a simple google search finds dozens of free applications for molecular biology.
By analogy with Open Source, Open Access to the research literature entails the freedom to read, use and redistribute the published results of scholarly research and derivative works based on those publications. What follows is a version of Peter Suber's very brief introduction to OA; for more details, see his full Open Access Overview and Timeline of the OA Movement. The bottom line is this:
Open-access (OA) literature is digital, online, free of charge, and free of most copyright and licensing restrictions. What makes it possible is the internet and the consent of the author or copyright-holder.
Most scholarly journals do not pay authors, who therefore do not lose revenue by publishing under OA conditions. Thus the controversies about OA to music and film (was Napster "piracy"? did it cost any actual musicians any money?) do not apply to the scholarly literature, the authors of which are clearly better off if access to their work is not restricted. Online publishing is much less expensive than its print-only ancestor, but it is not free; the big question of OA is how to pay the bills that do remain without charging access fees. Nearly all current OA models reduce to one of two basic blueprints: OA archives/repositories, and OA journals.
OA archives or repositories simply make their contents freely available to the world. They may contain preprints (the author's version prior to peer review), refereed postprints, or both. Archiving preprints does not require any form of permission, and a majority of journals already permit authors to archive their postprints. Archives which comply with the metadata harvesting protocol of the Open Archives Initiative are interoperative and can be searched as though they comprised a single (enormous, virtual) database, using high-level services such as OAIster. There are a number of open-source software packages available for building and maintaining OAI-compliant archives; Peter Suber maintains a list of lists of such archives, and SHERPA maintains a database of journal policies regarding pre/post-print archiving. Archives cost very little to set up and maintain, and increasing numbers of universities and research institutions are building their own. PubMed Central, maintained by the NIH, is probably the largest and best-known in biomedical science. ArXiv, run by Cornell University, is the principal means of transfer of research results for many (if not most) mathematicians and physicists. Stevan Harnad, a leading advocate of self-archiving, maintains a comprehensive self-archiving FAQ file.
OA journals are in most respects the same sorts of entities as traditional paid-access journals, but without the access fees. They perform peer review, and make the refereed articles available free to all comers. They pay the bills in a number of different ways. About half charge author-side fees, though who actually pays these is widely variable (author, author's institution, funding body, etc.). Publishing in an OA journal is obviously 100% compatible with self-archiving. The DOAJ currently lists nearly 2500 peer-reviewed OA journals, of which more than 700 are searchable at the article level; for larger lists of OA journals which may or may not be peer-reviewed, see JournalSeek or Yahoo's Free Full Text. Three of the most prominent OA journal publishers are the Public Library of Science, Hindawi Publishing and BioMed Central, and a number of traditional publishing companies now offer OA options.
A personal example
I have yet to publish any data here in the US, but I published a dozen or so articles while I was at the University of Queensland. More than half of these are not freely available from the journals in which they were published (J Clin Virol, Virology, Biochim Biophys Acta, Mol Biochem Parasitol, Acta Tropica -- all Elsevier journals, pfui! -- and Rev Med Virol from Wiley InterScience). I couldn't find any full-text copies online using Google Scholar or PubMed, either. You cannot read these seven papers of mine without paying a fee (usually around $30) or physically going to a library which carries (and has therefore paid for) the journal and issue in question. Neither can my professional colleagues, unless their institution happens to subscribe to the journal or some package which includes it; these subscription fees are commonly extortionate (Elsevier being a particularly egregious offender).
For you as a taxpayer, this means that you are denied access to information you've already paid for (since I've always been funded by government grants). For me as a scientist, it means that more than half of my life's work to date is, while not useless, certainly of much less use to the world than it might be. Given that a large part of why I do what I do is that I want to leave the world a better place than I found it, that is simply not acceptable to me. Fortunately, according to RoMEO, all of the journals concerned allow postprint archiving by authors, so I might be able to rescue it. Searching for "queensland" in DOAR (one of a number of such directories) leads me to ePrints UQ, so there is a relevant archive for me to use, but there's a catch: you have to be a current UQ staff member to deposit. I can (and will) talk to David Harrich, my boss at the time, about archiving all of our HIV papers, since Dave is still at UQ. My schistosomiasis papers, though, have no one on the author lists who could deposit them, so I'll have to contact the staff at ePrints UQ and see whether there's a way for ex-staff to deposit articles. If there isn't, I'll have to either find another repository that will take the articles, or make one of my own. Since my current employers don't have an institutional repository, I'm going to have to make that choice anyway for upcoming papers. Both arXiv and Cogprints will take biology papers, although mine don't seem to fit into any of their categories, and Peter Suber has mentioned building a Universal Repository in collaboration with the Internet Archive, but I'm not sure if anything has come of that endeavour. That leaves me with the option of building my own archive, for the purposes of which there are numerous open-source software packages available. Alternatively, at least as a first step, I could simply upload the papers to my own webspace somewhere and try to make sure the the Internet Archive and Google Scholar know about them, so that they would be available though not interoperable with other repositories. Finally, there's one last catch: Elsevier won't let me use their pdf versions, and I don't have the original files in most instances. So whatever I do, I'm going to have to track down the published versions and then reverse-engineer an "unofficial" version.
Why would I go to all this trouble? Because OA offers significant benefits and advantages to a variety of stakeholders:
Benefits of Open Access
1. Maximal research efficiency. The usual version of Linus' Law says that given enough eyeballs, all bugs are shallow -- meaning that with enough people co-operating on a development process, nearly every problem will be rapidly discovered and solved. The same is clearly true of complex research problems. and OA provides a powerful framework for co-operation. For instance, Brody et al. showed that, for articles in the high-energy physics section of arXiv (one of the oldest archives available for such study), the time between deposit and citation has been decreasing steadily since 1991, and dropped by about half between 1999 and 2003. Alma Swan explains: "the research cycle in high energy physics is approaching maximum efficiency as a result of the early and free availability of articles that scientists in the field can use and build upon rapidly".
Moreover, the machine readability of a properly formatted body of open access literature opens up immense new possibilities. Paul Ginsparg, founder of arXiv, observes:
True open access permits any third party to aggregate and data mine the articles, themselves treated as computable objects, linkable and interoperable with associated databases. We are still just scratching the surface of what can be done with large and comprehensive full-text aggregations.
...exciting new developments in text-mining and data-mining are beginning to show what can be done to create new, meaningful scientific information from existing, dispersed information using computer technologies. Research articles and accompanying data files can be searched, indexed and mined using semantic technologies to put together pieces of hitherto unrelated information that will further science and scholarship in ways that we have yet to begin imagining. These technologies are just in their infancy at the moment. Real scientific advances will be made using them but the technologies can only be applied effectively to the open access corpus: literature and data hidden behind journal or databank access restrictions are invisible to the computer tools that can do this work...
2. Maximal return on public investment. Just as OA is, at least for now, primarily (though not exclusively) aimed at literature for which the authors are not paid any kind of royalty, so one obvious focus of attention is government-funded research. Why should taxpayers pay twice, once to support the research and then again when the scientists they are funding need access to the literature? More importantly, open access to a body of knowledge makes that knowledge more available and useful to researchers, physicians, manufacturers, inventors and others who make of it the various socially desirable outcomes, such as advances in health care, that government funding of research is intended to produce. Peter Suber has gone over this intuitive position in some detail here.
3. Advantages for authors. There are well over 20,000 scholarly journals, and even the best-funded libraries can afford subscriptions to only a fraction of them. OA offers authors a virtually unlimited, worldwide audience: the only access barrier is internet access (which is, of course, cheaper to provide in poorer nations than comprehensive libraries of print journals would be!). There is a large and steadily growing body of evidence showing that OA measurably increases citation indices (that is, the number of times other papers refer to a given article). For instance, of the papers published in the Astrophysical Journal in 2003, 75% are also available in the OA arXiv database; the latter papers account for 90% of the citations to any 2003 Astrophysical Journal article, a 250% citation advantage for OA. Repeating the exercise with other journals returns similar results.
Not only is this of vital importance to academics when it comes to applying for funding or competing for tenure, it's more or less the whole damn point of publishing research in the first place: so that other people can read and use it!
4. Advantages for publishers: the benefits that accrue to authors of OA works also work to the advantage of publishers: more widely read, used and cited articles translates to more submissions and a wider audience for advertising, paid editorials and other value-add schemes.
5. Advantages for administrators. One of the best available proxy measures for research impact is citation counting: how many times has a given paper been cited by other researchers in their published work? This idea led to the development of the impact factor, a measure of a particular journal's importance within its own field. These sorts of bibliometric indicators are relied upon heavily by science administrators making decisions about funding, by faculties making decisions about tenure cases, and so on. Open access, by removing the subscription barriers that splinter the research literature into inaccessible proprietary islands, raises the possibility of vast improvements in our ability to measure and manage scientific productivity.
6. Scalability. Peter Suber has pointed out that, because it reduces production, distribution, storage and access costs so dramatically, OA "accommodates growth on a gigantic scale and, best of all, supports more effective tools for searching, sorting, indexing, filtering, mining, and alerting --the tools for coping with information overload." Online distribution is necessary but not sufficient for scalability, because subscribers to paid-access journals do not have unlimited budgets even if they are enormous institutional libraries. For end users to keep pace with the explosive growth of available information, the cost of access has to be kept down to the cost of getting online.
Tune in Next Time
In the second instalment, I will look at open access to raw experimental data, cooperation over competition as a research model and the ever-expanding role of the Web in science. In the meantime, if this has piqued anyone's interest in OA (and I hope it has!), here are my Simpy collections of open access and open science links.
One Last Thing
This is an immense topic, and anyone who knows anything much about it will certainly see things I've missed or got wrong. That's what the comments are for! Blogs are conversation tools, and I'd appreciate your feedback.
This work is licensed under a Creative Commons Attribution 3.0 License.
Dispatches: On The Shining
It is the greatest scary movie and the scariest great movie there is. From its first foreboding shots of a car on mountain roads, The Shining tends to unmoor me completely from my critical faculties. It's a cliche that it only gets scarier with each viewing, but I'll reassert it: I am more scared and compelled when watching today, maybe for the twentieth time, than ever. Since it's more a dreamlife than a movie, and depends on no shocks for its disturbing power, it only lodges deeper within me each time. The problem is that in addition to being disturbing, The Shining also produces such pleasure that one has to watch it periodically anyway. It creates a world of such photographic perfection and precise beauty that the idea of, say, a serif typeface in the credits would deface it utterly. Yet it's really, really funny. It's weird, but its austere beauty contributes to its scariness: if it had aged badly, it would much easier to laugh at and dismiss. You could look away.
A number of memorable films were released in 1980: Raging Bull, The Elephant Man, Gloria, Ordinary People, The Empire Strikes Back, Private Benjamin. The first three are great movies; seen today, however, none of them possess the strangeness of The Shining. For example, to watch Raging Bull today is to enjoy a excellent and movingly acted biopic. To watch The Elephant Man is to recall David Lynch in his embryonic period, with glimpses of the full-blown uncanny paranoia that emerges later. In both cases, we are the master of our viewing experience - after several viewings, we comfortably comprehend and analyze the movies' goals and the degree of success which with they reach them. They are movies. Nothing like this is the case upon re-viewing The Shining today. It doesn't even seem like a movie. It fits into the category "films of 1980" in a much more confounding way, when it does at all.
A profound uneasiness surrounds The Shining, enveloping it so thoroughly that it seems some sort of unclassifiable formal object, so freshly does it continue to impress itself. One way to put this would be to say that the other 1980 movies I mentioned are examples of different cinematic tones; The Shining's tone somehow stands apart and escapes the category of "movie" altogether. This strangeness expresses itself by the camera's action as well as the actors, who seem not so much to be professionals acting as the denizens of an archetypal reality. Both the setups and the action occur at a very deliberate speed, giving you time to register (and interpret) everything that happens, every authoritative cut and koan-like speech. In the very first sequence, when Jack Torrance is interviewed for the job of caretaker, the manager begins to explain the case of the previous caretaker, Grady. As he introduces the topic, he stumbles, then makes a forced laugh. There is an inexplicably pregnant cut towards his assistant, who sits in a chair, absolutely motionless - he's wondering how the boss is going to broach this particular delicate subject. That cut does everything to establish the mood of wrongness, yet it's unmotivated on first view, like a lot of other bits. That's why the movie feels at an angle to the horror genre - as everyone notices, it completely eschews the usual scare tactics: sudden music cues come at the "wrong" times, our expectation of scary shocks is mostly thwarted, and the compositions are balanced and photographic instead of aslant and "weird."
An odd thing: The Shining retains its tonal freshness even though much of it has entered pop culture. Jack chopping down the door, the two murdered girls holding hands, Danny scrawling "redrum" on the mirror, the deluge of blood from the elevator: that these are celebrated and iconic images has not dulled their talismanic power. Seeing them again, the movie's technique of preparing you for a disturbing sight, making you anticipate it, becomes more and more terrifying. (I'll confess here that I can't even watch the sequence in room 237 with the woman in the bathtub anymore - I worry so much for Danny I just can't watch it.) Even many secondary moments have entered the collective memory. I remember first watching Jurassic Park and grinning at Spielberg's allusion to The Shining as that movie's children hide in the stainless steel cabinet of a professional kitchen. Those pursuing velociraptors, though, did nothing to dislodge from memory the far more potent scene to which they alludes, of Danny hiding from his father.
It's Danny Lloyd's presence in the film, I think, that makes it so perfectly disturbing - any complicity the audience might have with Jack Nicholson's charismatic psychosis rebounds viciously whenever Danny is onscreen. His face is so beautiful and his manner so painfully innocent that he personifies the vulnerability of childhood. This splits us against ourselves, since the film has taken pains to help us identify with Jack's contemptuous antics. If Shelly Duvall's shrill Wendy Torrance makes us want to scream in frustration with Jack (and Kubrick), Danny makes us desperately protective against our own impulses. We cower, transfixed, when Jack sadistically explains cannibalism to his young son. And in a quite brilliant touch, Scatman Crothers plays the hotel's head chef as the one truly loving, caring (and "shining") person in the movie. Naturally he ends up as the victim of the film's vicious and only murder.
I don't mean to raise the ur-scenario of the isolated nuclear family as a suffocating nightmare above others, though. But the fact remains that The Shining often seems to be an allegory for something without quite enough clues as to its meaning. The massacre of Native Americans, the numerological significance of the number 12, the doctrine of eternal recurrence, the perils of alcoholism, the sense of human social life as a pathetic delusion, all of these can be supported by the right observer. What can we glean from this excess of possibility? Maybe this: the dream, not the real, is the state to which the film aspires, which is why it includes so many irregularities and "mistakes." Watch the famous scene of Jack chopping down the bathroom door behind which Wendy cowers. He chops through the right pane of the door, then sticks his hand through, which Wendy slices with her carving knife. We cut back to Jack's face, then to a shot of him in front of the door, where the right pane is splintered, but the left pane is now completely and cleanly removed. It's plainly absurd. This in a film that took a year(!) to edit, with forty or more takes of each shot.
Perhaps the most lauded innovation of the film is its use of long Steadicam (an apparatus that allows an operator to walk and run with the camera without shaking) shots that track various characters around the hotel. Among its many vivid employments, you probably particularly remember Danny riding his tricycle through the halls of the Overlook, his wheels alternately humming over the wood floor and being muted over the carpet. These shots, along with the model of the hedge maze that Jack looks into, give us a clear and masterful spatial knowledge of the hotel's layout, allying us with the maleficent authority of "the house," implicating us in the desire to kill. ("Your money's no good here, Mr. Torrance. Orders from the house.") As we haunt the hotel, so does Jack in his accelerating derangement. Kubrick even mirrors this acceleration by speeding up the intertitles, from "ONE MONTH LATER" to "THURSDAY" to "8PM." The steadicam, the titles, the pace - these devices combine to remove the film from everyday life. When you see Vivian Kubrick's documentary on the movie's making, the humdrum randomness of life on set (Jack mugging, Shelly whining, Danny running by with... it's Leon Vitali! "Bullingdon," from Barry Lyndon!) seems totally bizarre - by contrast, the film itself feels implacable, like it has no wayward elements. It didn't include any of the normality it didn't want. It never could have been any other way.
And that, I believe, is the secret to The Shining: the perfect autonomy of its execution. Shooting fifty takes of Danny running through the maze to get that perfect one starts to make sense: the film is stripped of any inner timeliness, any traces of the reality of 1980. It's other-world hangs together so immanently, with such formal unity. But such unity turns out to be suspect. At the movie's center is Jack's typewriter and the outrageous moment when Wendy discovers what his work consists of: ALL WORK AND NO PLAY MAKES JACK A DULL BOY typed thousands of times, as I hardly need to repeat. The most chilling of the suggestions the movie makes might be the idea that authorship, and maybe auteurship, are forms of psychosis. To desire to create and escape into one's one world is to risk succeeding to a sociopathic detachment. You get the sense here of an autocritique: after all, it's Kubrick who so clearly delights in the tiniest of details, like the little toy ax and American flag on the hotel manager's desk, and it's us who cheer him on. Among all filmmakers, the sense of a pure aesthetic, a fully controlled formal world, is never greater than with Kubrick. But it's just this escapist impulse that The Shining suggests is murderous, and you can't escape it if it's in you too.
PERCEPTIONS: eat your heart out on all souls day
From the Sweet Little Domestic Life weblog.
Selected Minor Works: A Philosophical Exchange, of Sorts
Justin E. H. Smith
(For an extensive archive of Justin Smith's writing, please visit www.jehsmith.com)
Many of us in the 3QD community have witnessed a recent assault on our inboxes from Australians with big ideas. It seems the land down under plays host to a vast network of retired insurance adjustors, used-car salesmen, accounts payable clerks, etc., who believe, against all probability, that they have discovered a number of very grand truths about the meaning of life, the right path for humanity's future, etc. They live off their wives' paychecks, sit in their pj's in their basement romper rooms (at least this is how I imagine them), and post grandiose proclamations in tortured English about the decline of civilization, the one true path towards renewal, and (you guessed it) the deplorable takeover of the universities by postmodern feminist leftist brainwashers. They self-publish their ruminations, and declare that the lack of interest on the part of university presses can only have to do with the dangerousness, the epoch-making potency, of what they have to say.
I have long been fascinated with autodidactic self-styled "philosophers." I always think to myself when I encounter them: it's not like it takes a genius to be a real philosopher. I got my Ph.D. in the thick of a six-year haze of pointless partying, all-consuming "relationships" with people whose names I've since forgotten, and the usual twenty-something bon-vivantism that is at such great odds with the truly committed life of the mind. And here I am some years later, not exactly up there with Habermas in terms of the influence my proclamations carry, but also not down there with the cranky and alienated writers of letters to editors at small-time local newspapers, railing against tax-and-spend liberals, the immodesty of teen pop stars, etc. So why, oh why, would anyone choose the parasitic social role of the self-trained loner philosopher, who enjoys none of the social capital of the professional, and who inevitably will be unable to communicate with anyone whose opinion carries any weight at all in society, never having learned the appropriate behavioral and lexical cues that make communication possible? What are the social factors that make these men (and they are always men) possible?
Intrigued (and procrastinating), I responded to an e-mail sent to me by one Philip Atkinson, who had got in touch after reading a 3QD essay in which I acknowledged, innocently enough, that I do not know what philosophy is. I present to you below an abridged transcript of the correspondence that followed.
Further to your confession that despite becoming tenured as a professor of philosophy you have no idea what philosophy is. I have spent the last few years challenging professors of philosophy to define or confess they cannot define what philosophy is. None would meet my challenge, so I am delighted to at last discover a university philosopher who confesses the truth. You are unique.
The purpose of my challenge is to demonstrate that philosophy is now unhelpful but demonstrate that this can be repaired by adopting a simple set of beliefs. And as proof of this claim, by using the proposed set of beliefs I recommend, I can explain the nature of civilization: a phenomenon that has defied all previous attempts at explanation.
So would you please read my claim about philosophy (or civilization) and confirm or refute it.
I don't think you quite appreciated the tone or spirit of the essay of mine you are citing. If you had, you would have detected the sense of irony that motivated me to write it, and you also would have understood that my ironic distance from my own discipline is a result of my disappointment with systems of definitions and axioms, such as the one you are promoting.
Thank you for responding to my letter, your reply is disappointing however. Your confession of ignorance about the nature of philosophy is independent of tone or spirit. No-one knows what philosophy is or what is can be used for; this is the very thing I am trying to repair. It was refreshing to see this simple truth confessed; but now you wish to recant.
You may well feel disappointment with definitions and axioms, but they are an essential part of understanding and everyone uses them, albeit unconsciously. The very meaning of words is axiomatic. The notions I am promoting as axioms of philosophy are easy to understand, and easy to use, unlike the pompous nonsense that generally is expressed by self-declared philosophers. I could sympathise with your disappointment with axioms and premises if I had to listen to the nonsense that now passes for philosophical expression in universities. My work is an attempt to make philosophy useful for everyone. It is an attempt to make the organisation to which we all belong, civilization, understandable.
I have asked you to do what I have asked every self-declared philosopher to do: refute, or confess you cannot refute, my claims. They invariably opt to behave like you and do neither; can't do one, wont do the other. Please note that your failure to uphold truth is betrayal of your duty, your community, and yourself: but you do keep your job.
You are not going to get anywhere attempting to communicate with professional philosophers using such a heavy-handed and arrogant approach. Trust me: your website is of absolutely no interest to anyone trained in philosophy. It is not at all surprising that no one has been interested in 'refuting' what you have to say. What you have to say seethes with outsider frustration. It is a call for attention, not an invitation to dialogue.
My advice to you --and I mean this sincerely, as advice that will benefit you-- is to delete the website, forget everything you think you know, and spend the next 10-15 years of your life reading the great works of philosophy with an unprejudiced spirit: that is, do not read them hoping to confirm what you already think you know. Read them hoping to learn from them. I would recommend starting with Plato's dialogues. Get back to me then: I will be in my early fifties.
Thank you for again responding to my letter, your reply is dishonest however. You make no attempt to answer my simple claims but merely attack my character with innuendo. You publicly confess you do not know what philosophy is, but when this claim wins my attention you try to tell me you did not mean it. So please tell me what philosophy is so that I can recognise a philosophical claim when I see it.
Please note that the truth or falsity of a claim is independent of the character of the author. A claim is either true or false depending upon the claim. If you can refute a single claim that I make, please do, otherwise I will believe you cannot.
Why do you suggest I destroy a simple theory that you cannot contradict? Especially as it clarifies the very subject that you are a professor in.
I'm going to try this one more time, because I continue to believe that all of us are capable of developing intellectually, even the most hard-headed.
I did not use any innuendo in my last message. I stated outright, in no uncertain terms, that your website is not interesting. I stand by that. It is wholly and completely without philosophical interest, not just to anyone who is a career philosopher, but to anyone who values subtlety of mind.
If the project is interesting at all, it is so only as a window into the world of a curious product of our society (and I've come across many similar cases (you would get along famously with Ronald Jump of Toledo, Ohio)): the autodidactic outsider who retires from an intellectually undemanding career in which he was never able to cultivate stimulating idea-based relationships, and at some point gets it into his head that he has something far more important to say than he in fact does. You fit this demographic to a tee.
I wrote the essay you cite using stylistic and rhetorical techniques about which you appear to know nothing. But fine, let us suppose I meant, literally and bluntly, that I do not know what philosophy is. That does not entail that I do not know what it is not. And I definitely know it is not what you have posted on your website.
When I move into a more serious rhetorical mode, I am indeed confident enough to proclaim a few things about philosophy: it involves, ineliminably, humility and openness, two traits you appear to lack entirely. Socrates is, for example, a philosopher, and again, I think you could learn from him if you would just get over yourself. I think, indeed, there are plenty of professional philosophers working today who are by no means world-historical figures, but who are smart enough and from whom you could learn a thing or two if you were not so arrogant. I've learned from many of them, and I do my best to pass on what I've learned to my students.
If you insist, then I am happy enough to indulge you in your little fantasy and confess that I cannot "refute" your claims. Does that give you a little thrill? Have you won? If it does give you a thrill, then you really are a hopeless case, and you are certainly no philosopher. I also cannot refute the claims, such as they are, of the lonely souls I meet on the streets of Montreal promoting the ideas of Lyndon LaRouche, Sun Myung Moon, or Rabbi Schneerson. This does not mean that these are good ideas. Indeed it is often the case that the weaker a claim is, the harder it is to engage it substantively: one cannot get a foothold in shared background assumptions, and communication proves impossible. This is why I cannot "refute" your project. I could pick it apart critically, bit by bit (I could, for example, point out that, whatever reality is, it is certainly not, as you claim, a "criterion" of anything), but that would be a commitment that I would only be willing to make to someone who has proven willing to enter into dialog with me. And you do not seem ready to do this.
If on the other hand you find that a victory like this lacks dignity, then, again, I repeat my advice: go learn from some people who have managed to say some profound and insightful things, and get back to me in a decade or so.
Thank you for again responding to my letter, your reply is silly however. Instead of replying to my simple questions, whose answers are crucial, again you digress.
Whether a subject is interesting to a reader or not is only of concern to that reader; it has no bearing on the truth or falsity, importance or unimportance, of the ideas in question. So why bring up such a claim? You keep using a term whose meaning you do not know, cannot define and therefore cannot understand. This must mean you do not know what you are talking about whenever you use the term philosophy; no statement you make using the word philosophy has a sensible meaning. And when you claim "I definitely know it is not what you have posted on your website" you are lying. Unless you can refute a claim, you cannot know if it is true or false.
The popular opinion that the public have about philosophers is that they are pompous, tiresome, fools who spend their time demonstrating that black is white. My simple theory, that you cannot refute, not only confirms this view but offers a way of making philosophy useful. You can neither refute my simple claims nor answer simple questions; this is evidence of your inadequacy; but you do not have the courage to admit your inadequacy.
I apologise for asking you to do something that is beyond your ability.
OK: your claims are false (as well as uninteresting). Reality is not a criterion of anything; truth is a relation that obtains between a belief and a state of affairs in the world, and so cannot itself be a belief; philosophy cannot simply be the study of the understanding, because this would leave out other important faculties such as reason, perception, emotion, etc. It would also leave out all areas of philosophy other than epistemology. You might want to collapse all of these faculties into understanding, but the burden is on you to explain how this could be done. You do not do that. If you had read any serious philosophy, you would know about important distinctions such as that between understanding and reason.
"Philosophy" is not a synonym of "civilization" as you indicate in your first message to me. If it were then we would get, by substitution, the odd claim that civilization is the study of understanding, a claim so foolish I don't think even you would be willing to support it. You say on your "Theory of Civilization" page that "civilization is an understanding". Well, what is it: is civilization an understanding or is it (by substitution) the study of the understanding?
You say that understanding is the invoking by reason of a set of values, or a morality. But earlier you had said that understanding is the bestowing of meaning. Which is it? If you want to say that these two definitions amount to the same thing, you have to explain how they could. In particular, you would have to explain in what sense the meaning of a word like "the" or "and" has anything to do with morality.
You say that morality (together with knowledge) FORMS truths. That makes you a subjectivist and a relativist about truth (though you probably didn't intend this). If morality and knowledge form truths, then truth is not an objective state of affairs independent of us, which is something you have been intent on claiming. You are therefore contradicting yourself.
I could go on (and on, and on) explaining all the problems with your claims. But I have rational and thoughtful people to deal with, and that is more rewarding. Your claims are false. They are wrong. And they are the consequence of your apparent total inability to learn from others.
I repeat my advice: if you wish to be a philosopher, you will have to wipe the slate clean and make yourself ready to learn from great philosophical minds. You are not (yet) a great philosophical mind, or even a middling one, but you may have it in you to become one if you are willing to scrap this false and frivolous website of yours and start afresh.
Thank you for again responding to my letter, your reply is mistaken however. It is idle merely to state your beliefs as truth, it merely tacitly confirms the notion that all truth is, or is founded upon, a belief or set of beliefs. Hence all reality is the creation of an understanding. And to study understanding is to study the creations of understanding, which includes epistemology - knowledge. As a civilization is an understanding then it is the proper study of philosophy.
You appear to be unable to differentiate between the study of civilization and civilization. Civilization is not philosophy, but the study of civilization is. You again repeat your mistake of using the word philosophy, whose meaning you do not know, to advance an argument. To bestow meaning an understanding has to invoke its values (beliefs), and this process is called reasoning. Reasoning is the manipulation of beliefs.
I explain that morality is the set of values (beliefs) that an understanding uses to recognise right from wrong, which are permanent for the life of the understanding. This is separate from those values an understanding uses to recognise true from false, which are knowledge, and vary depending upon the experience of the understanding.
What is it you cannot understand about this simple explanation of understanding? How do you think you understand, except by using this process?
I can bestow no sensible meaning upon your paragraph claiming I am a subjectivist.
You have failed to refute a single claim; you have only demonstrated an inability to think clearly.
You have proven me wrong in at least one sense: it turns out that some people really are incapable of improving their intellects. I hope you continue to find your project rewarding, and I hope your wife will be willing to continue sponsoring it. Have you asked her, by the way, if she has any interesting ideas about philosophy?
I wish you many years of blissful, ignorant, self-righteous tranquility.
October 29, 2006
A Country Ruled by Faith
Gary Wills in the New York Review of Books:
The right wing in America likes to think that the United States government was, at its inception, highly religious, specifically highly Christian, and even more specifically highly biblical. That was not true of that government or any later government—until 2000, when the fiction of the past became the reality of the present. George W. Bush was not only born-again, like Jimmy Carter. His religious conversion came late, and took place in the political setting of Billy Graham's ministry to the powerful. He was converted during a stroll with Graham on his father's Kennebunkport compound. It is true that Dwight Eisenhower was guided to baptism by Graham. But Eisenhower was a famous and formed man, the principal military figure of World War II, the leader of NATO, the president of Columbia University—his change in religious orientation was just an addition to many prior achievements. Bush's conversion at a comparatively young stage in his life was a wrenching away from mainly wasted years. He joined a Bible study culture in Texas that was unlike anything Eisenhower bought into.
Bush was a saved alcoholic—and here, too, he had no predecessor in the White House. Ulysses Grant conquered the bottle, but not with the help of Jesus.
Anthony Kaufman in Seed Magazine:
Earlier this month, physicists in Copenhagen announced they had successfully teleported information through a half a meter of space to a large object. The experiment, the first to transport information from light and matter, is said to be a revolutionary step in the field of quantum teleportation. But while it's one thing to teleport atomic data, it's quite another to teleport an entire human being.
The Prestige, a new Hollywood thriller, takes up the question of teleportation as one of its central conundrums: Is it feasible? What would people leave behind after teleporting? And could you bring your hat along for the ride?
The Silence of Günter Grass
Neal Ascherson in the London Review of Books:
A great deal of the abuse heaped on Grass in the last few months has come from old enemies and rivals. Those, especially on the nationalist right, who had writhed under his satire and resented what they saw as his systematic undermining of German self-confidence, were enchanted. What a downfall to relish! He, too, the mighty novelist accepted by the outside world as Germany’s political conscience, had hidden his past. But there are many more Germans who had used those early novels – The Tin Drum, Cat and Mouse, Dog Years – to form their own idea of their nation and its curse of amnesia. And they are hurt, as if Grass had let them down. He could have told the truth about those months, they lament, and nobody would have thought much the worse of him. In fact, to admit that he had been in the Waffen SS, however briefly, might have given even more resilience to his fiction and to his politics. What held him back, until it was too late?
Wikipedia's lamest edit wars
Occasionally, Wikipedians lose their minds and get into edit wars over the most petty things. This is to document that phenomenon. This page isn't comprehensive or authoritative, but it is designed to show the "worst-case" result of people attaching so much importance to some trivial detail that they are willing to engage in the lame pastime of edit warring over an even lamer cause. Back in the good old days, people settled this sort of thing with a gunfight. Now they do it by screwing with an encyclopedia. Truly, the Wikipedia outlook has changed the way things get done. Specifically, it has changed them from actually getting done to never getting done. On the other hand, nobody gets shot, either.
How to be funny
"Why are comedians such good liars? How hard do they work on their jokes? And how important is... timing? Jimmy Carr and Lucy Greeves explain the rules."
From the Telegraph:
They all laughed when I said I wanted to be a comedian. They're not laughing now.
This Monkhouse gag is funny but, of course, it's much better heard than read. On paper, a joke is a pale and inadequate one-dimensional version of itself. In fact, a joke scarcely exists until someone has told it and someone else has laughed.
The who, where, when, what and why of a joke's telling can be more significant that its topic, and no single theory - from Freud's notion of the joke as a release of suppressed sexual neurosis to Schopenhauer's definition of humour as a reaction to incongruity - can explain how jokes work.
Even comedy's greats seem stuck for a proper analysis. When John Cleese tired of questions about where he got his jokes from, he resorted to, 'I buy them from a little man in Swindon.' The truth is much more prosaic. Jokes are about 10 per cent inspiration and 90 per cent whittling and crafting - much of it in front of an audience.
"I can't keep up with myself"
"Elfriede Jelinek dismantles the novel with her latest, Greed. Lucy Ellmann applauds the tireless, scathing fury of a Nobel laureate."
From The Guardian:
For anyone who wants to write or read daredevil, risk-taking prose, therefore, it was tremendously encouraging that Elfriede Jelinek won the Nobel prize for literature in 2004. But most British readers hadn't heard of her, despite four novels being available from Serpent's Tail (Lust, Wonderful, Wonderful Times, Women as Lovers, and The Piano Teacher), all of them full of her uniquely sneering tone and tireless fury with the human race. Jelinek seized the novel by its bootstraps and shook it upside down. Was she looking for coins or keys, or just trying to prevent fiction swallowing any more insincerity? Her dynamic writing gives a sense of civilisation surviving against the odds.
Neocons, Betrayed by Battlestar Galactica
Brad Reed in The American Propsect on the odd love affair between neoconservatives and science fiction:
Over the sci-fi show's first two seasons, many conservatives saw it as a pitch-perfect metaphor for the United States’ post-9/11 battle against Osama bin Laden and his Muslamonazi horde. Galactica, which has become something of a surprise hit on the Sci Fi Channel, takes place in a post-apocalyptic universe where humanity has been decimated by a nuclear strike launched by an enemy race of robots known as the Cylons. Most of the action revolves around a noble band of 50,000 survivors who hurtle through space searching for a new home planet. Along the way, they have had to deal with Cylon sleeper agents, suicide bombers, and even a sinister pack of left-wingers who use violence to try to force humanity to make peace with their enemies.
“The more I watch the new Battlestar Galactica series, the more the Cylons seem like Muslims,” wrote “Michael,” the author of the Battlestar Galactica Blog, back in March. “They believe they are killing humans for their god. This is very much like the Muslim concept of jihad, which instructs Muslims to spread their religion through war.”
National Review’s Jonah Goldberg, who writes regularly about Galactica’s politics on NRO’s group blog, The Corner, also picked up on parallels between the show and the war on terror. Goldberg took particular glee in attacking Galactica’s anti-war movement, which he said consisted of “radical peaceniks” and “peace-terrorists” who “are clearly a collection of whack jobs, fifth columnists and idiots.” Goldberg also praised several characters for trying to rig a presidential election. “I liked that the good guys wanted to steal the election and, it turns out, they were right to want to,” wrote Goldberg. Stolen elections, evil robots, crazed hippies … what more could a socially inept right-winger want from a show?
But alas, this love affair between Galactica and the right was not to last: in its third season, the show has morphed into a stinging allegorical critique of America’s three-year occupation of Iraq.
How to Hack the Vote and Steal and Election
[Hat tip: Roop]
Over the course of almost eight years of reporting for Ars Technica, I've followed the merging of the areas of election security and information security, a merging that was accelerated much too rapidly in the wake of the 2000 presidential election. In all this time, I've yet to find a good way to convey to the non-technical public how well and truly screwed up we presently are, six years after the Florida recount. So now it's time to hit the panic button: In this article, I'm going to show you how to steal an election.
Now, I won't be giving you the kind of "push this, pull here" instructions for cracking specific machines that you can find scattered all over the Internet, in alarmingly lengthy PDF reports that detail vulnerability after vulnerability and exploit after exploit. (See the bibliography at the end of this article for that kind of information.) And I certainly won't be linking to any of the leaked Diebold source code, which is available in various corners of the online world. What I'll show you instead is a road map to the brave new world of electronic election manipulation, with just enough nuts-and-bolts detail to help you understand why things work the way they do.
Along the way, I'll also show you just how many different hands touch these electronic voting machines before and after a vote is cast, and I'll lay out just how vulnerable a DRE-based elections system is to what e-voting researchers have dubbed "wholesale fraud," i.e., the ability of an individual or a very small group to steal an entire election by making subtle changes in the right places.
The Art of Looking Sideways
The Art of Looking Sideways is a primer in visual intelligence, an exploration of the workings of the eye, the hand, the brain and the imagination. It is an inexhaustible mine of anecdotes, quotations, images, curious facts and useless information, oddities, serious science, jokes and memories, all concerned with the interplay between the verbal and the visual, and the limitless resources of the human mind. Loosely arranged in 72 chapters, all this material is presented in a wonderfully inventive series of pages that are themselves masterly demonstrations of the expressive use of type, space, colour and imagery.
This book does not set out to teach lessons, but it is full of wisdom and insight collected from all over the world. Describing himself as a visual jackdaw, master designer Alan Fletcher has distilled a lifetime of experience and reflection into a brilliantly witty and inimitable exploration of such subjects as perception, colour, pattern, proportion, paradox, illusion, language, alphabets, words, letters, ideas, creativity, culture, style, aesthetics and value.
The Art of Looking Sideways is the ultimate guide to visual awareness, a magical compilation that will entertain and inspire all those who enjoy the interplay between word and image, and who relish the odd and the unexpected.
It's Lonely At the Top
"Stay the course" is a time-honored rallying cry in politics. But it has always been more a slogan than a strategy, meant to show the steadfastness of the person who shouts it rather than what he actually intends to do. More telling is when staying the course turns into "constantly changing tactics to meet the situation on the ground." That is how President Bush is now describing the battle plan in Iraq. It also pretty neatly sums up what his presidency has come to as he reaches the eve of a midterm congressional election that has turned into a referendum on Bush himself—and on a policy in Iraq that has left him more isolated than at any other point in his presidency.
The last time control of Congress was up for grabs in a midterm election, it seemed Republican candidates across the country couldn't see enough of—or be seen enough with—George W. Bush. In the closing five days of 2002, Bush swooped through 17 cities, playing to tens of thousands of voters who packed tarmacs and arenas from Aberdeen, S.D., to Blountville, Tenn. This midterm election is also turning out to be all about Bush, but it's a much lonelier experience for him. He still fills smaller rooms, especially the kind where people are willing to write five-figure checks for the privilege of lunch with a Republican President. And he's welcomed warmly in places where having local reporters point out Bush's difficulties provides a diversion from the candidate's own. But when Air Force One touches down in tightly contested congressional districts these days, it often turns out that the G.O.P. candidate there has discovered a previous commitment elsewhere, the political equivalent of suddenly needing to have your tires rotated.
October 28, 2006
Flawed solution to famed math problem spurs cyber soap opera
Stephen Ornes in Seed Magazine:
It all started when a mathematician tackled one of math's most enduring open problems—one that happened to be worth $1 million—in a paper she posted online. The journal Nature quickly published a story on its web site; news of a great mathematical breakthrough began to spread.
But less than two weeks after she posted the paper, the author learned that she had made an error and withdrew her work. In another era—as recently as, say, 10 years ago—that would have been the end of the story...
Oh, but times have changed. In the world of instant communication and public access to sophisticated research, this small story blossomed into a veritable cyber-drama. The narrative at "Not Even Wrong," Woit's blog, escalated quickly. Within a week, it had become a revealing chronicle of scientific hope, human disappointment, and the perils of undertaking the often messy enterprises of science and math in the age of the blog.
Robert Pinsky's favorite Halloween Poem
From the Washington Post:
My favorite poem for Halloween was written in the 16th century: the hundredth poem in "Caelica," a book-length sequence composed over a lifetime by Fulke Greville (1554-1628). He was Lord Brooke, an eminent statesman under Elizabeth I and James I, and a close friend of his fellow poet Philip Sidney. Greville's sonnet analyzes the experience of seeing spooks or devils. The devils, he says, are psychological, the products of "hurt imaginations." They are not less fearsome, or less real, for coming from inside the mind:
In night when colours all to black are cast,
Distinction lost, or gone down with the light;
The eye a watch to inward senses plac'd,
Not seeing, yet still having power of sight,
Gives vain alarums to the inward sense,
Where fear stirr'd up with witty tyranny,
Confounds all powers, and thorough self-offence,
Doth forge and raise impossibility:
Such as in thick depriving darknesses,
Proper reflections of the error be,
And images of self-confusednesses,
Which hurt imaginations only see;
And from this nothing seen, tells news of devils,
Which but expressions be of inward evils.
The Talking Ape
Christina Behme reviews The Talking Ape: How Language Evolved by Robbins Burling, in Metapsychology:
According to Robin Burling questions about the evolution of language are intriguing but difficult to answer because researchers cannot rely on any direct (fossil) evidence. He claims that any theorizing about language evolution has to depart from one of two anchor points: (i) the communication-behavior of our closest primate cousins (chimpanzees and baboons) as an approximation of the starting point and (ii) the languages spoken by modern humans as the endpoint. To bridge the gap between these two endpoints Burling proposes as the central argument of his book, "that language comprehension, rather than production, was the driving force for the human ability to use language" (p.4). His somewhat counterintuitive approach refocuses attention from the "obvious" part of language (speaking) to the occasionally neglected part (understanding) and offers a solution to one of the most vexing puzzles of language evolution: language seems necessary to use language, so how could it evolve in a pre-linguistic species? Burling suggests that the puzzle dissolves when we recognize that communication does not begin with a meaningful vocalization or gesture but with the interpretation of the behavior of another individual. An individual who can understand another's action even when no communication has been attempted gains an evolutionary significant advantage (p.20). And, because social animals naturally engage in countless instrumental acts, there is always a lot to interpret. Throughout his book Burling supplies a wealth of details about language, communication, and the human mind to support his argument.
The Jew Hater
Robert O. Paxton on Bad Faith: A Forgotten History of Family, Fatherland and Vichy France by Carmen Callil and The Unfree French: Life Under the Occupation by Richard Vinen, in the New York Review of Books:
In August 1978, an enterprising French journalist, Philippe Ganier-Raymond, tracked down a nearly forgotten eighty-one-year-old French exile in Madrid named Louis Darquier de Pellepoix and cajoled him into conversation. Ganier-Raymond had brought along a tape recorder concealed in a fan.
The resulting interview was published in the French newsweekly L'Express on October 28, 1978, under a sensational title: "At Auschwitz They Gassed Only Lice." Louis Darquier (the "de Pellepoix" was fake, like a great deal else in his life) had been the Vichy French government's second commissioner for Jewish affairs between May 1942 and February 1944.
Darquier's unrepentant diatribe was, in the words of historian Henry Rousso, a "trigger" that set off one of those periodic national shouting matches that have, since the early 1970s, driven forward an enduring French fascination with the Vichy regime. Darquier's outrageous words had multiple ef-fects. They helped place French anti-Semitism at the center of debates about Vichy, a position which that subject has never lost, at some cost to historiographical balance. They gave a decisive boost to the efforts of French lawyer and Nazi-hunter Serge Klarsfeld to bring some responsible Vichy French officials to justice, in formal recognition of Vichy's complicity in the deportation of Jews from France.
The Smart and Swinging Bonobo
Civil war in the Democratic Republic of the Congo has threatened the existence of wild bonobos, while new research on the hypersexual primates challenges their peace-loving reputation.
Paul Raffaelle in Smithsonian Magazine:
It was at Germany's Frankfurt Zoo some years ago that I first got hooked on bonobos. One of their nicknames is pygmy chimp, and I had expected to see a smaller version of the chimpanzee, with the same swagger and strut in the males and timorous fealty in the females. Bonobos are smaller than chimps, all right—a male weighs about 85 to 95 pounds and a female, 65 to 85 pounds; a male chimpanzee can weigh as much as 135 pounds. But the male bonobos I saw in the zoo, unlike the chimps, did not try to dominate the females. Both males and females strode about the enclosure picking up fruit and mingling with their friends. They looked strangely human with their upright, bipedal gait; long, slim arms and legs; slender neck; and a body whose proportions resemble ours more than they do a chimp's. More than anything, they reminded me of models I'd seen of Australopithecus afarensis, the "ape man" who walked the African savanna three million years ago.
Motion attacks failure to honour centenary of W H Auden's birth
Jonathan Brown in The Independent:
"Death," observed Wystan Hugh Auden, "is the sound of distant thunder at a picnic." Now more than three decades after his demise, an ominous rumble of discontent is emanating from the direction of the late poet's family, friends and admirers over how he should be remembered.
February next year marks the centenary of W H Auden's birth, but it is feared the date is in danger of passing largely unnoticed and unremarked in his native Britain. The BBC admitted yesterday that in contrast to the star-studded celebrations marking Sir John Betjeman's centenary this year, it had yet to commission any programming in honour of Auden.
And much to the dismay of the poet's niece, overtures to the Royal Mail to issue a stamp celebrating the life of the author of Night Mail and his work for the groundbreaking GPO film unit, had been turned down flat.
It’s Her Party
Henry Alford in the New York Times Book Review:
Ever since I finished reading this book, I’ve spent a lot of time picturing Sedaris doing something she refers to several times — freshening up her cheese balls. This method of replenishing and re-forming round globs of nuts and cheese so they can be served at a second gathering is a good shorthand for Sedaris’s cooking style, which is the heart of the book (more than 200 recipes are included). In the kitchen, Sedaris is a magpie, a recycler of both foodstuffs and already published recipes. She is not afraid of the phrase “two cups potato chips, crushed.” Indeed, if Sedaris’s culinary approach seems to have gelled in about 1953, it owes less to the fresh-food enthusiast James Beard than it does to the convenience-food advocate Poppy Cannon. In one recipe, Sedaris impregnates whipped cream with canned fruit cocktail. Ardent foodies who read her book may be overcome with a desire to take her to a Greenmarket and say: “Darling, here are fresh peas. Explore.” But I viewed her retro approach less as a shortcoming than as a difference of opinion. The girl simply likes her crushed potato chips.
Carl Zimmer in his blog, The Loom:
To sequence the human genome, scientists established a network of laboratories, equipped with robots that could analyze DNA day and night. Once they began to finish up the human genome a few years ago, they began to wonder what species to sequence next. With millions of species to choose from, they could only pick a handful that would give the biggest bang for the buck. Squabbling ensued, with different coalitions of scientists lobbied for different species. Some argued successfully for medically important species, such as the mosquito that carries malaria. Others made the case for chimpanzees, to help them pinpoint that genes that make us uniquely human. And in 2002, a team of scientists made the case for the humble honeybee.
Why spend millions on the honeybee? For one thing, honeybees are commercially valuable. They make honey, and they pollinate crops. But the honeybee lobby also argued that there were much deeper reasons to sequence its genome. Honeybees lives in societies that rival our own in size and complexity. A single hive may contain as many as 80,000 bees, which together build the hive, gather food, and feed the next generation of bees. They gather nectar from flowers, and they find flowers by merging many sources of information including the position of the sun and the subtle nuances of a flower's scent. When they come back to their hive, they waggle out a dance to indicate where other honeybees can find the flowers. They manage all this with only a million neurons in their head--a thousandth the number we have.
Why I will cast my vote for Green/Rainbow on Nov. 7
John Walsh in Worcester Telegram:
Never before in my memory have the two major parties so sullied themselves and so obviously betrayed the American people as they have by voting for the war in Iraq and supporting it since. True, George W. Bush and his neocon advisers took the lead in lying to the people. But at every step of the way the Democratic Party went along with a war based on lies and deceit, a war that has killed or maimed tens of thousands of Americans and hundreds of thousands of Iraqis.
As the saying goes: Bush lied. Democrats complied. Thousands died.
Let us recall that the vote to go to war was taken in October 2002, when the midterm elections were only weeks away. Looking at the votes on the war, which 22 Democrats and 1 Republican opposed — much to their credit — many of the Democratic pro-war votes were cast either by those with tight races ahead, like Max Cleland and Tom Daschle, or else those with presidential ambitions, like Kerry, Edwards, Clinton, Lieberman, Biden and others. (Cleland and Daschle lost anyway — deservedly so.) Of course these worthies contend that Bush successfully deceived them.
But that confession is an admission of incompetence since millions around the world saw through Bush’s lies and so did 23 Senators. But deception is not the likely explanation; ambition is. If Ted Kennedy knew better, then is it believable that his close colleague, John Kerry, and others did not?
And every single senatorial vote was crucial on that day of infamy. If only 11 others had joined the 23 nays in a strong stand against being stampeded into war if they sustained a filibuster, we would not be in Iraq today. What greater issue is there than war? But these senators joined their Republican counterparts and put their careers and ambitions ahead of the fate of untold thousands of innocents.
What manner of men and women are these?
Can Wikipedia Ever Make the Grade?
From The Chronicle of Higher Education:
Alexander M.C. Halavais, an assistant professor of communications at Quinnipiac University, has spent hours and hours wading through Wikipedia, which has become the Internet's hottest information source. But to Wikipedia's legions of ardent amateur editors, Mr. Halavais may be best remembered as a troll.
Two years ago, when he was teaching at the State University of New York at Buffalo, the professor hatched a plan designed to undermine the site's veracity — which, at that time, had gone largely unchallenged by scholars. Adopting the pseudonym "Dr. al-Halawi" and billing himself as a "visiting lecturer in law, Jesus College, Oxford University," Mr. Halavais snuck onto Wikipedia and slipped 13 errors into its various articles. He knew that no one would check his persona's credentials: Anyone can add material to the encyclopedia's entries without having to show any proof of expertise.
Some of the errata he inserted — like a claim that Frederick Douglass, the abolitionist, had made Syracuse, N.Y., his home for four years — seemed entirely credible. Some — like an Oscar for film editing that Mr. Halavais awarded to The Rescuers Down Under, an animated Disney film — were more obviously false, and easier to fact-check. And others were downright odd: In an obscure article on a short-lived political party in New Brunswick, Canada, the professor wrote of a politician felled by "a very public scandal relating to an official Party event at which cocaine and prostitutes were made available."
Mr. Halavais expected some of his fabrications to languish online for some time. Like many academics, he was skeptical about a mob-edited publication that called itself an authoritative encyclopedia. But less than three hours after he posted them, all of his false facts had been deleted, thanks to the vigilance of Wikipedia editors who regularly check a page on the Web site that displays recently updated entries. On Dr. al-Halawi's "user talk" page, one Wikipedian pleaded with him to "refrain from writing nonsense articles and falsifying information."
Mr. Halavais realized that the jig was up.
October 27, 2006
The Valley of Transition
The idea of a "J-curve", that is, that societies, polities, economies, in transitioning to other, more modern, more democratic social states, will find that things get worse, before they get better. Ever since Samuel Huntington's Political Order in Changing Societies, this trajectory is seen by many political scientists to hold most strongly in democratizing societies. So, democracies may be more peaceful than authoritarian states, but democratizing societies are severly volatile. Others have seen the transition problem is socialism to capitalism, and capitalism to socialism, as well. In the transition, new institutions have yet to emerge and become effective, even as demands escalate. Thus, a heavy hand is needed in the transition. Bill Emmott and Fareed Zakaria discuss the issue in the wake of Ian Bremmer's new book, The J-Curve. In Slate:
Bremmer's target—quite like yours, Fareed, in The Future of Freedom—is the all-too-common notion that there is a smooth and even inevitable path that countries follow from dictatorship to democracy, along which others can readily nudge them. The Bush administration's "freedom agenda" is only the latest example of this delusion. The troops invading Iraq, remember, were to be greeted by cheering crowds throwing flowers.
We all know far too well that even if some did cheer, many threw bombs. Bremmer's chart explains why. It maps two things: stability, on the y axis, and openness, both internal and to the world, on the horizontal x axis.* Bremmer's argument is that history shows that the most stable countries are often also the most closed: North Korea, Cuba, China under Mao, Soviet Russia. But as countries become more open, they generally become more unstable in the first instance, as existing institutions are challenged and undermined, and the old power holders lose their grip. Only as and if new institutions are built and gain legitimacy, credibility, and power will stability rise again. Hence the J. There is nothing inevitable about escape from the unstable bottom of the curve: The country could move in either direction.
I found this a useful representation of what happens as institutions and regimes change and, certainly, a salutary warning against the view that democracy will grow as naturally as flowers in the spring. The book's main interest for me, however, lay not so much in the chart that gives it its title but in the fine and revealing case studies that Bremmer lays out to establish how complicated the political form of states really is. He outlines the situations in North Korea, Cuba, Iran, and China adeptly and looks also at countries, such as South Africa, that have made a successful transition to democracy; at others, such as India, where democracy has survived seemingly against the odds; and at Russia, where democracy has lately been foundering. The conclusion? That there is no clear rule that can guide us in judging which countries will move up the J curve and which will not. It all depends. Societies are fragile and complex organisms.
The Human Rights of Scientists
It's not often noticed, but world over, scientists suffer many human rights abuses. In News@nature.com:
Six medical workers are on trial in Libya, facing the death penalty for deliberately infecting hundreds of children with HIV, despite the fact that international experts say there is no evidence of their guilt (see 'A shocking lack of evidence').
And around the world, dozens of other scientists and physicians await verdicts of their own, after being imprisoned for dissenting with their government, fired for publishing unwelcome studies, or harassed for carrying out unwanted research.
The three profiles below give a taste of what some researchers face. They do not include the many who have been arrested in countries such as China, Ethiopia, Turkey, and Burma — to name a few — for speaking against their government.
Nor do they include tragic cases such as that of anthropologist Nikolai Girenko, who was studying racism in Russia when he was shot and killed in St Petersburg in 2004; Myrna Mack, a Guatemalan anthropologist who was stabbed to death in Guatemala City in 1990 after publishing a report documenting the murder of civilians by the military during the country's 36-year guerrilla war; or the many Iraqi academics who have been assassinated over the past three years (see 'Scientists become targets in Iraq').
Steve Reich at 70
In The Nation, David Schiff looks at the composer Steve Reich as he turns 70.
In his writings Reich conveys a very logical sense of his own development. There seems to be a straight line from Clapping Music to Drumming to Music for Mallet Instruments to Six Pianos, each work building on its predecessor until Reich reaches nirvana in Music for 18 Musicians. As I came to know Reich's oeuvre, I learned that Clapping Music actually marked the beginning of a second phase in his work, following a near-fatal trip to Ghana in 1970. In 1964 Reich had come upon phasing by accident when he was editing a tape recording of a black preacher; he misaligned two tape loops, setting in motion a process that transformed the preacher's words into abstract sounds. The result was Reich's opus one, It's Gonna Rain. In 1966 he refined this technique in another piece for tape, Come Out, which premiered at a benefit concert for the retrial of the "Harlem Six," a group of black youths charged with committing a murder during the 1964 Harlem riots. The voice of Daniel Hamm, a 19-year-old member of the Harlem Six--five of whom, including Hamm, were later acquitted--is first heard clearly saying, "I wanted to come out and show them." The phrase "Come out and show them" is then transformed through phasing to become an evolving series of rhythms, timbres and pitches. These early works remain fascinating, but their politics is troubling. They seem to spring directly from the civil rights struggle, and yet the phasing process calls attention away from the meaning of words to their sounds. A similar critique could be made of Drumming, where Reich extracted West African rhythms from their context and imposed on them a sophisticated process of transformation unrelated to their traditional forms. Was Reich, like many modernists before him, simply going primitive?
THE EXPANDING THIRD CULTURE
John Brockman in Edge:
Many people, even many scientists, have a narrow view of science as controlled, replicated experiments performed in the laboratory—and as consisting quintessentially of physics, chemistry, and molecular biology.
The essence of science is conveyed by its Latin etymology: scientia, meaning knowledge. The scientific method is simply that body of practices best suited for obtaining reliable knowledge. The practices vary among fields: the controlled laboratory experiment is possible in molecular biology, physics, and chemistry, but it is either impossible, immoral, or illegal in many other fields customarily considered sciences, including all of the historical sciences: astronomy, epidemiology, evolutionary biology, most of the earth sciences, and paleontology.
Just as science—that is, reliable methods for obtaining knowledge—has encroached on areas formerly considered to belong to the humanities (such as psychology), science is also encroaching on the social sciences, especially economics, geography, history, and political science. Humanities scholars and historians who spurn it condemn themselves to second-rate status and produce unreliable results. But this doesn't have to be the case. What can we do about this situation? We can start by asking a question.
Here is my question, the question I am asking myself, a question we can ask each other:
Why does society benefit from an accurate representation of knowledge?
Pompeii's most popular brothel goes on display
It was the jewel of Pompeii’s libertines: a brothel decorated with frescoes of erotic figures believed to be the most popular in the ancient Roman city. The Lupanare — which derives its name from the Latin word “lupa,” or “prostitute” — was presented to the public again Thursday following a yearlong, $253,000 restoration to clean up its frescoes and fix the structure.
Pompeii was destroyed in A.D. 79 by a cataclysmic eruption of Mount Vesuvius that killed thousands of people — and buried the city in 20 feet of volcanic ash, preserving Pompeii for 1,600 years and providing precious information on what life was like in the ancient world.
October 26, 2006
Marilynne Robinson on Dawkins
Harper's review of Dawkins's The God Delusion, via Darwiniana:
It is never a surprise to find Dawkins full of indignation. In his new book, The God Delusion, he has turned the full force of his intellect against religion, and all his verbal skills as well, and his humane learning, too, which is capacious enough to include some deeply minor poetry. Truly this book is a sword which turneth every way, to judge by the table of contents at least. There is no doubt in Dawkins’s mind that the evils of the world are to be laid at the doorstep of the church, mosque, and synagogue, and that science must be our salvation. It is the “God delusion,” which has afflicted almost everyone almost anywhere through the whole of recorded time, that has made us behave so badly. And Science (by which he really means his version of Darwinism) is our potential rescuer from this vale of tears. We need only to become more Dawkins-like in our thinking. This is a fairly cheery view of things beside others on offer, at least as regards the ongoing life of the planet, which he seems to assume.
Still, it is a difficult thing to set reason aside, and the habit of critical thought, and the sense of the past, not to mention the morning news.
Here, too, are the prose of Fernando Pessoa, J G Ballard and Bertolt Brecht; the artefacts of Cornelia Parker, Damien Hirst and Marcel Duchamp; the graphics of Ed Ruscha, Pavel Büchler and Robert Filliou, as well as a host of other treats. Dr Clock's demands close scrutiny, but despite its asserted aim of being a "definitive guide", don't linger. Look and read in dinosaur-shaped turkey nuggets of time, because if you look for too long - as I did - you become insistently aware that the knitting-pattern photographs, line drawings from golf manuals and other curious ephemera that pad out its pages are absurdisms in our eyes only, while for those there at the time, they were instructive, practical, serious even.
more from The New Statesman here.
ride and a rasher
The one stream of poetry which is continually flowing is slang. —G. K. Chesterton
One of the many benefits of owning the two-volume New Partridge Dictionary of Slang and Unconventional English (Routledge), besides the sheer size of the thing up there on a shelf with your other weighty reference books, is that you can dip in just about anywhere and enjoy the exuberant, endless display of human inventiveness with language. Let me demonstrate by flipping open volume 1 to . . . dick, as chance would have it. Besides ten meanings for that word, all familiar to an American speaker such as yourself, we have variations such as dickhead, dick-breath, dickwad, dickwipe, dicknose, and dickless wonder—more proof that the language will never have enough terms to describe "an offensive unlikeable person." But there is always something new to learn. You could guess that a dick doc is a urologist, especially if you've gone to medical school, but did you know that a dickless Tracy is a female police officer? That Dick Emery is Cockney rhyming slang for "memory"? Or that dick mittens are "hands that were not washed after urination"? Well, see?
more from Bookforum here.
daylight, firelight, lamplight, moonlight
On the right of Adam Elsheimer’s Flight into Egypt a full moon hangs above trees which are silhouetted against the night sky. Nothing ruffles the surface of the stream, which reflects both the trees and the carefully detailed face of the moon. A scattering of bright stars spreads to the Milky Way, which strikes across the sky from the top left corner. The wedge of trees which rises from right to the left is pitch black, but two other sources of light push back the darkness. In the centre foreground a mother and child on an ass are lit by the torch carried by a bearded man who holds his hand out towards the child. One can see that the ass has already entered the stream – the torchlight catches a ripple by its foot. On the far left, light from a fire two herdsmen have made carves a foliage-lined hollow out of the night, gilding at its edges the heads and flanks of animals and the surface of the stream. The only strong colour is Joseph’s red coat; the rest is moon-silver, pale fire-lit yellow, midnight blue or black. It is a small picture, so you lean forward to read it. You enter its space and wonder, item by item, what next? Will the moon rise or set? Will the family stop with the herdsmen? A picture like this is as close as a single frame can come to telling a sequential story.
more from the LRB here.
Mathematical Proofs Made Poetic
The Ganita Yuktibhasa (1530) by Jyesthadeva of the Kerala School of Mathematics is thought to be the first text on calculus, summarizing developments in mathematics in India from the 5th century onward, including infinitesimals, infinite series, power series, Taylor series and integration. I was reading on the work, including this presentation by Sarada Rajeev over at Rotchester's physics department, when I noticed that the school presented mathetmatical proofs and results in the form of poems. It seems alien to me (Malayalee, though I am), but it does paint a picture of a beautiful genre.
Hitchens, Defending the Disinvitation of Tony Judt
Contrarian that he is (remember this defense of televised forced confessions, all show trial like, while in the same breath condemning Amnesty's allusions to Stalinism to describe the current security regime), Chistopher Hitchens' dissents from the outrage over the cancellation of Tony Judt's talk.
A response from Jim Sleeper, one of the signatories, can be read here.
I have a perfect right, which I would defend to the death, to express my views on the question of Palestine. But I do not have a perfect right to express that opinion—which would have had to come up, even in a discussion of Iraq and the degeneration of the United Nations—at a meeting of a private group that takes the opposing view. Nor do I have an absolute right to criticize Theodor Herzl and all his works from a podium belonging to a neutral organization. Such outfits have their own right to pick and to choose and even to reconsider.
What a chance I missed to call attention to myself. I now can't open my e-mail or check my voicemail without reading or hearing about the repression visited on professor Tony Judt of New York University. It seems that he was booked to speak at a meeting sponsored by a group called Network 20/20 at the Polish Consulate in New York and had his event canceled when the relevant Polish diplomat decided that the evening might be—given professor Judt's views on Israel—more trouble than it was worth. I now hear of a fulminating letter, signed by no fewer than 114 intellectuals, that has been published in the New York Review of Books (there's glory for you) in which this repression is denounced. How dare the Polish Consulate refuse the heroic dissident Judt a platform! And how dare the Anti-Defamation League, or its chief spokesman Abraham Foxman (it's not quite clear who called) even telephone the Poles to complain?
The Fader Takes a Look at Escort
The October/November issue of The Fader (available free as a podcast) is out and has a great piece on The Wire, perhaps the best television show ever. It also has a review of the band Escort, founded by our friends Dan Balis, Eugene Cho, and Darius Maghen.
Between the three of them, only two of the present members of Escort are wearing cabana hats, but all are drinking beer from the bottle, beach-style relaxed. We're on a lounge deck talking about dance music--or more specifically, what brings people together to form a nine-piece orchestral disco band like theirs. It's the end of summer '06, and Escort has just released its first 12-inch single "Starlight," an elaborately faithful disco track where keyboardist Eugene Cho wiggles an analog theme as Zena Kitt (a vague Eartha relative, she tells me) belts out, Staaaaaarlight! I can't stop thinking of you! There's a little conga rumble underneath, some high pitched strings, and suddenly it's 1979--smooth and lovely.
The Overachievers: The Secret Lives of Driven Kids
From The Atlantic:
The frenzy of academic competition, particularly among affluent American families, has triggered a spate of cautionary new books. The titles reviewed here are all excellent: I give them all A+'s -- or, in the parlance of today's elite high schoolers, weighted GPAs of 4.687, including 5's in fifteen AP courses and a combined math/verbal SAT score of 1540.
Of course, I'm a biased reader; in my estimation, there can't be enough books written on the topic. I say, let's hurl them, one by one, at today's frenzied "helicopter parents," who deserve to be, if not bombarded, at least given a simple clonk over the head with a frying pan while a trained therapist yells, "Stop the insanity!"
Winning admission to a coveted college is so do-or-die that today's über-protective parents leave nothing to chance -- which is to say, nothing to the bumbling students themselves. For our most obsessively college-minded parents, it seems foolhardy to allow high-school seniors to track the progress of their own applications, to solicit their own letters of recommendation, even to write their own autobiographical essays about why they want to go to college. At a certain point, one might ask who is actually hoping to pull on that crimson sweatshirt.
In a telling USA Today essay on such parents, the MIT admissions head, Marilee Jones, wrote that they even "make excuses for their child's bad grades and threaten to sue high school personnel who reveal any information perceived to be potentially harmful to their child's chances of admission." (Indeed, in The Overachievers, Alexandra Robbins points out that the number of teachers purchasing liability insurance rose by 25 percent between 2000 and 2005.)
And when these litigious parents' work is well done, they need only stand back as their mini-me's shamble forward, robotlike, hurling lawsuits for them.
''Bizarre Beasts'' Were Real (Believe It or Not)
From The National Geographic:
A coil of teeth caps the lower jaw of a sculpture of a 13-foot (4-meter) whorl-tooth shark, or Helicoprion, a fish genus that lived about 250 million years ago. Artist Gary Staab depicts the animal's jaw as something of a spiral conveyor belt, in which new teeth would advance to replace old ones (concealed here by skin) . But the true arrangement and purpose of the teeth remains a mystery. Some scientists suggest that it may have operated like a spiked whip, possibly curled underneath the lower jaw like a weaponized elephant trunk.
The shark adds bite to "Bizarre Beasts, Past and Present," a new exhibition of Staab's sculptures at the National Geographic Museum in Washington, D.C. (through February 2, 2007). The animals depicted are, or were, all real—testaments to the twists, turns, and blind alleys of evolution.
October 25, 2006
And you, what are you doing here?
Michael Gilsenan reviews A Season in Mecca: Narrative of a Pilgrimage by Abdellah Hammoudi ed. Pascale Ghazaleh, in the London Review of Books:
Pilgrims travelled for many motives: the religious duty to make the haj, providing one could fulfil its conditions, was not ill, had the funds, would not leave one’s family destitute and so forth; trade, local or regional; labour and remittance along the way, on a journey whose duration was limited only by God; status. The temporal scale of ‘going on pilgrimage’ was enormously variable. Pilgrims might move and settle and then move on, or not. The process could take years. But by the 1880s, modern boundaries and frontiers were being drawn, territories delineated, wars fought, treaties with native rulers signed, legal systems imposed, ‘races’ scientifically delineated, their supposed characteristics ethnographically reported, their ‘characters’ assessed. The new colonial states demanded ever more documents. The pilgrimage was to be controlled. The experience necessarily changed and it has not ceased doing so. In our own day, it is plane and airport capacities that are crucial. Indeed, trips to the Holy Places by land are now forbidden.