“Patriotism,” Samuel Johnson said nearly 250 years ago, “is the last refuge of a scoundrel.” These days in India, the adage can be safely applied to nationalism. There is no other explanation of the threat to arrest and try Arundhati Roy on charges of sedition for what she said at a public meeting on Kashmir, where Syed Ali Geelani too spoke. I was not there at the meeting, but I have read her moving statement defending herself afterwards. I feel both proud and humbled by it. I am a psychologist and political analyst, handicapped by my vocation; I could not have put the case against censorship so starkly and elegantly. What she has said is simultaneously a plea for a more democratic India and a more humane future for Indians.
I faced a similar situation a couple of years ago, when I wrote a column in the Times of India on the long-term cultural consequences of the anti-Muslim pogrom in 2002. It was a sharp attack on Gujarat’s changing middle-class culture. I was served summons for inciting communal hatred. I had to take anticipatory bail from the Supreme Court and get the police summons quashed. The case, however, goes on, even though the Supreme Court, while granting me anticipatory bail, said it found nothing objectionable in the article. The editor of the Ahmedabad edition of the Times of India was less fortunate. He was charged with sedition.
I shall be surprised if the charges of sedition against Arundhati are taken to their logical conclusion. Geelani is already facing more than a hundred cases of sedition, so one more probably won’t make a difference to him. Indeed, the government may fall back on time-tested traditions and negotiate with recalcitrant opponents through income-tax laws. People never fully trusted the income-tax officials; now they will distrust them the way they distrust the CBI.
In the meanwhile, we have made fools of ourselves in front of the whole world. All this because some protesters demonstrated at the meeting that Arundhati and Geelani addressed! Yet, I hear from those who were present at the meeting that Geelani did not once utter the word “secession”, and even went so far as to give a soft definition of azadi. By all accounts, he put forward a rather moderate agenda. Was it his way of sending a message to the government of India? How much of it was cold-blooded public relations, how much a clever play with political possibilities in Kashmir?
We shall never know, just because most of those who pass as politicians today and our knowledge-proof babus have proved themselves incapable of understanding the subtleties of public communication. They are not literate enough to know what role free speech and free press play in an open society, not only in keeping the society open but also in serious statecraft. In the meanwhile, it has become dangerous to demand a more compassionate and humane society, for that has come to mean a serious criticism of contemporary India and those who run it. Such criticism is being redefined as anti-national and divisive. In the case of Arundhati, it is of course the BJP that is setting the pace of public debate and pleading for censorship. But I must hasten to add that the Congress looks unwilling to lose the race. It seems keen to prove that it is more nationalist than the BJP.
A respected peer-reviewed journal in psychology, The Journal of Personality and Social Psychology, is about to publish a paper that presents scientific evidence for precognition. The paper, by Daryl Bem of Cornell University, is called “Feeling the future: Experimental evidence for anomalous retroactive influences on cognition and affect,” and you can download a preprint on his webpage. I’ve scanned the paper only briefly, and am posting about it in hopes that some of you will read it carefully and provide analyses, either here or elsewhere.
The paper purports to show that a choice that you make in a computer test can be influenced by stimuli you receive after you’ve already made the choice. This implies you have some way, consciously or unconsciously, of detecting things that haven’t yet happened. In an article in Psychology Today, “Have scientists finally discovered evidence for psychic phenomena?“, psychologist Melissa Burkley at Oklahoma State University summarizes two of Bem’s studies:
However, Bem’s studies are unique in that they represent standard scientific methods and rely on well-established principles in psychology. Essentially, he took effects that are considered valid and reliable in psychology – studying improves memory, priming facilitates response times – and simply reversed their chronological order.
Jesús Huerta de Soto delivers the Hayek Memorial Lecture at the London School of Economics:
I would like to start off by stressing the following important idea: all the financial and economic problems we are struggling with today are the result, in one way or another, of something that happened precisely in this country on July 19, 1844… What happened on that fateful day that has conditioned up to the present time the financial and economic evolution of the whole world? On that date, Peel’s Bank Act was enacted after years of debate between Banking and Currency School Theorists on the true causes of the artificial economic booms and the subsequent financial crises that had been affecting England especially since the beginning of the Industrial Revolution.
The Bank Charter Act of 1844 successfully incorporated the sound monetary theoretical insights of the Currency School. This school was able to correctly discern that the origin of the boom and bust cycles lay in the artificial credit expansions orchestrated by private banks and financed not by the prior or genuine savings of citizens, but through the issue of huge doses of fiduciary media (in those days mainly paper banknotes, or certificates of demand deposits issued by banks for a much greater amount than the gold originally deposited in their vaults). So, the requirement by Peel’s Bank Act of a 100 percent reserve on the banknotes issued was not only in full accordance with the most elementary general principles of Roman Law regarding the need to prevent the forgery or the over-issue of deposit certificates, but also was a first and positive step in the right direction to avoid endlessly recurring cycles of booms and depressions.
However Peel’s bank Act, not withstanding the good intentions behind it, and its sound theoretical foundations, was a huge failure. Why? Because it stopped short of extending the 100 percent reserve requirement to demand deposits also (Mises 1980, 446-448).
The world has never aged like this before, and the aging of the world is happening everywhere. It is true that, now, developed countries are aging fastest, but it won’t be this way for long. Countries such as Brazil and Sri Lanka may not experience rapid aging now, but when they do, it will happen in just a couple of decades, while the rest of the world needed the entire 20th century.
People worry whether our social welfare systems will collapse. Whether we will have enough hospitals and housing. Whether overall human productivity will decrease. In the New York Times Magazine, Ted C. Fishman worries that global power may be determined by how much a country is willing to invest in care for its elderly, that the old may be pushed aside if they prove too costly. He worries that the old, unable to work, will live in poverty, but that the very act of an old country taking young workers from young countries will just hasten the aging of our last remaining young nations.
And yet, we haven't really asked ourselves just what it will feel like to live in an old world. Will reminiscing replace love songs? Will wisdom replace surprise?
I found your Himalayan chronology: a comprehensive set of cores from a ski area in Kashmir. I know you were there in 1973 and you likely felt the stay of November, before snow slams down the airplanes — mountain-shine through long blue needles, shadows and cores fresh on the snow in stripes. I can picture the measurement, later: Ashok bringing in tea, sweet, gingery, goat-milk thick and held far from the calipers. You drank the first half in 1790 between the earlywood and latewood. In 1600, you remembered the rest of it but it had a skin by then. The oldest pith came from a seedling in the year of Babur’s first arrival, complete with court painters to capture wild Hidustani beasts. (There’s a moment of privacy before uploading data onto the Persian vellum of the internet like a miniature painting before the gold leaf.)
by Hanna Coy from You Are Here– The Journal of Creatrive Geography, 2010
In the summer of 1991, as a new North Carolina State University graduate in environmental design in architecture, Elizabeth Whittaker, M.Arch. ’99, wore a hard hat, pouring concrete over rebars at Arcosanti, a planned community in the Arizona desert designed by the celebrated architect Paolo Soleri. “It was a hippie-throwback place,” she recalls. “Living off the land in a progressive, communal atmosphere. A hilarious place.” Today, as principal of MERGE Architects, Inc. (www.mergearchitects.com) in Boston, Whittaker still dons a hard hat occasionally, but now she’s overseeing the pours, and the buildings under construction are her own designs.
The hard hat suggests the hands-on, intimate involvement with details of a project that Whittaker specializes in, a way of working that she calls “extreme collaboration.” It’s a modus operandi that took form in the early days of her firm, which she founded in 2003, when “we were flying by the seat of our pants, doing these small, quick, needs-to-be-built-in-three-weeks-for-10-dollars kind of projects,” she explains. “We would be inventing the construction details right in the shop or on site with the artists and craftsmen—the steel fabricators, woodworkers, structural engineers, concrete fabricators. Every architect collaborates; this is extreme only in that it is so immediate. We’re inventing it with the tradesmen. I’ve built a practice on learning from these people—it’s more inventive when there are more voices.”
The word “relapse” comes from the Latin for “slipping backward,” or “slipping again.” It signals not just a fall but another fall, a recurrent sin, a catastrophe that happens again. It carries a particularly chilling resonance in cancer — for it signals the reappearance of a disease that had once disappeared. When cancer recurs, it often does so in treatment-resistant or widely spread form. For many patients, it is relapse that presages the failure of all treatment. You may fear cancer, but what cancer patients fear is relapse. Why does cancer relapse? From one perspective, the answer has to do as much with language, or psychology, as with biology. Diabetes and heart failure, both chronic illnesses whose acuity can also wax and wane, are rarely described in terms of “relapse.” Yet when a cancer disappears on a CT scan or becomes otherwise undetectable, we genuinely begin to believe that the disappearance is real, or even permanent, even though statistical reasoning might suggest the opposite. A resurrection implies a previous burial. Cancer’s “relapse” thus implies a belief that the disease was once truly dead.
But what if my patient’s cancer had never actually died, despite its invisibility on all scans and tests? CT scans, after all, lack the resolution to detect a single remnant cell. Blood tests for cancer also have a resolution limit: they detect cancer only when millions of tumor cells are present in the body. What if her cancer had persisted in a dormant state during her remissions — effectively frozen but ready to germinate? Could her case history be viewed through an inverted lens: not as a series of remissions punctuated by the occasional relapse, but rather a prolonged relapse, relieved by an occasional remission?
More here. (Note: Congratulations to dear friend and brilliant colleague, Sid. My MDS patients have hope because of you! BRAVO!)
Much of the initial response to the Browne Report seems to have missed the point. Its proposals have been discussed almost entirely in terms of ‘a rise in fees’. Analysis has largely concentrated on the amount graduates might pay and on which social groups may gain or lose by comparison with the present system. In other words, the discussion has focused narrowly on the potential financial implications for the individual student, and here it should be recognised that some of the details of Browne’s proposed system of graduate contributions to the cost of fees are, if his premises are granted, an improvement on the present patchwork arrangements.
But the report proposes a far, far more fundamental change to the way universities are financed than is suggested by this concentration on income thresholds and repayment rates. Essentially, Browne is contending that we should no longer think of higher education as the provision of a public good, articulated through educational judgment and largely financed by public funds (in recent years supplemented by a relatively small fee element). Instead, we should think of it as a lightly regulated market in which consumer demand, in the form of student choice, is sovereign in determining what is offered by service providers (i.e. universities). The single most radical recommendation in the report, by quite a long way, is the almost complete withdrawal of the present annual block grant that government makes to universities to underwrite their teaching, currently around £3.9 billion. This is more than simply a ‘cut’, even a draconian one: it signals a redefinition of higher education and the retreat of the state from financial responsibility for it.
Instead, Browne wants to see universities attracting customers in a competitive marketplace: there will be a certain amount of public subsidy of these consumers’ purchasing power, especially for those who do not go on to a reasonably well-paid job, but the mechanism which would henceforth largely determine what and how universities teach, and indeed in some cases whether they exist at all, will be consumer choice.
“American Gothic” has been described as the most reproduced painting in this country, which is not necessarily high praise. What artist would be elated to hear that one of his paintings had been appropriated in an advertising campaign for General Mills country cornflakes, or Coors beer? For most of his life, Grant Wood endured the scorn of leading art critics, who failed to recognize his refinement. He was known for one painting only, that image of a pale, homely farming pair posed in front of their white house, looking as if their dog just died. Wood painted his creaky masterpiece in 1930, amid the ravages of the Great Depression. Unable to move forward, Americans glanced back and found consolation in images of the sturdy agrarian past. Wood rose to fame as one of the three leaders of Regionalism (Thomas Hart Benton and John Steuart Curry were the other two) and, dressed in his bibbed overalls, presented himself as an antidote to East Coast pretentiousness. “All the really good ideas I’ve ever had came to me while I was milking a cow,” he said, somewhat goofily, in his most famous statement.
Where do brilliant ideas come from? When reporters ask Tim Berners-Lee about the moment he conceived of the World Wide Web, he can't answer. He hasn't forgotten, it just never happened. The idea percolated in his mind for nearly a decade, based on a desire to organize massive amounts of data shared between connected computers. He needed ideas of others to buzz around him and he needed an image that would make his idea understandable. His “stack” of information became a “mesh” before eventually becoming a “web.” The cliché did not hold true: His moment of insight, as it turns out, wasn't the result of a single flashbulb going off in his brain.
In his sixth book, “Where Good Ideas Come From: The Natural History of Innovation,” popular science writer Steven Johnson tries to dispel the notion of the “eureka moment.” As with nature, new concepts, like the Internet, slowly grow out of old concepts. They don't spring forth from nowhere. Darwin's theory, for instance, was built on centuries of observation, including his own. During his fateful voyage on the HMS Beagle, Darwin also discovered that atolls, islands made of coral, were created through the lives and deaths of tropical marine organisms, hardened bodies built up on one another. This key image, according to Johnson, gave Darwin a picture for his epic explanation of how life emerged. Using natural science's tendencies to build upon itself, as well as examples of major innovations in science, technology and even art, Johnson makes a case that ideas beget ideas, which means would-be innovators don't need an ivory tower; they need a crowd.
A flashier sort of supernatural novel, aimed at teenagers, is experiencing a startling revival; at the moment you can’t move for vampires and werewolves. Yet the corny “English country house with a spook” template is also being dusted off. It became respectable – and fit for the grown-ups – when Sarah Waters used a full-on array of supernatural effects in her last novel, The Little Stranger. If anything, she overdid it with her bumpings, visions, scratchings, unexplained fires and malign entities; but she also managed to pull off some splendid shocks, as well as cleverly investigating the many purposes a ghost can serve in a narrative. So what do the latest supernatural novels bring to the Hallowe’en party? In an age where viewers are inured to ever more graphic scenes of horror on film, how do you frighten with simple words on a page? I road-tested five recent examples to see if they could make me shudder: two classic English ghost stories and a sparky American take on the genre; an 18th-century chiller set in a spooky old Cambridge college; and a wainscot-free novel that colonises new territory for terror.
I was sitting on an airplane with a copy of “Storyteller: The Authorized Biography of Roald Dahl” when an elegant woman in the seat next to me murmured, almost to herself, “I live just down the lane from his old cottage in Oxfordshire.” Turning to her with excitement I asked if she’d ever run into him. “Oh, no, no,” she said with obvious amusement, as if the very suggestion was completely absurd. “He was a great writer,” she said, sounding very genuine. Yet she had a puzzled expression on her face. She asked me what I did for a living. I said I wrote books for grown-ups and children, just like Dahl. There was an awkward silence. We parted ways. For those who do not know Dahl’s grown-up stories, one of his most beloved — if I may use that word — is called “Pig” (1959), about an orphan raised by a tender, vegetarian aunt. The boy’s talents as a young vegetarian chef are depicted in a magical, mystical tone. When the aunt dies, the boy buries her and goes to the city where he encounters, gasp … pork! He loves it, and ends up with his throat slit by a butcher. Pure horror.
I saw “West Side Story” when I was 16 years old, and I have two vivid memories of the show. One, I didn’t believe for a minute that the dancers were anything like the teenage hoods I knew from the street corner, and secondly, I was completely overwhelmed by the beauty of the song “Maria.” It was a perfect love song. Sondheim was less enamored with the lyric he wrote for Bernstein. He describes it as having a kind of “overall wetness” — “a wetness, I regret to say, which persists throughout all the romantic lyrics in the show.” Sondheim’s rule, taught to him by his mentor, Oscar Hammerstein II, is that the book and composer are better served by lyrics that are “plainer and flatter.” It is the music that is meant to lift words to the level of poetry.
Sondheim’s regret about “Maria” reminded me of my own reluctance to add a third verse to “Bridge Over Troubled Water.” I thought of the song as a simple two-verse hymn, but our producer argued that the song wanted to be bigger and more dramatic. I reluctantly agreed and wrote the “Sail on silvergirl” verse there in the recording studio. I never felt it truly belonged. Audiences disagreed with both Sondheim and me. “Maria” is beloved, and “Sail on silvergirl” is the well-known and highly anticipated third verse of “Bridge.” Sometimes it’s good to be “wet.”
When I think of Stephen Sondheim songs, I think of his melody and lyrics as one. His career as a lyricist for other composers (Bernstein, Jule Styne and Richard Rodgers) is as distinct from his later work as night is to day, or conversely, day to night, since the quintessential Sondheim song is perceived to be somehow darker, lyrically more cerebral and colder than his earlier collaborative work. From “Sweeney Todd”:
There’s a hole in the world Like a great black pit And the vermin of the world Inhabit it, And its morals aren’t worth What a pig could spit, And it goes by the name of London.
A certain Arab country recently held parliamentary elections. The vote was reasonably free and fair. Turnout was 67 percent, and the opposition won a near majority of the seats — 45 percent to be exact. Sounds like a model democracy. Yet, rather than suggesting a bold, if unlikely, democratic experiment, Saturday's elections in Bahrain instead reflected a new and troubling trend in the Arab world: the free but unfair — and rather meaningless — election.
Something similar will happen on Nov. 9 in Jordan. The Hashemite Kingdom is a close U.S. ally that has grown increasingly proficient at predetermining election results without actually rigging them. It involves gerrymandering at a scale unknown in the West and odd electoral engineering (Jordan is one of only three countries in the world that uses something called Single Non Transferable Vote for national elections). Even when the opposition is allowed to win, the fundamentals do not necessarily change. Parliamentary legislation in countries like Jordan and Bahrain, after all, can be blocked by appointed “Upper Houses.” And even if that were not the case, the King (or the President) and his ministers — all appointed — can also kill any threatening legislation.
Controlling for a large number of demographic variables, such as sex, race, ethnicity, religion, marital status, number of children, education, earnings, depression, satisfaction with life, frequency of socialization with friends, number of recent sex partners, childhood social class, mother’s education, and father’s education, more intelligent children grow up to drink more alcohol in the UK and the US.
[This] graph shows the association between childhood intelligence (grouped into five “cognitive classes”: “very dull” – IQ < 75; “dull” – 75 < IQ < 90; “normal” – 90 < IQ < 110; “bright” – 110 < IQ < 125; “very bright” – IQ > 125) and the latent factor for the frequency of alcohol consumption. The latter variable is constructed from a large number of indicators for the frequency of alcohol consumption throughout adult life and standardized to have a mean of 0 and a standard deviation of 1.0. The data come from the National Child Development Study (NCDS) in the United Kingdom. There is a clear monotonic association between childhood intelligence (measured before the age of 16) and the frequency of alcohol consumption in their 20s, 30s, and 40s. “Very bright” British children grow up to consume alcohol nearly one full standard deviation more frequently than their “very dull” classmates.