Me, Inc.

Pamela S. Karlan in the Boston Review:

Karlan_36_4_corporation When the Supreme Court heard Santa Clara County v. Southern Pacific Railroad Co. in 1886, few would have pegged the case as a turning point in constitutional law. The matter at hand seemed highly technical: could California increase the property tax owed by a railroad if the railroad built fences on its property? As it turned out, the Court ruled unanimously in the railroad’s favor. And in so doing, the Court casually affirmed the railroad’s argument that corporations are “persons” within the meaning of the Fourteenth Amendment, which provides that no state shall “deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.” So certain were the justices of the Fourteenth Amendment’s applicability that their opinion did not engage the issue, but the Court reporter recorded the justices’ perspective on the topic:

Before argument Mr. Chief Justice Waite said: ‘The Court does not wish to hear argument on the question whether the provision in the Fourteenth Amendment to the Constitution which forbids a state to deny to any person within its jurisdiction the equal protection of the laws applies to these corporations. We are all of opinion that it does.’

That statement marks the origin of the view that corporations are persons as a matter of constitutional law. This played a central role in the 2010 decision in Citizens United v. Federal Election Commission, which struck down portions of the Bipartisan Campaign Reform Act that restricted corporate spending on electioneering communications in the run-up to a federal election. The Court declared that Congress could not discriminate between electioneering communications according to the identity of the speaker: since individual human beings clearly have a First Amendment right to speak about candidates during the election process, so too must corporations.

More here.

Atop TV Sets, a Power Drain That Runs Nonstop

Elizabeth Rosenthal in the New York Times:

26CABLE-graphic-popup Those little boxes that usher cable signals and digital recording capacity into televisions have become the single largest electricity drain in many American homes, with some typical home entertainment configurations eating more power than a new refrigerator and even some central air-conditioning systems.

There are 160 million so-called set-top boxes in the United States, one for every two people, and that number is rising. Many homes now have one or more basic cable boxes as well as add-on DVRs, or digital video recorders, which use 40 percent more power than the set-top box.

One high-definition DVR and one high-definition cable box use an average of 446 kilowatt hours a year, about 10 percent more than a 21-cubic-foot energy-efficient refrigerator, a recent study found.

These set-top boxes are energy hogs mostly because their drives, tuners and other components are generally running full tilt, or nearly so, 24 hours a day, even when not in active use. The recent study, by the Natural Resources Defense Council, concluded that the boxes consumed $3 billion in electricity per year in the United States — and that 66 percent of that power is wasted when no one is watching and shows are not being recorded. That is more power than the state of Maryland uses over 12 months.

More here.

The untold stories of rape during the Holocaust

Jessica Ravitz at CNN:

ScreenHunter_01 Jun. 26 17.22 Scholars are revisiting old testimonies and documents — and seeking new ones. Authors have published works to inspire conversation. Psychologists want to help survivors heal from their secrets. Activists, including feminist writer and organizer Gloria Steinem, hope these victims of the distant past can help shape a better future.

But the topic of sexual violence during the Holocaust is fraught with controversy. Some observers believe it's a subject not sufficiently widespread or proven to warrant broad attention. Others fear it's driven by a microscopic view that deflects focus from what needs to be remembered. And still others feel that by pushing the issue, it may harm survivors who've suffered enough.

What everyone can agree on is this: When it comes to learning from those who lived through the Holocaust, time is running out.

A spotlight on this dark subject was switched on with the late 2010 publication of a landmark book bearing a straightforward but telling title, “Sexual Violence against Jewish Women during the Holocaust.”

The interdisciplinary anthology touches on everything from rape, forced prostitution and sterilizations to psychological trauma, gender identity issues and depictions of violence in the arts. Co-edited by Sonja Hedgepeth and Rochelle Saidel, it is believed to be the first book in English to focus exclusively on this subject.

More here.

Pakistan Unhitches Hitchens

Anjum Altaf in The South Asian Idea:

ScreenHunter_01 Jun. 26 11.04 None of this is to argue that Pakistan is not plagued by very severe problems, some of which Hitchens has enumerated. The appropriate response to Hitchens is not a defense of Pakistan’s civil and military elite, of the kind Christine Fair has penned for The Huffington Post, with its accounting of Pakistan’s cooperation in the war against terror. Nor is it the dismissive posture adopted by many Pakistanis, pointing out their country’s various positives. These are weak defenses, the staples of many a domestic fight: This is all I’ve done for you, think of the good times, we were happy once, and ultimately those defenses are as far from the point as a Hitchens-style diatribe.

The response calls for the kind of unglamorous analysis that won’t make it into Vanity Fair or The Huffington Post. At any given time, a society is characterized by many currents and counter-currents, positives coexist with negatives, and struggles for human rights wax and wane. So has been the case in Pakistan. Hitchens’ statement that “Pakistan takes its twisted, cowardly revenge by harboring the likes of the late Osama bin Laden” is so unnuanced as to call into question the author’s credibility as an analyst; the greatest damage he has done here is to his own reputation.

There is no one Pakistan: There are many Pakistans, and the question to ask is why the forces of repression have been gaining the upper hand in the country.

More here.

Why parents can’t cut the apron strings

From Spiked:

More-kids-cover If Amy Chua is cast as the wild-eyed Tiger Warrior of twenty-first-century parenting, Bryan Caplan, an economist at George Mason University and author of a new book, Selfish Reasons to Have More Kids, is its amiable Buddha figure. He even looks the part. His publicity photos invariably picture him hugging his kids and grinning from ear to ear. It’s a far cry from Amy Chua’s formal, somewhat stiff family portraits. Pitting the two against one another has proved irresistible. Media pundits have dubbed them ‘gurus’ and taken to sorting parents into one category or another. Of course the irony is that the ‘debate’ between the two is not, as Jennie Bristow has pointed out on spiked, a real debate in any meaningful sense. It’s really more of a half-hearted rehearsal of the old nature-vs-nurture argument. It does not change anyone’s mind; it offers no answers and is unlikely to have any effect on what parents actually do. And this is a pity because Caplan’s book at least represents an attempt to address some of the excesses of today’s parenting culture.

Selfish Reasons begins with the observation that the American family is shrinking, and the main reason for this, according to Caplan, is that parents are stressed out about taking care of kids. More kids mean more stress and less happiness. This in itself might give some readers pause. It seems a bit glib and appears to ignore the long-term demographic trends and yet, just as a snapshot of American life today, it feels true.

More here.

Sunday Poem

For an Anniversary

The wing of the osprey lifted
over the nest on Tomales Bay
into fog and difficult gust
raking treetops from Inverness Ridge on over
The left wing shouldered into protective
gesture the left wing we thought broken

and the young beneath in the windy nest
creaking there in their hunger
and the tides beseeching, besieging
the bay in its ruined langour

by Adrienne Rich

The Double Game: The unintended consequences of American funding in Pakistan.

From The New Yorker:

Newyorker It’s the end of the Second World War, and the United States is deciding what to do about two immense, poor, densely populated countries in Asia. America chooses one of the countries, becoming its benefactor. Over the decades, it pours billions of dollars into that country’s economy, training and equipping its military and its intelligence services. The stated goal is to create a reliable ally with strong institutions and a modern, vigorous democracy. The other country, meanwhile, is spurned because it forges alliances with America’s enemies.

The country not chosen was India, which “tilted” toward the Soviet Union during the Cold War. Pakistan became America’s protégé, firmly supporting its fight to contain Communism. The benefits that Pakistan accrued from this relationship were quickly apparent: in the nineteen-sixties, its economy was an exemplar. India, by contrast, was a byword for basket case. Fifty years then went by. What was the result of this social experiment? India has become the state that we tried to create in Pakistan. It is a rising economic star, militarily powerful and democratic, and it shares American interests. Pakistan, however, is one of the most anti-American countries in the world, and a covert sponsor of terrorism. Politically and economically, it verges on being a failed state. And, despite Pakistani avowals to the contrary, America’s worst enemy, Osama bin Laden, had been hiding there for years—in strikingly comfortable circumstances—before U.S. commandos finally tracked him down and killed him, on May 2nd.

More here.

Michele Bachmann’s Holy War

Matt Taibbi in Rolling Stone:

Main Close your eyes, take a deep breath, and, as you consider the career and future presidential prospects of an incredible American phenomenon named Michele Bachmann, do one more thing. Don't laugh.

It may be the hardest thing you ever do, for Michele Bachmann is almost certainly the funniest thing that has ever happened to American presidential politics. Fans of obscure 1970s television may remember a short-lived children's show called Far Out Space Nuts, in which a pair of dimwitted NASA repairmen, one of whom is played by Bob (Gilligan) Denver, accidentally send themselves into space by pressing “launch” instead of “lunch” inside a capsule they were fixing at Cape Canaveral. This plot device roughly approximates the political and cultural mechanism that is sending Michele Bachmann hurtling in the direction of the Oval Office.

Bachmann is a religious zealot whose brain is a raging electrical storm of divine visions and paranoid delusions. She believes that the Chinese are plotting to replace the dollar bill, that light bulbs are killing our dogs and cats, and that God personally chose her to become both an IRS attorney who would spend years hounding taxpayers and a raging anti-tax Tea Party crusader against big government. She kicked off her unofficial presidential campaign in New Hampshire, by mistakenly declaring it the birthplace of the American Revolution. “It's your state that fired the shot that was heard around the world!” she gushed. “You are the state of Lexington and Concord, you started the battle for liberty right here in your backyard.”

More here.

Beliefs that give meaning to life can’t be dislodged by factual evidence

Salman Hameed in The Guardian:

Salman Millions of individuals in the UK believe in UFOs and ghosts. Yet we know that there is no credible evidence for any visitation from outer space or for some dead souls hanging out in abandoned houses. On the other hand, there is now overwhelming evidence that humans and other species on the planet have evolved over the past 4.5bn years. And yet 17% of the British population and 40% of Americans reject evolution. It seems that for many there is no connection between belief and evidence.

Some – maybe most – of the blame can be attributed to an education system that does not train people to think critically. Similarly, most people do not understand methodologies of science and the way theories get accepted. For some, scientific evidence has no role in the way they envision the world.

People who claim to have been abducted by aliens provide an interesting example. The “abductions” happen mostly in the early morning hours and, apart from psychological trauma, there is no physical evidence left behind. Some scientists have attributed these episodes to sleep paralysis – a momentary miscommunication between the brain and the body, just before going to sleep or waking up.

While abductions have most likely not taken place, the trauma experienced by the individuals may still be real.

More here.

Curing the Pelvic Headache

Robert Pinsky in the New York Times Book Review:

Pinsky-articleInline The appeal of conversion stories often depends on descriptions of the darkness before enlightenment: we enjoy learning in detail about the presalvation misery, debauchery or sinfulness. The more detail, the better. The English novelist Tim Parks understands that principle. In his urbane, droll, weird yet far from charmless account of the pain and misery suffered by his body in general, and by his bladder, prostate, penis and related bits in particular, the conversion is from a cerebral, anxious, hunched-over and compulsively verbal kvetch (not his term, but the literal “squeeze” makes the Yiddish word seem appropriate) to something resembling the opposite.

Like the reformed sinner who diverts his audience with lurid, prolonged accounts of nights in the fleshpots, Parks gives an amusing, anxiously over-the-top confession of his former condition: “I was nothing but tension. . . . I brushed my teeth ferociously, as if I wanted to file them down. I yanked on my socks as if determined to thrust my toes right through them. . . . When I pushed a command button, I did so as if it was my personal strength that must send the elevator to the sixth floor, or raise the door of the garage. While I shaved I tensed my jaw, while I read I tensed my throat, while I ate (too fast) I tensed my forehead, while I talked I tensed my shoulders, while I listened I tensed my neck, while I drove I tensed everything.

This passage — much longer without my ellipses — extends over an entire page. The next paragraph begins, “And this is only the briefest summary of my chronically maladjusted state.”

More here.

O’Keeffe and Stieglitz


At the beginning of their correspondence, in 1915, she addressed him as Mr. Stieglitz. He called her Miss O’Keeffe. Within a few years, he was Dearest Duck and she was Fluffy.. By 1933, when the first volume of their letters ends, much more than appellations had changed. Photographer Alfred Stieglitz and painter Georgia O’Keeffe had evolved from acquaintances to lovers and then from marital partners to distant combatants struggling to maintain a passionate relationship despite his infidelities and her quest for independence. Much has been written about Stieglitz and O’Keeffe — his pioneering modern art galleries and photographic work, her paintings of enormous flowers and Southwestern landscapes, their epic love affair. But “My Faraway One” is not just one more big book about the couple. It’s a substantial sampling of a huge trove of correspondence that was sealed until 2006, 20 years after O’Keeffe’s death, and the first annotated selection of those letters to appear in print.

more from Suzanne Muchnic at the LA Times here.

Hüsker Dü


It was early 1983, probably, after the “Everything Falls Apart” EP presaged Hüsker Dü’s departure from hard-core punk and before the “Metal Circus” EP made it official. Just a gig at a crummy club near CBGB, and late — after 1. There weren’t a dozen onlookers, but Hüsker Dü’s two early records were knockouts, and that Minneapolis trio never came east, so there we were. From our booth in back the music sounded terrific: headlong and enormous, the guitar unfashionably full, expressive and unending, with two raving vocalists alternating leads on songs whose words were hard to understand and whose tunes weren’t. Another half-dozen curious fans drifted in. And then, halfway through, the guitarist passed into some other dimension. When he stepped yowling off the low stage, most of us gravitated closer, glancing around and shaking our heads. The climax was the band’s now legendary cover of “Eight Miles High,” which transformed the Byrds’ gentle paean to the ­chemical-technological sublime into a roller coaster lifted screaming off its tracks — bruising and exhilarating, leaving the rider both very and barely alive.

more from Robert Christgau at the NYT here.

being human


What is human nature? A biologist might see it like this: humans are animals and, like all animals, consist mostly of a digestive tract into which they relentlessly stuff other organisms – whether animal or vegetable, pot-roasted or raw – in order to fuel their attempts to reproduce yet more such insatiable, self-replicating omnivores. The fundamentals of human nature, therefore, are the pursuit of food and sex. But that, the biologist would add, is only half the story. What makes human nature distinctive is the particular attribute that Homo sapiens uses to hunt down prey and attract potential mates. Tigers have strength, cheetahs have speed – that, if you like, is tiger nature and cheetah nature. Humans have something less obviously useful: freakishly large brains. This has made them terrifyingly inventive in acquiring other organisms to consume – and, indeed, in preparing them (what other animal serves up its prey cordon bleu?) – if also more roundabout in their reproductive strategies (composing sonnets, for example, or breakdancing). Human nature – the predilection for politics and war, industry and art – is, therefore, just the particularly brainy way that humans have evolved to solve the problems of eating and reproducing. Thus biologists believe that once they understand the human brain and the evolutionary history behind it, they will know all they need to about this ubiquitous brand of ape.

more from Stephen Cave at the FT here.

Rereading: Mildred Pierce

From Guardian:

Mildred-Pierce---2011-007 Edmund Wilson once called James M Cain (1892-1977) one of America's “poets of the tabloid murder”. After Dashiell Hammett and Raymond Chandler Cain is the writer most often credited with defining the “hard-boiled”, the tough-talking, fast-moving urban stories of violence, sex and money that characterised so much popular film and fiction in America during the 1930s and 40s. Unlike Hammett and Chandler, however, Cain did not focus his fiction on the consoling figure of the detective bringing a semblance of order to all that urban chaos. His novels are told from the perspective of the confused, usually ignorant, all-too-corruptible central actors in his lurid dramas of betrayal and murder. His first two novels, The Postman Always Rings Twice and Double Indemnity, were narrated by men destroyed by femmes fatales; both were made into enormously successful films, especially Billy Wilder's now-classic Double Indemnity, starring Fred MacMurray and Barbara Stanwyck in an improbable blonde wig.

In 1941, Cain published Mildred Pierce, his first novel to focus on a female protagonist; in 1945, it was duly made into a film, starring Joan Crawford in her only Oscar-winning performance, as an overprotective mother trying to cover up for her homicidal daughter. That version of Mildred Pierce is now a classic piece of stylish film noir; but its plot and tone diverge sharply from the novel, a more ostensibly “realistic” story about a divorced woman trying to raise her daughters in depression-era California. Now the film-maker Todd Haynes has returned to Cain's original text to bring us a mini-series of Mildred Pierce, with a cast including Kate Winslet in the title role, Evan Rachel Wood as the treacherous daughter and Guy Pearce. This new Mildred Pierce, produced for HBO with an apparently unlimited budget, may well be the most faithful adaptation of a book ever made: the dialogue is nearly verbatim, and the film moves painstakingly through a virtual transcription of Cain's novel. The attention to historical detail is astonishing, the performances outstanding, and the finished product is visually gorgeous, steeped in a golden sepia tone. But by the end some viewers may well be wondering what, exactly, about this story merited such reverential treatment: Cain's characterisation is uneven, to say the least, and the narrative is resolved only by means of contorted turns of the plot.

More here.

The Trouble With Common Sense

From The New York Times:

Christakis-popup The popularity of the Mona Lisa is an illusion. As Duncan J. Watts explains: “We claim to be saying that the Mona Lisa is the most famous painting in the world because it has attributes X, Y and Z. But really what we’re saying is that the Mona Lisa is famous because it’s more like the Mona Lisa than anything else.” In other words, we are trapped inside a hall of mirrors of our own devising. We think the Mona Lisa is famous because of its traits, but we think those traits are significant only because they belong to the Mona Lisa, which we know to be famous. Ditto Shakespeare? Yes. When an incredulous English professor asked him whether he believed “Shakespeare might just be a fluke of history,” Watts indicated that he meant exactly that.

Watts doesn’t tell us how that conversation ended, but common sense does. Either the literature professor sputtered that Watts — a sociologist, physicist and former officer of the Australian Navy — had no idea what he was talking about, and left him standing with a half-­empty drink in his hand, or she was quite taken with his unorthodox views and spent the rest of the evening engrossed. That both outcomes — although incompatible — strike us as predictable is actually Watts’s point in this penetrating and engaging book. We rely on common sense to understand the world, but in fact it is an endless source of just-so stories that can be tailored to any purpose. “We can skip from day to day and observation to observation, perpetually replacing the chaos of reality with the soothing fiction of our explanations,” Watts writes. Common sense is a kind of bespoke make-believe, and we can no more use it to scientifically explain the workings of the social world than we can use a hammer to understand mollusks.

More here.

Does Islam Stand Against Science?

Steve Paulson in the Chronicle of Higher Education:

Photo_13094_landscape_largeScience in Muslim societies already lags far behind the scientific achievements of the West, but what adds a fair amount of contemporary angst is that Islamic civilization was once the unrivaled center of science and philosophy. What's more, Islam's “golden age” flourished while Europe was mired in the Dark Ages.

This history raises a troubling question: What caused the decline of science in the Muslim world?

Now, a small but emerging group of scholars is taking a new look at the relationship between Islam and science. Many have personal roots in Muslim or Arab cultures. While some are observant Muslims and others are nonbelievers, they share a commitment to speak out—in books, blogs, and public lectures—in defense of science. If they have a common message, it's the conviction that there's no inherent conflict between Islam and science.

More here.

The Brain on Trial

Advances in brain science are calling into question the volition behind many criminal acts. A leading neuroscientist describes how the foundations of our criminal-justice system are beginning to crumble, and proposes a new way forward for law and order.

David Eagleman in The Atlantic:

Neuroscience2On the steamy first day of August 1966, Charles Whitman took an elevator to the top floor of the University of Texas Tower in Austin. The 25-year-old climbed the stairs to the observation deck, lugging with him a footlocker full of guns and ammunition. At the top, he killed a receptionist with the butt of his rifle. Two families of tourists came up the stairwell; he shot at them at point-blank range. Then he began to fire indiscriminately from the deck at people below. The first woman he shot was pregnant. As her boyfriend knelt to help her, Whitman shot him as well. He shot pedestrians in the street and an ambulance driver who came to rescue them.

The evening before, Whitman had sat at his typewriter and composed a suicide note:

I don’t really understand myself these days. I am supposed to be an average reasonable and intelligent young man. However, lately (I can’t recall when it started) I have been a victim of many unusual and irrational thoughts.

By the time the police shot him dead, Whitman had killed 13 people and wounded 32 more. The story of his rampage dominated national headlines the next day. And when police went to investigate his home for clues, the story became even stranger: in the early hours of the morning on the day of the shooting, he had murdered his mother and stabbed his wife to death in her sleep.

It was after much thought that I decided to kill my wife, Kathy, tonight … I love her dearly, and she has been as fine a wife to me as any man could ever hope to have. I cannot rationa[l]ly pinpoint any specific reason for doing this …

Along with the shock of the murders lay another, more hidden, surprise: the juxtaposition of his aberrant actions with his unremarkable personal life. Whitman was an Eagle Scout and a former marine, studied architectural engineering at the University of Texas, and briefly worked as a bank teller and volunteered as a scoutmaster for Austin’s Boy Scout Troop 5. As a child, he’d scored 138 on the Stanford-Binet IQ test, placing in the 99th percentile. So after his shooting spree from the University of Texas Tower, everyone wanted answers.

For that matter, so did Whitman. He requested in his suicide note that an autopsy be performed to determine if something had changed in his brain—because he suspected it had.

More here.

Leap Seconds May Hit a Speed Bump

Sophie Bushwick in Scientific American:

06-17-nistf1ph_1NIST-F1 is one of several international atomic clocks used to define international civil time (dubbed Coordinated Universal Time, or UTC), a job they perform a little too well. In fact, atomic clocks are actually more stable than Earth's orbit—to keep clocks here synched up with the motion of celestial bodies, timekeepers have to add leap seconds. The use of a leap year, adding a day to February every four years, locks the seasons, which result from Earth's orbit about the sun and the planet's tilt as it orbits, into set places in the civil calendar. Similarly, leap seconds ensure that the time it takes Earth to spin 360 degrees is equal to one day as defined by humans and their atomic clocks. Most recently, an extra second was tacked on to universal time on December 31, 2008.

However, since 1999, the Radiocommunication Sector of the ITU has been proposing the elimination of leap seconds from the measurement of UTC. Although the organization did not participate in the creation of the current leap second system, the radio waves it regulates are used to transmit UTC, giving it some influence.

Getting rid of leap seconds would certainly make it easier to calculate UTC, but this measure would also decouple astronomical time from civil time: The time measured by atomic clocks would gradually diverge from the time counted out by the movement of Earth through space. Eventually, one year will no longer be the length of Earth's orbit around the sun. Instead, it will be equivalent to a certain number of cycles of radiation from the cesium-133 atom (almost a billion billion cycles, to be precise).

More here.