The Idea of Egypt Begins to Emerge in 1955
AD After a Field Trip to the Mummy in
the NashvilleMuseum of Natural History
Unlike us they spoke with a foreign accent
Egyptian I guess and they didn’t live very long
or maybe they just kept shrinking like an old person
does on account of their being very extremely
wrinkled they took trinkets with them into their
burial chambers kind of like the stuff I would
pullout of my Mother’s top dresser drawer
that she always made me put back and couldn’t
for the life of her understand why I took it out
in the first place the mummy was kind of scary
the first time I saw it but by the second field trip
I was older and all and could look into the holes
where the ears eyes nose and mouth used to be
and count the threads on the dirty gray linen
that wound around the bones there was hardly
any other place to go on a field trip to except
FortNashboro which was a pretty poor excuse
for a fort which I guess is why the Cherokee
made the early Nashvillians mostly miserable
and the Upper Room which when you got there
all it was a big wooden carving of a famous
painting of the last supper which we had every
year at my house anyway except instead of Jesus
and the disciples we had my family and my sisters’
dumb boyfriends one girl fainted the first time
we saw the mummy and lots of girls screamed
kind of like they thought they were supposed to
even though it was a museum there’s a limit to
how long you can look at a mummy though and
after about two minutes we moved on down to
the glass cases that held real Indian arrowheads
that were probably just dug up out of the ground
but which I couldn’t help hoping had been ripped
from the still-beating hearts of the coon-skinned
soldiers at FortNashboro and plunked on Jesus’
seder plate what do you think y’all get back on the bus
now would have sounded like in hieroglyphic
by Arne Weingart
from ABZ Press
Adam Grant in The New York Times:
NORMALLY, I would have finished this column weeks ago. But I kept putting it off because my New Year’s resolution is to procrastinate more. I guess I owe you an explanation. Sooner or later. We think of procrastination as a curse. Over 80 percent of college students are plagued by procrastination, requiring epic all-nighters to finish papers and prepare for tests. Roughly 20 percent of adults report being chronic procrastinators. We can only guess how much higher the estimate would be if more of them got around to filling out the survey. But while procrastination is a vice for productivity, I’ve learned — against my natural inclinations — that it’s a virtue for creativity. For years, I believed that anything worth doing was worth doing early. In graduate school I submitted my dissertation two years in advance. In college, I wrote my papers weeks early and finished my thesis four months before the due date. My roommates joked that I had a productive form of obsessive-compulsive disorder. Psychologists have coined a term for my condition: pre-crastination.
…Steve Jobs procrastinated constantly, several of his collaborators have told me. Bill Clinton has been described as a “chronic procrastinator” who waits until the last minute to revise his speeches. Frank Lloyd Wright spent almost a year procrastinating on a commission, to the point that his patron drove out and insisted that he produce a drawing on the spot. It became Fallingwater, his masterpiece. Aaron Sorkin, the screenwriter behind “Steve Jobs” and “The West Wing,” is known to put off writing until the last minute. When Katie Couric asked him about it, he replied, “You call it procrastination, I call it thinking.”
So what if creativity happens not in spite of procrastination, but because of it?
Dan Piepenbring in The Paris Review:
Have you ever been tied in close contact with a person who had a strong sense of inferiority? I have, and it is hell. They carry it like a raw sore on the end of the index finger. You go along thinking well of them and doing what you can to make them happy and suddenly you are brought up short with an accusation of looking down on them, taking them for a fool, etc., but they mean to let you know and so on and so forth. It colors everything. For example, I took this man that I cared for down to Carl Van Vechten’s one night so that he could meet some of my literary friends, since he had complained that I was always off with them, and ignoring him. I hoped to make him feel at home with the group and included so that he would go where I went. What happened? He sat off in a corner and gloomed and uglied away, and we were hardly out on the street before he was accusing me of having dragged him down there to show off what a big shot I was and how far I was above him. He had a good mind, many excellent qualities, and I am certain that he loved me. But his feeling of inferiority would crop up and hurt me at the most unexpected moments. Right in the middle of what I considered some sweet gesture on my part, I would get my spiritual pants kicked up around my neck like a horse-collar. I asked him to bring me all the clippings on TELL MY HORSE, and he brought several and literally flung them at me. “You had read them” he accused, “and knew that they were flattering. You just asked me to get them to see how great you were.” You know how many marriages in the literary and art world have broken up such rocks, to say nothing of other paths of life. A business man is out scuffling for dear life to get things for the woman he loves, and she is off pouting and accusing him of neglecting her. She feels that way because she does not feel herself able to keep up with the pace that he is setting, and just be confident that she is wanted no matter how far he goes. Millions of women do not want their husbands to succeed for fear of losing him. It is a very common ailment. That is why I decided to write about it.
More here. (Note: At least one post throughout February will be in honor of Black History Month)
Dan Jones in Nature:
In Joel and Ethan Coen's 2009 film A Serious Man, physics professor Larry Gopnik is in the middle of an existential crisis. In a dream, he gives a lecture on Heisenberg's uncertainty principle; Sy Ableman, the older man with whom Gopnik's wife is having an affair, stays on after the students disperse. In a condescending drawl, he addresses Gopnik and his equation-covered chalkboard: “I'll concede that it's subtle, clever — but at the end of the day, is it convincing?”
Philosopher and cognitive scientist Daniel Dennett has been hearing variants of this riposte for decades. If history is a guide, his latest book, From Bacteria to Bach and Back, will elicit similar responses. It is a supremely enjoyable, intoxicating work, tying together 50 years of thinking about where minds come from and how they work. Dennett's path from the origins of life to symphonies is long and winding, but you couldn't hope for a better guide. Walk with him and you'll learn a lot.
The book's backbone is Charles Darwin's theory of natural selection. That replaced the idea of top-down intelligent design with a mindless, mechanical, bottom-up process that guides organisms along evolutionary trajectories into ever more complex regions of design space. Dennett also draws heavily on the idea of 'competence without comprehension', best illustrated by mathematician Alan Turing's proof that a mechanical device could do anything computational. Natural selection has created, through genetic evolution, a world rich in competence without comprehension — the bacteria, trees and termites that make up so much of Earth's biomass.
Yet, as Dennett and others argue, genetic evolution is not enough to explain the skills, power and versatility of the human mind.
Vasudevan Mukunth in The Wire:
On February 8, 1917, Einstein published a paper titled ‘Kosmologische Betrachtungen zur allgemeinen Relativitätstheorie‘ (‘Cosmological Considerations in the General Theory of Relativity’). In it, he described a number called the cosmological constant. The constant had a value such that, when used in his newly created equations describing the behaviour of the gravitational force, a non-changing universe was spit out – agreeing with knowledge at the time, as well as his belief, that the universe was static. Without the constant in the picture, on the other hand, Einstein’s general theory of relativity suggested that the gravitational pull of masses contained in the universe would pull all the matter together, keeping the universe dynamic.
It would be more than a decade before evidence would begin to emerge that the universe was expanding. And it would be scores of years before astronomers would find that the expansion was also accelerating. Then again, it would be many years before Einstein realised his actual mistake.
While he popularly considered his addition of the constant to be an affront to his own work, it may not have been as bad as he thought. The cosmological principle states, rather assumes, that the universe at the largest scales has the same properties everywhere. This is a spatial definition. An extension called the ‘perfect’ cosmological principle states that the universe at the largest scales has the same properties everywhere and at every time, implying that it has always remained the way it is today and it will continue to be this way forever. This is also called the steady-state theory, an alternative to the Big Bang theory that has been widely discredited – and which Einstein himself pursued for a while in 1931.
Anyway, in defence of Einstein, theoretical astrophysicist Peter Coles writes on his blog, “General relativity, when combined with the cosmological principle, but without the cosmological constant, requires the universe to be dynamical rather than static. If anything, therefore, you could argue that Einstein’s biggest blunder was to have failed to predict the expansion of the Universe!” Indeed, if Einstein had not decided to fudge his own monumental equations, he may have been onto something.
Andrew Sullivan in New York Magazine:
I guess I should start by saying this is not a blog. Nor is it what one might call a column. It’s an experiment of sorts to see if there’s something in between those two. Most Fridays, from now on, I’ll be writing in this space about, among other things, the end of Western civilization, the collapse of the republic, and, yes, my beagles. If you’re a veteran reader of my former site, the Dish, you may find yourselves at times in an uncanny valley. So may I. The model I’m trying to follow is more like the British magazine tradition of a weekly diary — on the news, but a little distant from it, personal as well as political, conversational more than formal.
I want to start with Trump’s lies. It’s now a commonplace that Trump and his underlings tell whoppers. Fact-checkers have never had it so good. But all politicians lie. Bill Clinton could barely go a day without some shading or parsing of the truth. Richard Nixon was famously tricky. But all the traditional political fibbers nonetheless paid some deference to the truth — even as they were dodging it. They acknowledged a shared reality and bowed to it. They acknowledged the need for a common set of facts in order for a liberal democracy to function at all. Trump’s lies are different. They are direct refutations of reality — and their propagation and repetition is about enforcing his power rather than wriggling out of a political conundrum. They are attacks on the very possibility of a reasoned discourse, the kind of bald-faced lies that authoritarians issue as a way to test loyalty and force their subjects into submission. That first press conference when Sean Spicer was sent out to lie and fulminate to the press about the inauguration crowd reminded me of some Soviet apparatchik having his loyalty tested to see if he could repeat in public what he knew to be false. It was comical, but also faintly chilling.
My friend Todd Shea and his organization CDRS are doing great work in Pakistan. Please help them by donating what you would like by clicking here. This video shows just one of their many projects.
Video length: 3:20
Diego Azurdia at The Quarterly Conversation:
Colonel revitalizes the notion of a literature that exists for and from the moment of writing, and it avoids the accompanying unchecked optimism in the possibility of transcendence by foregrounding failure. We are reminded of the pure effort, an idea coined by Ortega y Gasset in order to describe Don Quixote’s adventures and King Philp II’s construction of El Escorial. Both gargantuan in dimensions, much like the Colonel’s Vertigos of the Century, they exemplify those projects that foreground will over structure and design, so as to find their justification in effort itself. They are doomed to fail and inevitably result in a sheer state of melancholy, arguably Iberian and Latin American.
I guess it is unavoidable to end with a discussion of the tradition. Latin American writers have often dealt with challenges springing from the historical and problematic relation between cultural production and politics. The place of the intellectual in the continent has been a highly contended matter, and what we see in the Colonel is a kind of embodiment of the its different stages: from an “enlightened” academic in its universal (mathematical) labors, to the political subject attempting to participate in the movements of his time, to the forgotten hermit attempting to memorialize his own life. If anything, Colonel is a novel that attempts to work out the difficult question about the place of the contemporary intellectual.
Carlo Rovelli at The New York Times:
Alan Burdick’s “Why Time Flies” certainly does not answer our every question. And precisely for this reason it captures us. Because it opens up a well of fascinating queries and gives us a glimpse of what has become an ever more deepening mystery for humans: the nature of time.
Time may appear unproblematic at first. What is there to say about it? It flies, things happen in the fullness of it, clocks measure it, and we are well aware of its passage. This review shall take you perhaps three minutes to read. Nothing particularly curious about that. But the closer we look, the less clear our temporal sense becomes: First, our brain, body and cells all keep track of time in a variety of ways that are not all that well understood. Psychologists are puzzled by a wealth of experiments showing that we process time in more subtle and complex ways than we expected. Some neuroscientists interpret the brain as a “time machine,” whose core mechanism is to collect past memories in order to predict the future. Philosophers debate the very existence of time. And perhaps most disconcertingly of all, physics teaches us that physical time happens to be astonishingly different from how we intuit it: runs at different speeds, at different altitudes; is distorted by matter; is not organized in a straightforward past, present and future. Advanced tentative theories of the universe even discard temporality altogether from the basic ingredients of the world. From whatever side we address it, the nature of time is a source of perplexity and wonder.
John Mullan at The Guardian:
Oliver Goldsmith has always been a puzzle. So he was to his contemporaries, many of whom found him, as the actor David Garrickput it, “a mixture so odd” of contradictory qualities. Was he brilliant or foolish? The painter Joshua Reynolds recalled that Goldsmith like to argue from “false authorities” and talk humorous nonsense. Listeners never knew when to take him seriously. He is a puzzle to literary history too: he dabbled in this genre and that, producing no coherent body of work, yet managed to write a handful of small masterpieces.
There is his brilliant comedy of social pretensions and mistaken identities She Stoops to Conquer, almost the only play of the 18th century apart from Sheridan’swork still to be staged and relished. There is his nostalgic, melancholy poem “The Deserted Village”, once a favourite of all poetry anthologists, its quotability adaptable to any political perspective. “Ill fares the land, to hastening ills a prey, / Where wealth accumulates and men decay”. Above all, perhaps, there is The Vicar of Wakefield, one of the most frequently reprinted novels in English. This hilarious pastiche of the Book of Job manages to seem both a deliciously innocent tale and a wicked mockery of sentimentality. In its naive, sententious, oddly endearing narrator, Dr Primrose, Goldsmith created one of the great unreliable narrators of British fiction.
Robert McCrum in The Guardian:
William James, brother of the more famous Henry, was a classic American intellectual, a brilliant New Englander and renowned pragmatist – a celebrity in his time who coined the phrase “stream of consciousness”. He responded to the cultural and social ferment of the late 19th century with the Gifford lectures, given in Edinburgh during 1900-02. When he turned these talks into a book, James, a Harvard psychologist and the author of The Principles of Psychology, placed himself at the crossroads of psychology and religion to articulate an approach to religious experience that would help liberate the American mind at the beginning of the 20th century from its puritan restrictions by advancing a pluralistic view of belief inspired by American traditions of tolerance. Like his brother, he was obsessed by the problem of expressing individual consciousness through language; this is just one of the principal themes of The Varieties of Religious Experience.
…The idea that all citizens were equally and independently close to God sponsored among the James family the conviction that religious experience should not become confined within the narrow prison of a denomination. The same irreverence towards categories encouraged William James to adopt a high-low style that gives his writing a fresh and populist character that’s rather different from the mature style of his brother the novelist. William used his populism to suggest that any religious experience” was true if the consequences of holding it were pleasing to the individual concerned. This restatement of the American pursuit of happiness gave his audiences a new appreciation of human dignity grounded in everyday reality.
Sam Roberts in The New York Times:
Nearly a century ago, during the Harlem Renaissance, the activist and writer James Weldon Johnson described his neighborhood as “a city within a city, the greatest Negro city in the world.” But he wondered, “Are the Negroes going to be able to hold Harlem?” “When colored people do leave Harlem,” he wrote, “their homes, their churches, their investments and their businesses, it will be because the land has become so valuable that they can no longer afford to live on it.” He was correct in predicting in 1925 that “the date of another move northward” — as when blacks had displaced Jews, long after Harlem had begun as a Dutch suburb — “is very far in the future.” Three new books explore Harlem’s evolution since then.
In “Harlem: The Crucible of Modern African American Culture” (Praeger, $48), Lionel C. Bascom, who teaches in the department of writing, linguistics and creative process at Western Connecticut State University, argues that the Harlem Renaissance of the 1920s was not a distant, passing phase, but “a period that left indelible influence” on black culture and beyond. Still, by the early 21st century, greater Harlem’s majority population was no longer black, as Johnson had foretold. What happened to the poorest tenants and small shopkeepers? Would the neighborhood lose its cultural cachet as its inhabitants were increasingly priced out?
More here. (Note: At least one post throughout February will be in honor of Black History Month)
Sam Roberts in the New York Times:
Hans Rosling, a Swedish doctor who transformed himself into a pop-star statistician by converting dry numbers into dynamic graphics that challenged preconceptions about global health and gloomy prospects for population growth, died on Tuesday in Uppsala, Sweden. He was 68.
The cause was pancreatic cancer, according to Gapminder, a foundation he established to generate and disseminate demystified data using images.
Even before “post-truth” entered the lexicon, Dr. Rosling was echoing former Senator Daniel Patrick Moynihan’s maxim that everyone is entitled to his own opinions but not to his own facts.
“He challenged the whole world’s view of development with his amazing teaching skills,” Isabella Lovin, Sweden’s deputy prime minister, said in a statement.
On Twitter, Bill Gates remembered Dr. Rosling as a “great friend, educator and true inspiration.”
A self-described “edutainer,” Dr. Rosling captivated vast audiences in TED Talks — beginning a decade ago in front of live audiences and later viewed online by millions — and on television documentaries like the BBC’s “The Joy of Stats” in 2010.
Jennifer Oullette in New Scientist:
The same type of artificial intelligence that mastered the ancient game of Go could help wrestle with the amazing complexity of quantum systems containing billions of particles.
Google’s AlphaGo artificial neural network made headlines last year when it bested a world champion at Go. After marvelling at this feat, Giuseppe Carleo of ETH Zurich in Switzerland thought it might be possible to build a similar machine-learning tool to crack one of the knottiest problems in quantum physics.
Now, he has built just such a neural network – which could turn out to be a game changer in understanding quantum systems.
Go is far more complex than chess, in that the number of possible positions on a Go board could exceed the number of atoms in the universe. That’s why an approach based on brute-force calculation, while effective for chess, just doesn’t work for Go.
In that sense, Go resembles a classic problem in quantum physics: how to describe a quantum system that consists of many billions of atoms, all of which interact with each other according to complicated equations.
Amitava Kumar in The New Yorker:
If you threw a dart at the heart of India but your aim was off, a little low and to the right, you would hit the village of Matenar, in the administrative division of Bastar, in the state of Chhattisgarh. Though the region’s lush forests once found mention in ancient Sanskrit epics, Bastar now evokes for many Indians the threat or fear of the Naxalites, Maoist guerrilla groups that have waged a fifty-year insurgency against the national government. But Bastar also represents the ugly side of India’s Janus-faced democracy. The place where your errant dart fell is fabled for its mineral wealth, especially iron, coal, tin, and bauxite, and yet its inhabitants, most of whom belong to India’s indigenous population, the adivasis, are among the poorest in the country. Visitors to Chhattisgarh see dense jungle, small huts, and immense mines, but few schools, health centers, or hospitals. New construction seems devoted mostly to four-lane highways—the better to transport government troops into the state and minerals out of it.
In the past decade alone, more than two thousand people, most of them ordinary civilians, have died in the conflict between government forces and the Maoists. The aim of the police, who in many cases might more properly be called state-sponsored vigilantes, is to establish, with maximum force, federal sovereignty over Bastar—and to make the land safe for mining companies. In 2011, no less an authority than the Supreme Court of India compared Chhattisgarh to the Congo described in Joseph Conrad’s “Heart of Darkness.” With a startling forthrightness, the Justices observed that “predatory forms of capitalism, supported and promoted by the State in direct contravention of constitutional norms and values, often take deep roots around the extractive industries.”
Video length: 5:24
Zach Rabiroff at Open Letters Monthly:
Two and a half millennia ago, on the tiny Greek island of Sphacteria, something unthinkable happened. In the spring of 425 B.C., a small garrison of Athenian hoplites (heavily-armored spearmen who provided the staple of Greek fighting forces) landed on the sandy promontory of Pylos in the southern Peloponnese, and promptly began setting up camp for a long-term occupation. Their objective was to build a raiding base against the mighty Peloponnesian city of Sparta, against whom the Athenians had been waging war for six consecutive years, but the presence of an Athenian army within arm’s length of the Spartan homeland drew a swift response. Soon, a Spartan army was marching out to lay siege to Pylos. To block the entrance to the harbor, and prevent food and supplies from reaching the beleaguered fort, 420 Spartans took up position on the wooded island of Sphacteria just offshore. As Athenian stomachs grumbled, the Spartans settled in for certain victory.
But the they had made a dreadful miscalculation. The Athenians were the mightiest sea power of the ancient world, with a vastly larger and more experienced navy than the landlubbing Peloponnesians. Within days, a fleet of Athenian triremes had seized control of the harbor and encircled the tiny force of soldiers on Sphacteria. Now it was the Spartans’ turn to starve. For several weeks, intrepid smugglers supplied the stranded Sphacteria with food and water, tying waterproof sacks to the backs of helot slaves, who darted between Athenian patrol ships.
Peter Schjeldahl at The New Yorker:
The enigmatic, fantastically erudite artist Raymond Pettibon takes to Twitter like a bird to sky. My favorite of some fifty tweets that he posted on a recent day offers a reason that Donald Trump can’t be the Antichrist: “Not charming, goodlooking, endearing enuff.” In his art, Pettibon only sometimes addresses topical politics, or topical anything, but he knows his archetypes, and it’s nice to have eschatological expertise on current events. How seriously to take it is an uncertainty that haunts all of Pettibon’s art, which is surveyed in “A Pen of All Work,” a retrospective at the New Museum of some seven hundred creations, mostly drawings with text. He has intrigued and befuddled a growing audience since the late nineteen-seventies, when he emerged, in Hermosa Beach, California, as a bookish surfer who made flyers and album covers for the punk band Black Flag (his older brother Greg Ginn was the founder and guitarist) and a flurry of zines. His fame took hold slowly, and it remains confined largely to fine-art circles. Seeing the show is like being lost in a foreign but strangely familiar city, where polyphonic disembodied voices whisper, yell, or sputter wit and wisdom that you’re rarely sure that you heard quite right.
The title, “A Pen of All Work,” is from Byron’s “The Vision of Judgement,” in which the mediocre poet Robert Southey proposes to ghostwrite a memoir for Satan and, upon being rebuffed, extends the same offer to the archangel Michael. This befits Pettibon, who says that roughly a third of his texts are lifted, or rephrased, from cherished writers: a pantheon in which St. Augustine consorts with Henry James and Mickey Spillane. But every Pettibon phrasing sounds like a quotation from someone else, often in the formal, slightly stilted tones of a Victorian wordsmith.
Thomas Frank at the Australian Financial Review:
We always overlook the class interests of professionals because we have trouble thinking of professionals as a "class" in the first place. Still, if we want to understand the problems of liberalism, this is where we must look: at the assumptions and collective interests of professionals, the Democratic Party's favourite constituency.
With the rise of the post-industrial economy in the last few decades, the range of professionals has exploded. To use the voguish term, these are "knowledge workers", and many of them don't fit easily into the old framework.In addition to doctors, lawyers, the clergy, architects and engineers, the category includes economists, experts in international development, political scientists, managers, financial planners, computer programmers, aerospace designers …The top ranks of the professions are made up of highly affluent people.
The Democratic Party has other constituencies to be sure – minorities, women, and the young, for example, the other pieces of what they call the "coalition of the ascendant" – but professionals are the ones whose technocratic outlook tends to prevail. It is their tastes that are celebrated by liberal newspapers and it is their particular way of regarding the world that is taken for granted by liberals as being objectively true. Professionals dominate liberalism and the Democratic Party in the same way that Ivy Leaguers and professionals dominated the Obama cabinet (which might, by itself, guarantee closed minds and ideological uniformity).