Rebecca Newberger Goldstein in the New York Times:
The philosopher Sidney Morgenbesser, beloved by generations of Columbia University students (including me), was known for lines of wit that yielded nuggets of insight. He kept up his instructive shtick until the end, remarking to a colleague shortly before he died: “Why is God making me suffer so much? Just because I don’t believe in him?” For Morgenbesser, nothing worth pondering, including disbelief, could be entirely de-paradoxed.
The major thesis of Tim Whitmarsh’s excellent “Battling the Gods” is that atheism — in all its nuanced varieties, even Morgenbesserian — isn’t a product of the modern age but rather reaches back to early Western intellectual tradition in the ancient Greek world.
The period that Whitmarsh covers is roughly 1,000 years, during which the Greek-speaking population emerged from illiteracy and anomie, became organized into independent city-states that spawned a high-achieving culture, were absorbed into the Macedonian Empire and then into the Roman Empire, and finally became Christianized. These momentous political shifts are efficiently traced, with astute commentary on their reflection in religious attitudes.
You're sitting at a table with a friend and a stranger offers you some candy. Hooray! Who doesn't like candy? But wait! You're not getting the same amounts. One of you gets four delicious pieces, and the other gets a measly one. Does that feel unfair? Do you bristle? Do you forfeit your candy and your friend’s candy, because they’re unevenly distributed?
For decades, psychologists have argued that the answers depend on how old you are, and whether you're the one with the bigger or smaller share. Adults seem to reject inequality of any form, and will pay a personal cost to avoid it even if they stand to get a bigger slice of the pie. Children are more nuanced.
In 2011, Katherine McAuliffe and Peter Blake showed that 8-year-olds, like adults, will reject any unequal offer. But younger children, aged 4 to 7, only bristle at situations when they are disadvantaged. In other words, they'd take the four pieces of candy, thank you very much, and screw the other kid.
“They start out with this very self-focused idea that they recognize unfairness when it’s unfair to me,” says Blake. “It takes more years for different psychological processes to kick in before they can flip that, and say: What's unfair to you is also unfair in general.”
These and other experiments have shown that our aversion to advantageous inequity (when we get more than others) is distinct from our aversion to disadvantageous inequity (when others get more than us). These two reactions involve different parts of the brain. They appear at different ages. They appear in different species: Chimpanzees and capuchins don't like disadvantageous inequity, but they'll tolerate the advantageous kind just as much as 4-year-old humans.
Last week, Republican frontrunner Donald Trump expressed support for a database of Muslims in the United States, a registry so that “we” can keep track of “them.” Trump, of course, is no friend to civil liberties. We know this from his 1989 advocacy of the death penalty for five Black New York boys whom police had forced to confess to a rape and attempted murder they did not commit and Trump’s subsequent refusal to apologize to the boys in the face of exculpatory evidence so overwhelming that the state has dropped the charges. The youths (now adults) have been awarded millions of dollars in compensation for their wrongful imprisonment and received an apology from the state. But Trump is not alone.
…But let’s be clear about what just happened: in a nation founded on principles of religious freedom and expression, a major candidate in a presidential race has suggested that we target, profile, and “manage” a group of people on the basis of their religious beliefs and presumed ethnicity. This is fascism, not democracy, and it represents the fullest expression of the state’s racist and xenophobic biopolitical impulses. Little surprise that pundits are connecting Trump’s comments to Nazi Germany. Hitler, who felt it useful for governments that “men do not think,” moved far beyond registries and special identification badges to containment and extermination. Hate is a slippery slope, but the U.S. has an insidious history of draping itself in the mantle of patriotism while squelching human rights both here and elsewhere.
…Trump’s Islamophobia, shared by all too many around the world, in which the Other becomes a datapoint to be managed rather than a human being, and Congress’s anti-refugee bill embody a “logical” connection between biopolitics and fascism. Michel Foucault theorized biopolitics as the application of political power to human life, with “race” positioned as a technology of classification. Achille Mbembe moved beyond biopower, which he saw as insufficient to account for contemporary forms of killing, to necropolitics, locating the sovereign state’s right to kill in histories of colonization. Fascism is an ideal political form for the expression and operation of necropolitics, allowing the authoritarian state to decide whose lives matter.
This vintage video from the U.S. Department of Agriculture actually gives a very good primer on carving—frankly, it’s the best guide I’ve found, and the thigh-meat trick is indeed neat, even if the announcer’s chummy tone can grate. (Be sure to watch long enough to hear him intone, “There goes that drumstick for a hungry boy!”)
But it raises other questions. Mainly: What is “turkey time,” and why is it separate from “carving time”? Best of all is the rather menacing, passive-aggressive coda: “You can carve without these directions, but you can probably carve better with them.” As a random drunk in a bar once slurred at me when I said I didn’t want to go to the pier with him, “Fine, whatever, just thought you might want to see the Statue of Liberty!”
Wanderer, your footsteps are the road, and nothing more; wanderer, there is no road, the road is made by walking. By walking one makes the road, and upon glancing back one sees the path that must never be trod again. Wanderer, there is no road— Only wakes upon the sea. .
Maria Popova reviews Lisa Randall's Dark Matter and the Dinosaurs: The Astounding Interconnectedness of the Universe in The NYT Book Review:
A good theory is an act of the informed imagination — it reaches toward the unknown while grounded in the firmest foundations of the known. In “Dark Matter and the Dinosaurs,” the Harvard cosmologist Lisa Randall proposes that a thin disk of dark matter in the plane of the Milky Way triggered a minor perturbation in deep space that caused the major earthly catastrophe that decimated the dinosaurs. It’s an original theory that builds on a century of groundbreaking discoveries to tell the story of how the universe as we know it came to exist, how dark matter illuminates its beguiling unknowns and how the physics of elementary particles, the physics of space, and the biology of life intertwine in ways both bewildering and profound.
If correct, Randall’s theory would require us to radically reappraise some of our most fundamental assumptions about the universe and our own existence. Sixty-six million years ago, according to her dark-matter disk model, a tiny twitch caused by an invisible force in the far reaches of the cosmos hurled a comet three times the width of Manhattan toward Earth at least 700 times the speed of a car on a freeway. The collision produced the most powerful earthquake of all time and released energy a billion times that of an atomic bomb, heating the atmosphere into an incandescent furnace that killed three-quarters of Earthlings. No creature heavier than 55 pounds, or about the size of a Dalmatian, survived. The death of the dinosaurs made possible the subsequent rise of mammalian dominance, without which you and I would not have evolved to ponder the perplexities of the cosmos.
Paul Churchland reviews Richard Rorty's Mind, Language, and Metaphilosophy: Early Philosophical Papers in Notre Dame Philosophical Review:
This glowing collection includes Rorty's earliest publications — from 1961 through 1972 — and his earliest attempts to deal with the broad landscape of problems that engulfed our discipline in the second half of the 20th Century: most centrally (for Rorty), analytical reductionism, the mind/body problem, the distinguishing or defining feature of the mental, and the proper methodology for philosophy itself. The arc of Rorty's adventures here mirrors the arc of our professional concerns generally in that period, not least because Rorty was an influential contributor to those discussions, but also because he addressed in depth the work of his most prominent philosophical contemporaries, such as Carnap, Wittgenstein, Ryle, Strawson, Sellars, Quine, and Dennett. His insightful commentaries on these figures, and others, are worth the price of this collection all by themselves.
But a larger issue shapes the relevance of Rorty's essays here. Despite an initial philosophical education that was decidedly classical, Rorty was captured by C.S. Peirce's late 19th-century pragmatism, a philosophical perspective that never left him. And that fertile perspective dominates all of his philosophical activity throughout these essays. For example, given the basic pragmatist conviction that the ultimate function of cognitive activity is to survive in and to navigate the peculiar environment in which one happens to be embedded, a philosopher is very unlikely to find plausible a story that attempts to reduce or translate all empirical statements into a unique and philosophically basic vocabulary of sensory simples. For the sensory vocabulary we happen to use is also plastic, is also in the business of helping us to navigate the world, and is ultimately to be evaluated by the pragmatic virtues that drive and select our conceptual resources generally. Understanding our sensory access to the world is indeed of major importance, and it commands constant evaluation and reevaluation. But our first-person sensory judgements themselves do not constitute an independent touchstone, forever free from pragmatic evaluation. According to Rorty, they are an integral part of the overall epistemic contest. Classical empiricism is thus pushed aside.
The official TLC trip record dataset contains data for over 1.1 billion taxi trips from January 2009 through June 2015, covering both yellow and green taxis. Each individual trip record contains precise location coordinates for where the trip started and ended, timestamps for when the trip started and ended, plus a few other variables including fare amount, payment method, and distance traveled.
I used PostgreSQL to store the data and PostGIS to perform geographic calculations, including the heavy lifting of mapping latitude/longitude coordinates to NYC census tracts and neighborhoods. The full dataset takes up 267 GB on disk, before adding any indexes. For more detailed information on the database schema and geographic calculations, take a look at the GitHub repository.
Thanks to the folks at FiveThirtyEight, there is also some publicly available data covering nearly 19 million Uber rides in NYC from April–September 2014 and January–June 2015, which I’ve incorporated into the dataset. The Uber data is not as detailed as the taxi data, in particular Uber provides time and location for pickups only, not drop offs, but I wanted to provide a unified dataset including all available taxi and Uber data. Each trip in the dataset has a cab_type_id, which indicates whether the trip was in a yellow taxi, green taxi, or Uber car.
Borough Trends, and the Rise of Uber
The introduction of the green boro taxi program in August 2013 dramatically increased the amount of taxi activity in the outer boroughs. Here’s a graph of taxi pickups in Brooklyn, the most populous borough, split by cab type:
Once boro taxis appeared on the scene, though, the green taxis quickly overtook yellow taxis so that as of June 2015, green taxis accounted for 70% of Brooklyn’s 850,000 monthly taxi pickups, while yellow taxis have decreased Brooklyn pickups back to their 2009 rate. Yellow taxis still account for more drop offs in Brooklyn, since many people continue to take taxis from Manhattan to Brooklyn, but even in drop offs, the green taxis are closing the gap.
Take a slice of cake and cut it in two. Eat one half, and let a friend scoff the other. Your blood-sugar levels will both spike, but to different degrees depending on your genes, the bacteria in your gut, what you recently ate, how recently or intensely you exercised, and more. The spikes, formally known as “postprandial glycemic responses” or PPGR, are hard to forecast since two people might react very differently to exactly the same food.
But Eran Elinav and Eran Segal from the Weizmann Institute of Science have developed a way of embracing that variability. By comprehensively monitoring the blood sugar, diets, and other traits of 800 people, they built an algorithm that can accurately predict how a person's blood-sugar levels will spike after eating any given meal.
They also used these personalized predictions to develop tailored dietary plans for keeping blood sugar in check. These plans sometimes included unconventional items like chocolate and ice-cream, and were so counter-intuitive that they baffled both the participants and dieticians involved in the study. But they seemed to work when assessed in a clinical trial, and they hint at a future when individuals will get personalized dietary recommendations, rather than hewing to universal guidelines.
English speakers know that their language is odd. So do people saddled with learning it non-natively. The oddity that we all perceive most readily is its spelling, which is indeed a nightmare. In countries where English isn’t spoken, there is no such thing as a ‘spelling bee’ competition. For a normal language, spelling at least pretends a basic correspondence to the way people pronounce the words. But English is not normal.
Spelling is a matter of writing, of course, whereas language is fundamentally about speaking. Speaking came long before writing, we speak much more, and all but a couple of hundred of the world’s thousands of languages are rarely or never written. Yet even in its spoken form, English is weird. It’s weird in ways that are easy to miss, especially since Anglophones in the United States and Britain are not exactly rabid to learn other languages. But our monolingual tendency leaves us like the proverbial fish not knowing that it is wet. Our language feels ‘normal’ only until you get a sense of what normal really is.
There is no other language, for example, that is close enough to English that we can get about half of what people are saying without training and the rest with only modest effort. German and Dutch are like that, as are Spanish and Portuguese, or Thai and Lao.
Math as both profession and course of study can be a hard sell, something even Don Draper might have trouble pitching. The field unites numbers, theories, and ideas that, yes, can be physically represented but remain intangible. Math is a language unto itself that for some might as well be Latin or Klingon. Even its rare turns in popular culture—A Beautiful Mind,Proof, and The Big Bang Theory come to mind—typically depict brilliant but troubled and/or socially handicapped thinkers more absorbed by theory than reality.
To Richard Brown, however, math can be as beautiful as a ray of morning sunlight cast upon an orchid's petals, as lyrical as a Beethoven symphony. “Math is not about the numbers,” says Brown, director of undergraduate studies in Department of Mathematics at Johns Hopkins University. “It's the ideas behind the numbers. Yes, you can say it's built on a rigid set of rules, but out of that comes an infinite amount of creativity.” And, he adds, beauty.
UC San Francisco researchers have discovered that even brainless single-celled yeast have “sensory biases” that can be hacked by a carefully engineered illusion — a finding that could be used to develop new approaches to fighting diseases such as cancer. In the new study, published online Thursday November 19 in Science Express, Wendell Lim, PhD, the study’s senior author*, and his team discovered that yeast cells falsely perceive a pattern of osmotic levels (by applying potassium chloride) that alternate in eight minute intervals as massive, continuously increasing stress. In response, the microbes over-respond and kill themselves. (In their natural environment, salt stress normally gradually increases.) The results, Lim says, suggest a whole new way of looking at the perceptual abilities of simple cells and this power of illusion could even be used to develop new approaches to fighting cancer and other diseases.
“Our results may also be relevant for cellular signaling in disease, as mutations affecting cellular signaling are common in cancer, autoimmune disease, and diabetes,” the researchers conclude in the paper. “These mutations may rewire the native network, and thus could modify its activation and adaptation dynamics. Such network rewiring in disease may lead to changes that can be most clearly revealed by simulation with oscillatory inputs or other ‘non-natural’ patterns. “The changes in network response behaviors could be exploited for diagnosis and functional profiling of disease cells, or potentially taken advantage of as an Achilles’ heel to selectively target cells bearing the diseased network.”
As Of Men and War opens, a trumpet moans and won’t resolve into taps, and a van traps men who let it bounce them. They rarely come to town, they say. They are quick to road rage. They wear sunglasses. Later, one man tells his wife, “If we never had to leave the house, I’d be all right, but it’s going outside I can’t handle too much.” The French documentary (released in many U.S. cities this week after playing at theaters in France and at international festivals), follows veterans of Iraq and Afghanistan as they undergo trauma therapy at a residential center called Pathway Home, founded by therapist Fred Gusman. These characters are archetypal—“almost mythological”—says Isidore Bethel ’11, the head editor for the film and one of its associate producers: “These men could have served in any war.” Their transposition into an older pantheon of suffering is nimble. The contemporary term “PTSD” is used exactly twice, once by a vet guilt-tripping a too-persistent caller and then by another vet’s wife. Otherwise, no character mentions diagnoses or pills, only men and death and their closeness to it.
Of Men and War is the second film in Laurent Bécue-Renard’s series A Genealogy of Wrath. As a young man, the director spent several months in besieged Sarajevo for “purely romantic” reasons. He found the city’s high pitch catching and vaguely familiar. There, while writing news dispatches and short stories, he met a therapist who worked with war widows. He focused on three of these women for his first documentary, War-Wearied (2003). It dawned on him that a personal “quest” had drawn him to his project. His grandfathers served in the First World War, which killed some 1.7 million French, about 5 percent of the country’s population. “All of us who grew up in the second half of the twentieth century inherited the legacy of the wars in the first part of the twentieth century,” Bécue-Renard said. “It has been shaping the psyches of the families we belong to.”
……… Alone together a moment on the twenty-second anniversary ……… of their wedding he clasped her as she stood ……… at the sink, pressing into her backside, rubbing his cheek ……… against the stubble of her skull. He gave her a ring ……… of pink tourmaline with nine small diamonds around it. ……… She put it on her finger and immediately named it Please Don’t Die. ……… They kissed and Jane whispered, “Timor mortis conturbat me.”
On the eve of the 100th anniversary of the publication of General Relativity by Albert Einstein, Sean Carroll asks, “Einstein's legacy: if spacetime is dynamical rather than absolute, what else about the universe might be flexible?” at PBS Newshour:
Nicolaus Copernicus is famous for having suggested that the Earth moves around the sun, rather than the other way around. That’s a big deal, as it displaces the Earth from its presumed position at the center of the universe. But it’s easy for us to forget something equally amazing: the idea that the Earth can actually move at all. If anything seems like a solid foundation, it’s the Earth itself. But in our post-Copernican world, we know better.
Albert Einstein, with his general theory of relativity, took this conceptual revolution one step forward. Not only is the Earth not a fixed fulcrum around which the rest of the universe revolves, space and time themselves are not fixed and unchanging. In Einstein’s universe, space and time are absorbed into a single, four-dimensional “spacetime,” and spacetime is not solid. It twists and turns and bends in response to the motion of matter and energy. We perceive that stretching and distortion of the fabric of spacetime as the force of gravity.
The idea that space and time themselves are not immutable, but are dynamical quantities that can evolve through the history of the universe, is one of Einstein’s most dramatic legacies. It was so profound that Einstein himself had trouble accepting all the implications of the idea. When he investigated the universe as a whole in general relativity, he found that it should be expanding or contracting, not staying at a fixed size. That went contrary to his intuition, as well as to what astronomers of the time actually thought the universe was doing. When Edwin Hubble discovered the expansion of the universe in the 1920’s, Einstein realized that he had missed the opportunity to make one of the great predictions in the history of science.
The longer I looked at the painting the more I was drawn to the dialogue taking place, sotto voce, between the over-coloured count and his shadowy double hung high on the wall above him – a faintly preposterous rehash of the mirror in Las Meninas, where king and queen make their necessary appearance. The dialogue in Goya – the shadow play, the hovering between repetition and caricature – seemed to me to drain both parties (I presumed that the figure on the wall was an ancestor, or maybe the monarch himself) of reality. The second man staring at me – again, a version of a great moment in Las Meninas, where Velázquez in the background fixes his royal sitters with a predatory gaze – seemed to peer from the picture with an expression compounded of alarm, disbelief and sheer uncomfortable consciousness of his place in a game of looking. Looking and being looked at and thereby ‘brought to life’. He’d be damned if he’d occupy the place he’d been allotted. I found myself staring back at the painting in much the same frame of mind. The more I responded to Floridablanca’s local (stunning) reality effects – the silver shimmer on the count’s sash, the light through the glass on the clock face, the spectacles clutched in his fingers, the Zurbarán notebook glowing on the floor – the more it seemed to me they didn’t matter. What mattered – what made the painting Goya’s – was the pervasive unreality of the set-up, swallowing the world of objects and persons no sooner than it conjured them up.
I realise that I haven’t put my finger on what produced the feeling of unreality, and I’m not sure I can. I know there are dangers in trusting the feeling at all. Anyone looking at Goya’s portraits can’t avoid seeing them against the background of the Caprichos and Black Paintings and the unbearable private albums, some drawn, some etched and aquatinted, dwelling on torment and degeneracy.
Fourteen years after September 11, the reality-concealing rhetoric of Westernism participates in a race to extremes with its ideological twin, in an escalated dialectic of bombing from the air and slaughter on the ground. It grows more aggressive in proportion to the spread of the non-West’s chaos to the West, and also blends faster into a white supremacist hatred of immigrants, refugees, and Muslims (and, often, those who just “look” Muslim). Even more menacingly, it postpones the moment of self-reckoning and course-correction among Euro-American elites who seem to have led us, a century after the First World War, into another uncontrollable and extensive conflagration.
Among the more polished examples of their intellectual rearguardism last week was a piece in the Financial Times by the paper’s foreign-affairs columnist, Philip Stephens, titled “Paris attacks must shake Europe’s complacency. The idea that the west should shoulder blame rests on a corrosive moral relativism.”
It should be said that the Financial Times, the preferred newspaper of the Anglo-American intelligentsia as well as Davos Man and his epigones, keeps a fastidious distance, editorially, from the foam-at-the-mouth bellicosity of its direct competitor, the Wall Street Journal (whose op-ed pages often seem to be elaborating on its owner’s demented tweets).