Tuesday, February 28, 2017
GROWING UP POOR, WITH TRUMP ON TV
Amanda Rea in Literary Hub:
A lot of things didn’t reach the remote corner of Colorado where I grew up. Hummus, for instance, and artichokes, mailmen, punk rock, paved roads. This was the mid 80s, so the internet wouldn’t exist for another few years, and our single TV channel played, in my memory, a near-continuous loop of The Cosby Show, broken now and then by an old episode of I Love Lucy. Airplanes passed over us, leaving white skid marks across the sky. Any passenger looking down would’ve seen nothing, a flat beige expanse where the Rocky Mountains petered out.
And yet, even there at the edge of a ghosted mining town, where jackrabbits outnumbered humans ten-to-one, we had heard of Donald Trump. His was a household name, synonymous with wealth and arrogance and a place called New York City, which might as well have been a settlement on the moon for all it meant to us. Between Cosbyreruns, Trump was on the news, wearing a suit, talking about his money. He was young then, early forties, and though his feathered ducktail hadn’t yet turned lurid yellow, he had the same petulant mouth and smug manner we know so well today. Newscasters fawned over him. They said he had the “Midas touch”—his father had given him a fortune, and he’d turned it into more fortune.
My brother and I rolled our eyes. Nobody had to tell us Trump was an asshole, though our parents probably muttered something to that effect. We were no big fans of rich people. Our family had always been poor—generations of miners, ranchers, farmers, and oil hands.
Does Dark Energy Mean We're Losing Information About The Universe?
Ethan Siegel in Forbes:
Perhaps the biggest surprise of all about the Universe came at the very end of the 20th century: the discovery of dark energy and the accelerated expansion of the Universe. Rather than being pulled towards us gravitationally, the most distant galaxies in the Universe are speeding away from us at ever faster-and-faster speeds, destined to disappear from our view. But does that create its own information paradox? Rob Hansen wants to know, and inquires:
The universe's expansion means our visible horizon is retreating; things faraway are vanishing continuously. (Albeit slowly, right now.) This would seem to imply we are losing information about the universe. So why is it the idea of losing information in a black hole's event horizon is so controversial, if we're constantly losing information to another horizon?
There's a lot to unpack here, so let's start with the accelerated expansion of the Universe.
The case for moral disengagement from politics in the age of Trump
Anis Shivani in Salon:
There continues to be a gross underestimation, even among politically aware liberals, of what we are really up against, and how to counter it. Increasingly, our fellow citizens are resorting to the concepts of fascism to describe the current situation, but this is not necessarily followed by any cogent reflection on what the political subject under fascism needs to do. Ordinary liberal prescriptions have no chance of success under a regime that has moved into an overt fascist mode; moreover, the unacknowledged continuities from the recent neoliberal past, which led to the fascist overture in the first place, mar any consistency of thought among intellectuals, activists and ordinary citizens.
The time has come to explore modes of existence that only make sense under a fascist regime, or rather, are the only modes that make sense under fascist conditions. Above all, the question of moral disengagement from any existing political practice must be taken seriously, and this includes so-called “resistance.” Are there things that pass under the activist rubric today that are actually strengthening fascism rather than weakening it? If that is the case, then those activities must undergo severe scrutiny, because it may well be that what seems like activism is actually “passivism,” and vice versa.
Yuval Noah Harari: Nationalism vs. globalism: the new political divide
Video length: 1:00:00
Black Genealogies of Power: Seven Maxims for Resistance in the Trump Years
Dan Berger in AAIHS:
“Power concedes nothing without demand,” argued Frederick Douglass in one of his most cited speeches. “It never did and it never will.” Donald Trump inaugurated his first Black History Month at the White House with a bizarre mention of Douglass that made clear he does not know who Douglass was, what he did, or that the legendary abolitionist died 122 years ago. While Trump’s ignorance is clear, Douglass’s words remain a prescient example of how the black freedom struggle has thought about power. The black freedom struggle knows power intimately, as it has needed to: both the effects of power from above and the experience of power from below. How can it be otherwise? Slavery, colonialism, segregation, policing, and other forms of racism are power in and over the flesh. At the same time, black radicalism has developed its own power through abolitionism, marronage, transnationalism, feminism, labor organizing, fugitivity, and other forms of communal struggle.
Black History Month occasions a return to how black radicalism conceptualizes the issue of power. A diasporic political tradition built over centuries, the black radical tradition resists simplistic notions of what power is and how it operates. It has displayed a concurrent attention to strength on the bottom and to weakness from above. Here I want to complement the efforts of contemporary organizations such as BYP100, the Black Alliance for Just Immigration, the Dream Defenders, the Malcolm X Grassroots Movement, and the Movement for Black Lives by anchoring some of their historical forerunners. To that end, I present seven maxims of power as a useful but by no means comprehensive list for thinking about the revanchist assaults now underway by the Trump administration as well as the historic opposition movements now gathering force nationwide.
Don’t look to the institutions of power to resolve the problems caused by power. “O, let America be America again—/ The land that never has been yet—/ And yet must be—the land where every man [sic] is free,” Langston Hughes offered in his poem “Let America Be America Again.” Hughes’s poem centers on the contradiction of demanding “America be America again” with the recognition that “America never was America to me.” There are no halcyon days to return to, no golden era when American institutions upheld universal, intersectional antiracist policies. They have been, and remain, venues for necessary fights—both to defend existing rights and win new ones. Yet such fights are not calls to return to the past, to “restore faith” in traditional institutions, as we so often hear amidst Trumpist attacks on the media, the judiciary, and other normative branches of liberal democracy. Rather, political battles are most effective when pointing to the world that could be rather than the world that was (but wasn’t really).
More here. (Note: At least one post throughout February will be in honor of Black History Month)
In ‘Exit West,’ Mohsin Hamid Mixes Global Trouble With a Bit of Magic
Michiko Kakutani in The New York Times:
Mohsin Hamid’s dynamic yet lapidary books have all explored the convulsive changes overtaking the world, as tradition and modernity clash headlong, and as refugees — fleeing war or poverty or hopelessness — try to make their way to safer ground. His compelling new novel, “Exit West,” is no exception, recounting the story of the migrants Saeed and Nadia, who leave an unnamed country in the midst of a civil war and journey to Greece, England and eventually the United States in an effort to invent new lives for themselves. The first half of their story is about how war warps everyday life; the second half, a tale of globalization and its discontents. Writing in spare, crystalline prose, Hamid conveys the experience of living in a city under siege with sharp, stabbing immediacy. He shows just how swiftly ordinary life — with all its banal rituals and routines — can morph into the defensive crouch of life in a war zone, with fears of truck bombs and sniper fire and armed soldiers at checkpoints becoming a daily reality, along with constant surveillance from drones and helicopters. He also captures how insidiously violence alters the calculus of daily life: how windows with beautiful views become a liability; how funerals become smaller, more rushed affairs because of fighting in the streets.
The fiercely independent Nadia is feverishly keen to find a way out of the besieged city, and she and her more introspective boyfriend, Saeed, soon find an agent, who, for a fee, promises to supply them with an exit plan. There have been rumors of magical doors that whisk people away to strange and distant lands, and the door that Saeed and Nadia enter transports them to a beach on the Greek island of Mykonos, where hundreds of other migrants are living in tents and lean-tos in a makeshift refugee camp. Later, the couple will try other doorways that take them to other countries, other continents. “It was said in those days,” Hamid writes, “that the passage was both like dying and like being born.”
The returns to societal capital
Dietrich Vollrath over at his website [via Brad Delong]:
Brad DeLong had a recent post that contained a number of ideas regarding how we view redistribution in a market economy. I picked up on some comments he made towards the end of that post, in which he points out that much of our prosperity comes from a stock of societal capital that we unknowingly rely on every day. And because that societal capital is unseen and uncompensated, we are all in some way overpaid for what we do.
When he says societal capital, I think of it in two broad categories:
- Trust: I think this is much of what DeLong has in mind. We are lucky to be in the “trust” or “cooperate” equilibrium in our repeated game of exhanging goods and services. If you like, call it the “stag hunt” equilibrium Nick Rowe talks about. Regardless, we benefit from the decisions of our ancestors to play this equilibrium, so that it is the default. If you want to say this is due to some institutions, or culture, or pure luck, it doesn’t matter. We’ve found our way to the trust equilibrium, and benfit from that immensely.2.
- Scale: He doesn’t mention this explicitly, but I think it is as relevant as trust. Scale influences the potential profits from innovations, and so is crucial to growth. Bigger market, more profits, more incentives to innovate. But scale is not the same thing as trust, or institutions, or culture. If you doubt that, ask yourself why no firm is spending millions to get into New Zealand, paragon of free market institutions, but they are falling all over themselves to do business in China. Living in the US, or EU, or China, is to reap the benefits of living with scale.
The heart of DeLong’s point is that neither trust nor scale are things that are owned by any firm or individual. You could say that we inherited them from our ancestors, or you could say these are emergent properties, or you could say that they are designed by the institutions we choose for ourselves. Regardless, trust and scale are “ideas” in the broadest sense, and are inputs into the production process in that trust and scale mean our set of rival inputs (labor, capital) can produce more with them than without.
How is it that scale and trust mean we are overpaid?
Citizenship: A relic of European legal culture?
Dieter Gosewinkel in Eurozine:
A state is “the corporate body of a settled people equipped with sovereign authority”, wrote the influential Austrian constitutional lawyer Georg Jellinek in 1900. The defining characteristics of statehood are accordingly sovereign state power, a titular state people and a delineated state territory. This model of clear demarcations was formulated at the highpoint of the emergence of nation states, when they were at the peak of their legitimacy, and continues to shape international law to this day. National borders and national citizenship define an interior and exterior through legal means and thereby determine inclusion and exclusion. However, the theoretical and ethical bases of this legal construction are beginning to seem increasingly flawed.
Two waves of globalisation have undermined – and continue to undermine – the spatial concept of an economically and politically self-contained state. Worldwide flows of information, economic activity, communication and above all migration contradict conventional understanding of national statehood based on static models, in which the population is tied to one location, cultures are nationally delimited, and borders are only crossed as an exception. Political practice drives these developments forwards. The freedom of movement within the united Europe – the dissolution of borders for communication, goods and travel – has shaped the continent’s de facto existence to such an extent that it determines how leading European politicians imagine Europe ought to be: border checks should no longer be possible because they can no longer be conducted in practice. Praxis creates a theory that, in turn, confirms praxis. The advance of universalist global ethics and the humanitarianism of human rights legitimise a global politics of morality. Against this, the boundaries of traditional nation states seem at best anachronistic and at worst theoretically simplistic and ethically illegitimate.
Tragedy and Philosophy
Richard Marshall interviews Dennis Schmidt in 3:AM Magazine:
3:AM: You have written about the tension between tragedy and philosophy – German philosophy in particular. What is this tension?
DS: I would suggest that this tension is at the very root of the idea of philosophy that we have inherited in the West and that, until recently, has largely gone unchallenged. Over time, this tension was simply set aside as philosophy increasingly came to neglect the claims of art. But when philosophy as we now know it began Plato took the work of art, especially tragedies (and here Homer is included since Plato does not distinguish tragedy and epic as Aristotle will), as a sort of foil in his efforts to delineate this new way of speaking about the world called “philosophy.” A different stage or theatre was exposed – a theatre of ideas in the mind, not of actors on the stage – language and dialogue were still crucial to this new theatre, but even the residue of theatre that belonged to philosophical dialogue would very soon disappear. The birth of the essay, of the treatise, is coterminous with the essential exclusion of the work of art from philosophy. [As an aside, I would suggest that interviews, such as the one’s you conduct, are a gentle way of restoring something of the dialogue character of thinking to philosophy.] Part of the argument that I made in speaking of the German recovery of Greek tragedy as a philosophical problem is that this marks a genuinely new moment in the long history of philosophy and that this recovery of tragedy as a question opens up avenues for philosophy in general that have been closed off since the beginnings of philosophy.
Perhaps the most direct way to characterize this tension is to say that tragedy is the expression of a view of life as defined finally by an insurmountable contradiction (of a law of life at odds with itself), while philosophy will always aim at a sort of overcoming of contradiction (of the law of non-contradiction as the need of truth). There is, of course, more to be said. The form of presentation proper to tragedy is, as Aristotle notes, reliant upon language, meter, plot, spectacle, and stage. Tragedy needs to appeal to emotional life, to a feeling that perhaps cannot be conceptualized. Philosophy, on the other hand, is deeply distrustful of any turn to emotional life and it is equally suspicious of any language that does not abide by the rule of the concept, that is, by the demand for universalizability and consistency. The concept has long remained the mother tongue of philosophy and, at the same time, a tragedy that can be reduced to its concept does not merit the claim of being a work of art.
Monday, February 27, 2017
The Owl of Minerva Problem
by Scott F. Aikin and Robert B. Talisse
Wisdom is a product of experience and reflection. As a consequence, it's often quite a long road to that goal. It's for this reason that the poetic expression, "the Owl of Minerva Flies at Dusk," has its effect. Only at the end of the day, once the work is done and we recline in thought, do the insights of what we ought to have done, what the best option was, and what was wrong about a particular decision become clear. We live forward, but we understand backward. And that can occasion distinctive problems.
In democratic politics, this point about insight is certainly true. And it extends not only to the errors we may make as a country, but also to the errors we make in understanding ourselves and our decision-making. In its current form, much democratic theory is focused on the decision-making and argumentative elements of modern political life. This deliberative democratic movement casts democratic life as that of participating in ongoing discussions, wherein all have a voice, no issue is beyond question, and every decision must be justifiable to all those whom it effects. There are admirable ideals, but we understand the ways we can fail those ideals only in making mistakes, only in witnessing the pathologies to which public reason is prone.
We experience living in a democracy and then we see the particular kinds of challenges and errors to which reasoning together can be prone. Perhaps we should have anticipated the effects of group polarization that seem to define contemporary political discourse, but we understand it all too well now that we live under its conditions. The incurious dogmatism of epistemic closure, the slippery euphemism of Orwellian Newspeak, and the abuses of and visceral reactions to political correctness are all political phenomena that require we see as developments from histories and arising within particular social settings. We do now know them a priori.
The Owl of Minerva Problem at first looks like a simple point about the retrospective nature of knowledge: You must first have experience to know, so knowledge must be dependent on (at least some) events of the past. But the Owl of Minerva Problem raises distinctive trouble for our politics, especially when politics is driven by argument and discourse. Here is why: once we have a critical concept, say, of a fallacy, we can deploy it in criticizing arguments. We may use it to correct an interlocutor. But once our interlocutors have that concept, that knowledge changes their behavior. They can use the concept not only to criticize our arguments, but it will change the way they argue, too. Moreover, it will also become another thing about which we argue. And so, when our concepts for describing and evaluating human argumentative behavior is used amidst those humans, it changes their behavior. They adopt it, adapt to it. They, because of the vocabulary, are moving targets, and the vocabulary becomes either otiose or abused very quickly.
Consider the use of fallacy vocabulary less as a device for the cool evaluation of arguments, now, but rather as a tool of evasion or attack.
Ted Cruz famously attacked Donald Trump during the primary season for being the kind of person who relies on the ad hominem.
Further, the use of the term ‘straw man' charge to defend against any and all criticism in online argument is so widespread, the strategy has been added to a comic pantheon of argumentative personae.
The point, again, is that the tools we've used to make sense of and evaluate and improve our attempts at rational exchange have been tools of subverting it.
And now we see the same phenomenon with the expression ‘fake news.' The term had its purchase originally as one to explain the proliferation of false stories about the 2016 Presidential election in the US. For example, that the Pope endorsed Donald Trump, that Hillary Clinton was running a child pornography ring in a pizza parlor's basement. Now, however, the expression ‘fake news' is used by Donald Trump to disparage claims he holds are contrary to his interests. And so he says that CNN is fake news, and that the Russian ties to General Flynn is fake news. And so vocabulary we'd used to understand our joint exercise of reason is now part of that exercise and changing and being changed by that exercise.
And so, we may understand ourselves and the work or reasoning together only in retrospect, because the tools we use to make the parts of our practice explicit for endorsement or evaluation themselves become part of the practice and are changed by it. This is both good and bad news. The bad news is that our task of understanding ourselves and having a complete grasp of best theoretical practices is always incomplete and open to abuse by our very terms. But the good news is that those changes made by and to our critical vocabulary occur because we care for reason and wish to live up to its dictates. Even the most egregious fallacy is yet an attempt to lay claim to reason's legitimacy.
Random Triangles and Pillow Problems for Insomniacs
by Jonathan Kujawa
While laying in bed on the night of January 20, 1884, Lewis Carroll conjured up the following puzzle:
Three Points are taken at random on an infinite Plane. Find the chance of their being the vertices of an obtuse-angled Triangle.
That is, since any three points on a sheet of paper can be connected to form a triangle, what's the likelihood that one of the angles is more than ninety degrees if you pick those points at random?
If you only know Lewis Carroll from Alice in Wonderland you may be surprised that his thoughts turned to mathematics. In fact, his day job was to be a mathematician at Christ Church college in Oxford under the name Charles Dodgson. In addition to his more famous works of fiction, he was known for writing several mathematical texts. When teaching linear algebra I always take a day to talk about Dodgson Condensation .
One of the books he wrote is Curiosa Mathematica, Part II: Pillow Problems Thought Out During Wakeful Hours. It is the compendium of 72 math problems Dodgson pondered and solved while waiting to fall asleep. Helpfully he also gives the date he dreamt up the problem and the solution he devised. Go here if you'd like to take a look at the other 71 problems.
The Obtuse Triangle Problem is No. 58. Before we take a look at his solution we should step back a minute. What does it mean to pick three points at random? Like most politicians' speeches, it sounds good but falls apart under the slightest scrutiny. Are we to pick x and y coordinates for each of these points? Alternatively, we could pick an angle between 0 and 360 degrees and a distance and, starting at the origin, take the point at that angle and distance. Or, since all we care about is the resulting triangle, maybe we should randomly pick an angle between 0 and 180 degrees, pick two side lengths at random, and make the triangle made by drawing two sides of those lengths with that angle between them. I'm sure we could come up with a dozen different ways to randomly pick a triangle.
If a random triangle was a random triangle, and if the world was fair and just, then the odds of an obtuse triangle would be the same regardless of our method. Sadly, the world is neither fair nor just. It will matter how we choose to pick a random triangle .
Mathematicians know from centuries of painful experience that the slightest sloppiness at the beginning can lead to huge problems down the road. Before we worry about picking points on a two-dimensional plane, let's look at what should be an easier, one-dimensional problem:
If you break a stick of length one in two places, what is the chance that the three pieces can be used to make a triangle?
Certainly if you break it so that there is one really long piece and two stubby bits, then you won't be able to make a triangle. But you can also see that if the pieces are roughly equal in length then you'll be able to make a triangle with them. So what is the probability you'll be able to make a triangle?
Aggravatingly the answer is: it depends. The good news is that it isn't hard to tell if you can make a triangle or not. Everybody knows that going straight along one side will always be shorter than taking a detour around the other two sides. That is, if the three pieces have lengths a, b, and c, then in a triangle you have to have a+b ≥ c, a+c ≥ b, and b+c ≥ a. It turns out that the converse is also true: if you have lengths a, b, and c which satisfy these three inequalities, then you can form a triangle. That is, you can form a triangle if and only if all three side lengths are less than 1/2.
Say you break the stick at x and y, where these are numbers between 0 and 1. Then the side lengths are x, y-x and 1-y . Plotting the possible pairs of x's and y's where all three side lengths are less than 1/2 gives the green region:
Calculating areas we see that the chance of forming a triangle is 1/4.
However, that is not the only way of breaking a stick into three. You could instead pick a point a random and break the stick there. Then randomly pick one of the two halves and break it at some random spot. If anything, this is the more realistic scenario if you were to break a stick with your bare hands. In any case, a similar analysis shows that breaking a stick into three pieces in this way allows you to form a triangle 1/6 of the time. I find it horrifying that two reasonable approaches led to dramatically different outcomes . These sorts of unintuitive results convinced me that probability was not my natural mathematical habitat.
This has shades of the famous Monty Hall problem: Imagine you are a game show contestant. There are three doors but only one has a prize behind it. You start by picking one door at random. Since the host knows where the prize is, she opens one of the other two doors and shows you that it is empty. The host then gives you the opportunity to either stick with your initial choice or switch to the other closed door. Should you stand pat, switch, or does it not matter? It isn't obvious at first, but it turns out that there is a 1/3 chance that your door has the prize and a 2/3 chance the other closed door has the prize. You should switch!
In the case of the pillow problem, Dodgson calculated that the probability of having an obtuse triangle is
That is, using Dodgson's method of picking three points at random leads to an obtuse triangle approximately 2/3 of the time. On the other hand, Richard Guy showed that several other reasonable methods lead to an obtuse triangle 3/4 of the time !
I learned about the Obtuse Triangle Problem during a talk by Jason Cantarella at the Joint Meetings in January. The Joint Meetings is the jumbo annual meeting of mathematicians organized by the AMS, MAA, ASL, AWM, NAM, and SIAM. Dr. Canterella is a mathematician at the University of Georgia and his research is at the interface of geometry with other subjects. We came across him more than a year ago here at 3QD when we talked about his work on knots.
In this case Dr. Canterella was talking about his joint work with Tom Needham, Clayton Shonkwiler, and Gavin Stewart. You can read their paper here. In it they take a different approach to the Obtuse Triangle Problem. Namely, they instead consider the question of randomly picking three side lengths which sum to 2 . By doing an elementary change of variables they show that the choice of three side lengths which forms a triangle is the same as choosing an x, y, and z which satisfy x2+y2+z2=1. That is, they show that picking the three side lengths of a triangle is the same as choosing a point on the sphere.
This is great! Instead of picking three points in the plane, we are now randomly picking a single point on the sphere. This eliminates many of the pesky issues of ambiguity. Even better, a sphere has a natural geometry we can exploit. For example, any two points are connected by a shortest path, often called a great circle . You see these when you take intercontinental flights. The long, arcing path of your flight on the map is actually the shortest route once you take into account the curve of the Earth. You can use this page to draw great circles of your own.
In their formulation, the Obtuse Triangle Problem becomes a question of identifying the points on the sphere which correspond to triangles with an obtuse angle and then computing the fraction of the sphere covered by these points. Cantarella, Needham, Shonkwiler, and Stewart do exactly this. If you choose a triangle at random using their method the chance of an obtuse angle is
There are a couple of reasons their approach is particularly pleasing. First, they take into account the can of worms discussed in . Second, we can make use of the geometry of the sphere. Since every point on the sphere corresponds to a triangle, journeys around the sphere are journeys through "triangle space". In particular, great circles give us a way to talk about shortest paths between any two triangles.
Even better, their approach generalizes to other polygons. A triangle is a polygon with three sides. To be slightly fancy, the authors are identifying each triangle with a point in the Grassmannian space Gr(2,3). In a similar way, they can match polygons with n sides with points in the space Gr(2,n). Like the sphere, these spaces have natural ways of measuring distance. This lets the authors study "hexagon space" and "octagon space". You can do cool things like take the shortest path from the letter A to the letter Z in "decagon space":
Or pick a random 20,000 sided polygon:
You can find these animations and more at Clayton Shonkwiler's webpage. He regularly creates cool math animations and I highly recommend you follow him on twitter.
 A nifty method for computing determinants of matrices which, sadly, is hardly ever discussed nowadays. It plays a key role in the 1990s proof of the Alternating Sign Conjecture.
 Also, the probability of an obtuse triangle should be scale invariant. That is, it shouldn't matter if our x and y coordinates are in inches, miles, or lightyears. Or if we turn our head at a 37-degree angle, or look at the plane from above or from below. This is another can of worms we won't open (for now).
 Image from Bill the Lizard at mathoverflow.net. The symmetry of the picture comes from the fact that either x or y could be the smaller number. We did the case when x is smaller (i.e. to the left) of y. If y is smaller, then the side lengths are y, x-y, and 1-x.
 To read more about this problem, check out Martin Gardner's column on the topic. He also discusses the problem of picking a random chord on a circle in which three reasonable methods yield three different outcomes!
 We came across Richard Guy's Law of Small Numbers here at 3QD a few months ago.
 To make the math nicer they assume their stick has length 2. This doesn't have any effect on the outcome.
 Unless, of course, they are exactly antipodal.
“The woolly mammoth vanished from the Earth 4,000 years ago,
but now scientists say they are on the brink of resurrecting the ancient
beast in a revised form, through an ambitious feat of genetic engineering.”
If the wooly mammoth becomes the new Lazarus
reborn from an ice sarcophagus
does it mean that we may all return one day
to beat our breasts at the injustice of death
but also to rejoice in miracles? It’s an
honest question, we’ve been asking it
for generations, yet it’s never been answered
but in myth, the story that elevates ignorance
to poetry, that blazes red trails in pigment,
that ends up only as sublime music to our ears,
elusive, illusory as the apparition of tomorrow
But we still have this day
It seems never to end
Reality Check: Wine, Subjectivism and the Fate of Civilization
by Dwight Furrow
I must confess to having once been an olfactory oaf. In my early days as a wine lover, I would plunge my nose into a glass of Cabernet, sniffing about for a hint of cassis or eucalyptus only to discover a blast of alcohol thwarting my ascension to the aroma heaven promised in the tasting notes. A sense of missed opportunity was especially acute when the wine was described as "sexy, flamboyant, with a bounteous body." Disappointed but undaunted, I would hurry off to wine tastings hoping the reflected brilliance of a wine expert might inspire epithelial fitness. It was small comfort when the expert would try to soften my disappointment with the banality, "it's all subjective anyway." So one evening, while receiving instruction in the finer points of wine tasting from a charming but newly minted sommelier, I let frustration get the better of me and blurted "Well, if it's all subjective, what the hell are we doing here? Is it just your personal opinion that there is cassis in the cab or is it really there. We all have opinions. If you're an expert you should be giving us your knowledge, not your opinion!" Someone muttered something about "chill out" and it was quickly decided that my glass needed refilling. But the point stands. The idea of expertise involves the skillful apprehension of facts. If there is no fact about aromas of cassis in that cab there is no expertise at discerning it.
These conversations over a glass of wine are more pleasant (because of the wine) but structurally similar to the semester-long task of getting my college students to realize that moral beliefs are not arbitrary emendations of their lightly held personal attitudes but are rooted in our need to survive and flourish as social beings. Yet even after weeks of listening to me going on about the sources of value they still write term papers confidently asserting that with regard to "right" and "wrong", eh, who knows?
Subjectivism, the view that a belief is made true by my subjective attitude towards it, has long been the default belief of freshman students and arbiters of taste. Unfortunately this tendency to treat it as the wisdom of the ages has escaped the confines of the wine bar and classroom into the larger society. Buoyed by the cheers of multitudes, our fabulist-in-chief, routinely finds his "own facts" circulating in what seems to be an otherwise empty mind. Unfortunately, this is no longer mere fodder for a seminar debate.
Accompanying this idea that we are entitled to our own facts is the belief that reality can be invented through sheer force of the will. Authoritarian leaders have always sustained their power by re-defining reality such that complex problems are amenable to simple, authoritarian solutions. The idea of the strongman who can act and succeed independently of true belief, the confidence that conviction and will are sufficient to solve problems, is the logical extension of subjectivism, and the U.S. now has its very own Combover Caligula to test the theory.
This drama takes place against the background of majorities believing that while scientists keep our planes aloft, our computers humming, and help the enormously complex human body fight disease, they can't make simple measurements of CO2 concentration and temperature gradients. Climate change denial is the ultimate fabulation, the most extreme case of simply ignoring an inconvenient reality because you would rather it were different.
The common denominator linking all these fabulations is the belief that reality is whatever the mind says it is. Reality poses no independent standard to which our thoughts and attitudes must conform. Unfortunately, this idea has a rich and influential philosophical pedigree. The monumental presence of Immanuel Kant looms over the modern world, for it was Kant who argued that reality as-it-is-in-itself can never be known. According to Kant, the structure and organization that reality appears to have--constituted by time, space, and causation—is a product of the mind imposing order on reality according to principles and categories that enable these "appearances" to make sense to us.
Before my colleagues in philosophy go apoplectic let me clarify that I am not suggesting a logical or causal connection between the sophisticated arguments of Kant and the puerile subjectivism discussed above. Kant was no subjectivist because he argued that the rules that govern perception and reason are universally shared among rational beings (among which he includes, perhaps mistakenly, persons). Furthermore, his arguments were based on the quite plausible notion that any claim about reality as-it-is-in-itself will be dependent on how the mind gives structure and meaning to that claim, and thus all reference to a mind-independent reality is pure speculation. It was Kant's laudable dislike of unsupported claims and his awareness of the limits of human knowledge that led him to be cautious about claims to know reality. The traditional notion of "the real" is that which is independent of human experience, something unsullied by the distortions imposed by human thought. Kant was right that the very attempt to think such a thing would inevitably bind it to human thought.
Nevertheless, for the rash and incautious, it's a very short step from the view that a mind-independent reality is unknowable to the claim that therefore we can just forget about reality as a constraint on our ideas altogether. Thus, I wonder if the "spirit of the age" has finally run roughshod over the careful, rigorous skepticism of Kant by demonstrating the ultimate absurdity of thought disconnected from reality. At this point in history we urgently need a dose of reality. An awareness of the limits of knowledge and the impenetrability of the real is not sufficient; we need an awareness of reality pushing back, penetrating our insights and offering stubborn facts to which we must attend. After all, Kant does require that we bite a very large bullet. He poses the question whether we should believe him or our lyin' eyes which tell us that reality is right there in front of us. We are all intuitive realists; only in a philosophy seminar would we think otherwise.
Despite its alleged universality, Kant's view that all of this is just an elaborate construction of the mind seems to invite elaborate reconstructions based on all manner of preferences and prejudices, and so I fear that if we are to get beyond fabulation we must get beyond Kant. And that means showing that we need not bite that bullet that Kant thought necessary.
However, the alternative to Kant's transcendental idealism seems equally absurd. For the most straightforward way of rejecting subjectivism is to take on board the kind of objectivity to which the natural sciences aspire—what is real is whatever the best scientific theories say is real. But that leaves us with an arid reality evacuated of all meaning and value, since the mindless, meaningless physical particles and fields of force discovered by physics seem to lack any essential reference to what matters to us. Appeals to science have little to say about what we ought to care about, let alone the aesthetics of wine, moral norms or anything else in life that depends on judgment. We seem to be stranded on one side or the other of an abyss formed by the mighty pillars of objectivity and subjectivity with no way to traverse the chasm. Is there a way across that chasm?
Kant is arguing that we can't prove the common sense view that we are in touch with an independent reality and so intellectual rigor demands we be skeptical. This puts the pursuit of knowledge in the driver's seat but leaves us bereft of the very knowledge we seek. Yet, before we can prove anything we must first meet the causal force of reality head on. As we move about the world it presses in on us, resisting our actions, disrupting intentions, penetrating mind and body, a piercing, gale-force wind that requires careful tacking to navigate.
Kant wants to say this causal force is itself something the mind imposes on itself. But that is only remotely plausible after stepping back in a moment of abstract doubt and asking what we can really know. It's not addressing the human experience of a reality that buffets, ingresses, rubs, wounds, attracts and fascinates. What Kant misses is that our fundamental transaction with the world is not via knowledge. It is via feeling, emotion, sensibility, attraction and repulsion, in other words, aesthetics. Reality is felt before it is known—I suffer and love, therefore I am. Skepticism gets no foothold here.
How does this acknowledgement of the felt influence of causal forces help avoid subjectivism? That would be a very long tale but I will try to provide a sketch. The causal lines of force that resist our aims but also enable all human creativity are indicators of something deep and consequential. For they emerge out of potentialities, latent forces, dispositions in things that when activated by the presence of other things, including human beings, have a direction. I call these directed lines of force telic norms, patterns of probabilities that prescribe how reality might develop under certain conditions. These telic norms are attractors for feeling to which we respond with pleasure or aversion. There is value in the world for without it I doubt that a frog could catch a fly.
Whatever positive influence we have over reality will be realized by responding to telic norms under conditions appropriate to their realization—otherwise chaos ensues. Objectivity is achieved by accurately tracking the lines of force that emerge from a given set of conditions and that provide an anchor for telic norms. Whatever the future holds, it will emerge from these lines of causal influence and our ability to absorb their direction and make use of them. The first contact with them is not the mind that knows but the sensibility that feels drawn or repelled. When the mind spins away from these lines of force we have subjectivity and error.
Which brings me back to wine (you just knew I would return to wine). Winemaking is an art form in which the quality of the final product depends on nature and the recognition that nature has its own powers and dispositions that we can only sometimes, and within limits, influence. With each vintage nature imposes its "will". Good winemakers accurately track the telic norms that emerge first from the grapes and later the wine in its various stages of development in light of their sensibility and intentions regarding the final product.
The problem of objectivity is not that critics or consumers disagree about the quality of particular wines. Of course they disagree. We all have different preferences and histories and convergence of judgment would not be desirable in any case. What matters for objectivity is that critics and others who taste aesthetically track the potential of a wine, taste its ability to provide satisfaction to various people with differing sensibilities. Aesthetic tasting is not a matter of asserting subjective likes or dislikes but of identifying potentiality, the latent forces and indwelling capacities of a wine to produce pleasure.
Of course wine quality (or beauty if you prefer) is subjective to a degree but it is not merely subjective. It isn't something we project or impose onto an object but is a response to something in the object being judged, an appreciation of its power to affect us which is more felt than apprehended.
Kant was alleged to have had a taste for the grape. Had he tasted aesthetically might history have developed differently?
For more on the aesthetics of food and wine visit Edible Arts or consult American Foodie: Taste, Art and the Cultural Revolution.
Politics Trump Healthcare Information: News Coverage of the Affordable Care Act
by Jalees Rehman
The Affordable Care Act, also known as the "Patient Protection and Affordable Care Act", "Obamacare" or the ACA, is a comprehensive healthcare reform law enacted in March 2010 which profoundly changed healthcare in the United States. This reform allowed millions of previously uninsured Americans to gain health insurance by establishing several new measures, including the expansion of the federal Medicaid health insurance coverage program, introducing the rule that patients with pre-existing illnesses could no longer be rejected or overcharged by health insurance companies, and by allowing dependents to remain on their parents' health insurance plan until the age of 26. The widespread increase in health insurance coverage – especially for vulnerable Americans who were unemployed, underemployed or worked for employers that did not provide health insurance benefits – was also accompanied by new regulations targeting the healthcare system itself. Healthcare providers and hospitals were provided with financial incentives to introduce electronic medical records and healthcare quality metrics.
As someone who grew up in Germany where health insurance coverage is guaranteed for everyone, I assumed that over time, the vast majority of Americans would appreciate the benefits of universal coverage. One no longer has to fear financial bankruptcy as a consequence of a major illness and a government-back health insurance also provides for peace of mind when changing jobs. Instead of accepting employment primarily because it offers health benefits, one can instead choose a job based on the nature of the work. But I was surprised to see the profound antipathy towards this new law, especially among Americans who identified themselves as conservatives or Republicans, even if they were potential beneficiaries of the reform. Was the hatred of progressive-liberal views, the Democrats and President Obama who had passed the ACA so intense among Republicans that they were willing to relinquish the benefits of universal health coverage for the sake of their political ideology? Or were they simply not aware of the actual content of the law and opposed it simply for political reasons?
A recent study published by a team of researchers led by Sarah Gollust at the University of Minnesota may shed some light on this question. Gollust and her colleagues analyzed 1,569 local evening television news stories related to the ACA that were aired in the United States during the early months of when the health care reform was rolled out (between October 1, 2013, and April 19, 2014). They focused on analyzing local television news broadcasts because these continue to constitute the primary source of news for Americans, especially for those who are age 50 and higher. A Pew survey recently showed that 57% of all U.S. adults rely on television for their news, and among this group, local TV new (46%) is a more common source than cable news (31%) or network news (30%).
Gollust and colleagues found that 55% of the news stories either focused on the politics of the ACA such as political disagreements over its implementation (26.5%) or combined information regarding its politics with information on how it would affect healthcare insurance options (28.6%). Only 45% of the news stories focused exclusively on the healthcare insurance options provided by the law. The politics-focused news stories were also more likely to refer to the law as "Obamacare" whereas healthcare insurance focused news segments used the official name "Affordable Care Act" or "ACA". Surprisingly, the expansion of Medicaid, which was one of the cornerstones of the ACA because it would increase the potential access to health insurance for millions of Americans, was often ignored. Only 7.4% of news stories mentioned Medicaid at all, and only 5% had a Medicaid focus.
What were the sources of information used for the news stories? President Obama was cited in nearly 40% of the stories, whereas other sources included White House staff or other federal executive agencies (28.7%), Republican (22.3%) or Democratic (15.9%) politicians and officials. Researchers, academics or members of think tanks and foundations were cited in only 3.9% of the news stories about the ACA even though they could have provided important scholarly insights about the ACA and its consequences for individual healthcare as well as the healthcare system in general.
The study by Gollust and colleagues has its limitations. It did not analyze TV network news, cable news, or online news outlets which have significantly gained in importance as news sources during the past decade. The researchers also did not analyze news stories aired after April 2014 which may have been a better reflection of initial experiences of previously uninsured individuals who signed up for health insurance through the mechanisms provided by the ACA. Despite these limitations, the study suggests that one major reason for the strong opposition among Republicans against the ACA may have been the fact that it was often framed in a political context and understated the profound effects that the ACA had on access to healthcare and the reform of the healthcare system itself.
During the 2016 election campaign, many Republican politicians used the idea of "repealing" the ACA to energize their voters, without necessarily clarifying what exactly they wanted to repeal. Should all the aspects of the ACA – from the Medicaid expansion to the new healthcare quality metrics in hospitals –be repealed? If voters relied on the local television news to learn about the ACA, and if this coverage – as is suggested by Gollust's study – viewed the ACA predominantly as a political entity, then it is not surprising that voters failed to demand nuanced views from politicians who vowed to repeal the law. The research also highlights the important role that television reporting plays in framing the debate about healthcare reform. By emphasizing the actual content of the healthcare reform and its medical implications and by using more scholars instead of politicians as information sources, these media outlets could educate the public about the law.
There are many legitimate debates about the pros and cons of the healthcare reform that are not rooted in politics. For example, electronic medical records allow healthcare providers to easily monitor the results of laboratory tests and avoid wasting patient's time and money on unnecessary tests that may have been ordered by another provider. However, physicians who are continuously staring at their screens to scroll through test results may not be able to form the interpersonal bond that is critical for a patient-doctor relationship. One could consider modifying the requirements and developing better record-keeping measures to ensure a balance between adequate documentation and sufficient face-to-face doctor-patient time. The ACA's desire to track quality of healthcare delivery and penalize hospitals or providers who deliver suboptimal care could significantly improve adherence to guidelines based on sound science. On the other hand, one cannot demand robot-like adherence to guidelines, especially when treating severely ill, complex patients who require highly individualized care. These content-driven discussions are more productive than wholesale political endorsements or rejections of the healthcare reform.
Healthcare will always be a political issue but all of us – engaged citizens, patients, healthcare providers or journalists - need to do our part to ensure that this debates about this issue which directly impacts millions of lives are primarily driven by objective information and not by political ideologies.
Gollust, S. E., Baum, L. M., Niederdeppe, J., Barry, C. L., & Fowler, E. F. (2017). Local Television News Coverage of the Affordable Care Act: Emphasizing Politics Over Consumer Information. American Journal of Public Health, (published online Feb 16, 2017).
Eileen Alice Soper (1905-1990). When Badgers Awake.
John Lister-Kaye, naturalist and wildlife writer, describes his experience with Soper in "Gods of the Morning":
"As we approached the (badger) setts in the dusk she seemed to slough off her human-ness and transmogrify into something more than half wild. I couldn't understand how she sat so still. She denied cold and rain, she ignored itches - a gnat landing on her nose - she seemed to become part of the wood herself, part of the tree, the soil, the still evening air ..."
Special note to my siblings: Eileen Soper was the illustrator of our beloved childhood books by Enid Blyton - look!
WEIMAR ON MY MIND
by Brooks Riley
To paraphrase Heinrich Heine, I dream of Weimar in the night—not the era, but the town of Weimar, a lovely word on its own, one steeped in intellectual significance, historical resonance, cultural audacity, political and artistic enlightenment, philosophical bravura--and in modern times monstrous atrocity.
I remember the first time I heard the word Weimar. It wasn't that small town in Germany where Goethe, Schiller, Nietzsche, Liszt, Luther, Cranach, Bach, Wagner, Gropius, Klee, Kandinsky, Strauss, Schopenhauer and countless other thinkers and artists once lived--or even where Kafka on a visit fell in love with the daughter of the caretaker of Goethe's house.
It wasn't the birthplace of the Bauhaus movement. It wasn't the place where the new German constitution was signed in 1919 launching the legendary Weimar Republic, that glittering era of promise before the darkness fell. And it wasn't the town closest to the murderous concentration camp at Buchenwald.
It was our Weimaraner, a hunting dog my father acquired to quell his thirst for a canine to tip the balance in a feline household. But Tonndorf, named for the castle a few miles from Weimar where my father, Artillery Commander of the 6th Armored Division had quartered with his regiment at the end of World War 2, wasn't allowed in our household, and was banished to the stable with the horses, where he spent hours hoping to catch a rat coming out of a hole in the earthen floor of a stall, successful only once in all his years, when an emerging rat took a wrong turn and landed in his maw.
Weimaraners were exotic in the mid-Fifties. They hadn't been discovered by William Wegman or immortalized in the Museum of Modern Art. What I remember best about Tonndorf was the color of his coat, my favorite color, taupe. Taupe is the color gray with a smile, a hint of warmth that seeps through the sober neutrality of lightened black. I never think of Weimar without somehow seeing taupe, and when I look at Goethe's color wheel, I can't help wishing he had added that smile.
It would be many years before I actually went to Weimar, years before I began to understand the subterranean currents that would lead me there. So many interests of mine had their genesis in Weimar or were inextricably entwined with it. In college, a term paper of mine dealt with Friedrich Schiller's Wallenstein trilogy, which was written and premiered there. In it, I posited that Schiller might have foreseen the dangers of Napoleon, and had written Wallenstein as a parable. Ironically, Weimar later briefly fell to Napoleon.
To understand Weimar, it's important to remember how small it was, and how small it still is. At the time Goethe moved there in 1775, invited by the enlightened 18-year-old new Duke Karl August of Saxe-Weimar (who also developed the Weimaraner breed of dog, by the way), it had roughly 6,000 inhabitants, fewer than a single city block in New York. It was in the middle of nowhere--far from Frankfurt, far from Berlin, far from Munich, and very far from the other important cultural metropoles of Europe--nestled in the rolling hills of Thuringia between Erfurt and Jena. Today, it is home to around 65,000.
Even in Goethe's time, now referred to as the Golden age, Weimar already had a rich heritage. Johann Sebastian Bach had lived there for nine years before he was jailed for a month by an earlier Duke for stubbornness and insolence (there's more to that story), and then fired; Luther had spent time there, Cranach the Elder too.
I like small towns. After an early childhood in Paris, I grew up in a small American town and have since peppered my life's big-city resume intermittently with residency in small towns. I live in one now. Small towns embody the promise of perfection even when they don't or can't live up to that promise. At their best, they are miniature well-oiled societal structures without the agonies of big-city overcrowding and logistical isolation, the kindness of neighbors trumping the kindness of strangers. At their worst, they are teapots of conflict, as anyone who's ever seen a Western can attest.
I fell in love with Weimar when I worked there in 2008 directing the Deutsches Nationaltheater's production of Wagner's Ring of the Nibelung for DVD. During a lunch break I once wandered into the empty Jakobskirche where an organist was practicing a prelude that Bach had written for that church 300 years earlier. At the theater itself, at work on that quintessential work of German culture, I was breathing the rarefied air of past triumphs: among them, the world premiere of Wagner's Lohengrin, of Richard Strauss's Death and Transfiguration (in nearby Eisenach), and Don Juan (both written when he was Kapellmeister in Weimar), of Goethe's Iphigenia in Tauris and Torquato Tasso, of the late Schiller dramas directed by Goethe, then intendant of the theater (even if the building had been replaced).
Historical Weimar is well known and accessible. But my Weimar, and the Weimar my father saw, and that people I have known survived, form a network in my brain, that spreads far beyond the physical proximity, through many degrees of separation.
The significance of Weimar as the paradigm of German culture was not lost on Adolf Hitler, who is said to have visited Weimar 40 times before he came to power. Knowing what I know about Weimar in the years leading up to and including the Third Reich, it's difficult not to see a cautionary tale at work, one that applies to the history now being made in America. When the Bauhaus decamped for Dessau in 1925, it was due to Weimar's growing conservatism and hostility to anything new. Thuringia became the first German state to have a Nazi government. The Nazis tried to co-opt the great heritage of Weimar, claiming what was not rightfully theirs, the way America's greatness is not rightfully the provenance of Donald Trump. They even distorted Nietzsche's legacy as his own racist sister had done after he died.
C.G. Jung and the collective unconscious come to mind. What could be more emblematic than the Icarus plunge to earth of Weimar and a whole nation suddenly dominated by its Shadow? The more enlightened a culture (think Weimar), the deeper its fall when the Shadow ermerges from the recesses of the collective unconscious, when the negative aspects of personal character that are normally held in check by mutual agreement, are then suddenly unleashed (think nearby Buchenwald).
It's difficult not to think of Weimar these days, when our own, older republic now seems on the verge of annihilation in ways not unlike the massive assault on the Weimar Republic in 1933. The new liberal constitution that launched that brief period of promise after the ‘war to end all wars', was signed in the Deutsches Nationaltheater in Weimar in 1919. The plaque Gropius designed to commemorate that event was removed by the Nazis, and eventually replaced by a replica after the war.
The Weimar constitution was never repealed when the Nazis came to power. Instead, it was abused again and again, its loopholes wide enough to drive the panzers of perdition straight through to cataclysm, in ways its drafters could never have imagined.
The Weimar constitution didn't take into account the possibility of a Hitler. Our own constitution too begs the question: Are there loopholes wide enough for an orange Hummer to drive through? Does our constitution take into account the possibility of a Trump? Are there safeguards against madness, demagoguery or even infantile wilfulness? Unfortunately, it takes more than one to tango and 62 million people voted for Trump in spite of his faults and limitations (or because of them). Is the Jungian Shadow reemerging at the collective level, turning our great nation into a cesspool of racism, rancor, recrimination and mean-spirited self-destructiion?
Speaking of constitutions, in 1816 Goethe's old friend and patron, now Grand Duke Karl August of Saxe-Weimar-Eisenach was the first monarch in Germany to give his land a constitution, in which he guaranteed both freedom of speech and freedom of the press, urging his people to think for themselves. Enlightened leadership always leads to a healthy society and encourages creativity and self-determination. It was true in Weimar, and it was true in America, until now. As for walls, Weimar tore down its wall in 1757 seeing no need to keep people in, or out.
During the Third Reich, Weimar's National Theater, which had once presented the greatest works in the German canon, stooped to play host to the keepers at Buchenwald, offering them light entertainment in the form of operettas like Franz Lehar's Land of Smiles, while the librettist of that work, Fritz Löhner-Beda, was imprisoned in the camp nearby, writing the lyrics for the astonishing Buchenwaldlied (Buchenwald song) for his fellow inmates. https://en.wikipedia.org/wiki/Fritz_L%C3%B6hner-Beda
Many years later, after a screening in Santa Fe of Marc Neikrug's poignant musical drama Through Roses, about a violinist who had survived Auschwitz, I heard the song for the first time when an old man in the front row stood up and sang it. By the time he finished, we were all in Buchenwald somehow, the intervening years and separations of individual destiny evaporating, just as they had for the violinist in the film.
The Fifties in America could be called the Years of Silence. No one spoke of the Nazi genocide, it wasn't taught in schools, and the words Shoah and Holocaust had yet to join common parlance. The extermination of the Jews by the Third Reich, so sensationally reported immediately after the liberation of the camps, had slipped into oblivion, remembered only by those most affected, the survivors themselves and their families.
I first learned of the Shoah from Leon Uris's novel Exodus, which I read in high school in 1960. It would be another ten years before the Holocaust began to take hold in the American consciousness, gaining momentum as the years have gone by.
My own father, an otherwise lively storyteller, never told war stories and disapproved of those who did. After he died in 1971, I found a shoe-box full of snapshots of Buchenwald, tucked away in a cupboard in our library. The significance of those photos would not become apparent for another twenty years, when, exactly 50 years after the liberation of Buchenwald, I saw him by chance on German television, showing a French general around the camp, a month after it had been liberated.
The ensuing sadness that he'd never told me anything about that time was juxtaposed with my memory of Buchenwald survivor Elie Wiesel, with whom I once had an intense philosophical conversation about belonging. The two men breathing the same air within a few weeks in history seemed like an event of personal synchronicity.
On a day off from Wagner, I decided to visit Schloss Tonndorf , the castle on a high plateau a few miles from Weimar where my father and his men resided for two months at the end of the war in 1945. During that time, according to the Battle Book of Division Artillery, it developed into a community that boasted of ‘a laundry, a radio repair service, a tailor shop, a dry cleaning agency, a post exchange, a movie house, a beer garden, a newspaper, a baseball diamond, a military government, a police force, a photo laboratory, a ballroom, and an airport.' And, to go with that diamond, a baseball team called Riley's Red Raiders. A pop-up small town of men waiting to go home, grateful that the war was over. http://www.schloss-tonndorf.de/
After driving the rutted dirt road up to the castle, I soon learned that it was once again a community, now a utopian new-age commune devoted to ecological principles and peaceful co-existence. Thirty like-minded souls each paid 10,000 Euros to buy the castle in 2005. Now it produces honey, elderberry champagne, and bio milk products, offers classes in yoga, and puts on arts festivals. The only trace of my father's sojourn there was a photo pinned to its community bulletin board, with him and his men in the courtyard saluting the American flag.
Weimar has never been more beautiful than it is today, but its beauty is tragic, forever scarred by loss of innocence. The spirit of Weimar has been embalmed, the good along with the bad, in a town that has become a museum and no longer resonates in the modern world. In 2008 Johann Sebastian Bach was officially pardoned by the nominal Prince of Saxe-Weimar-Eisenach. A few wrongs can be made right, but most are permanent stains that will live on in perpetuity.
What can tiny Weimar tell the vast United States of America? That no society, however enlightened, principled or idealist, is immune to the forces of darkness, as we are now reminded by the Washington Post's new slogan, ‘Democracy dies in darkness'.
In 1968, at a party at the Architectural League in New York, the widow of Bauhaus member László Moholy-Nagy told me that America was becoming just like Germany in 1933. ‘Wrong,' I countered, ‘It could never happen here.' Or could it?
The man of the hour
by Katalin Balog
"As he died to make man holy, let us die to make things cheap." --Leonard Cohen, "Steer your way"
In this article I use a distinction borrowed from philosophy, between objectivity and subjectivity, to look at the nature of the Trump presidency. I explicated that distinction in more detail in some earlier posts here, here and here.
For all the ridiculousness of our president there is a whiff of the devil about him - by monumental bad luck, America has managed to elect a person embodying the worst of human nature. He combines thoughtlessness and utter disregard for standards of objectivity and reason with the soullessness and banality of reality TV run amok. Despite real parallels with 1930s Europe and more recent autocratic regimes across the world, the Trump era also offers novelty; it is its own, unique brand of awfulness, made in America.
In trying to grasp Trump's uniqueness, many commentators resort to psychology. In this essay, I want to propose a more philosophical perspective, a sort of psycho-philosophical approach that, in my view, allows one to appreciate better the psychic vortex that sucks up and annihilates anything of value around Trump. He is the inverse Midas: everything he touches turns immediately into junk. Business, entertainment, social media and now our national politics – very little is safe from his seeping menace. Kierkegaard's philosophy offers some clues to understanding this situation.
Kierkegaard suggested that the mind oscillates between two primary perspectives on the world: objective and subjective – and that the relationship between these approaches determines what kind of a person we are going to be. Objectivity is an orientation towards reality based on abstracting away, in various degrees, from subjective experience, and from individual points of view. An objective approach is based on concepts and modes of thinking about the world that is accessible from many different points of view. A subjective orientation, on the other hand, is based on an attunement and direct reflection on the inner experience of feeling, sensing, thinking and valuing that unfolds in our day-to-day living. It is the difference between an abstract, objective conception of water as a potable liquid that is also found in lakes, rivers and oceans, and the subjective concept of it based on what it is like to drink it or swim in it on this particular day in this particular place. Objective and subjective, of course, comes in degrees. Scientific concepts are the most objective but many of our everyday concepts are also of the more objective variety. The most subjective conceptions are those that arise in direct reflection on experience.
Much of Kierkegaard's philosophy is a warning against the tendency to take an increasingly objective stance. The spectacular advance of science and industry, and the rise of Enlightenment rationality in the last 300 years has slowly weakened subjectivity and created a culture that offers less and less incentive to deepen one's inner life. It created modes of being that have little room for silence and increasingly invite noise and constant action. Consequently, fewer of us live thoroughly immersed in life's experience and more of us are distracted by its abstractions, by all the ways our culture conceptually frames our existence as individuals, Democrats and Republicans, man and women, white people and minorities, one percenters and workers, consumers, immigrants, and so on. But according to Kierkegaard, our experience of life matters in ineffable ways that no objective understanding of the world can capture. By mistakenly taking our objective understanding as our only connection to reality we make our world less rich. By becoming less subjective, we cut ourselves off from sources of meaning and value.
One can frame a decision, for example, in objective terms. One might decide between career choices by weighing differences in opportunities for self-promotion, or, for that matter, opportunities of service, between being a real estate developer or being the President of the United States. We are encouraged to make choices framed in these objective terms. Alternatively, one might try framing the decision, at least in part, in terms of what it might be like to work in either occupation; in this case, one needs to have the patience to dwell in experience long enough for one's feelings about either alternative to emerge. In other words, one might deliberate subjectively.
I think it is clear that our president has no inclination towards inwardness. His restlessness, his need to occupy and entertain himself with constant activity is a symptom of a primarily objective orientation. Trump has been, as far as I can see, rightly described by Mark Singer in a 1997 New Yorker profile as having "an existence unmolested by the rumbling of a soul", a person so superficial and vapid that one might question if he has what is ordinarily thought of as an inner life. He has not only objectified and used others; he has managed to objectify himself through an identification with the gaze of others, and by extension, the camera – the symbol of the public eye. The banal, shiny look of his buildings, the gaudiness of his "palatial" dwellings express a soulless indifference to place, and a deference to the idea of wealth; a decorator's sensibility tethered to a perception of the opinion of others. As Kierkegaard observed in the Concluding Unscientific Postscript, "the world has perhaps always had a lack of what could be called authentic individualities, decisive subjectivities, those artistically permeated with reflection, the independent thinkers who differ from the bellowers and the didacticizers." But it is hard to find a leader in recent memory so unreflective as Trump is.
But how can a person lack subjectivity? Isn't it true that, given one's experience of life, one cannot fail to be subjective? To the contrary, the mind can flee its own subjectivity, can escape into alienation. As Freud has described, there are various ways of doing this: repressing experience, dissociating from it, numbing it, turning away from it. Most commonly, we turn our back on subjectivity to escape from pain or helplessness. Feeling weak, sad, or overwhelmed is taboo for many, especially men. According to Trump biographer Harry Hurt, Trump's father urged his sons to become "killers", and told them they are "king". In a Playboy interview Trump said, reflecting on his older brother Fred who died as an alcoholic at age 41, "I saw people really take advantage of Fred, and the lesson I learned was always to keep my guard one hundred percent."
When sensitivity to experience wanes, when the mind is preoccupied with promoting and protecting the self in a hostile world, a quality of drivenness develops. The machinery of mind churns away without reflection and its capacity for change and self-direction diminishes. Without sympathy to oneself, one cannot feel sympathy for others, so the moral universe looms ever more distant. Without appreciation of beauty and meaning the world appears barren.
But here is the thing: an objective orientation that abstracts away from lived experience is not the same as being objective in the normative sense. You can have an objective orientation but still be biased or uninterested in facts and evidence, even be on an outright campaign denying obvious facts; you can be full of contradictions and utterly irrational but still approach reality primarily in an abstract, conceptual way. Objectivity as an epistemic norm can be separated from objectivity as a primarily conceptual, abstract orientation to the world. Normative objectivity, requiring respect toward evidence, logic, and reason, is the virtue that represents excellence in one's objective orientation to life. In fact, the term "subjective" is sometimes used – in contrast with my use of the term is this essay – to describe a certain deficiency in this virtue; a self-centered bias in one's relationship to evidence and belief. Trump's belief, for example, that his crowd sizes exceeded all previous inauguration crowds is subjective in thissense; not in the sense of being based on reflection on lived experience.
Trump lacks conspicuously in the virtues of objectivity, while also lacking in the virtues of subjectivity. This puts him at ground zero with respect to the sources of value. The sources of value are – for all the protestations of Kierkegaard, and his romantic contemporaries – reason underlying both intellect and agency; and lived experience, replete with feelings, moods, and emotions. They together ground intellectual, prudential, moral and aesthetic value. Trump fails on both counts.
His deficiencies in the two realms are not unconnected. Both virtues are based on a repudiation of naked self-promotion. Objectivity in the normative sense requires putting the pursuit of truth and honesty ahead of one's own self-interest. This is an uphill battle for humans who – pace Rousseau – evolved to be social; and whose struggle for social status – as social psychologists theorize – has made us hardwired to prefer self-serving ideas to the truth. But humans have the potential to transcend self-interest; we can follow reason in discovering the world. Normative objectivity is the ideal most fully embodied in science, one of the highest achievements of Western culture.
The cultivation of subjectivity also requires a certain amount of self-denial: it requires the acceptance of one's inner world as it is, instead of seeing and presenting it in the most flattering light. This, too, runs against strong forces of human nature. Repression and denial are just the most obvious in a varied and intricate repertoire of self-deception. Art and literature at their best are attempts to break through this self-deception. Contemplative traditions and psychology both developed sophisticated mental techniques to do this and they, too, are precious achievements.
Trump's deficit in the virtues of both objectivity and subjectivity originates in his obsession with self-promotion. His guiding instinct is what Rousseau called "amour propre", love of self, in the sense of vanity, conceit, desire for recognition. Rousseau warned that amour propre, when it becomes the guiding motive in civilized society, makes people mean and alienated. In his book on education, Emile, he in fact advises
the main thing is that the child should do nothing because you are watching him or listening to him; in a word, nothing because of other people, but only what nature asks of him. Then he will only do good.
Trump is indeed corrupted by ambition; he lies and does cruel things with complete ease to promote his own advantage. His objectivity deficit is quite radical; he seems to have only a lose grip on the line between reality and representation. This makes him very dangerous.
His amour propre also makes him turn away from the reality of his self in favor of a flattering image of it. Cioran says: "This is how we recognize the man who has tendencies toward an inner quest: he will set failure above any success." This is because failure, Cioran thinks, "reveals us to ourselves, permits us to see ourselves as God sees us, whereas success distances us from what is most inward in ourselves and indeed in everything." This is gleeful provocation from a writer who enjoyed a fair amount of literary fame since adolescence. But the basic message seems correct.
For all his self-centeredness and bragging, I think it is a mistake to describe Trump as a narcissist. Narcissus of the myth sees himself in his actual reflection, and gets infatuated with it; Trump instead produces a doctored image of himself for others to fall for. He is enthralled not by himself, but a character called "Trump". Lacking the virtues of both intellect and heart, he personifies the worst and most superficial in contemporary American culture: its translation of all value into money and number (think about the size of those crowds!); its spreading contempt for facts and reason; its demonization of people different from oneself; its anti-intellectualism and dismissal of expertise; its obsession with fame, success and celebrity for its own sake and its indifference to decency, fairness or beauty. He has made junk, an addiction to worthless things his highest good. He busies himself building, as the late Leonard Cohen quipped, the Tower of Wrong. Whether Americans will eventually unite against him will be a test of the greatness of our country.
by Brooks Riley
Apelles' Lost Paintings and How to Tell a Great Work of Art
by Amanda Beth Peery
In Pliny the Elder's Natural History, he describes a fourth-century BC painter, Apelles of Kos, as superior to all other painters. According to the Encyclopedia Britannica, Apelles "continues to be regarded as the greatest painter of antiquity even though none of his work survives." How is it possible that the artist seen as the greatest painter of all of antiquity is one who left no surviving works? One possibility is that his fame has been expanded by myth and time, and with no works left to show the truth, his skills have been inflated beyond their due. That's probably true, but I believe there's another, more legitimate reason for Apelles' reputation. Apelles' art—often conveyed through the descriptions of ancient writers like Pliny—has engendered other art. One way of measuring the greatness of a work of art is to ask whether it gives rise to other works, or to say it differently, whether or not it inspires.
Apelles of Kos was the court painter of Macedon under Alexander the Great. Pliny recounts various stories about him, many of them gems. In one, Apelles comes to Egypt, then ruled by one of the Ptolemies (the first Ptolemy, I think) whom Apelles once knew. A court jester invites Apelles to a feast at the royal palace, but unbeknownst to Apelles, Ptolemy has long harbored a hatred for the artist and the pharaoh is enraged to see him at the feast. Ptolemy commands Apelles to tell him who invited him. Apelles, who never knew or doesn't remember the jester's name, picks up a piece of charcoal from the cold hearth and begins to draw the jester's face on the palace wall. Within just a stroke (or two), Ptolemy recognizes his jester. Apelles has captured the jester with just a single line.
Apelles is famed not only for his superior skill but also for his dedication to his art. Pliny attributes to Apelles the phrase "nulla dies sine linea," or "not a day without a line," because the artist worked every day. Apelles exemplified the artist's lifestyle and was so respectable and respected that he could speak out against Alexander the Great himself. In one story, Alexander is sitting for a portrait expounding his theories on art, going on at length, until the artist quietly begs him to stop because the boys grinding the colors will laugh at him. We don't know what Alexander was saying, but by stopping him, Apelles—in his innocence—asserted the artist's superior knowledge of the craft and maybe even the way of seeing and ways of creating that artists are able to access. Alexander, who had been tutored by Aristotle (who was tutored in turn by Plato, who was tutored by Socrates) cannot rival Apelles'—or the color-grinding boys'—intimate knowledge and experience of art. In this story Apelles rejects the very sources of knowledge in the West. He is insisting that there is another type of knowledge. Or he is insisting, at least, that there are other things to know.
At the risk of being an Alexander, I want to talk about what makes art good or true. One way to tell good art from the rest, I think, is that it inspires other artists.
The idea that art is immortal and immortalizing is common in the history of art and literature. In epics like Beowulf, for example, the names of heroes are sung to commemorate their brave deeds, and they are thus immortalized. Their stories are twisted and made enchanted over time, and their names have different sounds on modern tongues, but they live on in some form in surviving epics. When authors' names began to be passed down with their works, they too began to seek immortality, or at least inhuman longevity, by creating works that lived on after their death.
But more important than immortality is that a work of art give rise to others, that it passes some small light along and drives people who see it to write, sculpt, paint, or compose. It's possible that this is the only way that art can begin to approach immortality. Immortal things can live in a glass box slowly, eternally withering, but art, I think, is not immortal in the same way. Instead, it carries a certain immortal madness in it that passes from the mind of one artist to another. This madness is what drives artists to rival the gods by daring to create. Or, in today's more banal formulation, it is what we might call "inspiration." Art begets art.
When Apelles was asked why he "touched and retouched his paintings so continually" he said, "I paint for eternity." This quote survives while none of his art does. (Some of his paintings were supposedly destroyed in a fire in Caesar Augustus's mansion on Palatine Hill). But his paintings were described by Pliny and by Lucian, and some of them were painted again centuries later, during the Renaissance.
Using the famed concubine Campaspe as a model, Apelles painted Aphrodite raising from the sea, wringing her hair, with silver water droplets veiling around her. According to Pliny, "time and damp" eventually destroyed Apelles' Venus Anadyomene after first washing away the lower half (as though Aphrodite were being gradually reclaimed by the sea). When just the lower half was destroyed, no one with the necessary skill could be found to restore the painting. Some sources say that this was not Apelles' only painting of Aphrodite rising from the sea and wringing her hair. He began another, also using Campaspe as a model, but died before he could finish it. This second Venus Anadyomene was said to be even more beautiful than the first. Apelles left only an outline.
Apelles' beautiful painting, as it was described by ancient writers, was a source of inspiration for Botticelli's Birth of Venus. Maybe, with Botticelli, a painter of the requisite skill was finally found, and the destroyed painting or the unfinished painting could be finished. This was not the only one of Apelles' paintings that Renaissance artists brought back to life from ancient writers' descriptions. Botticelli, for one, also painted "The Calumny of Apelles," basing his version of the allegorical painting on Lucian's description of the original.
Is it possible that the whole history of art is a series of collaborations over time? Maybe this is only true of great art; that great works are not the product of a single mind but grow over time through the work of many hands and many imaginations. In Moby Dick, Ishmael says "For small erections may be finished by their first architects; grand one, true ones, ever leave the copestone to posterity. God keep me from ever completing anything. This whole book is but a draught—nay, but the draught of a draught. Oh, Time, Strength, Cash, and Patience!"
I wrote earlier that Botticelli might have finished Apelles' ruined or unfinished Venus. But maybe that's not where the line ended. Maybe the many artists who have been inspired by Botticelli's goddess have continued in an unending process of creation, with the painting taking different forms over time, some morphing so that the goddess is only a hint or a shade, but all, when they are successful, somehow passing on some quality of its exquisite beauty or its glory, or something else at the heart of it.
Jorge Luis Borges writes about how works of art are passed along to other works, sometimes through the medium of dreams. In "The Dream of Coleridge" he writes about Samuel Taylor Coleridge's poem "Kubla Khan." Borges tells the story, based on the poet's own account, of how Coleridge fell into an opium dream while reading about the thirteenth-century Mongolian emperor Kubla Khan's summer palace. Coleridge woke up with a poem of 200 to 300 lines perfectly formed in his mind. He began to write it. After writing just 54 lines, he was interrupted by a person from Porlock, a nearby town, who had come on business, and when he sat back down to finish the poem, the rest had vanished into the mists of a forgotten dream.
Borges writes that what Coleridge did not know and could not have known was that the palace itself came to Kubla Khan in a dream. A Persian source that described the origins of the palace in the emperor's dream only appeared in the West twenty years after Coleridge published his poem. In Borges' gorgeous essay, the Argentine writer says:
The first dreamer was given the vision of the palace and he built it; the second, who did not know of the other's dream, was given the poem about the palace. If the plan does not fail, some reader of "Kubla Khan" will dream, on a night centuries removed from us, of marble or of music. This man will not know that two others also dreamed. Perhaps the series of dreams has no end, or perhaps the last one who dreams will have the key.
After writing all this, I perceive—or think I perceive—another explanation. Perhaps an archetype not yet revealed to men, an eternal object (to use Whitehead's term) is gradually entering the world; its first manifestation was the palace; its second was the poem. Whoever compared them would have seen that they were essentially the same.
Is it possible that art is the gradual bringing-into-being of eternal objects? The story of Apelles' Venus seems to suggest this theory holds water. In some sources, Apelles' second, more beautiful and unfinished version of the painting was only an "outline." Maybe the ancient description of the painting, another outline, was "essentially the same" as Apelles' work. Maybe even Botticelli's rich recreation was, in its essentials, again the same. (And maybe the third iteration of Kubla Khan's palace was not of marble or of music but appeared in the form of Borges' essay.)
When the caves at Lascaux were discovered, Picasso went to see the prehistoric paintings, preserved from "time and damp" in the caves' long-sealed chambers and passageways. Afterwards, Picasso said, "we have learned nothing in 12,000 years." (Today it is believed that the paintings are closer to 17,000 years old.) Some paintings on the walls of Lascaux bear an eerie resemblance to Picasso's work, painted before the caves were discovered. The resemblance between the cave art and the modern art of the early twentieth century seems clear to me. Maybe this is evidence of eternal objects, or forms, coming into being.
Plato imagined forms that cast shadows against the walls of another cave. Maybe the objects in the Lascaux paintings flickered up like shadows, or maybe they were spirits emerging from the walls, as some archeologists think the prehistoric painters believed. I think it's interesting that these paintings, like Apelles' last Venus, strike some of us modern viewers as outlines. Many of Picasso's paintings, too, have heavy outlines and simple, stylized forms. Maybe these outlines are, in some way, the core or essence of an object struggling into the world. Apelles captured the nature of the jester's face with one or two strokes of charcoal on Ptolemy's wall.
We don't need to believe in the spirit world or in eternal objects to believe that two works of art, separated by time, can struggle to bring the same thing—often the same otherwise inexpressible object or quality—into the world. I think the art that most often inspires is art that has partially succeeded in bringing something with life in it into the world. But like Alexander, I've been expounding on art in Plato's shadow, and I can only hope that the boys grinding the colors aren't laughing. All I know—or think I know—is that art, when it has life in it, has inspired other artists to risk creating.
Amanda Beth Peery is an Assistant Editor of History at Princeton University Press. She has also worked in philosophy, literary criticism, and economics at Harvard University Press.
The Concussion Year
by Shadab Zeest Hashmi
The ghost that lurks around the old Bombay Company bookshelf is the ghost of an elliptical future, trailing the past like a spectacular, burning, comet-tail. It is the wispy energy of my own half-dreamed, half-written book that hovers over the rows of books I use for research, mostly works of history and poetry. After a night of writing, I have finally met my deadline. The life-size mirror leaning in the corner shows a pale face, preoccupied with time; my work is to not forget the past, and to call to poetry what may be forgotten. I am now searching for a book for remembrance, a book by the American Sufi poet Daniel Abdal-Hayy Moore. I want to honor this poet whose work I consider a beacon and who is now saying his goodbyes, dying of cancer. I am flailing for time, mine, his, and ours as poets, especially as Muslim poets living through times of brutal daily deaths. Weeks from now, earthly time will stop for him, moments from now, time will slow down for me, indefinitely.
The bookshelf phantom is poised to make projectiles of treasured objects— a miniature Chinese cabinet and framed Turkish calligraphic art on an easel— heavy objects that will slide down and cause multiple concussions and head/neck trauma. I am stunned but remain conscious, not bleeding but suddenly fatigued. It is ironic that one of the objects is Turkish— I had met Daniel Abdal-Hayy Moore and his wife Malika at the Nazim Hikmet Poetry Festival where he and I were both awarded the Hikmet Poetry Prize, where I recognized kindred souls in both Daniel and Malika and found a reservoir of inspiration and made lifelong friends at the Turkish House in Cary, NC. Despite the shock of the accident, I feel the surge of a promise, a kind of reassurance.
Over the next weeks and months, I will go through several phases and manifestations of the head and neck trauma. I will wait it out, struggle to continue my daily duties and keep promises to loved ones— kids’ sleepovers with cousins, cooking birthday treats, coaching for a science competition— I will strain to finish projects until the pain and fatigue take over; I will give up my goal of finishing my book by the summer. I will travel alone and with family, and come to know the din of airports and streets as nothing short of being trapped in a nightmare. I will feel helpless. I will also be the recipient of serendipity: I will be visited by childhood memories in unexpected places such as the ABBA museum in Stockholm or the Nivea store in Hamburg, reminisce with my younger brother; I will read my poetry among old friends and make new ones in Chicago, New York, Portland and Seattle. I will drink tea in some of the most beautiful gardens and have some of the saddest thoughts of my life.
In the course of a year, the political climate will ignite more mistrust, hatred and violence, particularly towards Muslims. I will find myself saying on stage how sick I am of having to publically translate reality as an American Muslim, to speak as a perpetual other; I will go home exhausted and abandoned. As someone who covets solitude, I will for once discover its frigid side. Unable to read or write for extended periods, I will have my first brush with hard core isolation— a lesson in humility that will teach me to take the first deep breaths of my life. I will no longer take breath for granted, nor will I take the beach a mile away for granted; I will catch more sunsets in six months than my whole life. I will be thankful to have my parents’ hands to picture as I suffer through a claustrophobic hour of MRI scans. I will find comfort in my mother’s voice on the phone and in chanting the ninety-nine names of God. I will discover gifts of health in nature on my weekly walks with my sister-in-law, a physician who coaches in integrative health in addition to her practice as an MD.
There are signs nesting in signs. In between episodes of panic attacks and nausea, I aim to be the finest listener, an artist of stillness who cultivates the patience to mend her own wings, and remains in no hurry to fly. My Neurologist invites me to his home to attend a Sufi zikr led by the celebrated Rumi scholars Kabir and Camille Helminsky, only days before the end of the concussion year; my attention is gently brought back to time, to letting go of time, and thereby of making what is allotted to me truly mine. I reflect on Daniel’s Muslim name— Abdal-Hayy, one in service of the Divine ever-living, the timeless One.
There's a certain kind of conversation in which I find myself every so often, which can roughly be summarized as "What's the big deal about DJing"? As someone who was a quasi-professional DJ in a former life, and is currently what one friend terms a 'monastic DJ', I've sensed a substantial gap in lay understanding of not just what a DJ does while engaged in the act of mixing, but also the place occupied by DJs in the contemporary musical ecosystem. This attitude — not unlike looking at a Jackson Pollock while muttering to yourself that you could do just as well — has received further support from the rise and fall of the spectacularly excessive (and, to my ears, creatively bankrupt) EDM scene; the unholy marriage of superstar DJs, casino-based clubs and overpriced bottle service; and the fact that watching someone DJ is fundamentally uninteresting.
Is there any value in mixing other people's music? When viewed from the most reductive position, the answer is clearly not. As critic David Hepworth noted in a now-deleted blog post, "You must surely realise that you make your living by putting on records, which is only a tiny bit removed in degree of difficulty from switching on the radio." If that's all that DJs are good for, then I suppose it's a relief that streaming services and software-driven playlists have come along to put this particular horse-and-buggy paradigm out of its misery.
Instead, it's more helpful to look at the larger role that DJs play in parsing the ocean of music in which we swim in these post-Napster days. Just as we turn to critics in other fields to understand what we should be reading or watching, we also turn to DJs for clarity on what to listen to. In this sense, the appropriate metaphor is one of the DJ as tastemaker.
In order to talk about how a DJ guides others' taste in music, we have to address the DJ's own, internal process. Over time, a DJ is a collector, a curator and an editor. Of course, being a DJ involves inhabiting all three of these roles at the same time, all the time, but there is also a progression here. I'll go over each of these and then return to what it means to be a tastemaker at the end of this post.
Collecting is the baseline activity for any DJ. Obviously, pretty much everyone has a music collection, but DJs take it to an obsessive level. Whether you're steeped in a particular genre — probably the most common trajectory—or collect the music from a particular era or geography, a DJ's collection is the foundation from which everything else flows.
Collecting is an endless process. To be sure, there is a real joy in finding obscure gems that might be decades old, or music that's just extremely overlooked. This is generally known as crate-digging. Collecting can also become an arms race — ie, the competition to access new releases before they drop commercially. But even if you have the hot new remix from so-and-so, or a white label vinyl pressing that no one else does, you might be ahead of the game for a week, and then only in your little tide pool of the electronic music universe. And while the collecting arms race led to interesting collective responses, too, such as the creation, in 1975, of record pools, any DJ quickly finds out that simply having a solid collection is necessary, but not sufficient, for realizing the work itself.
If collecting is about wrapping your arms around as much of your chosen domain as possible, we may logically ask if there is a thing as too much music? It's not unusual for DJs to have 15,000 or more tracks in their collections, as well as stacks of records and CDs that patiently await a critical listening. To which I would say, that's like asking a painter if they have too many colors in their palette, or an interior decorator if they have too many fabric swatches, or a fashion designer too many styles of buttons. All of these professionals engage in the act of remixing their materials into new, exciting and perhaps most importantly, appropriate arragements that speak to the needs of a unique aesthetic moment. So it's not so much an issue of having too much music, but rather the possibility of owning something and not knowing that you do.
The DJ's collection is the DJ's instrument.
Curation is, admittedly, a word that's been beaten to death in the last few years. Everyone is a curator now — if only because they are ‘curating' their own life. This is nonsense. It's kind of like saying that we're all knowledge workers, or that everyone we work for is a client.
In a stricter sense, curation is the act of assembling a representative collection of (traditionally tangible) objects. The assembly makes sense in some way — it is literally sense-making. So if you went to the recent Picasso sculpture retrospective at New York's MoMA, you didn't expect to see all of Picasso's sculptures, but a strategic sampling of them, displayed and annotated to demonstrate the artist's progression through time and across media.
In the same sense, the DJ is a curator for a particular domain of sound. Having listened to thousands of tracks, the DJ can select the seminal compositions that demonstrate the development of a sound or genre through its history (indeed, in some musical cultures DJs are known simply as ‘selectors'). This curatorial act can be performed either in real time, or in hindsight.
In the case of the former, the DJ is helping to define the sound of the moment. Kool DJ Red Alert did as much for hip-hop in the 80s and 90s with his long-running show on New York's 98.7 KISS-FM. But DJs also continue to define the contours of a genre even after it's become well-established. A good example here is the seminal set of mixes that Solitude (Tom Bond) has done for UK dubstep.
One more distinction bears mentioning here: while valuable, the kind of curation seen in various oral histories and "bluffer's guides" around the Web differs from the curation a DJ does (a charming example is this guide to Italo-disco). It's also distinct from the kind of magisterial presence that trend-setters such as John Peel had. In Peel's case he cultivated a weekly show for BBC Radio 1 over the remarkable span of nearly 40 years and helped to break countless bands to a global audience.
In contrast to these functions, the DJ presents the results of curation in the form of a mix. This may seem trivial, but the fact is that much of this music is designed to be heard in a mix. For example, it's not uncommon for dance tracks to begin in a thoroughly uninspiring manner, as with a simple kick drum hitting every beat. That's because producers know that DJs need a few bars to sync up the new track to its predecessor; a naked kick drum is the toehold that allows for a quick and effective transition. By the same token, dance music tracks, unlike pop music, rarely fade out, but will have elements drop out over a regular increments of time (usually denoted in cycles of 16 beats), until there is usually only a kick drum remaining. This way the DJ can mix out of the expiring track in an elegant and seamless manner. Thus, this design for mixing carries the additional, curious trait that certain parts of a track aren't meant to be heard by anyone but the DJ.
The mix is the preferred, long-term format for understanding electronic music. When DJs listen to individual tracks they are always thinking about how those tracks can be made to interact with other tracks in their collection. A beautiful song that has no possibility of interacting with the rest of a collection is simply not useful, since the desired outcome will always be a series of relationships between musical thoughts. Another way of thinking about listening to mixes versus standalone tracks is comparing it to the difference between reading a paragraph and immersing yourself in an essay. A good mix is an extended, coherent argument.
In the same way that you go to a museum to understand how to think about Picasso, you listen to DJs in order to understand how to think about a genre, or to see where a particular sound is headed.
Finally, editing draws upon the DJ's skills in making decisions in real time. This can be within the context of a live gig or a studio recording. Both have advantages — the good DJ feeds off the crowd and tailors selections for the moment, while studio recording allows a DJ to carefully assemble a definitive statement over the course of days or weeks (this exemplary techno mix by British selector Objekt took several months to refine and polish).
In both cases, DJs not only select what they will play, and in what order, but make many other decisions. There's really no reason to play a track all the way through, and DJs who tend to only do this I generally regard as pretty lazy. It's more interesting if you can start Track A at the breakdown, then mix Track B from its beginning, then mix back into the beginning of Track A, and end with the second half of Track B. Even better, save the second half of Track B until you've played some of Track C.
All of these decisions are accompanied by the skills and tools that DJs have traditionally had — equalization, cross-fading, pitch shifting, simple hi-pass/lo-pass filters. Digital DJing has added many more, such as effects, loops, and cue points. The ability to access tracks, beats and samples quickly has also reduced the time it takes to perform an edit in real time, to the extent that DJs with the right raw materials and skills can execute what are essentially remixes on the fly, or custom flows that can never be repeated.
It follows, then, that DJing technology blurs the lines between the extroverted phenomenon of playback and introverted correlate of production. While it's not production as it's commonly understood, what's created is a grey zone that may make the source material difficult to separate out from the mix as a whole. A good DJ will emphasize specific aspects of the music, or bring elements from different tracks into dialog with one another. This is all in the service of showing a listener the best that a certain collection has to offer. I discuss an example from my own mixing here.
Good DJs take the most interesting bits and put them together in the most interesting ways.
To return to the idea of tastemaking then: the DJ stands as the interpreter through which listeners access sound. In fact, this is virtually a requirement for electronic music in particular, with thousands of producers working across hundreds of genres that are constantly cross-pollinating one another. Moreover, the tempo of production has increased dramatically, due to the falling cost of both studio gear and distribution (at this point, only a laptop and an internet connection are needed to launch a project or even a record label). This is radically different from any other genre of music, which either has a fixed repertoire (classical) or is expanding, but slowly (jazz, rock and pop). It's an ocean of oceans out there — let a DJ help you make sense of it all.
This post is an expanded version of the first post on my new Medium blog, which focuses on the art of mixing records.
Sunday, February 26, 2017
Is Consciousness an Illusion?
Thomas Nagel in the New York Review of Books:
For fifty years the philosopher Daniel Dennett has been engaged in a grand project of disenchantment of the human world, using science to free us from what he deems illusions—illusions that are difficult to dislodge because they are so natural. In From Bacteria to Bach and Back, his eighteenth book (thirteenth as sole author), Dennett presents a valuable and typically lucid synthesis of his worldview. Though it is supported by reams of scientific data, he acknowledges that much of what he says is conjectural rather than proven, either empirically or philosophically.
Dennett is always good company. He has a gargantuan appetite for scientific knowledge, and is one of the best people I know at transmitting it and explaining its significance, clearly and without superficiality. He writes with wit and elegance; and in this book especially, though it is frankly partisan, he tries hard to grasp and defuse the sources of resistance to his point of view. He recognizes that some of what he asks us to believe is strongly counterintuitive. I shall explain eventually why I think the overall project cannot succeed, but first let me set out the argument, which contains much that is true and insightful.
Carl Erik Fisher in Nautilus:
Thomas was a highly successful and mild-mannered lawyer who was worried about his drinking. When he came to see me at my psychotherapy practice, his wine intake had crept up to six or seven glasses a night, and he was starting to hide it from his family and to feel the effects at work. We discussed treatment strategies and made an appointment to meet again. But when he returned two weeks later, he was despondent: His drinking was totally unchanged.
“I just couldn’t cut back. I guess I just don’t have the willpower.”
Another patient of mine, John, also initially came to me for help with drinking. At our first meeting, we talked about moderation-based approaches and setting a healthier limit. But one month later, he came back to my office declaring that he had changed his mind and made peace with his drinking habits. Sure, his wife wasn’t always thrilled with how much he drank, he told me, and occasionally the hangovers were pretty bad, but his relationship was still fairly solid and drinking didn’t cause any truly significant problems in his life.
In the abstract, John and Thomas are similar: They both succumbed to short-term temptations, and both didn’t keep their long-term goals. But while Thomas attributed that outcome to problems with willpower, John came to reframe his behavior from a perspective that sidestepped the concept of willpower altogether. Both John and Thomas would resolve their issues, but in very different ways.
Most people feel more comfortable with Thomas’ narrative. They would agree with his self-diagnosis (that he lacked willpower), and might even call it clear-eyed and courageous. Many people might also suspect that John’s reframing of his problem was an act of self-deception, serving to hide a real problem. But Thomas’ approach deserves just as much skepticism as John’s. It’s entirely possible that Thomas was seduced by the near-mystical status that modern culture has assigned to the idea of willpower itself—an idea that, ultimately, was working against him.
Ken Arrow's Impossibility Thoerem
The Future of Not Working
Annie Lowery in the NYT Magazine:
The basic or guaranteed income is a curious piece of intellectual flotsam that has washed ashore several times in the past half-millennium, often during periods of great economic upheaval. In “Utopia,” published in 1516, Thomas More suggests it as a way to help feudal farmers hurt by the conversion of common land for public use into private land for commercial use. In “Agrarian Justice,” published in 1797, Thomas Paine supports it for similar reasons, as compensation for the “loss of his or her natural inheritance, by the introduction of the system of landed property.” It reappears in the writings of French radicals, of Bertrand Russell, of the Rev. Dr. Martin Luther King Jr.
Silicon Valley has recently become obsessed with basic income for reasons simultaneously generous and self-interested, as a palliative for the societal turbulence its inventions might unleash. Many technologists believe we are living at the precipice of an artificial-intelligence revolution that could vault humanity into a postwork future. In the past few years, artificially intelligent systems have become proficient at a startling number of tasks, from reading cancer scans to piloting a car to summarizing a sports game to translating prose. Any job that can be broken down into discrete, repeatable tasks — financial analytics, marketing, legal work — could be automated out of existence.
In this vision of the future, our economy could turn into a funhouse-mirror version of itself: extreme income and wealth inequality, rising poverty, mass unemployment, a shrinking prime-age labor force. It would be more George Saunders than George Jetson. But what does this all have to do with a small village in Kenya?
A universal basic income has thus far lacked what tech folks might call a proof of concept. There have been a handful of experiments, including ones in Canada, India and Namibia. Finland is sending money to unemployed people, and the Dutch city Utrecht is doing a trial run, too. But no experiment has been truly complete, studying what happens when you give a whole community money for an extended period of time — when nobody has to worry where his or her next meal is coming from or fear the loss of a job or the birth of a child.
And so, the tech industry is getting behind GiveDirectly and other organizations testing the idea out.