Monday, March 27, 2017
Okay, poets, we get it: things are like other things
...... —A. R. & M. G.
Ah, But Math is Like That Too
When poets are so dissed
by engineers and physicists
they really should consider this:
(4+2) is just like 6
and keeping that in mind
81’s like the square of 9
and in case you think these
are a poet's tricks,
√36 is too like 6
(in this, poetry’s like
In fact, when quantities and things align
like is like an equal sign
and, what’s more,
(4×4) is 16’s metafour
Beauty is Not (Entirely) in the Eye of the Beholder
by Dwight Furrow
In philosophy the most important development in the last 300 years has been the idea that what can be intelligibly said about reality is constructed out of our subjective responses, suitably constrained by social norms and intersubjective communication. This is the essence of Immanuel Kant's so-called Copernican Revolution in philosophy which converted us from naïve realists who took reality at face value to sophisticated anti-realists constructing reality via the structures of consciousness and language.
Kant's argument is sound but preposterous. One would have thought that reality's stubborn resistance to our ideas and expectations and the fact we are often surprised by this resistance might lead us to take the idea of a real world more seriously. The performative contradiction of claiming all reality is a social construction while traipsing off to the doctor when ill renders truth and knowledge the exclusive purview of scientists who have never shown much inclination toward anti-realism. But once these "naïve" realist thoughts are cast out in favor of Kant's fastidious, critical skepticism, common sense can't find a way back in. And so for 300 years we have been denying what to non-philosophers seems obvious—there is a real world out there with which our senses put us into contact.
In light of this revolution in thought we were, by now, supposed to be basking in the friendly solidarities of intersubjective agreement, a consequence that unfortunately appears to be increasingly remote. This idea that reality is a social construction ebbs and flows outside the philosophy class but in today's "post-truth" society it seems ascendant. Perhaps a new way must be found to anchor truth in something more substantial than contingent, collective agreements.
Yet the one area that everyone agrees is subjective and seems the least amenable to a realist treatment is aesthetics. That beauty is in the eye of the bolder is taken to be so obvious it's barely worth mentioning. However, for those of us who think reality is getting short shrift, this easy comfort with the subjectivity of aesthetic judgment is puzzling. After all, aesthetic judgment involves our sensibility and perception, the fundamental way we feel and perceive reality, the critical interface on which that contact with reality rests. Aesthetics would seem to be the arena in which we discover the hard, resistant surfaces of reality and feel them most acutely. If we are to make the case for realism it will likely be through a better understanding of sensibility, and so aesthetics seems crucially relevant.
Regarding aesthetics, Kant insisted that an aesthetic judgment is one "whose determining basis cannot be other than subjective". But beauty doesn't seem to be merely subjective. When impressed by the beauty of a sunset, we are not aware of projecting our attitudes onto the sunset. Rather, the attribution of beauty is a response to the sunset, a recognition of something there in the brilliant sky. Furthermore, if a painting is beautiful it remains beautiful even when locked away in a closet unobserved. The idea that beauty is in the eye of the beholder does a poor job of explaining the stability and persistence of those features of objects we judge to be beautiful. In fact, Kant surreptitiously admits this. When I judge a flower to be beautiful, Kant writes, the beauty is not in the flower. Rather, the flower is beautiful "only by virtue of that characteristic in which it adapts itself to the way we apprehend it". But that formulation contains an implicit reference to some characteristic in virtue of which it adapts to us. Thus, there is something in the object that plays some explanatory role.
The standard picture of our judgments of beauty is that they begin with subjective feelings of pleasure and from those feelings of pleasure we judge an object beautiful. But this ignores the fact that there is something causing that feeling of pleasure. Alfred North Whitehead called the beauty of a flower "a lure for feeling." We're lured, drawn in, captivated by the flower, which is a better way of capturing the phenomenology of beauty as more response than projection. But if beauty is a lure for feeling it can't be the result of that feeling. The pleasure I get is a by-product of this allure. The judgment that a flower is beautiful might depend on pleasure but the attraction has already taken place. The thing and its allure is not constructed by us, it constructs and guides our experience. The attention to the flower is not response dependent; it is the response. It is Whitehead not Kant, who gets this relationship right.
And so Whitehead, trying to reverse Kant's revolution, argues that the subject arises from objectivity:
For Kant the process whereby there is experience is a passage from subjectivity to apparent objectivity. The philosophy of organism inverts this analysis, and explains the process as proceeding from objectivity to subjectivity, namely, from the objectivity whereby the external world is a datum, to the subjectivity, whereby there is one individual experience (Process and Reality, 156)
Kant's mistake was to see subjectivity as prior to and thus unmoored from its rootedness in the object, a stance he could not maintain even in his own formulations. Of course, Kant does not deny that reality impresses itself on us. But for him reality, the so called "thing-in-itself", can only be posited, not known, a mere logical requirement about which we can say nothing. By contrast, Whitehead's radical empiricism asserts that nothing comes into existence on its own; there is always an actual entity that explains all emergence. And it is by means of causality, real objects pressing upon us, that we come to experience beauty.
However, the problem with a straightforward realism about beauty, of the sort Whitehead seems at times to advocate, is that no straightforward causal account of beauty seems plausible. Causality operates according to laws, generalizations about what sorts of things cause other sorts of things. And the realm of aesthetics seems not to have such laws or general principles. Just as there is no law that connects a song slow of tempo in a minor key to our feelings of sadness, there is no law that connects symmetry and vivid color to beauty. There are no necessary and sufficient conditions that enable us to infer beauty from a set of objective properties. What is beautiful in one object may be ugly in another.
But this problem arises because the causal account connecting aesthetic properties with our responses to them is not straightforward. Recent work by theorists such as George Molnar on powers and dispositions as objective properties of reality helps make the case that beauty is at least in part in the object although we experience it indirectly.
Beautiful objects are not merely beautiful. The judgment of beauty rests, in part, on more substantive properties such as delicacy, symmetry, vibrancy, elegance, etc. Like beauty, we often experience these more fully descriptive properties as real properties of objects, not as projections. Yet we disagree about them as vehemently as we do judgments about beauty. It is this disagreement that lends support to the claim that aesthetic judgments are subjective. How can competent perceivers disagree if aesthetic properties are in the object, just there to be perceived?
The answer is that aesthetic properties such as elegance or delicacy are dispositional properties, objective properties of objects in the external world, which under the right conditions, cause us to have an aesthetic experience. These dispositional properties are identified only via their manifestations, the way they show themselves, but are nevertheless properties of the object rather than the observer and thus explain why we see the object as beautiful and justify that attribution.
To see this, consider a non-aesthetic property such as the red color of a rose. A rose is red because we see it as red. But if we are normally sighted, we do so because of the reflective properties of rose petals. These reflective properties of the rose provide the foundation for the rose's disposition to cause our experience of red if we are standard human perceivers under normal viewing conditions. The redness of the rose is presented to us in a manifestation event, an experience, under appropriate manifestation conditions. The physical features of the rose dispose it to be perceived as red, just as the physical features of a glass bowl dispose it to break when dropped.
The property "looking red" is of course relative to a perceiver, i.e. those perceivers who are equipped to view the rose as red and are thus disposed to do so under the appropriate conditions. But although "redness" here is defined in terms of its subjective manifestation, the existence of the property being manifested is not relative. The appearance of the property "red" is relative to human perceivers, but that does not entail that the existence of the property "red" is response dependent. The disposition to appear red is in the rose; the manifestation of the disposition is in the observer.
The key move that a metaphysics of powers makes is to view dispositions, which are causal powers, as objective features of an independently existing reality .There is a causal relationship between the properties of an object and a perceiver's response. But on a dispositional analysis, those properties are not mere properties but are causal powers which aim at bringing about particular responses in observers. This relationship is only a potential relationship until a manifestation event occurs, and we only perceive the disposition, the causal power, via the manifestation event, making the relationship between the objective property and the perceiver indirect. Yet, these causal powers in no way depend on the observers for their existence. The power to produce a response exists independently of its manifestation. A fragile glass bowl remains fragile even if it never crashes to the floor; a red rose remains red even when there is no one around to perceive it.
A similar analysis is available for aesthetic properties. We experience a painting exhibiting dynamic motion, an aesthetic property, because the artist employed strong directional lines along with a variety of irregular shapes and sizes forcing the eye to move about the painting. The perception of dynamism is response dependent, relative to a properly situated observer who has been trained to look for it. But the quality of dynamic motion is also a disposition of the painting, an arrangement of lines and shapes directed toward their manifestation as dynamic motion for an appropriately disposed observer.
These dispositional properties, the arrangement of lines and shapes that explain our recognition of dynamic motion, exist independently of their observation and do not depend on an observer for their existence. Rather, they depend on the objects in the external world of which they are properties. The aesthetic disposition is distinct from the aesthetic response it causes and from the conditions under which it can be observed. Thus, the dynamic painting when locked away in the closet unobserved does not lose its dynamic motion; the dynamic motion is latent, unmanifested, yet still very much part of the painting. On a dispositional analysis, the properties of objects have a stability that they lack on a subjectivist view.
How then can there be disagreements among competent judges if these properties are simply there to be observed. Some disagreements can be attributed to differences in the background of observers or the conditions of observation. But on this realist conception of aesthetic properties, some aesthetic dispositions may be manifest to some observers and not others. And no particular observer is likely to be disposed to experience all of a work's dispositions. In fact, it is possible for the very same object to produce contradictory responses in different observers despite the fact that these properties are genuine features of objects in the world. A rose may appear red to humans and not-red to alien creatures with different perceptual mechanisms, because the manifestation conditions are different for each group. Similarly, a painting may appear vivid and ebullient to one observer and garish to another because the manifestation conditions differ. The painting is disposed to exhibit either manifestation because its features cause such responses in competent observers.
What then about beauty? The recognition of beauty involves apprehensions of complex relations between properties that are experienced together as strikingly harmonious. But there is no reason to think complex arrangements of aesthetic properties that we perceive as harmonious are any less the product of causal powers that exist in the object than simpler properties. No doubt beauty only shows itself when we experience it under the appropriate conditions, but it is, for that, no less a dispositional property of real objects.
The next time someone tells you beauty is in the eye of the beholder just point out that it depends on whether the beholder has an eye for beauty.
For more on aesthetics as it applies to food and wine visit Edible Arts or consult American Foodie: Taste, Art and the Cultural Revolution
Asad Raza. Root Sequence. Mother Tongue. Whitney Biennial, 2017.
Installation: 26 young, potted trees, tools, and caretakers.
Under The Radar, Part 1
"In economics, the majority is always wrong."
~ JK Galbraith
One of the unfortunate gifts of the current, star-crossed administration is that there's something for everyone that will get their knickers in a twist. If immigration or climate change isn't your thing, just wait a few days, and some administration official will come out with a statement that lands somewhere in the space between spectacularly ignorant or merely deeply ill-considered. My latest opportunity to double-take arrived a few days ago, when Secretary of the Treasury (and Goldman Sachs alum) Steven Mnuchin opined that the threat of artificial intelligence to employment is "not even on my radar screen".
To be fair, the clip is brief enough that it is difficult to conclude whether or not Mnuchin knows what he is talking about. Too often when we talk about technology we fixate on one aspect of it, and intend (although not always) that this aspect stands in for the entirety of the technological phenomenon. These days, favored metonymies are ‘AI', along with ‘robots' and ‘algorithms'. Keeping this in mind while listening to the Mnuchin clip, it's unclear what he actually means when referring to AI. Although I suspect he's talking about the holy grail of AI, which is artificial general intelligence, or an AI that is indistinguishable from human intelligence.
If that is the case, then he did a disservice to the question, which was about the impact of AI on employment. Or, if you'll allow me to pluck out the metaphor, the impact of technology on employment, which is much more amorphous. Mnuchin's dodge was to say that, since we won't have human-equivalent AI for the foreseeable future, it's something that's not worth thinking about, at least until it happens. Come to think of it, I've heard this dodge before, mostly from the mouths of climate change skeptics and deniers. In both cases, the purpose is to obfuscate and delay until the truly catastrophic comes to pass, then innocently maintain that "no one could have seen this coming" or some such nonsense.
However, Mnuchin gives us a good opening for asking how technology and employment are influencing one another, or at least how we might think about these categories and phenomena. At another point in the same clip, he expresses optimism that technology is good for productivity, that it creates new jobs and industries and stimulates demand - all the old chestnuts. But how much of this is true, and is this time really different?
Oddly enough, it may make sense to begin the discussion from another perspective entirely. The vast majority of articles that have lately tackled the specter of automation and unemployment have approached it from the point of view of individuals. For example, the 3.5 million truck drivers who will shortly be rendered obsolete by driverless fleets. That's a lot of people, but we are still talking about an aggregation of individuals. This way of thinking may seem to make sense, since we persistently characterize the economy as exactly that: an aggregation of individuals. Individual agents make decisions to buy and sell, and the supply and demand curves shift accordingly. Firms put goods and services out on the market, and people either buy them, or they don't. Price is revealed, and all is well with the world. This may be adequate for anyone just beginning to learn about economics, but there are other interpretations that are perhaps even more powerful, and more resonant with the historical progress of industry.
Written exactly 50 years ago, economist John Kenneth Galbraith's The New Industrial State elaborated a theory of the firm that proposed a very different way of looking at production. As he observed large corporations engaging in extremely high-stakes, long-term bets, he noted that the commitment required to design and manufacture something as complex and expensive as a jet liner required an altogether different way of looking at the market. For Galbraith, the prime directive for a firm was not to be responsive to the market, but to subsume that market as thoroughly as possible. Galbraith fils summarizes his father's thought:
Large business firms often replace the market altogether. They do this by integration: replacing activity previously mediated by open purchase and sale with activity either internal to the corporation, or between a large, stable enterprise and its small, specialized suppliers, to whom risk is transferred. People reduce uncertainty…by forming up into structured groups large enough to forge the future for themselves. In politics these are countries and parties; in economics, corporations. Once control passes to the organization, Galbraith wrote, it passes completely.
Galbraith called this manifestation a "technostructure"; for him, this was the great narrative of American industrial history. There is much more to Galbraith's theories, but for the purposes of this essay, there are two important points in the above passage. The first is that no established corporation views the free market as desirable. Free markets only lead to uncertainty, threatening profitability and the ongoing viability of the firm. Uncertainty must be quashed at any cost - this includes both new entrants as well as existing consumers. It certainly doesn't work all the time, but it works well enough that companies hewing to this worldview may indeed last for decades. One need only look at Apple, which has famously built its success on designing products that it believes people need - and mercilessly removing functionalities that it no longer considers to be necessary. These deficits are then remedied by an extraordinary PR and marketing machine, which effectively uses the company's pole position to control the market's desires.
This allergy to uncertainty leads to the second insight. The drive to control the future is why cartels and monopolies tend to be the real equilibrium state for most ‘free-market' economies. Price fixing and other forms of cartel behavior are the scourge of free market ideologues, because the fact is that it's much easier to keep the disruptors out and make deals with your pals than it is to geniunely worship at the altar of innovation and entrepreneurship. Put another way, all Objectivists are aspiring oligarchs.
Of course, Galbraith was writing in the late 1960s, when manufacturing was king and digital information technologies were but a distant glimmer. And when we think of cartels, we are usually evoking OPEC and other "old economy" phenomena. Surely the digital economy is moving too quickly for Galbraith's principles to continue to hold fast. But consider this passage from Jason Smith, writing recently in the Brooklyn Rail:
Google's parent company Alphabet speaks in exalted tones of technological moonshots, but ninety percent of its revenue and almost all of its profits still come from advertising, most of it via search engines. It is buying up smaller robotics and AI firms, but not necessarily to ramp up investment: it is to establish monopoly conditions that will guarantee super-profits and higher market share within these stagnant conditions. Today, high profits are assured for firms able to disrupt market dynamics and price signals. Such firms are often "more adept at siphoning wealth off than creating it afresh"; they thrive less through innovation than through exorbitant market shares, and streams of technological rent.
Reading Smith in light of Galbraith, one really ought to replace "Today" with "As always." And lest it be forgotten, Silicon Valley as a whole is not immune from cartel behavior: witness the $415m settlement reached in 2015 on behalf of a class-action lawsuit that found Apple, Google, Intel and Adobe colluding with one another on the creation of "no-poach" lists, essentially promising not to hire away each other's employees in the never-ending war for engineering talent.
Despite Smith's uncanny echo of Galbraith's half-century-old observations, his larger project is to come to a better understanding of the relationship between technology, productivity and employment. Nevertheless, it is intriguing to view this set of relationships not simply from the point of view of economic data, which is endlessly contested, but rather from the perspective of the stakeholders, that is, firms and, to a lesser extent, the government.
In this sense technology, like the market, is suborned to the drive of firms to neutralize uncertainty. Like capital, it is deployed selectively. The notion that there is a headlong rush to replace everything (or everyone) with automated systems is simply fictitious. Smith cites the fact that since 1999, "private investment in software and computer equipment has fallen precipitously, by a full quarter: it is, today, as low as it was in 1995." This is coupled with another disquieting fact: that since the financial crisis, the United States has had "the slowest growth in productivity of any decade in American history." With so much capital on the sidelines, it is easy to conclude that investment in automation is not proceeding at nearly the rate that it could be. Smith concludes that:
Current speculations on both the promise and threat of automation are confronted with an ongoing crisis of accumulation [of capital]. In this climate, a fragmentary implementation of automation is unlikely either to liberate large fractions of humanity from work, or produce mass unemployment of the sort envisioned over and again by commentators for the past century.
At first glance, this "fragmentary implementation" may seem reassuring. As long as firms that occupy the technostructure niche are profitable and happy, the kind of catastrophic job losses implied by the sensationalizing media will occur much more slowly than they might otherwise. Firms will only implement automation to preserve their market positions; those truck drivers may yet have a chance to get retrained! But this is also a cold comfort, since it does not mean that automation isn't continuing to happen. It's just that it's happening at a pace set by the technostructure, and is meant to serve its interests, not the market's, and certainly not the public's.
There is a further, more troubling conclusion to be drawn, though. If the pace of automation is insufficient to dislocate the economy so suddenly, such that the torches and pitchforks stay stashed in people's garages, then what chance does labor have to assert its claims? What, in fact, even are the claims that labor may make, in a context that is bereft of unions and short on organizing? And what does work look like in an economy where automation is eventually, but nevertheless inevitably deployed? Next month I'll look at these issues. In the meantime though, perhaps Steve Mnuchin wasn't wrong when he said that AI replacing workers wasn't even on his radar. As long as the technostructure remains unperturbed, it would be more accurate to say these concerns remain comfortably under the radar. For the foreseeable future, it sounds like that's exactly where he wants them to be.
Deep Disagreements and Argumentative Optimism
by Scott F. Aikin and Robert B. Talisse
We all have had moments when we feel that those with whom we disagree not only reject the point we are focused on at the moment, but also reject our values, general beliefs, modes of reasoning, and even our hopes. In such circumstances, productive critical conversation seems impossible. For the most part, in order to be successful, argument must proceed against the background of common ground. Interlocutors must agree on some basic facts about the world, or they must share some source of reasons to with they can appeal, or they must value roughly the same sort of outcome. And so, if two parties disagree about who finished runners-up to Leister City in their historic BPL win last year, they may agree to consult the league website, and that will resolve the issue. Or if two travelers disagree about which route home is better, one may say, "Yes, your way is shorter, but it runs though the traffic bottleneck at the mall, and that adds at least ten minutes to the journey." And that may resolve the dispute, depending perhaps on whether time is what matters most.
But some disagreements invoke deeper disputes, disputes about what sources are authoritative, what counts as evidence, and what matters. Such disputes quickly become argumentatively strange. And so if someone does not recognize the authority of the soccer league's website about last year's standings, it is unclear how a dispute over last year's runners-up to Leister City could be resolved. What might one say to a disputant of this kind? Does he trust news sites, television reporting, or Wikipedia entries concerning the BPL? Does he regard the news sites and the league website as reliable sources of information concerning this year's standings or when the games are played? What if our interlocutor in the route-home case doesn't see why the quickest route is preferable to the shortest? Maybe our traveling companion regards our hurry-scurry as a part of a larger social problem, or maybe wants to enjoy the Zen of a traffic jam. Sometimes a disagreement about one thing lies at the tip of a very large iceberg of composed of many other, deeper, disagreements.
The puzzle about deep disagreement is whether or not reasoned argument works at all in them. There is a widely held view, perhaps at the core of deliberative views of democracy, and certainly central to educational programs that emphasizing critical thinking, that well-run argument is at least not pointless, and often even productive. And many hold that it's important to practice good argumentation, especially in cases of deep disagreement. Call this view argumentative optimism. The trouble for this optimism is that as disagreements run progressively deeper, it grows increasingly difficult to see how argument could have any point at all; this, in turn, encourages us to regard interlocutors as targets of incredulity, bemusement, and perhaps even contempt or hatred. There's little, many think, one can argue or say that is going to rationally resolve certain disagreements. In the end, it all may come down to who's got better propaganda, more money, or, perhaps, the better weapons. Call this view argumentative pessimism.
A famous argument for pessimism was given by Robert Fogelin in "The Logic of Deep Disagreements." The core of his case is as follows:
1. Successful argument is possible only if participants share a background of beliefs, values, and resolution procedures.
2. Deep disagreements are disagreements wherein participants have no such shared background.
3. Therefore: successful argument is not possible in deep disagreement cases.
4. In disagreements needing urgent resolutions that also do not admit of argumentative resolution, one should use non-argumentative means to resolve the dispute.
5. Therefore, in urgent deep disagreements, one should use non-argumentative means to resolve the dispute.
Fogelin did not identify his preferred non-argumentative means, nor did he clarify how one might determine that a disagreement is deep (as opposed to merely hard) or urgent. Regardless, it is clear that argumentative optimists face a challenge. How might they respond?
For starters, optimists should ask whether deep disagreements really exist. And so, an optimist could concede Fogelin's point, and yet contend that in fact that no actual disagreements are deep. One way the optimist could argue is as follows: In cases of persistent and hard disagreement, interlocutors seem not to share enough meanings in common to have their dispute count as properly disagreement. That is, in order for two parties to disagree, there must be a sufficient degree of semantic overlap, otherwise there is no disagreement at all, and the parties simply "talk past" each other. In other words, when one party asserts "Birds fly," and the other says "Birds don't fly," they apparently disagree. But if it is discovered that the two parties do not share in common a broad conception of what it is to fly, what things are birds, what authorities to consult, or whether one of them really did see a seagull up in the air just the other day, we should conclude that there is no disagreement after all, but rather a case mutual unintelligibility. Perhaps it's worse to countenance the possibility of mutual unintelligibility than deep disagreement, but it's one way to retain argumentative optimism. The deeper the disagreement, the harder it is to see it as a disagreement.
This means that insofar as we see disagreements as a disagreements at all, we must take the disputants to share enough in the background to allow them to talk about the same things; that is, in order to see parties as disagreeing, we must take them to inhabit the same world. Consequently, we can never see disagreements as deep. Where we see disagreement, we see (in principle) resolvability.
A different optimistic strategy is to reject Fogelin's first premise. One might say that argument isn't only about resolving disagreements. An argument, as an exercise of manifesting our rationality, may improve our understanding of our own views and those of others. In an exchange, we may, in thinking about an issue, actually create common ground in developing a shared culture of reasoning together. Consequently, argument can be productive in deep disagreement cases, but it takes a longer-run view.
Finally, the optimist may reject even the fourth premise. She may deny that when argument gives out in urgent cases, one may resort to some form of non-argumentative persuasion. The optimist could insist that the fourth premise states a dangerous policy, since one may have mis-identified merely difficult or hard cases as instances of deep disagreement. Additionally, the optimist might claim that resorting to propaganda, rhetoric, verbal coercion or other non-argumentative means gives up on the plausible thought that even in cases of severe and stubborn disagreement, parties still can learn from each other. The Fogelin policy presumes that when disputes seem irresolvable, the only alternative is to simply defeat or at least neutralize one's opponents. But notice that these tools, were they used against us, would strike us as objectionable.
The dispute between argumentative pessimist and optimists is itself stubborn and unlikely to be soon resolved. But in light of the dangers of prematurely adopting pessimism, this tie goes to the optimist.
Monday, March 20, 2017
i’m having coffee
i’m dreaming I’m having coffee with Whistler’s mother
i’m scratching a knuckle with my nose
i’m not listening to my wife while gazing out a window
i’m imagining our small distant sun rising over the horizon of Neptune
i’m having coffee, paper cup with a heat sleeve
i’m playing with two small stones, twiddling them in my palm like Queeg
i’m remembering throwing stones through a neighbor’s bias
i’m sitting, but you don’t want to know where
i’m wondering if death is simply the mirror parenthesis of birth
i’m lying in bed staring at the ceiling slightly chilled. I need another blanket
i’m fooled again
i’m not fooled again
i’m having coffee, dark roast, the only kind
i’m wrong about a lot of things, too many
i’m dumber than a stump but smarter than a breadbox
i’m still wondering what it’s all about Alfie
i don’t care what it’s all about, I’m picking asparagus
i’m inside a cosmic question bouncing off its walls
i’m having coffee, Colombian this time, but dark, as I said…
i’m puffed as a peacock but simultaneously beside the point
i’m over the hill but still climbing
i’m loose as a goose and tight as a fundamentalist’s ass
i’m unknown, thank god, remembering Elvis
i’m anonymous as a red leaf in the Berkshires in Fall
i’m having coffee gazing over the rim of a mountain watching a small cloud glide
i’m as unbelievable as your average Mohammed or Mike
i’m at least as believable as your average Mohammed or Mike
i’m beating my head against the wall again painlessly
i’m taking an aspirin just in case
i’m having tea , green, trying to take coffee’s edge off
i’m under the gun, but still over the clover
i’m not sure
i’m cock sure
i’m as fraught with anticipation as I was when I was twenty, just not as often
i’m remembering something, but quickly change channels
i’m thinking again of a Dylan line, so many good ones blowin in the wind
time out of mind
I am having coffee
I am not having
I am not not
Aydın Büyüktaş. Flatland, USA. 2016.
"Inspired by Edwin Abbott’s 1884 publication ‘Flatland: A Romance of Many Dimensions‘, Aydın used drones and 3D modelling software to produce the elaborate images. Each image requires around 18-20 aerial drone shots which are then stitched together digitally to form sweeping landscapes that curl upward without a visible horizon. You can see more of his gravity-defying work on his personal site."
The Chickening of America, or Why We Don't Eat Fish (But Could Eat More)
by Carol A Westbrook
It's Lent. For many people, that means you have to deprive yourself of food that you like to eat, and instead punish yourself by eating fish. In actuality, you are not required to eat fish during the forty days of Lent, devout Catholics and other Christians are only required to abstain from meat on Lenten Fridays. Fish is merely a protein can be conveniently substituted for the missing meat course--or you can eat eggs, cheese, pizza or eggplant Parmesan instead.
Yet some people are so unused to eating fish that when it appears in their diets it is memorable. Eating fish means "Lent." And they hate it.
During Lent we "try" to eat fish, and for many, McDonald's Filet-O-Fish is the answer. The company sells nearly a quarter its filling, 390-calorie sandwiches during the six-week Lenten season. Although it contains wild-caught Alaskan Pollock, the sandwich contains only 2.8 oz. this fish (as I calculated from the protein content provided in McDonald's online nutritional information). Since 2.8 g of Alaskan Pollock has only 73 calories and 0.8 g of fat, the Filet-O-Fish's 390 calories and 18.2 g of fat can only be attributed to the bread, tartar sauce, and melted cheese.
I don't eat Filet-O-Fish because I honestly like fish a great deal more than I like bread, tartar sauce and melted cheese. Truly, I love fish. I love eating it in any way, shape or form -- from smoked and pickled, to raw, fried, steamed and everything between. For example, while vacationing in Martinique, I had a plate of whole fried ballaboo, a local reef fish with a cute pointy nose that was meant to be eaten whole after deep-frying, sans pointy nose. Yum! (See the picture on the right). But most Americans don't share my passion, they hate fish.
Ten pounds of fish means only two meager 6-ounce fish meals each month, or one Filet-O-Fish per week. Fifteen pounds per person per year. That's a total of 4.5 million pounds of seafood, in a country where our yearly harvest is 11 billion pounds, both fished (9.5 billion pounds) and farmed (662 million pounds). The US is the fifth largest producer of seafood in the world, and we export almost all of it. Ironically, for all of this abundance of seafood, most of the fish we eat is imported! Yes, it's true--we buy back our own fish from countries like China that purchase it, clean and bone it, process it to individual portions, re-freeze it, and sell it back to us.
My purpose is not to convince you to eat fish; I know that's impossible. I'm just trying to understand why the average American doesn't like fish. There is no biologic reason; unlike bitter foods, or cabbagey foods like brocolli, there is no specific taste receptor that might get triggered. When I ask Google why people don't like fish, I only get a lot of blogs saying things like, "Eww. Fish is so disgusting." "It's smelly." "It's scary, it will make me sick." "I don't know how to cook it" "I have to dissect it to eat it." "I don't want to look at the eyes." You get the picture. The aversion is cultural, it is learned, it is ingrained.
There are people who aren't averse to fish, but even those Americans prefer to eat a bland lump of an overcooked, tasteless white or pink food than delight in the culinary experience of a freshly-caught piece of fish, cooked to the proper degree of doneness, and lightly seasoned. Now that I think about it, Americans prefer to see all of their animal-based food as a bland lump, rather than appearing as it really is: a piece of meat with bones, skin, sinews and organs. The model for the ideal food is a boneless, skinless chicken breast. This is what I mean by "The Chickening of America." Don't believe me? You may recall that in the 90's the National Pork Council had a campaign to get people to buy more pork by touting it as "The other white meat." It succeeded.
Try this experiment. Walk through a supermarket and look for a bone-in pork chop, a bone-in beef or pork shoulder roast, and a ham that isn't semi-boneless and spiral sliced. Chances are you can't find all three. In some supermarkets it's even hard to find a whole chicken, though you can still buy drumsticks and wings that have bones. Thanksgiving may be the only time where we ever sacrifice a whole animal for a meal, but even then we buy one that has been cleaned, dressed, and injected with marinade.
And that's not all. These lumps of homogeneous mystery meat taste so bland that now we need to make them more palatable by covering them with seasoning and sauces, adding back as much saturated fat, high-fructose corn syrup, sugar, and salt as possible, thereby swamping any nutritional value the meat might possible have. Or we'll take the tasteless lump and serve it smothered in sauces, gravy, melted cheese and saturated fat... Or we cover it in batter and deep fry. Have a hankering for a whole lobster? Red Lobster restaurants push the cheesy lobster casserole or skewered lobster meat instead. If you insist on a cooked whole lobster they will plate if for you after they have conveniently remove the offensive anatomic parts (the tasty tomales and roe) and pulled the meat out of its shell to save you the trouble of having to dissect it. (pix mystery meat)
So what's the problem with fish? Why can't we just eat more even it we have to manipulate and disguise it? We don't because no matter how we disguise it, what remains is still the idea of fish--fishy smell, scales, fins and, well, anatomy.Yuck. It may look like chicken but, as Helene York wrote in The Atlantic, Nov 13, 2012, "Despite how it's marketed, most seafood doesn't taste like chicken."
Monday, March 13, 2017
Angel Otero. Carnival 2012.
Art of Karachi
by Maniza Naqvi
I've traveled across the city from M.A. Jinnah Road and the Pioneer Book House in the neighborhood of Meriwether Tower to an art gallery off 26h street Block 4 Clifton in Karachi now in the shadow of another occupying towering tower. Same story. Of ground breaking points of references reaching back 1200 years. This route that encompasses galleys which brought Sidis and slaves and the Empires' soldiers, of alleys, and gullies and godowns and corridors, and mandirs and mazars and mosques, and synagogues. This route of the gods, part men-part women, and their many guest houses, whore houses, book houses, teahouses, sharab houses and more. I've crossed them all.
I have two hours before I call in to work---thousands of miles away--in the world where I am not quite like this. But still I hope the same. I'm here, this evening to meet my friend Hani. But instead in the moment I've walked into the opening reception in the courtyard for Taqseem the art exhibition. While I wait for Hani to arrive I go in to see the exhibit.
I stare at a photo shopped gigantic portrait of Jinnah by the artist Imran Channa---Jinnah in all his different iterations—perhaps seven different poses, now European now Indian, now Pakistani, so cool, so well dressed, debonair, effete, sophisticated, immaculate. And I'm there—gazing at him, this beautifully dressed man—and I'm dressed in my 20 year old khadi kurta—regretting not having washed my hands or feet or having taken a shower before I came here—And would it have killed me to have dragged a comb through my hair? But there wasn't enough time to fix things. For him. I mean. And I could've scrubbed my face. The one in the shades I'm picking that as the one I'm feeling…
The founder of the Gallery, Noorjehan Bilgrami takes me aside and asks me whether I'll try my hand at writing about this exhibition and critiquing it. I'm astonished. I've never done this. And what would I know about critiquing art? But she insists that I do this and so here I am in between being at the Pioneer Book House and finishing a report on remittances in Somalia and discussing a Safety Nets Core Course's next agenda iteration.
Of course you can. She says. I know you can. Try it at least. She guides me to Omar Wasim….who is standing near his art work……This artist has a partner for this art—Saira Sheikh but she is not here. I ask him if he's telling me the truth is there really another partner in his work or is she just a figment of his imagination a part of the conceit—The she of he? And he assures me she very much exists. And he tells me about her. And why she is not here this evening and I listen to him tell me about her. And I wish her the very best. And then I listen to him explain to me about the art work—and how developers have divided and destroyed and made the earth sick.
Over the noise of the milling guests, the noise in my own head, I piece together his narrative of how monuments precede the land grabbing and the land destroying—and I examine the image on the wall, a black and white tower in a desolate landscape---and the way the light falls on it and given my references of homelands—it appears to me as if it is that tower. That one tower that drives all narrative now. That towering tower. Monuments that precede destruction……
Just then a pointy headed pale young man lithe and long walks past and inadvertently with his foot he grazes the side of part of the installment—a mud embankment growing grass—his pink sneakered foot chips of a chunk of its corner—I see that happen and I see him walk away I call after him Hey look what you've done—you've damage the art work here—He stops stares at me and looks at the damage and replies in defense of his action—what is he supposed to do---I say come back and help fix it---own up to it –do something---He comes back and with his pink sneakered foot starts to kick sweep the fallen scattered dirt back to the base of the installation. Not like that I say—show some respect gather it up with your hands…..He looks at me and what I place to be a Belgian accent he demands haughtily--Who are you to tell me what to do---you will not tell me what to do—you cannot tell me what to do—I will do what I want and he walks away as quickly as he can. I have been left there shaking like a leaf—saying yes I can tell you what to do---I am telling you what to do. Now do it. Who the hell do you think you are some kind of spiked hai--pink toed thug? That's in my head. I find out later that he is a ballet dancer from Albania via Berlin.
The curator, the gallery staff, the artist himself—all of them--try to calm me down— They are hesitant to accost this foreigner---they are somehow intimidated by him I feel that they are not willing to confront this arrogant creature---and I am incensed. I go after him but he has left the premises…..left the gallery entirely.If there ever is a narcissistic moment as all art is expected to be--I think I'm engaged in it at full tilt and throttle.
And I am so angry it colors the rest of my evening—hot pink angry. That pink toed creature.
In the evening as I think back to what has happened. I wonder what did happen. How did I go from serene earth mother, peace love and rock n roll—Faiz—dirt under her fingernails to raging crazy person ninja bitch? Was all that art? Was his clipping the side of the piece of land—destroying it—breaking it apart—taking off a corner of it—then my reaction—my almost hot rage after being all sweet and gentle all day long—then his arrogance and refusal to fix it—restore it-fix what he had broken--- and the rest of it—including my impotent demand that he come back right now, right this second and fix it---fix it-----was it all part of the installation—was that all supposed to happen just that way. Did I just get taken in by an elaborate artistic theater?
Or am I as always reading into things only what I want to. Projecting whatever it is that's going on in my head onto the art that I engage with? Or for that matter anything that I engage with. Did I just walk into this?
Next Noorjehan introduces a very rattled me to Jamil Dehlavi. He is standing near his own work and looking at it. Noorjehan introduces us. I am expected I feel to know who Dehlavi is but my brain is already dealing with ten different poses of Jinnah and a pink sneakered poser and I can recall who this person is. So I immediately ask him about his work—the art in front of us. Just as we begin to talk, Noorjehan leaves us, and a newly discovered nephew of mine, an artist in residency at a collective in the neighborhood who is standing nearby greets me but I tell him that I'll catch up with him. He wears sun glasses—its night and we're indoors. He's young. He's beautiful. He's an artist. Of course he's wearing shades indoors, at night. And he reminds me of a sepia photograph of my father and two uncles—and a photograph taken in India—before partition, all of them wearing dark glasses—for a studio photograph.
I learn from Jamil Dehlavi that he is a film maker. And he made a film on Jinnah. I haven't seen it. Can't imagine how Christopher Lee gets to play Jinnah. Have they ever used an Indian or Pakistani to play Churchill or Mountbatten? I'm distracted by this and I tell him that Imran Aslam would make a great Jinnah. I don't know why I say this at this moment. Maybe the seven poses of Jinnah remind me that I had said that to Imran Aslam the one time I met him.
Koel had invited Dehlavi to create art for this exhibition Taqseem. Divide. His work is based on a personal history spanning the globe.
And now as I recall my conversation and piece it together in my memory—the script goes something like this: The scene is an art gallery—full of people. I am still recovering from the rage of art being damaged ---a bit of earth being disrespected, damaged and kicked around. While the ‘natives' quietly gather it and try to fix it all the while not confronting the destroyer. I am tired. I've been trying to save a bookstore. I'm grimy. And I add to my anxiety another one, that I have here a nephew whom I should get to know better and I hope that he doesn't think I'm ignoring him. I tell Jamil Dehlavi about myself. I request the artist that on all the things on my resume ---the different categories of me I'd like him to focus on the fact that I'm a writer. We are so many things at once. My hands and feet are dirty because I'm trying to restore a bookstore, the oldest in the city on the street named after the man of seven images and I'm trying to save it from closing down. And the man I'm talking to is a film maker who has made a film about the man of many images and he is other things too.
His father was Indian when he met his mother and married her. She was from France. They traveled back from Europe to live in Bombay. She brought with her many books. I asked if the bottle green ocean liner trunk carried the books his mother had brought with her. No he replies. Puzzled at the question.
I say—in the world we live in what is home? Is it a house? Can it be books? A bookshelf? A book house? The choice of books as luggage. The heaviness of it. The lightness of it. Or perhaps the seafaring trunk is just a metaphor for what is home. Or a coffin. He listens and shrugs and contemplates his work.
I wonder why he considers himself divided after all we are all many things—no one is just one thing anymore--never ever not now not ever was or is---and everyone is always traversing many different cultures, nowadays, every day--why-- in a given day—within a few hours. I check my watch and turn my gaze to another place.
At the far end I look towards the metal panel—Amin Gulgee's work. The rectangular piece is suspended from the ceiling, like a grill--like a gate--- the lattice work appears as if Urdu calligraphy--- it casts a shadow of itself on the wall---as light passes through it. And as I think back to it now----I think—even the hint--the very image of what might be words----needs light to take on fleeting meanings---and it leaves an indelible impression.
Taqseem Exhibit (here)
Monday, March 06, 2017
Vajiko Chachkhiani. Father, 2014. Performance.
by Akim Reinhardt
Donald Trump is going down. His house of cards will collapse at some point. The leaks will keep flowing and eventually his position will become untenable. Conflicts of interest. Connections to Russia. All of it will become too great a weight to carry, especially since The Donald has very few genuine allies in Washington.
The Democrats want him gone. So too do most of the Republicans. Hell, they never wanted him to begin with. The GOP did everything it could to derail his candidacy, and only climbed aboard after Trump's runaway train was the last red line careening towards the White House. So for now they're playing nice with the former Democrat who eschews Conservative dogma in a variety of ways and is loyal to absolutely no one save himself. But when the moment comes, they'll gladly trade Trump in for Mike Pence, a Conservative's wet dream.
For all these reasons, Trump may not make it to the finish line. But there's one more factor to consider: the precedent of regicide. And to understand that, we should begin by briefly recounting of the demise of the Ottoman sultan Osman II.
Young Osman II ascended the Ottoman throne in 1618 at the tender age of 14. Wishing to assert himself, in 1621 he personally led an invasion of Poland, which ended with a failed siege of Chota (aka Khotyn, now in western Ukraine). In a rather unwise move, Osman blamed the defeat on his elite fighting force, the Janissaries. Afterwards, he ordered the shuttering of Janissary coffee shops, which he saw as a hotbed of conspiracies against him. The Janissaries responded with a palace uprising. In 1622 they imprisoned the 17 year old monarch and soon after killed him. Because it was strictly forbidden to spill royal blood, they strangled him to death.
I first learned about the rise and fall of Osman II in 1992 while taking a graduate course on Ottoman history. "Something happens," our professor warned us in a foreboding tone, "the first time an empire commits regicide."
Afterwards, this lesson stuck with me, and I began to wonder if might apply it to U.S. history. However, I did not look at presidential assassinations as necessarily parallel. Even the murder of presidents Abraham Lincoln and William McKinley, both of whom were dispatched by killers with avowedly political motives, didn't seem to bear the same lesson. For it was not just the violence of Osman II's end that was important. It was also that he had been done not by a rogue assassin, but rather by competing elements of his own government.
In U.S. history there have been instances when Congress attempted to remove presidents from office. But instead of the royal bow string that was wrapped around Osman II's throat, the instrument of American regicide has been impeachment hearings. Our regicides have been strictly political, not literal.
The first serious effort to impeach a president came against Andrew Johnson, who inherited the White House in 1865 after Lincoln's death. Part of the move against him can be located in crass politics. Johnson, a Democrat from a Confederate state (Tennessee), faced a Congress thoroughly dominated by Northern Republicans. And the actual grounds upon which he was eventually impeached were likely unconstitutional; hoping to restrain Johnson, Congress had passed a bill forbidding presidents to remove their own cabinet members without congressional consent. When Johnson fired Secretary of War Edwin Stanton, whom he'd inherited from Lincoln, the House filed 11 impeachment charges against him.
However, there was a lot more to the attack on Johnson than just politics. In truth, while the direct justifications for impeachment were dubious, the entire episode reflected something much larger and deeper: an epic struggle for the soul of the nation.
How does one put a country back together after it has been riven by civil war? Furthermore, what would be the fate of both former confederates rebels and newly freed slaves?
The constitution offers no clear formula for addressing these questions. Congress obviously wanted a say in the matter. However, after assuming the presidency, Johnson took the initiative, pursuing his own reconstruction plans while Congress was out of session.
Presidential Reconstruction (1865-66) was an utter fiasco. A former slave owner, Johnson was a bitter and virulent racist. He eventually came to oppose slavery, but he believed the former slaves should remain in a deeply subservient position. As president, he overtly opposed the 14th amendment (1868), which granted African Americans citizenship and equal protection under the law. He also stood by and did nothing as Southern states ran amok. The infamous Black Codes quickly replaced the old Slave Codes; state and local laws sprouted up to bypass the 13th amendment and trap African Americans in exploitative agricultural labor practices. The KKK and other groups used violence to suppress African American political action. When blacks tried to assert their equality, they often met lethal oppression. Scores of African Americans were killed in race riots across the South. Adding insult to injury, Johnson also handed out pardons to high ranking Confederates, eventually up to and including former Confederate president Jefferson Davis.
Congress was appalled and began administering its own version of reconstruction, passing a series of civil rights acts, reconstruction acts, and constitutional amendments that granted African Americans political rights and oversaw a prolonged military occupation of the South. By the time he was done, Johnson had issued more vetoes than all of his predecessors combined. But with a super majority, Congress simply passed legislation and overrode his opposition. Johnson hurt his own cause with a disastrous speaking tour that turned most Northern voters against him. By the time of his impeachment trial in 1868, he was barely governing.
The House impeached Johnson and the Senate voted 38-19 in favor of removing him. However, since a 2/3 majority was needed, it failed by one vote. Part of the reason is that many moderate Republicans were less than enamored with Radical Republican Senator Benjamin Wade, who was in line to replace Johnson. Moderates figured it would be easier to advance their agenda with a hobbled Democratic president than with a feisty fellow Republican they didn't always agree with.
Thus, Johnson remained in office for a little over a year, largely neutered of power. A serious effort at American regicide had taken place for the first time.
It's difficult to measure the long term impact of Johnson's impeachment on the presidency. He was followed by Civil War hero Ulysses S. Grant, whose own presidency was fairly robust but also plagued by enough corruption that the former general's reputation waned greatly. Afterwards came a parade weak presidents are forgotten to most Americans, and whom most high school history students associate with little more than odd facial hair styles. It was not until the turn of the 20th century that the presidency assumed a stronger role as Theodore Roosevelt and later Woodrow Wilson helped establish what historians sometimes refer to as the Imperial Presidency: strong chief executives who promoted imperial ambitions abroad and (by the standards of the time) an activist federal government at home.
After a succession of presidents dedicated to small government and laissez-faire policies, the office was again strengthened under Franklin Roosevelt, who used it to combat the Great Depression and fight World War II. By the 1970s, Andrew Johnson's impeachment was more than a century past, and for most Americans he had been reduced to little more than an odd historical footnote and a black and white illustration in textbooks.
Then came Richard Nixon.
Like many developed nations with large Baby Boom cohorts, the United States today has a rather old population. The median age is nearly 39. That means nearly half the people, a large majority of the voting age population, and an overwhelming majority of current politicians were alive when Richard Nixon resigned the presidency amid scandal in 1974 to avoid the near certain prospect of impeachment by the House of Representatives and removal from office by the U.S. Senate. I believe that this is the modern case of American regicide that has so greatly influenced the political landscape ever since, shattering the taboo and leaving every subsequent president more vulnerable to political attacks from competing elements within the state.
Space will not allow me to review the myriad twists and turns of Watergate. Suffice it to say that, unlike the impeachment of Andrew Johnson, this was not a battle for the soul of the nation and its people filtered through a flimsy, partisan impeachment and removal trial. Rather, Richard Nixon was simply a corrupt and vile politician who attempted to further his career by actively undermining American democratic institutions when his goons stole election campaign secrets from Democratic Party headquarters. And then he ordered a cover up, complete with an illegal slush fund. And then he publicly lied about it. And then he abused his office by attempting to squash investigations. And then he refused to cooperate with Congress and the courts.
In short, Richard Nixon was a felon who committed high crimes against the state, and he had to go.
However, despite the righteousness of his exit under threat of removal, the floodgates were opened. The office of the presidency had been weakened, and every subsequent president had to carry the burden. Presidents were no longer the especial symbol of American virtue. They were no longer half a step above the fray and accorded a higher degree of respect. They were now just like any other politician, subject to the dirtiest of tricks and forever within the opposition's sites, an indelible target upon their backs.
Don't believe me? Just check the numbers.
Through the tenure of Barack Obama, there have been 44 presidents. Only two of them have been impeached (Johnson and Bill Clinton), with one other facing the near certain prospect of it (Nixon). However, a dozen presidents have seen a Congressperson officially move to begin impeachment hearings, with the cases eventually going nowhere. Of those dozen, half came before Richard Nixon. The other half were aimed at Nixon and his successors.
In other words, of the 36 presidents who preceded Nixon, only six endured a motion for impeachment, and only one was actually impeached or faced serious threat of it.
After Nixon's resignation, 5 of the next 7 presidents suffered an impeachment motion in the House, and one of them, Bill Clinton was actually impeached. In fact, every president beginning with Ronald Reagan has seen a member of Congress move to impeach him.
Ronald Reagan faced an impeachment motion over the Iran Contra Scandal.
George Bush the Elder faced an impeachment motion over the first Iraq war.
Prior to actually being impeached over the Monica Lewinski scandal, Bill Clinton faced an impeachment motion for allegedly obstructing an investigation of alleged campaign contributions from foreign sources.
George Bush the Younger faced an impeachment motion over his version of the Iraq (and Afghanistan) war.
Barack Obama faced two impeachment motions: one for administering the drone program in Afghanistan and Iraq, and the other for the odd combination of charges that he failed to do perform his presidential duty while also abusing his presidential powers.
All of this is not a coincidence. Rather, it marks a fundamental change in American presidential politics. It highlights a new attitude towards the presidency. This is the fallout of American regicide.
Congress did the right thing by chasing Richard Nixon from the White House. The correctness of congressional actions in that case are supported not only by almost every serious historian and political analyst who has assiduously studied the matter, but also by the bi-partisan movement against Nixon.
In February of 1974, the House of Representatives voted to authorize its Judiciary Committee to consider impeachment hearings. The vote tally? A whopping 410-4.
Later that year when the Judiciary Committee recommended impeaching Nixon on three counts, it was not quite as overwhelming a vote, but it was not strictly partisan. All the Democratic committee members voted Yes on all charges, but some Republicans also concurred: GOP members voted Yes a total of 15 times and No a total of 36 times on the three counts. And on the two measures that failed, all Republicans voted No, but so too did some Democrats: Dems voted No a total 18 times.
Moving to impeach Nixon was absolutely the right thing and eventually might have eventually come to fruition even if Republicans held the House as the evidence against the president became ever more damning and undeniable.
Nevertheless, the unintended repercussions of House actions and Nixon's flight from office are still with us. Every president is now a potential target for impeachment, particularly if the opposition party is in control, as witnessed by Clinton's impeachment, which regardless of its actual merits, passed the House on a partisan vote, and then failed in the Senate on a partisan vote.
The regicide of Richard Nixon is a legacy we continue to live with. Impeaching the president is no longer seen as a gasp-inducing nuclear option demanding the most serious of circumstances. All presidents now live with the specter of impeachment. Thus, the possibility of a serious movement to impeach Donald Trump arising at some point seems all the likelier.
Furthermore, Mike Pence is no Benjamin Wade. Many Republicans themselves are ill at ease with The Donald and strongly prefer Pence.
And so, should the Democrats regain the House in 2018 the midterm elections, it seems all but certain that Trump will face impeachment. But even if Republicans maintain their control of the House, they may yet work behind the scenes to manifest a more informal regicide.
If things continue to deteriorate, Republicans may pressure Trump to resign. Perhaps he would cite health concerns to save face, claiming an endless string of supposed victories on his way out the door.
And if things degenerate to the point that even a sizeable share of Republican voters disavow Trump, then the GOP itself could begin impeachment proceedings should The Donald fail to heed. That scenario, which seems rather far fetched at the present, highly partisan moment, could become more viable should the revelations of Trump's connections to Russia and Vladimir Putin become so clear that all rational voters can no longer deny them.
Under those circumstances, it would be vital for Republicans to get Trump out of office with enough time for Pence to assert himself as a legitimate incumbent for the 2020 election. Over a year should do it. By the time the 1976 election rolled around, Gerald Ford had spent two years as president after taking over for Nixon. It almost worked. He was able to fend off a challenge within the party from Ronald Regan, and probably would've beaten Jimmy Carter had he not hung himself with the albatross of pardoning Nixon in one of his first acts as president.
The Republicans will remember this. If they need to remove Trump from office because they risk going down in flames with him, then they will move quickly so that Pence can establish himself.
All in all, it seems some level of attempted political regicide against Donald Trump will emerge over the next four years. The details of course are impossible to predict. Whether it is the actual regicide that Nixon suffered, the near regicide that Clinton endured, or the far less successful attempts that everyone after Carter has witnessed, remains to be seen. At this point, we can't even know if it will be out in the open or take place behind closed doors, or if it will be initiated and pushed by the Democrats or the Republicans. But something is probably in the offing.
The king will soon be dead. Long live the king.
Akim Reinhardt's website is ThePublicProfessor.com
"Shut up and Calculate" (Galileo, Kepler and Schrödinger's Cat)
by Leanne Ogasawara
Trouble? You ask....?
Well according to science writer Tom Siegfried, "Quantum mechanics is science’s equivalent of political polarization." He says:
Voters either take sides and argue with each other endlessly, or stay home and accept politics as it is. Physicists either just accept quantum mechanics and do their calculations, or take sides in the never-ending debate over what quantum mechanics is actually saying about reality.
This is to draw attention to the fact that as the mathematical models become increasingly more sophisticated in representing a foundational picture of physical reality, the conclusions that can be drawn from the picture become "impossibly weird." Such weirdness is what led to Schrodinger's cat, for example. Or later on to the weirdness called "entanglement" and the multiverse; both which we can safely describe as being, "extreme weirdness."
To understand what physicists mean when they call something "weird," Weinberg pointedly contrasts the situation in quantum mechanics to the ever-reliable consolation that is classical Newtonian physics; in which one can quite accurately calculate where a ball will land if its velocity and direction are known. Not so in our new universe; for in the crazy upside-down world of quantum mechanics no one can say for sure where an electron will be if it is measured--only where the probability wave is most intense.
Even Einstein lost his nerve at this point, famously saying, "God does not throw dice." And Weinberg wonderfully describes how both Schrodinger and Einstein distanced themselves from quantum mechanics later in their careers for just this reason; as the observational results have led particle physicists into saying the damnedest things.
And interestingly this weirdness eventually led to the two different approaches to the issues, so wonderfully described by Tom Siegfried above. Before reading Weinberg's piece I wasn't actually aware that particle physicists thought in terms of "realism" versus "instrumentalism;" as this is a dichotomy that I associated with an older period in the history and philosophy of science.
This winter, I am auditing a fascinating class at Caltech taught by science historian Stefano Gattei on the "Parallel Lives of Galileo and Kepler." If you think about it, it is pretty incredible that two such brilliant minds, working on such similar topics and both very much taken over in religious matters (Catholicism and Lutheranism), lived basically at the same period of time. The two great men did, as we know, exchange a few letters-- but these letters are simply not as many as one would have imagined, given how similarly overlapping their interests were.
Parallel lives indeed.
Today in class, Professor Gattei mentioned how historian Frances Yates once referred to Kepler as, "the last true Medieval occultist." Whereas Galileo was perhaps the world's first scientific positivist, relentlessly seeking to distance himself from any metaphysical point of view, Kepler believed that he was doing "God's work," in order to make known God's mind through mathematics. Believing that God could be recognized in the Book of Nature, Kepler saw himself to be a kind of "priest." And so, he approached his work with many metaphysical assumptions in precisely the way a Lutheran pastor approaches the world! (Kepler had wanted to become a priest after all). For example, because the universe is God's handiwork and represents "the mind of God," Kepler believed that mathematical models describing the universe must also be beautiful.
But, he also believed they must be "true" or “real."
In his book concerning his 1st and 2nd Planetary laws Astronomia Nova, Kepler took great pains to present his work as a testament to God's Glory. To which, Galileo cautions Kepler in a letter to say that, physical and metaphysical reasons should not concern the natural philosopher. That is because Galileo believed, as most physicists of the time believed, that mathematical models were only as good as they were accurate at predicting events and "preserving the phenomena."
And yet, Galileo, like Kepler, was a realist. In the words of Professor Gattei, "Both Galileo and Kepler were realists. They differed in their approaches: whereas Kepler preferred to work a priori, and draw conclusions about the world from a number of geometrical assumptions, Galileo chose first to observe, and then advance more general theories," leaving metaphysical claims aside.
In much the same way as Galileo, when it comes to quantum mechanics Steven Weinberg is also a realist. But he has some doubts as well, saying in the NYRB article that:
The realist approach has a very strange implication..., first worked out in the 1957 Princeton Ph.D. thesis of the late Hugh Everett. When a physicist measures the spin of an electron, say in the north direction, the wave function of the electron and the measuring apparatus and the physicist are supposed, in the realist approach, to evolve deterministically, as dictated by the Schrödinger equation; but in consequence of their interaction during the measurement, the wave function becomes a superposition of two terms, in one of which the electron spin is positive and everyone in the world who looks into it thinks it is positive, and in the other the spin is negative and everyone thinks it is negative. Since in each term of the wave function everyone shares a belief that the spin has one definite sign, the existence of the superposition is undetectable. In effect the history of the world has split into two streams, uncorrelated with each other.
Weinberg calls this "strange" and "unsettling." And before dismissing this all as a kind of parochial preference, says, "Like many other physicists I would prefer a single history."
Wouldn't we all?
Instrumentalists work under various assumptions as well, of course. It is impossible to be completely distanced from the current scientific paradigm, so that even the most detached instrumentalist will never be working in a complete ideological vacuum. But, one of the challenges with realism in science is that as a realist aiming to make truth statements one brings with them many more preconceived notions and blind assumptions. Instrumentalists, in only committing to create models that "save the phenomena" and make accurate predictions, simply do not have to worry much about what it all means. They don't have to concern themselves too much with conclusions or inferences, such as splitting worlds or issues of non-locality and entanglement.
A realist will "want" conclusions to be in harmony with other known facts and existing laws. The realist might also very seriously prefer conclusions to be commonsensical or perhaps beautiful. Even Galileo, says J.L Heibron in his biography on Galileo, was guided by his sensibility and taste.
Galileo could stick to an attractive theory in the face of overwhelming experimental refutation. During his period of greatest creativity in the science of motion, from 1602 to 1609, he probably jumped from theory to experiment and from one idea to another, circled back and forth, inventing the form of descriptive mathematical physics, guided by little more than his buon gusto.
There is a wonderful essay by the great S. Chandrasekhar, called "Beauty and the Quest for Science" that is highly recommended. After using Kepler as an example of science following aesthetic concepts, Chandrasekhar tells a wonderful story about Einstein and Bohr that I had never heard before.
"God does not throw dice," by Einstein; or even more provocatively, "When judging a physical theory, I ask myself whether I would have made the Universe in that way if I had been God," also by Einstein. In the context of these last two statements by Einstein, it may be well to remember Bohr's remonstrance, "Nor is it our business to prescribe to God how he should run the world!"
Isn't that great? What is it about human beings that we can first even assume that our minds are capable of fathoming ultimate truths? This is an assumption that scientists almost always carry into their work--that they can know objective truth and ultimate causes. But reading Chandrasekhar's delightful essay, I am reminded that these gut feelings regarding "beauty" or "weirdness," as well as the faith that we can believe--like Kepler or Einstein-- that we "know what God would do" does lead to breakthroughs that are "right for all the wrong reasons." There have been so many instances of proof only much later catching up with a scientist's inspired hunch. Chandrasekhar gives examples of this sensibility leading to success. For example, he tells the story of Weyl, who once said that, "My work always tried to unite the true with the beautiful; but when I had to choose one or the other, I usually choose the beautiful."
This aesthetic sensibility is what informs Weinberg, and I think he is in the company of Giants in having faith in his preferences for theories that "make sense" to him since the field he is working in is far from worked out. Faith assumptions cut both ways, of course, but somehow science that is a search for truth, a truth grounded in mathematical elegance and simplicity and in unified laws, seems an honorable, if risky quest. Perhaps the great scientists all want to be philosophers in secret, since the utilitarian value of instrumentalism seems hollow and ultimately a dead end.
Kepler was maybe the first astronomer in history to insist that he was not solely constructing mathematical models to "save the phenomenon" but was also doing physics. He was quite clear about this in his later work, in fact, insisting that his aim was to provide physical causes for those phenomena explained by the mathematical causes. And so in addition to mathematical models (remember up till then astronomy, like music, was a subset of mathematics--not physics), he was aiming to provide hypotheses of physical causes to explain the observations he had inherited from Tycho Brahe.
Indeed, to be visionary is to see what others have not yet seen, because that is how we know where to look. But most of all the great scientists have a sense of play. Human beings show great hubris trying to understand all of existence, and we must be prepared to be surprised-- and to be delighted by the surprise. For if nothing else, we know that the Book of Nature loves to fool us again and again and again.
Recommended blog: Renaissance Mathematicus and Steven Weinberg's To Explain the World: The Discovery of Modern Science
O che bel stare è star in Paradiso
The future will be virtual and augmented
by Sarah Firisen
Science fiction has always run the gamut from extreme prescience on one end to paranoid fantastical delusions at the other, and everything in between. But it has always done more than merely try to predict future technologies, it has played its part in our imagining of the future. From the imaginations of writers and filmmakers spring fantastical creations that generations of science and tech geeks dedicate themselves to making a reality - there may be no better example of this than the fervor around creating Marty McFly’s hoverboard from Back to the Future. And while there is a very long list of fictional universes that have clearly inspired generations of scientists, maybe none has had quite such a direct and sustained impact as Neal Stephenson’s Snow Crash has had on the envisioning, creation and even lexicon of virtual and augmented reality. The impact is so clear that Stephenson was stalked by the founders of the extremely hyped but uber secretive augmented reality firm Magic Leap and was eventually persuaded to join them as their Chief Futurist. Stephenson even coined the terms avatar and metaverse in his rather dark tale of a pretty dystopian early 21st century.
Actual virtual reality technology has been around maybe 20 years or more in one form or another. But it’s really only in the last few years that a perfect storm of cheap and plentiful data storage, extreme advances in computing power, and perhaps most opportunely, the rise in quality and drop in cost of tiny hi-res screens thanks to the proliferation of smartphones, has meant that finally, despite many false starts, we may be on the cusp of realizing a level of virtual and augmented reality that may come close to Stephenson’s vision.
The line between technology evangelist and someone who drones on and on about the fad du jour may be one I have to balance almost daily, but I do find that one of the problems with trying to get work colleagues as excited about these technologies as I am is that they really have to be experienced. As I mentioned in an earlier piece, my first experience of wearing an Oculus Rift and playing a game that involved me increasingly isolated on an asteroid brought on such an extreme manifestation of my real world fear of heights, that I had to take the headset off before the game was done. It was really at that moment that I realized more fully what the potential of virtual reality (VR) and augmented reality (AR) really could be.
Over the last year of so, I’ve had the opportunity to experience many of the different headsets and technologies that are currently on the market: Oculus Rift, Samsung Gear VR, Microsoft HoloLens, Google Cardboard and a host of smartphone based AR apps. While the quality of the experience in the higher end options is pretty impressive, playing with a HoloLens really is just super geeky cool, at this point there is very clearly a tradeoff between cost, functionality and portability. Most of the higher end (functionality and price) options involve a clunky headset that is tethered to a processor. The Hololens is an untethered headset, but at this point its at the highest price point of all of them. But watch this video of Microsoft’s experiments with holoportation, the remote projection of real-time, high quality 3D representations of people and things in a way that allows interaction between people not physically located in the same space . There’s no doubt that the video does a good production job of making the holographic avatar seem of a higher quality that it is. And Microsoft doesn’t even try to pretend that the creation of their experience is anything less than clunky at the moment. None of this is even vaguely ready for primetime, yet. But I do have it on good authority that this video wasn’t just total smoke and mirrors, they did manage to create holoportation. I’m not sure when I was last as excited about the potential of a technology as I was watching that video.
There are all sorts of significant technological hurdles to overcome before these headsets are something that can easily be integrated into our everyday lives. If and when Magic Leap finally releases its technology, we’ll see just how much undeserved hype the industry has been playing into for the last year or so. But regardless, all the major players know that the race goes to the company who can make a lightweight, untethered, fully functional wearable and sell it at a reasonable price point. Part of the Magic Leap hype is that it will do away with the need for screens. Computer screens, smartphone screens, all of them. Why do you need a TV? Just have your family put their AR gear on and everyone on your sofa can watch as many screens and shows as they want. In fact, forget watching, you can have an immersive experience of the shows. There are already companies offering live VR experience of concerts and sporting events.
One of the more pressing VR challenges is a shortage of content. And just as in the early days of smartphones, this is about lack of standards, distribution channels and insufficient market to compensate for development costs. But this will change. An android-like platform for VR will inevitably come along. As will an app store. At the moment, the hardware is outpacing the platforms and so outpacing the content, but they will begin to align as the hardware becomes cheaper and the potential market grows.
One of the other challenges I have when talking to colleagues about these technologies is the incorrect perception that this is just about gaming. Like so much about technology, the initial frontiers may be gaming and pornography, but they are the very tip of the iceberg. It’s hard to think of an industry that couldn’t be impacted and potentially disrupted by AR and VR: retail, engineering, healthcare, entertainment, travel, transportation, real estate. Case Western Reserve University has piloted using the HoloLens to transform medical education. The popularity of streaming services and the growth of mobile platforms has caused the entertainment and communications industries to rethink what it means to consume and pay for content. In a world of unlimited virtual screens, will there be yet another paradigm shift in this area? I could really keep going and going with potential use cases.
And these technologies will impact how we interact with each other; there have already been reported cases of VR sexual harassment. Much as we’ve had to do with the growth and pervasiveness of the internet, we’re going to have to rethink and extend our definitions of various crimes and unpleasant human interactions. The concept of cyber-bullying is recognized as a very real, common and destructive activity. How much will we have to extend our understanding of it and response once all of our teens spend a significant amount of their time in virtual or augmented reality?
I met my ex-husband in 1995 when I joined a local NYC Internet Service Provider and we “bumped” into each other using UNIX mail and YTalk. I was a very early adopter of the HTML-based internet browsing. I remember trying to explain how amazing all this technology was to people and the look they all gave me. The idea that you might go on a date with someone you met using a computer and a phone line was insane. Incredulous people would make me repeat our digital cute-meet story over and over. I remember being told by my then employer, a boutique investment bank, to stop playing with the internet because they wouldn’t be interested in it anytime soon!
My point, there is precedent for my pronouncements on the huge future impact of technology to be discounted. There are a lot of significant, likely hugely impactful technologies that are already disrupting business and society today: blockchain, machine learning and artificial intelligence, robotics to name just a few. But I do think that VR and AR will be fundamentally disruptive to our lives as humans in much the same way as the internet has ended up being, maybe even more so.
I think that there will come a time, in the not so distant future, where for someone in the developed world, imagining a life that is only real-world based, that isn’t in someway digitally augmented or that has no virtual experiential components will be as incredible as a disconnected life seems to most of us now. I’m writing this piece using Google Docs, that I will then post on a blog, using links to a variety of sources, video and text-based. At the same time, I’m occasionally, checking my email, Facebook and looking at the headlines on NYTimes.com (I stopped getting an actual paper about 15 years ago). Soon I’m going to make dinner using some online recipes I’ve stored and using food that I ordered online and cooked in pans I bought on Amazon. All these activities are utterly mundane for most of us. There is nothing that I’m doing at the moment that in any way is noteworthy. And my prediction is that in 5-10 years, maybe sooner, much of the technology depicted in Snow Crash will be equally mundane.
Louis Armstrong and the Snake-Charmin’ Hoochie-Coochie Meme
Some years ago I was looking for a way to open the final chapter of a book I had been writing about music, Beethoven’s Anvil: Music in Mind and Culture. The chapter was to be a quick tour of black music in 20th Century America, starting with jazz and blues and ending with hip-hop. So, I thought and thought and, finally, an idea crept up on me.
I had this book of Louis Armstrong trumpet solos that I’d been practicing from ever since my early teens. The solos had been transcribed from recordings Armstrong had made in the late 1920s and had been circulating ever since. These were classic Armstrong, “Cornet Chop Suey,” “Struttin’ with some Barbecue,” “Gully Low Blues,” “Muggles” (nothing to do with Harry Potter, “muggles” is old New Orleans slang for Armstrong’s favorite inhalant) and a few others. One was a response to a recent recording by McKinny’s Cotton Pickers, “Tight Like That” , and was called, naturally enough, “Tight Like This”. During his improvisation in Armstrong quoted a certain riff, not once, but twice (at roughly 2:04 and then 2:13).
How did I know it was a quotation? Because I was familiar with the riff from other contexts. For one thing, it showed up in cartoons, often to accompany a snake charmer, but also as general all-purpose Oriental mystery music . For another, I knew it as a children’s song that me and by buddies used to sing, with lyrics to the effect that the girls in France didn’t wear underpants – hotcha! But how did Armstrong know this tune? He recorded “Tight Like This” in 1928, the same year that Walt Disney produced “Steamboat Willie,” generally regarded as the first cartoon with a fully synchronized soundtrack. So Armstrong’s recording predated the tune’s use in cartoon soundtracks. Did he learn it as a kid growing up on the streets of New Orleans?
I made a few phone calls, sent some emails to friends, queried a trumpeter’s listserve (sponsered by TPIN, Trumpet Players’ International Network), and information began trickling in. In the first place, other people remember this tune from their childhoods. One Eric Johnson, from the TPIN list, told me that his daughters remember these lyrics:
All the girls in France do the hokey pokey dance, 
And the way they shake is enough to kill a snake.
Karen Stober, also from TPIN, tells me the tune was sung by two children facing one another and clapping hands to the lyrics:
On the planet Mars all the women smoke cigars.
Every puff they take is enough to kill a snake.
When the snake is dead they put flowers on its head.
When the flowers die they say 1969! [whatever year it is].
We’ve moved from France to Mars, but there’s that snake again, and now we’ve got cigars – a regular Freudian wonderland of sub-rosa implication. What fun. I found a somewhat fuller version on the web where the dance was characterized as a “hookie-kookie dance.”
I then followed a lead suggested by my friend, David Bloom, who suggested I check out the 1893 Chicago World’s Columbian Exposition (aka World’s Fair). It was a major event in American cultural life. It was the first large-scale use of alternating current, and the first Ferris wheel. The exposition hosted delegations from all over the world, including Japan, the first chance Americans had to experience that nation and its people – who were here, of course, to learn about us as well. This is when and where hamburgers became all-American fast food; Pabst Blue-Ribbon Beer flowed freely on the midway; Kellogg’s Cornflakes debuted here as well. And, wouldn’t you know it? Elias Disney, Walt’s father, was a carpenter on the construction job. But all this is beside the point.
The point is about the entertainment on the midway. Yes, we had Wild Bill Cody, and we had John Philips Sousa. But we also had a lithe young woman who danced as “Little Egypt.” The exposition’s press agent, Sol Bloom, claimed that he had written our little tune just so Little Egypt could dance to it. The tune was a hit and was subsequently copyrighted under various names, including Dance of the Midway, Coochi-Coochi Polka, Danse de Ventre, and The Streets of Cairo . Just how it was copyrighted several times is a bit of a mystery, but the fact that several folks claimed it as their own testifies to the tune’s popularity. One of those folks, W. J. Voges, included it as the Koochie-Koochie Dance in the second edition of Pasquila Medley published in New Orleans in 1895. We’ve now got the tune in New Orleans at a date prior to Armstrong’s birth.
Another of my TPIN informants, trumpeter’s trumpeter Jeanne Pocius, told me that this melody appears in Arban’s Complete Conservatory Method for Trumpet as one of “Sixty-eight Duets for Two Cornets,” which follows “150 Classic and Popular Melodies.” Our tune is called “Arabian Song.” Arbans, as it is known among trumpeters, is the central method book for “legit” trumpet and cornet pedagogy. It was written and compiled by Jean-Baptiste Arban, a 19th century cornet virtuoso, composer, conductor and teacher on the faculty of the Paris Conservatory. He first published his Grande méthode complète pour cornet à pistons et de saxhorn in 1864, well before Sol Bloom claimed he’d written the tune for Little Egypt. Was Sol Bloom telling a stretcher – as Huck Finn called it – when he claimed that tune as his own?
Where did Arban get it? Roughly half the book is a set of technical exercises graded from elementary to extremely advanced. Those exercises are followed by complete tunes and compositions, real music. And that’s where we find our snake-charmin’ shimmy-shakin’ tune. The melodies Arban collected comprise what appears to be a European Songbook of the 1860s. Some of these tunes were by recognized masters of the European high art tradition—Mozart, Beethoven, Mendelssohn, Bellini, von Weber, and Haydn, among others. But many were just tunes, attributed to no one. And so it is with “Arabian Tune.” As I said in Beethoven’s Anvil (pp. 253-254):
When we consider the lyrics this tune has attracted, its use in cartoons to accompany snake charming, and its title, it seems to be a musical icon of the Mysterious Licentious Orient, which had fascinated European peoples at least since the Crusades. It is the only song identified with the Orient in Arban’s collection, but other tunes have national or ethnic identification. Thus we find a “German Song,” a “Neapolitan song” and a “Swiss Song,” a “French Air” and an “Italian Air,” a “Russian Hymn” and an “Austrian Hymn,” as well as “Blue Bells of Scotland” and “Yankee Doodle.” In compiling his collection of melodies Arban clearly wanted to present music from all the civilized nations he could think of. It is thus in the service of a truncated ethnic inclusiveness that he included an “Arabian Song”—or, more likely, the one-and-only “Arabian Song” he knew.
Beyond this, the opening five notes of this song are identical to the first five notes of Colin Prend Sa Hotte, published in Paris in 1719. Writing in 1857, J. B. Wekerlin noted that the first phrase of that song is almost identical to Kradoutja, a now-forgotten Arabic or Algerian melody that had been popular in France since 1600. This song may thus have been in the European meme pool 250 years before Arban found it. It may even be a Middle Eastern song, or a mutation of one, that came to Europe via North Africa through Moorish Spain or was brought back from one of the Crusades. For all practical purposes we can consider it to be nearly as old and widely dispersed as dirt. And, on the evidence, equally fertile.
We still don’t know exactly how Armstrong and the tune found one another, but that no longer seems like much of an issue. That tune got around. But I wouldn’t be surprised if he learned it as a child, and that it had lyrics similar to those that I learned, and that others learned after me.
If we think of that melody as a meme, or perhaps several memes, then Armstrong’s uses – he used it other tunes as well – would be mutations and his use of only part of the melody would seem to be a kind of memetic recombination. But what is that fragment being recombined with? For one thing, it was being combined with other licks, or riffs as musicians call them. Some of these fragments may have come from the same pool that floated the “Arabian Song” across the Atlantic from Europe. Others may have been indigenous to the United States or even local to New Orleans or Chicago. Jazz culture, like any musical culture, is full of these licks, which can come from any place.
Musical quotation is rife among jazz musicians. Dizzy Gillespie often inserted a fragment of the “Habanera” from “Carmen” into his “A Night in Tunisia.” Dexter Gordon liked the opening phrase of “Mona Lisa.” Lee Morgan recorded an improvisation in which he quoted Ziggy Elman’s licks from Elman’s famous solo on “And the Angles Sing,” which he recorded under Benny Goodman. Morgan was too young to have seen Elman perform live and so must have learned those licks from a recording. As for Elman’s licks, they would have come from Europe with Elman’s Jewish ancestors.
Anyone who has read jazz biographies has read many accounts of jazz musicians hearing jazz on records or on the radio, becoming intrigued, inspired, and learning from recordings and broadcasts. These sources have likely been as important in jazz’s evolution as direct person-to-person transmission, for they allowed memes to spread over great distances in a matter of days or weeks. Musicians learned, not only from those whom they knew directly, but also from those who recorded and broadcast. Jazz culture was thus able to develop a huge pool of memes which musicians could use in performance. When jazz musicians play, they call on various intersecting pools of material which they then assemble into a performance.
Of course, jazz musicians and would-be jazz musicians weren’t the only ones to hear the broadcasts and the recordings. Everyone heard them and was familiar with the tunes and riffs, the licks and phrases, the memes of jazz culture. The audience was thus primed to hear and appreciate what the musicians played. And that priming is as important to cultural life as the performances by the musicians themselves. As I wrote in Beethoven’s Anvil (pp. 255-256):
The greatness of an individual musician such as Armstrong is a function, both of his power to forge compelling performances from the “raw” memes and of the existence of that meme pool. While Armstrong may have been ahead of his fellows, he couldn’t have been very far ahead of them, otherwise they could not have performed together. Beyond this, without a large population of music-lovers familiar with the same meme pool, Armstrong’s recordings would have had little effect. By the time he went to Chicago, a large population had been listening and dancing to rags and blues, show tunes, fox trots and Charlestons and marches, all with a hot pulse and raggy rhythms. Armstrong’s improvisations gave them a new wild pleasure, and their collective joy made him great.
* * * * * *
 It appears in this cartoon from the 1940s, though not in the context of snake charming. It starts about 01:54:
It occurs again at roughly 05:10 where it is followed by a marvelous dance sequence with some fabulous trumpet playing on the sound-track (the reason I like this cartoon).
 Here’s “Tight Like That” by McKinney’s Cotton Pickers, arranged by the great Don Redman:
 Here’s Google query on “all the girls in France” that coughs up several hundred-thousand results.
 Check out the Wikipedia entry, The Streets of Cairo, or the Poor Little Country Maid. It recounts some of the history and has links to various versions of the tune, including in cartoons.
Shira also has a useful page: Streets of Cairro: That “Snake Charmer” Song.
Monday, February 27, 2017
The Owl of Minerva Problem
by Scott F. Aikin and Robert B. Talisse
Wisdom is a product of experience and reflection. As a consequence, it's often quite a long road to that goal. It's for this reason that the poetic expression, "the Owl of Minerva Flies at Dusk," has its effect. Only at the end of the day, once the work is done and we recline in thought, do the insights of what we ought to have done, what the best option was, and what was wrong about a particular decision become clear. We live forward, but we understand backward. And that can occasion distinctive problems.
In democratic politics, this point about insight is certainly true. And it extends not only to the errors we may make as a country, but also to the errors we make in understanding ourselves and our decision-making. In its current form, much democratic theory is focused on the decision-making and argumentative elements of modern political life. This deliberative democratic movement casts democratic life as that of participating in ongoing discussions, wherein all have a voice, no issue is beyond question, and every decision must be justifiable to all those whom it effects. There are admirable ideals, but we understand the ways we can fail those ideals only in making mistakes, only in witnessing the pathologies to which public reason is prone.
We experience living in a democracy and then we see the particular kinds of challenges and errors to which reasoning together can be prone. Perhaps we should have anticipated the effects of group polarization that seem to define contemporary political discourse, but we understand it all too well now that we live under its conditions. The incurious dogmatism of epistemic closure, the slippery euphemism of Orwellian Newspeak, and the abuses of and visceral reactions to political correctness are all political phenomena that require we see as developments from histories and arising within particular social settings. We do now know them a priori.
The Owl of Minerva Problem at first looks like a simple point about the retrospective nature of knowledge: You must first have experience to know, so knowledge must be dependent on (at least some) events of the past. But the Owl of Minerva Problem raises distinctive trouble for our politics, especially when politics is driven by argument and discourse. Here is why: once we have a critical concept, say, of a fallacy, we can deploy it in criticizing arguments. We may use it to correct an interlocutor. But once our interlocutors have that concept, that knowledge changes their behavior. They can use the concept not only to criticize our arguments, but it will change the way they argue, too. Moreover, it will also become another thing about which we argue. And so, when our concepts for describing and evaluating human argumentative behavior is used amidst those humans, it changes their behavior. They adopt it, adapt to it. They, because of the vocabulary, are moving targets, and the vocabulary becomes either otiose or abused very quickly.
Consider the use of fallacy vocabulary less as a device for the cool evaluation of arguments, now, but rather as a tool of evasion or attack.
Ted Cruz famously attacked Donald Trump during the primary season for being the kind of person who relies on the ad hominem.
Further, the use of the term ‘straw man' charge to defend against any and all criticism in online argument is so widespread, the strategy has been added to a comic pantheon of argumentative personae.
The point, again, is that the tools we've used to make sense of and evaluate and improve our attempts at rational exchange have been tools of subverting it.
And now we see the same phenomenon with the expression ‘fake news.' The term had its purchase originally as one to explain the proliferation of false stories about the 2016 Presidential election in the US. For example, that the Pope endorsed Donald Trump, that Hillary Clinton was running a child pornography ring in a pizza parlor's basement. Now, however, the expression ‘fake news' is used by Donald Trump to disparage claims he holds are contrary to his interests. And so he says that CNN is fake news, and that the Russian ties to General Flynn is fake news. And so vocabulary we'd used to understand our joint exercise of reason is now part of that exercise and changing and being changed by that exercise.
And so, we may understand ourselves and the work or reasoning together only in retrospect, because the tools we use to make the parts of our practice explicit for endorsement or evaluation themselves become part of the practice and are changed by it. This is both good and bad news. The bad news is that our task of understanding ourselves and having a complete grasp of best theoretical practices is always incomplete and open to abuse by our very terms. But the good news is that those changes made by and to our critical vocabulary occur because we care for reason and wish to live up to its dictates. Even the most egregious fallacy is yet an attempt to lay claim to reason's legitimacy.
“The woolly mammoth vanished from the Earth 4,000 years ago,
but now scientists say they are on the brink of resurrecting the ancient
beast in a revised form, through an ambitious feat of genetic engineering.”
If the wooly mammoth becomes the new Lazarus
reborn from an ice sarcophagus
does it mean that we may all return one day
to beat our breasts at the injustice of death
but also to rejoice in miracles? It’s an
honest question, we’ve been asking it
for generations, yet it’s never been answered
but in myth, the story that elevates ignorance
to poetry, that blazes red trails in pigment,
that ends up only as sublime music to our ears,
elusive, illusory as the apparition of tomorrow
But we still have this day
It seems never to end
Reality Check: Wine, Subjectivism and the Fate of Civilization
by Dwight Furrow
I must confess to having once been an olfactory oaf. In my early days as a wine lover, I would plunge my nose into a glass of Cabernet, sniffing about for a hint of cassis or eucalyptus only to discover a blast of alcohol thwarting my ascension to the aroma heaven promised in the tasting notes. A sense of missed opportunity was especially acute when the wine was described as "sexy, flamboyant, with a bounteous body." Disappointed but undaunted, I would hurry off to wine tastings hoping the reflected brilliance of a wine expert might inspire epithelial fitness. It was small comfort when the expert would try to soften my disappointment with the banality, "it's all subjective anyway." So one evening, while receiving instruction in the finer points of wine tasting from a charming but newly minted sommelier, I let frustration get the better of me and blurted "Well, if it's all subjective, what the hell are we doing here? Is it just your personal opinion that there is cassis in the cab or is it really there. We all have opinions. If you're an expert you should be giving us your knowledge, not your opinion!" Someone muttered something about "chill out" and it was quickly decided that my glass needed refilling. But the point stands. The idea of expertise involves the skillful apprehension of facts. If there is no fact about aromas of cassis in that cab there is no expertise at discerning it.
These conversations over a glass of wine are more pleasant (because of the wine) but structurally similar to the semester-long task of getting my college students to realize that moral beliefs are not arbitrary emendations of their lightly held personal attitudes but are rooted in our need to survive and flourish as social beings. Yet even after weeks of listening to me going on about the sources of value they still write term papers confidently asserting that with regard to "right" and "wrong", eh, who knows?
Subjectivism, the view that a belief is made true by my subjective attitude towards it, has long been the default belief of freshman students and arbiters of taste. Unfortunately this tendency to treat it as the wisdom of the ages has escaped the confines of the wine bar and classroom into the larger society. Buoyed by the cheers of multitudes, our fabulist-in-chief, routinely finds his "own facts" circulating in what seems to be an otherwise empty mind. Unfortunately, this is no longer mere fodder for a seminar debate.
Accompanying this idea that we are entitled to our own facts is the belief that reality can be invented through sheer force of the will. Authoritarian leaders have always sustained their power by re-defining reality such that complex problems are amenable to simple, authoritarian solutions. The idea of the strongman who can act and succeed independently of true belief, the confidence that conviction and will are sufficient to solve problems, is the logical extension of subjectivism, and the U.S. now has its very own Combover Caligula to test the theory.
This drama takes place against the background of majorities believing that while scientists keep our planes aloft, our computers humming, and help the enormously complex human body fight disease, they can't make simple measurements of CO2 concentration and temperature gradients. Climate change denial is the ultimate fabulation, the most extreme case of simply ignoring an inconvenient reality because you would rather it were different.
The common denominator linking all these fabulations is the belief that reality is whatever the mind says it is. Reality poses no independent standard to which our thoughts and attitudes must conform. Unfortunately, this idea has a rich and influential philosophical pedigree. The monumental presence of Immanuel Kant looms over the modern world, for it was Kant who argued that reality as-it-is-in-itself can never be known. According to Kant, the structure and organization that reality appears to have--constituted by time, space, and causation—is a product of the mind imposing order on reality according to principles and categories that enable these "appearances" to make sense to us.
Before my colleagues in philosophy go apoplectic let me clarify that I am not suggesting a logical or causal connection between the sophisticated arguments of Kant and the puerile subjectivism discussed above. Kant was no subjectivist because he argued that the rules that govern perception and reason are universally shared among rational beings (among which he includes, perhaps mistakenly, persons). Furthermore, his arguments were based on the quite plausible notion that any claim about reality as-it-is-in-itself will be dependent on how the mind gives structure and meaning to that claim, and thus all reference to a mind-independent reality is pure speculation. It was Kant's laudable dislike of unsupported claims and his awareness of the limits of human knowledge that led him to be cautious about claims to know reality. The traditional notion of "the real" is that which is independent of human experience, something unsullied by the distortions imposed by human thought. Kant was right that the very attempt to think such a thing would inevitably bind it to human thought.
Nevertheless, for the rash and incautious, it's a very short step from the view that a mind-independent reality is unknowable to the claim that therefore we can just forget about reality as a constraint on our ideas altogether. Thus, I wonder if the "spirit of the age" has finally run roughshod over the careful, rigorous skepticism of Kant by demonstrating the ultimate absurdity of thought disconnected from reality. At this point in history we urgently need a dose of reality. An awareness of the limits of knowledge and the impenetrability of the real is not sufficient; we need an awareness of reality pushing back, penetrating our insights and offering stubborn facts to which we must attend. After all, Kant does require that we bite a very large bullet. He poses the question whether we should believe him or our lyin' eyes which tell us that reality is right there in front of us. We are all intuitive realists; only in a philosophy seminar would we think otherwise.
Despite its alleged universality, Kant's view that all of this is just an elaborate construction of the mind seems to invite elaborate reconstructions based on all manner of preferences and prejudices, and so I fear that if we are to get beyond fabulation we must get beyond Kant. And that means showing that we need not bite that bullet that Kant thought necessary.
However, the alternative to Kant's transcendental idealism seems equally absurd. For the most straightforward way of rejecting subjectivism is to take on board the kind of objectivity to which the natural sciences aspire—what is real is whatever the best scientific theories say is real. But that leaves us with an arid reality evacuated of all meaning and value, since the mindless, meaningless physical particles and fields of force discovered by physics seem to lack any essential reference to what matters to us. Appeals to science have little to say about what we ought to care about, let alone the aesthetics of wine, moral norms or anything else in life that depends on judgment. We seem to be stranded on one side or the other of an abyss formed by the mighty pillars of objectivity and subjectivity with no way to traverse the chasm. Is there a way across that chasm?
Kant is arguing that we can't prove the common sense view that we are in touch with an independent reality and so intellectual rigor demands we be skeptical. This puts the pursuit of knowledge in the driver's seat but leaves us bereft of the very knowledge we seek. Yet, before we can prove anything we must first meet the causal force of reality head on. As we move about the world it presses in on us, resisting our actions, disrupting intentions, penetrating mind and body, a piercing, gale-force wind that requires careful tacking to navigate.
Kant wants to say this causal force is itself something the mind imposes on itself. But that is only remotely plausible after stepping back in a moment of abstract doubt and asking what we can really know. It's not addressing the human experience of a reality that buffets, ingresses, rubs, wounds, attracts and fascinates. What Kant misses is that our fundamental transaction with the world is not via knowledge. It is via feeling, emotion, sensibility, attraction and repulsion, in other words, aesthetics. Reality is felt before it is known—I suffer and love, therefore I am. Skepticism gets no foothold here.
How does this acknowledgement of the felt influence of causal forces help avoid subjectivism? That would be a very long tale but I will try to provide a sketch. The causal lines of force that resist our aims but also enable all human creativity are indicators of something deep and consequential. For they emerge out of potentialities, latent forces, dispositions in things that when activated by the presence of other things, including human beings, have a direction. I call these directed lines of force telic norms, patterns of probabilities that prescribe how reality might develop under certain conditions. These telic norms are attractors for feeling to which we respond with pleasure or aversion. There is value in the world for without it I doubt that a frog could catch a fly.
Whatever positive influence we have over reality will be realized by responding to telic norms under conditions appropriate to their realization—otherwise chaos ensues. Objectivity is achieved by accurately tracking the lines of force that emerge from a given set of conditions and that provide an anchor for telic norms. Whatever the future holds, it will emerge from these lines of causal influence and our ability to absorb their direction and make use of them. The first contact with them is not the mind that knows but the sensibility that feels drawn or repelled. When the mind spins away from these lines of force we have subjectivity and error.
Which brings me back to wine (you just knew I would return to wine). Winemaking is an art form in which the quality of the final product depends on nature and the recognition that nature has its own powers and dispositions that we can only sometimes, and within limits, influence. With each vintage nature imposes its "will". Good winemakers accurately track the telic norms that emerge first from the grapes and later the wine in its various stages of development in light of their sensibility and intentions regarding the final product.
The problem of objectivity is not that critics or consumers disagree about the quality of particular wines. Of course they disagree. We all have different preferences and histories and convergence of judgment would not be desirable in any case. What matters for objectivity is that critics and others who taste aesthetically track the potential of a wine, taste its ability to provide satisfaction to various people with differing sensibilities. Aesthetic tasting is not a matter of asserting subjective likes or dislikes but of identifying potentiality, the latent forces and indwelling capacities of a wine to produce pleasure.
Of course wine quality (or beauty if you prefer) is subjective to a degree but it is not merely subjective. It isn't something we project or impose onto an object but is a response to something in the object being judged, an appreciation of its power to affect us which is more felt than apprehended.
Kant was alleged to have had a taste for the grape. Had he tasted aesthetically might history have developed differently?
For more on the aesthetics of food and wine visit Edible Arts or consult American Foodie: Taste, Art and the Cultural Revolution.
Politics Trump Healthcare Information: News Coverage of the Affordable Care Act
by Jalees Rehman
The Affordable Care Act, also known as the "Patient Protection and Affordable Care Act", "Obamacare" or the ACA, is a comprehensive healthcare reform law enacted in March 2010 which profoundly changed healthcare in the United States. This reform allowed millions of previously uninsured Americans to gain health insurance by establishing several new measures, including the expansion of the federal Medicaid health insurance coverage program, introducing the rule that patients with pre-existing illnesses could no longer be rejected or overcharged by health insurance companies, and by allowing dependents to remain on their parents' health insurance plan until the age of 26. The widespread increase in health insurance coverage – especially for vulnerable Americans who were unemployed, underemployed or worked for employers that did not provide health insurance benefits – was also accompanied by new regulations targeting the healthcare system itself. Healthcare providers and hospitals were provided with financial incentives to introduce electronic medical records and healthcare quality metrics.
As someone who grew up in Germany where health insurance coverage is guaranteed for everyone, I assumed that over time, the vast majority of Americans would appreciate the benefits of universal coverage. One no longer has to fear financial bankruptcy as a consequence of a major illness and a government-back health insurance also provides for peace of mind when changing jobs. Instead of accepting employment primarily because it offers health benefits, one can instead choose a job based on the nature of the work. But I was surprised to see the profound antipathy towards this new law, especially among Americans who identified themselves as conservatives or Republicans, even if they were potential beneficiaries of the reform. Was the hatred of progressive-liberal views, the Democrats and President Obama who had passed the ACA so intense among Republicans that they were willing to relinquish the benefits of universal health coverage for the sake of their political ideology? Or were they simply not aware of the actual content of the law and opposed it simply for political reasons?
A recent study published by a team of researchers led by Sarah Gollust at the University of Minnesota may shed some light on this question. Gollust and her colleagues analyzed 1,569 local evening television news stories related to the ACA that were aired in the United States during the early months of when the health care reform was rolled out (between October 1, 2013, and April 19, 2014). They focused on analyzing local television news broadcasts because these continue to constitute the primary source of news for Americans, especially for those who are age 50 and higher. A Pew survey recently showed that 57% of all U.S. adults rely on television for their news, and among this group, local TV new (46%) is a more common source than cable news (31%) or network news (30%).
Gollust and colleagues found that 55% of the news stories either focused on the politics of the ACA such as political disagreements over its implementation (26.5%) or combined information regarding its politics with information on how it would affect healthcare insurance options (28.6%). Only 45% of the news stories focused exclusively on the healthcare insurance options provided by the law. The politics-focused news stories were also more likely to refer to the law as "Obamacare" whereas healthcare insurance focused news segments used the official name "Affordable Care Act" or "ACA". Surprisingly, the expansion of Medicaid, which was one of the cornerstones of the ACA because it would increase the potential access to health insurance for millions of Americans, was often ignored. Only 7.4% of news stories mentioned Medicaid at all, and only 5% had a Medicaid focus.
What were the sources of information used for the news stories? President Obama was cited in nearly 40% of the stories, whereas other sources included White House staff or other federal executive agencies (28.7%), Republican (22.3%) or Democratic (15.9%) politicians and officials. Researchers, academics or members of think tanks and foundations were cited in only 3.9% of the news stories about the ACA even though they could have provided important scholarly insights about the ACA and its consequences for individual healthcare as well as the healthcare system in general.
The study by Gollust and colleagues has its limitations. It did not analyze TV network news, cable news, or online news outlets which have significantly gained in importance as news sources during the past decade. The researchers also did not analyze news stories aired after April 2014 which may have been a better reflection of initial experiences of previously uninsured individuals who signed up for health insurance through the mechanisms provided by the ACA. Despite these limitations, the study suggests that one major reason for the strong opposition among Republicans against the ACA may have been the fact that it was often framed in a political context and understated the profound effects that the ACA had on access to healthcare and the reform of the healthcare system itself.
During the 2016 election campaign, many Republican politicians used the idea of "repealing" the ACA to energize their voters, without necessarily clarifying what exactly they wanted to repeal. Should all the aspects of the ACA – from the Medicaid expansion to the new healthcare quality metrics in hospitals –be repealed? If voters relied on the local television news to learn about the ACA, and if this coverage – as is suggested by Gollust's study – viewed the ACA predominantly as a political entity, then it is not surprising that voters failed to demand nuanced views from politicians who vowed to repeal the law. The research also highlights the important role that television reporting plays in framing the debate about healthcare reform. By emphasizing the actual content of the healthcare reform and its medical implications and by using more scholars instead of politicians as information sources, these media outlets could educate the public about the law.
There are many legitimate debates about the pros and cons of the healthcare reform that are not rooted in politics. For example, electronic medical records allow healthcare providers to easily monitor the results of laboratory tests and avoid wasting patient's time and money on unnecessary tests that may have been ordered by another provider. However, physicians who are continuously staring at their screens to scroll through test results may not be able to form the interpersonal bond that is critical for a patient-doctor relationship. One could consider modifying the requirements and developing better record-keeping measures to ensure a balance between adequate documentation and sufficient face-to-face doctor-patient time. The ACA's desire to track quality of healthcare delivery and penalize hospitals or providers who deliver suboptimal care could significantly improve adherence to guidelines based on sound science. On the other hand, one cannot demand robot-like adherence to guidelines, especially when treating severely ill, complex patients who require highly individualized care. These content-driven discussions are more productive than wholesale political endorsements or rejections of the healthcare reform.
Healthcare will always be a political issue but all of us – engaged citizens, patients, healthcare providers or journalists - need to do our part to ensure that this debates about this issue which directly impacts millions of lives are primarily driven by objective information and not by political ideologies.
Gollust, S. E., Baum, L. M., Niederdeppe, J., Barry, C. L., & Fowler, E. F. (2017). Local Television News Coverage of the Affordable Care Act: Emphasizing Politics Over Consumer Information. American Journal of Public Health, (published online Feb 16, 2017).
Eileen Alice Soper (1905-1990). When Badgers Awake.
John Lister-Kaye, naturalist and wildlife writer, describes his experience with Soper in "Gods of the Morning":
"As we approached the (badger) setts in the dusk she seemed to slough off her human-ness and transmogrify into something more than half wild. I couldn't understand how she sat so still. She denied cold and rain, she ignored itches - a gnat landing on her nose - she seemed to become part of the wood herself, part of the tree, the soil, the still evening air ..."
Special note to my siblings: Eileen Soper was the illustrator of our beloved childhood books by Enid Blyton - look!
There's a certain kind of conversation in which I find myself every so often, which can roughly be summarized as "What's the big deal about DJing"? As someone who was a quasi-professional DJ in a former life, and is currently what one friend terms a 'monastic DJ', I've sensed a substantial gap in lay understanding of not just what a DJ does while engaged in the act of mixing, but also the place occupied by DJs in the contemporary musical ecosystem. This attitude — not unlike looking at a Jackson Pollock while muttering to yourself that you could do just as well — has received further support from the rise and fall of the spectacularly excessive (and, to my ears, creatively bankrupt) EDM scene; the unholy marriage of superstar DJs, casino-based clubs and overpriced bottle service; and the fact that watching someone DJ is fundamentally uninteresting.
Is there any value in mixing other people's music? When viewed from the most reductive position, the answer is clearly not. As critic David Hepworth noted in a now-deleted blog post, "You must surely realise that you make your living by putting on records, which is only a tiny bit removed in degree of difficulty from switching on the radio." If that's all that DJs are good for, then I suppose it's a relief that streaming services and software-driven playlists have come along to put this particular horse-and-buggy paradigm out of its misery.
Instead, it's more helpful to look at the larger role that DJs play in parsing the ocean of music in which we swim in these post-Napster days. Just as we turn to critics in other fields to understand what we should be reading or watching, we also turn to DJs for clarity on what to listen to. In this sense, the appropriate metaphor is one of the DJ as tastemaker.
In order to talk about how a DJ guides others' taste in music, we have to address the DJ's own, internal process. Over time, a DJ is a collector, a curator and an editor. Of course, being a DJ involves inhabiting all three of these roles at the same time, all the time, but there is also a progression here. I'll go over each of these and then return to what it means to be a tastemaker at the end of this post.
Collecting is the baseline activity for any DJ. Obviously, pretty much everyone has a music collection, but DJs take it to an obsessive level. Whether you're steeped in a particular genre — probably the most common trajectory—or collect the music from a particular era or geography, a DJ's collection is the foundation from which everything else flows.
Collecting is an endless process. To be sure, there is a real joy in finding obscure gems that might be decades old, or music that's just extremely overlooked. This is generally known as crate-digging. Collecting can also become an arms race — ie, the competition to access new releases before they drop commercially. But even if you have the hot new remix from so-and-so, or a white label vinyl pressing that no one else does, you might be ahead of the game for a week, and then only in your little tide pool of the electronic music universe. And while the collecting arms race led to interesting collective responses, too, such as the creation, in 1975, of record pools, any DJ quickly finds out that simply having a solid collection is necessary, but not sufficient, for realizing the work itself.
If collecting is about wrapping your arms around as much of your chosen domain as possible, we may logically ask if there is a thing as too much music? It's not unusual for DJs to have 15,000 or more tracks in their collections, as well as stacks of records and CDs that patiently await a critical listening. To which I would say, that's like asking a painter if they have too many colors in their palette, or an interior decorator if they have too many fabric swatches, or a fashion designer too many styles of buttons. All of these professionals engage in the act of remixing their materials into new, exciting and perhaps most importantly, appropriate arragements that speak to the needs of a unique aesthetic moment. So it's not so much an issue of having too much music, but rather the possibility of owning something and not knowing that you do.
The DJ's collection is the DJ's instrument.
Curation is, admittedly, a word that's been beaten to death in the last few years. Everyone is a curator now — if only because they are ‘curating' their own life. This is nonsense. It's kind of like saying that we're all knowledge workers, or that everyone we work for is a client.
In a stricter sense, curation is the act of assembling a representative collection of (traditionally tangible) objects. The assembly makes sense in some way — it is literally sense-making. So if you went to the recent Picasso sculpture retrospective at New York's MoMA, you didn't expect to see all of Picasso's sculptures, but a strategic sampling of them, displayed and annotated to demonstrate the artist's progression through time and across media.
In the same sense, the DJ is a curator for a particular domain of sound. Having listened to thousands of tracks, the DJ can select the seminal compositions that demonstrate the development of a sound or genre through its history (indeed, in some musical cultures DJs are known simply as ‘selectors'). This curatorial act can be performed either in real time, or in hindsight.
In the case of the former, the DJ is helping to define the sound of the moment. Kool DJ Red Alert did as much for hip-hop in the 80s and 90s with his long-running show on New York's 98.7 KISS-FM. But DJs also continue to define the contours of a genre even after it's become well-established. A good example here is the seminal set of mixes that Solitude (Tom Bond) has done for UK dubstep.
One more distinction bears mentioning here: while valuable, the kind of curation seen in various oral histories and "bluffer's guides" around the Web differs from the curation a DJ does (a charming example is this guide to Italo-disco). It's also distinct from the kind of magisterial presence that trend-setters such as John Peel had. In Peel's case he cultivated a weekly show for BBC Radio 1 over the remarkable span of nearly 40 years and helped to break countless bands to a global audience.
In contrast to these functions, the DJ presents the results of curation in the form of a mix. This may seem trivial, but the fact is that much of this music is designed to be heard in a mix. For example, it's not uncommon for dance tracks to begin in a thoroughly uninspiring manner, as with a simple kick drum hitting every beat. That's because producers know that DJs need a few bars to sync up the new track to its predecessor; a naked kick drum is the toehold that allows for a quick and effective transition. By the same token, dance music tracks, unlike pop music, rarely fade out, but will have elements drop out over a regular increments of time (usually denoted in cycles of 16 beats), until there is usually only a kick drum remaining. This way the DJ can mix out of the expiring track in an elegant and seamless manner. Thus, this design for mixing carries the additional, curious trait that certain parts of a track aren't meant to be heard by anyone but the DJ.
The mix is the preferred, long-term format for understanding electronic music. When DJs listen to individual tracks they are always thinking about how those tracks can be made to interact with other tracks in their collection. A beautiful song that has no possibility of interacting with the rest of a collection is simply not useful, since the desired outcome will always be a series of relationships between musical thoughts. Another way of thinking about listening to mixes versus standalone tracks is comparing it to the difference between reading a paragraph and immersing yourself in an essay. A good mix is an extended, coherent argument.
In the same way that you go to a museum to understand how to think about Picasso, you listen to DJs in order to understand how to think about a genre, or to see where a particular sound is headed.
Finally, editing draws upon the DJ's skills in making decisions in real time. This can be within the context of a live gig or a studio recording. Both have advantages — the good DJ feeds off the crowd and tailors selections for the moment, while studio recording allows a DJ to carefully assemble a definitive statement over the course of days or weeks (this exemplary techno mix by British selector Objekt took several months to refine and polish).
In both cases, DJs not only select what they will play, and in what order, but make many other decisions. There's really no reason to play a track all the way through, and DJs who tend to only do this I generally regard as pretty lazy. It's more interesting if you can start Track A at the breakdown, then mix Track B from its beginning, then mix back into the beginning of Track A, and end with the second half of Track B. Even better, save the second half of Track B until you've played some of Track C.
All of these decisions are accompanied by the skills and tools that DJs have traditionally had — equalization, cross-fading, pitch shifting, simple hi-pass/lo-pass filters. Digital DJing has added many more, such as effects, loops, and cue points. The ability to access tracks, beats and samples quickly has also reduced the time it takes to perform an edit in real time, to the extent that DJs with the right raw materials and skills can execute what are essentially remixes on the fly, or custom flows that can never be repeated.
It follows, then, that DJing technology blurs the lines between the extroverted phenomenon of playback and introverted correlate of production. While it's not production as it's commonly understood, what's created is a grey zone that may make the source material difficult to separate out from the mix as a whole. A good DJ will emphasize specific aspects of the music, or bring elements from different tracks into dialog with one another. This is all in the service of showing a listener the best that a certain collection has to offer. I discuss an example from my own mixing here.
Good DJs take the most interesting bits and put them together in the most interesting ways.
To return to the idea of tastemaking then: the DJ stands as the interpreter through which listeners access sound. In fact, this is virtually a requirement for electronic music in particular, with thousands of producers working across hundreds of genres that are constantly cross-pollinating one another. Moreover, the tempo of production has increased dramatically, due to the falling cost of both studio gear and distribution (at this point, only a laptop and an internet connection are needed to launch a project or even a record label). This is radically different from any other genre of music, which either has a fixed repertoire (classical) or is expanding, but slowly (jazz, rock and pop). It's an ocean of oceans out there — let a DJ help you make sense of it all.
This post is an expanded version of the first post on my new Medium blog, which focuses on the art of mixing records.
Monday, February 20, 2017
Things We Learn
Things come to us
out of nowhere
Surfers riding waves
we learn the nuances of gravity
its center-of, its bonding property,
its Gs, its fatal promise, we learn
how to stand erect and,
for the most part, stay that way
learn how to take a fall
how to shuck and jive
through sticky moments
through disequilibrium to stoop
or, chest out, stand tall
falling even into the troughs of its waves
we ride, we glide skulls full of juice
snapping, crackling through calculations
needed to adjust, adjust
we learn to know the force of the wave
behind, its feel, learn to fear and not to,
to not let its immensity in terror lock us,
to knock us off our board, we learn
immediately where our feet should be,
the optimal pose, how to shift without thought,
to enter the exhilaration of a barrel
and ride despite threat of a lethal dive
to surface sane, with soul intact, alive
Sughra Raza. Hong Kong Alley; Jan, 2017.
Monday, February 13, 2017
On particle “action at a distance”: “…if particles have definite states even when
no one is looking (a concept known as realism) and if indeed no signal
travels faster than light (locality)… (and, as has) recently been
discovered … you can keep locality and realism by giving up just a little
bit of freedom.” This is known as the “freedom-of-choice” loophole.
Action at a Distance
He was not looking but
she really was not an apparition
standing at the center of the room
She was standing but
being there in that spot
precisely where she was,
not on the moon, say,
nor in the wind gathering speed
toward eternity (though
that wind always blew) he knew she
They were free but
things are not always what they seem.
The world’s a funny place.
Even Einstein said parts of it are spooky,
yet we love and hate
in the places we stand
practicing all in freedom, or not
Yet he knew
All models are wrong, some are useful
by Hari Balasubramanian
Thoughts on the differences in math applied to the physical and social sciences.
The quote in the title is attributed to the statistician George Box. The term ‘model' could refer to a single equation, a set of equations, or an algorithm that takes an input and carries out some calculations. Box's point is that you can never capture a physical or biological or social system entirely within a mathematical or algorithmic framework; you always have to leave something out. Put another way, reality yields itself to varying degrees but never completely; something always remains unknown that is not easily describable.
And in any case, for the practical matter of achieving a certain outcome that extra effort may not be necessary. If the goal is to put a satellite into orbit, the equations that define Newton's laws of motion and gravity, though not 100% correct, are more than sufficient; you don't need Einstein's theories of relativity though they would provide a more accurate description. But if the goal is to determine a GPS device's location on earth you do need relativity. This is because for an observer on earth a clock on an orbiting satellite ticks at a different speed than a clock on earth and if the necessary adjustments are not made, your phone's location estimate will be inaccurate.
So there is this art in modeling, this choosing of some aspects and ignoring others, trying to create the the right approximations. As Box notes: "there is no need to ask the question 'Is the model true?'. If 'truth' is to be the 'whole truth' the answer must be 'No'. The only question of interest is 'Is the model illuminating and useful?'"
Models vary widely in the amount of truth they capture. In the engineering disciplines that exploit physical laws – mechanical, chemical, civil, electronics and communications engineering – the test of a model is whether the mathematical answers match empirical observations to the degree of precision needed and whether the results can be reproduced again and again.
Standards are high: if equations or computer simulations describing some physical phenomena do not match empirical observations, they eventually will be abandoned or modified. Evidence of this precision and repeatability is all around us – consider that, for the most part, light comes on when turn on the switch, a bridge is able to withstand loads, sensors are able to measure accurately, images and voices and messages can be searched and transmitted at near-instant speeds. Indeed, the evidence so pervasive that it is often taken for granted.
Contrast this with mathematical models in what we can call social domains – economics, healthcare delivery, election polling, psychology and human behavior. In these fields, you can't get – at least not yet – the kind of precision and repeated successes that you get in physics. You can use the models to sharpen your thought process; you can predict general trends and derive insights. But predicting the precise value of a future quantity is quite challenging. For example, models in economics often assume rational actions when of course there is always a rogue factor in how individuals and groups behave, throwing off chances of an accurate prediction. Indeed, one use of such models is to show that equations and theorems, however elegant, have little basis in reality. Friedrich Hayek captured the spirit of this beautifully in his quote: "The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design."
Another difficulty in social systems is that cause and effect and not so neatly separated. This is particularly true when you try to analyze historical data. If we consider multiple regression and other statistical models – very much in fashion these days: easy to use them at a click of a button and a dizzying array of graphs and numbers pop out, and the users may not be aware of the nitty-gritty details that generated these results – in the case of statistical models, it could well be the case that the effects have little to do with the hypothesized causes. A complex matter gets reduced to p values, percentage improvements or other newly defined metrics, meant to highlight the modeler's principal claims and inadvertently masking deficiencies. Sometimes the deficiencies can't be detected since the datasets are so large and have so many variables, they are not easily visualized; so anything goes. Perhaps this is why one study may find that such and such is true – media outlets enhance the effect by providing attention-grabbing titles – while another discovers the opposite result.
Once in a while we get something remarkably successful like Nate Silver's election forecasts, which aggregate and weight various polls. So successful that we are lulled into thinking that there is a rock-solid science of predicting how a population will vote in an election. This perception lasts until an election like 2016 comes along. Looking at the comments section of Silver's website on Nov 9, 2016 you could feel the anger – and the anger turned on the pollsters and statisticians: how could everyone get it wrong?
Like stories, models have a kind of seductive power: we get psychologically attached to them more than is warranted, forgetting that they may be more wrong than we think. If we are clear-eyed about what these models really are – in Silver's case, projections based on samples where errors could easily creep in – perhaps we wouldn't be so surprised. The advent of big data and machine learning only seems to have made us more confident, more triumphant – there's a feeling that social systems, human behavior and consciousness will finally yield themselves to massive computing power and advanced statistics, that algorithms of the same status as the great laws of physics are about to be unveiled. Maybe we really are on the verge. Already the impact of big data, both for beneficial and nefarious purposes, is undeniable. But there is also reason to be skeptical; there is no substitute to looking closely under the hood of the new algorithms, on a case by case basis, to note whether the expectations are unrealistic to begin with.
Many of the thoughts in this piece come from my own modeling experience in a field called operations research. A relatively new branch of mathematics and engineering, operations research is concerned with ‘optimal' ways of running organizations and making things more ‘efficient'. (I use quotes here since these terms are far more difficult to define and achieve than it seems at first glance.) An airline needs to match its pilots, crews and passengers to flights each day and reschedule in case of unexpected events; a corporation like Intel or Apple needs to manage its far flung supply chains so that its products are delivered on time; Doctors Without Borders needs to deploy its clinical staff and equipment on a short notice during infectious disease outbreaks such as Ebola. From a computational viewpoint, these problems become difficult quickly; it is not unusual for there to be billions or trillions or many orders higher number of solutions that even the fastest computers cannot parse through. Without models and search algorithms, finding good answers in a reasonable amount of time would be impossible.
But in the end, the optimizations that are carried out in an abstracted mathematical world have to be implemented in situations where human behavior – all those messy, inexplicable, contradictory, delightful, mischievous things we do – plays a non-trivial role. And so the results, I've noticed, are far from optimal in practice. Some groups may feel unfairly treated; unintended consequences pop up sooner or later in parts of the system that the model did not consider; or the environment changes to an extent that the so-called globally optimal solutions, obtained with great effort, turn out to be short-sighted.
On a personal front, I find myself – and this is the side of me that is drawn to the humanities – rebelling against excessive quantification, metrics and buzzwords such as ‘predictive analytics'. "Don't count too much," my aunt said to me sternly last August in India when I kept telling her how many of her delicious sweets I'd greedily consumed, and which I was feeling guilty about. "Don't count too much – just tell me whether you enjoyed them." That comment has stayed with me and it seemed to summarize what I'd been feeling: that perhaps we are overdoing models and analyses in situations where numbers are far from the full story.