Monday, February 27, 2017
by Scott F. Aikin and Robert B. Talisse
Wisdom is a product of experience and reflection. As a consequence, it's often quite a long road to that goal. It's for this reason that the poetic expression, "the Owl of Minerva Flies at Dusk," has its effect. Only at the end of the day, once the work is done and we recline in thought, do the insights of what we ought to have done, what the best option was, and what was wrong about a particular decision become clear. We live forward, but we understand backward. And that can occasion distinctive problems.
In democratic politics, this point about insight is certainly true. And it extends not only to the errors we may make as a country, but also to the errors we make in understanding ourselves and our decision-making. In its current form, much democratic theory is focused on the decision-making and argumentative elements of modern political life. This deliberative democratic movement casts democratic life as that of participating in ongoing discussions, wherein all have a voice, no issue is beyond question, and every decision must be justifiable to all those whom it effects. There are admirable ideals, but we understand the ways we can fail those ideals only in making mistakes, only in witnessing the pathologies to which public reason is prone.
We experience living in a democracy and then we see the particular kinds of challenges and errors to which reasoning together can be prone. Perhaps we should have anticipated the effects of group polarization that seem to define contemporary political discourse, but we understand it all too well now that we live under its conditions. The incurious dogmatism of epistemic closure, the slippery euphemism of Orwellian Newspeak, and the abuses of and visceral reactions to political correctness are all political phenomena that require we see as developments from histories and arising within particular social settings. We do now know them a priori.
The Owl of Minerva Problem at first looks like a simple point about the retrospective nature of knowledge: You must first have experience to know, so knowledge must be dependent on (at least some) events of the past. But the Owl of Minerva Problem raises distinctive trouble for our politics, especially when politics is driven by argument and discourse. Here is why: once we have a critical concept, say, of a fallacy, we can deploy it in criticizing arguments. We may use it to correct an interlocutor. But once our interlocutors have that concept, that knowledge changes their behavior. They can use the concept not only to criticize our arguments, but it will change the way they argue, too. Moreover, it will also become another thing about which we argue. And so, when our concepts for describing and evaluating human argumentative behavior is used amidst those humans, it changes their behavior. They adopt it, adapt to it. They, because of the vocabulary, are moving targets, and the vocabulary becomes either otiose or abused very quickly.
Consider the use of fallacy vocabulary less as a device for the cool evaluation of arguments, now, but rather as a tool of evasion or attack.
by Jonathan Kujawa
While laying in bed on the night of January 20, 1884, Lewis Carroll conjured up the following puzzle:
Three Points are taken at random on an infinite Plane. Find the chance of their being the vertices of an obtuse-angled Triangle.
That is, since any three points on a sheet of paper can be connected to form a triangle, what's the likelihood that one of the angles is more than ninety degrees if you pick those points at random?
If you only know Lewis Carroll from Alice in Wonderland you may be surprised that his thoughts turned to mathematics. In fact, his day job was to be a mathematician at Christ Church college in Oxford under the name Charles Dodgson. In addition to his more famous works of fiction, he was known for writing several mathematical texts. When teaching linear algebra I always take a day to talk about Dodgson Condensation .
One of the books he wrote is Curiosa Mathematica, Part II: Pillow Problems Thought Out During Wakeful Hours. It is the compendium of 72 math problems Dodgson pondered and solved while waiting to fall asleep. Helpfully he also gives the date he dreamt up the problem and the solution he devised. Go here if you'd like to take a look at the other 71 problems.
The Obtuse Triangle Problem is No. 58. Before we take a look at his solution we should step back a minute. What does it mean to pick three points at random? Like most politicians' speeches, it sounds good but falls apart under the slightest scrutiny. Are we to pick x and y coordinates for each of these points? Alternatively, we could pick an angle between 0 and 360 degrees and a distance and, starting at the origin, take the point at that angle and distance. Or, since all we care about is the resulting triangle, maybe we should randomly pick an angle between 0 and 180 degrees, pick two side lengths at random, and make the triangle made by drawing two sides of those lengths with that angle between them. I'm sure we could come up with a dozen different ways to randomly pick a triangle.
If a random triangle was a random triangle, and if the world was fair and just, then the odds of an obtuse triangle would be the same regardless of our method. Sadly, the world is neither fair nor just. It will matter how we choose to pick a random triangle .
“The woolly mammoth vanished from the Earth 4,000 years ago,
but now scientists say they are on the brink of resurrecting the ancient
beast in a revised form, through an ambitious feat of genetic engineering.”
If the wooly mammoth becomes the new Lazarus
reborn from an ice sarcophagus
does it mean that we may all return one day
to beat our breasts at the injustice of death
but also to rejoice in miracles? It’s an
honest question, we’ve been asking it
for generations, yet it’s never been answered
but in myth, the story that elevates ignorance
to poetry, that blazes red trails in pigment,
that ends up only as sublime music to our ears,
elusive, illusory as the apparition of tomorrow
But we still have this day
It seems never to end
by Dwight Furrow
I must confess to having once been an olfactory oaf. In my early days as a wine lover, I would plunge my nose into a glass of Cabernet, sniffing about for a hint of cassis or eucalyptus only to discover a blast of alcohol thwarting my ascension to the aroma heaven promised in the tasting notes. A sense of missed opportunity was especially acute when the wine was described as "sexy, flamboyant, with a bounteous body." Disappointed but undaunted, I would hurry off to wine tastings hoping the reflected brilliance of a wine expert might inspire epithelial fitness. It was small comfort when the expert would try to soften my disappointment with the banality, "it's all subjective anyway." So one evening, while receiving instruction in the finer points of wine tasting from a charming but newly minted sommelier, I let frustration get the better of me and blurted "Well, if it's all subjective, what the hell are we doing here? Is it just your personal opinion that there is cassis in the cab or is it really there. We all have opinions. If you're an expert you should be giving us your knowledge, not your opinion!" Someone muttered something about "chill out" and it was quickly decided that my glass needed refilling. But the point stands. The idea of expertise involves the skillful apprehension of facts. If there is no fact about aromas of cassis in that cab there is no expertise at discerning it.
These conversations over a glass of wine are more pleasant (because of the wine) but structurally similar to the semester-long task of getting my college students to realize that moral beliefs are not arbitrary emendations of their lightly held personal attitudes but are rooted in our need to survive and flourish as social beings. Yet even after weeks of listening to me going on about the sources of value they still write term papers confidently asserting that with regard to "right" and "wrong", eh, who knows?
Subjectivism, the view that a belief is made true by my subjective attitude towards it, has long been the default belief of freshman students and arbiters of taste. Unfortunately this tendency to treat it as the wisdom of the ages has escaped the confines of the wine bar and classroom into the larger society. Buoyed by the cheers of multitudes, our fabulist-in-chief, routinely finds his "own facts" circulating in what seems to be an otherwise empty mind. Unfortunately, this is no longer mere fodder for a seminar debate.
by Jalees Rehman
The Affordable Care Act, also known as the "Patient Protection and Affordable Care Act", "Obamacare" or the ACA, is a comprehensive healthcare reform law enacted in March 2010 which profoundly changed healthcare in the United States. This reform allowed millions of previously uninsured Americans to gain health insurance by establishing several new measures, including the expansion of the federal Medicaid health insurance coverage program, introducing the rule that patients with pre-existing illnesses could no longer be rejected or overcharged by health insurance companies, and by allowing dependents to remain on their parents' health insurance plan until the age of 26. The widespread increase in health insurance coverage – especially for vulnerable Americans who were unemployed, underemployed or worked for employers that did not provide health insurance benefits – was also accompanied by new regulations targeting the healthcare system itself. Healthcare providers and hospitals were provided with financial incentives to introduce electronic medical records and healthcare quality metrics.
As someone who grew up in Germany where health insurance coverage is guaranteed for everyone, I assumed that over time, the vast majority of Americans would appreciate the benefits of universal coverage. One no longer has to fear financial bankruptcy as a consequence of a major illness and a government-back health insurance also provides for peace of mind when changing jobs. Instead of accepting employment primarily because it offers health benefits, one can instead choose a job based on the nature of the work. But I was surprised to see the profound antipathy towards this new law, especially among Americans who identified themselves as conservatives or Republicans, even if they were potential beneficiaries of the reform. Was the hatred of progressive-liberal views, the Democrats and President Obama who had passed the ACA so intense among Republicans that they were willing to relinquish the benefits of universal health coverage for the sake of their political ideology? Or were they simply not aware of the actual content of the law and opposed it simply for political reasons?
Eileen Alice Soper (1905-1990). When Badgers Awake.
John Lister-Kaye, naturalist and wildlife writer, describes his experience with Soper in "Gods of the Morning":
"As we approached the (badger) setts in the dusk she seemed to slough off her human-ness and transmogrify into something more than half wild. I couldn't understand how she sat so still. She denied cold and rain, she ignored itches - a gnat landing on her nose - she seemed to become part of the wood herself, part of the tree, the soil, the still evening air ..."
Special note to my siblings: Eileen Soper was the illustrator of our beloved childhood books by Enid Blyton - look!
by Brooks Riley
To paraphrase Heinrich Heine, I dream of Weimar in the night—not the era, but the town of Weimar, a lovely word on its own, one steeped in intellectual significance, historical resonance, cultural audacity, political and artistic enlightenment, philosophical bravura--and in modern times monstrous atrocity.
I remember the first time I heard the word Weimar. It wasn't that small town in Germany where Goethe, Schiller, Nietzsche, Liszt, Luther, Cranach, Bach, Wagner, Gropius, Klee, Kandinsky, Strauss, Schopenhauer and countless other thinkers and artists once lived--or even where Kafka on a visit fell in love with the daughter of the caretaker of Goethe's house.
It wasn't the birthplace of the Bauhaus movement. It wasn't the place where the new German constitution was signed in 1919 launching the legendary Weimar Republic, that glittering era of promise before the darkness fell. And it wasn't the town closest to the murderous concentration camp at Buchenwald.
It was our Weimaraner, a hunting dog my father acquired to quell his thirst for a canine to tip the balance in a feline household. But Tonndorf, named for the castle a few miles from Weimar where my father, Artillery Commander of the 6th Armored Division had quartered with his regiment at the end of World War 2, wasn't allowed in our household, and was banished to the stable with the horses, where he spent hours hoping to catch a rat coming out of a hole in the earthen floor of a stall, successful only once in all his years, when an emerging rat took a wrong turn and landed in his maw.
Weimaraners were exotic in the mid-Fifties. They hadn't been discovered by William Wegman or immortalized in the Museum of Modern Art. What I remember best about Tonndorf was the color of his coat, my favorite color, taupe. Taupe is the color gray with a smile, a hint of warmth that seeps through the sober neutrality of lightened black. I never think of Weimar without somehow seeing taupe, and when I look at Goethe's color wheel, I can't help wishing he had added that smile.
It would be many years before I actually went to Weimar, years before I began to understand the subterranean currents that would lead me there. So many interests of mine had their genesis in Weimar or were inextricably entwined with it. In college, a term paper of mine dealt with Friedrich Schiller's Wallenstein trilogy, which was written and premiered there. In it, I posited that Schiller might have foreseen the dangers of Napoleon, and had written Wallenstein as a parable. Ironically, Weimar later briefly fell to Napoleon.
by Katalin Balog
"As he died to make man holy, let us die to make things cheap." --Leonard Cohen, "Steer your way"
In this article I use a distinction borrowed from philosophy, between objectivity and subjectivity, to look at the nature of the Trump presidency. I explicated that distinction in more detail in some earlier posts here, here and here.
For all the ridiculousness of our president there is a whiff of the devil about him - by monumental bad luck, America has managed to elect a person embodying the worst of human nature. He combines thoughtlessness and utter disregard for standards of objectivity and reason with the soullessness and banality of reality TV run amok. Despite real parallels with 1930s Europe and more recent autocratic regimes across the world, the Trump era also offers novelty; it is its own, unique brand of awfulness, made in America.
In trying to grasp Trump's uniqueness, many commentators resort to psychology. In this essay, I want to propose a more philosophical perspective, a sort of psycho-philosophical approach that, in my view, allows one to appreciate better the psychic vortex that sucks up and annihilates anything of value around Trump. He is the inverse Midas: everything he touches turns immediately into junk. Business, entertainment, social media and now our national politics – very little is safe from his seeping menace. Kierkegaard's philosophy offers some clues to understanding this situation.
Kierkegaard suggested that the mind oscillates between two primary perspectives on the world: objective and subjective – and that the relationship between these approaches determines what kind of a person we are going to be. Objectivity is an orientation towards reality based on abstracting away, in various degrees, from subjective experience, and from individual points of view. An objective approach is based on concepts and modes of thinking about the world that is accessible from many different points of view. A subjective orientation, on the other hand, is based on an attunement and direct reflection on the inner experience of feeling, sensing, thinking and valuing that unfolds in our day-to-day living. It is the difference between an abstract, objective conception of water as a potable liquid that is also found in lakes, rivers and oceans, and the subjective concept of it based on what it is like to drink it or swim in it on this particular day in this particular place. Objective and subjective, of course, comes in degrees. Scientific concepts are the most objective but many of our everyday concepts are also of the more objective variety. The most subjective conceptions are those that arise in direct reflection on experience.
by Brooks Riley
by Amanda Beth Peery
In Pliny the Elder's Natural History, he describes a fourth-century BC painter, Apelles of Kos, as superior to all other painters. According to the Encyclopedia Britannica, Apelles "continues to be regarded as the greatest painter of antiquity even though none of his work survives." How is it possible that the artist seen as the greatest painter of all of antiquity is one who left no surviving works? One possibility is that his fame has been expanded by myth and time, and with no works left to show the truth, his skills have been inflated beyond their due. That's probably true, but I believe there's another, more legitimate reason for Apelles' reputation. Apelles' art—often conveyed through the descriptions of ancient writers like Pliny—has engendered other art. One way of measuring the greatness of a work of art is to ask whether it gives rise to other works, or to say it differently, whether or not it inspires.
Apelles of Kos was the court painter of Macedon under Alexander the Great. Pliny recounts various stories about him, many of them gems. In one, Apelles comes to Egypt, then ruled by one of the Ptolemies (the first Ptolemy, I think) whom Apelles once knew. A court jester invites Apelles to a feast at the royal palace, but unbeknownst to Apelles, Ptolemy has long harbored a hatred for the artist and the pharaoh is enraged to see him at the feast. Ptolemy commands Apelles to tell him who invited him. Apelles, who never knew or doesn't remember the jester's name, picks up a piece of charcoal from the cold hearth and begins to draw the jester's face on the palace wall. Within just a stroke (or two), Ptolemy recognizes his jester. Apelles has captured the jester with just a single line.
Apelles is famed not only for his superior skill but also for his dedication to his art. Pliny attributes to Apelles the phrase "nulla dies sine linea," or "not a day without a line," because the artist worked every day. Apelles exemplified the artist's lifestyle and was so respectable and respected that he could speak out against Alexander the Great himself. In one story, Alexander is sitting for a portrait expounding his theories on art, going on at length, until the artist quietly begs him to stop because the boys grinding the colors will laugh at him. We don't know what Alexander was saying, but by stopping him, Apelles—in his innocence—asserted the artist's superior knowledge of the craft and maybe even the way of seeing and ways of creating that artists are able to access. Alexander, who had been tutored by Aristotle (who was tutored in turn by Plato, who was tutored by Socrates) cannot rival Apelles'—or the color-grinding boys'—intimate knowledge and experience of art. In this story Apelles rejects the very sources of knowledge in the West. He is insisting that there is another type of knowledge. Or he is insisting, at least, that there are other things to know.
by Shadab Zeest Hashmi
The ghost that lurks around the old Bombay Company bookshelf is the ghost of an elliptical future, trailing the past like a spectacular, burning, comet-tail. It is the wispy energy of my own half-dreamed, half-written book that hovers over the rows of books I use for research, mostly works of history and poetry. After a night of writing, I have finally met my deadline. The life-size mirror leaning in the corner shows a pale face, preoccupied with time; my work is to not forget the past, and to call to poetry what may be forgotten. I am now searching for a book for remembrance, a book by the American Sufi poet Daniel Abdal-Hayy Moore. I want to honor this poet whose work I consider a beacon and who is now saying his goodbyes, dying of cancer. I am flailing for time, mine, his, and ours as poets, especially as Muslim poets living through times of brutal daily deaths. Weeks from now, earthly time will stop for him, moments from now, time will slow down for me, indefinitely.
The bookshelf phantom is poised to make projectiles of treasured objects— a miniature Chinese cabinet and framed Turkish calligraphic art on an easel— heavy objects that will slide down and cause multiple concussions and head/neck trauma. I am stunned but remain conscious, not bleeding but suddenly fatigued. It is ironic that one of the objects is Turkish— I had met Daniel Abdal-Hayy Moore and his wife Malika at the Nazim Hikmet Poetry Festival where he and I were both awarded the Hikmet Poetry Prize, where I recognized kindred souls in both Daniel and Malika and found a reservoir of inspiration and made lifelong friends at the Turkish House in Cary, NC. Despite the shock of the accident, I feel the surge of a promise, a kind of reassurance.
There's a certain kind of conversation in which I find myself every so often, which can roughly be summarized as "What's the big deal about DJing"? As someone who was a quasi-professional DJ in a former life, and is currently what one friend terms a 'monastic DJ', I've sensed a substantial gap in lay understanding of not just what a DJ does while engaged in the act of mixing, but also the place occupied by DJs in the contemporary musical ecosystem. This attitude — not unlike looking at a Jackson Pollock while muttering to yourself that you could do just as well — has received further support from the rise and fall of the spectacularly excessive (and, to my ears, creatively bankrupt) EDM scene; the unholy marriage of superstar DJs, casino-based clubs and overpriced bottle service; and the fact that watching someone DJ is fundamentally uninteresting.
Is there any value in mixing other people's music? When viewed from the most reductive position, the answer is clearly not. As critic David Hepworth noted in a now-deleted blog post, "You must surely realise that you make your living by putting on records, which is only a tiny bit removed in degree of difficulty from switching on the radio." If that's all that DJs are good for, then I suppose it's a relief that streaming services and software-driven playlists have come along to put this particular horse-and-buggy paradigm out of its misery.
Instead, it's more helpful to look at the larger role that DJs play in parsing the ocean of music in which we swim in these post-Napster days. Just as we turn to critics in other fields to understand what we should be reading or watching, we also turn to DJs for clarity on what to listen to. In this sense, the appropriate metaphor is one of the DJ as tastemaker.
In order to talk about how a DJ guides others' taste in music, we have to address the DJ's own, internal process. Over time, a DJ is a collector, a curator and an editor. Of course, being a DJ involves inhabiting all three of these roles at the same time, all the time, but there is also a progression here. I'll go over each of these and then return to what it means to be a tastemaker at the end of this post.
Sunday, February 26, 2017
Thomas Nagel in the New York Review of Books:
For fifty years the philosopher Daniel Dennett has been engaged in a grand project of disenchantment of the human world, using science to free us from what he deems illusions—illusions that are difficult to dislodge because they are so natural. In From Bacteria to Bach and Back, his eighteenth book (thirteenth as sole author), Dennett presents a valuable and typically lucid synthesis of his worldview. Though it is supported by reams of scientific data, he acknowledges that much of what he says is conjectural rather than proven, either empirically or philosophically.
Dennett is always good company. He has a gargantuan appetite for scientific knowledge, and is one of the best people I know at transmitting it and explaining its significance, clearly and without superficiality. He writes with wit and elegance; and in this book especially, though it is frankly partisan, he tries hard to grasp and defuse the sources of resistance to his point of view. He recognizes that some of what he asks us to believe is strongly counterintuitive. I shall explain eventually why I think the overall project cannot succeed, but first let me set out the argument, which contains much that is true and insightful.
Carl Erik Fisher in Nautilus:
Thomas was a highly successful and mild-mannered lawyer who was worried about his drinking. When he came to see me at my psychotherapy practice, his wine intake had crept up to six or seven glasses a night, and he was starting to hide it from his family and to feel the effects at work. We discussed treatment strategies and made an appointment to meet again. But when he returned two weeks later, he was despondent: His drinking was totally unchanged.
“I just couldn’t cut back. I guess I just don’t have the willpower.”
Another patient of mine, John, also initially came to me for help with drinking. At our first meeting, we talked about moderation-based approaches and setting a healthier limit. But one month later, he came back to my office declaring that he had changed his mind and made peace with his drinking habits. Sure, his wife wasn’t always thrilled with how much he drank, he told me, and occasionally the hangovers were pretty bad, but his relationship was still fairly solid and drinking didn’t cause any truly significant problems in his life.
In the abstract, John and Thomas are similar: They both succumbed to short-term temptations, and both didn’t keep their long-term goals. But while Thomas attributed that outcome to problems with willpower, John came to reframe his behavior from a perspective that sidestepped the concept of willpower altogether. Both John and Thomas would resolve their issues, but in very different ways.
Most people feel more comfortable with Thomas’ narrative. They would agree with his self-diagnosis (that he lacked willpower), and might even call it clear-eyed and courageous. Many people might also suspect that John’s reframing of his problem was an act of self-deception, serving to hide a real problem. But Thomas’ approach deserves just as much skepticism as John’s. It’s entirely possible that Thomas was seduced by the near-mystical status that modern culture has assigned to the idea of willpower itself—an idea that, ultimately, was working against him.
Annie Lowery in the NYT Magazine:
The basic or guaranteed income is a curious piece of intellectual flotsam that has washed ashore several times in the past half-millennium, often during periods of great economic upheaval. In “Utopia,” published in 1516, Thomas More suggests it as a way to help feudal farmers hurt by the conversion of common land for public use into private land for commercial use. In “Agrarian Justice,” published in 1797, Thomas Paine supports it for similar reasons, as compensation for the “loss of his or her natural inheritance, by the introduction of the system of landed property.” It reappears in the writings of French radicals, of Bertrand Russell, of the Rev. Dr. Martin Luther King Jr.
Silicon Valley has recently become obsessed with basic income for reasons simultaneously generous and self-interested, as a palliative for the societal turbulence its inventions might unleash. Many technologists believe we are living at the precipice of an artificial-intelligence revolution that could vault humanity into a postwork future. In the past few years, artificially intelligent systems have become proficient at a startling number of tasks, from reading cancer scans to piloting a car to summarizing a sports game to translating prose. Any job that can be broken down into discrete, repeatable tasks — financial analytics, marketing, legal work — could be automated out of existence.
In this vision of the future, our economy could turn into a funhouse-mirror version of itself: extreme income and wealth inequality, rising poverty, mass unemployment, a shrinking prime-age labor force. It would be more George Saunders than George Jetson. But what does this all have to do with a small village in Kenya?
A universal basic income has thus far lacked what tech folks might call a proof of concept. There have been a handful of experiments, including ones in Canada, India and Namibia. Finland is sending money to unemployed people, and the Dutch city Utrecht is doing a trial run, too. But no experiment has been truly complete, studying what happens when you give a whole community money for an extended period of time — when nobody has to worry where his or her next meal is coming from or fear the loss of a job or the birth of a child.
And so, the tech industry is getting behind GiveDirectly and other organizations testing the idea out.
Jenny C. Mann in Avidly:
I study the history of rhetoric, something that has made me intimately, painfully aware of the long history of hysteria around the idea of a woman speaking in public. The stubborn persistence of this hostility towards female speech is everywhere in evidence—as just one example, take the online and print harassment of the classicist Mary Beard, who ably responded in the London Review of Books by tracing the long history of men telling women to shut up all the way back to the Odyssey. And here we are again with Mitch McConnell and Senate Republicans denying Elizabeth Warren the right to take to the Senate Floor and read aloud a letter from Coretta Scott King in opposition to the Cabinet appointment of Senator Jeff Sessions.
In justifying the collective Republican censure of their peer in the Senate Chamber, McConnell explained: “She was warned. She was given an explanation. Nevertheless she persisted.” Already this “nevertheless” has become a rallying cry on social media for those who are horrified by the silencing of Scott King’s letter and Warren’s speech. When I awoke this morning to the many #nevertheless hashtags, I was overwhelmed with that giddy-nauseous feeling of possibility that you get when something in popular culture twangs a string that resonates with your own scholarly obsessions. For in his malice, McConnell has fastened on precisely the best word to describe the disorderly intrusions of female speech in a public forum.
Alix Oswald Voces Novae:
On March 6, 1857, Dred Scott's eleven-year struggle for freedom had finally come to an end. The Supreme Court of the United States rendered its decision, ruling that Dred Scott was still a slave. Even more controversially, the Court ruled that the Missouri Compromise was unconstitutional; that all blacks, free or enslaved, could never be United States citizens, and that Congress did not have the right to decide the slavery question in the territories. This loaded decision, which was supposed to solve the slavery question once and for all and more importantly mitigate the nation's growing sectional crisis, ended up creating more tension in the country between the North and South. The reaction to the decision varied by region and political party, with it being criticized by northerners and Republicans, and praised by southerners and Democrats. The nation's intense reaction to the Dred Scott decision not only had an effect on politics in the late 1850s, but would also serve as one of several precipitates for the ultimate breakdown in American politics, the southern secession and Civil War.
...The Dred Scott decision had far reaching effects even long after it seemed like it had lost its influence. On February 23, 1865, Illinois Senator Lyman Turnbull proposed to Congress, House bill No. 748, which would have provided for a bust of Chief Justice Taney to be made and placed inside the Supreme Court Room.To this proposition, Senator Charles Sumner of Massachusetts retorted, "I object to that; that now an emancipated country should make a bust to the author of the Dred Scott decision."Senator Wilson also vehemently opposed this bill, and responded with an impassioned speech. He began by declaring, "We, the chosen representatives of a people who have reversed that unrighteous decree, trampled it beneath our feet with loathing and scorn unutterable," had ended up "sitting here in the closing hours of the Thirty-Eighth Congress with an empty Treasury."He expressed that Congress had more important matters to attend to, like the "$130,000 due to the heroes of the Republic who are fighting, bleeding, dying to defend their country," which was "menaced by armed treason born of the Dred Scott decision."Senator Wilson then condemned Congress for "consuming precious time and giving our voices and votes to take $1,000 out of the pockets of the people, to keep out of the hands of our soldiers," which were "outstretched to receive them."He concluded by again denouncing the proposal to allocate "$1,000 to set up a bust to the memory of the man," who Wilson described as doing "more than all other men that ever breathed the air or trod the soil of the North American continent to plunge the nation into this bloody revolution."
More here. (Note: At least one post throughout February will be in honor of Black History Month)
Visiting the Oracle
It’s dark on purpose
so just listen.
Maybe I inhabit a jar, maybe a pot,
maybe nothing. Only this
loose end of a voice
rising to meet you.
It sounds like water.
Don’t think about that.
Let your servants climb back down the mountain
by themselves. I’ll listen.
I’ll tell you everything
I discover, but I can’t
say what it means.
Someone will always
assure you of the best of fortunes,
but you know better.
And keep this in mind: The answer
reveals itself in time
like the clue that fits
perfectly and explains everything
after the crime has been solved.
Then you will say: I should have known.
It was there all along
and never even concealed,
like the story of the letter
overlooked by the thief because
it had not been hidden.
That’s the trick, of course.
You don’t need me.
by Lawrence Raab
from The Collector of Cold Weather
Ecco Press, NY, 1976
Saturday, February 25, 2017
Louise O. Fresco in Aeon:
The tomato is one of our lovelier foods; juicy icon of the good life. There’s almost nothing better than buying fresh tomatoes on a Saturday morning, bringing them home to your kitchen, washing them carefully, slicing them, admiring their shiny interiors with the miraculous seeds inside, adding a few drops of green, virgin olive oil, and perhaps a leaf or two from the basil plant on the windowsill. Just paradise.
Few people are indifferent to the sun-drenched cherry tomatoes served up in every picturesque Italian village trattoria; or a well-tended vegetable garden where the branches of each tomato plant are carefully tied by hand with a green ribbon – these fruits are harvested with loving care. Most likely you feel that such tomatoes should be organically grown, on small fields, reflecting tradition and history. You might think that, this way, they accrue authenticity, honesty and truth, that their production will be small-scale, and preferably local.
But how ‘good’ are they really? And what does ‘good’ mean in this context? Are the organic hand-picked tomatoes sold at farmers’ markets really better, in a technical sense, or do they just make us feel like better consumers – perhaps even better human beings? If the organic tomato is just a vehicle for romantic fallacy, then we have to look dispassionately at how they are grown from the perspective of sustainability.
“You can’t go home again.” Thomas Wolfe’s famous phrase has long served as a dictum for writers and analysands, but it needs an addendum: You can’t stop trying. Sam Shepard has acknowledged the compulsion — and also the futility — in interviews and dramatized it in plays where protagonists return to the place that’s supposed to take you in, but doesn’t. They come home not for comfort but to settle scores, demand respect, even elicit an acknowledgment of their existence. Family members in extremis shout and holler, hoping, like the father in “Buried Child,” that the sounds they make will signal an affirmative reply to the question, “Are we still in the land of the living?”
This question floats over Shepard’s novella of short-burst imaginings and conversations with himself, as the aging narrator ruefully takes stock. He’s in the land of the living, but only just, hanging on by his fingernails, his memory, his imagination, his never-ending obsession with his father, his blue thermal socks (nicked from a movie set) and his ongoing arguments with women, including a sometime-girlfriend 50 years his junior. She’s called the Blackmail Girl because she’s recording their conversations for a book that will launch her literary career. Maybe. There’s a wry poetic justice in the spectacle of a writer, that scavenger of others’ lives, helplessly furnishing material for another. The voyeur voyeured.
There are no half measures to Kay Redfield Jamison’s medico-biographical study of poet Robert Lowell. It is impassioned, intellectually thrilling and often beautifully written, despite being repetitive and overlong: A little too much would seem to be just enough for Jamison.
Nonetheless, “Robert Lowell: Setting the River on Fire” achieves a magnificence and intensity — dare one say a manic brilliance? — that sets it apart from more temperate and orderly biographies. Above all, the book demands that readers seriously engage with its arguments, while also prodding them to reexamine their own beliefs about art, madness and moral responsibility. Reading this analysis of “genius, mania, and character” is an exhilarating experience.
From the late 1940s to the mid-1970s, Lowell was the most admired and talked-about American poet of his generation. Scion of a privileged New England family, he counted among many distinguished ancestors two notable poets — James Russell Lowell and Amy Lowell — as well as Percival Lowell, the astronomer who sighted what he thought were canals on Mars.