Tuesday, August 19, 2014
Simon Hammond in The White Review:
In the summer of 1959, a headstrong but lovesick English graduate took a trip to the hometown of his favourite writers, to mark the end of his degree and to help him forget his sorrows. En route to Dublin via the Welsh Coast he hitched a lift with the owner of an upscale holiday resort, who offered him a job for the summer, an offer he took up after walking in the footsteps of Joyce, Beckett and O’Brien. Travelling People, which BS Johnson wrote in fits and starts over the next two years, is the story of a young man who takes a job at a Welsh holiday resort. It has the brisk outlines of a familiar English comedy, but presented with an incongruous trickery more in keeping with Johnson’s Irish heroes. Plenty of direct experience made it into the novel (Johnson even incorporated letters that he had written that summer) but names were changed and elements added to provide excitement, perhaps even as wish-fulfilment. Henry has a passionate affair and gets a first in his degree, while Johnson wasn’t so fortunate; the heart attack that afflicts the owner, with whom Johnson fell out, never happened. But the translation of experience is uneasy: rogue autobiographical elements – Johnson’s romantic hysteria, his odd superstitions – crop up without explanation.
Published after a string of rejections to muted applause, with some copies returned in the belief that the typographical experimentation was a printing error, Johnson was nevertheless pleased with what he saw as the novel’s ingenuity, even claiming that in some respects it had improved on Joyce’s Ulysses. But behind the bravado lay a nagging dissatisfaction. He began to feel embarrassed by the fictional additions, to believe that the novel would have been better if it had been more honest, if he hadn’t compromised the truth for the sake of a good story. Increasingly Johnson dismissed it as an apprentice work, and was later reluctant to have it republished. Never again would he be so blasé with the facts of his life. The six novels that followed would be the work of a writer at war with the imagination.
George Johnson in The New York Times:
Almost 20 years ago, in the pages of an obscure publication called Bioastronomy News, two giants in the world of science argued over whether SETI — the Search for Extraterrestrial Intelligence — had a chance of succeeding. Carl Sagan, as eloquent as ever, gave his standard answer. With billions of stars in our galaxy, there must be other civilizations capable of transmitting electromagnetic waves. By scouring the sky with radio telescopes, we just might intercept a signal.
But Sagan’s opponent, the great evolutionary biologist Ernst Mayr, thought the chances were close to zero. Against Sagan’s stellar billions, he posed his own astronomical numbers: Of the billions of species that have lived and died since life began, only one — Homo sapiens — had developed a science, a technology, and the curiosity to explore the stars. And that took about 3.5 billion years of evolution. High intelligence, Mayr concluded, must be extremely rare, here or anywhere. Earth’s most abundant life form is unicellular slime. Since the debate with Sagan, more than 1,700 planets have been discovered beyond the solar system — 700 just this year. Astronomers recently estimated that one of every five sunlike stars in the Milky Way might be orbited by a world capable of supporting some kind of life. That is about 40 billion potential habitats. But Mayr, who died in 2005 at the age of 100, probably wouldn’t have been impressed. By his reckoning, the odds would still be very low for anything much beyond slime worlds. No evidence has yet emerged to prove him wrong.
Ta-Nehisi Coates in The Atlantic:
Among the many relevant facts for any African-American negotiating their relationship with the police the following stands out: The police departments of America are endowed by the state with dominion over your body. I came home at the end of this summer to find that dominion had been. This summer in Ferguson and Staten Island we have seen that dominion employed to the maximum ends—destruction of the body. This is neither new nor extraordinary. It does not matter if the destruction of your body was an overreaction. It does not matter if the destruction of your body resulted from a misunderstanding. It does not matter if the destruction of your body springs from foolish policy. Sell cigarettes without proper authority and your body can be destroyed. Resent the people trying to entrap your body and it can be be destroyed. Protect the home of your mother and your body can be destroyed. Visit the home of your young daughter and your body will be destroyed. The destroyers of your body will rarely be held accountable. Mostly they will receive pensions.
It will not do to point out the rarity of the destruction of your body by the people whom you pay to protect it. As Gene Demby has noted, destruction is merely the superlative form of a dominion whose prerogatives include friskings, detainings, beatings, and humiliations. All of this is common to black people. All of this is old for black people. No one is held accountable. The body of Michael Brown was left in the middle of the street for four hours. It can not be expected that anyone will be held accountable.
Alana Marie Levinson-LaBrosse in Fair Observer:
The poet, in a collared shirt beneath a sweater vest and elbow-patched blazer, takes his seat. The more audacious fans push to shake his hand; he rises to accept, to graze cheeks in the formal kiss. Each time he stands, the audience follows, breaking into fresh, ferocious applause. He takes the stage flanked by three bodyguards who clear a path through the grabbing attendees.
During his short speech on political parties and their failings, the Kurdish language and its splintering, the audience keeps bursting into applause, like peals of thunder. I start a tally as he reads his poems. Audience members mouth the words along with him. After one poem, the clapping synchronizes and the audience takes up a chant, “Doo-bah-rah! Doo-bah-rah!” — “Again! Again!” and the poet relaunches, delivering the poem a second time. He leans over the lectern to deliver the lines. The tally: 48.
I remember the first time I’d seen such a response to live poetry — at an elocution contest sponsored by the American University of Iraq, Sulaimani (AUIS). Some 20 contestants took the stage and at least 100 students crammed into the cafeteria just to watch try-outs. At the time, the school only had 400 students. When the student-translator took the stage to read the poem in its original language first, the audience interrupted him, cheering at the end of each line. All this while the university had trouble galvanizing students to come to soccer games.
Monday, August 18, 2014
Dear Readers, Writers, Bloggers,
We are very honored and pleased to announce that Frans de Waal has agreed to be the final judge for our 5th annual prize for the best blog and online-only writing in the category of science. Details of the previous four science (and other) prizes can be seen on our prize page.
As you may know, Frans B. M. de Waal is a Dutch/American biologist and primatologist known for his work on the behavior and social intelligence of primates. His first book, Chimpanzee Politics (1982) compared the schmoozing and scheming of chimpanzees involved in power struggles with that of human politicians. Ever since, de Waal has drawn parallels between primate and human behavior, from peacemaking and morality to culture. His scientific work has been published in hundreds of technical articles in journals such as Science, Nature, Scientific American, and outlets specialized in animal behavior. His popular books - translated into twenty languages - have made him one of the world's most visible primatologists. His latest books are The Age of Empathy (2009), and The Bonobo and the Atheist (2013).
De Waal is currently C. H. Candler Professor in the Psychology Department of Emory University and Director of the Living Links Center at the Yerkes National Primate Research Center, in Atlanta, Georgia. He has been elected to the (US) National Academy of Sciences, the American Academy of Arts and Sciences, and the Royal Dutch Academy of Sciences. In 2007, he was selected by Time as one of The Worlds’ 100 Most Influential People Today, and in 2011 by Discover as among 47 (all time) Great Minds of Science.
As usual, this is the way it will work: the nominating period is now open. There will then be a round of voting by our readers which will narrow down the entries to the top twenty semi-finalists. After this, we will take these top twenty voted-for nominees, and the editors of 3 Quarks Daily will select six finalists from these, plus they may also add up to three wildcard entries of their own choosing. The three winners will be chosen from these by Frans de Waal.
The first place award, called the "Top Quark," will include a cash prize of 500 dollars; the second place prize, the "Strange Quark," will include a cash prize of 200 dollars; and the third place winner will get the honor of winning the "Charm Quark," along with a 100 dollar prize.
(Welcome to those coming here for the first time. Learn more about who we are and what we do here, and do check out the full site here. Bookmark us and come back regularly, or sign up for the RSS Feed.)
August 11, 2014:
- The nominations are opened. Please nominate your favorite blog entry by placing the URL for the blog post (the permalink) in the comments section of this post. You may also add a brief comment describing the entry and saying why you think it should win. Do NOT nominate a whole blog, just one individual blog post.
- Blog posts longer than 4,000 words are strongly discouraged, but we might make an exception if there is something truly extraordinary.
- Each person can only nominate one blog post.
- Entries must be in English.
- The editors of 3QD reserve the right to reject entries that we feel are not appropriate.
- The blog entry may not be more than a year old. In other words, it must have been first published after August 10, 2013.
- You may also nominate your own entry from your own or a group blog (and we encourage you to).
- Guest columnists at 3 Quarks Daily are also eligible to be nominated, and may also nominate themselves if they wish.
- Nominations are limited to the first 200 entries.
- Prize money must be claimed within a month of the announcement of winners.
August 22, 2014
- The nominating process will end at 11:59 PM (NYC time) of this date.
- The public voting will be opened soon afterwards.
August 30, 2014
- Public voting ends at 11:59 PM (NYC time).
September 5, 2014
- The finalists are announced.
September 22, 2014
- The winners are announced.
One Final and Important Request
If you have a blog or website, please help us spread the word about our prizes by linking to this post. Otherwise, post a link on your Facebook profile, Tweet it, or just email your friends and tell them about it! I really look forward to reading some very good material, and think this should be a lot of fun for all of us.
Best of luck and thanks for your attention!
by Scott F. Aikin and Robert B. Talisse
In the course of discussing the central themes of our recent book, Why We Argue (And How We Should), with audiences of various kinds, one kind of critical response has emerged as among the most popular. It deserves a more detailed reply than we are able to provide here; nonetheless, we want to sketch our response.
Why We Argue presents a conception of proper argumentation that emphasizes its essentially cooperative and dialectical dimension. Very roughly, our view runs as follows. The central aim of cognitive life is to believe what is true and reject what is false. We pursue this by appealing to our evidence and reasons when forming and maintaining our beliefs. Yet in pursuing this aim, we quickly encounter conflicting and inadequate evidence and reasons; furthermore, we discover that we each must rely upon other people as sources of evidence and reasons. Importantly, the others upon whom we must rely do not always speak univocally; they often provide conflicting reasons and evidence. Accordingly, in the pursuit of our central cognitive aim, we confront the inevitability of disagreement. Argumentation is the process by which we attempt to resolve disagreement rationally. Consequently, argumentation is inescapable for a rational creature like us; and the aspiration to argue properly is an indispensible corollary of our central cognitive aim.
The project of arguing well requires individuals to interact with each other in certain ways, and to avoid interacting in other ways. More specifically, in order to argue well, we must individually attempt to take the reasons, perspectives, arguments, criticisms, and objections of others seriously; we must see even those with whom we most vehemently disagree as fellow participants in the process of proper argumentation, and we must engage with them on those terms. This means, among other things, that when engaging in argument, one must seek to make the most of the reasons and considerations offered by one's opposition. Verbal tricks, insults, threats, and obfuscation are failures of argumentation, even when they prove effective at closing discussion or eliciting assent. A lot of Why We Argue (And How We Should) is devoted to cataloguing and dissecting common ways in which argumentation, especially political argumentation, fails.
So much for the nutshell version of our conception of argumentation. Let's turn now to the critical reaction it commonly invites. Critics say that our view is misguided because it cannot acknowledge the brute fact that most often we argue not to rationally resolve disagreement, but to end disagreement; and the favored way of ending disagreement is by winning an argument. Here a sports analogy is often introduced. Critics often claim that just as one plays baseball not (primarily) for the exercise, camaraderie, or the cooperative teamwork, but rather to win baseball games; so it is that when one argues, one argues to win.
by Jonathan Kujawa
The big news in math this week was the opening of the quadrennial International Congress of Mathematicians (ICM) in Seoul. A number of prestigious awards are given at the ICM. Most famously this includes the Fields medal and the Nevanlinna prize (aka the Fields medal for computer science). Up to four winners of the Fields medal are announced along with the winner of the Nevanlinna prize. All the winners must be no older than 40.
I had the pleasure to attend the 2006 ICM in Madrid. This is the ICM famous for Grigori Perelman refusing to accept the Fields medal for his work in finishing the proof of the Poincaré conjecture. Perelman (or at least the media version of him) comes across as the stereotypical eccentric mathematician uninterested in worldly things. Fortunately for the PR folks, this year's winners all appear to be the sort you'd enjoy having over for dinner and drinks.
This year the Fields medal went to Artur Avila, Manjul Bhargava, Martin Hairer, and Maryam Mirzakhani. The Nevanlinna prize went to Subhash Khot. An excellent profile of each of the winners, including very nicely done videos, can be found on the Quanta website. The profiles are a bit short on the actual math of the winners. If you'd like a more meaty discussion of their work, former Fields medalist Terry Tao wrote blog posts here and here giving a more technical overview. Even better, former Fields medalist Timothy Gowers is blogging from the ICM itself! He's giving summaries of the main talks as well as his more general impressions while at the event. I can also recommend that you check out the excellent overviews of some of the winners' work on John Baez's Google+ page.
Rather than talk about the details of the winners' work , I wanted to point out a meta-mathematical common feature of their research. This is the idea of studying a collection of objects as a whole, rather than one by one.
the way it
a perfect metaphor
for shapes of time,
overused as moon
for that which vanishes
quiet now, the wedding past
too much so—
a house that buzzed
now hushed, silence loud
than a midnight crescent
empties, spills, ebbs and fills,
evaporates and billows like a cloud
above a sugarbush still
boiling down sweet water
for its essence
by Jim Culleny
by Paul Braterman
All men are mortal.
Socrates is a man.
Therefore Socrates is mortal.
It's perfectly valid, meaning that if the premises are true, the conclusion must also be true. Despite this, as Bertrand Russell explained very clearly many years ago, the argument is almost totally worthless.
There is no real doubt that Socrates is mortal. Just look at the poor chap, clearly showing his 70 years. Bent, scarred from the Peloponnesian War, his brow furrowed by decades of unhappy marriage, and even more unhappy attempts to persuade his fellow citizens that the best form of government is a fascist oligarchy. Besides, he is on trial for doubting the existence of the gods, and the news from the Agora is not good. Take my advice, and do not offer him life insurance.
Even if we didn't know about his circumstances, we would readily agree that he is mortal. We see decrepitude and death around us all the time, few people have been known to live beyond a hundred years, none beyond 150, and we have no reason to believe that Socrates is any different. In fact, from our experience, we are a lot more certain that Socrates is mortal than we are that all men are mortal. Ganymede, Elijah, and the Virgin Mary were all, according to various traditions, taken directly up into heaven without having to go through the tedious process of dying. However, no Zeus-worshipper or biblical literalist or devout Catholic would for such reasons doubt the mortality of Socrates. So the premise, that all men are mortal, is actually less certain than the conclusion, and if we seriously doubted Socrates's mortality, we would simply deny that premise. In other words, this classic example of deductive logic tells us nothing that we didn't already know.
by Jalees Rehman
"Do not put your work off till tomorrow and the day after; for a sluggish worker does not fill his barn, nor one who puts off his work: industry makes work go well, but a man who puts off work is always at hand-grips with ruin." Hesiod in "The Works and Days"
Paying bills, filling out forms, completing class assignments or submitting grant proposals – we all have the tendency to procrastinate. We may engage in trivial activities such as watching TV shows, playing video games or chatting for an hour and risk missing important deadlines by putting off tasks that are essential for our financial and professional security. Not all humans are equally prone to procrastination, and a recent study suggests that this may in part be due to the fact thatthe tendency to procrastinate has a genetic underpinning. Yet even an individual with a given genetic make-up can exhibit a significant variability in the extent of procrastination. A person may sometimes delay initiating and completing tasks, whereas at other times that same person will immediately tackle the same type of tasks even under the same constraints of time and resources.
A fully rational approach to task completion would involve creating a priority list of tasks based on a composite score of task importance and the remaining time until the deadline. The most important task with the most proximate deadline would have to be tackled first, and the lowest priority task with the furthest deadline last. This sounds great in theory, but it is quite difficult to implement. A substantial amount of research has been conducted to understand how our moods, distractability and impulsivity can undermine the best laid plans for timely task initiation and completion. The recent research article "The Categorization of Time and Its Impact on Task Initiation" by the researchers Yanping Tu (University of Chicago) and Dilip Soman (University of Toronto) investigates a rather different and novel angle in the psychology of procrastination: our perception of the future.
"If you explain to a musician he'll tell that
he knows it but he just can't do it"
~ Bob Marley
It's hard to imagine that the Beastie Boys released "Paul's Boutique" around this time, 25 years ago. Even more astonishing is the fact that I recently had two separate conversations with members of the so-called Millennial Generation, which resulted in the extraordinary discovery that neither person had even heard of "Paul's Boutique." Now this may make me sound like an ornery codger complaining about how the young folk of today are illiterate because they have never heard of (insert name of your own pet artist). But taken together, these two events require me to submit a modest contribution to keeping the general awareness of "Paul's Boutique" alive and well.
What makes "Paul's Boutique" so extraordinary and enduring? The sophomoric effort by the brash NYC trio debuted in 1989, and was the much-anticipated follow-up to "License To Ill." But instead of a new set of frat party anthems along the lines of "Fight For Your Right (To Party)," listeners were treated to a continuous magic carpet woven out of a kaleidoscope of samples. Romping over this dense, schizophrenic bricolage, MCA, Ad-Rock and Mike D traded lightning-quick call-and-response rhymes that embraced the usual MC braggadocio but at the same time drew on a vast range of sources and styles. The effect, to this day, is a delirious sort of aural whiplash.
No one is clear on how many songs were actually sampled, although the number is certainly well over a hundred. The exegesis of both samples and lyrical references is a time-honored tradition, too. Around 1995, one of the first sites that ever made me think the World Wide Web might be a good idea was (and continues to be) the Paul's Boutique Samples and References List. When studied, Torah-like, alongside the Beastie Boys Annotated Lyrics and the record itself, one begins to appreciate the catholic taste of both the rappers and their producers, the inimitable Dust Brothers, who would go on to provide much of the genius behind Beck's seminal "Odelay" album a few years later.
by Tamuira Reid
I don't like writing about depression. Because it's hard to get right in words. Because I sound like an asshole when I try. Because I am too close to it still. Because my memory of what happened feels faulty at best.
I remember light streaming through the blinds, big fat rays of sun, hitting me in the face. I remember a phone next to me, maybe in the palm of my hand, maybe wedged between the mattress and my thigh. Cold coffee on the nightstand. Cigarette ash on the sheets. I remember the sounds of kids playing on the street below, throwing rocks at a metal shop gate.
Friends told me to buck up. Pull it together. Muscle through. They said things like fake it till you make it and everything happens for a reason. They blamed it on global warming. Growing pains. Venus is in retrograde, after all.
They wanted me to will myself better and all I wanted was to write my will. I thought I was dying. I believed with every fiber left of my being that I was dying. Case closed. The party is over.
The more I needed people the more I retreated from them. How could I tell them that I couldn't feel my body? That it was completely disconnected from my mind? I was a person in parts, each part trying to function in its dysfunction. Pieces that no longer fit together in a way that made any sense.
My neighbor at the time, a well-meaning philosophy professor that only left his apartment long enough each day to teach and buy wine, told me that depression comes in waves. But that made it sound too beautiful. There was nothing good about the bad. He suffered from melancholy, a sort of condition that he became addicted to, enamored of. A powerful, deep sadness that became life-affirming for him. People broke his heart but in a pretty, poetic way. And this somehow gave him buoyancy in this world.
But my depression felt different.
by Brooks Riley
by Dwight Furrow
The world of food and wine thrives on a heavy dose of nostalgia. Culinarians ("foodies'" in the vernacular) chase down heritage tomatoes, ferment their own vinegar, and learn to butcher hogs in the name of "how things used to be" before the industrial food business created TV dinners and Twinkies. As we scour the Internet for authentic recipes, we imagine simpler times of family farms supporting family feasts consisting of real food, prepared in homey, immaculate kitchens with fruit pies on the windowsill, and the kids shelling beans at the table. Similarly, the wine industry continues to thrive on the romantic myth of the noble winemaker diligently tilling a small vineyard year after year to hand-produce glorious wines that taste of the local soil and climate.
Of course, in reality the winemaking of days past was not so romantic. Bad weather would have ruined some vintages and difficulties in controlling fermentation temperatures and unsanitary conditions in the winery rendered many wines barely drinkable. As to the way we ate in the not-to-distant past, for most people, food was scarce, expensive, of poor quality and often unsafe. Kitchens, if they existed, were poorly equipped and their operation depended on difficult, relentless work by women. Only the wealthy could eat in the manner approaching the quality of contemporary nostalgic yearnings, but that quality usually depended on the work of underpaid kitchen staff after slavery was abolished.
Nostalgia is a form of selective memory, history without the bad parts, enabling us to enjoy the past without guilt.
Does this dependency on myth render our contemporary fascination with the foods of the past a kind of kitsch—a sentimental, clichéd, easily marketed longing that offers "emotional gratification without intellectual effort" in Walter Benjamin's formulation, an aesthetic and moral failure? Worse, is this longing for the past a conservative resistance to the modern world. The word "nostalgia" has Greek roots—from nostos and algia meaning "longing to return home". Are contemporary culinarians and wine enthusiasts longing for a return to the "good" old days?
by Kathleen Goodwin
Few students of colonial history can deny the power of spoken and written language to subject and subdue a population. Zia Haider Rahman's "dazzling debut" of a novel, "In the Light of What We Know", contrasts two South Asian characters who attended Oxford together as undergrads—a privileged Pakistani narrator who becomes an investment banker in London and his friend Zafar, an impoverished refugee of the 1971 Bangladeshi Liberation War turned Harvard educated international human rights lawyer. One of the central themes of the novel is the way language can exert power over individuals and groups, as well as entire nations. Spoken language is an obvious manifestation of the tension created by modern day neo-colonialism or the 1971 splintering of Bengali-speaking East Pakistan from Urdu-speaking West Pakistan. But Rahman also explores a parallelism in the way language—in the form of industry jargon, acronyms and other forms of coded phrasing— can create power structures with remarkable potency.
"In The Light of What We Know" skips around temporally but the narrative is centered around London in 2008, in the midst of the unfurling financial crisis. The nameless narrator is revealed to not only be a banker, but the head of the mortgage-backed securities unit of his (also unnamed) global investment bank and thus on the verge of losing his job as the public condemns him and his counterparts for the calamity that is taking place. Tellingly, Rahman's résumé includes a stint as a Wall Street investment banker prior to becoming a novelist. His purpose does not appear to be to crucify the financial sector, rather his novel explores the great irony of the financial crisis—the securities derived from residential mortgages, which when the American housing market collapsed became essentially worthless and set off the chain of events that have changed history forever, were vetted and encouraged by the entities that should have understood and prevented the systemic risk to global markets these securities posed. Rahman's narrator explains this as being a function of the incestuous and hierarchical relationship between the big banks and the ratings agencies and regulators. The narrator describes a financial club of sorts, headed by chummy Oxford classmates who maintain a revolving door hiring policy and most importantly—speak a financial language incomprehensible to the ignorant public. The critical point is the way these hidden power structures allowed the conditions preceding the financial crisis to occur. The ratings agencies and regulators were compliant with the investment banks while the rest of society was unaware of the huge gambles being sanctioned, which eventually proved to be detrimental to the stability of the global economy.
by Shadab Zeest Hashmi
Starry night, a large starry night with infinite trees, is the background of what seems to be an architectural form— a balcony, bridge, courtyard with pillars? In the foreground, a sphere with a curve draped over it like an arm. This drawing has the expansiveness that suggests eternity (or waiting for what seems like an eternity) and monumentality, as well as intimacy, a sense of security. A birthday present from my teenage son, this abstract drawing is titled “colic.” The architectural form is a crib, the starry sky is the sleepless, endless night of shared anxiety between a mother and her colicky newborn.
I am handed this drawing on my return from an evening in New York City, my eyes still filled with the lambent and the monumental, with sorrows hidden under careful inscriptions; riches, anxiety, loneliness, poverty, and also a plentitude of heart, a sharing of burdens. My son’s drawing belongs in the series of photographs I have just taken of the city— of monumentality and intimacy: endless tunnel ceilings, vertiginous buildings, old trees, sparkle, strangers caught sharing a laugh as they contend with waiting in queue together. Wear this city like a jewel if you will, or a sensible shoe— carry it like a bouquet of nerves, or an empty envelope— New York is resplendent and humble, so high and mighty it gives you the cold shoulder, so electric it sings you into rebirth.
“Colic” is about birth, and the anxiety and excitement of growth. When I read New York into this drawing, I see the loftiness of empire— starry and sorry— the darkness of hierarchies, the bond of empathy. I see the struggle for meeting the definition of nationhood, the founding fathers are in the high rises, in the homeless, in the cogs and wheels, in the sobs and hiccups of the centuries.
But it is my birthday today and this drawing jolts me into the realization that the night sky is still full of uncertainty, mystery and hope— colic is still a good metaphor for life, that I still long for my own mother’s protective arms, that nothing is sweeter than to be remembered as an extended arm by my son.
Sunday, August 17, 2014
Carl Zimmer in the New York Times:
Your body is home to about 100 trillion bacteria and other microbes, collectively known as your microbiome. Naturalists first became aware of our invisible lodgers in the 1600s, but it wasn’t until the past few years that we’ve become really familiar with them.
This recent research has given the microbiome a cuddly kind of fame. We’ve come to appreciate how beneficial our microbes are — breaking down our food, fighting off infections and nurturing our immune system. It’s a lovely, invisible garden we should be tending for our own well-being.
But in the journal Bioessays, a team of scientists has raised a creepier possibility. Perhaps our menagerie of germs is also influencing our behavior in order to advance its own evolutionary success — giving us cravings for certain foods, for example.
Maybe the microbiome is our puppet master.
“One of the ways we started thinking about this was in a crime-novel perspective,” said Carlo C. Maley, an evolutionary biologist at the University of California, San Francisco, and a co-author of the new paper. “What are the means, motives and opportunity for the microbes to manipulate us? They have all three.”
Massimo Pigliucci in Scientia Salon:
Graham Priest is a colleague of mine at City University of New York’s Graduate Center, a world renowned expert in logic, a Buddhist connoisseur, and an all-around nice guy . So I always pay attention to what he says or writes. Recently he published a piece in Aeon magazine  entitled “Beyond true and false: Buddhist philosophy is full of contradictions. Now modern logic is learning why that might be a good thing.” I approached it with trepidation, for a variety of reasons. To begin with, I am weary of attempts at reading things into Buddhism or other Asian traditions of thought that are clearly not there (the most egregious example being the “documentary” What The Bleep Do We Know?, and the most frustrating one the infamous The Tao of Physics, by Fritjof Capra). But I quickly reassured myself because I knew Graham would do better than that.
Second, Graham knows a lot more than I do about both logic and Buddhism (especially the latter), so surely I was going to learn new things about both topics and, more crucially, how they are related to each other. The problem is that I ended up learning and appreciating more about logic, not so much about Buddhism, and very little about their congruence. Hence this essay.
I am going to follow Graham’s exposition pretty closely, and will of course invite him to comment on my take at his pleasure. Broadly speaking, my thesis is that the parallels that Graham sees between logic and Buddhism are more superficial than he understands them to be and, more importantly, that Buddhism as presented in his essay, is indeed a type of mysticism, not a philosophy, which means that logic (and, consequently, argumentation) are besides the point. Moreover, I will argue that even if the parallels with logic run as deep as Graham maintains, Buddhism would still face the issue — fundamental in any philosophy — of whether what it says is true of the world or not, an issue that no mystical tradition is actually equipped to handle properly.
As Iraq faces a new crisis, the novel Baghdad Central explores the freighted “moment of ambiguity” a decade earlier.
Guernica: Baghdad Central is unusual first of all for its Iraqi protagonist. What was the genesis of the book?
Elliott Colla: I saw The Hurt Locker in August 2009. I thought it was a great film but it also really disappointed and infuriated me. Here was another great work about American war in the Middle East, and yet again there are no non-American characters. Iraqis in that film are either victims or perpetrators and Americans get to be heroes. I was with a friend, and we talked about how we should just flip that on its head. What would it be like to have a movie where all the heroes were Iraqi and all the Americans are on the periphery? I sat down and wrote, and when I woke up the next morning I had this character, Khafaji, in my mind. Then there was the work of imagination and research. I was going to take the American bogeyman, the villain—the Baathist war criminal—and see what it would be like to make this kind of person into a hero. What would it take to make a reader like him, or become interested in his story before they learn that he’s a war criminal?
Guernica: How knowledgeable of Iraq were you at the time?
Elliott Colla: Recently I heard Barbara Ehrenreich talk about her writing process, and in response to a question on whether to write what you know, she said, “Listen, I write what I want to know.” I couldn’t put it any better.
Justin E. H. Smith in his own blog:
There have been many forceful contributions recently to the discussion of academic philosophy's 'white man problem' (see in particular here). I have been trying in my own way to contribute to these discussions, but what I am able to contribute is limited by the fact that in my social identity I am pegged as a cis straight white man (though in truth, I feel like protesting, it is far more complicated than this; and isn't it always!), and also by the fact that I disagree with my political allies in the effort to make academic philosophy more inclusive on some fundamental philosophical points as to what this inclusiveness must involve. Allow me to elaborate briefly on this latter limitation.
Jonardon Ganeri, following Homi Bhabha, articulates a distinction between two sorts of intercultural communication: cosmopolitanism and pluralism. Cosmopolitanism tends to interpret different viewpoints as "co-inhabitants in a single matrix, and to that extent [as] susceptible to syncretism," while the cardinal tenet of pluralism "is that the irreconcilable absence of consensus is itself something of political, social, or philosophical value" (31). It has come to seem to me that most proposed solutions to the 'white man problem' in philosophy are based on a philosophical commitment to pluralism, in the sense defined, whereas I believe that cosmopolitanism is far more appropriate to the subject under investigation: expressions of philosophical ideas about, say, mind-body dualism, or the relationship between utterances and the things the utterances are about, really do exist in a universal matrix, bounded by the evolutionary history of the human species, whether they occur in Europe, India, or Amazonia. To study any of these ideas as if they were the particular property of any constituency in virtue of affiliation or ancestry is simply bad scholarship.
Nina Martyris in Forbes India:
Gabriel García Márquez, the genius of the imagination who died in April at the age of 87, may not have written on India, but he had a multifaceted connection with the country that can be boiled down to three people: A Gandhi, a gypsy, and a Rushdie. Gandhi first, and for that we must wind back to a magical morning in October, 1982, when the news broke that the Latin American writer who had enchanted the world with One Hundred Years of Solitude had been awarded the Nobel Prize for literature.
In that joyous moment, Gabo became to Colombia what Pele is to Brazil. For this beautiful Caribbean country battered by poverty, drug violence and civil war, the fact that one of its countrymen had won the Nobel was akin to it winning the World Cup. And because Gabo had grown up dirt poor and could be as coarse as a sailor and as chivalrous as Don Quixote, celebrations erupted not just in Bogota’s linen-clad salons but in the country’s barrios and villages as well. Taxi drivers in Barranquilla, where Gabo had spent his early years as a journalist, heard the news on their radios and began to toot their horns in unison. One excited reporter asked a prostitute if she had heard, and she replied, yes, a client had told her in bed. This nugget would have delighted the new laureate, for not only are prostitutes—especially the trembling child prostitute—portrayed with extraordinary sympathy in his stories (his depiction reminds one of Manto’s Bombay prostitutes), he himself had lived above a brothel in his youth when he was unable to afford more respectable quarters. Publicly Gabo maintained that winning the Nobel would be “an absolute catastrophe”, but secretly he longed for it. And so, when the Swedish minister called his home in Mexico City with the news, he put down the phone, turned to his beloved wife Mercedes, and said: “I’m fucked”.
That day, his telephone was so jammed with calls that his old friend Fidel Castro was forced to send a telegram: “Justice has been done at last… Impossible to get through by phone.” Far away in New Delhi, Prime Minister Indira Gandhi thrilled to the news, not least because she happened to be in the middle of One Hundred Years of Solitude. In a lucky turn of events, Gandhi got a chance to meet Castro the very next month in Moscow, where they’d both gone to attend Soviet Premier Leonid Brezhnev’s funeral. Why don’t you bring your friend to India for the Non Aligned Movement summit next year, she suggested. Why not, said Castro.