Monday, September 14, 2015
by Jalees Rehman
Some years ago, I was enveloped by the desire to see our children grow up to be poets. I used to talk to them about poetic metaphors, rhymes and read to them excerpts from the biographies of famous poets. When the kids were learning about haikus at school, I took the opportunity to pontificate on the controversies surrounding the 5-7-5 syllable counts and the difficulties of imposing classic Japanese schemes on the English language, which abounds in diphthongs and long syllables.
The feedback from our children was quite mixed, ranging from polite questions such as "Do you know how long this will take?" to less polite snores. I had apparently not yet succeeded in my attempts to awaken their inner poet.
Our younger son was about eight years old, when we found out about a wonderful opportunity to inculcate the love of literature into our children: The Chicago Printers Row Literature Festival! I was especially excited by the fact that they would have a special "Lil' Lit" area, just for children. I convinced the whole family to go - promising to reward each kid with $5 if they accompanied us. I hoped that my poetry monologues had prepared the children for the poetic muses that they would encounter at the festival.
Even though it was early June, Chicago was experiencing one of its rare June Gloom weekends with cloudy, drizzly weather and frosty breezes. After exiting the parking garage, our kids tried to renegotiate the promised $5 reward in light of the unpleasant weather. I brushed off their whining and charged towards the long-awaited beacon of literary pleasure.
Once we arrived at "Lil' Lit", we saw a bunch of near-empty booths and an elderly author reading from a book to a couple of five-year olds, surrounded by fifty empty seats. The few booth owners looked at us with great expectations. They had been staking out the crowd of adults, walking past them and not been able to spot any children, so my children quickly become the center of attention at "Lil' Lit".
The children were not too enthusiastic about sitting down with the author who was reading from her book, perhaps because her sparse audience had the same facial expressions that our kids exhibited when I talked about poetry. We looked around and spotted a giant yellow "Bouncy Book" which caught the kids' attention. But before they could rush over there and begin jumping on it, they saw that the ginormous hollow book had a hole and kept on deflating.
One of the booths was called "Creative Creations". I was puzzling about the title, but relieved when our kids volunteered to participate in the activity. Apparently, this booth was giving children some chalk so that they could unleash their creativity. All three of our children took to the idea and started drawing and writing on the sidewalk in beautiful rainbow colors. For some strange reason, my eight year old son took his "ninja glove" out of his pocket and grabbed a green chalk. I relaxed, and my wife and I strolled around in the area, pleased to live in Chicago, a city that offered such cultural enrichment for children. I uttered a silent prayer of thanks for the fact that we had left dreary Indiana and recently moved to Chicago.
After about fifteen minutes or so, we returned to the booth of creativity. One of the ladies who operated the booth came up to me and said, "Excuse me sir, your son has …ahem…written a…haiku…"
I couldn't believe it! All my hard work had finally paid off. Even though they had pretended to not to listen, at least one of them had learned how to write a haiku. I was not sure if I was more proud of his accomplishment or my superb teaching.
I smiled and walked to the area of the sidewalk where the haiku was written.
I was first going to count the syllables, but once I started reading it, I stopped counting:
Torture comes to me,
I did not know how to respond to the accusatory glances of the lady. I looked at my son, who was folding away his "ninja glove".
He then calmly asked "Can we go now?"
I saw the deflating bouncy book in the background and I nodded, trying to hide my embarrassment with a half-hearted smile.
His haiku brought about the end of the poetry monologue series in our house.
Monday, June 22, 2015
The Archetype Of The Suffering Artist Must Die
by Mandy de Waal
Click on over to the New York Times and you'll find a gallery of tortured artists. First up is a youthful, but ghostly looking Jean Nicolas Arthur Rimbaud. The caption for the dark painting on the NYT site reads: "The Poet Rimbaud. Serial runaway. Absinthe and hashish benders. Shot by poet-lover Verlaine."
Born in October 1854 in the Champagne-Ardenne region of France, Rimbaud started writing poetry in primary school. By the time he was 16 he'd already written Le Dormeur du Val [The Sleeper In The Valley].
"It is a green hollow where a stream gurgles," the poem begins, before telling the story of "A young soldier, open-mouthed, bare-headed, With the nape of his neck bathed in cool blue watercress," sleeping stretched out on the grass under the sky.
Written during the French-Prussian war, the denouement of this piece is tragic:
"No odour makes his nostrils quiver;
He sleeps in the sun, his hand on his breast
At peace. There are two red holes in his right side."
Rimbaud's life was no less grim. His genius flowered early, and then stalled. By the time he was 21 he'd stopped writing. Four years earlier he'd send Le Dormeur du Val to celebrated French poet, Paul Verlaine, who'd forsake his wife and child for Rimbaud. The relationship would end after a few short years after Verlaine discharged a gun at Rimbaud in a jealous, drunken rage. Rimbaud wouldn't die then, but at at the age of 37 after suffering many agonising months from bone cancer.
Also on display in the gallery of the artiste manqué is Frida Kahlo (1907 to 1954) who got polio at the age of six which withered her right leg, which was eventually amputated. It is also thought that Kahlo had spina bifida. When she was 18 the artist was in a freak bus accident. "The tram she was riding collided with a bus and the tram's handrail penetrated her vagina. In an extra and tragic irony, someone on the tram had been carrying gold paint which spilled over Frida and the other passengers," Mike Gonzalez writes in ‘Frida Kahlo: a Life' for the Socialist Review.
After the crash there was a long period of painful convalescence, and the Mexican painter would suffer from bouts of pain for the rest of her life. Then there was the emotional torment. The troubled, tempestuous relationship with Diego Rivera. His jealousies over her affairs, and her fury over his relationship with her sister, Cristina.
Others featured in the NYT's hall of hardship "The Composer Beethoven. Sixteen when mom died. Went deaf at height of his gift. Chronic pain." You'll also find the Novelist Jean Genet. ["Mom a prostitute. As was he. Put up for adoption. Vagabond. Thief."].
The underlying narrative of these and countless other stories that underscore the archetype of the suffering artist which pervades and permeates the psyches of today's creators and makers. The narrative goes like this: if you want to be a Van Gogh you've got to cut off your ear. You've got to suffer for your art.
But thankfully, there are those who think that the notion of the anguished artist is bullshit, and I've got to say I agree with the wholeheartedly. Surrealist film legend David Lynch thinks that suffering doesn't turn artistic dross to gold, but says it hinders artists.
"It's good for the artist to understand conflict and stress. Those things can give you ideas. But I guarantee you, if you have enough stress, you won't be able to create. And if you have enough conflict, it will just get in the way of your creativity," writes Lynch in his book Catching the Big Fish.
"Some artists believe that anger, depression or these negative things give them an edge. They think they need to hold onto that anger and fear so they can put it in their work. And they don't like the idea of getting happy — it makes them want to puke. They think it would make them lose this supposed power of the negative," Lynch writes.
But torturing oneself for some kind of creative return doesn't make sense, now does it? Where's the logic in that? Or as Lynch writes: "It's common sense: The more the artist is suffering, the less creative he is going to be. It's less likely that he is going to enjoy doing his work and less likely that he will be able to do really good work,"
In South Africa, educator and digital luminary, Dave Duarte, is no fan of the starving artist archetype either and is actively doing to disrupt it. The CEO of learning and teaching company Treeshake.com and a Young Global Leader of the World Economic Forum believes "the story of the struggling artist has gone on long enough."
"We don't treat the artistic and creative disciplines with the same current economic respect as we do so many other disciplines, and yet artists can be just as impactful and transformative as entrepreneurs and businesses – if not more so," says Duarte, who together with Elaine Rumboll [managing director of The Creative Leadership Consultancy] teaches artists how to reject the myth of the ‘starving artist' by becoming successful creators and makers.
"Each year we get about 30 to 40 artists from a range of different disciplines - from sculpture and fine art to Graphic design and product design and video or writing or comedy. People with a diverse range of disciplines come together to learn the basic business skills that aren't taught," Duarte says. "This problem was identified by Elaine [Dave's muse and life partner] as critically important because fundamentally, practicing artists are creative entrepreneurs - they enter the business market and have to fend for themselves."
"We're disrupting the ‘starving artist' myth by dealing with misconceptions that are deeply ingrained in the culture of arts. One of these fantasies is that being commercially oriented undermines the integrity of one's work. Myths like these are deeply held misconceptions that are perhaps even taught at art and/or design schools and become criticisms of art or artists. Culturally artists who are commercially successful can be seen as sell-outs or lacking integrity, which is nonsense," Duarte says.
Duarte – who serves on Endeavor's Venture Corps and in so doing helps the organisation achieve its goal of supporting high-impact entrepreneurs across the world – explains that it doesn't make sense for society to enable athletes or lawyers or accountants to be professional, while expecting artists to suffer and starve. "We are conscientising and changing the creative space by showing people how making money is not the opposite of doing good art," he says.
"The first thing that artists need to realise is that as an artist you are a creative entrepreneur essentially, and that what you want to do is to create financial constancy for yourself so you can focus on your work at the very least," says Duarte, adding that artists need to create platforms for their business that enable scale.
What are the first things all artists, makers and creators need to learn? "Things like cash flow are really important to understand but are not well understood business concepts in the arts community. In creative spaces we don't talk about deal flow leads; how to negotiate; how to make sales; or how to close deals. All of these conversations are really important for creating sustainable businesses," says Duarte.
Other important lessons artists need to learn speak to sales and marketing. "Sales is like hunting and marketing is like farming," says Duarte, who then explains: "Sales will give you a quick win. You get a quick win, but this requires a lot of energy because artists must go out and to get these quick wins." Being in a ‘sales oriented mode' is very different to creating art, and is almost a separate part of an artist's work.
"Sales can take you out of your process. This is fine, and this is necessary, but to be sustainable as an artist in the long run, creators and makers should be thinking about the branding and marketing perspective, which is more about being a farmer. Being a farmer is about taking a long view on things," says Duarte. Just as farmers plant and nurture and nourish, so too artists need to consider and do that which grows their brands.
Farming is about of investing in the creative brand, the artist's reach, and the artist's community which enables a pull, rather than a push type marketing strategy. "Farming as an analogy for branding means artists don't have to go out to their markets that much and disrupt their creative work and process. Farming is about creating branding that enables an artist's market to come in and meet them a lot more often," Duarte says.
Rumboll and Duarte's key teachings for artists also include the basics of branding. Sensory consistency shows creatives how to look and sound the same online as in the real world, and there are teachings in experiential consistency. "For instance if your work is provocative, hopefully you are as provocative in your marketing of the work. Emotional consistency is how you make people feel around your work and you," Duarte explains, stressing that it is important to have an underlying consistency and to communicate consistently to the marketplace, because this is what grows brands. "In other words don't consider marketing as something that is separate from the act of producing your work. It is an extension and a part of your creative product that that you need to imbue your philosophy into," he says.
The thinking behind this speaks to a concept articulated by the founding executive editor of Wired magazine, which is that of ‘a thousand true fans'. In a wildly popular blog post written in March 2008 the author of New Rules for the New Economy: 10 Radical Strategies for a Connected World Cand What Technology Wants states: "The long tail is a decidedly mixed blessing for creators. Individual artists, producers, inventors and makers are overlooked in the equation. The long tail does not raise the sales of creators much, but it does add massive competition and endless downward pressure on prices. Unless artists become a large aggregator of other artist's works, the long tail offers no path out of the quiet doldrums of minuscule sales."
The Long Tail is an expression used to articulate the market shift from mainstream products to the niche, a shift that was enabled by the democratisation of economies and markets by the internet. The best thinking on this phenomenon is captured in a book written by the current editor-in-chief of Wired Magazine, Chris Anderson called The Long Tail: Why the Future of Business is Selling Less of More.
Kelly says that rather than "aiming for a blockbuster hit" artists should escape the long tail by finding 1,000 "true fans". "A creator, such as an artist, musician, photographer, craftsperson, performer, animator, designer, videomaker, or author – in other words, anyone producing works of art – needs to acquire only 1,000 True Fans to make a living," writes Kelly. "A True Fan is defined as someone who will purchase anything and everything you produce."
"They will drive 200 miles to see you sing. They will buy the super deluxe re-issued hi-res box set of your stuff even though they have the low-res version. They have a Google Alert set for your name. They bookmark the eBay page where your out-of-print editions show up. They come to your openings. They have you sign their copies. They buy the t-shirt, and the mug, and the hat. They can't wait till you issue your next work. They are true fans," Kelly writes.
In short the thinking is that to become sustainable businesses, artists should aspire to have at least one thousand true fans.
Why do we want our artists to survive and thrive? "For innovation to thrive in a country, you need artists," says Rumboll on a YouTube video that promotes Business Acumen For Artists. "You need that creativity to drive any innovation. Without artists, innovation cannot happen. Rumboll says that in a coming paradigm shift, artists will be known as creative entrepreneurs.
Rumboll declares: "The absolute starting point is the knowledge that what Andy Warhol said is true, and this is that being good at business is the most fascinating kind of art."
The archetype of the artist who must suffer for society is as flawed as it is outdated. It is time for that myth to depart. Long live artists who create, and contribute to society, and who make a sustainable, happy livelihood doing this.
* * *
Why I hate the myth of the suffering artist by Al Kennedy at The Guardian.
The Myth of the Tortured Artist at The Daily Beast.
Monday, March 16, 2015
Lament of the Expunged Metaphor
You bastard! You butcher! You murdering swine!
I had it all: beauty, aptness, concision.
I fit snugly into that trimetric line.
And what's my reward? –A brutal excision.
Don't tell me they told you to "kill all your darlings."
Bill Faulkner's not going to take this rap.
That's a defense used by Eichmanns and Gôrings:
"I just followed orders." Don't give me that crap!
I could have been something—a catchphrase, a clichéd
Expression. Folk would have asked, "Who said it?"
You should have stuck by me. We would have made
Such a statement—and you'd have the credit.
I knew it was coming. I saw how you treated
That cute little simile in the first stanza.
It was she got you started; now, she's deleted.
The dreaded black line came through like a panzer.
And you smiled as you did it! I saw you smirking
As you penned her replacement. That's when I lost hope.
You'll axe us, no matter how well we're working,
The moment you're smitten with a pretty new trope.
Oh you're clever—like Bluebeard!—and so discrete.
The world never sees any trace of your crimes.
No bruises. No blood. Just a clean printed sheet
Of meticulous meter and neat little rhymes.
But not even your cunning will suffice
To save you from what I hope and trust is
To be your fate, the terrible price
Assessed by the gods of poetic justice--
One day, leafing through a rival's verse,
You'll see me, set in a beautiful line
Like a mounted gem. And then you'll curse
Your cruel folly, and cry, "But . . . . you're mine!"
And too late you'll discover my charms.
And you'll want me back. And I'll say, "Never!
Your darling lies in another's arms,
A thing of beauty lost forever."
by Emrys Westacott
Monday, February 03, 2014
Haiku and Landays in Science
by Jalees Rehman
That's all that remains
Of warriors' dreams.
My favorite scientific experiments are those which resemble a haiku: simple and beautiful with a revelatory twist. This is why the haiku is very well suited for expressing scientific ideas in a poetic form. Contemporary haiku poets do not necessarily abide by the rules of traditional Japanese haiku, such as including a word which implies the season of the poem or the 17 (5-7-5) syllable structure of three verses. Especially when writing in a language other than Japanese, one can easily argue that the original 5-7-5 structure was based on Japanese equivalents of syllables and that there is no need to apply this syllable count to English-language haiku. Even the reference to seasons and nature may not apply to a modern-day English haiku about urban life or, as in my case, science.
Does this mean that contemporary haiku are not subject to any rules? In the introductory essay to an excellent anthology of English-language haiku, "Haiku in English: The First Hundred Years", the poet Billy Collins describes the benefit of retaining some degree of structure while writing a haiku:
Many poets, myself included, stick to the basic form of seventeen syllables, typically arranged in three lines in a 5-7-5 order. This light harness is put on like any formal constraint in poetry so the poet can feel the comfort of its embrace while being pushed by those same limits into unexpected discoveries. Asked where he got his inspiration, Yeats answered, "in looking for the next rhyme word." To follow such rules, whether received as is the case with the sonnet or concocted on the spot, is to feel the form pushing back against one's self-expressive impulses. For the poet, this palpable resistance can be a vital part of the compositional experience. I count syllables not out of any allegiance to tradition but because I want the indifference and inflexibility of a seventeen-syllable limit to balance my self-expressive yearnings. With the form in place, the act of composition becomes a negotiation between one's subjective urges and the rules of order, which in this case could not be simpler or firmer.
The seventeen syllable limit – like any other limit or rule in poetic forms – provides the necessary constraints that channel our boundless creativity to create a finite poem. It is a daunting task to sit down with a pen and paper, and try to write a poem about a certain topic. Our minds and souls are flooded with a paralyzing plethora of images and ideas. But, as Collins suggests, if we are already aware of certain rules, it becomes much easier to start the process of poetic filtering and negotiation.
What is the essence of a haiku? In the same essay, Collins offers a very elegant answer:
Whether they are the counting or the non-counting type, poets are likely to agree that at the heart of the haiku lies something beyond counting, that is, its revelatory effect on the reader, that eye-opening moment of insight that occurs whenever a haiku succeeds in drawing us through the keyhole of its details into the infinite, or to put it more ineffably, into the "Void of the Whole." No one would argue that any tercet that mentions a cloud or a frog qualifies as a real haiku; it would be like calling an eleven-line poem about courtly love a sonnet. A true haiku contains a special uncountable feature, and every serious devotee of the form aims to achieve that with every attempt.
The revelatory surprise, the "Aha moment", is what characterizes a true haiku. I have experimented with the haiku form, trying to capture scientific concepts or the process of scientific discovery. Many poets do not give titles to their haiku, but I feel that the title can be very helpful to create a poetic tension and provide a context that may be difficult to incorporate within the haiku verses. A haiku – like every good poem – should not require explanatory lines by the poet, but I think that one can make some exceptions here in the context of experimenting with haiku.
Scientific images or phrases are not always self-evident, so I include brief annotations for the haiku I have written which may be helpful for people who are not routinely exposed to the scientific research.
Grainy threads in cells,
powerhouses of life are
harbingers of death
I have been studying mitochondria for a number of years, but I still marvel at the Janus-like role of mitochondria. They are active sites of biosynthesis and produce the universal energy molecule of cells (ATP) thus ensuring the growth and survival of cells. At the same time, mitochondria can initiate a cell's suicide program (apoptosis), forcing a cell to die. You can read about some of our mitochondrial research on lung cancer here.
Ceci n'est pas une
pipette, porting microdrops
for my macrodreams
Many of us have spent hours, days and months repetitively pipetting hundreds of samples for PCR reactions, ELISA assays or other tests, and sooner or later most of us wonder about the meaning of these Sisyphean tasks.
in science are tested only
to be rejected
If I received a dollar for every wonderful scientific idea I have had that turned out to be wrong, I would not have to write any more grants to support my lab.
Haiku have become an integral part of English language poetry, but there is another poetic form that may soon be gaining popularity. The journalist and poet Eliza Griswold recently teamed up with the photographer Seamus Murphy, traveled to Afghanistan and collected landays that are commonly composed by Afghani women in their native language Pushto. Landays are a form of folk poetry, couplets consisting of a verse with nine syllables followed by one with thirteen syllables. Griswold worked with native Pushto speakers to translate the landays into English. In her brilliant essay published in the June 2013 issue of Poetry Magazine, Griswold provides us with glimpses into the lives of Afghani women, the hardships that they face on a daily basis. The essay also contains translations of landays, which have become a form of lyrical resistance for Afghani women, allowing them to voice their anger and frustration. Illiterate women compose, share and recite these poems, often anonymously and behind closed doors, in society that marginalizes women. The narratives about Afghani women and the translations of landays, which preserve their characteristic wit and sarcasm, are accompanied by haunting photographs that convey the beauty of war-torn Afghanistan and its people.
Here is a description of landays from Griswold's essay:
A landay has only a few formal properties. Each has twenty-two syllables: nine in the first line, thirteen in the second. The poem ends with the sound "ma" or "na." Sometimes they rhyme, but more often not. In Pashto, they lilt internally from word to word in a kind of two-line lullaby that belies the sharpness of their content, which is distinctive not only for its beauty, bawdiness, and wit, but also for the piercing ability to articulate a common truth about war, separation, homeland, grief, or love. Within these five main tropes, the couplets express a collective fury, a lament, an earthy joke, a love of home, a longing for the end of separation, a call to arms, all of which frustrate any facile image of a Pashtun woman as nothing but a mute ghost beneath a blue burqa.
Examples of landays collected by Griswold:
You sold me to an old man, father.
May God destroy your home, I was your daughter.
I tried to kiss you in secret but you're bald!
Your bare skull thumped against the wall.
I dream I am the president.
When I awake, I am the beggar of the world.
In April of 2014, Griswold and Murphy will also release the book "I Am the Beggar of the World: Landays from Contemporary Afghanistan" which will contain a more comprehensive collection of landays.
Landays have not yet caught on as a poetic form in the English-language, but this landmark work by Griswold might change that. I think that landays might be a great opportunity for scientists to describe their experiences with the scientific enterprise.
My landays revolve around the work and lives of academic scientists:
I work alone in the lab each night,
conducting all our experiments for your career.
Sirens of tenure captivate us,
chained to hallowed halls of academic freedom.
Journals can make or break our careers,
careers can make or break us, we can make or break journals.
These landays attempt to approximate the 9-13 syllable count in the couplets but as with haiku, the nature, structure and themes of landays written in English will likely be different from the original Pushto landays.
It does not really matter what poetic form or structure scientists choose to express themselves, but my personal experience has been that poetry is a wonderful way to share science. Writing haiku or landays about science has forced me to think about what aspects of my scientific work I really treasure. What started as a playful exercise with words has become a journey.
Monday, December 09, 2013
Google Zeitgeist: Annoying Philosophers, Weird Germans and White Pakistanis
by Jalees Rehman
The Autocomplete function of Google Search is both annoying and fascinating. When you start typing in the first letters or words of your search into the Google search box, Autocomplete takes a guess at what you are looking for and "completes" the search phrase by offering you multiple query phrases. The queries offered by Autocomplete are "a reflection of the search activity of users and the content of web pages indexed by Google". Considering the fact that more than five billion Google searches are conducted on an average day, the Google Autocomplete function has a huge database of search information that it can reference. This also means that the Autocomplete suggestions are quite dynamic and can vary over time. A popular new song lyric, the name of a viral video or a recent movie quote can catapult itself to the top of the Autocomplete suggestion list within a matter of hours or days if millions of users start search for that specific phrase. Autocomplete may also take a user's browsing history or location into account, which explains why it may offer a varying set of suggestions to different users.
Autocomplete can be quite annoying because the suggested lists of queries are based on their web popularity and can thus consist of bizarre combinations which are not at all related to one's intended searches. On the other hand, Autocomplete is also a fascinating tool to provide a window into the Zeitgeist of web users, revealing what kinds of phrases are most commonly used on the web, and by inference, what contemporary ideas are currently associated with the entered keywords. The Google Zeitgeist website reveals the most widely searched terms to help identify cultural trends - based on the frequency of Google search engine queries - during any given year.
The United Nations Entity for Gender Equality and the Empowerment of Women (UN Women) recently used the Google Search Autocomplete function in an ad campaign to highlight the extent of misogyny on the web. Searching for "women should…" or "women need to…" was autocompleted to phrases such as "women should be slaves" or "women need to be put in their place". The fact that Autocomplete suggested these phrases means that probably hundreds of thousands of internet users have used these phrases in their search queries or on web pages indexed by Google – a reminder of how much gender injustice still exists in our world.
A recent article in Slate pointed towards another form of bias unveiled by Autocomplete: Occupational prejudice. The search phrase "scientists are…." was autocompleted to suggest that scientists were either liars, liberal or stupid. I tried it out and received similar suggestions by Autocomplete:
I guess we scientists have been upgraded from merely being stupid to being idiots. I was curious whether other professions fare better.
Well, apparently bankers do not.
And doctors are not only as stupid as scientists, they are also overpaid, arrogant and dangerous.
I can understand that doctors are thought to be overpaid, but it is a bit of a surprise that folks on the web think that professors are overpaid, especially considering the fact that many of them have spent a decade or more in postgraduate education before they become professors and still earn far less than non-academic colleagues in the private industry.
Philosophers, on the other hand, are not perceived as being stupid by the Google Zeitgeist. They are wise and annoying with a tinge of depression.
The next time you contact your editors, please remember that they are people, too.
The fact that Autocomplete suggests these phrases means that they are frequently used in searches and web pages but there is no way to know who is using them and what the intent is behind their usage.
What does the Google Zeitgeist tell us about people of different nationalities?
Germans are not seen in a very positive light, but the prejudices regarding Germans being rude, cold and weird should not come as a surprise to anyone who watches Hollywood movies which love to propagate such clichés.
Interestingly, search queries suggest that both Americans and Germans may come across as weird and rude.
Maybe the web collective feels that members of all nationalities are weird and rude – even the Canadians, who are also known to be nice even though they are afraid of the dark.
When I queried the characteristics of Pakistanis with the "Pakistanis are…." Phrase, I was surprised by the fact that Autocomplete offered very different suggestions than those for Germans and North Americans. The latter were being described by adjectives such as rude, weird, nice or cold – but when it came to Pakistanis, the search queries instead focused on their ethnic identity.
Are Pakistanis white or not white? Are they mostly Indians or do they have Arab origins? The odd thing is that I have conversations around these questions with many Pakistanis, who often try to convince me that they indeed have "white" roots. Some Pakistanis I know – especially those who are proud of their fair skin color - frequently mention their possible Greek origins (dating back to the times of Alexander the Great and his invasion of the Indian subcontinent) conquests, others emphasize the fact that the people who currently reside in Pakistan may have had Arab forefathers when the Arabs invaded the Indian subcontinent. On the other hand, I also know plenty of Pakistanis who see themselves as people with a primarily Indian heritage. The fact that this is a hotly debated topic among Pakistanis suggests that maybe the internet queries suggested by Autocomplete were in fact based on queries or web pages of Pakistanis who are interested in discussing this topic.
When it comes to Arabs, their ethnic identity is also apparently a popular topic in internet queries, and again my personal interactions with American Arabs mirror the Autocomplete suggestions. I have often heard American Arabs mention that they feel they ought to be accepted as part the American "white" population ("Hello – I just received a phone call, Dr. Frantz Fanon is on hold for you on line 1).
I first thought that perhaps the desire to identify oneself with being "white" was a remnant of one's colonial past, but my search for "Nigerians are…" did not support this hypothesis.
The Web seems to hold extremely positive views of Nigerians – smart, intelligent and educated.
Moving beyond searches for nationalities, what characteristics do web users associate with members of other groups?
Well, religions do not fare well.
Christianity and Islam are seen as evil, full of falsehood and (oddly enough) may not even be religions.
In contrast, atheism is not labeled as evil. The suggested queries instead revolve around the question of whether or not atheism is a religion.
How about a cultural ideology?
Ok, Google Zeitgeist tells us that postmodernism is BS and dead.
The human emotion of Schadenfreude, on the other hand, is very much alive.
Autocomplete is not only a tool to identify biases and phrases used on the web; it has also become an inspiration for poets. The Google Poetics blog is run by Sampsa Nuotio and Raisa Omaheimo and collects Google poems, recognizing that Autocomplete suggestions sometimes contain a Dadaist beauty and are in essence prose poems. Inspired by their collection of Google poems, I sometimes enter words or verses from famous poems to generate Autocomplete's mutant versions of those famous verses:
Here is a Google Autocomplete poem based on "Do not go gentle into that good night" by Dylan Thomas:
Do not go
do not go where the path may lead
do not go gentle poem
do not go my love
Do not go beyond what is written
And one based on the line "Let us go then, you and I" from T.S. Eliot's ‘The Love Song of J. Alfred Prufrock'
let us entertain you
let us entertain you gift cards
let us play with your look
let us go then you and i
I would like to now close with a final ode to Google:
google is evil
google is god
google is your friend
google is down
Monday, July 22, 2013
Three Seconds: Poems, Cubes and the Brain
by Jalees Rehman
A child drops a chocolate chip cookie on the floor, immediately picks it up, looks quizzically at a parental eye-witness and proceeds to munch on it after receiving an approving nod. This is one of the versions of the "three second rule", which suggests that food can be safely consumed if it has had less than three seconds contact with the floor. There is really no scientific basis for this legend, because noxious chemicals or microbial flora do not bide their time, counting "One one thousand, two one thousand, three one thousand,…" before they latch on to a chocolate chip cookie. Food will likely accumulate more bacteria, the longer it is in contact with the floor, but I am not aware of any rigorous scientific study that has measured the impact of food-floor intercourse on a second-to-second basis and identified three seconds as a critical temporal threshold. Basketball connoisseurs occasionally argue about a very different version of the "three second rule", and the Urban Dictionary provides us with yet another set of definitions for the "three second rule", such as the time after which one loses a vacated seat in a public setting. I was not aware of any of these "three second rule" versions until I moved to the USA, but I had come across the elusive "three seconds" time interval in a rather different context when I worked at the Institute of Medical Psychology in Munich: Stimuli or signals that occur within an interval of up to three seconds are processed and integrated by our brain into a "subjective present".
I joined the Institute of Medical Psychology at the University of Munich as a research student in 1992 primarily because of my mentor Till Roenneberg. His intellect, charm and infectious enthusiasm were simply irresistible. I scrapped all my plans to work on HIV, cancer or cardiovascular disease and instead began researching the internal clock of marine algae in Till's laboratory – in an Institute of Medical Psychology. Within weeks of working at the institute, I realized how fortunate I was. Ernst Pöppel, one of Germany's leading neuroscientists and the director of the institute, had created a multidisciplinary research heaven. Ernst assembled a team of remarkably diverse researchers who studied neurobiology, psychology, linguistics, mathematics, philosophy, endocrinology, cell physiology, marine biology, computer science, ecology – all on the same floor. Since I left the institute nearly 20 years ago, I have worked in many academic departments at various institutions, each claiming to value multidisciplinary studies, but I have never again encountered any place that has been able to successfully integrate natural sciences, social sciences and the humanities in the same way as the Munich institute.
The central, unifying theme of the institute was time. Not physical time, but biological and psychological time. How does our brain perceive physical time? What is the structure of perceived time? What regulates biological oscillations in humans, animals and even algae? Can environmental cues modify temporal perception? The close proximity of so many disciplines made for fascinating coffee-break discussions, forcing us to re-evaluate our own research findings in the light of the discoveries made in neighboring labs and inspired us to become more creative in our experimental design.
Some of the most interesting discussions I remember revolved around the concept of the subjective present, i.e. the question of what it is that we perceive as the "now". Our brain continuously receives input from our senses, such as images we see, sounds we hear or sensations of touch. For our brain to process these stimuli appropriately, it creates a temporal structure so that it can tell apart preceding stimuli from subsequent stimuli. But the brain not only assigns a temporal order to the stimuli, it also integrates them and conveys to us a sense of the subjective past and the subjective present. We often use vague phrases such as "living in the moment" and we all have a sense of what is the "now", but we do not always realize what time intervals we are referring to. If we just saw an image or heard a musical note one second ago, physical time would clearly place them in "the past". Decades of research performed by Ernst Pöppel and his colleagues at the institute, as well as several other laboratories around the world, suggest that our brain integrates our subjective temporal reality in chunks of approximately three second duration.
Temporal order can be assessed in a rather straightforward experimental manner. Research subjects can be provided sequential auditory clicks, one to each ear. If the clicks are one second apart, nearly all participants can correctly identify whether or not the click in the right ear came before the one in the left ear. It turns out that this holds true even if the clicks are only 100 milliseconds (0.1 seconds) apart. The threshold for being able to correctly assign a temporal order to such brief stimuli lies around 30 milliseconds for young adults (up to 25 years old) and 60 milliseconds for older adults.
Temporal integration of stimuli, on the other hand, cannot be directly measured through experiments. It is not possible to ask research subjects "Are these two stimuli part of your now?" and expect a definitive answer, because everyone has a different concept and definition of what constitutes "now". Therefore, researchers such as Ernst Pöppel have had to resort to indirect assessments of temporal integration, and ascertain what interval of time is grasped as a perceptual unit by our brain. An excellent summary of the work can be found in the paper "A hierarchical model of temporal perception". Instead of reviewing the hundreds of experiments that have lead researchers to derive the three-second interval, I will just review two studies which I believe are among the most interesting.
In one of the studies, Pöppel partnered up with the American poet Frederick Turner. Turner and Pöppel recorded and measured hundreds of Latin, Greek, English, Chinese, Japanese, French and German poems, analyzing the length of each LINE. They used the expression LINE to describe a "fundamental unit of metered poetry". In many cases, a standard verse or line in a poem did indeed fit the Turner-Pöppel definition of a LINE, but they used the more generic LINE for their analysis because not all languages or orthographic traditions write or print a LINE in a separate space as is common in English or German poems. If a long line in a poem was divided by a caesura into two sections, Turner and Pöppel considered this to be two LINES.
The basic idea behind this analysis was that each unit of a poem (LINE) conveys one integrated idea or thought, and that the reader experiences each LINE as a "now" moment while reading the poem. Turner and Pöppel published their results in the classic essay "The Neural Lyre: Poetic Meter, the Brain, and Time" for which they also received the Levinson Prize in 1983. Their findings were quite remarkable. The peak duration of LINES in poems was between 2.5 seconds and 3.5 seconds, independent of what language the poems were written in. For example, 73% of German poems had a LINE duration between 2 and 3 seconds. Here are some their other specific findings:
Epic meter (a seven-syllable line followed by a five-syllable one) (average) 3.25 secs.
Waka (average) 2.75 secs.
Tanka (recited much faster than the epic, as 3 LINES of 5, 12, and 14 syllables) (average) 2.70 secs.
Four-syllable line 2.20 secs.
Five-syllable line 3.00 secs.
Seven-syllable line 3.80 secs.
Pentameter 3.30 secs.
Seven-syllable trochaic line 2.50 secs.
Stanzas using different line lengths 3.00 secs., 3.10 secs.
Ballad meter (octosyllabic) 2.40 secs.
Poets all around the world did not conspire to write three-second LINES. It is more likely that our brain may be attuned to processing poetic information in 3 second chunks and that poets are subconsciously aware of this. This was not a controlled, rigorous scientific study, but the results are nevertheless fascinating, not only because they points towards the three second interval that neuroscientists have established in recent decades for temporal integration in the brain, but also because they suggest that the rules for metered poetry may be universal. I strongly advise everyone to read the now classic essay by Turner and Pöppel, to then try reading aloud their own favorite poems and see if the LINES indeed approximate three seconds.
A second approach to glean into the inner workings of temporal integration process in our brain is the use of perceptual reversal experiments, such as those performed with the Necker cube. This cube is a 2-D line drawing, which our brain perceives as a cube – or actually two distinct cubes. Most people who stare at the drawing for a while will notice that their mind creates two distinct cube representations. Once the mind perceives the two different cubes, it becomes very difficult to cling to just one cube representation. Our brain starts flip-flopping between the two cubes; even when we try our best to just hang on to one of the cube representations in our mind. Interestingly, the average duration that it takes for our mind to automatically shift from one cube representation to the other one approximates three seconds.
Nicole von Steinbüchel, a colleague of Ernst Pöppel at the Institute of Medical Psychology, asked a fascinating question. If the oscillatory perceptual shift between the two cube representations is indeed indicative of the "subjective present" and the temporal integration capacity, would brain injury affect the oscillation? She studied patients who had brain lesions (usually due to a stroke) in either the left or right hemisphere of the brain. She and her team of researchers were able to show that while healthy participants reported a three second interval between the automatic shifting of the cube representations in their brain, the average shift time was four seconds in patients with brain damage in the left brain hemisphere and up to six seconds if the damage had occurred in a certain part of the right brain hemisphere. Nicole von Steinbüchel's research demonstrates the clinical relevance of studying temporal integration, but it also suggests that the brain may have designated areas which specialize in creating a temporal structure.
The analysis of poetry and the Necker cube experiments are just two examples of cognitive studies indicating that our brain uses three second intervals to process information and generate the experience of the "now" or the "subjective present". Taken alone, none of these studies are a conclusive proof that our brain uses three second intervals, but one cannot help but notice a remarkable convergence of data pointing towards a cognitive three second rule.
Frederick Turner and Ernst Pöppel (1983) "The Neural Lyre: Poetic Meter, the Brain, and Time" Poetry 142(5): 277-309. A reprint also available online here: http://www.cosmoetica.com/B22-FT2.htm
Ernst Pöppel (1997) "A hierarchical model of temporal perception" Trends in Cognitive Sciences 1(2): 56-61.
Nicole von Steinbüchel (1998) "Temporal ranges of central nervous processing: clinical evidence" Experimental Brain Research 123 (1-2): 220-233.
Monday, February 04, 2013
Ecology’s Image Problem
“There are tories in science who regard imagination as a faculty to be avoided rather than employed. They observe its actions in weak vessels and are unduly impressed by its disasters” —John Tyndall, 1870
In his 1881 essay on Mental Imagery, Francis Galton noted that few Fellows of the Royal Society or members of the French Institute, when asked to do so, could imagine themselves sitting at the breakfast-table from which presumably they had only recently arisen. Members of the general public, women especially, fared much better, being able to conjure up vivid images of themselves enjoying their morning meal. From this Galton, an anthropologist, noted polymath, and eugenicist, concluded that learned men, bookish men, relying as they do on abstract thought, depend on mental images little, if at all.
In this rejection of the scientific role for the imagination Galton was in disagreement with Irish physicist John Tyndall who in a 1870 address to the British Association in Liverpool entitled The Scientific Use of the Imagination claimed that in explaining sensible phenomena, scientists habitually form mental images of that which is beyond the immediately sensible. "Newton’s passage from a falling apple to a falling moon”, Tyndall wrote, “was, at the outset, a leap of the prepared imagination.” The imagination, Tyndall claimed, is both the source of poetic genius and an instrument of discovery in science.
The role of the imagination is chemistry, is well enough known. In 1890 the German Chemical Society celebrated the discovery by Friedrich August Kekulé von Stradonitz of the structure of benzene, a ring-shaped aromatic hydrocarbon. At this meeting Kekulé related that the structure of benzene came to him as a reverie of a snake seizing its own tail (the ancient symbol called the Ouroboros).
Since this is quite a celebrated case of the scientific use of the imagination I quote Kekule’s account of the events in full:
“During my stay in Ghent, Belgium, I occupied pleasant bachelor quarters in the main street. My study, however, was in a narrow alleyway and had during the day time no light. For a chemist who spends the hours of daylight in the laboratory this was no disadvantage. I was sitting there engaged in writing my text-book; but it wasn't going very well; my mind was on other things. I turned my chair toward the fireplace and sank into a doze. Again the atoms were flitting before my eyes. Smaller groups now kept modestly in the background. My mind's eye, sharpened by repeated visions of a similar sort, now distinguished larger structures of varying forms. Long rows frequently close together, all, in movement, winding and turning like serpents! And see! What was that? One of the serpents seized its own tail and the form whirled mockingly before my eyes. I came awake like a flash of lightning. This time also [he had had fruitful dreams before] I spent the remainder of the night working out the consequences of the hypothesis. If we learn to dream, gentlemen, then we shall perhaps find truth…” Berichte der deutschen chemischen Gesellsehaft, 1890, 1305-1307 (in Libby 1922).
In supporting his argument about the positive role of the imagination John Tyndall quoted Sir Benjamin Brodie, the chemist, who wrote that the imagination (”that wondrous faculty”) when it is “properly controlled by experience and reflection, becomes the noblest attribute of man”. Brodie cautioned, however, that the imagination when “left to ramble uncontrolled, leads us astray into a wilderness of perplexities and errors…”
The philosopher Vigil Aldrich provided an interesting example of how imagination could be a hindrance to science. Sir Arthur Stanley Eddington, the English astrophysicist, referred frequently, according to Aldrich, to “the world outside us”. Consciousness, in contrast, can be described as being “inside of us.” Using such images Eddington was, said Aldrich, “under the spell of the telephone-exchange analogy.” Where the nerve ending leave off the world beyond us takes over. If the telephone exchange image seems ill-chosen, the image, after all, could be worse. One might imagine inner consciousness as a submarine and from our berth within it we come to know the outside world by means of a periscope! Now, Eddington did not use this image (others did) but when we try to make sense of it we can do so only by saying that inner consciousness is like a submarine only when one supposes that it is nothing at all like a submarine. One must “tone down the analogy” to make it useful. If you do otherwise “the lively imagination begins to protest”. Aldrich speculated that theorists persists with inept picture-making because when toned down, it often appeared as if the image is illuminating even when it is not. Moreover, a flashy image is entertaining. Thus one can easily make the “pleasant mistake” of identifying the image with the “real meaning” of an assertion.
A strength of environmental disciplines is that they bring into proximity bodies of knowledge that are often set apart. Though some quibble with him on this, historian of ecology Donald Worster places both Charles Darwin, the philosophical scientist and Henry David Thoreau the scientific philosopher at the ground of ecology as a natural scientific discipline. And though it is fair to say that ecology has maintained an identity largely separate from the environmentalisms it has inspired, nevertheless ecology and environmentalisms have been good conversation partners. Both have listened to an admirable degree to its poets, artists and philosophers. A good thing this may be in many ways, but my contention here is that the environmental sciences and the practices associated with them — environmentalisms like sustainability — are prone of taking their most arresting images too literally. I wonder if there is not in environmental thought a pathology of the imagination? Too readily, it seems, we transform a provocative image into a proven hypothesis; we smuggle ancient and baffling worldviews into contemporary conceptions of nature.
I sketch a few examples here to illustrate the case. Perhaps you will have ones that you can add.
Nature as an Organism
You are justified in calling Nature your Mother if you have a mother who wants you dead. A Mother who inculcated both your limitations and your accomplishments. Nature: A Mother who birthed a world equipped with tooth and nail and hungry eye; whose family tie is the ripping of flesh. Why, I wonder, are we quick to demand of God an explanation of evil but incline less to asking that question of Mother Nature?
To call Nature our mother is just one manifestation of the image of the Earth as organism. It is enduring, compelling and surely wrong-footing.
University of Wisconsin historian Frank N. Egerton traces the myth of cosmos as organism back to Plato. Timaeus asked “In the likeness of what animal did the Creator make the world?” He then speculated as follows: “For the Deity, intending to make this world like the fairest and most perfect of intelligible beings, framed one visible animal comprehending within itself all other animals of a kindred nature.” Because of Plato’s fateful influence on the history of western thought, Egerton noted that the implications of this myth have been enduring. According to Egerton the myth is the source of two related concepts “the supraorganismic balance-of-nature concept and the microcosm-macrocosm concept.” The supraorganismic concept views the cosmos as having the attributes of a living thing whereas the microcosm-macrocosm concept takes different parts of the universe to correspond with an organismal body.
Both flavors of the organismal concept get expressed in ecosystem ecology. Natural ecosystems, the influential University of Georgia ecology Eugene Odum asserted, are integrated wholes, and developed in a manner that parallels the development of individual organisms or human societies. The development of the natural systems, ecological succession in other words, is orderly, predictable, and directional. It leads, in Odum’s view of things, to a stabilized ecosystem with predictable ratios of biomass, productivity, respiration and so forth. The “strategy” of ecosystem development, as Odum called it, corresponds to the “strategy” for long-term evolutionary development of the biosphere – “namely, increased control of, or homeostasis with, the physical environment in the sense of achieving maximum protection from its perturbations.” Homeostasis etymologically derives from the Greek “standing-still” and in the sense that Odum meant to imply, indicates a dynamic and regulated stability. In other words, the stability of the organism.
Odum does not stand here accused of covertly importing the organismal image into his work; he was quite explicit about it. There is much to admire in Odum’s work and the ecology that he inspired, but the sense of design and purpose that it implied in nature (what philosophers call teleology) put Odum's ecosystem ecology at loggerheads with contemporary evolutionary theory which insists on the purposelessness of nature. It has taken quite some time to reconcile ecosystem thought with evolutionary theory.
Another example of the superorganism’s baleful influence can be found in the Gaia hypothesis. In his preface to Gaia: A New Look at Life on Earth (1979) Lovelock wrote:
“The concept of Mother Earth or, as the Greeks called her long ago, Gaia, has been widely held throughout history and has been the basis of a belief which still coexists with the great religions."
If the development of James Lovelock and Lynn Margulis’s Gaia hypothesis is anything to go by, hypotheses about the workings of nature derived from the organismal image of nature have a shelf life of a decade or so. Lovelock’s Gaia: A New Look at Life on Earth was published in 1979 and he rescinded the teleological claims of the Gaia hypothesis by 1988 in his book Ages of Gaia — or at least he became attentive to the problems that the superorganism concept created. He still maintains that the Earth’s atmosphere is homeostatically regulated but he admitted to not having been led astray by the sirens of the superorganism.
It is a banality of the ecological sciences to state that everything is connected. That ebullient Scot, and eventual stalwart of the American wilderness movement, John Muir, provided the image. He wrote, "When we try to pick out anything by itself, we find it hitched to everything else in the universe."
And if such statements are employed to sponsor a notion that individual organisms cannot be regarded in isolation from those that they consume, and those that can consume them, or furthermore, that as a consequence of the deep intersections of the living and the never-alive, that there can been unforeseen consequences flowing from species additions or removals from ecosystems, then few may argue with this. However, just as the ripples of a stone dropped in a still pond propagate successfully only to its edges (though they may entrain delightful patterns in the finest of its marginal sands), not every ecological event has intolerably large costs to exact. True, if the dominoes line-up and the circumstances are just so, a butterfly’s wing beat over the Pacific may hurl a typhoon against its shores, but more often than not such lepidopterous catastrophes do not come to pass.
Ecosystems, energized so that matter cycles and conjoins the living with the dead, have their lines of demarcation, borders defined by their internal interactions being more powerful than their external ones. They are therefore buffered against many potentially contagious disasters. This, of course, is the essence of resilience - the capacity of a system to absorb disturbance without disruption to habitual structure and function. Ecology is as much the science investigating the limits of connections as it is the thought that everything is connected.
The Community Concept
Is there a greater 20th Century American environmental thinker than Aldo Leopold? Certainly there few that provided as many genuinely poetic images: in the eyes of a dying wolf he saw “a fierce green fire”, he exhorted us to “think like a mountain”, he depicted the crane as “wilderness incarnate”. For all of that, has Leopold not led us astray, with images associated with of the “ethical sequence”? Leopold’s influential land ethic “enlarges the boundaries of the community concept.” The ethical sequence that he proposed progresses stutteringly from free men, to women, to slaves, to animals, plants, rocks and land. It has a compelling lucidity. Leopold admitted, however, that it seems a little too simple. The ethic invites us into community with the land. A person’s self-image will change under a land ethic: “In short,” Leopold writes “a land ethic changes the role of Homo sapiens from conqueror plain member and citizen of it.”
Now, Leopold is a subtle thinker and knows not to confuse the image with the thing. Certainly he expected this transformation to take quite some time. The land ethic would not emerge without “an internal change in our intellectual emphases, loyalties, affections, and convictions.” Now I have little problem with the image of extending the ethical circle other than noting that it makes it seem easier than it has proven to be. My more serious objection concerns the rather thin notion of community that seems to be implied in Leopold image of the plain citizen. As environmental philosopher William Jordan III has illustrated in his book The Sunflower Forest (2003), missing from Leopold’s account is any acknowledgment of the negative elements of the human experience of community: envy, selfishness, fear, hatred, and shame. As Jordan pointed out this leads Leopold and others to “a sentimental, moralizing philosophy that…insists on the naturalness of humans…but that neglects or downplays the radical difficulty of achieving such a sense of self, and also downplays the role of culture and cultural institutions in carrying out this work.” If Leopold’s image of the community and our place within it is an impoverished one, the work of extending the circle becomes impossible.
There are other images that we might have discussed here. Ones that have had, at times at least, unfortunate implications for environmental thinking. For instance, in 1864 George Perkins Marsh wrote that mankind is disruptive, not just occasionally, mind you, but “is everywhere a disturbing agent.” One hundred years later the Wilderness Act renews the image in the definition of wilderness as an area “untrammeled by man.” We might have considered contemporary accounts of social-ecological systems where these systems are posited as a compound substance, but that in depicting them, we tease the components apart again.
So, if environmental thought and ecological science has been susceptible to what my colleague and friend Professor David Wise of University of Illinois, Chicago, has called “malicious metaphors”, is there a more productive way to think about the role of the image in developing environmental thought?
The work of French philosopher Gaston Bachelard (1884 - 1862) — one of the more lovable of the French phenomenologists, certainly the hairiest — is helpful in sorting out of a productive role for the imagination in science. He was renowned for his work on epistemological issues in science as well as for his phenomenological account of the poetic image, and his philosophical meditation on reverie. As much as he was a materialist in his approach to science, he was subjective and personal (as a matter of theoretical orientation) in his philosophical work on the imagination.
Bachelard’s work on first glance is so inviting. Chapters in his book The Poetics of Space (1958) have enticing titles like The House from Cellar to Garret, Nests, Shells. Perhaps this is why the book is a philosophic bestseller. My copy claims “more than 80,000 copies sold”. And though indeed opening a Bachelard book is like relaxing into a warm bath, nevertheless there is an astringent in those waters. The thought is somewhat obscure as Bachelard ransacks the lexicon of the various disciplines he brings together in his work: Kantian philosophy, Husserlian phenomenology, Jungian psychoanalysis etc. Oftentimes his use of technical terms was novel; reinterpreting them, Bachelard pushed them into new service. Because of this density, I wonder how many of those 80,000 copies have languished on bookshelves? Mine certainly did until the past few weeks.
To enjoy the fruits of Bachelard’s insights we should do at least some of the work of appreciating how he produced them. In the hope that this will embolden you to return to your copy of The Poetics of Space, or other works by Bachelard on the imagination, or pick them up for the first time, I will give a summary, as best I understand it, of what his phenomenology of the image is all about. I am, I should tell you, strictly an amateur Bachelardian.
The poetic image is eruptive for both poet and reader. Bachelard say that for its creation “the flicker of the soul is all that is needed.” So, every great image is its own origin. Famously, Bachelard maintained that the imagination, contrary to view of many philosophical accounts, is “the faculty of deforming images offered by perception.” The poetic image emerges into the consciousness as a direct product of “the heart, soul and being of man.” Elsewhere Bachelard claims “the imagination [is] a major power of the human nature.”
The poetic image is therefore not caught up in a network of causalities. Our first recourse should not be to ask what archetypes an image represents, or what aspects of the poet’s psycho-biography explains it away. In this assertion Bachelard remains true to phenomenology’s maxim of going “back to the things themselves.” In as much as such things are possible, one approaches the poetic image freed from all presuppositions.
So it is of secondary importance to ask where an artistic image comes from; what matters more is to explore what opportunities for freedom an image creates. Instead of cause and effect, at the center point of which we traditionally ask the image to stand, rather we might speak of the “resonances and reverberations” of the image. This is not, I think, just some fanciful softening of language, it is a necessary acknowledgment of the way in which an image does not simply reflect a memory, but rather revives an absent one and the way in which an image explodes into images. When we read the poetic image it resonates, when we communicate it it reverberates. The repercussions of the image, said Bachelard, “invite us to give greater depth to our own existence.” What bearing does an image have on our freedom? A great piece of art, Bachelard says “awakens images that have been effaced, at the same time that it confirms the unforeseeable nature of speech. And if we render speech unforeseeable, is this not an apprenticeship to freedom?”
I propose that Gaston Bachelard’s phenomenological account of the poetic image, despite its somewhat unpromising obscurity, is helpful in addressing environmental thought’s special porousness to striking images. In this short sketch I cannot fully substantiate the claim. I will end, however, with an example where an approach such as Bachelard’s seems to have been fruitful.
Tim Morton is one of the most widely read and exciting environmental writers of recent years. As far as I know has not cited Bachelard as a methodological inspiration, although his work is phenomenological and existential. [Added: One of Morton's earlier books on the representation of the spice trade in Romantc Literature was entitled Poetics of Spice (2006) - making him, it would seem, an explicit Bachelardian after all!]. Morton is so concerned about the potential of sedimented ideas leading us into Sir Benjamin Brodie’s “wilderness of perplexities and errors”, that he elected to drop the term “Nature” altogether. In his book Ecology Without Nature (2007) he explained the problem: “…the idea of nature is getting in the way of properly ecological forms of culture, philosophy, politics, and art.”
The results of Morton’s analysis lead us to strange, perplexing, though ultimately interesting places. Out of this natureless ecology comes a suite of insights on “dark ecology”, an ecology reminding us that we are always already implicated in the ecological. There is no outside from which we get a guilt-free view of the fantastic mess. Deriving also from an ecology developed without a sentimental view of nature comes a fresh analysis of connectedness. Morton revives Muir’s hitching image but this time its resonances are weirder than the oceanic feeling that we are all blissfully in this together. His analysis gives us the queer bestiary of “strange strangers” with which we are stickily intimate, and yet we can never fully get to know. Morton develops this account in The Ecological Thought (2010) which I recommend to you. I am not supposing that this is an adequate summary of Morton’s recent books, but I think that Tim is converging on the idea of resonances and reverberations that Bachelard has written about.
The image, and the imagination, can play a positive role in environmental thinking. Darwin’s image of the “tangled bank” is both a pretty and useful way of thinking about the way in which the organismal profusion developed from a common ancestor. But a misapplied image can be a disaster. Understanding our responsibilities with respect to the image is the work of the future, it is the work that will birth the future.
Walter Libby The Scientific Imagination The Scientific Monthly, Vol. 15, No. 3 (Sep., 1922), pp. 263-270
Monday, December 10, 2012
There Was No Couch: On Mental Illness and Creativity
by Jalees Rehman
The psychiatrist held the door open for me and my first thought as I entered the room was “Where is the couch?”. Instead of the expected leather couch, I saw a patient lying down on a flat operation table surrounded by monitors, devices, electrodes, and a team of physicians and nurses. The psychiatrist had asked me if I wanted to join him during an “ECT” for a patient with severe depression. It was the first day of my psychiatry rotation at the VA (Veterans Affairs Medical Center) in San Diego, and as a German medical student I was not yet used to the acronymophilia of American physicians. I nodded without admitting that I had no clue what “ECT” stood for, hoping that it would become apparent once I sat down with the psychiatrist and the depressed patient.
I had big expectations for this clinical rotation. German medical schools allow students to perform their clinical rotations during their final year at academic medical centers overseas, and I had been fortunate enough to arrange for a psychiatry rotation in San Diego. The University of California (UCSD) and the VA in San Diego were known for their excellent psychiatry program and there was the added bonus of living in San Diego. Prior to this rotation in 1995, most of my exposure to psychiatry had taken the form of medical school lectures, theoretical textbook knowledge and rather limited exposure to actual psychiatric patients. This may have been part of the reason why I had a rather naïve and romanticized view of psychiatry. I thought that the mental anguish of psychiatric patients would foster their creativity and that they were somehow plunging from one existentialist crisis into another. I was hoping to engage in some witty repartee with the creative patients and that I would learn from their philosophical insights about the actual meaning of life. I imagined that interactions with psychiatric patients would be similar to those that I had seen in Woody Allen’s movies: a neurotic, but intelligent artist or author would be sitting on a leather couch and sharing his dreams and anxieties with his psychiatrist.
I quietly stood in a corner of the ECT room, eavesdropping on the conversations between the psychiatrist, the patient and the other physicians in the room. I gradually began to understand that that “ECT” stood for “Electroconvulsive Therapy”. The patient had severe depression and had failed to respond to multiple antidepressant medications. He would now receive ECT, what was commonly known as electroshock therapy, a measure that was reserved for only very severe cases of refractory mental illness. After the patient was sedated, the psychiatrist initiated the electrical charge that induced a small seizure in the patient. I watched the arms and legs of the patients jerk and shake. Instead of participating in a Woody-Allen-style discussion with a patient, I had ended up in a scene reminiscent of “One Flew Over the Cuckoo's Nest”, a silent witness to a method that I thought was both antiquated and barbaric. The ECT procedure did not take very long, and we left the room to let the sedation wear off and give the patient some time to rest and recover. As I walked away from the room, I realized that my ridiculously glamorized image of mental illness was already beginning to fall apart on the first day of my rotation.
During the subsequent weeks, I received an eye-opening crash course in psychiatry. I became acquainted with DSM-IV, the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders which was the sacred scripture of American psychiatry according to which mental illnesses were diagnosed and classified. I learned ECT was reserved for the most severe cases, and that a typical patient was usually prescribed medications such as anti-psychotics, mood stabilizers or anti-depressants. I was surprised to see that psychoanalysis had gone out of fashion. Depictions of the USA in German popular culture and Hollywood movies had led me to believe that many, if not most, Americans had their own personal psychoanalysts. My psychiatry rotation at the VA took place in the mid 1990s, the boom time for psychoactive medications such as Prozac and the concomitant demise of psychoanalysis.
I found it exceedingly difficult to work with the DSM-IV and to appropriately diagnose patients. The two biggest obstacles I encountered were a) determining cause –effect relationships in mental illness and b) distinguishing between regular human emotions and true mental illness. The DSM-IV criteria for diagnosing a “Major Depressive Episode”, included depressive symptoms such as sadness or guilt which were severe enough to “cause clinically significant distress or impairment in social, occupational, or other important areas of functioning”. I had seen a number of patients who were very sad and had lost their job, but I could not determine whether the sadness had impaired their “occupational functioning” or whether they had first lost their job and this had in turn caused profound sadness. Any determination of causality was based on the self-report of patients, and their memories of event sequences were highly subjective.
The distinction between “regular” human emotions and mental illness was another challenge for me and the criteria in the DSM-IV manual seemed so broad that what I would have considered “sadness” was now being labeled as a Major Depression. A number of patients that I saw had severe mental illnesses such as depression, a condition so disabling that they could hardly eat, sleep or work. The patient who had undergone ECT on my first day belonged to that category. However, the majority of patients exhibited only some impairment in their sleep or eating patterns and experienced a degree of sadness or anxiety that I had seen in myself or my friends. I had considered transient episodes of anxiety or unhappiness as part of the spectrum of human emotional experience. The problem I saw with the patients in my psychiatry rotation was these patients were not only being labeled with a diagnosis such as “Major Depression”, but were then prescribed antidepressant medications without any clear plan to ever take them off the medications. By coincidence, that year I met the forensic psychiatrist Ansar Haroun, who was also on faculty at UCSD and was able to help me with my concerns. Due to his extensive work in the court system and his rigorous analysis of mental states for legal proceedings, Haroun was an expert on causality in psychiatry as well the definition of what constitutes a truly pathological mental state.
Regarding the issue of causality, Haroun explained to me the complexity of the mind and mental states makes it extremely difficult to clearly define cause and effect relationships in psychiatry. In infectious diseases, for example, specific bacteria can be identified by laboratory tests as causes of a fever. The fever normally does not precede the bacterial infection nor does it cause the bacterial infection. The diagnosis of mental illnesses, on the other hand, rests on subjective assessments of patients and is further complicated by the fact that there are no clearly defined biological causes or even objective markers of most mental illnesses. Psychiatric diagnoses are therefore often based on patterns of symptoms and a presumed causality. If a patient exhibits symptoms of a depressed mood and has also lost his or her job during that same time period, psychiatrists then have to diagnose whether the depression was the cause of losing the job or whether the job loss caused depressive symptoms. In my limited experience with psychiatry and the many discussions I have had with practicing psychiatrists, it appears that the leeway given to psychiatrists to assess cause-effect relationships may result in an over-diagnosis of mental illnesses or an over-estimation of their impact.
I also learnt from Haroun that the question of how to address the distinction between the spectrum of “regular” human emotions and actual mental illness had resulted in a very active debate in the field of psychiatry. Haroun directed me towards the writings of Tom Szasz, who was a brilliant psychiatrist but also a critic of psychiatry, repeatedly pointing out the limited scientific evidence for diagnoses of mental illness. Szasz’ book “The Myth of Mental Illness” was first published in 1960 and challenged the foundations of modern psychiatry. One of his core criticisms of psychiatry was that his colleagues had begun to over-diagnose mental illnesses by blurring the boundaries between everyday emotions and true diseases. Every dis-ease (discomfort) was being turned into a disease that required a therapy. The reasons for this overreach by psychiatry were manifold, ranging from society and the state trying to regulate what was acceptable or normal behavior to psychiatrists and pharmaceutical companies that would benefit financially from the over-diagnosis of mental illness. An excellent overview of his essays can be found in his book “The Medicalization of Everyday Life”. Even though Tom Szasz passed away earlier this year, psychiatrists and researchers are now increasingly voicing their concerns about the direction that modern psychiatry has taken. Allan Horwitz and Jerome Wakefield, for example, have recently published “The Loss of Sadness: How Psychiatry Transformed Normal Sorrow into Depressive Disorder” and “All We Have to Fear: Psychiatry's Transformation of Natural Anxieties into Mental Disorders”. Unlike Szasz who even went as far as denying the existence of mental illness, Horowitz and Wakefield have taken a more nuanced approach. They accept the existence of true mental illnesses, admit these illnesses can be disabling and acknowledge the patients who are afflicted by mental illnesses do require psychiatric treatment. However, Horowitz and Wakefield criticize the massive over-diagnosis of mental illness and point out the need to distinguish true mental illnesses from normal sadness and anxiety.
Before I started my psychiatry rotation in San Diego, I had been convinced that mental illness fostered creativity. I had never really studied the question in much detail, but there were constant references in popular culture, movies, books and TV shows to the creative minds of patients with mental illness. The supposed link between mental illness and creativity was so engrained in my mind that the word “psychotic” automatically evoked images of van Gogh’s paintings and other geniuses whose creative minds were fueled by the bizarreness of their thoughts. Once I began seeing psychiatric patients who truly suffered from severe disabling mental illnesses, it became very difficult for me to maintain this romanticized view of mental illness. People who truly suffered from severe depression had difficulties even getting out of bed, getting dressed and meeting their basic needs. It was difficult to envision someone suffering from such a disabling condition to be able to write large volumes of poetry or to analyze the data from ground-breaking experiments. The brilliant book “Creativity and Madness: New Findings and Old Stereotypes” by Albert Rothenberg helped me understand that the supposed link between creativity and mental illness was primarily based on myths, anecdotes and a selection bias in which the creative accomplishments of patients with mental illness were glorified and attributed to the illness itself. Geniuses who suffered from schizophrenia or depression were not creative because of their mental illness but in spite of their mental illness.
I began to realize that the over-diagnosis of mental illness and the departure of causality that had become characteristic for contemporary psychiatry also helped foster the myth that mental illness enhances creativity. Many beautiful pieces of literature or art can be inspired by emotional states such as the sadness of unrequited love or the death of a loved one. Creativity is often a response to a state of discomfort or dis-ease, an attempt to seek out comfort. However, if definitions of mental illness are broadened to the extent that nearly every such dis-ease is considered a disease, one can easily fall into the trap of believing that mental illness indeed begets creativity. In respect to establishing causality, Rothenberg found, contrary to the prevailing myth, mental illness was actually a disabling condition that prevented creative minds from completing their artistic or scientific tasks. A few years ago, I came across “Poets on Prozac: Mental Illness, Treatment, and the Creative Process” a collection of essays written by poets who suffer from mental illness. The personal accounts of most poets suggest that their mental illnesses did not help them write their poetry, but actually acted as major hindrances. It was only when their illness was adequately treated and they were in a state of remission that they were able to write poems. A recent comprehensive analysis of studies that attempt to link creativity and mental illness can be found in the excellent textbook “Explaining Creativity: The Science of Human Innovation” by Keith Sawyer, who concludes that there is no scientific evidence for the claim that mental illness promotes creativity. He also points to a possible origin of this myth:
The mental illness myth is based in cultural conceptions of creativity that date from the Romantic era, as a pure expression of inner inspiration, an isolated genius, unconstrained by reason and convention.
I assumed that the myth had finally been laid to rest, but, to my surprise I came across the headline Creativity 'closely entwined with mental illness' on the BBC website in October 2012. The BBC story was referring to the large-scale Swedish study “Mental illness, suicide and creativity: 40-Year prospective total population study” by Simon Kyaga and his colleagues at the Karolinska Institute, published online in the Journal of Psychiatric Research. The BBC news report stated “Creativity is often part of a mental illness, with writers particularly susceptible, according to a study of more than a million people” and continued:
Lead researcher Dr Simon Kyaga said the findings suggested disorders should be viewed in a new light and that certain traits might be beneficial or desirable.
For example, the restrictive and intense interests of someone with autism and the manic drive of a person with bipolar disorder might provide the necessary focus and determination for genius and creativity.
Similarly, the disordered thoughts associated with schizophrenia might spark the all-important originality element of a masterpiece.
These statements went against nearly all the recent scientific literature on the supposed link between creativity and mental illness and once again rehashed the tired, romanticized myth of the mentally ill genius. I was puzzled by these claims and decided to read the original paper. There was the additional benefit of learning more about the mental health of Swedes, because my wife is a Swedish-American. It never hurts to know more about the mental health or the creative potential of one’s spouse.
Kyaga’s study did not measure creativity itself, but merely assessed correlations between self-reported “creative professions” and the diagnoses of mental illness in the Swedish population. Creative professions included scientific professions (primarily scientists and university faculty members) as well as artistic professions such as visual artists, authors, dancers and musicians. The deeply flawed assumption of the study was that if an individual has a “creative profession”, he or she has a higher likelihood of being a creative person. Accountants were used as a “control”, implying that being an accountant does not involve much creativity. This may hold true for Sweden, but the creativity of accountants in the USA has been demonstrated by the recent plethora of financial scandals. The size of the Kyaga study was quite impressive, involving over one million patients and collecting data on the relatives of patients. The fact that Sweden has a total population of about 9.5 million and that more than one million of its adult citizens are registered in a national database as having at least one mental illness is both remarkable and worrisome.
The main outcome was the likelihood that patients with certain mental illnesses such as depression, schizophrenia or anxiety disorders were engaged in a “creative profession”. The results of the study directly contradicted the BBC hyperbole:
We found no positive association between psychopathology and overall creative professions except for bipolar disorder. Rather, individuals holding creative professions had a significantly reduced likelihood of being diagnosed with schizophrenia, schizoaffective disorder, unipolar depression, anxiety disorders, alcohol abuse, drug abuse, autism, ADHD, or of committing suicide.
Not only did the authors fail to find a positive correlation between creative professions and mental illnesses (with the exception of bipolar disorder), they actually found the opposite of what they had suspected: Patients with mental illnesses were less likely to engage in a creative profession.
Their findings do not come as a surprise to anyone who has been following the scientific literature on this topic. After all, the disabling features of mental illness make it very difficult to maintain a creative profession. Kyaga and colleagues also presented a contrived subgroup analysis, to test whether there was any group within the “creative professions” that showed a positive correlation with mental illness. It appears contrived, because they only break down the artistic professions, but did not perform a similar analysis for the scientific professions. Among all these subgroup analyses, the researchers found a positive correlation between the self-reported profession ‘author’ and a number of mental illnesses. However, they also found that other artistic professions did not show such a positive correlation.
How the results of this study gave rise to the blatant misinterpretation reported by the BBC that “the disordered thoughts associated with schizophrenia might spark the all-important originality element of a masterpiece” is a mystery in itself. It shows the power of the myth of the mad genius and how myths and convictions can tempt us to misinterpret data in a way that maintains the mythic narrative. The myth may also be an important component in the attempt to medicalize everyday emotions. The notion that mental illness fosters creativity could make the diagnosis more palatable. You may be mentally ill, but don’t worry, because it might inspire you to paint like van Gogh or write poems like Sylvia Plath.
A study of the prevalence of mental illness published in the Archives of General Psychiatry in 2005 estimated that roughly half of all Americans will have been diagnosed with a mental illness by time they reach the age of 75. This estimate was based on the DSM-IV criteria for mental illness, but the newer DSM-V manual will be released in 2013 and is likely to further expand the diagnosis of mental illness. The DSM-IV criteria had made allowance for bereavement to avoid diagnosing people who were profoundly sad after the loss of a loved one with the mental illness depression. This bereavement exemption will likely be removed from the new DSM-V criteria so that the diagnosis of major depression can be used even during the grieving period. The small group of patients who are afflicted with disabling mental illness do not find their suffering to be glamorous. There is a large number of patients who are experiencing normal sadness or anxiety and end up being inappropriately diagnosed with mental illness using broad and lax criteria of what constitutes an illness. Are these patients comforted by romanticized myths about mental illness? The continuing over-reach of psychiatry in its attempt to medicalize emotions, supported by the pharmaceutical industry that reaps large profits from this over-reach, should be of great concern to all of society. We need to wade through the fog of pseudoscience and myths to consider the difference between dis-ease and disease and the cost of medicalizing human emotions.
Image Credit: Wikimedia Commons Public Domain ECT machine (1960s) by Nasko and Self-Portait of van Gogh.
Wednesday, August 31, 2011
I don't know what to say to you, neighbor,
as you shovel snow from your part of our street
neat in your Greek black. I've waited for
chance to find words; now, by chance, we meet.
We took our boys to the same kindergarten,
thirteen years ago when our husbands went.
Both boys hated school, dropped out feral, dropped in
to separate troubles. You shift snow fast, back bent,
but your boy killed himself, six days dead.
My boy washed your wall when the police were done.
He says, "We weren't friends?" and shakes his head,
"I told him it was great he had that gun,"
and shakes. I shake, close to you, close to you.
You have a path to clear, and so you do.
by Marie Ponsot
from Springing -New and Selected Poems