Wednesday, October 26, 2016
Trees are People and the People are Trees
And there in the crowded commons
three hundred striding people,
gesturing, eating the air,
halted around us, suddenly quiet.
They sprouted leaves and cones,
they wore strange bark for clothing,
and gently lifted their arms.
by John Haines
from The Owl in the Mask of the Dreamer
Graywolf Press, 1993
Ken Wessen in Plus Magazine:
As computers are constantly becoming faster and better, many computational problems that were previously out of reach have now become accessible. But is this trend going to continue forever, or are there problems computers will never, ever be able to solve? Let's start our consideration of this question by looking at how computer scientists measure and classify the complexity of computational algorithms.
Suppose you are responsible for retrieving files from a large filing system. If the files are all indexed and labelled with tabs, the task of retrieving any specific file is quite easy — given the required index, simply select the file with that index on its tab. Retrieving file 7, say, is no more difficult than retrieving file 77: a quick visual scan reveals the location of the file and one physical move delivers it into your hands. The total number of files doesn't make much difference. The process can be carried out in what is called constant time: the time it takes to complete it does not vary with the number of files there are in total. In computers, arrays and hash-tables are commonly used data structures that support this kind of constant time access.
Now suppose that over time the tabs have all fallen off the files. They are still indexed and in order, but you can no longer immediately spot the file you want. This introduces the requirement to search, and a particularly efficient way to do so is a binary search. This involves finding the middle file and seeing whether the file you need comes before or after. For example, when looking for file 77, pull out the middle file and see if its index is smaller or larger than 77, and then keep looking to the left or right of the middle file as appropriate.
With this single step you have effectively halved the size of the problem, and all you need to do is repeat the process on each appropriate subset of files until the required file is found. Since the search space is halved each step, dealing with twice as many files only requires one additional step.
Writing for the total number of files, it turns out that as grows, the number of steps it takes to solve the problem (that is, the number of steps it takes to find your file) grows in proportion to the logarithm to base of (see the box below to find out why). We therefore say that a binary search is logarithmic, or, alternatively, that it has computational complexity This is the so-called big O notation: the expression in the brackets after the O describes the type of growth you see in the number of steps needed to solve the problem as the problem size grows (see the box on the left for a formal definition).
A logarithmic time process is more computationally demanding that a constant time process, but still very efficient.
But what if over time the loss of the tabs has allowed the files to become disordered? If you now pull out file 50 you have no idea whether file 77 comes before or after it.
Grigori Guitchounts in Nautilus:
The animals of neuroscience research are an eclectic bunch, and for good reason. Different model organisms—like zebra fish larvae, C. elegans worms, fruit flies, and mice—give researchers the opportunity to answer specific questions. The first two, for example, have transparent bodies, which let scientists easily peer into their brains; the last two have eminently tweakable genomes, which allow scientists to isolate the effects of specific genes. For cognition studies, researchers have relied largely on primates and, more recently, rats, which I use in my own work. But the time is ripe for this exclusive club of research animals to accept a new, avian member: the corvid family.
Corvids, such as crows, ravens, and magpies, are among the most intelligent birds on the planet—the list of their cognitive achievements goes on and on—yet neuroscientists have not scrutinized their brains for one simple reason: They don’t have a neocortex. The obsession with the neocortex in neuroscience research is not unwarranted; what’s unwarranted is the notion that the neocortex alone is responsible for sophisticated cognition. Because birds lack this structure—the most recently evolved portion of the mammalian brain, crucial to human intelligence—neuroscientists have largely and unfortunately neglected the neural basis of corvid intelligence.
This makes them miss an opportunity for an important insight. Having diverged from mammals more than 300 million years ago, avian brains have had plenty of time to develop along remarkably different lines (instead of a cortex with its six layers of neatly arranged neurons, birds evolved groups of neurons densely packed into clusters called nuclei). So, any computational similarities between corvid and primate brains—which are so different neurally—would indicate the development of common solutions to shared evolutionary problems, like creating and storing memories, or learning from experience. If neuroscientists want to know how brains produce intelligence, looking solely at the neocortex won’t cut it; they must study how corvid brains achieve the same clever behaviors that we see in ourselves and other mammals.
Marcia Angell reviews Alison Gopnik's The Gardener and the Carpenter: What the New Science of Child Development Tells Us About the Relationship Between Parents and Children in the New York Review of Books:
The first sentence in Gopnik’s book is “Why be a parent?” Good question, but she answers it only abstractly, saying that having children “allows a new kind of human being to come into the world.” She does say that being a parent is profoundly satisfying, even if exhausting, but that tells us why you’re glad you did it, not why you did it. In thinking about the reasons in my own family, I realized that they have probably varied over the generations but have some things in common. One set of my grandparents (about 1880 to 1960), who farmed, fished, and built boats, had eleven children; the children provided much-needed labor, even when very young, and they were a source of pride, particularly for my grandfather (I think he saw them as proof of potency), not to mention a bid for family immortality. They were also a form of old-age and medical insurance.
My parents (about 1906 to 1990) lived a different life. They had only two children and we were of almost no use. My father worked in an office that might as well have been on the moon, and my mother was a housewife without much to do after we were of school age. I think they had children because it was expected of them, and besides, what else could my mother do? But they liked the idea of family (the reality, maybe not so much), and here, too, it offered security in old age and continuation of the dynasty, such as it was.
I am seventy-seven years old and, like Gopnik, the mother of grown children who have young children of their own, and also a woman with a postgraduate degree and a demanding profession. I knew the planet didn’t need more children, and there was now some safety net for old age and illness. So why did I have children? All I can say is that I wanted them very much, partly for the lifelong love and companionship of people whose character and values I had helped form. (Here Gopnik might accuse me of being something of a carpenter, and I may have been, but she is too, I suspect.) And like Gopnik, I am glad I had them.
Nevertheless, despite an unbroken chain of people choosing to have children, albeit for different reasons, we are now living at a time when fewer and fewer women are making that choice. The most recent data from the National Center for Health Statistics show that the fertility rate for American women ages fifteen to forty-four was 62.9 per thousand in 2014, the lowest ever recorded. In 1950 it was 106.2 per thousand, 70 percent higher. Moreover, according to Sophie Gilbert in her review in The Atlantic of a book edited by Meghan Daum, titled Selfish, Shallow, and Self-Absorbed (2015), which contains essays by writers who chose not to have children, 25 percent of women with college degrees never have children. Despite the new focus of celebrity magazines on celebrity babies, more young people seem to be finding sufficiently close and sustaining relationships with one another to forgo parenthood.
Trees are People and the People are Trees
And there in the crowded commons
three hundred striding people,
gesturing, eating the air,
halted around us, suddenly quiet.
They sprouted leaves and cones,
they wore strange bark for clothing,
and gently lifted their arms.
by John Haines
from The Owl in the Mask of the Dreamer
Graywolf Press, 1993
Anthony Gottlieb in Spiked:
In 2000, scholar, writer and then executive editor at The Economist Anthony Gottlieb received widespread acclaim for the first installment of his survey of Western philosophy, The Dream of Reason, which covered thought from the Greeks to the Renaissance. This year, its remarkable sequel, The Dream of Enlightenment, emerged. Focusing on that ‘150-year burst’ of intellectual energy that begins in Northern Europe after the Thirty Years War, and stretches up to the eve of the French Revolution, Gottlieb provides a profoundly illuminating portrait of an era in which the battles fought (and sometimes won) were to pave the way for the modern age. The spiked review caught up with Gottlieb to discuss toleration, freedom and the many misconceptions that have, at points, turned Enlightenment thinkers into caricatures of themselves.
review: What really comes through in The Dream… is the extent to which many Enlightenment thinkers were immersed in the natural sciences, in ‘mechanical philosophy’, practically and theoretically. Indeed, as The Dream… reveals, Descartes thought of himself principally as a mathematician and scientist, and Spinoza was famed for his microscopic technology. What’s striking, however, is that they were not only able to reconcile their religious faith with the natural sciences; they actually used natural sciences, the method of mechanical philosophy, to prove the existence of God…
Gottlieb: Yes, it was certainly common throughout the period to think that the more science shows you about nature, the more it showed the evidence of God. Isaac Newton (1643-1727) was very specific about this. He endorsed what we now call the argument of design, that is, the idea that there is evidence of design in nature. Newton thought that the further you looked into the workings of the natural world, the more you saw the evidence of God. And most Enlightenment thinkers, except for Hume and some after him, accepted that idea.
Kelly Servick in Science:
For people with knee joint injuries, the most promising source of new cartilage might be right up their noses. For the first time, doctors in Switzerland have grafted cartilage from the nose into the knees of patients with severe injuries to this connective tissue, the tearing of which can lead to pain and even osteoarthritis. Doctors now have limited means of repairing cartilage: They can graft or inject knee cartilage cells from a cadaver or a healthy part of the person’s own joint.
Or they can create tiny breaks in the underlying bone in the hopes of releasing progenitor cells that can restore the cartilage. But over the last decade, researchers have realized that cartilage cells from the nose are adept at forming new tissue that can hold up to the mechanical stress of the knee joint. And extracting those cells is much less invasive and damaging than digging around in someone’s knee. In a study published online today in The Lancet, researchers cut a flat chunk about the diameter of a pencil eraser out of the septum dividing participants’ nostrils, then broke down the tissue with enzymes and grew the cells on a porous membrane.
Tuesday, October 25, 2016
Last year marked the two-hundredth anniversary of the eruption of Indonesia’s Mount Tambora, among the largest volcanic eruptions in recorded history. This year marks the two-hundredth anniversary of Mary Shelley’s Frankenstein. Next year, 2017, will be the two-hundredth anniversary of Baron Karl Drais’s “running machine,” the precursor to the modern bicycle. Strange as it may seem, these three events are all intimately related; they’re all tied together by the great shift in climate that made 1816 the “year without a summer.”
Tambora, on the island of Sumbawa, Indonesia—then the Dutch East Indies—began its week-long eruption on April 5, 1815, though its impact would last years. Lava flows leveled the island, killing nearly all plant and animal life and reducing Tambora’s height by a third. It belched huge clouds of dust into the air, bringing almost total darkness to the surrounding area for days. The geologist Charles Lyell would reflect that “the darkness occasioned in the daytime by the ashes in Java was so profound, that nothing equal to it was ever witnessed in the darkest night.” According to Lyell, of the 12,000 residents of the province of Tambora, only twenty-six survived. Tens of thousands more were choked to their deaths by the thick black air and the falling dust, which blanketed the ground in piles more than a meter high.
Ten thousand dead – a conservative estimate at best. Three million internally displaced. Twenty million in need of aid. Two hundred thousand besieged for over a year. Thirty-four ballistic missiles fired into Saudi Arabia. More than 140 mourners killed in a double-tap strike on a funeral. These are just some of the numerical subscripts of the war in Yemen.
The British government would probably prefer to draw attention to the money being spent on aid in Yemen – £37m extra, according to figures released by the Department for International Development in September – rather than the £3.3bn worth of arms that the UK licensed for sale to Saudi Arabia in the first year of the kingdom’s bombing campaign against one of the poorest nations in the Middle East.
Yet, on the ground, the numbers are meaningless. What they do not show is how the conflict is tearing Yemeni society apart. Nor do they account for the deaths from disease and starvation caused by the hindering of food imports and medical supplies – siege tactics used by both sides – and for the appropriation of aid for financial gain.
The hesitation in the drive toward Mosul also has much to do with Iraq’s fractious politics. The three main forces advancing toward the city—the Iraqi army, the peshmerga, and the coalition of independent Shiite militias, some backed by Iran—are in conflict about their parts in the coming liberation. Nechirvan Barzani, the Kurdistan Regional Government prime minister, announced last summer that the peshmerga would play a “central role” in the liberation of Mosul, which has a minority Kurdish population. The top commanders of the Iraqi security forces, dominated by Shiites, insist that the Kurds stick to the outskirts of the city, which is itself largely Sunni—then withdraw as soon as the battle is over.
The Shiite militias, poised within striking distance of Mosul in parts of neighboring Kirkuk province, have also demanded that they participate in the Mosul operation. “They played a huge role in the liberation of areas [around Baghdad] and they are highly motivated,” a US military officer in Baghdad told me. But the prospect of armed Shiites sweeping through Mosul has alarmed many Sunnis, who recall the killings of Sunni civilians during the liberation of Fallujah and other parts of Anbar province last spring. Some Shiite militia leaders, meanwhile, say they will oppose any attempt by the peshmerga to march into Mosul. Kurdish leaders are also demanding a referendum on their own independence as soon as the Islamic State is driven out of the country. Al-Abadi has hedged on Kurdish independence, which is opposed by most of the Shiite majority. (The US government has repeatedly said it supports a united Iraq.)
Carmen Nobel at the website of Harvard Business School:
Soltes, who was doing an in-depth investigation on white-collar crime, had been interviewing Madoff every Wednesday evening for several months. Madoff, a renowned stockbroker turned fraudster, conducted the phone calls from FCI Butner, a medium-security federal correctional institution in North Carolina. At the time, he was serving the third year of a 150-year prison sentence for orchestrating the biggest Ponzi scheme in history.
Madoff’s phone-time allowance was limited, and he saved much of it for his conversations with Soltes. They conversed in 15-minute chunks, the maximum amount of uninterrupted call time that the prison would allow.
The professor and the felon shared a genuine, geeky interest in financial economics. Sometimes they discussed the early days of Madoff’s career, which began in 1960. Other times they chatted about new books, academic journal articles, or recent events in the news. But that evening Soltes led the conversation with a specific question: How would you explain your actions and misconduct to a group of students?
Ed Yong in The Atlantic:
Roses are red but violets aren’t blue. They’re mostly violet. The peacock begonia, however, is blue—and not just a boring matte shade, but a shiny metallic one. Its leaves are typically dark green in color, but if you look at them from the right angle, they take on a metallic blue sheen. “It’s like green silk, shot through with a deep royal blue,” says Heather Whitney from the University of Bristol.
And she thinks she knows why.
Similar metallic colours are common in nature—you can find it in the wings of many butterflies, the bibs of pigeons, the feathers of peacocks, and the shells of jewel beetles. These body parts get their color not from pigments but from microscopic structures that are found in evenly spaced layers. As light hits each layer, some gets reflected and the rest pass through. Because of the regular gaps between the layers, the reflected beams amplify each other to produce exceptionally strong colors—at least, from certain viewing angles.This is called iridescence.
Iridescence is less obvious among plants, but there are some stunning exceptions.
From Notes on Liberty:
Here is an illustration of these basic ideas. Today, one can buy shoes made by machine in South Korea or by hand in India. That is, modern mass production along rationalized lines, in the world, exists side by side with craft production fairly similar to all shoe production before 1750. The average line worker in a Korean shoe production does not need to be very bright, and he can be satisfactorily trained in a month or so. By contrast, a traditional Indian shoe-maker is apprenticed for four to five years, or more.** He cannot be stupid and he needs patience, perseverance, and a superior ability to focus, among other personal traits. It’s true that today’s unskilled Korean worker probably has more formal education that the Indian shoe-maker. That’s not because he needs it to do his job but because he lives in a rich society where formal education is a consumption item. It may also be to enable him to spend rationally. It may make him a better citizen. It’s not required by his job beyond basic literacy, if that.
Video length: 9:01
Brandon Keim in The Chronicle of Higher Education:
In recent years scientists have even found that insects possess evolutionarily ancient brain structures responsible for creating mental maps of one’s own place in space. Some researchers consider these structures foundational to human awareness; if they are, then insects, too, would appear to be conscious. Whatever it feels like to be a bee, it feels like something. What that something is, how instinct and awareness interact, how different forms of memory shape experience, how evolution’s convergences and divergences have shaped the development of cognition across time and circumstance — these are frontier questions now being asked. Science has come a long way from a reflexive adherence to C. Lloyd Morgan’s wariness of "higher psychical faculty," or the famed behaviorist B.F. Skinner’s insistence that other animals are "conscious in the sense of being under stimulus control" and experience pain with no more conscious resonance than "they see a light or hear a sound."
Other questions involve capacities like morality: Might its biological building blocks be widespread in the animal kingdom? Or what about motivation? After all, a human whose every physical need is provided for, but who doesn’t actually do anything except sit in a room, won’t be very happy. Beyond seeking pleasure, avoiding pain and procreating, what might an animal find fulfilling? "I don’t think I can understand that unless I try, with a whole lot of humility, to imagine what it would be like to be that animal," says Becca Franks, a cognitive psychologist at the University of British Columbia. "Then you take those insights to create an experimental, data-driven paradigm. That’s how science proceeds."
martha promise receives leadbelly, 1935
when your man comes home from prison,
when he comes back like the wound
and you are the stitch,
when he comes back with pennies in his pocket
and prayer fresh on his lips,
you got to wash him down first.
you got to have the wildweed and treebark boiled
and calmed, waiting for his skin like a shining baptism
back into what he was before gun barrels and bars
chewed their claim in his hide and spit him
stumbling backwards into screaming sunlight.
you got to scrub loose the jailtime fingersmears
from ashy skin, lather down the cuffmarks
from ankle and wrist, rinse solitary’s stench loose
from his hair, scrape curse and confession
from the welted and the smooth,
the hard and the soft,
the furrowed and the lax.
you got to hold tight that shadrach’s face
between your palms, take crease and lid
and lip and brow and rinse slow with river water,
and when he opens his eyes
you tell him calm and sure
how a woman birthed him
back whole again.
by Tyehimba Jess
Editorial in Nature:
In March 2011, this publication suggested that the US Congress seemed lost in the “intellectual wilderness”. The Republicans had taken over the House of Representatives, and one of the early acts of the chamber’s science committee was to approve legislation that denied the threat of climate change. As it turns out, this was just one tiny piece of a broader populist movement that was poised to transform the US political scene. Judging by the current presidential campaign, when it comes to reason, decency and use of evidence, much of the country’s political system seems to have lost its way. Is there anything left to say about the unsuitability of Donald Trump as a presidential candidate? Even senior figures of his own party have disowned him. The latest revelations about his sordid attitude and behaviour towards women only confirm what was obvious to many from the very beginning: Trump is a demagogue not fit for high office, or for the responsibilities that come with it.
Will the centre hold? Will the United States elect its first female president, Hillary Clinton? It should do. And not just because she is not Donald Trump. Clinton is a quintessential politician — and a good one at that. She has shown tremendous understanding of complex issues directly relevant to Nature’s readers, and has engaged with scientists and academics. Take health: as first lady, she led attempts to expand health care in the early years of her husband Bill Clinton’s presidency. She supported the Children’s Health Insurance Program, which reaches millions of poor children. She championed women’s rights, and as secretary of state made global health a priority through the Global Health Initiative, a framework to coordinate various US programmes. Clinton may not have the outsider appeal of a newcomer. But few politicians with her degree of experience and pragmatism do. She is arguably the best-qualified presidential candidate for two decades.
Monday, October 24, 2016
by Holly A. Case
Is there a relationship between politics and madness? The history of the legal strategy known as the insanity defense offers some clues. One thinker, the political philosopher Hannah Arendt, was so haunted by the moral confusion of the insanity defense as to wonder whether there is a way to tell right from wrong without reference to right and left.
Last month before a packed courtroom in Graz, Austria, a man stood trial for three counts of murder and 108 counts of attempted murder. The defendant, Alen R., appeared each day in a white suit. His face, like his last name, was obscured in the Austrian media, but the case was such a high-profile one—all seven days of the proceedings were broadcast live and it was front-page news in every one of the Austrian dailies—that Alen R. became something of an anti-celebrity.
On June 20, 2015, just after midday, Alen R. ran down pedestrians and cyclists with his SUV along a route stretching more than a mile through the city center of Graz. Witnesses estimated his maximum speed to be over 60 miles per hour. At one point he stopped to attack two people with a knife. Over the five-minute duration of his "mad driving spree," he killed three people and injured thirty-six, many of them seriously.
The focus of the trial came down to one question: "Is Alen R. so mentally ill that he can assume no responsibility for the apocalyptic drive in his SUV through the pedestrian zones of Graz?" At issue were the conflicting expert assessments of psychiatrists and a psychologist regarding the defendant's sanity: one had concluded that he was "of unsound mind" and should therefore be referred for psychiatric treatment rather than given a prison sentence, and another believed Alen R. to be very much "of sound mind" and said he should stand trial as an accused criminal. To break the tie, a third (German) psychiatrist was called in who diagnosed him with schizophrenia. In the end, the jury deferred to the testimony of a fourth expert, a psychologist, who declared Alen R. to be of sound enough mind to be criminally responsible for murder and attempted murder. He was given a life sentence (though it is not yet binding) along with a referral for incarceration in a facility for the criminally insane.
by Hari Balasubramanian
There are interests that lie dormant within us, waiting to take hold some day. If someone had said ten years back that I'd be into birds, I would have been skeptical. It's true that I always had a fondness for animals: in high school, I spent a lot of time following neighborhood stray dogs and watching cheetahs chase gazelles on National Geographic. After moving to Arizona for grad school, I sought out every opportunity to hike and visit the famous national parks of the American southwest. But despite all the time spent outdoors, birds had never intrigued me. I used to be puzzled, even amused, by people who showed up at a trail with binoculars.
For many, it's the sighting of a particular species, usually a rare or colorful one, that sparks an interest. In my case, it was a very common North American bird – the cardinal. This was in 2011. I'd been living in Amherst, Massachusetts for three years. I had heard of cardinals, mostly as the name of a football team, and had never spotted one.
But in March that year, I suddenly starting seeing them: outside my apartment, during my walks in the woods around Amherst and while driving (they would often fly across the road). The crested bright red male was a thrill to watch. I felt privileged every time I saw one. Something was being revealed just to me! I asked others if they had seen any and would feel proud if their reply was negative. There was probably a simpler explanation of course. It snowed and rained a lot that year, and the population could have spiked for some ecological reason. Or the sight of the first made me look for more every day, with the result that I had simply begun to see what had always been there.
Whatever the reason, cardinals sparked a wider interest in birds and indeed all other species. It all seemed a tremendous mystery.
Otto Umbehr. Self Portrait at the Beach. c 1930.
by Richard King
The late Alexander Cockburn once suggested – mischievously, as was his wont – that the principal reason The New York Times published a "Corrections" column every morning was to convince its readers that everything else in the previous day's paper had been 100% true, morally as well as factually. In this way The Gray Lady maintained her reputation as America's premier clearing house for "All the News That's Fit to Print": by reminding the world that she, too, was ever-so-slightly fallible.
Observing the meltdown in the US media in the weeks since Donald Trump became the GOP's man, it is hard not to think of Cockburn's zinger. Faced with the prospect of a President Trump – now highly unlikely, post-the Access Hollywood controversy – the media has moved from shock to repentance: Grub Street is jumping with journalists eager to take their share of the blame for the elevation of the Orange One. Nor, I think, are they wrong to do so, though the terms in which the mea culpas are currently being offered in the press manage both to miss the point and to highlight the very attitudes for which they should be apologetic. I'll get to those a little later. Suffice it to say, for now, that the media's self-flagellation in this instance smells strongly of self-aggrandisement.
The self-flagellation was discernible even before Trump's nomination. The New York Times' Jim Rutenberg, for example, suggested as long ago as May that the media was failing in its duty to voters, so wide of the mark had its predictions been. But it is only in the last few weeks that the sound of hats being dutifully chewed has yielded decisively to the rustle of sackcloth. Nicholas Kristof, also writing in the Times, struck an especially masochistic note: "Those of us in the news media have sometimes blamed Donald Trump's rise on the Republican Party's toxic manipulation of racial resentments over the years. But we should also acknowledge another force that empowered Trump: Us."
by Emrys Westacott
Donald Trump epitomizes extravagance. Not the imprudently living beyond one's means sort of extravagance criticized by Ben Franklin, but the kind that spares no expense in the quest to gratify one's desires and impress people.
Gold-gilded towers, marbled mansions, emblazoned private jets: all of them scream out, "Look how f____ing rich I am!"
There is a paradox here. You'd think that Trump flaunting his wealth so unabashedly would turn off the majority of voters. You'd expect it to especially turn off the ones that the polls say make up his base–men without a college education who feel they are losing out in a changing world. After all, most people aren't rich. That's why politicians like to present themselves as commoners: so that voters can identify with them. Even those who ate baby food off silver spoons will typically tell stories about some parent or grandparent who was dirt poor and worked their way up.
There is also a deep strain in American culture that has always been highly critical of luxury, extravagance, boastfulness and pride. These are, after all, the opposite of: simplicity, frugality, modesty and humility–the traditional Christian values taught by Jesus, practiced by the Puritans, and associated with the rural homestead.
Furthermore, a preference for frugal simplicity and related values is not just a Puritan prejudice. It's supported by a rich philosophical tradition, from Socrates and Epicurus in ancient times to Thoreau and Wendell Berry more recently. Like our religious heritage, this tradition has left its mark on our thinking. According to these sages, frugal simplicity is the path to both virtue and happiness.
Now some might argue that these traditional values are out of fashion. But that's not entirely true. Simplicity is still respected. When the current pope was chosen in 2013, his simple lifestyle was hailed on all sides as a sure sign of his moral integrity. Warren Buffet, "the sage of Omaha," has a reputation for wisdom that is decidedly enhanced by his choosing to live in the same unexceptional house that he bought in 1958.
So how is Trump able to turn not just his wealth but his showy, extravagant lifestyle into political capital?
by Brooks Riley
by Humera Afridi
On a recent weekend morning, I spoke with eminent writer and intellectual Gündüz Vassaf at his home on the island of Sedef in Turkey. I was calling from Manhattan, New York, via Skype, and the distances of space and time between us collapsed to make way for a conversation that felt like a natural continuation of a felicitous meeting earlier in the summer.
Vassaf, the author of 14 critically acclaimed books of nonfiction, fiction, essays and poetry, had just returned from a brisk swim in the Sea of Marmara. It was a chilly 20 degrees Celsius on the island, the sun suspended low in the late October sky, but that did not deter him. I sense there is not much that can restrain Vassaf from following his heart. His is a quest for freedom—in work, in life, in mind, in body—a right that he asserts not just for himself, but, judiciously for all sentient beings, and does so with a rare ebullience, one balanced with wisdom.
In his 1987 bestseller, Prisoners of Ourselves, Vassaf writes:
"This book is about freedom. It's about freedom we avoid, freedom that we fear to have in our everyday lives. Even with our simple daily acts we subject ourselves to a totalitarian order of our creation and subservience.
My first idea was to write a book about our accommodation of totalitarian regimes. Throughout history, millions across the world have experienced changes in regimes from a relatively democratic state to a totalitarian order.
In the end and over time, we acquiesce to these regimes. We internalize the new norms. The very few who don't, become martyrs, unknown patients in mental hospitals, forgotten prisoners of conscience.
I did not write a book about the above because I realized that also in "democratic" regimes we can become prisoners of ourselves."
Prisoners of Ourselves explores the psychology of totalitarianism in every day life and is a profound elucidation of human consciousness. It sold over 70,000 copies when it was published and quickly rose to the stature of a contemporary classic in Turkey. Vassaf has written many other works in between this astute and marvelously prescient book of lyrical essays—one which I find illuminates the present historical moment—and his most recent, What Can I Do? that was released, serendipitously, a week after the recent failed coup attempt in Turkey.
by Carl Pierer
Much has been written about Zeffirelli's adaption of Romeo & Juliet, in particular its focus on the themes of youth and beauty. A neat narrative lends itself to explain the films popularity and immense success: Zeffirelli catered for a teen audience (choosing unknown, very young lead actors, exploring themes of sexuality) in a time where precisely this teen audience was preoccupied with similar explorations - the film was released in 1968 – need more be said? Today, it seems, Zeffirelli's once progressive interpretation has become canonical. The film, to a modern audience, seems a trifle antiquated: the romance, the costumes, the operatic acting all add to its heaviness. Yet, beneath this striking opulence, the film is a subtle and skillful interpretation of Shakespeare's text. It is a nuanced study of the consequences of patriarchal structures based on a phallic conception of masculinity, which has not lost any of its actuality. More so, it treats women as agents by painting them as complicit supporters of the patriarchal hierarchy.
In a brilliant essay, Peter Donaldson has explored some alternative themes dominating Zeffirelli's adaption. One of them is its treatment of the homoerotic undercurrents in Shakespeare's text. Donaldson reads Zeffirelli's film as visually underscoring Shakespeare's social criticism of the patriarchal structures which form the social context of the play. The feud, which produces the tragedy, is understood, on this reading, as a symptom of a much deeper illness: "misogyny and its corollary, male fear of intimacy with other men." (Donaldson, p. 153)