Tuesday, February 02, 2016
It is estimated that between 1890 and 1925, an African American was lynched every two and a half days. The academic and intellectual community was no different from the bulk of mainstream America. Peoples of African descent were visibly absent in any scholarship or intellectual discourse that dealt with human civilization. Black history events, African Americans were so dehumanized and their history so distorted in academia that slavery, peonage, segregation, and lynching were considered justifiable conditions. Under Woodson's direction and contributions from other African American and white scholars, the Negro History Week was launched on a serious platform in 1926 to neutralize the apparent ignorance and deliberate distortion of Black History. Theme of black history month 2016 is, Hallowed Grounds: Sites of African American Memories.
February was selected by a man named Carter Goodwin Woodson, who was a noted historian and publisher, and who was a pioneer in American Black history. He selected February for several reasons, in that this month has an enormous significance in Black American history. First it is in celebration of two historical figures who had a great impact on the Black population. They are Abraham Lincoln and Frederick Douglass. Other noteworthy persons whereby the month of February is significant are: W.E.B. Dubois, who was born on February 23, 1868, and who was a Civil Rights leader and co-founder of the N.A.A.C.P. The 15th Amendment to the United States Constitution was passed on February 3, 1870 which gave Blacks the right to vote.The first Black senator, Hiriam R. Revels took office on February 25, 1870. The N.A.A.C.P. (National Association for the Advancement of Colored People) was founded in New York City of February 12, 1909, and Malcolm X, the militant leader who promoted Black Nationalism was shot and killed by Black Muslims on February 21, 1965.
More here. (Note: At least one post will be dedicated to honor Black History Month throughout February)
Jan Hoffman in The New York Times:
One evening in the late fall, Lucien Majors, 84, sat at his kitchen table, his wife Jan by his side, as he described a recent dream. Mr. Majors had end-stage bladder cancer and was in renal failure. As he spoke with a doctor from Hospice Buffalo , he was alert but faltering. In the dream, he said, he was in his car with his great pal, Carmen. His three sons, teenagers, were in the back seat, joking around. “We’re driving down Clinton Street,” said Mr. Majors, his watery, pale blue eyes widening with delight at the thought of the road trip. “We were looking for the Grand Canyon.” And then they saw it. “We talked about how amazing, because there it was — all this time, the Grand Canyon was just at the end of Clinton Street!” Mr. Majors had not spoken with Carmen in more than 20 years. His sons are in their late 50s and early 60s. “Why do you think your boys were in the car?” asked Dr. Christopher W. Kerr, a Hospice Buffalo palliative care physician who researches the therapeutic role of patients’ end-of-life dreams and visions. “My sons are the greatest accomplishment of my life,” Mr. Majors said. He died three weeks later.
For thousands of years, the dreams and visions of the dying have captivated cultures, which imbued them with sacred import. Anthropologists, theologians and sociologists have studied these so-called deathbed phenomena. They appear in medieval writings and Renaissance paintings, in Shakespearean works and set pieces from 19th-century American and British novels, particularly by Dickens. One of the most famous moments in film is the mysterious deathbed murmur in “Citizen Kane”: “Rosebud!” Even the law reveres a dying person’s final words, allowing them to be admitted as evidence in an unusual exception to hearsay rules.
Consists of two tight-twisted, separate strands
Conjoined as one: and not unlike, in fact,
Our own familiar silver wedding bands,
Though these are loosely woven, inexact,
With wide interstices, so that each makes
A circle of ellipses. Tightly caught
At random intervals, two little snakes
Of wire are crimped into a snaggled knot,
That four short ends, sharp bevel-cut, present
Unsheathed, ingenious fangs. And when in place,
Stretched taut, or strewn in loose coils, may prevent
The passage through some designated space
Of beast, or man. You got used to the stench;
The mud was worse than being under fire,
My father said. A detail left the trench
At night, to get the dead back from the wire,
And no one volunteered. They stood, to view
Our brief exchange of rings and vows, for both
Our fathers had survived that war: and knew
Of death, and bright entanglement, and troth.
by Richard Outram
from University of Toronto Libraries
J. M. Tyree in Guernica:
There’s a man on the bus sitting directly in front of you. He has a small brown spider crawling across his red shirt, near his left shoulder blade.
You say nothing, but watch it with fascination until he rings the bell and exits at his stop.
After he leaves, the woman sitting next to you says, “Did you see that?”
“What?” you say.
The man with the spider on his back turns around because you’ve tapped him on the arm.
“There’s a spider on your back,” you say.
“Que?” he says, looking pissed off.
“A spider,” you say. “Como se dice ‘spider’…uh, mira, puedo que…que...could I just brush it off your shirt?”
He shakes his head disgustedly and turns away.
“There’s a spider on your shirt,” you say. “Could I brush it off?”
“Please don’t,” the man says. “My cousin’s soul has been trapped inside that spider for eleven years. One more year to go!”
Nancy Fliesler in the Harvard Gazette:
Researchers at Harvard-affiliated Boston Children’s Hospital have, for the first time, visualized the origins of cancer from the first affected cell and watched its spread in a live animal. Their work, published in the Jan. 29 issue of Science, could change the way scientists understand melanoma and other cancers and lead to new, early treatments before the cancer has taken hold.
“An important mystery has been why some cells in the body already have mutations seen in cancer, but do not yet fully behave like the cancer,” says the paper’s first author, Charles Kaufman, a postdoctoral fellow in the Zon Laboratory at Boston Children’s Hospital. “We found that the beginning of cancer occurs after activation of an oncogene or loss of a tumor suppressor, and involves a change that takes a single cell back to a stem cell state.”
That change, Kaufman and colleagues found, involves a set of genes that could be targeted to stop cancer from ever starting.
The study imaged live zebrafish over time to track the development of melanoma. All the fish had the human cancer mutation BRAFV600E — found in most benign moles — and had also lost the tumor suppressor gene p53.
Ban Ki-moon in the New York Times:
In Israel and the occupied Palestinian territories, 2016 has begun much as 2015 ended — with unacceptable levels of violence and a polarized public discourse. That polarization showed itself in the halls of the United Nationslast week when I pointed out a simple truth: History proves that people will always resist occupation.
Some sought to shoot the messenger — twisting my words into a misguided justification for violence. The stabbings, vehicle rammings and other attacks by Palestinians targeting Israeli civilians are reprehensible. So, too, are the incitement of violence and the glorification of killers.
Nothing excuses terrorism. I condemn it categorically.
It is inconceivable, though, that security measures alone will stop the violence. As I warned the Security Council last week, Palestinian frustration and grievances are growing under the weight of nearly a half-century of occupation. Ignoring this won’t make it disappear. No one can deny that the everyday reality of occupation provokes anger and despair, which are major drivers of violence and extremism and undermine any hope of a negotiated two-state solution.
Israeli settlements keep expanding. The government has approved plans for over 150 new homes in illegal settlements in the occupied West Bank. Last month, 370 acres in the West Bank were declared “state land,” a status that typically leads to exclusive Israeli settler use.
At the same time, thousands of Palestinian homes in the West Bank risk demolitionbecause of obstacles that may be legal on paper but are discriminatory in practice.
Monday, February 01, 2016
by Dwight Furrow
For much of the 20th Century, the U.S. was a culinary backwater. Outside some immigrant enclaves where old world traditions were preserved, Americans thought of food as nutrition and fuel. Food was to be cheap, nutritious (according to the standards of the day) and above all convenient; the pleasures of food if attended to at all were a minor domestic treat unworthy of much public discussion.
How times have changed! Today, celebrity chefs strut across the stage like rock stars, a whole TV network is devoted to explaining the intricacies of fermentation or how to butcher a hog, countless blogs recount last night's meal in excruciating detail, and competitions for culinary capo make the evening news. We talk endlessly about the pleasures of food, conversations that are supported by specialty food shops, artisan producers, and aisles of fresh, organic produce in the supermarket. Restaurants, even small neighborhood establishments, feature chefs who cook with creativity and panache.
Why this sudden interest in food? As I argue in American Foodie: Taste, Art and the Cultural Revolution, our current interest in food is a search for authenticity, face-to-face contact, local control, and personal creativity amidst a world that is increasingly standardized, bureaucratic, digitized, and impersonal. In contemporary life, the public world of work, with its incessant demands for efficiency and profit, has colonized our private lives. The pressures of a competitive, unstable labor market, the so-called "gig" economy, along with intrusive communications technology make it increasingly difficult to escape a work world governed by the value of efficiency. This relentless acceleration of demands compresses our sense of time so we feel like there is never enough of it. Standardization destroys the uniqueness of localities and our social lives are spread across the globe in superficial networks of "contacts" where we interact with brands instead of whole persons. The idea that something besides production and consumption should occupy our attention, such as a sense of community or self-examination, seems quaint and inefficient—a waste of time. Thus, we lose touch with ourselves while internalizing the self-as-commodity theme and hiving off all aspects of our lives that might harm our "brand"—a homogenized, marketable self. Even our vaunted and precious capacity to choose is endangered, for we no longer choose based on a sensibility shaped by our unique experiences; instead our sensibilities are constructed by corporate choice architects, informed by their surveys and datamining that shepherd our decisions.
by Jonathan Kujawa
I came dangerously close to not becoming a mathematician. Like many people my experiences with math in school left me irritated and bored. I have a poor memory and I'm not a detail oriented person . The arbitrary rules to be memorized and the fiddly and unforgiving nature of calculations made each homework a minefield of point-losing opportunities. And the problems! To "motivate" us with "applications" the problems were meant to be real-world, yet always seemed to involve the patently ridiculous: rectangular pastures, conical barns, spherical cows. I don't know how anyone can refer to such obviously contrived problems as "real-world" with a straight face.
Or, worse, problems were completely devoid of any motivation whatsoever. I have strong memory of having to learn how to multiply together matrices. The rules were clearly designed to maximize the number of calculations required and, hence, the chances of making a mistake. I can't imagine who thought this was a good topic for fifteen year olds. Not a word was said about why we should learn such a thing, or why anyone, anywhere should care. Oh to have known something about how matrices are used in geometry and computer graphics, or to store and manipulate data, or to compute probabilities in Markov processes. Heck, just to point out that it is an example of a "multiplication" where AB and BA are not equal would have been great start!
Of course my experience is the rule, not the exception. Paul Lockhart wrote a fantastic essay in 2002 entitled "A Mathematician's Lament" which captures the situation perfectly. It's requiring everyone to be able to read music and never letting them hear a tune, only saying it will be needed in some unspecified way as a working adult. Or teaching reading using only tax forms and TV repair manuals. Everyone with an interest in math or education should read it. You can read it here. As Lockhart writes,
...if I had to design a mechanism for the express purpose of destroying a child’s natural curiosity and love of pattern-making, I couldn’t possibly do as good a job as is currently being done— I simply wouldn’t have the imagination to come up with the kind of senseless, soul-crushing ideas that constitute contemporary mathematics education.
So how was my soul saved?
Stephen Gill. Outside In Explosion.
by Scott F. Aikin and Robert B. Talisse
When it comes to political questions, reasonable people disagree. Reasonable disagreement persists also in philosophy, religion, and a broad array of interpersonal matters. That's life. And, indeed, we must live; we must make decisions, set plans, and adopt policies that affect, interest, and impact others. Our decisions have drawbacks, actions have consequences, and plans impose costs on others. We cannot always just go our own way; we have to consult others in trying to figure out how to go on. Hence disagreements arise.
Any view important enough to stimulate disagreement is a view that will look to some reasonable others as prohibitively costly, suboptimal, incorrect, or foolhardy. Thus assessing the drawbacks of one's view is where the argument concerning its overall merit begins, not where it ends. Thoughtful people are aware that their views will strike some reasonable others as manifestly rejectable, and consequently, thoughtful people take reasonable criticism not always as an attack on their proposals, but rather as an occasion for thinking and saying more about them. In some instances, the case can be made that the drawbacks of one's view must be borne (because, perhaps, the viable alternatives are yet even worse); in other cases, it might be arguable that the costs of adopting one's view are merely apparent or on the whole insignificant. The point is that it's plainly insipid to proceed as if the fact that an opponent's view is imperfect were a decisive reason to reject it. Showing that an interlocutor's proposal is thoroughly criticizable is never the end of the matter. What must also be shown is that the interlocutor's criticizable proposal is inferior to the other (criticizable) proposals worth considering. And that comparative task requires us to allow our interlocutors to respond to our criticisms.
The trouble is that so much popular political debate seems to presuppose that the only political view worth accepting would be one that could not be reasonably criticized.
by Jalees Rehman
Nearly half a million applications for asylum submitted by refugees were processed by German authorities in 2015, according to the German Federal Office for Refugees and Migration. The number of people who were officially registered in Germany as potential asylum seekers was even far higher-roughly one million in 2015 – which suggests that Germany anticipates an even higher number of official asylum applications for 2016. Chancellor Angela Merkel has defied many critics even in her own party and cabinet by emphasizing that Germany can and will take on more refugees, most of whom are coming from war-torn countries such as Syria, Iraq and Afghanistan. "We can do it!" ("Wir schaffen das!") was the phrase she used in September of 2015 to convey her optimism and determination in the face of ever-growing numbers of refugees and the gradual rise of support for far right extremist demonstrations and violent attacks by far right extremists on refugees centers in Germany.
The German media and right wing populists are currently obsessing about statistics such as the fact that the far right and libertarian party AfD (Alternative für Deutschland - Alternative for Germany) will garner 10% of the popular vote or that the vast majority of the refugees are male and could lead to a demographic gender shift if they remain in Germany. While such statistics serve as an important barometer of the political climate in the German electorate or to prepare for the challenges faced by the refugees and German society in the next years, they do not address the fundamental philosophical questions raised by this refugee crisis. In the latest issue of the popular German philosophy periodical "Philosophie Magazin", the editors asked philosophers and other academic scholars to weigh in on some of the key issues and challenges in the face of this crisis.
Should we be motivated by a sense of global responsibility when we are confronted with the terrible suffering experienced by refugees whose homes have been destroyed? The sociologist Hartmut Rosa at the University of Jena responds to this question by suggesting that we should focus on Verbundenheit ("connectedness") instead of Verantwortung ("responsibility"). Demanding that those of us who lead privileged lives of safety and reasonable material comfort should feel individually responsible for the suffering of others can lead to a sense of moral exhaustion. Are we responsible for the suffering of millions of people in Syria and East Africa? Are we responsible for the extinction of species as a consequence of climate change? Instead of atomizing – and thus perhaps even rendering irrelevant – the abstract concept of individual responsibility, we should become aware of how we are all connected.
by Brooks Riley
by Tamuira Reid
I stopped caring about him sometime between January and May, when the weather changed and the leaves came back. He went on that big white pill and couldn't have aged cheese or avocado and I sat at the table in the kitchen, watching him watch me.
The yelling wouldn't stop until he'd had enough, when his eyes no longer felt right in his head and he'd rather lay down than stand there, fist in mouth, cat rubbing against both legs.
He once told me that depression comes in waves but that makes it sound too beautiful. There was nothing good about the bad.
Sometimes we'd try to fight it before it hit. I'd take a shower. He'd shave his face. Vacuum the hallway rug. But it never worked and the top would blow off and it would be hands to throats again, just like that.
Teacups shook in their skin, books fell over on themselves and I wanted to see how it would all play out. Does he get the girl in the end? Or does she leave during a quiet moment, smiling as she turns away. His hand pressed against her like an ear.
"No sooner does man discover intelligence
than he tries to involve it in his own stupidity."
~ Jacques Yves Cousteau
Over the course of my last few posts I have been groping towards some kind of meeting point between, on the one hand, the current wave of information technologies, as represented by artificial intelligence (AI), social media and robotics; and on the other, what might be termed, for the sake of brevity, the social condition. The thought experiment is hardly virtual, and is in fact unfolding before us in real time, but as I have been considering the issues at stake, there are significant blind spots that will demand elaboration by many commentators in the years and decades to come. Assuming that, as Marc Andreessen put it, software (and the physical objects in which it is increasingly becoming embodied) will continue to "eat the world", how can we expect these technological goods to be distributed across society?
It's actually kind of difficult to envision this as even being a problem in the first place. It's true that, up until in the first years of this century, there was some discussion of the so-called ‘digital divide', where certain segments of the population would not be able to get onto the ‘Internet superhighway' (another term that has fallen into disuse, perhaps because it feels like we never get out of our cars anymore). These were the segments of society that were already disadvantaged in some respect, where circumstances of poverty and/or geography prevented the delivery of physical and therefore digital services. Less so, those on the wrong side of the divide may have also landed there because of language proficiency or age.
Sunday, January 31, 2016
Amy Ellis Nutt in the Washington Post:
For the first time, scientists have pinned down a molecular process in the brain that helps to trigger schizophrenia. The researchers involved in the landmark study, which was published Wednesday in the journal Nature, say the discovery of this new genetic pathway probably reveals what goes wrong neurologically in a young person diagnosed with the devastating disorder.
The study marks a watershed moment, with the potential for early detection and new treatments that were unthinkable just a year ago, according to Steven Hyman, director of the Stanley Center for Psychiatric Research at the Broad Institute at MIT. Hyman, a former director of the National Institute of Mental Health, calls it "the most significant mechanistic study about schizophrenia ever."
"I’m a crusty, old, curmudgeonly skeptic," he said. "But I’m almost giddy about these findings."
The researchers, chiefly from the Broad Institute, Harvard Medical School and Boston Children's Hospital, found that a person's risk of schizophrenia is dramatically increased if they inherit variants of a gene important to "synaptic pruning" -- the healthy reduction during adolescence of brain cell connections that are no longer needed.
In patients with schizophrenia, a variation in a single position in the DNA sequence marks too many synapses for removal and that pruning goes out of control. The result is an abnormal loss of gray matter.
Andrew Prokop in Vox:
"Today," Sanders said as he announced his campaign last May, "we begin a political revolution to transform our country economically, politically, socially and environmentally."
And what, exactly, does he mean by that?
On its surface, the concept is simple: Sanders wants to organize and mobilize the people against the powerful — specifically, corporations and the wealthy. He argues that by building a movement among average Americans, he'll be able to win elections, defeat special interests, push liberal reforms into law, and build an economy that works for everyone.
But when you drill down into the details, Sanders's hoped-for political revolution is more complicated than that — and more interesting.
For one, it's a contested theory about the best way to advance progressive policies. For another, it's an electoral argument about how the Democratic Party can expand its appeal among white voters and change the existing partisan math. It's a case that the system is so broken and corrupt that extreme measures are needed to shake it up. And it's a usually implicit, sometimes explicit critique of President Obama and his party.
Katherine Marrone in Bitchmedia:
Sundance has a reputation for being more female-friendly than the Hollywood establishment: Each year, at least 25 percent of the films at Sundance are directed by women. And more female-directed and female-centered films win awards at the independent film festival than at the Oscars. But even when they’re a hit on the festival circuit, films directed by or about women often get overlooked for distribution by old-school production studios. This year though, the big-name studios like Sony may matter less at Sundance as streaming companies like Netflix and Amazon have been snapping up most of the films. “We’re interested in distinctive films by artists who have something new and interesting to say,” Roy Price, head of Amazon Studios, told the New York Times. Still, streaming companies and traditional studios alike have bought some exciting films by and about women this year at Sundance. Here are 10 films that are either directed by a woman or starring a female protagonist (or both) that are making a splash at Sundance. Keep an eye out for them.
Tallulah: Netflix purchased streaming rights to Sian Heder’s writing/directing debut for $5 million. The film stars Ellen Page as Lu, a free-spirited young woman who, distraught after being left by her boyfriend, wanders an upscale hotel looking for leftovers. When she’s mistaken for a maid by one of the hotel’s patrons, she decides to “rescue” the patron’s child from its mother. Lu takes the child to the house of her boyfriend’s mother, Margo (played by Allison Janney). What follows is a deep story of two women from different worlds trying to understand one another.
Under the Shadow: Netflix also bought the rights to this tense and haunting horror film set in Tehran during the 1980s. Directed by Babak Anvari, the Farsi film centers on an aspiring doctor named Shideh (Narges Rashidi) who wants to continue her medical studies but faces pushback because she’s done political activism. As the story develops, supernatural forces seem to mix with the real-world horrors of war.
From The Telegraph:
Edith Wharton (January 24, 1862- August 11, 1937) won the 1921 Pulitzer Prize for The Age of Innocence. She was nominated for the Nobel Prize for Literature three times. In this article, originally published in 2007, Caroline Moore reviews Edith Wharton by Hermione Lee.
"No man but a blockhead ever wrote, except for money"; a woman who does so is doubly open to ridicule. As Hermione Lee shows in this excellent biography, the reputation of the novelist Edith Wharton – a magnificently subtle, passionate and constantly surprising writer – has suffered unfairly, merely because she was born near the top of what she called the 'small, slippery pyramid' of society. She was born in 1862: her father, George, was a Jones. This may not sound distinguished; but, as Edith caustically remarked, in New York the Jones family had "for generations, in a most distinguished way… done nothing whatever remarkable". Her relations gave rise to the phrase 'keeping up with the Joneses'; but that did nothing to help the aspirations of an un-pretty, unfashionably red-headed little girl who was born to be remarkable. Almost symbolically, Edith's red hair remained defiantly unfaded until her dying day. Her mother, Lucretia, was cold, disapproving and, according to her daughter, distrusted writers with"the sort of diffidence which, thank heaven, no psychoanalyst had yet arisen to call a complex".
Edith, in accordance with the customs of her class, was forbidden to read any novels, until 'the day of my marriage'. Yet, as a child, she was a natural, even compulsive writer, 'making up' incessantly – a solitary, ritualistic, obsessive activity. Her first literary efforts were quelled. Aged 11, she showed her mother a story which began, ' "Oh, how do you, Mrs Brown?" said Mrs Tompkins. "If only I had known you were going to call I should have tidied up the drawing room".' 'Never shall I forget', Edith wrote bitterly, 'the sudden drop of my creative frenzy when she returned it with the icy comment: "Drawing rooms are always tidy." '
Fire From My Mother
Out, lips pursed,
Hearing echoes of you
I am never alone
You are here when
I’m breathing fire,
From this world,
Never alone, breathing
Fire from my belly
Infused with embers
Of your eyes, hearth of
Your heart, umbilical cord
Connecting us once like deep
Sea diver to oxygen tank,
Sunlight to life, vitamin D
I breathe in fire breath
You feed me, like the eagle
Feeding her weak fledgling—
“Every crow thinks her crows
Are blackest,” you said…
I breathe in fire, like a bellows
In sweaty blacksmith palms
Breathes in air, I breathe in
Fire, healers breathe, shamans
Breathe, warriors, witch doctors
Dancing ‘round leaping
Red fingers of flame, breathe
I breathe in breaths of fire
From flames you ignited…
by Raymond Nat Turner
Saturday, January 30, 2016
Lynn Parramore interviews Lance Taylor over at the INET blog:
A new paper by economist Lance Taylor for the Institute For New Economic Thinking’s Working Group on the Political Economy of Distribution takes on the way economists have looked at wealth and income inequality. Taylor’s research challenges some conclusions about what’s driving inequality made by Thomas Piketty and Joseph Stiglitz. What’s really causing the growing gap between haves and have-nots? Is it mechanical market forces? Outsourcing? Real estate? As Taylor sees it, economists have gotten the answer wrong. Worker exploitation and outsized business profits are factors, but even more key are the unjustified payments to the wealthy generated by our outsized financial sector. This hasn’t just “happened.” Flawed economic theory and politicians beholden to the rich lead to policies that make it happen. We can fix the problem, but it will take bold steps.
Lynn Parramore: You recently dived into the debate on what causes wealth and income inequality — and whether or not we can fix it within the existing social order. Heated discussions among economists got touched off by Thomas Piketty’s bestselling book, Capital in the Twenty-First Century, but you say that a key part of the story actually is a debate that happened in the late 60s and early 70s, the “Cambridge capital controversy.” Why is this old debate so vital now?
Lance Taylor: Because it tells us that mainstream economists have been wrong in how they think about inequality for a long time. Which means that they haven’t been particularly helpful in solving the problem. This is one of the key challenges of our time. We can do better.
LP: Ok, so tell us a little about this debate and why the ordinary person should care about it.
LT: The Cambridge capital controversy between economists at MIT in the U.S. and at Cambridge University in the U.K. took place at two levels. Especially for the Brits, the first level was about whether distributions of income and wealth are partly shaped by social and political relationships – class conflict if you will – or mostly by “market forces.”
There were technical skirmishes at the second level – one in particular about the nature of capital and the role of the rate of profit made by producers. Nobody denied that we need capital goods – machines, computers, buildings, railroad tracks – to produce stuff. The question was whether it makes sense to talk about an economy-wide all- encompassing capital stock. The MIT crowd wanted to say that if you have more capital stock, then 3 things will happen: 1) the profit rate will fall due to decreasing returns, 2) output and the real wage will go up and 3) as far as distribution is concerned the world will be a better place. These ideas are built into the standard mainstream model of economic growth, mostly because of MIT’s Bob Solow and Trever Swan from Australia, influential economists who invented it. Thomas Piketty relies on this model in his book.
Any time you produce something, you’re going to use some combination of capital goods. Trains haul iron ore, which makes steel, which makes railroad tracks. The theory of capital has always centered on the implications of how much it costs to use these goods. The Cambridge controversy focused on the question of what were the cheapest costs for using workers together with different combinations of capital goods as the rate of profit changes.
Complications arise because the production cost of each good depends on the profit rate along with the prices of the other goods.
Over at Meaning of Life TV, Robert Wright talks to Jeremy England:
Laura Tanenbaum Garth Risk Hallberg's City on Fire,in Dissent:
City on Fire, Garth Risk Hallberg’s novel about New York in the 1970s, is a big and elaborately constructed book with 944 pages, dozens of characters, seven sections, six interludes, a prologue, and a postscript. Each section opens with images and quotations, drawn from works ranging from Walt Whitman’s Leaves of Grass to Walter Benjamin’s The Arcades Project. Hallberg seems inspired by the democratic scope of these projects, and by the belief that everyone’s story, everyone’s point of view, should matter.
The novel’s plot centers around the murder of Samantha, a young NYU student hanging around the city’s punk scene. Around this story, Hallberg weaves together a vast range of characters who come into contact with Samantha, and another set who come into contact with them. He moves around in time, filling in social and psychological background on even seemingly peripheral characters. As in the great social novels of the nineteenth century, which are clearly on Hallberg’s mind, we move through the social and class strata of the city. The Hamilton-Sweeneys, one of the city’s richest families, serve as a node for these connections. There’s the family patriarch, William, who is under threat of indictment, and his nefarious stepbrother, Amory Gould, pulling the strings. The son, also William, breaks ties with the family, enters the art world, struggles with addiction, and joins a band in Samantha’s circle. Regan struggles to be the good daughter who stays with the business and with motherhood and domesticity; her husband Keith eventually takes up with Samantha. There’s Charlie, the suburban kid who falls hard for Samantha; Nicky Chaos, the punk guru; and the cop and journalist who investigate her murder. At first, we don’t know who killed Samantha, but we know we are moving towards the events of July 17, 1977, when the city was famously plunged into darkness. Everyone has a story and everyone’s story gets told with sympathy, with the exception of Amory, whose villainy serves as a foil for the novel’s humane liberalism.
“But how was it possible for a book to be as big as life?” asks Mercer Goodman, the teacher, would-be novelist, and sometime-lover of William. “Such a book would have to allocate 30-odd pages for each hour spent living (because this was how much Mercer could read in an hour, before the marijuana)—which was like 800 pages a day. Times 365 equaled roughly 280,000 pages each year: call it 3 million per decade, or 24 million in an average human lifespan.” Hallberg does not give us a book as big as, at least, this life. But it is nonetheless big.
Samuel Beckett Play Brought to Life in an Eerie Short Film Starring Alan Rickman & Kristin Scott Thomas
Colin Marshall in Open Culture:
Here at Open Culture, when we think of authors who write work made for the movies, we do, of course, think of names like Dan Brown, J.K. Rowling, and Robert Ludlum — but even more so of names like Samuel Beckett, whose pushing of aesthetic and intellectual boundaries on the stage we welcome now more than ever on the screen. And in a way, his works have undergone more complete film adaptation than have the books of many bestselling mainstream writers, thanks to the 2002 omnibus project Beckett on Film, which rounded up nineteen auteurs to direct films, ranging in length from seven minutes to two hours, of each and every one of his nineteen plays.
Beckett on Film‘s roster of directors includes Michael Lindsay-Hogg doing Waiting for Godot, Atom Egoyan doing Krapp’s Last Tape, Neil Jordan doing Not I, the artist Damien Hirst doing Breath, and Anthony Minghella, he of The English Patient and The Talented Mr. Ripley, doing Play, which you can watch above. The sixteen-minute production adapts Beckett’s 1963 one-act, a distinctively purgatorial sort of romantic drama which presents a man (“M”), his wife (“W1”), and his mistress (“W2”), each trapped in an urn, each forced to speak about the details of their triangular relationship when, on stage, the spotlight turns to them. On film, Minghella chooses to swap out the spotlight for the camera itself, which cuts, swings, and shifts focus swiftly between the three, commanding the history of the affair from all three perspectives, each delivered with flat, rapid-fire insistence yet with surprising clarity and feeling as well.