Tuesday, May 19, 2015
For as long as humans and horses have coexisted, humans have looked at horses and seen in them that ineffable quality we associate with the things we have lumped into a broad box labeled “Beautiful Things,” along with mountain ranges, the night sky, flowers in bloom, and the female form. The human-horse relationship is so much more intimate than the human-cow or human-chicken or human-sheep relationship. These are animals we ride, and when riding them we have been both mistaken for and mythologized as one and the same being. I don’t need to belabor how little man might have accomplished and how much more slowly he would have done it without the horse. The way we feel about the horse is more like how we feel about dogs than how we feel about stock animals.
Yet as urban dog and cat ownership skyrocketed in the United States between 1920 and 1940, so did meat-processing plants supplying demand for pet food with horsemeat. About 200 such meat plants opened during those decades, even as horses were publicly beloved on a scale we can hardly imagine from today’s perspective. There were the YA novels, yes. There was also Seabiscuit running across headlines and between 1930 and 1948, Gallant Fox, Omaha, War Admiral, Whirlaway, Count Fleet, Assault, and Citation won the Triple Crown in an impressive string, a feat only one American Thoroughbred had managed before. Only four have managed it since; the last one, Affirmed, was in 1978.
GIVEN THAT JACK SMITH never actually completed another movie after Flaming Creatures (1963), that most of his theater pieces concern the impossibility of their coming into existence, and that many all-but-identical drafts of the same scripts were found among his papers, it’s hardly surprising that he should have been fascinated by the most famously indecisive character in world literature.
Hamlet in the Rented World (A Fragment) is a twenty-seven-minute assemblage put together by Jerry Tartaglia on behalf of the Gladstone Gallery in New York from materials discovered in the Jack Smith Archives, including five quarter-inch audio reels and four rolls of 16-mm film (two of them untouched camera originals), all dating from the early 1970s. Guided by Smith’s scripts, Tartaglia’s reconstruction may be considered the artist’s last, posthumous word. (Hundreds of slides, the material for scores of the slide shows Smith presented during the ’70s and ’80s, remain—but we won’t go there.) Tartaglia, an avant-garde filmmaker whose deep involvement with Smith’s movies began when he discovered Flaming Creatures’ camera original in a laboratory discard bin in 1978 and who has labored over restorations of all Smith’s other film projects, knows this material better than anyone on earth.
Dennis Cooper's latest book, Zac’s Haunted House, was released online in mid-January by the Paris-based small press and label Kiddiepunk. Dubbed an “html novel” and offered as a free download, it consists of seven html files, each of which expands into a long, vertical scroll of animated gifs. You could call Zac’s Haunted House many things: net art, a glorified Tumblr, a visual novel, a mood board, or a dark night of the Internet's soul. It has just a few words—the chapter titles and a few subtitles embedded in some of the gifs—but it still very clearly belongs to Cooper’s own haunted oeuvre, capable of evoking powerful and gnarled emotions. Although it is something of an about-face from his last novel, The Marbled Swarm—with that book’s intentionally contrived, digressive language—Zac’s Haunted House still displays Cooper’s obsessive attention to form and style. It also features his by now nearly classical imagery and interests: The vulnerable young male body juxtaposed with death and failure; charged use of subcultural vernacular; and confused bodies, to say nothing of identities, fumbling through sex and subterfuge. Cooper has always written characters whose ineloquence hints at experiences that defy language; now, telling a story almost exclusively in images, he pushes this inarticulateness in a new direction. The result is surprisingly eloquent, and accurately speaks to our experience of the present, online and IRL.
I thought I was following a track of freedom
and for awhile it was —Adrienne Rich
Consider the earnestness of pavement
its dark elegant sheen after rain,
its insistence on leading you somewhere
A highway wants to own the landscape,
it sections prairie into neat squares
swallows mile after mile of countryside
to connect the dots of cities and towns,
to make sense of things
A river is less opinionated
it never argues with gravity
its history is a series of delicate negotiations with
time and geography
Wet your feet all you want
it's never the river you remember;
a road repeats itself incessantly
obsessed with its own small truth,
it wants you to believe in something particular
The destination you have in mind when you set out
is nowhere you have ever been;
where you arrive finally depends on
how you get there,
by river or by road
by Michael Crummey
from Arguments With Gravity
Kingston, Ont.: Quarry Press, 1996.
Philip Ball in Nautilus:
Shaken by a paper with the provocative title “Why most published research findings are false.”1 Written by John Ioannidis, a professor of medicine at Stanford University, it didn’t actually show that any particular result was wrong. Instead, it showed that the statistics of reported positive findings was not consistent with how often one should expect to find them. As Ioannidis concluded more recently, “many published research findings are false or exaggerated, and an estimated 85 percent of research resources are wasted.”2
It’s likely that some researchers are consciously cherry-picking data to get their work published. And some of the problems surely lie with journal publication policies. But the problems of false findings often begin with researchers unwittingly fooling themselves: they fall prey to cognitive biases, common modes of thinking that lure us toward wrong but convenient or attractive conclusions. “Seeing the reproducibility rates in psychology and other empirical science, we can safely say that something is not working out the way it should,” says Susann Fiedler, a behavioral economist at the Max Planck Institute for Research on Collective Goods in Bonn, Germany. “Cognitive biases might be one reason for that.” Psychologist Brian Nosek of the University of Virginia says that the most common and problematic bias in science is “motivated reasoning”: We interpret observations to fit a particular idea. Psychologists have shown that “most of our reasoning is in fact rationalization,” he says. In other words, we have already made the decision about what to do or to think, and our “explanation” of our reasoning is really a justification for doing what we wanted to do—or to believe—anyway. Science is of course meant to be more objective and skeptical than everyday thought—but how much is it, really?
George Johnson in The New York Times:
The powerful algorithm that has populated the earth with 10 million species, each occupying a different ecological niche, is an example of what computer scientists call “random generate and test.” Start with the DNA alphabet, then blindly shuffle the letters to produce a kaleidoscope of living forms. The fittest, selected by the demands of the environment, will multiply and fill their habitats. The Darwinian principle is also at work inside the body, though in very different ways. Through random variation and selection, the immune system spins out the endless diversity of antibodies that it uses to stop microscopic invaders. But cancer also thrives through this evolutionary imperative as, mutation by mutation, a normal human cell transforms into a deadly tumor, which becomes fitter and fitter at the expense of its host. Among the advantages it evolves is the ability to outwit our immunological defenses.
One of the most encouraging developments in medical research has been the effort to help the immune system fight back, beating cancer at its own evolutionary game. That was a dominant theme last month at the annual meeting of the American Association for Cancer Research in Philadelphia as scientists discussed recent successes in immunotherapy while considering how far the field still has to go. Why have these treatments been working so well with some cancers but not others? And why, even in the best cases, do not all patients respond? The realization that Darwinian forces, for good and bad, are at work inside us can be traced to the early 1950s, when Frank Macfarlane Burnet, an Australian virologist, was pondering how we manage to fight off a potentially infinite variety of invading microbes, tailoring an antibody against each one. One possibility was that when an interloper is identified, by its molecular bumps and grooves, the immune system systematically engineers an appropriate weapon. Nature doesn’t work in such a methodical manner, and Burnet suggested a messier, more intuitive explanation: the clonal selection theory of immunity.
Monday, May 18, 2015
by Lisa Lieberman
Frigid women. Manipulative wives. Bad mothers. Dumb blondes. Alcoholism. Failing marriages. Furtive sex. Before Mad Men revived these retro conventions and somehow made them hip, they were just tawdry. The poster for BUtterfield 8 (1960) shows Liz Taylor in a slip, highball in one hand, a mink coat hanging off her shoulder. "The most desirable woman in town and the easiest to find. Just call BUtterfield 8." (In the more risqué version, she's standing by a pink telephone wearing nothing but a sheet).
In real life, Liz had just wrecked Eddie Fisher's marriage. He plays her friend Steve in this picture, long-suffering an older-brotherly way, a real prince. He left Debbie Reynolds for Liz, but she's the one doing penance here. Liz's character, Gloria, is angry, manipulative, and a nymphomaniac: the dark side of 1950s womanhood, as perceived by 1950s men. Nobody would ever mistake her for a nice girl.
The married guy she's cheating with, Liggett, is married to a nice girl, Emily. She's long-suffering too. She knows her husband is lying to her, he drinks too much and beats her around, but she blames herself for tempting him with a job in Daddy's company when she should have let him stand on his own two feet. Actually, it's not all Emily's fault. Emily's mother played a part in emasculating Liggett. They blamed mothers for everything in the 1950s and, let me tell you, Gloria's mother's got a lot to answer for too.
Poor Gloria. Behind her back, the men who buy her drinks and expensive trinkets (less crass than paying money for her "services") make jokes about how they ought to rent out Yankee Stadium, the only place big enough to hold all her ex-conquests. Poor Liz. She may have won the Oscar for her role, but it wasn't worth the humiliation.
It wasn't only Liz, though. "Prepare to be shocked," promised the trailer to A Summer Place, "because this bold, outspoken drama is the kind of motion picture excitement demanded by audiences today." Really? I can't imagine what audiences in 1959 found shocking about this picture. As an exposé of sexual hypocrisy, it's pretty tame. Yes, there's an extramarital affair, but the betrayed spouses are so unsympathetic you're cheering the adulterous couple on. There's a pair of teenaged lovers having sex too, but Molly (Sandra Dee) and Johnny (Troy Donahue) are driven into one another's arms by the screwed-up adults in their lives. Knowing the mess that both Dee and Donahue made of their own lives, it's tempting to read more into this picture. When Johnny's alcoholic father calls Molly "a succulent little wench," we're obviously meant to feel, with Johnny, that this accusation is unjust, but he only disputes the "wench" part. Dee is indeed succulent, her surface innocence barely concealing her sexual readiness. Toward the end of her life, the actress revealed that she had been raped repeatedly by her step-father as a child. The way she was presented in A Summer Place, it's all there. Poor Sandra.
To Tariq, Younger Brother
7 November 1952 – 7 November 2014
Lines written at Raj Bagh Cemetery and at Jewel House
The root of our life, the life below the life
At Raj Bagh Cemetery
Aha! There you are buried at Father’s feet,
next to uncle Rasool. Are you still
not talking to him? Why did you steer clear
of him all your adult life? Grudges?
We lived our childhood with his children, after
all. Say, “Hello! Uncle Rasool,” or your
typical “Howdy!” Believe me, talking cures.
“I don’t want to see your face again,”
you wrote me once I sold you my share in
Jewel House for a brotherly sum.
Net one-eighty. In no time, you seeded
Mia’s young mind with poison talk: Don’t
trust our family, you told her. Have faith in
only the peerless Mister Peer, best
friend—who, by the way, was not at your burial.
Everyone is Corruptible,
his creed, you told me once. No money for your
school, you wrote Mia. She spread the news:
I had taken all. Tsk! Tsk! I know no dad,
except in fiction, who would disgrace
his sole heir, not even the tuk tuk driver
who dodges rogue traffic to wheel me
to the lively veggie bazaar at Dal Gate.
Such malice! Matched only by your ex-
wife’s mediocrity, turning up her fatuous
nose as if her kind had all the world’s
culture, Kashmiris only agriculture.
"(Bee-eaters) forage over grasslands and Acacia savanna, and are well known for the ingenious use of ‘beaters’ to chase up grasshoppers, dragonflies and other prey species. These beaters usually take the form of grazing herds of game and domestic animals, and large flocks of carmine bee-eaters may gather overhead. They also use various creatures as convenient mobile perches from which to swoop off, snatching insects flushed by their ride.
Northern Carmine Bee-eaters in particular are masters of this trait, and rides range from elephants, donkeys and goats to Kori and Arabian Bustards, Abyssinian Ground Hornbills ..."
by Yohan J. John
We are routinely told that we live in a brave new Information Age. Every aspect of human life — commerce, entertainment, education, and perhaps even the shape of consciousness itself — seems to be undergoing an information-driven revolution. The tools for storing and sharing information are becoming faster, more ubiquitous, and less visible. Meanwhile, we are increasingly employing information as an explanation of phenomena outside the world of culture and technology — as the central metaphor with which to talk about the nature of life and mind. Molecular biology, for instance, tells us how genetic information is transferred from one generation to the next, and from one cell to the next. And neuroscience is trying to tell us how information from the external world and the body percolates through the brain, influencing behavior and giving rise to conscious experience.
But do we really know what information is in the first place? And is it really a helpful way to think about biological phenomena? I'd like to argue that explanations of natural phenomena that involve information make inappropriate use of our latent, unexamined intuitions about inter-personal communication, blurring the line between what we understand and what we don't quite have a grip on yet.
People who use information technologies presumably have a working definition of information. We often see it as synonymous with data: whatever can be stored on a hard drive, or downloaded from the internet. This covers text, images, sound, and video — anything that can be represented in bits and bytes. Vision and hearing are the senses we seem to rely on most often for communication, so it's easy to forget that there are still experiences that we cannot really communicate yet, like textures, odors or tastes. (Smellevision still seems a long way off.)
The data-centric conception of information is little over half a century old, and sits alongside an older sense of information. The word 'information' comes from the verb 'inform', which is from the Old French word informer, which means 'instruct' or 'teach'. This word in turn derives from Latin informare, which means 'to shape, form'. The concept of form is closely linked to this sense of information. When something is informative, it creates a specific form or structure in the mind of the receiver — one that is presumably useful.
But there is a tension between seeing information as a unit of communication, and seeing it as something that allows a sender to create a desired result in the mind of a receiver. And this tension goes back to the origins of information theory. Claude Shannon introduced the modern technical notion of information in 1948, in a paper called A Mathematical Theory of Communication. He framed his theory in terms of a transmitter, a channel, and a receiver. The mathematical results he derived showed how any signal could be coded as a series of discrete symbols, and transmitted with perfect fidelity between sender and receiver, even if the channel is noisy. But for the purposes of the theory, the meaning or content of the information was irrelevant. The theory explained how to efficiently send symbols between point A and point B, but had nothing to say about what was actually done with these symbols. All that mattered was that the sender and receiver agree on a system of encoding and decoding. Information theory, and all the technologies that emerged in its wake, allows us to communicate more and communicate faster, but it doesn't really tell us everything we would like to know about communication.
by Brooks Riley
Sunday, May 17, 2015
Elizabeth Barnes in Philosop-her:
I didn’t expect to feel so angry. A few years ago, having established a certain amount of professional security, I decided to start doing more work on social and feminist philosophy – especially philosophical issues related to disability. I’d always done some work on the topic, but I considered doing lots of work on it a professional luxury that had to be earned. When I began to focus more of my research on disability, I expected plenty of things – a deeper sense of fulfillment from what I was doing, a fair amount of side eye from colleagues, worries that the topic was too niche to be of general interest – but I didn’t expect the emotional drain that the work would be. I feel angry – more than I could’ve anticipated and more than I often care to admit – when I write about disability. And I also, at times, feel so, so sad.
I have sat in philosophy seminars where it was asserted that I should be left to die on a desert island if the choice was between saving me and saving an arbitrary non-disabled person. I have been told it would be wrong for me to have my biological children because of my disability. I have been told that, while it isn’t bad for me to exist, it would’ve been better if my mother could’ve had a non-disabled child instead. I’ve even been told that it would’ve been better, had she known, for my mother to have an abortion and try again in hopes of conceiving a non-disabled child. I have been told that it is obvious that my life is less valuable when compared to the lives of arbitrary non-disabled people. And these things weren’t said as the conclusions of careful, extended argument. They were casual assertions. They were the kind of thing you skip over without pause because it’s the uncontroversial part of your talk.
Amy Fleming in More Intelligent Life:
THE REWARD SYSTEM exists to ensure we seek out what we need. If having sex, eating nutritious food or being smiled at brings us pleasure, we will strive to obtain more of these stimuli and go on to procreate, grow bigger and find strength in numbers. Only it’s not as simple in the modern world, where people can also watch porn, camp out in the street for the latest iPhone or binge on KitKats, and become addicted, indebted or overweight. As Aristotle once wrote: “It is of the nature of desire not to be satisfied, and most men live only for the gratification of it.” Buddhists, meanwhile, have endeavoured for 2,500 years to overcome the suffering caused by our propensity for longing. Now, it seems, Berridge has found the neuro-anatomical basis for this facet of the human condition—that we are hardwired to be insatiable wanting machines.
If you had opened a textbook on brain rewards in the late 1980s, it would have told you that the dopamine and opioids that swished and flickered around the reward pathway were the blissful brain chemicals responsible for pleasure. The reward system was about pleasure and somehow learning what yields it, and little more. So when Berridge, a dedicated young scientist who was more David than Goliath, stumbled upon evidence in 1986 that dopamine did not produce pleasure, but in fact desire, he kept quiet. It wasn’t until the early 1990s, after rigorous research, that he felt bold enough to go public with his new thesis. The reward system, he then asserted, has two distinct elements: wanting and liking (or desire and pleasure). While dopamine makes us want, the liking part comes from opioids and also endocannabinoids (a version of marijuana produced in the brain), which paint a “gloss of pleasure”, as Berridge puts it, on good experiences.
Spencer Kornhaber in The Atlantic:
In 1949, the legend goes, B.B. King ran into a burning building to save a guitar he loved. The dance hall he’d been playing at in Twist, Arkansas, caught flame when two men knocked over a barrel of fuel while fighting about a woman. The woman’s name was Lucille—and from that point on, King’s guitar was named Lucille, too. Though Gibson would later launch a B.B. King Lucille model, and King indeed favored that company’s instruments, there wasn’t just one Lucille. Most any guitar he’d play would get the name. Much like how the name came to stand in for the instrument, King’s name came to stand, in the public’s imagination, for the kind of music he played. When people today talk about the blues, they’re talking in part about B.B. King; when they talk about B.B. King, they’re talking about the blues. The two concepts are the same.
More here. (Note: I had the honor of hearing him live many times. With his passage, an entire era has ended.)
A Panic That Can Still Come Upon Me
if this time you, all of it, this time now
If exit is merely a sign
by Peter Gizzi
from The Outernationale © 2007
Saturday, May 16, 2015
For Bedford, histories that start in the parlor room can only end in the street. To illustrate the public temperament surrounding the novel’s scandals, Bedford provides unmarked fragments of dialogue, pulled, so it seems, from the cafés, the sitting rooms, and the street corners. Some are clearly from on high. When Eduard’s wife, Sarah, promises never to pay another of her husband’s debts, two voices muse: “She might have done it less subtly.” / “This kind of thing can only be done in that way or not at all.” / “Then it cannot be done at all.” Others, from on low. When the Felden Scandal erupts, so do the lower classes: “Ourtaxes.” / “That’s right.” / “Our savings.” / “Hear, hear!” / “The working man’s pence.” / “That’s where they go!” / “Lunatics in luxury.” And anti-Semitism: “Did you see—Jews got their fingers in it too.” / “Whenever there is something rotten in the state of Denmark . . . ” Like the Dreyfus Affair in Proust, the Felden Scandal occasions a glimpse into the larger social context beyond our principals; unlike Proust, Bedford knows where the sentiments are headed—where and when and how the casual and mocking anti-Semitism turns from words into actions.
A Legacy doesn’t find answers to the postwar era’s questions; to be fair, few books do and none conclusively. Rather, Bedford’s novel shows that the roots of our evils—our social evils, our political evils—are not just in decisions made in bunkers or boardrooms, but in kitchens and bedrooms as well. And they don’t start as evils, perhaps. Death might begin as a disagreement over dinner. That’s putting it lightly, but all histories are linked. As Sarah notes, “Crisis? There are no crises. It’s all a chain, a long chain.”
In January 2014 Michael Gove, then Britain’s education secretary, opened the centenary year of the first world war in typically belligerent style, with a full-frontal attack on the “myth” that the conflict was a “misbegotten shambles — a series of catastrophic mistakes perpetrated by an out-of-touch elite”. This falsehood was propagated, he claimed, by various guilty parties — from the authors of Oh! What a Lovely War and Blackadder to “leftwing academics”. “Leftwing academics” duly returned fire, and Gove came in for a drubbing in the liberal press.
Yet to be fair to Gove, he was simply echoing, albeit rather crudely, the work of academic historians. Hew Strachan and others have for some time been challenging the “lions led by donkeys” view of the war, championed by AJP Taylor in the bracingly anti-elitist 1960s. For Strachan, the British fought a necessary war against an illiberal and militaristic Germany. Nor does Taylor’s stress on callous elitism and aristocratic arrogance find much favour in Christopher Clark’s The Sleepwalkers — one of the most important books of the centenary season. For Clark, all of Europe’s rulers, including Britain’s, were blameworthy, but it is their myopic misreadings of international politics, not their aristocratic values, that he sees as culpable.
Medicine is dominated by the quants. We learn about human health from facts, and facts are measurable. A disease is present or not present; a reckonable proportion of people respond to a particular drug; the inability to predict gene-environment interactions reflects only a failure to map facts we will eventually be able to determine; and if the observable phenotype varies for an established genotype, the differences must be caused by calculable issues. In this version of things, the case histories that constituted most of medical literature up to the early 20th century reflect a lack of empirical sophistication. Only if we can’t compute something are we reduced to storytelling, which is inherently subjective and often inaccurate. Science trades in facts, not anecdotes.
No one has done more to shift this arithmetical naïveté than Oliver Sacks, whose career as a clinician and writer has been devoted to charting the unfathomable complexity of human lives. “All sorts of generalizations are made possible by dealing with populations,” he writes in his new memoir “On the Move,” “but one needs the concrete, the particular, the personal too.” The emergent field of narrative medicine, in which a patient’s life story is elicited in order that his immediate health crisis may be addressed, in many ways reflects Sacks’ belief that a patient may know more about his condition than those treating him do, and that doctors’ ability to listen can therefore outrank technical erudition. Common standards of physician neutrality are in Sacks’ view cold and unforgiving — a trespass not merely against a patient’s wish for loving care, but also against efficacy.
Molly Hannon in The Daily Beast:
“You are born, you grow up, and you become a wife.” “But what if it wasn’t this way?” asks Kate Bolick, the author of Spinster: Making a Life of One’s Own. What if women did not have to worry about getting married, or agonize about when and if it will happen—two questions, Bolick claims, that will hound a young girl into her adult life, regardless of where she was raised, or her religious association. “Men don’t have the same problems,” she argues. And she’s right. They don’t. So what if women were like men? What if marriage was not an end goal, but simply a choice—a choice to not settle, a choice to not search, or even the choice to forgo waiting for Mr. Right to magically appear? What if women could save themselves and carve out a life of their own—on their own terms, and be content with that choice, or at least free from the judgment of others?
Bolick’s book, which reads more like a memoir than a manifesto on the single life, manages to deliver an honest confession about the perils of being alone. She does not gush. Instead, she tells. She recounts childhood and puberty with a wry and self-deprecating fondness, homing in on how young girls are quickly evaluated on their looks—and marketability. Then, there is the confusing joy of hormones and high school, and the gradual transition into college, and the debauchery and free love that follows. From that, women come to a point where they can settle, push on, or wait. Does one venture out into the real world, where Solo cups of beer and parties are not always present or available? Or should we resist and go our own way?
Friday, May 15, 2015
Brendan O'Connor in The Verge:
Though in life Rube Goldberg was known to the world as a cartoonist, he was first an engineer. He graduated from UC Berkeley in 1904 and took a job in San Francisco where he worked on the city’s sewer systems. But he didn’t last long. A naturally talented artist, Goldberg became a sports cartoonist for the San Francisco Chronicleearning $8 per week.
He moved to New York in 1907; by 1915, his cartoons were nationally syndicated. This was an era in which a syndicated cartoonist could make a healthy living: according to a short profile published by The New York Times in 1963, Goldberg was earning a salary upwards of $50,000 by 1916 — over $1 million by today’s standards.
Over the course of his decades-long career, Goldberg drew cartoons that were variously political and frivolous. He penned three nationally syndicated, weekly comic strips —"Boob McNutt," "Mike and Ike: They Look Alike," and "Lala Palooza" — and wrote a single-frame cartoon called "Foolish Questions." At the peak of his career, he wrote three editorial page cartoons every week, which appeared in 43 newspapers across the country.
Goldberg’s work made him famous: he was named the first president of the National Cartoonists Society in 1946; in 1948, he won the Pulitzer Prize for a political cartoon satirizing nuclear power. (The conservative Goldberg was invited to the White House by Presidents Eisenhower and Nixon.) Goldberg "has won as many trophies as even his most prolific trophy-inventing machine might devise," reads a short Times profile on the occasion of his 80th birthday. "He takes them seriously but not too seriously, like nearly everything else in life."
But Goldberg’s engineering studies were not entirely wasted — no cartoons left as indelible an impact on popular culture as his mechanical chain-reaction illustrations. Goldberg drew his cockamamie inventions intermittently from the beginning of his career — he drew the first, "Automatic Weight Reducing Machine," in 1914, and in 1921 Marcel Duchamp published some of Goldberg’s designs in New York Dada. But the majority of these cartoons come from a bi-weekly series he drew for the magazine Collier’s Weekly from 1929 to 1931 called "The Inventions of Professor Lucifer G. Butts." Professor Butts (the "G" stood for "Gorgonzola") was a parody of a Berkeley engineering professor who had once asked his students to design a machine that could weigh the world. Goldberg, one of those students, found this to be a preposterous task.
Julian Baggini in Aeon:
The idea that IVM might have a part to play in a cleaner, fairer food system runs counter to a central idea put forward by many critics of industrial agriculture: that farming needs to be based more on traditional, natural, biological and ecological systems not artificial mono-cultures. Surely in vitro meat would be the most artificial mono-culture of them all.
Professor Mark Post of Maastricht University presents his 'cultured beef' burger. Photo by David Parry/PA
The belief that we have to choose between a food system that is over-dependent on technology and one that is more in harmony with nature rests on the assumption that there is a neat moral and conceptual contrast between ‘natural’ and ‘artificial’, and that this lines up neatly with the distinction between ‘good’ and ‘bad’. If IVM is the greenest, most animal-friendly meat, yet it is even more artificial than a pitiful, intensively reared broiler chicken, then no one can maintain the fantasy that bucolic nature has a monopoly on good, ethical food.
For those who have campaigned for a more ethical and sustainable food system, IVM is a good test of where their values really lie: with hard-nosed ethics or soft-focus sentiment. After all, it is hard for anyone concerned about the environment or animal welfare to disagree with Post’s claim that ‘from an ethical view [IVM] can have only benefits’. Cultured meat has the potential to replace lame, belching, farting, grain-guzzling, confined beasts with clean, safe, sustainable meat, direct from the factory floor.
Faced with this unsettling truth, how have greens and animal rights campaigners responded to Post’s synthetic burger?