Thirty years ago last week, Salman Rushdie’s The Satanic Verses was published. Rushdie was then perhaps the most celebrated British novelist of his generation. His new novel, five years in the making, had been expected to set the world alight, though not quite in the way that it did.
The novel was, Rushdie suggested, both about “migration, metamorphosis, divided selves, love, death” and “a serious attempt to write about religion and revelation from the point of view of a secular person”. At its heart was a clash of race, religion and identity that, ironically, prophesied the controversy that engulfed the novel and still shapes our lives today.
Within a month, The Satanic Verses had been banned in Rushdie’s native India. By the end of the year, protesters had burned a copy of the novel on the streets of Bolton. Then, on Valentine’s Day 1989, came the event that transformed the controversy – Ayatollah Khomeini’s fatwa calling for Rushdie’s death.
The affair marked a watershed in British political and cultural life. There had long been conflicts between minority communities and the state, from the Notting Hill riots of the 1950s to the Grunwick dispute in 1977, to the inner-city disturbances of the 1980s. These were in the main political conflicts, workplace struggles or issues of law and order.
Dangling from a balloon high above Antarctica, a particle detector has spotted something that standard physics is at a loss to explain.
Two unusual signals seen by the detector, known as the Antarctic Impulsive Transient Antenna, or ANITA, can’t be attributed to any known particles, a team of physicists at Penn State reports online September 25 at arXiv.org. The result hints at the possibility of new particles beyond those cataloged in the standard model, the theory that describes the various elementary particles that make up matter.
Like the old man in the Pixar movie Up, ANITA floats on a helium balloon, at an altitude of 37 kilometers for about a month at a time. It searches for the signals of high-energy particles from space, including lightweight, ghostly particles called neutrinos. Those neutrinos can interact within Antarctica’s ice, producing radio waves that are picked up by ANITA’s antennas.
The two puzzling signals appear to be from extremely energetic neutrinos shooting skyward from within the Earth. A neutrino coming up from below isn’t inherently surprising: Low-energy neutrinos interact with matter so weakly that they can zip through the entire planet. But high-energy neutrinos can’t pass through as much material as lower-energy neutrinos can. So although high-energy neutrinos can skim the edges of the planet, they won’t survive a pass straight through.
Alexander Hertel-Fernandez, Caroline Tervo, and Theda Skocpol in The Guardian:
The cries of “Shame! Shame! Shame!” rang throughout the marbled walls of the Wisconsin state assembly chamber. Disgusted Democratic politicians, some of whom had been up for over 60 hours by this point, punctuated their chants by throwing papers – and even drinks – at their Republican counterparts. Police officers had to be summoned to physically restrain one Democratic representative yelling “Cowards!” across the aisle.
The source of this confrontation, in the early hours of February 2011, was an unprecedented push by Wisconsin Republicans, led by the state’s newly elected Republican governor, Scott Walker, to slash the union rights held by most public workers. Walker argued that budget woes in the state necessitated the shift, and barrelled forward to eliminate the rights of virtually all public-sector workers to collectively bargain with government and to allow government employees to opt out of paying dues to their unions.
At first blush this might seem like a years-old local issue in a US state that rarely lights up the international headlines. Yet events in Wisconsin are crucial to understanding how a little-known, billionaire-funded organization, called Americans for Prosperity (AFP), has tilted American politics to the right. It is intertwined with, and rivals in size, the Republican party itself.
The man was 23 when the delusions came on. He became convinced that his thoughts were leaking out of his head and that other people could hear them. When he watched television, he thought the actors were signaling him, trying to communicate. He became irritable and anxious and couldn’t sleep. Dr. Tsuyoshi Miyaoka, a psychiatrist treating him at the Shimane University School of Medicine in Japan, eventually diagnosed paranoid schizophrenia. He then prescribed a series of antipsychotic drugs. None helped. The man’s symptoms were, in medical parlance, “treatment resistant.” A year later, the man’s condition worsened. He developed fatigue, fever and shortness of breath, and it turned out he had a cancer of the blood called acute myeloid leukemia. He’d need a bone-marrow transplant to survive. After the procedure came the miracle. The man’s delusions and paranoia almost completely disappeared. His schizophrenia seemingly vanished. Years later, “he is completely off all medication and shows no psychiatric symptoms,” Dr. Miyaoka told me in an email. Somehow the transplant cured the man’s schizophrenia.
A bone-marrow transplant essentially reboots the immune system. Chemotherapy kills off your old white blood cells, and new ones sprout from the donor’s transplanted blood stem cells. It’s unwise to extrapolate too much from a single case study, and it’s possible it was the drugs the man took as part of the transplant procedure that helped him. But his recovery suggests that his immune system was somehow driving his psychiatric symptoms. At first glance, the idea seems bizarre — what does the immune system have to do with the brain? — but it jibes with a growing body of literature suggesting that the immune system is involved in psychiatric disorders from depression to bipolar disorder. The theory has a long, if somewhat overlooked, history. In the late 19th century, physicians noticed that when infections tore through psychiatric wards, the resulting fevers seemed to cause an improvement in some mentally ill and even catatonic patients. Inspired by these observations, the Austrian physician Julius Wagner-Jauregg developed a method of deliberate infection of psychiatric patients with malaria to induce fever. Some of his patients died from the treatment, but many others recovered. He won a Nobel Prize in 1927. One much more recent case study relates how a woman’s psychotic symptoms — she had schizoaffective disorder, which combines symptoms of schizophrenia and a mood disorder such as depression — were gone after a severe infection with high fever.
Before 2016 Jordan Peterson was indistinguishable from any other relatively successful academic with a respectable scholarly pedigree: B.A. in political science from the University of Alberta (1982), B.A. in psychology from the same institution (1984), Ph.D. in clinical psychology from McGill University (1991), postdoc at McGill’s Douglas Hospital (1992–1993), assistant and associate professorships at Harvard University in the psychology department (1993–1998), full tenured professorship at the University of Toronto (1999 to present), private clinical practice in Toronto, and a scholarly book by a reputable publishing house (Routledge). This ordinary career path turned extraordinary in 2016 when the controversial Bill C-16, a federal amendment to the Canadian Human Rights Act and Criminal Code, was passed, “to protect individuals from discrimination within the sphere of federal jurisdiction and from being the targets of hate propaganda, as a consequence of their gender identity or their gender expression.”7That sounds reasonable enough: if we’re going to protect people from discrimination based on race, age, sex, and religion, why not gender identity or expression as well? Who would disagree with this clause in the bill?
[A]ll individuals should have an opportunity equal with other individuals to make for themselves the lives that they are able and wish to have and to have their needs accommodated, consistent with their duties and obligations as members of society, without being hindered in or prevented from doing so by discriminatory practices based on race, national or ethnic origin, colour, religion, age, sex, sexual orientation, marital status, family status, disability or conviction for an offence for which a pardon has been granted or in respect of which a record suspension has been ordered.
To me this reads like another step on the moral arc bending toward justice. But in a series of YouTube videos Peterson outlined his concerns (dread really) that Bill C-16 could turn into “compelled speech” that, if not obeyed, could land one in jail for not addressing someone by their preferred pronoun (zie, xem, hir, ve, xe, xyr…).8 Peterson went on record stating, “I’m not using the words that other people require me to use, especially if they’re made up by radical left-wing ideologues. And that’s that.”9 Even more emphatically, he told a television audience, “If they fine me, I won’t pay it. If they put me in jail, I’ll go on a hunger strike.”10The image of a Canadian psychology professor on a hunger strike over gender pronouns is a little hard to equate with Gandhi’s emaciating efforts to break free his country from British rule, but it’s a sign of moral progress that we’ve shifted from condemning colonization to protesting pronouns.
In 1921, 24-year-old William Faulkner had dropped out of the University of Mississippi (for the second time) and was living in Greenwich Village, working in a bookstore—but he was getting restless. Eventually, his mentor, Phil Stone, an Oxford attorney, arranged for him to be appointed postmaster at the school he had only recently left. He was paid a salary of $1,700 in 1922 and $1,800 in the following years, but it’s unclear how he came by that raise, because by all accounts he was uniquely terrible at his job. “I forced Bill to take the job over his own declination and refusal,” Stone said later, according to David Minter’s biography. “He made the damndest postmaster the world has ever seen.”
Faulkner would open and close the office whenever he felt like it, he would read other people’s magazines, he would throw out any mail he thought unimportant, he would play cards with his friends or write in the back while patrons waited out front. A comic in the student publication Ole Miss in 1922 showed a picture of Faulkner and the post office, calling it the “Postgraduate Club. Hours: 11:30 to 12:30 every Wednesday. Motto: Never put the mail up on time. Aim: Develop postmasters out of fifty students every year.”
When René Descartes was 31 years old, in 1627, he began to write a manifesto on the proper methods of philosophising. He chose the title Regulae ad Directionem Ingenii, or Rules for the Direction of the Mind. It is a curious work. Descartes originally intended to present 36 rules divided evenly into three parts, but the manuscript trails off in the middle of the second part. Each rule was to be set forth in one or two sentences followed by a lengthy elaboration. The first rule tells us that ‘The end of study should be to direct the mind to an enunciation of sound and correct judgments on all matters that come before it,’ and the third rule tells us that ‘Our enquiries should be directed, not to what others have thought … but to what we can clearly and perspicuously behold and with certainty deduce.’ Rule four tells us that ‘There is a need of a method for finding out the truth.’
But soon the manuscript takes an unexpectedly mathematical turn. Diagrams and calculations creep in. Rule 19 informs us that proper application of the philosophical method requires us to ‘find out as many magnitudes as we have unknown terms, treated as though they were known’. This will ‘give us as many equations as there are unknowns’. Rule 20 tells us that, ‘having got our equations, we must proceed to carry out such operations as we have neglected, taking care never to multiply where we can divide’. Reading the Rules is like sitting down to read an introduction to philosophy and finding yourself, an hour later, in the midst of an algebra textbook.
In his consistently entertaining new book, “Immigrant, Montana,” Amitava Kumar, an Indian-born writer and scholar, recalls the youthful romantic adventures of Kailash, an Indian-born writer and scholar. The fuzzy distinctions between the author’s life and that of his fictional protagonist are multiple and intentional. “This is a work of fiction as well as nonfiction,” Kumar explains in an author’s note, “an in-between novel by an in-between writer.”
The relationship between fact and fiction provides an animating tension throughout Kailash’s recollection of his salad days. While pursuing graduate study at a university that sounds a lot like Columbia, he researched the life and career of Agnes Smedley, a real-life American writer, best known for her book “Daughter of Earth.” Kailash describes it as “neither a memoir nor simply a novel. And when I read it, I thought Smedley offered us a model for writing.” He found a similar example in a charismatic professor named Ehsaan Ali. “From Ehsaan we wanted narrative,” Kailash recalls. “We didn’t always care how much of it was nonfiction or fiction. Ehsaan lived — and narrated — his life along the blurry Line of Control between the two genres.”
Emmy Noether was a force in mathematics — and knew it. She was fully confident in her capabilities and ideas. Yet a century on, those ideas, and their contribution to science, often go unnoticed. Most physicists are aware of her fundamental theorem, which puts symmetry at the heart of physical law. But how many know anything of her and her life?
A conference in London this week, the Noether Celebration, hopes to change that. It’s a welcome move. In a world where young scientists look for inspirational female role models, it is hard to think of a more deserving candidate.
Noether was born in 1882 in Erlangen, Germany. Her parents wanted all their children to get doctorates, so although many universities at the time did not formally accept women, she went. After graduation, sexist regulations prevented Noether from getting jobs in academia. Undaunted, for many years she lectured in Erlangen and, from 1915, at the University of Göttingen — often for free.
At the time, that city was the centre of the mathematical world, largely due to the presence of two of its titans — Felix Klein and David Hilbert.
When last June we heard about the kids arriving in New York from the Southern border—the first moment the child separation policy flared into the public eye, the first I knew of its systematic existence—we raced to LaGuardia to witness, to support, to stage something visible, at least. We made posters on the M60 bus. I passed around fat sharpies, brought extra neon poster-board. No one knew what to write, or who to address. The children? Their captors? The news cameras? It maybe wasn’t a great decision. The couple of kids I saw filing out of side doors were so little and tired and quiet. I’m not sure a phalanx of screaming adults helped, though it gave the TV cameras something to show other than their tiny bodies. Some stubborn questioners managed to get information about where the children were being taken in the unmarked vans.
This emergency airport-going has been a thing these last two years. I like it. It is good to disrupt these spaces of fear and docility with liveness and spontaneity and too much mess and song and language everywhere. To show up and clog up and drown out “bags unattended” announcements with the people’s mic. And in my family, you always pick up at the airport in person. We don’t mess around with this “I’ll meet you at home” business. At Qaid-e-Azam International Airport in Karachi, my uncle and grandfather would meet us on the tarmac.
Cicadas might be a pest, but they’re special in a few respects. For one, these droning insects have a habit of emerging after a prime number of years (7, 13, or 17). They also feed exclusively on plant sap, which is strikingly low in nutrients. To make up for this deficiency, cicadas depend on two different strains of bacteria that they keep cloistered within special cells, and that provide them with additional amino acids. All three partners – the cicadas and the two types of microbes – have evolved in concert, and none could survive on its own. These organisms together make up what’s known as a holobiont: a combination of a host, plus all of the resident microbes that live in it and on it. The concept has taken off within biology in the past 10 years, as we’ve discovered more and more plants and animals that are accompanied by a jostling menagerie of internal and external fellow-travellers. Some of the microorganisms kill each other with toxins, while others leak or release enzymes and nutrients to the benefit of their neighbours. As they compete for space and food, cohabiting microbes have been found to affect the nutrition, development, immune system and behaviour of their hosts. The hosts, for their part, can often manipulate their resident microbiota in many ways, usually via the immune system.
You yourself are swarming with bacteria, archaea, protists and viruses, and might even be carrying larger organisms such as worms and fungi as well. So are you a holobiont, or are you just part of one? Are you a multispecies entity, made up of some human bits and some microbial bits – or are you just the human bits, with an admittedly fuzzy boundary between yourself and your tiny companions? The future direction of medical science could very well hinge on the answer.
The American evolutionary theorist Lynn Margulis, who popularised the theory of symbiosis, first coined the term ‘holobiont’ in 1991. She was interested in long-term, tightly integrated associations such as those evident in lichens – the crusty-looking growths found on rocks and trees, made up of fungus conjoined with algae. Margulis thought that there was a tight analogy between an egg and a sperm coming together to form a new organism, and the coming together of two species to form a new symbiotic consortium, which she called a holobiont.
“Look at me when I’m talking to you! You’re telling me that my assault doesn’t matter. That what happened to me doesn’t matter!” Those anguished words came from Maria Gallagher, who, along with Ana Maria Archila, confronted Senator Jeff Flake after he announced on Friday morning that he would vote to confirm Brett Kavanaugh to the Supreme Court, as Mr. Flake stood in a Capitol Hill elevator that he clearly wished could transport him far, far away.
…Whatever happens next, Republican lawmakers ought to tread carefully. They thus far have not covered themselves in glory in their handling of the allegations against Judge Kavanaugh. This brief pause provides them with an opportunity to start repairing some of that damage, to try to come across as — and maybe even to actually be — more interested in the truth than in shoving through their nominee regardless of it. As they try to figure out how best to move forward, they would do well to keep something in mind: Women are watching. Many women have been eyeing the Republican Party with growing unease since it was taken over by Donald Trump, whose retrograde views on gender are straight out of the 1950s — or maybe the 1590s. Confronted with serious and credible allegations against Judge Kavanaugh, Republican lawmakers could have seized the moment to reassure anxious women that they realized some issues transcend partisanship. Instead, they failed, quite spectacularly, to rise to the occasion — turning their furious defense of the nominee into an illuminating microcosm of the disregard and disrespect for women that have become hallmarks of Mr. Trump’s Republican Party. As the Kavanaugh nightmare took form, women watched in dismay as Republican lawmakers worked to discredit Dr. Blasey by suggesting that she was either hopelessly confused, a political pawn or a liar. They watched in disbelief as Republicans repeatedly declined to call for an independent investigation into Dr. Blasey’s allegations, much less the subsequent ones brought by Deborah Ramirez and Julie Swetnick. They watched in frustration as Republicans failed to call material witnesses or outside experts to testify.
…Women have not simply been watching. They’ve been preparing their response. That response may come in 2018 or in 2020. But it will come. And, without a course correction far more dramatic than the frantic shuffling spurred by Mr. Flake’s 11th-hour pang of conscience, the damage Republican lawmakers are doing to their party could last for decades.
In 2016, Russian born novelist Gary Shteyngart took a bus ride. It was an uncertain time in American culture and politics, and Shteyngart, who’s won a devoted following for his bestsellers “Absurdistan, “Super Sad True Love Story” and the memoir “Little Failure,” didn’t know what would come of the journey. Two years later, he’s emerged with not just the first big novel of the post-Obama era, but the first truly great novel of it.
Like Shteyngart, Barry Cohen, the protagonist of “Lake Success,” is a New Yorker who finds himself moved to explore the country via Greyhound — aka “the Hound.” But Barry is a crazy rich hedge fund manager, a father of a recently diagnosed autistic preschooler, and a man on an impulsive mission to connect with the person he once was. The book is at once a picaresque tale of Barry’s travels and a domestic novel of his wife Seema’s simultaneous odyssey on the home front. In trademark Shteyngart fashion, it is very funny. And it is likewise deeply human and warm. Without giving anything away, when the ending arrives too soon, it’s both surprising and somehow beautifully inevitable.
Transhumanism (also abbreviated as H+) is a philosophical movement which advocates for technology not only enhancing human life, but to take over human life by merging human and machine. The idea is that in one future day, humans will be vastly more intelligent, healthy, and physically powerful. In fact, much of this movement is based upon the notion that death is not an option with a focus to improve the somatic body and make humans immortal.
Certainly, there are those in the movement who espouse the most extreme virtues of transhumanism such as replacing perfectly healthy body parts with artificial limbs. But medical ethicists raise this and other issues as the reason why transhumanism is so dangerous to humans when what is considered acceptable life-enhancement has virtually no checks and balances over who gets a say when we “go too far.” For instance, Kevin Warwick of Coventry University, a cybernetics expert, asked the Guardian, “What is wrong with replacing imperfect bits of your body with artificial parts that will allow you to perform better – or which might allow you to live longer?” while another doctor stated that he would have “no part” in such surgeries. There is, after all, a difference between placing a pacemaker or performing laser eye surgery on the body to prolong human life and lend a greater degree of quality to human life, and that of treating the human body as a tabula rasa upon which to rewrite what is, effectively, the natural course of human life.
On 24 September 1599, while William Shakespeare was mulling over a draft of Hamlet in his house downriver from the Globe in Southwark, a mile to the north a motley group of Londoners were gathering in a half-timbered Tudor hall. The men had come together to petition the ageing Elizabeth I, then a bewigged and painted sexagenarian, to start up a company “to venter in a voiage to ye Est Indies”.
The East India Company quickly grew into the world’s first and most powerful multinational corporation, and the one that, more than any other in history, would transform not just patterns of global trade but the globe itself. Before long a mere handful of businessmen from a distant island on the rim of Europe had made themselves masters of a subcontinent whose inhabitants numbered 50 to 60 million. They succeeded the mighty Mughalempire where even minor provincial nawabs and governors ruled over vast areas, larger in both size and population than the biggest countries of Europe, so reversing the balance of trade that from Roman times on had drained western bullion eastwards.
Over the course of three and half centuries, a whole British colonial world was founded to exploit and administer these conquests, a world with its own peculiar argot, its own institutions, its idiosyncratic snobberies and social hierarchies, its own educational establishments and career paths – an empire within an empire. When the British finally left India in 1947, nearly 350 years after the founding of the East India Company, that world dissolved overnight. Perhaps it is only possible now, more than 70 years later, in an age when the imperial British feel almost as distant a part of history as the imperial Romans, for this expatriate society to receive the particular attention that its idiosyncrasy deserves.