Koshik the Elephant Can Speak Korean

49233-1024x684

Becky Crew in Scientific American:

Upstaged spectacularly by a young Beluga whale that can sort of speak human, an Asian elephant named Koshik can also imitate human speech, but in Korean, using his trunk.

Captive-born in 1990 and transferred to South Korea’s Everland Zoo three years later, Koshik lived with two female Asian elephants for a couple of years before being kept completely alone for the following seven years. During this time, he showed a keen interest in learning several spoken commands, and by August 2004, when he was 14 years old and about to reach sexual maturity, his trainers noticed that he was attempting to imitate their speech.

It’s not known if this was the first time Koshik imitated human speech, or if he’d started doing it earlier and his trainers hadn’t noticed, but there’s a good chance the reason he had started was because, for a long period of time during his formative years, the only social interaction he had was with humans.

Isolation from conspecifics has led to speech intimation in a number of unlikely animals, such as Hoover the harbour seal (Phoca vitulina) from Maine, who in 1976 showed an ability to imitate human speech. Hoover was found as an orphaned cub, and was hand-reared by locals before being transferred at three months old to the New England Aquarium. Here he shared an exhibit pool with other harbour seals, but he was the oldest male for most of his life.

Science Fictions

Sir-Francis-Bacon

Philip Ball in Aeon:

Scientists can be notoriously dismissive of other disciplines, and one of the subjects that suffers most at their hands is history. That suggestion will surprise many scientists. ‘But we love history!’ they’ll cry. And indeed, there is no shortage of accounts from scientists of the triumphant intellectual accomplishments of Einstein, Darwin, Newton, Galileo, and so on. They name institutes and telescopes after these guys, making them almost secular saints of rationalism.

And that’s the problem. All too often, history becomes a rhetorical tool bent into a shape that serves science, or else a source of lively anecdote to spice up the introduction to a talk or a book. Oh, that Mendeleev and his dream of a periodic table, that Faraday forecasting a tax on electricity!

I don’t wish to dismiss the value of a bit of historical context. But it’s troubling that the love of a good story so often leads scientists to abandon the rigorous attitude to facts that they exhibit in their own work. Most worrisome of all is the way these tales from science history become shoehorned into a modern narrative — so that, say, the persecution of Galileo shows how religion is the enemy of scientific truth.

There’s no point getting too po-faced about the commandeering of Newton’s almost certainly apocryphal falling apple to represent science in the Paralympic opening ceremony. But what Newton’s definitive biographer Richard Westfall says about that story warns us how these populist fables can end up giving a distorted view of science. He says that it ‘vulgarises universal gravitation by treating it as a bright idea. A bright idea cannot shape a scientific tradition.’ Besides, how many of those munching apples at the ceremony could have explained why, if the moon is indeed just like an apple, the apple falls but the moon does not? Anecdote can anaesthetise thought rather than stimulate it.

An Interview with Maurice Sendak

Interview_sendak

Emma Brockes in The Believer:

THE BELIEVER: Do you miss the city, living out here?

MAURICE SENDAK: I really don’t like the city anymore. You get pushed and harassed and people grope you. It’s too tumultuous. It’s too crazy. I’m afraid of falling over in New York. People are all insane and talking on machines and twittering and twottering. All that. I’m here looking for peace and quiet. A yummy death.

BLVR: A yummy death?

MS: I’m just reading a book about Samuel Palmer and the ancients in England in the 1820s. You were so lucky to have William Blake. He’s lying in bed, he’s dying, and all the young men come—the famous engravers and painters—and he’s lying and dying, and suddenly he jumps up and begins to sing! “Angels, angels!” I don’t know what the song was. And he died a happy death. It can be done. [Lifts his eyebrows to two peaks] If you’re William Blake and totally crazy.

BLVR: You do some teaching out here?

MS: I have a fellowship that started last year, two men and two women living in a house, and I go over when they want me to critique, or whatever the hell. I just talk dirty. They’re nice people. Young. It’s probably not very original, but old artists like to have young artists around… to destroy. I’m joking. I really want to help them. But publishing is such an outrageously stupid profession. Or has become so.

BLVR: More so than it was?

MS: Well, nobody knows what they’re doing. I wonder if that’s always been true. I think being old is very fortunate right now. I want to get out of this as soon as possible. It’s terrible. And the great days in the 1950s and after the war, when publishing children’s books was youthful and fun… it really was. It’s not just looking back and pretending that it was good. It was good. And now it’s just stupid.

The Corporatization of Higher Education

13512646964863353178_1ee452937e_b

Nicolaus Mills in Dissent:

In 2003, only two colleges charged more than $40,000 a year for tuition, fees, room, and board. Six years later more than two hundred colleges charged that amount. What happened between 2003 and 2009 was the start of the recession. By driving down endowments and giving tax-starved states a reason to cut back their support for higher education, the recession put new pressure on colleges and universities to raise their price.

When our current period of slow economic growth will end is anybody’s guess, but even when it does end, colleges and universities will certainly not be rolling back their prices. These days, it is not just the economic climate in which our colleges and universities find themselves that determines what they charge and how they operate; it is their increasing corporatization.

If corporatization meant only that colleges and universities were finding ways to be less wasteful, it would be a welcome turn of events. But an altogether different process is going on, one that has saddled us with a higher-education model that is both expensive to run and difficult to reform as a result of its focus on status, its view of students as customers, and its growing reliance on top-down administration. This move toward corporatization is one that the late University of Montreal professor Bill Readings noted sixteen years ago in his study,The University in Ruins, but what has happened in recent years far exceeds the alarm he sounded in the 1990s.

an adventure

E0cd5a47-2e38-447b-b21b-b2755d6e1185

Patrick Leigh Fermor, who died last year aged 96, had a facility for bringing together worlds usually considered incompatible. Here was a war hero who was also one of the great English prose stylists; who adored Greece and Britain with equal passion; and who was celebrated for his love of both high and low-living. His masterpiece, A Time of Gifts (1977), an account of the first stage of his 1933-34 walk from the Hook of Holland to Constantinople (“like a tramp, a pilgrim, or a wandering scholar”) has his 18-year-old self moving from doss-houses to Danubian ducal fortresses: “There is much to recommend moving straight from straw to a four-poster,” he writes, “and then back again.”

more from William Dalrymple at the FT here.

biafra

1104-Nossiter-Hochschild-sfSpan

The architects of Biafra were correct in their frustration with the Nigerian government, which did not intervene as thousands of Ibos were massacred. But they were deluding themselves that Biafra was viable. The nascent state had virtually no chance of survival once the authorities in Lagos decided they were going to stamp out the secession in what they called a “police action.” Was Biafra ever really a “country,” as Achebe would have it? It had ministries, oil wells, a ragtag army, an often-shifting capital, official cars (Achebe had one) and a famous airstrip. But as a “country,” it was stillborn. Nonetheless, for over two brutal years, the Biafran war dragged on at the insistence of Ojukwu — described as “brooding, detached and sometimes imperious” in a 1969 New York Times profile by Lloyd Garrison — and meddling international players. Hundreds of thousands of civilians were killed. As many as 6,000 a day starved to death once the federal government blockaded the ever diminishing Republic of Biafra. But Ojukwu refused to give up. The final death toll was estimated at between one and three million people.

more from Adam Nossiter at the NY Times here.

Inside the Centre: The Life of J Robert Oppenheimer

From The Telegraph:

Carter_main_2385878bIt’s 11 years since Ray Monk’s biography of Bertrand Russell, a book which, like his earlier one of Ludwig Wittgenstein, pulled off the impressive feat of explaining the philosophy while rivetingly portraying the life. The subject of his new 780-page book, Inside the Centre: the Life of J Robert Oppenheimer, would seem to be an excellent fit: Oppenheimer was intellectually brilliant, his work arcane and personally he was a disaster – an “unintegrated” personality made up, his friend Isidor Rabi said, of “many bright, shining splinters”. Monk has called his book Inside the Centre because Oppenheimer, the son of rich, assimilated German Jewish parents – the classic insider-outsider – had a talent for putting himself at the centre of things: at the birth of particle physics at the University of Göttingen in the Twenties, at the creation of the atom bomb, and as director of Princeton’s Institute for Advanced Study where he gathered about him the likes of Einstein and T S Eliot. He was also fascinated, throughout his adult life, by what lay within the centre of the atom.

Born in 1904 in New York into a tight-knit cultured, liberal, philanthropic, Jewish social circle, Oppenheimer was an exceptionally bright child. His parents were suffocatingly attentive. Monk describes an atmosphere of melancholy, overprotective and short on “fun”. With a voracious appetite for, among other things, chemistry, French literature, modernist poetry, Hinduism and Sanskrit (which he taught himself), he didn’t discover physics until his second year at Harvard, blagging his way onto a postgraduate course in thermodynamics. He went onto the Rutherford laboratory in Cambridge and, aged 22, so impressed Max Born – orchestrator of the amazing advances in quantum mechanics taking place at the University of Göttingen in Germany – that the latter invited him there to collaborate. He returned to the United States from the cutting edge of theoretical physics in 1929 with the deliberate intention of building a school of physics in America to rival that in Europe. Within five years he had pretty much succeeded. In the late Thirties he made his most original scientific contribution: three articles, ignored at the time, in which he described what happened to collapsing stars and predicted the existence of black holes. Had he lived another three years – when the existence of neutron stars were confirmed – he would probably have received a Nobel Prize.

More here.

New tools reveal ‘new beginning’ in split-brain research

From PhysOrg:

BrainSplit-brain research has been conducted for decades, and scientists have long ago shown that language processing is largely located in the left side of the brain. When words appear only in the left visual field—an area processed by the right side of the brain—the right brain must transfer that information to the left brain, in order to interpret it. The new study at UC Santa Barbara shows that healthy test subjects respond less accurately when information is shown only to the right brain.

While hemispheric specialization is considered accurate, the new study sheds light on the highly complex interplay—with neurons firing back and forth between distinct areas in each half of the brain. The findings rely on extremely sensitive neuroscience equipment and analysis techniques from network science, a fast-growing field that draws on insights from sociology, mathematics, and physics to understand complex systems composed of many interacting parts. These tools can be applied to systems as diverse as earthquakes and brains. Fifty years ago, UC Santa Barbara neuroscientist Michael S. Gazzaniga moved the field forward when he was a graduate student at the California Institute of Technology and first author of a groundbreaking report on split-brain patients. The study, which became world-renowned, was published in the (PNAS) in August 1962. This week, in the very same journal, Gazzaniga and his team announced major new findings in split-brain research. The report is an example of the interdisciplinary science for which UC Santa Barbara is well known.

More here.

Sarah Losh, Romantic architect

P8_Birch_Losh_paper_303229h

What Sarah described as a “Lombardic” idiom could be at once northern and southern, regional and European, and it would honour the purity of the early Church. But it is the idiosyncratic decorative scheme of the church, rather than its structural style, that makes her design so extraordinary. Sarah’s wide reading in Romantic literature had led her to see patterns of spiritual significance in nature, and she was attracted to the myths and cults that had revered natural rhythms of birth and death long before the advent of Christianity. Her church is steeped in a history that is not confined to the traditions of Anglicanism. Lotus flowers represent light and creation, while the pomegranate symbolizes regeneration. The pulpit was made from bog oak, thousands of years old; it was carved to resemble a fossilized tree, tracing a form of growth far older than the Church. The pine cone, which gives Uglow the title of her book, is to be found everywhere, as a recurrent emblem of eternal life. Like so much that caught Sarah’s imagination, it was both local and universal. The pine cone was a familiar object in the woods she owned, but it was also a symbol common to the Romans and Egyptians, and even to the Masons, who often used it to signify renewal in their ornate halls. It embodied the mysterious multiplicity of meaning that she valued most.

more from Dinah Birch at the TLS here.

three feet high and rising

Hurricane-sandy-satellite-image-537x358

The New York Academy Sciences has already begun examining the viability of three massive floodgates near the mouth of New York Harbor, not unlike the Thames River floodgate that protects London today. Another floodgate has been proposed for the Potomac River just south of Washington, fending against tsunami-like surge tides from future mega storms. Plus there will be levees—everywhere. Imagine the National Mall, Reagan National Airport and the Virginia suburbs—all well below sea level—at the mercy of “trust-us-they’ll-hold” levees maintained by the Army Corps of Engineers. Oceans worldwide are projected to rise as much as three more feet this century—much higher if the Greenland ice sheet melts away. Intense storms are already becoming much more common. These two factors together will in essence export the plight of New Orleans, bringing the Big Easy “bowl” effect here to New York City and Washington, as well as to Charleston, Miami, New York and other coastal cities. Assuming we want to keep living in these cities, we’ll have to build dikes and learn to exist beneath the surface of surrounding tidal bays, rivers and open seas—just like New Orleans.

more from Mark Tidwell at The Nation here.

it’s climate change

Sandy-and-Earth-300x300

Hurricane Sandy has emboldened more scientists to directly link climate change and storms, without the hedge. On Monday, as Sandy came ashore in New Jersey, Jonathan Foley, director of the Institute on the Environment at the University of Minnesota, tweeted: “Would this kind of storm happen without climate change? Yes. Fueled by many factors. Is [the] storm stronger because of climate change? Yes.” Raymond Bradley, director of the Climate Systems Research Center at the University of Massachusetts, was quoted in the Vancouver Sun saying: “When storms develop, when they do hit the coast, they are going to be bigger and I think that’s a fair statement that most people could sign onto.” A recent, peer-reviewed study published by several authors in the Proceedings of the National Academy of Science concludes: “The largest cyclones are most affected by warmer conditions and we detect a statistically significant trend in the frequency of large surge events (roughly corresponding to tropical storm size) since 1923.”

more from Mark Fischetti at Scientific American here.

Mapping the Art Genome

From Smithsonian:

GalleryIf you’re not familiar, Pandora is a Web site that takes a visitor’s preference for an individual musician or song and creates a personalized radio station to fit his or her taste. If you like the Beatles’ “Paperback Writer,” you may also like “Ruby Tuesday” by The Rolling Stones, for instance, or “I Can’t Explain” by The Who. With Art.sy, a visitor can enter an artist, artwork, artistic movement or medium into a search bar and the site will generate a list of artists and works that have been deemed related in some way. “There are a lot of people who may know who Warhol is, but they have no idea who Ray Johnson is. The ability to make those connections is what this is about,” said Cwilich, Art.sy’s Chief Operating Officer, on a recent segment of The Takeaway with John Hockenberry.

The endeavor is a true collaboration between computer scientists and art historians. (This is even evident in Art.sy’s leadership. Cleveland, Art.sy’s 25-year-old chief executive officer, is a computer science engineer, and Cwilich is a former executive from Christie’s Auction House.) To create a Web site that could generate fine-art recommendations, the Art.sy team had to first tackle the Art Genome Project. Essentially, a number of art historians have identified 800-and-counting “genes,” or characteristics, that apply to different pieces of art. These genes are words that describe the medium being used, the artistic style or movement, a concept (i.e., war), content, techniques and geographic regions, among other things. All the images that are tagged with a specific gene—say, “American Realism” or “Isolation/Alienation”—are then linked within the search technology.

More here.

Memento mori: its time we reinvented death

From NewScientist:

DeathIT'S said that when a general returned in glory to ancient Rome, he was accompanied in his procession through the streets by a slave whose job it was to remind him that his triumph would not last forever. “Memento mori,” the slave whispered into the general's ear: “remember you will die”. The story may be apocryphal, but the phrase is now applied to art intended to remind us of our mortality – from the Grim Reaper depicted on a medieval clock to Damien Hirst's bejewelled skull. As if we needed any reminder. While few of us know exactly when death will come, we all know that eventually it will. It's usual to talk about death overshadowing life, and the passing of loved ones certainly casts a pall over the lives of those who remain behind. But contemplating our own deaths is one of the most powerful forces in our lives for both good and ill (see “Death: Why we should be grateful for it“) – driving us to nurture relationships, become entrenched in our beliefs, and construct Ozymandian follies.

In this, we are probably unique. Most animals seem to have hardly any conception of mortality: to them, a dead body is just another object, and the transition between life and death unremarkable. We, on the other hand, tend to treat those who have passed away as “beyond human”, rather than “non-human” or even “ex-human”. We have developed social behaviours around the treatment of the dead whose complexity far exceeds even our closest living relatives' cursory interest in their fallen comrades. Physical separation of the living from the dead may have been one of the earliest manifestations of social culture (see “Death: The evolution of funerals“); today, the world's cultures commemorate and celebrate death in ways ranging from solemn funerals to raucous carnivals. So you could say that humans invented death – not the fact of it, of course, but its meaning as a life event imbued with cultural and psychological significance. But even after many millennia of cultural development, we don't seem to be sure exactly what it is we've invented. The more we try to pin down the precise nature of death, the more elusive it becomes; and the more elusive it becomes, the more debatable our definitions of it (see “Death: The blurred line between dead and alive“).

More here.

Peter Singer on Roe v. Wade, Obamacare, Romney

Singer.pig_-300x194

John Horgan talks to Singer in Scientific American:

Singer’s analysis of abortion surprised me. First of all, he agreed with many pro-lifers that a fetus, even at six weeks, is a “living human being.” [See postscript below] He showed us slides of fetuses, because we should not “run away from what abortion is.”

Singer nonetheless believes that abortion is ethical, because even a viable fetus is not a rational, self-aware person with desires and plans, which would be cut short by death; hence it should not have the same right as humans who have such qualities. Abortion is also justified, Singer added, both as a female right and as a method for curbing overpopulation.

Singer further surprised me—and showed his meta-commitment to democracy and reason–when he said that he, like Mitt Romney and his running mate Paul Ryan, disliked Roe V. Wade. That 1973 Supreme Court decision, Singer felt, provides a flimsy rationale for abortion and has corrupted the process whereby Supreme Court Justices are chosen. Ideally, Singer said, voters rather than unelected judges should determine the legal status of abortion. Singer nonetheless acknowledged that if Roe V. Wade is overturned, some states might outlaw or severely restrict abortion. “I’m torn,” he admitted.

Growing the Hell Up: From Middle Earth to NJ

11012012_Diaz1

Richard Wolinsky Interviews Junot Díaz in Guernica:

Richard Wolinsky: What prompted you to write then realistic fiction, rather than go directly into science fiction, particularly because people like Octavia Butler were using science fiction, and Chip Delany were using science fiction, in the social context? And Le Guin as well.

Junot Díaz: Well, I can’t speak for any of those three, but they seem to come out of traditions where there had been an enormous body of work about the reality that they’d come out of. Even Samuel R. Delany, Butler, Ursula Le Guin–we already had dozens of books about the African American experience, about the African American experience at Harlem, about the woman’s feminist/intellectual experience. So I think that in some ways there was a certain amount of freedom. Now I’m just sort of thinking aloud.

Whereas, I come at a time where the Dominican diasporic experience was completely non-present. It had been almost barely narrativized. And I felt like my Middle Earth, the world that I was going to retrieve, the world that I was going to create, was this world that no one had ever ever encountered, that no one had ever voyaged to. All the skills that I learned in science fiction and fantasy for world building, I brought to bear in building the world of Yunior de la Casa’s family. Building the world of the Dominican Republic in the ’50s, ’60s, ’70s, ’80s, ’90s, and 2000. And in many ways, I feel that no more is Dahlgren a constructed world than The Brief Wondrous Life of Oscar Wao.

Catastrophism: The Pseudoscience Wars

164501362

Steven Shapin reviews Michael Gordin's The Pseudoscience Wars: Immanuel Velikovsky and the Birth of the Modern Fringe, in the LRB:

Fifteen hundred years before the birth of Christ, a chunk of stuff blew off the planet Jupiter. That chunk soon became an enormous comet, approaching Earth several times around the period of the exodus of the Jews from Egypt and Joshua’s siege of Jericho. The ensuing havoc included the momentary stopping and restarting of the Earth’s rotation; the introduction into its crust of organic chemicals (including a portion of the world’s petroleum reserves); the parting of the Red Sea, induced by a massive electrical discharge from the comet to Earth; showers of iron dust and edible carbohydrates falling from the comet’s tail, the first turning the waters red and the second nourishing the Israelites in the desert; and plagues of vermin, either infecting Earth from organisms carried in the comet’s tail or caused by the rapid multiplication of earthly toads and bugs induced by the scorching heat of cometary gases. Eventually, the comet settled down to a quieter life as the planet Venus, which, unlike the other planets, is an ingénue at just 3500 years old. Disturbed by the new girl in the neighbourhood, Mars too began behaving badly, closely encountering Earth several times between the eighth and seventh centuries BCE; triggering massive earthquakes, lava flows, tsunamis and atmospheric fire storms; causing the sudden extinction of many species (including the mammoth); shifting Earth’s spin axis and relocating the North Pole from Baffin Island to its present position; and abruptly changing the length of the terrestrial year from 360 to its present 365¼ days. There were also further shenanigans involving Saturn and Mercury.

If this story makes you feel even the slightest stab of recognition, you’re probably at least fifty years old, because it’s a summary of the key ideas in Immanuel Velikovsky’sWorlds in Collision. Published in New York in 1950, the book is now almost forgotten, but it was one of the greatest cultural sensations of the Cold War era. Before it was printed, it was trailed in magazines, and immediately shot onto the American bestseller lists, where it stayed for months, grabbing the attention and occupying the energies of both enthusiasts and enraged critics. The brouhaha subsided after a few years, but the so-called Velikovsky affair erupted with greater violence in the late 1960s and early 1970s, when the author gathered a gaggle of disciples and lectured charismatically (and at times incomprehensibly) to large and enraptured campus audiences. Velikovsky’s story was chewed over by philosophers and sociologists convinced of its absurdity, some trying to find standards through which one could securely establish the grounds of its obvious wrong-headedness, others edgily exploring the radical possibility that no such standards existed and reflecting on what that meant for so-called demarcation criteria between science and other forms of knowledge.

Some Notes on the Novella

Ian McEwan in The New Yorker:

NovelWhen a character in my recent book, “Sweet Tooth,” publishes his short first work of fiction, he finds some critics are suggesting that he has done something unmanly or dishonest. His experience reflects my own. A novella? Perhaps you don’t have the necessary creative juice. Isn’t the print rather large, aren’t the lines too widely spaced? Perhaps you’re trying to pass off inadequate goods and fool a trusting public. Composers, including those of the highest rank, have never had such problems of scale. Who doubts the greatness of Beethoven’s piano sonatas and string quartets or of Schubert’s songs? Some, like me, prefer them to the symphonies of either man. Who could harden his heart against the intimate drama of Mozart’s G minor trio, or not lose himself in the Goldberg variations or not stand in awe of the D minor Chaconne played on a lonesome violin?

Strangely, the short story never arouses suspicion of short-changing, probably because the form is so fundamentally different from the novel. I believe the novella is the perfect form of prose fiction. It is the beautiful daughter of a rambling, bloated ill-shaven giant (but a giant who’s a genius on his best days). And this child is the means by which many first know our greatest writers. Readers come to Thomas Mann by way of “Death in Venice,” Henry James by “The Turn of the Screw,” Kafka by “Metamorphosis,” Joseph Conrad by “Heart of Darkness,” Albert Camus by “L’Etranger.” I could go on: Voltaire, Tolstoy, Joyce, Solzhenitsyn. And Orwell, Steinbeck, Pynchon. And Melville, Lawrence, Munro. The tradition is long and glorious. I could go even further: the demands of economy push writers to polish their sentences to precision and clarity, to bring off their effects with unusual intensity, to remain focussed on the point of their creation and drive it forward with functional single-mindedness, and to end it with a mind to its unity. They don’t ramble or preach, they spare us their quintuple subplots and swollen midsections.

More here.

Fairly Simple Math Could Bridge Quantum Mechanics and General Relativity

From Scientific American:

AppleCould an analysis based on relatively simple calculations point the way to reconciling the two most successful — and stubbornly distinct — branches of modern theoretical physics? Frank Wilczek and his collaborators hope so. The task of aligning quantum mechanics, which deals with the behaviour of fundamental particles, with Einstein’s general theory of relativity, which describes gravity in terms of curved space-time, has proved an enormous challenge. One of the difficulties is that neither is adequate to describe what happens to particles when the space-time they occupy undergoes drastic changes — such as those thought to occur at the birth of a black hole. But in a paper posted to the arXiv preprint server on 15 October (A. D. Shapere et al. http://arxiv.org/abs/1210.3545; 2012), three theoretical physicists present a straightforward way for quantum particles to move smoothly from one kind of ‘topological space’ to a very different one.

The analysis does not model gravity explicitly, and so is not an attempt to formulate a theory of ‘quantum gravity’ that brings general relativity and quantum mechanics under one umbrella. Instead, the authors, including Nobel laureate Frank Wilczek of the Massachusetts Institute of Technology (MIT) in Cambridge, suggest that their work might provide a simplified framework for understanding the effects of gravity on quantum particles, as well as describing other situations in which the spaces that quantum particles move in can radically alter, such as in condensed-matter-physics experiments. “I’m pretty excited,” says Wilczek, “We have to see how far we can push it.” The idea is attracting attention not only because of the scope of its possible applications, but because it is based on undergraduate-level mathematics. “Their paper starts with the most elementary framework,” says Brian Greene, a string theorist at Columbia University in New York. “It’s inspiring how far they can go with no fancy machinery.” Wilczek and his co-authors set up a hypothetical system with a single quantum particle moving along a wire that abruptly splits into two. The stripped-down scenario is effectively the one-dimensional version of an encounter with ripped space-time, which occurs when the topology of a space changes radically.

More here.