Jack Dunitz (1923-2021): Chemist and writer extraordinaire

by Ashutosh Jogalekar

Jack Dunitz during a student outing at Caltech in 1948 (Image credit: OSU Special Collections)

Every once in a while there is a person of consummate achievement in a field, a person who while widely known to workers in that field is virtually unknown outside it and whose achievements should be known much better. One such person in the field of chemistry was Jack Dunitz. Over his long life of 98 years Dunitz inspired chemists across varied branches of chemistry. Many of his papers inspired me when I was in college and graduate school, and if the mark of a good scientific paper is that you find yourself regularly quoting it without even realizing it, then Dunitz’s papers have few rivals.

Two rare qualities in particular made Dunitz stand out: simple thinking that extended across chemistry, and clarity of prose. He was the master of the semi-quantitative argument. Most scientists, especially in this day and age, are specialists who rarely venture outside their narrow areas of expertise. And it is even rarer to find scientists – in any field – who wrote with the clarity that Dunitz did. When he was later asked in an interview what led to his fondness for exceptionally clear prose, his answer was simple: “I was always interested in literature, and therefore in clear expression.” Which is as good a case for coupling scientific with literary training as I can think of.

Dunitz who was born in Glasgow and got his PhD there in 1947 had both the talent and the good fortune to have been trained by three of the best chemists and crystallographers of the 20th century: Linus Pauling, Dorothy Hodgkin and Leopold Ruzicka, all Nobel Laureates. In my personal opinion Dunitz himself could have easily qualified for a kind of lifetime achievement Nobel himself. While being a generalist, Dunitz’s speciality was the science and art of x-ray crystallography, and few could match his acumen in the application of this tool to structural chemistry. Read more »

Grand Observations: Darwin and Fitzroy

by Mark Harvey

Captain Robert Fitzroy

One of the artifacts of modern American culture is the digital clutter that crowds our minds and crowds our days. I’m old enough to have grown up in the era before even answering machines and the glorification of fast information. It’s an era that’s hard to remember because like most Americans, I’ve gotten lost in the sea of immediate “content” and the vast body of information at our fingertips and on our phones. While it’s a delicious feeling to be able to access almost every bit of knowledge acquired by humankind over the last few thousand years, I suspect the resulting mental clutter has in many ways made us just plain dumber. Our little brains can absorb and process a lot of information but digesting the massive amount of data available nowadays has some of our minds resembling the storage units of hoarders: an unholy mess of useless facts and impressions guarded in a dark space with a lost key.

If you consider who our “wise men” and “wise women” are these days, they sure seem dumber than men and women of past centuries. I guess some of them are incredibly clever when it comes to computers, material science, genetic engineering, and the like. But when it comes to big-picture thinking, even the most glorified billionaires just seem foolish. And our batch of politicians even more so.

It’s hard to know the shape and content of the human mind in our millions of years of development but the story goes that we’ve advanced in consciousness almost every century, with major advances in periods such as the Renaissance and the Enlightenment. That may be true for certain individuals but as a whole, it seems we drove right on past the bus stop of higher consciousness with our digital orgy and embryonic embrace of artificial intelligence. Are we losing the wonderful feeling, agency, and utility of uncluttered minds? Read more »

September 1, 1939: A tale of two papers

by Ashutosh Jogalekar

Scientific ideas can have a life of their own. They can be forgotten, lauded or reworked into something very different from their creators’ original expectations. Personalities and peccadilloes and the unexpected, buffeting currents of history can take scientific discoveries in very unpredictable directions. One very telling example of this is provided by a paper that appeared in the September 1, 1939 issue of the “Physical Review”, the leading American journal of physics.

The paper had been published by J. Robert Oppenheimer and his student, Hartland Snyder, at the University of California at Berkeley. Oppenheimer was then a 35-year-old professor and had been teaching at Berkeley for ten years. He was widely respected in the world of physics for his brilliant mind and remarkable breadth of interests ranging from left-wing politics to Sanskrit. He had already made important contributions to nuclear and particle physics. Over the years Oppenheimer had collected around him a coterie of talented students. Hartland Snyder was regarded as the best mathematician of the group.

Hartland Snyder (Image credit: Niels Bohr Library and Archives)

Oppenheimer and Snyder’s paper was titled “On Continued Gravitational Contraction”. It tackled the question of what happens when a star runs out of the material whose nuclear reactions make it shine. It postulated a bizarre, wondrous, wholly new object in the universe that must be created when massive stars die. Today we know that object as a black hole. Oppenheimer and Snyder’s paper was the first to postulate it (although an Indian physicist named Bishveshwar Datt had tackled a similar case before without explicitly considering a black hole). The paper is now regarded as one of the seminal papers of 20th century physics.

But when it was published, it sank like a stone. Read more »

A horror show of technological and moral failure

by Ashutosh Jogalekar

A B-29 dropping bombs over Japan. The drift in the bombs because of the jet stream is apparent.

“Black Snow: Curtis LeMay, the Firebombing of Tokyo and the Road to the Atomic Bomb”, by James M. Scott

On the night of March 9, 1945, almost 300 B-29 bombers took off from Tinian Island near Japan. Over the next six hours, 100,000 civilians in Tokyo were burnt to death, more possibly than in any six hour period in history. James Scott’s “Black Snow” tells the story of this horrific event which was both a technological and a moral failure. It is also the story of how moral failures can result from technological failures, a lesson that we should take to heart in an age when we understand technology less and less and morality perhaps even lesser.

The technological failure in Scott’s story is the failure of the most expensive technological project in World War 2, the B-29 bomber. The United States spent more than $3 billion on developing this wonder of modern technology, more than on the Manhattan Project. Soaring at 30,000 feet like an impregnable iron eagle, the B-29 was supposed to drop bombs with pinpoint precision on German and Japanese factories producing military hardware.

This precision bombing was considered not only a technological achievement but a moral one. Starting with Roosevelt’s plea in 1939 after the Germans invaded Poland and started the war, it was the United States’s policy not to indiscriminately bomb civilians. The preferred way, the moral way, was to do precision bombing during daytime rather than carpet bombing during nighttime. When the British, led by Arthur “Butcher” Harris, resorted to nighttime bombing using incendiaries, it was a moral watershed. Notoriously, in Hamburg in 1943 and Dresden in 1944, the British took advantage of the massive, large-scale fires caused by incendiaries to burn tens of thousands of civilians to death. Read more »

A Science Thanksgiving

by Ashutosh Jogalekar

The pistol shrimp (Image credit: Wired)

It’s Thanksgiving weekend here in the U.S., and there’s an informal tradition on Thanksgiving to give thanks for all kinds of things in our lives. Certainly there’s plenty to be thankful for this year, especially for those of us whose lives and livelihoods haven’t been personally devastated by the coronavirus pandemic. But I thought I would do something different this year. Instead of being thankful for life’s usual blessings, how about being thankful for some specific facts of nature and the universe that are responsible for our very existence and make it wondrous? Being employed and healthy and surrounded by family and friends is excellent, but none of that would be possible without the amazing unity and diversity of life and the universe. So without further ado and in no particular order, I present an entirely personal selection of ten favorites for which I am eternally thankful.

I am thankful for the value of the resonance level energy of the excited state of carbon-12: carbon-12 which is the basis of all organic life on earth is formed in stars through the reaction of beryllium-8 with helium-4. The difference in energies between the starting materials (beryllium + helium) and carbon is only about 4%. If this difference had been even slightly higher, the unstable beryllium-8 would have disappeared long before it had transmuted into carbon-12, making life impossible. Read more »

What Freeman Dyson taught the world

by Ashutosh Jogalekar

Freeman Dyson combined a luminous intelligence with a genuine sensitivity toward human problems that was unprecedented among his generation’s scientists. In his contributions to mathematics and theoretical physics he was second to none in the 20th century, but in the range of his thinking and writing he was probably unique. He made seminal contributions to science, advised the U.S government on critical national security issues and won almost every award for his contributions that a scientist could. His understanding of human problems found expression in elegant prose dispersed in an autobiography and in essays and book reviews in the New Yorker and other sources. Along with being a great scientist he was also a cherished friend and family man who raised six children. He was one of a kind. Those of us who could call him a friend, colleague or mentor were blessed.

Now there is a volume commemorating his remarkable mind from MIT Press that is a must-read for anyone who wants to appreciate the sheer diversity of ideas he generated and lives he touched. From spaceships powered by exploding nuclear bombs to the eponymous “Dyson spheres” that could be used by advanced alien civilizations to capture energy from their suns, from his seminal work in quantum electrodynamics to his unique theories for the origins of life, from advising the United States government to writing far-ranging books for the public that were in equal parts science and poetry, Dyson’s roving mind roamed across the physical and human universe. All these aspects of his life and career are described by a group of well-known scientists and science writers, including his son, George and daughter, Esther. Edited by the eminent physicist and historian of science David Kaiser, the volume brings it all together.  I myself was privileged to write a chapter about Dyson’s little-known but fascinating foray into the origins of life. Read more »

The root of diverse evil

by Ashutosh Jogalekar

Steven Weinberg

It wasn’t very long ago that I was rather enamored with the New Atheist movement, of which the most prominent proponent was Richard Dawkins. I remember having marathon debates with a religious roommate of mine in graduate school about religion as the “root of all evil”, as the producers of a documentary by Dawkins called it. Dawkins and his colleagues made the point that no belief system in human history is as all-pervasive in its ability to cause harm as religion.

My attitude toward religion started changing when I realized that what the New Atheists were criticizing wasn’t religion but a caricature of religion that was all about faith. Calling religion the “root of all evil” was also a bad public relations strategy since it opened up the New Atheists to obvious criticism – surely not all evil in history has been caused by religion? But the real criticism of the movement goes deeper. Just like the word ‘God’, the word ‘religion’ is a very broad term, and people who subscribe to various religions do so with different degrees of belief and fervor. For most moderately religious people, faith is a small part of their belonging to a religion; rather, it’s about community and friendship and music and literature and what we can broadly call culture. Many American Jews and American Hindus for instance call themselves cultural Jews or cultural Hindus.

My friend Freeman Dyson made this point especially well, and he strongly disagreed with Dawkins. One of Freeman’s arguments, with which I still agree, was that people like Dawkins set up an antagonistic relationship between science and religion that makes it seem like the two are completely incompatible. Now, irrespective of whether the two are intellectually compatible or not, it’s simply a fact that they aren’t so in practice, as evidenced by scores of scientists throughout history like Newton, Kepler and Faraday who were both undoubtedly great scientists and devoutly religious. These scientists satisfied one of the popular definitions of intelligence – the ability to simultaneously hold two opposing thoughts in one’s mind. Read more »

As simple as possible, but no simpler

by Ashutosh Jogalekar

Physicists writing books for the public have faced a longstanding challenge. Either they can write purely popular accounts that explain physics through metaphors and pop culture analogies but then risk oversimplifying key concepts, or they can get into a great deal of technical detail and risk making the book opaque to most readers without specialized training. All scientists face this challenge, but for physicists it’s particularly acute because of the mathematical nature of their field. Especially if you want to explain the two towering achievements of physics, quantum mechanics and general relativity, you can’t really get away from the math. It seems that physicists are stuck between a rock and a hard place: include math and, as the popular belief goes, every equation risks cutting their readership by half or, exclude math and deprive readers of a deeper understanding. The big question for a physicist who wants to communicate the great ideas of physics to a lay audience without entirely skipping the technical detail thus is, is there a middle ground?

Over the last decade or so there have been a few books that have in fact tried to tread this middle ground. Perhaps the most ambitious was Roger Penrose’s “The Road to Reality” which tried to encompass, in more than 800 pages, almost everything about mathematics and physics. Then there’s the “Theoretical Minimum” series by Leonard Susskind and his colleagues which, in three volumes (and an upcoming fourth one on general relativity) tries to lay down the key principles of all of physics. But both Penrose and Susskind’s volumes, as rewarding as they are, require a substantial time commitment on the part of the reader, and both at one point become comprehensible only to specialists.

If you are trying to find a short treatment of the key ideas of physics that is genuinely accessible to pretty much anyone with a high school math background, you would be hard-pressed to do better than Sean Carroll’s upcoming “The Biggest Ideas in the Universe”. Since I have known him a bit on social media for a while, I will refer to Sean by his first name. “The Biggest Ideas in the Universe” is based on a series of lectures that Sean gave during the pandemic. The current volume is the first in a set of three and deals with “space, time and motion”. In short, it aims to present all the math and physics you need to know for understanding Einstein’s special and general theories of relativity. Read more »

Should a scientist have faith?

by Ashutosh Jogalekar

Niels Bohr took a classic leap of faith when postulating the quantum atom (Image: Atomic Heritage Foundation)

Scientists like to think that they are objective and unbiased, driven by hard facts and evidence-based inquiry. They are proud of saying that they only go wherever the evidence leads them. So it might come as a surprise to realize that not only are scientists as biased as non-scientists, but that they are often driven as much by belief as are non-scientists. In fact they are driven by more than belief: they are driven by faith. Science. Belief. Faith. Seeing these words in a sentence alone might make most scientists bristle and want to throw something at the wall or at the writer of this piece. Surely you aren’t painting us with the same brush that you might those who profess religious faith, they might say?

But there’s a method to the madness here. First consider what faith is typically defined as – it is belief in the absence of evidence. Now consider what science is in its purest form. It is a leap into the unknown, an extrapolation of what is into what can be. Breakthroughs in science by definition happen “on the edge” of the known. Now what sits on this edge? Not the kind of hard evidence that is so incontrovertible as to dispel any and all questions. On the edge of the known, the data is always wanting, the evidence always lacking, even if not absent. On the edge of the known you have wisps of signal in a sea of noise, tantalizing hints of what may be, with never enough statistical significance to nail down a theory or idea. At the very least, the transition from “no evidence” to “evidence” lies on a continuum. In the absence of good evidence, what does a scientist do? He or she believes. He or she has faith that things will work out. Some call it a sixth sense. Some call it intuition. But “faith” fits the bill equally.

If this reliance on faith seems like heresy, perhaps it’s reassuring to know that such heresies were committed by many of the greatest scientists of all time. All major discoveries, when they are made, at first rely on small pieces of data that are loosely held. A good example comes from the development of theories of atomic structure. Read more »

The meandering march of progress

by Ashutosh Jogalekar

Fossils and artist’s rendering of Ardipithecus ramidus, the bipedal ape (Photo credit: WIRED magazine)

Like other parents, we were delighted when our daughter started walking a few months ago. But just like other parents, it’s not possible to remember when she went from scooting to crawling to speed-walking for a few steps before becoming unsteady again to steady walking. It’s not possible because no such sudden moment exists in time. Like most other developmental milestones, walking lies on a continuum, and although the rate at which walking in a baby develops is uneven, it still happens along a continuous trajectory, going from being just one component of a locomotion toolkit to being the dominant one.

As paleoanthropologist and anatomist Jeremy DeSilva describes in his book “First Steps“, this gradual transition mirrors our species’s evolution toward becoming the upright ape. Just like most other human faculties, it was on a continuum. DeSilva’s book is a meditation on the how, the when and the why of that signature human quality of bipedalism, which along with cooking, big brains, hairlessness and language has to be considered one of the great evolutionary innovations in our long history. Describing the myriad ins and outs of various hominid fossils and their bony structures, DeSilva tells us how occasional walking in trees was an adaptation that developed as early as 15 million years ago, long before humans and chimps split off about 6 million years ago. In fact one striking theory that DeSilva describes blunts the familiar, popular picture of the transition from knuckle-walking ape to confident upright human (sometimes followed by hunched form over computer) that lines the walls of classrooms and museums; according to this theory, rather than knuckle-walking transitioning to bipedalism, knuckle-walking in fact came after a primitive form of bipedalism on trees developed millions of years earlier. Read more »

Justification and the Value-Free Ideal in Science

by Fabio Tollon

One of the cornerstones good of science is that its results furnish us with an objective understanding of the world. That is, science, when done correctly, tells us how the world is, independently of how we might feel the world to be (based, for example, on our values or commitments). It is thus central to science, and its claims to objectivity, that values do not override facts. An important feature of this view of science is the distinction between epistemic and non-epistemic values. Simply, epistemic values are those which would seem to make for good science: external coherence, explanatory power, parsimony, etc. Non-epistemic values, on the other hand, concern things like our value judgements, biases, and preferences. In order for science to work well, so the story goes, it should only be epistemic values that come to matter when we assess the legitimacy of a given scientific theory (this is often termed the “value-free ideal”). Thus, a central presupposition underpinning this value-free ideal is that we can in fact mark a distinction between epistemic and non-epistemic values Unfortunately, as with most things in philosophy, things are not that simple.

The first thing to note are the various ways that the value-free ideal plays out in the context of discovery, justification, and application. With respect to the context of discovery, it doesn’t seem to matter if we find that non-epistemic values are operative. While decisions about funding lines, the significance we attach to various theories, and the choice of questions we might want to investigate are all important insofar as they influence where we might choose to look for evidence, they do not determine whether the theories we come up with are valid or not.

Similarly, in the context of application, we could invoke the age-old is-ought distinction: scientific theories cannot justify value-laden beliefs. For example, even if research shows that taller people are more intelligent, it would not follow that taller people are more valuable than shorter people. Such a claim would depend on the value that one ascribes to intelligence beforehand. Therefore, how we go about applying scientific theories is influenced by non-epistemic values, and this is not necessarily problematic.

Thus, in both the context of validation and the context of discovery, we find non-epistemic values to be operative. This, however, is not seen as much of a problem, so long as these values do not “leak” into the context of justification, as it is here that science’s claims to objectivity are preserved. Is this really possible in practice though? Read more »

Should we Disregard the Norms of Assertion in Inter-scientific Discourse? A Response to a False Dilemma

by George Barimah, Ina Gawel, David Stoellger, and Fabio Tollon*

"Assertion" by Ina Gawel
“Assertion” by Ina Gawel

When thinking about the claims made by scientists you would be forgiven for assuming that such claims ought to be true, justified, or at the very least believed by the scientists themselves. When scientists make assertions about the way they think the world is, we expect these assertions to be, on the balance of things, backed up by the local evidence in that field.

The general aim of scientific investigations is that we uncover the truth of the matter: in physics, this might involve discovering a new particle, or realizing that what we once thought was a particle is in fact a wave, for example. This process, however, is a collective one. Scientists are not lone wolves who isolate themselves from other researchers. Rather, they work in coordinated teams, which are embedded in institutions, which have a specific operative logic. Thus, when an individual scientist “puts forward” a claim, they are making this claim to a collection of scientists, those being other experts in their field. These are the kinds of assertions that Haixin Dang and Liam Kofi Bright deal with in a recent publication: what are the norms that govern inter-scientific claims (that is, claims between scientists). When scientists assert that they have made a discovery they are making a public avowal: these are “utterances made by scientists aimed at informing the wider scientific community of some results obtained”. The “rules of the game” when it comes to these public avowals (such as the process of peer-review) presuppose that there is indeed a fact of the matter concerning which kinds of claims are worthy of being brought to the collective attention of scientists. Some assertions are proper and others improper, and there are various norms within scientific discourse that help us make such a determination.

According to Dang and Bright we can distinguish three clusters of norms when it comes to norms of assertions more generally. First, we have factive norms, the most famous of which is the knowledge norm, which essentially holds that assertions are only proper if they are true. Second, we have justification norms, which focus on the reason-responsiveness of agents. That is, can the agent provide reasons for believing their assertion. Last, there are belief norms. Belief norms suggest that for an assertion to be proper it simply has to be the case that the speaker sincerely believes in their assertion. Each norm corresponds to one of the conditions introduced at the beginning of this article and it seems to naturally support the view that scientists should maintain at least one (if not all) of these norms when making assertions in their research papers. The purpose of Dang and Bright’s paper, however, is to show that each of these norms are inappropriate in the case of inter-scientific claims. Read more »

The ethics of regulating AI: When too much may be bad

by Ashutosh Jogalekar

Areopagitica‘ was a famous speech delivered by the poet John Milton in the English Parliament in 1644, arguing for the unlicensed printing of books. It is one of the most famous speeches in favor of freedom of expression. Milton was arguing against a parliamentary ordinance requiring authors to get a license for their works before they could be published. Delivered during the height of the English Civil War, Milton was well aware of the power of words to inspire as well as incite. He said,

For books are not absolutely dead things, but do preserve as in a vial the purest efficacy and extraction of that living intellect that bred them. I know they are as lively, and as vigorously productive, as those fabulous Dragon’s teeth; and being sown up and down, may chance to spring up armed men…

What Milton was saying is not that books and words can never incite, but that it would be folly to restrict or ban them before they have been published. This appeal toward withholding restraint before publication found its way into the United States Constitution and has been a pillar of freedom of expression and the press since.

Why was Milton opposed to pre-publication restrictions on books? Not just because he realized that it was a matter of personal liberty, but because he realized that restricting a book’s contents means restricting the very power of the human mind to come up with new ideas. He powerfully reminded Parliament,

Who kills a man kills a reasonable creature, God’s image; but he who destroys a good book, kills reason itself, kills the image of God, as it were, in the eye. Many a man lives a burden to the earth; but a good book is the precious lifeblood of a master spirit, embalmed and treasured up on purpose to a life beyond life.

Milton saw quite clearly that the problem with limiting publication is in significant part a problem with trying to figure out all the places a book can go. The same problem arises with science. Read more »

Hidden Worlds: Science, Truth, and Quantum Mechanics

by Jochen Szangolies

Figure 1: A typical result of googling the word ‘quantum’: pretty, but not especially enlightening.

Hearing the words ‘quantum mechanics’ usually invokes images of the impossibly tiny and fleeting, phenomena just barely on the edge of existence, unfathomably far removed from everyday experience. Perhaps illustrated in the form of bright, jittery sparkly things jumping about in a PBS documentary, perhaps as amorphous, hovering blobs of improbability, perhaps, sometimes, by the confounding notion of a cat that’s somehow both dead and alive, yet neither of those.

This does the subject a disservice. It paints a picture of quantum mechanics as far removed from everyday experience, as something we need not worry about in everyday life, something for boffins in lab-coats to contend with in their arcane ways. Yet, we’re told of the fantastic properties of the quantum world: particles that can be in two places at once, or spontaneously erupt out of sheer nothingness; that can jump through walls and communicate with one another across great distances instantly; that seem to know when they’re being watched; that are somehow both wave and particle; and so on.

Quantum reality, then, is at once beyond our grasp and, apparently, a source of fantastical properties. This combination has always marked the arena of the mystical: something just out of reach, something fundamentally unknowable, that, nevertheless, holds the promise of opening the doors to a strange, new world—to powers far beyond those the mundane world holds in store. The quantum world is a hidden world, and, like other hidden worlds throughout history, access to it becomes a coveted resource—to the profit of those purporting to be able to grant it. Read more »

Does belief in God make you rich?

by Ashutosh Jogalekar

Religion has always had an uneasy relationship with money-making. A lot of religions, at least in principle, are about charity and self-improvement. Money does not directly figure in seeking either of these goals. Yet one has to contend with the stark fact that over the last 500 years or so, Europe and the United States in particular acquired wealth and enabled a rise in people’s standard of living to an extent that was unprecedented in human history. And during the same period, while religiosity in these countries varied there is no doubt, especially in Europe, that religion played a role in people’s everyday lives whose centrality would be hard to imagine today. Could the rise of religion in first Europe and then the United States somehow be connected with the rise of money and especially the free-market system that has brought not just prosperity but freedom to so many of these nations’ citizens? Benjamin Friedman who is a professor of political economy at Harvard explores this fascinating connection in his book “Religion and the Rise of Capitalism”. The book is a masterclass on understanding the improbable links between the most secular country in the world and the most economically developed one.

Friedman’s account starts with Adam Smith, the father of capitalism, whose “The Wealth of Nations” is one of the most important books in history. But the theme of the book really starts, as many such themes must, with The Fall. When Adam and Eve sinned, they were cast out from the Garden of Eden and they and their offspring were consigned to a life of hardship. As punishment for their deeds, all women were to deal with the pain of childbearing while all men were to deal with the pain of backbreaking manual labor – “In the sweat of thy face shalt thou eat bread, till thou return unto the ground”, God told Adam. Ever since Christianity took root in the Roman Empire and then in the rest of Europe, the Fall has been a defining lens through which Christians thought about their purpose in life and their fate in death. Read more »

The last great contrarian?

by Ashutosh Jogalekar

Freeman Dyson, photographed in 2013 in his office by the author

On February 28th this year, the world lost a remarkable scientist, thinker, writer and humanist, and many of us also lost a beloved, generous mentor and friend. Freeman Dyson was one of the last greats from the age of Einstein and Dirac who shaped our understanding of the physical universe in the language of mathematics. But what truly made him unique was his ability to bridge C. P. Snow’s two cultures with aplomb, with one foot firmly planted in the world of hard science and the other in the world of history, poetry and letters. Men like him come along very rarely indeed, and we are poorer for his absence.

The world at large, however, knew Dyson not only as a leading scientist but as a “contrarian”. He didn’t like the word himself; he preferred to think of himself as a rebel. One of his best essays is called “The Scientist as Rebel”. In it he wrote, “Science is an alliance of free spirits in all cultures rebelling against the local tyranny that each culture imposes on its children.” The essay describes pioneers like Kurt Gödel, Albert Einstein, Robert Oppenheimer and Francis Crick who cast aside the chains of conventional wisdom, challenging beliefs and systems that were sometimes age-old, beliefs both scientific and social. Dyson could count himself as a member of this pantheon.

Although Dyson did not like to think of himself as particularly controversial, he was quite certainly a very unconventional thinker and someone who liked to go against the grain. His friend and fellow physicist Steven Weinberg said that when consensus was forming like ice on a surface, Dyson would start chipping away at it. In a roomful of nodding heads, he would be the one who would have his hand raised, asking counterfactual questions and pointing out where the logic was weak, where the evidence was lacking. And he did this without a trace of one-upmanship or wanting to put anyone down, with genuine curiosity, playfulness and warmth. His favorite motto was the founding motto of the Royal Society: “Nullius in verba”, or “Nobody’s word is final”. Read more »

Making far out the norm: Or how to nurture loonshots

by Ashutosh Jogalekar

Vannevar Bush – loonshot pioneer (Picture credit- TIME magazine)

What makes a revolutionary scientific or technological breakthrough by an individual, an organization or even a country possible? In his thought provoking book “Loonshots: How to Nurture the Crazy Ideas that Win Wars, Cure Diseases and Transform Industries”, physicist and biotechnology entrepreneur Safi Bahcall dwells on the ideas, dynamics and human factors that have enabled a select few organizations and nations in history to rise above the fray and make contributions of lasting impact to modern society. Bahcall calls such seminal, unintuitive, sometimes vehemently opposed ideas “Loonshots”. Loonshots is a play on “moonshots” because the people who come up with these ideas are often regarded as crazy or anti-establishment, troublemakers who want to rattle the status quo.

Bahcall focuses on a handful of individuals and companies to illustrate the kind of unconventional, out of the box thinking that makes breakthrough discoveries possible. Among his favorite individuals are Vannevar Bush, Akira Endo and Edwin Land, and among his favorite organizations are Bell Labs and American Airlines. Each of these individuals or organizations possessed the kind of hardy spirit that’s necessary to till their own field, often against the advice of their peers and superiors. Each possessed the imagination to figure out how to think unconventionally or orthogonal to the conventional wisdom. And each courageously pushed ahead with their ideas, even in the face of contradictory or discouraging data. Read more »

Infinite horizons

by Ashutosh Jogalekar

The Doomsday Scenario, also known as the Copernican Principle, refers to a framework for thinking about the death of humanity. One can read all about it in a recent book by science writer William Poundstone. The principle was popularized mainly by the philosopher John Leslie and the physicist J. Richard Gott in the 1990s; since then variants of it have have been cropping up with increasing frequency, a frequency which seems to be roughly proportional to how much people worry about the world and its future.

The Copernican Principle simply states that the probability of us existing at a unique time in history is small because we are nothing special. We therefore must exist roughly close to half the period of our existence. Using Bayesian statistics and the known growth of population, Gott and others then calculated lower bounds for humanity’s future existence. Referring to the lower bound, their conclusion is that there is a 95% chance that humanity will go extinct in 9120 years.

The Doomsday Argument has sparked a lively debate on the fate of humanity and on different mechanisms by which the end will finally come. As far as I can tell, the argument is little more than inspired numerology and has little to do with any rigorous mathematics. But the psychological aspects of the argument are far more interesting than the mathematical ones; the arguments are interesting because they tell us that many people are thinking about the end of mankind, and that they are doing this because they are fundamentally pessimistic. This should be clear by how many people are now talking about how some combination of nuclear war, climate change and AI will doom us in the near future. I reject such grim prognostications because they are mostly compelled by psychological impressions rather than by any semblance of certainty. Read more »

Life and Death in New Jersey

by Ashutosh Jogalekar

On a whim I decided to visit the gently sloping hill where the universe announced itself in 1964, not with a bang but with ambient, annoying noise. It’s the static you saw when you turned on your TV, or at least used to back when analog TVs were a thing. But today there was no noise except for the occasional chirping of birds, the lone car driving off in the distance and a gentle breeze flowing through the trees. A recent trace of rain had brought verdant green colors to the grass. A deer darted into the undergrowth in the distance.

The town of Holmdel, New Jersey is about thirty miles east of Princeton. In 1964, the venerable Bell Telephone Laboratories had an installation there, on top of this gently sloping hill called Crawford Hill. It was a horn antenna, about as big as a small house, designed to bounce off signals from a communications satellite called Echo which the lab had built a few years ago. Tending to the care and feeding of this piece of electronics and machinery were Arno Penzias – a working-class refuge from Nazism who had grown up in the Garment District of New York – and Robert Wilson; one was a big picture thinker who enjoyed grand puzzles and the other an electronics whiz who could get into the weeds of circuits, mirrors and cables. The duo had been hired to work on ultra-sensitive microwave receivers for radio astronomy.

In a now famous comedy of errors, instead of simply contributing to incremental advances in radio astronomy, Penzias and Wilson ended up observing ripples from the universe’s birth – the cosmic microwave background radiation – by accident. It was a comedy of errors because others had either theorized that such a signal would exist without having the experimental know-how or, like Penzias and Wilson, were unknowingly building equipment to detect it without knowing the theoretical background. Penzias and Wilson puzzled over the ambient noise they were observing in the antenna that seemed to come from all directions, and it was only after clearing away every possible earthly source of noise including pigeon droppings, and after a conversation with a fellow Bell Labs scientist who in turn had had a chance conversation with a Princeton theoretical physicist named Robert Dicke, that Penzias and Wilson realized that they might have hit on something bigger. Dicke himself had already theorized the existence of such whispers from the past and had started building his own antenna with his student Jim Peebles; after Penzias and Wilson contacted him, he realized he and Peebles had been scooped by a few weeks or months. In 1978 Penzias and Wilson won the Nobel Prize; Dicke was among a string of theorists and experimentalists who got left out. As it turned out, Penzias and Wilson’s Nobel Prize marked the high point of what was one of the greatest, quintessentially American research institutions in history. Read more »

Animal Stories

by Joan Harvey

We are all the animals and none of them. It is so often said that poetry and science both seek truth, but perhaps they both seek hedges against it. —Thalia Field

Konrad Lorenz, still charming, circa 1981.

A handsome bearded man leads a row of eager young ducklings who mistake him for their mother. Many of us recognize this image, warm and charming, gemütlich even, as that of ethologist, Konrad Lorenz. Thalia Field, in her book Bird Lovers, Backyard, in a section titled “A Weedy Sonata,” leads us to Lorenz the way I came to him, the way I remember him from childhood: “…the imprinting idea reveals this white-bearded man in work pants and waders, a row of ducklings strolling behind him….Picture: Konrad Lorenz on his steps, feeding a baby bird from a dropper. Martina the goose waiting to go up to sleep in ‘her bedroom’ at the top of his house. A family portrait in progress.”

Recently Leanne Ogasawara, in her 3 Quarks Daily essay on Leonardo’s painting Salvator Mundi, concludes that in evaluating the provenance of an Old Master, it is wisest to trust the scientists, a position with which I’m inclined to agree. But in the discussion that followed, others raised the need for a “fresh eye,” suggesting that artists and philosophers and laymen should weigh in for a more balanced view, one less prone to innate bias. Today, with more women in science, with research in neuroscience leading to an explosion in ideas about what consciousness is, with neuroscientists concluding that animals too are conscious, there is recognition that we have drawn false borders where there may be none. Previously agreed on methods and theories have been increasingly questioned both from within and without a number of fields. There is a general re-visioning of assumed truths, of the canon left by mostly white men. Of course the best science is always open to correction as more information becomes available.

My mother, a passionate animal lover, who often preferred animals to humans, and who had six kids in a row, somewhat as if she’d produced a litter, had Lorenz’s book, King Solomon’s Ring, on her shelf, though I no longer remember if she gave it to me to read, or I just found it myself. And what I remember, what everyone remembers from the book, is this man, embodying both the maternal and paternal, leading a flock of baby geese around, feeding them, acting as their substitute mom. Imprinting. Read more »