Monday, November 02, 2015
Is your brain wired for science, or for bunk?
by Maarten Boudry
Science education is an uphill battle. More than 40% percent of the U.S. population, one of the most scientifically advanced countries on the planet, believes that the earth was created in six days by supernatural fiat a few millennia ago. Ghosts, gods, angels and devils continue to populate people’s fertile imagination. Belief in telepathy and assorted psychic powers is rampant, as is belief in all sorts of quack medicine and conspiracy theories. It is no wonder that some scientists and science educators are driven to desperation: why don’t people just get it? Why do they doggedly persist in the myths of old, or the fads of late, as if the scientific revolution has never taken place?
Meanwhile, the progress of science continues unabated, with an ironic twist. Science does not just explain the way the universe is; it also explains why people continue to think the universe is different than it is. In other words, science is now trying to explain its own failure in persuading the population at large of its truth claims. Decades of research in cognitive psychology have revealed that our brains, alas, are just not wired up for science. Or at least not for the fruits of scientific research. To be sure, science is a product of human brains (where else would it come from?), but as scientists have made progress, they have come up with theories and views that are increasingly hard to swallow for those same brains. Take evolutionary theory, a crowning achievement of science. Our minds are prone to find purpose in nature (intuitive teleology), but evolution says there isn’t any: all is blind chance, mindless necessity and pitiless indifference. Our minds like to think of biological species as immutable categories separated by unbridgeable chasms (intuitive essentialism), but evolutionary theory just talks about imperceptibly shifting populations and changes in gene frequencies. Our minds can just about conceive of a thousand years, but scientists estimate that life on earth has been evolving since 3.8 thousand times thousand times thousand years ago. It’s hard to get your puny human brain around such things.
In Why Religion is Natural and Science is Not, philosopher Robert McCauley offers ample demonstrations of the truth of his book title. Many scientific theories run roughshod over our deepest intuitions. Lewis Wolpert even remarked that “I would almost contend that if something fits with common sense it almost certainly isn't science.” It is not so much that the universe is inimical to our deepest intuitions, it’s that it does not care a whit about them (it’s nothing personal, strictly business). And it gets worse as we go along. Newton’s principle of inertia was already hard to get your head around (uniform motion continuing indefinitely?), but think about the curvature of space-time in general relativity, or the bizarre phenomena of quantum mechanics, which baffle even the scientists who spend a lifetime thinking about them. Science does not have much going for it in the way of intuitive appeal.
Bearing all that in mind, it may seem remarkable, not that so many people refuse to accept the scientific worldview, but that so many have embraced it at all. Of course, science has one thing in its favor: it works, bitches. Every time your GPS device tells you where you are, or you heat up your soup by bombarding it with invisible waves, or you blindly entrust your fate in the hands of an able surgeon, you are relying on the achievements of science. Science is culturally successful despite the fact that it clashes with deeply engrained intuitions. By and large, people accept the epistemic authority of science—sometimes begrudgingly—because they admire its technological fruits and because deep down they know it is reasonable to defer to the expertise of more knowledgeable people. Without its technological prowess, which ultimately derives from the facts that it tracks truth, the scientific worldview would wither away. No system of beliefs could succeed in convincing so many people of so many bizarre and counterintuitive things, unless the truth was on its side, at the least most of the times.
We can see that if we compare science with some of its contenders: religion, superstition, ideology, and in particular pseudoscience – belief systems that actively mimic the superficial trappings of science, trying to piggy ride on its cultural prestige. By definition, pseudoscience doesn’t have truth on its side (except by a sheer stroke of luck), or else we would just call it ‘science’. Because they defy reality, pseudosciences can boast of no genuine technological success. The army does not hire psychics (or so one hopes), homeopathy only has nothing but the placebo effect to count on, and creationists are marginalized in the scientific community, despite their persistent campaign for recognition.
But how do pseudoscience and other weird belief systems survive and sustain themselves? They profit exactly from that which is lacking in science: intuitive appeal. Almost all pseudosciences tap into the universal cognitive biases, intuitions and heuristics of the human mind -- courtesy of evolution by natural selection. Intuitive appeal makes up for lack of epistemic warrant. Take alternative medicine. Many quack reme dies are based on the intuitive physical processes. There are energy streams to be channeled, blockades to be lifted, and tensions to be released. People see health as a “balance” between different elements of forces (as in the ancient theory of humors), and sickness as disturbance of this balance. Alternatively, illness can be conceived of as a contamination by a harmful or filthy substance, which needs to be expelled from the body. In many cultures across the world, people cut open veins to purge patients from ‘bad blood’. Bloodletting has evolved many times over, independent from each other. Modern westerners detox and purify their colons, or open up and cleanse their chakras with healing crystals. If you find these rituals of purification intuitively plausible, the placebo effect will take it from there.
Even a form of alternative medicine such as homeopathy, which may strike the scientifically literate reader of this website as the height of absurdity, is based on simple and intuitive principles. The ritual of diluting and shaking taps into intuitive essentialism: an intangible essence is imprinted on the water, and becomes more potent as the process is repeated over and over again. The homeopathic similia principle, which states that ‘like should be cured with like’, can be found in many other alternative therapies, and is a form of what James Frazer called ‘sympathetic magic’: to get cured from a disease, you have to find a remedy sharing the same “essence”. Many people find it intuitively plausible that, if you are ill, you should take some hair of the dog that bit you. (Indeed, this phrase was once literally put into practice, as a cure against rabies).
And then of course there are explanations in terms of agents and intentions, which are intuitively very satisfying to the human mind. Once people find an agent behind the scenes, they achieve psychological closure and stop looking for further explanations. Reality, alas, does not provide much satisfaction in that regard. Modern science has largely abandoned intentional explanations of natural phenomena, and at the most basic level of reality, it only admits of chance and necessity, both of which are blind and impersonal. Many pseudosciences, however, still satisfy our desire for intentional explanations. This is obvious enough in the case of Biblical creationism and other forms of religious pseudoscience, but it also holds for conspiracy theories, superstition and paranormal belief. One of the primary cognitive roots of conspiracy thinking is that our brains are prone to detecting agency in random patterns or meaningless events. In some pseudosciences, the agents are not immediately apparent: the intentional explanations are still there, but they are no longer anchored in identifiable agents. Examples of such agentless intentionality in pseudoscience are concepts such as fate, karma or élan vital, and the widespread idea that there is some meaningful order in the cosmos, such as we find in astrology and New Age spiritualism.
The problem, alas, is that the world out there does not care a whit about our intuitive whims. Reality, as the science fiction writer Philip K. Dick wrote, is that which, when you stop believing in it, doesn't go away. An idea may be intuitively palatable or psychologically comforting, but if it flies in the face of reality, it is psychologically unstable. If our beliefs and reality collide, it is usually our beliefs that must yield. So why is there still so much nonsense around? It may be interesting to adopt the perspective of the beliefs themselves here, or the ‘memes’, as Richard Dawkins coined them. If a belief system is to survive in a population of human brains, it has to be intuitively appealing and easy to process and remember. But it also has to reckon with the world out there. If a belief system is blatantly false, it will not find any receptive adherents. In our scientific age, beliefs are increasingly exposed to critical scrutiny and controlled testing, which can be seen as a ruthless form of cultural selection. Religion and supernatural belief systems have partly solved this problem by lifting their claims outside of the visible realm altogether, and also by intimidating and browbeating dissenters: questioning is taboo, criticism is blasphemy, and apostates should be punished.
Pseudoscientific belief systems cannot easily resort to such drastic measures, as these memes are trying to mimic the trappings of science, including its deference to evidence and its culture of open questioning and criticism. Pseudoscience rides piggyback on the prestige of science in our modern age, so it has to look scientific. But how do they pull that off? Apart from adopting the trappings of modern science (technical jargon, experimentation, conferences, peer-reviewed magazines), pseudosciences have developed certain ‘strategies’ to cope with the threat of adverse evidence, and to resist critical scrutiny. In my dissertation Here Be Dragons. Exploring the Hinterlands of Science,and in a series of papers with the philosopher Johan Braeckman, I refer to these as ‘immunizing strategies’ and ‘epistemic defense mechanisms’. These are clever tricks, sometimes embedded in the belief system itself, that serve to protect it from refutation, and to give it a spurious ring of plausibility. One very neat trick is to give a theory-internal explanation for opposition against the theory itself. Sigmund Freud, one of the cleverest pseudoscientists of his age, famously suggested that the opposition against psychoanalysis bears out one of its main predictions: that everyone is under the spell of unconscious forces, which are secretly trying their utmost to hide the inconvenient truths of Freudian theory. Those who attack psychoanalysis, Freud wrote, display “the same resistance as in our patients”, and this resistance “finds it easy to disguise itself as an intellectual rejection and to bring up arguments like those which we ward off in our patients”. Thus the “argument from resistance” was born, and became a staple of the psychoanalytic arsenal. Scientologists and Marxists have constructed their own version of the resistance argument. It is quite neat, as it is a trump card that can be used in any discussion, against any given argument.
But there are plenty of other immunizing strategies. Parapsychology has built-in ad hoc clauses to deal with unwelcome data. For instance, many parapsychologists believe that the presence of inquisitive minds disturbs psychic phenomena, a phenomenon that is called ‘‘negative psi vibration’’ or ‘‘catapsi’’ (notice the technical jargon). In the words of one of the followers of Franz Anton Mesmer, the famous 19th century magnetizer, ‘never magnetize before inquisitive persons!’. Defenders of alternative medicine have devised another immunizing strategy against overly inquisitive persons bent on putting their remedies to the test. The argument goes like this: “Every human subject is radically unique, so every treatment is different. In our holistic approach, one cannot just generalize across individuals. Our methods are not accessible to anything so crude and reductionist as a randomized clinical trial.” If you give it a minute of thought, this argument makes little sense, but it sure sounds convincing in the ears of believers. Should one not respect the uniqueness of every individual, rather than reducing people to numbers and data points?
Many pseudoscientific claims are also moving targets, amenable to a range of interpretations. This is notoriously so in the case of astrology and assorted forms of fortune-telling. Horoscopes may look as if they contain specific predictions or interesting observations about your character, but as soon as they are threatened with falsification, they become vague or turn into metaphors. This can also be seen as an immunizing strategy. Many such predictions lead a conceptual “double life”, as the philosopher Frank Cioffi put it, expanding and contracting as the occasion calls for.
If a belief systems postulates invisible intentional agents, as many pseudosciences do, a whole range of immunizing strategies opens up: the fairies in the garden may be shy and actively evade being detected, the secret conspirators may be planting false evidence to throw us off the scent, the visiting extraterrestrials may not want to disclose their visitations. The same goes for religion, of course: God can test our faith by playing hide and seek, or the devil may tempt us with clever skeptical arguments. (Some creationists believe that Satan himself whispered the idea of evolution in Darwin’s ear).
How do immunizing strategies work? Our minds tend to search for confirmations of our beliefs, and do not actively seek out adverse information. This so-called myside bias may lead to tunnel vision and belief perseverance. Immunizing strategies exacerbate this bias, by turning refutations into confirmations and by giving us the perfect excuse to turn a deaf ears to critics. In this way, immunizing strategies prevent people from getting rid of bad beliefs. The philosopher Stephen Law called such belief systems “epistemic black holes”: once you are sucked into one of them, it is extremely hard to get out.
In a recent paper in Philosophical Psychology, co-autored with Stefaan Blancke and Massimo Pigliucci, we have compared the cultural dynamics of science and pseudoscience, developing what Dan Sperber called an ‘epidemiology of representations’. This approach maps how beliefs (or ‘memes’) spread in a population, like viruses affecting the mind. The framework can be applied to any form of cultural representations, so it works for science and pseudoscience alike. There are just different selective processes at work. Science has achieved cultural stability – in the community of experts but also in the population at large – because it works and because it is true, despite the fact that it flies in the face of pretty much every human intuition. Pseudoscience, on the other hand, can thrive because it taps into our innate intuitions and biases, and because it is protected by its own in-built survival kit. Both are culturally ‘successful’, but for completely different reasons.
* * *
Maarten Boudry (1984) is a postdoctoral fellow of the Flemish Fund for Scientific Research (FWO) at Ghent University. In 2011, he defended his dissertation on pseudoscience, Here Be Dragons. Exploring the Hinterland of Science, consisting of a collection of papers that have been published in Philosophy of Science, Philosophia, Quarterly Review of Biology, Science & Education and Philosophical Psychology. He is co-editor of Philosophy of Pseudoscience. Reconsidering the Demarcation Problem (2013), together with Massimo Pigliucci. His current research deals with evolutionary epistemology, in particular the problem of human irrationality. Other research interests include naturalism, skepticism, and the conflict between science and religion. He just published Illusions for the Advanced. Why Truth is Always Better ("Illusies voor gevorderden", in Dutch) and is co-author of The Doubting Thomas Might Be Right, (with Johan Braeckman, 2011).
Posted by S. Abbas Raza at 12:45 AM | Permalink