Science Is Truth Until It Isn’t

by Thomas O’Dwyer

Science cat
Don’t forget to credit my research assistant, Erwin Schrödinger. Image: ScienceLive / Shutterstock

“Trust the science; follow the scientists” has become a familiar refrain during our past year of living dangerously. It is the admonition of world health organisations to shifty politicians; it is good advice for all whose lives have been battered into disruption by Covid-19. But another insidious pandemic has been creeping up on us. The World Health Organization calls it the “infodemic”. It includes those endlessly forwarded emails from ill-informed relatives, social media posts, and sensational videos full of spurious “cures” and malicious lies about the virus and the pandemic. The disinformation isn’t all the work of internet trolls, conspiracy theorists and “alternative” medicine peddlers. Some actual scientists have been caught in acts of deception. These are people who undermine whatever faith the public has left in science, and who sabotage the credibility of their scrupulous colleagues. One of the worst cases of fraud was Dr Andrew Wakefield’s bogus 1998 research paper linking vaccines to autism, which endangered the lives of countless children before it was debunked and its author struck off the UK medical register. In this 700th anniversary year of Dante Alighieri’s death, we should reserve a special place in his Inferno for those who profit from turning the truths of Mother Nature into dangerous lies.

“If it disagrees with experiment, it’s wrong. That’s all there is to it,” the physicist Richard Feynman once said in a lecture on scientific method. It’s a noble truth — your theory is wrong if the experiments say so — but given the flaws of human nature, it’s not that simple. Sloppy work or deliberate fraud can make your theory seem correct enough to get published in one prestigious journal, and cited in many others. Scientific theories should follow the Darwinian principle of “survival of the fittest”. Yesterday’s cast-off ideas (goodbye, phlogiston) may pave the path to progress, but along the way, there are also some fake signposts pointing in wrong directions. Read more »

When are you past your prime?

by Emrys Westacott

ScreenHunter_618 May. 12 12.10Recently I had a discussion with a couple of old friends–all of us middle-aged guys–about when one's powers start to decline. God only knows why this topic came up, but it seems to have become a hardy perennial of late. My friends argued that in just about all areas, physical and mental, we basically peak in our twenties, and by the time we turn forty we're clearly on the rocky road to decrepitude.

I disagreed. I concede immediately that this is true of most, perhaps all, physical abilities: speed, strength, stamina, agility, hearing, eyesight, the ability to recover from injury, and so on. The decline after forty may be slight and slow, but it's a universal phenomenon. Of course, we can become fitter through exercise and the eschewing of bad habits, but any improvement here is made possible by our being out of shape in the first place.

What about mental abilities? Again, it's pretty obvious that some of these typically decline after forty: memory, processing speed, the ability to think laterally, perhaps. Here too, the decline may be very gradual, but these capacities clearly do not seem to improve in middle age. Still, I think my friends focus too much on certain kinds of ability and generalize too readily from these across the rest of what we do with our minds. More specifically, I suspect they view the cognitive capabilities that figure prominently in and are especially associated with mathematics and science as somehow the core of thinking in general. Because of this, and because these capacities are more abstract and can be exercised before a person has acquired a great deal of experience or knowledge, certain abilities have come to be identified with sharpness as such, and one's performance at tasks involving quick mental agility or analytic problem solving is taken as a measure of one's raw intellectual horsepower.

A belief in pure abiity, disentangled from experiential knowledge, underlies notions like IQ. It has had a rather inglorious history, and it has been used at times to justify a distribution of educational resources favouring those who are already advantaged. Today it continues to interest those who prefer to see any assessments or evaluations expressed quantitatively wherever possible–-a preference that also reflects the current cultural hegemony of science. Yet what matters to us, really, shouldn't be abilities in the abstract–how quickly we can calculate, or how successfully we can recall information—but what we actually do with these or any other abilities we possess. Is there any reason to suppose that we make better use of what we've got before we're forty?

Read more »