Menu
Aeon
DonateNewsletter
SIGN IN

Tyrannosaurus rex, a feathered beast. Illustration by Richard Wilkinson

i

Paradigms lost

Science is not a ‘body of knowledge’ – it’s a dynamic, ongoing reconfiguration of knowledge and must be free to change

by David P Barash + BIO

Tyrannosaurus rex, a feathered beast. Illustration by Richard Wilkinson

Coming from a scientist, this sounds smug, but here it is: science is one of humanity’s most noble and successful endeavours, and our best way to learn how the world works. We know more than ever about our own bodies, the biosphere, the planet and even the cosmos. We take pictures of Pluto, unravel quantum mechanics, synthesise complex chemicals and can peer into (as well as manipulate) the workings of DNA, not to mention our brains and, increasingly, even our diseases.

Sometimes science’s very success causes trouble, it’s true. Nuclear weapons – perhaps the most immediate threat to life on Earth – were a triumph for science. Then there are the paradoxical downsides of modern medicine, notably overpopulation, plus the environmental destruction that science has unwittingly promoted. But these are not the cause of the crisis faced by science today. Today science faces a crisis of legitimacy which is entirely centred on rampant public distrust and disavowal.

A survey by the Pew Research Center in Washington, DC, conducted with the American Association for the Advancement of Science, reported that in 2015 a mere 33 per cent of the American public accepted evolution. A standard line from – mostly Republican – politicians when asked about climate change is ‘I’m not a scientist’… as though that absolved them from looking at the facts. Vaccines have been among medical science’s most notable achievements (essentially eradicating smallpox and nearly eliminating polio, among other infectious scourges) but the anti-vaccination movement has stalled comparable progress against measles and pertussis.

How can this be? Why must we scientists struggle to defend and promote our greatest achievements? There are many possible factors at work. In some cases, science conflicts with religious belief, particularly among fundamentalists – every year I find it necessary to give my undergraduate students a ‘talk’ in which I am frank that evolutionary science is likely to challenge any literalist religious beliefs they might have. In the political sphere, there is a conflict between scientific facts and short-term economic prospects (climate‑change deniers tend to be not merely scientifically illiterate, but funded by CO2-emitting corporations). Anti-vaxxers are propelled by the lingering effect of a single discredited research report that continues to resonate with people predisposed to ‘alternative medicine’ and stubborn opposition to establishment wisdom.

The problems run deeper than this, however. Many scientific findings run counter to common sense and challenge our deepest assumptions about reality: the fact that even the most solid objects are composed at the subatomic level of mostly empty space, or the difficulty of conceiving things that go beyond everyday experience, such as vast temperatures, time scales, distances and speeds, or (as in the case of continental drift) exceedingly slow movements – not to mention the statistically verifiable but nonetheless unimaginable ability of natural selection, over time, to generate outcomes of astounding complexity. On top of this, we have the continuing paradox that the more we learn about reality, the less central and self-important is our own species.

Yet one factor in the public distrust of science has been largely overlooked, and it goes to the heart of the scientific enterprise. The capacity for self-correction is the source of science’s immense strength, but the public is unnerved by the fact that scientific wisdom isn’t immutable. Scientific knowledge changes with great speed and frequency – as it should – yet public opinion drags with reluctance to be modified once established. And the rapid ebb and flow of scientific ‘wisdom’ has left many people feeling jerked around, confused, and increasingly resistant to science itself.

In his hugely influential book, The Structure of Scientific Revolutions (1962), the physicist and philosopher of science Thomas Kuhn argued that ‘normal science’ proceeds within certain reigning ‘paradigms’. In other words, each scientific discipline is governed by an accepted set of theories and metaphysical assumptions, within which normal science operates. Periodically, when this rather humdrum ‘puzzle solving’ leads to results that are inconsistent with the regnant perspective, there follows a disruptive, exciting period of ‘scientific revolution’, after which a new paradigm is instituted and normal science can operate once more.

Strangely, Kuhn argued that new paradigms do not necessarily offer a more accurate picture of the real world. This seems a peculiar claim: for example, in Kuhn’s own field of astronomy, the Copernican view of a heliocentric solar system is clearly superior to the earlier geocentric one. Kuhn’s language has lent itself to an exaggerated sense of just how revolutionary a new paradigm is liable to be. When Newton said: ‘If I have seen farther, it is by standing on the shoulders of giants’, he wasn’t merely being modest; rather he was emphasising the extent to which science is cumulative, mostly building on past achievements rather than making quantum leaps.

But Kuhn was right about this: the accumulation process generates not just something more, but often something altogether new. Sometimes the new involves the literal discovery of something which hadn’t previously been known (electrons, general relativity, Homo naledi). At least as important, however, are conceptual novelties, changes in the ways that people understand – and often misunderstand – the material world: their operating paradigms.

Of course, the fundamental laws and processes of the natural world exist independently of human paradigms: the Earth orbited the Sun regardless of whether people signed on to a Ptolemaic or a Copernican perspective. As B F Skinner said: ‘No theory changes what it is a theory about.’ The world’s factual details are in continual Heraclitean flux, but the basic rules and patterns underlying these changes in the physical and biological world are themselves constant. So far as we know, light travelled at the same speed during the age of dinosaurs as it does today, just as special and general relativity were valid before being identified by Albert Einstein. Our insights, however, are always ‘evolving’.

We need to see this paradigm transition as progress rather than a sign of weakness, which is more difficult than one might think

This sort of change is both frightening and exciting. After all, it’s hard to give up a cherished idea, particularly one that took a while to catch on but that eventually becomes widely accepted. And for many people – scientists and non-scientists alike – it’s even harder to give up ideas that appeared to have the seal of scientific approval. Isn’t that what science is supposed to be: a series of iron-clad factual statements of what we know to be true?

In fact, this is itself untrue. Science is a process, which, unlike ideology, is distinguished by intellectual flexibility, by a graceful, grateful (albeit sometimes grudging) acceptance of the need to change our minds, as our understanding of the world evolves. Most people aren’t revolutionaries, scientific or otherwise. But anyone aspiring to be well-informed needs to understand not only the most important scientific findings, but also their provisional nature, and the need to avoid hardening of the categories: to know when it is time to lose an existing paradigm and replace it with a new one. What is more, they need to see this transition as progress rather than a sign of weakness, which is more difficult than one might think. A good paradigm is a tough thing to lose.

There is a long list of ideas that were considered ‘scientifically valid’ in their day and have since been discarded. Belief in a flat Earth is a prominent one, along with the Ptolemaic system that had enshrined our planet as the centre of all things celestial. Although it is easy to ridicule that earlier geocentric world view, it was impressively ‘scientific’ in its day, buttressed by elaborate mathematical models and supported by much of the empirical data of the time – albeit based on visual astronomy rather than optical telescopes.

Alchemy is in a real sense the ancestor of what we now call chemistry, but its practitioners had to recant their previous paradigm in order to become, eventually, ‘real’ chemists. Other lost theories include the ‘luminiferous ether’, long believed to constitute a substance that propagates light waves, and whose explanatory reach was later extended to include electromagnetic radiation generally; or ‘caloric’, a hypothetical substance that ostensibly embodied heat energy, and which flowed from hotter bodies to colder ones.

Some of these paradigm shifts occurred before science itself became an institutional endeavour, and did not, therefore, undermine the legitimacy of science as an enterprise. The word ‘scientist’ didn’t even exist until the English historian and philosopher William Whewell coined the term in 1834. Once science became an intellectual discipline and scientists were identified as its practitioners, then along with the good (progress in getting the nature of the natural world right) came the bad (the fact that the wisdom of science wasn’t rock solid).

Nor are paradigm shifts confined to the distant scientific past. In my own specialty, the study of animal behaviour, it was de rigueur for many decades to avoid any assumptions about animal consciousness, or even the presence of animal minds. A mere hint of such anthropomorphism was a kind of third rail in animal behaviour research: touch it and you might not get electrocuted, but you certainly wouldn’t get a research grant, or tenure. This paradigm of ‘mindless’ animals derived in part from a misapplication of the Occam’s Razor principle (always make the simplest possible assumptions), and was in part a consequence of radical behaviourism, an effort to render psychology purely objective and scientific – itself largely outmoded. Recent discoveries, including work on Alex the (sadly deceased) African grey parrot as well as remarkable studies on cognition in chimpanzees, crows and dogs, have shown that these creatures are capable of intellectual feats that compare favourably with those of normal, healthy human beings. Once denied by science, animal minds are now legitimate subjects of research, under the rubric of ‘cognitive ethology’.

The animal mind has become a legitimate object of scientific enquiry, while at the same time the human mind has been brought firmly into the physical universe. This has been deeply disorienting for those committed to a mystical concept of consciousness as ineffably separated from materiality. René Descartes is justly renowned as a philosopher and mathematician; however, he thought of himself as primarily an empirical researcher and was in fact a 17th‑century pioneer physiologist. Part of Descartes’ science was the certainty that body and mind were quite separate, a belief that is still deeply influential in the popular imagination. Yet one of the most productive of today’s scientific disciplines is neurobiology, whose insights have made it increasingly difficult to maintain the dualistic Cartesian concept that human consciousness is beyond the reach of scientific research and ultimately, physical and biological explanation. ‘You, your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will,’ wrote Francis Crick in The Astonishing Hypothesis (1994), ‘are in fact no more than the behaviour of a vast assembly of nerve cells and their associated molecules.’

Some of the most dramatic paradigm shifts have involved bio-medicine: no wonder that much of the complaint about science being fickle comes from a confusion at changing advice about our bodies and how to care for them. Thanks to Louis Pasteur, Robert Koch, Joseph Lister and other pioneering 19th- and 20th-century microbiologists, we came to understand the role of pathogens in causing disease, resulting in the scientific discovery that ‘germs are bad’. This particular paradigm – displacing belief in ‘bad air’ and the like (the term ‘influenza’ derives from the supposed ‘influence’ of miasmas in causing disease) – was vigorously resisted by the medical establishment. Doctors who would routinely go from conducting autopsies on disease-ridden corpses couldn’t abide the idea that their unwashed hands were transmitting illness to their patients, to the extent that the physician Ignaz Semmelweis, who in 1847 demonstrated the role of hand-borne pathogens in causing ‘puerperal fever’, was ignored, then vilified, then literally driven mad.

More recently, just as people have finally adjusted to worrying about creatures so small that they can’t be seen, a new generation of microbiologists have demonstrated the stunning fact that most microbes who associate with us (including but not limited to the gut microbiome) aren’t merely benign but essential for health.

Wine? Bad. Well, actually – good, if not overdone. Sugar? First OK, then not. And don’t get me started on gluten

Nerve cells, we were long told, didn’t regenerate, especially not within the brain. Now we know that actually they do. Brains can even produce whole new neurons; you can teach old dogs new tricks. Similarly, it was assumed until recently that once an embryonic cell differentiates into, say, a skin or liver cell, its fate is sealed. The advent of cloning technology has changed this, with the finding that cell nuclei can be induced to differentiate into other tissue types. Dolly the sheep was cloned from the nucleus of a fully differentiated mammary cell, proof that the paradigm of irreversible cell differentiation itself needed to be reversed. Biologists have long known that life is fragile and can exist only under very exacting and special circumstances. Au contraire: living organisms have recently been found thriving in some of the most challenging environments imaginable, including super-heated oceanic vents and anaerobic conditions previously thought to be lifeless. Individual lives are indeed fragile, but life is remarkably robust.

Until recently, physicians were scientifically certain that at least a week of bed rest was necessary after even a normal, uncomplicated vaginal childbirth, not to mention invasive surgery. Now surgical patients are often encouraged to walk as soon as possible. For decades, protuberant but basically benign tonsils were unceremoniously yanked out whenever a child had a sore throat. Not any more. Psychiatry offers a pervasive, problematic panoply of paradigms lost: homosexuality was, until 1974, considered a form of mental illness; schizophrenia was thought to be caused by the verbal and emotional malfeasance of ‘schizophreno-genic mothers’; and prefrontal lobotomies were the scientifically approved treatment of choice for schizophrenia, bipolar disease, psychotic depression, and sometimes, merely a way of calming an ornery and intransient patient.

Probably the most notable cases of medical paradigms found, then lost, then regained, then placed in a kind of scientific limbo occur in the field of nutrition. I wasn’t the only child growing up in the 1950s for whom a normal breakfast consisted of two eggs. For later generations, dietary cholesterol was tarred as barely short of poison. Now? Not so much. There seems little doubt that trans-fats are bad, very bad. But other forms of fat have undergone a dizzying course of banishment, embrace, then moderate tolerance if only because they diminish appetite and thus might actually limit obesity. Caffeine was also bad, a verdict that has been increasingly reversed – but only up to a point. Wine, especially red wine? Bad. Well, actually – good. So long as it’s not overdone. Sugar? First OK, then not. And now, so-so. And don’t get me started on gluten.

Deprived of previous paradigms, many of them comforting, what’s left? Some of these unseated certainties will not be missed, at least not for long: it is relatively straightforward (although not always easy) to keep changing our diets, or to reconfigure our perception of microbes and of the capacity of nerve cells to regenerate and of others to differentiate.

But the loss of any paradigm is disorienting, and to be deprived of many can be downright disheartening. Perhaps we mourn the loss of certainty, of the sort that most religions offer to their followers. Perhaps it’s more a search for authority, of the sort once provided by our parents. Or a universal yearning for any reliable port – even if conceptual rather than maritime – in the storms of life’s unknowns. Whatever the underlying cause, people have difficulty accepting the unstable, shifting, impermanent reality of how the world is put together. And this difficulty, in turn, renders us uncomfortable with precisely the only stability and certainty that science offers: that paradigms come and go.

Even more worrying, changes in scientific insights have also provided opportunities for malefactors to sow undeserved doubt. Creationists point to the shifting intellectual dynamic between advocates of phyletic gradualism (evolution proceeds slowly) and punctuated equilibrium (sometimes it is rather fast), as showing that Darwinism is seriously in doubt. It isn’t; specialists merely disagree as to the rates at which evolution by natural selection occurs, not that it occurs. Ditto for controversy over whether the meaningful unit of selection is the gene, the individual, or even the group. By the same token, climate‑change deniers point to the constant revisions in atmospheric models and data as ‘proving’ that the science itself is bogus. ‘If climate science is settled,’ wrote the conservative columnist Charles Krauthammer in The Washington Post in 2014, ‘why do its predictions keep changing?’ Note to Mr Krauthammer: because as we get better data, we make better predictions (which, incidentally, turn out to confirm anthropogenic global heating, to a degree that is typically more worrisome, not less).

Our reality hasn’t become unstable; it’s just that our understanding of reality is of necessity a work in progress

One possible corrective might be to modify the way we teach science. Currently, our insights are communicated as a catalogue of Things We Know, which has the dual disadvantage of not only making science seem a laborious exercise in memorisation, but also giving the false impression that our knowledge is petrified and immutable, a Cretaceous-era insect entombed in amber. Maybe, instead, we should teach science as an exciting examination of Things We Don’t (Yet) Know.

Denied the comforting blanket of illusory permanence and absolute truth, we have the opportunity and obligation to do something extraordinary: to see the world as it is, and to understand and appreciate that our images will keep changing, not because they are fundamentally flawed, but because we keep providing ourselves with better lenses. Our reality hasn’t become unstable; it’s just that our understanding of reality is of necessity a work in progress.

The loss of paradigms might be painful, but it is testimony to the vibrancy of science, and to the regular, unstoppable enhancement of human understanding as we approach an increasingly accurate grasp of how our world works. According to the Bible, having eaten forbidden fruit from the Tree of Knowledge of Good and Evil, we were punished for our disobedience. As we pursue knowledge – not of good and evil, but (as Shakespeare put it) of how the world wags – we too must absorb a kind of punishment. Fortunately, losing a paradigm is less devastating than being kicked out of paradise. Moreover, unlike the purported ways of God, science doesn’t need any special justification – to men or women – beyond the satisfaction it provides as well as the practical insights it yields. Every paradigm lost is compensated by wisdom found.

I recently heard a man interviewed on my local public radio station complain about the difficulty of keeping up with what he called the ‘swerves of scientific wisdom’: ‘I spent two tours in Iraq as a gunner,’ he said, ‘and I know how hard it is to hit a moving target. I wish these scientific experts would just hold still.’

But that’s the thing. Holding still is exactly what science won’t do.