Do human bodies carry more toxins now than 50 years ago?

<p><em>Alan Levine/Flickr</em></p>

Alan Levine/Flickr


by Matt Limmer + BIO

Alan Levine/Flickr

Look around, and you’re likely to find a burgeoning industry of alternative health professionals, laboratories and containment companies, all formed around the disturbing belief that we humans have become contaminated with metals, plastics, chemicals, hormones and all the detritus and waste of modern life, unfettered and unrestrained. The list of dangers seems endless: polychlorinated biphenyls (PCBs) dumped into the Hudson River by General Electric for three decades until finally banned in 1997; the endless flow of plastics now clogging our landfills and, presumably, our bodies; mercury from China; oestrogens, fluorides, and even electromagnetic waves. This industrial poison, a product of the past 50 years, is said to be the cause of an upsurge in autism and chronic disease.

But is it true?

Maybe not. It could be that we are not any more contaminated than we were decades before.

We, the environment and its inhabitants, can become contaminated from a wide range of toxins. Most often, we recognise products of industrialisation as environmental contaminants, but naturally occurring toxins abound, from ultraviolet light coming from space to compounds created naturally, by plants, for self-defence. By some measures, more than 99 per cent of human exposure to toxins result from natural toxins, and the toxic body-burden presented by these natural toxins is unlikely to have significantly changed in modern times.

The presence of such natural toxins, moreover, should not be frightening. As living organisms, we’ve evolved strategies, such as the liver, to mitigate many of these natural invaders, strategies that can be used in the defence against human-made toxins, too.

When taking an account of our current and past contamination, the mere presence of a toxin is unfortunately insufficient. Instead, we need to understand each toxin’s actual risk, calculated as the mathematical product of exposure and chemical toxicity, the latter often correlated with chemical concentration. However, determining both components of risk posed by the plethora of toxins is difficult. At one end of the chemical spectrum sit well-understood, regulated contaminants that still create exposures, such as constituents of petroleum products. At the other end of the chemical spectrum are the unknown unknowns – contaminants whose adverse effects have yet to have been understood.

Since the inception of the United States Environmental Protection Agency more than 40 years ago, many known toxicants have been efficiently removed from global production, particularly, as in the case of the pesticide DDT, when risk outweighs benefit. In this sense, we are less contaminated than before: drinking water is tightly regulated for a number of contaminants; rivers near industrial effluents no longer catch fire; and outdoor air contamination by particulate matter and ozone has been reduced.

Still, we have more industrial production, more cars and more people today than in the past, and the Earth is no larger. The environment can naturally remediate or tolerate many contaminants up to a point – often termed the assimilative capacity – beyond which the natural system is negatively affected.

And new, relatively unstudied chemicals are being produced at substantial rates. According to the Chemical Abstracts Service (CAS), a division of the American Chemical Society, there are at least 100 million potential new contaminants for humans to deal with today, about 10 times more than the estimated number of eukaryotic species on Earth. Understanding how a potential contaminant behaves in the environment is similar to understanding how a particular species behaves in its environment; we must determine its distribution, life cycle and rate of decay under a variety of circumstances.

Similarly, while in many ways individual species resemble related species, and contaminants resemble similar contaminants, each species and contaminant is unique in some way. Thoroughly studying each potential contaminant takes time and resources. The first step, estimating our exposure to all these chemicals, is an overwhelming task. Determining the chemical toxicity is even more difficult, because toxicity encompasses a wide variety of damages that are still being researched. Classically, acute toxicity has been investigated due to the short time frames involved, but chronic conditions such as cancers are of increasing interest.

Because of the vast diversity and complexity of chemicals and toxicity, regulations and much of the research has historically been reactionary, waiting to see which chemicals accumulate in biological systems or otherwise exert toxicity. This type of approach obscures our ability to answer the question of this opinion, simply because there are too many unknowns.

To better protect human health and the environment, reactionary regulation must be supplemented with proactive research and regulation. Even with the vast number of potential toxicants produced, there is hope for reducing our exposure to emerging and fugitive contaminants. Existing laws protect us from many known contaminants, assuming these laws are properly enforced. For many other contaminants, scientific consensus needs to be pushed into regulations, without the prerequisite of a national headline disaster. Fundamental research needs to continue to uncover the unknown unknowns and to develop computational tools for predicting risk and exposure. Whether or not we are more or less contaminated will probably always be unanswerable, but through proactive research and regulation, we can work to minimise the risk presented by environmental contaminants.