People are understandably concerned about health and safety. They worry about hoverboards that threaten to burst into flames, or a Zika virus infection that could harm their unborn children. They worry ceaselessly about health threats and industrial hazards and accidents that they can do nothing to prevent, and they want their governments to protect them. They want to know: is it safe or not?
But safety isn’t as simple as yes or no, and believing it is can lead us into a false sense of security. Or needless worry. Safety is more nuanced and therefore more problematic, as we are learning now, five years on from the Fukushima Daiichi nuclear power-plant accident. What is more, the safety issues around Fukushima will have a tremendous impact on us all – not just on the local residents of that Japanese city.
Exposure to any potential hazard involves a certain risk. It’s where we draw the line between high risk and low risk that defines what is safe and what is not. This gives rise to an interesting set of questions. Would everyone draw that line in the same place? Would we draw the same line for ourselves as we would for others? And, would we allow others to draw the line for us?
The radiation protection community has had a lot of experience drawing safety lines ever since 1895, when X-rays were discovered. Those lines have shifted like moveable feasts, as each generation, thinking it knows better, has written over the recommendation of another.
One hundred years ago, women factory workers in New Jersey and Connecticut began painting watch dials with fluorescent paint doped with radioactivity. The radioactivity made the paint glow in the dark, and watches with glowing dials were the latest novelty. Spurred by heavy advertising, watch sales rocketed, and the women found it hard to keep pace with the high demand.
But dial-painting was a risky business. The women typically restored the points to their blunted brushes by twisting the bristles in the corner of their mouths – a technique called ‘lip pointing’ – and that practice posed a health hazard. Soon the women started coming down with bone cancers and many subsequently died as a result of ingesting radium. Lip pointing was abolished, but since the watch factories were laden with radium dust, the risk of workers breathing or ingesting radium remained. Were these small residual amounts of internal radioactivity worth worrying about?
Several medical committees were convened to address the radium safety issue, but none could agree on a safe limit. The situation became urgent during the Second World War, when the need for glowing dials moved from novelty watches to the instrument panels on the dashboards of Air Force fighter planes. To break the safety deadlock, the military established its own committee of radiation experts and tasked it with reviewing existing data on radium exposure until it arrived at a consensus on a safety limit. But still it was not clear what criteria should be used to draw that all-important line between dangerous and safe. Finally, the chairman suggested that each committee member consider the level of radium that they would feel comfortable having their own wives and daughters consume.
With this ‘wife or daughter criteria’ guiding it, the committee selected 0.1 micrograms as the safety threshold for radium consumption. Above that line lay danger; below it, safety.
When X-rays and other types of external sources of radiation became prevalent, many radiation workers likewise experienced daily exposures to radiation. Most suffered no noticeable health problems, but some developed lesions and rashes on their skin. Sometimes, they later developed cancerous tumours where the lesions had been. With radiation now a valuable tool in medicine and industry, abolishing it was unthinkable. That high doses of external radiation were dangerous was obvious; the critical question was what was the dose below which workers would be safe.
Again, a series of expert scientific committees came and went, each using different criteria for what constituted ‘safe’. Again, the committees came up with different occupational limits. A conceptual breakthrough came when one committee proposed that employment in the radiation industry should pose neither more nor less risk of death than employment in other comparable occupations. So extensive surveys were conducted, and the average risk of occupational death among all types of workers calculated. Then an annual dose limit for radiation workers was selected that conferred a comparable risk of death. The limit set happened to fall between the death rates for manufacturing workers and construction workers. Radiation work was thus safer than construction, but more dangerous than manufacturing – at least it was for those who agreed with the death-rate rationale as the best way to draw the line on risk.
The 1970s posed a particular safety challenge in Japan. Desperately short of fossil fuels and facing growing demands for electrical power, the country embarked on building dozens of nuclear power plants along its coastline, where the need for large amounts of cooling water for the reactors might be easily met. There was only one problem: Japan’s coastline is under constant threat from tsunamis, which, though infrequent, had brought death and destruction to the country’s coastal communities over many centuries. What should be done?
The government commissioned the building of seawalls as a line of defence against potential tsunamis. But the question arose as to how high the seawalls should be to guarantee safety – given that, as civil engineers pointed out, the higher the wall, the greater the technological challenges, and the higher the costs.
all the marker stones convey the same message: learn from the misfortunes of your ancestors; do not erect buildings below this point
Many of the Japanese nuclear officials could remember the devastating tsunami of 1960 that claimed 142 lives along Japan’s eastern coast. It was the worst tsunami in modern history, with sweeping successions of waves reaching 5.7 metres (18.7 feet) in height. So the new seawalls were built at 5.7 metres – a level judged adequate to provide protection. Any lower would be too dangerous, given recent historic precedent, while building them any higher would be too costly.
This calculation proved shortsighted. Power-plant officials were not the first people to draw safety lines for tsunamis in Japan. Up in the hills, all along the coast, there still exist many ancient stone monuments that mark the high-water lines of tsunamis from centuries past. These markers, dating back to as early as 869 AD, were intended as a warning to the region’s descendants. Their individual inscriptions vary, but all the stones convey essentially the same message: learn from the misfortunes of your ancestors; do not erect buildings below this point.
As it turns out, the 1960 tsunami was one of the smaller ones to hit Japan. As we now well know, the decision to use the 1960 tsunami as the benchmark for determining a safe seawall height had a tragic outcome. The waves that struck Fukushima Daiichi on 11 March 2011 were 15 metres (49 feet) high – nearly three times higher than the seawall. The result was that the reactor buildings were flooded, the control rooms lost electrical power, and three reactor cores underwent meltdown, releasing radioactivity into the environment.
Such stories illustrate the problems of trying to set any type of safety limit. Our long history with radiation protection has taught us two lessons. One: the criteria for assessing safety is often subjective; and two, there is really no such thing as safe, only varying levels of risk. While radiation protection professionals are well aware of these lessons, the public is often clueless.
Another challenging concept for the public is the notion that the major determinant of risk is dose, that is, the amount of toxin that actually gets into your body’s tissues as a result of being exposed: two people might both get five hours of sun at the beach, but the one wearing sunscreen will get a lower dose of harmful rays. In the case of radiation – and many other environmental hazards such as pesticides and air pollutants – there is no option of avoiding low-level exposures that result in our absorbing some specific dose. Safety is not a matter of choosing between receiving a dose or not: instead, it boils down to determining what dose is too much.
Ken Buesseler, director of the Center for Marine and Environmental Radioactivity at the Woods Hole Oceanographic Institute in Massachusetts, laments the way people think that any radioactivity in the environment poses a significant health danger. He told me: ‘People ask me all the time if there is still Fukushima radioactivity in the Pacific Ocean, as though its very presence were a major health threat. I tell them that it’s still there but the amount that’s there now, compared to the amount that was found off Fukushima in the ocean in 2011, is like the difference between the temperature on the Earth’s surface and the temperature in the interior of the Sun. The two temperatures are no way comparable, and neither are the present ocean radioactivity levels compared to 2011.’
But if the public might need to get up to speed with the new paradigm on viewing safety, public health officials haven’t made it easy for them. Often they will characterise risks in arcane language and confusing jargon, using metrics such as odds ratios, relative risks, standardised mortality ratios, and so on. These metrics are the workhorse for epidemiologists and other scientists who measure risk, but they don’t translate well into public discourse.
even radiation protection professionals often drop the ball when it comes to effectively communicating that risk
Also, safety professionals frequently use a plethora of different dose and exposure units, depending upon the circumstances. Sometimes their selection of units doesn’t even make much sense. For example, at Fukushima, radioactive water releases into the ocean were reported in tons (a weight unit) rather than in litres or gallons (volume units). When was the last time you filled your gas tank with 0.1 tons of gas? It’s little wonder the public is confused.
Despite decades of experience protecting the public from radiation risks, even radiation protection professionals often drop the ball when it comes to effectively communicating that risk. As professor Paul Locke of the Johns Hopkins Bloomberg School of Public Health told me: ‘The greatest challenge is the failure of radiation professionals to be active and engaged listeners and communicators. This needs to change if things are to move forward in the radiation protection field.’
One overdue change would be for radiation protection professionals and public health officials to think hard about how they use numbers when communicating with the public. Although it’s well-documented that people have difficulty conceptualising ratios and fractions because they focus on numerators to the detriment of denominators, there’s been little movement away from using ratio metrics when presenting risk to the public. This is despite the availability of more easily intelligible risk metrics (such as ‘effort-to-yield’ measures) that have been shown by the risk communication researcher Gird Gigerenzer, director of the Max Planck Institute for Human Development in Berlin, to be uniformly understandable by professionals and the general public alike. Unless public health professionals follow Gigerenzer’s recommendations in communicating with the public, they will have to share some of the blame when risk communication breaks down.
On the plus side, the radiation protection community, for one, is already moving away from the simplistic concept of safe and dangerous levels of radiation, in favour of a risk-minimisation approach. Although the US Nuclear Regulatory Commission (NRC) still maintains regulatory dose limits for different categories of exposure, its working policy is actually more progressive, with users of radiation in medicine, industry and commerce now being required to keep the doses to radiation-exposed personnel ‘as low as reasonably achievable’ (called the ALARA principle). This policy acknowledges that risk occurs at all doses, so it favours keeping them prudently low, regardless of how minimal the absolute level of risk might be.
It is time to move this progressive message to the public at large, not just for radiation, but for all the environmental hazards we face. If people think that regulatory dose limits represent a wall between safety and danger, they are sadly mistaken. The regulatory limit is just a line drawn in the sand.
If anyone has issues with either the acceptability of the risk levels, or the appropriateness of the criteria used to judge safety, then they should pick up a stick and draw their own line, reflecting their individual risk tolerance and the safety criteria that are relevant to them. It will require a little more work and a little self-education, but it will release people from a dependence on experts and regulatory agencies telling us what ‘safe’ is. Nor is it necessary that we all draw the same line; different lines can be equally valid as long as they are based on scientific facts – not rumours, conspiracy theories and hysterics. It is the responsibility of regulatory agencies to provide us with the facts about risk in a clearly intelligible way, so that we can make our own judgments on safety.
Now, in 2016, another line is being drawn: the annual radiation dose that should be considered as safe for Fukushima evacuees to return to their homes. The Japanese government wants to draw the line at an effective dose of 20 millisieverts (mSv) per year; others think it should be higher or lower. Let’s explore for a moment what that 20-mSv line would mean in terms of risk.
A 20-mSv dose entails approximately the same level of cancer risk as the dose from a single whole-body CT scan. How much cancer risk does that represent? About one in 1,000. That is to say, for 1,000 people living for one year in an area with an annual dose rate of 20 mSv per year, we would expect one of them to develop a lethal type of cancer at some point later in their lives due to that radiation exposure. This is what our best science tells us. Is a risk of one in 1,000 too high?
Put another way, a one in a 1,000 risk represents a 0.1 per cent increase above the background cancer rate, which is about 25 per cent: that is, for 25 per cent of all people, the cause of their ultimate death will be cancer. Some people are surprised by this number because they underestimate their personal risk. But cancer is a very common disease. The 0.1 per cent cancer risk from the radiation exposure adds to this baseline risk of 25 per cent, so that the person exposed to 20 mSv of radiation now has a 25.1 per cent cancer risk (compared with the 25 per cent risk of an unexposed person). Living one year at 20 mSv produces 25.1 per cent risk, at two years it’s 25.2 per cent, at three years: 25.3 per cent. These annual incremental increases of a 0.1 per cent are too small to see in any cancer registry or any epidemiological study, so they will remain theoretical risk estimates only – unconfirmable by any future health study of returning evacuees.
So the question evacuees must contemplate when considering their possible return is this: am I comfortable with subjecting myself to a one in 1,000 level of cancer risk? What evacuees ultimately decide will likely depend just as much on what they stand to lose by not returning as what they risk by going back. This is why the people should make their own decisions, individually, about what is ‘safe’ for them, because while regulatory agencies can calculate the cancer risks of going back, they cannot calculate the personal costs of not returning. Some people will simply have more at stake than others, and they will have to pay a very high personal price if they insist on extremely low risk.
do we really want government and public health officials acting as though they are our parents, choosing safety criteria for us?
The aftermath of Fukushima is all about drawing safety lines. It’s about where they should be drawn, whether they’ll need to be redrawn, and by whom. In other words, exactly which risk lines must be crossed if evacuees are ever to return home. These are the questions that Fukushima dropped squarely in all of our laps. If you think the issue of whether to return home or not is a quandary only for the people of Fukushima, think again. It is exactly the same choice you’d be facing if your nearest nuclear power plant had an accidental meltdown, or a terrorist detonated a dirty bomb in your city. We all have skin in this game. And we need to start thinking about these questions now.
For too long there has been a paternalistic approach to ensuring public safety, with the public demanding: ‘Tell us whether it is safe or not, and spare us the details.’ But do we really want government and public health officials acting as though we were children and they our parents, choosing safety criteria for us? Do we want them to restrict and regulate our actions based on judgments about levels of ‘acceptable risk’ that we might not agree with, or that might not apply to our personal situation? Should men be picking radium ingestion levels for women? Should scientists be choosing the acceptable death rates for workers? Should civil engineers be telling us how big a tsunami to be concerned about? Probably not.
All that scientists can reasonably be expected to do is conduct risk assessments and report the results back to us in an intelligible and transparent manner. And that’s a big enough job in itself. After that, shouldn’t it be we – the ‘stakeholders’ – who decide what risks are acceptable to us, with full awareness that, when we insist on very low levels of risk, there will be consequences to pay. If we are too conservative about radiation risks, we might lose the benefits of diagnostic X-rays, have to leave our radon-contaminated home, or pay higher prices for electricity. There is no free lunch when it comes to low risk tolerance.
As we move forward through the 21st century, let’s take heed of the lessons from our radiation past and abandon the concepts of ‘safe’ and ‘dangerous’ that have outlived their usefulness. We are far more sophisticated than that now – and we have far more reliable risk data. We ought to be ready, willing and able to make decisions about levels of ‘acceptable risk’ on our own. We’ll rely on the risk assessors for reliable information and guidance, but the buck for decision-making on acceptability of risk cannot be passed on. And the consequences for our poor or good decisions fall on us, as masters of our own ‘safety’ – just as it should.