Chaos and cause

Can a butterfly’s wings trigger a distant hurricane? The answer depends on the perspective you take: physics or human agency

by Erik Van Aken + BIO

A slight shift in Cleopatra’s beauty, and the Roman Empire unravels. You miss your train, and an unexpected encounter changes the course of your life. A butterfly alights from a tree in Michoacán, triggering a hurricane halfway across the globe. These scenarios exemplify the essence of ‘chaos’, a term scientists coined during the middle of the 20th century, to describe how small events in complex systems can have vast, unpredictable consequences.

Beyond these anecdotes, I want to tell you the story of chaos and answer the question: ‘Can the simple flutter of a butterfly’s wings truly trigger a distant hurricane?’ To uncover the layers of this question, we must first journey into the classical world of Newtonian physics. What we uncover is fascinating – the Universe, from the grand scale of empires to the intimate moments of daily life, operates within a framework where chaos and order are not opposites but intricately connected forces.

In his bestselling book Chaos: Making a New Science (1987), James Gleick observes that 20th-century science will be remembered for three things: relativity, quantum mechanics (QM), and chaos. These theories are distinctive because they shift our understanding of classical physics toward a more complex, mysterious and unpredictable world.

Classical physics, which reached its pinnacle in the work of Isaac Newton, painted a universe ruled by determinism and order. It was a world akin to a perfectly designed machine, where each action, like the fall of a domino, inevitably triggered a predictable effect. This absolute predictability – a world where understanding the present means knowing the future – became the essence of Newtonian mechanics.

Classical physics not only presented an orderly universe among Newton’s followers, but it also instilled a profound sense of mastery over the natural world. Newton’s discoveries fostered the belief that the Universe, previously shrouded in mystery, was now laid bare, sparking an unprecedented optimism in the power of science. Armed with Newton’s laws and revolutionary mathematics, leading thinkers felt they had finally unlocked the secrets of reality.

In this atmosphere of scientific triumph, Alexander Pope, the great poet of the Enlightenment, wrote a fitting epitaph for Newton that captured the monumental impact of his contribution:

Nature and Nature’s laws lay hid in night.
God said, Let Newton be! and all was light.

Not everyone was excited. In his beautiful work Lamia (1820), John Keats poignantly expressed concern over the loss of mystery and wonder in the face of empirical scrutiny:

Do not all charms fly
At the mere touch of cold philosophy?
There was an awful rainbow once in heaven:
We know her woof, her texture; she is given
In the dull catalogue of common things.
Philosophy will clip an Angel’s wings,
Conquer all mysteries by rule and line,
Empty the haunted air, and gnomed mine –
Unweave a rainbow, as it erewhile made
The tender-person’d Lamia melt into a shade.

The ‘cold philosophy’ of classical physics seemed to ‘unweave a rainbow’, stripping the natural world of its enchantment and mystery. Keats resented the process of scientific rationalisation, which could ‘clip an Angel’s wings’ and reduce the world’s wonders to simple entries in ‘the dull catalogue of common things’.

Chaos theory reveals a beguiling level of unpredictability, particularly at a macroscopic level

And yet, the 20th century witnessed a dramatic shift with the emergence of relativity, which redefines our understanding of space and time; QM, which revolutionised our understanding of the subatomic world; and chaos theory. The orderly and predictable world of Newtonian physics, the dream of a mechanical universe ready to unveil her innermost workings, was, happily or not, something of an illusion. In the 20th century, science revealed a far more intricate, less predictable and, indeed, chaotic universe.

Like the other two pillars Gleick identified, chaos theory challenges our understanding of classical physics. However, unlike QM and relativity, chaos theory operates within a Newtonian framework – it assumes a deterministic reality governed by specific laws. Yet chaos theory reveals a beguiling level of unpredictability, particularly at a macroscopic level.

The unpredictability revealed by chaos theory, seemingly at odds with a deterministic worldview, arises from the complex nature of nonlinear systems.

In dynamical systems, behaviour changes over time. The concept of determinism implies that future states are precisely determined by current conditions, without any randomness or chance involved. However, when dynamical systems exhibit nonlinearity, their behaviour becomes more complex and less predictable. This complexity arises from a disproportionate relationship between input or cause and output or effect.

Consider a simple faucet. At low pressure, water flows in a smooth, or laminar, pattern. As pressure increases, the flow remains steady but broadens slightly. At one critical point, however, marked by no more than a tiny pressure change, we see a ‘phase transition’ – the orderly flow suddenly becomes turbulent, exemplifying chaos: the sensitivity of nonlinear systems like fluids to minor changes, leading to unpredictable outcomes.

Think about the movement of a small pebble rolling down a mountain slope. Tiny variations in its starting point, uneven terrain, soil density, even wind direction can drastically alter its path and final position. For instance, imagine we drop a pebble at a specific location and it comes to rest in another location. Imagine we run a simple experiment, dropping the pebble one millimetre away from where we dropped it in the first place. If the pebble’s movement is slightly altered by external factors like wind, hitting a patch of highly dense soil or a large rock, its speed could increase dramatically, ultimately stopping in an unexpected location 5,000 mm away from where it landed in the first drop.

A parallel in celestial mechanics is the so-called three-body problem, with three bodies in space like the recent Netflix series. Consider two bodies in space: Earth and the Moon. Newtonian mechanics allows us to predict the orbital motions of these two bodies perfectly. Yet, when we add a third body, the Sun, we discover a level of complexity that defies Newtonian predictability. The gravitational interactions among these three bodies create a dynamic, nonlinear system where slight variations in initial conditions, for example, minor variations in the distances or velocities of any one body, can lead to vastly different outcomes; the long-term positions of the three bodies become practically impossible to predict.

In broader mathematical and scientific terms, ‘chaos’ refers to systems that appear random yet are inherently deterministic. Take the example of a roulette wheel, commonly perceived as a game of chance. While we might assume the outcome is purely random, the underlying mechanics of the roulette wheel, including the motion, friction and the force of the spin, adhere to deterministic physical laws. The true source of unpredictability stems from its extreme sensitivity to initial conditions: how forcefully the ball is dropped, the speed at which the wheel spins, subtle vibrations from environmental factors like an air conditioner, and even the movement of patrons around the table. These factors, often unnoticed, can significantly influence the outcome of each spin. Chaos theory teaches us that even seemingly insignificant variations in initial conditions – a fraction of a millimetre difference in the ball’s drop point – can lead to disproportionately large effects.

Colloquially known as the butterfly effect, chaos theory can shatter our common notion of cause and effect. It suggests that predicting the long-term future is incredibly complex because even tiny, seemingly irrelevant events can have significant consequences.

The term ‘butterfly effect’ is often attributed to the meteorologist Edward Lorenz, who used the now-familiar example to describe chaos: a butterfly flapping its wings in Brazil could set off a chain of events leading to a hurricane in Texas three weeks later.

This seemingly outlandish scenario underscores the counterintuitive nature of chaos theory. While the idea of small causes having large effects might feel familiar, chaos theory challenges our common assumptions about how the world works. The surprising lesson isn’t that small events can have significant consequences but, rather, the profound difficulty in predicting those consequences. This core principle – the difficulty in prediction – has a technical definition: ‘sensitive dependence’ on initial conditions – for instance, a roulette ball’s position before it is dropped, the speed of the roulette wheel, etc.

But sensitive dependence is not a novel concept. It has a place in history:

For want of a nail the shoe was lost.
For want of a shoe the horse was lost.
For want of a horse the rider was lost.
For want of a rider the battle was lost.
For want of a battle the kingdom was lost.

‘For want of a nail’ captures a familiar notion about causality – small events can cascade into significant consequences.

Yet, within the framework of chaos theory, we can take this idea further.

Consider the potential for sudden changes due to phase transitions, as when the swirling water goes from smooth to turbulent. Tiny variations in a system’s conditions, like a seemingly insignificant missing nail, can accumulate and trigger an unexpected shift – the shoe falling off, the horse being injured, the battle being lost. These sudden changes, surprising transitions within the system, are driven by underlying physical laws, yet they reveal the inherent unpredictability and complexity embedded in what might seem like straightforward events.

Lorenz entered some numbers and went to get a coffee. On his return, he discovered a shocking result

Just as a missing nail leads to the loss of a kingdom, could the flutter of a distant insect trigger catastrophic events? The answer, perhaps surprisingly, depends on perspective – how we choose to look at the world and how we understand cause and effect.

Before we consider the two distinct perspectives, it is critical to note that the butterfly effect is a metaphor for a theory, namely, chaos – the idea that small changes in conditions can have large, unexpected effects. While the butterfly effect is a powerful image, it’s important to remember the scientific foundation Lorenz’s work provided.

I mentioned that Lorenz was a meteorologist. Indeed, he studied the weather and tried to find ways to improve forecasting – predicting when a storm might arise, where it would turn, when it would die down, and so on. During his investigations at MIT, Lorenz developed a simple computer model to track hypothetical weather systems in a targeted environment (the actual world). As the story goes, Lorenz entered some numbers into his computer program and left his office to get a coffee. When he returned, he discovered a shocking result.

His model was relatively simple. It used a set of differential equations to represent how air moves and temperatures fluctuate. Lorenz was repeating a simulation he had run earlier – but he had rounded off one variable from .506127 to .506, a seemingly inconsequential alteration. To Lorenz’s surprise, that tiny alteration drastically transformed the model’s output.

Lorenz’s groundbreaking work uncovered a startling phenomenon: small changes can have enormous, unforeseen consequences, leading to impenetrable barriers in long-term prediction. We call this phenomenon the butterfly effect, but its scientific foundation lies in the sensitivity of nonlinear systems to initial conditions.

The chaotic nature of nonlinear systems impacts more than just mathematics. For instance, small genetic mutations or environmental changes in biological evolution can lead to significant evolutionary shifts over time. The path of evolution is not linear or predictable; instead, it is full of unexpected twists and turns, like the movement of a pebble down the mountain. Similarly, in economics, markets function as complex, nonlinear systems. Rumours about a company or slight changes in interest rates can act as triggers, setting off substantial and unanticipated shifts. The 2007-08 financial crisis provides a sobering reminder that minor perturbations in one sector can ripple into a global meltdown.

Perhaps the point about small events is best stated by Terry Pratchett and Neil Gaiman in their book Good Omens (1990):

It used to be thought that the events that changed the world were things like big bombs, maniac politicians, huge earthquakes, or vast population movements, but it has now been realised that this is a very old-fashioned view held by people totally out of touch with modern thought. The things that really change the world, according to Chaos theory, are the tiny things. A butterfly flaps its wings in the Amazonian jungle, and subsequently a storm ravages half of Europe.

Tiny things matter. But can the movement of a butterfly, weighing roughly the same as a penny, cause a sizeable storm? The answer is rather complex. The answer is both yes and no – yes, from the perspective of classical physics, and no, from our perspective as human agents.

Allow me to explain.

Consider the act of lighting a match. Conventionally, this act is perceived in a simple, linear fashion – the striking of the match (event A) leads to ignition (event B), ostensibly illustrating what 19th-century philosophers called the ‘law of causality’ – given event A, event B will follow. Simple enough. Until we learn that the law of causality breaks down when scrutinised through the lens of classical physics.

Physics informs us that igniting a match is not solely an outcome of its striking but rather the aggregate effect of a vast multitude of elements. These include the match’s chemical composition, the force exerted in the strike, the presence of oxygen, and many other factors. The critical point is that, from a physical perspective, causation is not a simplistic sequence but a complex interplay of myriad factors, each contributing more or less subtly to the final event.

Thus, in the realm of classical physics, the concept of cause is dramatically broadened, suggesting that nearly every event within an event’s ‘past light cone’ – everything in its past – could be considered causal. To illustrate, consider the example of a tree falling in a forest. Here, the event’s past light cone encompasses all preceding events that could have influenced this particular tree’s fall; the concept, the ‘past light cone of an event’, indicates that information or influence travels at or below the speed of light. For the falling tree, the past light cone includes immediate factors like wind, the tree’s health, and soil conditions, as well as a multitude of more distant events – from the formation of weather patterns to ecological changes and even distant solar activity impacting Earth’s climate. No matter how seemingly unrelated or remote, each event converges within the tree’s past light cone, contributing to a complex web of causality.

If nearly everything influences everything else, the word ‘cause’ begins to lose its meaning

The philosopher Alyssa Ney summarises the above point with notable clarity. In ‘Physical Causation and Difference-Making’ (2009), Ney writes, assuming we look to physics to ground or understand causality:

[T]here are a lot of causal relations at this world, perhaps a lot more than we ordinarily assume. The fields of our best physical theories are spread out across the entire universe and interact with everything in their reach. They link small events like your leaving the house this morning with those more significant ones transpiring in Iraq a little later and more distant ones farther away in the galaxy. It is not quite true on this picture that ‘everything causes everything’, but things come close.

Bertrand Russell’s arguments in ‘On the Notion of Cause’ (1912-13) complicate the picture of causality in physics even further. Russell attacks the idea of cause and effect altogether. In essence, he argues that if A produces B, and A encompasses the environment (the past light cone of A), this broadens the scope of event A to such an extent that it becomes essentially unrepeatable.

Russell’s argument leads us to a dilemma: to uphold the law of causality, we must define events by noting invariable uniformities and by abstracting away most of the physical influences on A. Yet, this abstraction may inadvertently exclude causal influences, undermining the principle of causality. Thus, Russell asserts two significant conclusions: first, that our conventional notion of causality is not grounded in physics; and second, if notions like ‘cause’ must be reducible to physics, we should eliminate our use of the term ‘cause’.

According to Russell, there is no cause and effect at all.

What does this mean for the butterfly effect? Quite simply, it means that when we look at causality through the lens of physics, the flapping of a butterfly’s wings counts as a contributing cause to a later storm. But so too is everything else within the storm’s past light cone. All flapping butterflies, a breaching whale in the Pacific, a young child playing football in Edinburgh, and the Moon’s gravitational effect all count as causal.

The tension compels us closer to Russell’s radical conclusion – if nearly everything influences everything else, the word ‘cause’ begins to lose its meaning.

Yet, there is a line in the philosophy of causation, traceable through thinkers like R G Collingwood, Nancy Cartwright, Huw Price and James Woodward, which posits that we must locate the notion of cause in human practice by focusing on things like manipulation and control. In this view, ‘causes’ are seen as ‘handles’, things in nature that provide us with a measure of control. This framework emphasises the role of human perspectives in shaping, framing or limiting events, and compels us to consider the extent of our influence in complex systems.

This highlights the distinction between physical causation and how we use the concept of cause to understand and navigate the world. Consider my efforts to prevent a common cold: I focus on controllable factors like diet, sleep and who I interact with, and I disregard seemingly irrelevant factors like butterflies and distant whale breaches. The hook is this: while remote and uncontrollable factors like the movement of a butterfly can have some minor physical influence, the movement of the butterfly does not make a difference to my physical health. Philosophers often spell this out in terms of probability: I can alter the probability of catching a cold by ensuring I get sufficient sleep, while the probability is unaltered by catching a butterfly and keeping it safely in a jar; or counterfactuals: had I not stayed awake until 4am, I would not have become ill. We reject as absurd the counterfactual: had this particular butterfly not moved from one flower to another, I would not have become ill.

But notice the minor tension even here. If it is true that the movement of a butterfly (or anything in the past light cone) has some effect on my health, is it arbitrary to focus on controllable factors like how much I sleep? No, because we enable basic causal reasoning once we shift our focus from physics to a more practical, human-level perspective. Indeed, it seems to be a central aspect of our ordinary use of ‘cause’. We may want to avoid being injured or getting ill, and our interest leads us to ask a specific set of questions; in turn, this leads us to the fact that much of the world becomes irrelevant. If we want to avoid getting lung cancer or the flu, for example, we will not be interested in the current migration patterns of monarch butterflies or the number of universities in California.

Consider E H Carr’s seminal work What Is History? (1961). In a chapter titled ‘Causation in History’, Carr admits that determinism introduces serious complications in historical analysis. However, he emphasises that historians focus on fruitful generalisations, or what Carr calls ‘real’ causes. To illustrate, imagine that Smith, walking to buy a pack of cigarettes, is killed by a drunk driver speeding around a blind corner. While it is true that had Smith not been a smoker, he would not have died, we cannot generalise the proposition, ‘smoking caused Smith’s death.’ It is much more useful, certainly in the context of history and everyday life, to say that the real cause of Smith’s death was the drunk driver, the speed of the vehicle, or the blind corner. This is why historians cite the Treaty of Versailles or the Nazi invasion of Poland in 1939 as a cause of the Second World War and not Hitler’s being born.

A basic thesis emerges: we can overcome Russell’s problem, the problem of causation in physics, by shifting our perspective. If we look at the world through the ordinary lens of human agency, rather than the lens of physics, we can talk about causes as handles, events within nature that make a difference to some effect and provide us with a sense of control.

Imagine that Sam and Suzy are standing near a fire. Each bystander desires to extinguish the flame. Imagine further that Suzy decides to spray the fire with a hose, and that Sam decides to pray for the fire to go out. From a physical perspective, Sam and Suzy – one spraying and one praying – affect the fire by their mere presence and, thus, by their actions. Yet, from a macro-level human perspective, only one individual affects the fire. That is, only Suzy’s spraying makes a difference to the flames.

Our actions are simultaneously bound by the determinism of physical laws and enriched with intention

When we shift our perspective from physics to agency and difference-making, we land on the most intuitive assessment of the butterfly effect. From our perspective, the butterfly is not a cause of the storm because we cannot affect storms by manipulating butterflies. And while the butterfly could have an effect on a storm, it does not make a difference to the occurrence of storms in a way that we can predict or control.

Exploring the dichotomy between the perspectives of physics and human agency uncovers a paradox: our actions are simultaneously bound by the determinism of physical laws and enriched with intention, purpose and meaning that go beyond them.

To fully appreciate what this means, heed a lesson from Fyodor Dostoyevsky’s great novel, The Brothers Karamazov (1880), which asks how a benevolent God could allow suffering. There is just one virtuous character in the novel, the monk Father Zosima, whose simple teaching, dictated through the genius of Dostoyevsky, sheds light on chaos, causation and difference-making:

See, here you have passed by a small child, passed by in anger, with a foul word, with a wrathful soul; you perhaps did not notice the child, but he saw you, and your unsightly and impious image has remained in his defenceless heart. You did not know it, but you may thereby have planted a bad seed in him, and it may grow, and all because you did not restrain yourself before the child, because you did not nurture in yourself a heedful, active love … for one ought to love not for a chance moment but for all time. Anyone, even a wicked man, can love by chance. My young brother asked forgiveness of the birds: it seems senseless, yet it is right, for all is like an ocean, all flows and connects; touch it in one place and it echoes at the other end of the world.

This Essay was made possible through the support of a grant to Aeon Media from the John Templeton Foundation. The opinions expressed in this publication are those of the author and do not necessarily reflect the views of the Foundation. Funders to Aeon Media are not involved in editorial decision-making.