Menu
Aeon
DonateNewsletter
SIGN IN

Photo by duncan1890/getty

i

Quantum common sense

Despite its confounding reputation, quantum mechanics both guides and helps explain human intuition

by Philip Ball + BIO

Photo by duncan1890/getty

Quantum theory contradicts common sense. Everyone who has even a modest interest in physics quickly gets this message. The quantum view of reality, we’re often told, is as a madhouse of particles that become waves (and vice versa), and that speak to one another through spooky messages that defy normal conceptions of time and space. We think the world is made from solid, discrete objects – trees and dogs and tables – things that have objective properties that we can all agree on; but in quantum mechanics the whole concept of classical objects with well-defined identities seems not to exist. Sounds ridiculous? The much-lauded physicist Richard Feynman thought so, yet he implored us to learn to live with it. ‘I hope you can accept Nature as She is – absurd,’ he said in 1985.

Except that much of the popular picture is wrong. Quantum theory doesn’t actually say that particles can become waves or communicate in spooky ways, and it certainly does not say that classical objects don’t exist. Not only does it not deny the existence of classical objects, it gives a meaningful account of why they do exist. In some important respects, the modern formulation of the theory reveals why common sense looks the way it does. You could say that the classical world is simply what quantum mechanics looks like if you are six feet tall. Our world, and our intuition, are quantum all the way up.

Why, then, is it still so common to find talk of quantum mechanics defying logic and generally messing with reality? We might have to put some of the blame on the Danish physicist Niels Bohr. He was probably the deepest thinker about the meaning of quantum theory among its founding pioneers, and his intuitions were usually right. But during the 1920s and ’30s, Bohr drove a lasting wedge between the quantum and classical worlds. They operate according to quite different principles, he said, and we simply have to accept that.

According to Bohr, what quantum mechanics tells us is not how the world is, but what we’ll find when we make measurements. The mathematical machinery of the theory gives us the probabilities of the various possible outcomes. When we make a measurement, we get just one of those possibilities, but there’s no telling which; nature’s selection is random. The quantum world is probabilistic, whereas the classical world (which is where all of our measurements happen) contains only unique outcomes. Why? That’s just how things are, Bohr answered, and it is fruitless to expect quantum mechanics to supply deeper answers. It tells us (with unflagging reliability) what to expect. What more do you want?

Bohr’s ‘Copenhagen interpretation’ – named after the location of the physics institute he founded in 1921 – didn’t exactly declare a contradiction between classical and quantum physics, but it implied an incompatibility that Bohr patched over with a mantra of what he called ‘complementarity’. The classical and quantum worlds are complementary aspects of reality, he said: there’s common sense and there’s quantum sense, but you can’t have both – at least, not at the same time.

The principle of complementarity seemed a deeply unsatisfying compromise to many physicists, since it not only evaded difficult questions about the nature of reality but essentially forbade them. Still, complementarity had at least the virtue of pinpointing where the problems lay: in understanding what we mean by measurement. It is through measurement that objects become things rather than possibilities – and furthermore, they become things with definite states, positions, velocities and other properties. In other words, that’s how the counterintuitive quantum world gives way to common-sense experience. What we needed to unite the quantum and classical views, then, was a proper theory of measurement. There things languished for a long time.

Now we have that theory. Not a complete one, mind you, and the partial version still doesn’t make the apparent strangeness of quantum rules go away. But it does enable us to see why those rules lead to the world we experience; it allows us to move past the confounding either/or choice of Bohr’s complementarity. The boundary between quantum and classical turns out not to be a chasm after all, but a sensible, traceable path.

It’s a strange idea that measurement needs explaining at all. Usually what we mean by a measurement seems so trivial that we don’t even ask the question. A ball has a position, or a speed, or a mass. I can measure those things, and the things I measure are the properties of the ball. What more is there to say?

But in the quantum world things aren’t so obvious. There, the position of a particle is nothing more than a whole set of possible positions until the moment when it is observed. The same holds true for any other aspect of the particle. How does the multitude of potential properties in a quantum object turn into one specific reading on a measuring device? What is it about the object that caused the device to point to that precise answer? The modern answer is surprising: the act of measurement doesn’t entail a collapse of quantum-ness and a shift to classical-ness after all.

Quantum objects have a wave nature – which is to say, the theory tells us that they can be described as if they were waves, albeit waves of a peculiar sort. The waves do not move through any physical substance, as do waves in air or water, but are encoded in a purely mathematical object called a wave function that can be converted to probabilities of values of observable quantities.

As a result, quantum particles (such as photons of light, electrons, atoms, or even entire molecules) can exhibit interference, a classical property of waves in which two peaks reinforce each other when they overlap, whereas when a peak coincides with a trough the two can cancel each other out. It’s hard to talk about this phenomenon without giving the impression that the particles themselves are somehow wavy, and the unfortunate expression ‘wave-particle duality’ only compounds the confusion. But all we’re really seeing here is a feature of the particles’ wave functions, for want of a better term. Asking if these quantum objects really are particles or waves misses the point, because both of those are classical concepts. The reason we ask anyway is that we’re trying instinctively to recover some common-sense picture of the quantum world. But what we call ‘common sense’ is a feature of the classical world, and we can’t expect to use it for quantum things.

The environment is what conjures classical physics – and ‘common-sense’ behaviour – out of the quantum soup

Quantum effects such as interference rely on the wave functions of different entities being coordinated (the technical term is coherent) with one another. If they’re not, the effects are averaged away. That sort of coherence is what permits the quantum property of superposition, in which particles are said to be in two or more states at once. Again, they’re not really in two states at once – we don’t know how best to describe what they really are in a classical sense. But if the wave functions of those states are coherent, then both states remain possible outcomes of a measurement.

If their wave functions are not coherent, two states cannot interfere, nor maintain a superposition. The process called decoherence therefore destroys these fundamentally quantum properties, and the states behave more like distinct classical systems. Macroscopic objects don’t display quantum interference or exist as superpositions because they can’t be described by coherent wave functions. This – and not sheer size per se – is the fundamental dividing line between what we think of as quantum versus classical (familiar) behaviour. Quantum coherence is essentially what defines ‘quantum-ness’.

What, though, causes decoherence? This arises because of a long-neglected aspect of quantum entities: their environment. The way a quantum system behaves and evolves can depend crucially on the fact that it doesn’t exist in isolation. The environment is what conjures classical physics – and ‘common-sense’ behaviour – out of the quantum soup.

There’s no obvious reason why decoherence couldn’t have been understood by Bohr and his peers in the early days of quantum mechanics, because it involves nothing but the basic principles of quantum theory. The reason it was neglected might have been largely because that’s what usually happens in science. Researchers figure that they can focus in on the system they’re interested in, and either ignore its surroundings totally or relegate them to a minor background perturbation. Usually that works fine. But not if we want to observe anything about the quantum world.

The foundations of decoherence theory were laid in the 1970s by the German physicist H Dieter Zeh. Even then it was largely ignored until two papers on the ‘decoherence programme’ the following decade, by Wojciech Zurek at the Los Alamos National Laboratory in New Mexico, brought it to a wide audience. Polish by birth and exuberantly curly haired, Zurek displays a laconic calm in the face of the mind-boggling aspects of quantum mechanics that he has uncovered. That composure makes sense once you appreciate that he studied under John Wheeler, the near-legendary American physicist who himself worked with Bohr and had a rare talent for the wry epigram. (He coined the term wormhole and popularised the concept of black holes.)

Zurek has become one of the key architects and advocates of decoherence theory, helping to establish it as the central concept connecting the quantum and classical worlds. This connection comes from the fact that quantum coherence is contagious. If one quantum object interacts with another, they become linked into a composite superposition: in some sense, they become a single system. This is, in fact, the only thing that can happen in such an interaction, according to quantum mechanics. The two objects are then said to be entangled. It might sound spooky, but this is merely what happens when a quantum system interacts with its environment – as a photon of light or an air molecule bounces off it, say. As a result, coherence spreads into the environment.

In theory, there is no end to this process. An entangled air molecule hits another, and the second molecule gets drawn into the entangled state. Meanwhile, other particles hit the initial quantum system, too. As time passes, the system becomes more and more entangled with its environment, which means that it can’t be broken down into separate entities any more.

This spreading of entanglement is the thing that destroys the manifestation of coherence in the original quantum system. Because superposition becomes a shared property of the system and its environment, we can’t any longer see the superposition just by looking at the little part of that shared state corresponding to the original system. We can’t see the wood for the trees, you might say. Decoherence is not actually a loss of superposition and coherence, but rather a loss of our ability to detect these things in the original system.

We don’t need a conscious mind to measure or look. With or without us, the Universe is always looking

Only by looking closely at the states of all the entangled particles can we deduce that they’re in a superposition. And how can we possibly hope to do that – to monitor every photon that bounces off the original system, every air molecule that collided with it and then subsequently with others? The pieces of the puzzle have been scattered so widely that they are lost, for all practical purposes, even though in principle they are still out there, and remain so (as far as quantum mechanics tells us) indefinitely. That’s the essence of what decoherence is: a loss of (personally) meaningful coherence. It is a gradual and real process that occurs at a particular rate.

Quantum mechanics allows us to calculate that rate, so that we can put the theory of decoherence to the test. Serge Haroche and colleagues at the École Normale Supérieure in Paris first did that in 1996 by measuring decoherence of an atom held in a device called a ‘light trap’ and interacting with photons. The loss of interference between states of the atom owing to decoherence, as calculated from quantum theory, matched the experimental observations perfectly. And in 2003 a team at the University of Vienna led by Anton Zeilinger and Markus Arndt watched interference vanish between the quantum waves of large molecules, as they altered the rate of decoherence by gradually admitting a background gas into the chamber where the interference took place, so that the gas molecules would collide with those in the matter waves. Again, theory and experiment tallied well.

Decoherence is a phenomenally efficient process, probably the most efficient one known to science. For a dust grain 100th of a millimetre across floating in air, it takes about 10-31 seconds: a million times faster than the passage of a photon of light across a single proton! Even in the near-isolation of interstellar space, the ubiquitous photons of the cosmic microwave background – the Big Bang’s afterglow – will decohere such a grain in about one second.

So, for objects approaching the macroscopic scale under ordinary conditions, decoherence is, to all practical purposes, inevitable and instantaneous: you can’t keep them looking ‘quantum’. It’s almost as if the laws of quantum physics that make the world are contrived to hide those very laws from anything much bigger than atom-sized, tricking us into thinking that things just have to be the way we experience them. But if we watch nature carefully enough, we can see how the trick is done.

Notice that this effect of decoherence has nothing to do with observation in the normal sense. To turn quantum to classical, we don’t need a conscious mind to measure or look; we just need an environment full of stuff. With or without us, the Universe is always looking.

The decay of quantum superposition and interference by decoherence is only the first element in a quantum theory of measurement, however. We also have to explain why classical measuring instruments register the values they do. Exactly how we define a superposition state depends on how we choose to write the maths. From the quantum perspective, all states are equally valid solutions to the equations. So why do some of these states survive decoherence and get translated into those unambiguous readouts, or ‘pointer states’, in a measuring device, while others don’t? Why do we see the common-sense states but not the imponderable superpositions?

There are two parts to the answer. First, it turns out that the decoherence-inducing interaction with the environment doesn’t just squash quantumness indiscriminately. It specifically selects states that have particular mathematical properties of symmetry, and trashes the others. Zurek calls this environment-induced selection or einselection. In this way, ‘the environment functions not just as a garbage dump, but as a communication channel’, he says.

It’s not enough, though, for a quantum state to survive decoherence in order for us to be able to measure it. Survival means that the state is measurable in principle – but we still have to get at that information to detect the state. So we need to ask how that information becomes available to an experimenter. (Really, who’d have thought there is so much to the mere act of observation?)

Here’s the exciting answer: it’s precisely because a quantum system interacts with its environment that it leaves an imprint on a classical measuring device at all. If we were able, with some amazing instrument, to record the trajectories of all the air molecules bounding off the speck of dust, we could figure out where the speck is without looking at it directly; we could just monitor the imprint it leaves on its environment. And this is, in effect, all we are doing whenever we determine the position, or any other property, of anything: we’re detecting not the object itself, but the effect it creates.

Just as coupling the object to its environment sets decoherence in train, so too it imprints information about the object onto the environment, creating a kind of replica. A measurement of that object then amounts to acquiring this information from the replica.

A detailed theoretical analysis of decoherence carried out by Zurek and his colleagues shows that some quantum states are better than others at producing these replicas: they leave a more robust footprint, which is to say, more copies. These robust states are the ones that we can measure, and that ultimately produce a unique classical signature from the underlying quantum morass. You could say that it’s only the ‘fittest’ states that survive the decoherence process by producing abundant copies in the environment. Zurek fittingly calls the idea ‘Quantum Darwinism’.

Just as in nature, fitness here is determined both by the entity and its environment. Some environments are good at inducing decoherence of a quantum object but not at retaining reliable, sharply defined replicas of it. The collisions of air molecules are like this. Yes, you could reconstruct where an object is from the trajectories of air molecules bouncing off it, but only if you could collect that information before it gets scrambled by the molecules subsequently colliding with one another.

Photons, on the other hand, are much better at retaining an imprint, because they don’t generally interact with one another after they have bounced off the object, so the information they carry away doesn’t get messed up so easily. It’s no coincidence that vision is a reliable and widespread way that organisms find out about their environment! Smell, which relies on the passage of odorant molecules through the busy, jostling air, is less good. Some animals use it when vision won’t work well (at night, say), but the smeller has to sniff out a wandering, diffusing trail rather than just seeing the target and heading for it.

You can gaze as long as you like at a coffee mug without altering it but you can’t do that to an Old Master painting

Zurek and his colleague Jess Riedel have been able to calculate how fast and extensive this proliferation of quantum copies is for a few simple situations, such as a dust speck in a vacuum flooded by sunlight. They find that, after being illuminated for just one microsecond, a grain of dust a micrometre across will have its location imprinted about 100 million times in the scattered photons.

It’s because of this multiple imprinting that such objects seem to have objective, classical-like properties at all. Ten observers, say, can separately measure the position of a dust grain and all agree that it’s in the same location. Each observation consumes a different replica of the grain in the reflected photons. In this view, we can assign an objective position to the speck not because it truly ‘has’ such a position (whatever that means), but because its position state can imprint many indistinguishable replicas in the environment. What we take as obvious common sense turns out to have secure yet far-from-obvious underpinning in quantum theory.

There’s a seemingly bizarre corollary to this picture. When we measure a property of a system by probing its replica in the environment, we destroy that replica. Might we then potentially use up all the copies by repeated measurement, so that the state can’t any longer be observed? Yes we can: too much measurement will ultimately make the state seem to vanish.

But we needn’t be perplexed by the finite number of replicas. It just tells us that if we keep poking at a system to find out about it, eventually we’ll perturb it into another state. That’s completely consistent with our experience. Sure, you can gaze for as long as you like at a coffee mug without altering it in any substantial way. But you can’t do that to an Old Master painting, for the pigments will fade under too much light: you will alter their state. If you examine in prolonged and sustained fashion something small enough – like an electron – the reflection of even a single photon becomes a big deal – so you don’t have many replicas to capture before you end up seeing a different state.

What Quantum Darwinism tells us is that, fundamentally, the issue is not really about whether probing physically disturbs what is probed (although that can happen). It is the gathering of information that alters the picture. Through decoherence, the Universe retains selected highlights of the quantum world, and those highlights have exactly the features that we have learnt to expect from the classical world. We come along and sweep up that information – and in the process we destroy it, one copy at a time.

Decoherence doesn’t completely neutralise the puzzle of quantum mechanics. Most importantly, although it shows how the probabilities inherent in the quantum wave function get pared down to classical-like particulars, it does not explain the issue of uniqueness: why, out of the possible outcomes of a measurement that survive decoherence, we see only one of them. Some researchers feel compelled to add this as an extra (you might say ‘super-common-sensical’) axiom: they define reality as quantum theory plus uniqueness.

All the same, thanks to the theory of decoherence, we no longer have to make quantum measurement some magical and mysterious event that crystallises knowledge. We have a mathematical theory to explain how information gets out of the quantum system and into the macroscopic apparatus. We can use the theory to calculate how quickly that happens, and how robustly. We have, at long last, a theory of measurement. What’s more, it is a theory that confers no privileged status on the conscious observer, stripping away the seemingly mystical veneer from quantum mechanics.

There’s no longer any need for Bohr’s arbitrary division of the world into the microscopic, where quantum mechanics rules, and macroscopic, which is necessarily classical. Now we can see not only that they are a continuum, but also that classical physics is just a special case of quantum physics. Regarded this way, common sense is a direct and utterly sensible outgrowth of quantum sense.

Common sense comes out of principles that seem very far from common-sensical

This quantum theory of measurement is a reversal of the usual way that science works. We normally take our human common sense and experience for granted, and work back from it to deduce more fundamental physical behaviours. Sure, what we discover that way might sometimes seem a long way from common sense – heliocentrism, Higgs bosons, black holes, etc. But we typically get to those points by taking it for granted that there is an uncomplicated relationship between what we measure and what is there.

Decoherence theory doesn’t take that common-sense view of measurement for granted. It starts by accepting that the world is fundamentally governed by quantum rules, which seem at face value to run deeply counter to experience, and then it works upwards to see if it can recover common sense. Remarkably, it can.

That is why the quantum theory of measurement can be thought of as nothing less than a ‘theory of common sense’. Decoherence theory explains where common sense comes from – namely, out of principles that seem very far from common-sensical. The challenge is then on all of us to reconcile our instinctive common sense with its quantum origins. But we no longer have to regard the two as being in conflict, since they are not only consistent but inextricably linked.

We can seek solace in the knowledge that the conflict between classical and quantum is not in the physics. It’s just in our minds.