The Universe may have begun as a quantum event, but astrophysical and terrestrial phenomena today are distinctly classical in nature, with no hint of quantum indefiniteness. The ultimate reason, a new report in Physical Review Letters suggests, may be a tiny but ubiquitous background of gravitational waves left over from the big bang. These waves randomly disturb quantum states everywhere enough to ensure that real-life Schrödinger’s cats are never seen.
Quantum mechanics allows the existence of the strange states known as superpositions, in which, say, an electron can be both spin up and spin down at the same time. While single particle superpositions can be fairly stable, theorists have wondered why larger objects never seem to appear in such states, exemplified by Erwin Schrödinger’s famous half-alive, half-dead cat. Many theorists now believe that macroscopic superpositions, in which numerous quantum components must maintain a precise relationship with each other, are disrupted by continual environmental influences. Such disturbances, acting differently on each component of a superposition, “decohere” it into a classical state that is, say, dead or alive, but not both. Even a system as small as an atom requires extraordinary protection from stray electromagnetic fields in the lab to remain in a superposition.
Since gravitational fields are both pervasive and inescapable, researchers have proposed that they play a fundamental role in ensuring that macroscopic systems behave in a classical way. Miles Blencowe of Dartmouth College in Hanover, New Hampshire, has now calculated the decoherence caused by the cosmic background of gravitational radiation, small ripples in spacetime representing “echoes” of the big bang. This background is a cousin to the more familiar cosmic microwave background but is thought to be at a slightly lower temperature of about kelvin. Although these gravitational waves would rarely be the largest environmental influence on an object, Blencowe wanted to see whether they alone would guarantee that macroscopic quantum superpositions would never be seen in the wild.
A gravitational wave represents a disturbance in space and time that, as it travels across a superposition, can push its components out of sync. To get a sense of the size of such effects, Blencowe studied a simple model of an object that could be placed in a superposition. He theoretically constructed an atomic-scale blob of mass and energy consisting of an arrangement of a fictitious field. He then imagined this blob prepared in a superposition of a ground state and an excited state separated by one electron volt (eV) in energy and estimated how quickly gravitational background radiation would upset it. The answer: not quickly at all. It would take about seconds, or times the age of the Universe, for such a superposition to decohere.
This result isn’t surprising because the interaction of weak gravitational waves with a small object like an atom is tiny. But Blencowe showed that the decoherence rate rises with the square of the energy difference between two states in a superposition. For a larger object, equivalent to Avogadro’s number of atoms prepared in an overall superposition, that difference would be x eV, and the decoherence time would fall to about seconds.
In inflationary models of cosmology, the gravitational radiation background is likely to be more complicated than the simple form assumed by Blencowe, says Claus Kiefer of the University of Cologne, Germany. Still, he says, whatever gravitational radiation exists would lead to decoherence in the way Blencowe describes. The new result is “certainly interesting because it shows the ubiquitous action of decoherence,” says Kiefer.