Questioning the rules of the game
We know how to use the “rules” of quantum physics to build lasers, microchips, and nuclear power plants, but when students question the rules themselves, the best answer we can give is often, “The world just happens to be that way.” Yet why are individual outcomes in quantum measurements random? What is the origin of the Schrödinger equation? In a paper [1] appearing in Physical Review A, Giulio Chiribella at the Perimeter Institute in Waterloo, Canada, and Giacomo Mauro D’Ariano and Paolo Perinotti at the University of Pavia, Italy, offer a framework in which to answer these penetrating questions. They show that by making six fundamental assumptions about how information is processed, they can derive quantum theory. (Strictly speaking, their derivation only applies to systems that can be constructed from a finite number of quantum states, such as spin.) In this sense, Chiribella et al.’s work is in the spirit of John Wheeler’s belief that one obtains “it from bit,” in other words, that our account of the universe is constructed from bits of information, and the rules on how that information can be obtained determine the “meaning” of what we call particles and fields.
Instead of taking the meaning of quantum theory from the mathematics one uses to calculate wave functions and energy levels, principles-based reconstructions of quantum theory attempt to extract meaning along with “formalism,” while deriving the theory from some deeper physical principles [2]. In the past, a vast majority of attempts to find a set of physical principles behind quantum theory (most notably within the quantum logic approach in the sixties), either fell short of uniquely deriving quantum theory, or were based on abstract mathematical assumptions that themselves called for a more conclusive physical motivation. The rise of quantum information science increased the awareness that information—the key concept for understanding, for example, why unknown quantum states can’t be cloned, or the possibility of quantum state teleportation—plays a more fundamental role in quantum physics as compared to classical physics [3].
In his seminal work from 2001, Lucien Hardy (now at the Perimeter Institute) reopened the field by deriving quantum theory from five “reasonable” axioms [4]. Hardy’s reconstruction was entirely developed within what is called an operational approach: instead of using notions like position, momentum, or energy of “traditional” physics, the focus is put on primitive laboratory operations [5], such as how a state is prepared, transformed, and measured (Fig. 1). In this picture, the state of a system is determined by the preparation procedure and represents that mathematical object from which one can calculate the probability for any conceivable measurement. Pure states are those that cannot be written as probabilistic mixtures of other states and which correspond to a situation of maximal knowledge about the system’s preparation. Hardy’s reconstruction, however, left the uncomfortable possibility that quantum theory is just the “simplest” theory in a hierarchy of probabilistic theories, in which each “lower” theory is a special case of the “higher” one. (This is analogous to having classical probability theory as a special case of quantum theory.) It was later proven [5], and the proof further sharpened [6], that out of all theories from the hierarchy, only quantum theory is consistent with the notion of entanglement—a cornerstone of quantum theory.
Still using the operational approach, Chiribella et al. nonetheless follow a completely different route to deriving a quantum theory [1]. They assume five new elementary axioms—causality, perfect distinguishability, ideal compression, local distinguishability, and pure conditioning—which define a broad class of theories of information processing. For example, the causality axiom—stating that one cannot signal from future measurements to past preparations—is so basic that it is usually assumed a priori. Both classical and quantum theory fulfil the five axioms. What is significant about Chiribella et al.’s work is that they show that a sixth axiom—the assumption that every state has what they call a “purification”—is what singles out quantum theory within the class. In fact, this last axiom is so important that they call it a postulate. The purification postulate can be defined formally (see below), but to understand its meaning in simple words, we can look to Schrödinger, who in describing entanglement gives the essence of the postulate: “Maximal knowledge of a total system does not necessarily include maximal knowledge of all its parts.” (Formally, the purification postulate states that every mixed state of system can always be seen as a state belonging to a part of a composite system that itself is in a pure state . This pure state is called “purification” and is assumed to be unique up to a reversible transformation on ).
Chiribella et al. conclude there is only one way in which a theory can satisfy the purification postulate: it must contain entangled states. (The other option, that the theory must not contain mixed states, that is, that the probabilities of outcomes in any measurement are either or like in classical deterministic theory, cannot hold, as one can always prepare mixed states by mixing deterministic ones.) The purification postulate alone allows some of the key features of quantum information processing to be derived, such as the no-cloning theorem or teleportation [7]. By combining this postulate with the other five axioms, Chiribella et al. were able to derive the entire mathematical formalism behind quantum theory.
But what is the deeper meaning of the purification postulate? After all, we will have learned little about quantum theory if we derive it from axioms that are equally opaque. A possible answer to this question might be found in a little-known, unpublished paper written by Heisenberg [8,9] in 1935, titled “Is a deterministic completion of quantum mechanics possible?” in which he outlined his own response to the famous Einstein, Podolsky, and Rosen paper from the same year. In the paper, Heisenberg argued that it is necessary to make an epistemological divide between the “system” and the “measurement device,” a divide he referred to as the “cut.” Heisenberg was trying to understand whether a prediction to get an outcome with a classical device measuring a quantum system is the same if instead both the system and the device measuring it are described by quantum wave functions and are measured by yet another device: “At what place should one draw the cut between the description by wavefunctions and the classical description? The answer to this question is: the quantum mechanical predictions about the outcome of an arbitrary experiment are independent of the location of the cut just discussed.”
Heisenberg’s statement can be understood in terms of the purification postulate. Any measurement on the “system” can be viewed as a measurement on the “measurement device,” where a composite of the two is in a suitable pure state. Hence, the prediction for the measurement is the same, irrespective of where we put the “cut”—immediately after the “system” or only after the “measurement device.” In a theory that is probabilistic and at the same time universal, in the sense that a pure state can be ascribed to any system, the purification postulate ensures consistency of probability assignments independently of what the observer chooses to consider as “system under observation.”
Having principles from which one can reconstruct a known physical theory is fine, but can this help us to search for new physics? As in any axiomatic reconstruction, one can ask how Chirabella et al.’s results change when the principles are weakened or modified. The most radical generalization of their work [1] would be to drop the assumption of causality. Research developing frameworks that do not presume underlying spacetime or fixed causal structures is on the way [10–12] and will likely have consequences for the program of merging quantum theory and general relativity.
References
- G. Chiribella, G. D’Ariano, and P. Perinotti, Phys. Rev. A 84, 012311 (2011)
- A. Zeilinger, Found. Phys. 29, 631 (1999); Č. Brukner and A. Zeilinger, in Time, Quantum and Information, edited by L. Castell and O. Ischebeck (Springer, New York, 2003)[Amazon][WorldCat]
- C. Fuchs, in Proceedings of the NATO Advanced Research Workshop on Decoherence and its Implications in Quantum Computation and Information Transfer, Mykonos, Greece, 2000, edited by A. Gonis (IOS Press, Amsterdam, 2001)[Amazon][WorldCat]; arXiv:quant-ph/0106166
- L. Hardy, arXiv/quant-ph/0101012 (2001)
- B. Dakic and Č. Brukner, in Deep Beauty, edited by Hans Halvorson (Cambridge University Press, New York, 2011)[Amazon][WorldCat]; arXiv:0911.0695 (2009)
- L. Masanes and M. P. Mueller, New J. Phys. 13, 063001 (2011)
- G. Chiribella, G. M. D’Ariano, and P. Perinotti, Phys. Rev. A 81, 062348 (2010)
- W. Pauli, Wissenschaftlicher Briefwechsel mit Bohr, Einstein, Heisenberg, Vol. 2, edited by K. von Meyenn, A. Hermann, and V. F. Weisskopf, (Springer, Berlin, 1985), pp. 1930-1939
- For the English translation of Heisenberg’s manuscript with a brief introduction and bibliography see E. Crull and G. Bacciagaluppi, http://philsci-archive.pitt.edu/8590/
- L. Hardy, arXiv:gr-qc/0509120v1 (2005)
- G. Chiribella, G. M. D’Ariano, P. Perinotti, and B. Valiron, arXiv:0912.0195v2 (2009)
- O. Oreshkov, F. Costa, and Č. Brukner, arXiv:1105.4464 (2011)