Information theory says that every physical system, from a glass of water to a microchip, holds 1s and 0s in the states of its component particles. Changes in those states could be called “computation,” just as your desktop machine computes by changing the information in its memory. In the 10 June print issue of *PRL*, one researcher pushes this idea to the limit, calculating the total number of bits in the Universe and how many computations they could have performed since the big bang. In one interpretation, these numbers outline a grand computation the Universe itself is performing.

Two years ago, Seth Lloyd of MIT studied the properties of the “ultimate laptop,” a hypothetical device whose speed and memory would be limited solely by physical laws [1]. He had to consider two things for this calculation: energy and entropy. It takes energy to turn a 0 into a 1 by inverting a nuclear spin or switching on a transistor; and entropy has a well-known relation to the number of states in a system, each of which embodies a bit.

Now Lloyd has applied the same approach to the Universe as a whole. “The ideas are cosmic but the physics is mundane,” he says, involving mostly well-established laws. The Universe’s energy is locked up primarily in matter according to E = $m\phantom{\rule{0}{0ex}}{c}^{2},$ and is basically constant. Taking rough estimates of the age and energy density of the Universe, and assuming that gravity’s total energy is equal in magnitude to that of matter, he finds that the Universe could have performed ${10}^{120}$ basic operations, or ops, on all its bits so far.

Because entropy is closely related to temperature, he also imagines maxing out the entropy by turning all of the Universe’s matter into radiation and using the so-called blackbody radiation formula to get a temperature. Subtotal: ${10}^{90}$ bits. If gravitational fields can contribute entropy, as theorists believe, the total could be much larger. They deduced years ago that a black hole’s entropy is proportional to its surface area. A more recent conjecture–that the universe itself stores information this way–leads Lloyd to a speculative grand total of ${10}^{120}$ bits.

So what does any of this mean? At the very least, the numbers represent the most computing the Universe could have supported, or the minimum size of a computation for simulating the Universe, Lloyd notes. More interestingly, he says, the fact that the ops calculation from information theory and the bits calculation from quantum gravity gave equivalent results “suggests that we can go on being hopeful that there’s a connection out there” between quantum gravity and computation. The Universe’s program could be thought of as the primordial quantum fluctuations that seeded the formation of galaxies, he says. “It’s not running Windows or Linux, and we hope that it never shall.”

For Y. Jack Ng of the University of North Carolina in Chapel Hill, recent results indicating that quantum computers could simulate any physical system make it logical to think of the cosmos as processing information. Taking the quantum gravity aspects of Lloyd’s calculation at face value, he points out, nature has performed an average of only a single op on every bit. So “it appears that the Universe is not the most efficient computer.” Let’s just hope it doesn’t crash.

JR Minkel is a freelance science writer in New York City.

### References

- S. Lloyd, Nature (London)
**406**, 1047 (2000).