Sensitive magnetic detectors are essential for applications ranging from land mine clearance to imaging the magnetic activity of our brains, as well as for fundamental investigations of the symmetry of nature’s laws. Superconductor-based magnetometers have been the primary devices for ultrasensitive detection for a number of years. But atomic magnetometers—based on gases of atoms like rubidium—have recently started to offer comparable or better sensitivity, with the advantage of not requiring bulky and expensive cryogenic cooling. Yet the most sensitive schemes could only work in a highly shielded environment, screened from the Earth’s own magnetic field. Now, writing in Physical Review Letters, Dong Sheng at Princeton University, New Jersey, and co-workers report an atomic magnetometer that can detect fields one hundred billion times smaller than the Earth’s while operating in a finite field.
Atomic magnetometers detect how internal atomic levels are split into different spin states through the Zeeman effect induced by the external magnetic field. Typically, a pump laser is used to “polarize” the atoms by populating specific spin states, and a probe laser reads out the spin precession, yielding a signal that is proportional to the magnetic field. Sheng et al. have introduced two key improvements. First, they used a multipass cell in which the probe laser beam passes many times through the rubidium vapor, enhancing the measured signal. Second, they used a fast time-resolved setup, allowing the measurement to take place within millisecond of laser pumping, before mechanisms that cause spin relaxation—the ultimate limit to noise in these systems—kick in. The demonstrated sensitivity, on par with the best available sensors, can be achieved without the need to operate under a close-to-zero magnetic field. – Matteo Rini