Quantum Computers Have a Fit
As researchers were snaring the Higgs boson at CERN, the LHC machines were cranking out gigabytes of data each second. Even with the uninteresting bits filtered out, modern large-scale science creates mind-boggling amounts of data, causing standard techniques like curve fitting to run into a brick wall. Quantum computing—harnessing nonlocality and entanglement to make solving really hard problems more efficient—might have the prescription for this headache. In a paper in Physical Review Letters, Nathan Wiebe at the University of Waterloo, Canada, and colleagues propose an algorithm to improve the data analyzer’s best friend, least-squares fitting, on a quantum computer.
The authors built upon earlier theoretical work by Harrow et al. [see Phys. Rev. Lett. 103, 150502 (2009)] investigating a quantum method for finding expectation values of the solutions to systems of linear equations. Wiebe et al. adapt this algorithm to estimate the quality of a least-squares fit to an exponentially large data set (the kind that stymies classical computers) without having to obtain a full solution first and without having to fully characterize the state of the quantum computer (a process called quantum state tomography).
When realistic fault-tolerant quantum computing becomes available, the algorithm of Wiebe et al. could be used to find concise, continuous fitting functions for a given bounded approximation error. From a more general perspective, the result not only applies to one of the most widely used analysis techniques in science, but shows that quantum computing can find use outside of niche applications like prime-number factoring. – David Voss