By Fred Chong, Seymour Goodman Professor, University of Chicago
Quantum computing is the only technology in which every device that we add to a machine doubles the potential computing power of the machine. If we can overcome the challenges in developing practical algorithms, software and machines, quantum computing could solve some problems where computation grows too quickly (exponentially in the size of the input) for classical machines.
In the short term, quantum computing will change our understanding of the aforementioned sciences that fundamentally rely on understanding the behavior of electrons. A classical computer uses an exponential number of bits (electrons) to model the positions of electrons and how they change. Obviously, nature only uses one electron to “model” each electron in a molecule. Quantum computers will use only a small (constant) number of electrons to model molecules. This video by our EPiQC teammate, Ken Brown, gives a great example of understanding electrons using the state-of-the-art in quantum chemistry algorithms.