About four years ago, our team made the pivotal professional decision to focus on quantum chemistry as one of the key applications for quantum computing. Since then, scientists at Xanadu have published multiple papers and learned crucial lessons, some of which we’re sharing with you today. We focus on conceptual insights that inform our research roadmap, valuable to experts and non-experts alike.
Contents
- Lesson 1: There is always a speed versus accuracy tradeoff.
- Lesson 2: Don’t use Hartree–Fock as the initial state.
- Lesson 3: Quantum phase estimation is a method to improve classical energy estimates.
- Lesson 4: There is a bandwidth problem.
- Final thoughts
Lesson 1: There is always a speed versus accuracy tradeoff.
Imagine that your job is to factor the number 27,843,913. There is no sense in which this can be done approximately — you need to figure out that the prime factors are precisely 6,211 and 4,483.
This is not true for numerical simulations: it is always possible to sacrifice accuracy for speed, and to sacrifice speed for accuracy.
We can illustrate this in terms of tradeoff curves. Depending on the properties of the system and the available simulation algorithms — which can be classical or quantum — it may be worthwhile to trade accuracy for speed, or vice versa. The curve may also suggest that it’s best to aim for maximum accuracy or maximum speed.
Here it is worthwhile to sacrifice accuracy for speed, and speed for accuracy.
In this case, it's best to aim for maximum accuracy or maximum speed.
Large gains in speed are possible by slightly reducing accuracy, but it is unappealing to aim for very high speeds.
So while it is probably hopeless to factor very large numbers without a quantum computer, scientists are already able to simulate all types of molecules and materials. They are only limited by the accuracy of the simulations and the size of the systems they can tackle.
This makes quantum advantage for quantum chemistry very tricky.
We can’t just talk about quantum speedups, because classical algorithms can be faster if they sacrifice accuracy; nor can we just talk about higher accuracy, since classical methods can increase accuracy with more costly algorithms. Even worse, we can’t just talk about cost for a given accuracy since the specific tradeoffs depend on the system of interest! The only option is to do a case-by-case analysis, carefully considering all possible algorithms and tradeoffs on each side. This is complicated by the fact that we don’t always have reliable benchmarks for comparison.
The conclusion is that quantum algorithms for quantum chemistry will not just outperform classical methods; they will play a role in an ecosystem of simulation techniques, each with their own strengths and weaknesses. Our job as researchers is to understand exactly where quantum computers could shine and how they can be best combined with existing classical techniques. Evidence currently points to medium-sized systems with strong electronic correlations that cannot be accurately described by classical methods that truncate the underlying Hilbert space. You can find some of these systems in our list of top 20 molecules for quantum computing.
Lesson 2: Don’t use Hartree–Fock as the initial state.
Most quantum algorithms for chemistry require an initial state. Countless papers have chosen the Hartree-Fock state. This is usually not the best choice.
It is almost always worthwhile to prepare more sophisticated initial states.
Preparing advanced states is more work, but the quality of the state compensates for it. This is especially true for large-scale quantum algorithms targeting quantum advantage: the full procedure is complicated enough that we can afford to prepare more advanced states. A better initial state means fewer repetitions of the main algorithm, leading to an overall cost reduction and improved performance.
Our recent paper on initial state preparation introduces state-of-the-art techniques based on the “sum-of-Slaters” and matrix product state representations. Tentatively, we identify the density matrix renormalization group algorithm (DMRG) as the leading general method. Consider using DMRG to prepare your initial state. The DMRG package block2 and Xanadu's Overlapper library combined with PennyLane state preparation functionality are useful resources for this purpose.
Classical methods based on wave functions, like DMRG and coupled-cluster, have many properties in common with quantum algorithms. Compared to density functional theory, these wave function techniques are more expensive but more accurate, and their errors can be better controlled. This is similar to what quantum algorithms can offer.
We can thus pinpoint some potential applications of quantum computing by identifying problems where wavefunction methods are already in widespread use; especially for calculating ground- and excited-state energies.
Lesson 3: Quantum phase estimation is a method to improve classical energy estimates.
A prevailing notion in the community is that ground-state energies can be calculated by running quantum phase estimation many times to ensure a high chance of sampling the ground-state energy. The probability of this happening is given by the overlap squared of the initial state with the true ground state.
The issue is that this perspective leads to an apparent “no-win” scenario for quantum computing. As argued in a recent paper, due to the orthogonality catastrophe, an initial state's overlap with the ground state decreases exponentially with system size. Bad news. On the other hand, if the initial state has a large overlap, it typically already provides a good estimate of the ground-state energy, making the quantum algorithm less attractive. More bad news. The insinuation is that small and large overlaps are both problematic for quantum computing; a seemingly contradictory statement.
The resolution to this problem is that we should think in terms of energy distributions, not overlaps.
Quantum phase estimation is doing something extraordinary: it samples eigenvalues of the Hamiltonian without having to diagonalize it. These energies are sampled from a distribution determined by the initial state. The average of this distribution is the expectation value for the initial state, which is the energy estimate that would be obtained classically.
An example energy distribution. The average energy from the initial state is not as good as the best classical estimate, but there is a considerable chance of sampling an even lower energy, as shown by the shaded area.
Since exactly computing ground-state energies is so difficult, the general goal is to instead find good upper bounds for them. We can obtain improvements from quantum phase estimation every time an energy smaller than average is observed. If we get really lucky, this may be precisely the ground-state energy. But even if we don’t hit the jackpot, quantum phase estimation can systematically improve the classical estimates with sufficiently many samples. Great news. More repetitions increase the chance of observing outliers with very low energy. Ideal candidates for quantum improvements are “Goldilocks” molecules where classical estimates are neither too good nor too bad; they are just right to ensure a high chance of substantial improvement. Regardless, as long as classical estimates are lacking in quality, quantum phase estimation can provide a pathway to improve them.
Lesson 4: There is a bandwidth problem.
The naive perspective on chemistry applications is that quantum computers are memory-efficient: we need N qubits to represent any state of N orbitals, while we would need many more bits classically — exponentially many in a worst-case scenario. We actually get this answer in technical interviews sometimes.
This is incomplete because we also care about the time it takes to carry out a quantum algorithm.
It can be difficult to encode information in a quantum state (like an approximate ground state or Hamiltonian coefficients) and to extract information (like an expectation value). Borrowing from a paper by colleagues at Microsoft, we like to refer to this as the bandwidth problem for quantum computing.
This is one of the most important challenges in quantum algorithms: We need better methods to prepare states, and we need better methods to extract information from those states. In particular, sampling-based extraction methods are problematic since they can negate potential advantages of quantum algorithms by requiring an extremely large number of repetitions to reach accurate estimates. A potential solution is to build quantum supercomputers with thousands of independent QPUs, where we can benefit from massive parallelization and sample more efficiently by trading time for space.
Final thoughts
Part of our technical research strategy is to put into action these lessons on quantum computing for quantum chemistry. Understanding tradeoffs in simulation has motivated us to develop software tools combining classical and quantum methods: a differentiable Hartree–Fock solver, fermionic operator arithmetic, the Overlapper library for initial state preparation, and the GradDFT library for training neural functionals. We continue to investigate better methods for initial state preparation, especially using quantum algorithms. We have also refined the search for quantum advantage by identifying “Goldilocks” molecules where classical estimates are off, but initial states are good enough to provide a substantial chance of quantum improvements. Tackling the bandwidth problem, we explore new algorithms and applications to compute chemical properties using fewer samples.
The scientific importance of quantum computing is undeniable, but open questions remain regarding commercial applications. Addressing this challenge requires innovations across hardware, error correction, algorithms, software, applications, and business development. Xanadu is working relentlessly to solve these problems and build quantum computers that are useful and available to people everywhere.
About the author
Juan Miguel Arrazola
Making quantum computers useful