1. Blog/
  2. Quantum Computing/
  3. What is the time-ordered exponential and why you should stop using it

September 09, 2024

What is the time-ordered exponential and why you should stop using it

What is the time-ordered exponential operator? Why do people like the time-ordered exponential? And why should we stop using the time-ordered exponential? Time-ordering for time-dependent Hamiltonian evolution is not necessary and just overcomplicates the whole topic for the sake of having a neat-looking function, which is not actually a function.

In this highly opinionated blog post, I want to show some of the misleading implications the formula brings and argue that we should abolish it in undergrad teaching.

Contents

What is the time-ordered exponential operator?

The dynamics of a pure quantum state is determined by the Schrödinger equation (SE)

where is the (time-dependent) Hamiltonian of the system. With the ansatz for some initial state we can rewrite the equation equivalently to the matrix equation:

When the system Hamiltonian has no explicit time-dependence, we can directly write the solution as a matrix exponential of the system Hamiltonian,

Inserting this back into the SE we get

and see that it is a valid solution.

Click here to see the full derivation

Things get significantly more complicated when the Hamiltonian is explicitly time-dependent. Simply using the matrix exponential with does not do the trick anymore:

One correct solution is given by the so-called Dyson series,

Click here to see the full derivation

To derive the Dyson series, we first rewrite the initial value problem of the SE, and , as the equivalent integral equation

To see the correctness we can differentiate and obtain back the SE,

as well as the initial value, because .

In the integral equation, enters on both the left and right hand side of the equation. We can construct better and better solutions by iteratively substituting in better approximations of . For tiny , a 0th order approximation is . We can substitute that in the integral equation to obtain the 1st order approximation

We can substitute again in the integral equation to obtain the 2nd order approximation

Continuing this process we obtain the n-th order approximation, given by the Dyson sum

The exact solution is obtained in the limit , the Dyson series.


For the case that , we get

and retrieve the standard matrix exponential for .

Until this point, all is well. But you can spoil everyone's fun and make things unnecessarily complicated. The goal will now be to rewrite the Dyson series to something that resembles a matrix exponential with some imagination. This comes at a high cost though, as we need to introduce the time-ordering operator ,

that re-arranges any series of time-dependent Hamiltonians such that . The time-ordering operator allows us to shift the bounds of the integrals in the Dyson series (see section 4 here) to

Here, the reduction of the series to the matrix exponential when the Hamiltonian is time-independent is much more evident. You can think of the integral part as the integral to the power , with the caveat that it has to remain time-ordered.

Click to see something that is wrong but helps with the intuition

Do not use this, as it yields the exact same problems that I am going to lament below. It helps, however, to understand the intuition behind defining the time-ordered matrix exponential as we would obtain


This motivates the definition of the time-ordered matrix exponential that simply summarizes this whole term,

Why do people use the time-ordered exponential?

The time-ordered exponential is a neat formula that resembles the matrix exponential from the time-independent case and allows us to write down a formal solution to the time-dependent case in a concise fashion.

It further provides some intuition about the dimensions of as the exponent should be dimensionless. It also makes it clear that an operator that commutes with at all times , also commutes with the evolution unitary. To some degree, it could be even argued that it motivates writing the gradient of time-evolutions (pulses) in terms of an effective generator as we did in the ODEgen PennyLane Demo.

However, arguably these can also all be deduced from drawing parallels with the time-independent case.

Why should we stop using the time-ordered exponential operator?

Misleading notation

The formula just stated as is without context to the Dyson series can lead to all kinds of confusions and wrong results.

Let us walk through an explicit example with the time-dependent Hamiltonian . Let us assume we are interested in the time evolution up to time . We can obtain the correct result by employing qml.evolve, which internally directly solves the SE numerically.

import pennylane as qml import jax.numpy as jnp import jax jax.config.update("jax_enable_x64", True) def sin(p, t): return jnp.sin(t) def cos(p, t): return jnp.cos(t) H = cos * qml.Z(0) + sin * qml.X(0) t = 0.5 Ut = qml.evolve(H)([[], []], t=t) Ut_m = qml.matrix(Ut)

However, we can also easily integrate the Hamiltonian directly and obtain . So naively one could think that the solution to the SE is just

The argument on the right hand side is a concrete, numerical matrix. How do we interpret this formula?

One possible interpretation is that this is the time-ordering operator applied to the matrix exponential. Trivially for the concrete matrix, there is nothing to order and we just have . This leads to wrong results, as can be seen in the following example.

>>> int_H = jnp.sin(t) * qml.Z(0) + (jnp.cos(t) - 1) * qml.X(0) >>> Ut_expm = qml.evolve(int_H, t) # exp(-1j * t * int_H) >>> Ut_expm_m = qml.matrix(Ut_expm) >>> jnp.allclose(Ut_m, Ut_expm_m) False >>> Ut_m Array([[ 0.88010107-0.45961545j, 0.02006437-0.11735909j], [-0.02006437-0.11735909j, 0.88010107+0.45961545j]], dtype=complex128) >>> Ut_expm_m Array([[9.69551427e-01-0.23727482j, 0.00000000e+00+0.06058621j], [2.60208521e-18+0.06058621j, 9.69551427e-01+0.23727482j]], dtype=complex128)

While the time-ordering operator is a well-defined operator that maps a series of operators to a series of operators, the time-ordered exponential is not. In particular, is not a well-defined function that has as its argument that we can manipulate. Rather, the time-ordered exponential is just a short hand notation for the Dyson series, or any other equivalent solution to the time-dependent SE (e.g. the Magnus series).

A much more suitable and clear notation would be to write

This is just as compact and still conveys that we are time-evolving from some initial time to some target time .

However, that expression still contains the superfluous time-ordering operator .

Time-ordering is not necessary

Time-ordering is not necessary in the first place. It is introduced artificially to create a stronger resemblance of the Dyson series with an exponential series. In practice, you never actually compute any time-orderings. Analytically it is much more convenient to use the Dyson or Magnus series, and numerically, one either uses differential equation solvers (as is done in qml.evolve) or splits the evolution into a product of small, finite and discrete steps at times ,

There may be analytic calculations that warrant using time-orderings but in most remotely practical settings, time-ordering is not employed. I strongly believe the widespread use of the time-ordered exponential as the formal solution to the SE comes predominantly with disadvantages.

My personal recommendation would be to resort to the common short-hand notation

with or without as a subscript, superscript or argument. Alternatives that are more explicit could be something like .

Conclusion

I hope I managed to convince you of the misleading implications of the time-ordered exponential. Further, I argue that the formula overly complicates time-dependent dynamics by introducing unnecessary time-ordering just to motivate a function, which is not actually a function. I suggest we stop confusing undergrad students and just drop the whole time-ordering business when teaching and discussing time-dependent Hamiltonian dynamics.

Last modified: September 09, 2024

Related Blog Posts