PennyLane
Install
Install
The PennyLane Guide to Quantum Machine Learning Hero Image

The PennyLane Guide to Quantum Machine Learning

Quantum Machine Learning (QML) asks if quantum hardware will be a game changer for the way computers learn from data. The field has been tackling this question with various approaches and tools ranging from abstract learning theory to practical benchmarks. However, there is a lot we don't understand, and the field is rapidly evolving.

Discover the different flavours of quantum machine learning in this curated guide.

Quantum neural networks

It turns out that parametrized or variational quantum circuits can be trained similar to neural networks, which earned them the name 'quantum neural networks' (QNN) and has sparked extensive research into their behaviour as machine learning models—however the early optimism around usage of these models on near-term hardware has not stood up to scrutiny. Here you can explore the basic ideas of these models.


Integrating QNNs with machine learning software

Since quantum neural networks typically depend on classical software to perform an optimization task (or other analyses), it becomes important to integrate quantum neural networks with classical machine learning software. This can include general machine learning toolkits such as Scikit-learn, but also frameworks designed around gradient descent and backpropagation such as JAX and PyTorch.

Scaling and benchmarking challenges with QNNs

After an initial period of optimism, it became increasingly clear that naive QNN model designs face issues of scalability and do not stand out in performance compared to classical models and on near-term quantum hardware. It is a critically unresolved question of where variational circuits will fit in the fault-tolerant regime, and how to integrate them into more sophisticated quantum model designs.

Traditional quantum algorithms for machine learning

Since the beginning of QML, researchers have tried to use famous quantum subroutines—like the HHL algorithm, amplitude amplification, and more recent additions such as quantum singular value decomposition or QROM—to speed up machine learning. Early speedup claims advertised sub-linear runtime in the size of the data by making strong (and sometimes dubious) assumptions on how the data is stored and loaded. Since most classical machine learning algorithms already have linear runtimes in the size of the data, it is becoming clear that quantum computers cannot just speed up existing models, but have to solve issues that current machine learning struggles with—possibly in the small-data regime.

Quantum learning advantages

Some known quantum advantages, like Shor's algorithm or reconstructing states from quantum measurements, can be used to construct formal learning problems for which quantum speedups can be proven. This has become a vibrant field of academic research, but it is still unclear whether the highly abstract results can help guide the search for real-world QML algorithms.


Quantum machine learning and symmetries

The secret sauce of many quantum algorithms, like Shor's, is to process information in 'Fourier space' using the Quantum Fourier Transform. Fourier transforms are deeply linked to groups and symmetries—think of the symmetry of a periodic signal—and are increasingly important concepts in deep learning, for example to build symmetry-aware models or to understand the dynamics of learning. An exciting new area in QML asks if methods like the QFT can unlock fundamentally different approaches to machine learning.


Documentation

  • qml.gradient
  • qml.kernels
  • qml.fourier
  • qml.liealg
  • Gradients and training
  • PennyLane optimizers

Quantum neural networks

It turns out that parametrized or variational quantum circuits can be trained similar to neural networks, which earned them the name 'quantum neural networks' (QNN) and has sparked extensive research into their behaviour as machine learning models—however the early optimism around usage of these models on near-term hardware has not stood up to scrutiny. Here you can explore the basic ideas of these models.


Integrating QNNs with machine learning software

Since quantum neural networks typically depend on classical software to perform an optimization task (or other analyses), it becomes important to integrate quantum neural networks with classical machine learning software. This can include general machine learning toolkits such as Scikit-learn, but also frameworks designed around gradient descent and backpropagation such as JAX and PyTorch.


Scaling and benchmarking challenges with QNNs

After an initial period of optimism, it became increasingly clear that naive QNN model designs face issues of scalability and do not stand out in performance compared to classical models and on near-term quantum hardware. It is a critically unresolved question of where variational circuits will fit in the fault-tolerant regime, and how to integrate them into more sophisticated quantum model designs.

Traditional quantum algorithms for machine learning

Since the beginning of QML, researchers have tried to use famous quantum subroutines—like the HHL algorithm, amplitude amplification, and more recent additions such as quantum singular value decomposition or QROM—to speed up machine learning. Early speedup claims advertised sub-linear runtime in the size of the data by making strong (and sometimes dubious) assumptions on how the data is stored and loaded. Since most classical machine learning algorithms already have linear runtimes in the size of the data, it is becoming clear that quantum computers cannot just speed up existing models, but have to solve issues that current machine learning struggles with—possibly in the small-data regime.


Quantum learning advantages

Some known quantum advantages, like Shor's algorithm or reconstructing states from quantum measurements, can be used to construct formal learning problems for which quantum speedups can be proven. This has become a vibrant field of academic research, but it is still unclear whether the highly abstract results can help guide the search for real-world QML algorithms.


Quantum machine learning and symmetries

The secret sauce of many quantum algorithms, like Shor's, is to process information in 'Fourier space' using the Quantum Fourier Transform. Fourier transforms are deeply linked to groups and symmetries—think of the symmetry of a periodic signal—and are increasingly important concepts in deep learning, for example to build symmetry-aware models or to understand the dynamics of learning. An exciting new area in QML asks if methods like the QFT can unlock fundamentally different approaches to machine learning.


Documentation

  • qml.gradient
  • qml.kernels
  • qml.fourier
  • qml.liealg
  • Gradients and training
  • PennyLane optimizers

Quantum neural networks

It turns out that parametrized or variational quantum circuits can be trained similar to neural networks, which earned them the name 'quantum neural networks' (QNN) and has sparked extensive research into their behaviour as machine learning models—however the early optimism around usage of these models on near-term hardware has not stood up to scrutiny. Here you can explore the basic ideas of these models.


Integrating QNNs with machine learning software

Since quantum neural networks typically depend on classical software to perform an optimization task (or other analyses), it becomes important to integrate quantum neural networks with classical machine learning software. This can include general machine learning toolkits such as Scikit-learn, but also frameworks designed around gradient descent and backpropagation such as JAX and PyTorch.


Scaling and benchmarking challenges with QNNs

After an initial period of optimism, it became increasingly clear that naive QNN model designs face issues of scalability and do not stand out in performance compared to classical models and on near-term quantum hardware. It is a critically unresolved question of where variational circuits will fit in the fault-tolerant regime, and how to integrate them into more sophisticated quantum model designs.


Traditional quantum algorithms for machine learning

Since the beginning of QML, researchers have tried to use famous quantum subroutines—like the HHL algorithm, amplitude amplification, and more recent additions such as quantum singular value decomposition or QROM—to speed up machine learning. Early speedup claims advertised sub-linear runtime in the size of the data by making strong (and sometimes dubious) assumptions on how the data is stored and loaded. Since most classical machine learning algorithms already have linear runtimes in the size of the data, it is becoming clear that quantum computers cannot just speed up existing models, but have to solve issues that current machine learning struggles with—possibly in the small-data regime.


Quantum learning advantages

Some known quantum advantages, like Shor's algorithm or reconstructing states from quantum measurements, can be used to construct formal learning problems for which quantum speedups can be proven. This has become a vibrant field of academic research, but it is still unclear whether the highly abstract results can help guide the search for real-world QML algorithms.


Quantum machine learning and symmetries

The secret sauce of many quantum algorithms, like Shor's, is to process information in 'Fourier space' using the Quantum Fourier Transform. Fourier transforms are deeply linked to groups and symmetries—think of the symmetry of a periodic signal—and are increasingly important concepts in deep learning, for example to build symmetry-aware models or to understand the dynamics of learning. An exciting new area in QML asks if methods like the QFT can unlock fundamentally different approaches to machine learning.


Documentation

  • qml.gradient
  • qml.kernels
  • qml.fourier
  • qml.liealg
  • Gradients and training
  • PennyLane optimizers
PennyLane

PennyLane is a cross-platform Python library for quantum computing, quantum machine learning, and quantum chemistry. Built by researchers, for research. Created with ❤️ by Xanadu.

Research

  • Research
  • Performance
  • Hardware & Simulators
  • Demos
  • Quantum Compilation
  • Quantum Datasets

Education

  • Teach
  • Learn
  • Codebook
  • Coding Challenges
  • Videos
  • Glossary

Software

  • Install PennyLane
  • Features
  • Documentation
  • Catalyst Compilation Docs
  • Development Guide
  • API
  • GitHub
Stay updated with our newsletter

© Copyright 2025 | Xanadu | All rights reserved

TensorFlow, the TensorFlow logo and any related marks are trademarks of Google Inc.

Privacy Policy|Terms of Service|Cookie Policy|Code of Conduct