# Community¶

Have an existing GitHub repository or Jupyter notebook showing off quantum machine learning with PennyLane? Read the guidelines and submission instructions here, and have your demonstration and research featured on our community page.

#### Meta-Variational Quantum Eigensolver

###### Nahum Sá

03/27/2021

In this tutorial I follow the Meta-VQE paper. The Meta-VQE algorithm is a variational quantum algorithm that is suited for NISQ devices and encodes parameters of a Hamiltonian into a variational ansatz. We can obtain good estimations of the ground state of the Hamiltonian by changing only those encoded parameters.

#### Feature maps for kernel-based quantum classifiers

###### Semyon Sinchenko

03/03/2021

In this tutorial we implement a few examples of feature maps for kernel based quantum machine learning. We'll see how quantum feature maps could make linear unseparable data separable after applying a kernel and measuring an observable. We will follow an article and also implement all the kernel functions with PennyLane.

#### Variational Quantum Circuits for Deep Reinforcement Learning

###### Samuel Yen-Chi Chen

03/03/2021

This work explores variational quantum circuits for deep reinforcement learning. Specifically, we reshape classical deep reinforcement learning algorithms like experience replay and target network into a representation of variational quantum circuits. Moreover, we use a quantum information encoding scheme to reduce the number of model parameters compared to classical neural networks. To the best of our knowledge, this work is the first proof-of-principle demonstration of variational quantum circuits to approximate the deep Q-value function for decision-making and policy-selection reinforcement learning with experience replay and target network. Besides, our variational quantum circuits can be deployed in many near-term NISQ machines.

#### QCNN for Speech Commands Recognition

###### C.-H. Huck Yang

02/03/2021

We train a hybrid quantum convolution neural network (QCNN) on acoustic data with up to 10,000 features. This model uses layers of random quantum gates to efficiently encode convolutional features. We perform a neural saliency analysis to provide a classical activation mapping to compare classical and quantum models, illustrating that the QCNN self-attention model did learn meaningful representations. An additional connectionist temporal classification (CTC) loss on character recognition is also provided for continuous speech recognition.

#### Layerwise learning for quantum neural networks

###### Felipe Oyarce Andrade

26/01/2021

In this project we’ve implemented a strategy presented by Skolik et al., 2020 for effectively training quantum neural networks. In layerwise learning the strategy is to gradually increase the number of parameters by adding a few layers and training them while freezing the parameters of previous layers already trained. An easy way for understanding this technique is to think that we’re dividing the problem into smaller circuits to successfully avoid falling into Barren Plateaus. We provide a proof-of-concept implementation of this technique in Pennylane’s Pytorch interface for binary classification in the MNIST dataset.

#### A Quantum-Enhanced Transformer

###### Riccardo Di Sipio

20/01/2021

The Transformer neural network architecture revolutionized the analysis of text. Here we show an example of a Transformer with quantum-enhanced multi-headed attention. In the quantum-enhanced version, dense layers are replaced by simple Variational Quantum Circuits. An implementation based on PennyLane and TensorFlow-2.x illustrates the basic concept.

#### A Quantum-Enhanced LSTM Layer

###### Riccardo Di Sipio

18/12/2020

In Natural Language Processing, documents are usually presented as sequences of words. One of the most successful techniques to manipulate this kind of data is the Recurrent Neural Network architecture, and in particular a variant called Long Short-Term Memory (LSTM). Using the PennyLane library and its PyTorch interface, one can easily define a LSTM network where Variational Quantum Circuits (VQCs) replace linear operations. An application to Part-of-Speech tagging is presented in this tutorial.

#### Quantum Machine Learning Model Predictor for Continuous Variables

###### Roberth Saénz Pérez Alvarado

16/12/2020

According to the paper "Predicting toxicity by quantum machine learning" (Teppei Suzuki, Michio Katouda 2020), it is possible to predict continuous variables—like those in the continuous-variable quantum neural network model described in Killoran et al. (2018)—using 2 qubits per feature. This is done by applying encodings, variational circuits, and some linear transformations on expectation values in order to predict values close to the real target. Based on an example from PennyLane, and using a small dataset which consists of a one-dimensional feature and one output (so that the processing does not take too much time), the algorithm showed reliable results.

#### Trainable Quanvolutional Neural Networks

###### Denny Mattern, Darya Martyniuk, Fabian Bergmann, and Henri Willems

26/11/2020

We implement a trainable version of Quanvolutional Neural Networks using parametrized `RandomCircuits`

. Parameters are optimized using standard gradient descent. Our code is based on the Quanvolutional Neural Networks demo by Andrea Mari. This demo results from our research as part of the PlanQK consortium.

#### Using a Keras optimizer for Iris classification with a QNode and loss function

###### Hemant Gahankari

09/11/2020

Using PennyLane, we explain how to create a quantum function and train a quantum function using a Keras optimizer directly, i.e., not using a Keras layer. The objective is to train a quantum function to predict classes of the Iris dataset.

#### Linear regression using angle embedding and a single qubit

###### Hemant Gahankari

09/11/2020

In this example, we create a hybrid neural network (mix of classical and quantum layers), train it and get predictions from it. The data set consists of temperature readings in degrees Centigrade and corresponding Fahrenheit. The objective is to train a neural network that predicts Fahrenheit values given Centigrade values.

#### Amplitude embedding in Iris classification with PennyLane's KerasLayer

###### Hemant Gahankari

09/11/2020

Using amplitude embedding from PennyLane, this demonstration aims to explain how to pass classical data into the quantum function and convert it to quantum data. It also shows how to create a PennyLane KerasLayer from a QNode, train it and check the performance of the model.

#### Angle embedding in Iris classification with PennyLane's KerasLayer

###### Hemant Gahankari

09/11/2020

Using angle embedding from PennyLane, this demonstration aims to explain how to pass classical data into the quantum function and convert it to quantum data. It also shows how to create a PennyLane KerasLayer from a QNode, train it and check the performance of the model.

#### Characterizing the loss landscape of variational quantum circuits

###### Patrick Huembeli and Alexandre Dauphin

30/09/2020

Using PennyLane and complex PyTorch, we compute the Hessian of the loss function of VQCs and show how to characterize the loss landscape with it. We show how the Hessian can be used to escape flat regions of the loss landscape.

## Downloads

## Related tutorials