Abstract
Inspired by the success of Boltzmann machines based on classical Boltzmann distribution, we propose a new machine-learning approach based on quantum Boltzmann distribution of a quantum Hamiltonian. Because of the noncommutative nature of quantum mechanics, the training process of the quantum Boltzmann machine (QBM) can become nontrivial. We circumvent the problem by introducing bounds on the quantum probabilities. This allows us to train the QBM efficiently by sampling. We show examples of QBM training with and without the bound, using exact diagonalization, and compare the results with classical Boltzmann training. We also discuss the possibility of using quantum annealing processors for QBM training and application.
- Received 21 July 2017
- Revised 16 January 2018
DOI:https://doi.org/10.1103/PhysRevX.8.021050
Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.
Published by the American Physical Society
Physics Subject Headings (PhySH)
Popular Summary
Machine-learning techniques have allowed for the automation of many tasks, evading a traditional algorithmic approach. While the success of machine learning has so far been confined to the realm of traditional digital computation, the question of whether quantum computing can speed up the learning process is important and largely unanswered. We have constructed and tested a novel training procedure for a quantum version of a neural network that can be readily implemented on existing quantum computing hardware.
Machine-learning techniques can capture useful representations of various data sets with only limited resources. The core building block in many architectures is a neural network known as a Boltzmann machine, which is based on a classical probability distribution. It is therefore natural to ask whether a quantum extension of the Boltzmann machine can provide further power to encode classical data sets.
We develop a learning algorithm for a quantum Boltzmann machine. Via the training process, the machine learns to replicate correlations of a provided data set. We compare the performance of the quantum machine to its classical analog with the help of numerical experiments. Our results demonstrate how the machine exploits its quantum nature to mimic data sets in both supervised and unsupervised settings.
Our work opens the door for a novel application of quantum hardware as a sampler for a quantum Boltzmann machine, technology that might prove pivotal for the next generation of machine-learning algorithms.