Trainability barriers and opportunities in quantum generative modeling

05/04/2023
by   Manuel S. Rudolph, et al.
0

Quantum generative models, in providing inherently efficient sampling strategies, show promise for achieving a near-term advantage on quantum hardware. Nonetheless, important questions remain regarding their scalability. In this work, we investigate the barriers to the trainability of quantum generative models posed by barren plateaus and exponential loss concentration. We explore the interplay between explicit and implicit models and losses, and show that using implicit generative models (such as quantum circuit-based models) with explicit losses (such as the KL divergence) leads to a new flavour of barren plateau. In contrast, the Maximum Mean Discrepancy (MMD), which is a popular example of an implicit loss, can be viewed as the expectation value of an observable that is either low-bodied and trainable, or global and untrainable depending on the choice of kernel. However, in parallel, we highlight that the low-bodied losses required for trainability cannot in general distinguish high-order correlations, leading to a fundamental tension between exponential concentration and the emergence of spurious minima. We further propose a new local quantum fidelity-type loss which, by leveraging quantum circuits to estimate the quality of the encoded distribution, is both faithful and enjoys trainability guarantees. Finally, we compare the performance of different loss functions for modelling real-world data from the High-Energy-Physics domain and confirm the trends predicted by our theoretical results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2018

Differentiable Learning of Quantum Circuit Born Machine

Quantum circuit Born machines are generative models which represent the ...
research
10/22/2020

Learnability and Complexity of Quantum Samples

Given a quantum circuit, a quantum computer can sample the output distri...
research
03/27/2023

A Framework for Demonstrating Practical Quantum Advantage: Racing Quantum against Classical Generative Models

Generative modeling has seen a rising interest in both classical and qua...
research
05/10/2022

Theory of Quantum Generative Learning Models with Maximum Mean Discrepancy

The intrinsic probabilistic nature of quantum mechanics invokes endeavor...
research
08/08/2023

Application-Oriented Benchmarking of Quantum Generative Learning Using QUARK

Benchmarking of quantum machine learning (QML) algorithms is challenging...
research
10/08/2021

F-Divergences and Cost Function Locality in Generative Modelling with Quantum Circuits

Generative modelling is an important unsupervised task in machine learni...

Please sign up or login with your details

Forgot password? Click here to reset