Theory of overparametrization in quantum neural networks

09/23/2021
by   Martin Larocca, et al.
0

The prospect of achieving quantum advantage with Quantum Neural Networks (QNNs) is exciting. Understanding how QNN properties (e.g., the number of parameters M) affect the loss landscape is crucial to the design of scalable QNN architectures. Here, we rigorously analyze the overparametrization phenomenon in QNNs with periodic structure. We define overparametrization as the regime where the QNN has more than a critical number of parameters M_c that allows it to explore all relevant directions in state space. Our main results show that the dimension of the Lie algebra obtained from the generators of the QNN is an upper bound for M_c, and for the maximal rank that the quantum Fisher information and Hessian matrices can reach. Underparametrized QNNs have spurious local minima in the loss landscape that start disappearing when M≥ M_c. Thus, the overparametrization onset corresponds to a computational phase transition where the QNN trainability is greatly improved by a more favorable landscape. We then connect the notion of overparametrization to the QNN capacity, so that when a QNN is overparametrized, its capacity achieves its maximum possible value. We run numerical simulations for eigensolver, compilation, and autoencoding applications to showcase the overparametrization computational phase transition. We note that our results also apply to variational quantum algorithms and quantum optimal control.

READ FULL TEXT
research
10/06/2021

Exponentially Many Local Minima in Quantum Neural Networks

Quantum Neural Networks (QNNs), or the so-called variational quantum cir...
research
02/10/2023

Effects of noise on the overparametrization of quantum neural networks

Overparametrization is one of the most surprising and notorious phenomen...
research
08/30/2022

Symmetric Pruning in Quantum Neural Networks

Many fundamental properties of a quantum system are captured by its Hami...
research
07/13/2021

How many degrees of freedom do we need to train deep networks: a loss landscape perspective

A variety of recent works, spanning pruning, lottery tickets, and traini...
research
10/22/2018

A jamming transition from under- to over-parametrization affects loss landscape and generalization

We argue that in fully-connected networks a phase transition delimits th...
research
11/17/2020

Optimizing parametrized quantum circuits via noise-induced breaking of symmetries

Very little is known about the cost landscape for parametrized Quantum C...
research
05/25/2022

A Convergence Theory for Over-parameterized Variational Quantum Eigensolvers

The Variational Quantum Eigensolver (VQE) is a promising candidate for q...

Please sign up or login with your details

Forgot password? Click here to reset