Convergence of stochastic gradient descent on parameterized sphere with applications to variational Monte Carlo simulation

03/21/2023
by   Nilin Abrahamsen, et al.
0

We analyze stochastic gradient descent (SGD) type algorithms on a high-dimensional sphere which is parameterized by a neural network up to a normalization constant. We provide a new algorithm for the setting of supervised learning and show its convergence both theoretically and numerically. We also provide the first proof of convergence for the unsupervised setting, which corresponds to the widely used variational Monte Carlo (VMC) method in quantum physics.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset