Optimization and Sampling Under Continuous Symmetry: Examples and Lie Theory

09/02/2021
by   Jonathan Leake, et al.
0

In the last few years, the notion of symmetry has provided a powerful and essential lens to view several optimization or sampling problems that arise in areas such as theoretical computer science, statistics, machine learning, quantum inference, and privacy. Here, we present two examples of nonconvex problems in optimization and sampling where continuous symmetries play – implicitly or explicitly – a key role in the development of efficient algorithms. These examples rely on deep and hidden connections between nonconvex symmetric manifolds and convex polytopes, and are heavily generalizable. To formulate and understand these generalizations, we then present an introduction to Lie theory – an indispensable mathematical toolkit for capturing and working with continuous symmetries. We first present the basics of Lie groups, Lie algebras, and the adjoint actions associated with them, and we also mention the classification theorem for Lie algebras. Subsequently, we present Kostant's convexity theorem and show how it allows us to reduce linear optimization problems over orbits of Lie groups to linear optimization problems over polytopes. Finally, we present the Harish-Chandra and the Harish-Chandra–Itzykson–Zuber (HCIZ) formulas, which convert partition functions (integrals) over Lie groups into sums over the corresponding (discrete) Weyl groups, enabling efficient sampling algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset