Neural approximation of Wasserstein distance via a universal architecture for symmetric and factorwise group invariant functions

08/01/2023
by   Samantha Chen, et al.
0

Learning distance functions between complex objects, such as the Wasserstein distance to compare point sets, is a common goal in machine learning applications. However, functions on such complex objects (e.g., point sets and graphs) are often required to be invariant to a wide variety of group actions e.g. permutation or rigid transformation. Therefore, continuous and symmetric product functions (such as distance functions) on such complex objects must also be invariant to the product of such group actions. We call these functions symmetric and factor-wise group invariant (or SFGI functions in short). In this paper, we first present a general neural network architecture for approximating SFGI functions. The main contribution of this paper combines this general neural network with a sketching idea to develop a specific and efficient neural network which can approximate the p-th Wasserstein distance between point sets. Very importantly, the required model complexity is independent of the sizes of input point sets. On the theoretical front, to the best of our knowledge, this is the first result showing that there exists a neural network with the capacity to approximate Wasserstein distance with bounded model complexity. Our work provides an interesting integration of sketching ideas for geometric problems with universal approximation of symmetric functions. On the empirical front, we present a range of results showing that our newly proposed neural network architecture performs comparatively or better than other models (including a SOTA Siamese Autoencoder based approach). In particular, our neural network generalizes significantly better and trains much faster than the SOTA Siamese AE. Finally, this line of investigation could be useful in exploring effective neural network design for solving a broad range of geometric optimization problems (e.g., k-means in a metric space).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2020

A New Neural Network Architecture Invariant to the Action of Symmetry Subgroups

We propose a computationally efficient G-invariant neural network that a...
research
12/04/2019

Universal approximation of symmetric and anti-symmetric functions

We consider universal approximations of symmetric and anti-symmetric fun...
research
06/30/2022

Learning Functions on Multiple Sets using Multi-Set Transformers

We propose a general deep architecture for learning functions on multipl...
research
08/18/2022

Deep Neural Network Approximation of Invariant Functions through Dynamical Systems

We study the approximation of functions which are invariant with respect...
research
02/18/2020

A Computationally Efficient Neural Network Invariant to the Action of Symmetry Subgroups

We introduce a method to design a computationally efficient G-invariant ...
research
09/30/2018

Deep, Skinny Neural Networks are not Universal Approximators

In order to choose a neural network architecture that will be effective ...
research
07/05/2021

Universal Approximation of Functions on Sets

Modelling functions of sets, or equivalently, permutation-invariant func...

Please sign up or login with your details

Forgot password? Click here to reset