Controlling the Complexity and Lipschitz Constant improves polynomial nets

02/10/2022
by   Zhenyu Zhu, et al.
0

While the class of Polynomial Nets demonstrates comparable performance to neural networks (NN), it currently has neither theoretical generalization characterization nor robustness guarantees. To this end, we derive new complexity bounds for the set of Coupled CP-Decomposition (CCP) and Nested Coupled CP-decomposition (NCP) models of Polynomial Nets in terms of the ℓ_∞-operator-norm and the ℓ_2-operator norm. In addition, we derive bounds on the Lipschitz constant for both models to establish a theoretical certificate for their robustness. The theoretical results enable us to propose a principled regularization scheme that we also evaluate experimentally in six datasets and show that it improves the accuracy as well as the robustness of the models to adversarial perturbations. We showcase how this regularization can be combined with adversarial training, resulting in further improvements.

READ FULL TEXT
research
06/04/2019

Adversarial Training Generalizes Data-dependent Spectral Norm Regularization

We establish a theoretical link between adversarial training and operato...
research
05/06/2020

Training robust neural networks using Lipschitz bounds

Due to their susceptibility to adversarial perturbations, neural network...
research
06/05/2020

Lipschitz Bounds and Provably Robust Training by Laplacian Smoothing

In this work we propose a graph-based learning framework to train models...
research
05/27/2019

Scaleable input gradient regularization for adversarial robustness

Input gradient regularization is not thought to be an effective means fo...
research
02/10/2021

Towards Certifying ℓ_∞ Robustness using Neural Networks with ℓ_∞-dist Neurons

It is well-known that standard neural networks, even with a high classif...
research
03/23/2021

CLIP: Cheap Lipschitz Training of Neural Networks

Despite the large success of deep neural networks (DNN) in recent years,...
research
06/11/2019

Stable Rank Normalization for Improved Generalization in Neural Networks and GANs

Exciting new work on the generalization bounds for neural networks (NN) ...

Please sign up or login with your details

Forgot password? Click here to reset