Can convolutional ResNets approximately preserve input distances? A frequency analysis perspective

06/04/2021
by   Lewis Smith, et al.
0

ResNets constrained to be bi-Lipschitz, that is, approximately distance preserving, have been a crucial component of recently proposed techniques for deterministic uncertainty quantification in neural models. We show that theoretical justifications for recent regularisation schemes trying to enforce such a constraint suffer from a crucial flaw – the theoretical link between the regularisation scheme used and bi-Lipschitzness is only valid under conditions which do not hold in practice, rendering existing theory of limited use, despite the strong empirical performance of these models. We provide a theoretical explanation for the effectiveness of these regularisation schemes using a frequency analysis perspective, showing that under mild conditions these schemes will enforce a lower Lipschitz bound on the low-frequency projection of images. We then provide empirical evidence supporting our theoretical claims, and perform further experiments which demonstrate that our broader conclusions appear to hold when some of the mathematical assumptions of our proof are relaxed, corresponding to the setup used in prior work. In addition, we present a simple constructive algorithm to search for counter examples to the distance preservation condition, and discuss possible implications of our theory for future model design.

READ FULL TEXT

page 4

page 5

page 12

page 13

research
07/15/2021

On the expressivity of bi-Lipschitz normalizing flows

An invertible function is bi-Lipschitz if both the function and its inve...
research
12/21/2018

Lipschitz bijections between boolean functions

We answer four questions from a recent paper of Rao and Shinkar on Lipsc...
research
05/18/2022

Trading Positional Complexity vs. Deepness in Coordinate Networks

It is well noted that coordinate-based MLPs benefit – in terms of preser...
research
11/02/2021

Lipschitz widths

This paper introduces a measure, called Lipschitz widths, of the optimal...
research
12/08/2021

Broadening the convergence domain of Seventh-order method satisfying Lipschitz and Hölder conditions

In this paper, the local convergence analysis of the multi-step seventh ...
research
01/07/2021

Towards Understanding Learning in Neural Networks with Linear Teachers

Can a neural network minimizing cross-entropy learn linearly separable d...
research
05/06/2020

Towards Frequency-Based Explanation for Robust CNN

Current explanation techniques towards a transparent Convolutional Neura...

Please sign up or login with your details

Forgot password? Click here to reset