Relaxing Equivariance Constraints with Non-stationary Continuous Filters

Equivariances provide useful inductive biases in neural network modeling, with the translation equivariance of convolutional neural networks being a canonical example. Equivariances can be embedded in architectures through weight-sharing and place symmetry constraints on the functions a neural network can represent. The type of symmetry is typically fixed and has to be chosen in advance. Although some tasks are inherently equivariant, many tasks do not strictly follow such symmetries. In such cases, equivariance constraints can be overly restrictive. In this work, we propose a parameter-efficient relaxation of equivariance that can effectively interpolate between a (i) non-equivariant linear product, (ii) a strict-equivariant convolution, and (iii) a strictly-invariant mapping. The proposed parameterization can be thought of as a building block to allow adjustable symmetry structure in neural networks. Compared to non-equivariant or strict-equivariant baselines, we experimentally verify that soft equivariance leads to improved performance in terms of test accuracy on CIFAR-10 and CIFAR-100 image classification tasks.

READ FULL TEXT

page 1

page 2

page 4

page 5

page 6

page 7

page 8

page 9

research
12/28/2018

Exploring Weight Symmetry in Deep Neural Network

We propose to impose symmetry in neural network parameters to improve pa...
research
12/28/2018

Exploring Weight Symmetry in Deep Neural Networks

We propose to impose symmetry in neural network parameters to improve pa...
research
02/08/2016

Exploiting Cyclic Symmetry in Convolutional Neural Networks

Many classes of images exhibit rotational symmetry. Convolutional neural...
research
06/07/2021

Encoding Involutory Invariance in Neural Networks

In certain situations, Neural Networks (NN) are trained upon data that o...
research
06/10/2019

SymNet: Symmetrical Filters in Convolutional Neural Networks

Symmetry is present in nature and science. In image processing, kernels ...
research
07/04/2021

Generalisation in Neural Networks Does not Require Feature Overlap

That shared features between train and test data are required for genera...
research
04/12/2021

Noether: The More Things Change, the More Stay the Same

Symmetries have proven to be important ingredients in the analysis of ne...

Please sign up or login with your details

Forgot password? Click here to reset