TI-POOLING: transformation-invariant pooling for feature learning in Convolutional Neural Networks

04/21/2016
by   Dmitry Laptev, et al.
0

In this paper we present a deep neural network topology that incorporates a simple to implement transformation invariant pooling operator (TI-POOLING). This operator is able to efficiently handle prior knowledge on nuisance variations in the data, such as rotation or scale changes. Most current methods usually make use of dataset augmentation to address this issue, but this requires larger number of model parameters and more training data, and results in significantly increased training time and larger chance of under- or overfitting. The main reason for these drawbacks is that the learned model needs to capture adequate features for all the possible transformations of the input. On the other hand, we formulate features in convolutional neural networks to be transformation-invariant. We achieve that using parallel siamese architectures for the considered transformation set and applying the TI-POOLING operator on their outputs before the fully-connected layers. We show that this topology internally finds the most optimal "canonical" instance of the input image for training and therefore limits the redundancy in learned features. This more efficient use of training data results in better performance on popular benchmark datasets with smaller number of parameters when comparing to standard convolutional neural networks with dataset augmentation and to other baselines.

READ FULL TEXT
research
08/21/2018

Isometric Transformation Invariant Graph-based Deep Neural Network

Learning transformation invariant representations of visual data is an i...
research
02/08/2022

Improving the Sample-Complexity of Deep Classification Networks with Invariant Integration

Leveraging prior knowledge on intraclass variance due to transformations...
research
12/01/2015

Towards Dropout Training for Convolutional Neural Networks

Recently, dropout has seen increasing use in deep learning. For deep con...
research
12/16/2014

Locally Scale-Invariant Convolutional Neural Networks

Convolutional Neural Networks (ConvNets) have shown excellent results on...
research
09/30/2015

Generalizing Pooling Functions in Convolutional Neural Networks: Mixed, Gated, and Tree

We seek to improve deep neural networks by generalizing the pooling oper...
research
10/29/2019

Best Practices for Convolutional Neural Networks Applied to Object Recognition in Images

This research project studies the impact of convolutional neural network...
research
06/12/2019

DeepSquare: Boosting the Learning Power of Deep Convolutional Neural Networks with Elementwise Square Operators

Modern neural network modules which can significantly enhance the learni...

Please sign up or login with your details

Forgot password? Click here to reset