Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks

01/29/2018
by   Jason Kuen, et al.
1

It is desirable to train convolutional networks (CNNs) to run more efficiently during inference. In many cases however, the computational budget that the system has for inference cannot be known beforehand during training, or the inference budget is dependent on the changing real-time resource availability. Thus, it is inadequate to train just inference-efficient CNNs, whose inference costs are not adjustable and cannot adapt to varied inference budgets. We propose a novel approach for cost-adjustable inference in CNNs - Stochastic Downsampling Point (SDPoint). During training, SDPoint applies feature map downsampling to a random point in the layer hierarchy, with a random downsampling ratio. The different stochastic downsampling configurations known as SDPoint instances (of the same model) have computational costs different from each other, while being trained to minimize the same prediction loss. Sharing network parameters across different instances provides significant regularization boost. During inference, one may handpick a SDPoint instance that best fits the inference budget. The effectiveness of SDPoint, as both a cost-adjustable inference approach and a regularizer, is validated through extensive experiments on image classification.

READ FULL TEXT
research
08/19/2019

Adaptative Inference Cost With Convolutional Neural Mixture Models

Despite the outstanding performance of convolutional neural networks (CN...
research
10/10/2021

Haar Wavelet Feature Compression for Quantized Graph Convolutional Networks

Graph Convolutional Networks (GCNs) are widely used in a variety of appl...
research
09/23/2020

Pruning Convolutional Filters using Batch Bridgeout

State-of-the-art computer vision models are rapidly increasing in capaci...
research
07/07/2020

Enabling On-Device CNN Training by Self-Supervised Instance Filtering and Error Map Pruning

This work aims to enable on-device training of convolutional neural netw...
research
07/10/2019

Dual Dynamic Inference: Enabling More Efficient, Adaptive and Controllable Deep Inference

State-of-the-art convolutional neural networks (CNNs) yield record-break...
research
04/03/2019

Model Slicing for Supporting Complex Analytics with Elastic Inference Cost and Resource Constraints

Deep learning models have been used to support analytics beyond simple a...
research
02/18/2022

Stochastic Perturbations of Tabular Features for Non-Deterministic Inference with Automunge

Injecting gaussian noise into training features is well known to have re...

Please sign up or login with your details

Forgot password? Click here to reset