Smooth Inter-layer Propagation of Stabilized Neural Networks for Classification

09/27/2018
by   Jingfeng Zhang, et al.
1

Recent work has studied the reasons for the remarkable performance of deep neural networks in image classification. We examine batch normalization on the one hand and the dynamical systems view of residual networks on the other hand. Our goal is in understanding the notions of stability and smoothness of the convergence of ResNets so as to explain when they contribute to significantly enhanced performance. We postulate that convergence stability is of importance for the trained ResNet to transfer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2017

Multi-level Residual Networks from Dynamical Systems View

Deep residual networks (ResNets) and their variants are widely used in m...
research
09/21/2020

Kernel-Based Smoothness Analysis of Residual Networks

A major factor in the success of deep neural networks is the use of soph...
research
04/18/2022

An Optimal Time Variable Learning Framework for Deep Neural Networks

Feature propagation in Deep Neural Networks (DNNs) can be associated to ...
research
03/22/2023

An Empirical Analysis of the Shift and Scale Parameters in BatchNorm

Batch Normalization (BatchNorm) is a technique that improves the trainin...
research
02/18/2019

LocalNorm: Robust Image Classification through Dynamically Regularized Normalization

While modern convolutional neural networks achieve outstanding accuracy ...
research
06/11/2018

State Space Representations of Deep Neural Networks

This paper deals with neural networks as dynamical systems governed by d...
research
05/16/2019

AlgoNet: C^∞ Smooth Algorithmic Neural Networks

Artificial neural networks revolutionized many areas of computer science...

Please sign up or login with your details

Forgot password? Click here to reset