Stable ResNet

10/24/2020
by   Soufiane Hayou, et al.
1

Deep ResNet architectures have achieved state of the art performance on many tasks. While they solve the problem of gradient vanishing, they might suffer from gradient exploding as the depth becomes large (Yang et al. 2017). Moreover, recent results have shown that ResNet might lose expressivity as the depth goes to infinity (Yang et al. 2017, Hayou et al. 2019). To resolve these issues, we introduce a new class of ResNet architectures, called Stable ResNet, that have the property of stabilizing the gradient while ensuring expressivity in the infinite depth limit.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/27/2019

Evolution Strategies Converges to Finite Differences

Since the debut of Evolution Strategies (ES) as a tool for Reinforcement...
research
02/16/2023

New √(n)-consistent, numerically stable higher-order influence function estimators

Higher-Order Influence Functions (HOIFs) provide a unified theory for co...
research
06/30/2018

A New Benchmark and Progress Toward Improved Weakly Supervised Learning

Knowledge Matters: Importance of Prior Information for Optimization [7],...
research
11/03/2016

Demystifying ResNet

The Residual Network (ResNet), proposed in He et al. (2015), utilized sh...
research
06/06/2021

Regularization in ResNet with Stochastic Depth

Regularization plays a major role in modern deep learning. From classic ...
research
10/01/2021

ResNet strikes back: An improved training procedure in timm

The influential Residual Networks designed by He et al. remain the gold-...
research
08/03/2016

Paraconsistency and Word Puzzles

Word puzzles and the problem of their representations in logic languages...

Please sign up or login with your details

Forgot password? Click here to reset