Optimal Approximation and Learning Rates for Deep Convolutional Neural Networks

08/07/2023
by   Shao-Bo Lin, et al.
0

This paper focuses on approximation and learning performance analysis for deep convolutional neural networks with zero-padding and max-pooling. We prove that, to approximate r-smooth function, the approximation rates of deep convolutional neural networks with depth L are of order (L^2/log L)^-2r/d, which is optimal up to a logarithmic factor. Furthermore, we deduce almost optimal learning rates for implementing empirical risk minimization over deep convolutional neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2022

Optimal Learning Rates of Deep Convolutional Neural Networks: Additive Ridge Functions

Convolutional neural networks have shown extraordinary abilities in many...
research
10/25/2022

Learning Ability of Interpolating Convolutional Neural Networks

It is frequently observed that overparameterized neural networks general...
research
10/13/2021

Detecting Slag Formations with Deep Convolutional Neural Networks

We investigate the ability to detect slag formations in images from insi...
research
06/12/2019

DeepSquare: Boosting the Learning Power of Deep Convolutional Neural Networks with Elementwise Square Operators

Modern neural network modules which can significantly enhance the learni...
research
02/06/2020

The Costs and Benefits of Goal-Directed Attention in Deep Convolutional Neural Networks

Attention in machine learning is largely bottom-up, whereas people also ...
research
06/23/2021

Universal Consistency of Deep Convolutional Neural Networks

Compared with avid research activities of deep convolutional neural netw...
research
01/28/2019

Squeezed Very Deep Convolutional Neural Networks for Text Classification

Most of the research in convolutional neural networks has focused on inc...

Please sign up or login with your details

Forgot password? Click here to reset