A Survey of Deep Learning: From Activations to Transformers

02/01/2023
by   Johannes Schneider, et al.
0

Deep learning has made tremendous progress in the last decade. A key success factor is the large amount of architectures, layers, objectives, and optimization techniques that have emerged in recent years. They include a myriad of variants related to attention, normalization, skip connection, transformer and self-supervised learning schemes – to name a few. We provide a comprehensive overview of the most important, recent works in these areas to those who already have a basic understanding of deep learning. We hope that a holistic and unified treatment of influential, recent works helps researchers to form new connections between diverse areas of deep learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2021

Self-Supervised Learning with Swin Transformers

We are witnessing a modeling shift from CNN to Transformers in computer ...
research
06/10/2021

Beyond BatchNorm: Towards a General Understanding of Normalization in Deep Learning

Inspired by BatchNorm, there has been an explosion of normalization laye...
research
02/19/2023

Optimization Methods in Deep Learning: A Comprehensive Overview

In recent years, deep learning has achieved remarkable success in variou...
research
05/17/2019

An Essay on Optimization Mystery of Deep Learning

Despite the huge empirical success of deep learning, theoretical underst...
research
08/27/2023

Deep Learning for Visual Localization and Mapping: A Survey

Deep learning based localization and mapping approaches have recently em...
research
04/10/2019

A Selective Overview of Deep Learning

Deep learning has arguably achieved tremendous success in recent years. ...
research
05/19/2023

Survey on software ISP methods based on Deep Learning

The entire Image Signal Processor (ISP) of a camera relies on several pr...

Please sign up or login with your details

Forgot password? Click here to reset