Towards an understanding of CNNs: analysing the recovery of activation pathways via Deep Convolutional Sparse Coding

06/26/2018
by   Michael Murray, et al.
0

Deep Convolutional Sparse Coding (D-CSC) is a framework reminiscent of deep convolutional neural networks (DCNNs), but by omitting the learning of the dictionaries one can more transparently analyse the role of the activation function and its ability to recover activation paths through the layers. Papyan, Romano, and Elad conducted an analysis of such an architecture, demonstrated the relationship with DCNNs and proved conditions under which the D-CSC is guaranteed to recover specific activation paths. A technical innovation of their work highlights that one can view the efficacy of the ReLU nonlinear activation function of a DCNN through a new variant of the tensor's sparsity, referred to as stripe-sparsity. Using this they proved that representations with an activation density proportional to the ambient dimension of the data are recoverable. We extend their uniform guarantees to a modified model and prove that with high probability the true activation is typically possible to recover for a greater density of activations per layer. Our extension follows from incorporating the prior work on one step thresholding by Schnass and Vandergheynst.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2020

On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks

In this paper, we have extended the well-established universal approxima...
research
05/08/2023

TaLU: A Hybrid Activation Function Combining Tanh and Rectified Linear Unit to Enhance Neural Networks

The application of the deep learning model in classification plays an im...
research
01/29/2022

On Polynomial Approximation of Activation Function

In this work, we propose an interesting method that aims to approximate ...
research
07/13/2022

MorphoActivation: Generalizing ReLU activation function by mathematical morphology

This paper analyses both nonlinear activation functions and spatial max-...
research
06/18/2020

Image classification in frequency domain with 2SReLU: a second harmonics superposition activation function

Deep Convolutional Neural Networks are able to identify complex patterns...
research
03/12/2018

Representation Learning and Recovery in the ReLU Model

Rectified linear units, or ReLUs, have become the preferred activation f...
research
02/08/2018

A Generalization Method of Partitioned Activation Function for Complex Number

A method to convert real number partitioned activation function into com...

Please sign up or login with your details

Forgot password? Click here to reset