Convolutional Rectifier Networks as Generalized Tensor Decompositions

03/01/2016
by   Nadav Cohen, et al.
0

Convolutional rectifier networks, i.e. convolutional neural networks with rectified linear activation and max or average pooling, are the cornerstone of modern deep learning. However, despite their wide use and success, our theoretical understanding of the expressive properties that drive these networks is partial at best. On the other hand, we have a much firmer grasp of these issues in the world of arithmetic circuits. Specifically, it is known that convolutional arithmetic circuits possess the property of "complete depth efficiency", meaning that besides a negligible set, all functions that can be implemented by a deep network of polynomial size, require exponential size in order to be implemented (or even approximated) by a shallow network. In this paper we describe a construction based on generalized tensor decompositions, that transforms convolutional arithmetic circuits into convolutional rectifier networks. We then use mathematical tools available from the world of arithmetic circuits to prove new results. First, we show that convolutional rectifier networks are universal with max pooling but not with average pooling. Second, and more importantly, we show that depth efficiency is weaker with convolutional rectifier networks than it is with convolutional arithmetic circuits. This leads us to believe that developing effective methods for training convolutional arithmetic circuits, thereby fulfilling their expressive potential, may give rise to a deep learning architecture that is provably superior to convolutional rectifier networks but has so far been overlooked by practitioners.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/16/2015

On the Expressive Power of Deep Learning: A Tensor Analysis

It has long been conjectured that hypotheses spaces suitable for data th...
research
10/13/2016

Tractable Generative Convolutional Arithmetic Circuits

Casting neural networks in generative frameworks is a highly sought-afte...
research
03/20/2017

Boosting Dilated Convolutional Networks with Mixed Tensor Decompositions

The driving force behind deep networks is their ability to compactly rep...
research
08/22/2017

On Relaxing Determinism in Arithmetic Circuits

The past decade has seen a significant interest in learning tractable pr...
research
05/05/2017

Analysis and Design of Convolutional Networks via Hierarchical Tensor Decompositions

The driving force behind convolutional networks - the most successful de...
research
04/05/2017

Deep Learning and Quantum Entanglement: Fundamental Connections with Implications to Network Design

Deep convolutional networks have witnessed unprecedented success in vari...
research
05/22/2016

Inductive Bias of Deep Convolutional Networks through Pooling Geometry

Our formal understanding of the inductive bias that drives the success o...

Please sign up or login with your details

Forgot password? Click here to reset