Boosting Dilated Convolutional Networks with Mixed Tensor Decompositions

by   Nadav Cohen, et al.

The driving force behind deep networks is their ability to compactly represent rich classes of functions. The primary notion for formally reasoning about this phenomenon is expressive efficiency, which refers to a situation where one network must grow unfeasibly large in order to realize (or approximate) functions of another. To date, expressive efficiency analyses focused on the architectural feature of depth, showing that deep networks are representationally superior to shallow ones. In this paper we study the expressive efficiency brought forth by connectivity, motivated by the observation that modern networks interconnect their layers in elaborate ways. We focus on dilated convolutional networks, a family of deep models delivering state of the art performance in sequence processing tasks. By introducing and analyzing the concept of mixed tensor decompositions, we prove that interconnecting dilated convolutional networks can lead to expressive efficiency. In particular, we show that even a single connection between intermediate layers can already lead to an almost quadratic gap, which in large-scale settings typically makes the difference between a model that is practical and one that is not. Empirical evaluation demonstrates how the expressive efficiency of connectivity, similarly to that of depth, translates into gains in accuracy. This leads us to believe that expressive efficiency may serve a key role in the development of new tools for deep network design.


page 1

page 2

page 3

page 4


Analysis and Design of Convolutional Networks via Hierarchical Tensor Decompositions

The driving force behind convolutional networks - the most successful de...

On the Expressive Power of Overlapping Architectures of Deep Learning

Expressive efficiency refers to the relation between two architectures A...

PolyNet: A Pursuit of Structural Diversity in Very Deep Networks

A number of studies have shown that increasing the depth or width of con...

Convolutional Rectifier Networks as Generalized Tensor Decompositions

Convolutional rectifier networks, i.e. convolutional neural networks wit...

On the Expressive Power of Deep Learning: A Tensor Analysis

It has long been conjectured that hypotheses spaces suitable for data th...

Inductive Bias of Deep Convolutional Networks through Pooling Geometry

Our formal understanding of the inductive bias that drives the success o...

Please sign up or login with your details

Forgot password? Click here to reset