Deep Exponential-Family Auto-Encoders

by   Bahareh Tolooshams, et al.

We consider the problem of learning recurring convolutional patterns from data that are not necessarily real valued, such as binary or count-valued data. We cast the problem as one of learning a convolutional dictionary, subject to sparsity constraints, given observations drawn from a distribution that belongs to the canonical exponential family. We propose two general approaches towards its solution. The first approach uses the ℓ_0 pseudo-norm to enforce sparsity and is reminiscent of the alternating-minimization algorithm for classical convolutional dictionary learning (CDL). The second approach, which uses the ℓ_1 norm to enforce sparsity, generalizes to the exponential family the recently-shown connection between CDL and a class of ReLU auto-encoders for Gaussian observations. The two approaches can each be interpreted as an auto-encoder, the weights of which are in one-to-one correspondence with the parameters of the convolutional dictionary. Our key insight is that, unless the observations are Gaussian valued, the input fed into the encoder ought to be modified iteratively, and in a specific manner, using the parameters of the dictionary. Compared to the ℓ_0 approach, once trained, the forward pass through the ℓ_1 encoder computes sparse codes orders of magnitude more efficiently. We apply the two approaches to the unsupervised learning of the stimulus effect from neural spiking data acquired in the barrel cortex of mice in response to periodic whisker deflections. We demonstrate that they are both superior to generalized linear models, which rely on hand-crafted features.


page 1

page 2

page 3

page 4


Scalable Convolutional Dictionary Learning with Constrained Recurrent Sparse Auto-encoders

Given a convolutional dictionary underlying a set of observed signals, c...

Mixture Model Auto-Encoders: Deep Clustering through Dictionary Learning

State-of-the-art approaches for clustering high-dimensional data utilize...

Deep Residual Auto-Encoders for Expectation Maximization-based Dictionary Learning

Convolutional dictionary learning (CDL) has become a popular method for ...

RandNet: deep learning with compressed measurements of images

Principal component analysis, dictionary learning, and auto-encoders are...

Relating graph auto-encoders to linear models

Graph auto-encoders are widely used to construct graph representations i...

Deep Convolutional Compressed Sensing for LiDAR Depth Completion

In this paper we consider the problem of estimating a dense depth map fr...

Deep Double Sparsity Encoder: Learning to Sparsify Not Only Features But Also Parameters

This paper emphasizes the significance to jointly exploit the problem st...

Please sign up or login with your details

Forgot password? Click here to reset