Interpretable Deep Recurrent Neural Networks via Unfolding Reweighted ℓ_1-ℓ_1 Minimization: Architecture Design and Generalization Analysis

by   Huynh Van Luong, et al.

Deep unfolding methods—for example, the learned iterative shrinkage thresholding algorithm (LISTA)—design deep neural networks as learned variations of optimization methods. These networks have been shown to achieve faster convergence and higher accuracy than the original optimization methods. In this line of research, this paper develops a novel deep recurrent neural network (coined reweighted-RNN) by the unfolding of a reweighted ℓ_1-ℓ_1 minimization algorithm and applies it to the task of sequential signal reconstruction. To the best of our knowledge, this is the first deep unfolding method that explores reweighted minimization. Due to the underlying reweighted minimization model, our RNN has a different soft-thresholding function (alias, different activation functions) for each hidden unit in each layer. Furthermore, it has higher network expressivity than existing deep unfolding RNN models due to the over-parameterizing weights. Importantly, we establish theoretical generalization error bounds for the proposed reweighted-RNN model by means of Rademacher complexity. The bounds reveal that the parameterization of the proposed reweighted-RNN ensures good generalization. We apply the proposed reweighted-RNN to the problem of video frame reconstruction from low-dimensional measurements, that is, sequential frame reconstruction. The experimental results on the moving MNIST dataset demonstrate that the proposed deep reweighted-RNN significantly outperforms existing RNN models.


Designing recurrent neural networks by unfolding an l1-l1 minimization algorithm

We propose a new deep recurrent neural network (RNN) architecture for se...

Generalization Error Bounds for Iterative Recovery Algorithms Unfolded as Neural Networks

Motivated by the learned iterative soft thresholding algorithm (LISTA), ...

A New Hybrid-parameter Recurrent Neural Networks for Online Handwritten Chinese Character Recognition

The recurrent neural network (RNN) is appropriate for dealing with tempo...

Learned ISTA with Error-based Thresholding for Adaptive Sparse Coding

The learned iterative shrinkage thresholding algorithm (LISTA) introduce...

Neural Estimation and Optimization of Directed Information over Continuous Spaces

This work develops a new method for estimating and optimizing the direct...

Optimizing Recurrent Neural Networks Architectures under Time Constraints

Recurrent neural network (RNN)'s architecture is a key factor influencin...

Generalization bounds for deep thresholding networks

We consider compressive sensing in the scenario where the sparsity basis...

Please sign up or login with your details

Forgot password? Click here to reset