Only sparsity based loss function for learning representations

03/07/2019
by   Vivek Bakaraju, et al.
0

We study the emergence of sparse representations in neural networks. We show that in unsupervised models with regularization, the emergence of sparsity is the result of the input data samples being distributed along highly non-linear or discontinuous manifold. We also derive a similar argument for discriminatively trained networks and present experiments to support this hypothesis. Based on our study of sparsity, we introduce a new loss function which can be used as regularization term for models like autoencoders and MLPs. Further, the same loss function can also be used as a cost function for an unsupervised single-layered neural network model for learning efficient representations.

READ FULL TEXT
research
07/28/2023

How regularization affects the geometry of loss functions

What neural networks learn depends fundamentally on the geometry of the ...
research
11/29/2019

Sparsely Grouped Input Variables for Neural Networks

In genomic analysis, biomarker discovery, image recognition, and other s...
research
07/15/2021

Lockout: Sparse Regularization of Neural Networks

Many regression and classification procedures fit a parameterized functi...
research
11/19/2017

Convergence Analysis of the Dynamics of a Special Kind of Two-Layered Neural Networks with ℓ_1 and ℓ_2 Regularization

In this paper, we made an extension to the convergence analysis of the d...
research
03/22/2021

Intersection Regularization for Extracting Semantic Attributes

We consider the problem of supervised classification, such that the feat...
research
02/28/2019

Unsupervised Abnormality Detection through Mixed Structure Regularization (MSR) in Deep Sparse Autoencoders

Deep sparse auto-encoders with mixed structure regularization (MSR) in a...
research
06/24/2023

Current density impedance imaging with PINNs

In this paper, we introduce CDII-PINNs, a computationally efficient meth...

Please sign up or login with your details

Forgot password? Click here to reset