Implicit Regularization with Polynomial Growth in Deep Tensor Factorization

07/18/2022
by   Kais Hariz, et al.
0

We study the implicit regularization effects of deep learning in tensor factorization. While implicit regularization in deep matrix and 'shallow' tensor factorization via linear and certain type of non-linear neural networks promotes low-rank solutions with at most quadratic growth, we show that its effect in deep tensor factorization grows polynomially with the depth of the network. This provides a remarkably faithful description of the observed experimental behaviour. Using numerical experiments, we demonstrate the benefits of this implicit regularization in yielding a more accurate estimation and better convergence properties.

READ FULL TEXT
research
02/19/2021

Implicit Regularization in Tensor Factorization

Implicit regularization in deep learning is perceived as a tendency of g...
research
01/27/2022

Implicit Regularization in Hierarchical Tensor Factorization and Deep Convolutional Neural Networks

In the pursuit of explaining implicit regularization in deep learning, p...
research
01/13/2020

On implicit regularization: Morse functions and applications to matrix factorization

In this paper, we revisit implicit regularization from the ground up usi...
research
05/04/2021

Implicit Regularization in Deep Tensor Factorization

Attempts of studying implicit regularization associated to gradient desc...
research
06/17/2021

Adaptive Low-Rank Regularization with Damping Sequences to Restrict Lazy Weights in Deep Networks

Overfitting is one of the critical problems in deep neural networks. Man...
research
04/18/2023

Generalized Implicit Factorization Problem

The Implicit Factorization Problem was first introduced by May and Ritze...
research
05/10/2021

Exploiting Elasticity in Tensor Ranks for Compressing Neural Networks

Elasticities in depth, width, kernel size and resolution have been explo...

Please sign up or login with your details

Forgot password? Click here to reset