A consistent and flexible framework for deep matrix factorizations

by   Pierre De Handschutter, et al.

Deep matrix factorizations (deep MFs) are recent unsupervised data mining techniques inspired by constrained low-rank approximations. They aim to extract complex hierarchies of features within high-dimensional datasets. Most of the loss functions proposed in the literature to evaluate the quality of deep MF models and the underlying optimization frameworks are not consistent because different losses are used at different layers. In this paper, we introduce two meaningful loss functions for deep MF and present a generic framework to solve the corresponding optimization problems. We illustrate the effectiveness of this approach through the integration of various constraints and regularizations, such as sparsity, nonnegativity and minimum-volume. The models are successfully applied on both synthetic and real data, namely for hyperspectral unmixing and extraction of facial features.


Deep matrix factorizations

Constrained low-rank matrix approximations have been known for decades a...

A Flexible Optimization Framework for Regularized Matrix-Tensor Factorizations with Linear Couplings

Coupled matrix and tensor factorizations (CMTF) are frequently used to j...

Gradient descent in Gaussian random fields as a toy model for high-dimensional optimisation in deep learning

In this paper we model the loss function of high-dimensional optimizatio...

Matrix denoising for weighted loss functions and heterogeneous signals

We consider the problem of recovering a low-rank matrix from a noisy obs...

A Unified Joint Matrix Factorization Framework for Data Integration

Nonnegative matrix factorization (NMF) is a powerful tool in data explor...

Certifying Out-of-Domain Generalization for Blackbox Functions

Certifying the robustness of model performance under bounded data distri...

Please sign up or login with your details

Forgot password? Click here to reset