An Inequality with Applications to Structured Sparsity and Multitask Dictionary Learning

02/08/2014
by   Andreas Maurer, et al.
0

From concentration inequalities for the suprema of Gaussian or Rademacher processes an inequality is derived. It is applied to sharpen existing and to derive novel bounds on the empirical Rademacher complexities of unit balls in various norms appearing in the context of structured sparsity and multitask dictionary learning or matrix factorization. A key role is played by the largest eigenvalue of the data covariance matrix.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/12/2013

Distributed dictionary learning over a sensor network

We consider the problem of distributed dictionary learning, where a set ...
research
01/01/2012

Collaborative Filtering via Group-Structured Dictionary Learning

Structured sparse coding and the related structured dictionary learning ...
research
11/10/2020

Applications of Online Nonnegative Matrix Factorization to Image and Time-Series Data

Online nonnegative matrix factorization (ONMF) is a matrix factorization...
research
12/13/2013

Sample Complexity of Dictionary Learning and other Matrix Factorizations

Many modern tools in machine learning and signal processing, such as spa...
research
04/16/2018

Binary Matrix Factorization via Dictionary Learning

Matrix factorization is a key tool in data analysis; its applications in...
research
09/04/2012

Sparse coding for multitask and transfer learning

We investigate the use of sparse coding and dictionary learning in the c...
research
11/07/2014

A totally unimodular view of structured sparsity

This paper describes a simple framework for structured sparse recovery b...

Please sign up or login with your details

Forgot password? Click here to reset