A determinantal point process for column subset selection

12/23/2018
by   Ayoub Belhadji, et al.
0

Dimensionality reduction is a first step of many machine learning pipelines. Two popular approaches are principal component analysis, which projects onto a small number of well chosen but non-interpretable directions, and feature selection, which selects a small number of the original features. Feature selection can be abstracted as a numerical linear algebra problem called the column subset selection problem (CSSP). CSSP corresponds to selecting the best subset of columns of a matrix X ∈R^N × d, where best is often meant in the sense of minimizing the approximation error, i.e., the norm of the residual after projection of X onto the space spanned by the selected columns. Such an optimization over subsets of {1,...,d} is usually impractical. One workaround that has been vastly explored is to resort to polynomial-cost, random subset selection algorithms that favor small values of this approximation error. We propose such a randomized algorithm, based on sampling from a projection determinantal point process (DPP), a repulsive distribution over a fixed number k of indices {1,...,d} that favors diversity among the selected columns. We give bounds on the ratio of the expected approximation error for this DPP over the optimal error of PCA. These bounds improve over the state-of-the-art bounds of volume sampling when some realistic structural assumptions are satisfied for X. Numerical experiments suggest that our bounds are tight, and that our algorithms have comparable performance with the double phase algorithm, often considered to be the practical state-of-the-art. Column subset selection with DPPs thus inherits the best of both worlds: good empirical performance and tight error bounds.

READ FULL TEXT
research
05/17/2015

Provably Correct Algorithms for Matrix Column Subset Selection with Selectively Sampled Data

We consider the problem of matrix column subset selection, which selects...
research
08/16/2019

Low-rank approximation in the Frobenius norm by column and row subset selection

A CUR approximation of a matrix A is a particular type of low-rank appro...
research
05/19/2015

oASIS: Adaptive Column Sampling for Kernel Matrix Approximation

Kernel matrices (e.g. Gram or similarity matrices) are essential for man...
research
07/24/2023

A Statistical View of Column Subset Selection

We consider the problem of selecting a small subset of representative va...
research
07/16/2021

Streaming and Distributed Algorithms for Robust Column Subset Selection

We give the first single-pass streaming algorithm for Column Subset Sele...
research
03/14/2023

Asymptotically Sharp Upper Bound for the Column Subset Selection Problem

This paper investigates the spectral norm version of the column subset s...
research
10/03/2012

Feature Subset Selection for Software Cost Modelling and Estimation

Feature selection has been recently used in the area of software enginee...

Please sign up or login with your details

Forgot password? Click here to reset