On Projections to Linear Subspaces

by   Erik Thordsen, et al.

The merit of projecting data onto linear subspaces is well known from, e.g., dimension reduction. One key aspect of subspace projections, the maximum preservation of variance (principal component analysis), has been thoroughly researched and the effect of random linear projections on measures such as intrinsic dimensionality still is an ongoing effort. In this paper, we investigate the less explored depths of linear projections onto explicit subspaces of varying dimensionality and the expectations of variance that ensue. The result is a new family of bounds for Euclidean distances and inner products. We showcase the quality of these bounds as well as investigate the intimate relation to intrinsic dimensionality estimation.


page 1

page 2

page 3

page 4


HoroPCA: Hyperbolic Dimensionality Reduction via Horospherical Projections

This paper studies Principal Component Analysis (PCA) for data lying in ...

Van Trees inequality, group equivariance, and estimation of principal subspaces

We establish non-asymptotic lower bounds for the estimation of principal...

Intrinsic dimension estimation of data by principal component analysis

Estimating intrinsic dimensionality of data is a classic problem in patt...

Lazy stochastic principal component analysis

Stochastic principal component analysis (SPCA) has become a popular dime...

Supporting Multi-point Fan Design with Dimension Reduction

Motivated by the idea of turbomachinery active subspace performance maps...

Pruned Collapsed Projection-Aggregation Decoding of Reed-Muller Codes

The paper proposes to decode Reed-Muller (RM) codes by projecting onto o...

Please sign up or login with your details

Forgot password? Click here to reset