Schrödinger PCA: You Only Need Variances for Eigenmodes

06/08/2020
by   Ziming Liu, et al.
0

Principal component analysis (PCA) has achieved great success in unsupervised learning by identifying covariance correlations among features. If the data collection fails to capture the covariance information, PCA will not be able to discover meaningful modes. In particular, PCA will fail the spatial Gaussian Process (GP) model in the undersampling regime, i.e. the averaged distance of neighboring anchor points (spatial features) is greater than the correlation length of GP. Counterintuitively, by drawing the connection between PCA and Schrödinger equation, we can not only attack the undersampling challenge but also compute in an efficient and decoupled way with the proposed algorithm called Schrödinger PCA. Our algorithm only requires variances of features and estimated correlation length as input, constructs the corresponding Schrödinger equation, and solves it to obtain the energy eigenstates, which coincide with principal components. We will also establish the connection of our algorithm to the model reduction techniques in the partial differential equation (PDE) community, where the steady-state Schrödinger operator is identified as a second-order approximation to the covariance function. Numerical experiments are implemented to testify the validity and efficiency of the proposed algorithm, showing its potential for unsupervised learning tasks on general graphs and manifolds.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset