Learning Polynomials of Few Relevant Dimensions

04/28/2020
by   Sitan Chen, et al.
3

Polynomial regression is a basic primitive in learning and statistics. In its most basic form the goal is to fit a degree d polynomial to a response variable y in terms of an n-dimensional input vector x. This is extremely well-studied with many applications and has sample and runtime complexity Θ(n^d). Can one achieve better runtime if the intrinsic dimension of the data is much smaller than the ambient dimension n? Concretely, we are given samples (x,y) where y is a degree at most d polynomial in an unknown r-dimensional projection (the relevant dimensions) of x. This can be seen both as a generalization of phase retrieval and as a special case of learning multi-index models where the link function is an unknown low-degree polynomial. Note that without distributional assumptions, this is at least as hard as junta learning. In this work we consider the important case where the covariates are Gaussian. We give an algorithm that learns the polynomial within accuracy ϵ with sample complexity that is roughly N = O_r,d(n log^2(1/ϵ) (log n)^d) and runtime O_r,d(N n^2). Prior to our work, no such results were known even for the case of r=1. We introduce a new filtered PCA approach to get a warm start for the true subspace and use geodesic SGD to boost to arbitrary accuracy; our techniques may be of independent interest, especially for problems dealing with subspace recovery or analyzing SGD on manifolds.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset