Regress Consistently when Oblivious Outliers Overwhelm
We give a novel analysis of the Huber loss estimator for consistent robust linear regression proving that it simultaneously achieves an optimal dependency on the fraction of outliers and on the dimension. We consider a linear regression model with an oblivious adversary, who may corrupt the observations in an arbitrary way but without knowing the data. (This adversary model also captures heavy-tailed noise distributions). Given observations y_1,…,y_n with an α uncorrupted fraction, we obtain error guarantees Õ(√(d/α^2· n)), optimal up to logarithmic terms. Our algorithm works with a nearly optimal fraction of inliers α≥Õ(√(d/n)) and under mild restricted isometry assumptions (RIP) on the (transposed) design matrix. Prior to this work, even in the simple case of spherical Gaussian design, no estimator was known to achieve vanishing error guarantees in the high dimensional settings d≳√(n), whenever the fraction of uncorrupted observations is smaller than 1/log n. Our analysis of the Huber loss estimator only exploits the first order optimality conditions. Furthermore, in the special case of Gaussian design X∼ N(0,1)^n × d, we show that a strikingly simple algorithm based on computing coordinate-wise medians achieves similar guarantees in linear time. The algorithm also extends to the settings where the parameter vector β^* is sparse.
READ FULL TEXT