New algorithms for computing the least trimmed squares estimator

03/19/2022
by   Yijun Zuo, et al.
0

Instead of minimizing the sum of all n squared residuals as the classical least squares (LS) does, Rousseeuw (1984) proposed to minimize the sum of h (n/2 ≤ h < n) smallest squared residuals, the resulting estimator is called least trimmed squares (LTS). The idea of the LTS is simple but its computation is challenging since no LS-type analytical computation formula exists anymore. Attempts had been made since its presence, the feasible solution algorithm (Hawkins (1994)), fastlts.f (Rousseeuw and Van Driessen (1999)), and FAST-LTS (Rousseeuw and Van Driessen (2006)), among others, are promising approximate algorithms. The latter two have been incorporated into R function ltsReg by Valentin Todorov. These algorithms utilize combinatorial- or subsampling- approaches. With the great software accessibility and fast speed, the LTS, enjoying many desired properties, has become one of the most popular robust regression estimators across multiple disciplines. This article proposes analytic approaches – employing first-order derivative (gradient) and second-order derivative (Hessian matrix) of the objective function. Our approximate algorithms for the LTS are vetted in synthetic and real data examples. Compared with ltsReg – the benchmark in robust regression and well-known for its speed, our algorithms are comparable (and sometimes even favorable) with respect to both speed and accuracy criteria. Other major contributions include (i) originating the uniqueness and the strong and Fisher consistency at empirical and population settings respectively; (ii) deriving the influence function in a general setting; (iii) re-establishing the asymptotic normality (consequently root-n consistency) of the estimator with a neat and general approach.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset