# New algorithms for computing the least trimmed squares estimator

Instead of minimizing the sum of all n squared residuals as the classical least squares (LS) does, Rousseeuw (1984) proposed to minimize the sum of h (n/2 ≤ h < n) smallest squared residuals, the resulting estimator is called least trimmed squares (LTS). The idea of the LTS is simple but its computation is challenging since no LS-type analytical computation formula exists anymore. Attempts had been made since its presence, the feasible solution algorithm (Hawkins (1994)), fastlts.f (Rousseeuw and Van Driessen (1999)), and FAST-LTS (Rousseeuw and Van Driessen (2006)), among others, are promising approximate algorithms. The latter two have been incorporated into R function ltsReg by Valentin Todorov. These algorithms utilize combinatorial- or subsampling- approaches. With the great software accessibility and fast speed, the LTS, enjoying many desired properties, has become one of the most popular robust regression estimators across multiple disciplines. This article proposes analytic approaches – employing first-order derivative (gradient) and second-order derivative (Hessian matrix) of the objective function. Our approximate algorithms for the LTS are vetted in synthetic and real data examples. Compared with ltsReg – the benchmark in robust regression and well-known for its speed, our algorithms are comparable (and sometimes even favorable) with respect to both speed and accuracy criteria. Other major contributions include (i) originating the uniqueness and the strong and Fisher consistency at empirical and population settings respectively; (ii) deriving the influence function in a general setting; (iii) re-establishing the asymptotic normality (consequently root-n consistency) of the estimator with a neat and general approach.

research
02/21/2022

### Least sum of squares of trimmed residuals regression

In the famous least sum of trimmed squares (LTS) of residuals estimator ...
research
10/12/2022

### Asymptotics for the least trimmed squares estimator

Novel properties of the objective function in both empirical and populat...
research
04/01/2022

### Asymptotic normality of the least sum of squares of trimmed residuals estimator

To enhance the robustness of the classic least sum of squares (LS) of th...
research
12/11/2018

### Distribution-free properties of isotonic regression

It is well known that the isotonic least squares estimator is characteri...
research
07/09/2020

### Robust Geodesic Regression

This paper studies robust regression for data on Riemannian manifolds. G...
research
10/25/2018

### Nuclear Norm Regularized Estimation of Panel Regression Models

In this paper we investigate panel regression models with interactive fi...
research
03/29/2018

### A Formula for Type III Sums of Squares

Type III methods were introduced by SAS to address difficulties in dummy...