Minimum discrepancy principle strategy for choosing k in k-NN regression

08/20/2020
by   Yaroslav Averyanov, et al.
0

This paper presents a novel data-driven strategy to choose the hyperparameter k in the k-NN regression estimator. We treat the problem of choosing the hyperparameter as an iterative procedure (over k) and propose using an easily implemented in practice strategy based on the idea of early stopping and the minimum discrepancy principle. This estimation strategy is proven to be minimax optimal, under the fixed-design assumption on covariates, over different smoothness function classes, for instance, the Lipschitz functions class on a bounded domain. After that, the novel strategy shows consistent simulations results on artificial and real-world data sets in comparison to other model selection strategies such as the Hold-out method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset