Consistency of random forests

05/12/2014
by   Erwan Scornet, et al.
0

Random forests are a learning algorithm proposed by Breiman [Mach. Learn. 45 (2001) 5--32] that combines several randomized decision trees and aggregates their predictions by averaging. Despite its wide usage and outstanding practical performance, little is known about the mathematical properties of the procedure. This disparity between theory and practice originates in the difficulty to simultaneously analyze both the randomization process and the highly data-dependent tree structure. In the present paper, we take a step forward in forest exploration by proving a consistency result for Breiman's [Mach. Learn. 45 (2001) 5--32] original algorithm in the context of additive regression models. Our analysis also sheds an interesting light on how random forests can nicely adapt to sparsity. 1. Introduction. Random forests are an ensemble learning method for classification and regression that constructs a number of randomized decision trees during the training phase and predicts by averaging the results. Since its publication in the seminal paper of Breiman (2001), the procedure has become a major data analysis tool, that performs well in practice in comparison with many standard methods. What has greatly contributed to the popularity of forests is the fact that they can be applied to a wide range of prediction problems and have few parameters to tune. Aside from being simple to use, the method is generally recognized for its accuracy and its ability to deal with small sample sizes, high-dimensional feature spaces and complex data structures. The random forest methodology has been successfully involved in many practical problems, including air quality prediction (winning code of the EMC data science global hackathon in 2012, see http://www.kaggle.com/c/dsg-hackathon), chemoinformatics [Svetnik et al. (2003)], ecology [Prasad, Iverson and Liaw (2006), Cutler et al. (2007)], 3D

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2013

Consistency of Online Random Forests

As a testament to their success, the theory of random forests has long b...
research
04/28/2021

Universal Consistency of Decision Trees in High Dimensions

This paper shows that decision trees constructed with Classification and...
research
04/25/2016

Neural Random Forests

Given an ensemble of randomized regression trees, it is possible to rest...
research
11/18/2015

A Random Forest Guided Tour

The random forest algorithm, proposed by L. Breiman in 2001, has been ex...
research
07/27/2020

Robust Similarity and Distance Learning via Decision Forests

Canonical distances such as Euclidean distance often fail to capture the...
research
04/29/2020

Interpretable Random Forests via Rule Extraction

We introduce SIRUS (Stable and Interpretable RUle Set) for regression, a...
research
11/08/2017

Universal consistency and minimax rates for online Mondrian Forests

We establish the consistency of an algorithm of Mondrian Forests, a rand...

Please sign up or login with your details

Forgot password? Click here to reset