Hyperparameter Importance for Machine Learning Algorithms

by   Honghe Jin, et al.
wells fargo

Hyperparameter plays an essential role in the fitting of supervised machine learning algorithms. However, it is computationally expensive to tune all the tunable hyperparameters simultaneously especially for large data sets. In this paper, we give a definition of hyperparameter importance that can be estimated by subsampling procedures. According to the importance, hyperparameters can then be tuned on the entire data set more efficiently. We show theoretically that the proposed importance on subsets of data is consistent with the one on the population data under weak conditions. Numerical experiments show that the proposed importance is consistent and can save a lot of computational resources.


Importance of Tuning Hyperparameters of Machine Learning Algorithms

The performance of many machine learning algorithms depends on their hyp...

Hyperparameter Search in Machine Learning

We introduce the hyperparameter search problem in the field of machine l...

Automatic Exploration of Machine Learning Experiments on OpenML

Understanding the influence of hyperparameters on the performance of a m...

Experimental Investigation and Evaluation of Model-based Hyperparameter Optimization

Machine learning algorithms such as random forests or xgboost are gainin...

PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces

The recent rise in popularity of Hyperparameter Optimization (HPO) for d...

Tuning structure learning algorithms with out-of-sample and resampling strategies

One of the challenges practitioners face when applying structure learnin...

A quasi-Monte Carlo data compression algorithm for machine learning

We introduce an algorithm to reduce large data sets using so-called digi...

Please sign up or login with your details

Forgot password? Click here to reset