Tuning support vector machines and boosted trees using optimization algorithms

by   Jill F. Lundell, et al.

Statistical learning methods have been growing in popularity in recent years. Many of these procedures have parameters that must be tuned for models to perform well. Research has been extensive in neural networks, but not for many other learning methods. We looked at the behavior of tuning parameters for support vector machines, gradient boosting machines, and adaboost in both a classification and regression setting. We used grid search to identify ranges of tuning parameters where good models can be found across many different datasets. We then explored different optimization algorithms to select a model across the tuning parameter space. Models selected by the optimization algorithm were compared to the best models obtained through grid search to select well performing algorithms. This information was used to create an R package, EZtune, that automatically tunes support vector machines and boosted trees.


page 5

page 6


EZtune: A Package for Automated Hyperparameter Tuning in R

Statistical learning models have been growing in popularity in recent ye...

Algebraically Explainable Controllers: Decision Trees and Support Vector Machines Join Forces

Recently, decision trees (DT) have been used as an explainable represent...

pGMM Kernel Regression and Comparisons with Boosted Trees

In this work, we demonstrate the advantage of the pGMM (“powered general...

FastMapSVM: Classifying Complex Objects Using the FastMap Algorithm and Support-Vector Machines

Neural Networks and related Deep Learning methods are currently at the l...

Please sign up or login with your details

Forgot password? Click here to reset