Tuning support vector machines and boosted trees using optimization algorithms

03/13/2023
by   Jill F. Lundell, et al.
0

Statistical learning methods have been growing in popularity in recent years. Many of these procedures have parameters that must be tuned for models to perform well. Research has been extensive in neural networks, but not for many other learning methods. We looked at the behavior of tuning parameters for support vector machines, gradient boosting machines, and adaboost in both a classification and regression setting. We used grid search to identify ranges of tuning parameters where good models can be found across many different datasets. We then explored different optimization algorithms to select a model across the tuning parameter space. Models selected by the optimization algorithm were compared to the best models obtained through grid search to select well performing algorithms. This information was used to create an R package, EZtune, that automatically tunes support vector machines and boosted trees.

READ FULL TEXT

page 5

page 6

03/03/2023

EZtune: A Package for Automated Hyperparameter Tuning in R

Statistical learning models have been growing in popularity in recent ye...
08/26/2022

Algebraically Explainable Controllers: Decision Trees and Support Vector Machines Join Forces

Recently, decision trees (DT) have been used as an explainable represent...
07/18/2022

pGMM Kernel Regression and Comparisons with Boosted Trees

In this work, we demonstrate the advantage of the pGMM (“powered general...
04/07/2022

FastMapSVM: Classifying Complex Objects Using the FastMap Algorithm and Support-Vector Machines

Neural Networks and related Deep Learning methods are currently at the l...

Please sign up or login with your details

Forgot password? Click here to reset