An l_1-oracle inequality for the Lasso in mixture-of-experts regression models

09/22/2020
āˆ™
by   TrungTin Nguyen, et al.
āˆ™
6
āˆ™

Mixture-of-experts (MoE) models are a popular framework for modeling heterogeneity in data, for both regression and classification problems in statistics and machine learning, due to their flexibility and the abundance of statistical estimation and model choice tools. Such flexibility comes from allowing the mixture weights (or gating functions) in the MoE model to depend on the explanatory variables, along with the experts (or component densities). This permits the modeling of data arising from more complex data generating processes, compared to the classical finite mixtures and finite mixtures of regression models, whose mixing parameters are independent of the covariates. The use of MoE models in a high-dimensional setting, when the number of explanatory variables can be much larger than the sample size (i.e., pā‰« n), is challenging from a computational point of view, and in particular from a theoretical point of view, where the literature is still lacking results in dealing with the curse of dimensionality, in both the statistical estimation and feature selection. We consider the finite mixture-of-experts model with soft-max gating functions and Gaussian experts for high-dimensional regression on heterogeneous data, and its l_1-regularized estimation via the Lasso. We focus on the Lasso estimation properties rather than its feature selection properties. We provide a lower bound on the regularization parameter of the Lasso function that ensures an l_1-oracle inequality satisfied by the Lasso estimator according to the Kullback-Leibler loss.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
āˆ™ 04/06/2021

A non-asymptotic penalization criterion for model selection in mixture of experts models

Mixture of experts (MoE) is a popular class of models in statistics and ...
research
āˆ™ 10/12/2020

Robust Finite Mixture Regression for Heterogeneous Targets

Finite Mixture Regression (FMR) refers to the mixture modeling scheme wh...
research
āˆ™ 10/31/2021

Phase-type mixture-of-experts regression for loss severities

The task of modeling claim severities is addressed when data is not cons...
research
āˆ™ 11/17/2022

Mixture of Experts Distributional Regression: Implementation Using Robust Estimation with Adaptive First-order Methods

In this work, we propose an efficient implementation of mixtures of expe...
research
āˆ™ 07/14/2019

Estimation and Feature Selection in Mixtures of Generalized Linear Experts Models

Mixtures-of-Experts (MoE) are conditional mixture models that have shown...
research
āˆ™ 10/30/2022

Prediction Sets for High-Dimensional Mixture of Experts Models

Large datasets make it possible to build predictive models that can capt...
research
āˆ™ 09/11/2011

The Bayesian Bridge

We propose the Bayesian bridge estimator for regularized regression and ...

Please sign up or login with your details

Forgot password? Click here to reset