Simultaneous Factors Selection and Fusion of Their Levels in Penalized Logistic Regression

12/20/2022
by   Lea Kaufmann, et al.
0

Nowadays, several data analysis problems require for complexity reduction, mainly meaning that they target at removing the non-influential covariates from the model and at delivering a sparse model. When categorical covariates are present, with their levels being dummy coded, the number of parameters included in the model grows rapidly, fact that emphasizes the need for reducing the number of parameters to be estimated. In this case, beyond variable selection, sparsity is also achieved through fusion of levels of covariates which do not differentiate significantly in terms of their influence on the response variable. In this work a new regularization technique is introduced, called L_0-Fused Group Lasso (L_0-FGL) for binary logistic regression. It uses a group lasso penalty for factor selection and for the fusion part it applies an L_0 penalty on the differences among the levels' parameters of a categorical predictor. Using adaptive weights, the adaptive version of L_0-FGL method is derived. Theoretical properties, such as the existence, √(n) consistency and oracle properties under certain conditions, are established. In addition, it is shown that even in the diverging case where the number of parameters p_n grows with the sample size n, √(n) consistency and a consistency in variable selection result are achieved. Two computational methods, PIRLS and a block coordinate descent (BCD) approach using quasi Newton, are developed and implemented. A simulation study supports that L_0-FGL shows an outstanding performance, especially in the high dimensional case.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset