DeepAI AI Chat
Log In Sign Up

Adiabatic Quantum Feature Selection for Sparse Linear Regression

06/04/2021
by   Surya Sai Teja Desu, et al.
0

Linear regression is a popular machine learning approach to learn and predict real valued outputs or dependent variables from independent variables or features. In many real world problems, its beneficial to perform sparse linear regression to identify important features helpful in predicting the dependent variable. It not only helps in getting interpretable results but also avoids overfitting when the number of features is large, and the amount of data is small. The most natural way to achieve this is by using `best subset selection' which penalizes non-zero model parameters by adding ℓ_0 norm over parameters to the least squares loss. However, this makes the objective function non-convex and intractable even for a small number of features. This paper aims to address the intractability of sparse linear regression with ℓ_0 norm using adiabatic quantum computing, a quantum computing paradigm that is particularly useful for solving optimization problems faster. We formulate the ℓ_0 optimization problem as a Quadratic Unconstrained Binary Optimization (QUBO) problem and solve it using the D-Wave adiabatic quantum computer. We study and compare the quality of QUBO solution on synthetic and real world datasets. The results demonstrate the effectiveness of the proposed adiabatic quantum computing approach in finding the optimal solution. The QUBO solution matches the optimal solution for a wide range of sparsity penalty values across the datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/05/2020

Adiabatic Quantum Linear Regression

A major challenge in machine learning is the computational expense of tr...
10/25/2021

Quantum Algorithms and Lower Bounds for Linear Regression with Norm Constraints

Lasso and Ridge are important minimization problems in machine learning ...
12/14/2020

On the Treatment of Optimization Problems with L1 Penalty Terms via Multiobjective Continuation

We present a novel algorithm that allows us to gain detailed insight int...
05/20/2017

Learning Feature Nonlinearities with Non-Convex Regularized Binned Regression

For various applications, the relations between the dependent and indepe...
01/31/2022

Sparse Signal Reconstruction with QUBO Formulation in l0-regularized Linear Regression

An l0-regularized linear regression for a sparse signal reconstruction i...
09/30/2022

Shuffled linear regression through graduated convex relaxation

The shuffled linear regression problem aims to recover linear relationsh...