Minimax Rates for High-dimensional Double Sparse Structure over ℓ_q-balls

07/25/2022
by   Zhifan Li, et al.
0

In this paper, we focus on the high-dimensional double sparse structure, where the parameter of interest simultaneously encourages group-wise sparsity and element-wise sparsity in each group. Combining Gilbert-Varshamov bound and its variants, we develop a novel lower bound technique for the metric entropy of the parameter space, which is well suited for the double sparse structure over ℓ_q-balls for q ∈ [0,1]. We prove the lower bounds on estimation error in an information theoretical manner, which is based on our proposed lower bound technique and Fano's inequality. The matching upper bounds are also established, whose proof follows from a direct analysis of the constrained least-squares estimators and results on empirical processes. Moreover, we extend the results over ℓ_q-balls into the double sparse regression model and establish its minimax rate on the estimation error. Finally, we develop the DSIHT (Double Sparse Iterative Hard Thresholding) algorithm and show its optimality in the minimax sense for solving the double sparse linear regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2023

A minimax optimal approach to high-dimensional double sparse linear regression

In this paper, we focus our attention on the high-dimensional double spa...
research
09/21/2019

Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference

In this paper, we study sparse group Lasso for high-dimensional double s...
research
12/15/2017

Rate-optimal estimation of p-dimensional linear functionals in a sparse Gaussian model

We consider two problems of estimation in high-dimensional Gaussian mode...
research
12/05/2019

A Convex Optimization Approach to High-Dimensional Sparse Quadratic Discriminant Analysis

In this paper, we study high-dimensional sparse Quadratic Discriminant A...
research
02/20/2014

Multi-Step Stochastic ADMM in High Dimensions: Applications to Sparse Optimization and Noisy Matrix Decomposition

We propose an efficient ADMM method with guarantees for high-dimensional...
research
04/13/2022

Generalization Error Bounds for Multiclass Sparse Linear Classifiers

We consider high-dimensional multiclass classification by sparse multino...
research
05/30/2019

Global empirical risk minimizers with "shape constraints" are rate optimal in general dimensions

Entropy integrals are widely used as a powerful tool to obtain upper bou...

Please sign up or login with your details

Forgot password? Click here to reset