Cramer-Rao Bound for Estimation After Model Selection and its Application to Sparse Vector Estimation

by   Elad Meir, et al.

In many practical parameter estimation problems, such as coefficient estimation of polynomial regression and direction-of-arrival (DOA) estimation, model selection is performed prior to estimation. In these cases, it is assumed that the true measurement model belongs to a set of candidate models. The data-based model selection step affects the subsequent estimation, which may result in a biased estimation. In particular, the oracle Cramer-Rao bound (CRB), which assumes knowledge of the model, is inappropriate for post-model-selection performance analysis and system design outside the asymptotic region. In this paper, we analyze the estimation performance of post-model-selection estimators, by using the mean-squared-selected-error (MSSE) criterion. We assume coherent estimators that force unselected parameters to zero, and introduce the concept of selective unbiasedness in the sense of Lehmann unbiasedness. We derive a non-Bayesian Cramer-Rao-type bound on the MSSE and on the mean-squared-error (MSE) of any coherent and selective unbiased estimators. As an important special case, we illustrate the computation and applicability of the proposed selective CRB for sparse vector estimation, in which the selection of a model is equivalent to the recovery of the support. Finally, we demonstrate in numerical simulations that the proposed selective CRB is a valid lower bound on the performance of the post-model-selection maximum likelihood estimator for general linear model with different model selection criteria, and for sparse vector estimation with one-step thresholding. It is shown that for these cases the selective CRB outperforms the existing bounds: oracle CRB, averaged CRB, and the SMS-CRB from [1].


Non-Bayesian Post-Model-Selection Estimation as Estimation Under Model Misspecification

In many parameter estimation problems, the exact model is unknown and is...

Robustness Analysis of the Data-Selective Volterra NLMS Algorithm

Recently, the data-selective adaptive Volterra filters have been propose...

A Generalized Focused Information Criterion for GMM

This paper proposes a criterion for simultaneous GMM model and moment se...

On the Distribution of Penalized Maximum Likelihood Estimators: The LASSO, SCAD, and Thresholding

We study the distributions of the LASSO, SCAD, and thresholding estimato...

On MMSE and MAP Denoising Under Sparse Representation Modeling Over a Unitary Dictionary

Among the many ways to model signals, a recent approach that draws consi...

Selective Inference for Testing Trees and Edges in Phylogenetics

Selective inference is considered for testing trees and edges in phyloge...

Please sign up or login with your details

Forgot password? Click here to reset