Confidence Sets under Generalized Self-Concordance

12/31/2022
by   Lang Liu, et al.
0

This paper revisits a fundamental problem in statistical inference from a non-asymptotic theoretical viewpoint x2013 the construction of confidence sets. We establish a finite-sample bound for the estimator, characterizing its asymptotic behavior in a non-asymptotic fashion. An important feature of our bound is that its dimension dependency is captured by the effective dimension x2013 the trace of the limiting sandwich covariance x2013 which can be much smaller than the parameter dimension in some regimes. We then illustrate how the bound can be used to obtain a confidence set whose shape is adapted to the optimization landscape induced by the loss function. Unlike previous works that rely heavily on the strong convexity of the loss function, we only assume the Hessian is lower bounded at optimum and allow it to gradually becomes degenerate. This property is formalized by the notion of generalized self-concordance which originated from convex optimization. Moreover, we demonstrate how the effective dimension can be estimated from data and characterize its estimation accuracy. We apply our results to maximum likelihood estimation with generalized linear models, score matching with exponential families, and hypothesis testing with Rao's score test.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset