Information-based inference for singular models and finite sample sizes

06/19/2015
by   Colin H. LaMont, et al.
0

A central problem in statistics is model selection, the choice between competing models of a stochastic process whose observables are corrupted by noise. In the information-based paradigm of inference, model selection is performed by estimating the predictive performance of the com- peting models. The candidate model with the best estimated predictive performance is selected. Information-based inference is dependent on the accuracy of the estimate of the predictive complexity, a measure of the flexibility of the model in fitting the data. A large-sample-size approximation for the performance is the Akaike Information Criterion (AIC). The AIC approximation fails in a wide range of important applications, either significantly under or over-estimating the complexity. We introduce an improved approximation for the complexity which we use to define a new information criterion: the frequentist information criterion (FIC). FIC extends the applicability of information-based infer- ence to the finite-sample-size regime of regular models and to singular models. We demonstrate the power of the approach in a number of example problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro