Model selection by minimum description length: Lower-bound sample sizes for the Fisher information approximation

08/01/2018
by   Daniel W. Heck, et al.
0

The Fisher information approximation (FIA) is an implementation of the minimum description length principle for model selection. Unlike information criteria such as AIC or BIC, it has the advantage of taking the functional form of a model into account. Unfortunately, FIA can be misleading in finite samples, resulting in an inversion of the correct rank order of complexity terms for competing models in the worst case. As a remedy, we propose a lower-bound N' for the sample size that suffices to preclude such errors. We illustrate the approach using three examples from the family of multinomial processing tree models.

READ FULL TEXT
research
10/07/2021

Minimum Message Length Autoregressive Moving Average Model Order Selection

This paper derives a Minimum Message Length (MML) criterion for the mode...
research
10/01/2015

Higher-order asymptotics for the parametric complexity

The parametric complexity is the key quantity in the minimum description...
research
06/19/2015

Information-based inference for singular models and finite sample sizes

A central problem in statistics is model selection, the choice between c...
research
07/11/2016

Minimum Description Length Principle in Supervised Learning with Application to Lasso

The minimum description length (MDL) principle in supervised learning is...
research
06/02/2020

Unsupervised Discretization by Two-dimensional MDL-based Histogram

Unsupervised discretization is a crucial step in many knowledge discover...
research
10/25/2019

Unified model selection approach based on minimum description length principle in Granger causality analysis

Granger causality analysis (GCA) provides a powerful tool for uncovering...
research
06/23/2019

Inferring Latent dimension of Linear Dynamical System with Minimum Description Length

Time-invariant linear dynamical system arises in many real-world applica...

Please sign up or login with your details

Forgot password? Click here to reset