Alpha-NML Universal Predictors

02/25/2022
by   Marco Bondaschi, et al.
0

Inspired by Sibson's alpha-mutual information, we introduce a new class of universal predictors that depend on a real parameter greater than one. This class interpolates two well-known predictors, the mixture estimator, that includes the Laplace and the Krichevsky-Trofimov predictors, and the Normalized Maximum Likelihood (NML) estimator. We point out some advantages of this class of predictors and study its performance in terms of known regret measures under logarithmic loss, in particular for the well-studied case of discrete memoryless sources.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2012

Venn-Abers predictors

This paper continues study, both theoretical and empirical, of the metho...
research
06/29/2020

Data integration in high dimension with multiple quantiles

This article deals with the analysis of high dimensional data that come ...
research
02/29/2020

Are You an Introvert or Extrovert? Accurate Classification With Only Ten Predictors

This paper investigates how accurately the prediction of being an introv...
research
02/14/2020

Upper and Lower Class Functions for Maximum Likelihood Estimator for Single server Queues

Upper and lower class functions for the maximum likelihood estimator of ...
research
05/14/2020

Training conformal predictors

Efficiency criteria for conformal prediction, such as observed fuzziness...
research
11/25/2020

Bounds for Algorithmic Mutual Information and a Unifilar Order Estimator

Inspired by Hilberg's hypothesis, which states that mutual information b...
research
12/24/2009

On Finding Predictors for Arbitrary Families of Processes

The problem is sequence prediction in the following setting. A sequence ...

Please sign up or login with your details

Forgot password? Click here to reset