An inverse Sanov theorem for curved exponential families
We prove the large deviation principle (LDP) for posterior distributions arising from curved exponential families in a parametric setting, allowing misspecification of the model. Moreover, motivated by the so called inverse Sanov Theorem, obtained in a nonparametric setting by Ganesh and O'Connell (1999 and 2000), we study the relationship between the rate function for the LDP studied in this paper, and the one for the LDP for the corresponding maximum likelihood estimators. In our setting, even in the non misspecified case, it is not true in general that the rate functions for posterior distributions and for maximum likelihood estimators are Kullback-Leibler divergences with exchanged arguments. Finally, the results of the paper has some further interest for the case of exponential families with a dual one (see Letac (2021+)).
READ FULL TEXT