Empirical Phi-discrepancies and quasi-empirical likelihood: exponential bounds
1 MODAL’X, Université Paris-Ouest-Nanterre-La Défense
2 CREST-LS et Laboratoire REGARDS, Université de Reims Champagne Ardennes
3 Ecole Normale Supérieure de Cachan
We review some recent extensions of the so-called generalized empirical likelihood method, when the Kullback distance is replaced by some general convex divergence. We propose to use, instead of empirical likelihood, some regularized form or quasi-empirical likelihood method, corresponding to a convex combination of Kullback and χ2 discrepancies. We show that for some adequate choice of the weight in this combination, the corresponding quasi-empirical likelihood is Bartlett-correctable. We also establish some non-asymptotic exponential bounds for the confidence regions obtained by using this method. These bounds are derived via bounds for self-normalized sums in the multivariate case obtained in a previous work by the authors. We also show that this kind of results may be extended to process valued infinite dimensional parameters. In this case some known results about self-normalized processes may be used to control the behavior of generalized empirical likelihood.
© EDP Sciences, SMAI 2015