The Hellinger Bounds on the Kullback-Leibler Divergence and the Bernstein Norm

Authors: Tetsuya Kaji

Year: 2026

math.STecon.EM

0
Citations
2026
Published
1
Authors

Abstract

The Kullback-Leibler divergence, the Kullback-Leibler variation, and the Bernstein "norm" are used to quantify discrepancies among probability distributions in likelihood models such as nonparametric maximum likelihood and nonparametric Bayes. They are closely related to the Hellinger distance, which is often easier to work with. Consequently, it is of interest to characterize conditions under which the Hellinger distance serves as an upper bound for these measures. This article characterizes a necessary and sufficient condition for each of the discrepancy measures to be bounded by the Hellinger distance. It accommodates unbounded likelihood ratios and generalizes all previously known results. We then apply it to relax the regularity condition for the sieve maximum likelihood estimator.

Read PDF