Cressie Read Power Divergence for Moment-Based Estimation: Hyperparameter and Finite Sample Behavior

Authors: Jieun Lee, Anil K. Bera

Year: 2026

econ.EM

0
Citations
2026
Published
2
Authors

Abstract

We study Cressie Read power divergence (CRPD) estimation for moment based models, focusing on finite sample behavior. While generalized empirical likelihood estimators, dual to CRPD, are known to outperform generalized method of moments estimators in small to moderate samples, the power parameter is typically chosen arbitrarily by the researcher, serving mainly as an index. We interpret it as a hyperparameter that determines the loss function and governs the learning procedure, shaping the curvature of the objective and influencing finite sample performance. Using second order asymptotics, we show that it affects both the structural estimator and the associated Lagrange multipliers, governing robustness, bias, and sensitivity to sampling variation. Monte Carlo simulations illustrate how estimator performance varies with the choice of the power parameter and underlying distributional features, with implications for second order bias and coverage distortion. An empirical illustration based on Owen (2001)s classical example highlights the practical relevance of tuning the power parameter.

Read PDF