Weak convergence of Bayes estimators under general loss functions

Authors

Requadt R, Li H, Munk A

Journal

Arxiv

Citation

arXiv:2510.05645.

Abstract

We investigate the asymptotic behavior of parametric Bayes estimators under a broad class of loss functions that extend beyond the classical translation-invariant setting. To this end, we develop a unified theoretical framework for loss functions exhibiting locally polynomial structure. This general theory encompasses important examples such as the squared Wasserstein distance, the Sinkhorn divergence and Stein discrepancies, which have gained prominence in modern statistical inference and machine learning. Building on the classical Bernstein–von Mises theorem, we establish sufficient conditions under which Bayes estimators inherit the posterior’s asymptotic normality. As a by-product, we also derive conditions for the differentiability of Wasserstein-induced loss functions and provide new consistency results for Bayes estimators. Several examples and numerical experiments demonstrate the relevance and accuracy of the proposed methodology.

DOI

10.48550/arXiv.2510.05645