Optimal-order uniform and nonuniform bounds on the rate of convergence to normality for maximum likelihood estimators
Document Type
Article
Publication Date
1-1-2017
Abstract
© 2017, Institute of Mathematical Statistics. All rights reserved. It is well known that, under general regularity conditions, the distribution of the maximum likelihood estimator (MLE) is asymptotically normal. Very recently, bounds of the optimal order O(1√n) on the closeness of the distribution of the MLE to normality in the so-called bounded Wasserstein distance were obtained [2, 1], where n is the sample size. However, the corresponding bounds on the Kolmogorov distance were only of the order O(1/n1/4). In this paper, bounds of the optimal order O(1√n) on the closeness of the distribution of the MLE to normality in the Kolmogorov distance are given, as well as their nonuniform counterparts, which work better in tail zones of the distribution of the MLE. These results are based in part on previously obtained general optimal-order bounds on the rate of convergence to normality in the multivariate delta method. The crucial observation is that, under natural conditions, the MLE can be tightly enough bracketed between two smooth enough functions of the sum of independent random vectors, which makes the delta method applicable. It appears that the nonuniform bounds for MLEs in general have no precedents in the existing literature; a special case was recently treated by Pinelis and Molzon [20]. The results can be extended to M-estimators.
Publication Title
Electronic Journal of Statistics
Recommended Citation
Pinelis, I.
(2017).
Optimal-order uniform and nonuniform bounds on the rate of convergence to normality for maximum likelihood estimators.
Electronic Journal of Statistics,
11(1), 1160-1179.
http://doi.org/10.1214/17-EJS1264
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p/13133