On the asymptotic consistency of minimum divergence and least-squares principles
Document Type
Article
Publication Date
12-1-2007
Abstract
Euclidean distance is a discrepancy measure between two realvalued functions. Divergence is a discrepancy measure between two positive functions. Corresponding to these two well-known discrepancy measures, there are two inference principles; namely, the least-squares principle for choosing a real-valued function subject to linear constraints, and the minimum-divergence principle for choosing a positive function subject to linear constraints. To make the connection between these two principles more transparent, this correspondence provides an observation and a constructive proof that the minimum-divergence principle reduces to the least-squares principle asymptotically as the positivity requirements are de-emphasized. Hence, these two principles are asymptotically consistent. © 2007 IEEE.
Publication Title
IEEE Transactions on Information Theory
Recommended Citation
Zhao, Z.,
&
Blahut, R.
(2007).
On the asymptotic consistency of minimum divergence and least-squares principles.
IEEE Transactions on Information Theory,
53(9), 3283-3287.
http://doi.org/10.1109/TIT.2007.903127
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p/11043