On the asymptotic consistency of minimum divergence and least-squares principles
Euclidean distance is a discrepancy measure between two realvalued functions. Divergence is a discrepancy measure between two positive functions. Corresponding to these two well-known discrepancy measures, there are two inference principles; namely, the least-squares principle for choosing a real-valued function subject to linear constraints, and the minimum-divergence principle for choosing a positive function subject to linear constraints. To make the connection between these two principles more transparent, this correspondence provides an observation and a constructive proof that the minimum-divergence principle reduces to the least-squares principle asymptotically as the positivity requirements are de-emphasized. Hence, these two principles are asymptotically consistent. © 2007 IEEE.
IEEE Transactions on Information Theory
On the asymptotic consistency of minimum divergence and least-squares principles.
IEEE Transactions on Information Theory,
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p/11043