On the asymptotic consistency of minimum divergence and least-squares principles

Document Type


Publication Date



Euclidean distance is a discrepancy measure between two realvalued functions. Divergence is a discrepancy measure between two positive functions. Corresponding to these two well-known discrepancy measures, there are two inference principles; namely, the least-squares principle for choosing a real-valued function subject to linear constraints, and the minimum-divergence principle for choosing a positive function subject to linear constraints. To make the connection between these two principles more transparent, this correspondence provides an observation and a constructive proof that the minimum-divergence principle reduces to the least-squares principle asymptotically as the positivity requirements are de-emphasized. Hence, these two principles are asymptotically consistent. © 2007 IEEE.

Publication Title

IEEE Transactions on Information Theory