Information | |
---|---|
has gloss | eng: In information theory and statistics, Kullbacks inequality is a lower bound on the Kullback–Leibler divergence expressed in terms of the large deviations rate function. If P and Q are probability distributions on the real line, such that P is absolutely continuous with respect to Q, i.e. P<<Q, and whose first moments exist, then :D_KL}(P\|Q) \ge \Psi_Q^*(\mu_1(P)), where \Psi_Q^* is the rate function, i.e. the convex conjugate of the cumulant-generating function, of Q, and \mu'_1(P) is the first moment of P. |
lexicalization | eng: Kullback's inequality |
instance of | c/Statistical inequalities |
Lexvo © 2008-2025 Gerard de Melo. Contact Legal Information / Imprint