e/Kullback's inequality

New Query

Information
has glosseng: In information theory and statistics, Kullbacks inequality is a lower bound on the Kullback–Leibler divergence expressed in terms of the large deviations rate function. If P and Q are probability distributions on the real line, such that P is absolutely continuous with respect to Q, i.e. P<<Q, and whose first moments exist, then :D_KL}(P\|Q) \ge \Psi_Q^*(\mu_1(P)), where \Psi_Q^* is the rate function, i.e. the convex conjugate of the cumulant-generating function, of Q, and \mu'_1(P) is the first moment of P.
lexicalizationeng: Kullback's inequality
instance ofc/Statistical inequalities

Query

Word: (case sensitive)
Language: (ISO 639-3 code, e.g. "eng" for English)


Lexvo © 2008-2025 Gerard de Melo.   Contact   Legal Information / Imprint