\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Logarithm
Outcome Probabilities
Needed by:
Relative Entropy
Links:
Sheet PDF
Graph PDF

Cross Entropy of Probability Distributions

Definition

Consider two distributions on the same finite set. The cross entropy of the first distribution relative to the second distribution is the expectation of the negative logarithm of the first distribution under the second distribution.

Notation

Suppose $p: A \to \R $ and $q: A \to \R $ are distributions on the finite set $A$. We denote the cross entropy of $p$ relative to $q$ by $H(q, p)$; in symbols,

\[ H(q, p) := -\sum_{a \in A} q(a) \log p(a) \]

Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view