Consider two distributions on the same finite set. The cross entropy of the first distribution relative to the second distribution is the expectation of the negative logarithm of the first distribution under the second distribution.
Suppose $p: A \to \R $ and $q: A \to \R $
are distributions on the finite set $A$.
We denote the cross entropy of $p$ relative to
$q$ by $H(q, p)$; in symbols,
\[
H(q, p) := -\sum_{a \in A} q(a) \log p(a)
\]