The entropy of a distribution is the expectation of the negative logarithm of the distribution under the distribution. It is sometimes called the discrete entropy to distinguish it with another related topic.1
Let $A$ be a finite set.
Let $p:A \to \R $ be a distribution.
The entropy of $p$ is
\[
-\sum_{a \in A} p(a) \log(p(a)).
\]
Let $x: \Omega \to V$ be a discrete random
variable.