\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Outcome Probabilities
Logarithm
Needed by:
Differential Entropy
Optimal Average Codeword Length
Relative Entropy
Links:
Sheet PDF
Graph PDF

Discrete Entropy

Definition

The entropy of a distribution is the expectation of the negative logarithm of the distribution under the distribution. It is sometimes called the discrete entropy to distinguish it with another related topic.1

Notation

Let $A$ be a finite set. Let $p:A \to \R $ be a distribution. The entropy of $p$ is

\[ -\sum_{a \in A} p(a) \log(p(a)). \]

We denote the entropy of $p$ by $H(p)$.

Properties

Let $x: \Omega \to V$ be a discrete random variable.

  1. $H(x) \geq 0$
  2. $H(f(x)) \leq H(x)$
  3. For invertible $g$, $H(g(x)) \leq H(x)$


  1. Future editions may not forward reference differential entropy. ↩︎
Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view