\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Relative Entropy
Marginal Probability Distributions
Needed by:
Differential Mutual Information
Mutual Information Graph
Links:
Sheet PDF
Graph PDF
Wikipedia

Mutual Information

Definition

The mutual information of a joint distribution over two random variables is the entropy of the product of the marginal distributions relative to the joint distribution.

Notation

Let $A$ and $B$ be two non-empty sets. Let $p_{12}: A \times B \to \R $ be a distribution with marginal distributions $p_1: A \to \R $ and $p_2: B \to \R $. The mutual information of $p$ is $d(p, p_1p_2)$ where $d$ denotes the relative entropy.

Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view