\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Multivariate Normal Entropy
Differential Mutual Information
Normal Correlation
Needed by:
None.
Links:
Sheet PDF
Graph PDF

Multivariate Normal Mutual Informations

Why

What is the differential mutual information between two components of a multivariate normal?

Result

Let $g \sim \normal{\mu }{\Sigma }$. Then the mutual information between component $i$ and component $j$ is

\[ -\frac{1}{2}\ln(1 - \rho _{ij}^2) \]

where $\rho _{ij}$ is the correlation between components $i$ and $j$.
Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view