\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Real Positive Semidefinite Matrix Order
Differential Entropy
Probability Measures
Needed by:
Multivariate Normal Mutual Informations
Links:
Sheet PDF
Graph PDF

Multivariate Normal Entropy

Result

Suppose $x: \Omega \to \R ^d$ has normal density $g: \R ^d \to \R $ on probability space $(\Omega , \mathcal{A} , \mathbfsf{P} )$. Suppose $x$ has mean $\mu \in \R ^d$ and covariance $\Sigma \succ 0$. Then the entropy of $x$ is

\[ h(g) = -\int g \log g = \frac{1}{2} \log ((2\pi e)^d \det \Sigma ) \]

This result tells us the multivariate normal entropy (or Gaussian entropy).

Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view