We want to extend our notion of entropy (see \sheetref{discrete_entropy}{Discrete Entropy}) to real-valued (continuous) random variables.
The differential entropy of a probability density function is the integral of the density against the negative log of the density. This definition made to be similar to the case of discrete entropy. If a real-valued random variable has a density, then we call the differential entropy of its density the differential entropy of the random variable.
Let $f: \R ^n \to \R $ be a probability
density function.
The differential entropy of $f$ is
\[
- \int f \log f
\]
Let $x: \Omega \to \R $ be uniform on $[0, 1/2]$. Then $h(x) = \log1/2 < 0$.
We have $h(ax) = h(x) + \log\abs{a}$. In general $h(Ax) = h(x) + \log\abs{A}$.
Even though the value of the differential
entropy is not necessarily a good analogy to
discrete entropy, differences still are.
In particular, the following holds
\[
I(X; Y) = H(Y) - H(Y \mid X) = H(X) = H(X \mid Y)
\]