The differential mutual information between $i$ and $j$th components of a multivariate density is the differential relative entropy of the $i,j$th marginal density with the product of the $i$th and $j$th marginal densities.
Let $f: \R ^d \to \R $.
Let $d$ denote the differential relative
entropy.
The mutual information between
$i$ and $j$ for $i,j = 1, \dots , d$
and $i \neq j$ is
\[
d(f_{ij}, f_{i}f_{j})
\]