this is for holding javascript data
Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex
almost 8 years ago
Commit id: 1bd15bfdb156adee7ae513e2049adf60cdc4d866
deletions | additions
diff --git a/subsubsection_Mutual_Information_There_are__.tex b/subsubsection_Mutual_Information_There_are__.tex
index 77eccc3..6dec5b9 100644
--- a/subsubsection_Mutual_Information_There_are__.tex
+++ b/subsubsection_Mutual_Information_There_are__.tex
...
We define the Mutual Information between $X$ and $Y$, noted $MI(X,Y)$, as:
\[ MI(X,Y) = S(X) + S(Y) - S(X,Y) \]
or
again equivalently as:
\[ MI(X,Y) =
S(X,Y) S(X) - S(X|Y)\]
or symmetrically as:
\[ MI(X,Y) =
S(X,Y) S(Y) - S(Y|X)\]
Thus it may be interpreted as the amount of
information uncertainty (or information) shared by the random variables $X$ and $Y$.
The former form of mutual information contains the term $-S(X,Y)$, which means that the lower the joint entropy is and the higher the mutual information is.
Furthermore the later form of mutual information with the term $-S(X|Y)$ might be translate to "the amount of uncertainty about $X$ minus the uncertainty about $X$ when $Y$ is known".