this is for holding javascript data
Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex
almost 8 years ago
Commit id: df141a6ef1be209116688e52c988bacae52c90e5
deletions | additions
diff --git a/subsubsection_Mutual_Information_There_are__.tex b/subsubsection_Mutual_Information_There_are__.tex
index d920b15..4ba640f 100644
--- a/subsubsection_Mutual_Information_There_are__.tex
+++ b/subsubsection_Mutual_Information_There_are__.tex
...
The most intuitive definition is the following.
Let $X : P_{1} \rightarrow E_{1}$ and $Y : P_{2} \rightarrow E_{2}$ be two random variables, where $E_{1}$ and $E_{2}$ are two discrete probability spaces.
We define the Mutual Information of $X$ and $Y$, noted
$I(X,Y)$, $MI(X,Y)$, as:
\[
I(X,Y) MI(X,Y) = S(X) + S(Y) - S(X,Y) \]
or again as:
\[ MI(X,Y) = S(X,Y) - S(X|Y)\]
or symmetrically as:
\[ MI(X,Y) = S(X,Y) - S(Y|X)\]
Thus it may be interpreted as the amount of information shared by the random variables $X$ and $Y$.