this is for holding javascript data
Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex
almost 8 years ago
Commit id: 8be77af45c526efebda2de0e4557a8f0c7595718
deletions | additions
diff --git a/subsubsection_Mutual_Information_There_are__.tex b/subsubsection_Mutual_Information_There_are__.tex
index ab06e64..d479ce2 100644
--- a/subsubsection_Mutual_Information_There_are__.tex
+++ b/subsubsection_Mutual_Information_There_are__.tex
...
There are several equivalent ways to define Mutual Information.
The most intuitive definition is the following.
Let $X : P_{1} \rightarrow E_{1}$ and
$Y: $Y : P_{2} \rightarrow E_{2}$ be two random variables, where $E_{1}$ and $E_{2}$ are two discrete probability spaces.
We define the Mutual Information of $X$ and $Y$, noted $I(X,Y)$, as:
\[ I(X,Y) = S(X) + S(Y) - S(X,Y) \]
Another definition of $I(X,Y)$ is:
\[ I(X,Y) = \sum_{(x,y) \in E_1\times
E_2}P_{(X,Y)}(x,y)*log\Bigg(\frac{log(P_{(X,Y)}(x,y))}{P_{X}(x)*P_{Y}(y)}\Bigg) E_2}P_{(X,Y)}(x,y)*log\Bigg(\frac{P_{(X,Y)}(x,y)}{P_{X}(x)*P_{Y}(y)}\Bigg) \]