this is for holding javascript data
Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex
almost 8 years ago
Commit id: d59793ea2c91e37cf40ff6e7621e7b749d5462f5
deletions | additions
diff --git a/subsubsection_Mutual_Information_There_are__.tex b/subsubsection_Mutual_Information_There_are__.tex
index c9e997e..bc1010f 100644
--- a/subsubsection_Mutual_Information_There_are__.tex
+++ b/subsubsection_Mutual_Information_There_are__.tex
...
\subsubsection{Mutual Information}
There are several equivalent ways to define Mutual Information.
The most intuitive definition is the following:
Let $X :
P_1 P_{1} \rightarrow
E_1$ E_{1}$ and $Y:
P_2 P_{2} \rightarrow
E_2$ E_{2}$ be two random variables, where
$E_1$ $E_{1}$ and
$E_2$ $E_{2}$ are two discrete probability spaces.
We define the Mutual information
of$ X$ of $X$ and $Y$, noted
$I(X,Y)$ as:$ $I(X,Y)$, as:
\[ I(X,Y) = \sum{x \in E_1, y \in E_2}P_{(X,Y)}(x,y)*log(\frac{log(P_{(X,Y)}(x,y))}{P_{X}(x)*P_{Y}(y)}) \]