Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex  almost 8 years ago

Commit id: b64b005c00f82dc04f846891823dd1c51e0a947c

deletions | additions      

       

\subsubsection{Mutual Information}  There are several equivalent ways to define Mutual Information.  The most intuitive definition is the following: following.  Let $X : P_{1} \rightarrow E_{1}$ and $Y: P_{2} \rightarrow E_{2}$ be two random variables, where $E_{1}$ and $E_{2}$ are two discrete probability spaces.   We define the Mutual information of $X$ and $Y$, noted $I(X,Y)$, as:  \[ I(X,Y) = \sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)*log\Bigg(\frac{log(P_{(X,Y)}(x,y))}{P_{X}(x)*P_{Y}(y)}\Bigg) \]