Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex  almost 8 years ago

Commit id: 3fc780dc306818959ce8c1fd1d6abcb17f111ec9

deletions | additions      

       

The most intuitive definition is the following.  Let $X : P_{1} \rightarrow E_{1}$ and $Y: P_{2} \rightarrow E_{2}$ be two random variables, where $E_{1}$ and $E_{2}$ are two discrete probability spaces.   We define the Mutual information Information  of $X$ and $Y$, noted $I(X,Y)$, as: \[ I(X,Y) = S(X) + S(Y) - S(X,Y) \]  Another definition of $I(X,Y)$ is:  \[ I(X,Y) = \sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)*log\Bigg(\frac{log(P_{(X,Y)}(x,y))}{P_{X}(x)*P_{Y}(y)}\Bigg) \]