Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex  almost 8 years ago

Commit id: ca0f523b835c1e3fe548c9a6900b780f4813bf82

deletions | additions      

       

The most intuitive definition is the following:  Let $X : P_{1} \rightarrow E_{1}$ and $Y: P_{2} \rightarrow E_{2}$ be two random variables, where $E_{1}$ and $E_{2}$ are two discrete probability spaces.   We define the Mutual information of $X$ and $Y$, noted $I(X,Y)$, as:  \[ I(X,Y) = \sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)*log\Big(\frac{log(P_{(X,Y)}(x,y))}{P_{X}(x)*P_{Y}(y)}\Big) E_2}P_{(X,Y)}(x,y)*log\Bigg(\frac{log(P_{(X,Y)}(x,y))}{P_{X}(x)*P_{Y}(y)}\Bigg)  \]