Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex  almost 8 years ago

Commit id: 71e66e4aff95c424b2f67920a4b70363f79ba110

deletions | additions      

       

The most intuitive definition is the following:  Let $X : P_{1} \rightarrow E_{1}$ and $Y: P_{2} \rightarrow E_{2}$ be two random variables, where $E_{1}$ and $E_{2}$ are two discrete probability spaces.   We define the Mutual information of $X$ and $Y$, noted $I(X,Y)$, as:  \[ I(X,Y) = \sum{x \sum_{x  \in E_1, y \in E_2}P_{(X,Y)}(x,y)*log(\frac{log(P_{(X,Y)}(x,y))}{P_{X}(x)*P_{Y}(y)}) \]