Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex  almost 8 years ago

Commit id: b628aa77d5925d9ccac35cec8a2c9978f5c72515

deletions | additions      

       

The most intuitive definition is the following:  Let $X : P_1 \rightarrow E_1$ and $Y: P_2 \rightarrow E_2$ be two random variables, where $E_1$ and $E_2$ are two discrete probability spaces.   We define the Mutual information of$ X$ and $Y$, noted $I(X,Y)$ as:$  \[I(X,Y) \[ I(X,Y)  = \sum{x \in E_1, y \in E_2}P_{(X,Y)}(x,y)*log(\frac{log(P_{(X,Y)}(x,y))}{P_{X}(x)P_{Y}(y)})\] E_2}P_{(X,Y)}(x,y)*log(\frac{log(P_{(X,Y)}(x,y))}{P_{X}(x)*P_{Y}(y)}) \]