Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex  almost 8 years ago

Commit id: 0d8c78462351aaa82393eb5ea778b2d44a809db8

deletions | additions      

       

We define the Mutual Information of $X$ and $Y$, noted $I(X,Y)$, as:  \[ I(X,Y) = S(X) + S(Y) - S(X,Y) \]  Another definition Thus it may be interpreted as the amount  of $I(X,Y)$ is:  \[ I(X,Y) = \Sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)log\Bigg(\frac{P_{(X,Y)}(x,y)}{P_{X}(x)P_{Y}(y)}\Bigg) \] information shared by the random variables $X$ and $Y$.