Lucas Fidon edited However_the_previous_definitions_are__.tex  almost 8 years ago

Commit id: 852f52b3552cf53189e731f5c05f9ff92616157b

deletions | additions      

       

However the previous definitions are well nigh impossible to use in practice. Hopefully it can be proved that the Mutual Information between $X$ and $Y$ can be expressed as:  \[ MI(X,Y) = \sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)log\Bigg(\frac{P_{(X,Y)}(x,y)}{P_{X}(x)P_{Y}(y)}\Bigg) \]  The interpretation of this form is that it measures the distance between the joint distribution and the k=joint distribution in case of independence between $X$ and $Y$^. $Y$.  So it is a measure of \textit{dependence} between two distribution (or random variables). \subsubsection{Properties}  Mutual information as the following properties: