Lucas Fidon edited However_the_previous_definitions_are__.tex  almost 8 years ago

Commit id: a566eec91e41c3b58709cab4dbcac06007d3725e

deletions | additions      

       

However the previous definitions are well nigh impossible to use in practice. Hopefully it can be proved that the Mutual Information between $X$ and $Y$ can be expressed as:  \[ MI(X,Y) = \sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)log\Bigg(\frac{P_{(X,Y)}(x,y)}{P_{X}(x)P_{Y}(y)}\Bigg) \]  The interpretation of this form is that it measures the distance between the joint distribution and the k=joint joint  distribution in case of independence between $X$ and $Y$. So it is a measure of \textit{dependence} between two distribution (or random variables). \subsubsection{Properties}  Mutual information has the following properties: