this is for holding javascript data
Lucas Fidon edited However_the_previous_definitions_are__.tex
almost 8 years ago
Commit id: 6307ca9563aae3d7fe298919f4cf8a05c8dc88d9
deletions | additions
diff --git a/However_the_previous_definitions_are__.tex b/However_the_previous_definitions_are__.tex
index 86ea259..8f629c2 100644
--- a/However_the_previous_definitions_are__.tex
+++ b/However_the_previous_definitions_are__.tex
...
However the previous definitions are well nigh impossible to use in practice. Hopefully it can be proved that the Mutual Information between $X$ and $Y$ can be expressed as:
\[ MI(X,Y) = \sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)log\Bigg(\frac{P_{(X,Y)}(x,y)}{P_{X}(x)P_{Y}(y)}\Bigg) \]
The interpretation of this form is that it measures the distance between the joint distribution and the joint distribution in case of independence between $X$ and $Y$. So it is a measure of \textit{dependence} between two
probability distributions (or random variables).
\subsubsection{Properties}
Mutual information has the following properties: