this is for holding javascript data
Lucas Fidon edited However_the_previous_definitions_are__.tex
almost 8 years ago
Commit id: b61264eda9c1f78816c362d5a470462429cf7c38
deletions | additions
diff --git a/However_the_previous_definitions_are__.tex b/However_the_previous_definitions_are__.tex
index b5b4533..874e397 100644
--- a/However_the_previous_definitions_are__.tex
+++ b/However_the_previous_definitions_are__.tex
...
However the previous definitions are well nigh impossible to use in practice. Hopefully it can be proved that the Mutual Information between $X$ and $Y$ can be expressed as:
\[ MI(X,Y) = \sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)log\Bigg(\frac{P_{(X,Y)}(x,y)}{P_{X}(x)P_{Y}(y)}\Bigg) \]
The interpretation of this form is that it measures the distance between the joint distribution and the k=joint distribution in case of independence between $X$ and $Y$^. So it is a measure of \textit{dependence} between two distribution (or random variables).
\subsubsection{Properties}
Mutual information as the following properties:
\[MI(X,Y) = MI(Y,X) (symmetry) \]
\[MI(X,X) = S(X) \]
The amount of information a random variable shared with itself is simply the entropy of $X$.
\[MI(X,Y) \leq S(X),
MI(X,Y) \leq S(Y) \]
The amount of information shared by two random variable cannot be greater than the information contained in one of those single one random variables.
\[