this is for holding javascript data
Lucas Fidon edited However_the_previous_definitions_are__.tex
almost 8 years ago
Commit id: f4ed0857a7c75f3107ceaea88e0b536a720c455e
deletions | additions
diff --git a/However_the_previous_definitions_are__.tex b/However_the_previous_definitions_are__.tex
index 9810f65..52969a0 100644
--- a/However_the_previous_definitions_are__.tex
+++ b/However_the_previous_definitions_are__.tex
...
The interpretation of this form is that it measures the distance between the joint distribution and the k=joint distribution in case of independence between $X$ and $Y$. So it is a measure of \textit{dependence} between two distribution (or random variables).
\subsubsection{Properties}
Mutual information
as has the following properties:
\begin{enumerate}
\item \[MI(X,Y) = MI(Y,X) (symmetry) \]
\item \[MI(X,X) = S(X) \]
...
\item \[MI(X,Y) \leq S(X),
MI(X,Y) \leq S(Y) \]
The amount of information shared by two random variable cannot be greater than the information contained in one of those single one random variables.
\item \[MI(X,Y) \meq 0 \]
The uncertainty about $X$ cannot be increased by learning about $Y$.
\item \[MI(X,Y) = 0 iff X and y are independent.\]
\end{enumerate}