this is for holding javascript data
Lucas Fidon edited However_the_previous_definitions_are__.tex
almost 8 years ago
Commit id: c9803b63862878b6bd4ef87dc76600d207bc0de2
deletions | additions
diff --git a/However_the_previous_definitions_are__.tex b/However_the_previous_definitions_are__.tex
index 3de78c9..d5feba9 100644
--- a/However_the_previous_definitions_are__.tex
+++ b/However_the_previous_definitions_are__.tex
...
\begin{enumerate}
\item $MI(X,Y) = MI(Y,X)$ (symmetry)
\item $MI(X,X) = S(X)$
The amount of information a random variable shared with itself is simply the entropy of $X$.
\item $MI(X,Y) \leq S(X),$
$MI(X,Y) \leq S(Y) $
The amount of information shared by two random variable cannot be greater than the information contained in one of those single one random variables.
\item
\[MI(X,Y) $MI(X,Y) \geq
0 \] 0$
The uncertainty about $X$ cannot be increased by learning about $Y$.
\item $MI(X,Y) = 0$ iff $X$ and $Y$ are independent.
\end{enumerate}