this is for holding javascript data
Lucas Fidon edited However_the_previous_definitions_are__.tex
almost 8 years ago
Commit id: 0580afae3c24d853b40116fffd8cbbad2a1f33ff
deletions | additions
diff --git a/However_the_previous_definitions_are__.tex b/However_the_previous_definitions_are__.tex
index 3699191..86ea259 100644
--- a/However_the_previous_definitions_are__.tex
+++ b/However_the_previous_definitions_are__.tex
...
However the previous definitions are well nigh impossible to use in practice. Hopefully it can be proved that the Mutual Information between $X$ and $Y$ can be expressed as:
\[ MI(X,Y) = \sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)log\Bigg(\frac{P_{(X,Y)}(x,y)}{P_{X}(x)P_{Y}(y)}\Bigg) \]
The interpretation of this form is that it measures the distance between the joint distribution and the joint distribution in case of independence between $X$ and $Y$. So it is a measure of \textit{dependence} between two
distribution distributions (or random variables).
\subsubsection{Properties}
Mutual information has the following properties:
...
$MI(X,Y) \leq S(Y) $
The amount of information shared by two random
variable variables cannot be greater than the information contained in one of those
single one random
variables. variable.
\item $MI(X,Y) \geq 0$
The uncertainty about $X$ cannot be increased by learning about $Y$.
\item $MI(X,Y) = 0$ iff $X$ and $Y$ are independent.
When $X$ and $Y$ are not in any way related, no information is gained about one of the random
variables variable when the other is known.
\end{enumerate}
For more information about mutual information the reader can refer to \cite{Pluim_2003}. And for extra information about multivariate mutual information he can refer to \cite{srinivasa2005review}.