Lucas Fidon edited However_the_previous_definitions_are__.tex  almost 8 years ago

Commit id: e525ba4d88cdfc8a83cab5e756e6de5dd59ed637

deletions | additions      

       

\subsubsection{Properties}  Mutual information has the following properties:  \begin{enumerate}  \item \[MI(X,Y) $MI(X,Y)  = MI(Y,X) MI(Y,X)$  (symmetry)\]  \item \[MI(X,X) $MI(X,X)  = S(X) \] S(X)$  The amount of information a random variable shared with itself is simply the entropy of $X$.  \item \[MI(X,Y) $MI(X,Y)  \leq S(X),\]  \[MI(X,Y) S(X),$  $MI(X,Y)  \leq S(Y) \] $  The amount of information shared by two random variable cannot be greater than the information contained in one of those single one random variables.  \item \[MI(X,Y) \geq 0 \]  The uncertainty about $X$ cannot be increased by learning about $Y$.