Lucas Fidon edited However_the_previous_definitions_are__.tex  almost 8 years ago

Commit id: 251702aaa5ba1a38f2530a8052a937d68c3b360e

deletions | additions      

       

\item \[MI(X,Y) = MI(Y,X) (symmetry) \]  \item \[MI(X,X) = S(X) \]  The amount of information a random variable shared with itself is simply the entropy of $X$.  \item \[MI(X,Y) \leq S(X),  MI(X,Y) S(X),\]  \[MI(X,Y)  \leq S(Y) \] The amount of information shared by two random variable cannot be greater than the information contained in one of those single one random variables.  \item \[MI(X,Y) \meq \geq  0 \] The uncertainty about $X$ cannot be increased by learning about $Y$.  \item \[MI(X,Y) = 0 iff X and y are independent.\]  \end{enumerate}