Lucas Fidon edited However_the_previous_definitions_are__.tex  almost 8 years ago

Commit id: 9a1744ba08e8ab6dcbe74d85f87a4c0316fb4e73

deletions | additions      

       

\subsubsection{Properties}  Mutual information as the following properties:  \begin{enumerate}  \item  \[MI(X,Y) = MI(Y,X) (symmetry) \] \item  \[MI(X,X) = S(X) \] The amount of information a random variable shared with itself is simply the entropy of $X$.  \item  \[MI(X,Y) \leq S(X), MI(X,Y) \leq S(Y) \]  The amount of information shared by two random variable cannot be greater than the information contained in one of those single one random variables.  \end{enumerate}  \[