Lucas Fidon edited However_the_previous_definitions_are__.tex  almost 8 years ago

Commit id: 31da59248d111beb42c6bdf7bb19ca569b76feac

deletions | additions      

       

The amount of information a random variable shared with itself is simply the entropy of $X$.  \item $MI(X,Y) \leq S(X),$  $MI(X,Y) \leq S(Y) $  The amount of information shared by two random variable cannot be greater than the information contained in one of those single one random variables.