Lucas Fidon edited subsection_Mutual_Information_definition_and__.tex  almost 8 years ago

Commit id: cd6e6a65c217e156d33a79efc2325884fc8f66b9

deletions | additions      

       

Shannon introduced the entropy to be a measure of the quantity of information of a random variable.    Let $X: P \rightarrow E$ be a random variable with $E$ a discrete probability space.  The entropy of $X$ $X$, noted $S(X)$  is defined as: \[ S(X) = -\sum_{x \in E}P_{X}(x)*log(P_{X}(x)) \]    The entropy has three interpretations: