Lucas Fidon edited subsection_Mutual_Information_definition_and__.tex  almost 8 years ago

Commit id: 2162b9f3f57000ed880fc20f94f2311a7952853c

deletions | additions      

       

Let $X: P \rightarrow E$ be a random variable with $E$ a discrete probability space.  The entropy of the probability law of $X$, noted $S(X)$, is defined as:  \[ S(X) = -\sum_{x S(X)=-\sum_{x  \in E}P_{X}(x)*log\big(P_{X}(x)\big) \]   It is noteworthy that, in particular, the entropy of the joint probability law of two random variables $X : P_{1} \rightarrow E_{1}$ and $Y : P_{2} \rightarrow E_{2}$ is defined as:  \[ S(X,Y) = -\sum_{(x,y) S(X,Y)=-\sum_{(x,y)  \in E_1\times E_2}P_{(X,Y)}(x,y)*log\big(P_{(X,Y)}(x,y)\big) \] Somewhat imprecisely, we used to designate the entropy of the probability law of a random variable $X$ as simply \textit{the entropy of $X$}. Hence the notation \textit{$S(X)$} for the entropy of $X$.