Lucas Fidon edited subsection_Mutual_Information_definition_and__.tex  almost 8 years ago

Commit id: 11ca70d9ce83a58c4c01bc3611987d8977480b94

deletions | additions      

       

It is noteworthy that, in particular, the entropy of the joint probability law of two random variables $X : P_{1} \rightarrow E_{1}$ and $Y : P_{2} \rightarrow E_{2}$ is defined as:  \[ S(X,Y)=-\sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)log\big(P_{(X,Y)}(x,y)\big) \]  Besides the entropy of the probability law of $X$ conditionally to the probability law of $Y$ is defined as:  \[ S(X|Y)=-\sum_{x \in E_1}P_{(X|Y)}(x)log\big(P_{(X|Y)}(x)\big) \]  Somewhat imprecisely, we used to designate the entropy of the probability law of a random variable $X$ as simply \textit{the entropy of $X$}. Hence the notation \textit{$S(X)$} for the entropy of $X$.  The entropy of a probability law (or of a random variable) has three interpretations: