Lucas Fidon edited subsection_Mutual_Information_definition_and__.tex  almost 8 years ago

Commit id: fe9b2a9bd176982f734d49a395e911e8030f55f6

deletions | additions      

       

\[ S(X) = -\sum_{x \in E}P_{X}(x)*log(P_{X}(x)) \]    It is noteworthy that, in particular, the entropy of the joint distribution of two random variables $X : P_{1} \rightarrow E_{1}$ and $Y : P_{2} \rightarrow E_{2}$ is defined as:  \[ S(X,Y) = -\sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)*log\Big(log(P_{(X,Y)}(x,y))\Big) E_2}P_{(X,Y)}(x,y)*log\big(P_{(X,Y)}(x,y)\big)  \] The entropy has three interpretations:   \begin{itemize}