Lucas Fidon edited subsubsection_Mutual_Information_There_are__.tex  almost 8 years ago

Commit id: c8926e2380842e5c72cae4c0cc3bdacbe0fcabad

deletions | additions      

       

The most intuitive definition is the following.  Let $X : P_{1} \rightarrow E_{1}$ and $Y : P_{2} \rightarrow E_{2}$ be two random variables, where $E_{1}$ and $E_{2}$ are two discrete probability spaces.   We define the Mutual Information of between  $X$ and $Y$, noted $MI(X,Y)$, as: \[ MI(X,Y) = S(X) + S(Y) - S(X,Y) \]  or again as: