this is for holding javascript data
Lucas Fidon edited subsection_Mutual_Information_definition_and__.tex
almost 8 years ago
Commit id: 78641ad49ae454a77233be3b5a990f248d423cab
deletions | additions
diff --git a/subsection_Mutual_Information_definition_and__.tex b/subsection_Mutual_Information_definition_and__.tex
index 939ebc3..b45a534 100644
--- a/subsection_Mutual_Information_definition_and__.tex
+++ b/subsection_Mutual_Information_definition_and__.tex
...
\[ S(X,Y)=-\sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)log\big(P_{(X,Y)}(x,y)\big) \]
Besides the entropy of the probability law of $X$ conditionally to the probability law of $Y$ is defined as:
\[ S(X|Y)=-\sum_{x \in
E_1}P_{(X|Y)}(x)log\big(P_{(X|Y)}(x)\big) E_1}P_{X|Y}(x)log\big(P_{X|Y}(x)\big) \]
Somewhat imprecisely, we used to designate the entropy of the probability law of a random variable $X$ as simply \textit{the entropy of $X$}. Hence the notation \textit{$S(X)$} for the entropy of $X$.