this is for holding javascript data
Lucas Fidon edited subsection_Mutual_Information_definition_and__.tex
almost 8 years ago
Commit id: 0ba309491b46325d99613080e9380c268131afc2
deletions | additions
diff --git a/subsection_Mutual_Information_definition_and__.tex b/subsection_Mutual_Information_definition_and__.tex
index 4f3ec51..adaf3b2 100644
--- a/subsection_Mutual_Information_definition_and__.tex
+++ b/subsection_Mutual_Information_definition_and__.tex
...
Besides the entropy of the probability distribution of $X$ conditionally to the probability distribution of $Y$ is defined as:
\[ S(X|Y)=-\sum_{x \in E_1}P_{X|Y}(x)log\big(P_{X|Y}(x)\big) \]
Somewhat imprecisely, we used to designate the entropy of the probability distribution of a random variable $X$ as simply
\textit{the "\textit{the entropy of
$X$}. $X$}". Hence the notation \textit{$S(X)$} for the entropy of $X$.
The entropy of a probability distribution (or of a random variable) has three interpretations:
\begin{itemize}