this is for holding javascript data
Lucas Fidon edited subsection_Mutual_Information_definition_and__.tex
almost 8 years ago
Commit id: 2a51601c50ab64ac8afb6b26b20b898ed119972a
deletions | additions
diff --git a/subsection_Mutual_Information_definition_and__.tex b/subsection_Mutual_Information_definition_and__.tex
index 53ed0fd..2118fef 100644
--- a/subsection_Mutual_Information_definition_and__.tex
+++ b/subsection_Mutual_Information_definition_and__.tex
...
Shannon introduced the entropy to be a measure of the quantity of information of a random variable.
Let $X: P \rightarrow E$ be a random variable with $E$ a discrete probability space.
The entropy of $X$, noted
$S(X)$ $S(X)$, is defined as:
\[ S(X) = -\sum_{x \in E}P_{X}(x)*log(P_{X}(x)) \]
The entropy has three interpretations: