this is for holding javascript data
Kim H. Parker edited subsection_information_theory_Information_theory__.tex
about 8 years ago
Commit id: 020525624632562f1cb51db599ffa54050e403a8
deletions | additions
diff --git a/subsection_information_theory_Information_theory__.tex b/subsection_information_theory_Information_theory__.tex
index fd15a43..6db349d 100644
--- a/subsection_information_theory_Information_theory__.tex
+++ b/subsection_information_theory_Information_theory__.tex
...
\[
\Delta_{JS}(A;B) = \sqrt{JS(A;B)}
\]
is a metric that is positive
($\Delat_{JS} ($\Delta_{JS} \ge 0$), symmetrical
($\Delat_{JS}(A;B) ($\Delta_{JS}(A;B) =
\Delat_{JS}(B;A)$) \Delta_{JS}(B;A)$) and satisfies the triangular inequality
($\Delat_{JS}(A;B) ($\Delta_{JS}(A;B) \le
\Delat_{JS}(A;C) \Delta_{JS}(A;C) +
\Delat_{JS}(B;C)$). \Delta_{JS}(B;C)$). We will use this metric of the distance between probability density functions in the following analysis.
Finally, we note that the definition of entropy involves knowledge of the probability density function and there are well-known problems in the estimation of the underlying pdf from a single sample of variables from the distribution. This will be discussed below.