this is for holding javascript data
Kim H. Parker edited subsection_information_theory_Information_theory__.tex
about 8 years ago
Commit id: 30c814116a8ee6e34f0978acc6eed794b7dce8ea
deletions | additions
diff --git a/subsection_information_theory_Information_theory__.tex b/subsection_information_theory_Information_theory__.tex
index 6db349d..e07e9ab 100644
--- a/subsection_information_theory_Information_theory__.tex
+++ b/subsection_information_theory_Information_theory__.tex
...
\[
\Delta_{JS}(A;B) = \sqrt{JS(A;B)}
\]
is a metric that is positive
($\Delta_{JS} $\big[\Delta_{JS} \ge
0$), 0\big]$, symmetrical
($\Delta_{JS}(A;B) $\big[\Delta_{JS}(A;B) =
\Delta_{JS}(B;A)$) \Delta_{JS}(B;A)\big]$ and satisfies the triangular inequality
($\Delta_{JS}(A;B) $\big[\Delta_{JS}(A;B) \le \Delta_{JS}(A;C) +
\Delta_{JS}(B;C)$). \Delta_{JS}(B;C)\big]$. We will use this metric of the distance between probability density functions in the following analysis.
Finally, we note that the definition of entropy involves knowledge of the probability density function and there are well-known problems in the estimation of the underlying pdf from a single sample of variables from the distribution. This will be discussed below.