this is for holding javascript data
Kim H. Parker edited subsection_information_theory_Information_theory__.tex
about 8 years ago
Commit id: 2d03a8c3b4dff485c6d93916bb030ff82da5b8f0
deletions | additions
diff --git a/subsection_information_theory_Information_theory__.tex b/subsection_information_theory_Information_theory__.tex
index 345387b..d4565fd 100644
--- a/subsection_information_theory_Information_theory__.tex
+++ b/subsection_information_theory_Information_theory__.tex
...
\[
\Delta_{JS}(A;B) = \sqrt{JS(A;B)}
\]
is a
metric, i.e. metric; that is, it is positive $\big[\Delta_{JS} \ge 0\big]$, symmetrical $\big[\Delta_{JS}(A;B) = \Delta_{JS}(B;A)\big]$ and satisfies the triangular inequality $\big[\Delta_{JS}(A;B) \le \Delta_{JS}(A;C) + \Delta_{JS}(B;C)\big]$. We will use this metric of the distance between probability density functions in the following analysis.
Finally, we note that the definition of entropy involves knowledge of the probability density function and there are well-known problems in the estimation of the underlying pdf from a single sample of variables from the distribution. This will be discussed below.