this is for holding javascript data
Kim H. Parker edited subsection_information_theory_Information_theory__.tex
about 8 years ago
Commit id: 8a10ce794c9b9727a776ec44fea357a83b089881
deletions | additions
diff --git a/subsection_information_theory_Information_theory__.tex b/subsection_information_theory_Information_theory__.tex
index d6e8c5d..82fbfcc 100644
--- a/subsection_information_theory_Information_theory__.tex
+++ b/subsection_information_theory_Information_theory__.tex
...
\[
JS(A,B) = H(M) - \frac{1}{2}(H(A) - H(B))
\]
That is, the Jensen-Shannon divergence is equal to the entropy of the average distribution of
the two distributions minus the average of the entropies of the individual distributions.
Finally, it has been shown that the Jensen-Shannon distance defined as
\[
\Delta_{JS}(A;B) = \sqrt{JS(A;B)}
\]
is a metric that satisfies symmetry and the triangular inequality. We will use this metric of the distance between probability density functions in the following.
We also Finally, we note that the definition of entropy involves knowledge of the probability density function and there are well-known problems in the estimation of the underlying pdf from a single sample of variables from the
distribution, e.g. through the use of binned histograms. Some of these problems can be overcome by methods based on the mearest neighbour statistices of the sample. These methods distribution. This will be discussed
later. below.