this is for holding javascript data
Kim H. Parker edited subsection_information_theory_Information_theory__.tex
about 8 years ago
Commit id: 9b7a5cb20f470374796943b598c66c5cdca66e27
deletions | additions
diff --git a/subsection_information_theory_Information_theory__.tex b/subsection_information_theory_Information_theory__.tex
index 6b88e03..d754079 100644
--- a/subsection_information_theory_Information_theory__.tex
+++ b/subsection_information_theory_Information_theory__.tex
...
From its definition it can easily be shown that
\[
JS(A,B) = H(M) -
\frac{1}{2}(H(A) \mbox{$\frac{1}{2}$} \big(H(A) -
H(B)) H(B)\big)
\]
That is, the Jensen-Shannon divergence is equal to the entropy of the average distribution of the two distributions minus the average of the entropies of the individual distributions.