this is for holding javascript data
Kim H. Parker edited subsection_information_theory_Information_theory__.tex
about 8 years ago
Commit id: c2cb7babf49217d9fd3b743bc7f6ee3258fef2c1
deletions | additions
diff --git a/subsection_information_theory_Information_theory__.tex b/subsection_information_theory_Information_theory__.tex
index b5f5ee1..f6be026 100644
--- a/subsection_information_theory_Information_theory__.tex
+++ b/subsection_information_theory_Information_theory__.tex
...
\[
JS(A;B) = \frac{1}{2} D(A||M) + \frac{1}{2} D(B||M)
\]
where $\phi(M_x) =
\frac{1}{2}\phi(A_x) \frac{1}{2}\big(\phi(A_x) +
\frac{1}{2}\phi(B_x)$ \phi(B_x)\big)$
From its definition it can easily be shown that
\[
JS(A,B) = H(M) -
\frac{1}{2}H(A) \frac{1}{2}(H(A) -
\frac{1}{2}H(B) H(B))
\]
That is, the Jensen-Shannon divergence is equal to the entropy of the average distribution of two distributions minus the average of the entropies of the individual distributions.