this is for holding javascript data
Kim H. Parker edited subsection_information_theory_Information_theory__.tex
about 8 years ago
Commit id: bdeb5153afce54a56a6bde22e7799b19f8012dcb
deletions | additions
diff --git a/subsection_information_theory_Information_theory__.tex b/subsection_information_theory_Information_theory__.tex
index 24b18b2..5acbdb2 100644
--- a/subsection_information_theory_Information_theory__.tex
+++ b/subsection_information_theory_Information_theory__.tex
...
\]
This measure of distance has several disadvantages; it is not symmetric and it is not a metric. The Jensen-Shannon divergence is defined using the Kullback-Leibler divergence in a way that makes it symmetric
\[
JS(A;B) = \frac{1}{2}
D(A||M) \big(D(A||M) +
\frac{1}{2} D(B||M) D(B||M)\big)
\]
where $\phi(M_x) = \frac{1}{2}\big(\phi(A_x) + \phi(B_x)\big)$