this is for holding javascript data
Kim H. Parker edited subsection_information_theory_Information_theory__.tex
about 8 years ago
Commit id: c88ea3b16a2e0177662894fb21ece3ab6c24b0b9
deletions | additions
diff --git a/subsection_information_theory_Information_theory__.tex b/subsection_information_theory_Information_theory__.tex
index 2ce5d9f..6b88e03 100644
--- a/subsection_information_theory_Information_theory__.tex
+++ b/subsection_information_theory_Information_theory__.tex
...
\]
This measure of distance has several disadvantages; it is not symmetric and it is not a metric. The Jensen-Shannon divergence is defined using the Kullback-Leibler divergence in a way that makes it symmetric
\[
JS(A;B) =
\frac{1}{2} D(A||M) \mbox{$\frac{1}{2}$} \big(D(A||M) +
\frac{1}{2} D(B||M) D(B||M)\big)
\]
where $\phi(M_x) = \frac{1}{2}\big(\phi(A_x) + \phi(B_x)\big)$