Kim H. Parker edited subsection_information_theory_Information_theory__.tex  about 8 years ago

Commit id: 9b7a5cb20f470374796943b598c66c5cdca66e27

deletions | additions      

       

From its definition it can easily be shown that   \[  JS(A,B) = H(M) - \frac{1}{2}(H(A) \mbox{$\frac{1}{2}$} \big(H(A)  - H(B)) H(B)\big)  \]  That is, the Jensen-Shannon divergence is equal to the entropy of the average distribution of the two distributions minus the average of the entropies of the individual distributions.