Kim H. Parker edited subsection_minimising_the_Jensen_Shannon__.tex  about 8 years ago

Commit id: f6444ecc265b36abbaa0ea47483426fc40f26942

deletions | additions      

       

\subsection{minimising the Jensen-Shannon distance between $P$ and ($dP_+ + dP_-$) and $\rho d c  dU$ and ($dP_+ - dP_-$)} The separation of the measured $dP$ and $dU$ can be done for any arbitrary value of $c$ usng the water hammer equations and the assumption that the forward and backward waves are additive. Thus for any value of $c$ (assuming that $\rho$ is a constant) we obtain distributions of $dP_+$ and $dP_-$. For noiseless measurements and the 'true' value of $c$, the distribution of the measured $dP$ should equal the sum of the distributions of $dP_+$ and $dP_-$ and the distribution of $\rho c dU$ should equal the sum of the distributions of $dP_+$ and $-dP_-$. For noisy measurements these distributions will never be equal. We argue, however, that the sum of the distances between the these distributions will be minimum when we use the true value of $c$. 

\phi(dP_+ - dP_-) = \frac{1}{2}(\phi(dP_+) + \phi(-dP_-))  \]  The problem of estimating the probability density function (pdf) from a sample distribution has received considerable attention over the years. The simplest approach approach, generally called the maximum likelyhood estimator,  is to estimate the pdf from the normalised histogram of the sampled distribution. For example, given the distribution of $dP$ \[  \phi(dP) = h(dP;b)/N   \]  where $h(dP;b)$ is the number of values of $dP$ lying in the bins $b$ and $N$ is the total number of samples.   The units of $\Delta$ depend on the base of the logarithm which is used in calculating the entropy, traditionally log base 2 is used in which case the entropy and the Jensen-Shannon divergence is measured in bits. $\Delta$ therefore has the units of $bits^{1/2}$. $\sqrt{bits}$.