this is for holding javascript data
Julian Schrenk edited untitled.tex
over 9 years ago
Commit id: cf5f07ee92103a909858d651b799d9e21ecbb422
deletions | additions
diff --git a/untitled.tex b/untitled.tex
index 943368f..02931a3 100644
--- a/untitled.tex
+++ b/untitled.tex
...
If we want to approximate the probability density without specifying a functional form we can use density estimation \url{http://en.wikipedia.org/wiki/Density_estimation}, especially kernal density estimation \url{http://en.wikipedia.org/wiki/Kernel_density_estimation}. This is a bit like histogramming with smoothing, but you end up with an analytic form for the density at the end.
\section{Entropy computation}
Once we have all of the above tools, we should be able to compute the entropy or total number of basins in several different ways.
These correspond to different entropy definitons and analysis techniques.
\subsection{APF entropy}
Description:
This is the granular entropy according to Asenjo14: 10.1103/PhysRevLett.112.098002
Definition:
\begin{equation}
S_\text{APF}^* = \langle F \rangle_\text{biased} + \log(V_\text{acc})
\end{equation}
\begin{equation}
S_\text{APF} = S_\text{APF}^* - \log(N!)
\end{equation}
Requirements:
Needs the $log(V)$ values as obtained from the basinvolume computation.
\subsection{Fit to CDF and Jackknife}
Description:
JackLogOmega gives the LogOmega entropy (log of number of basins), after
unbiasing the distribution with fit to generalised gaussian CDF and numerical
integration as in Asenjo14.
Definition:
\begin{equation}
S^\star = \log \Omega = \log(V_\text{acc}) - \log(\text{unbiased mean volume})
\end{equation}
\begin{equation}
S = S^\star - \log(N!)
\end{equation}
Requirements:
Needs an analytical (or, at least numerically integrable) expression for the un-biased basin volume distribution.
Currently this is obtained by fitting a generalised gaussian to the empirical CDF.
\subsection{ML fit}
TODO: 3.) MLLogOmega gives LogOmega from ML estimate for LogOmega, after maximum
likelihood fit of generalised gaussian.
\subsection{Bayesian}
TODO: 4.) BayesianLogOmega uses Bayesian inference to get the parameters of the
generalised gaussian.
\subsection{Non-parametric}
TODO: 5.) NonParametricLogOmega constructs a non-parametric description of the
biased distribution and integrates that with the un-biasing factor to get
LogOmega without the assumption of the genealised gaussian.
This needs some kernel density estimation or smoothing or similar to get the
PDF description.