this is for holding javascript data
Jacob Stevenson edited untitled.tex
over 9 years ago
Commit id: 15265407ae7de88e1143e3d2660d175aa8e1d9a4
deletions | additions
diff --git a/untitled.tex b/untitled.tex
index 2aa5f0c..984a6fc 100644
--- a/untitled.tex
+++ b/untitled.tex
...
\section{probability distribution}
The end result of this project will be a series of volumes sampled from an unknown probability distribution. We want to determine what that distibution will be. We will have a functional form for that distribution $P(V | \theta )$ where $\theta$ are the unknown parameters of the distibution. In the past we've used a generalized Gaussian. $P(V | \theta)$ is the biased distribution, so it will be the generalized gaussian times $V$.
This is mostly taken from \url{http://en.wikipedia.org/wiki/Distribution_fitting}
There are several ways to determine what the values of $\theta$ should be. None of which involve histogramming.
\subsection{maximum likelihood method}
\url{http://en.wikipedia.org/wiki/Maximum_likelihood}
The likelihood is the joint probability distribution of all the observations given the set of parameters $\theta$
\begin{equation}
L(V_1, ..., V_N | \theta) = \prod_{i=1}^{N} P(V_i | \theta)
...
\subsection{Bayesian updates}
\url{http://en.wikipedia.org/wiki/Bayesian_inference#Parametric_formulation}
With this method you start with some prior distribution over the parameters $P(\theta)$ and apply Bayes' theorem iteratively to update the distribution given the additional knowledge (the fact of the observation).
\begin{equation}
P(\theta | V_1) = \frac{ P(V_1 | \theta) P(\theta) }{ P(V_1)}