Tyler Robbins edited untitled.html  over 8 years ago

Commit id: ca7992e2913f4d97f1c413080484437ded4d23be

deletions | additions      

       

genetic algorithm to estimate parameters of multiple neuron models. Of  particular interest, they propose genetic algorithms as a means of solving STRF  models. We applied this method as one means of estimating the parameters of an  augmented HR-neuron model that includes a sensory filter (DSTRF).


Affine Invariant MCMC

*Tyler - Will need to add in some generic information on how MCMC works, I guess?*

We also estimated neuron and filter parameters using a Markov Chain Monte Carlo ("MCMC") technique. MCMC provides some distinct advantages over other parameter estimation methods, such as variational methods. First, MCMC provides an estimate for the full posterior distribution of the parameters rather than just a single value as with genetic algorithms. Having the posterior distribution for the parameters is useful for drawing inferences from the uncertainty of the parameter estimates. Secondly, MCMC allows for a Bayesian approach to estimating the parameters. Prior knowledge or beliefs about the parameters may be used in the estimation procedure and then updated afterwards. Finally, MCMC is simple to run in parallel on multi-core or multi-processor computers, which allows for significant reductions in run time.


To sample from the posterior distribution, we used emcee, a Python package implementing an affine-invariant ensemble sampler (Foreman-Mackey et al. 2013). The affine-invariant sampler is insensitive to covariances between parameters and requires tuning of much fewer hyper-parameters than standard MCMC algorithms (Goodman and Weare 2010). Further, the ensemble method used by the sampler was designed to run in a parallel processing environment. Rather than having one chain randomly sampling the posterior distribution, the ensemble sampler has hundreds of small chains sampling at once. These features should allow the sampler to converge quicker than standard MCMC algorithms on good parameter estimations


Evaluating Fitness

**Tyler - This is where we'll talk about SPIKy and stuff. I'm working on this part now.**



SYNC = \frac{1}{M}\sum_{k=1}^{M}C_k

\[C_i^{(n,m)} \frac{1}{M}\sum_{k=1}^{M}C_k
\[C_i^{(n,m)}  = \left \{\begin{matrix}
1\;\;\;\;  \{\begin{matrix}
1  if \;\;min_j(|t_i^m-t_j^m)|) min_j(|t_i^m-t_j^m)|)  <\tau_{i,j}& \\
0 \;\;\; >  otherwise \;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;&
\end{matrix}\]

Twin &
\end{matrix}\]



Twin  Experiments



Results

not sure we have much to go here yet
  • Bobby: Maybe break this into a emcee and and GA section? All I have ready s stuff from the 1D twin data.

Genetic Algorithm Results


To assess the capabilities of a genetic algorithm similar to one used by Lynch and Houghton (2015), we generated a 2400ms spike train  derived from a DSTRF HR-neuron model with known parameters (table X). For the r(t) parameter, we convolved gaussian random  noise with a filter h(t) presented