ca7992e2913f4d97f1c413080484437ded4d23be
We also estimated neuron and filter parameters using a Markov Chain Monte Carlo ("MCMC") technique. MCMC provides some distinct advantages over other parameter estimation methods, such as variational methods. First, MCMC provides an estimate for the full posterior distribution of the parameters rather than just a single value as with genetic algorithms. Having the posterior distribution for the parameters is useful for drawing inferences from the uncertainty of the parameter estimates. Secondly, MCMC allows for a Bayesian approach to estimating the parameters. Prior knowledge or beliefs about the parameters may be used in the estimation procedure and then updated afterwards. Finally, MCMC is simple to run in parallel on multi-core or multi-processor computers, which allows for significant reductions in run time.
To sample from the posterior distribution, we used emcee, a Python package implementing an affine-invariant ensemble sampler (Foreman-Mackey et al. 2013). The affine-invariant sampler is insensitive to covariances between parameters and requires tuning of much fewer hyper-parameters than standard MCMC algorithms (Goodman and Weare 2010). Further, the ensemble method used by the sampler was designed to run in a parallel processing environment. Rather than having one chain randomly sampling the posterior distribution, the ensemble sampler has hundreds of small chains sampling at once. These features should allow the sampler to converge quicker than standard MCMC algorithms on good parameter estimations
**Tyler - This is where we'll talk about SPIKy and stuff. I'm working on this part now.**
Genetic Algorithm Results