Bobby edited untitled.html  over 8 years ago

Commit id: a08a33a57729d0f93c02ac9c963136d0eb704639

deletions | additions      

       

Introduction

What is the big picture question?

What are the current methods available for addressing this question?

What are the limitations of these methods?

Recently, Lynch and Houghton (2015) have developed a method to predict in vivo response of auditory neurons using an estimated spectrotemporal receptive field (STRF) filter combined with an integrate-and-fire neuron model based on the Izhikevich neuron (Izhikevich, 2003). This model adds an additional adaptive parameter to the integrate-and-fire neuron for enhanced biophysical realism. Optimization is performed iteratively on the parameters of the filter and the parameters of the neuron model using a genetic algorithm in which fitness is evaluated by the van Rossum distance metric (van Rossum, 2001).

Lynch and Houghton’s method fills a clear need in the neuroinformatics field and moves forward the possibilities for neuron modeling in in vivo electrophysiology research. There exist many avenues for further improvement, including models with more biologically interpretable parameters and improved optimization algorithms.

In this paper, we propose a dynamical systems-based neuron model combined with an STRF filter that provides superior prediction accuracy with a computationally efficient optimization algorithm. This method is based on a Hindmarsh-Rose (HR) neuron model (Hindmarsh & Rose, 1984), which strikes a balance between the limited parameter sets of the integrate-and-fire models and the biological realism of the Hodgkin-Huxley ion current models (Hodgkin & Huxley, 1952). The feature space is explored by emcee (Foreman-Mackey et al., 2013), a Python implementation of a Markov chain Monte Carlo (MCMC), to find a local optima using the computational efficient SPIKE synchronization metric (Kreuz et al., 2015) as the fitness function. This combination of algorithms is first tested on simulated data with known parameters, and then validation on a real data set recorded in vivo from auditory neurons in the zebra finch (Taeniopygia guttata).


Methods


The Hindmarsh-Rose Model

The Hindmarsh-Rose model is a simple model of neuronal activity that allows for complex behaviors such as bursting and chaotic spiking (Hindmarsh and Rose 1984). It is described by three coupled first order differential equations with eight parameters.

x=y-ax^3+bx^2-z+I

y=c-dx^2

z=r(s(x-x_0)-z)



All variables are dimensionless. In these equations, x is the membrane potential, $y$ is the recovery variable, and z is the slow adaptation current. The parameters a, b, c, and d model the ion channels that produce action potentials, with a and b determining spike shape and c and d determining spike frequency. The parameter r forces z to adjust slowly relative to x and y, while s controls the tendency to burst and x0 is the resting potential. Finally, I is an applied current from outside the neuron such as from a patch clamp.



The HR-neuron model exhibits chaotic  characteristics that allow for multiple possible variations on spike times,  shape, and quantity. These chaotic dynamics are well studied  for many possible parameter combinations (Storace,  Linaro, & de Lange, 2008; Shilnikov & Kolomeites, 2008).


**Next 2008). Researchers have previously  demonstrated that both genetic algorithms and MCMC methods are capable of  solving chaotic equations (Loskutov, Molkov, Mukhin, & Feigin, 2007; Peng,  Liu, Zhang, & Wang, 2009).


**Next  I think we'll want to talk about extended the HR model with an STRF**

Twin studies with 1D data
  • Bobby:For the sake of the final I am putting the 1D stuff I have for GA here and in the results.

Genetic Algorithm

Mathematical  models such as the HR-neuron provide researchers a framework to understand and  predict qualities of a given system of interest. Researchers have developed 

augmented HR-neuron model that includes a sensory filter (DSTRF).


Affine Invariant MCMC

*Tyler - Will need to add in some generic information on how MCMC works, I guess?*

We also estimated neuron and filter parameters using a Markov Chain Monte Carlo ("MCMC") technique. MCMC provides some distinct advantages over other parameter estimation methods, such as variational methods. First, MCMC provides an estimate for the full posterior distribution of the parameters rather than just a single value as with genetic algorithms. Having the posterior distribution for the parameters is useful for drawing inferences from the uncertainty of the parameter estimates. Secondly, MCMC allows for a Bayesian approach to estimating the parameters. Prior knowledge or beliefs about the parameters may be used in the estimation procedure and then updated afterwards. Finally, MCMC is simple to run in parallel on multi-core or multi-processor computers, which allows for significant reductions in run time.


To sample from the posterior distribution, we used emcee, a Python package implementing an affine-invariant ensemble sampler (Foreman-Mackey et al. 2013). The affine-invariant sampler is insensitive to covariances between parameters and requires tuning of much fewer hyper-parameters than standard MCMC algorithms (Goodman and Weare 2010). Further, the ensemble method used by the sampler was designed to run in a parallel processing environment. Rather than having one chain randomly sampling the posterior distribution, the ensemble sampler has hundreds of small chains sampling at once. These features should allow the sampler to converge quicker than standard MCMC algorithms on good parameter estimations


Evaluating Fitness

**Tyler - This is where we'll talk about SPIKy and stuff. I'm working on this part now.**



Twin Experiments


Results

not sure we have much to go here yet
  • Bobby: Maybe break this into a emcee and and GA section? All I have ready s stuff from the 1D twin data.

Figures


Let's get down some ideas for the figures we want. Don't worry about the order for now, we can rearrange.

Figure 1:

Figure 2:

Figure 3:

Figure 4:

Figure 5:

Figure 6:

Figure 7:

Figure 8:

References

  1. Foreman-Mackey, D., Hogg, D. W., Lang, D., Goodman, J. (2013). emcee: The MCMC Hammer. PASP 125, 306-312.
  2. Hindmarsh, J. L., and Rose, R. M. (1984) A model of neuronal bursting using three coupled first order differential equations. Proc. R. Soc. London, Ser. B 221, 87–102.
  3. Hodgkin, A., and Huxley, A. (1952). A quantitative description of membrane current  and its application to conduction and excitation in nerve. J. Physiol. 117,  500–544.
  4. Holland, J. H. (1973). Genetic algorithms and the optimal allocation of trials. SIAMJ. Comput.2, 88–105.
  5. Izhikevich, E. M. (2003). Simple model of spiking neurons. IEEE Transactions on Neural Networks 14, 1569-1572.
  6. Kreuz T, Mulansky M, Bozanic N (2015). SPIKY: A graphical user   interface for monitoring spike train synchrony. JNeurophysiol 113, 3432.
  7. Lynch, 3432.
  8. Loskutov,  E., Molkov, Y., Mukhin, D., & Feigin, A. (2007). Markov chain Monte Carlo  method in Bayesian reconstruction of dynamical systems from noisy chaotic time  series. Physical Review E Phys. Rev. E, 77(6), 066214
  9. Lynch,  E. P., and Houghton, C. J. (2015). Parameter estimation of neuron models using in-vitro and in-vivo electrophysiological data. Frontiers in Neuroinformatics 9, 1-15.
  10. Storace, 1-15.
  11. Peng, B., Liu, B., Zhang,  F., & Wang, L. (2009) Differential  evolution algorithm-based parameter estimation for chaotic systems. Chaos, Solitons  & Fractals, 39 (5), 2110-2118.
  12. Storace,  M., Linaro, D., & De Lange, E. (2008). The Hindmarsh–Rose neuron model:  Bifurcation analysis and piecewise-linear approximations. Chaos: An  Interdisciplinary Journal of Nonlinear Science, (18), 033128
  13.  Shulnikov, A. & Kolomiets, M. (2008). Methods