Automated neuron model optimization techniques: The increase in complexity of computational neuron models makes the hand tuning of model parameters more difficult than ever. Fortunately, the parallel increase in computer power allows scientists to automate this tuning. Optimization algorithms need two essential components. The first one is a function that measures the difference between the output of the model with a given set of parameters and the data. This error function or fitness function makes the ranking of different parameter sets possible. The second component is a search algorithm that explores the parameter space to find the best parameter set in a minimal amount of time. In this review we distinguish three types of error functions: feature-based ones, point-by-point comparison of voltage traces and multi-objective functions. We then detail several popular search algorithms, including brute-force methods, simulated annealing, genetic algorithms, evolution strategies, differential evolution and particle-swarm optimization. Last, we shortly describe Neurofitter, a free software package that combines a phase-plane trajectory density fitness function with several search algorithms.

References in zbMATH (referenced in 14 articles )

Showing results 1 to 14 of 14.
Sorted by year (citations)

  1. Zhou, Shijie; Lai, Ying-Cheng; Lin, Wei: Stochastically adaptive control and synchronization: from globally one-sided Lipschitzian to only locally Lipschitzian systems (2022)
  2. Burghi, Thiago B.; Schoukens, Maarten; Sepulchre, Rodolphe: Feedback identification of conductance-based models (2021)
  3. Ioan, Daniel; Bărbulescu, Ruxandra; Silveira, Luis Miguel; Ciuprina, Gabriela: Reduced order models of myelinated axonal compartments (2019)
  4. Narayanan, Vignesh; Li, Jr-Shin; Ching, ShiNung: Biophysically interpretable inference of single neuron dynamics (2019)
  5. Moye, Matthew J.; Diekman, Casey O.: Data assimilation methods for neuronal state and parameter estimation (2018)
  6. Coventry, Brandon S.; Parthasarathy, Aravindakshan; Sommer, Alexandra L.; Bartlett, Edward L.: Hierarchical winner-take-all particle swarm optimization social network for neural model fitting (2017)
  7. van der Scheer, H. T.; Doelman, A.: Synapse fits neuron: joint reduction by model inversion (2017)
  8. Tumanova, Natalija; Čiegis, Raimondas; Meilūnas, Mečislavas: Numerical analysis of nonlinear model of excited carrier decay (2013)
  9. Buhry, Laure; Grassia, Filippo; Giremus, Audrey; Grivel, Eric; Renaud, Sylvie; Saïghi, Sylvain: Automated parameter estimation of the Hodgkin-Huxley model using the differential evolution algorithm: application to neuromimetic analog integrated circuits (2011)
  10. Pospischil, Martin; Piwkowska, Zuzanna; Bal, Thierry; Destexhe, Alain: Comparison of different neuron models to conductance-based post-stimulus time histograms obtained in cortical pyramidal cells using dynamic-clamp in vitro (2011)
  11. Fairhurst, D.; Tyukin, I.; Nijmeijer, H.; van Leeuwen, C.: Observers for canonic models of neural oscillators (2010)
  12. Ma, Huan-Fei; Lin, Wei: Nonlinear adaptive synchronization rule for identification of a large amount of parameters in dynamical models (2009)
  13. Jolivet, Renaud; Roth, Arnd; Schürmann, Felix; Gerstner, Wulfram; Senn, Walter: Special issue on quantitative neuron modeling (2008)
  14. Van Geit, W.; De Schutter, E.; Achard, P.: Automated neuron model optimization techniques: a review (2008)

Further publications can be found at: