Efficient optimization of ODE neurons using gradient descent

My point of view before the seminar

Abstract

Neuroscientists fit simulations of morphologically and biophysically detailed neurons to data, often using evolutionary algorithms. However, such gradient-free approaches are computationally expensive, making convergence slow when neuron models have many parameters. Here we introduce a gradient-based algorithm using differentiable ODE solvers that scales well to high-dimensional problems. GPUs make parallel simulations fast and gradient calculations make optimization fast. We verify the usefulness of our approach optimizing neuron models with active dendrites with heterogeneously distributed ion channel densities. We find that individually stimulating and recording all dendritic compartments makes such models identifiable. Identification breaks down gracefully as less stimulation and recording sites are given. Differentiable neuron models, which should be added to popular neuron simulation packages, promise a new era of optimizable neuron models with many free parameters, a key feature of real neurons.

Date
May 16, 2024 10:00 AM
Location
Ashburn, Virginia
19700 Helix Drive, Ashburn, Virginia 20147

Presentation at HHMI Janelia’s Computation and Theory Seminar at the Janelia Research Campus on May 16th, 2024.

Ilenna Jones
Ilenna Jones
Research Fellow

My research interests include neuronal biophysics, dendritic computation, and neuroscience for AI.