2026-08-12 –, Room 6
Chaotic neural dynamics resist equation discovery because parameter sensitivity creates intractable optimization landscapes. Using the SciML ecosystem, we combine prediction-error methods with universal differential equations (PEM-UDE) and multiple shooting to tame chaos during learning. In spiking networks, we derive novel mean-field equations for sparse cortical connectivity that predict frequency shifts and synchrony patterns, validated by intracranial recordings.
Understanding how neural populations generate brain rhythms requires equations that bridge the biophysics of individual neurons with macroscopic network dynamics. Next-generation neural mass models (NGNMMs) achieve this analytically but rely on the assumption of all-to-all connectivity between neurons, a condition that holds only in deep brain structures like the hippocampus but fails badly in the cortex, where connectivity is typically 1-5%. Removing this assumption analytically has proven contentious, with competing approaches yielding inconsistent results.
We sidestep the analytical difficulty entirely by learning the governing equations directly from data using scientific machine learning. Our PEM-UDE approach combines the prediction-error method with universal differential equations to discover equations from chaotic neural time series. The key insight is that the PEM correction removes the sensitive dependence on parameters that makes chaotic systems intractable for standard UDE training. This effectively smooths the loss landscape while preserving the correct solution. After training, we extract symbolic equations via sparse regression (STLSQ) or genetic algorithm methods.
We already demonstrated PEM-UDE on benchmark chaotic systems (Roessler attractor, Petrzela-Polak circuit with 5x noise), and applied it to learn novel NGNMMs from populations of Izhikevich neurons with connectivity ranging from 5% to 100% (arXiv:2507.03631). The learned equations include correction terms that capture how sparsity modifies firing rate and voltage dynamics. Here, we extend the original method by combining PEM with multiple shooting (arXiv:2602.21588) to handle longer time series, regime transitions, and mixed excitatory-inhibitory populations with varying neuron parameters.
We will present and discuss how neural network architecture, prediction error, and multiple-shooting parameters affect training performance (speed and accuracy). Our longer-term goal is to develop an automated tool chain for fitting data to UDEs.
Finally, we will present progress on analytically flattening a more complex neural circuit (Nature Communications 17(390)2026) using UDEs. This work is motivated by the inability of large-scale neuron simulations to be used for parameter fitting of experimental data. By reducing the dimensionality of the system (going from 10,000s to 10s of states), we should be able to parameter-fit our models to individual patients.
I am an Associate Professor at the Biomedical Engineering Department at Stony Brook University. I also have affiliate positions at the Martinos Center for Biomedical Imaging at MGH/Harvard Medical School and at JuliaLab at MIT/CSAIL. I am currently leading the development of Neuroblox.jl, a Julia package to design, simulate, and analyze dynamic models of the brain. Our effort is built on top of ModelingToolkit.jl, but we are also developing our own, and sometimes more efficient, algorithms to build graphs of dynamical motives (we just released GraphDynamics.jl
Dr. Chris Rackauckas is the VP of Modeling and Simulation at JuliaHub, the Director of Scientific Research at Pumas-AI, Co-PI of the Julia Lab at MIT, and the lead developer of the SciML Open Source Software Organization. For his work in mechanistic machine learning, his work is credited for the 15,000x acceleration of NASA Launch Services simulations and recently demonstrated a 60x-570x acceleration over Modelica tools in HVAC simulation, earning Chris the US Air Force Artificial Intelligence Accelerator Scientific Excellence Award. See more at https://chrisrackauckas.com/. He is the lead developer of the Pumas project and has received a top presentation award at every ACoP in the last 3 years for improving methods for uncertainty quantification, automated GPU acceleration of nonlinear mixed effects modeling (NLME), and machine learning assisted construction of NLME models with DeepNLME. For these achievements, Chris received the Emerging Scientist award from ISoP.