2025-07-23 –, Main Room 2
State-space models (SSMs) are powerful tools for modeling time series data that naturally arise in neuroscience, finance, and engineering. These models assume observations arise from a hidden latent sequence, encompassing methods like Hidden Markov Models (HMMs) and Linear Dynamical Systems (LDS). We introduce StateSpaceDynamics.jl, an open source, modular package designed to be fast, readable, and self contained for the express purpose of fitting a plurality of SSMs, easily in Julia.
Advancements in systems neuroscience have enabled the collection of massive, multivariate time series datasets, where simultaneous recordings from hundreds to thousands of neurons are increasingly common. Interpreting these high-dimensional recordings presents a significant challenge. Recent modeling approaches suggest that neural activity can be effectively characterized by sets of low dimensional latent variables. Accordingly, there is a growing need for models that combine dimensionality reduction with temporal dynamics, for which state-space models (SSMs) provide a natural framework.
While advanced SSM implementations exist in Python, such as the SSM package from the Linderman lab, the Julia programming language lacks an equivalent library that meets the needs of modern neuroscientists. Existing Julia offerings, such as StateSpaceModels.jl, are limited to Gaussian observation models. This fundamental limitation precludes the analysis of the non-Gaussian observations that are so common in neuroscience, where spike counts often follow Poisson or other discrete distributions.
To address these limitations, we have developed StateSpaceDynamics.jl, which employs a direct maximization of the complete-data log-likelihood for LDS models (Paninski et al., 2010). By leveraging the block-tridiagonal structure of the Hessian matrix, this method allows for the exact computation of the Kalman smoother in O(T) time. Furthermore, this method facilitates the generalization of the smoother to accommodate other observation noise models, such as Poisson and Bernoulli.
Additionally, StateSpaceDynamics.jl implements a variational expectation maximization method (Ghahramani and Hinton, 2000) that enables the learning of hierarchical SSMs, such as switching linear dynamical systems (sLDS). These models gained traction in the study of neural dynamics, especially as they relate to movement preparation and execution.
Lastly, StateSpaceDynamics.jl provides implementations of discrete state space models, such as Hidden Markov Models (HMMs), and the ability to fit these models using the expectation maximization algorithm. While these are not the primary development target of the package and are implemented in HiddenMarkovModels.jl (Dalle, 2024), they are necessary for the development of hierarchical models.
By providing these features, StateSpaceDynamics.jl fills a critical gap in the Julia ecosystem, offering modern computational neuroscientists the tools necessary to model complex neural data with state-space models that incorporate both dimensionality reduction and temporal dynamics.
Computational Neuroscience Ph.D. Candidate at Boston University. Currently working in the DePasquale and Scott labs. Focusing on understanding astrocytes in brain computation and the development of novel state-space models.
PhD candidate in Biomedical Engineering at Boston University co-advised by Drs. Brian DePasquale and Michael Economo.