JuliaCon 2026

NoLimits.jl: A flexible Julia framework for nonlinear, neural and latent-state mixed-effects modeling
2026-08-12 , Room 4

NoLimits.jl is a flexible open-source Julia framework for nonlinear modeling and parameter estimation with random effects. It supports ODE-based mechanistic models, hidden Markov models, hybrid mechanistic-machine learning components, normalizing flows, and nested random-effect structures within a unified interface. By leveraging Julia’s composability, it enables scalable frequentist and Bayesian inference beyond the constraints of traditional open-source mixed-effects software.


Nonlinear mixed-effects models are central to longitudinal and hierarchical data analysis. While many established platforms provide robust and production-ready workflows, extending open-source tools to incorporate modern machine learning components, flexible random-effect distributions, or complex latent structures often requires substantial methodological effort.

This talk introduces NoLimits.jl (NOn LInear MIxed effecTS), an open-source Julia framework for nonlinear modeling with random effects, designed for composability and rapid methodological experimentation. The framework enables users to combine mechanistic models (including DifferentialEquations.jl-based ODE systems), hierarchical random effects with multiple nested levels, hidden Markov structures, and differentiable machine learning components within a unified interface.

NoLimits.jl integrates seamlessly with the Julia ecosystem, leveraging Distributions.jl, DifferentialEquations.jl, Lux.jl, Optimization.jl, and Turing.jl for both frequentist and Bayesian inference.

The focus of the talk is to present the capabilities of the framework and to outline potential improvements that can benefit from the community. Using real data examples, I will showcase the package capabilities by demonstrating practical modeling workflows, including:
1) Embedding soft decision trees and neural networks as differentiable components alongside mechanistic ODE models,
2) Specifying hidden Markov models within nonlinear mixed-effects structures for individualized latent transition analysis,
3) Extending classical Gaussian random effects using normalizing flows to capture skewed or multimodal heterogeneity.

I will conclude the talk with an outline of opportunities for collaboration and contribution, as well as future directions, including extending the framework toward federated learning capabilities.

Manuel develops statistical methods and software in Julia, R, and Python focusing on nonlinear mixed-effects models, longitudinal data, and federated learning. He began using Julia in 2023 and now builds research software leveraging its composability and automatic differentiation ecosystem. He is the author of Coconots.jl and NoLimits.jl. Manuel is a PhD student in Mathematics in the group of Jan Hasenauer at the University of Bonn.