Fitting Neural Ordinary Differential Equations with DiffeqFlux.jl
07-24, 16:35–17:05 (US/Eastern), Room 349

Neural Ordinary Differential Equations (neural ODEs) are a brand new and exciting method to model nonlinear transformations as they combine the two fields of machine learning and differential equations. In this talk we discuss DiffEqFlux.jl, a package for designing and training neural ODEs, and we introduce new methodologies to improve the efficiency and robustness of neural ODEs fitting.


A neural Ordinary Differential Equation (ODE) is a differential equation whose evolution equation is a neural network. We can use neural ODEs to model nonlinear transformations by directly learning the governing equations from time course data. Therefore, neural ODEs present a novel method for modelling time series in an elegant manner as they allow us to use sophisticated differential equations solving procedures in the field of machine learning, an area of already high and still increasing demand.

In this talk we discuss DiffEqFlux.jl, a package for designing and training neural ODEs. We demonstrate how to fit neural ODEs against data by using the L2 loss function, and explain how Julia's automatic differentiation is used to calculate the gradients through the differential equation solvers to compute the gradients of the loss function. While this is the "standard" method, it involves solving an ODE at each step of the optimization which can be very time consuming. Thus, we introduce new methodologies available in DiffEqFlux.jl, to improve the efficiency and robustness of the fitting. First, we demonstrate new functionalities provided by a bridge to the two stage collocation method of the package DiffEqParamEstim.jl. Second, we show how to effectively use these functions in a mixed training loop to improve the speed and robustness of the fitting. Third, we demonstrate and explain a new loss function in DiffEqFlux.jl which allows for multiple shooting, and show its performance characteristics. Together, these three features improve the performance and robustness of the fitting process of neural ODEs in Julia and, thus, allow it to scale to more practical models and data.


Co-authors

Chris Rackauckas, Michael Stumpf

Elisabeth Rösch did her undergraduate studies in Munich, Germany, (B.Sc. Bioinformatics - Technical University Munich and Ludwig-Maximilans-Univerisity Munich) and postgraduate studies in London, UK, (M.Sc. Bioinformatics and Theoretical Systems Biology- Imperial College London). Currently she is doing her PhD in the Maths and Stats department of the University of Melbourne, Australia. She focuses her research on the combination of machine learning and mechanistic modelling applied to Theoretical Systems Biology.