Neural Ordinary Differential Equations with DiffEqFlux
2019-07-24 , Room 349

This talk will demonstrate the models described in Neural Ordinary Differential Equations implemented in DiffEqFlux.jl, using DifferentialEquations.jl to solve ODEs with dynamics specified and trained with Flux.jl.


This talk will demonstrate models described in Neural Ordinary Differential Equations implemented in DiffEqFlux.jl, using DifferentialEquations.jl to solve ODEs with dynamics specified and trained with Flux.jl. In particular it will show how to use gradient optimization with the adjoint method to train a neural network which parameterizes an ODE for supervised learning and for Continuous Normalizing Flows. These demonstrations will be contributed to the Flux model-zoo.

The supervised learning demonstration will illustrate that neural ODEs can be drop-in replacements for residual networks on supervised tasks such as image recognition.

The Continuous Normalizing Flow demo will show how a neural ODE, with the instantaneous change of variables, can learn a continuous transformation from tractable base distribution to a distribution over data which can be sampled from and evaluate densities under.


Co-authors

Jesse Bettencourt is a graduate student in the Machine Learning group at the University of Toronto and the Vector Institute. He is supervised by David Duvenaud and Roger Grosse and teaches the senior undergraduate/graduate course on probabilistic models and machine learning.