2019-07-23, 16:45–17:15, Elm B
Modeling practice seems to be partitioned into scientific models defined by mechanistic differential equations and machine learning models defined by parameterizations of neural networks. While the ability for interpretable mechanistic models to extrapolate from little information is seemingly at odds with the big data "model-free" approach of neural networks, the next step in scientific progress is to utilize these methodologies together in order to emphasize their strengths while mitigating weaknesses. In this talk we will describe four separate ways that we are merging differential equations and deep learning through the power of the DifferentialEquations.jl and Flux.jl libraries. Data-driven hypothesis generation of model structure, automated real-time control of dynamical systems, accelerated of PDE solving, and memory-efficient deep learning workflows will all shown to be derived from this common computational structure of differential equations mixed with neural networks. The audience will leave with a new appreciation of how these two disciplines can benefit from one another, and how neural networks can be used for more than just data analysis.
Dynamical models are often interesting due to the high-level qualitative behavior that they display. Differential equation descriptions of fluids accurately predict when drone flight will go unstable, and stochastic evolution models demonstrate the behavior for how patterns may suddenly emerge from biological chemical reactions. However, utilizing these models in practice requires the ability to understand, prediction, and control these outcomes. Traditional nonlinear control methods directly tie the complexity of the simulation to the control optimization process, making it difficult to apply these methods in real-time to highly detailed but computationally expensive models.
In this talk we will show how to decouple the computation time of a model from the ability to predict and control its qualitative behavior through a mixture of differential equation and machine learning techniques. These new methods directly utilize the language-wide differentiable programming provided by Flux.jl to perform automatic differentiation on differential equation models described using DifferentialEquations.jl. We demonstrate an adaptive data generation technique and show that common classification methods from machine learning literature converge to >99% accuracy for determining qualitative model outcomes directly from the parameters of the dynamical model. Using a modification of methods from Generative Adversarial Networks (GANs), we demonstrate an inversion technique with the ability to predict dynamical parameters that meet user-chosen objectives. This method is demonstrated to be able to determine parameters which constrains predator-prey models to a specific chosen domain and predict chemical reaction rates that result in Turing patterns for reaction-diffusion partial differential equations. Code examples will be shown and explained to directly show Julia users how to do these new techniques. Together, these methods are scalable and real-time computational tools for predicting and controlling the relation between dynamical systems and their qualitative outcomes with many possible applications.
Lyndon White, Mike Innes