Solving Differential Equations in Julia
2019-07-22, 08:30–12:00, PH 103N

This workshop is for both experienced DifferentialEquations.jl users and newcomers. The first hour of the workshop will introduce the user to DifferentialEquations.jl, describing the basic workflow and the special features which are designed to make solving hard equations (automatic sparsity detection, Jacobian coloring, polyalgorithms, etc.) easy. After the introduction, the workshop will break out into groups to work on exercises, where the developers of the library's components will be available for any questions. Some of the exercises are designed for beginners to learn how to solve differential equations and fit models to data, while others are for experienced users to learn the newest performance-enhancement features and upgrade to GPU-accelerated workflows.

The exercises are described as follows:

  • Exercise 1 takes the user through defining the same biological system with stochasticity, utilizing EnsembleProblems to understand 95% bounds on the solution, and perform Bayesian parameter estimation.
  • Exercise 2 takes the user through defining hybrid differential equation, that is a differential equation with events, and using adjoints to to perform gradient-based parameter estimation.
  • Exercise 3 takes the user through differential-algebraic equation (DAE) modeling, the concept of index, and using both mass-matrix and implicit ODE representations.
  • Exercise 4 takes the user through optimizing a PDE solver, utilizing automatic sparsity pattern recognition, automatic conversion of numerical codes to symbolic codes for analytical construction of the Jacobian, preconditioned GMRES, and setting up a solver for IMEX and GPUs.
  • Exercise 5 focuses on a parameter sensitivity study, utilizing GPU-based ensemble solvers to quickly train a surrogate model to perform global parameter optimization.
  • Exercise 6 takes the user through training a neural stochastic differential equation, using GPU-acceleration and adjoints through Flux.jl's neural network framework to build efficient training codes.