2019-07-22 –, PH 103N
This workshop is for both experienced DifferentialEquations.jl users and newcomers. The first hour of the workshop will introduce the user to DifferentialEquations.jl, describing the basic workflow and the special features which are designed to make solving hard equations (automatic sparsity detection, Jacobian coloring, polyalgorithms, etc.) easy. After the introduction, the workshop will break out into groups to work on exercises, where the developers of the library's components will be available for any questions. Some of the exercises are designed for beginners to learn how to solve differential equations and fit models to data, while others are for experienced users to learn the newest performance-enhancement features and upgrade to GPU-accelerated workflows.
The exercises are described as follows:
- Exercise 1 takes the user through defining the same biological system with stochasticity, utilizing EnsembleProblems to understand 95% bounds on the solution, and perform Bayesian parameter estimation.
- Exercise 2 takes the user through defining hybrid differential equation, that is a differential equation with events, and using adjoints to to perform gradient-based parameter estimation.
- Exercise 3 takes the user through differential-algebraic equation (DAE) modeling, the concept of index, and using both mass-matrix and implicit ODE representations.
- Exercise 4 takes the user through optimizing a PDE solver, utilizing automatic sparsity pattern recognition, automatic conversion of numerical codes to symbolic codes for analytical construction of the Jacobian, preconditioned GMRES, and setting up a solver for IMEX and GPUs.
- Exercise 5 focuses on a parameter sensitivity study, utilizing GPU-based ensemble solvers to quickly train a surrogate model to perform global parameter optimization.
- Exercise 6 takes the user through training a neural stochastic differential equation, using GPU-acceleration and adjoints through Flux.jl's neural network framework to build efficient training codes.
Chris' research and software combines AI with differential equation models of human organs to give patients accurate and personalized drug doses: reducing pain and complications for patients while reducing treatment costs for hospitals.
Chris Rackauckas is an applied mathematics instructor at the Massachusetts Institute of Technology and a senior research analyst at the University of Maryland, School of Pharmacy in the Center for Translational Medicine. Chris's recent work is focused on bringing personalized medicine to standard medical practice through the proliferation of mathematical software. His work on developing the DifferentialEquations.jl solver suite along with over a hundred other Julia packages, not only earned him the inaugural Julia Community Prize and front page features in tech community sites, it is also the foundation of the PuMaS.jl package for Pharmaceutical Modeling and Simulation, set to release in March 2019. Chris’ work with PuMaS makes it possible to predict the optimal medication dosage for individuals, reducing the costs and potential complications associated with treatments. The software is currently being tested in the administration of treatment for neonatal abstinence syndrome (NAS), an opioid withdrawal disorder in newborn babies. NAS requires medically administered morphine doses every four hours to prevent the infants from experiencing withdrawal symptoms. PuMaS is being used to predict personalized safe dosage regimens by incorporating realistic biological models (quantitative systems pharmacology) and deep learning into the traditional nonlinear mixed effects (NLME) modeling framework. This software and its methodology are also being tested in clinical trials at Johns Hopkins University for its ability to predict an individual's drug response to vancomycin and automatically prescribe optimal doses directly from a patient's health records.
Chris started this work while completing his Masters and Ph.D. at the University of California, Irvine where he was awarded the Mathematical and Computational Biology institutional fellowship, the Graduate Dean's Fellowship, the National Science Foundation's Graduate Research Fellowship, the Ford Predoctural Fellowship, the NIH T32 Predoctural Training Grant, and the Data Science Initiative Summer Fellowship. His research with his advisor, Dr. Qing Nie, focused on the methods for simulating stochastic biological models and detailing how the randomness inherent in biological organisms can be controlled using stochastic analysis. Chris bridged the gap between theory and practice by having a "wet lab bench" in Dr. Thomas Schilling’s lab, where these methodologies were tested on zebrafish. Fluorescence Light Microscopy (FLIM) measurements of retinoic acid in the zebrafish hindbrain showed that the predicted control proteins could attenuate inherent biological randomness. The result was a verified mathematical theory for controlling the randomness in biological signaling. Chris received the Kovalevsky Outstanding Ph.D. Thesis Award from the Department of Mathematics upon graduation and was showcased in an interview "Interdisciplinary Case Study: How Mathematicians and Biologists Found Order in Cellular Noise" in iScience.
As an undergraduate at Oberlin College, Chris was awarded the NSF S-STEM scholarship and the Margaret C. Etter Student Lecturer Award by the American Crystallographic Association, an award usually given for PhD dissertations, for his work on 3+1 dimensional incommensurate crystal structure identification of H-acid. This award was given for Service Crystallography for its potential impact on industrial dye manufacturing.