JuliaCon 2020 (times are in UTC)

JuliaCon 2020 (times are in UTC)

Doing Scientific Machine Learning (SciML) With Julia
2020-07-26 , Red Track

Scientific machine learning combines differentiable programming, scientific simulation, and machine learning in order impose physical constraints on machine learning and automatically learn biological models. Given the composibility of Julia, it is positioned as the best language for this set of numerical techniques, but how to do actually "do" SciML? This workshop gets your hands dirty.

Join via Zoom: link in email from Eventbrite. Backup Youtube link: https://youtu.be/QwVO0Xh2Hbg


In this workshop we'll dive into some of the latest techniques in scientific machine learning, including Universal Differential Equations (Universal Differential Equations for Scientific Machine Learning), Physics-Informed Neural Networks (Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations), and Sparse Identification of Nonlinear Dynamics (SInDy, Discovering governing equations from data by sparse identification of nonlinear dynamical systems). The goal is to get those in the workshop familiar with what these methods are, what kinds of problems they solve, and know how to use Julia packages to implement them.

The workshop will jump right into how to model the missing part of a physical simulation, describe how universal approximators (neural networks) can be used in this context, and show how to transform such problems into an optimization problem which is then accelerated by specializing automatic differentiation. The set of packages that is involved in this is somewhat intense, using many tools from JuliaDiffEq (DiffEqFlux.jl, DifferentialEquations.jl, DiffEqSensitivity.jl, ModelingToolkit.jl, NeuralPDE.jl, DataDrivenDiffEq.jl, Surrogates.jl, etc.) combined with machine learning tools (Flux.jl), differentiation tooling (SparseDiffTools.jl, Zygote.jl, ForwardDiff.jl, ReverseDiff.jl, etc.), and optimization tooling (JuMP, Optim.jl, Flux.jl, NLopt.jl, etc.) all spun together in a glorious soup that automatically discovers physical laws at the end of the day. Thus this workshop has something different to offer for everyone: new users of Julia will get a nice overview of the unique composibility of the Julia package ecosystem, while experienced Julia users will learn how to bridge some area that they are comfortable with (such as machine learning) to a whole new set of phenomena. Meanwhile, even though who are only knee deep in coding can gain a lot from learning these new mathematical advances, meaning that even a casual observer likely has a lot to learn!

Christopher Rackauckas is an Applied Mathematics Instructor at the Massachusetts Institute of Technology and a Senior Research Analyst at University of Maryland, Baltimore, School of Pharmacy in the Center for Translational Medicine. Chris's research is focused on numerical differential equations and scientific machine learning with applications from climate to biological modeling. He is the developer of over many core numerical packages for the Julia programming language, including DifferentialEquations.jl for which he won the inaugural Julia community prize, and the Pumas.jl for pharmaceutical modeling and simulation. He is the lead developer for the SciML Open Source Scientific Machine Learning software organization, along with its packages like DiffEqFlux.jl and NeuralPDE.jl

This speaker also appears in: