JuliaSim: Machine Learning Accelerated Modeling and Simulation
2021-07-30 , Green

Julia is known for its speed, but how can you keep making things faster after all of the standard code optimization tricks run out? The answer is machine learning reduced or approximate models. JuliaSim is an extension to the Julia SciML ecosystem for automatically generating machine learning surrogates which accurately reproduce model behavior.


Julia is known for its speed, but how can you keep making things faster after all of the standard code optimization tricks run out? The answer is machine learning reduced or approximate models. JuliaSim is an extension to the Julia SciML ecosystem for automatically generating machine learning surrogates which accurately reproduce model behavior. In this talk we will showcase how you can take your existing ModelingToolkit.jl models and automate the model order reduction of its components. By hooking into the hierarchical modeling ecosystem, this allows for using the same surrogate across many models without requiring retraining. We will show the benefits of this process on energy efficient building design, which has been accelerated by orders of magnitude over the Dymola Modelica implementation, by using neural surrogatized HVAC models. We will demo simultaneous translation and acceleration of components designed outside of Julia through JuliaSim's ability to take in Functional Markup Units (FMUs) from Modelica and Simulink, along with domain-specific modeling definitions like SPICE netlists of electrical circuits and Pumas pharmacometic models. Similarly, this system allows for generating digital twins of real objects through its measurements, allowing one to quickly incorporate components with less physical understanding directly through its data. We will show a JuliaHub-based parallelized training platform that allows offloading the training process to the cloud. This will allow for engineers to pull pre-accelerated models from the ever growing JuliaSim Model Store directly into their Julia-based designs for fast exploration. Together this will leave the audience ready to integrate ML-accelerated modeling and simulation tools into their workflows.

Chris Rackauckas is an Applied Mathematics Instructor at MIT and the Director of Scientific Research at Pumas-AI. He is the lead developer of the SciML open source scientific machine learning organization which develops widely used software for scientific modeling and inference. One such software is DifferentialEquations.jl for which its innovative solvers won an IEEE Outstanding Paper Award and the inaugural Julia Community Prize. Chris' work on high performance differential equation solving is seen in many applications from the MIT-CalTech CLiMA climate modeling initiative to the SIAM DSWeb award winning DynamicalSystems.jl toolbox. Chris is also the creator of Pumas, the foundational software of Pumas-AI for nonlinear mixed effects modeling in clinical pharmacology. These efforts on Pumas led to the International Society of Pharmacology's (ISoP) Mathematical and Computational Special Interest Group Award at the American Conference of Pharmacology (ACoP) 2019 for his work on improved clinical dosing via Koopman Expectations, along with the ACoP 2020 Quality Award for his work on GPU-accelerated nonlinear mixed effects modeling via generation of SPMD programs. For this work in pharmacology, Chris received the Emerging Scientist award from ISoP in 2020, the highest early career award in pharmacometrics.

This speaker also appears in: