JuliaCon 2020 (times are in UTC)

DynamicPPL: Stan-like Speed for Dynamic Probabilistic Models
07-30, 18:30–19:00 (UTC), Purple Track

We present DynamicPPL.jl, a modular library providing a lightning-fast infrastructure for probabilistic programming and Bayesian inference, used in Turing.jl. DynamicPPL enables Turing to have C/Stan-like speeds for Bayesian inference involving static and dynamic models alike. Beside run-time speed, DynamicPPL provides a user-friendly domain-specific language for defining and then querying probabilistic models.


We present the preliminary high-level design and features of DynamicPPL.jl (https://github.com/TuringLang/DynamicPPL.jl), a modular library providing a lightning-fast infrastructure for probabilistic programming, used as a backend for Turing.jl (https://github.com/TuringLang/Turing.jl). Beside a computational performance that is often close to or better than Stan, DynamicPPL provides an intuitive domain-specific language (DSL) that allows the rapid development of complex dynamic probabilistic programs. Being entirely written in Julia, a high-level dynamic programming language for numerical computing, DynamicPPL inherits a rich set of features available through the Julia ecosystem. Since DynamicPPL is a modular, stand-alone library, any probabilistic programming system written in Julia, such as Turing.jl, can use DynamicPPL to specify models and trace their model parameters. The main features of DynamicPPL are: 1) a meta-programming based DSL for specifying dynamic models using an intuitive tilde-based notation; 2) a tracing data-structure for tracking random variables in dynamic probabilistic models; 3) a rich contextual dispatch system allowing tailored behaviour during model execution; and 4) a user-friendly syntax for probabilistic queries. Finally, we show in a variety of experiments that DynamicPPL, in combination with Turing.jl, achieves computational performance that is often close to or better than Stan.

I am a PhD student at UNSW Canberra working on topology optimization. I have 2 seemingly unrelated interests that I hope to combine one day: topology optimization and Bayesian inference. This talk is all about the latter, so let's focus on that. I joined the TuringLang development team 1-2 years ago. Since then, my main goal has been to make Turing as fast as possible without sacrificing usability. In this talk, I share some of the work I have done towards that goal and give an overview of the present and future of Turing and DynamicPPL including features, short-term goals and bottlenecks.

GitHub: https://github.com/mohamed82008