Differentiate All The Things!
07-24, 15:45–16:15 (US/Eastern), Room 349

Explore Flux's brand-new compiler integration, and how this lets us turn anything in the Julia ecosystem into a machine learning model.


Last JuliaCon I announced the Zygote tool for analytical differentiation (AD) of Julia code. Flux has now uses Zygote as its default AD,* enabling both a more elegant interface and all kinds of new models that weren't possible before.

Flux's new APIs are powerful and let us easily express advanced concepts like backpropagation through time. But really, Julia's power is in its awesome open-source ecosystem, with state of the art tools for differential equations, mathematical optimisation, and even colour theory! Come and see how we can take advantage of all of these tools in machine learning models, enabling "theory-driven" ML to tackle harder problems than ever.

  • In theory; I write this from 4 months in the past, so who knows.

I work at Julia Computing on all kinds of Julia things – mainly on turning Julia into a language for differentiable programming, via the Flux machine learning stack.