2021-07-28 –, Green
Deep learning has grown steadily and there has been rising interest from various groups to incorporate ML techniques in their modelling via differentiable programming. Software 2.0 as its known, is going to need a large resource pool of tools to actualise its goal. In this talk, we will discuss how the Flux.jl stack along with Zygote and next-gen AD tooling is enabling differentiable programming already in a variety of domains and tour across the packages and projects that are taking part in it.
Machine Learning has come a long way in the past decade. With differentiable programming we have seen a renewed interest from numerous communities to apply ML techniques to diverse fields through scientific machine learning. Traditional deep learning has seen many strides with larger, more compute-intensive models which need increasingly complex training routines that push the boundaries of the current state-of-the-art.
In this talk, we will go through the depth of the machine learning and differentiable programming ecosystem in Julia through the FluxML stack. We shall discuss the various tools and features available to the users through the advances in the ecosystem and the next-gen tooling required to allow even more expressive modelling possible in Julia.
We will also take note of the new packages and techniques being developed in domains such as differentiable physics, chemistry, graph networks, molecular simulation and multi-GPU training etc.
We will also talk about the development effort in the Flux stack including performance enhancements, better coverage of CUDA, NNlib optimisations for the CPU and the new composable and functional optimisers via Optimisers.jl etc.
Dhairya Gandhi is a data scientist at Julia Computing Inc. and is the lead developer of the Machine Learning framework Flux.