Juliacon 2024

Gradients for everyone: a quick guide to autodiff in Julia
07-11, 14:30–15:00 (Europe/Amsterdam), For Loop (3.2)

Automatic Differentiation (AD) is the art of letting your computer work out derivatives for you. The Julia ecosystem provides many different AD tools, which can be overwhelming. This talk will give everyone the necessary information to answer two simple questions:
- As a developer, how do I make my functions differentiable?
- As a user, how do I differentiate through other people's functions?

Video: https://www.youtube.com/live/ZKt0tiG5ajw?t=19747s


The ability to take gradients of arbitrary computer programs is at the core of many breakthroughs in science and machine learning.
In Python, AD is fragmented into separate frameworks like TensorFlow, PyTorch or JAX. Each one of these has its own package ecosystem, and interoperability is limited. You first pick your AD framework, and then you write all your code in it.

Meanwhile on the Julia side, the dream is to make the whole language differentiable. Ideally, you should be able to write the code you want, and get gradients for free. In practice, there are multiple AD packages (ForwardDiff, ReverseDiff, Zygote, Enzyme, etc.) with distinct sets of tradeoffs. Should you use generic types? Can you perform mutating operations? Newcomers may not understand where these constraints come from, or whether they are inevitable.

The present talk is an introduction to AD in Julia, from two complementary point of views.
- package developers want to make their functions differentiable with as many backends as possible.
- package users want easy ways to compute derivatives with the best possible performance.

We will give both of these groups the right tools to make informed decisions. Here is a rough summary of our presentation:

  1. Classification of AD systems - forward and reverse mode
  2. Python versus Julia - two different approaches
  3. Making code differentiable - quick fixes or custom rules
  4. Using differentiable code - switching backends if needed

Presentation slides: https://gdalle.github.io/JuliaCon2024-AutoDiff/

See also: GitHub

Postdoctoral researcher at EPFL (Switzerland) in the INDY, IdePHICS and SPOC labs.
Currently working on graph machine learning.
Active member of the JuliaGraphs and JuliaDiff organizations.
Personal website: https://gdalle.github.io/

This speaker also appears in:

PhD student in the Machine Learning Group at TU Berlin.
Interested in in automatic differentiation, explainability and dynamical systems.

This speaker also appears in: