JuliaCon 2023

When Enzyme meets JuMP: a tour de ronde
2023-07-28 , 32-155

Julia provides a vibrant automatic differentiation (AD) ecosystem, with numerous AD libraries. All these AD solutions are unique, and take diverse approaches to the various fundamental AD design choices for code transformations available in Julia. The recent refactoring of the JuMP nonlinear interface is giving us an opportunity to integrate some of these AD libraries into JuMP. However, how far can we go in the integration?


In this talk, we present our recent work with Enzyme, an AD backend working directly at the LLVM level, enabling massively parallel or vectorized modeling through GPUCompiler.jl. We put a special emphasis on the extraction of sparse Jacobian and sparse Hessian with Enzyme using respectively forward and forward-over-reverse automatic differentiation. We give a thorough investigation of the capability of Enzyme regarding nonlinear programming and present detailed results on the seminal optimal power flow (OPF) problem.

Assistant professor at Mines Paris - PSL