Juliacon 2024

BLASPhemy
07-12, 10:30–11:00 (Europe/Amsterdam), Function (4.1)

Over the past years, Enzyme.jl has developed into a popular and fast tool for automatic differentiation (AD) in Julia. Basic Linear Algebra Subroutines (BLAS) on the other hand have been around for decades, are highly optimized, and are automatically called by Julia to handle your Matrix and Vector operations. Sounds like Enzyme and BLAS would be an awesome combination, right? Except when things fall apart. In this talk, we show what went wrong and how we fixed it.


BLAS and Lapack routines have been optimized for decades, and are widely used in numerical and machine learning code throughout the Julia ecosystem. Unfortunately, Enzyme.jl can’t handle their black-box implementations — so we just replaced their implementation with a generic openBLAS fallback, we hope you don’t mind! And since we are already at it, we also hardcoded the number of threads for this BLAS implementation to one, since multithreading is complex and we want to protect you from race conditions. Speaking about correctness, when differentiating our fallback we noticed that gradients could be wrong, but at least only for irrelevant scalars like 0.0 or 1.0 — nobody cares about those, right?

Although this approach “works”, it feels like there must be something faster and more reliable to do here. In this talk, we showcase improvements to Enzyme’s BLAS handling using LLVM’s code-generation capabilities (tablegen) to generate efficient differentiation rules for low-level BLAS calls while leveraging Enzyme’s ability to work within the compiler. These improvements increase BLAS AD performance by up to 64x, avoid crashes with large matrices, and allow supporting your favorite hardware-specific, multithreaded BLAS libraries. We also discuss ongoing work to improve performance even further via tricks such as memory management optimization, and the benefit of these improvements on downstream Julia applications.

I am an M.Sc. CS Student at the University of Toronto.
Previously I was an undergrad at the Karlsruhe Institute of Technology, Germany.