JuliaCon Local Paris 2025

Reactant.jl - Optimize Julia Functions With MLIR and XLA for High-Performance Execution on CPU, GPU, TPU and more.
02/10/2025 , Amphithéâtre Robert Faure
Langue: English

Reactant.jl is a package that takes a Julia function operating on arrays, and compiles it into the MLIR representation. It can then run fancy optimizations on top of it, including using EnzymeMLIR for automatic differentiation, and create relevant executables for CPU/GPU/TPU via XLA.


Reactant.jl is a package that takes a Julia function operating on arrays, and compiles it into the MLIR representation. It can then run fancy optimizations on top of it, including using EnzymeMLIR for automatic differentiation, and create relevant executables for CPU/GPU/TPU via XLA.

In this talk, we go over the advantages of such a system for a variety of tensor related workloads where having tensor aware optimization becomes beneficial. We describe the inner workings of the package and its current limitations. We also demonstrate the current raising capabilities of Reactant, enabling running CUDA.jl kernels on other accelerators such as CPU, TPU or AMD GPUs. Finally, we highlight successful integrations in Julia packages such as Lux.jl and Tenet.jl.

PhD student at Barcelona Supercomputing Center (BSC-CNS)

Interested about Julia and developer tools.

3nd year PhD student at IRISA (Institute for Research in Computer Science and Random Systems) interested in compilers, MLIR and FPGA.