2025-07-24 –, Main Room 2
This talk will explore the latest advancements and current state of Lux.jl, a deep-learning framework in Julia. We will also introduce how to use Reactant.jl, a powerful tool that compiles Julia code to MLIR and executes it across various backends using XLA, with Lux. The session will highlight how Reactant.jl and Lux.jl enable training neural networks in Julia at speeds comparable to popular frameworks like JAX and PyTorch.
We will cover the following topics:
-
Compiling Lux models using Reactant and EnzymeJAX
-
Model-Parallel and Data-Parallel Training: Implement scalable training strategies in Lux.jl using Reactant.jl and OpenXLA for distributing models and data across multiple devices.
-
Compiler Optimizations with Enzyme and JAX.
-
Debugging Performance with TensorBoard Profiler: Use TensorBoard Profiler to identify and resolve performance bottlenecks in training and inference pipelines.
-
Visualizing Models with Model Explorer: Explore and analyze neural network architectures interactively using Model Explorer for better debugging and understanding.
-
Exporting Models to JAX and TensorFlow: Seamlessly export trained models from Lux.jl to JAX and TensorFlow for deployment in production environments.
Ph.D. Student @ MIT