JuliaCon 2026

FluxOptics.jl: A Composable Framework for Optical Inverse Design in Julia
2026-08-14 , Room 6

FluxOptics.jl is a Julia framework for differentiable optical inverse design. It enables both rapid prototyping relying on automatic differentiation with Zygote.jl and production performance through algorithmic differentiation and custom gradient rules. The framework provides memory-efficient optimization through controlled buffer management, while maintaining a composable architecture. Benchmarks against JAX show competitive performance while handling larger problems, demonstrating Julia's strengths for computational physics applications.


Context: optical systems and inverse design

The laws of optics are expressed as a set of partial differential equations (PDEs) known as Maxwell's equations. Finite difference and finite element methods are the standard and most accurate methods to solve rigorously such equations at the nanoscale level, but are intractable to describe large optical systems made of traditional elements such as lenses, mirrors and spatial light modulators. Instead, some simplifying assumptions are commonly made to transform the original PDEs into simpler equations that can often be solved efficiently using Fourier methods, while remaining accurate enough for a large number of applications.

An optical system can naturally be described as a cascade of optical elements through which light propagates. Recent work has drawn analogies with neural networks, employing the term Diffractive Optical Neural Networks (DONNs), where optical components act as trainable weights and physical propagation models replace traditional convolutions or matrix multiplications. This vision has led to using machine learning frameworks such as Tensorflow, Pytorch and more recently JAX, to optimize optical systems for specific functions like beam shaping and mode multiplexing, with applications ranging from telecommunications to optical computing.

FluxOptics.jl architecture

FluxOptics.jl brings this composable approach to Julia with an architecture that allows selecting the trade-off between fast prototyping and performance when developing a new differentiable optical component. A key challenge in inverse design problems is memory management during the forward pass and gradient computation. One can easily achieve fast prototyping of differentiable programming in Julia by writing pure functions and using Zygote.jl. However, this often limits the performance of the forward model which could benefit from in-place mutable operations, and also leads to dynamic memory allocation which triggers the garbage collector and slows down the computation.

To achieve the best efficiency, FluxOptics.jl implements algorithmic differentiation through manual gradient rules defined by extending an interface that leverages ChainRulesCore.jl. This enables controlled buffer management and reuse of intermediate computations during backpropagation. The framework distinguishes between Pure components for rapid prototyping using automatic differentiation and Custom components for production-level optimization where developers write efficient forward and backward passes with fine-grained control over memory allocation, while still relying on Zygote.jl for composing gradients across the system.

Components can be marked Static for fixed geometry or Trainable for optimizable parameters. The FieldProbe component captures intermediate field states, and makes them accessible at the system output through a dictionary, which is useful for multi-objective optimization or visualization. Adjacent non-trainable components with compatible types can be automatically merged for efficiency, such as consecutive phase masks or propagation steps.

The framework integrates naturally with Optimisers.jl and extends it with proximal operators for constrained optimization including Total-Variation regularization, sparsity-inducing penalties (Iterative Shrinkage-Thresholding Algorithm and its accelerated variant FISTA), and box constraints.

Performance and validation

To enable honest comparison with the Python ecosystem, JaxOptics was developed as a minimal JAX reimplementation covering the core free-space propagation methods and phase modulation. Preliminary benchmarks show better performance on isolated propagation tasks, while JAX's XLA compiler can show a slight advantage on certain optimization workflows. However, Julia's key strength lies in memory efficiency and the ability to handle larger problems. The framework has been published in JOSS and demonstrates its capabilities through real research applications including field retrieval, waveguide tomography, multimode intensity shaping, and 45-mode Hermite-Gaussian mode sorting.

This talk presents FluxOptics.jl's architecture and demonstrates design patterns for composable differentiable systems that could extend to other domains of computational physics beyond optics.

See also:

Independent computational physicist working on optical propagation, inverse design, and differentiable programming. Former postdoc at University of Innsbruck, FAU Erlangen, and University of Rennes. Developer of FluxOptics.jl