2026-08-14 –, Room 6
Automatic differentiation (AD) is gaining ground as a technique for optimization of tensor networks (TN), which are widely used simulation tools in quantum computing, condensed matter, and high energy physics. In this talk we will provide an overview of the ongoing work to add support for end-to-end AD in our large, complex set of physics simulation packages at the "QuantumKitHub". Efficient AD of these networks involves differentiation through complex linear algebra, complicated tensor operations, and other constructs that push the boundaries of what Julia's AD frameworks are capable of.
In recent years, AD-based optimization for tensor networks has become more popular. Using AD in these cases requires support for complex numbers, differentiation through linear algebra factorizations, and other features that are often not at the core of "traditional" AD frameworks such as JAX or PyTorch. For these reasons, the flexibility and extensibility of Julia's next generation AD tooling allows us to explore the use of AD in large scale simulation of quantum systems. In this talk we will discuss some of the challenges we have encountered integrating AD into the TN workflow, such as supporting rules for truncated SVD for complex double valued matrices, and some of the innovative techniques that AD allows us to explore, such as the ongoing research into the optimization of tensor network states based on performing AD around a fixed point of an operator.
I am a Julia contributor since 2015. I work mostly on GPUs, quantum packages, and linear algebra.
Software Research Fellow at the Flatiron Institute, CCQ, studying tensor network methods and algorithms for classical and quantum physics simulations.