JuliaCon 2026

Katharine Hyatt

I am a Julia contributor since 2015. I work mostly on GPUs, quantum packages, and linear algebra.


Sessions

08-13
15:30
30min
GPU acceleration in the QuantumKitHub ecosystem
Katharine Hyatt, Lukas Devos

QuantumKitHub's various packages provide low- and high-level tooling for the implementation of (among other things) tensor network algorithms. These algorithms are highly amenable to GPU-based acceleration, but there are many stumbling blocks along the way. In the past year we have been actively working to add GPU support to the whole stack of TN-related packages, and in this talk we will discuss the performance benefits and challenges thus far, our roadmap, and how this work can benefit the wider JuliaGPU developer and user community.

Julia, GPUs, and Accelerators
Room 3
08-13
17:00
15min
What's new in CUDA.jl (besides CuTile)?
Katharine Hyatt

Even more improvements and features have been since this package was discussed last year at JuliacCon 2025. This talk will highlight some of the more meaningful user-facing feature additions, performance and quality of life improvements, as well as significant bug fixes.

Julia, GPUs, and Accelerators
Room 3
08-14
10:30
30min
Automatic and fixed-point differentiation in tensor network algorithms
Katharine Hyatt, Lukas Devos

Automatic differentiation (AD) is gaining ground as a technique for optimization of tensor networks (TN), which are widely used simulation tools in quantum computing, condensed matter, and high energy physics. In this talk we will provide an overview of the ongoing work to add support for end-to-end AD in our large, complex set of physics simulation packages at the "QuantumKitHub". Efficient AD of these networks involves differentiation through complex linear algebra, complicated tensor operations, and other constructs that push the boundaries of what Julia's AD frameworks are capable of.

Differentiable Computational Models and their Applications
Room 6