BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//pretalx//pretalx.com//juliacon-2026//speaker//YYDPCZ
BEGIN:VTIMEZONE
TZID:CET
BEGIN:STANDARD
DTSTART:20001029T040000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10
TZNAME:CET
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20000326T030000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=3
TZNAME:CEST
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:pretalx-juliacon-2026-TATRTG@pretalx.com
DTSTART;TZID=CET:20260813T153000
DTEND;TZID=CET:20260813T160000
DESCRIPTION:QuantumKitHub's various packages provide low- and high-level to
 oling for the implementation of (among other things) tensor network algori
 thms. These algorithms are highly amenable to GPU-based acceleration\, but
  there are many stumbling blocks along the way. In the past year we have b
 een actively working to add GPU support to the whole stack of TN-related p
 ackages\, and in this talk we will discuss the performance benefits and ch
 allenges thus far\, our roadmap\, and how this work can benefit the wider 
 JuliaGPU developer and user community.
DTSTAMP:20260502T092924Z
LOCATION:Room 3
SUMMARY:GPU acceleration in the QuantumKitHub ecosystem - Katharine Hyatt\,
  Lukas Devos
URL:https://pretalx.com/juliacon-2026/talk/TATRTG/
END:VEVENT
BEGIN:VEVENT
UID:pretalx-juliacon-2026-BPEJLA@pretalx.com
DTSTART;TZID=CET:20260813T170000
DTEND;TZID=CET:20260813T171500
DESCRIPTION:Even more improvements and features have been since this packag
 e was discussed last year at JuliacCon 2025. This talk will highlight some
  of the more meaningful user-facing feature additions\, performance and qu
 ality of life improvements\, as well as significant bug fixes.
DTSTAMP:20260502T092924Z
LOCATION:Room 3
SUMMARY:What's new in CUDA.jl (besides CuTile)? - Katharine Hyatt
URL:https://pretalx.com/juliacon-2026/talk/BPEJLA/
END:VEVENT
BEGIN:VEVENT
UID:pretalx-juliacon-2026-GFVKR3@pretalx.com
DTSTART;TZID=CET:20260814T103000
DTEND;TZID=CET:20260814T110000
DESCRIPTION:Automatic differentiation (AD) is gaining ground as a technique
  for optimization of tensor networks (TN)\, which are widely used simulati
 on tools in quantum computing\, condensed matter\, and high energy physics
 . In this talk we will provide an overview of the ongoing work to add supp
 ort for end-to-end AD in our large\, complex set of physics simulation pac
 kages at the "QuantumKitHub". Efficient AD of these networks involves diff
 erentiation through complex linear algebra\, complicated tensor operations
 \, and other constructs that push the boundaries of what Julia's AD framew
 orks are capable of.
DTSTAMP:20260502T092924Z
LOCATION:Room 6
SUMMARY:Automatic and fixed-point differentiation in tensor network algorit
 hms - Katharine Hyatt\, Lukas Devos
URL:https://pretalx.com/juliacon-2026/talk/GFVKR3/
END:VEVENT
END:VCALENDAR
