BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//pretalx//pretalx.com//juliacon-2026//talk//GFVKR3
BEGIN:VTIMEZONE
TZID:CET
BEGIN:STANDARD
DTSTART:20001029T040000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10
TZNAME:CET
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20000326T030000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=3
TZNAME:CEST
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:pretalx-juliacon-2026-GFVKR3@pretalx.com
DTSTART;TZID=CET:20260814T103000
DTEND;TZID=CET:20260814T110000
DESCRIPTION:Automatic differentiation (AD) is gaining ground as a technique
  for optimization of tensor networks (TN)\, which are widely used simulati
 on tools in quantum computing\, condensed matter\, and high energy physics
 . In this talk we will provide an overview of the ongoing work to add supp
 ort for end-to-end AD in our large\, complex set of physics simulation pac
 kages at the "QuantumKitHub". Efficient AD of these networks involves diff
 erentiation through complex linear algebra\, complicated tensor operations
 \, and other constructs that push the boundaries of what Julia's AD framew
 orks are capable of.
DTSTAMP:20260502T102651Z
LOCATION:Room 6
SUMMARY:Automatic and fixed-point differentiation in tensor network algorit
 hms - Katharine Hyatt\, Lukas Devos
URL:https://pretalx.com/juliacon-2026/talk/GFVKR3/
END:VEVENT
END:VCALENDAR
