Julia has a rich AD ecosystem: ForwardDiff, ReverseDiff, Enzyme, Zygote, FastDifferentiation, Symbolics, ChainRules, DiffRules, etc. Each AD algorithm is best for a subset of problems but no one algorithm is good for all problems. This minisymposium is an opportunity for AD practitioners to explore the fundamental underlying similarities between these algorithms and to begin to design new hybrid algorithms to address the weakness of existing algorithms.
The purpose of this mini-symposium is two fold: to gather AD practitioners in one place to share their insights and engineering know how, and to explore the theoretical underpinnings of AD with the goal of devising new AD algorithms with better efficiency and fewer limitations.
We will be accepting talks on the theory and practice of AD. Topics for the theory of AD include, but are not limited to,
* algorithms beyond forward and reverse
* AD for tensor differentiation
* AD for sparse Jacobians
* efficient higher order derivatives
On the practice side we are looking for talks describing the engineering required to make efficient and widely applicable AD systems.