Adrian Hill
PhD student in the Machine Learning Group at TU Berlin.
Interested in in automatic differentiation, explainability and dynamical systems.
- Personal website: adrianhill.de
- GitHub: @adrhill
- Project spotlight
Session
10-03
11:30
30min
Leveraging Sparsity to Accelerate Automatic Differentiation
Guillaume Dalle, Adrian Hill
Jacobians and Hessians play vital roles in scientific computing and machine learning, from optimization to probabilistic modeling. While these matrices are often considered too computationally expensive to calculate, their inherent sparsity can be leveraged to dramatically accelerate Automatic Differentiation (AD). By building on top of DifferentiationInterface.jl, we are able to bring Automatic Sparse Differentiation to all major Julia AD backends, including ForwardDiff and Enzyme.
Error, derivatives, stability
Jean-Baptiste Say Amphitheater