JuliaCon 2023

Fast Higher-order Automatic Differentiation for Physical Models
2023-07-26 , Online talks and posters

Taking higher-order derivatives is crucial for physical models like ODEs and PDEs, and it would be great to get it done by automatic differentiation. Yet, existing packages in Julia either has exponential scaling w.r.t. order (nesting first-order AD) or has exponential scaling w.r.t. dimension (nested Taylor polynomial in TaylorSeries.jl). The author presents TaylorDiff.jl (https://github.com/JuliaDiff/TaylorDiff.jl) which is specifically optimized for fast higher-order directional derivatives.


The author will be presenting efficient higher-order automatic differentiation (AD) algorithms and its potential application in scientific models where higher-order derivatives are required to calculated efficiently, such as solving ODEs and PDEs with neural functions. Existing methods to achieve higher-order AD often suffer from one or more of the following problems: (1) nesting first-order AD would result in exponential scaling w.r.t. order; (2) ad-hoc hand-written higher-order rules which is hard to maintain and not utilizing existing first-order AD infrastructures; (3) inefficient data representation and manipulation that causes "penalty of abstraction" problem at first- or second-order when compared to highly-optimized first-order AD libraries. By combining advanced techniques in computational science, i.e. aggressive type specializing, metaprogramming and symbolic computing, TaylorDiff.jl will address all three problems.

TaylorDiff.jl is currently capable for the followings: (1) Taylor-mode AD algorithms which can compute the n-th order derivative of scalar functions in O(n) time; (2) automatic generation of higher-order rules from first-order rules in ChainRules.jl; (3) achieved comparable performance with ForwardDiff.jl at lower order.

Songchen is currently a second-year master student at MIT Center for Computational Science and Engineering (CCSE), and working as a research assistant at the Julia lab within MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). They use mathematical and computational theory to build machine learning and high-performance computing infrastructure, and collaborate in the ubiquitously interdisciplinary environment of CCSE.