JuliaCon 2023

Songchen Tan

Songchen is currently a second-year master student at MIT Center for Computational Science and Engineering (CCSE), and working as a research assistant at the Julia lab within MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). They use mathematical and computational theory to build machine learning and high-performance computing infrastructure, and collaborate in the ubiquitously interdisciplinary environment of CCSE.


Session

07-26
10:30
30min
Fast Higher-order Automatic Differentiation for Physical Models
Songchen Tan

Taking higher-order derivatives is crucial for physical models like ODEs and PDEs, and it would be great to get it done by automatic differentiation. Yet, existing packages in Julia either has exponential scaling w.r.t. order (nesting first-order AD) or has exponential scaling w.r.t. dimension (nested Taylor polynomial in TaylorSeries.jl). The author presents TaylorDiff.jl (https://github.com/JuliaDiff/TaylorDiff.jl) which is specifically optimized for fast higher-order directional derivatives.

SciML
Online talks and posters