Mathias Louboutin

Post Docotoral Fellow at Georgia Institute of technology.
My main research focuses on high-performance computing for large-scale PDE constraints optimization (medical imaging, seismic imaging) on standard clusters and in the Cloud. In particular I work intensively on open source solutions in Julia and Python and high-level abstractions for high-performance computing such as Devito (Finite difference DSL) or JUDI.jl (linear algebra abstraction for PDE constraint optimization).
My secondary research project is aimed at computational and algorithmic solutions for large-scale machine learning.


Sessions

07-29
16:30
30min
InvertibleNetworks.jl - Memory efficient deep learning in Julia
Philipp A. Witte, Mathias Louboutin, Ali Siahkoohi, Felix J. Herrmann, Gabrio Rizzuti, Bas Peters

We present InvertibleNetworks.jl, an open-source package for invertible neural networks and normalizing flows using memory-efficient backpropagation. InvertibleNetworks.jl uses manually implement gradients to take advantage of the invertibility of building blocks, which allows for scaling to large-scale problem sizes. We present the architecture and features of the library and demonstrate its application to a variety of problems ranging from loop unrolling to uncertainty quantification.

Green