Felix J. Herrmann

Felix J. Herrmann graduated from Delft University of Technology in 1992 and received his Ph.D. in engineering physics from that same institution in 1997. After research positions at Stanford University and the Massachusetts Institute of Technology, he became back in 2002 faculty at the University of British Columbia. In 2017, he joined the Georgia Institute of technology where he is now a Georgia research Alliance Scholar Chair in Energy, cross-appointed between the Schools of Earth & Atmospheric Sciences, Computational Science & Engineering, and Electrical & Computer Engineering. His cross-disciplinary research program spans several areas of computational imaging including seismic, and more recently, medical imaging. Dr. Herrmann is widely known for tackling challenging problems in the imaging sciences by adapting techniques from randomized linear algebra, PDE-constrained and convex optimization, high-performance computing, machine learning, and uncertainty quantification. Over his career, he has been responsible for several cost-saving innovations in industrial time-lapse seismic data acquisition and wave-equation based imaging. In 2019, he toured the world presenting the SEG Distinguished Lecture "Sometimes it pays to be cheap – Compressive time-lapse seismic data acquisition". In 2020, he was the recipient of the SEG Reginald Fessenden Award for his contributions to seismic data acquisition with compressive sensing. At Georgia Tech, he leads the Seismic Laboratory for Imaging and modeling and he is co-founder/director of the Center for Machine Learning for Seismic (ML4Seismic), designed to foster industrial research partnerships to drive innovations in artificial-intelligence assisted seismic imaging, interpretation, analysis, and time-lapse monitoring.


Sessions

07-29
16:30
30min
InvertibleNetworks.jl - Memory efficient deep learning in Julia
Philipp A. Witte, Mathias Louboutin, Ali Siahkoohi, Felix J. Herrmann, Gabrio Rizzuti, Bas Peters

We present InvertibleNetworks.jl, an open-source package for invertible neural networks and normalizing flows using memory-efficient backpropagation. InvertibleNetworks.jl uses manually implement gradients to take advantage of the invertibility of building blocks, which allows for scaling to large-scale problem sizes. We present the architecture and features of the library and demonstrate its application to a variety of problems ranging from loop unrolling to uncertainty quantification.

Green