I am pursuing a Ph.D. in Computational Science and Engineering at Georgia Institute of Technology. Currently, my research is mainly focused on applications of deep learning in inverse problems and uncertainty quantification.
We present InvertibleNetworks.jl, an open-source package for invertible neural networks and normalizing flows using memory-efficient backpropagation. InvertibleNetworks.jl uses manually implement gradients to take advantage of the invertibility of building blocks, which allows for scaling to large-scale problem sizes. We present the architecture and features of the library and demonstrate its application to a variety of problems ranging from loop unrolling to uncertainty quantification.