JuliaCon 2020 (times are in UTC)

JuliaCon 2020 (times are in UTC)

NeuralProcesses.jl: Composing Neural Processes with Flux
2020-07-29 , Red Track

Neural Processes (NPs) are a rich class of models for meta-learning that have enjoyed a flurry of interest recently. We present NeuralProcesses.jl, a compositional framework for constructing and training NPs built on top of Flux.jl. We demonstrate how the Convolutional Conditional Neural Process (ConvCNP), a new member of the NP family, can be implemented with the framework. The ConvCNP models translation equivariance, which is an important inductive bias for many learning problems.


(Conditional) Neural Processes ((C)NPs) [^1] [^2] are a rich class of models that parametrise the predictive distribution through an encoding of the observed data. Their flexibility allows them to be deployed in a myriad of applications, such as image completion and generation, time series modelling, and spatio-temporal applications. Neural Processes have enjoyed much interest recently, resulting in the development of several well-performing members of the Neural Process family. As an effort to accelerate the development and evaluation of NP architectures, we present NeuralProcesses.jl [^3], a framework for NPs built on top of Flux.jl. NeuralProcesses.jl provides basic building blocks that can be flexibly composed and mixed and matched to reconstruct many (conditional) neural process architectures from the literature, as well as expand to novel architectures.

A recently introduced new member of the neural process family, called the Convolutional Conditional Neural Process (ConvCNP) [^4], proposes to account for translation equivariance in the data: if the observations are shifted, then the predictions should be shifted accordingly. Translation equivariance is an important inductive bias for many learning problems, such as time series, spatial data, and images. The ConvCNP has been demonstrated to achieve state-of-the-art performance on several established NP benchmarks.

In this talk, we give a brief introduction to meta-learning, the neural process family, and NeuralProcesses.jl. To demonstrate the abstractions of NeuralProcess.jl, we walk through an implementation of the ConvCNP. We will then use the ConvCNP to predict a sawtooth wave, an otherwise challenging task due to the wave's discontinuous nature. We conclude with a brief demonstration of how the building blocks of NeuralProcess.jl can be used to construct several prominent NP architectures.

[^1] https://arxiv.org/abs/1807.01613

[^2] https://arxiv.org/abs/1807.01622

[^3] https://github.com/wesselb/NeuralProcesses.jl

[^4] https://openreview.net/forum?id=Skey4eBYPS

Hello! I am a PhD student at the Machine Learning Group at the University of Cambridge, supervised by Dr Richard Turner.