JuliaCon 2020 (times are in UTC)

KernelFunctions.jl, machine learning kernels for Julia
07-29, 12:30–12:40 (UTC), Red Track

Kernel functions are used in an incredible number of applications ranging from SVM to Gaussian processes as well as Stein variational methods.
With KernelFunctions.jl we propose a modular and easily customizable kernel framework. The emphasis made in this package is to work smoothly with automatic differentiation while being able to construct arbitrarily complex kernels both in terms of input transformation and kernel structure.


Realising that every package requiring kernels was having his own implementation and that MLKernels.jl, the last standard, had a lot of imcompabilities, William Tebbutt and I decided to work on a common project usable by all.
KernelFunctions allows to process arbitrary transformations on the input data allowing for example to use a neural net or to construct kernel sums and product to create complex kernel structures. A series of standard kernels are available and more and more are added but creating a custom kernel is extremely straight-forward.
The package needs only a few dependencies and is very light-weight.

I will give a brief introduction to the kernel methods followed by a few concrete examples such as deep kernel learning or automatic kernel selection.

PhD Student at TU Berlin with Pr. Opper. I am interested in Bayesian methods and more particularly in approximate inference and Gaussian Processes. I am developing AugmentedGaussianProcesses.jl and KernelFunctions.jl

This speaker also appears in: