Training deep learners and other iterative models with MLJ

MLJ.jl (Machine Learning in Julia) is a is a toolbox written in Julia providing meta-algorithms for selecting, tuning, evaluating, composing and comparing over 150 machine learning models written in Julia and other languages. We describe new developments enabling a user to wrap iterative models , such as a gradient tree booster or a Flux neural network, in a "control strategy". Wrapping hyper-parameter tuning in a control strategy is a particularly powerful possibility discussed.

The speaker’s profile picture
Anthony Blaom

Anthony Blaom is a mathematician, publishing chiefly in the areas of differential geometry and dynamical systems, and a scientific computing consultant. He is a co-creator and lead contributor for MLJ, an open-source machine learning platform written in Julia, which began as a project at the Alan Turing Institute, London.

Initially trained as an a mechanical engineer, Anthony earned a PhD in Mathematics at Caltech in 1998. He is currently a Senior Research Fellow in the Department of Computer Science, University of Auckland.