Training deep learners and other iterative models with MLJ

MLJ.jl (Machine Learning in Julia) is a is a toolbox written in Julia providing meta-algorithms for selecting, tuning, evaluating, composing and comparing over 150 machine learning models written in Julia and other languages. We describe new developments enabling a user to wrap iterative models , such as a gradient tree booster or a Flux neural network, in a "control strategy". Wrapping hyper-parameter tuning in a control strategy is a particularly powerful possibility discussed.