Martin Wistuba

Martin Wistuba is a researcher at Amazon Web Services where he works on automation of hyperparameter optimization and Neural Architecture Search. Earlier, he was at IBM Research, where he developed tools to automate deep learning.


LinkedIn

https://linkedin.com/in/wistuba


Session

04-17
11:40
45min
Hyperparameter optimization for the impatient
Martin Wistuba

In the last years, Hyperparameter Optimization (HPO) became a fundamental step in the training
of Machine Learning (ML) models and in the creation of automatic ML pipelines.
Unfortunately, while HPO improves the predictive performance of the final model, it comes with a significant cost both in terms of computational resources and waiting time.
This leads many practitioners to try to lower the cost of HPO by employing unreliable heuristics.

In this talk we will provide simple and practical algorithms for users that want to train models
with almost-optimal predictive performance, while incurring in a significantly lower cost and waiting
time. The presented algorithms are agnostic to the application and the model being trained so they can be useful in a wide range of scenarios.

We provide results from an extensive experimental activity on public benchmarks, including comparisons with well-known techniques like Bayesian Optimization (BO), ASHA, Successive Halving.
We will describe in which scenarios the biggest gains are observed (up to 30x) and provide examples for how to use these algorithms in a real-world environment.

All the code used for this talk is available on (GitHub)[https://github.com/awslabs/syne-tune].

PyData: Machine Learning & Stats
B09