BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//pretalx//pretalx.com//pyconde-pydata-berlin-2023//talk//DECAHT
BEGIN:VTIMEZONE
TZID:CET
BEGIN:STANDARD
DTSTART:20001029T040000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10
TZNAME:CET
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20000326T030000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=3
TZNAME:CEST
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:pretalx-pyconde-pydata-berlin-2023-DECAHT@pretalx.com
DTSTART;TZID=CET:20230417T114000
DTEND;TZID=CET:20230417T122500
DESCRIPTION:In the last years\, Hyperparameter Optimization (HPO) became a 
 fundamental step in the training\nof Machine Learning (ML) models and in t
 he creation of automatic ML pipelines.\nUnfortunately\, while HPO improves
  the predictive performance of the final model\, it comes with a significa
 nt cost both in terms of computational resources and waiting time.\nThis l
 eads many practitioners to try to lower the cost of HPO by employing unrel
 iable heuristics.\n\nIn this talk we will provide simple and practical alg
 orithms for users that want to train models\nwith almost-optimal predictiv
 e performance\, while incurring in a significantly lower cost and waiting\
 ntime. The presented algorithms are agnostic to the application and the mo
 del being trained so they can be useful in a wide range of scenarios.\n\nW
 e provide results from an extensive experimental activity on public benchm
 arks\, including comparisons with well-known techniques like Bayesian Opti
 mization (BO)\, ASHA\, Successive Halving.\nWe will describe in which scen
 arios the biggest gains are observed (up to 30x) and provide examples for 
 how to use these algorithms in a real-world environment.\n\nAll the code u
 sed for this talk is available on (GitHub)[https://github.com/awslabs/syne
 -tune].
DTSTAMP:20260410T164641Z
LOCATION:B09
SUMMARY:Hyperparameter optimization for the impatient - Martin Wistuba
URL:https://pretalx.com/pyconde-pydata-berlin-2023/talk/DECAHT/
END:VEVENT
END:VCALENDAR
