Saturday 11:45–12:30 in Kursraum 3

Towards automating machine learning: benchmarking tools for hyperparameter tuning

Dr. Thorben Jensen

Audience level:
Intermediate

Description

Fine-tuning machine learning models (hyperparameter tuning) is crucial but tedious. Fortunately, optimization promises to automate this task. We give an overview on algorithms for this task and explain their inner workings. To help you selecting one for your project, we benchmark implementations in Python against human experts. We link to the discussion on automating machine learning in general.

Abstract

While fitting machine learning models to data has become a straight-forward task, finding the right models to do so often is not. Many models literally have countless configurations, so-called hyperparameters. Simply trying out all configurations is therefore not an option. This is why Data Scientists often manually tweak their models, trying to improve them. Even though this activity can be pleasant, one nagging question often remains: would the next hyperparameter change have improved my model further?

A promising solution to automating model tuning is hyperparameter optimization, i.e. applying rigorous optimization methods to those parameters whose values are set before model training. The value to be optimized is model performance after training. This talk will introduce and compare methods of hyperparameter optimization. This will include the Python libraries skopt, hyperopt, bayes_opt, smac, as well as the most common approaches Random Search, and Grid Search. The most important optimization methods will be explained.

To evaluate the potential of the different hyperparameter optimization approaches, they will be benchmarked against each other, and against human experts. Focus of this benchmark will be on hyperparameter tuning for supervised learning on table-structured data with XGBoost models. For increased robustness, we benchmark on multiple datasets.

With our talk we intend to give practical understanding on automated model tuning, but also on the general potential of automating machine learning. We will highlight the parts of machine learning workflows that hyperparameter tuning might automate effectively. Based on this, we will map other tasks in machine learning that would benefit from automation.

Subscribe to Receive PyData Updates

Subscribe