Thursday 15:00–15:30 in Track 2

Maxing out supervised learning with model-based hyperparameter selection

Artur SuchwaƂko, Tomasz Melcer

Audience level:
Intermediate

Description

Hyperparameter tuning is one of the troublesome parts of training modern classifiers like XGBoost or neural networks: bad selection will make an otherwise good classifier useless, and even good hyperparameters can often be made better. We will show on examples how a model-based selection offers a more principled and faster way to find hyperparameters that work well.

Abstract

Hyperparameter tuning is one of the troublesome parts of training modern classifiers like XGBoost or neural networks: bad selection will make an otherwise good classifier useless, and even good hyperparameters can often be made better. This problem can be considered a black-box optimization problem. To solve it, supervised learning practitioneers commonly use methods like expert decision, grad student optimization, grid or random search. A family of more principled model-based methods has recently provided much-needed improvements to this field: faster and more accurate selection. We will show how to apply model-based methods to the hyperparameter selection problem with examples in Python and R. We will discuss the strenghts and weaknesses of these methods when used in a business/commercial setting.

Subscribe to Receive PyData Updates

Subscribe

Tickets

Get Now