Sunday 10:15 AM–11:00 AM in Modeling & Data Techniques - Rm 100A

A Brief Introduction to Hyperparameter Optimization

Jill Cates

Audience level:
Novice

Description

Hyperparameters are often described as "knobs" that control the learning process of a machine learning model. The hyperparameters that you choose for your model can significantly impact its performance. In this talk, we will discuss examples of hyperparameter optimization and specific techniques such as grid search, random search, as well as sequential model-based optimization.

Abstract

Machine learning models tend to have an overwhelming amount of configurations (hyperparameters) that can drastically impact your results. Selecting the best hyperparameters is not a closed-form solution, but rather an iterative process that involves training the model with different hyperparameter settings and evaluating the performance of each configuration.

Hyperparameter optimization is a key step in the data science pipeline which aims to identify the hyperparameters that optimize model performance. In this talk, we will walk through the most popular techniques for tuning hyperparameters:

We will delve into how hyperparameters are tuned with a variety of tools including scikit-learn, scikit-optimize, and hyeropt.

Note: This talk is intended for people who are new to field of data science.

Subscribe to Receive PyData Updates

Subscribe