In this tutorial, we'll go step by step, starting with importance of Hyper-parameter optimization. We'll then implement a simple exhaustive search from scratch using for loops and do some exercises. After that we'll try SkLearn's hyper-parameter search algorithms, then we'll compare them with more effective Hyper-Parameter Optimization Algorithm, TPE implemented in Hyperopt.
Hands on Experience with Advanced Hyper-parameter Optimization Techniques, using Hyperopt
We'll go step by step, starting with what Hyper-parameter optimization is, we'll then implement a simple exhaustive search from scratch and do some exercises, after that we'll try Scikit-Learn's Grid Search and Random Search, we'll compare it with the more effective Hyper-Parameter Optimization Algorithm implemented in Hyperopt Library, TPE. We've included exercise for every part so that, you guys get a good understanding on what you are doing. We'll also go through on how to parallelize the evaluations using MongoDB making the optimization even more effective, and discuss the solution to general difficulties faced while making it work.
A Docker Image will be provided, so that participants won't have to waste time in setting up the environment.
The Workflow of the Workshop would be:
After attending this workshop you will be able to apply Hyper-parameter optimization using better algorithms which decides the hyper-parameters based on previous information instead of just brute force techniques. In short much much efficient model training.
So, better attend it! ;)