Friday 15:30–17:00 in Tower Suite 3

Follow the gradient: an introduction to mathematical optimisation

Gianluca Campanella

Audience level:


Optimisation is at the heart of many mathematical models (including most ML algorithms), but it's often overlooked as an implementation detail. Conversely, developing an appreciation for optimisation techniques leads to a better understanding of their impact on these applications.

This workshop provides a comprehensive overview of continuous optimisation, with a practical ML focus.


The workshop starts with a brief introduction to optimisation theory, introducing a common notation as well as a taxonomy of objective functions and constraints.

This is followed by a review of continuous optimisation problems and techniques. Methods and ideas from the field are then related to practical machine learning applications such as quantile regression, penalised linear regression, and deep neural networks.

Optimisation methods are introduced by first presenting the motivating ideas and corresponding ML algorithms, followed by a more formal mathematical treatment and examples using different Python packages.


The workshop will be of interest to anyone seeking to develop an intuition for the practical implications of optimisation choices on training and performance of ML algorithms, particularly with regard to custom model development.

Whilst mathematical formalism is kept to a minimum, it is recommended that attendees are comfortable with undergraduate-level calculus.


Slides are notebook are available at

Subscribe to Receive PyData Updates