This tutorial aims to introduce key theory and methods in Variational Inference and apply these in practice, ending up connecting VI and recent generative model advances such as VAEs and GANs.
Deep Generative Models have become very popular recently, with VAEs and then GANs gaining mainstream attention. This tutorial introduces Variational Inference and key extensions that form part of the theory of VAEs and GANs. The aim is to introduce the theory in an accessible way, and provide concrete examples in pytorch. This tutorial assumes a reasonable understanding of probability, such as key Bayesian terminology (prior, posterior etc) and basic information theory such as KL-divergence and entropy. The aim of this tutorial is not to generate pretty pictures - but to develop understanding of these methods and see/implement some key results from the machine learning literature.