Saturday 11:00–11:45 in Auditorium

Variational Inference and Python

Peadar Coyle

Audience level:
Experienced

Description

Recent improvements in Probabilistic Programming have led to a new method called Variational Inference. This is an alternative method to the standard method of Markov Chain Monte-Carlo. We'll discuss these methods in PyMC3 and Edward, explain the theory and the limitations and apply these methods to realistic examples.

Abstract

The state of the nation

There are currently three big trends in machine learning: Probabilistic Programming, Deep Learning and "Big Data". Inside of PP, a lot of innovation is in making things scale using Variational Inference. In this talk , I will show how to use Variational Inference in PyMC3 to fit a simple Bayesian Neural Network. I will also discuss how bridging Probabilistic Programming and Deep Learning can open up very interesting avenues to explore in future research.

Probabilistic Programming

Probabilistic Programming allows very flexible creation of custom probabilistic models and is mainly concerned with insight and learning from your data. The approach is inherently Bayesian so we can specify priors to inform and constrain our models and get uncertainty estimation in form of a posterior distribution. Using MCMC sampling algorithms we can draw samples from this posterior to very flexibly estimate these models. PyMC3 and Stan are the current state-of-the-art tools to consruct and estimate these models.

One major drawback of sampling, however, is that it's often very slow, especially for high-dimensional models. That's why more recently, variational inference algorithms have been developed that are almost as flexible as MCMC but much faster. Instead of drawing samples from the posterior, these algorithms instead fit a distribution (e.g. normal) to the posterior turning a sampling problem into and optimization problem. ADVI -- Automatic Differentation Variational Inference -- is implemented in PyMC3 and Stan, as well as a new package called Edward which is mainly concerned with Variational Inference.

In this talk we'll apply these methods of Variational Inference to regression and neural network problems, and explain the advantages for solving big data problems in probabilistic programming. You'll leave this talk with methods you can apply in your own work, and will showcase some of the new features in PyMC3 and Edward.

The speakers are both contributors to PyMC3.

Subscribe to Receive PyData Updates

Subscribe

Tickets

Get Now