Sunday 14:10–14:45 in Megatorium

Bayesian Deep learning with 10% of the weights.

Rob Romijnders

Audience level:
Intermediate

Description

Deep learning grows in popularity and use, but it has two problems. Neural networks have millions of parameters and provide no uncertainty. In this talk, we solve both problems with one simple trick: Bayesian deep learning. We show how to prune 90% of the parameters while maintaining performance. As a bonus, we get the uncertainty over our predictions, which is useful for critical applications.

Abstract

Deep learning grows in popularity and use, but they have two problems:

We discuss experiments on the two popular applications of deep learning: images and time series. For image classification, we train a Bayesian neural network on the CIFAR10 data set. For time series, we train on the 5000 ECG's from UCR. In both experiment, we achieve the same performance as baseline with less than 10% of the weights.

This talk will be practical, so you can get started with Bayesian deep learning yourselves. All code is open source and implemented in Tensorflow. The code consists of simple helper functions that you can add to your own deep learning code.

Slides are here

Subscribe to Receive PyData Updates

Subscribe