Friday 12:00–12:30 in Track 2

PyTorch: a framework for research of dynamic deep learning models. How, why, and what's next?

Adam Paszke

Audience level:
Intermediate

Description

PyTorch is one of deep learning frameworks that can natively support models without a predefined structure, and is now widely used at companies and universities all around the world. The talk aims to introduce its basic tensor library, automatic differentiation package, a set of helpers for constructing novel machine learning models, and discuss the future of the library.

Abstract

So, why do we need dynamical computational graphs again?

Recently, there's been a rising interest in frameworks that can support dynamical computational graphs natively and there's a reason for that. This section will introduce the concept and discuss pros and cons compared to static graph frameworks.

Ok, show me the code

This is meant to showcase the different levels of abstraction at which PyTorch can be used. Here I'll start with the lowest one, which is a NumPy-like tensor library. No fancy features; Just a good old imperative paradigm sprinkled with some linear algebra (on GPUs too!).

Automatic differentiation to the rescue!

While having a NumPy alternative with a fast GPU backend is great, it still doesn't allow you to easily experiment with machine learning models. This section introduces torch.autograd - a package allowing to wrap tensors and automatically track their usage so as to compute derivatives of expressions automatically.

High-level helpers

Automatic differentiation is a huge leap forward, but writing out more complicated models can get pretty verbose. torch.nn has been created just for this purpose. It features over a hundred commonly used building blocks (not only for neural networks), just so that you can compose your model in a handful of lines.

Extending the library

So far so good, but we're talking about a research library. It's not always sufficient to rely on what has been provided for you. Also, there's no point in having the whole Python package ecosystem available to you, if you can't mix and match different libraries. No need to worry, we've thought about this too. In this section I'll show that it's straightforward to extend PyTorch with your own code, that will seamlessly integrate with already implemented modules.

Next steps

Finally, I'll discuss recently added features (distributed training) and ones that are under active development (just-in-time compiler for those really complicated models).

Subscribe to Receive PyData Updates

Subscribe

Tickets

Get Now