MyGrad: Drop-in automatic differentiation for NumPy

Ryan Soklaski

Prior knowledge:
Previous knowledge expected
Familiarity with differential calculus. Some familiarity with automatic differentiation frameworks, like PyTorch or JAX.

Summary

MyGrad is a lightweight library that adds automatic differentiation to NumPy – its only dependency is NumPy! You can simply drop-in a MyGrad-tensor into NumPy-based code to make it differentiable. Learn about how this library does its magic, how it is being used to introduce high schoolers to gradient-based optimization methods, and how you can contribute to the project!

Description

Automatic differentiation frameworks (e.g. PyTorch, JAX, TensorFlow, Zygote.jl) have exploded in popularity and utility over the past decade, thanks in large part to the success of gradient-based optimization algorithms in training machine learning models (specifically, neural networks).

MyGrad is a lightweight library that adds automatic differentiation to NumPy – its only dependency is NumPy! Its primary goal is to make automatic differentiation accessible and easy to use across the Python/NumPy ecosystem. MyGrad is not meant to compete with its industrial-grade brethren, but to give NumPy users a true "it's just NumPy, but with autodiff" experience; NumPy's ufuncs, view-semantics, and in-place operations are all richly supported, so you can just drop a MyGrad-tensor into your NumPy code and start differentiating!

Check out this lightning talk to see how MyGrad makes keen use of NumPy's recently-added function dispatch mechanisms to work its magic. See some examples of MyGrad in action (linear regression, CNNs, and attention-based transformers, oh my!). And learn how MyGrad is being used in the classroom to make gradient-based algorithms accessible for high school and college students.