This tutorial aims to provide some examples of how to write effective Bayesian programs using TensorFlow and Tensorflow Probability. In TFP land, effectiveness usually comes from writing model that could generate batch-able functions, and utilizing modern hardware (GPU, TPU) with compiler accelerator (i.e., XLA). I will give a walkthrough on how to do so and highlight some gotchas.
In a way, writing down a probabilistic program (in most case, a Bayesian model that could be represented in a directed acyclic graph) that runs is pretty straightforward. However, scaling it up is often not trivial, especially for models containing lots of control flows. Extensive rewrite of your model might be needed to make it run fast. In this tutorial, I will give some example of how to write a "disciplined" TFP model and its fitting/sampling routine so that it is: 1,batch-friendly so it can take advantage of automatic parallel computation of many algorithms; 2, XLA compatible so it could be accelerated in GPU and TPU.