Thursday October 28 9:30 AM – Thursday October 28 10:00 AM in Talks II

Components, Workflows, and Cookbooks - Building Medical Grade AI pipelines with Argo Workflows

Omri Fima

Prior knowledge:
No previous knowledge expected

Summary

In this talk, I will share the practices and techniques we use to build reusable, production-grade AI pipelines using Argo Workflows. How you can write your workflows to be more reusable, and easier to debug and deploy. And Finally, How we integrated Argo with Jupyter Notebooks for robust and fast ad-hoc experimentation.

Description

  • A Brief introduction to Argo Pipelines, and why we use them at Diagnostic Robotics. a quick introduction on what we do and how creating reusable research pipelines at scale became a challange.
  • Why Argo YAML configurations are great. How Argo helped us build, stable reusable pipelines, and why we found Argo YAML DAG's simplicity was better fitting than Airflow's Python based DAG's
  • Why Argo YAML configurations are a nightmare. The pains we met as our DAG became more complex and more varied, how the limitiations of YAML, as led us to build a Python DSL over it, and enjoy the beauties of both worlds.
  • Components, Workflows, and Cookbooks - the building blocks of a reusable Argo pipeline. I will share the system we built to oragnize and structure our DAG code, to allow us to build reusable and composable pipelines, thus allowing us to adapt with ease to a wide range of experiments,models and flows.
  • How we do CI\CD for our pipelines. How we use K8S deployment mecahnisms, to build, test and deploy our pipelines with ease.
  • Running pipelines Ad-hoc with jupyter notebooks. Finally, I will share is how we use Argo's API with our internally build library to mix the interactive nature of Jupyter Notebooks with the scale of Argo pipelines to be able to perform ad-hoc experimentation at scale.