Using Airflow to build Data Pipelines

Details
Abstract
Airflow has become the de facto standard for scheduling and monitoring workflows. Its extensible framework has led to widespread adoption by the community, especially for designing and orchestrating ETL pipelines and ML workflows. In this talk, we will look at some of the issues and design principles commonly seen in the data pipelines space, and look at how Airflow can incorporate them to build reliable and scalable pipelines.
Bio
Dakshin is a Solution Consultant at Sahaj AI Software. When he isn't busy bingeing TV or playing video games, he enjoys nerding out over computers and technology in general. Over the past year, he has been working on a Data Platform with Airflow at its core, where he came across most of the concepts involved in this talk. His current hobbies include cycling and building custom mechanical keyboards.
Prerequisites for the talk
- Beginner-level understanding of Python
- A basic understanding of Data engineering would help but is not mandatory.
Zoom link: https://us02web.zoom.us/j/87201735240?pwd=eEtGcVhVMUlTcUVqMk11QjR5bjJLUT09
COVID-19 safety measures

Using Airflow to build Data Pipelines