arrow_back General Talk: Neural Networks from zero (using only numpy) to practical projects (using scikit-learn and TensorFlow)
Airflow: To Manage Data Pipelines
Submitted by Mahendra Yadav (@userimack) on Thursday, 17 August 2017
A significant part of the IT/Data Engineering team is spent on writing and scheduling jobs, monitoring and troubleshooting the issues. Enterprise data originates from various sources and there are various business rules and processes that govern how that data can be consumed.
Airflow is a platform to programmatically author, schedule and monitor workflows. (https://airflow.incubator.apache.org/)
The various tasks in the workflow(s) are configured as a Directed Acyclic Graph. This talk covers how Airflow is used to establish better workflows for data engineering.
P.S: This talk is inspired from Bargava Subramanian (@barsubra) proposal.
- Existing challenges in data engineering - creating/monitoring/troubleshooting workflows
- Introduction to Airflow
- Main advantages of Airflow
- Tasks as DAG
- Airflow in practice - case study
- Dynamic pipeline generation
- Demo UI dashboards
- Data Engineering at Scale
- Brief overview of what other options exists
Mahendra Yadav is a Data Engineer at Azri Solutions, Hyderabad. In his day to day work he processes a lot of data from different sources.