Introducing dag-factory

November 21, 2018
3 minutes read

Apache Airflow is “a platform to programmatically author, schedule, and monitor workflows.” And it is currently having its moment. At DataEngConf NYC 2018, it seemed like every other talk was either about or mentioned Airflow. There have also been countlessblogposts about how different companies are using the tool and it even has a podcast!

A major use case for Airflow seems to be ETL or ELT or ETTL or whatever acronym we are using today for moving data in batches from production systems to data warehouses. This is a pattern that will typically be repeated in multiple pipelines. I recently gave a talk on how we are using Airflow at my day job and use YAML configs to make these repeated pipelines easy to write, update, and extend. This approach is inspired by Chris Riccomini’s seminal post on Airflow at WePay. Someone in the audience then asked “how are you going from YAML to Airflow DAGs?” My response was that we had a Python file that parsed the configs and generated the DAGs. This answer didn’t seem to satisfy the audience member or myself. This pattern is very common in talks or posts about Airflow, but it seems like we are all writing the same logic independently. Thus the idea for dag-factory was born.

The dag-factory library makes it easy to create DAGs from YAML configuration by following a few steps. First, install dag-factory into your Airflow environment:

This approach offers a number of benefits including that Airflow DAGs can be created with no Python knowledge. This opens the platform to non-engineers on your team, which can be a productivity boost to both your organization and data platform.

dag-factory also allows engineers who do not regularly work with Airflow to create DAGs. These people frequently want to use the great features of Airflow (monitoring, retries, alerting, etc.), but learning about Hooks and Operators are outside the scope of their day-to-day jobs. Instead of having to read the docs (ewwww) to learn these primitives, they can create YAML configs just as easily as the cron job (ewwwwwwww) they were going to use. This will reduce the time spent onboarding new teams to Airflow dramatically.