Data sources can contain a LOT of different entities, and you might not want Meltano to pull every data source into your dashboard. In this step, you can choose which to include by clicking the "Edit Selections" button of an installed extractor.

If you're using SaaS tools to manage support, sales, marketing, revenue and other business functions you know your data is constantly changing. To keep your dashboards up to date, Meltano provides Orchestration using Apache Airflow.

TIP

Right now, Airflow can not be installed from inside Meltano's UI so you need to return to your command line interface.

Run the following command:

meltano add orchestrator airflow

Once Airflow is installed, you can view the ELT pipeline schedule(s) created in the previous Running the ELT step via Meltano UI where a DAG gets created for each pipeline schedule.

A DAG is automatically created in Airflow and "is a collection of all the tasks you want to run, organized in a way that reflects their relationships and dependencies".

TIP

To see a list of all your scheduled DAGs within the Meltano UI under "Orchestration" you will need to kill your terminal window running the meltano ui command and then restart it. You will only need to do this the first time you install Airflow.

# After installing Airflow, you will need to shut down your current instance of Meltano and restart
meltano ui

If you run into issues, it is possible that you could end up with multiple instances of Airflow running at the same time. This is a known issue (#821) common if you have been working with multiple Meltano projects, or have killed Meltano UI from the command line.

To troubleshoot, run sudo lsof -i -P | grep -i "listen" from your command line. If you see multiple instances of Python running on Port 5010, kill the first instance with kill 12345 (using the number for your instance). Then run meltano ui and try again.