We use airflow to orchestrate our workflows, and dbt with bigquery for our daily transformations in BigQuery. We have two separate git repos, one for our dbt project and a separate one for airflow.
It seems the simplest approach to scheduling our daily run dbt
seems to be a BashOperator
in airflow. However, to schedule DBT to run with Airflow, it seems like our entire DBT project would need to be nested inside of our Airflow project, that way we can point to it for our dbt run
bash command?
Is it possible to trigger our dbt run
and dbt test
without moving our DBT directory inside of our Airflow directory? With the airflow-dbt package, for the dir
in the default_args
, maybe it is possible to point to the gibhub link for the DBT project here?