I am trying to run airflow locally. My DAG has a BigQueryOperator and I want to use the cloud sdk for authentication. I run "gcloud auth application-default login" in order to get the json file with the credentials. I try to test my Dag running the command:
airflow test testdag make_tmp_table 2019-02-13
I get the error message "User must be authenticated when user project is provided"
If I instead of using the cloud sdk use a service account that has admin rights to BigQuery it works, but I need to use authentication through the cloud sdk.
Does anyone know what this error message means or how I can run airflow and using the cloud sdk for authentication?
I have used the following source to try to understand how I can run airflow with BigQueryOperators locally. https://medium.com/@jbencina/local-testing-with-google-cloud-composer-apache-airflow-75d4213d2893
"quota_project_id": "myproject"
line in theapplication_default_credentials.json
file. I don't know why Airflow doesn't like the quota project ID key, but I tested it multiple times, and this was the problem. – Recur