To separate bigquery queries from the actual code I want to store the sql in a separate file and then read it from the python code. I have tried to add the file in the same bucket as the DAGs and also in a sub folder, but it seems like I can't read the file when it airflow is running my python script with the sql files.
What I want is this:
gs://my-bucket/dags -> store dags
gs://my-bucket/dags/sql -> store sql files
The sql files might be files that I need to read first to inject things that is not supported by the jinja templating.
Can I do the above?