I can reliably reproduce the issue in Composer 1.9 / Airflow 1.10.6 by performing the following actions:
- Create a new Composer Cluster
- Upload a DAG that references an Airflow Connection
- Set an Environment Variable in Composer
- Wait for
airflow-scheduler
and airflow-worker
to restart
Aside from the FERNET_KEY configuration is missing
, the issue manifests itself with the following Airflow error banners:
Broken DAG: [/home/airflow/gcs/dags/MY_DAG.py] in invalid literal for int() with base 10: 'XXX'
Broken DAG: [/home/airflow/gcs/dags/MY_DAG.py] Expecting value: line 1 column 1 (char 0)
The root cause of the issue is that adding a new environment variable removes the AIRFLOW__CORE__FERNET_KEY
environment variable from the airflow-scheduler
and airflow-worker
Kubernetes Deployment Spec Pod Templates:
- name: AIRFLOW__CORE__FERNET_KEY
valueFrom:
secretKeyRef:
key: fernet_key
name: airflow-secrets
As a workaround, it's possible to apply a Kubernetes Deployment Spec Patch:
$ cat config/composer_airflow_scheduler_fernet_key_patch.yaml
spec:
template:
spec:
containers:
- name: airflow-scheduler
env:
- name: AIRFLOW__CORE__FERNET_KEY
valueFrom:
secretKeyRef:
key: fernet_key
name: airflow-secrets
$ kubectl patch deployment airflow-scheduler --namespace=$AIRFLOW_ENV_GKE_NAMESPACE --patch "$(cat config/composer_airflow_scheduler_fernet_key_patch.yaml)"
NOTE: This patch must also be applied to airflow-worker
.