How to fix "Failed to fetch log file from worker. Unsupported URL protocol" error in Airflow DAG logs?
Asked Answered
C

7

10

I am running Airflow via docker through this image apache/airflow:2.1.0 Please refer to this thread for the initial error I faced.

Currently I am able to run my previous existing DAGs. However when I add newer DAGS I get the following error in the log file. I am pretty sure it is not an issue with memory or compute.

*** Log file does not exist: /opt/airflow/logs/my-task/my-task/2021-06-15T14:11:33.254428+00:00/1.log
*** Fetching from: http://:8793/log/my-task/my-task/2021-06-15T14:11:33.254428+00:00/1.log
*** Failed to fetch log file from worker. Unsupported URL protocol ''

Things I have tried already:

  • Restarting my containers
  • docker prune and then building
  • Deleting the DAG from the frontend
Clarkclarke answered 15/6, 2021 at 14:23 Comment(1)
I am also facing this issue with CeleryExecutor in K8 cluster deployed through helm-chart. But after deleting the pods and deployments it works fineStichter
S
7

I don't have the solution for this, but I have a clue.

Apparently, the problem is a bug that prevents Airflow to store the log if the task didn't even get to run, as you know already.

So, something that is not a syntax error is causing an error. In my case, I'm 80% sure is about Airflow not picking the right path to my config and utils folders, so, first thing the task does is try to use functions and credentials stored in that folders and not being able, so inmediately crashes, before being able to store some logs. Probably I can do something about it on the yaml file.

BTW yesterday I saw your question across multiple platforms without any answer and I want to tell you that my soul resonated with yours on this crusade to make the godforsaken Airflow DAG work. I feel you, bro.

Stuckey answered 1/7, 2021 at 6:46 Comment(1)
yeah... I had my airflow state mega screwed up. luckily it was dev and I had to just blow away everything and start over. Even removed the db volume for the airflow db and that didnt' fix it. Anyways, good answer indicating that the source of the question error is some other error.Ephemeron
D
2

I had the same problem. For me the cause of task failure at the beginning of run was that my worker didn't have the write permissions on the mounted logs directory(ro mount on shared drive). Once I fixed that everything started to work.

Dhiman answered 2/7, 2021 at 12:0 Comment(0)
M
1

The same problem here. I'm using CeleryExecutor in K8S cluster. Each component is running as independent pod(under deployment). My first thought: It can be related to lack off mounted volumes(with files). I'll try to mount PVC and give info if it's works

Manganese answered 17/6, 2021 at 14:1 Comment(0)
P
1

If you are using kind one more way to fix it:

Fist of all, get configuration file by typing:

helm show values apache-airflow/airflow > values.yaml 

After that check that fixPermissions is true.

persistence:
  # Enable persistent volumes
  enabled: true
  # Volume size for worker StatefulSet
  size: 10Gi
  # If using a custom storageClass, pass name ref to all statefulSets here
  storageClassName:
  # Execute init container to chown log directory.
  # This is currently only needed in kind, due to usage
  # of local-path provisioner.
  fixPermissions: true

Update your installation by:

helm upgrade --install airflow apache-airflow/airflow -n airflow -f values.yaml --debug
Prunelle answered 6/7, 2021 at 15:34 Comment(0)
S
0

my airflow was installed at /var/airflow folder and I just gave write permission - sudo chmod -R 777 /var/airflow/ stop container (docker-compose down) and restart the docker service - sudo systemctl restart docker

Susceptible answered 17/8, 2022 at 3:44 Comment(0)
R
0

For me this error was caused by a syntax error in one of my custom operators that I was actively working on. I didn't see the DAG parse error so was stuck for a while. Once the syntax error was fixed the error went away. Silly mistake but hopefully that helps someone

Raucous answered 5/12, 2023 at 15:39 Comment(0)
C
0

it worked for me while i gave write permission to logs directory

Cisneros answered 28/8 at 9:45 Comment(1)
Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.Parodist

© 2022 - 2024 — McMap. All rights reserved.