Errno 13 Permission denied when Airflow tries to write to logs
Asked Answered
H

8

21

We're running into a permission error when using Airflow, receiving the following error:

PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/logs/scheduler/2019-12-18/../../../../home

We've tried using chmod 777 -R on the /usr/local/airflow/logs/schedule directory within the container but this doesn't seem to have done the trick.

We have this piece in our entrypoint.sh script:

export AIRFLOW__CORE__BASE_LOGS_FOLDER="/usr/local/airflow/logs

Has anyone else run into this airflow log permission issue? Can't seem to find much about this one in particular online.

Hankypanky answered 19/12, 2019 at 15:43 Comment(0)
E
15

Just for anyone with the same issue...

Surprisingly, I had to take a look to the Airflow documentation... and according to it:

On Linux, the mounted volumes in container use the native Linux filesystem user/group permissions, so you have to make sure the container and host computer have matching file permissions.

mkdir ./dags ./logs ./plugins
echo -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" > .env

Once you have matched file permissions:

docker-compose up airflow-init
docker-compose up
Exeunt answered 14/6, 2021 at 13:50 Comment(1)
Thank you this worked for me. If you don't set AIFLOW_UID, all files will be created as root user which causes permission issues.Ellord
K
12

Folder permission that is bind mounted could also result in this error.

For example:

docker-compose.yml (pseudo code)

   service_name:
     ...
     volumes:
      - /home/user/airflow_logs:/opt/airflow/logs

Grant permission to the local folder, so that airflow container can write logs, create directory if needed etc.,

 sudo chmod u=rwx,g=rwx,o=rwx /home/user/airflow_logs
Krigsman answered 11/9, 2020 at 7:32 Comment(1)
I had the same issue and I have solved it with the command: sudo chmod -R 777 /home/user/airflow_logs Also, it is important to mention that this can be applied to any folder one is trying to export from container.Arron
S
9

I solved the issue: in my case the problem was that the volume mounted folders, logs and dags didn't have write permission. I added it with

chmod -R 777 dags/
chmod -R 777 logs/

and in the docker-composer file they are mounted as

    volumes:
      - ./dags:/opt/bitnami/airflow/dags
      - ./logs:/opt/bitnami/airflow/logs
Sacristy answered 19/8, 2021 at 15:39 Comment(0)
H
6

I had the same error.

PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/logs/scheduler'

The reason I got that error is because I didn't create the initial 3 folders (dags, logs, plugins) before running airflow docker container. So docker seems to have created then automatically but the permissions were wrong.

Steps to fix:

  1. Stop current container
docker-compose down --volumes --remove-orphans
  1. Delete folders dags, logs, plugins
  2. Just in case Destroy the images and volumes already created (in Docker Desktop)
  3. Create folders again from command line
mkdir logs dags plugins
  1. run airflow docker again
docker-compose up airflow-init
docker-compose up
Hokanson answered 18/1, 2023 at 17:0 Comment(0)
G
2

I also have the same problem using Apache Airflow 1.10.7.

Traceback (most recent call last):
  File "/usr/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/usr/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 135, in _run_file_processor
    set_context(log, file_path)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py", line 198, in set_context
    handler.set_context(value)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 65, in set_context
    local_loc = self._init_file(filename)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 148, in _init_file
    os.makedirs(directory)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  [Previous line repeated 5 more times]
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 221, in makedirs
    mkdir(name, mode)
PermissionError: [Errno 13] Permission denied: '/media/radifar/radifar-dsl/Workflow/Airflow/airflow-home/logs/scheduler/2020-01-04/../../../../../../../home'

After checking how file_processor_handler.py works I find that the error was caused by the different directory location of example dag and our dag folder settings. In my case 7 folder above the folder 2020-01-04 is /media/radifar. In your case 4 folder above the folder 2019-12-18 is /usr/local. That's why the PermissionError was raised.

I was able to solve this problem by cleaning the AIRFLOW_HOME folder then run airflow version, set the load_example to False in airflow.cfg. Then run airflow initdb. After that I can use airflow without error.

Grandioso answered 4/1, 2020 at 15:21 Comment(5)
For me load_examples = False was enough to fix the problem.Woozy
Turning off examples also resolved the error for me.Rooney
I have set load_examples = False but I am still getting Permission denied error. airflow inidb and airflow UI everything worked perfectly till yesterday but today I am getting this error. Can someone please help.Jarrodjarrow
@alex yes solved it. For me it was related to permission issues. I changed the permissions to 777 and that helped me. What is the error you are getting now?Jarrodjarrow
ok interesting - so you changed permissions for /scheduler logs folder? I posted here my issue #63510835 this is airflow running on kubernetesPacifa
B
0

I was having the same problem running an Airflow image on docker hosted by Windows.

My solution was to override the CMD in the scheduler's dockerfile with a CMD that set the file permissions, before launching the default CMD.

The default CMD can be obtained with docker inspect -f '{{.Config.Cmd}}' <schedulerImageId>.

Example. I used the bitnami image ( docker.io/bitnami/airflow-scheduler:2.1.4-debian-10-r16 ). Inspecting the image I saw that the default CMD was

/opt/bitnami/scripts/airflow-scheduler/run.sh

So I created a run.sh script with the following content:

#! /bin/bash

chmod -R 777 /opt/bitnami/airflow/logs/
. /opt/bitnami/scripts/airflow-scheduler/run.sh

Then I added the following lines at the end of my dockerfile:

COPY run.sh /
RUN  chmod +x /run.sh

CMD /run.sh
Barren answered 11/4, 2022 at 12:37 Comment(0)
C
0

Little late to the party, but you could add a user to the default group, which creates the directory.

When your docker-compose is up you could run service docker-compose exec SERVICE_NAME bash and check to which group specific directory belongs to and then add this group to your user permission in docker-compose.yml:

service_name:
     ...
     user: USER_NAME:USER_GROUP
Cerracchio answered 7/7, 2022 at 8:35 Comment(0)
P
0

Another approach would be to copy the files into the image whilst also changing the ownership.

COPY --chown=airflow . .
Proximity answered 2/2, 2023 at 12:19 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.