Couldn't use data file .coverage: unable to open database file
Asked Answered
I

2

8

A strange issue with permissions occured when pushing to GitHub. I have a test job which runs tests with coverage and then pushes results to codecov on every push and pull request. However, this scenario only works with root user.

If running with digitalshop user it throws an error:

Couldn't use data file '/digital-shop-app/.coverage': unable to open database file

My question is: how to run coverage in docker container so it won't throw this error? My guess is that it's because of permissions.

docker-compose.yml:

version: '3.9'

services:
  test:
    build: .
    command: >
      sh -c "
        python manage.py wait_for_db &&
        coverage run --source='.' manage.py test mainapp.tests &&
        coverage report &&
        coverage xml
      "
    volumes: 
      - ./digital-shop-app:/digital-shop-app
    env_file: .env
    depends_on: 
      - db

  db:
    image: postgres:13-alpine
    environment:
      - POSTGRES_DB=${DB_NAME}
      - POSTGRES_USER=${DB_USER}
      - POSTGRES_PASSWORD=${DB_PASS}

Dockerfile:

FROM python:3.9-alpine3.13

ENV PYTHONUNBUFFERED 1

COPY ./requirements.txt /requirements.txt
COPY ./digital-shop-app /digital-shop-app
COPY ./scripts /scripts

WORKDIR /digital-shop-app

RUN python -m venv /py && \
    /py/bin/pip install --upgrade pip && \
    apk add --no-cache bash && \
    apk add --update --no-cache postgresql-client && \
    apk add --update --no-cache --virtual .tmp-deps \
        build-base jpeg-dev postgresql-dev musl-dev linux-headers \
        zlib-dev libffi-dev openssl-dev python3-dev cargo && \
    apk add --update --no-cache libjpeg && \
    /py/bin/pip install -r /requirements.txt && \
    apk del .tmp-deps && \
    adduser --disabled-password --no-create-home digitalshop && \
    chown -R digitalshop:digitalshop /py/lib/python3.9/site-packages && \
    chmod -R +x /scripts

ENV PATH="/scripts:/py/bin:/py/lib:$PATH"

USER digitalshop

CMD ["run.sh"]
Illumine answered 25/3, 2022 at 8:11 Comment(2)
Small round of tips: make your issue as small and focused as possible: can you reproduce this without the complexity of GitHub action, in a simple run of docker compose on your machine? Then, next step: what is the smaller python code to reproduce it. Then you will have a minimal reproducible example that, hopefully, someone can help you with.Lebar
@β.εηοιτ.βε Thanks for useful tips. You gave me food for thought. As you mentioned, I tried without GitHub Actions on a local machine and found the problem. Coverage didn't create .coverage file because my user didn't own the directory. Also I updated my answer.Illumine
I
1

So I ended up creating another Dockerfile called Dockerfile.test and putting pretty much the same configuration except non-admin user creation. Here's the final variant:

Running code as root user is not recommended thus please read UPDATE section

Dockerfile.test:

FROM python:3.9-alpine3.13

ENV PYTHONUNBUFFERED 1

COPY ./requirements.txt /requirements.txt
COPY ./digital-shop-app /digital-shop-app

WORKDIR /digital-shop-app

RUN python -m venv /py && \
    /py/bin/pip install --upgrade pip && \
    apk add --no-cache bash curl gnupg coreutils && \
    apk add --update --no-cache postgresql-client libjpeg && \
    apk add --update --no-cache --virtual .tmp-deps \
        build-base jpeg-dev postgresql-dev musl-dev linux-headers \
        zlib-dev libffi-dev openssl-dev python3-dev cargo && \
    /py/bin/pip install -r /requirements.txt && \
    apk del .tmp-deps

ENV PATH="/py/bin:/py/lib:$PATH"

docker-compose.yml:

version: '3.9'

services:
  test:
    build:
      context: .
      dockerfile: Dockerfile.test
    command: >
      sh -c "
        python manage.py wait_for_db &&
        coverage run --source='.' manage.py test mainapp.tests &&
        coverage report &&
        coverage xml
      "
    volumes: 
      - ./digital-shop-app:/digital-shop-app
    env_file: .env
    depends_on: 
      - db

I don't know exactly whether it is a good practice. If not then please tell how to do it correctly.

UPDATE:

Thanks to @β.εηοιτ.βε for giving me food for thought.

After some local debugging I found out that coverage needs user to own the directory where .coverage file is located. So I created subdir named /cov inside project folder and set digitalshop user as its owner including everything inside. Finally I specified path to .coverage file by setting env variable COVERAGE_FILE=/digital-shop-app/cov/.coverage where digital-shop-app is project root folder. And also specified the same path to coverage.xml report in docker-compose.yml. Here's the code:

docker-compose.yml (added -o flag to coverage xml command):

version: '3.9'

services:
  test:
    build:
      context: .
    command: >
      sh -c "
        python manage.py wait_for_db &&
        coverage run --source='.' manage.py test mainapp.tests &&
        coverage xml -o /digital-shop-app/cov/coverage.xml
      "
    env_file: .env
    depends_on: 
      - db

  db:
    image: postgres:13-alpine
    environment:
      - POSTGRES_DB=${DB_NAME}
      - POSTGRES_USER=${DB_USER}
      - POSTGRES_PASSWORD=${DB_PASS}

Dockerfile:

FROM python:3.9-alpine3.13

ENV PYTHONUNBUFFERED 1

COPY ./requirements.txt /requirements.txt
COPY ./digital-shop-app /digital-shop-app
COPY ./scripts /scripts

WORKDIR /digital-shop-app

RUN python -m venv /py && \
    /py/bin/pip install --upgrade pip && \
    apk add --no-cache bash && \
    apk add --update --no-cache postgresql-client && \
    apk add --update --no-cache --virtual .tmp-deps \
        build-base jpeg-dev postgresql-dev musl-dev linux-headers \
        zlib-dev libffi-dev openssl-dev python3-dev cargo && \
    apk add --update --no-cache libjpeg && \
    /py/bin/pip install -r /requirements.txt && \
    apk del .tmp-deps && \
    adduser --disabled-password --no-create-home digitalshop && \
    chown -R digitalshop:digitalshop /py/lib/python3.9/site-packages && \
    chmod -R +x /scripts && \
    # New code here
    mkdir -p /digital-shop-app/cov && \
    chown -R digitalshop:digitalshop /digital-shop-app/cov

ENV PATH="/scripts:/py/bin:/py/lib:$PATH"

USER digitalshop

CMD ["run.sh"]
Illumine answered 25/3, 2022 at 10:52 Comment(1)
No, you shouldn't run a container with root, indeed. but you are lacking a minimal reproducible example so we can reproduce and help you.Lebar
B
2

Problem

I ran into the same error when running pytest with coverage using docker-compose in the github-hosted ubuntu-latest image in GitHub Actions. This is an instance of the Docker host file system owner matching problem.

In short, the user on the host (the github action runner) and the user in on the container (where my pytest suite runs) have different UIDs. The mounted directory app is owned by the user on the host. When the user in the container attempts to write to the app/.coverage, permission is denied (since this user is not the owner).

The fix

In my case, I solved the issue by matching the UID of my docker image's default user with that of the github actions runner user, 1001. I added this to my Dockerfile to accomplish this:

# Make the default user have the same UID as the github actions "runner" user.
# This to avoid permission issues when mounting volumes.
USER root
RUN usermod --uid 1001 <image_default_user>
USER <image_default_user>

All relevant files

app/test_utils/docker-compose.yml:

version: "3.9"

services:
  app:
    build:
      context: ../
      dockerfile: ./test_utils/Dockerfile
    container_name: app
    volumes:
      - ..:/app

app/test_utils/Dockerfile:

FROM <my base image>

# Make the default user have the same UID as the github actions "runner" user.
# This to avoid permission issues when mounting volumes, see
USER root
RUN usermod --uid 1001 <image_default_user>
USER <image_default_user>

COPY . /app
WORKDIR /app

RUN pip3 install -r requirements.txt -r requirements_test.txt

app/.github/unittests.yml:

name: Run unit tests, Report coverage

on:
  pull_request:
    paths:
      - app/*
      - .github/workflows/unittests.yml

jobs:
  build:
    runs-on: ubuntu-latest

    steps:
      - name: Checks out the repo
        uses: actions/checkout@v2

      - name: Build docker image
        run: docker-compose -f test_utils/docker-compose.yml build

      - name: Run unit tests & produce coverage report
        # Adapted from the docker example in
        # https://github.com/MishaKav/pytest-coverage-comment?tab=readme-ov-file#example-usage
        run: |
          docker-compose \
            -f test_utils/docker-compose.yml \
            run app \
            pytest \
              --cov-report=term-missing:skip-covered \
              --junitxml=/app/pytest.xml \
              --cov=/app \
              /app \
            | tee pytest-coverage.txt

      - name: Pytest coverage comment
        uses: MishaKav/pytest-coverage-comment@main
        with:
          pytest-coverage-path: pytest-coverage.txt
          junitxml-path: pytest.xml

Blond answered 1/3 at 18:24 Comment(0)
I
1

So I ended up creating another Dockerfile called Dockerfile.test and putting pretty much the same configuration except non-admin user creation. Here's the final variant:

Running code as root user is not recommended thus please read UPDATE section

Dockerfile.test:

FROM python:3.9-alpine3.13

ENV PYTHONUNBUFFERED 1

COPY ./requirements.txt /requirements.txt
COPY ./digital-shop-app /digital-shop-app

WORKDIR /digital-shop-app

RUN python -m venv /py && \
    /py/bin/pip install --upgrade pip && \
    apk add --no-cache bash curl gnupg coreutils && \
    apk add --update --no-cache postgresql-client libjpeg && \
    apk add --update --no-cache --virtual .tmp-deps \
        build-base jpeg-dev postgresql-dev musl-dev linux-headers \
        zlib-dev libffi-dev openssl-dev python3-dev cargo && \
    /py/bin/pip install -r /requirements.txt && \
    apk del .tmp-deps

ENV PATH="/py/bin:/py/lib:$PATH"

docker-compose.yml:

version: '3.9'

services:
  test:
    build:
      context: .
      dockerfile: Dockerfile.test
    command: >
      sh -c "
        python manage.py wait_for_db &&
        coverage run --source='.' manage.py test mainapp.tests &&
        coverage report &&
        coverage xml
      "
    volumes: 
      - ./digital-shop-app:/digital-shop-app
    env_file: .env
    depends_on: 
      - db

I don't know exactly whether it is a good practice. If not then please tell how to do it correctly.

UPDATE:

Thanks to @β.εηοιτ.βε for giving me food for thought.

After some local debugging I found out that coverage needs user to own the directory where .coverage file is located. So I created subdir named /cov inside project folder and set digitalshop user as its owner including everything inside. Finally I specified path to .coverage file by setting env variable COVERAGE_FILE=/digital-shop-app/cov/.coverage where digital-shop-app is project root folder. And also specified the same path to coverage.xml report in docker-compose.yml. Here's the code:

docker-compose.yml (added -o flag to coverage xml command):

version: '3.9'

services:
  test:
    build:
      context: .
    command: >
      sh -c "
        python manage.py wait_for_db &&
        coverage run --source='.' manage.py test mainapp.tests &&
        coverage xml -o /digital-shop-app/cov/coverage.xml
      "
    env_file: .env
    depends_on: 
      - db

  db:
    image: postgres:13-alpine
    environment:
      - POSTGRES_DB=${DB_NAME}
      - POSTGRES_USER=${DB_USER}
      - POSTGRES_PASSWORD=${DB_PASS}

Dockerfile:

FROM python:3.9-alpine3.13

ENV PYTHONUNBUFFERED 1

COPY ./requirements.txt /requirements.txt
COPY ./digital-shop-app /digital-shop-app
COPY ./scripts /scripts

WORKDIR /digital-shop-app

RUN python -m venv /py && \
    /py/bin/pip install --upgrade pip && \
    apk add --no-cache bash && \
    apk add --update --no-cache postgresql-client && \
    apk add --update --no-cache --virtual .tmp-deps \
        build-base jpeg-dev postgresql-dev musl-dev linux-headers \
        zlib-dev libffi-dev openssl-dev python3-dev cargo && \
    apk add --update --no-cache libjpeg && \
    /py/bin/pip install -r /requirements.txt && \
    apk del .tmp-deps && \
    adduser --disabled-password --no-create-home digitalshop && \
    chown -R digitalshop:digitalshop /py/lib/python3.9/site-packages && \
    chmod -R +x /scripts && \
    # New code here
    mkdir -p /digital-shop-app/cov && \
    chown -R digitalshop:digitalshop /digital-shop-app/cov

ENV PATH="/scripts:/py/bin:/py/lib:$PATH"

USER digitalshop

CMD ["run.sh"]
Illumine answered 25/3, 2022 at 10:52 Comment(1)
No, you shouldn't run a container with root, indeed. but you are lacking a minimal reproducible example so we can reproduce and help you.Lebar

© 2022 - 2024 — McMap. All rights reserved.