Bitbucket Pipelines - share pip libraries across multiple steps
Asked Answered
F

2

8

I am trying to store pip libraries installed in the initial step as artifacts, so that it can be reused in the parallel steps later on. Unfortunately, when "Test part 1" and "Test part 2" are executed, mentioned libraries does not exist.

Moreover, mentioned artifacts are not visible even in the Artifacts tab in Bitbucket window

image: python:3.8

options:
  max-time: 20

definitions:
  steps:
    - step: &fetch-and-build
        name: Update image
        caches:
          - pip
          - docker
        services:
          - docker
        script:
          - pip install -r requirements.txt -U
          - pip list -v
        artifacts:
          - /usr/local/lib/python3.8/**
  services:
    postgres:
      image: postgres
      memory: 512
      variables:
        POSTGRES_HOST_AUTH_METHOD: 'trust'
    redis:
      image: redis
      memory: 256
    docker:
      memory: 2048

  default:
    - step: *fetch-and-build
    - parallel:
      - step:
          name: "Test part 1"
          caches:
            - pip
            - docker
          script:
            - pip list -v
            - export COVERAGE_PROCESS_START=./.coveragerc
            - coverage run --parallel-mode --concurrency=multiprocessing --rcfile=./.coveragerc manage.py test -v 3 --parallel=5 payments
          services:
            - redis
            - postgres
            - docker
          artifacts:
            - htmlcov/**
            - htmlcov/index.html
            - coverage/.coverage
            - /usr/local/lib/python3.8/**
      - step:
            name: "Test part 2"
            caches:
              - pip
              - docker
            script:
              - export COVERAGE_PROCESS_START=./.coveragerc
              - coverage run --parallel-mode --concurrency=multiprocessing --rcfile=./.coveragerc manage.py test -v 3 --parallel=5 feed jobs
            services:
              - redis
              - postgres
              - docker
            artifacts:
              - htmlcov/**
              - htmlcov/index.html
              - coverage/.coverage
              - /usr/local/lib/python3.8/**
Feck answered 11/1, 2021 at 8:33 Comment(1)
did u got the answer?Hirsh
V
0

For the "artifacts are not visible even in the Artifacts tab in Bitbucket window"

Currently Artifacts Path in bitbucket pipeline is limited to the files in the build directory. To work around this, first copy the necessary files into the build dir, then in artifacts, mention the the path

For Example:

  - step:
      name: "Test part 1"
      caches:
        - pip
        - docker
      script:
        - pip list -v
        - mkdir artifact #to create single dir for artifacts, and create subfolders if you want to
        - cp /usr/local/lib/python3.8/** ./artifact/python #just like this you can copy multiple files which are not in your current build dir.
        - export COVERAGE_PROCESS_START=./.coveragerc
        - coverage run --parallel-mode --concurrency=multiprocessing --rcfile=./.coveragerc manage.py test -v 3 --parallel=5 payments
      services:
        - redis
        - postgres
        - docker
      artifacts:
        - ./artifact #you can mention the artifacts dir.

In the next step you can copy the files to your destinations from artifact.

Vereeniging answered 5/9, 2022 at 1:53 Comment(0)
B
0

Artifact paths must be relative to the build directory and NOT use . nor .. path segments. This is recurring cause of confusion Artifact not being published in bitbucket pipeline

Other than that, copying a python base directory feels like overkill unless you are using virtualenvs. I'd settle for copying a fraction of the site folder with

image: python

pipelines:
  default:
    - step:
        script:
          - export PYTHONPATH=$BITBUCKET_CLONE_DIR/mysite
          - export PIP_TARGET=$BITBUCKET_CLONE_DIR/mysite
          - pip install -r requirements.txt
        artifacts:
          - mysite
    - step:
        script:
          - export PYTHONPATH=$BITBUCKET_CLONE_DIR/mysite
          - export PIP_TARGET=$BITBUCKET_CLONE_DIR/mysite
          - do-your-thing

But this is to strictly answer your literal question.

The correct answer is DON'T. You should better use caches:

image: python

definitions:
  caches:
    pip: ~/.cache/pip
    venv: .venv # or ~/.local/share/virtualenvs/ or ~/.virtualenvs/

pipelines:
  default:
    - step:
        caches: [pip, venv]
        script:
          - python -m venv .venv
          - source .venv
          - pip install -r requirements
          - do-your-thing

And just benefit from the install instruction being a fast no-op from now on. But don't skip installation in any step: steps should work regardless of the availability of the caches.

Blasphemy answered 30/6, 2023 at 12:11 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.