A locally built Docker image within a Bitbucket Pipeline
Asked Answered
O

2

10

What I need is a way to build a Dockerfile within the repository as an image and use this as the image for the next step(s).

I've tried the Bitbucket Pipeline configuration below but in the "Build" step it doesn't seem to have the image (which was built in the previous step) in its cache.

pipelines:
  branches:
    main:
      - step:
          name: Docker Image(s)
          script:
            - docker build -t foo/bar .docker/composer 
          services:
            - docker
          caches:
            - docker
      - step:
          name: Build
          image: foo/bar
          script:
            - echo "Hello, World"
            - composer --version
          services:
            - docker
          caches:
            - docker

I've tried the answer on the StackOverflow question below but the context in that question is pushing the image in the following step. It's not about using the image which was built for the step itself.

Bitbucket pipeline use locally built image from previous step

Overbid answered 3/6, 2021 at 19:4 Comment(0)
D
6

There's a few conceptual mistakes in your current pipeline. Let me first first run through those before giving you some possible solutions.

Clarifications

Caching

Bitbucket Pipelines uses the cache keyword to persist data across multiple pipelines. Whilst it will also persist across steps, the primary use-case is for the data to be used on separate builds. The cache takes 7 days to expire, and thus will not be updated with new data during those 7 days. You can manually delete the cache on the main Pipelines page. If you want to carry data across steps in the same pipelines, you should use the artifacts keyword.

Docker service

You should only need to use the docker service whenever you want to have a docker daemon available to your build. Most commonly whenever you need to use a docker command in your script. In your second step, you do not need this. So it doesn't need the docker service.

Solution 1 - Combine the steps

Combine the steps, and run composer within the created image by using the docker run command.

pipelines:
  branches:
    main:
      - step:
          name: Docker image and build
          script:
            - docker build -t foo/bar .docker/composer 
            # Replace <destination> with the working directory of the foo/bar image.
            - docker run -v $BITBUCKET_CLONE_DIR:<destination> foo/bar composer --version
          services:
            - docker

Solution 2 - Using two steps with DockerHub

This example keeps the two step approach. In this scenario, you will push your foo/bar image to a public repository in Dockerhub. Pipelines will then pull it to use in the subsequent step.

pipelines:
  branches:
    main:
      - step:
          name: Docker Image(s)
          script:
            - docker build -t foo/bar .docker/composer 
            - docker login -u $DOCKERHUB_USER -p $DOCKERHUB_PASSWORD
            - docker push foo/bar
          services:
            - docker
      - step:
          name: Build
          image: foo/bar
          script:
            - echo "Hello, World. I'm running insider of the previously pushed foo/bar container"
            - composer --version

If you'd like to use a private repository instead, you can replace the second step with:

...
      - step:
          name: Build
          image: 
            name: foo/bar
            username: $DOCKERHUB_USERNAME
            password: $DOCKERHUB_PASSWORD
            email $DOCKERHUB_EMAIL
          script:
            - echo "Hello, World. I'm running insider of the previously pushed foo/bar container"
            - composer --version
Delhi answered 6/6, 2021 at 6:12 Comment(1)
Thank you for the detailed feedback! Will look into it further with it in mind. Still figuring out what way would be best for us. Thanks again!Overbid
C
2

To expand on phod's answer. If you really want two steps, you can transfer the image from one step to another.

pipelines:
  branches:
    main:
      - step:
          name: Docker Image(s)
          script:
            - docker build -t foo/bar .docker/composer
            - docker image save foo/bar -o foobar.tar.gz
          services:
            - docker
          caches:
            - docker
          artifacts:
            - foobar.tar.gz
      - step:
          name: Build
          script:
            - docker image load -i foobar.tar.gz
            - docker run -v $BITBUCKET_CLONE_DIR:<destination> foo/bar composer --version
          services:
            - docker

Note that this will upload all the layers and dependencies for the image. It can take quite a while to execute and may therefor not be the best solution.

Consumptive answered 9/6, 2022 at 4:35 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.