Bitbucket Pipeline: Container 'Build' exceeded memory limit
Asked Answered
L

1

6

I am trying to run a pipeline for my Angular App but when it comes to the "npm run build" part it crashes, the fail reason is "Container "Build" exceeded memory limit." I tried messing around with the memory settings in the yml file, for instance adding 'size 2x'and changing the memory amount assigned to docker.

bitbucket-pipelines.yml:

image: node:14.17.0

options:
 docker: true
 size: 2x

pipelines:
  custom:
    prod-deployment:
    - step:
        name: Build angular app
        caches:
        - node
        services:
        - docker
        size: 2x # Double resources available for this step.
        script:
        - mv .npmrc_config .npmrc
        - npm install --unsafe-perm
        - npm install -g @angular/[email protected]
        - free -m
        - npm run dashboard:build
        - wget "censored for security"
        artifacts:
        - dist/**

    - step:
        name: Deploy artifacts using SCP to PROD
        deployment: production
        size: 2x # Double resources available for this step.
        script:
        - pipe: atlassian/scp-deploy:1.1.0
          variables:
            USER: $USERNAME
            SERVER: $SERVER
            REMOTE_PATH: 'Censored for Security'
            LOCAL_PATH: 'dist/*'

    dev-deployment:
    - step:
        name: Build angular app
        caches:
        - node
        services:
        - docker
        size: 2x # Double resources available for this step.
        script:
        - mv .npmrc_config .npmrc
        - npm install --unsafe-perm
        - npm install -g @angular/[email protected]
        - free -m
        - npm run build
        - wget 'Censored for Security'
        artifacts:
        - dist/**

    - step:
        name: Deploy artifacts using SCP to PROD
        deployment: production
        size: 2x # Double resources available for this step.
        script:
        - pipe: atlassian/scp-deploy:1.1.0
          variables:
            USER: $USERNAME
            SERVER: $SERVER
            REMOTE_PATH: 'Censored for Security'
            LOCAL_PATH: 'dist/*'
            
definitions:
  services:
    docker:
      memory: 4096

Console error message:

Killed
npm ERR! code ELIFECYCLE
npm ERR! errno 137
npm ERR! build: `node --max_old_space_size=6144 node_modules/@angular/cli/bin/ng build --configuration=calio && npm run `
npm ERR! Exit status 137
npm ERR! 
npm ERR! Failed at the build script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR!     /root/.npm/_logs/2021-10-14T08_40_46_012Z-debug.log
Liverpool answered 14/10, 2021 at 8:55 Comment(1)
This isn't an area I have specific knowledge, however a similar question has been asked before at the link below. Perhaps something from here will help #57605941Dixon
I
1
  1. Do not use the docker service unless you really need it. Its memory resources (4GB for you) are subtracted from the total resources available for the step, so this could lead to your memory shortages.

  2. You can use even bigger builds (size: 4x or 8x) but only with self-hosted runners. https://support.atlassian.com/bitbucket-cloud/docs/step-options/#Size

    - step:
        size: 4x # or 8x
        runs-on: 
          - 'self.hosted'
          - 'my.custom.label'
        ...

Note your runners will require 16-32 GB of actual memory.

  1. A simple npm run build running out of this much memory is a flag for something being wrong. Locally run the build to debug what could possibly be the root cause for a peak memory usage. E.g see this similar question https://mcmap.net/q/1023015/-yarn-build-error-command-failed-with-exit-code-137-bitbucket-pipelines-out-of-memory-using-max-memory-8192mb
Incoherence answered 28/3, 2023 at 16:6 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.