I am trying to create a Circleci workflow which builds and uploads a java WAR to my organizations existing S3 artifact storage bucket repository. First, the project is built and some information is saved in a file, as well as VERSION being put into BASH_ENV:
export "VERSION=$(mvn -q -Dexec.executable=echo -Dexec.args='${project.version}' --non-recursive exec:exec)" >> $BASH_ENV
if [ $CIRCLE_BRANCH = "master" ]; then
ENVIRONMENT=production
elif [ $CIRCLE_BRANCH = "develop" ]; then
ENVIRONMENT=qa
elif [ $CIRCLE_BRANCH = "release-1.0" ]; then
ENVIRONMENT=staging
else
ENVIRONMENT=$CIRCLE_BRANCH
fi
echo "ENVIRONMENT=$ENVIRONMENT" >> project.info
if [ -z "$ENVIRONMENT" ]; then
echo No environment is set
exit 1
fi
Then, I attempt to upload the artifact using the official S3 orb:
steps:
- attach_workspace:
at: .
- run: source project.info
- run: export "VERSION=$VERSION" >> $BASH_ENV
- aws-s3/copy:
from: target/project-${VERSION}.war
to: 's3://artifact.bucket/project/project-${VERSION}.war'
arguments: '--dryrun'
This upload task fails with The user-provided path target/project-.war does not exist.
Because an empty string is being interpolated instead of the expected version.
The Circleci documentation states:
In every step, CircleCI uses bash to source BASH_ENV. This means that BASH_ENV is automatically loaded and run, allowing you to use interpolation and share environment variables across run steps.
Yet it appears that this is not the case. How do I make the S3 upload tasks use the environment variables I am setting?