I am running a gitlab-ci
job that will fetch locally a file from a remote server, more or less as follows:
retrieve_docs:
stage: fetch_docs
image: debian:jessie
script:
- ssh $USERNAME@$SERVER /perform/some/work
- INSTFILE=$(ssh $USERNAME@$SERVER bash -c 'find /root -iname "somepattern*" | tail -n 1 | xargs readlink -f')
- echo "Will retrieve locally $INSTFILE"
- scp $USERNAME@$SERVER:$INSTFILE .
- BASEFILE=$(basename $INSTFILE)
- mv $BASEFILE downloads/
artifacts:
name: $BASEFILE
paths:
- downloads/
The above job definition however does not seem to work, as the BASEFILE
variable is rendered as empty when providing the filename.
Is there a way to use dynamic artifact name(s)?
Is there a reason that this artifact is also never copied in my (empty/tracked)
downloads
folder?The above process will actually fetch a
.zip
file locally. Is there a way (although I have set expiration of 1 week) to have each job delete old artifacts and keep only the latest artifact / zip file?