passing variable to bash script in a jenkins pipeline job
Asked Answered
V

8

27

I have a Jenkins pipeline job in which I configure my environment with a bash script named setup.sh which looks like:

#!/bin/bash
export ARCH=$1
echo "architecture = " ${ARCH}

In the Jenkins pipeline script, Icall the setup.sh script with:

def lib_arch='linux-ubuntu-14.04-x86_64-gcc4.8.4'
sh ". /opt/setup.sh ${lib_arch}"

unfortunately it seems that NO variable is passed to the setup.sh script, and the echo ${ARCH} return an empty string! I tried to instead do: sh "source /opt/setup.sh ${lib_arch}" and this fails as well with the "source not found" message. I also tried changing the first line of my script to

#!/bin/sh

but it does not help. So how can I pass a parameter to my bash script in a Jenkins pipeline script? thanks for your help.

Update: a workaround was sugggested by Bert Jan Schrijve in this thread (see below):

sh "bash -c \" source /opt/setup.sh ${lib_arch}\"" 
Vacuole answered 29/10, 2016 at 20:43 Comment(0)
P
21

If you are using multiline shell script with triple apostrophe, you have to use this syntax:

sh '''
 echo '''+varToPrint+'''
 other commands...
'''

(from https://medium.com/devopslinks/a-hacky-hackers-guide-to-hacking-together-jenkins-scripted-pipelines-part-3-aca73bd33eaa)

Parturition answered 1/4, 2019 at 11:26 Comment(2)
Thank you! You best of the best! At least, I've found the solution, best wishes, thank you!Travelled
Not very nice, but the only solution that works for me +1Sabba
M
9

The example below works:

void updateApplicationVersionMaven(String version) {
    sh "mvn -B versions:set -DnewVersion=$version"
}

And a complete pipeline script (tested on Jenkins 2.7.3):

node {
    stage('test') {
        def testVar='foo'
        sh "echo $testVar"    
    }
}

EDIT (after comments): Ah, tested some more and could reproduce the issue. It's because you're sourcing the script with ". /opt/setup.sh". This influences the shell environment, and in this case breaks the Jenkins variable injection. Interesting.

EDIT2 (after comments): I believe this is an issue with the default shell that's being used (either by Jenkins or by the OS). I could reproduce the issue from the comments and was able to work around it by explicitly using bash as a shell:

def testVar='foo3'
sh "bash -c \". /var/jenkins_home/test.sh $testVar && echo \$ARCH\""

The last echo now echos the contents of testVar that was passed as an argument to the script and subsequently set by the script as an environment variable.

Manassas answered 29/10, 2016 at 22:2 Comment(8)
I tried your suggestion and it fails. I even tried to simply do: sh ". /opt/setup.sh bla" . Clearly, the parameter is not passed to the shell script, since $# returns 0, whereas in the script console, i get [testsh] Running shell script + . /opt/setup.sh linux-ubuntu-14.04-x86_64-gcc4.8.4Vacuole
That's odd. I've added a small example and tested it on Jenkins 2.7.3. What version are you on?Manassas
I am using this war version: 2.19.1. I believe there is something odd here. In your pipeline script, you are not doing what I am trying to achieve. In my case, I am trying to give a variable value defined with def (like you) to a bash script. As I wrote, if i do sh " . /opt/setup.sh bla", and if in my script I try to echo $1, I can clearly see that the parameter "bla" is not given to the script.Vacuole
Yes, I've tested the same (with a script). If you call the script with /opt/setup.sh (without the source dot), it should work.Manassas
sure, but the variable defined in the setup.sh cannot be accessed outside. I cannot do sh "/opt/setup.sh bla && echo \$ARCH".Vacuole
Ah, you're trying to set environment variables with the script. Got it, found an issue and a workaround. Answer updated.Manassas
Thanks Bert. I applied your approach in the end with the bash -c option.Vacuole
Alright! Please accept the answer if it solved your problem.Manassas
N
8

Had the same problem and the posted solutions did not work for me. Using environment variables did the trick:

env.someVar='someVal'
sh "echo  ${env.someVar}"
Neb answered 24/7, 2018 at 13:3 Comment(1)
I dont know why this received a down vote. It seems to be a way to address the OP's question. Guess ill upvote it ;-)Update
N
4

https://mcmap.net/q/508161/-passing-variable-to-bash-script-in-a-jenkins-pipeline-job THANK YOU @Tony. This worked out for me! I tried everything!

For people that need 2nd pipe's status code and are forced to use bash in Jenkins because of the "Bad Substitution" Error when using sh.

    def verify(name, config) {
script {
    sh '''
      #!/bin/bash
      docker-compose run --rm -e HOST='''+name+''' test rake -f test/Rakefile test_'''+config+''' | tee -a test-output-'''+config+'''.log; test "${PIPESTATUS[0]}" -eq 0
    '''
  }
}

You also have to put shebang on the same line as '''

https://devops.stackexchange.com/questions/9942/prevent-pipelinestatus0-from-being-evaluated-in-jenkinsfile

Nonalignment answered 2/10, 2020 at 15:43 Comment(0)
U
3

Using the returnStdout with env is another way to pass val back and forth. Example shows a unique id from uuidgen is used as a common external resource across stages.

node {
    stage('stage 1') {
        env.UNIQUE = sh(returnStdout: true, script: 'uuidgen').trim()
        sh 'echo "started `date`" > /tmp/$UNIQUE'
    }
    stage('stage 2'){
        sh 'echo "done `date`" >> /tmp/$UNIQUE'
        println sh(returnStdout: true, script: 'cat /tmp/$UNIQUE').trim()
    }
}

this will output a date to a unique file showing when it completed. uuidgen will produce a different string each time you run it.

+ echo 'done Tue Oct 22 10:12:20 CDT 2019'
[Pipeline] sh
+ cat /tmp/d7bdb6a5-badb-474d-95dd-cf831ea88a2a
[Pipeline] echo
started Tue Oct 22 10:12:20 CDT 2019
done Tue Oct 22 10:12:20 CDT 2019
Update answered 22/10, 2019 at 15:13 Comment(6)
well if you think of it this way env. is just the shell's global var list. so echo $TERM is groovy's env.TERMUpdate
But without doublequotes, $BRANCH_NAME should not resolve to my-branch. It should literally pass $BRANCH_NAME as an argument. It's not intuitive at all.Eureka
Cant see your original comment so don't remember the exact problem but if you have 2 steps 1 env.USR = env.USER and 2 sh 'echo one $USR && echo "two $USR" && echo \'three $USR\' ' you should see why single quotes don't matter from the pipeline until they are passes to shell. the output is one jenkins two jenkins three $USRUpdate
I ran sh -c 'echo one $USR' locally, and you are right that $USR resolves successfully. I was under the impression that single quotes prevented variables from resolving, but it is clear that when using sh -c ' ... ', the single quotes do not affect variable resolution, hence my confusion. BTW, running sh -c 'echo one $USR && echo "two $USR" && echo \'three $USR\' ' acted like the single quotes were unbalanced, and I don't understand why...Eureka
Groovy is doing esc on \'. What's happening is more like export USR=$USER; bash -xe script.sh where the content of script.sh is exactly this echo one $USR && echo "two $USR" && echo 'three $USR' the -xe is how you get the + signs in the output in jenkins log. You can play with this by making a /tmp/bash.sh from this. #/bin/bash ; cp $2 /tmp/back.txt ; echo $@; /bin/bash $1 $2 you can see your cmd in /tmp/back.txt and the passing args in jenkins log output. You also need to set the Shell exec in your configure system menu to /tmp/bash.sh so it does not go for default bash.Update
not work: groovy.lang.MissingMethodException: No signature of method: Script1.sh() is applicable for argument types: (java.util.LinkedHashMap) values: [[returnStdout:true, script:uuidgen]] Possible solutions: use([Ljava.lang.Object;), is(java.lang.Object), run(), run(), any(), with(groovy.lang.Closure)Shirtmaker
D
1

I've solved in another way:

  1. Create a file with the desired variables
  2. Run, in the same command, both a source and the command itself

Example:

sh 'echo -n HOST_IP= > host_ip.var'
sh '/sbin/ip route|awk \'/default/ { print $3 }\' >> host_ip.var'
sh 'source host_ip.var && echo your ip: $HOST_IP'

The file ends up with

REMOTE=172.16.0.1

The output is

your ip: 172.16.0.1

Note: it is very important that the last sh command uses single quotes ('), not double ("), otherwise the pipeline tries to replace said variable

Dragonet answered 12/4, 2018 at 8:57 Comment(0)
P
0

In my case I just needed to save initial directory (where workspace gets mount) into docker agent. So it could be used later as an input to some other build command. Normally I'd prefer to use bash variable, but after trial and error I ended up doing this:

stage('Build') {
    agent {
        docker { ... }
    }
    steps {
        script {
            MYPWD = sh( script: 'pwd', returnStdout: true );
        }
        sh "echo $MYPWD"
    }
}
Philpot answered 18/3, 2021 at 10:16 Comment(0)
B
0
def lib_arch='linux-ubuntu-14.04-x86_64-gcc4.8.4'

withEnv(["ARCH=${lib_arch}"]) {
sh """. /opt/setup.sh"""

# Subsequent steps can access $ARCH
sh "echo 'Architecture in pipeline: $ARCH'"
}

Note:

  • Double quotes for variable expansion within strings
  • withEnv for persistent environment variables
  • Use . (dot) for sourcing scripts within sh
  • Adapt file paths and commands to your specific setup.

If there is a pipeline with environment variable to be passed

pipeline {
environment {
    LIB_ARCH = 'linux-ubuntu-14.04-x86_64-gcc4.8.4'
}

stages {
    stage('Deploy') {
        steps {
            script {
                def arch = env.LIB_ARCH
                sh """. /opt/setup.sh ${arch}"""
            }
        }
    }
 }

}

Note:

  • Use prefix env. for accessing environment variables
  • Double quotes "" for variable expansion
  • Ensure LIB_ARCH is correctly defined in the pipeline environment
  • Change file paths and commands specific to your setup
Boehmite answered 11/1 at 6:11 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.