Changing the bash script sent to sbatch in slurm during run a bad idea?
Asked Answered
C

1

24

I wanted to run a python script main.py multiple times with different arguments through a sbatch_run.sh script as in:

#!/bin/bash
#SBATCH --job-name=sbatch_run
#SBATCH --array=1-1000
#SBATCH --exclude=node047

arg1=10 #arg to be change during runs
arg2=12 #arg to be change during runs
python main.py $arg1 $arg2

The arguments are encoded in the bash file ran by sbatch. I was worried that if I ran sbatch_run.sh multiple times one after the other but changing the value of arg1 and arg2 during each run, that it might cause errors in my runs. For example if I do:

sbatch sbatch_run.sh # with arg1=10 and arg2=12

and then immediately after I change sbatch_run.sh but run the file again as in:

sbatch sbatch_run.sh # with arg1=69 and arg2=666

would case my runs to all run with the last one (i.e. arg1=69 and arg2=666) instead of each run with its own arguments.

I know for sure that if I hard code the arguments in main.py and then run the same sbatch script but change the main.py it will run the last one. I was wondering if that is the case too if I change the sbatch_run.sh script.


Just so you know, I did try this experiment, by running 1000 scripts, then some get queued and put a sleep command and then change the sbatch_run.sh. It seems to not change what my run is, however, if I am wrong this is way too important to be wrong by accident and wanted to make sure I asked too.

For the record I ran:

#!/bin/bash
#SBATCH --job-name=ECHO
#SBATCH --array=1-1000
#SBATCH --exclude=node047

sleep 15
echo helloworld
echo 5

and then change the echo to echo 10 or echo byebyeworld.

Constriction answered 4/8, 2016 at 23:20 Comment(2)
this means from the answer I got that if you are running a script though sbatch and want to change the arguments to the script (as in main.py example), make sure to have somewhere where the arguments don't change. For example, passing them directly in the bash script ran by slurm or a config file for each run, just make sure the correct config file is being ran and that you don't change it accidentally!Constriction
Upvoted for the choice of numbers in the second example, and because this Q/A was super helpful.Demitria
T
34

When sbatch is run, Slurm copies the submission script to its internal database ; you can convince yourself with the following experiment:

$ cat submit.sh
#!/bin/bash
#SBATCH  --hold
echo helloworld

The --hold is there to make sure the job does not start. Submit it :

$ sbatch submit.sh

Then modify the submission script:

$ sed -i 's/hello/bye/' submit.sh
$ cat submit.sh
#!/bin/bash
#SBATCH  --hold
echo byeworld

and now use control show job to see the script Slurm is planning to run:

$ scontrol show -ddd job YOURJOBID
JobId=******* JobName=submit.sh
[...]
BatchScript=
   #!/bin/bash
   #SBATCH  --hold
   echo helloworld
[...]

It hasn't changed although the original script has.

[EDIT] Recent versions of Slurm use scontrol write batch_script <job_id> [<optional_filename>] rather than scontrol show -dd job to write the submission script to a file named <optional_filename>. The optional filename can be - to display the script to the screen rather than save it to a file.

Tauromachy answered 5/8, 2016 at 9:26 Comment(9)
ah, nice! Thats why it doesn't work if I change a script that is ran within the bash script (for example the python main.py script being called with the bash script), because for that one, it doesn't send a copy of the script...excellent it means that the arguments to my python script don't change then! :DConstriction
Is it possible to copy other scripts to the internal data base? For example I wanted to have a config script that I will be changing across runs but I would like to keep constant once the job is in the queue.Constriction
@CharlieParker I don't think so. You need to copy that config script for each job in a separate directory (or with a job-specific name)Tauromachy
I guess the hacky solution but it works, is just make the submission job the config file. Since you can specify which interpreter to use (like python) things are actually much simpler since the submission script doesn't have to be in bash. Thanks Damien for the discussions.Constriction
why do you need the -ddd flag? I saw the manual and there is only one single -d for details, what does triple d do? as in dddConstriction
@CharlieParker The manual says 'Repeating the option more than once (e.g., "-dd") will cause the show job command to also list the batch script, if the job was a batch job.' So actually one additional d is sufficient. I put three out of habit as some other tools (e.g. sshd ) take up to three d'sTauromachy
I assume that this applied to srun too i.e. changing the submission script to srun doesn't screw things up, right? (not sure how to check)Constriction
another side question, if we actually run a script through some command like docker as in srun docker submission_script, does that mean we can change the submission_script to run new things without having to worry it will change the ones that are already running?Constriction
scontrol write batch_script <id> doesn't show me anything but batch script for job <id> written to slurm-<id>.shRetinol

© 2022 - 2024 — McMap. All rights reserved.