Edit shell script while it's running
Asked Answered
R

11

119

Can you edit a shell script while it's running and have the changes affect the running script?

I'm curious about the specific case of a csh script I have that batch runs a bunch of different build flavors and runs all night. If something occurs to me mid operation, I'd like to go in and add additional commands, or comment out un-executed ones.

If not possible, is there any shell or batch-mechanism that would allow me to do this?

Of course I've tried it, but it will be hours before I see if it worked or not, and I'm curious about what's happening or not happening behind the scenes.

Rascal answered 3/8, 2010 at 15:49 Comment(4)
I've seen two results from editing the script file for a running script: 1) the changes are ignored as if it had read the whole thing into memory or 2) the script crashes with an error as if it had read part of command. I don't know if that's dependent on the size of the script. Either way, I wouldn't try it.Spinal
In short: no, unless it's self-referential/calling, in which case the main script would still be the old one.Wellspoken
There are two important questions here. 1) How can I correctly and safely add commands to a running script? 2) When I modify a running script, what will happen?Enkindle
The question is whether a shell executes a script by reading the entire script file and then executing it, or by partially reading it as it executes. I don't know which it is; it might not even be specified. You should avoid depending on either behavior.Auvil
T
-89

Scripts don't work that way; the executing copy is independent from the source file that you are editing. Next time the script is run, it will be based on the most recently saved version of the source file.

It might be wise to break out this script into multiple files, and run them individually. This will reduce the execution time to failure. (ie, split the batch into one build flavor scripts, running each one individually to see which one is causing the trouble).

Teresa answered 3/8, 2010 at 15:54 Comment(8)
I have observed the same. Any place in the bash (or csh or ksh) documentation where it is mentioned?Steib
I have observed the opposite. Running bash scripts that get edited can cause the running script to crash because the file seems to move under bash's script reading file position.Diminuendo
In my experience on multiple systems, the executing copy is NOT independent from the disk file, that's why this issue is so surprising and important in shell script programming.Enkindle
It’s definitely not independent of the on-disc file. The shell just usually reads the scripts in blocks of, for example, 128 bytes or 4096 bytes or 16384 bytes, and only reads the next block when it needs new input. (You can do things like lsof on a shell running a script and see it’s still got the file opened.)Shelves
No. Actually if you edit a script it causes the process to fail.Headrace
You are not correct. It is buffered depending on implementation and the actual command being called in the script, whether stdout is redirected to a file, there are many factors and your answer isn't simply correct.Energid
@Teresa - it might be what you've observed, but I suspect that this is a red herring thrown in by your editor - which may be creating a new file and moving it into place on each save, rather than editing the file directly.Sn
"These downvotes suck. I am right. It is what I have observed" -- so if I observe a face on the moon, does that mean there really is a man in the moon? My observations can be wrong... so can yours.Joelynn
W
62

It does affect, at least bash in my environment, but in very unpleasant way. See these codes. First a.sh:

#!/bin/sh

echo "First echo"
read y

echo "$y"

echo "That's all."

b.sh:

#!/bin/sh

echo "First echo"
read y

echo "Inserted"

echo "$y"

# echo "That's all."

Do

$ cp a.sh run.sh
$ ./run.sh
$ # open another terminal
$ cp b.sh run.sh  # while 'read' is in effect
$ # Then type "hello."

In my case, the output is always:

hello
hello
That's all.
That's all.

(Of course it's far better to automate it, but the above example is readable.)

[edit] This is unpredictable, thus dangerous. The best workaround is , as described here put all in a brace, and before the closing brace, put "exit". Read the linked answer well to avoid pitfalls.

[added] The exact behavior depends on one extra newline, and perhaps also on your Unix flavor, filesystem, etc. If you simply want to see some influences, simply add "echo foo/bar" to b.sh before and/or after the "read" line.

Wiser answered 10/6, 2011 at 7:44 Comment(3)
Mh, I don't see the affection. Am I missing something?Twannatwattle
The exact behavior depends on one extra newline, and perhaps also on Unix flavor, filesystem, etc, thought not sure at al. If you simply want to see any influences, simply enlarge b.sh by adding 10 lines of echo foo/bar/baz. The gist of the answers by dave4220 and me is that the effect is not easy to predict. (BTW the noun "affection" means "love" =)Wiser
yes, it's very broken. i have a solution (below). what's even more dangerous is svn/rsync/git updatesHeadrace
H
53

Try this... create a file called bash-is-odd.sh:

#!/bin/bash
echo "echo yes i do odd things" >> bash-is-odd.sh

That demonstrates that bash is, indeed, interpreting the script "as you go". Indeed, editing a long-running script has unpredictable results, inserting random characters etc. Why? Because bash reads from the last byte position, so editing shifts the location of the current character being read.

Bash is, in a word, very, very unsafe because of this "feature". svn and rsync when used with bash scripts are particularly troubling, because by default they "merge" the results... editing in place. rsync has a mode that fixes this. svn and git do not.

I present a solution. Create a file called /bin/bashx:

#!/bin/bash
source "$1"

Now use #!/bin/bashx on your scripts and always run them with bashx instead of bash. This fixes the issue - you can safely rsync your scripts.

Alternative (in-line) solution proposed/tested by @AF7:

{
   # your script
exit $?
} 

Curly braces protect against edits, and exit protects against appends. Of course, we'd all be much better off if bash came with an option, like -w (whole file), or something that did this.

Headrace answered 17/10, 2013 at 15:28 Comment(12)
Btw; here's a plus to counter the minus and because I like your edited answer.Polyphemus
I can't recommend this. In this workaround, positional parameters are shifted by one. Also remember that you can't assign a value to $0. It means if you simply change "/bin/bash" to "/bin/bashx", many scripts fail.Wiser
Please tell me that such an option has been implemented already!Marlenamarlene
A simple solution, suggested to me by my friend Giulio (credits where due) is to insert { at the beginning and } at the end of the scritp. Bash is forced to read everything in memory.Marlenamarlene
@Marlenamarlene improving on your friend's solution: { your_code; } && exit; will prevent lines appended to the end from being executed as well.Blockade
Adding another || exit catches pending status codes > 0 as well :-)Blockade
I guess technically this would be better as an answer to the opposite question (how to prevent edits from affecting the running script), not that it really matters now.Pyromagnetic
@teikakazura it doesn't shift parameter - because you use source, but it does change the first argument to bashx insead of bashHeadrace
@ErikAronesty yes, rsync edits bytes with --in-place, but that doesn't change the problem with bash. bash has a byte pointer up to which point the script was executed. edits shifting code beyond or adding code after that point will be executed unless bash had to read-ahead the code in curly braces, and if that code is self-contained with an exit inside the curly braces nothing can be edited. I wrote "} && exit;" back then, today I would write "exit; }"Blockade
@Blockade yes, i agree, which is why bashx solves the problem for sure, but solutions like {...}; exit $? can potentially have a race where the file is edited between the code (which is safe) and the exit command (which is not). I think you have to put the exit in the curly braces to be sure.Headrace
@ErikAronesty That is correct and what I meant by 'today I would write "exit; }"'. I guess you should edit your answer, then :-)Blockade
This doesn't just happen in bash. It happens in sh too.Dalrymple
V
18

Break your script into functions, and each time a function is called you source it from a separate file. Then you could edit the files at any time and your running script will pick up the changes next time it gets sourced.

foo() {
  source foo.sh
}
foo
Vonvona answered 3/8, 2010 at 18:59 Comment(1)
I have been using this technique effectively for a while now to update my long-running build scripts while they are running. I'd love to learn a technique for causing the current file to read until the end of file, so that I don't have to have two files to implement each shell script.Enkindle
O
5

Good question! Hope this simple script helps

#!/bin/sh
echo "Waiting..."
echo "echo \"Success! Edits to a .sh while it executes do affect the executing script! I added this line to myself during execution\"  " >> ${0}
sleep 5
echo "When I was run, this was the last line"

It does seem under linux that changes made to an executing .sh are enacted by the executing script, if you can type fast enough!

Oxfordshire answered 6/9, 2018 at 4:24 Comment(0)
L
3

An interesting side note - if you are running a Python script it does not change. (This is probably blatantly obvious to anyone who understands how shell runs Python scripts, but thought it might be a useful reminder for someone looking for this functionality.)

I created:

#!/usr/bin/env python3
import time
print('Starts')
time.sleep(10)
print('Finishes unchanged')

Then in another shell, while this is sleeping, edit the last line. When this completes it displays the unaltered line, presumably because it is running a .pyc? Same happens on Ubuntu and macOS.

Loritalorn answered 9/11, 2018 at 14:4 Comment(1)
Python typically reads and compiles the whole script before it runs it, because . Some shells do this too, some read and execute the script piece by piece. Modify a running script only if you know that it is safe for that particular implementation of the language.Aksoyn
R
1

I don't have csh installed, but

#!/bin/sh
echo Waiting...
sleep 60
echo Change didn't happen

Run that, quickly edit the last line to read

echo Change happened

Output is

Waiting...
/home/dave/tmp/change.sh: 4: Syntax error: Unterminated quoted string

Hrmph.

I guess edits to the shell scripts don't take effect until they're rerun.

Rubicund answered 3/8, 2010 at 15:57 Comment(5)
you should put the string you want to display in quotes.Perverse
actually, it proves that your editor doesn't work the way you think. many, many editors (including vim, emacs) operate on a "tmp" file, and not the live file. Try using "echo 'echo uh oh' >> myshell.sh" instead of vi/emacs... and watch as it outputs the new stuff. Worse... svn and rsync also edit this way!Headrace
-1. That error isn't related to the file being edited: it's because you're using an apostrophe! That acts as a single quote, causing the error. Put that whole string in double quotes and try again.Christyna
The fact that the error occurred shows that the edit didn't have the intended effect.Chiffon
@Chiffon Who knows? Maybe bash saw Change didn'ned.Involution
I
1

If this is all in a single script, then no it will not work. However, if you set it up as a driver script calling sub-scripts, then you might be able to change a sub-script before it's called, or before it's called again if you're looping, and in that case I believe those changes would be reflected in the execution.

Ivan answered 3/8, 2010 at 16:10 Comment(0)
R
0

I'm hearing no... but what about with some indirection:

BatchRunner.sh

Command1.sh
Command2.sh

Command1.sh

runSomething

Command2.sh

runSomethingElse

Then you should be able to edit the contents of each command file before BatchRunner gets to it right?

OR

A cleaner version would have BatchRunner look to a single file where it would consecutively run one line at a time. Then you should be able to edit this second file while the first is running right?

Rascal answered 3/8, 2010 at 16:5 Comment(1)
I wonder if it loads them into memory to run them and a change doesn't matter once the main process is initiated...Rabid
M
-1

Use Zsh instead for your scripting.

AFAICT, Zsh does not exhibit this frustrating behavior.

Magnolia answered 31/3, 2020 at 23:53 Comment(1)
This is reason #473 to prefer Zsh to bash. I've recently been working on an old bash script that takes 10m to run, and I can't edit it while waiting for it to complete!Magnolia
G
-5

usually, it uncommon to edit your script while its running. All you have to do is to put in control check for your operations. Use if/else statements to check for conditions. If something fail, then do this, else do that. That's the way to go.

Guaco answered 3/8, 2010 at 15:57 Comment(2)
It's actually less about scripts failing than it is deciding to modify the batch job mid operation. I.E. realizing there's more I want to compile, or that I don't need certain jobs already in queue.Rascal
If you strictly append to scripts, then bash will do what you expect!Headrace
T
-89

Scripts don't work that way; the executing copy is independent from the source file that you are editing. Next time the script is run, it will be based on the most recently saved version of the source file.

It might be wise to break out this script into multiple files, and run them individually. This will reduce the execution time to failure. (ie, split the batch into one build flavor scripts, running each one individually to see which one is causing the trouble).

Teresa answered 3/8, 2010 at 15:54 Comment(8)
I have observed the same. Any place in the bash (or csh or ksh) documentation where it is mentioned?Steib
I have observed the opposite. Running bash scripts that get edited can cause the running script to crash because the file seems to move under bash's script reading file position.Diminuendo
In my experience on multiple systems, the executing copy is NOT independent from the disk file, that's why this issue is so surprising and important in shell script programming.Enkindle
It’s definitely not independent of the on-disc file. The shell just usually reads the scripts in blocks of, for example, 128 bytes or 4096 bytes or 16384 bytes, and only reads the next block when it needs new input. (You can do things like lsof on a shell running a script and see it’s still got the file opened.)Shelves
No. Actually if you edit a script it causes the process to fail.Headrace
You are not correct. It is buffered depending on implementation and the actual command being called in the script, whether stdout is redirected to a file, there are many factors and your answer isn't simply correct.Energid
@Teresa - it might be what you've observed, but I suspect that this is a red herring thrown in by your editor - which may be creating a new file and moving it into place on each save, rather than editing the file directly.Sn
"These downvotes suck. I am right. It is what I have observed" -- so if I observe a face on the moon, does that mean there really is a man in the moon? My observations can be wrong... so can yours.Joelynn

© 2022 - 2024 — McMap. All rights reserved.