Run piped commands with eval
Asked Answered
H

2

8

I have the following command line to check free space of a file system:

fs_used=`df -h /u01 | sed '1d' | sed '1d' | awk '{print $4}' | cut -d'%' -f1`

It works fine. It returns the percentage of the used space on the file system (without the % symbol).

Now I need to make it variable and run it with the eval command. I tried the following but it doesn't work (exit with df: invalid option -- 'd')

df_cmnd="df -h $fs1 | sed '1d' | sed '1d' | awk '{print $4}' | cut -d'%' -f1"
fs_used=eval $df_cmnd

The problem, I guess, is that eval cannot run piped commands. Is that true? is there any workaround or alternative to make this code run?

Herrin answered 13/12, 2017 at 13:29 Comment(1)
No, you need to make it a function, not a variable.Epigraphy
T
10

Backslash-escape the $, and use $():

#              V                                         V
df_cmnd="df -h \$fs1 | sed '1d' | sed '1d' | awk '{print \$4}' | cut -d'%' -f1"
fs_used=$(eval "$df_cmnd")
#       ^^               ^

This will use the value of fs1 at the time you eval.

But, in reality, please don't use eval! Make it a shell function instead:

df_cmnd(){
    df -h "$1" | sed '1d' | sed '1d' | awk '{print $4}' | cut -d'%' -f1
}
fs_used=$(df_cmnd /u01)

Then you don't have to worry about escaping.

Explanation

Look at how bash interprets your df_cmnd assignment:

$ df_cmnd="df -h $fs1 | sed '1d' | sed '1d' | awk '{print $4}' | cut -d'%' -f1"
$ echo $df_cmnd
df -h | sed '1d' | sed '1d' | awk '{print }' | cut -d'%' -f1
#    ^                                   ^

In my case, fs1 was empty, so I just got df -h for the df part. In your case and mine, bash replaced $4 with its value, here, empty since I wasn't running in a script with four arguments. Therefore, awk will print the whole line rather than just the fourth field.

Tungusic answered 13/12, 2017 at 13:31 Comment(4)
Thank alot, I am going with the function optionHerrin
@OuldAbba I am delighted to hear it! :D :D :D Less eval makes the world a better place.Tungusic
And if you do ever use eval, remember to double-quote your variables (e.g. eval "$df_cmnd"). This eliminates one whole class of weird eval-related bugs. (Unfortunately, it doesn't fix the other dozen or so classes of weird eval-related bugs. So you should still avoid eval.)Endo
@GordonDavisson Good point - edited. Although I have mixed feelings about showing how to use eval "properly" ;)Tungusic
J
2

I had a legitimate need for eval-ing a string: a helper scripts runs a complicated sequence of pipes with double-quoting (e.g. --json-override='"key": "value"'), 460+ characters long. But when I run it with --noop, I want to see the command that would be executed, in a copy-pasteable form for manual execution. I came up with this:

echo_eval () {
    cmd=$(cat)
    if ((noop)); then
        echo "$cmd"
    else
        eval "$cmd"
    fi
}

foo () { for arg; do echo "[$arg]"; done; }

echo_eval <<EOF
foo 1 2 "3 4" '"5 6"' " 7 " "\"8 9\"" | tac
EOF
exit 1

The main purpose of the here-document is to skip all the pitfalls of using the ' and " quotes in an already ' or "-quoted string (pain and suffering).

While I haven't tested it extensively, it seems to hit the relevant check-boxes, like pipes, quoting, quoting space-containing arguments, preserving quoted quotes, escaped quotes, leading/trailing space in arguments, etc.:

["8 9"]
[ 7 ]
["5 6"]
[3 4]
[2]
[1]

When noop is true, I get:

foo 1 "2" "3 4" '"5 6"' " 7 " "\"8 9\"" | tac

which is the command verbatim, copy-pasteable for manual execution and yielding the same output.

We can also use <<'EOF' to disable substitution of any $s before echo_eval, but if any variable/command substitution does need to happen before calling echo_eval, we have to use <<EOF and escape all $s that shouldn't be evaluated before.


Having said all this, I'd gladly (a) hear if there are pitfalls (apart from the $ mentioned above) and (b) see a better solution to the problem if there is one.

Jujutsu answered 17/3, 2020 at 12:3 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.