How to implement 'set -o pipefail' in a POSIX way - almost done, expert help needed
Asked Answered
P

4

22

I have to implement the Bash set -o pipefail option in a POSIX way so that it works on various Linux/Unix flavors. To explain a bit, this option enables the user to verify the successful execution of all piped commands. With this option enabled this command tail app.log | grep 'ERROR' fails if grep fails, otherwise the tail error is suppressed.

So, I found a really nice solution here: http://cfaj.ca/shell/cus-faq-2.html

run() {
   j=1
   while eval "\${pipestatus_$j+:} false"; do
       unset pipestatus_$j
       j=$(($j+1))
   done
   j=1 com= k=1 l=
   for a; do
       if [ "x$a" = 'x|' ]; then
           com="$com { $l "'3>&-
                       echo "pipestatus_'$j'=$?" >&3
               } 4>&- |'
           j=$(($j+1)) l=
       else
           l="$l \"\$$k\""
       fi
           k=$(($k+1))
   done
   com="$com $l"' 3>&- >&4 4>&-
   echo "pipestatus_'$j'=$?"'
   exec 4>&1
   eval "$(exec 3>&1; eval "$com")"
   exec 4>&-
   j=1
   while eval "\${pipestatus_$j+:} false"; do
       eval "[ \$pipestatus_$j -eq 0 ]" || return 1
       j=$(($j+1))
   done
   return 0
}

The above-mentioned run() function enables the user to invoke the piped commands in such a way:

run cmd1 \| cmd2 \| cmd3

If one of the commands fails you get it in $?

There is a problem, however; it does not support the grouping of commands between pipes. I want to be able to invoke something like this:

run echo "test" ; grep "test" \| awk '{print}'

When I do it, the invocation fails. I cannot get the right modification to support the grouping of commands -- the script is a bit too complex for my bash skills...

Could somebody help?

Peashooter answered 26/10, 2012 at 9:33 Comment(5)
; does not group commands between pipes in bash. ( ... ; ... ) or { ... ; ... ; } does.Deibel
set -o pipefail. set -e is different. This is possible but probably more effort than its worth. Ksh also supports pipefail, and mksh supports PIPESTATUS which can easily be used to implement pipefail. I would seriously consider using a different language before attempting this in POSIX sh.Pinson
@Deibel - this is what I meant. You are right. This does not work either.Peashooter
@Pinson I need a consistent and single way to handle this. The run() method is fine, so I don't think it's a lot of effort to add the grouping.Peashooter
At the peril of stating the obvious, grep x | awk '{y}' should be written awk '/x/{y}' anyway. iki.fi/era/unix/award.html#grepDigit
N
14

My two cents:

#!/bin/sh

# Saving the pid of the main shell is required,
# as each element of the pipe is a subshell.
self=$$

lots_and_fail() {
    seq 100
    return 1
}

{ lots_and_fail || kill $self; } | sed s/7/3/

This thing seems to do the job. Thoughts?

Nette answered 2/10, 2019 at 10:15 Comment(6)
So let me get this straight. You downvoted the answer because it contains an obviously rhetorical question, or do you actually see a reason why this solution is not good? Would you mind to elaborate on your downvote?Nette
Clever. But why doesn't it work if instead of kill you use exit 1?Esculent
@Esculent - exit 1 exits the subshell :)Nette
If wrap guts into function it would be a bit nicer: try() { set -e ; trap 'kill $$' EXIT ; eval "$@" ; }. And use it as try cmd1 | ... cmdNJaques
@jsxt: nice to see, could it be that the last sentence of Jonathan Leffler's answer may also apply to this?: "It assumes that wrapping double quotes around an argument neutralizes it correctly, which is not always true (though it is true a lot of the time)." [referring originally to OP]. Or does eval "$@" make this fully transparent?Geriatrics
@hakre, surely. My suggestion just accumulates the best stuff of the suggestions by Dacav and Jonathan Leffler.Jaques
T
6

When you type:

run echo "test" ; grep "test" \| awk '{print}'

you invoke run with the arguments echo and "test"; then you invoke grep with arguments "test", |, awk and {print}. Typically, grep is not going to find any of the files called |, awk or {print}.

To invoke run as you wanted, you'd have to escape the semi-colon like you did the | (and you'd need to do things similarly for && or || or & and possibly other components of a command line; the handling of $(...) or backticks `...` needs to be thought about carefully).

If you write:

run echo "test" \; grep "test" \| awk '{print}'

you will at least get all the arguments you intended to run. Whether it then works is debatable; I don't yet understand how the run code you showed is supposed to work.

[...Later...]

It does some fearsome I/O redirections, but wraps each segment of a command separated by a pipe symbol into a separate little packet of hieroglyphs. It assumes that wrapping double quotes around an argument neutralizes it correctly, which is not always true (though it is true a lot of the time).

Thalassa answered 26/10, 2012 at 13:16 Comment(0)
B
2

dash from git (not released as of time of me writing this) has support for pipefail since this commit:

https://git.kernel.org/pub/scm/utils/dash/dash.git/commit/?id=6347b9fc52d742f36a0276cdea06cd9ad1f02c77

In Debian, that commit has been cherry-picked for version 0.5.12-7:

https://tracker.debian.org/news/1530876/accepted-dash-0512-7-source-into-unstable/

Before:

$ sh -c "exit 1" | cat
$ echo $?
0

After:

$ set -o pipefail
$ sh -c "exit 1" | cat
$ echo $?
1
Bordelaise answered 19/5, 2024 at 5:22 Comment(0)
T
1

The core of your idea should probably involve something like this:

{ cmd1 ; echo $? > status1 ; } | cmd2 && grep -q '^0$' status1 }

In longer form, that would be:

{ cmd1 ; echo $? > status1 ; } |  \
{ cmd2 ; echo $? > status2 ; } |  \
  # ... and so on                 \
  cmdN                         && \
  # ^ note lack of wrapper        \ 
grep -q '^0$' status1 &&          \
grep -q '^0$' status2 &&          \
  # ... and so on, to N-1
Tremolant answered 28/2, 2019 at 17:54 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.