How to redirect python subprocess stderr and stdout to multiple files? [duplicate]
Asked Answered
O

1

1

I just want to redirect stderr and stdout to multiple files. For example: stderr should redirected to file_1 and file_2.

I am using below to redirect output to single file.

subprocess.Popen("my_commands",shell=True,stdout=log_file,stderr=err_file,executable="/bin/bash")

Above thing redirects stdout and stderr to a single file.
Can anybody tell the way to do same(redirect output to both files log_file and err_file e.g. stdout should redirect to both log_file and err_file and stderr should redirect to err_file and new_file)

Olivas answered 22/12, 2016 at 12:51 Comment(5)
You may find some of the answers here helpful: #2997387Burack
You are telling me the same what I was using. But it should redirect to multiple file descriptors. Not only to single file.Olivas
Not usefull. Not workingOlivas
With the code in my answer you can redirect to as many files as you can open. :)Burack
What Python version are you using? Can you wait for "my_commands" to complete before writing the output data to the files? Or do you need to write to the files while "my_commands" is still running? If you can wait, what you want is easy. If you need to write to the 3 files while "my_commands" is still running it's a bit trickier.Burack
B
3

You can create your own file-like class that writes to multiple file handles. Here's a simple example, with a test that redirects sys.stdout and sys.stderr.

import sys

class MultiOut(object):
    def __init__(self, *args):
        self.handles = args

    def write(self, s):
        for f in self.handles:
            f.write(s)

with open('q1', 'w') as f1, open('q2', 'w') as f2, open('q3', 'w') as f3:
    sys.stdout = MultiOut(f1, f2)
    sys.stderr = MultiOut(f3, f2)
    for i, c in enumerate('abcde'):
        print(c, 'out')
        print(i, 'err', file=sys.stderr)

After running that code, here's what those files contain:

q1

a out
b out
c out
d out
e out    

q3

0 err
1 err
2 err
3 err
4 err    

q2

a out
0 err
b out
1 err
c out
2 err
d out
3 err
e out
4 err

FWIW, you can even do this, if you like:

sys.stdout = MultiOut(f1, f2, sys.stdout)
sys.stderr = MultiOut(f3, f2, sys.stderr)

Unfortunately, file-like objects like MultiOut can't be used with Popen because Popen accesses files via the underlying OS file descriptor, i.e., it wants something that the OS considers to be a file, so only Python objects that supply a valid fileno method can be used for Popen's file arguments.

Instead, we can use Python 3's asyncio features to execute the shell command and to copy its stdout and stderr output concurrently.

Firstly, here's a simple Bash script that I used to test the following Python code. It simply loops over an array, echoing the array contents to stdout and the array indices to stderr, like the previous Python example.

multitest.bsh

#!/usr/bin/env bash

a=(a b c d e)
for((i=0; i<${#a[@]}; i++))
do 
    echo "OUT: ${a[i]}"
    echo "ERR: $i" >&2
    sleep 0.01
done

output

OUT: a
ERR: 0
OUT: b
ERR: 1
OUT: c
ERR: 2
OUT: d
ERR: 3
OUT: e
ERR: 4

And here's Python 3 code that runs multitest.bsh, piping its stdout output to files q1 and q2, and its stderr output to q3 and q2.

import asyncio
from asyncio.subprocess import PIPE

class MultiOut(object):
    def __init__(self, *args):
        self.handles = args

    def write(self, s):
        for f in self.handles:
            f.write(s)

    def close(self):
        pass

@asyncio.coroutine
def copy_stream(stream, outfile):
    """ Read from stream line by line until EOF, copying it to outfile. """
    while True:
        line = yield from stream.readline()
        if not line:
            break
        outfile.write(line) # assume it doesn't block

@asyncio.coroutine
def run_and_pipe(cmd, fout, ferr):
    # start process
    process = yield from asyncio.create_subprocess_shell(cmd,
        stdout=PIPE, stderr=PIPE, executable="/bin/bash")

    # read child's stdout/stderr concurrently
    try:
        yield from asyncio.gather(
            copy_stream(process.stdout, fout),
            copy_stream(process.stderr, ferr))
    except Exception:
        process.kill()
        raise
    finally:
        # wait for the process to exit
        rc = yield from process.wait()
    return rc

# run the event loop
loop = asyncio.get_event_loop()

with open('q1', 'wb') as f1, open('q2', 'wb') as f2, open('q3', 'wb') as f3:
    fout = MultiOut(f1, f2)
    ferr = MultiOut(f3, f2)
    rc = loop.run_until_complete(run_and_pipe("./multitest.bsh", fout, ferr))
loop.close()
print('Return code:', rc)    

After running the code, here's what those files contain:

q1

OUT: a
OUT: b
OUT: c
OUT: d
OUT: e

q3

ERR: 0
ERR: 1
ERR: 2
ERR: 3
ERR: 4

q2

OUT: a
ERR: 0
OUT: b
ERR: 1
OUT: c
ERR: 2
OUT: d
ERR: 3
OUT: e
ERR: 4

The asyncio code was lifted from J.F. Sebastian's answer to the question Subprocess.Popen: cloning stdout and stderr both to terminal and variables. Thanks, J.F!

Note that data is written to the files when it becomes available to the scheduled coroutines; exactly when that happens depends on the current system load. So I put the sleep 0.01 command in multitest.bsh to keep the processing of stdout and stderr lines synchronised. Without that delay the stdout and stderr lines in q2 generally won't be nicely interleaved. There may be a better way to achieve that synchronisation, but I'm still very much a novice with asyncio programming.

Burack answered 22/12, 2016 at 13:24 Comment(3)
This will not work for me. Can you please suggest same with subprocess.Popen. Thanks.Olivas
@Olivas Sorry about that! I've just learned that this doesn't work with Popen because Popen accesses files via the underlying OS file, which is why you get the AttributeError: 'MultiOut' object has no attribute 'fileno' error. What you want is possible, but a little tricky, and I'm currently looking into various solutions.Burack
@Olivas Please see my updated answer.Burack

© 2022 - 2024 — McMap. All rights reserved.