Constantly print Subprocess output while process is running
Asked Answered
W

17

329

To launch programs from my Python-scripts, I'm using the following method:

def execute(command):
    process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
    output = process.communicate()[0]
    exitCode = process.returncode

    if (exitCode == 0):
        return output
    else:
        raise ProcessException(command, exitCode, output)

So when i launch a process like Process.execute("mvn clean install"), my program waits until the process is finished, and only then i get the complete output of my program. This is annoying if i'm running a process that takes a while to finish.

Can I let my program write the process output line by line, by polling the process output before it finishes in a loop or something?

I found this article which might be related.

Whitney answered 11/12, 2010 at 16:3 Comment(5)
Thread instead of subprocess, i thinkGrefe
No, you don't need threads. The entire piping idea works because you can get read/write from processes while they are running.Lizettelizotte
related: Python: read streaming input from subprocess.communicate()Consider
You can also try this solution https://mcmap.net/q/25570/-running-shell-command-and-capturing-the-outputBesetting
You can use asynchronous functionsMedusa
L
375

You can use iter to process lines as soon as the command outputs them: lines = iter(fd.readline, ""). Here's a full example showing a typical use case (thanks to @jfs for helping out):

from __future__ import print_function # Only Python 2.x
import subprocess

def execute(cmd):
    popen = subprocess.Popen(cmd, stdout=subprocess.PIPE, universal_newlines=True)
    for stdout_line in iter(popen.stdout.readline, ""):
        yield stdout_line 
    popen.stdout.close()
    return_code = popen.wait()
    if return_code:
        raise subprocess.CalledProcessError(return_code, cmd)

# Example
for path in execute(["locate", "a"]):
    print(path, end="")
Lizettelizotte answered 11/12, 2010 at 16:45 Comment(28)
I've tried this code (with a program that takes significant time to run) and can confirm it outputs lines as they're received, rather than waiting for execution to complete. This is the superior answer imo.Latent
Note: In Python 3, you could use for line in popen.stdout: print(line.decode(), end=''). To support both Python 2 and 3, use bytes literal: b'' otherwise lines_iterator never ends on Python 3.Consider
The problem with this approach is that if the process pauses for a bit without writing anything to stdout there is no more input to read. You will need a loop to check whether or not the process has finished. I tried this using subprocess32 on python 2.7Raleigh
@Har: wrong. The loop ends (on EOF) when the subprocess is dead. No need to check whether the process is alive or not.Consider
Do not use PIPE unless you read from the pipe while the process is running otherwise the child process may hang (your edit that introduced stderr=PIPE is wrong). To read more than one pipe, you need more complex code (threads, async.IO).Consider
it should work. To polish it, you could add bufsize=1 (it may improve performance on Python 2), close the popen.stdout pipe explicitly (without waiting for the garbage collection to take care of it), and raise subprocess.CalledProcessError (like check_call(), check_output() do). The print statement is different on Python 2 and 3: you could use the softspace hack print line, (note: comma) to avoid doubling all newlines like your code does and passing universal_newlines=True on Python 3, to get text instead of bytes—related answer.Consider
@Lizettelizotte I think I was mistaken. It appeared as though it was ending without reading all stdout, but really there was stderr and the whole process terminated with errors which is why eof was reachedUnadvised
It works great, however if it calls the python script as the child process and with time.sleep(). The iteration of line will block.Impend
@binzhang: Can you upload somewhere that script you're running to check it?Lizettelizotte
@Lizettelizotte child_process.py: ``` import time import sys while True: print 'hello_world' time.sleep(1) sys.stdout.flush() ``` In the above calling process: ``` for path in execute(["python", "child_thread.py"]): print(path, end="") ```Impend
@Lizettelizotte if without sys.stdout.flush(), the iteration of line will block. if without sys.stdout.flush() and time.sleep, the iteration will not blockImpend
@Impend That's not an error, stdout is buffered by default on Python scripts (also for many Unix tools). Try execute(["python", "-u", "child_thread.py"]). More info: #14259000Lizettelizotte
@binzhang: related: Python C program subprocess hangs at “for line in iter” (useful if there is no -u, grep's --line-buffered analogs in a general case)Consider
Could you explain the second argument to print() end=""? That is treated as a syntax error for me when I try to run it python 3.5, and I can't figure out what it's for.Tourniquet
@Tourniquet I'm confused. That should work on 3.5. Seed the docs for more info: docs.python.org/3/library/functions.html#printLizettelizotte
You should also set stderr=subprocess.STDOUT when you construct Popen to make sure you don't miss out on any error messagesNumbersnumbfish
@Numbersnumbfish Sometimes you need to capture stderr, sometimes it's ok to show it on the terminal. Anyway, not sure what would be the best way to read from two both at the same time, @J.F.Sebastian?Lizettelizotte
@tokland, really sleek code. I had to do quite a bit of reading about 'yield' and iterables after seeing your code, but I still can't really put the pieces together. Could you explain in more detail how your code works, and also what was your intuition for using yield in this case?Gardening
This does not work with the scp command. Trying to send a file and I'm getting no output, but it seems to work for the tree command.Clapboard
@Klik: I think that's because scp autodetects if there is a terminal or not. There is info in SO, for example: #3891309Lizettelizotte
@Lizettelizotte thanks for the reference. I have a working implementation by using rsync. askubuntu.com/questions/44059/progress-bar-for-scp-command Doing this I was able to get a progress update of the uploadClapboard
@tokland, Trying to run this with a while loop in the child process, which works great. But if I put a sleep.time(1) in, so the loop only runs once per second, I get nothing printed until the process finishes. Any thoughts?Schild
This does not work for me when my command contains a "&" (ie - trying to run two commands together). Is there any workaround to allow the "&" sign?Parallelism
How to set a timeout for it?Gadolinite
I'm puzzled...Does this not still suffer from the fundamental problem of potential blocking on the pipe in case the process sends a lot of data without any newlines? That is, the exact reason why the Python documentation advises to use communicate() instead of directly reading from the pipes?Matti
This works! I just want to give a heads up for those that it doesn't work for... for me the problem was that the program I was executing didn't flush as I expected!Playwright
@Lizettelizotte to read both streams at once, threads or async.io could be used #31834397Consider
Hi @tokland, I have a cli tool that has a progress bar (so it outputs a carriage return to always update the same line). I can't find a way to capture that progress bar. Running your script just prints an empty line. Do you know how to do it? ThanksTzong
C
147

To print subprocess' output line-by-line as soon as its stdout buffer is flushed in Python 3:

from subprocess import Popen, PIPE, CalledProcessError

with Popen(cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
    for line in p.stdout:
        print(line, end='') # process line here

if p.returncode != 0:
    raise CalledProcessError(p.returncode, p.args)

Notice: you do not need p.poll() -- the loop ends when eof is reached. And you do not need iter(p.stdout.readline, '') -- the read-ahead bug is fixed in Python 3.

See also, Python: read streaming input from subprocess.communicate().

Consider answered 4/2, 2015 at 10:36 Comment(34)
This solution worked for me. The accepted solution given above just went on printing blank lines for me.Vacla
I did have to add sys.stdout.flush() to get prints immediately.Vacla
@Codename: you shouldn't need sys.stdout.flush() in the parent -- stdout is line-buffered if it is not redirected to a file/pipe and therefore printing line flushes the buffer automatically. You don't need sys.stdout.flush() in the child too -- pass -u command-line option instead.Consider
@ J.F. Sebastian Sorry, I should have mentioned that I am redirecting the output to a file.Vacla
@Codename: if it is redirected to a file then why do you need sys.stdout.flush()? Are you monitoring the file with tail -f? Have you considered check_call(cmd, stdout=file_object) instead?Consider
@ J.F. Sebastian - yes I wanted to have the ability to do a tail on the file. Also, I want to be able to dump the output via command line in the usual way with the > operator and not have a coded filename.Vacla
@Codename: if you want to use > then run python -u your-script.py > some-file. Notice: -u option that I've mentioned above (no need to use sys.stdout.flush()).Consider
I have a couple of constraints. Not everyone using my script will have python command aliased to python 3 binary and I want to keep it as simple as possible for them. Does sys.stdout.flush() have any major disadvantages that I should be concerned of?Vacla
@Codename: Sprinkling your code with sys.stdout.flush() may affect performance and it is error-prone. I don't see how -u is related to python command being aliased to whatever.Consider
Yes, you are right about the -u. But that would mean one more arg that all the users would have to add while running, right...Vacla
@Codename: no, it does not mean that. I've answered the question as stated. What to do to accommodate additional requirements depends on specifics. If you want unbuffered output in all cases; replace sys.stdout with an unbuffered object or redirect it. To avoid modifying the code, you could create a shell script that set appropriate command-line parameters, environment variables for python executable. As a quick and dirty hack, you could pass flush=True to the print() function.Consider
For my case (Python-based build scripts running under Jenkins) this happened to be the best answer at this page. But I think it is worth adding code for obtaining return code at the end - return_code = p.wait()Unchristian
@Unchristian no need to call p.wait()—it is called on exit from the with block. Use p.returncode.Consider
@J.F.Sebastian In my case p.returncode gave None - thus my comment :) (Python 3.6 on Windows and 3.4 on Ubuntu)Unchristian
@mvidelgauz: It can't be None outside the with-statement unless an exception occured. Copy-paste the code as is. The indentation of the code is significant in PythonConsider
@J.F.Sebastian Thanks, I didn't realize that 'p' exists outside of with, i added it after for but inside withUnchristian
doesn't work for me, I'm trying to capture bully output on Kali LinuxFrawley
@Frawley “doesn’t work” is too generic. What did you expect to happen and what happens instead? Do you want capture stderr? Does the utility changes its output if it is redirected to a pipe? (stdout is not tty . You can check by appending “| cat” to the command in your terminal.Consider
@Consider here are more details based on this code. test1 and test2 do not print anything during command execution. I waited 1 minute. test3, test4, and test5 cause bully to print the usage and exit.Frawley
@Frawley test3-5 are broken (a list arg is wrong with shell=True and I meant you to run the "| cat" variant manually in the terminal, to see the expected output). There are not enough details to know whether test1-2 are broken too (start with passing the command as a literal list, to make sure the arg are split as you expect). Try running your own script (that prints to stdout, stderr) as an exercise.Consider
@Consider These tests are now all printing nothing. The | cat variant doesn't produce any output as you probably expected. The sys.stdout.write variant provided in another answer prints something in realtime (but I had problems printing bytes).Frawley
@vault: if running manually the command with | cat from the command line doesn't produce any output then the python code with stdout=PIPE shouldn't produce any output too. sys.stdout.write() answer shouldn't work at all if you use Python 3 (as my answer assumes) -- you should get TypeError. Try: cmd = [sys.executable, '-c', "import sys, time\nfor i in range(5):\n print(i, flush=True)\n print(i*i, file=sys.stderr)\n time.sleep(1)"]Consider
@Consider two numbers each second, 10 lines: 0 0 1 1 4 2 9 3 16 4Frawley
@Frawley it is the expected behaviorConsider
@Consider is there a way with Popen to retrieve the output line by line, with such commands?Frawley
@Frawley yes, just follow the linksConsider
does this code prints errors too? or should i add stderr=subprocess.PIPE ?Glynis
@kr1p to read 2 streams (stdout + stderr) at the same time but separately a different approach should be used (async.io). You can merge streams stderr=STDOUT, to keep the current (simple) approachConsider
Is anyone on windows? This isn't working for me. It just hangs until the process is complete and then it prints all at once..Schultz
How to handle KeyboardInterrupt in this case?Raye
@MichaelPacheco If it was caused by Ctrl-C in the shell (SIGINT signal is sent to the foreground process group), then there is nothing to handle: the child process dies by default.Consider
@Consider this happens to other signals too? Like SIGTERMRaye
@MichaelPacheco if you send it to the process group, then yes.Consider
Is it possible to remove previously printed line? In progress bar each progress i printed which fills the terminal output in each secondGoodrum
W
90

Ok i managed to solve it without threads (any suggestions why using threads would be better are appreciated) by using a snippet from this question Intercepting stdout of a subprocess while it is running

def execute(command):
    process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

    # Poll process for new output until finished
    while True:
        nextline = process.stdout.readline()
        if nextline == '' and process.poll() is not None:
            break
        sys.stdout.write(nextline)
        sys.stdout.flush()

    output = process.communicate()[0]
    exitCode = process.returncode

    if (exitCode == 0):
        return output
    else:
        raise ProcessException(command, exitCode, output)
Whitney answered 11/12, 2010 at 18:16 Comment(13)
Merging the ifischer's and tokland's code works quite well (I had to change print line, to sys.stdout.write(nextline); sys.stdout.flush(). Otherwise, it would print out every two lines. Then again, this is using IPython's Notebook interface, so maybe something else was happening - regardless, explicitly calling flush() works.Chavis
mister you're my life saver!! really strange that this kind of things are not build-in in library itself.. cause if I write cliapp, i want to show everything what's processing in loop instantly.. s'rsly..Mackie
Can this solution be modified to constantly print both output and errors? If I change stderr=subprocess.STDOUT to stderr=subprocess.PIPE and then call process.stderr.readline() from within the loop, I seem to run afoul of the very deadlock that is warned about in the documentation for the subprocess module.Woody
@DavidCharles I think what you're looking for is stdout=subprocess.PIPE,stderr=subprocess.STDOUT this captures stderr, and I believe (but I've not tested) that it also captures stdin.Latent
thanks for waiting for exit code. Didn't know how to work it outBuck
@VitalyIsaev: there is no need to poll the exit status in the loop, you can do it after the loop using rc = process.wait()Consider
Hi all, got a small doubt, we are breaking at process.Poll() is not None, means each time it poll for the output and it will become None ?Gloam
We are using stdout = subprocess.pipe and stderr = subprocess.STDOUT like this, can anyone give me a good reference to study about these , what will happen using different combinations of these, for instance, putting stdout = subprocess.STDOUT, stderr = subprocess.pipe etc...Gloam
execute("ls") runs forever and plot '' all the time. I ran this command on python3 and I needed to add str() there nextline = str(process.stdout.readline()). Any idea why it doesn't want to quit?Denbighshire
@F1sher: readline is probably returning b"" instead of the "" used in the code above. Try using if nextline == b"" and...Sochi
the problem with this is that it blocks while waiting for output, making it so you cannot do work (like look for timeout or w/e) until the program outputs another line on stdout/stderr. For a full solution you can use threading/queues a la #375927Pellucid
How does this work without creating scroll bars and identical output? It seems to just stamp and update the screen, yet if I choose which lines to yield, then it begins to scroll with the selected lines over and over again. If I let it run, then it just "updates" the existing screen. Any insight would be greatly appreciated.Wagner
For programs with extremely fast output this methodology was not printing all of the lines. It appears that if a program can produce significantly fast output, such that multiple lines are printed before the next call to readline then those lines will be dropped.Scrutinize
H
71

There is actually a really simple way to do this when you just want to print the output:

import subprocess
import sys

def execute(command):
    subprocess.check_call(command, shell=True, stdout=sys.stdout, stderr=subprocess.STDOUT)

Here we're simply pointing the subprocess to our own stdout, and using existing succeed or exception api.

Hypallage answered 10/10, 2019 at 23:1 Comment(18)
This solution is simpler and cleaner than @tokland's solution, for Python 3.6. I noticed that the shell=True argument is not necessary.Bron
Good catch, Good Will. Removed shell=TrueHypallage
Very astucious, and works perfectly with little code. Maybe you should redirect the subprocess stderr to sys.stderr too ?Renny
Manu you certainly can. I didn't, here, because the attempt in the question was redirecting stderr to stdout.Hypallage
Can you explain whats the difference between sys.stdout and subprocess.STDOUT?Antifederalist
Sure, @RonSerruya. sys.stdout is a File object which allows normal writing operations. subprocess.STDOUT is a special value explicitly used to redirect stderr to the same output as stdout. Conceptually, you're saying that you want both feeds to go to the same place, rather than passing in the same value twice.Hypallage
THANK YOU! I've been browsing all of the stackoverflow subprocess questions and this is the only thing that worked.Anterior
Obviously the cleanest solution. It lets you forward the stderr as well. For click cli: subprocess.check_call(command, stdout=click.get_binary_stream("stdout"), stderr=click.get_binary_stream("stderr")). Brilliant, thanks @AndrewRingGalibi
Why would you redirect stderr to stdout, though? That seems like a dumb thing to do. Just leave both stdout and stderr alone, and the process will inherit the standard file descriptors from Python. (That means Python has no way to capture what's being printed, but in this scenario, you are not doing that.)Hamite
@Hamite In this specific case, because that's what was asked about in the question. In a more general sense, sometimes you want a single stream of output, perhaps for unified processing, or perhaps you want to reserve stderr for your own errors, not the errors in the subprocess. It's dependent on the specifics of your use-case.Hypallage
@GoodWill Look again, this code is broken on python3.8, you must once again use shell=True or else you python pukes FileNotFoundError and a stack trace.Supposititious
Keeping shell=True allows you to pass the command in one go. I would suggest against doing this (see why here), and just ensure that command is a list of strings that make up your command. So cp ../example.txt example.txt should be fed as ["cp", "../example.txt", "example.txt"].Slight
Never use shell=True. It needlessly invokes an extra shell process to call your program.Goodrum
Can you explain why any/all of these arguments are required? As many people pointed out, some are harmful and don't seem to have any impact on the solution.Stockist
@Stockist Shell and stderr -> stdout are done to mirror the configuration in the question. Harmful is a stretch as a general statement, though there are scenarios where it can be problematic, there are also others where it is needed. As with most things, the right solution is driven by context.Hypallage
I just want to get the output displayed on the screen, not saved in a file. I tried out this code snippet and it leads to the error: ` # Assuming file-like object; c2pwrite = stdout.fileno(); if stderr is None: UnsupportedOperation: fileno``. So I think it is complaining that I don't provide a file to save the output in it. Can you share how can I achieve the output to be displayed, but simply on the screen? tnxDesiderate
This is a blocking call, it doesn't seem to run in the background.Gaffrigged
@Gaffrigged Correct. The question isn't asking about a background process.Hypallage
T
11

In Python >= 3.5 using subprocess.run works for me:

import subprocess

cmd = 'echo foo; sleep 1; echo foo; sleep 2; echo foo'
subprocess.run(cmd, shell=True)

(getting the output during execution also works without shell=True) https://docs.python.org/3/library/subprocess.html#subprocess.run

Trever answered 5/10, 2018 at 6:56 Comment(4)
This is not "during execution". The subprocess.run() call only returns when the subprocess has finished running.Hamite
Can you explain how it is not "during execution"? Something like >>> import subprocess; subprocess.run('top') also seems to print "during execution" (and top never finishes). Maybe I'm not grasping some subtle difference?Trever
If you redirect the output back to Python e.g. with stdout=subprocess.PIPE you can only read it after top finishes. Your Python program is blocked during the execution of the subprocess.Hamite
Right, that makes sense. The run method still works if you're only interested in seeing the output as it's generated. If you want to do something with the output in python asynchronously you are right that it doesn't work.Trever
N
9

@tokland

tried your code and corrected it for 3.4 and windows dir.cmd is a simple dir command, saved as cmd-file

import subprocess
c = "dir.cmd"

def execute(command):
    popen = subprocess.Popen(command, stdout=subprocess.PIPE,bufsize=1)
    lines_iterator = iter(popen.stdout.readline, b"")
    while popen.poll() is None:
        for line in lines_iterator:
            nline = line.rstrip()
            print(nline.decode("latin"), end = "\r\n",flush =True) # yield line

execute(c)
Nashner answered 24/12, 2014 at 16:3 Comment(1)
you could simplify your code. iter() and end='\r\n' are unnecessary. Python uses universal newlines mode by default i.e., any '\n' is translated to '\r\n' during printing. 'latin' is probably a wrong encoding, you could use universal_newlines=True to get text output in Python 3 (decoded using locale's preferred encoding). Don't stop on .poll(), there could be buffered unread data. If Python script is running in a console then its output is line-buffered; you can force line-buffering using -u option -- you don't need flush=True here.Consider
D
7

To answer the original question, the best way IMO is just redirecting subprocess stdout directly to your program's stdout (optionally, the same can be done for stderr, as in example below)

p = Popen(cmd, stdout=sys.stdout, stderr=sys.stderr)
p.communicate()
Dressmaker answered 31/10, 2018 at 22:36 Comment(1)
Not specifying anything for stdout and stderr does the same thing with less code. Though I suppose explicit is better than implicit.Hamite
H
6

For anyone trying the answers to this question to get the stdout from a Python script note that Python buffers its stdout, and therefore it may take a while to see the stdout.

This can be rectified by adding the following after each stdout write in the target script:

sys.stdout.flush()
Higa answered 21/6, 2014 at 12:40 Comment(6)
But running Python as a subprocess of Python is crazy in the first place. Your script should simply import the other script; look into multiprocessing or threading if you need parallelized execution.Hamite
@triplee There are several scenarios in which running Python as a subprocess of Python is appropriate. I have a number of python batch scripts that I wish to run sequentially, daily. These can be orchestrated by a master Python script that initiates the execution, and emails me if the child script fails. Each script is sandboxed from the other - no naming conflicts. I'm not parallelising so multiprocessing and threading aren't relevant.Higa
You could also start the other python program using a different python executable than the on that the main python program is running in e.g., subprocess.run("/path/to/python/executable", "pythonProgramToRun.py")Dissection
You can also use PYTHON_UNBUFFERED env var or launch python with -u to avoid this behaviorDobla
@Hamite what if the other Python script is executed on another machine?Holp
Then you are not running Python as a direct subprocess of Python. Having said that, there are situations where it is useful or even mandatory to run Python as a subprocess of itself (for example, if the subprocess needs to catch signals independently of the parent) but I don't think this is one of them.Hamite
T
5

In case someone wants to read from both stdout and stderr at the same time using threads, this is what I came up with:

import threading
import subprocess
import Queue

class AsyncLineReader(threading.Thread):
    def __init__(self, fd, outputQueue):
        threading.Thread.__init__(self)

        assert isinstance(outputQueue, Queue.Queue)
        assert callable(fd.readline)

        self.fd = fd
        self.outputQueue = outputQueue

    def run(self):
        map(self.outputQueue.put, iter(self.fd.readline, ''))

    def eof(self):
        return not self.is_alive() and self.outputQueue.empty()

    @classmethod
    def getForFd(cls, fd, start=True):
        queue = Queue.Queue()
        reader = cls(fd, queue)

        if start:
            reader.start()

        return reader, queue


process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(stdoutReader, stdoutQueue) = AsyncLineReader.getForFd(process.stdout)
(stderrReader, stderrQueue) = AsyncLineReader.getForFd(process.stderr)

# Keep checking queues until there is no more output.
while not stdoutReader.eof() or not stderrReader.eof():
   # Process all available lines from the stdout Queue.
   while not stdoutQueue.empty():
       line = stdoutQueue.get()
       print 'Received stdout: ' + repr(line)

       # Do stuff with stdout line.

   # Process all available lines from the stderr Queue.
   while not stderrQueue.empty():
       line = stderrQueue.get()
       print 'Received stderr: ' + repr(line)

       # Do stuff with stderr line.

   # Sleep for a short time to avoid excessive CPU use while waiting for data.
   sleep(0.05)

print "Waiting for async readers to finish..."
stdoutReader.join()
stderrReader.join()

# Close subprocess' file descriptors.
process.stdout.close()
process.stderr.close()

print "Waiting for process to exit..."
returnCode = process.wait()

if returnCode != 0:
   raise subprocess.CalledProcessError(returnCode, command)

I just wanted to share this, as I ended up on this question trying to do something similar, but none of the answers solved my problem. Hopefully it helps someone!

Note that in my use case, an external process kills the process that we Popen().

Toxophilite answered 11/7, 2016 at 0:36 Comment(1)
I've had to use something almost exactly like this for python2. While something like this should have been provided in python2, it is not so something like this is absolutely fine.Dunno
M
4

Building on @jfs's excellent answer, here is a complete working example for you to play with. Requires Python 3.7 or newer.

sub.py

import time

for i in range(10):
    print(i)
    time.sleep(1)

main.py

from subprocess import PIPE, Popen
import sys

with Popen([sys.executable, '-u', 'sub.py'], bufsize=1, stdout=PIPE, text=True
           ) as sub:
    for line in sub.stdout:
        print(line, end='')

-u is to avoid buffering the outputs - alternatively, print(i, flush=True) also works.

Manage answered 22/8, 2022 at 22:6 Comment(2)
Flush did it for my prints during a while loop.Clamatorial
Simplest approach that worked for me -- edited to remove flush=True requirement - several comments say it, none put it in an answer.Oversupply
I
2

None of the answers here addressed all of my needs.

  1. No threads for stdout (no Queues, etc, either)
  2. Non-blocking as I need to check for other things going on
  3. Use PIPE as I needed to do multiple things, e.g. stream output, write to a log file and return a string copy of the output.

A little background: I am using a ThreadPoolExecutor to manage a pool of threads, each launching a subprocess and running them concurrency. (In Python2.7, but this should work in newer 3.x as well). I don't want to use threads just for output gathering as I want as many available as possible for other things (a pool of 20 processes would be using 40 threads just to run; 1 for the process thread and 1 for stdout...and more if you want stderr I guess)

I'm stripping back a lot of exception and such here so this is based on code that works in production. Hopefully I didn't ruin it in the copy and paste. Also, feedback very much welcome!

import time
import fcntl
import subprocess
import time

proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

# Make stdout non-blocking when using read/readline
proc_stdout = proc.stdout
fl = fcntl.fcntl(proc_stdout, fcntl.F_GETFL)
fcntl.fcntl(proc_stdout, fcntl.F_SETFL, fl | os.O_NONBLOCK)

def handle_stdout(proc_stream, my_buffer, echo_streams=True, log_file=None):
    """A little inline function to handle the stdout business. """
    # fcntl makes readline non-blocking so it raises an IOError when empty
    try:
        for s in iter(proc_stream.readline, ''):   # replace '' with b'' for Python 3
            my_buffer.append(s)

            if echo_streams:
                sys.stdout.write(s)

            if log_file:
                log_file.write(s)
    except IOError:
        pass

# The main loop while subprocess is running
stdout_parts = []
while proc.poll() is None:
    handle_stdout(proc_stdout, stdout_parts)

    # ...Check for other things here...
    # For example, check a multiprocessor.Value('b') to proc.kill()

    time.sleep(0.01)

# Not sure if this is needed, but run it again just to be sure we got it all?
handle_stdout(proc_stdout, stdout_parts)

stdout_str = "".join(stdout_parts)  # Just to demo

I'm sure there is overhead being added here but it is not a concern in my case. Functionally it does what I need. The only thing I haven't solved is why this works perfectly for log messages but I see some print messages show up later and all at once.

Izaak answered 24/4, 2019 at 23:6 Comment(0)
C
1

This PoC constantly reads the output from a process and can be accessed when needed. Only the last result is kept, all other output is discarded, hence prevents the PIPE from growing out of memory:

import subprocess
import time
import threading
import Queue


class FlushPipe(object):
    def __init__(self):
        self.command = ['python', './print_date.py']
        self.process = None
        self.process_output = Queue.LifoQueue(0)
        self.capture_output = threading.Thread(target=self.output_reader)

    def output_reader(self):
        for line in iter(self.process.stdout.readline, b''):
            self.process_output.put_nowait(line)

    def start_process(self):
        self.process = subprocess.Popen(self.command,
                                        stdout=subprocess.PIPE)
        self.capture_output.start()

    def get_output_for_processing(self):
        line = self.process_output.get()
        print ">>>" + line


if __name__ == "__main__":
    flush_pipe = FlushPipe()
    flush_pipe.start_process()

    now = time.time()
    while time.time() - now < 10:
        flush_pipe.get_output_for_processing()
        time.sleep(2.5)

    flush_pipe.capture_output.join(timeout=0.001)
    flush_pipe.process.kill()

print_date.py

#!/usr/bin/env python
import time

if __name__ == "__main__":
    while True:
        print str(time.time())
        time.sleep(0.01)

output: You can clearly see that there is only output from ~2.5s interval nothing in between.

>>>1520535158.51
>>>1520535161.01
>>>1520535163.51
>>>1520535166.01
Casiecasilda answered 8/3, 2018 at 20:2 Comment(0)
T
1

This works at least in Python3.4

import subprocess

process = subprocess.Popen(cmd_list, stdout=subprocess.PIPE)
for line in process.stdout:
    print(line.decode().strip())
Thermometry answered 20/7, 2018 at 1:36 Comment(1)
This has the problem that it blocks in the loop until the process has finished running.Hamite
F
0

Use the -u Python option with subprocess.Popen() if you want to print from stdout while the process is running. (shell=True is necessary, despite the risks...)

Fromma answered 30/9, 2022 at 12:6 Comment(0)
S
-1
import time
import sys
import subprocess
import threading
import queue

cmd='esptool.py --chip esp8266 write_flash -z 0x1000 /home/pi/zero2/fw/base/boot_40m.bin'
cmd2='esptool.py --chip esp32 -b 115200 write_flash -z 0x1000 /home/pi/zero2/fw/test.bin'
cmd3='esptool.py --chip esp32 -b 115200 erase_flash'

class ExecutorFlushSTDOUT(object):
    def __init__(self,timeout=15):
        self.process = None
        self.process_output = queue.Queue(0)
        self.capture_output = threading.Thread(target=self.output_reader)
        self.timeout=timeout
        self.result=False
        self.validator=None
        
    def output_reader(self):
        start=time.time()
        while self.process.poll() is None and (time.time() - start) < self.timeout:
            try:
                if not self.process_output.full():
                    line=self.process.stdout.readline()
                    if line:
                        line=line.decode().rstrip("\n")
                        start=time.time()
                        self.process_output.put(line)
                        if self.validator:
                            if self.validator in line: print("Valid");self.result=True

            except:pass
        self.process.kill()
        return
            
    def start_process(self,cmd_list,callback=None,validator=None,timeout=None):
        if timeout: self.timeout=timeout
        self.validator=validator
        self.process = subprocess.Popen(cmd_list,stdout=subprocess.PIPE,stderr=subprocess.PIPE,shell=True)
        self.capture_output.start()
        line=None
        self.result=False
        while self.process.poll() is None:
            try:
                if not self.process_output.empty():
                    line = self.process_output.get()
                if line:
                    if callback:callback(line)
                    #print(line)
                    line=None
            except:pass                
        error = self.process.returncode
        if error:
            print("Error Found",str(error))
            raise RuntimeError(error)
        return self.result

execute = ExecutorFlushSTDOUT()

def liveOUTPUT(line):
    print("liveOUTPUT",line)
    try:
        if "Writing" in line:
            line=''.join([n for n in line.split(' ')[3] if n.isdigit()])
            print("percent={}".format(line))
    except Exception as e:
        pass
    


result=execute.start_process(cmd2,callback=liveOUTPUT,validator="Hash of data verified.")

print("Finish",result)

Siret answered 11/12, 2021 at 19:22 Comment(0)
S
-2

Simple better than complex.

os library has built-in module system. You should execute your code and see the output.

import os
os.system("python --version")
# Output
"""
Python 3.8.6
0
"""

After version it is also printed return value as 0.

Suitor answered 14/8, 2021 at 0:20 Comment(3)
In this example you cannot process output?Johns
You should check from documentation: docs.python.org/3/library/os.html#os.systemSuitor
@EinoMäkitalo question asking to see constantly printing. If your script file is different from actual execute file, it works. If you run code you will see console output as well. Doc says: Changes to sys.stdin, etc. are not reflected in the environment of the executed command. If command generates any output, it will be sent to the interpreter standard output stream.Suitor
N
-4

In Python 3.6 I used this:

import subprocess

cmd = "command"
output = subprocess.call(cmd, shell=True)
print(process)
Nichrome answered 18/4, 2019 at 10:24 Comment(1)
This is not an answer to this particular question. Waiting for the subprocess to finish before obtaining its output is specifically and precisely what the OP is trying to avoid. The old legacy function subprocess.call() has some warts which are fixed by newer functions; in Python 3.6 you would generally use subprocess.run() for this; for convenience, the older wrapper function subprocess.check_output() is also still available - it returns the actual output from the process (this code would return the exit code only, but even then print something undefined instead).Hamite

© 2022 - 2024 — McMap. All rights reserved.