Constantly print Subprocess output while process is running

Posted on

Solving problem is about exposing yourself to as many situations as possible like Constantly print Subprocess output while process is running and practice these strategies over and over. With time, it becomes second nature and a natural way you approach any problems in general. Big or small, always start with a plan, use other strategies mentioned here till you are confident and ready to code the solution.
In this post, my aim is to share an overview the topic about Constantly print Subprocess output while process is running, which can be followed any time. Take easy to follow this discuss.

Constantly print Subprocess output while process is running

To launch programs from my Python-scripts, I’m using the following method:

def execute(command):
    process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
    output = process.communicate()[0]
    exitCode = process.returncode
    if (exitCode == 0):
        return output
    else:
        raise ProcessException(command, exitCode, output)

So when i launch a process like Process.execute("mvn clean install"), my program waits until the process is finished, and only then i get the complete output of my program. This is annoying if i’m running a process that takes a while to finish.

Can I let my program write the process output line by line, by polling the process output before it finishes in a loop or something?

I found this article which might be related.

Answer #1:

You can use iter to process lines as soon as the command outputs them: lines = iter(fd.readline, ""). Here’s a full example showing a typical use case (thanks to @jfs for helping out):

from __future__ import print_function # Only Python 2.x
import subprocess
def execute(cmd):
    popen = subprocess.Popen(cmd, stdout=subprocess.PIPE, universal_newlines=True)
    for stdout_line in iter(popen.stdout.readline, ""):
        yield stdout_line
    popen.stdout.close()
    return_code = popen.wait()
    if return_code:
        raise subprocess.CalledProcessError(return_code, cmd)
# Example
for path in execute(["locate", "a"]):
    print(path, end="")
Answered By: tokland

Answer #2:

Ok i managed to solve it without threads (any suggestions why using threads would be better are appreciated) by using a snippet from this question Intercepting stdout of a subprocess while it is running

def execute(command):
    process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
    # Poll process for new output until finished
    while True:
        nextline = process.stdout.readline()
        if nextline == '' and process.poll() is not None:
            break
        sys.stdout.write(nextline)
        sys.stdout.flush()
    output = process.communicate()[0]
    exitCode = process.returncode
    if (exitCode == 0):
        return output
    else:
        raise ProcessException(command, exitCode, output)
Answered By: Wolkenarchitekt

Answer #3:

To print subprocess’ output line-by-line as soon as its stdout buffer is flushed in Python 3:

from subprocess import Popen, PIPE, CalledProcessError
with Popen(cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
    for line in p.stdout:
        print(line, end='') # process line here
if p.returncode != 0:
    raise CalledProcessError(p.returncode, p.args)

Notice: you do not need p.poll() — the loop ends when eof is reached. And you do not need iter(p.stdout.readline, '') — the read-ahead bug is fixed in Python 3.

See also, Python: read streaming input from subprocess.communicate().

Answered By: jfs

Answer #4:

There is actually a really simple way to do this when you just want to print the output:

import subprocess
import sys
def execute(command):
    subprocess.check_call(command, stdout=sys.stdout, stderr=subprocess.STDOUT)

Here we’re simply pointing the subprocess to our own stdout, and using existing succeed or exception api.

Answered By: Andrew Ring

Answer #5:

@tokland

tried your code and corrected it for 3.4 and windows
dir.cmd is a simple dir command, saved as cmd-file

import subprocess
c = "dir.cmd"
def execute(command):
    popen = subprocess.Popen(command, stdout=subprocess.PIPE,bufsize=1)
    lines_iterator = iter(popen.stdout.readline, b"")
    while popen.poll() is None:
        for line in lines_iterator:
            nline = line.rstrip()
            print(nline.decode("latin"), end = "rn",flush =True) # yield line
execute(c)
Answered By: user3759376

Answer #6:

In case someone wants to read from both stdout and stderr at the same time using threads, this is what I came up with:

import threading
import subprocess
import Queue
class AsyncLineReader(threading.Thread):
    def __init__(self, fd, outputQueue):
        threading.Thread.__init__(self)
        assert isinstance(outputQueue, Queue.Queue)
        assert callable(fd.readline)
        self.fd = fd
        self.outputQueue = outputQueue
    def run(self):
        map(self.outputQueue.put, iter(self.fd.readline, ''))
    def eof(self):
        return not self.is_alive() and self.outputQueue.empty()
    @classmethod
    def getForFd(cls, fd, start=True):
        queue = Queue.Queue()
        reader = cls(fd, queue)
        if start:
            reader.start()
        return reader, queue
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(stdoutReader, stdoutQueue) = AsyncLineReader.getForFd(process.stdout)
(stderrReader, stderrQueue) = AsyncLineReader.getForFd(process.stderr)
# Keep checking queues until there is no more output.
while not stdoutReader.eof() or not stderrReader.eof():
   # Process all available lines from the stdout Queue.
   while not stdoutQueue.empty():
       line = stdoutQueue.get()
       print 'Received stdout: ' + repr(line)
       # Do stuff with stdout line.
   # Process all available lines from the stderr Queue.
   while not stderrQueue.empty():
       line = stderrQueue.get()
       print 'Received stderr: ' + repr(line)
       # Do stuff with stderr line.
   # Sleep for a short time to avoid excessive CPU use while waiting for data.
   sleep(0.05)
print "Waiting for async readers to finish..."
stdoutReader.join()
stderrReader.join()
# Close subprocess' file descriptors.
process.stdout.close()
process.stderr.close()
print "Waiting for process to exit..."
returnCode = process.wait()
if returnCode != 0:
   raise subprocess.CalledProcessError(returnCode, command)

I just wanted to share this, as I ended up on this question trying to do something similar, but none of the answers solved my problem. Hopefully it helps someone!

Note that in my use case, an external process kills the process that we Popen().

Answered By: Will

Answer #7:

In Python >= 3.5 using subprocess.run works for me:

import subprocess
cmd = 'echo foo; sleep 1; echo foo; sleep 2; echo foo'
subprocess.run(cmd, shell=True)

(getting the output during execution also works without shell=True)
https://docs.python.org/3/library/subprocess.html#subprocess.run

Answered By: user7017793

Answer #8:

For anyone trying the answers to this question to get the stdout from a Python script note that Python buffers its stdout, and therefore it may take a while to see the stdout.

This can be rectified by adding the following after each stdout write in the target script:

sys.stdout.flush()
Answered By: user1379351

Leave a Reply

Your email address will not be published.