I recently came across a situation where I wanted to capture the output of a subprocess started by a Python script,
but also let it print to the terminal normally. An example of where this may be useful is with something like curl
,
where progress is output to stderr
(with the -o
option). In an interactive program, you may want to show the user
that progress information, but also capture it for parsing in your script. By default,
subprocess.run
does not capture
any output, but the subprocess does print to the terminal. Passing stdout=subprocess.PIPE, stderr=subprocess.STDOUT
to subprocess.run
captures the output but does not let the subprocess print. So you don't see any output until
the subprocess has completed. Redirecting sys.stdout
or sys.stderr
doesn't work because it only replaces the
Python script's stdout
or stderr
, it doesn't have an effect on the subprocess'.
The only way to accomplish this seems to be to start the subprocess with the non-blocking subprocess.Popen
, poll
for available output, and both print it and accumulate it in a variable. The code shown here requires the
selectors
module, which is only available in Python 3.4+.
Thanks for this code snippet.
I'm using this to call bash commands from python. It works great when you don't need to keep the bash context by passing it
["bash","-c", statement]
where statement is the bash command.However, I'd like to keep a bash subprocess running and feeding it commands when I wanna. I've tried calling your snipped with only
["bash"]
so it stays open, adding stdin as PIPE and then writing commands to stdin and it locks.I don't know why that happens