Start a background process in Python
Asked Answered
W

11

406

I'm trying to port a shell script to the much more readable python version. The original shell script starts several processes (utilities, monitors, etc.) in the background with "&". How can I achieve the same effect in python? I'd like these processes not to die when the python scripts complete. I am sure it's related to the concept of a daemon somehow, but I couldn't find how to do this easily.

Wellmeaning answered 28/7, 2009 at 18:56 Comment(4)
The really duplicated question is How to launch and run external script in background?. Cheers ;)Discontinuance
Hi Artem. Please accept Dan's answer because (1) more votes, (2) subprocess.Popen() is the new recommended way since 2010 (we are in 2015 now) and (3) the duplicated question redirecting here has also an accepted answer about subprocess.Popen(). Cheers :-)Discontinuance
@olibre In fact the answer should be subprocess.Popen("<command>") with <command> file led by a suitable shebang. Works perfect for me (Debian) with bash and python scripts, implicitely shells and survives its parent process. stdout goes to same terminal than the parent's. So this works much like & in a shell which was OPs request. But hell, all the questions work out very complex while a little testing showed it in no time ;)Mooch
For background maybe see also https://mcmap.net/q/25477/-running-bash-commands-in-pythonHeist
D
106

Note: This answer is less current than it was when posted in 2009. Using the subprocess module shown in other answers is now recommended in the docs

(Note that the subprocess module provides more powerful facilities for spawning new processes and retrieving their results; using that module is preferable to using these functions.)


If you want your process to start in the background you can either use system() and call it in the same way your shell script did, or you can spawn it:

import os
os.spawnl(os.P_DETACH, 'some_long_running_command')

(or, alternatively, you may try the less portable os.P_NOWAIT flag).

See the documentation here.

Depolymerize answered 28/7, 2009 at 19:5 Comment(10)
Remark: you must specify the full path to the executable. This function will not use the PATH variable and the variant that does use it is not available under Windows.Evilminded
straight up crashes python for me.Agnew
os.P_DETACH has been replaced with os.P_NOWAIT.Semivowel
From the docs: "Note that the subprocess module provides more powerful facilities for spawning new processes and retrieving their results; using that module is preferable to using these functions" - use the subprocess answer below instead.Ionone
os.spawn family can crash silently (for example, due to this bug). See the subprocess.Popen and subprocess.call replacements for them: docs.python.org/2/library/…Iodoform
Could the people suggesting using subprocess give us a hint how to detach a process with subprocess?Empale
How can I use Python script (say attach.py) to find a background process and redirect its IO so that attach.py can read from / write to some_long_running_prog in background?Feudalism
what if I want to run a command and then keep sending stuff to that process running in the background?Tamas
In 2021 I get ValueError: spawnv() arg 2 cannot be empty from the code above.Ewart
AttributeError: module 'os' has no attribute 'P_DETACH'Baronetcy
T
520

While jkp's solution works, the newer way of doing things (and the way the documentation recommends) is to use the subprocess module. For simple commands its equivalent, but it offers more options if you want to do something complicated.

Example for your case:

import subprocess
subprocess.Popen(["rm","-r","some.file"])

This will run rm -r some.file in the background. Note that calling .communicate() on the object returned from Popen will block until it completes, so don't do that if you want it to run in the background:

import subprocess
ls_output=subprocess.Popen(["sleep", "30"])
ls_output.communicate()  # Will block for 30 seconds

See the documentation here.

Also, a point of clarification: "Background" as you use it here is purely a shell concept; technically, what you mean is that you want to spawn a process without blocking while you wait for it to complete. However, I've used "background" here to refer to shell-background-like behavior.

Technics answered 28/8, 2011 at 21:47 Comment(21)
@Dan: How do I kill the process once it's running in the background? I want to run it for a while (it's a daemon that I interact with) and kill it when I'm done with it. The docs aren't helpful...Idoux
@Juan: If you're running on a unix system, you could use the kill command. You could also kill it from the task manager in windows.Technics
@Technics but don't I need to know the PID for that? Activity monitor/Task manager not an option (needs to happen programmatically).Idoux
ok so how do you force the process to background when you need the result of Popen() to write to its stdin?Smolensk
It is a misleading answer. "background" job is the property of a shell but Popen() does not use a shell to run commands by default. stdout=PIPE won't change it.Viera
@J.F.Sebastian: I interpreted it as "how can I create an independent process that doesn't stop the execution of the current program". How would you suggest I edit it to make this more explicit?Technics
@Dan: the correct answer is: use Popen() to avoid blocking the main thread and if you need a daemon then look at python-daemon package to understand how a well-defined daemon should behave. Your answer is ok if you remove everything starting with "But be wary" except for the link to subprocess' docs.Viera
@Technics are you sure that it's true that the last command will not run? AFAIK you can create a Popen object and access the pid property without blocking anything in the parent process.... see the answer by "f p" below (https://mcmap.net/q/25474/-start-a-background-process-in-python)Taro
@Technics proc = subprocess.Popen(["rm","-r","some.file"]), then to kill: proc.terminate()Measles
Actually Popen(["ls", "-a"], stdout=subprocess.PIPE) runs in the background. The problem is when you later use p.communicate() which brings the process to the foreground.Marianmariana
@Pithikos: do not use stdout=PIPE unless you read from the pipe while the process is running otherwise the child process may hang foreever if the corresponding OS pipe buffer fills up.Viera
How should I do to integrate variables ? My command is something like this : command = 'python UploadService.py ' + var1 + ' ' + var2 + ' ' + var3 . Doing subprocess.Popen([command]) does not workNightly
what if I want to run a command and then keep sending stuff to that process running in the background?Tamas
@Charlie Parker: See the documentation on subprocess. If that doesn't help, you might want to ask a new question.Technics
I've just edited this aggressively to fix errors that I noticed or were pointed out above by others. Hopefully you don't object to the changes. If you do, you of course have the power to rollback, but I'd appreciate if you'd also let me know why so we can discuss. If you're happy, might be worth flagging for the mods to clean up some of the now-obsolete comments in this thread (including this one).Dextro
I am not sure why, but this technique does not work for me where I am trying subprocess.Popen(["/some/path/to/hive","-f","/some/path/to_HQL/hql.dat"])Bowling
@MarkGinsburg What exactly doesn't work about it? Does it block the rest of your script instead of running in parallel? I don't know much about how python's changed in the years since I wrote this, but a good workaround might be to also use os.fork() unixy systems.Technics
I was wrong... it did launch but the job did not show up (in a Unix system) when I typed 'jobs' --- but it was visible when I did 'ps -elf | grep aStringContainedinMyJob' So this technique fooled me, because I expected it to show up in 'jobs'.Bowling
@MarkGinsburg: Yeah, this doesn't really run the job "in the background" (which is a shell concept), this creates another process that runs the command independently. The shell doesn't even know about it. This resembles the background in that the job runs in parallel with the main program without blocking it, which is more properly a forked process. If you want the process's information in the shell, you can print out the PID and use that to look at the process's info with ps.Technics
I wonder why you suggest communicate() in place of wait()...Dividivi
To keep the program running evenif the python script is killed, add start_new_session=True for POSIXIconic
D
106

Note: This answer is less current than it was when posted in 2009. Using the subprocess module shown in other answers is now recommended in the docs

(Note that the subprocess module provides more powerful facilities for spawning new processes and retrieving their results; using that module is preferable to using these functions.)


If you want your process to start in the background you can either use system() and call it in the same way your shell script did, or you can spawn it:

import os
os.spawnl(os.P_DETACH, 'some_long_running_command')

(or, alternatively, you may try the less portable os.P_NOWAIT flag).

See the documentation here.

Depolymerize answered 28/7, 2009 at 19:5 Comment(10)
Remark: you must specify the full path to the executable. This function will not use the PATH variable and the variant that does use it is not available under Windows.Evilminded
straight up crashes python for me.Agnew
os.P_DETACH has been replaced with os.P_NOWAIT.Semivowel
From the docs: "Note that the subprocess module provides more powerful facilities for spawning new processes and retrieving their results; using that module is preferable to using these functions" - use the subprocess answer below instead.Ionone
os.spawn family can crash silently (for example, due to this bug). See the subprocess.Popen and subprocess.call replacements for them: docs.python.org/2/library/…Iodoform
Could the people suggesting using subprocess give us a hint how to detach a process with subprocess?Empale
How can I use Python script (say attach.py) to find a background process and redirect its IO so that attach.py can read from / write to some_long_running_prog in background?Feudalism
what if I want to run a command and then keep sending stuff to that process running in the background?Tamas
In 2021 I get ValueError: spawnv() arg 2 cannot be empty from the code above.Ewart
AttributeError: module 'os' has no attribute 'P_DETACH'Baronetcy
C
58

You probably want the answer to "How to call an external command in Python".

The simplest approach is to use the os.system function, e.g.:

import os
os.system("some_command &")

Basically, whatever you pass to the system function will be executed the same as if you'd passed it to the shell in a script.

Cantal answered 28/7, 2009 at 19:2 Comment(8)
IMHO, python scripts are usually written to be cross-platform and if there simple cross-platform solution exists it's better to stick with it. Never know if you'll have to work with another platform in future :) Or if some other man would want to migrate your script to his platform.Tingly
This command is synchronous (i.e. it always waits for a termination of the started process).Sent
@Tingly is os.system not portable?Radiopaque
@Tingly isn't the choice of running something in the background already positioning you in posix-land? What would you do on Windows? Run as a service?Radiopaque
how can I use this if I need to run a command from a specific folder?Onrush
@mrRobot: see https://mcmap.net/q/25522/-equivalent-of-shell-39-cd-39-command-to-change-the-working-directory for how to change the current working directory of the current Python script, which you can do immediately before executing the sub-command.Cantal
This will not work in windows. Eg, a launched window would close when script is done.Bertelli
@mrRobot: you can use os.system("cd somedir && some_command &")Downstage
R
40

I found this here:

On windows (win xp), the parent process will not finish until the longtask.py has finished its work. It is not what you want in CGI-script. The problem is not specific to Python, in PHP community the problems are the same.

The solution is to pass DETACHED_PROCESS Process Creation Flag to the underlying CreateProcess function in win API. If you happen to have installed pywin32 you can import the flag from the win32process module, otherwise you should define it yourself:

DETACHED_PROCESS = 0x00000008

pid = subprocess.Popen([sys.executable, "longtask.py"],
                       creationflags=DETACHED_PROCESS).pid
Rodina answered 27/11, 2012 at 21:19 Comment(4)
+1 for showing how to retain the process id. And if anyone want to kill the program later with the process id: #17857428Hear
This seems Windows onlyScarberry
any cross-platform solution for this?Nectarous
ValueError: creationflags is only supported on Windows platformsOthaothe
S
39

Use subprocess.Popen() with the close_fds=True parameter, which will allow the spawned subprocess to be detached from the Python process itself and continue running even after Python exits.

https://gist.github.com/yinjimmy/d6ad0742d03d54518e9f

import os, time, sys, subprocess

if len(sys.argv) == 2:
    time.sleep(5)
    print 'track end'
    if sys.platform == 'darwin':
        subprocess.Popen(['say', 'hello'])
else:
    print 'main begin'
    subprocess.Popen(['python', os.path.realpath(__file__), '0'], close_fds=True)
    print 'main end'
Sepulchral answered 25/12, 2015 at 1:47 Comment(5)
In windows, it doesn't detach but using creationflags parameter worksGiddings
This solution leaves a subprocess as Zombie on Linux.Iggie
@Iggie this can be avoid by set SIGCHLD SIG_IGN : #16808103Tennyson
thanks @Jimmy your answer is the ONLY solution works for me.Tennyson
The close_fds=True option works by detaching the process, but it didn't return back to my Python program. Hoping to find an option that truly executes a process and sends it to the background and then returns back to the Python program.Naamann
K
16

Both capture output and run on background with threading

As mentioned on this answer, if you capture the output with stdout= and then try to read(), then the process blocks.

However, there are cases where you need this. For example, I wanted to launch two processes that talk over a port between them, and save their stdout to a log file and stdout.

The threading module allows us to do that.

First, have a look at how to do the output redirection part alone in this question: Python Popen: Write to stdout AND log file simultaneously

Then:

main.py

#!/usr/bin/env python3

import os
import subprocess
import sys
import threading

def output_reader(proc, file):
    while True:
        byte = proc.stdout.read(1)
        if byte:
            sys.stdout.buffer.write(byte)
            sys.stdout.flush()
            file.buffer.write(byte)
        else:
            break

with subprocess.Popen(['./sleep.py', '0'], stdout=subprocess.PIPE, stderr=subprocess.PIPE) as proc1, \
     subprocess.Popen(['./sleep.py', '10'], stdout=subprocess.PIPE, stderr=subprocess.PIPE) as proc2, \
     open('log1.log', 'w') as file1, \
     open('log2.log', 'w') as file2:
    t1 = threading.Thread(target=output_reader, args=(proc1, file1))
    t2 = threading.Thread(target=output_reader, args=(proc2, file2))
    t1.start()
    t2.start()
    t1.join()
    t2.join()

sleep.py

#!/usr/bin/env python3

import sys
import time

for i in range(4):
    print(i + int(sys.argv[1]))
    sys.stdout.flush()
    time.sleep(0.5)

After running:

./main.py

stdout get updated every 0.5 seconds for every two lines to contain:

0
10
1
11
2
12
3
13

and each log file contains the respective log for a given process.

Inspired by: https://eli.thegreenplace.net/2017/interacting-with-a-long-running-child-process-in-python/

Tested on Ubuntu 18.04, Python 3.6.7.

Kookaburra answered 12/12, 2018 at 21:48 Comment(0)
R
13

You probably want to start investigating the os module for forking different threads (by opening an interactive session and issuing help(os)). The relevant functions are fork and any of the exec ones. To give you an idea on how to start, put something like this in a function that performs the fork (the function needs to take a list or tuple 'args' as an argument that contains the program's name and its parameters; you may also want to define stdin, out and err for the new thread):

try:
    pid = os.fork()
except OSError, e:
    ## some debug output
    sys.exit(1)
if pid == 0:
    ## eventually use os.putenv(..) to set environment variables
    ## os.execv strips of args[0] for the arguments
    os.execv(args[0], args)
Rosalvarosalyn answered 28/7, 2009 at 19:12 Comment(4)
os.fork() is really useful, but it does have a notable downside of only being available on *nix.Zinfandel
The only problem with os.fork is that it is win32 specific.Depolymerize
More details about this approach: Creating a daemon the Python wayHaugen
You can also reach similar effects with threading: https://mcmap.net/q/25474/-start-a-background-process-in-python I think that might work on Windows.Kookaburra
S
2

Unlike some prior answers that use subprocess.Popen, this answer uses subprocess.run instead. The issue with using Popen is that if the process is not manually waited for until completion, a stale <defunct> entry remains in the Linux process table as seen by ps. These entries can add up.

In contrast to Popen, when using subprocess.run, by design run waits for the process to complete, and so no such defunct entry will remain in the process table. Because subprocess.run is blocking, it can be run in a thread. The rest of the code can continue after starting this thread. In this way, the process effectively runs in the background.

import subprocess, threading

kwargs = {stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, check=True, **your_kwargs}
threading.Thread(subprocess.run, args=(your_command,), kwargs=kwargs).start()

Note that subprocess.call also waits for the process to complete, and can be used similarly.

Spiritualize answered 16/6, 2023 at 19:19 Comment(0)
C
1

You can use

import os
pid = os.fork()
if pid == 0:
    Continue to other code ...

This will make the python process run in background.

Contingency answered 4/3, 2021 at 16:14 Comment(0)
U
1

I'm running Python 3.9.14 on Linux. I found the following worked for me in a similar situation:

import subprocess

cmd = "sleep 5 && ls /tmp >& ls.out &"
try: 
    runResult = subprocess.run(["bash", "-c", cmd])
except Exception as ex:
    print( f"Failed to run '{cmd}'" )
    if hasattr( ex, "message" ):
        print( ex.message )
    elif hasattr( ex, "strerror" ):
        print( ex.strerror)
    else:
        print( ex )

If you run the above and quickly do an ls in the current directory, you will find that the "ls.out" file doesn't yet exist. Wait a few more seconds and the file is there. So, the command continues to run after Python exits.

The 'runResult' has a 'returncode' field that indicates whether the program launched successfully or not. I do not know of a good way of later killing the process from within Python.

I was also able to do a more Python-ish (Python-ly?) approach: I have a shell script named "runs5secs":

#!/bin/bash
sleep 5
ls

and I can run it in the background with:

import shlex
import subprocess

cmd = "sleep 5 && ls /tmp >& ls.out &"

logName = "./run5secs.out"
cmd = "./run5secs my1 your2"

try:
  f = open( logName, 'w' )
except Exception as ex:
    print( f"Failed to run '{cmd}'" )
    if hasattr( ex, "message" ):
        print( ex.message )
    elif hasattr( ex, "strerror" ):
        print( ex.strerror)
    else:
        print( ex )

args = shlex.split( cmd )

try: 
    cmdRes = subprocess.Popen( args, stdout=f,
                               stderr=subprocess.STDOUT,
                               universal_newlines=True )
except Exception as ex:
    print( f"Failed to run '{cmd}'" )
    if hasattr( ex, "message" ):
        print( ex.message )
    elif hasattr( ex, "strerror" ):
        print( ex.strerror)
    else:
        print( ex )

print( cmdRes )
Unmoral answered 9/2, 2024 at 18:53 Comment(0)
E
-1

I haven't tried this yet but using .pyw files instead of .py files should help. pyw files dosen't have a console so in theory it should not appear and work like a background process.

Edulcorate answered 10/2, 2022 at 2:16 Comment(1)
This does not provide an answer to the question. Once you have sufficient reputation you will be able to comment on any post; instead, provide answers that don't require clarification from the asker. - From ReviewJerald

© 2022 - 2025 — McMap. All rights reserved.