Is it possible to run function in a subprocess without threading or writing a separate file/script.
Asked Answered
L

4

106
import subprocess

def my_function(x):
    return x + 100

output = subprocess.Popen(my_function, 1) #I would like to pass the function object and its arguments
print output 
#desired output: 101

I have only found documentation on opening subprocesses using separate scripts. Does anyone know how to pass function objects or even an easy way to pass function code?

Lipcombe answered 12/1, 2010 at 3:49 Comment(1)
I believe that you are looking for the multiprocessing module.Whaley
R
144

I think you're looking for something more like the multiprocessing module:

http://docs.python.org/library/multiprocessing.html#the-process-class

The subprocess module is for spawning processes and doing things with their input/output - not for running functions.

Here is a multiprocessing version of your code:

from multiprocessing import Process, Queue

# must be a global function    
def my_function(q, x):
    q.put(x + 100)

if __name__ == '__main__':
    queue = Queue()
    p = Process(target=my_function, args=(queue, 1))
    p.start()
    p.join() # this blocks until the process terminates
    result = queue.get()
    print result
Remington answered 12/1, 2010 at 3:57 Comment(6)
You can use the processify decorator as a shortcut: gist.github.com/2311116Uropod
I assume that this clones the Python interpreter and all of its environment for the subprocess?Pharmaceutics
Here is a fork of processify that works in python 3 and supports generator functions. gist.github.com/stuaxo/889db016e51264581b50Effieeffigy
Note that this code contains a deadlock in case you are passing non-trivially large data through the queue - always queue.get() before joining the process, otherwise it'll hang on trying to write to the queue while nothing is reading it.Stage
@Uropod I want to run a function in the background but I have some resource limitations and cannot run the function as many times that I want and want to queue the extra executions of the function. Do you have any idea on how I should do that? I have my question here. Could you please take a look at my question? Any help would be great!Odontograph
what if i don't what a result from it. Can i do it without passing queue? Also can my_function, call another function? while its in the process?Armed
O
23

You can use the standard Unix fork system call, as os.fork(). fork() will create a new process, with the same script running. In the new process, it will return 0, while in the old process it will return the process ID of the new process.

child_pid = os.fork()
if child_pid == 0:
  print "New proc"
else:
  print "Old proc"

For a higher level library, that provides multiprocessing support that provides a portable abstraction for using multiple processes, there's the multiprocessing module. There's an article on IBM DeveloperWorks, Multiprocessing with Python, with a brief introduction to both techniques.

Ottie answered 12/1, 2010 at 3:58 Comment(6)
I'm curious; why the downvote? Is there anything wrong in my answer?Ottie
Multiprocessing is not just a higher level wrapper around fork(), it's a multiplatform multiprocessing toolkit (which uses fork on unix). Which is important, because this means it runs on, say, Windows, while fork() does not. Edit: And this was the reason for the downvote, although I later decided it probably wasn't worth it. Too late to take it back, though. Edit2: Or rather, fork() being suggested when it's not cross-platform was the reason.Postprandial
@Devin, you can always take back a downvote you did, if you want to.Cenac
Edited to clarify that, then. I explicitly mentioned that fork is not portable; I generally will give non-portable answers along with information that they are non-portable, and let the questioner decide if that's sufficient for them. As I've edited my answer, you should be able to remove the downvote if you feel that I've improved it sufficiently; though no hard feelings if you don't, I just wanted to check to see what I'd gotten wrong.Ottie
@Alex, nope, you can't. After a certain amount of time passes, you can't take it back, until an edit occurs. Such an amount of time had passed before I rethought, thus the "too late" comment. Anyway, as I said, I had decided it wasn't worth it, so it's gone. I also do appreciate and understand your reasons, and I'm glad there'd be no hard feelings either way. :pPostprandial
@BrianCampbell I want to run a function in the background but I have some resource limitations and cannot run the function as many times that I want and want to queue the extra executions of the function. Do you have any idea on how I should do that? I have my question here. Could you please take a look at my question and see if you can give me some hints (or even better, an answer) on how I should do that?Odontograph
M
9

Brian McKenna's above post about multiprocessing is really helpful, but if you wanted to go down the threaded route (opposed to process-based), this example will get you started:

import threading
import time

def blocker():
    while True:
        print "Oh, sorry, am I in the way?"
        time.sleep(1)

t = threading.Thread(name='child procs', target=blocker)
t.start()

# Prove that we passed through the blocking call
print "No, that's okay" 

You can also use the setDaemon(True) feature to background the thread immediately.

Mauro answered 6/8, 2014 at 22:2 Comment(1)
Note that due to the GIL, threading in Python is only really useful for waiting on things (i.e. non-CPU-bound tasks). For CPU bound tasks, multiprocessing must be used.Houdan
L
7

You can use concurrent.futures.ProcessPoolExecutor, which not only propagates the return value, but also any exceptions:

import concurrent.futures

# must be a global function    
def my_function(x):
    if x < 0:
        raise ValueError
    return x + 100

with concurrent.futures.ProcessPoolExecutor() as executor:
    f = executor.submit(my_function, 1)
    ret = f.result()  # will rethrow any exceptions
Liminal answered 3/6, 2022 at 14:8 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.