PHP Background Processes
Asked Answered
C

10

41

I'm trying to make a PHP script, I have the script finished but it takes like 10 minutes to finish the process it is designed to do. This is not a problem, however I presume I have to keep the page loaded all this time which is annoying. Can I have it so that I start the process and then come back 10mins later and just view the log file it has generated?

Cissie answered 5/11, 2008 at 13:0 Comment(0)
C
50

Well, you can use ignore_user_abort

So the script will continue to work (keep an eye on script duration, perhaps add "set_time_limit(0)")

But a warning here: You will not be able to stop a script with these two lines:

ignore_user_abort(true); 
set_time_limit(0);

Except you can directly access the server and kill the process there! (Been there, done an endless loop, calling itself over and over again, made the server come to a screeching stop, got shouted at...)

Claudy answered 5/11, 2008 at 13:9 Comment(3)
This is brilliant. This also works very well with wordpress ajax call 'wp_ajax_nopriv_your_trigger'. Use the above two methods in your hook, and invoke (using a scheduler perhaps) yourdomain.com/wp-admin/admin-ajax.php?action=your_triggerBalbriggan
Worked like a charm! I am uploading a file to my server, then I use a script using this method to upload it to S3 so that I am easily able to show upload progress to my users. I was refreshing, other requests and what not in my site. Lo and behold the file is in S3 and my log shows the progression. :DClap
@azure_ardee WordPress already has a mock cron system for doing things like this. See codex.wordpress.org/Function_Reference/wp_cronTheodicy
G
27

Sounds like you should have a queue and an external script for processing the queue.

For example, your PHP script should put an entry into a database table and return right away. Then, a cron running every minute checks the queue and forks a process for each job.

The advantage here is that you don't lock an apache thread up for 10 minutes.

Grayce answered 5/11, 2008 at 16:37 Comment(0)
B
8

I had lots of issues with this sort of process under windows; My situation was a little different in that I didn't care about the response of the "script"- I wanted the script to start and allow other page requests to go through while it was busy working away.

For some reason; I had issues with it either hanging other requests or timing out after about 60 seconds (both apache and php were set to time out after about 20 minutes); It also turns out that firefox times out after 5 minutes (by default) anyway so after that point you can't know what's going on through the browser without changing settings in firefox.

I ended up using the process open and process close methods to open up a php in cli mode like so:

pclose(popen("start php myscript.php", "r"));

This would ( using start ) open the php process and then kill the start process leaving php running for however long it needed - again you'd need to kill the process to manually shut it down. It didn't need you to set any time outs and you could let the current page that called it continue and output some more details.

The only issue with this is that if you need to send the script any data, you'd either do it via another source or pass it along the "command line" as parameters; which isn't so secure.

Worked nicely for what we needed though and ensures the script always starts and is allowed to run without any interruptions.

Bush answered 5/11, 2008 at 14:13 Comment(1)
Thanks exactly what I needed here are some other workarounds I found: somacon.com/p395.phpComedo
L
5

I think shell_exec command is what you are looking for.

However, it is disables in safe mode.

The PHP manual article about it is here: http://php.net/shell_exec

There is an article about it here: http://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/

Lunch answered 5/11, 2008 at 13:8 Comment(0)
B
4

There is another option which you can use, run the script CLI...It will run in the background and you can even run it as a cronjob if you want.

e.g

> #!/usr/bin/php -q

<?php

//process logs

?>

This can be setup as a cronjob and will execute with no time limitation....this examples is for unix based operation system though.

FYI I have a php script running with an infinite loop which does some processing and has been running for the past 3 months non stop.

Birdseed answered 6/11, 2008 at 7:27 Comment(1)
I think you got lucky. In my research and experience, it's difficult to get one PHP script to run continuously, without stopping, for more than a few days--it wasn't designed to do that, or so I hear. But a few hours should be no problem. This line might help someone: ini_set('max_execution_time',28800); // 8 hoursLolly
K
3

You could use ignore_user_abort() - that way the script will continue to run even if you close your browser or go to a different page.

Kalindi answered 5/11, 2008 at 13:9 Comment(0)
P
2

Think about Gearman

Gearman is a generic application framework for farming out work to multiple machines or processes. It allows applications to complete tasks in parallel, to load balance processing, and to call functions between languages. The framework can be used in a variety of applications, from high-availability web sites to the transport of database replication events.

This extension provides classes for writing Gearman clients and workers. - Source php manual

Offical website of Gearman

Philippe answered 25/6, 2013 at 14:32 Comment(0)
B
2

In addition to bastiandoeen's answer you can combine ignore_user_abort(true); with a cUrl request.

Fake a request abortion setting a low CURLOPT_TIMEOUT_MS and keep processing after the connection closed:

function async_curl($background_process=''){

    //-------------get curl contents----------------

    $ch = curl_init($background_process);
    curl_setopt_array($ch, array(
        CURLOPT_HEADER => 0,
        CURLOPT_RETURNTRANSFER =>true,
        CURLOPT_NOSIGNAL => 1, //to timeout immediately if the value is < 1000 ms
        CURLOPT_TIMEOUT_MS => 50, //The maximum number of mseconds to allow cURL functions to execute
        CURLOPT_VERBOSE => 1,
        CURLOPT_HEADER => 1
    ));
    $out = curl_exec($ch);

    //-------------parse curl contents----------------

    //$header_size = curl_getinfo($ch, CURLINFO_HEADER_SIZE);
    //$header = substr($out, 0, $header_size);
    //$body = substr($out, $header_size);

    curl_close($ch);

    return true;
}

async_curl('http://example.com/background_process_1.php');

NB

If you want cURL to timeout in less than one second, you can use CURLOPT_TIMEOUT_MS, although there is a bug/"feature" on "Unix-like systems" that causes libcurl to timeout immediately if the value is < 1000 ms with the error "cURL Error (28): Timeout was reached". The explanation for this behavior is:

[...]

The solution is to disable signals using CURLOPT_NOSIGNAL

pros

  • No need to switch methods (Compatible windows & linux)
  • No need to implement connection handling via headers and buffer (Independent from Browser and PHP version)

cons

  • Need curl extension

Resources

Backplate answered 21/2, 2015 at 13:11 Comment(0)
T
1

Zuk.

I'm pretty sure this will work:

<?php 

pclose(popen('php /path/to/file/server.php &'));
echo "Server started. [OK]"; 

?>

The '&' is important. It tells the shell not to wait for the process to exit.

Also You can use this code in your php (as "bastiandoeen" said)

ignore_user_abort(true); 
set_time_limit(0);

in your server stop command:

<?php

$output;
exec('ps aux | grep -ie /path/to/file/server.php | awk \'{print $2}\' | xargs kill -9 ', $output);
    echo "Server stopped. [OK]";

?>
Tenderfoot answered 11/1, 2014 at 13:46 Comment(0)
B
-1

Just call StartBuffer() before any output, and EndBuffer() when you want client to close connection. The code after calling EndBuffer() will be executed on server without client connection.


    private function StartBuffer(){
        @ini_set('zlib.output_compression',0);
        @ini_set('implicit_flush',1);
        @ob_end_clean();
        @set_time_limit(0);
        @ob_implicit_flush(1);
        @ob_start();
    }

    private function EndBuffer(){
        $size = ob_get_length();
        header("Content-Length: $size");
        header('Connection: close');
        ob_flush();ob_implicit_flush(1);
    }

Bogie answered 26/7, 2018 at 13:46 Comment(1)
This does not add to the already accepted answer and doesn't even solve the OP's issue. Also liberal use of @ is really bad practiceOddfellow

© 2022 - 2024 — McMap. All rights reserved.