I think some code about the cURL solution is needed here, so I will share mine (it was written mixing several sources as the PHP Manual and comments).
It does some parallel HTTP requests (domains in $aURLs
) and print the responses once each one is completed (and stored them in $done
for other possible uses).
The code is longer than needed because the realtime print part and the excess of comments, but feel free to edit the answer to improve it:
<?php
/* Strategies to avoid output buffering, ignore the block if you don't want to print the responses before every cURL is completed */
ini_set('output_buffering', 'off'); // Turn off output buffering
ini_set('zlib.output_compression', false); // Turn off PHP output compression
//Flush (send) the output buffer and turn off output buffering
ob_end_flush(); while (@ob_end_flush());
apache_setenv('no-gzip', true); //prevent apache from buffering it for deflate/gzip
ini_set('zlib.output_compression', false);
header("Content-type: text/plain"); //Remove to use HTML
ini_set('implicit_flush', true); // Implicitly flush the buffer(s)
ob_implicit_flush(true);
header('Cache-Control: no-cache'); // recommended to prevent caching of event data.
$string=''; for($i=0;$i<1000;++$i){$string.=' ';} output($string); //Safari and Internet Explorer have an internal 1K buffer.
//Here starts the program output
function output($string){
ob_start();
echo $string;
if(ob_get_level()>0) ob_flush();
ob_end_clean(); // clears buffer and closes buffering
flush();
}
function multiprint($aCurlHandles,$print=true){
global $done;
// iterate through the handles and get your content
foreach($aCurlHandles as $url=>$ch){
if(!isset($done[$url])){ //only check for unready responses
$html = curl_multi_getcontent($ch); //get the content
if($html){
$done[$url]=$html;
if($print) output("$html".PHP_EOL);
}
}
}
};
function full_curl_multi_exec($mh, &$still_running) {
do {
$rv = curl_multi_exec($mh, $still_running); //execute the handles
} while ($rv == CURLM_CALL_MULTI_PERFORM); //CURLM_CALL_MULTI_PERFORM means you should call curl_multi_exec() again because there is still data available for processing
return $rv;
}
set_time_limit(60); //Max execution time 1 minute
$aURLs = array("http://domain/script1.php","http://domain/script2.php"); // array of URLs
$done=array(); //Responses of each URL
//Initialization
$aCurlHandles = array(); // create an array for the individual curl handles
$mh = curl_multi_init(); // init the curl Multi and returns a new cURL multi handle
foreach ($aURLs as $id=>$url) { //add the handles for each url
$ch = curl_init(); // init curl, and then setup your options
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1); // returns the result - very important
curl_setopt($ch, CURLOPT_HEADER, 0); // no headers in the output
$aCurlHandles[$url] = $ch;
curl_multi_add_handle($mh,$ch);
}
//Process
$active = null; //the number of individual handles it is currently working with
$mrc=full_curl_multi_exec($mh, $active);
//As long as there are active connections and everything looks OK…
while($active && $mrc == CURLM_OK) { //CURLM_OK means is that there is more data available, but it hasn't arrived yet.
// Wait for activity on any curl-connection and if the network socket has some data…
if($descriptions=curl_multi_select($mh,1) != -1) {//If waiting for activity on any curl_multi connection has no failures (1 second timeout)
usleep(500); //Adjust this wait to your needs
//Process the data for as long as the system tells us to keep getting it
$mrc=full_curl_multi_exec($mh, $active);
//output("Still active processes: $active".PHP_EOL);
//Printing each response once it is ready
multiprint($aCurlHandles);
}
}
//Printing all the responses at the end
//multiprint($aCurlHandles,false);
//Finalize
foreach ($aCurlHandles as $url=>$ch) {
curl_multi_remove_handle($mh, $ch); // remove the handle (assuming you are done with it);
}
curl_multi_close($mh); // close the curl multi handler
?>
CURL
to fire requests and fetch some data from the web ... – Jodijodie