How to make HTTP requests in PHP and not wait on the response
Asked Answered
G

17

249

Is there a way in PHP to make HTTP calls and not wait for a response? I don't care about the response, I just want to do something like file_get_contents(), but not wait for the request to finish before executing the rest of my code. This would be super useful for setting off "events" of a sort in my application, or triggering long processes.

Any ideas?

Gerge answered 23/9, 2008 at 23:0 Comment(5)
one function - 'curl_multi', look in the php docs for it. Should solve your problemsDiarmit
The title of this post is misleading. I came looking for truly asynchronous calls similar to requests in Node.js or an AJAX request. The accepted answer isn't async (it blocks and doesn't provide a callback), just a faster synchronous request. Consider changing the question or accepted answer.Friendly
Playing with connection handling via headers and buffer is not bulletproof. I have just post a new answer independant from OS, browser or PHP verisonAnomie
Asynchronous does not mean you don't care about the response. It just means the call doesn't block the main thread execution. Asynchronous still requires a response, but the response can be processed in another thread of execution or later in an event loop. This question is asking for a fire-and-forget request which can be synchronous or asynchronous depending on message delivery semantics, whether you care about message order, or delivery confirmation.Rebozo
I think you should make this fire HTTP request in non-blocking mode (w/c is what you really want).. Because when you call a resource, you basically want to know if you reached the server or not (or whatever reason, you simply need the response). The best answer really is fsockopen and setting stream reading or writing to non-blocking mode. It's like call and forget.Brigidbrigida
E
24

You can do trickery by using exec() to invoke something that can do HTTP requests, like wget, but you must direct all output from the program to somewhere, like a file or /dev/null, otherwise the PHP process will wait for that output.

If you want to separate the process from the apache thread entirely, try something like (I'm not sure about this, but I hope you get the idea):

exec('bash -c "wget -O (url goes here) > /dev/null 2>&1 &"');

It's not a nice business, and you'll probably want something like a cron job invoking a heartbeat script which polls an actual database event queue to do real asynchronous events.

Expellant answered 23/9, 2008 at 23:35 Comment(6)
Similarly, I've also done the following: exec("curl $url > /dev/null &");Guenzi
Question: is there a benefit of calling 'bash -c "wget"' rather than just 'wget'?Guenzi
In my testing, using exec("curl $url > /dev/null 2>&1 &"); is one of the fastest solutions here. It's immensely faster (1.9s for 100 iterations) than the post_without_wait() function (14.8s) in the "accepted" answer above. AND it's a one-liner...Salmons
Use full path (e.g. /usr/bin/curl) to make it even more fasterDislocation
does this wait until the script is finished?Jareb
Isn't exec() disabled on most shared servers?Brittabrittain
G
43

The answer I'd previously accepted didn't work. It still waited for responses. This does work though, taken from How do I make an asynchronous GET request in PHP?

function post_without_wait($url, $params)
{
    foreach ($params as $key => &$val) {
      if (is_array($val)) $val = implode(',', $val);
        $post_params[] = $key.'='.urlencode($val);
    }
    $post_string = implode('&', $post_params);

    $parts=parse_url($url);

    $fp = fsockopen($parts['host'],
        isset($parts['port'])?$parts['port']:80,
        $errno, $errstr, 30);

    $out = "POST ".$parts['path']." HTTP/1.1\r\n";
    $out.= "Host: ".$parts['host']."\r\n";
    $out.= "Content-Type: application/x-www-form-urlencoded\r\n";
    $out.= "Content-Length: ".strlen($post_string)."\r\n";
    $out.= "Connection: Close\r\n\r\n";
    if (isset($post_string)) $out.= $post_string;

    fwrite($fp, $out);
    fclose($fp);
}
Gerge answered 23/9, 2008 at 23:1 Comment(17)
If you look at the link you posted here, my answer includes a way to do GET requests as well.Delightful
This is NOT async! In particular if the server on the other side is down this piece of code will hang for 30 seconds (the 5th parameter in the fsockopen). Also the fwrite is going to take its sweet time to execute (that you can limit with stream_set_timeout($fp, $my_timeout). The best you can do is to set a low timeout on fsockopen to 0.1 (100ms) and $my_timeout to 100ms. You risk though, that the request timeout.Flyte
I assure you that it is async, and does not take 30 seconds. That's a timeout max. It's feasible that your settings are different causing that effect, but this worked great for me.Gerge
@UltimateBrent There's nothing in the code that suggests it's asynchronous. It doesn't wait for a response, but that is not asynchronous. If the remote server opens the connection and then hangs, this code would wait for 30 seconds until you hit that timeout.Trotter
I don't know what to tell you. I use it all over the place, and it works as intended.Gerge
the reason that it seems to work "async" because you don't read from the socket before closing it so it didn't hang even if the server did not emit a response in time. However this is absolutely not async. If the write buffer is full (very least likely) your script will definitely hang there. You should consider changing your title to something like "requesting a webpage without waiting for response".Opaline
here is an article about actual async: vince.shiftrunstop.com/dev/…Viafore
Maybe http_build_query would be useful to build the querystring...Service
This is neither async nor is it using curl, how you dare calling it curl_post_async and get even upvotes...Indecipherable
I didn't call it that, I copied from the question I linked. I've updated the function name though.Gerge
I'm trying to use it but the request doesn't actually go though every time... When I debug and step through the code I can see the request in the target server log always... but in normal use not... Why could that be?Discourtesy
How do you read in the requested php page the $post_string (parameters)? I try if (isset($_GET["myvar"])) $token = $_GET["myvar"]; but its empty :(Mukden
exec("curl $url > /dev/null 2>&1 &"); is one of the fastest solutions here (Thanks @Matt Huggins). It's immensely faster (1.9s for 100 iterations) than the post_without_wait() function (14.8s). And it doesn't come with the same timeout/URL rewriting/etc limitations since it's full-blown cURL. AND it's a one-liner...Salmons
@DavidKrmpotic that did happen to me, I added a sleep(1); before fclose(). That should work.Camaraderie
@UltimateBrent I don't think you understand the meaning of async, as a couple of replies here already stated, this has nothing to do with async. Maybe this will help: #748675Kagu
Works well, while Guzzle for example which is recommended in other threads doesnt allow 'fire and forget"Conspire
How on earth is this the accepted answer? This is not asynchronous at all. It's using 100% blocking I/O. Nothing about this code is asynchronous.Parrot
P
30

If you control the target that you want to call asynchronously (e.g. your own "longtask.php"), you can close the connection from that end, and both scripts will run in parallel. It works like this:

  1. quick.php opens longtask.php via cURL (no magic here)
  2. longtask.php closes the connection and continues (magic!)
  3. cURL returns to quick.php when the connection is closed
  4. Both tasks continue in parallel

I have tried this, and it works just fine. But quick.php won't know anything about how longtask.php is doing, unless you create some means of communication between the processes.

Try this code in longtask.php, before you do anything else. It will close the connection, but still continue to run (and suppress any output):

while(ob_get_level()) ob_end_clean();
header('Connection: close');
ignore_user_abort();
ob_start();
echo('Connection Closed');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();

The code is copied from the PHP manual's user contributed notes and somewhat improved.

Pressman answered 13/2, 2010 at 18:22 Comment(4)
This would work. But if you are using a MVC framework it may be difficult to implement because the way that these framework intercept and rewrite calls. For example it does not work in a Controller in CakePHPFlyte
A doubt about this code, the process you need to do in longtask must go after this lines? Thanks.Rosenblatt
It doesn't works perfectly. Try to add while(true); after your code. The page will hang, this means it is still running in foreground.Jalapa
How do I "open it via cURL"? How do I "create some means of communication between the processes"?Brittabrittain
E
24

You can do trickery by using exec() to invoke something that can do HTTP requests, like wget, but you must direct all output from the program to somewhere, like a file or /dev/null, otherwise the PHP process will wait for that output.

If you want to separate the process from the apache thread entirely, try something like (I'm not sure about this, but I hope you get the idea):

exec('bash -c "wget -O (url goes here) > /dev/null 2>&1 &"');

It's not a nice business, and you'll probably want something like a cron job invoking a heartbeat script which polls an actual database event queue to do real asynchronous events.

Expellant answered 23/9, 2008 at 23:35 Comment(6)
Similarly, I've also done the following: exec("curl $url > /dev/null &");Guenzi
Question: is there a benefit of calling 'bash -c "wget"' rather than just 'wget'?Guenzi
In my testing, using exec("curl $url > /dev/null 2>&1 &"); is one of the fastest solutions here. It's immensely faster (1.9s for 100 iterations) than the post_without_wait() function (14.8s) in the "accepted" answer above. AND it's a one-liner...Salmons
Use full path (e.g. /usr/bin/curl) to make it even more fasterDislocation
does this wait until the script is finished?Jareb
Isn't exec() disabled on most shared servers?Brittabrittain
B
17

You can use this library: https://github.com/stil/curl-easy

It's pretty straightforward then:

<?php
$request = new cURL\Request('http://yahoo.com/');
$request->getOptions()->set(CURLOPT_RETURNTRANSFER, true);

// Specify function to be called when your request is complete
$request->addListener('complete', function (cURL\Event $event) {
    $response = $event->response;
    $httpCode = $response->getInfo(CURLINFO_HTTP_CODE);
    $html = $response->getContent();
    echo "\nDone.\n";
});

// Loop below will run as long as request is processed
$timeStart = microtime(true);
while ($request->socketPerform()) {
    printf("Running time: %dms    \r", (microtime(true) - $timeStart)*1000);
    // Here you can do anything else, while your request is in progress
}

Below you can see console output of above example. It will display simple live clock indicating how much time request is running:


animation

Biform answered 19/5, 2015 at 0:27 Comment(4)
This should be the accepted answer to the question because, even if it's not true async, it's better than the accepted one and all "async" answers with guzzle (Here you can perform operations while the request is performed)Destrier
Accepted Answer ©Ammonium
I don't want to have to install anything else on my server; I want a pure PHP version. But how would I even install this if it comes to that?Brittabrittain
@Destrier Why it is better, then accepted answer with exec()?Falsehood
E
16

As of 2018, Guzzle has become the defacto standard library for HTTP requests, used in several modern frameworks. It's written in pure PHP and does not require installing any custom extensions.

It can do asynchronous HTTP calls very nicely, and even pool them such as when you need to make 100 HTTP calls, but don't want to run more than 5 at a time.

Concurrent request example

use GuzzleHttp\Client;
use GuzzleHttp\Promise;

$client = new Client(['base_uri' => 'http://httpbin.org/']);

// Initiate each request but do not block
$promises = [
    'image' => $client->getAsync('/image'),
    'png'   => $client->getAsync('/image/png'),
    'jpeg'  => $client->getAsync('/image/jpeg'),
    'webp'  => $client->getAsync('/image/webp')
];

// Wait on all of the requests to complete. Throws a ConnectException
// if any of the requests fail
$results = Promise\unwrap($promises);

// Wait for the requests to complete, even if some of them fail
$results = Promise\settle($promises)->wait();

// You can access each result using the key provided to the unwrap
// function.
echo $results['image']['value']->getHeader('Content-Length')[0]
echo $results['png']['value']->getHeader('Content-Length')[0]

See http://docs.guzzlephp.org/en/stable/quickstart.html#concurrent-requests

Emmettemmey answered 9/7, 2018 at 6:2 Comment(6)
However, this answer its not asynchronous. apparently guzzle doesn't do thatPuett
Guzzle requires you to install curl. Otherwise it is non-parallel, and it doesn't give you any warning that it's non-parallel.Souza
Thanks for the link @Puett - yes, it appears it's not completely async (as in when you want to send off a request but don't care about the result) but a few posts down in that thread a user has offered a workaround by setting a very low request timeout value which still permits the connection time, but doesn't wait for the result.Emmettemmey
I don't want to have to install anything else on my server; I want a pure PHP version. But how would I even install Guzzle if it comes to that?Brittabrittain
composer require guzzle/guzzle gives me adds 537 files and 2.5 million bytes of new code to my project! For an HTTP client! No thanks.Cyprinodont
We need more people like @Cyprinodont in our projects.Nitrosamine
C
12
/**
 * Asynchronously execute/include a PHP file. Does not record the output of the file anywhere. 
 *
 * @param string $filename              file to execute, relative to calling script
 * @param string $options               (optional) arguments to pass to file via the command line
 */ 
function asyncInclude($filename, $options = '') {
    exec("/path/to/php -f {$filename} {$options} >> /dev/null &");
}
Continual answered 13/3, 2010 at 7:18 Comment(5)
This is not asyncronous because exec is blocking until you quit or fork the process you want to run.Indecipherable
Did you notice the & at the end?Continual
So would this block the script then or not, im confused?Arnoldarnoldo
@Arnoldarnoldo it won't. ampersand (&) means to run the script in backgroundGaut
Isn't exec() disabled on most shared servers?Brittabrittain
A
9
  1. Fake a request abortion using CURL setting a low CURLOPT_TIMEOUT_MS

  2. set ignore_user_abort(true) to keep processing after the connection closed.

With this method no need to implement connection handling via headers and buffer too dependent on OS, Browser and PHP version

Master process

function async_curl($background_process=''){

    //-------------get curl contents----------------

    $ch = curl_init($background_process);
    curl_setopt_array($ch, array(
        CURLOPT_HEADER => 0,
        CURLOPT_RETURNTRANSFER =>true,
        CURLOPT_NOSIGNAL => 1, //to timeout immediately if the value is < 1000 ms
        CURLOPT_TIMEOUT_MS => 50, //The maximum number of mseconds to allow cURL functions to execute
        CURLOPT_VERBOSE => 1,
        CURLOPT_HEADER => 1
    ));
    $out = curl_exec($ch);

    //-------------parse curl contents----------------

    //$header_size = curl_getinfo($ch, CURLINFO_HEADER_SIZE);
    //$header = substr($out, 0, $header_size);
    //$body = substr($out, $header_size);

    curl_close($ch);

    return true;
}

async_curl('http://example.com/background_process_1.php');

Background process

ignore_user_abort(true);

//do something...

NB

If you want cURL to timeout in less than one second, you can use CURLOPT_TIMEOUT_MS, although there is a bug/"feature" on "Unix-like systems" that causes libcurl to timeout immediately if the value is < 1000 ms with the error "cURL Error (28): Timeout was reached". The explanation for this behavior is:

[...]

The solution is to disable signals using CURLOPT_NOSIGNAL

Resources

Anomie answered 21/2, 2015 at 13:36 Comment(5)
How do you handle connection time out (resolve, dns)? When I set timeout_ms to 1 I always end up with "resolving timed out after 4 ms" or something like thatAconcagua
I don't know but 4 ms sounds already pretty fast to me... I don't think you can resolve faster by changing any curl settings. Try optimizing the targeted request perhaps...Anomie
Ok, but timeout_ms=1 sets the timeout for the whole request. So if your resolve takes more than 1ms, then curl will timeout and stop the request. I don't see how this can work at all (assuming resolve takes >1 ms).Aconcagua
Whilst it doesn't make much sense, this works flawlessly and is a pretty great solution for doing PHP asynchronouslyStagecraft
This example shows how you can use CURLOPT_TIMEOUT_MS successfully with a value less than 1000ms (1s) by also using CURLOPT_NOSIGNAL to avoid the known bug php.net/manual/en/function.curl-setopt.php#104597Gratian
P
5

The swoole extension. https://github.com/matyhtf/swoole Asynchronous & concurrent networking framework for PHP.

$client = new swoole_client(SWOOLE_SOCK_TCP, SWOOLE_SOCK_ASYNC);

$client->on("connect", function($cli) {
    $cli->send("hello world\n");
});

$client->on("receive", function($cli, $data){
    echo "Receive: $data\n";
});

$client->on("error", function($cli){
    echo "connect fail\n";
});

$client->on("close", function($cli){
    echo "close\n";
});

$client->connect('127.0.0.1', 9501, 0.5);
Photocomposition answered 3/3, 2014 at 6:6 Comment(1)
I don't want to have to install anything else on my server; I want a pure PHP version. But how would I even install this if it comes to that?Brittabrittain
B
5

You can use non-blocking sockets and one of pecl extensions for PHP:

You can use library which gives you an abstraction layer between your code and a pecl extension: https://github.com/reactphp/event-loop

You can also use async http-client, based on the previous library: https://github.com/reactphp/http-client

See others libraries of ReactPHP: http://reactphp.org

Be careful with an asynchronous model. I recommend to see this video on youtube: http://www.youtube.com/watch?v=MWNcItWuKpI

Beefwood answered 18/6, 2014 at 15:20 Comment(1)
I don't want to have to install anything else on my server; I want a pure PHP version. But how would I even install this if it comes to that?Brittabrittain
E
5

Event Extension

Event extension is very appropriate. It is a port of Libevent library which is designed for event-driven I/O, mainly for networking.

I have written a sample HTTP client that allows to schedule a number of HTTP requests and run them asynchronously.

This is a sample HTTP client class based on Event extension.

The class allows to schedule a number of HTTP requests, then run them asynchronously.

http-client.php

<?php
class MyHttpClient {
  /// @var EventBase
  protected $base;
  /// @var array Instances of EventHttpConnection
  protected $connections = [];

  public function __construct() {
    $this->base = new EventBase();
  }

  /**
   * Dispatches all pending requests (events)
   *
   * @return void
   */
  public function run() {
    $this->base->dispatch();
  }

  public function __destruct() {
    // Destroy connection objects explicitly, don't wait for GC.
    // Otherwise, EventBase may be free'd earlier.
    $this->connections = null;
  }

  /**
   * @brief Adds a pending HTTP request
   *
   * @param string $address Hostname, or IP
   * @param int $port Port number
   * @param array $headers Extra HTTP headers
   * @param int $cmd A EventHttpRequest::CMD_* constant
   * @param string $resource HTTP request resource, e.g. '/page?a=b&c=d'
   *
   * @return EventHttpRequest|false
   */
  public function addRequest($address, $port, array $headers,
    $cmd = EventHttpRequest::CMD_GET, $resource = '/')
  {
    $conn = new EventHttpConnection($this->base, null, $address, $port);
    $conn->setTimeout(5);

    $req = new EventHttpRequest([$this, '_requestHandler'], $this->base);

    foreach ($headers as $k => $v) {
      $req->addHeader($k, $v, EventHttpRequest::OUTPUT_HEADER);
    }
    $req->addHeader('Host', $address, EventHttpRequest::OUTPUT_HEADER);
    $req->addHeader('Connection', 'close', EventHttpRequest::OUTPUT_HEADER);
    if ($conn->makeRequest($req, $cmd, $resource)) {
      $this->connections []= $conn;
      return $req;
    }

    return false;
  }


  /**
   * @brief Handles an HTTP request
   *
   * @param EventHttpRequest $req
   * @param mixed $unused
   *
   * @return void
   */
  public function _requestHandler($req, $unused) {
    if (is_null($req)) {
      echo "Timed out\n";
    } else {
      $response_code = $req->getResponseCode();

      if ($response_code == 0) {
        echo "Connection refused\n";
      } elseif ($response_code != 200) {
        echo "Unexpected response: $response_code\n";
      } else {
        echo "Success: $response_code\n";
        $buf = $req->getInputBuffer();
        echo "Body:\n";
        while ($s = $buf->readLine(EventBuffer::EOL_ANY)) {
          echo $s, PHP_EOL;
        }
      }
    }
  }
}


$address = "my-host.local";
$port = 80;
$headers = [ 'User-Agent' => 'My-User-Agent/1.0', ];

$client = new MyHttpClient();

// Add pending requests
for ($i = 0; $i < 10; $i++) {
  $client->addRequest($address, $port, $headers,
    EventHttpRequest::CMD_GET, '/test.php?a=' . $i);
}

// Dispatch pending requests
$client->run();

test.php

This is a sample script on the server side.

<?php
echo 'GET: ', var_export($_GET, true), PHP_EOL;
echo 'User-Agent: ', $_SERVER['HTTP_USER_AGENT'] ?? '(none)', PHP_EOL;

Usage

php http-client.php

Sample Output

Success: 200
Body:
GET: array (
  'a' => '1',
)
User-Agent: My-User-Agent/1.0
Success: 200
Body:
GET: array (
  'a' => '0',
)
User-Agent: My-User-Agent/1.0
Success: 200
Body:
GET: array (
  'a' => '3',
)
...

(Trimmed.)

Note, the code is designed for long-term processing in the CLI SAPI.


For custom protocols, consider using low-level API, i.e. buffer events, buffers. For SSL/TLS communications, I would recommend the low-level API in conjunction with Event's ssl context. Examples:


Although Libevent's HTTP API is simple, it is not as flexible as buffer events. For example, the HTTP API currently doesn't support custom HTTP methods. But it is possible to implement virtually any protocol using the low-level API.

Ev Extension

I have also written a sample of another HTTP client using Ev extension with sockets in non-blocking mode. The code is slightly more verbose than the sample based on Event, because Ev is a general purpose event loop. It doesn't provide network-specific functions, but its EvIo watcher is capable of listening to a file descriptor encapsulated into the socket resource, in particular.

This is a sample HTTP client based on Ev extension.

Ev extension implements a simple yet powerful general purpose event loop. It doesn't provide network-specific watchers, but its I/O watcher can be used for asynchronous processing of sockets.

The following code shows how HTTP requests can be scheduled for parallel processing.

http-client.php

<?php
class MyHttpRequest {
  /// @var MyHttpClient
  private $http_client;
  /// @var string
  private $address;
  /// @var string HTTP resource such as /page?get=param
  private $resource;
  /// @var string HTTP method such as GET, POST etc.
  private $method;
  /// @var int
  private $service_port;
  /// @var resource Socket
  private $socket;
  /// @var double Connection timeout in seconds.
  private $timeout = 10.;
  /// @var int Chunk size in bytes for socket_recv()
  private $chunk_size = 20;
  /// @var EvTimer
  private $timeout_watcher;
  /// @var EvIo
  private $write_watcher;
  /// @var EvIo
  private $read_watcher;
  /// @var EvTimer
  private $conn_watcher;
  /// @var string buffer for incoming data
  private $buffer;
  /// @var array errors reported by sockets extension in non-blocking mode.
  private static $e_nonblocking = [
    11, // EAGAIN or EWOULDBLOCK
    115, // EINPROGRESS
  ];

  /**
   * @param MyHttpClient $client
   * @param string $host Hostname, e.g. google.co.uk
   * @param string $resource HTTP resource, e.g. /page?a=b&c=d
   * @param string $method HTTP method: GET, HEAD, POST, PUT etc.
   * @throws RuntimeException
   */
  public function __construct(MyHttpClient $client, $host, $resource, $method) {
    $this->http_client = $client;
    $this->host        = $host;
    $this->resource    = $resource;
    $this->method      = $method;

    // Get the port for the WWW service
    $this->service_port = getservbyname('www', 'tcp');

    // Get the IP address for the target host
    $this->address = gethostbyname($this->host);

    // Create a TCP/IP socket
    $this->socket = socket_create(AF_INET, SOCK_STREAM, SOL_TCP);
    if (!$this->socket) {
      throw new RuntimeException("socket_create() failed: reason: " .
        socket_strerror(socket_last_error()));
    }

    // Set O_NONBLOCK flag
    socket_set_nonblock($this->socket);

    $this->conn_watcher = $this->http_client->getLoop()
      ->timer(0, 0., [$this, 'connect']);
  }

  public function __destruct() {
    $this->close();
  }

  private function freeWatcher(&$w) {
    if ($w) {
      $w->stop();
      $w = null;
    }
  }

  /**
   * Deallocates all resources of the request
   */
  private function close() {
    if ($this->socket) {
      socket_close($this->socket);
      $this->socket = null;
    }

    $this->freeWatcher($this->timeout_watcher);
    $this->freeWatcher($this->read_watcher);
    $this->freeWatcher($this->write_watcher);
    $this->freeWatcher($this->conn_watcher);
  }

  /**
   * Initializes a connection on socket
   * @return bool
   */
  public function connect() {
    $loop = $this->http_client->getLoop();

    $this->timeout_watcher = $loop->timer($this->timeout, 0., [$this, '_onTimeout']);
    $this->write_watcher = $loop->io($this->socket, Ev::WRITE, [$this, '_onWritable']);

    return socket_connect($this->socket, $this->address, $this->service_port);
  }

  /**
   * Callback for timeout (EvTimer) watcher
   */
  public function _onTimeout(EvTimer $w) {
    $w->stop();
    $this->close();
  }

  /**
   * Callback which is called when the socket becomes wriable
   */
  public function _onWritable(EvIo $w) {
    $this->timeout_watcher->stop();
    $w->stop();

    $in = implode("\r\n", [
      "{$this->method} {$this->resource} HTTP/1.1",
      "Host: {$this->host}",
      'Connection: Close',
    ]) . "\r\n\r\n";

    if (!socket_write($this->socket, $in, strlen($in))) {
      trigger_error("Failed writing $in to socket", E_USER_ERROR);
      return;
    }

    $loop = $this->http_client->getLoop();
    $this->read_watcher = $loop->io($this->socket,
      Ev::READ, [$this, '_onReadable']);

    // Continue running the loop
    $loop->run();
  }

  /**
   * Callback which is called when the socket becomes readable
   */
  public function _onReadable(EvIo $w) {
    // recv() 20 bytes in non-blocking mode
    $ret = socket_recv($this->socket, $out, 20, MSG_DONTWAIT);

    if ($ret) {
      // Still have data to read. Append the read chunk to the buffer.
      $this->buffer .= $out;
    } elseif ($ret === 0) {
      // All is read
      printf("\n<<<<\n%s\n>>>>", rtrim($this->buffer));
      fflush(STDOUT);
      $w->stop();
      $this->close();
      return;
    }

    // Caught EINPROGRESS, EAGAIN, or EWOULDBLOCK
    if (in_array(socket_last_error(), static::$e_nonblocking)) {
      return;
    }

    $w->stop();
    $this->close();
  }
}

/////////////////////////////////////
class MyHttpClient {
  /// @var array Instances of MyHttpRequest
  private $requests = [];
  /// @var EvLoop
  private $loop;

  public function __construct() {
    // Each HTTP client runs its own event loop
    $this->loop = new EvLoop();
  }

  public function __destruct() {
    $this->loop->stop();
  }

  /**
   * @return EvLoop
   */
  public function getLoop() {
    return $this->loop;
  }

  /**
   * Adds a pending request
   */
  public function addRequest(MyHttpRequest $r) {
    $this->requests []= $r;
  }

  /**
   * Dispatches all pending requests
   */
  public function run() {
    $this->loop->run();
  }
}


/////////////////////////////////////
// Usage
$client = new MyHttpClient();
foreach (range(1, 10) as $i) {
  $client->addRequest(new MyHttpRequest($client, 'my-host.local', '/test.php?a=' . $i, 'GET'));
}
$client->run();

Testing

Suppose http://my-host.local/test.php script is printing the dump of $_GET:

<?php
echo 'GET: ', var_export($_GET, true), PHP_EOL;

Then the output of php http-client.php command will be similar to the following:

<<<<
HTTP/1.1 200 OK
Server: nginx/1.10.1
Date: Fri, 02 Dec 2016 12:39:54 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: close
X-Powered-By: PHP/7.0.13-pl0-gentoo

1d
GET: array (
  'a' => '3',
)

0
>>>>
<<<<
HTTP/1.1 200 OK
Server: nginx/1.10.1
Date: Fri, 02 Dec 2016 12:39:54 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: close
X-Powered-By: PHP/7.0.13-pl0-gentoo

1d
GET: array (
  'a' => '2',
)

0
>>>>
...

(trimmed)

Note, in PHP 5 the sockets extension may log warnings for EINPROGRESS, EAGAIN, and EWOULDBLOCK errno values. It is possible to turn off the logs with

error_reporting(E_ERROR);

Concerning "the Rest" of the Code

I just want to do something like file_get_contents(), but not wait for the request to finish before executing the rest of my code.

The code that is supposed to run in parallel with the network requests can be executed within a the callback of an Event timer, or Ev's idle watcher, for instance. You can easily figure it out by watching the samples mentioned above. Otherwise, I'll add another example :)

Espadrille answered 2/12, 2016 at 7:58 Comment(0)
B
4

let me show you my way :)

needs nodejs installed on the server

(my server sends 1000 https get request takes only 2 seconds)

url.php :

<?
$urls = array_fill(0, 100, 'http://google.com/blank.html');

function execinbackground($cmd) { 
    if (substr(php_uname(), 0, 7) == "Windows"){ 
        pclose(popen("start /B ". $cmd, "r"));  
    } 
    else { 
        exec($cmd . " > /dev/null &");   
    } 
} 
fwite(fopen("urls.txt","w"),implode("\n",$urls);
execinbackground("nodejs urlscript.js urls.txt");
// { do your work while get requests being executed.. }
?>

urlscript.js >

var https = require('https');
var url = require('url');
var http = require('http');
var fs = require('fs');
var dosya = process.argv[2];
var logdosya = 'log.txt';
var count=0;
http.globalAgent.maxSockets = 300;
https.globalAgent.maxSockets = 300;

setTimeout(timeout,100000); // maximum execution time (in ms)

function trim(string) {
    return string.replace(/^\s*|\s*$/g, '')
}

fs.readFile(process.argv[2], 'utf8', function (err, data) {
    if (err) {
        throw err;
    }
    parcala(data);
});

function parcala(data) {
    var data = data.split("\n");
    count=''+data.length+'-'+data[1];
    data.forEach(function (d) {
        req(trim(d));
    });
    /*
    fs.unlink(dosya, function d() {
        console.log('<%s> file deleted', dosya);
    });
    */
}


function req(link) {
    var linkinfo = url.parse(link);
    if (linkinfo.protocol == 'https:') {
        var options = {
        host: linkinfo.host,
        port: 443,
        path: linkinfo.path,
        method: 'GET'
    };
https.get(options, function(res) {res.on('data', function(d) {});}).on('error', function(e) {console.error(e);});
    } else {
    var options = {
        host: linkinfo.host,
        port: 80,
        path: linkinfo.path,
        method: 'GET'
    };        
http.get(options, function(res) {res.on('data', function(d) {});}).on('error', function(e) {console.error(e);});
    }
}


process.on('exit', onExit);

function onExit() {
    log();
}

function timeout()
{
console.log("i am too far gone");process.exit();
}

function log() 
{
    var fd = fs.openSync(logdosya, 'a+');
    fs.writeSync(fd, dosya + '-'+count+'\n');
    fs.closeSync(fd);
}
Bother answered 8/2, 2012 at 19:20 Comment(2)
Please note that many hosting providers do not allow usage of certain PHP functions (like popen/exec). See disable_functions PHP directive.Bunn
Isn't exec() disabled on most shared servers? Plus, I want a pure PHP solution.Brittabrittain
A
3
class async_file_get_contents extends Thread{
    public $ret;
    public $url;
    public $finished;
        public function __construct($url) {
        $this->finished=false;
        $this->url=$url;
    }
        public function run() {
        $this->ret=file_get_contents($this->url);
        $this->finished=true;
    }
}
$afgc=new async_file_get_contents("http://example.org/file.ext");
Ailee answered 27/2, 2015 at 12:35 Comment(1)
Doesn't work for me. Yeah, it fetches the files fine, but is still just as slow as regular file_get_contents().Brittabrittain
A
2

Here is a working example, just run it and open storage.txt afterwards, to check the magical result

<?php
    function curlGet($target){
        $ch = curl_init();
        curl_setopt($ch, CURLOPT_URL, $target);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
        $result = curl_exec ($ch);
        curl_close ($ch);
        return $result;
    }

    // Its the next 3 lines that do the magic
    ignore_user_abort(true);
    header("Connection: close"); header("Content-Length: 0");
    echo str_repeat("s", 100000); flush();

    $i = $_GET['i'];
    if(!is_numeric($i)) $i = 1;
    if($i > 4) exit;
    if($i == 1) file_put_contents('storage.txt', '');

    file_put_contents('storage.txt', file_get_contents('storage.txt') . time() . "\n");

    sleep(5);
    curlGet($_SERVER['HTTP_HOST'] . $_SERVER['SCRIPT_NAME'] . '?i=' . ($i + 1));
    curlGet($_SERVER['HTTP_HOST'] . $_SERVER['SCRIPT_NAME'] . '?i=' . ($i + 1));
Aigrette answered 9/6, 2014 at 23:56 Comment(2)
Are you sure this is async? It doesn't look like it...Brittabrittain
magic lines works fine, thanks!Sumerlin
S
2

I find this package quite useful and very simple: https://github.com/amphp/parallel-functions

<?php

use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;

$responses = wait(parallelMap([
    'https://google.com/',
    'https://github.com/',
    'https://stackoverflow.com/',
], function ($url) {
    return file_get_contents($url);
}));

It will load all 3 urls in parallel. You can also use class instance methods in the closure.

For example I use Laravel extension based on this package https://github.com/spatie/laravel-collection-macros#parallelmap

Here is my code:

    /**
     * Get domains with all needed data
     */
    protected function getDomainsWithdata(): Collection
    {
        return $this->opensrs->getDomains()->parallelMap(function ($domain) {
            $contact = $this->opensrs->getDomainContact($domain);
            $contact['domain'] = $domain;
            return $contact;
        }, 10);
    }

It loads all needed data in 10 parallel threads and instead of 50 secs without async it finished in just 8 secs.

Set answered 29/8, 2019 at 20:28 Comment(4)
I don't want to have to install anything else on my server; I want a pure PHP version. But how would I even install this if it comes to that?Brittabrittain
@Brittabrittain composer require amphp/parallel-functionsSet
where do I run this?Brittabrittain
In terminal (console)Set
A
1

Here is my own PHP function when I do POST to a specific URL of any page.... Sample: *** usage of my Function...

    <?php
        parse_str("[email protected]&subject=this is just a test");
        $_POST['email']=$email;
        $_POST['subject']=$subject;
        echo HTTP_POST("http://example.com/mail.php",$_POST);***

    exit;
    ?>
    <?php
    /*********HTTP POST using FSOCKOPEN **************/
    // by ArbZ

function HTTP_Post($URL,$data, $referrer="") {

    // parsing the given URL
    $URL_Info=parse_url($URL);

    // Building referrer
    if($referrer=="") // if not given use this script as referrer
        $referrer=$_SERVER["SCRIPT_URI"];

    // making string from $data
    foreach($data as $key=>$value)
        $values[]="$key=".urlencode($value);
        $data_string=implode("&",$values);

    // Find out which port is needed - if not given use standard (=80)
    if(!isset($URL_Info["port"]))
        $URL_Info["port"]=80;

    // building POST-request: HTTP_HEADERs
    $request.="POST ".$URL_Info["path"]." HTTP/1.1\n";
    $request.="Host: ".$URL_Info["host"]."\n";
    $request.="Referer: $referer\n";
    $request.="Content-type: application/x-www-form-urlencoded\n";
    $request.="Content-length: ".strlen($data_string)."\n";
    $request.="Connection: close\n";
    $request.="\n";
    $request.=$data_string."\n";

    $fp = fsockopen($URL_Info["host"],$URL_Info["port"]);
    fputs($fp, $request);
    while(!feof($fp)) {
        $result .= fgets($fp, 128);
    }
    fclose($fp); //$eco = nl2br();


    function getTextBetweenTags($string, $tagname) {
        $pattern = "/<$tagname ?.*>(.*)<\/$tagname>/";
        preg_match($pattern, $string, $matches);
        return $matches[1];
    }
    //STORE THE FETCHED CONTENTS to a VARIABLE, because its way better and fast...
    $str = $result;
    $txt = getTextBetweenTags($str, "span"); $eco = $txt;  $result = explode("&",$result);
    return $result[1];
    <span style=background-color:LightYellow;color:blue>".trim($_GET['em'])."</span>
    </pre> "; 
}
</pre>
Algometer answered 13/5, 2014 at 23:38 Comment(1)
I don't want to have to install anything else on my server; I want a pure PHP version. But how would I even install this if it comes to that?Brittabrittain
P
1

ReactPHP async http client
https://github.com/shuchkin/react-http-client

Install via Composer

$ composer require shuchkin/react-http-client

Async HTTP GET

// get.php
$loop = \React\EventLoop\Factory::create();

$http = new \Shuchkin\ReactHTTP\Client( $loop );

$http->get( 'https://tools.ietf.org/rfc/rfc2068.txt' )->then(
    function( $content ) {
        echo $content;
    },
    function ( \Exception $ex ) {
        echo 'HTTP error '.$ex->getCode().' '.$ex->getMessage();
    }
);

$loop->run();

Run php in CLI-mode

$ php get.php
Photographic answered 28/12, 2018 at 18:15 Comment(2)
I don't want to have to install anything else on my server; I want a pure PHP version. But how would I even install this if it comes to that?Brittabrittain
Is it really async? Logic in my URL works 10 seconds, and $http->get() waiting for it.Falsehood
L
1

Symfony HttpClient is asynchronous https://symfony.com/doc/current/components/http_client.html.

For example you can

use Symfony\Component\HttpClient\HttpClient;

$client = HttpClient::create();
$response1 = $client->request('GET', 'https://website1');
$response2 = $client->request('GET', 'https://website1');
$response3 = $client->request('GET', 'https://website1');
//these 3 calls with return immediately
//but the requests will fire to the website1 webserver

$response1->getContent(); //this will block until content is fetched
$response2->getContent(); //same 
$response3->getContent(); //same
Leviathan answered 18/4, 2020 at 19:37 Comment(3)
I don't want to have to install anything else on my server; I want a pure PHP version. But how would I even install this if it comes to that?Brittabrittain
This is pure php, but you'll need the curl php extension enabled in order to work.Leviathan
hm ok. i just used curl_multi thoBrittabrittain

© 2022 - 2024 — McMap. All rights reserved.