Cache problem? Server side events work in localhost, not in production enviroment
Asked Answered
T

2

6

I want to ask this question with a simply example. (I will write down at the end of the post).

I have read this: server sent events not updating until script is finished

But I don't know how to solve it.

With the solution of its answer (https://mcmap.net/q/1917465/-server-sent-events-not-updating-until-script-is-finished) works perfect, so may be it is a problem of cache in my production server (share web hosting).

I recieve all the EventStream at the end, all in a row at the same time.

I have already checked all combinations of:

header('Cache-Control: no-cache, no-store, must-revalidate, private, max-age=0');
header('Pragma: no-cache');
header('Expires: 0');

But no luck

Any one knows how to solve this without "str_pad($message, 800000)" ?

Any clue to compare my localhost configuration of the server and the shared hostweb server?

Thanks,

NOTE 1: php version 8 in both enviroments. I have checked that I work with apached as developement enviroment and CGI/FastCGI in my shared webserver. Is it related? I have found this: Event Source -> Server returns event stream in bulk rather then returning in chunk

NOTE 2: Output buffering is the same in both servers: output_buffering 4096

This is a simple example that doesnt work in my hosting:

test.html

<!DOCTYPE html>
<html>
    <head>
        <meta charset="utf-8" />
    </head>
    <body>
        <br />
        <input type="button" onclick="startTask();"  value="Start Long Task" />
        <input type="button" onclick="stopTask();"  value="Stop Task" />
        <br />
        <br />
          
        <p>Results</p>
        <br />
        <div id="results" style="border:1px solid #000; padding:10px; width:300px; height:250px; overflow:auto; background:#eee;"></div>
        <br />
          
        <progress id='progressor' value="0" max='100' ></progress>  
        <span id="percentage" style="text-align:right; display:block; margin-top:5px;">0</span>
    </body>
</html>

<script>
var es;
  
function startTask() {
    if (!!window.EventSource) {
        es = new EventSource('long_process.php');
        //a message is received
        es.addEventListener('message', function(e) {
            var result = JSON.parse( e.data );
            addLog(result.message);  
            if(e.lastEventId == 'CLOSE') {
                addLog('Received CLOSE closing');
                es.close();
                var pBar = document.getElementById('progressor');
                pBar.value = pBar.max; //max out the progress bar
            }
            else {
                var pBar = document.getElementById('progressor');
                pBar.value = result.progress;
                var perc = document.getElementById('percentage');
                perc.innerHTML   = result.progress  + "%";
                perc.style.width = (Math.floor(pBar.clientWidth * (result.progress/100)) + 15) + 'px';
            }
        });

        es.addEventListener('error', function(e) {
          addLog('Error occurred');
          es.close();
        });        
    }
}
    
function stopTask() {
    es.close();
    addLog('Interrupted');
}

function addLog(message) {
    var r = document.getElementById('results');
    r.innerHTML += message + '<br>';
    r.scrollTop = r.scrollHeight;
}
</script>

long_process.php

<?php
header('Content-Type: text/event-stream');
// recommended to prevent caching of event data.
header('Cache-Control: no-cache'); 
  
function send_message($id, $message, $progress) {
    $d = array('message' => $message , 'progress' => $progress);
    echo "id: $id" . PHP_EOL;
    echo "data: " . json_encode($d) . PHP_EOL;
    echo PHP_EOL;
    //push the data out by all force possible
    ob_flush();
    flush();
}
  
//LONG RUNNING TASK
for($i = 1; $i <= 10; $i++) {
    send_message($i, 'on iteration ' . $i . ' of 10' , $i*10); 
    sleep(1);
}
send_message('CLOSE', 'Process complete', 100);
?>

UPDATE About @Tigger answer: I have used this code, but no luck. Again I recieve all in a row at the end of the script (10seconds), not a message every 1 second. (I have also checked "\n" and PHP_EOL).

function send_message($id, $message, $progress) {
    $d = array('message' => $message , 'progress' => $progress);
      
    echo "id: $id" . "\n";
    echo "data: " . json_encode($d) . "\n";
    echo "\n";
      
    //push the data out by all force possible
    while(ob_get_level() > 0) {
        ob_end_flush();
    }
    flush();
}

UPDATE About second @Tigger answer I have used MDN sample on GitHub and no luck. XAMPP works, my production webserver ... doesn't.


UPDATE About hosting provider As I have not found a solution, I have contacted with my shared web hosting, and here is their answer:

(translate with google):

Hello, After analyzing the case, as we have been able to verify, the use of SSE on a platform like ours with an nginx proxy ahead of apache, would require certain customizations in the nginx configuration of the hosting, which makes it incompatible with the service of shared hosting. You need a service that is more customizable such as a vps, or a virtual private server or similar. Greetings,

As I can't change nginx configuration, is it any other configuration/command in my php files or javascript that will help me?

Thrilling answered 3/10, 2021 at 18:57 Comment(2)
It's possible that this server provides only fastCGI, not CGIAshleeashleigh
@Ashleeashleigh Thanks a lot for your answer, how could I check what you are asking for? Do I will find a solution if it is only fastCGI ? not? Only if it is CGI? Thanks.Thrilling
S
1

After a lot of messing around I found the following syntax for your long_process.php works best in my environment.

My server is using FreeBSD and my PHP scripts (PHP 8) are also Unix formatted (important for line returns). If you are on a mix of Windows and Linux, your line returns could be part of the issue.

I also found ob_get_level() helped a lot. The connection_aborted() check will close off the script quicker too. This will prevent the script from continuing when the user navigates away, returning resources to the webserver.

My JavaScript structure is a bit different from yours as well, but your issue appears to be on the PHP side, so I have skipped that part.

long_process.php

// how long between each loop (in seconds)                                                                                                                  
define('RETRY',4);

header("Cache-Control: no-cache");
header("Content-Type: text/event-stream");

// skip the first check as the member just started
echo 'retry: '.(RETRY * 1000);
echo 'data: {"share":true,"update":false}';
echo "\n\n";

flush();
sleep(RETRY);

while(1) {
    if (... some conditional check here ...) {
        echo 'data: {"share":true,"update":true}';
    } else {
        echo 'data: {"share":true,"update":false}';
    }
    echo "\n\n";

    while(ob_get_level() > 0) {
        ob_end_flush();
    }

    flush();

    if (connection_aborted()) {
        break;
    }

    sleep(RETRY);
}
Simmie answered 3/10, 2021 at 21:48 Comment(3)
Hi, @Simmie thanks a lot for your answer. My long_process.php is only and example. I have tried your pseudo code but no luck. I have also tried both version with PHP_EOL or with "\n". See my updated question.Thrilling
@JuanRangel : There is a MDN sample on GitHub. Does the sample code work in your two environments?Simmie
Hi @Tigger, thanks a lot for your help, but... no luck. I have used exactly the code that is on github. In my XAMPP test enviroment, it just works. I can see messages with a random time between them. In production web server .... nothing. Nothing even when I click "close connection". ?Thrilling
V
0

As per this answer on a similar question, this is an Nginx isssue. You can fix this by adding a 'X-Accel-Buffering' header with value 'no' in your response. See this entry in the Nginx documentation for more detail.

Veator answered 1/2, 2022 at 14:30 Comment(1)
Thanks @Veator It doesn't work, may be I am limited beacuse of my shared server.Thrilling

© 2022 - 2024 — McMap. All rights reserved.