I'm working in a section of a Perl module that creates a large CSV response. The server runs on Plack, on which I'm far from expert.
Currently I'm using something like this to send the response:
$res->content_type('text/csv');
my $body = '';
query_data (
parameters => \%query_parameters,
callback => sub {
my $row_object = shift;
$body .= $row_object->to_csv;
},
);
$res->body($body);
return $res->finalize;
However, that query_data
function is not a fast one and retrieves a lot of records. In there, I'm just concatenating each row into $body
and, after all rows are processed, sending the whole response.
I don't like this for two obvious reasons: First, it takes a lot of RAM until $body
is destroyed. Second, the user sees no response activity until that method has finished working and actually sends the response with $res->body($body)
.
I tried to find an answer to this in the documentation without finding what I need.
I also tried calling $res->body($row_object->to_csv)
on my callback section, but seems like that ends up sending only the last call I made to $res->body
, overriding all previous ones.
Is there a way to send a Plack response that flushes the content on each row, so the user starts receiving content in real time as the data is gathered and without having to accumulate all data into a veriable first?
Thanks in advance for any comments!
getline
method. For more detail, post a short but complete example. – Complectgetline
and it worked fine, except for the fact that Plack is still buffering the response and doesn't send anything to the browser until->getline
is undefined. Regarding my example, I modified it to make it a little bit more self-explanatory. The only real difference by posting my real code would be a lot of non relevant lines added to the mix. The only thing I'm trying to figure out is how to make Plack send an unbuffered/autoflushed response. – Dairenget_data
function in your code. I think you meanquery_data
. – Anthracite