PHP Output buffering, Content Encoding Error caused by ob_gzhandler?
Asked Answered
M

2

9

Can anyone explain why I am receiving the following error?

In the code, if the echo $gz; is commented out I receive no error (but also no output!), if it isn't I get (from Firefox),

Content Encoding Error


The page you are trying to view cannot be shown because it uses an invalid or unsupported form of compression.


Thanks for your help, here's the code:

ob_start('ob_gzhandler') OR ob_start();
echo 'eh?';
$gz = ob_get_clean();
echo $gz;
Maccarthy answered 19/6, 2011 at 14:4 Comment(2)
Do you set the appropriate Content-Encoding header? 'eh?' isn't gzipped here btw.Cherie
@Cherie I wasn't setting the Content-Encoding header manually, no, but Firebug shows that it is set correctly anyway, I have now tried manually setting it too, but no difference. I'm aware that "eh?" isn't gzipped in my example - as I mention in the question, I just used that to show that the error occurs whether I return the gzipped content or not. I'll make that a bit clearer I think. Thanks.Maccarthy
F
16

The output of your application should only contain one output encoding. If you have multiple chunks that are encoded differently, then the browser will get a result that it is impossible to work with. Hence the encoding error.

Kohana itself makes already use of the output buffer. If you want to combine that with your ob_gzhandler output buffer, you need to start your buffer before kohana initialized it's own. That's because output buffer are stackable. When kohana has finished it's output buffering, yours will apply:

ob_start('ob_gzhandler'); # your buffer:
   ob_starts and ends by kohana

So whenever kohana has done some output, these chunks will get passed on into your output callback (ob_gzhandler()) and will be gz-encoded.

The browser should then only get gz-encoded data as it was the output buffer at the topmost level.

Using ob_gzhandler and manually echo'ing the buffer

If you make use of ob_start('ob_gzhandler') to let PHP deal with the compression and you then echo ob_get_clean(), you will create an unreliable output. That's related to how the compression togther with output buffering works:

PHP will buffer chunks of output. That means, PHP starts to compress the output but keeps some bytes to continue compressing. So ob_get_clean() returns the so-far compressed part of the buffer. Often that result is not complete.

To deal with that, flush the buffer first:

ob_start('ob_gzhandler') OR ob_start();
echo 'eh?';
ob_flush();
$gz = ob_get_clean();
echo $gz;

And ensure you don't have any more output after that.

If you would have PHP reached the end of your script, it would have taken care of that: Flushing and outputting.

Now you need to manually call ob_flush() to explicitly make PHP push the buffer through the callbacks.

Inspecting HTTP Compression Problems with Curl

As firefox will return an error, another tool to inspect what's causing the encoding error is needed. You can use curl to track what's going on:

curl --compress -i URL

Will request the URL with compression enabled while displaying all response headers and the body unencoded. This is necessary as PHP transparently enables / disables compression of the ob_gzhandler callback based on request headers.

A response also shows that PHP will set the needed response headers as well. So no need to specify them manually. That would be even dangerously, because only by calling ob_start('ob_gzhandler') you can not say if compression is enabled or not.

In case the compression is broken, curl will give an error description but would not display the body.

Following is such a curl error message provoked with an incompletely generated output by a faulty php script:

HTTP/1.1 200 OK
X-Powered-By: PHP/5.3.6
Content-Encoding: gzip
...

curl: (23) Error while processing content unencoding: invalid code lengths set

By adding the --raw switch, you can even peak into the raw response body:

curl --compress --raw -i URL

That can give an impression what's going wrong, like uncompressed parts within the body.

Fulminate answered 19/6, 2011 at 15:21 Comment(6)
Oops, sorry - I just completely rewrote the question as you were typing your answer. I'll rollback if your answer solves the original problem, but as you can see in the updated question - I may have been too hasty thinking this was a Kohana problem, I'm getting the same error even if I don't route through Kohana. Thanks for your reply though - I'll give it a proper read ... now.Maccarthy
Ok, thanks. Tried it with curl it seems that the response is coming through unencoded, but with Content-Encoding: gzip hence the error when decoding.Maccarthy
Test the code example I've added. It works. Tracked this down with curl and firefox.Fulminate
Yeah was new to me as well with so much detail. Let me know if that solved it with Kohana as well.Fulminate
I will do. Now I've had another look at it though I'm confused again, the ob_get_clean won't actually contain anything right, since ob_flush discards the buffer contents.Maccarthy
Thanks thats perfect solution !Jeanejeanelle
I
0

This is what phpharo does:

/** output buffring */
if (isset($_SERVER['HTTP_ACCEPT_ENCODING']) && strpos($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip') !== false)
{
 ob_start('ob_gzhandler'); ob_start();
}
else
{
 ob_start();
}
Introrse answered 4/11, 2013 at 10:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.