curl error 18 - transfer closed with outstanding read data remaining
Asked Answered
C

14

101

when retrieving data from a URL using curl, I sometimes (in 80% of the cases) get

error 18: transfer closed with outstanding read data remaining

Part of the returned data is then missing. The weird thing is that this does never occur when the CURLOPT_RETURNTRANSFER is set to false, that is the curl_exec function doesn't return the data but displays the content directly.

What could be the problem? Can I set some of the options to avoid such behaviour?

Commune answered 18/11, 2009 at 23:52 Comment(3)
Can you give us the URL you are trying, it could be a bad connection if you are testing this on your localhost.Debrief
are you sending Connection: Close headers? If so, try using something like Connection: Keep-Alive and Keep-Alive: *** where *** is a number of your choosing that makes sense (maybe 10 seconds, to be safe; most modern browsers use 300, which is 5 minutes).Attenuator
I just ran into this with a NodeJS express server and streaming results back one line at a time. The issue for me is that I set a response header ("Content-Type": "text/csv") after the data was streamed out. My header did not appear in the response and it appears to have caused the CURL error. I explicitly set the header above the stream output and it started working.Churchwell
H
42

I bet this is related to a wrong Content-Length header sent by the peer. My advice is to let curl set the length by itself.

Homonym answered 19/11, 2009 at 8:17 Comment(5)
It can be related to Content-Length response header. I have encountered a similar case in one of coworker projects: a Java webservice gateway sets Content-Length: 601 while the XML response is 210 bytesConciliate
I believe @Conciliate is right. We got the same with an unterminated chunked transfer encoding. curl is expecting more data (the server announced to send more or doesn't sent the terminating 0), but the server closes the connection.Gash
There is no "Content-Length" included, example attached.Westonwestover
How did you guys managed to fix this ? I am not sending any content length in code. Is it something getting added from server side automatically ?Beberg
The Content-Length header may also be correctly set by the server; but it could happen that the server truncates the response, hence it doesn't send all the "promised" content length, due to a crash of the script generating/sending the response (e.g. a PHP script). See this response to another question: https://mcmap.net/q/89102/-curl-transfer-closed-with-outstanding-read-data-remaining.Calix
R
56

The error string is quite simply exactly what libcurl sees: since it is receiving a chunked encoding stream it knows when there is data left in a chunk to receive. When the connection is closed, libcurl knows that the last received chunk was incomplete. Then you get this error code.

There's nothing you can do to avoid this error with the request unmodified, but you can try to work around it by issuing a HTTP 1.0 request instead (since chunked encoding won't happen then) but the fact is that this is most likely a flaw in the server or in your network/setup somehow.

Rochus answered 4/12, 2009 at 18:52 Comment(5)
For me, the problem was on a remote end I had no control over and the only working fix was forcing 1.0 with this: curl_setopt($curl, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_1_0);Antietam
@DanielStenberg still the same problem.Pantechnicon
Setting HTTP ver. 1.0 helped me with chunked encoing and strange hex marks appearing in retrieved data. Thanks a lot for this hint!Modena
Is it something that happens for a specific URL that we are using curl for? @EricCaronDeepen
@MonaJalal I wasn't running the remote server so I'm not aware of its configuration. My guess was that IIS had different configurations and was responding differently to 1.0 and 1.1 requests.Antietam
H
42

I bet this is related to a wrong Content-Length header sent by the peer. My advice is to let curl set the length by itself.

Homonym answered 19/11, 2009 at 8:17 Comment(5)
It can be related to Content-Length response header. I have encountered a similar case in one of coworker projects: a Java webservice gateway sets Content-Length: 601 while the XML response is 210 bytesConciliate
I believe @Conciliate is right. We got the same with an unterminated chunked transfer encoding. curl is expecting more data (the server announced to send more or doesn't sent the terminating 0), but the server closes the connection.Gash
There is no "Content-Length" included, example attached.Westonwestover
How did you guys managed to fix this ? I am not sending any content length in code. Is it something getting added from server side automatically ?Beberg
The Content-Length header may also be correctly set by the server; but it could happen that the server truncates the response, hence it doesn't send all the "promised" content length, due to a crash of the script generating/sending the response (e.g. a PHP script). See this response to another question: https://mcmap.net/q/89102/-curl-transfer-closed-with-outstanding-read-data-remaining.Calix
J
18

Seeing this error during the use of Guzzle as well. The following header fixed it for me:

'headers' => [
    'accept-encoding' => 'gzip, deflate',
],

I issued the request with Postman which gave me a complete response and no error. Then I started adding the headers that Postman sends to the Guzzle request and this was the one that fixed it.

Jarv answered 30/4, 2019 at 10:16 Comment(3)
Thanks, this worked for me. I tried with postman to get response headers and found that response is returning gzip so I added this.Hoxha
This helped however the response became in gzip format.Genia
This also solved my problem with Insomnia and querying a Lumen API with large dataset.Rutheruthenia
G
7

I had the same problem, but managed to fix it by suppressing the 'Expect: 100-continue' header that cURL usually sends (the following is PHP code, but should work similarly with other cURL APIs):

curl_setopt($curl, CURLOPT_HTTPHEADER, array('Expect:'));

By the way, I am sending calls to the HTTP server that is included in the JDK 6 REST stuff, which has all kinds of problems. In this case, it first sends a 100 response, and then with some requests doesn't send the subsequent 200 response correctly.

Guidotti answered 4/12, 2009 at 15:14 Comment(5)
Where would we modify this setting? I just don't know where to add this line when cURL is in Windows.Trimorphism
@Trimorphism I'm not sure what you mean by "cURL is in Windows", but on the command line, you can suppress the 'Expect:' header by giving it an empty value: curl -H 'Expect:' ... I hope this helps...Guidotti
I'm running wordpress on IIS and it uses cURL so I don't know where to specify this option :-/Trimorphism
I don't know much about WordPress, but it's open source, so you should be able to find the relevant part of the PHP code and basically just insert the line from my answer (probably after changing the name of $curl to whatever it's called in that code).Guidotti
I am a little confused, the "Expect: 100" header is used for the server to encourage client to send data, but not for client to receive data. how can surpress this header can help client receive data? Thanks.Stereobate
T
5

Encountered similar issue, my server is behind nginx. There's no error in web server's (Python flask) log, but some error messsage in nginx log.

[crit] 31054#31054: *269464 open() "/var/cache/nginx/proxy_temp/3/45/0000000453" failed (13: Permission denied) while reading upstream

I fixed this issue by correcting the permission of directory:

/var/cache/nginx
Tommie answered 18/5, 2020 at 10:33 Comment(1)
I had this issue as well. It was caused by running nginx as the user id daemon instead of nginx -> nginx.conf:user nginx; ... my O/S (Alpine) package install assumes you will be running nginx as the user nginx not daemon. Changing the permissions probably would have also fixed it as well, but IMHO running as nginx is a better fix - for my circumstancesEllipticity
A
4

I got this error when my server process got an exception midway during generating the response and simply closed the connection without saying goodbye. curl still expected data from the connection and complained (rightfully).

Aquiver answered 27/7, 2014 at 7:7 Comment(0)
B
3

I got this error when my server ran out of disk space and closed the connection midway during generating the response and simply closed the connection

Beckford answered 10/2, 2021 at 22:12 Comment(0)
L
2

I've solved this error by this way.

$ch = curl_init ();
curl_setopt ( $ch, CURLOPT_URL, 'http://www.someurl/' );
curl_setopt ( $ch, CURLOPT_TIMEOUT, 30);
ob_start();
$response = curl_exec ( $ch );
$data = ob_get_clean();
if(curl_getinfo($ch, CURLINFO_HTTP_CODE) == 200 ) success;

Error still occurs, but I can handle response data in variable.

Loot answered 24/10, 2012 at 8:45 Comment(0)
M
2

I had this problem working with pycurl and I solved it using

c.setopt(pycurl.HTTP_VERSION, pycurl.CURL_HTTP_VERSION_1_0) 

like Eric Caron says.

Metalepsis answered 5/2, 2015 at 8:51 Comment(0)
S
1

I got this error when i was accidentally downloading a file onto itself.
(I had created a symlink in an sshfs mount of the remote directory to make it available for download, forgot to switch the working directory, and used -OJ).

I guess it won’t really »help« you when you read this, since it means your file got trashed.

Sites answered 26/1, 2019 at 8:54 Comment(0)
S
1

I had this same problem. I tried all of these solutions but none worked. In my case, the request was working fine in Postman but when I do it with curl in php I get the error mentioned above.

What I did was check the PHP code generated by Postman and replicate the same thing.

First the request is set to use Http version 1.1 And the second most important part is the encoding for me.

Here is the code that helped me

curl_setopt($ch, CURLOPT_ENCODING, '');
curl_setopt($ch, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_1_1);

If I remove the CurlOpt Encoding I get back the error.

Sikorski answered 5/4, 2021 at 10:1 Comment(0)
E
0

I got this error when running through a nginx proxy and I was running nginx under the user-id daemon instead of the user id nginx.

This means some of nginx's scratch directories weren't accessible / writable.

Switching from user daemon; to user nginx; fixed it for me.

Ellipticity answered 25/3, 2021 at 16:43 Comment(0)
D
0

it can be related to many issues. In my case, i was using Curl to build an image (via Docker api). Thus, the build was stuck that's why i got this error. when I fixed the build, the error disappeared.

Disarray answered 8/9, 2021 at 8:24 Comment(0)
A
0

We can fix this by suppressing the Expect: 100-continue header that cURL normally sends.

Agrology answered 21/8, 2022 at 15:17 Comment(1)
Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.Berkeley

© 2022 - 2024 — McMap. All rights reserved.