POST 4GB file from shell using cURL
Asked Answered
J

5

12

I try to post a file with a filesize of 4GB to a REST API.

Instead of uploading a file with this size, cURL POSTs a file with Content-Length: 0.

curl -v -i -d @"/work/large.png" -H "Transfer-Encoding: chunked" http://localhost:8080/files
* Adding handle: conn: 0x7fcafc00aa00
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* - Conn 0 (0x7fcafc00aa00) send_pipe: 1, recv_pipe: 0
* About to connect() to localhost port 8080 (#0)
*   Trying localhost...
* Connected to localhost (localhost) port 8080 (#0)
> POST /files HTTP/1.1
> User-Agent: curl/7.30.0
> Host: localhost:8080
> Accept: */*
> Transfer-Encoding: chunked
> Authorization: bearer XXX.XXX.XXX
> x-user-token: bearer XXX.XXX.XXX
* upload completely sent off: 5 out of 0 bytes
< HTTP/1.1 201 Created
HTTP/1.1 201 Created
< Date: Thu, 02 Jan 2014 14:55:46 GMT
Date: Thu, 02 Jan 2014 14:55:46 GMT
< ETag: "d41d8cd98f00b204e9800998ecf8427e"
ETag: "d41d8cd98f00b204e9800998ecf8427e"
< Location: http://localhost:8080/files/66032e34-9490-4556-8495-fb485ca12811
Location: http://localhost:8080/files/66032e34-9490-4556-8495-fb485ca12811
* Server nginx/1.4.1 is not blacklisted
< Server: nginx/1.4.1
Server: nginx/1.4.1
< Content-Length: 0
Content-Length: 0
< Connection: keep-alive
Connection: keep-alive

Using files with a smaller size will work as expected.

-rw-r--r--  1 user1  wheel  4403200000  2 Jan 15:02 /work/large.png

Why does the upload fail? And, how to correctly upload such a file?

Cheers.

Jell answered 2/1, 2014 at 15:19 Comment(5)
Maybe you need -X POST? https://mcmap.net/q/910387/-curl-large-file-as-post-requestScandent
Thanks Karl, but this doesn't help. IMHO cURL uses POST as default when specifying -d.Jell
Try --data-binary instead of -d. -d defaults to --data-ascii, which won't work well on a binary PNG image.Scandent
--data-binary works perfect. Thanks a lot.Jell
Hey, @Nils! Perhaps you could consider accepting one of the answers here? This would help out others users, as it lets them know what worked for you.Stuffy
E
4

To upload large binary files using CURL you'll need to use --data-binary flag.

In my case it was:

 curl -X PUT --data-binary @big-file.iso https://example.com

Note: this is really an extended version of @KarlC comment, which actually is the proper answer.

Epistemic answered 19/10, 2016 at 11:44 Comment(1)
This risks doing exactly what OP is trying to avoid, i.e. OOM because it reads the whole file in memory before sending it.Westberry
D
28

I think you should consider using -T option instead of --data-binary. The --data-binary loads the entire file into memory (curl 7.47). At best it is slow, at worst the OOM killer will reply with a Killed message.

curl -XPOST -T big-file.iso https://example.com
Dunghill answered 14/2, 2018 at 17:37 Comment(5)
I'm confused. You suggest using -T instead of --data-binary, and the answer below by @jb. says that --data-binary is the right way to do it?Stuffy
If the client has enough memory to hold the file in memory then --data-binary could be the "right way to do it".Unworldly
What does the -T flag do?Aquitaine
This question is 4 yours old, but the -T flag stands for transfer, as in transfer or upload a file.Chkalov
-T is the short form of --upload-file. There’s really no reason for suggesting short options without explaining what they do.Amalekite
E
4

To upload large binary files using CURL you'll need to use --data-binary flag.

In my case it was:

 curl -X PUT --data-binary @big-file.iso https://example.com

Note: this is really an extended version of @KarlC comment, which actually is the proper answer.

Epistemic answered 19/10, 2016 at 11:44 Comment(1)
This risks doing exactly what OP is trying to avoid, i.e. OOM because it reads the whole file in memory before sending it.Westberry
R
0

Did you verify that the connection is not timing out ? Check CURLOPT_POSTFIELDS has a length or size limit? Check Can't post data to rest server using cURL with content length larger than 1MB

But based on my research all I can say, is issue is at server side. Now it could be memory issue (Buffer size related ), timeout issue... and quite a lot depends on the platform you are using on server side. So, provide some details about serverside and some log output...especially try to capture the error log.

Rawdin answered 2/1, 2014 at 15:33 Comment(2)
The server side actually doesn't get a request entity. Even if I change to a remote server, there is no network activity at all. I also tried to upload a file using Jersey, which works fine with the same server backend.Jell
Can you try with a smaller file first and see if it works? Say something like, less than 2 mb. If that works we are sure your framework is working in some form. Also, let me know the content of access log (or its equivalent) with both < 2 mb and 4 GB file.Rawdin
H
0

In my case the curl was consuming way too much memory as the file size... I noticed that the response was the issue (maybe a memory leak in curl or bash?) and I solved it by directing the curl output to a file:

curl {{command arguments and url}} > curl_response.data

that solved the issue of curl using up too much memory...

Hibben answered 20/8, 2023 at 11:32 Comment(0)
C
-1

The issue of out of memory was resolved for me by adding physical memory to the server. The server consumed 7 GB of RAM, the archive took up 17 GB. 7+17=24 GB of RAM. I allocated 32 GB of RAM. The recovery went without problems.

Cutworm answered 12/8, 2024 at 23:4 Comment(1)
This does not address the question; you don’t want to physically add RAM to your server every time you have to upload a big file.Amalekite

© 2022 - 2025 — McMap. All rights reserved.