I have been searching for a good method, and banging my head against the wall.
In a file sharing service project, I have been assigned to determine the best method available for upload large files.
After searching a lot of questions here on stackoverflow and other forums, here's what I got :
Increase the script maximum execution time, along with maximum file size allowed
This case really doesn't fit good. It will almost timeout everytime when the file is being uploaded through a normal broadband connection (1mbps-2mbps). Even if PHP scripts are executed after the upload has been done, there is still no guarantee that the upload will not timeout.
Chunked upload.
Although I kind of understand what I'm supposed to do here, but what I'm confused about is that, say a 1GB file is being uploaded, and I'm reading it in chunks of 2MB, still if the upload is slow, the php script execution will timeout and give error.
Use other languages like Java and Perl?
Is it really efficient to use java or perl for handling file uploads?
Method used by the client is not the problem here, as we'll be issuing a client SDK, and can implement the method of our choice in it. Both the client and server end implementations will be decided by us.
What method, according to you, should be the best one, considering that the memory usage should be efficient, and there may be many concurrent uploads going on?
How do Dropbox, and similar cloud storage services handle big file uploads, and still stay fast at it?
upload_max_filesize
andpost_max_size
) and you may also need to changemax_input_time
if it takes longer than 5 minutes to upload the file.... but it won't store the actual file in memory unless you explicitly load it into memory – Caves