I'm trying to send large files (50MB-2GB) that I have stored in S3 using filepicker.io to the Google App Engine Blobstore (in Python). I would like this to happen without going through a form on the client browser as it defeats the project requirements (and often hangs with very large files).
I tried multiple solutions, including:
- trying to load the file into GAE with urlfetch (but GAE has a 32MB limit for requests/responses)
- constructing a multi-part form in python and sending it to
blobstore.create_upload_url()
(can't transfer the file via url, and can't load it in the script because of the 32MB limit) - using boto to read the file straight into the blobstore (connection times out and logs show
encountered HTTPException exception
from boto that triggersCancelledError: The API call logservice.Flush() was explicitly cancelled.
from GAE that crashes the process.
I am struggling to find a working solution. Any hints on how I could perform this transfer, or how to pass the file from s3 as a form attachment without loading it in python first (ie. just specifying its url) would be very much appreciated.