Firebase cloud function [ Error: memory limit exceeded. Function invocation was interrupted.] on youtube video upload
Asked Answered
M

3

4

I was trying to upload videos to youtube using the firebase cloud function.

What I need is when a user uploads a video to firebase cloud storage, functions.storage.object().onFinalize event will get triggered and in that event, I store the file to a temporary location and upload the file to youtube from the temp location to youtube, after uploading I delete both files.

It will work fine for small files.

But if I upload a large file then the function is getting terminated by showing this error

Error: memory limit exceeded. Function invocation was interrupted.

Code for uploading video

   var requestData = {
        'params': {
        'part': 'snippet,status'
        },
        'properties': {
        'snippet.categoryId': '22',
        'snippet.defaultLanguage': '',
        'snippet.description': "docdata.shortDesc",
        'snippet.tags[]': '',
        'snippet.title': "docdata.title",
        'status.embeddable': '',
        'status.license': '',
        'status.privacyStatus': 'public',
        'status.publicStatsViewable': ''
        }, 'mediaFilename': tempLocalFile
    };

    insertVideo(tempLocalFile, oauth2Client, requestData);

insert video function

function insertVideo( file, oauth2Client, requestData) {
    return new Promise((resolve,reject)=>{
        google.options({ auth: oauth2Client });
        var parameters = removeEmptyParameters(requestData['params']);
        parameters['auth'] = oauth2Client;
        parameters['media'] = { body:  fs.createReadStream(requestData['mediaFilename'])};
        parameters['notifySubscribers'] = false;
        parameters['resource'] = createResource(requestData['properties']);

        console.log("INSERT >>> ");
        let req = google.youtube('v3').videos.insert(parameters,  (error, received)=> {
            if (error) {
                console.log("in error")
                console.log(error);
                try {
                    fs.unlinkSync(file);
                } catch (err) {
                    console.log(err);
                } finally{
                    // response.status(200).send({ error: error })
                }
                reject(error)
            } else {
                console.log("in else")
                console.log(received.data)
                fs.unlinkSync(file);
                resolve();
            }
        }); 
    })

}

code for creating temp local file

           bucket.file(filePath).createReadStream()
            .on('error', (err)=> {
                reject(err)
            })
            .on('response', (response)=> {
                console.log(response)
            })
            .on('end', ()=> {
                console.log("The file is fully downloaded");
                resolve();
            })
            .pipe(fs.createWriteStream(tempLocalFile));

Every file read and write is handled by streams, any idea on why the memory issue is happening

Mystique answered 13/11, 2018 at 16:11 Comment(7)
Hey Suhail, just wondering how did you set up the oauth2Client if you use cloud function to handle the event. How did you get the browser access to establish the OAuth? Thanks!!!Unworthy
I used oauth playground, developers.google.com/oauthplaygroundMystique
Oh cool! I will check it out. Btw, once the token is generated, where did you put in the server so that cloud function can access it? Thanks a lot!Unworthy
Added that to configMystique
Sorry Suhail, can you be more specific? Really sorry I am new to this cloud function 😅. Where is the config? Is it for firebase?Unworthy
@Unworthy check this page firebase.google.com/docs/functions/config-envMystique
Wow, that's really helpful! Thank you so much Suhail!!!Unworthy
D
5

The only writeable part of the filesystem in Cloud Functions is the /tmp directory. As per the documentation here:

This is a local disk mount point known as a "tmpfs" volume in which data written to the volume is stored in memory. Note that it will consume memory resources provisioned for the function.

This is why you hit the memory limit with bigger files.

Your options are:

  • Allocate more memory to your function (currently up to 2 GB)
  • Execute the upload from an environment where you can write to filesystem. For example, your Cloud Function could call an App Engine Flexible service to execute the upload.
Damar answered 13/11, 2018 at 16:21 Comment(0)
U
3

The simplest quick fix is to increase function memory

For Firebase, the docs tell you to set function memory like so:

exports.convertLargeFile = functions
.runWith({
  timeoutSeconds: 300,
  memory: '1GB',
})
.storage.object()
.onFinalize((object) => {
  // Do some complicated things that take a lot of memory and time
});

Valid values for memory are 128MB 256MB 512MB 1GB 2GB 4GB 8GB

However, setting this alone did not work for me. I also had to go to the Google Cloud Platform Console functions list

THEN click on the name of the function whose memory you want to increase. This will take you to an edit screen where you can make changes and deploy the changes. This is just a matter of clicking buttons and changing dropdown values.

Once done you should see the changes reflected in both the aforementioned functions list AND in the firebase functions list - https://console.firebase.google.com/u/0/project/YOUR_PROJECT/functions/list

Now your function should work!

Urinate answered 20/10, 2021 at 23:0 Comment(0)
N
0

You can also use a resumable video upload following a series of steps:

  1. Your GCS-triggered functions gets triggered when the video finishes the upload.
  2. The function starts a resumable upload session, calculates what are reasonable chunks to upload, and inserts the chunk definitions into pubsub with the range for each chunk and the session id
  3. You create a new pubsub-triggered function with that topic that receives that message, downloads the chunk from GCS using the range header (undocumented on the JSON API, but I already reported it), and uploads the chunk to Youtube

I have not tried, but this might even allow parallel uploads to Youtube from different functions uploading different chunks (which would greatly improve performance, although the docs suggest that the chunks need to be uploaded in order). You can download an arbitrary chunk from a GCS object, so the GCS side of things are not a problem for parallelization.

If parallel uploads are not allowed, you can just insert a new pubsub message when a functions finishes uploading it's chunk with the last byte uploaded, so the execution of functions is ordered (while it allows for parallel uploads of different videos).

This is a little more involved, but allows you to upload arbitrary-sized videos (up to the current 128 GB limit on Youtube) from small functions.

Take care to handle failures properly (maybe re-inserting the chunk into the pubsub topic).

Nostril answered 11/3, 2019 at 11:29 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.