Google Cloud Functions memory limit exceeded
Asked Answered
W

1

6

We do a lot of image processing in Google Cloud Functions using NodeJS and Sharp (libvips) library. Even though we have the memory limit for our functions set to 2Gb the function occasionally runs out of memory and crashes with the 'Error: memory limit exceeded. Function invocation was interrupted.' message.

Is there a way to catch this exception? I want to return a more polite (json) response so my server knows what the problem was.

Wilhelmstrasse answered 13/2, 2019 at 10:16 Comment(0)
S
3

Application-wide uncaught exceptions in NodeJS Google Cloud Platform apps need to be reported manually.

That being said, more details on the memory limit exceeded error could already be on the logs. You only need to search the error message on the Logs viewer from the GCP console as shown in the docs, or use Advanced Filters e.g. to search by time. The documentation also explains how to write log entries from your Cloud Functions. Then you can use the Stackdriver Logging API, for instance, to export the logs and get a json.

I would also suggest using Stackdriver Monitoring to track the memory usage of your Cloud Function.

Shaman answered 25/4, 2019 at 14:13 Comment(1)
If this answer was helpful, could you accept it? Thank you.Shaman

© 2022 - 2024 — McMap. All rights reserved.