Why do I get the error "Code uncompressed size is greater than max allowed size of 272629760" only for some deployment targets?
T

2

8

As part of an AWS CodePipeline in an AWS CodeBuild action I deploy resources created with the Serverless Framework to a "UAT" (user acceptance testing) stage. The pipeline runs in its own tooling AWS account, first deploying cross-account into a separate "UAT" account, then deploying cross-account into a separate "Production" account.

The first deployment to "UAT" completes successfully, whereas the succeeding deployment to "Production" fails with the error ...

Serverless Error ----------------------------------------
 
  An error occurred: <some name>LambdaFunction - Resource handler returned message: "Code uncompressed size is greater than max allowed size of 272629760. (Service: Lambda, Status Code: 400, Request ID: <some request id>, Extended Request ID: null)" (RequestToken: <some request token>, HandlerErrorCode: InvalidRequest).
 
  Get Support --------------------------------------------
     Docs:          docs.serverless.com
     Bugs:          github.com/serverless/serverless/issues
     Issues:        forum.serverless.com
 
Your Environment Information ---------------------------
     Operating System:          linux
     Node Version:              14.17.2
     Framework Version:         2.68.0 (local)
     Plugin Version:            5.5.1
     SDK Version:               4.3.0
     Components Version:        3.18.1

This started to happen, once I introduced the usage of a private Lambda Layer. The total size of all files seems way less than the maximum allowed size.

This question isn't so much about the actual error (there already exists a similar question). I rather wonder why the behavior is inconsistent, varying with the deployment targets. Because the limits for the Lambda Function package size (including the usage of Lambda Layers) should be the same for all environments.

Talc answered 10/12, 2021 at 9:11 Comment(3)
Exact same boat here. We added a layer and now we get that max size error too. Did you find a solution? Or are you still dealing with it?Drily
Ran into a similar issue. Tried to incorporate a (tiny) layer into an existing lambda that had been deployed via S3 due to its size. During terraform apply, I got the error Code uncompressed size is greater than max allowed size of 272629760 only when the layer was in play. Had to deploy a dummy version of the lambda with no dependencies, add the layer, then redeploy. Annoyingly misleading error message, as it made it seem like the size of the layer was the problem.Flaherty
Do you use nodejs, could you show use the package.json?License
T
2

This is a weird issue which I also faced when moving to Lambda layers.

To fix this, as a workaround,

  • Comment the lambda layers configuration in your serverless.yml file and deploy it
  • Uncomment the lambda layers configuration in your serverless.yml file and deploy it

This steps may look weird but it worked for me

Teetotal answered 23/9, 2023 at 1:48 Comment(0)
R
0

I encountered an issue where the following configuration in serverless.yaml did not work as expected:

package:
  excludeDevDependencies: true

To resolve this, I manually excluded the development dependencies directly in package.json. After making this adjustment, I ran npm install to install the necessary packages. Finally, I was able to successfully deploy my application using sls deploy.

I hope this helps others who might face the same problem!

Rabelaisian answered 26/6 at 13:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.