AWS CloudFormation update Lambda Code to use latest version in S3 bucket
Asked Answered
K

3

6

I'm trying to create a CloudFormation template supporting Lambda Function and AWS CodeBuild project for building .netcore source code into a deployed zip file in S3 bucket. Here are the particulars:

  • Using a GitHub mono-repo with multiple Lambda functions as different projects in the .netcore solution
  • Each Lambda function (aka .netcore project) has a CloudFormation YAML file generating a stack containing the Lambda function itself and CodeBuild project.
  • CodeBuild project is initiated from GitHub web hook which retrieves the code from GitHub sub-project and uses its buildspec.yaml to govern how build should happen.
  • buildspec uses .netcore for building project, then zips and copies output to a target S3 bucket
  • Lambda function points to S3 bucket for source code

This is all working just fine. What I'm struggling with is how to update Lambda function to use updated compiled source code in S3 bucket.

Here is subset of CloudFormation template:

Resources:
Lambda:
    Type: AWS::Lambda::Function
    Properties:
        FunctionName: roicalculator-eventpublisher
        Handler: RoiCalculator.Serverless.EventPublisher::RoiCalculator.Serverless.EventPublisher.Function::FunctionHandler
        Code:
            S3Bucket: deployment-artifacts
            S3Key: RoiCalculatorEventPublisher.zip
        Runtime: dotnetcore2.1

CodeBuildProject:
  Type: AWS::CodeBuild::Project
  Properties:
    Name: RoiCalculator-EventPublisher-Master
    Artifacts:
      Location: deployment-artifacts
      Name: RoiCalculatorEventPublisher.zip
      Type: S3
    Source:
      Type: GITHUB
      Location: https://github.com/XXXXXXX
      BuildSpec: RoiCalculator.Serverless.EventPublisher/buildspec.yml

Here is subset of buildspec.yaml:

phases:
install:
    runtime-versions:
        dotnet: 2.2
    commands:
      dotnet tool install -g Amazon.Lambda.Tools
  build:
    commands:
      - dotnet restore
      - cd RoiCalculator.Serverless.EventPublisher
      - dotnet lambda package --configuration release --framework netcoreapp2.1 -o .\bin\release\netcoreapp2.1\RoiCalculatorEventPublisher.zip
      - aws s3 cp .\bin\release\netcoreapp2.1\RoiCalculatorEventPublisher.zip s3://deployment-artifacts/RoiCalculatorEventPublisher.zip

You can see the same artifact name (RoiCalculatorEventPublisher.zip) and S3 bucket (deployment-artifacts) are being used in buildspec (for generating and copying) and CloudFormation template (for Lambda function's source).

Since I'm overwriting application code in S3 bucket using same file name Lambda is using, how come Lambda is not being updated with latest code?

How do version numbers work? Is it possible to have a 'system variable' containing the name of the artifact (file name + version number) and access same 'system variable' in buildspec AND CloudFormation template?

What's the secret sauce for utilizing CloudFormation template to generate source code (via buildspec) using CodeBuild as well as update Lambda function which consumes the generated code?

Thank you.

Krupp answered 26/9, 2019 at 6:4 Comment(1)
Possible duplicate of this questionThirteenth
H
8

Unfortunately, unless you change the "S3Key" on 'AWS::Lambda::Function' resource on every update, CloudFormation will not see it as a change (it will not look inside the zipped code for changes).

Options:

Option 1) Update S3 Key with every upload

Option 2) Recommended advice is to use AWS SAM to author Lambda template, then use "cloudformation package" command to package the template, which takes cares of creating a unique key for S3 and uploading the file to the bucket. Details here: https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-deploying.html

Edit 1:

In response to your comment, let me add some details of SAM approach:

To use CloudFormation as a Deployment tool for your Lambda function in your Pipeline. The basic idea to deploy a Lambda function will be as follows:

1) Create a a SAM template of your Lambda function

2) A basic SAM template looks like:

    AWSTemplateFormatVersion: '2010-09-09'
    Transform: 'AWS::Serverless-2016-10-31'
    Resources:
    FunctionName:
        Type: 'AWS::Serverless::Function'
        Properties:
            Handler: index.handler
            Runtime: nodejs6.10
            CodeUri: ./code

3) Add a directory "code" and keep the lambda code files in this directory

4) Install SAM Cli [1]

5) Run the command to package and upload:

$ sam package --template-file template.yaml --output-template packaged.yaml --s3-bucket {your_S3_bucket}

6) Deploy the package:

$ aws cloudformation deploy --template-file packaged.yaml --stack-name stk1 --capabilities CAPABILITY_IAM

You can keep the Template Code (Step1-2) in CodeCommit/Github and do the Steps4-5 in a CodeBuild Step. For Step6, I recommend to do it via a CloudFormation action in CodePipeline that is fed the "packaged.yaml" file as input artifact.

See also [2].

References:

[1] Installing the AWS SAM CLI on Linux - https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install-linux.html

[2] Building a Continuous Delivery Pipeline for a Lambda Application with AWS CodePipeline - https://docs.aws.amazon.com/en_us/lambda/latest/dg/build-pipeline.html

Hayse answered 26/9, 2019 at 8:43 Comment(2)
so your recommended approach would be to change CloudFormation template to use SAM approach then modify buildspec commands to package the template? Does your suggestion fit into the buildspec approach I'm using? Will I still be able to have automatic builds and deploys after checking code into GitHub? Or, is this a different approach?Krupp
I'm unsure the proper order of operations are for your recommended approach. How does cloudformation package work with buildspec which uses .netcore to build C# bits?Krupp
U
0
  • I am using aws scp instead of aws cp and never had this problem.
  • I am working on a project with serverless architecture with multiple lambdas, where in we have multiple folder with just a python file and requirement.txt file inside it.
  • Usually the directory and lambda is named the same for convenience for eg. folder email_sender would have python file as email_sender.py and a requirement.txt if it needs one.
  • In the code build after installing the dependencies i am just showing below how we are ziping
      echo "--- Compiling lambda zip: ${d}.zip"
      d=$(tr "_" "-" <<< "${d}")
      zip -q -r ${d}.zip . --exclude ".gitignore" --exclude "requirements.txt" --exclude "*__pycache__/*" > /dev/null 2>&1
      mv ${d}.zip ../../${CODEBUILD_SOURCE_VERSION}/${d}.zip

  • And while doing a copy to s3 bucket we use scp as following
aws s3 sync ${CODEBUILD_SOURCE_VERSION}/ ${S3_URI} --exclude "*" --include "*.zip" --sse aws:kms --sse-kms-key-id ${KMS_KEY_ALIAS} --content-type "binary/octet-stream" --exact-timestamps

Ury answered 26/9, 2019 at 8:46 Comment(1)
So you're saying that the currently deployed lambda function always reference the newer code/zip without having to update anything in the stack template via aws/sam cli?Coral
P
0

There are two other options

  1. Add AWS CLI script in the Pipeline

    update-function-code

  2. New Deployment Options for AWS Lambda

Poi answered 23/6, 2023 at 15:50 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.