AWS CodePipeline: pass Lambda function output to CloudFormation
Asked Answered
K

2

7

i want to run a CloudFormation template with CodePipeline. This template is expecting an input parameter which needs to contain the current Date/Time. Unfortunately CloudFormation isn't able to generate the current DateTime by itself out-of-the-box.

My approach was first to run a simple Lambda function to create the current timestamp and save it as OutputArtifacts. The subsequently CloudFormation task imports this artifact as InputArtifacts and gets the value from the DateTime attribut and passes it to CloudFormation via ParameterOverrides instruction.

Unfortunately CodePipeline keeps saying the DateTimeInput parameter is invalid (obviously GetArtifactAtt lookup failed). I assume the lambda output (python: print) doesn't get saved as artifact properly?

Do you know how to pass the lambda output correctly or do you have an idea how to achieve this on a better way?

All pipeline components are defined with CloudFormation as YAML. Here are the relevant parts:

Lambda Function:

Resources:
  ...
  GetDateTimeFunction:
    Type: AWS::Lambda::Function
    Properties:
      Handler: index.lambda_handler
      Runtime: python2.7
      Timeout: '10'
      Role: !GetAtt GetDateTimeFunctionExecutionRole.Arn
      Code:
        ZipFile: |
                import datetime
                import boto3
                import json

                code_pipeline = boto3.client('codepipeline')

                def lambda_handler(event, context):
                  now = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
                  responseData = {'DateTime':now}
                  print json.dumps(responseData)
                  response = code_pipeline.put_job_success_result(jobId=event['CodePipeline.job']['id'])
                  return response

here are the pipeline tasks:

Resources:
...
  Pipeline:
    Type: AWS::CodePipeline::Pipeline
    Properties:
      ArtifactStore:
        Location: !Ref ArtifactStoreBucket
        Type: S3
      DisableInboundStageTransitions: []
      Name: !Ref PipelineName
      RoleArn: !GetAtt PipelineRole.Arn
      Stages:
        - Name: Deploy
          Actions:
            - Name: GetDateTime
              RunOrder: 1
              ActionTypeId:
                Category: Invoke
                Owner: AWS
                Provider: Lambda
                Version: '1'
              Configuration:
                 FunctionName: !Ref GetDateTimeFunction
              OutputArtifacts:
                - Name: GetDateTimeOutput
            - Name: CreateStack
              RunOrder: 2
              ActionTypeId:
                Category: Deploy
                Owner: AWS
                Provider: CloudFormation
                Version: '1'
              InputArtifacts:
                - Name: TemplateSource
                - Name: GetDateTimeOutput
              Configuration:
                ActionMode: REPLACE_ON_FAILURE
                Capabilities: CAPABILITY_IAM
                RoleArn: !GetAtt CloudFormationRole.Arn
                StackName: !Ref CFNStackname
                TemplatePath: !Sub TemplateSource::${CFNScriptfile}
                TemplateConfiguration: !Sub TemplateSource::${CFNConfigfile}
                ParameterOverrides: |
                  {
                    "DateTimeInput" : { "Fn::GetArtifactAtt" : [ "GetDateTimeOutput", "DateTime" ] }
                  }

Update: I was to naive and thought there would be a simple way. Now I know it is a more advanced and manual task just to deliver a simple output artifact with lambda.

Inside the python code one must evaluate the passed event dictionary (CodePipeline.job) to lookup:
- the predefined OutputArtifacts (S3 Bucket/Key) and
- temporary S3 session credentials provided by CodePipeline.
Then a S3 client must be initialized by these credentials. S3 put_object needs to run afterwards.

https://docs.aws.amazon.com/codepipeline/latest/userguide/actions-invoke-lambda-function.html https://forums.aws.amazon.com/thread.jspa?threadID=232174

So my question is again: Do you guys have an idea how to achieve this on a better or more simple way?
I merely want to put the current date and time as input parameter for CloudFormation and don't want to break automation.

Kilo answered 30/10, 2018 at 13:10 Comment(0)
K
4

Yes, I wasn't aware of the need to manually handle the output artifacts myself.

Finally this did the trick:

  GetDateTimeFunction:
    Type: AWS::Lambda::Function
    Properties:
      Handler: index.lambda_handler
      Runtime: python2.7
      Timeout: '10'
      Role: !GetAtt GetDateTimeFunctionExecutionRole.Arn
      Code:
        ZipFile: |
                from __future__ import print_function
                from boto3.session import Session
                from zipfile import ZipFile
                import json
                import datetime
                import boto3
                import botocore
                import traceback
                import os
                import shutil


                code_pipeline = boto3.client('codepipeline')

                def evaluate(event):
                    # Extract attributes passed in by CodePipeline
                    job_id = event['CodePipeline.job']['id']
                    job_data = event['CodePipeline.job']['data']

                    config = job_data['actionConfiguration']['configuration']
                    credentials = job_data['artifactCredentials']

                    output_artifact = job_data['outputArtifacts'][0]
                    output_bucket = output_artifact['location']['s3Location']['bucketName']
                    output_key = output_artifact['location']['s3Location']['objectKey']

                    # Temporary credentials to access CodePipeline artifact in S3
                    key_id = credentials['accessKeyId']
                    key_secret = credentials['secretAccessKey']
                    session_token = credentials['sessionToken']

                    return (job_id, output_bucket, output_key, key_id, key_secret, session_token)

                def create_artifact(data):
                    artifact_dir = '/tmp/output_artifacts/'+str(uuid.uuid4())
                    artifact_file = artifact_dir+'/files/output.json'
                    zipped_artifact_file = artifact_dir+'/artifact.zip'
                    try:
                        shutil.rmtree(artifact_dir+'/files/')
                    except Exception:
                        pass
                    try:
                        os.remove(zipped_artifact_file)
                    except Exception:
                        pass
                    os.makedirs(artifact_dir+'/files/')
                    with open(artifact_file, 'w') as outfile:
                        json.dump(data, outfile)
                    with ZipFile(zipped_artifact_file, 'w') as zipped_artifact:
                        zipped_artifact.write(artifact_file, os.path.basename(artifact_file))

                    return zipped_artifact_file

                def init_s3client (key_id, key_secret, session_token):
                    session = Session(aws_access_key_id=key_id, aws_secret_access_key=key_secret, aws_session_token=session_token)
                    s3client = session.client('s3', config=botocore.client.Config(signature_version='s3v4'))
                    return s3client

                def lambda_handler(event, context):
                    try:
                        (job_id, output_bucket, output_key, key_id, key_secret, session_token)=evaluate(event)
                        (s3client)=init_s3client(key_id, key_secret, session_token)

                        now=datetime.datetime.now().strftime('%Y-%m-%d_%H:%M:%S')
                        data={"DateTime":now}

                        (zipped_artifact_file)=create_artifact(data)

                        s3client.upload_file(zipped_artifact_file, output_bucket, output_key, ExtraArgs={"ServerSideEncryption": "AES256"})

                        # Tell CodePipeline we succeeded
                        code_pipeline.put_job_success_result(jobId=job_id)

                    except Exception as e:
                        print("ERROR: " + repr(e))
                        message=repr(e)
                        traceback.print_exc()
                        # Tell CodePipeline we failed
                        code_pipeline.put_job_failure_result(jobId=job_id, failureDetails={'message': message, 'type': 'JobFailed'})

                    return "complete"


  Pipeline:
    Type: AWS::CodePipeline::Pipeline
    Properties:
      ArtifactStore:
        Location: !Ref ArtifactStoreBucket
        Type: S3
      DisableInboundStageTransitions: []
      Name: !Ref PipelineName
      RoleArn: !GetAtt PipelineRole.Arn
      Stages:
        - Name: S3Source
          Actions:
            - Name: TemplateSource
              RunOrder: 1
              ActionTypeId:
                Category: Source
                Owner: AWS
                Provider: S3
                Version: '1'
              Configuration:
                S3Bucket: !Ref ArtifactStoreBucket
                S3ObjectKey: !Ref SourceS3Key
              OutputArtifacts:
                - Name: TemplateSource
        - Name: Deploy
          Actions:
            - Name: GetDateTime
              RunOrder: 1
              ActionTypeId:
                Category: Invoke
                Owner: AWS
                Provider: Lambda
                Version: '1'
              Configuration:
                 FunctionName: !Ref GetDateTimeFunction
              OutputArtifacts:
                - Name: GetDateTimeOutput
            - Name: CreateStack
              RunOrder: 2
              ActionTypeId:
                Category: Deploy
                Owner: AWS
                Provider: CloudFormation
                Version: '1'
              InputArtifacts:
                - Name: TemplateSource
                - Name: GetDateTimeOutput
              Configuration:
                ActionMode: REPLACE_ON_FAILURE
                Capabilities: CAPABILITY_IAM
                RoleArn: !GetAtt CloudFormationRole.Arn
                StackName: !Ref CFNStackname
                TemplatePath: !Sub TemplateSource::${CFNScriptfile}
                TemplateConfiguration: !Sub TemplateSource::${CFNConfigfile}
                ParameterOverrides: |
                  {
                    "DateTimeInput" : { "Fn::GetParam" : ["GetDateTimeOutput", "output.json", "DateTime"]}
                  }

Pretty much overhead for such a trivial task ;-)

Kilo answered 1/11, 2018 at 19:41 Comment(0)
K
0

You should use "Fn::GetParam" instead of "Fn::GetArtifactAtt". According to CloudFormation document, "Fn::GetArtifactAtt" can only get artifact's attribute like BucketName, ObjectKey and URL. "Fn::GetParam" can get an value from a json file in the artifact. Therefore if you can generate the artifact "GetDateTimeOutput" as a zip file, which includes a JSON file (e.g. param.json) with the following content

{ "DateTime": "2018/10/31 13:32:00" }

Then you can use { "Fn::GetParam" : [ "GetDateTimeOutput", "param.json", "DateTime" ] } to get the time.

You could either modify your Lambda function to do it, or use a CodeBuild action. CodeBuild takes care of creating the zip, you just need to specify the build commands to create a JSON file in the output folder. You can find more information on how to use CodeBuild in CodePipeline in the following document.

CloudFormation Document https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/continuous-delivery-codepipeline-parameter-override-functions.html#w2ab1c13c17b9

CodeBuild Document https://docs.aws.amazon.com/codebuild/latest/userguide/how-to-create-pipeline.html

Kiki answered 31/10, 2018 at 20:44 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.