How can I deploy Google Cloud Functions in CI/CD without re-deploying unchanged Cloud Functions to avoid the quota? [closed]
Asked Answered
R

1

0

Cloud Build has a create quota of 30. If we have more than 30 Cloud Functions, this quota can easily be reached. Is there a way to deploy more than 30 Cloud Functions that people use, that preferably is smart enough to not deploy unmodified Cloud Functions?

Rose answered 22/7, 2021 at 16:17 Comment(1)
This question says "There seems to be a technical limit and I would like to know what to do in case this limit is reached.", why is this an "audit", for checking attention?Leta
D
5

Following our conversation in GCP community slack channel, here is an idea with a small example. The example depicts one cloud function but easily can be extended to an arbitrary set of cloud functions.

Bear in mind - this is not my invention - one can find plenty of examples in the internet.

The CICD uses Terraform inside Cloud Build (simply speaking - cloud build yaml file contains 'terraform init' and `terraform apply'). Thus, push (or pull request) triggers a Cloud Build job, which executes Terraform.

In the scope of this question - terraform script should have 4 elements:

1/ A name of the zip archive with the cloud function code - as it should be in the GCS bucket:

locals {
  cf_zip_archive_name = "cf-some-prefix-${data.archive_file.cf_source_zip.output_sha}.zip"
}

2/ A zip archive:

data "archive_file" "cf_source_zip" {
  type        = "zip"
  source_dir  = "${path.module}/..<<path + directory to the CF code>>"
  output_path = "${path.module}/tmp/some-name.zip"
}

3/ A GCS object in a bucket (under assumption that the bucket is already exist, or created outside of the scope of this question):

resource "google_storage_bucket_object" "cf_source_zip" {
  name         = local.cf_zip_archive_name
  source       = data.archive_file.cf_source_zip.output_path
  content_type = "application/zip"
  bucket       = google_storage_bucket.cf_source_archive_bucket.name
}

4/ A Cloud Function (only 2 parameters are shown):

resource "google_cloudfunctions_function" "sf_integrations" {

  source_archive_bucket = google_storage_bucket.cf_source_archive_bucket.name
  source_archive_object = google_storage_bucket_object.cf_source_zip.name

}

How it works all together =>

When the Terraform is triggered, the zip file is created in case the cloud function code has been modified. SHA hash code of the zip file is different (if the code has been modified). Thus, the local variable with the GCS object name gets different value. It means that the zip file is uploaded to the GCS bucket with the new name. As the source code object has now a new name source_archive_object = google_storage_bucket_object.cf_source_zip.name, terraform finds out that the cloud function has to be redeployed (because the state file has the old name of the archive object). The cloud function is redeployed.

On the other hand, if the code is not modified - the name source_archive_object = google_storage_bucket_object.cf_source_zip.name is not modified, so Terraform does not deploy anything.

Obviously, if other parameters are modified - the redeployment goes ahead anyway.

Declared answered 22/7, 2021 at 16:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.