Connect Google Cloud Build to Google Cloud SQL
S

3

5

Google Cloud Run allows for using Cloud SQL. But what if you need Cloud SQL when building your container in Google Cloud Build? Is that possible?

Background

I have a Next.js project, that runs in a Container on Google Cloud Run. Pushing my code to Cloud Build (installing the stuff, generating static pages and putting everything in a Container) and deploying to Cloud Run works perfectly. 👌

Cloud SQL

But, I just added some functionality in which it also needs to some data from my PostgreSQL instance that runs on Google Cloud SQL. This data is used when building the project (generating the static pages).

Locally, on my machine, this works fine as the project can connect to my CloudSQL proxy. While running in CloudRun this should also work, as Cloud Run allows for connecting to my Postgres instance on Cloud SQL.

My problem

When building my project with Cloud Build, I need access to my database to be able to generate my static pages. I am looking for a way to connect my Docker cloud builder to Cloud SQL, perhaps just like Cloud Run (fully managed) provides a mechanism that connects using the Cloud SQL Proxy.

That way I could be connecting to /cloudsql/INSTANCE_CONNECTION_NAME while building my project!

Question

So my question is: How do I connect to my PostgreSQL instance on Google Cloud SQL via the Cloud SQL Proxy while building my project on Google Cloud Build?

Things like my database credentials, etc. already live in Secrets Manager, so I should be able to use those details I guess 🤔

Sorel answered 16/12, 2020 at 15:4 Comment(0)
B
6

You can use the container that you want (and you need) to generate your static pages, and download cloud sql proxy to open a tunnel with the database

  - name: '<YOUR CONTAINER>'
    entrypoint: 'sh'
    args:
      - -c
      - |
        wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
        chmod +x cloud_sql_proxy
        ./cloud_sql_proxy -instances=<my-project-id:us-central1:myPostgresInstance>=tcp:5432 &
        <YOUR SCRIPT>        

Brumaire answered 16/12, 2020 at 16:42 Comment(3)
Thank you, but that won't work for me wanting that just in the build stage, not the whole Container. But, inspired on your input I added the following to my Dockerfile: RUN GOOGLE_APPLICATION_CREDENTIALS=$(pwd)/service_account.json \ ./cloud_sql_proxy -instances=foobar:europe-west4:baz=tcp:5432 & \ GOOGLE_APPLICATION_CREDENTIALS=$(pwd)/service_account.json DB_HOST='127.0.0.1' npm run build && rm -rf .next/cache after installing the CloudSQL proxy and injecting the SERVICE_ACCOUNT_FILE via a base64 encoded build-arg.Sorel
Would you mind explaining the <YOUR CONTAINER> value a bit more? I’m also struggling with the same issue at OP and am trying to understand things a bit more. Would you use something like “gcr.io/cloud-builders/yarn” for <YOUR CONTAINER> and then have everything to build nextjs container image + push image to registry + deploy cloud run all under <YOUR SCRIPT>?Roldan
Your container is the container that you want with the binary installed that you need. It could be NodeJS, it could be a specific runtime or whatever. If you use wget, you must use a container with wget installed in it, or install it yourselves with apt-get or other package manager (depending on your container OS)Brumaire
E
2

App engine has an exec wrapper which has the benefit of proxying your Cloud SQL in for you, so I use that to connect to the DB in cloud build (so do some google tutorials).

However, be warned of trouble ahead: Cloud Build runs exclusively* in us-central1 which means it'll be pathologically slow to connect from anywhere else. For one or two operations, I don't care but if you're running a whole suite of integration tests that simply will not work.

Also, you'll need to grant permission for GCB to access GCSQL.

steps:
  - id: 'Connect to DB using appengine wrapper to help'
    name: gcr.io/google-appengine/exec-wrapper
    args:
      [
        '-i',  # The image you want to connect to the db from
        '$_GCR_HOSTNAME/$PROJECT_ID/$REPO_NAME:$SHORT_SHA',
        '-s',  # The postgres instance
        '${PROJECT_ID}:${_POSTGRES_REGION}:${_POSTGRES_INSTANCE_NAME}',
        '-e',  # Get your secrets here...
        'GCLOUD_ENV_SECRET_NAME=${_GCLOUD_ENV_SECRET_NAME}',
        '--', # And then the command you want to run, in my case a database migration
        'python',
        'manage.py',
        'migrate',
      ]

substitutions:
  _GCLOUD_ENV_SECRET_NAME: mysecret
  _GCR_HOSTNAME: eu.gcr.io
  _POSTGRES_INSTANCE_NAME: my-instance
  _POSTGRES_REGION: europe-west1

* unless you're willing to pay more and get very stung by Beta software, in which case you can use cloud build workers (at the time of writing are in Beta, anyway... I'll come back and update if they make it into production and fix the issues)

Erythromycin answered 20/12, 2020 at 18:5 Comment(0)
S
1

The ENV VARS (including DB connections) are not available during build steps. However, you can use ENTRYPOINT (of Docker) to run commands when the container runs (after completing the build steps).

I was having the need to run DB migrations when a new build was deployed (i.e. when the container starts running) and using ENTRYPOINT (to a file/command) was able to run migrations (which require DB connection details, not available during the build-process).

"How to" part is pretty brief and is located here : https://mcmap.net/q/2031264/-which-is-the-right-way-to-run-laravel-migrations-using-google-cloud-run-and-google-cloud-sql

Shareeshareholder answered 7/9, 2021 at 14:15 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.