Is there a way to access google cloud SQL via proxy inside docker container
Asked Answered
T

4

28

I have multiple docker machines(dev,staging) running on Google Compute Engine which hosts Django servers(this needs access to Google Cloud SQL access). I have multiple Google Cloud SQL instances running, and each instance is used by respective docker machines on my Google Compute Engine instance.

Currently i'm accessing the Cloud SQL by whitelisting my Compute Engine IP. But i dont want to use IPs for obvious reasons ie., i dont use a static ip for my dev machines.

But Now want to use google_cloud_proxy way to gain the access. But How do i do that ! GCP gives multiple ways to access google Cloud SQL instances. But none of them fit my usecase:

I have this option https://cloud.google.com/sql/docs/mysql/connect-compute-engine; but this

  1. only gives my computer engine access to the SQL instance; which i have to access from my Docker.
  2. This doesn't support me to proxy multiple SQL instances on same compute engine machine; I was hoping to do this proxy inside the docker if possible .

So, How do I gain access to the CLoud SQL inside Docker ? If docker compose is a better way to start; How easy is it to implement for kubernetes(i use google container engine for production)

Turnery answered 23/8, 2017 at 10:11 Comment(6)
A single Cloud SQL proxy can proxy multiple instances. What is the reason you need to have multiple proxies?Lampley
I have reading somethings and realised what you said is true. So my 2nd question is invalid now.. do you have any thoughts on Q1 .. how can I access this proxy connectiom inside individual dockersTurnery
I'm not sure I fully understand the question. You can run the proxy as a separate docker image (cloud.google.com/sql/docs/mysql/connect-docker) and then connect to it from your docker image.Lampley
based on your answer. i can see you understand my question. Connect-docker is what is what i mean by using docker-compose in my question. I see docker compose is the option. but im just exploring if thats the best option.Turnery
If you connect from GCE instances with static IPs, you can choose to whitelist those IPs and connect directly by IP. If you don't want to maintain IP whitelists, then using the proxy docker container is your best option.Lampley
Yeah. For some reasons we decided not to use the whitelisting IPs approach. Apparently this is the option I'm left with.. thanksTurnery
T
35

I was able to figure out how to use cloudsql-proxy on my local docker environment by using docker-compose. You will need to pull down your Cloud SQL instance credentials and have them ready. I keep them in my project root as credentials.json and add it to my .gitignore in the project.

The key part I found was using =tcp:0.0.0.0:5432 after the GCP instance ID so that the port can be forwarded. Then, in your application, use cloudsql-proxy instead of localhost as the hostname. Make sure the rest of your db creds are valid in your application secrets so that it can connect through local proxy being supplied by the cloudsql-proxy container.

Note: Keep in mind I'm writing a tomcat java application and my docker-compose.yml reflects that.

docker-compose.yml:

version: '3'
services:
  cloudsql-proxy:
      container_name: cloudsql-proxy
      image: gcr.io/cloudsql-docker/gce-proxy:1.11
      command: /cloud_sql_proxy --dir=/cloudsql -instances=<YOUR INSTANCE ID HERE>=tcp:0.0.0.0:5432 -credential_file=/secrets/cloudsql/credentials.json
      ports:
        - 5432:5432
      volumes:
        - ./credentials.json:/secrets/cloudsql/credentials.json
      restart: always

  tomcatapp-api:
    container_name: tomcatapp-api
    build: .
    volumes:
      - ./build/libs:/usr/local/tomcat/webapps
    ports:
      - 8080:8080
      - 8000:8000
    env_file:
      - ./secrets.env
    restart: always
Traumatize answered 24/1, 2018 at 21:4 Comment(9)
interesting, thanks Dan, we used similar approach :) thanks for posting your answerTurnery
Could I know why restart: always for the cloudsql-proxy?Sherburne
I would like to run on GCE via such docker-compose.yml. However, my app couldn't connect to cloudsql-proxy. It returns gaierror: [Errno -2] Name or service not known. Do you know how to resolve? 3xSherburne
Can't you just connect via 'cloudsql-proxy' instead of '0.0.0.0'?Triplenerved
Spent 2 days debugging, this is the correct answer as of August 2021Bobbery
I am getting this error: gcloud is not in the path and -instances and -projects are emptyCaritta
I am running docker compose on cloud shell but getting this error. dockerpycreds.errors.StoreError: Credentials store docker-credential-gcloud exited with "".Alroy
Sorry, wrong answer, cloudsql-proxy instead of localhost is unusable. The rest of this is helpful though.Pejsach
Actually the whole thing doesn't work, no matter what I do it won't talk to cloud_sql_proxy, so something is missing.Pejsach
I
7

With Cloud SQL Proxy >= 2.0, the accepted answer was helpful for getting started, but we continued to have a lot of challenges connecting to the proxy from another container. The best explanation that we could build off came from this blog post: https://towardsdatascience.com/how-to-connect-to-gcp-cloud-sql-with-cloud-sql-auth-proxy-in-docker-99bdf810c498

Our final solution ended up looking like this (a streamlit dashboard connecting to cloud sql via the sql proxy):

services:
  cloudsql-proxy:
    container_name: cloudsql-proxy
    image: gcr.io/cloud-sql-connectors/cloud-sql-proxy:2.5.0
    command: <INSTANCE_ID> --credentials-file=/secrets/cloudsql/credentials.json --address 0.0.0.0 --port 5432
    networks:
      - dashboard
    ports:
      - 127.0.0.1:5432:5432
    volumes:
      - ./dashboard/credentials.json:/secrets/cloudsql/credentials.json

  dashboard:
    build: ./dashboard/
    working_dir: /dashboard/
    environment:
      - "DASHBOARD_DB_HOST=cloudsql-proxy"
      - "DASHBOARD_DB_NAME=$DASHBOARD_DB_NAME"
      - "DASHBOARD_DB_USER=$DASHBOARD_DB_USER"
      - "DASHBOARD_DB_PASSWORD=$DASHBOARD_DB_PASSWORD"
    networks:
      - dashboard
    ports:
      - "8051:8051"
    depends_on:
      - "cloudsql-proxy"
networks:
  dashboard:
    name: dashboard
    driver: bridge

Hopefully this is helpful for those coming to this question in 2023 and onwards.

Ivyiwis answered 12/7, 2023 at 16:21 Comment(0)
I
4

For Mac OS users you can use the following as POSTGRES_HOST:

host.docker.internal

like

DATABASES = {
    "default": {
        "ENGINE": "django.db.backends.postgresql",
        "NAME": "<DB-NAME>",
        "HOST": "host.docker.internal",
        "PORT": "<YOUR-PORT>",
        "USER": "<DB-USER>",
        "PASSWORD": "<DB-USER-PASSWORD>",
    },
}

Your localhost will be forwarded into the container.

Irvin answered 19/1, 2023 at 17:46 Comment(1)
This is what I needed. Replacing 127.0.0.1 with host.docker.internal allowed my webapp to work through a container.Lillylillywhite
C
0

You can refer to the Google documentation here: https://cloud.google.com/sql/docs/postgres/connect-admin-proxy#connecting-docker

That will show you how to run the proxy on a container. Then you can use docker-compose as per the answer @Dan suggested here: https://mcmap.net/q/491250/-is-there-a-way-to-access-google-cloud-sql-via-proxy-inside-docker-container

 docker run -d \
      -v PATH_TO_KEY_FILE:/config \
      -p 127.0.0.1:5432:5432 \
      gcr.io/cloudsql-docker/gce-proxy:1.19.1 /cloud_sql_proxy \
      -instances=INSTANCE_CONNECTION_NAME=tcp:0.0.0.0:5432 \
      -credential_file=/config
Caritta answered 16/10, 2021 at 19:11 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.