Google CloudSQLAdmin - The service account does not have the required permissions for the bucket
Asked Answered
R

3

16

I am writing a python function which uses service account credentials to call the Google cloudSQLAdmin api to export a database to a bucket.

The service account has been given project owner permissions, and the bucket has permissions set for project owners. The sqlAdmin api has been enabled for our project.

Python code:

from google.oauth2 import service_account
from googleapiclient.discovery import build
import googleapiclient
import json

def main():
    SCOPES = ['https://www.googleapis.com/auth/sqlservice.admin', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control']
    SERVICE_ACCOUNT_FILE = './creds/service-account-credentials.json'
    PROJECT = "[REDACTED]"
    DB_INSTANCE = "[REDACTED]"
    BUCKET_PATH = "gs://[REDACTED]/[REDACTED].sql"
    DATABASES = [REDACTED]
    BODY = { # Database instance export request.
    "exportContext": { # Database instance export context. # Contains details about the export operation.
      "kind": "sql#exportContext", # This is always sql#exportContext.
      "fileType": "SQL", # The file type for the specified uri.
          # SQL: The file contains SQL statements.
          # CSV: The file contains CSV data.
      "uri": BUCKET_PATH, # The path to the file in Google Cloud Storage where the export will be stored. The URI is in the form gs://bucketName/fileName. If the file already exists, the requests succeeds, but the operation fails. If fileType is SQL and the filename ends with .gz, the contents are compressed.
      "databases": DATABASES,
    },
  }

    credentials = service_account.Credentials.from_service_account_file(SERVICE_ACCOUNT_FILE, scopes=SCOPES)
    sqladmin = googleapiclient.discovery.build('sqladmin', 'v1beta4', credentials=credentials)
    response = sqladmin.instances().export(project=PROJECT, instance=DB_INSTANCE, body=BODY).execute()
    print(json.dumps(response, sort_keys=True, indent=4))

Running this code nets the following error:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "[REDACTED]/main.py", line 47, in hello_pubsub
    response = sqladmin.instances().export(project=PROJECT, instance=DB_INSTANCE, body=BODY).execute()
  File "/usr/local/lib/python3.7/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/googleapiclient/http.py", line 851, in execute
    raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 403 when requesting https://www.googleapis.com/sql/v1beta4/projects/[REDACTED]/instances/[REDACTED]/export?alt=json returned "The service account does not have the required permissions for the bucket.">

I have tried this across 2 GCP projects, with multiple service accounts with varying permissions.

Related questions: Access denied for service account (permission issue?) when importing a csv from cloud storage to cloud sql - This issue was caused by incorrect permissions, which shouldn't be the case here as the account has project owner permissions

Renvoi answered 15/2, 2019 at 17:49 Comment(2)
I finally figured this one out - Google actually doesn't mention this in their documentation but each SQL instance has a corresponding service account. It's using that service account to export the data, so you must give it access to the target container.Renvoi
When exporting through the console (web gui), the service account is automatically added to the bucket permissions. If it was not exported that way before, you need to add the account yourself.Autoeroticism
C
44

Google Cloud uses system of identity and access management for managing resources: IAM.

Each Cloud SQL instance uses corresponding Service Account that have permissions. To find your Cloud SQL service account name go to:

Console > SQL > Instance Name > Service account

With this Service Account name, you can grant access control permission for the bucket, selecting from: Storage Admin, Storage Object Admin, Storage Object Creator, Storage Object Viewer.

Following the least privilege principle, you will only need to add: Storage Object Creator, to allow export to the Cloud Storage bucket.

Detailed storage access roles descriptions are included in the documentation.


Colorless answered 29/3, 2019 at 10:31 Comment(3)
This answered it for me. Deserves that pretty green check.Quincyquindecagon
Actually, to strictly follow the "least privilege principle", only Storage Object Creator (or legacyBucketWriter) permission is required for CloudSQL instance export to GCS bucket, which is the equivlent of gcloud sql export sql command. See cloud.google.com/sql/docs/postgres/import-export/exportingGownsman
Which PRINCIPAL_TYPE should we use when granting access of bucket to service account? (the link you gave)Histrionism
A
4

Slightly off-topic, but it should be mentioned: The error message

(gcloud.sql.import.csv) HTTPError 403: The service account does not have the required permissions for the bucket.

also occures if the file *doesnt exist on the bucket or if you used wildcards (as mentioned on gcloud sql import csv fails with permissions error when import file contains wildcard ) So maybe before spending hours of debugging the permisson, first double-check if the file is really there (in my case a cleanup-cron ran in the meantime and my test-file got removed...).

Aleydis answered 10/12, 2021 at 8:57 Comment(0)
R
3

These are the minimal permissions assignment that worked for me.

Keep in mind that there are two relevant service accounts. The first is a service account that is automatically created when provisioning a GCP Cloud SQL instance. This service account will have a name similar to [email protected].

The second service account is one you create yourself; let's call this service account cloud-sql-export. Let's also assume that our GCP project ID is my_gcp_project_id.

Here are the permissions:

  1. Create a policy that assigns the storage.admin role to the Cloud SQL service account and assign this policy to the bucket you want to export the data to. Here is the Terraform code:

     data "google_iam_policy" "cloud_sql_bucket_admin" {
       binding {
         role = "roles/storage.admin"
         members = [
           "serviceAccount:[email protected]",
         ]
       }
     }
    
     resource "google_storage_bucket_iam_policy" "mypolicy" {
       bucket      = "mybucketname"
       policy_data = data.google_iam_policy.cloud_sql_bucket_admin.policy_data
     }
    
  2. For the other permissions I created a role and assigned that role to the custom cloud-sql-export service account:

     resource "google_project_iam_custom_role" "cloudsql-export-bucket-role" {
       role_id     = "cloudsqlExportBucketRole"
       title       = "Cloud SQL Export to Bucket Role"
       description = "Export from Cloud SQL and write to buckets"
       permissions = ["cloudsql.instances.get", "cloudsql.instances.export"]
     }
     resource "google_project_iam_member" "cloudsql_archive_role_bind" {
       project = "my_gcp_project_id"
       role    = "projects/my_gcp_project_id/roles/cloudsqlExportBucketRole"
       member  = "serviceAccount:cloud-sql-export@my_gcp_project_id.iam.gserviceaccount.com"
     }
    

(This is an expansion on Lucy Nunley's comment to the original question.)

Review answered 13/10, 2021 at 19:25 Comment(1)
OP didnt say anything about using terraform..?Aleydis

© 2022 - 2024 — McMap. All rights reserved.