Permissions For Google Cloud SQL Import Using Service Accounts
P

5

24

I've exported MySQL Database following the MySQL Export Guide successfully.

Now, I'm trying to import MySQL Database following the MySQL Import Guide.

I've checked the permissions for the service_account_email I'm using, and I have allowed both Admin SQL and Admin Storage permissions.

I was able to successfully activate my service account using this command locally:

gcloud auth activate-service-account <service_account_email> --key-file=<service_account_json_file>  

After I ran the command:

gcloud sql import sql <instance> <gstorage_file> --database=<db_name> --async

I got this information:

{
  "error": {
    "errors": Array[1][
      {
        "domain": "global",
        "reason": "required",
        "message": "Login Required",
        "locationType": "header",
        "location": "Authorization"
      }
    ],
    "code": 401,
    "message": "Login Required"
  }
}

Other Things I've Tried

I also tried using the service_account_email of my SQL instance, which came from:

gcloud sql instances describe <instance_name>

But, it seems to have the same error.

Question

Based on the REST API JSON error I'm given, how do I "login" using the service_account_email so I wouldn't get the 401 Error?

Palace answered 13/6, 2018 at 1:56 Comment(2)
Some questions: do you have CloudSQL and GCS in the same project? Do you have 2nd Generation instances? If all of them are affirmative, have you provided the permissions mentioned in step 6 (Add the service account to the bucket ACL as a writer) and 7 (Add the service account to the import file as a reader) of the manual you posted?Barmaid
Yes, I have CloudSQL and GCStorage in the same project. My MySQL instance is a 2nd Generation instance. I have added my service account as a writer and reader successfully (following the MySQL Import Guide).Palace
K
49

Problem is about the permission of database instance service account to write on created bucket. Steps to solve this issue

1) Go to your Cloud SQL Instance and copy service account of instance (Cloud SQL->{instance name}->OVERVIEW->Service account)

2) After copy the service account, go the Cloud Storage Bucket where to want to dump and set desired permission to that account (Storage->{bucket name}->permissions->add member).

Keble answered 4/12, 2018 at 8:43 Comment(4)
Worked like a charm after an hour of frustration. Thank you!Mclean
As of now it is <bucket name> - Permissions->Grant Access. And the exact role to add is 'Storage Object Creator'Proctoscope
Thank you ! They should definitely show the concerned service account email in these errors.Saurian
If you need to import a file from a bucket into cloudSQL then you need Storage Object ViewerKlaus
D
11

The cloud SQL instance is running under a Google service account that is not a part of your project. You will need to grant this user permissions on the file in Cloud Storage that you want to import. Here is a handy dandy bash snippet that will do that.

SA_NAME=$(gcloud sql instances describe YOUR_DB_INSTANCE_NAME --project=YOUR_PROJECT_ID --format="value(serviceAccountEmailAddress)")
gsutil acl ch -u ${SA_NAME}:R gs://YOUR_BUCKET_NAME;
gsutil acl ch -u ${SA_NAME}:R gs://${YOUR_BUCKET_NAME}/whateverDirectory/fileToImport.sql;

The first line gets the service account email address. The next line gives this service account read permissions on the bucket. The last line gives the service account read permissions on the file.

Deryl answered 14/1, 2019 at 18:1 Comment(1)
If you want to delete the permission afterwards (deletes all permission) gsutil acl ch -d ${SA_NAME} gs://YOUR_BUCKET_NAME and gsutil acl ch -d ${SA_NAME} gs://YOUR_BUCKET_NAME/file.sqlVasta
P
4

Google also has some of the worst error reporting around. If you get this error message it might also be that you entered a PATH incorrectly. In my case it was my path to my bucket directory. Go figure, I don't have permissions to access a bucket that doesn't exist. Technically correct but hardly useful.

Pirzada answered 20/10, 2020 at 18:51 Comment(2)
Thank you, this was it for me. I am gratefully I scrolled all the way down.Cinchonine
Had this issue. Hopefully this might save someone some time... ``` PROJECT_ID=$(gcloud info --format='value(config.project)') SA_NAME=$(gcloud sql instances describe ${DB_NAME} --project=${PROJECT_ID} --format="value(serviceAccountEmailAddress)") gsutil acl ch -u ${SA_NAME}:R gs://${PROJECT_ID} gsutil acl ch -u ${SA_NAME}:R gs://${PROJECT_ID}/${DB_NAME}.sql gcloud sql import sql ${DB_NAME} gs://${PROJECT_ID}/${DB_NAME}.sql --database ${DB_NAME} ```Radiate
D
2

According to the google Docs

  • Describe the instance you are importing to:
    gcloud sql instances describe INSTANCE_NAME
  • Copy the serviceAccountEmailAddress field.
  • Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket.
    gsutil iam ch serviceAccount:SERVICE-ACCOUNT:objectAdmin gs://BUCKET-NAME
  • Then Import the database
Dashing answered 13/2, 2022 at 12:43 Comment(0)
H
0

After performing some research, and based in the permission error, these are the steps that I find more useful for you to troubleshoot the issue:

In order to easier test ACLs and permissions, you can:

You might need to grant additional IAM role such as 'roles/storage.admin' to service account in question, see more information here.

Hokku answered 14/6, 2018 at 15:27 Comment(2)
Probably not. I've done all of those things. It could be that it has something to do with our SQL or .gz dump file, although the backup .gz file came from SQL itself, but I'm still working it out with support. Will let you know. But, this is the right method.Palace
I am with GCP support too. It is better if the issue is solved through support since they will have more information/tools. Once it is solved, share the answer in this post so the community can benefit from the lessons learnt, please.Barmaid

© 2022 - 2024 — McMap. All rights reserved.