Retrieving Automated Google Cloud SQL Backups
Asked Answered
M

5

13

My team has a requirement that we be able to retrieve a backup of our database (hosted on Google Cloud SQL) and restore that database to a locally hosted instance of MySQL.

I know that Google Cloud SQL has the ability to schedule backups, but these don't appear to be available to download anywhere.

I also know that we are able to "export" our database to Google Cloud Storage, but we'd like to be able to schedule the "export".

The end goal here is to execute the following steps in some sort of an admin script:

  1. Automatically backup our database that is hosted on Google Cloud SQL.
  2. Download the backup to a local (not cloud) server.
  3. Restore backup to a local instance of MySQL.

Any ideas?

Manion answered 2/1, 2013 at 18:55 Comment(1)
3 years later, jmwicks has the correct answerReactive
D
14

gcloud sdk commands now provide import/export functionality:

gcloud sql export sql <DATABASE_INSTANCE> \
    gs://<CLOUD_STORAGE_BUCKET>/cloudsql/export.sql.gz \
    --database <DATABASE_NAME>

This export can be downloaded using gsutil. It can also be imported using mysqlimport

Darla answered 29/2, 2016 at 18:5 Comment(2)
You also need to grant the service account for the sql instance (gcloud sql instances describe <instance-name> | grep -i account) the Storage Object Creator permission so it can create the dump file in the bucket.Rheumatic
This command is deprecated and will be removed in version 205.0.0. Use gcloud sql export sql insteadBuatti
O
6

That's the problem I've encountered and my solution was:

  1. Go to IAM Service Accounts Management
  2. Create a new Service account (I called it sql-backuper), download access key for it in JSON
  3. Grant Viewer, Storage Object Creator roles to it on main IAM page (currently GCloud doesn't have a separate read-only role for SQL)
  4. Set it up on the machine that will do backups: gcloud auth activate-service-account [email protected] --key-file /home/backuper/gcloud-service-account.json (gcloud auth documentation)
  5. Create a new bucket at GCloud Storage Browser
  6. Now on your backup machine you can run: gcloud sql instances export [sql-instance-name] gs://[bucket-name]/[file-name].gz --database [your-db-name] (gcloud sql documentation) and gsutil cp gs://[bucket-name]/[file-name].gz [local-file-name].gz (gsutil cp documentation)
  7. You've got a local DB copy which you can now use as you want
Overcapitalize answered 9/8, 2016 at 18:54 Comment(1)
this should be the accepted answer as is covers all requirementsEarthbound
P
3

Note that you can now trigger an Export operation using the Cloud SQL REST API.

So your admin script can do that and then download the backup from Cloud Storage (You'll need to wait until the export operation finishes though).

Panache answered 5/12, 2013 at 7:21 Comment(0)
G
2

Sorry, but Cloud SQL does not have this functionality currently. We'd like to make this easier in the future. In the meantime, you could use Selenium (or some other UI scripting framework) in combination with a cron job.

Greenhead answered 2/1, 2013 at 19:37 Comment(1)
Went ahead and marked this as the answer because, it is. But this seems like a pretty big hole in functionality.Manion
K
2

If you want to download a backup (manual or automated), you can launch another CloudSQL instance and then:

  1. Click on the backup options (restore) from the backuped instance
  2. Choose to restore on the previously launched instance
  3. Export data from your newly restored CloudSQL
  4. Get the .sql or .csv on Cloud Storage
Kenelm answered 9/11, 2016 at 16:3 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.