How to restore single database from instance backup on GCP?
Asked Answered
V

4

8

I am a beginner GCP administrator. I have several applications running on one instance. Each application has its own database. I set up automatic instance backup via the GCP GUI.

I would like to prepare for a possible failure of one of the applications, i.e. one database. I would like to prepare a procedure for restoring such a database, but in the GCP GUI there is no option to restore one database, I need to restore the entire instance, which I cannot due to the operation of other applications on this instance.

I also read in the documentation that a backup cannot be exported.

Is there any way to restore only one database from the entire instance backup?

Will I have to write a MySQL script that will backup each database separately and save it to Cloud Storage?

Valiant answered 2/3, 2020 at 12:44 Comment(2)
What is an "instance backup"? Are you using MySQL installed on a VM instance or Cloud SQL?Klemens
@JohnHanley Cloud SQLValiant
A
1

As of now there is no way to restore only one database from the entire instance backup. As you can check on the documentation the rest of the applications will also experience downtime (since the target instance will be unavailable for connections and existing connections will be lost).

Since there is no built in method to restore only one database from the entire backup instance you are correct and writing a MySQL script to backup each database separately and use import and export operations (here is the relevant documentation regarding import and export operations in the Cloud SQL MySQL context).

But I would recommend you from an implementation point of view to use a separate Cloud SQL instance for each application, and then you could restore the database in case one particular application fails without causing downtime or issues on the rest of the applications.

Alika answered 4/3, 2020 at 10:28 Comment(0)
B
8

Like Daniel mentioned you can use gcloud sql export/import to do this. You'll also need a Google Storage Bucket.

First export a database to a file

gcloud sql export sql [instance-name] [gs://path-to-export-file.gz] --database=[database-name]

Create an empty database

gcloud sql databases create [new-database-name] --instance=[instance-name]

Use the export file to populate your fresh, empty database.

gcloud sql import sql [instance-name] [gs://path-to-export-file.gz] --database=[database-name]
Braddock answered 17/7, 2020 at 22:37 Comment(1)
Is there not any way to overwrite an existing db? I want to import the data itself from another db once rake has built out the tables.Steere
S
3

I'm also a beginner here, but as an alternative, I think could you do the following:

  1. Create a new instance with the same configuration
  2. Restore the original backup into the new instance (this is possible)
  3. Create a dump of the one database that you are interested in
  4. Finally, import that dump into the production instance

In this way, you avoid messing around with data exports, limit the dump operation to the unlikely case of a restore, and save money on database instances.

Curious what people think about this approach?

Stiegler answered 15/7, 2020 at 20:1 Comment(0)
A
1

As of now there is no way to restore only one database from the entire instance backup. As you can check on the documentation the rest of the applications will also experience downtime (since the target instance will be unavailable for connections and existing connections will be lost).

Since there is no built in method to restore only one database from the entire backup instance you are correct and writing a MySQL script to backup each database separately and use import and export operations (here is the relevant documentation regarding import and export operations in the Cloud SQL MySQL context).

But I would recommend you from an implementation point of view to use a separate Cloud SQL instance for each application, and then you could restore the database in case one particular application fails without causing downtime or issues on the rest of the applications.

Alika answered 4/3, 2020 at 10:28 Comment(0)
V
1

I see that the topic has been raised again. Below is a description of how I solved the problem with doing backup individual databases from one instance, without using the built-in instance backup mechanism in GCP and uploud it to cloud storage.

To solve the problem, I used Google Cloud Functions written in Node.js 8. Here is step by step solution:

  1. Create a Cloud Storage Bucket.

  2. Create Cloud Function using Node.js 8.

  3. Edit below code to meet your instance and database parameters:

    const {google} = require("googleapis");
    const {auth} = require("google-auth-library");
    var sqladmin = google.sqladmin("v1beta4");
    
    exports.exportDatabase = (_req, res) => {
      async function doBackup() {
        const authRes = await auth.getApplicationDefault();
        let authClient = authRes.credential;
        var request = {
          // Project ID 
          project: "",
          // Cloud SQL instance ID
          instance: "",
          resource: {
            // Contains details about the export operation.
            exportContext: {
              // This is always sql#exportContext.
              kind: "sql#exportContext",
              // The file type for the specified uri (e.g. SQL or CSV)
              fileType: "SQL", 
              /** 
               * The path to the file in GCS where the export will be stored.
               * The URI is in the form gs://bucketName/fileName.
               * If the file already exists, the operation fails.
               * If fileType is SQL and the filename ends with .gz, the contents are compressed.
           */
          uri:``,
          /**
           * Databases from which the export is made.
           * If fileType is SQL and no database is specified, all databases are exported.
           * If fileType is CSV, you can optionally specify at most one database to export.
           * If csvExportOptions.selectQuery also specifies the database, this field will be ignored.
           */
          databases: [""]
        }
      },
      // Auth client
      auth: authClient
    };
    
    // Kick off export with requested arguments.
    sqladmin.instances.export(request, function(err, result) {
      if (err) {
        console.log(err);
      } else {
        console.log(result);
      }
      res.status(200).send("Command completed", err, result); 
    }); } doBackup(); };
    

Sorry for the last line but I couldn't format it well.

  1. Save and deploy this Cloud Function
  2. Copy the Trigger URL from configuration page of Cloud function.
  3. In order for the function to run automatically with a specified frequency, use Cloud Scheduler: Descrition: "", Frequency: USE UNIX-CORN !!!, Time zone: Choose yours, Target: HTTP, URL: PAST COPIED BEFORE TRIGGER URL HTTP method: POST
  4. Thats All, it shoudl work fine.
Valiant answered 28/7, 2020 at 12:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.