ERROR: (gcloud.sql.export.sql) HTTPError 409: Operation failed because another operation was already in progress
Asked Answered
H

3

6

Running sql exports via jenkins (backups), On a regular basis i receive "ERROR: (gcloud.sql.export.sql) HTTPError 409: Operation failed because another operation was already in progress. ERROR: (gcloud.sql.operations.wait) argument OPERATION [OPERATION ...]: Must be specified.

I'm trying to determine where i can see which job are causing this to fail

ive tried to extending the gcloud sql operations wait --timeout to 1600 no luck

gcloud sql operations wait --timeout=1600

Hyperthyroidism answered 1/4, 2019 at 19:53 Comment(0)
S
12

To wait for an operation, you need to specify the ID of the operation, as @PYB said. Here's how you can do that programmatically, like in a Jenkins script:

$ gcloud sql operations list --instance=$DB_INSTANCE_NAME --filter='NOT status:done' --format='value(name)' | xargs -r gcloud sql operations wait
$ gcloud sql ... # whatever you need to do
Scary answered 8/7, 2020 at 15:46 Comment(0)
A
1

There are 2 errors here that could be affecting you. The first one is that there is an administrative operation starting before the previous one has completed. Reading through this “Best Practices” doc on SQL will help you on that front: https://cloud.google.com/sql/docs/mysql/best-practices#admin Specifically, in the Operations tab you can see the operations that are running.

Finally, the [OPERATION] argument is missing from the command “gcloud sql operations wait --timeout=1600”. See the documentation on that command here: https://cloud.google.com/sdk/gcloud/reference/sql/operations/wait

OPERATION is the name of the running operation, and if you wish to list all instance operations to get the right name, you can use this command: https://cloud.google.com/sdk/gcloud/reference/sql/operations/list.

The operations names are 36 chars string on hexadecimal format, so your command should look something like this:
“gcloud sql operations wait OPERATION aaaaaaaa-0000-0000-0000-000000000000 --timeout=1600”

Cheers

Ambidexterity answered 5/4, 2019 at 23:29 Comment(0)
B
0

I have the same problem during a long running import:

gcloud sql import sql "mycompany-mysql-1" $DB_BACKUP_PATH --database=$DB_NAME -q

Does it really mean if the import runs for an hour, I am not able to create databases during that time? Really???

gcloud sql databases create $DB_NAME --instance="mycompany-mysql-1", -i "mycompany-mysql-1" --async "

This is a big issue if you use GCloud inside CI/CD! Anyone with an easy solution?

My idea until now:

  1. download the backup to the CI/CD from the cloud bucket
  2. connect over CLI to the MySQL and import the dump this way

But this means whenever two task inside the CI/CD want to do more than one task at the same time, one task will fail or I have to wait. Very Sad, if I am got it correct.

Bary answered 27/5, 2021 at 10:13 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.