How to make Django migrations on Google Cloud Platform?
Asked Answered
D

2

5

I would like to host my app on Google App Engine and have a Google SQL instance for the database. The main part of the app is an API built with Django Rest Framework. I asked Google Cloud Support about the best practices of making migrations in production when I need to modify the database's schema. Since I am new to web development, perhaps any experts here have similar experience and can verify if the suggested process is something what I can really follow?

For database migration best practice, you could create a separate development/test/backup instance of your Cloud SQL database. Let's say for example your DB instance for production is DB1, create a dev instance namely DB2 that has all the tables of DB1. After that, configure your app to point DB2 instance temporarily. Please make sure that both instances are in sync and updated. Then, deploy a new version of your app that points to DB2 so you can update DB1 (add new tables, columns) as your official database instance in the production environment. Then you can point it again to DB1 and update DB2.

Distinct answered 16/2, 2018 at 10:50 Comment(0)
K
11

That's a good practice indeed to have a second CloudSQL to perform your migration onto. I'd suggest to:

  1. Deploy the new App Engine version which uses the new db schema but don't direct traffic to it yet (use gcloud app deploy --no-promote)
  2. Clone your CloudSQL instance to create a new one where you'll run the migration
  3. On your local machine, configure the Cloud SQL Proxy to point to this new CloudSQL instance and run python manage.py migrate
  4. Once the migration has finished, direct traffic to the new App Engine version.

In a production environment your new CloudSQL instance will lack the data written to the first instance while you were performing steps 2 to 4. The easiest solution to avoid this situation altogether is to stop your App Engine app during the migration. If you cannot afford some downtime though, you'll need to track the changes made to the first instance after you've cloned it and apply these changes manually to the new instance.

Keys answered 22/2, 2018 at 15:2 Comment(3)
Thanks for the answer. Will keep this blueprint in mind.Distinct
Is performing the migration on a copied database worth the effort of syncing the changes written during this process? Surely it's better to just perform migrations on the live system rather than schedule downtime?Territorialize
This is a ridiculously inefficient solution. A database could be several terabytes large, and deploys may come several times an hour.Graceless
P
1

I think an alternative to the proposed solution would be to create a read replica and direct your app traffic to the read replica while performing such migrations on the write instance.

Note this requires fast migrations.

It could also be a nightly operation or during low traffic periods where the site can be put into maintenance.

For more high available apps you might want to look at https://www.braintreepayments.com/blog/ruby-conf-australia-high-availability-at-braintree/

Prothonotary answered 24/3, 2021 at 16:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.