Drupal DATABASE deployment strategies?
Asked Answered
W

4

14

From this item: What's best Drupal deployment strategy? .... I quote:

Databases are trickier; cleaning up the dev/staging DB and pushing it to live is easiest for the initial rollout but there are a few wrinkles when doing incremental DB updates if users on the live site are also generating content.

I want some ideas on how to do this? Currently I get a complete copy of the existing db on my local machine, commit that subversion, and then deploy the whole database. Currently the file is 15megs, and each time I have to upload the whole file (i think subversion sees it as a whole new file, because it has so many changes each time).

So, my questions are really:

  1. How can I get my Db size down when committing (other than committing less frequently)?
  2. Is there any other way to keep my db and server DB in synch? especially considering that users will be posting new data all the time?
Westberg answered 30/9, 2009 at 10:45 Comment(0)
J
13

Is there any other way to keep my db and server DB in synch? especially considering that users will be posting new data all the time?

We have a large distributed team and editorial staff everywhere so deploying the database is not feasible.

To get around this we make extensive use of update functions. We have a module which has no real code, which we use for updating settings. Every time a developer makes a configuration change they write an update function in this module which when run will make the corresponding change on the other development DBs, staging and live.

There are issues, particularly with cross dependencies (if people write update functions in more than one module), and it can take time to code something that is a relatively minor change in the admin. Install profile api helps in this.

For example

function mysite_update_6000() {
  install_include(array('user'));
  $editor_rid = install_add_role('editor');
  install_add_permissions(DRUPAL_ANONYMOUS_RID, array('do something'));
  install_add_permissions($editor_rid, array('do something', 'administer nodes'));
  return array();
} 

Will add a role and assign some permissions to it. Doing this keeps all of the changes in the code so that you don't have to try to migrate and synchronise databases.

There is also a migration module which may help with this, it logs changes to tables and saves them to an update function. This is not to be confused with the drupal.org migrate module which is for content migration.

We have had some success, but also some issues with the features module, which can help with migrating features.

Jackfish answered 1/10, 2009 at 9:57 Comment(2)
+1 - Good idea to use a 'dummy' module to allow for custom update functions.Ortrud
@Jeremy: I find your approach very ingteresting, given that features module does not work for all things. But I am wondering how to you manage to write udpate code for all kinds of things one does as a Druapl admin. For example, I enable all translations/locale related modules and then create translated versions of menu items in all my menus. Where would you look to know what code to write in update functions for this?Sharitasharity
O
2

For smaller projects, we still do something similar to your current procedure in that we lock the live instance to read only by blocking all users with edit rights, than dump the database, upload it to a stage instance, perform all updates we need there and once satisfied with the results, we switch the stage instance to become the next live version. But even for small instances this is painful and far from a good solution.

In two bigger projects, we are in the same boat as Jeremy in that the whole setup is way to complex for deploying complete database dumps, especially since we can not afford to lock down the instances to read only mode just for some updates.

For those, we have used Migraine to some extend (see also this related discussion). It is not a Drupal module but a python script that we adapted a bit to our needs. It aims to create somewhat structured dumps, separating user supplied content from settings and other stuff, thus allowing for more selective update and staging strategies. But given the more or less chaotic Drupal database structure (especially the lack of referential integrity enforcement), using this approach needs constant tweaking when adding new modules and is pretty risky, as one needs to make extra sure to dump/update only coherent sets of tables.

We try to minimize the need for 'wholesale' dump/update operations by using the update functions of our custom modules and I like Jeremy Frenchs suggestion of adding a 'dummy' module just for the ability to add update functions for other settings!

All in all, updating/migrating Drupal instances is a big pain right now and I hope there will be a more coherent solution in future versions, although I can see that it is difficult to come up with a generalized approach given the current database schema and the amount of custom modules with individual additions out there :/


PS: Backup and Migrate is a Drupal module that seems to take an approach similar to that of the Migraine script, but I have not used it yet.

Ortrud answered 1/10, 2009 at 12:17 Comment(0)
L
2

Henrik and Jeremy have given excellent answers on the state of deployment. I have also heard of Capistrano (ruby) being used to good effect. The DrupalCampLA Case Study describes the deployment mechanism they used (including Capistrano), and the download package is said to include their deployment script(s).

If you want to minimize the size of your database dumps, be sure to tailor it to exclude cache and watchdog tables. Backup and Migrate's UI by default shows you which tables it thinks are worth ignoring.

Lurlinelusa answered 1/10, 2009 at 18:5 Comment(1)
+1 for suggesting another alternative - I had not heard of Capistrano before but it looks/sounds promising to check it out.Ortrud
B
1

The best way to do this is to keep all your changes in code and use tools as the features module to push changes to staging and/or production.

It's more work while developing, that's a fact, but if you work together with a bunch of people who all have their own database or you want to easily push changes to production, features is def. the way to go.

Burkhalter answered 1/9, 2011 at 10:18 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.