I have a Rails 3 app running an older version of Spree (an open source shopping cart). I am in the process of updating it to the latest version. This requires me to run numerous migrations on the database to be compatible with the latest version. However the apps current database is roughly around 300mb and to run the migrations on my local machine (mac os x 10.7, 4gb ram, 2.4GHz Core 2 Duo) takes over three days to complete.
I was able to decrease this time to only 16 hours using an Amazon EC2 instance (High-I/O On-Demand Instances, Quadruple Extra Large). But 16 hours is still too long as I will have to take down the site to perform this update.
Does anyone have any other suggestions to lower this time? Or any tips to increase the performance of the migrations?
FYI: using Ruby 1.9.2, and Ubuntu on the Amazon instance.
ActiveRecord::Base.transaction do.. end
; it should ideally bring down the time from 3 hours. The downside of this is that when on production, the site will probably be "down". Also, like @Leonhard says about rebuilding indexes, you will want to take that into account. At any rate, this is a very learning experience. I feel your pain - hope something alleviates it asap! :) I'll keep an eye on this question. – Mangofind_each
. I did a bunch of testing on a large migration and found that a batch size of ~100find_each(batch_size: 100) {}
was optimal. The migration ran many times faster that just usingeach
. I would convert every instance of.where().each
to usefind_each
and see how far you get. – Limp