It's not all that bad. I've been using varying levels of Git and SVN management for Sugar for years.
Here's a template .gitignore I usually start with:
# Ignore Everything
/*
# Except .htaccess and Config files
!.htaccess
!config.php
!config_override.php
!.gitignore
# Except custom
!/custom/
# but do ignore some of custom and Extensions stuff
/custom/backup
/custom/blowfish
/custom/history
/custom/index.html
/custom/workflow
/custom/modulebuilder
custom/modules/Connectors
/custom/modules/*/Ext
/custom/application/Ext
# and do track custom modules
!/modules/
/modules/*
!/modules/org_MyModule/
Make sure you keep that last section tracked manually. When you create a module, add it to that list.
If you do end up modifying core files, either for bug fixing or enhancement (which you should try to avoid), you can explicitly add them to your .gitignore
The biggest issue I end up having are config.php
and config_override.php
. Depending on your environment set up, these aren't all that bad, but the site_url
param needs to change based on the URL for the system, so kind of a PITA. Don't think that you can just track one or the other though, since Sugar's config updating processes can rewrite them on a regular and somewhat unpredictable basis.
As for the statement that every change causes dozens of file changes, yes and no. It depends a little bit, but I've certainly seen it. One thing that you can do to minimize this a bit is to make sure you've disabled languages you don't use. I've had projects that I joined that were tracking literally tens of thousands of language files that no one at the organization was speaking. It was English only. We configured Sugar to not generate these and the diff I created deleting those files was so large that our GUI diff tool wouldn't display it.
what about a database or studio changes in production?
Easily the biggest difficulty. As you likely know, Sugar stores field information both in the vardef
file and the database table fields_meta_data
.
Depending on your set up for development/deployment, you can work around this one of a few ways. I'll outline a few I've seen,
- nightly backups of the
- database schema
- fields_meta_data
- Product Catalog, Product Category tables
- users table (but scrubbed user_hash data)
- email addresses table, but only for users
This method allows for pretty good development matching of production, but scrubbing some sensitive data and not including unnecessary data. The downside is that custom scripting needs to be employed to (1) handle the backup and (2) handle your dev. environment's download of the backup
- a frequent CRON driven script that uses
mysqldump
to back up key tables (e.g. config
, fields_meta_data
, users
) to individual files, then detects "actual changes" (e.g. diff configured to ignore whitespace timestamps) and if they are found, will commit these files to master.
You can pair this with a similar script that watches the output of git status
on production; I've also seen Inotify used for this. When those changes are found (again, on production), they're automatically committed to master.
Once perfected, this method is much more automatic and will even be informed if one of your business managers makes changes in Studio. As you move to commit your local development changes, you'll notice that your master is not the same as origin/master, and you can handle potential conflicts there.