I have about a dozen repositories that on the file system are 1 GB to 10 GB in size, and I need to set up automated backups for all of them (our old backup scripts got lost when a computer went down) with our XP 64-bit machines.
After reading this question about the best way to back up SVN repos, I started dumping the biggest repo we have, which is about 13 GB. This command has been executing for ~2.5 hours now, and it's currently dumping revision ~200 of 300+.
svnadmin --deltas \\path\to\repo\folder > \\path\to\backup\folder\dump.svn
The dump file is over 100 GB and counting. I know I can 7-zip this sucker, but 100 GB?! ... o_O
The repositories contain a large amount of binary data, which could be part of the problem, but as of right now, switching to a more efficient version control system (assuming there is one) is not realistic; SVN is a part of life here.
I've considered using hotcopy, which takes up a lot less space, but I tried using one of our old hotcopy-ed backups, and subversion 1.7 couldn't find a bunch of files it needed. It seems that I'd have to install the version of SVN that originally hotcopy-ed the repo, and dump that repo to get it into a newer SVN. This statement seems to verify the problem I'm having with hotcopy: http://svn.haxx.se/users/archive-2005-05/0842.shtml
I feel like I've just got to be missing something. Maybe there's some flag for dump that magically makes the dump 1/5 the size...
Do I have any other options?
UPDATE: The last revision, #327, was just dumped. The final size of the dump file is 127 GB. That's from a 13.5 GB repo. I have probably roughly 3X that much in all of my repositories combined.