In trying to run a database dump using a query from a db of about 5 billion, the progress bar times seem to indicate that this dump won't finish in any reasonable time (100+ days). The query also froze after it seems to have ended at 0%, around 22 or so hours later - the line after is a metadata.json line.
The dump line is:
mongodump -h myHost -d myDatabase -c mycollection --query "{'cr' : {\$gte: new Date(1388534400000)}, \$or: [ { 'tln': { \$lte: 0., \$gte: -100.}, 'tlt': { \$lte: 100, \$gte: 0} }, { 'pln': { \$lte: 0., \$gte: -100.}, 'plt': { \$lte: 100, \$gte: 0} } ] }"
And my last few lines of output was (typed as I can't post images yet.)
[timestamp] Collection File Writing Progress: 10214400/5066505869 0% (objects)
[timestamp] Collection File Writing Progress: 10225100/5066505869 0% (objects)
[timestamp] 10228391 objects
[timestamp] Metadata for database.collection to dump/database/collection.metadata.json
Any thoughts to help improve performance or any idea on why this is taking so long?
metadata.json
is normally emitted when the dump completes for a given collection. – Relax