mongodb import multiple collections at once
Asked Answered
R

4

5

I am using this command

mongoimport --db databasename

to import a database that I exported using mongoexport. The database has over 100 collections in the mongoimport documentation you need to specify the collection name and json file. How do I go about importing all the collections at once without having to type a command for each collection

Rochus answered 28/1, 2017 at 20:24 Comment(0)
C
3

According to the documentation

New in version 2.6: If you do not specify --collection, mongoimport takes the collection name from the input filename. MongoDB omits the extension of the file from the collection name, if the input file has an extension.

So it seems it is supposed to do import one collection at a time. So unless you write a shell script to do it, I don't see a way.

mongodump and mongorestore are much better for taking full db dump and restore it at once as the other part of the same document says

Warning

Avoid using mongoimport and mongoexport for full instance production backups. They do not reliably preserve all rich BSON data types, because JSON can only represent a subset of the types supported by BSON. Use mongodump and mongorestore as described in MongoDB Backup Methods for this kind of functionality.

Cari answered 28/1, 2017 at 20:57 Comment(1)
agree, do not use mongoimport or mongoexport on production, I had cases of loosing data when doing such operationsOr
G
8

It is very surprising that there isn't easy to find documentation on importing multiple collections. Restoring a database isn't always intuitive keyword when you are searching for it, but that is what you want to do when you want to restore a database from a backup. If you already have a dump of collections that you exported with mongodump, you should be able to use mongorestore.

This is all you really need to run:

mongorestore --db db_name ./db_dumpfiles/
Gumdrop answered 4/10, 2019 at 0:47 Comment(0)
E
4

If dump files are in .json you can use the script for import

ls *.json | sed 's/.metadata.json//' | while read col; do mongoimport -d db_name -c $col < $col.metadata.json; done

If dump files are in .json.gz you can use the script for import

ls *.gz | sed 's/.metadata.json.gz//' | while read col; do mongoimport -d db_name --gzip -c $col < $col.metadata.json.gz; done
Endmost answered 14/6, 2017 at 11:44 Comment(1)
Thanks for this script. Saved me a lot of typing :)Shamble
C
3

According to the documentation

New in version 2.6: If you do not specify --collection, mongoimport takes the collection name from the input filename. MongoDB omits the extension of the file from the collection name, if the input file has an extension.

So it seems it is supposed to do import one collection at a time. So unless you write a shell script to do it, I don't see a way.

mongodump and mongorestore are much better for taking full db dump and restore it at once as the other part of the same document says

Warning

Avoid using mongoimport and mongoexport for full instance production backups. They do not reliably preserve all rich BSON data types, because JSON can only represent a subset of the types supported by BSON. Use mongodump and mongorestore as described in MongoDB Backup Methods for this kind of functionality.

Cari answered 28/1, 2017 at 20:57 Comment(1)
agree, do not use mongoimport or mongoexport on production, I had cases of loosing data when doing such operationsOr
A
3

One simple command

mongorestore --db folder_name ./folder_name/
Amaranthaceous answered 28/1, 2021 at 9:21 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.