Mongoimport of JSON file
Asked Answered
K

20

221

I have a JSON file consisting of about 2000 records. Each record which will correspond to a document in the mongo database is formatted as follows:

{jobID:"2597401",
account:"XXXXX",
user:"YYYYY",
pkgT:{"pgi/7.2-5":{libA:["libpgc.so"],flavor:["default"]}},     
startEpoch:"1338497979",
runTime:"1022",
execType:"user:binary",
exec:"/share/home/01482/XXXXX/appker/ranger/NPB3.3.1/NPB3.3-MPI/bin/ft.D.64",
numNodes:"4",
sha1:"5a79879235aa31b6a46e73b43879428e2a175db5",
execEpoch:1336766742,
execModify: new Date("Fri May 11 15:05:42 2012"),
startTime: new Date("Thu May 31 15:59:39 2012"),
numCores:"64",
sizeT:{bss:"1881400168",text:"239574",data:"22504"}},

Each record is on a single line in the JSON file, and the only line breaks are at the end of every record. Therefore, each line in the document starts with "{jobID:"... I am trying to import these into a mongo database using the following command:

mongoimport --db dbName --collection collectionName --file fileName.json

However, I get the following error:

Sat Mar  2 01:26:12 Assertion: 10340:Failure parsing JSON string near: ,execModif
0x10059f12b 0x100562d5c 0x100562e9c 0x10025eb98 0x10000e643 0x100010b60 0x10055c4cc 0x1000014b7    
0x100001454 
 0   mongoimport                         0x000000010059f12b _ZN5mongo15printStackTraceERSo + 43
 1   mongoimport                         0x0000000100562d5c _ZN5mongo11msgassertedEiPKc + 204
 2   mongoimport                         0x0000000100562e9c _ZN5mongo11msgassertedEiRKSs + 12
 3   mongoimport                         0x000000010025eb98 _ZN5mongo8fromjsonEPKcPi + 1576
 4   mongoimport                         0x000000010000e643          
                                         _ZN6Import8parseRowEPSiRN5mongo7BSONObjERi + 2739
 5   mongoimport                         0x0000000100010b60 _ZN6Import3runEv + 7376
 6   mongoimport                         0x000000010055c4cc _ZN5mongo4Tool4mainEiPPc + 5436
 7   mongoimport                         0x00000001000014b7 main + 55
 8   mongoimport                         0x0000000100001454 start + 52
Sat Mar  2 01:26:12 exception:BSON representation of supplied JSON is too large: Failure parsing    
    JSON string near: ,execModif
Sat Mar  2 01:26:12 
Sat Mar  2 01:26:12 imported 0 objects
Sat Mar  2 01:26:12 ERROR: encountered 1941 errors

I do not know what the problem is. Can someone recommend a solution?

Kaufman answered 2/3, 2013 at 6:40 Comment(0)
K
423

I was able to fix the error using the following query:

mongoimport --db dbName --collection collectionName --file fileName.json --jsonArray

Hopefully this is helpful to someone.

Kaufman answered 11/4, 2013 at 5:18 Comment(8)
--jsonArray being the ticket, yes?Venable
Short form of this mongoimport -d <database> -c <collection> --jsonArray -f <filename>.json.Osei
sometimes user/password are required mongoimport --db dbName --collection collectionName --file fileName.json --jsonArray -u ser -p passwordSunder
Adding to @DiegoAndrésDíazEspinoza comment, that in my case I got an error of "unable to authenticate using mechanism 'SCRAM-SHA-1'". So, after a search, I found that it is missing the keyword authenticationDatabase as mentioned in the answer https://mcmap.net/q/120534/-how-to-import-data-into-mongodb.Planography
For future searchers, you might still need to install mongoimport: docs.mongodb.com/database-tools/installation/installationEdacity
--file fileName.json is extra and causes: "Failed: cannot decode array into a primitive.D"Inalterable
I'm getting Uncaught: SyntaxError: Missing semicolon. (1:14) error on --dbDaveen
+ --uri="" for specifying the connection string for a database other than localhost (which is what mongoimport tries to use by default)Maidenhood
P
68

try this,

mongoimport --db dbName --collection collectionName <fileName.json

Example,

mongoimport --db foo --collection myCollections < /Users/file.json
connected to: *.*.*.*
Sat Mar  2 15:01:08 imported 11 objects

Issue is because of you date format.

I used same JSON with modified date as below and it worked

{jobID:"2597401",
account:"XXXXX",
user:"YYYYY",
pkgT:{"pgi/7.2-5":{libA:["libpgc.so"],flavor:["default"]}},     
startEpoch:"1338497979",
runTime:"1022",
execType:"user:binary",
exec:"/share/home/01482/XXXXX/appker/ranger/NPB3.3.1/NPB3.3-MPI/bin/ft.D.64",
numNodes:"4",
sha1:"5a79879235aa31b6a46e73b43879428e2a175db5",
execEpoch:1336766742,
execModify:{"$date" : 1343779200000},
startTime:{"$date" : 1343779200000},
numCores:"64",
sizeT:{bss:"1881400168",text:"239574",data:"22504"}}

hope this helps

Preselector answered 2/3, 2013 at 9:32 Comment(5)
I have the same error as in the question... Did check this import?Brena
I adjusted the dates as you suggested and that did get rid of that particular error. However, now I am getting a new one. Here is the new error:Kaufman
Can you paste the new JSON and which version of Mongo you are on ?Preselector
I was able to fix the error by adding --jsonArray to the end of the query.Kaufman
Need to use "" around the .json, if it's contain folder name has spaces in it. Answered by Abhi below For E.g. This will not work, need to add "" to the json file location to import it. D:\>mongoimport --db testimport --collection small_zip < D:\Dev\test test\small_zips.json The system cannot find the file specified. This works D:\>mongoimport --db testimport --collection small_zip < "D:\Dev\test test\small_zips.json" 2016-04-17T18:32:34.328+0800 connected to: localhost 2016-04-17T18:32:34.610+0800 imported 200 documentsTerminology
J
29

Using mongoimport you can able to achieve the same

mongoimport --db test --collection user --drop --file ~/downloads/user.json

where,

test - Database name
user - collection name
user.json - dataset file

--drop is drop the collection if already exist.

Jeffrey answered 27/12, 2017 at 7:18 Comment(0)
W
22

console:

mongoimport -d dbName -c collectionName dataFile.js 
Whence answered 16/11, 2014 at 0:49 Comment(0)
H
8

Your syntax appears completely correct in:

mongoimport --db dbName --collection collectionName --file fileName.json

Make sure you are in the correct folder or provide the full path.

Heath answered 3/11, 2013 at 22:51 Comment(0)
A
7

I have used below command for export DB

mongodump --db database_name --collection collection_name

and below command worked for me to import DB

mongorestore --db database_name path_to_bson_file
Awakening answered 9/6, 2016 at 10:50 Comment(0)
R
6

Import JSON/CSV file in MongoDB

  • wait wait
  • first check mongoimport.exe file in your bin folder(C:\Program Files\MongoDB\Server\4.4\bin) if it is not then download mongodb database tools(https://www.mongodb.com/try/download/database-tools)
  • copy extracted(unzip) files(inside unzipped bin) to bin folder(C:\Program Files\MongoDB\Server\4.4\bin)
  • copy your json file to bin folder(C:\Program Files\MongoDB\Server\4.4\bin)
  • Now open your commond prompt change its directory to bin
cd "C:\Program Files\MongoDB\Server\4.4\bin"
  • Now copy this on your commnad prompt
mongoimport -d tymongo -c test --type json --file restaurants.json
  • where d- database(tymongo-database name), c-collection(test-collection name)

FOR CSV FILE

 mongoimport -d tymongo -c test --type csv --file database2.csv --headerline
Ratib answered 16/2, 2021 at 15:43 Comment(0)
S
3

Run the import command in another terminal. (not inside mongo shell.)

mongoimport --db test --collection user --drop --file ~/downloads/user.json
Sunk answered 11/1, 2019 at 8:42 Comment(0)
H
3

In windows you can use your Command Prompcmd cmd , in Ubuntu you can use your terminal by typing the following command:

mongoimport  -d  your_database_name  -c  your_collection_name  /path_to_json_file/json_file_name.json

then when you open your mongo shell, you will find to check your database_name when running this command:

show databases
Hemichordate answered 26/3, 2019 at 9:10 Comment(0)
K
2

This command works where no collection is specified .

mongoimport --db zips "\MongoDB 2.6 Standard\mongodb\zips.json"

Mongo shell after executing the command

connected to: 127.0.0.1
no collection specified!
using filename 'zips' as collection.
2014-09-16T13:56:07.147-0400 check 9 29353
2014-09-16T13:56:07.148-0400 imported 29353 objects
Kaitlynkaitlynn answered 16/9, 2014 at 17:59 Comment(0)
I
2

Solution:-

mongoimport --db databaseName --collection tableName --file filepath.json

Example:-

Place your file in admin folder:-

C:\Users\admin\tourdb\places.json

Run this command on your teminal:-

mongoimport --db tourdb --collection places --file ~/tourdb/places.json

Output:-

admin@admin-PC MINGW64 /
$ mongoimport --db tourdb --collection places --file ~/tourdb/places.json
2019-08-26T14:30:09.350+0530 connected to: localhost
2019-08-26T14:30:09.447+0530 imported 10 documents

For more link

Inapt answered 26/8, 2019 at 9:16 Comment(0)
M
1

I tried something like this and it actually works:

mongoimport --db dbName --file D:\KKK\NNN\100YWeatherSmall.data.json
Mycenaean answered 15/2, 2019 at 11:25 Comment(0)
S
1

This works with me when db with usrname and password

mongoimport --db YOUR_DB --collection MyCollection --file /your_path/my_json_file.json -u my_user -p my_pass

db without username password please remove -u my_user -p my_pass

My sample json

{ 
    "_id" : ObjectId("5d11c815eb946a412ecd677d"), 
    "empid" : NumberInt(1), 
    "name" : "Rahul"
}
{ 
    "_id" : ObjectId("5d11c815eb946a412ecd677e"), 
    "empid" : NumberInt(2), 
    "name" : "Rahul"
}
Shortsighted answered 29/7, 2019 at 8:5 Comment(0)
G
1

A bit late for probable answer, might help new people. In case you have multiple instances of database:

mongoimport --host <host_name>:<host_port> --db <database_name> --collection <collection_name>  --file <path_to_dump_file> -u <my_user> -p <my_pass>

Assuming credentials needed, otherwise remove this option.

Gezira answered 13/11, 2019 at 17:23 Comment(0)
O
1
  1. Just copy path of json file like example "C:\persons.json"
  2. go to C:\Program Files\MongoDB\Server\4.2\bin
  3. open cmd on that mongodb bin folder and run this command

mongoimport --jsonArray --db dbname--collection collectionName--file FilePath

example mongoimport --jsonArray --db learnmongo --collection persons --file C:\persons.json

Obau answered 17/2, 2020 at 15:1 Comment(0)
J
1

Number of answer have been given even though I would like to give mine command . I used to frequently. It may help to someone.

mongoimport original.json -d databaseName -c yourcollectionName --jsonArray --drop
Java answered 20/4, 2020 at 12:22 Comment(0)
C
1

this will work:

$  mongoimport --db databaseName --collection collectionName --file filePath/jsonFile.json 

2021-01-09T11:13:57.410+0530 connected to: mongodb://localhost/ 2021-01-09T11:13:58.176+0530 1 document(s) imported successfully. 0 document(s) failed to import.

Above I shared the query along with its response

Cold answered 9/1, 2021 at 6:5 Comment(0)
A
1

mongoimport -d <dbname> -c <collectio_name> --file <c:\users\test.json> --jsonArray

Adaliah answered 17/1, 2022 at 12:46 Comment(1)
As it’s currently written, your answer is unclear. Please edit to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers in the help center.Quenelle
K
1
  1. import json array data to ATLAS in your local laptop https://www.mongodb.com/docs/atlas/import/mongoimport/
mongoimport --uri "mongodb+srv://<user>:<password>@cluster0.elddaddy.mongodb.net/test?retryWrites=true&w=majority&ssl=true" --collection Providers --drop --file /Users/Documents/data2.json --jsonArray

This command imports the data in the data2.json file to a collection named Providers in the MongoDB database located at the cluster0.dl79aky.mongodb.net URI.

The --drop option is used to drop the existing collection if it exists.

The --jsonArray option specifies that the input file is a JSON array, rather than a single JSON object. This allows us to import an array of documents as a batch.

The --uri option specifies the URI to connect to the database, which includes the user credentials, database name, and connection options.

Here's the breakdown of the URI:

mongodb+srv://: specifies that this is a connection string for a MongoDB Atlas cluster that uses SRV DNS record :@: specifies the username and password of the user that is connecting to the database cluster0.dl79aky.mongodb.net: the name of the MongoDB Atlas cluster that you want to connect to

/test: the name of the database within the cluster that you want to connect to

?retryWrites=true&w=majority: specifies the write concern options for the connection. retryWrites=true specifies that the driver should retry writes if they fail, and w=majority specifies that the write operation should wait for the majority of nodes to acknowledge the write before returning

&ssl=true: specifies that the connection should use SSL/TLS encryption

  1. check your data should be json array
[
  {
    "name": "John",
    "age": 30,
    "email": "[email protected]"
  },
  {
    "name": "Jane",
    "age": 25,
    "email": "[email protected]"
  },
  {
    "name": "Bob",
    "age": 40,
    "email": "[email protected]"
  }
]
Kenlee answered 18/3, 2023 at 2:5 Comment(0)
B
0

If you try to export this test collection:

> db.test.find()
{ "_id" : ObjectId("5131c2bbfcb94ddb2549d501"), "field" : "Sat Mar 02 2013 13:13:31 GMT+0400"}
{"_id" : ObjectId("5131c2d8fcb94ddb2549d502"), "field" : ISODate("2012-05-31T11:59:39Z")}

with mongoexport (the first date created with Date(...) and the second one created with new Date(...) (if use ISODate(...) will be the same as in the second line)) so mongoexport output will be looks like this:

{ "_id" : { "$oid" : "5131c2bbfcb94ddb2549d501" }, "field" : "Sat Mar 02 2013 13:13:31 GMT+0400" }
{ "_id" : { "$oid" : "5131c2d8fcb94ddb2549d502" }, "field" : { "$date" : 1338465579000 } }

So you should use the same notation, because strict JSON doesn't have type Date( <date> ).

Also your JSON is not valid: all fields name must be enclosed in double quotes, but mongoimport works fine without them.

You can find additional information in mongodb documentation and here.

Brena answered 2/3, 2013 at 9:37 Comment(2)
I adjusted the dates as you suggested and that did get rid of that particular error. However, now I am getting a new one. Here is the new error: ' Sat Mar 2 15:22:07 exception:BSON representation of supplied JSON is too large: Failure parsing JSON string near: data:"1949 Sat Mar 2 15:22:07 Sat Mar 2 15:22:07 imported 0 objects Sat Mar 2 15:22:07 ERROR: encountered 34763 errors'Kaufman
I think it's another error retaled to field sizeT:{data: "1949..."}}Brena

© 2022 - 2024 — McMap. All rights reserved.