MongoDb The handle is invalid - json

I am trying to import a JSON file into MongoDb using Mongoimport. It throws the following error Failed: error processing document #1: read C:\Users\mbryant2\Documents\primer-dataset.json: The handle is invalid.
Here is my cmd:
$ mongoimport --db tempTestDb --collection restaurants --drop --file C:/Users/mbryant2/Documents/primer-dataset.json
and response:
2018-09-14T12:17:36.337-0600 connected to: localhost
2018-09-14T12:17:36.338-0600 dropping: tempTestDb.restaurants
2018-09-14T12:17:36.339-0600 Failed: error processing document #1: read C:\Users\mbryant2\Documents\primer-dataset.json: The handle is invalid.
2018-09-14T12:17:36.339-0600 imported 0 documents
Anyone have any ideas on what I am missing? Is it needing login credentials or something like that?

If the data is represented as a JSON array, rather than individual lines of JSON text, you will need to add the --jsonArray parameter to mongoimport.

Related

mongoimport fails due to invalid character in massive file, possibly an issue with the character encoding

When I run the following command:
mongoimport -v -d ntsb -c data xml_results.json --jsonArray
I get this error:
2020-07-15T22:51:41.267-0400 using write concern: &{majority false 0}
2020-07-15T22:51:41.270-0400 filesize: 68564556 bytes
2020-07-15T22:51:41.270-0400 using fields:
2020-07-15T22:51:41.270-0400 connected to: mongodb://localhost/
2020-07-15T22:51:41.270-0400 ns: ntsb.data
2020-07-15T22:51:41.271-0400 connected to node type: standalone
2020-07-15T22:51:41.271-0400 Failed: error processing document #1: invalid character '}' looking for beginning of object key string
2020-07-15T22:51:41.271-0400 0 document(s) imported successfully. 0 document(s) failed to import.
I have tried all the solutions in this file and nothing worked. My JSON file is 60ish MB in size so it would be really hard to go through it and find the bracket issue. I believe that it is a problem with the UTF-8 formatting maybe? I take an XML file I downloaded on the internet and convert it into JSON with a Python script. When I try the --jsonArray flag, it gives the same error. Any ideas? Thanks!
It turns out within this massive file there were a few unnecessary commas. I was able to use Pythons built in JSON parsing to jump to lines with errors and remove them manually. As far as I can tell, the invalid character had nothing to do with the } but with the comma that caused it to expect another value before the closing bracket.
After solving this, I was still unable to import successfully because now the file was too large. The trick around this was to surround all the JSON objects with array brackets [] and use the following command: mongoimport -v -d ntsb -c data xml_results.json --batchSize 1 --jsonArray
After a few seconds the data imported successfully into Mongo.

MongoDB import of geojson data fails

I'm trying to import some GeoJson data into MongoDB. The entire file is about 24MB, so in theory the per-document limit of 16MB shouldn't be exceeded. But it looks like it's complaining about the size. I have tried solutions offered here, but none seems to work. I type the command:
mongoimport -d userdata -c countries < countries.geojson
and I get
2017-11-17T01:09:29.561+0400 connected to: localhost
2017-11-17T01:09:31.055+0400 num failures: 1
2017-11-17T01:09:31.055+0400 Failed: lost connection to server
2017-11-17T01:09:31.055+0400 imported 0 documents
and the mongod logs show (after backtrace):
2017-11-17T01:09:31.055+0400 I - [conn153] AssertionException handling request, closing client connection: 10334 BSONObj size: 17756597 (0x10EF1B5) is invalid. Size must be between 0 and 16793600(16MB) First element: insert: "countries"
2017-11-17T01:09:31.055+0400 I - [conn153] end connection 127.0.0.1:61806 (2 connections now open)
I have tried
mongoimport -d userdata -c countries < countries.geojson --batchSize 1
and
mongoimport -d userdata -c countries -j 4 < countries.geojson
based on other similar answers but got the same result, with the same response and logs.
Anyone have clues as to what's going on here? Should I break the GeoJson into two and give that a shot? I thought the 16MB limit was on individual documents, not collections or collection imports.

Mongoimport not adding JSON documents to db after import

mongoimport command returns with the correct amount of documents, and adds a new collection but when I try to open my db there is nothing. I am using a json array to store my data but am not sure why this isnt working.
C:\Program Files\MongoDB\Server\3.2\bin>mongoimport --db playerList --collection data --jsonArray --file ../../../../../nodeProjects/public/data.json
2016-07-20T09:30:05.807-0700 connected to: localhost
2016-07-20T09:30:05.813-0700 imported 1 document
C:\Program Files\MongoDB\Server\3.2\bin>mongo
MongoDB shell version: 3.2.7
connecting to: test
> use playerList
switched to db playerList
> db.playerList.find().pretty()
> db.getCollectionNames()
[ "data" ]
and my data.json file is.
[{"name":"A.J. Green","team":"CIN","pos":"WR","weeklyPts":[{"week":1,"pts":6.3},{"week":2,"pts":10.5},{"week":3,"pts":34.7}]}]
your collection is data not playerList which can be viewed in last line i.e db.getCollectionNames(), change db.playerList.find().pretty to db.data.find.pretty()and it will work
The collection name in your find() is wrong, you are doing a find on the playerList collection but you imported the data into a collection called "data". So try:
db.data.find().pretty()

unexpected identifier while importing json document into mongodb with mongoimport command

I am trying to import a json document into mongodb but it shows me unexpected identifier. my json document looks something like following
[
{
"Cancer Sites":"Female Breast",
"State":"Alabama",
"Year":2000,
"Sex":"Female",
"Count":550
},
{
"Cancer Sites":"Female Breast",
"State":"Alabama",
"Year":2000,
"Sex":"Female",
"Count":2340
},
{
"Cancer Sites":"Female Breast",
"State":"Alabama",
"Year":2000,
"Sex":"Female",
"Count":45
}
]
I tried with following query from my mongo shell but it doesn't work
mongoimport -d treatment -c stats --file news.json
I am executing it from mongo shell on windows command prompt. my mongo shell is in C:\mongodb\bin path and my file is also in same path. can anyone tell where I am wrong
since it is a list of array we should use
mongoimport -d treatment -c stats --jsonArray news.json

Importing JSON or CSV file in MongoDB

I'm new to MongoDB in Windows 7 and I tried to import JSON and CSV file into MongoDB.
1.First i tried importing JSON file using the command
"C:\>mongodb\bin\mongoimport –host localhost:27017 –db mydb –collection docs"
and it showed this error
"exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Expecting '{': offset:0"
2.When i import CSV file i used the command
"C:\mongodb\bin>mongoimport --db mynewdb --collection message --type csv --fields
form,Iname,fistname,lastname --file d:\new folder\csv1.csv"
and i get the error message as
"ERROR: multiple occurrences
Import CSV, TSV or JSON data into MongoDB.
When importing JSON documents, each document must be a separate line of the input file"
I downloaded JSON and CSV bulk file randomly by browsing. I want to know whether it will get imported when data's are non-organized or should the data be organized? If so where can get a complete bulk JSON and CSV file which is ready to import.
Try using the --jsonArray flag at the end of your query so it would look like
C:\>mongodb\bin\mongoimport –host localhost:27017 –db mydb –collection docs --jsonArray