mongoimport command returns with the correct amount of documents, and adds a new collection but when I try to open my db there is nothing. I am using a json array to store my data but am not sure why this isnt working.
C:\Program Files\MongoDB\Server\3.2\bin>mongoimport --db playerList --collection data --jsonArray --file ../../../../../nodeProjects/public/data.json
2016-07-20T09:30:05.807-0700 connected to: localhost
2016-07-20T09:30:05.813-0700 imported 1 document
C:\Program Files\MongoDB\Server\3.2\bin>mongo
MongoDB shell version: 3.2.7
connecting to: test
> use playerList
switched to db playerList
> db.playerList.find().pretty()
> db.getCollectionNames()
[ "data" ]
and my data.json file is.
[{"name":"A.J. Green","team":"CIN","pos":"WR","weeklyPts":[{"week":1,"pts":6.3},{"week":2,"pts":10.5},{"week":3,"pts":34.7}]}]
your collection is data not playerList which can be viewed in last line i.e db.getCollectionNames(), change db.playerList.find().pretty to db.data.find.pretty()and it will work
The collection name in your find() is wrong, you are doing a find on the playerList collection but you imported the data into a collection called "data". So try:
db.data.find().pretty()
Related
When I run the following command:
mongoimport -v -d ntsb -c data xml_results.json --jsonArray
I get this error:
2020-07-15T22:51:41.267-0400 using write concern: &{majority false 0}
2020-07-15T22:51:41.270-0400 filesize: 68564556 bytes
2020-07-15T22:51:41.270-0400 using fields:
2020-07-15T22:51:41.270-0400 connected to: mongodb://localhost/
2020-07-15T22:51:41.270-0400 ns: ntsb.data
2020-07-15T22:51:41.271-0400 connected to node type: standalone
2020-07-15T22:51:41.271-0400 Failed: error processing document #1: invalid character '}' looking for beginning of object key string
2020-07-15T22:51:41.271-0400 0 document(s) imported successfully. 0 document(s) failed to import.
I have tried all the solutions in this file and nothing worked. My JSON file is 60ish MB in size so it would be really hard to go through it and find the bracket issue. I believe that it is a problem with the UTF-8 formatting maybe? I take an XML file I downloaded on the internet and convert it into JSON with a Python script. When I try the --jsonArray flag, it gives the same error. Any ideas? Thanks!
It turns out within this massive file there were a few unnecessary commas. I was able to use Pythons built in JSON parsing to jump to lines with errors and remove them manually. As far as I can tell, the invalid character had nothing to do with the } but with the comma that caused it to expect another value before the closing bracket.
After solving this, I was still unable to import successfully because now the file was too large. The trick around this was to surround all the JSON objects with array brackets [] and use the following command: mongoimport -v -d ntsb -c data xml_results.json --batchSize 1 --jsonArray
After a few seconds the data imported successfully into Mongo.
I am trying to import a JSON file into MongoDb using Mongoimport. It throws the following error Failed: error processing document #1: read C:\Users\mbryant2\Documents\primer-dataset.json: The handle is invalid.
Here is my cmd:
$ mongoimport --db tempTestDb --collection restaurants --drop --file C:/Users/mbryant2/Documents/primer-dataset.json
and response:
2018-09-14T12:17:36.337-0600 connected to: localhost
2018-09-14T12:17:36.338-0600 dropping: tempTestDb.restaurants
2018-09-14T12:17:36.339-0600 Failed: error processing document #1: read C:\Users\mbryant2\Documents\primer-dataset.json: The handle is invalid.
2018-09-14T12:17:36.339-0600 imported 0 documents
Anyone have any ideas on what I am missing? Is it needing login credentials or something like that?
If the data is represented as a JSON array, rather than individual lines of JSON text, you will need to add the --jsonArray parameter to mongoimport.
I have successfully used mysqldump to export a db schema without data. I now want to import this db schema.
I have tried a couple of methods but come across the error related to the < character.
Any ideas?
As the error shows, input redirection in PowerShell does not work with the <sign. You would do it using the Cmdlet Get-Content and use piping to put the output into the input of the command:
Get-Content v:\mcsdb.sql | mysql -u root --sql --recreate-schema mcs
If you are interested in more in-depth info, refer to the Documentation.
I am trying to import a json document into mongodb but it shows me unexpected identifier. my json document looks something like following
[
{
"Cancer Sites":"Female Breast",
"State":"Alabama",
"Year":2000,
"Sex":"Female",
"Count":550
},
{
"Cancer Sites":"Female Breast",
"State":"Alabama",
"Year":2000,
"Sex":"Female",
"Count":2340
},
{
"Cancer Sites":"Female Breast",
"State":"Alabama",
"Year":2000,
"Sex":"Female",
"Count":45
}
]
I tried with following query from my mongo shell but it doesn't work
mongoimport -d treatment -c stats --file news.json
I am executing it from mongo shell on windows command prompt. my mongo shell is in C:\mongodb\bin path and my file is also in same path. can anyone tell where I am wrong
since it is a list of array we should use
mongoimport -d treatment -c stats --jsonArray news.json
I'm new to MongoDB in Windows 7 and I tried to import JSON and CSV file into MongoDB.
1.First i tried importing JSON file using the command
"C:\>mongodb\bin\mongoimport –host localhost:27017 –db mydb –collection docs"
and it showed this error
"exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Expecting '{': offset:0"
2.When i import CSV file i used the command
"C:\mongodb\bin>mongoimport --db mynewdb --collection message --type csv --fields
form,Iname,fistname,lastname --file d:\new folder\csv1.csv"
and i get the error message as
"ERROR: multiple occurrences
Import CSV, TSV or JSON data into MongoDB.
When importing JSON documents, each document must be a separate line of the input file"
I downloaded JSON and CSV bulk file randomly by browsing. I want to know whether it will get imported when data's are non-organized or should the data be organized? If so where can get a complete bulk JSON and CSV file which is ready to import.
Try using the --jsonArray flag at the end of your query so it would look like
C:\>mongodb\bin\mongoimport –host localhost:27017 –db mydb –collection docs --jsonArray