pymongo how to read Json file - json

I hava a 'abnormal' json file.
the file body like this
{
"_id" : ObjectId("aaaddd"),
"created_at" : ISODate("2017-05-26T18:04:31.315Z"),
"updated_at" : ISODate("2017-05-26T18:04:31.315Z"),
}
i have tried many ways to import this to mongo by pymongo,
But i can not load the file body with json loader or bson loader.
I know it is not a regular json or bson file.
But i use mongoimport import this file to mongodb successfully.
So does anyone know how to fix this and make it work?And how can i import this file to mongodb use pymongo?

Because the contents of that file are not JSON, they cannot be parsed by PyMongo's JSON parser. (PyMongo just uses the Python standard JSON parser to do most of the work.) Only mongoimport understands that file format, you must use mongoimport to load it into MongoDB.
If files like this one are part of your regular workflow, I recommend you create files that are standard JSON using mongoexport, instead of this non-JSON format.

Related

How to JSON wrapper ( {"text": " ) from CSV response in WSO2

I am new to WSO2. I wrote Custom Java Class Mediator to transform JSON request to CSV format. I have a proxy service to SFTP the generated CSV file to a (MoveIT) folder.
Custom Mediator converts the JSON request properly to CSV format. But, when send the CSV file using transport.vfs.replyfilename to the endpoint, I see the 'text' wrapper as below in the CSV file:
{"text":"1,F20175_A.CSV,20200623135039\n2,123456789,2017-MO-BX-0048,123456789,987654321,Y/N,C/A,20190101,20201231,20190930,75000,44475.86,15563.52,0.00,15563.52,0.00,60039.38,14960.62,60039.38,0.00,20191218\n3,1,999999999\n"}
I set contentType, MessageType properties "text/plain". I also used vfs.ContentType to set to text/plain as below:
text/plain
I know I am missing something. Has anybody come across this issue in WSO EI 6.6? Should I go ahead and write Custom Message Formatter? Any tips?
I want the output to be written as:
1,F20169_A.CSV,20200617153638
2,123456789,2017-MO-BX-0048,123456789,987654321,Y/N,C/A,20190101,20201231,20190930,75000,44475.86,15563.52,0.00,15563.52,0.00,60039.38,14960.62,60039.38,0.00,20191218
3,1,999999999
Thanks!
It seems that after converting to the CSV format the result is available within a text tag. Therefore using the file connector [1] when you write to the file access the text content using //text() XPath. Please refer to the following sample configuration of inputContent.
<fileconnector.create>
<filePath>{$ctx:filePath}</filePath>
<inputContent>{//text()}</inputContent>
<encoding>{$ctx:encoding}</encoding>
<setTimeout>{$ctx:setTimeout}</setTimeout>
<setPassiveMode>{$ctx:setPassiveMode}</setPassiveMode>
<setSoTimeout>{$ctx:setSoTimeout}</setSoTimeout>
<setUserDirIsRoot>{$ctx:setUserDirIsRoot}</setUserDirIsRoot>
<setStrictHostKeyChecking>{$ctx:setStrictHostKeyChecking}</setStrictHostKeyChecking>
</fileconnector.create>
[1]-https://docs.wso2.com/display/ESBCONNECTORS/Working+with+the+File+Connector#WorkingwiththeFileConnector-create

Export from MongoDB Cloud into local MongoDB ("$oid is not valid for storage." error)

I'm just learning about MongoDB. Is there an easy way to export/import data from MongoDB?
The task at hand is simple: Using MongoDB Cloud, copy a document from a collection (using the Copy Document button located in the Atlas > Collection section) and be able to import it into my local MongoDB DB. Doing so, I got the following:
{
"_id": {"$oid":"5e666e346e781e0b34864de4"},
"created_at":{"$date":{"$numberLong":"1583771188026"}}
}
Trying to import that into my local MongoDB using db.my_collection.insert() leads me to the following error:
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 52,
"errmsg" : "$oid is not valid for storage."
}
})
So I've already done my research and I've found about how MongoDB creates output data in Extended JSON v2.0 (Relaxed mode) by default or in Canonical Mode.
So the documentation is really telling me the format that is used to export in MongoDB is not natively supported to be able to directly import it? What did I get wrong?
How can I import directly what is being exported?
So... MongoDB doesn't support its own export format to import data into a collection.
The solution for your problem would be just replacing
{
"_id": {"$oid":"5e666e346e781e0b34864de4"},
"created_at":{"$date":{"$numberLong":"1583771188026"}}
}
for:
{
"_id": ObjectId("5e666e346e781e0b34864de4"),
"created_at":{"$date":{"$numberLong":"1583771188026"}}
}
This way you'll be able to import your data without any problem.
mongoDB uses Bson format to store it's data, you need to convert Json to Bson then import it. OR
you can use mongoDBcompass and import your Json file and mongoDBcompass will do the job for you!
I find another method to solve this problem.You can put this json file on your server and use mongodb command to load this json file into certain collection.An example are as followed:
mongoimport --host localhost --port 27017 --username ezsonaruser --password 123456 --db ezsonar_25 --collection host_locations_test --file /root/shaql/host_locations.json

Mongo function to validate data imported from JSON file

Is there any way to validate data that I imported from json file using mongoimport or other third-party tools like Studio3T
If I have example.json file like that:
[
{
"firstnumber" : 3,
"secondnumber" : 2,
}
]
And I try to import it to MongoDB then how can I validate data on json file to make sure that correct data was imported.
Any help would be greatly appreciated.

For jmeter post request, how can I generate input json from csv file?

I am trying to make a post rest call to my service. My sample input json file is,
{
"$id": "1",
"description": "sfdasd"
}
I have one csv file which contain a bunch of id and description, So is there a option where I can convert csv file to json objects and pass them to post call?
Assuming your CSV file is called test.csv, located in JMeter's "bin" folder and looks like:
Add CSV Data Set Config to your Test Plan and configure it as follows:
You can inline the defined JMeter Variables directly into your request body like:
{
"$id": "${id}",
"description": "${description}"
}
So when you run the test the variables placeholders will automatically be substituted with the values from the CSV file in the HTTP Request sampler:
See Using CSV DATA SET CONFIG article for more information on JMeter tests parameterization using CSV files.
Json is just text. Send as is with the variable id taken from csv:
{ "${id}": "1", "description": "sfdasd" }
CSV data can be converted to JSON via a POJO using Jackson. If a POJO is not already defined or required, you can always use the Java Collection classes to store parsed data and later convert it to JSON. http://www.novixys.com/blog/convert-csv-json-java/ is a good reference on how to convert csv to java.
You can check the following links too.
1. directly convert CSV file to JSON file using the Jackson library
2. Converting an CSV file to a JSON object in Java

ReactJS read malformed JSON file

Currently trying to learn how to use reactJS and I'm trying to read in a malformed json file.
EG
{
Item : "text",
Desc : "example"
}
Now my understanding is that typically I can use
import data from "data.json"
which would build an object I could use, however that won't work with the malformed json.
I think that my solution would be to read the file as one big string, use regex to clean the json, and then attempt to parse it, but I'm not really sure how to go about reading the file in.