Mongo function to validate data imported from JSON file - json

Is there any way to validate data that I imported from json file using mongoimport or other third-party tools like Studio3T
If I have example.json file like that:
[
{
"firstnumber" : 3,
"secondnumber" : 2,
}
]
And I try to import it to MongoDB then how can I validate data on json file to make sure that correct data was imported.
Any help would be greatly appreciated.

Related

ingest JSON-STAT from RESTapi to csv on Azure Data Factory

I want to make a request to an RESTapi of the Norwegian statistics bureau.
According to their api I need to make a POST request (thats not usual but that is ok) to get the tables that I need. In the body of the POST request I can specify with kind of table I want.
What I get is a JSON-STAT file. I need some help about how to handle it on Azure Data Factory. I read on documentation that ADF supports just JSON when using REST API. Does it mean that it also supports JSON-STAT ? If so How can I handle it on the activity ( source/sink ).
Thanks in advance,
Nunotrt
Here is the sample process of JSON-STAT file ingest from REST API . Create a new REST dataset. Create Linked services to the REST API. Provide REST dataset as the source for the copy activity and also with the POST request method, providing the request body({ "query": [ { "code": "Region", "selection": { "filter": "agg:KommSummer", "values": [ "K-3006" ] } } ], "response": { "format": "json-stat2" } })
Create a linked services for sink dataset. Create a file path for the sink dataset to get the file into the location. Result we get as per the expectations.
If you want to convert a json-stat file to csv or parquet, it is an alternative to you use a python code as a part of your Data Factory job.
The python library pyjstat makes it easy to convert json-stat to a DataFrame, and from DataFrame to any other file format.
Here are an example with json-stat 1.2
import pandas as pd
from pyjstat import pyjstat as pyj
# url
file_ssb = 'https://data.ssb.no/api/v0/dataset/1102.json'
# Upload data to memory
dataset = pyj.Dataset.read(file_ssb)
# Write json-stat to dataframe
df = dataset.write('dataframe')
print(df)
# Save as csv
df.to_csv('../Data/test_jsonstat.csv')

Add data to a json file in terraform

I am not a software guy, but I am preparing a small script for delarative onboarding work using terraform. The declaration is in a json file and it is send to the device through terraform. But I do not want to keep the sensitive data like passwords and secret keys in the json file. I want to inject them from a seperate file in a vault to the json file when I am executing it in terraform. I can use terraform to read the data from a vault or a s3 bucket, but I am not sure how to add it to the json declaration. I saw in terraform there is a functions like jsondecode and jsonencode. But I am not sure whether they will be any use to my requirement.
Could someone help me on this please ?
Thansk.
Maybe this is something which could help you?
Create a data source which loads your template file, here you can pass in some vars.
some examples for variales:
data "template_file" "my_template" {
template = templatefile("template.json", {
foo = "bar"
another_foo = aws_s3_bucket.my_bucket.id
more_foo = var.my_variable
})
}
The template file could look like this:
{
"Value": "123",
"value_2": "${foo}",
"value_3": "${another_foo}",
"value_4": ${more_foo},
}

Export from MongoDB Cloud into local MongoDB ("$oid is not valid for storage." error)

I'm just learning about MongoDB. Is there an easy way to export/import data from MongoDB?
The task at hand is simple: Using MongoDB Cloud, copy a document from a collection (using the Copy Document button located in the Atlas > Collection section) and be able to import it into my local MongoDB DB. Doing so, I got the following:
{
"_id": {"$oid":"5e666e346e781e0b34864de4"},
"created_at":{"$date":{"$numberLong":"1583771188026"}}
}
Trying to import that into my local MongoDB using db.my_collection.insert() leads me to the following error:
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 52,
"errmsg" : "$oid is not valid for storage."
}
})
So I've already done my research and I've found about how MongoDB creates output data in Extended JSON v2.0 (Relaxed mode) by default or in Canonical Mode.
So the documentation is really telling me the format that is used to export in MongoDB is not natively supported to be able to directly import it? What did I get wrong?
How can I import directly what is being exported?
So... MongoDB doesn't support its own export format to import data into a collection.
The solution for your problem would be just replacing
{
"_id": {"$oid":"5e666e346e781e0b34864de4"},
"created_at":{"$date":{"$numberLong":"1583771188026"}}
}
for:
{
"_id": ObjectId("5e666e346e781e0b34864de4"),
"created_at":{"$date":{"$numberLong":"1583771188026"}}
}
This way you'll be able to import your data without any problem.
mongoDB uses Bson format to store it's data, you need to convert Json to Bson then import it. OR
you can use mongoDBcompass and import your Json file and mongoDBcompass will do the job for you!
I find another method to solve this problem.You can put this json file on your server and use mongodb command to load this json file into certain collection.An example are as followed:
mongoimport --host localhost --port 27017 --username ezsonaruser --password 123456 --db ezsonar_25 --collection host_locations_test --file /root/shaql/host_locations.json

pymongo how to read Json file

I hava a 'abnormal' json file.
the file body like this
{
"_id" : ObjectId("aaaddd"),
"created_at" : ISODate("2017-05-26T18:04:31.315Z"),
"updated_at" : ISODate("2017-05-26T18:04:31.315Z"),
}
i have tried many ways to import this to mongo by pymongo,
But i can not load the file body with json loader or bson loader.
I know it is not a regular json or bson file.
But i use mongoimport import this file to mongodb successfully.
So does anyone know how to fix this and make it work?And how can i import this file to mongodb use pymongo?
Because the contents of that file are not JSON, they cannot be parsed by PyMongo's JSON parser. (PyMongo just uses the Python standard JSON parser to do most of the work.) Only mongoimport understands that file format, you must use mongoimport to load it into MongoDB.
If files like this one are part of your regular workflow, I recommend you create files that are standard JSON using mongoexport, instead of this non-JSON format.

For jmeter post request, how can I generate input json from csv file?

I am trying to make a post rest call to my service. My sample input json file is,
{
"$id": "1",
"description": "sfdasd"
}
I have one csv file which contain a bunch of id and description, So is there a option where I can convert csv file to json objects and pass them to post call?
Assuming your CSV file is called test.csv, located in JMeter's "bin" folder and looks like:
Add CSV Data Set Config to your Test Plan and configure it as follows:
You can inline the defined JMeter Variables directly into your request body like:
{
"$id": "${id}",
"description": "${description}"
}
So when you run the test the variables placeholders will automatically be substituted with the values from the CSV file in the HTTP Request sampler:
See Using CSV DATA SET CONFIG article for more information on JMeter tests parameterization using CSV files.
Json is just text. Send as is with the variable id taken from csv:
{ "${id}": "1", "description": "sfdasd" }
CSV data can be converted to JSON via a POJO using Jackson. If a POJO is not already defined or required, you can always use the Java Collection classes to store parsed data and later convert it to JSON. http://www.novixys.com/blog/convert-csv-json-java/ is a good reference on how to convert csv to java.
You can check the following links too.
1. directly convert CSV file to JSON file using the Jackson library
2. Converting an CSV file to a JSON object in Java