Can we interchange JSON schema with YAML schema? or viceversa? - json

I have a device application which gets the data in JSON format. This JSON format is generated by another web based application using a YAML schema.
Now, as the web tool validates this JSON data file against the YAML schema, my device application also has to validate it against a schema. Since, the resource on my device is limited and we already have json schema validation in place, we are restricted to use schema in JSON format only.
So, my question is could we replace the YAML schema with JSON schema for the web tool? The web application has Swagger.
On another note, is there any existing script or open source tool to convert YAML schema to JSON schema?
Not sure about the OpenAPI definition. Its a simple schema file that will be used to validate JSON data. The JSON schema (draft v4) has below format. Our device application is in C++ language. Not sure about what is used in Web Tool, but it has some Swagger framework that generates the JSON data file for us.
{
"$schema": "https://json-schema.org/draft/2019-09/schema",
"definitions": {
...
"foobar_Result" : {
"type" : "object",
"properties" : {
"request" : {
"type" : "integer"
},
"success" : {
"type" : "boolean"
},
"payload" : {
"type" : "array", "items" : {"$ref" : "#/definitions/foobar_Parameter"}
}
},
"required" : ["request"],
"additionalProperties" : false
}
},
"$ref" : "#/definitions/foobar_Result"
}

If you are looking converting between API specification formats then this tool might help https://www.apimatic.io/transformer/

Related

converting one JSON structure into another JSON structure

The client side expects JSON string in the below format
descriptions": [
{
"lang": "string",
"size": "string",
"text": "string",
"type": "string"
}
],
, but the received JSON is a bit different - like below
"descriptions":{
"desc":[
{
"size":string,
"lang":"string",
"type":"string",
"content":"string"
}
]
},
Is there anyway to ignore the "desc" part - for eg. using a JSON annotation ? Context: I am passing this JSON through a REST API call and it will be automatically converted to a Java object at the receiving end.
A simple
descriptions = descriptions.desc;
will do.
You just construct the object you need:
var clientDescriptions = descriptions.desc.map(function(d) {
size: d.size,
type: d.type,
lang: d.lang,
content: d.text
});
If you are using Gson, it is possible to use custom JsonDeserializer / JsonSerializer for your Java model. Your model object and API can then be cleanly implemented without having to deal with the different json structures.
In my personal experience, the backend has a bug and must be solved, it mustn't be sending malformed data.In the long run it is the best solution.

JSON schema without "properties" keyword

Vendor send me a json schema. Please look at this:
{
"$schema" : "http://json-schema.org/draft-04/schema#",
"type" : "object",
"definitions" : {
...
},
"oneOf" : [{
"$ref" : "#/definitions/commons/strings/text"
}, {
"$ref" : "#/definitions/dto/scriptStep"
}, {
"$ref" : "#/definitions/dto/callResult"
}
]
}
There is no "properties" keyword (but by the way there is very large "definitions" part). Does it mean that schema actually descibes empty json object {}? Or does it mean that json could contain one of elements from "oneOf" array?
All JSON Schema keywords are constraints. For example, the empty schema {} means that any JSON is valid. A schema with just `{ "type": "object" } mean that any JSON object is valid. There are no constraints on what properties the object has.
However, that isn't what your vendor is expressing in this schema. The JSON doesn't only have to be valid against the "type": "object", but also against one of the three schemas referenced in oneOf. Presumably, those schemas include a properties keyword.
This probably isn't the best designed schema, but it is valid.

Elasticsearch - Sense - Indexing JSON files?

I'm trying to load some JSON files to my local ES instance via Sense, but I can't seem to figure the code out. I know ES has the Bulk API and the Index API, but I can't seem to bring the code together. How can I upload/index JSON files to my local ES instance using Sense? Thank you!
Yes, ES has a bulk api to upload JSON files to the ES cluster. I don't think that API is exposed in low level languages as in case of Sense it is Javascript in the browser. High level clients are available in Java or C# which expose more control over the ES cluster. I don't think chrome browser will support execution of this command.
To upload a JSON file to elastic using the bulk api.
1) This command uploads JSON documents from a JSON file.
curl -s -XPOST localhost:9200/_bulk --data-binary #path_to_file;
2)The JSON file should be formatted as follows:
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "field1" : "value1" }
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "field1" : "value3" }
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "doc" : {"field2" : "value2"} }
Where JSON object doc represents each JSON object data and the corresponding index JSON object represent metadata for that particular JSON doc like document id, type in index,index name.
link to bulk upload
Also you can refer my previous answer

Date Field Schema in Avro Input Kettle

I'm using Pentaho Data Integration (Kettle) for an ETL process, extracting from a MongoDB source.
My source has an ISODateField so the JSON returned from the extraction is like:
{ "_id" : { "$oid" : "533a0180e4b026f66594a13b"} , "fac_fecha" : { "$date" : "2014-04-01T00:00:00.760Z"} , "fac_fedlogin" : "KAYAK"}
So now, I have to unserialize this JSON with an AVRO Input. So I've defined the AVRO schema
like
{
"type": "record",
"name": "xml_feeds",
"fields": [
{"name": "fac_fedlogin", "type": "string"},
{"name": "fac_empcod", "type": "string"},
{"name": "fac_fecha", "type": "string"}
]
}
It would be ok that fac_fecha could be a date type but AVRO doesn't support this.
In execution time, AVRO Input rejects all rows as they have an error. This only ocurrs when I use the date field.
Any suggestions of how can I do this?
Kettle version: 4.4.0
Pentaho-big-data-plugin: 1.3.0
You can convert this date string to a long (milliseconds).
This can be done both in Java and Javascript.
And then you can convert back the long to Date if required.
The easiest solution I found for this problem was uprading The Pentaho Big Data Plugin to a newer version 1.3.3
With this new version expliciting the schema for the mongodb Input json is avoided. So the Final solution is shown as following:
global view:
And inside MongoDB Input:
The schema is decided automatically and it can me modified.

Create the structure of an empty collection in mongoDB

Is there any possibility to create the structure of an empty collection in MongoDB using mongoimport from a JSON file like the one below?
"Users" : {
"name" : "string",
"telephone" : {
"personal": { "type": "number" },
"job": { "type" : "number" }
},
"loc" : "array",
"friends" : "object"
}
My goal is to create a mongoDB schema from JSON files.
Yes, you can mongoimport a JSON file and if you clear out the values of those field (set them to ""), importing your JSON file should do just that.
However, MongoDB is a NoSQL database, and creating a schema in the MongoDB database doesn't really make sense. What will happen is that you'll have one record with fields whose values are empty.