Back-converting from JSON draft 4 to JSON draft 3 - json

I have a bit of a weird situation. I am running a JSON schema validator that doesn't support draft 4, and due to corporate wonkiness I'm stuck with it instead of replacing it. Our developers are giving me schemas in draft 4 format, so I have to go through by hand and back-convert it, particularly the required fields.
This has all worked fine up until I hit something like this (consider this pseudo-code; I'm still getting the hang of JSON):
"items": {
"type": "array",
"required" : true,
"items": [
{...},
"required": ["0", "1"] // This bit right here
],
}
I'm told it basically says, "The first two items in the array are required." But I can't find a way to express that in JSON draft 3. Is that even supported, and if so, how would you express it?

required is a keyword that only has meaning for object instances, not for arrays.
The way to indicate that an array must have at least 2 items is through minItems keyword in both Draft3 and Draft4.
If you need to express any other schema just for the first and second item in an array, you do it by having two schemas in the items array. For instance, the following schema requires that properties "0" and "1" are included in the first and second items in the array.
For Draft 3:
"items" : [{
"properties" : {
"0" : {"required" : true},
"1" : {"required" : true}
}
}, {
"properties" : {
"0" : {"required" : true},
"1" : {"required" : true}
}
}
]
And Draft 4:
"items" : [{
"required" : ["0", "1"]
}, {
"required" : ["0", "1"]
}
]

Related

How to generate all test cases from a JSON schema

I am developing a system where we create documents that are created from json-files. The json files are described by a json schema of the following kind, where a key either could have a static default value or have one of multiple enumerated values:
{
"name" : { "default" : "John Doe" },
"bilingual" : { "type" : "boolean" },
"kind_of_document" : {
"type" : "integer",
"enum" : [0, 1, 2]
}
}
What I want to do is to create test files for all possible combinations of values from the schema. In the above case there would be six different json files:
{ "name" : "John Doe", "bilingual" : true, "kind_of_document" : 0 }
{ "name" : "John Doe", "bilingual" : true, "kind_of_document" : 1 }
{ "name" : "John Doe", "bilingual" : true, "kind_of_document" : 2 }
{ "name" : "John Doe", "bilingual" : false, "kind_of_document" : 0 }
{ "name" : "John Doe", "bilingual" : false, "kind_of_document" : 1 }
{ "name" : "John Doe", "bilingual" : false, "kind_of_document" : 2 }
This is a simplified version; in a normal case there may be 20-30 keys, of which most have default values, but there may still 5-10 keys that can have multiple values. They number of keys may vary from one document to another.
The problem:
How do I iterate over the list of keys to generate all possible combinations of values? I suppose that some form of recursion is the way to go, but I can't figure out what the algorithm should be. It seems to me that this should be a fairly common problem, so somebody has most likely solved it before.
I have tried googling for it, but I don't know how to formulate the query to avoid just general questions about generating combinations of list elements, such as
How to get all possible combinations of a list’s elements?
I have also looked at itertools, but I am not sure how it could me solve the problem:
https://docs.python.org/3/library/itertools.html#itertools.permutations
I am writing this in python, but of course the general algorithm for solving this would be language independent.

How to validate string and number using json schema

I would like to validate a schema based on either its maximum/minimum (number) OR maximumLength/minimumLength (string).
I have a json form:
[
{
"key":"foo",
"title":"Test",
"type":"string"
}
]
and a json schema:
{
"type": "object",
"properties": {
"foo": {
"type": ["number","string"],
"maxLength":2,
"minLength":0,
"minimum":3,
"maximum":10
}
}
}
and a json model:
{
"foo": "bar"
}
Why does this example not work with validation? The model I have is not validated to false. According to this document it is possible to have different types defined in an array, but how can we do validation based on min/max values?
Your schema is correct. The validator you are using doesn't work properly. Here is an alternative that uses anyOf instead.
{
"type": "object",
"properties": {
"foo": {
"anyOf": [
{ "$ref": "#/definitions/boundedNumber" }
{ "$ref": "#/definitions/boundedString" }
]
}
},
"definitions": {
"boundedString": {
"type": "string",
"maxLength": 2,
"minLength": 0
},
"boundedNumber": {
"type": "number",
"minimum": 3,
"maximum": 10
}
}
}
Although it is quite a bit longer, some would argue that this is actually easier to read/maintain because of the separation of the type specific keywords.
Your schema is validating JSON objects ("type":"object"). In addition, if they have a property with key "foo", its value must be either a number between 3 an 10, or a string of maximum length 2.
Valid objects according to your schema:
{"foo":6}
{"foo":"as"}
Invalid objects:
{"foo":60}
{"foo":"asereje"}
If you want to validate arrays you must define your parent object as an array and use items tag to specify the schema for the array items, for instance:
{
"type" : "array",
"items" : {
"type" : "object",
"properties" : {
"foo" : {
"type" : ["number", "string"],
"maxLength" : 2,
"minLength" : 0,
"minimum" : 3,
"maximum" : 10
}
}
}
}
The schema above would validate the following JSON array:
[{
"foo" : 6
}, {
"foo" : "as"
}
]
John the issue is the type "type" : ["number", "string"]
Angular Schema Form is generating fields from the combined JSON Schema and the UI Schema (form), when the field type is defined it knows which type to validate against, when there are multiple types it doesn't know which part of the spec to base the form field against, so it falls back to a textbox and does not add the appropriate validation requirements for string alone.
To achieve your desired outcome you would need to tell it which field type you wish to use. There is a bug in the 0.8.x versions where it does not validate based on the type set in the UI schema if it cannot determine the type in the data schema, I believe that is fixed in the latest development branch. If not if there is an issue raised in Git, I would prioritise it.

Is there any way to preserve the order while generating patch for json files ?

I am new to Json stuff i.e. JSON PATCH.
I have scenario where I need to figure out between two version of Json files of same object, for that I am using json-patch-master.
But unfortunately the patch generated interpreting it differently i.e. the order differently hence getting unexpected/invalid results.
Could anyone help me how to preserve the order while generating Json Patch ?
**Here is the actual example.
Original Json file :**
[ {
"name" : "name1",
"roolNo" : "1"
}, {
"name" : "name2",
"roolNo" : "2"
}, {
"name" : "name3",
"roolNo" : "3"
}, {
"name" : "name4",
"roolNo" : "4"
} ]
**Modified/New Json file: i.e. removed 2nd node of original file.**
[ {
"name" : "name1",
"roolNo" : "1"
}, {
"name" : "name3",
"roolNo" : "3"
}, {
"name" : "name4",
"roolNo" : "4"
} ]
**Patch/Diff Generated :**
[ {"op":"remove","path":"/3"},
{"op":"replace","path":"/1/name","value":"name3"},
{"op":"replace","path":"/1/roolNo","value":"3"},
{"op":"replace","path":"/2/name","value":"name4"},
{"op":"replace","path":"/2/roolNo","value":"4"}]
Very time I generate Diff/Patch it is giving different path/diff results.
And moreover the interpretation is different i.e. order is not preserving.
**Is there any way to get expected results i.e. [ {"op":"remove","path":"/1"} ] , in other words generated a patch/diff based some order so will get what is expected. ?
How to handle this kind of scenario ?**
Please help me.
Thank you so much.
~Shyam
We are currently working on this issue in Starcounter-Jack/JSON-Patch.
It seems to work nice with native Array.Observe- http://jsfiddle.net/tomalec/p4s7aw96/.
Try Starcounter-Jack/JSON-Patch issues/65_ArrayObserve branch
we will release it as new version once shim and performance will be checked.
Feel free to add you comments at JSON-Patch issue board

Json array in mongoDB

I want to get objects according to an ID they have in an array in a json file in mongodb.
I tried a lot of ways to get them with no success:
db.collection.find({"Id":"2"})
db.collection.find({"Messages.Id":"2"})
db.collection.find({"Messages":{$elemMatch:{"Id":"2"}}})
db.collection.find({"Messages.Id":{$elemMatch:{"Id":"2"}}})
{
"Messages" : [
{
"text":"aaa",
"Id" : [ "1", "2" ]
},
{
"texts" : "bbb",
"Id" : [ "1", "3" ]
}
]
}
Even though that's how it's supposed to be done according to the mongodb documentation.
So I thought something was wrong with my json design (I tried changing it but that didn't help either).
Can anyone suggest to me a good design or query to get the objects with a certain id will work?
UPDATE:
I want for example that if in the query i request the id 2
only the first message and all of it will be displayed (I don't mind if the Id field wont be displayed)
{
"text":"aaa",
"Id":["1","2"]
}
To find single elements that match you will need to utilize the positional operator ($).
db.collection.find({"Messages.Id": "2"}, {"Messages.$": 1, _id: 0})
For finding multiple matches, you would use the aggregation pipeline:
db.collection.aggregate([
{ $unwind: "$Messages" },
{ $match: {"Messages.Id": "1"}},
{ $group: { _id: null, messages: { $push: "$Messages"}}}
])

JSON-Schema pattern repetition inside array

Is there a way to create a repetition pattern for elements inside array of a JSON Schema document. I would like to use it in order to validate query produced by a query builder tool.
the part of the schema i currently use is
"complexCondition": {
"anyOf": [
{
"title": "Simple condition, e.x. X==0",
"type": "string"
},
{
"type": "array",
"items": [
{
"$ref": "#/complexCondition"
},
{
"type": "string", "enum": ["AND","OR"]
},
{
"$ref": "#/complexCondition"
}
],
"additionalItems": false
}
]
}
This allows me correct validation of query "conditionA && conditionB && conditionC" as
[[conditionA,"AND",conditionB],"AND",conditionC]
However, I would like to be able and validate documents where query is stored as
[conditionA,"AND",conditionB,"AND",conditionC]
and this to achievable for any number of conditions.
In order to achieve what you intend to do you would need to define a rule for odd and even positions
in an array, which can not be done for arrays of any length with json-schema.
If your expected number of query conditions is not going to grow ad infinitum, you could hard-code the first n positions in items keyword:
"items":[{"$ref":"#/condition},{"$ref":"#/operator"},{"$ref":"#/condition},{"$ref":"#/operator"} ... etc.]
In this case you still have the problem that you could define a wrong condition ending in "operator".
Other option would be to make "oneOf" and build each posibility (this could be automated with a script, I guess you won't have expressions with 100 clauses...)
Anyway I find a bit strange to try to flatten a concept that is inherently recursive.
How would you store this ( (A OR B) OR (C AND B))?
So If you can still rethink your desired serialization schema format I would suggest you to make a recursive approach. Something like:
"predicate" : {
"type" : "array",
"items" : {
"$ref" : "#/clause"
}
}
"clause" : {
"type" : "array",
"items" : [{
"$ref" : "#/condition"
}
],
"additionalItems" : {
"type" : "array",
"items" : [{
"$ref" : "#/operator"
}, {
"$ref" : "#/clause"
}
]
}
}
The generated string is more ugly, but parsing it would be fairly easy with a simple recursive function:
Simple condition:
[["condition1"]]
Simple Clause:
[["condition1",["OR",["condition2"]]]
Adding a clause at the same level
[["condition1",["OR",["condition2"]],["OR",["condition3"]]]
Add a clause to a child level
[["condition1",["OR",["condition2"]],["OR",["condition3", ["AND",["condition4"]]]]]