I am using HTTP+Swagger to make API call(hosted in APIM gateway) in logic app.
In Logic app we can use the parse Json to parse the schema and it provide us access to the individual element of the JSON payload(which we can use in downstream logic). But we have to manually paste the schema in the schema field of parseJson activity.
I want to automate the process. I am looking a way where I can collect the Json schema in a variable and then parse my Json through this variable(schema). This will help me not to make manual interferences with logic app in case of change in schema.
Any help much appreciated.
Thanks,
You don't really have to manually write the JSON schema but you just need a sample of the JSON that you want to parse. Consider here is some sample
[
{
"document": "A",
"min": 7500001,
"policy": "X"
},
{
"document": "B",
"min": 7500001,
"policy": "Y"
},
{
"document": "C",
"min": 7500001,
"policy": "Z"
},
{
"document": "D",
"min": 7500002,
"policy": "X"
}
]
So you need to use Use sample payload to generate schema.
and then you will be automatically have generated schema
Below is the generated schema
{
"type": "array",
"items": {
"type": "object",
"properties": {
"document": {
"type": "string"
},
"min": {
"type": "integer"
},
"policy": {
"type": "string"
}
},
"required": [
"document",
"min",
"policy"
]
}
}
Updated Answer
So, This would work even when you have a JSON change just remove the required from the schema and this would work perfectly when you are trying to use the same parameters as before.
For example, I have changed my sample to
[
{
"document": "A",
"min": 7500001,
"sample":"test1"
},
{
"document": "B",
"min": 7500001,
"sample":"test2"
},
{
"document": "C",
"min": 7500001,
"policy": "Z"
},
{
"document": "D",
"min": 7500002,
"policy": "X"
}
]
and remain the schema to be the same as before, this would fail due to schema validation
but when you just remove the required from schema i.e.,
{
"type": "array",
"items": {
"type": "object",
"properties": {
"document": {
"type": "string"
},
"min": {
"type": "integer"
},
"policy": {
"type": "string"
}
}
}
}
Related
I have a JSON object like:
{
"result": [
{
"name" : "abc",
"email": "abc.test#mail.com"
},
{
"name": "def",
"email": "def.test#mail.com"
},
{
"name": "xyz",
"email": "abc.test#mail.com"
}
]
}
and schema for this:
{
"definitions": {},
"$schema": "http://json-schema.org/draft-07/schema#",
"$id": "https://example.com/object1607582431.json",
"title": "Root",
"type": "object",
"required": [
"result"
],
"properties": {
"result": {
"$id": "#root/result",
"title": "Result",
"type": "array",
"default": [],
"uniqueItems": true,
"items": {
"$id": "#root/result/items",
"title": "Items",
"type": "object",
"required": [
"name",
"email"
],
"properties": {
"name": {
"$id": "#root/result/items/name",
"title": "Name",
"type": "string"
},
"email": {
"$id": "#root/result/items/email",
"title": "Email",
"type": "string"
}
}
}
}
}
}
I am looking for an option to check uniqueness for email irrespective of name. How I can validate that every email should be unique?
You can't. There are no keywords that let you compare one particular data value against another, other than uniqueItems, which compares an array element in toto against another.
The JsonSchema specification does not currently support this.
You can see the active GitHub issue here: https://github.com/json-schema-org/json-schema-vocabularies/issues/22
However, there are various extensions of JsonSchema that do validate unique fields within lists of objects.
If you happen to be using Python you can use the package (I created) JsonVL. It can be installed with pip install jsonvl and then run with jsonvl data.json schema.json.
Code examples in the GitHub repo: https://github.com/gregorybchris/jsonvl
I am working with JSON Schema Draft 4 and am experiencing an issue I can't quite get my head around. Within the schema below you'll see an array, metricsGroups where any item should equal exactly oneOf the defined sub-schemas. Within the sub-schemas you'll notice that they both share the property name timestamp, but metricsGroupOne has the properties temperature and humidity whilst metricsGroupTwo has properties PIR and CO2. All properties within both metricsGroups are required.
Please see the schema below. Below the schema is an example of some data that I'd expect to be validated, but instead is deemed invalid and an explanation of my issue.
{
"type": "object",
"properties": {
"uniqueId": {
"type": "string"
},
"metricsGroups": {
"type": "array",
"minItems": 1,
"items": {
"oneOf": [
{
"type": "object",
"properties": {
"metricsGroupOne": {
"type": "array",
"minItems": 1,
"items": {
"type": "object",
"properties": {
"timestamp": {
"type": "string",
"format": "date-time"
},
"temperature": {
"type": "number"
},
"humidity": {
"type": "array",
"items": {
"type": "number"
}
}
},
"additionalProperties": false,
"required": [
"timestamp",
"temperature",
"humidity"
]
}
}
},
"required": [
"metricsGroupOne"
]
},
{
"type": "object",
"properties": {
"metricsGroupTwo": {
"type": "array",
"minItems": 1,
"items": {
"type": "object",
"properties": {
"timestamp": {
"type": "string",
"format": "date-time"
},
"PIR": {
"type": "array",
"items": {
"type": "number"
}
},
"CO2": {
"type": "number"
}
},
"additionalProperties": false,
"required": [
"timestamp",
"PIR",
"CO2"
]
}
}
},
"required": [
"metricsGroupTwo"
]
}
]
}
}
},
"additionalProperties": false,
"required": [
"uniqueId",
"metricsGroups"
]
}
Here's some data that I believe should be valid:
{
"uniqueId": "d3-52-f8-a1-89-ee",
"metricsGroups": [
{
"metricsGroupOne": [
{"timestamp": "2020-03-04T12:34:00Z", "temperature": 32.5, "humidity": [45.0] }
],
"metricsGroupTwo": [
{"timestamp": "2020-03-04T12:34:00Z", "PIR": [16, 20, 7], "CO2": 653.76 }
]
}
]
}
The issue I am facing is that both of the metricsGroup arrays in my believed to be valid data validate against both of the sub-schemas - this then invalidates the data due to the use of the oneOf keyword. I don't understand how the entry for metricsGroupOne validates against the schema for metricsGroupTwo as the property names differ and vice versa.
I'm using an node library under the hood that throws this error, but I've also tested that the same error occurs on some online validation testing websites:
jsonschemavalidator
json-schema-validator
Any help is appreciated. Thanks,
Adam
JSON Schema uses a constraints based approach. If you don't define something is not allowed, it is allowed.
What's happening here is, you haven't specificed in oneOf[1] anything which would make the first item in your instance data array invalid.
Lete me illistrate this with a simple example.
My schema. I'm going to use draft-07, but there's no difference in this principal for draft-04
{
"oneOf": [
{
"properties": {
"a": true
}
},
{
"properties": {
"b": false
}
}
]
}
And my instance:
{
"a": 1
}
This fails validation because the instance is valid when both oneOf schemas are applied.
Demo: https://jsonschema.dev/s/EfUc4
If the instance was in stead...
{
"a": 1,
"b": 1
}
This would be valid, because the instance is fails validation for the subschema oneOf[1].
If the instance was...
{
"b": 1
}
It would be valid according to oneOf[0] but not according to oneOf[1], and therefore overall would be valid because it is only valid according to one subschema.
In your case, you probably need to use additionalProperties to make properties you haven't defined in properties dissallowed. I can't tell from your question if you want to allow both properties, because your schema is defined as oneOf which seems to conflict with the instance you expect to be valid.
I am looking to implement two schemas, one of which is an array of the other, let's call this Payload B and the other Payload A.
The problem: The parent schema requires an additional key, which the child schema cannot allow.
Payload A:
{
"a": "a",
"b": "b",
"c": "2019-05-01T09:00:00Z"
}
implemented with a schema of:
{
"$schema": "http://json-schema.org/draft-07/schema#",
"$id": "item-schema.json",
"title": "Individual Item POST",
"description": "",
"type": "object",
"properties": {
"a": {
"type": "string"
},
"b": {
"type": "string"
},
"c": {
"type": "string",
"format": "date-time"
}
},
"additionalProperties": false,
"required": [
"a",
"b",
"c"
]
}
Payload B:
[
{
"a": "aa",
"b": "bb",
"c": "2019-05-01T10:00:00Z",
"d": "dd"
},
{
"a": "aaa",
"b": "bbb",
"c": "2019-05-01T11:00:00Z",
"d": "ddd"
}
]
implemented with a schema of:
{
"$schema": "http://json-schema.org/draft-07/schema#",
"$id": "items-schema.json",
"title": "Multiple Item POST",
"description": "",
"type": "array",
"items": {
"allOf": [
{
"$ref": "item-schema.json"
},
{
"properties": {
"d": {
"type": "string"
}
}
}
]
},
"additionalItems": false
}
The issue I have is that whilst Payload A is validated correctly in that it doesn't allow any extraneous keys, Payload B is invalid due to Payload A's schema.
The only way to achieve what you want with draft-7 JSON Schema is to have duplication in your schema.
You would need to modify the allOf/1 to have properties:a,b,c:true (remember, booleans are schemas) and add additionalProperties:false.
AND you would have to remove additionalProperties:false from item-schema.json.
If you can't do that and you need item-schema.json to work on its own, then I'm sorry: you're out of luck, so you'll have to duplicate the schema rather than reference it.
Recognising that this is not super great, we worked hard (I approved the PR. Props to the rest of the core team!) to create a new keyword for draft-8 unevaluatedProperties. You can read about it at https://github.com/json-schema-org/json-schema-spec/issues/556.
If it's any consolation, the OpenAPI Specification has the same issue, which is explained in the issue referenced above.
I have a JSON file and i want to validate the file using JSON Schema on MongoDB.
How do i import the JSON Schema to MongoDB and then validate the JSON File.
JSON file is already in one collection so i want to validate without importing a new JSON file.
JSON:
{
"Book": {
"Year:2016-2017": {
"Crs": [{
"Cr": {
"_id": {
"$oid": "5a439ff4fc0900f06fb470a4"
},
"Number": 35,
"Pag": 8,
"Desc": "Embl",
"Ad": "S",
"Type": "Embl"
}
}]
}
}
}
JSON schema:
{
"type": "object",
"$schema": "http://json-schema.org/draft-03/schema",
"id": "http://jsonschema.net",
"required": false,
"properties": {"Book": {
"type": "object",
"id": "http://jsonschema.net/Book",
"required": false,
"properties": {"Year:2016-2017": {
"type": "object",
"id": "http://jsonschema.net/Book/Year:2016-2017",
"required": false,
"properties": {"Crs": {
"type": "array",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs",
"required": false,
"items": {
"type": "object",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs/0",
"required": false,
"properties": {"Cr": {
"type": "object",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs/0/Cr",
"required": false,
"properties": {
"Ad": {
"type": "string",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs/0/Cr/Ad",
"required": false
},
"Desc": {
"type": "string",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs/0/Cr/Desc",
"required": false
},
"Number": {
"type": "number",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs/0/Cr/Number",
"required": false
},
"Pag": {
"type": "number",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs/0/Cr/Pag",
"required": false
},
"Type": {
"type": "string",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs/0/Cr/Type",
"required": false
},
"_id": {
"type": "object",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs/0/Cr/_id",
"required": false,
"properties": {"$oid": {
"type": "string",
"id": "http://jsonschema.net/Book/Year:2016-2017/Crs/0/Cr/_id/$oid",
"required": false
}}
}
}
}}
}
}}
}}
}}}
I want to validate the JSON using the schema with mongodb.
JSON file is already in one collection so i want to validate without importing a new JSON file.
As of MongoDB v3.6.x, document schema validation only occurs during updates and inserts, existing documents do not undergo validation checks until modification. This means that your existing JSON document in a MongoDB collection will not be validated by the validation rule(s) that you just applied.
Note that $jsonSchema operator and JSON-SCHEMA support is new in MongoDB version 3.6.
i want to add a new "Cr" but it needs to respect the JSON Schema.(Insertion/Update)
Once you specified a document schema validation on a collection, any update operations will trigger validation checks on the affected documents.
For example, if you have an existing document in a collection as below:
{
"_id": 1,
"a": {
"b": {
"crs": [
{
"cr": {
"d": 2,
"e": "two"
}
}
]
}
}
}
If you then apply a validator as below:
// Validator for nested array of objects.
var schema = {
"type": "object",
"properties": {
"a": {
"type": "object",
"properties": {
"b": {
"type": "object",
"properties": {
"crs": {
"type": "array",
"items": {
"type": "object",
"properties": {
"cr": {
"type": "object",
"properties": {
"d": {
"bsonType": "int",
},
"e": {
"type": "string",
}
}
}}
}
}}
}}
}}}
db.runCommand({"collMod": "collectionName",
"validator": { "$jsonSchema": schema}}
);
When you update the existing document to add a new cr, the whole document will be validated.
db.collectionName.update({_id: 1}, {"$push":{
"a.b.crs":{
"cr":{
"d": NumberInt(20),
"e": "new element"}}}
});
Any pre-existing documents that are not valid according to your document schema validation rule(s), you will have to correct the documents first before updating them.
I'm trying to create json schema for a document where field values in some object should validate against a enum defined in another object in the same document.
More specifically, in the example below, I'd like to be able to define "properties" with id and values (I should be able to define different properties in different json files).
Then "objects" should be able to refer to these properties, so that object.properties[i].id must match with id of one of the properties and object.properties[i].value must match with one of the enum values defined for that property.
{
"properties": [
{
"id": "SIZE",
"values": ["small", "medium", "big"]
},
{
"id": "MATERIAL",
"values": ["wood", "glass", "steel", "plastic"]
},
{
"id": "COLOR",
"values": ["red", "green", "blue"]
}
],
"objects": [
{
"name": "chair",
"properties": [
{
"id": "SIZE",
"value": "small"
},
{
"id": "COLOR",
"value": "red"
}
],
},
{
"name": "table",
"properties": [
{
"id": "MATERIAL",
"value": "wood"
}
]
}
]
}
I tried to create json schema to validate such structure, but got stuck with describing reference to inner fields of "property" object. I also looked into the standard and did not find a way to achieve the goal.
Is it possible to create a json schema which would validate my json files?
There is a proposal for $data reference that almost allows to do it if you change your data structure a little bit to remove one level of indirection. It's is supported in Ajv (I am the author).
So if your data were:
{
"properties": {
"SIZE": ["small", "medium", "big"],
"MATERIAL": ["wood", "glass", "steel", "plastic"],
"COLOR": ["red", "green", "blue"]
},
"objects": {
"chair": {
"SIZE": "small",
"COLOR": "red"
},
"table": {
"MATERIAL": "wood"
}
}
}
then your schema could have been:
{
"type": "object",
"properties": {
"properties": {
"type": "object",
"additionalProperties": {
"type": "array",
"items": { "type": "string" }
}
},
"objects": {
"type": "object",
"additionalProperties": {
"type": "object",
"properties": {
"SIZE": {"enum": {"$data": "3/properties/SIZE"}},
"MATERIAL": {"enum": {"$data": "3/properties/MATERIAL"}},
"COLOR": {"enum": {"$data": "3/properties/MATERIAL"}}
}
}
}
}
}
And it could be dynamically generated based on all list of possible properties.
With the data structure you have you either can use custom keywords if the validator supports them or implement some part of validation logic outside of JSON schema.