I would like to validate a schema based on either its maximum/minimum (number) OR maximumLength/minimumLength (string).
I have a json form:
[
{
"key":"foo",
"title":"Test",
"type":"string"
}
]
and a json schema:
{
"type": "object",
"properties": {
"foo": {
"type": ["number","string"],
"maxLength":2,
"minLength":0,
"minimum":3,
"maximum":10
}
}
}
and a json model:
{
"foo": "bar"
}
Why does this example not work with validation? The model I have is not validated to false. According to this document it is possible to have different types defined in an array, but how can we do validation based on min/max values?
Your schema is correct. The validator you are using doesn't work properly. Here is an alternative that uses anyOf instead.
{
"type": "object",
"properties": {
"foo": {
"anyOf": [
{ "$ref": "#/definitions/boundedNumber" }
{ "$ref": "#/definitions/boundedString" }
]
}
},
"definitions": {
"boundedString": {
"type": "string",
"maxLength": 2,
"minLength": 0
},
"boundedNumber": {
"type": "number",
"minimum": 3,
"maximum": 10
}
}
}
Although it is quite a bit longer, some would argue that this is actually easier to read/maintain because of the separation of the type specific keywords.
Your schema is validating JSON objects ("type":"object"). In addition, if they have a property with key "foo", its value must be either a number between 3 an 10, or a string of maximum length 2.
Valid objects according to your schema:
{"foo":6}
{"foo":"as"}
Invalid objects:
{"foo":60}
{"foo":"asereje"}
If you want to validate arrays you must define your parent object as an array and use items tag to specify the schema for the array items, for instance:
{
"type" : "array",
"items" : {
"type" : "object",
"properties" : {
"foo" : {
"type" : ["number", "string"],
"maxLength" : 2,
"minLength" : 0,
"minimum" : 3,
"maximum" : 10
}
}
}
}
The schema above would validate the following JSON array:
[{
"foo" : 6
}, {
"foo" : "as"
}
]
John the issue is the type "type" : ["number", "string"]
Angular Schema Form is generating fields from the combined JSON Schema and the UI Schema (form), when the field type is defined it knows which type to validate against, when there are multiple types it doesn't know which part of the spec to base the form field against, so it falls back to a textbox and does not add the appropriate validation requirements for string alone.
To achieve your desired outcome you would need to tell it which field type you wish to use. There is a bug in the 0.8.x versions where it does not validate based on the type set in the UI schema if it cannot determine the type in the data schema, I believe that is fixed in the latest development branch. If not if there is an issue raised in Git, I would prioritise it.
Related
I have a JSON that looks like this:
{
"name": "Jane",
"company_name": "BrandNewStartup",
"designation": "DesignLead"
}
The name element can appear alone in the JSON as well, so the following JSON is valid too.
{
"name": "Jane"
}
But company_name and designation cannot appear if name is missing. So the following JSON should be invalid:
{
"company_name": "BrandNewStartup",
"designation": "DesignLead"
}
I have tried the following rule:
"oneOf": [
{
"required": [
"name",
"company_name",
"designation"
]
},
{
"required": [
"name"
]
}
]
However this does not seem to work (e.g this validation library raises error that the JSON should be valid to only one schema but it is valid against all).
If I change this to anyOf, the first JSON with all the 3 fields works, but when name appears alone, an error is raised that the company_name designation fields are missing.
How do I define this rule?
JSON Schema is a constraints based language. Anything you don't specify is allowed.
The required keyword means a key is required in an object, but doesn't inherintly prevent any other keys from being included.
Breaking down the schema in your question, when you have all three keys in your object, as in your first example instance, then both subschemas in oneOf would be valid.
In order to restrict the allowed properties, you need to use the additionalProperties keyword, which in your case also means you need to use the properties keyword. required has no effect on additionalProperties.
The second subschema needs to only be valid when the instance has ONLY name and no other keys. Here's a live demo using the below modified JSON Schema: https://jsonschema.dev/s/7JcUa
{
"$schema": "http://json-schema.org/draft-07/schema",
"oneOf": [
{
"required": [
"name",
"company_name",
"designation"
]
},
{
"required": [
"name"
],
"properties": {
"name": true
},
"additionalProperties": false
}
]
}
I'm currently writing a json schema and I was wondering if someone knows an answer to my problem. Can I ensure that an object's value equals the key of an object somewhere in the JSON. Given the following JSON:
{
"defaultConfig" : "config1",
"configs" : {
"config0" : {...},
"config1" : {...},
"config2" : {...}
}
}
Can I validate that the content of "defaultConfig" must be one of the keys of the properties of "configs" (e.g. "config0", "config1", "config2")? I can't use enums in this case, as the config names are not known beforehand?
Edit: Here's the schema I have so far:
{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties" : {
"defaultConfig" : {
"type" : "string"
},
"configs" : {
"type" : "object",
"patternProperties": {
"." : {"type" : "object"}
}
}
}
}
No, there is nothing in JSON Schema that allows a keyword to reference a different part of the data instance in that way.
However, the latest version of the specification allows extensions via the $vocabulary keyword, so if you are so inclined, you could write your own keyword that did what you needed.
I'm trying to create a JSON schema for an existing JSON file that looks something like this:
{
"variable": {
"name": "age",
"type": "integer"
}
}
In the schema, I want to ensure the type property has the value string or integer:
{
"variable": {
"name": "string",
"type": {
"type": "string",
"enum": ["string", "integer"]
}
}
}
Unfortunately it blows up with message: ValidationError {is not any of [subschema 0]....
I've read that there are "no reserved words" in JSON schema, so I assume a type of type is valid, assuming I declare it correctly?
The accepted answer from jruizaranguren doesn't actually answer the question.
The problem is that given JSON (not JSON schema, JSON data) that has a field named "type", it's hard to write a JSON schema that doesn't choke.
Imagine that you have an existing JSON data feed (data, not schema) that contains:
"ids": [ { "type": "SSN", "value": "123-45-6789" },
{ "type": "pay", "value": "8675309" } ]
What I've found in trying to work through the same problem is that instead of putting
"properties": {
"type": { <======= validation chokes on this
"type": "string"
}
you can put
"patternProperties": {
"^type$": {
"type": "string"
}
but I'm still working through how to mark it as a required field. It may not be possible.
I think, based on looking at the "schema" in the original question, that JSON schemas have evolved quite a lot since then - but this is still a problem. There may be a better solution.
According to the specification, in the Valid typessection for type:
The value of this keyword MUST be either a string or an array. If it is an array, elements of the array MUST be strings and MUST be unique.
String values MUST be one of the seven primitive types defined by the core specification.
Later, in Conditions for successful validation:
An instance matches successfully if its primitive type is one of the types defined by keyword. Recall: "number" includes "integer".
In your case:
{
"variable": {
"name": "string",
"type": ["string", "integer"]
}
}
I wish to write a json schema to cover this (simplified) example
{
"errorMessage": "",
"nbRunningQueries": 0,
"isError": False,
"result": {
"foo": {"price":10.0, "country":"UK"},
"bar": {"price":100.2, "country":"UK"}
}
}
which can have this pretty trivial root schema
schema = {
"type":"object",
"required":True,
"properties":{
"errorMessage": {"type":"string", "required":True},
"isError": {"type":"boolean", "required":True},
"nbRunningQueries": {"type":"number", "required":True},
"result": {"type":"object","required":True}
}
}
The complication is the results {} element. Unlike a standard pattern where results would be an array of same objects - each with an id field or similar this response models a python dictionary which looks like this:
{
"foo": {},
"bar": {},
...
}
So given that a will be getting a results object of flexible size with no set keys how can I write json schema for this?
I don't control the input sadly or I'd rewrite it to be something like
{
"errorMessage": "",
"nbRunningQueries": 0,
"isError": False,
"result": [
{"id": "foo", "price":10.0, "country": "UK"},
{"id": "bar", "price":100.2, "country":"UK"}
]
}
Any help or links to pertinent examples would be great. Thanks.
With json-schema draft 4, you can use additionalProperties keyword to specify the schema of any new properties that you could receive in your results object.
"result" : {
"type" : "object"
"additionalProperties" : {
"type" : "number"
}
}
If you can restrict the allowed key names, then you may use "patternProperties" keyword and a regular expression to limit the permited key names.
Note that in json-schema draft 4 "required" must be an array which is bounded to the object, not to each property.
Is there a way to create a repetition pattern for elements inside array of a JSON Schema document. I would like to use it in order to validate query produced by a query builder tool.
the part of the schema i currently use is
"complexCondition": {
"anyOf": [
{
"title": "Simple condition, e.x. X==0",
"type": "string"
},
{
"type": "array",
"items": [
{
"$ref": "#/complexCondition"
},
{
"type": "string", "enum": ["AND","OR"]
},
{
"$ref": "#/complexCondition"
}
],
"additionalItems": false
}
]
}
This allows me correct validation of query "conditionA && conditionB && conditionC" as
[[conditionA,"AND",conditionB],"AND",conditionC]
However, I would like to be able and validate documents where query is stored as
[conditionA,"AND",conditionB,"AND",conditionC]
and this to achievable for any number of conditions.
In order to achieve what you intend to do you would need to define a rule for odd and even positions
in an array, which can not be done for arrays of any length with json-schema.
If your expected number of query conditions is not going to grow ad infinitum, you could hard-code the first n positions in items keyword:
"items":[{"$ref":"#/condition},{"$ref":"#/operator"},{"$ref":"#/condition},{"$ref":"#/operator"} ... etc.]
In this case you still have the problem that you could define a wrong condition ending in "operator".
Other option would be to make "oneOf" and build each posibility (this could be automated with a script, I guess you won't have expressions with 100 clauses...)
Anyway I find a bit strange to try to flatten a concept that is inherently recursive.
How would you store this ( (A OR B) OR (C AND B))?
So If you can still rethink your desired serialization schema format I would suggest you to make a recursive approach. Something like:
"predicate" : {
"type" : "array",
"items" : {
"$ref" : "#/clause"
}
}
"clause" : {
"type" : "array",
"items" : [{
"$ref" : "#/condition"
}
],
"additionalItems" : {
"type" : "array",
"items" : [{
"$ref" : "#/operator"
}, {
"$ref" : "#/clause"
}
]
}
}
The generated string is more ugly, but parsing it would be fairly easy with a simple recursive function:
Simple condition:
[["condition1"]]
Simple Clause:
[["condition1",["OR",["condition2"]]]
Adding a clause at the same level
[["condition1",["OR",["condition2"]],["OR",["condition3"]]]
Add a clause to a child level
[["condition1",["OR",["condition2"]],["OR",["condition3", ["AND",["condition4"]]]]]