yaml validation against pojo/bean - json

I have a schema defined in yaml format. At application startup, I just need to verify that this schema has not been changed. So for that, in code I need to write a bean and validate that schema against that bean. Is this possible? It may sound strange because usually payloads are validated against schema, but I need to validate schema against a bean. Any JAVA based solution?
I was looking at Jackson-module-jsonSchema but it generates Schema from a bean, not validate a bean against a schema AND its json not yaml.
P.S. I can convert yaml to json if need be (if an api is available to do this in json)

Related

Kafka Connect schema format for JSONConverter

I am using Kafka Connect to retrieve an existing schema from the schema registry and then trying to convert the returned schema string using JSONConverter (org.apache.kafka.connect.json.JSONConverter).
Unfortunately, I get an error from JSONConverter:
org.apache.kafka.connect.errors.DataException: Unknown schema type: object
I viewed the JSONConverter code and the error occurs because the schema "type" returned from the schema registry is "object" (see below) but JSONConverter does not recognize that type.
Questions:
Is the retrieved schema usable for JSONConverter? If yes, am I using this incorrectly?
Is JSONConverter expecting a different format? If yes, does someone know what the format JSONConverter is expecting?
Is there a different method of concerting the schema registry response into a "Schema"?
Here are the relevant artifacts:
schema registry response (when querying for a particular schema):
[{"subject":"test-schema","version":1,"id":1,"schemaType":"JSON","schema":"{\"title\":\"test-schema\",\"type\":\"object\",\"required\":[\"id\"],\"additionalProperties\":false,\"properties\":{\"id\":{\"type\":\"integer\"}}}"}]
When the text above is cleaned up a bit, the relevant schema component ("schema") is shown below:
{
"title":"test-schema",
"type":"object",
"required":["id"],
"additionalProperties":false,
"properties":{"id":{"type":"integer"}}
}
org.apache.kafka.connect.json.JSONConverter doesn't actually use "JSONSchema" specification. It has its own (not well documented) format. It also doesn't integrate at all with the Schema Registry.
An object is struct type. - https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained/#json-schemas
If you intend on using actual JSONSchema (and the registry), you need to use the Converter from Confluent - io.confluent.connect.json.JsonSchemaConverter
Is there a different method of concerting the schema registry response into a "Schema"
If you use the Schema Registry Java Client, then yes, use the getSchemaById method, then the schemaType() and rawSchema() method of that response should get you close to what you want. With that, you would pass it to some JSONSchema library (e.g. org.everit.json.schema, which is used by the registry)

BizTalk XML to JSON Pipeline - Force JSON array even without a schema target namespace

Coming from this question: Conversion of XML Schema to JSON array list in Biztalk
We have the same situation: Our XML needs to be converted to JSON and we have child objects which can occur one or multiple times which must always result in a JSON array.
The problem is that we are not able to set a target namespace because our schema is generated by SAP (IDoc).
Are there other options to do the serialization? I would like to avoid a custom implementation for JSON serialization.
Create an internal BizTalk schema with a target namespace and map the SAP payload to that.

Given a JSON object with a $schema property, how do I validate it without providing the schema again?

I have an .ndjson file with a bunch of JSON objects. Each object has a $schema property pointing to a file with a JSON Schema defining the validation rules for that object. (If it helps, we can assume that I have a bunch of files instead of a .ndjson file - splitting it into multiple files is easy.)
I've looked around for a way to validate the JSON objects against the referenced $schemas, but all validators I can find require me to specify not only the object but also the schema explicitly.
For example:
jsonschema (Python)
fastjsonschema (Python)
pajv (Node/Commandline)
all require me to specify the schema as input.
Is there a validator that will read the $schema property instead of requiring me to provide the schema explicitly?

JSON Schema Validation for PL/SQL or SQL, similar to XSLT for XML Schema Validation

I am looking for the schema validation for JSON structure received during the API Invocation from ORDS. Lets say I have 10 different json tags constructing the JSON and I want to validate this JSON against pre-defined validation to check if any required parameters are missing or to validate the datatype.
I am looking for the functionality similar to XLST which is used for XML schema validation.
Please note this schema validation is required for the JSON using PL/SQL and ORDS(Oracle Rest Database Services).
Technical Components: SQL, PL/SQL, ORDS. We want something which is compatible with the mentioned components.

Generate Angular2 forms from Swagger API specification

I'm looking for a way to generate a set of Angular2 form templates from a Swagger API definition file. I want a result that will allow me to test my POST/PUT requests, and even use it in my app.
After some research I found this Angular2 form library that takes a JSON schema as input: https://github.com/makinacorpus/angular2-schema-form
So if you know of a Swagger -> JSON Schema converter that will work too.
Cheers!
So if you know of a Swagger -> JSON Schema converter that will work
too.
Swagger 2.0 supports a subset of JSON schema draft 4. This is what swagger's Schema object is. From the docs:
The following properties are taken directly from the JSON Schema
definition and follow the same specifications:
$ref - As a JSON Reference
format (See Data Type Formats for further details)
title
description (GFM syntax can be used for rich text representation)
default (Unlike JSON Schema, the value MUST conform to the defined type for the Schema Object)
multipleOf
...
The following properties are taken from the JSON Schema definition but
their definitions were adjusted to the Swagger Specification.
items
allOf
properties
additionalProperties
It should be a fairly simple exercise to manually extract the schema from your swagger, but I don't know of any automated tool to do this. I think the fact some of the JSON schema properties have been modified by swagger may make auto conversion problematic in certain circumstances.