JSON validation against a schema (Java EE application) - json

I have a use case where I need to validate JSON objects against a schema that can change real time..
Let me explain my requirements..
I persist JSON objects (MongoDB).
Before persisting I MUST validate the data type of some of the
fields of JSON objects (mentioned in #1) against a schema.
I persist the schema in mongodb.
I always validate the JSON objects against the latest schema available in db. (so I dont think it matters much even if the schema can change in real time for me it is kinda static).
I am using a J2EE stack (Spring Framework).
Can anyone guide me here..?

Another way of doing it is to use an external library https://github.com/fge/json-schema-validator to do the work for you. The one I proposed supports draft 4 of JSON Schema.

The IBM DataPower appliance has JSON Schema validation support. This will allow you to offload validation to an appliance that is designed for it along with routing of data within te enterprise.

Related

JSON and Schema Registry

I am trying to produce JSON records from my Scala Producer code to Kafka topic. It is successfully generated, however I am not able to register the schema and do schema evolution compatibility checks.
I am not able to find any proper code/doc references. How do I register my JSON schema and consume by connecting to schema registry client and check for the compatibilities.
Any suggestions please? (more about what am trying Class io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer could not be found)
Compatibilities are checked server-side automatically upon producing, which in-turn registers the schemas, by default.
You provide schema.registry.url to the Producer and Consumer properties when using the clients and the JSONSchema(De)Serializer classes

Converting REST API JSON schema into a CQL Cassandra schema

I want download data from a Rest API into a database.The data I want save are typed objects, like java object. I have chosen cassandra because it support the type Array type, Map type, versus standard SQLdatabase(Mysql, Sqlite,..). It is better to serialize java object.
In first, I should create the tables CQL from json schema of REST API. How it is possible to generate CQL table from json schema of REST API.
I know openapi-generator can generate mysql schema from json schema, but don't support CQL for the moment. So I need to search a alternative solution.
I haven't used off-the-shelf packages extensively to manage Cassandra schema but there are possibly open-source projects or software like Hackolade that might do it for you.
https://cassandra.link/cassandra.toolkit/ managed by Anant (I don't have any affiliation) has an extensive list of resources you might be interested in. Cheers!

avro schema with json encoding - how to determine schema back from serialized data

I want to use apache avro schema's for data serialization and deserialization.
I want to use it with json encoding.
I want to put several of this serialized objects using different schemas to the same "source" (it's a kafka topic).
When I read it back I have the need to be able to resolve the right schema for the current data entry.
But the serialized data don't have any schema information in it. And to test all possible schema's for compatibility (kind of a duck typing approach) would be pretty unclean and error prone (for data which fits to multiple schemas it would be unclear which one to take)
I'm currently thought about putting the namespace and object name inside the json data programatically. But such a solution would not belong to the avro standard and it would open a new error scenario where it's possible to put the wrong schema namespace and/or object name inside the data.
I'm wondering if there would be a better way. Or if I have a general flaw in my solution.
Background: I want to use it for kafka message but don't want to use the schema registry (don't wan't to have a new single point of failure). I also still want to have KSQL support which is only available for json format or for using avro with the schema registry)

How do I store nested JSON objects directly in ABAP DDIC?

ABAP Databases, oracle, MaxDB et al., are mostly RDBMS. Right now, I have a JSON structure that cannot be normalised and hence I want to store it as is. So, I want a MongoDB like Object store in ABAP.
What's the best way to achieve this? Is data cluster an option? Perhaps the only option?
I don't think you can connect to some other then supported DBs directly from ABAP. If you have Netweaver Java, you can call some custom Java application, which accesses MongoDB. You can check SAP Hana if there is something similar.
In ABAP you interact with RDBMS via ABAP Dictionary.
It supports data types like LCHR, STRING, RAWSTRING. Checkout docs for more details.
Data cluster is one option, but you can simply use a binary type DB field for storing the JSON data.
There is a method called transformation in ABAP, which converts from ABAP data to XML/JSON data and vice-versa.
There's a simple example on the following blog:
https://blogs.sap.com/2013/07/04/abap-news-for-release-740-abap-and-json/
Comments on the blog page contain more info.

How to get a JSON representation of a Mongoose schema?

I'm writing an API in Express, using Mongoose as my data layer. I'd like the API to be as self-describing as possible so my frontend can automatically generate forms and validation based on schema rules set up in my Mongoose models.
Is there any existing way to get a JSON representation of a Mongoose schema, or will I have to write my own? There seem to be plenty of JSON-to-Mongoose schema generators, but very little in the way of describing an existing schema.
I probably don't get a point. You can define your schema as Object and then use JSON.stringify on it. Or if you want, you can access all schema paths through model as Model.schema.paths.