I'm writing an API in Express, using Mongoose as my data layer. I'd like the API to be as self-describing as possible so my frontend can automatically generate forms and validation based on schema rules set up in my Mongoose models.
Is there any existing way to get a JSON representation of a Mongoose schema, or will I have to write my own? There seem to be plenty of JSON-to-Mongoose schema generators, but very little in the way of describing an existing schema.
I probably don't get a point. You can define your schema as Object and then use JSON.stringify on it. Or if you want, you can access all schema paths through model as Model.schema.paths.
Related
I want download data from a Rest API into a database.The data I want save are typed objects, like java object. I have chosen cassandra because it support the type Array type, Map type, versus standard SQLdatabase(Mysql, Sqlite,..). It is better to serialize java object.
In first, I should create the tables CQL from json schema of REST API. How it is possible to generate CQL table from json schema of REST API.
I know openapi-generator can generate mysql schema from json schema, but don't support CQL for the moment. So I need to search a alternative solution.
I haven't used off-the-shelf packages extensively to manage Cassandra schema but there are possibly open-source projects or software like Hackolade that might do it for you.
https://cassandra.link/cassandra.toolkit/ managed by Anant (I don't have any affiliation) has an extensive list of resources you might be interested in. Cheers!
I want to use apache avro schema's for data serialization and deserialization.
I want to use it with json encoding.
I want to put several of this serialized objects using different schemas to the same "source" (it's a kafka topic).
When I read it back I have the need to be able to resolve the right schema for the current data entry.
But the serialized data don't have any schema information in it. And to test all possible schema's for compatibility (kind of a duck typing approach) would be pretty unclean and error prone (for data which fits to multiple schemas it would be unclear which one to take)
I'm currently thought about putting the namespace and object name inside the json data programatically. But such a solution would not belong to the avro standard and it would open a new error scenario where it's possible to put the wrong schema namespace and/or object name inside the data.
I'm wondering if there would be a better way. Or if I have a general flaw in my solution.
Background: I want to use it for kafka message but don't want to use the schema registry (don't wan't to have a new single point of failure). I also still want to have KSQL support which is only available for json format or for using avro with the schema registry)
This may be a naive question or a requirement. I parse huge raw data and restructure it into a set of JSON files. These JSON files reside in a directory). I would like to avoid setting up any NoSql database such as - MongoDB (any database for that matter)
Based on these JSON files, I would like to setup express models so that I can make queries to JSON files. There won't be any Create, Update, and Delete operations user will perform.
I was looking for a module similar to mongoose which can make things simpler.
I am using python 3 for functional testing of a bunch of rest endpoints.
But i cannot figure out the best way to validate the json reaponse ( verifying the type, required, missing and additional fields)
I thought of below options :
1. Writing custom code and validate the response while converting the data into python class objects.
2. Validate using json schema .
Option 1: would be difficult to maintain and need to add separate functions to all the data models.
Option 2 : i like it. But i dont want to write schema for each endpoint in separate file/object. Is there a way to put it in a single object like we have swagger yml file. That way would be easy to maintain.
I would like to know which option is the best and if there are other better options / libraries available.
I've been through the same process, but validating REST requests and responses with Java. In the end I went with JSON Schema (there's an equivalent Python implementation at https://pypi.python.org/pypi/jsonschema) because it was simple and powerful, and hand-crafting the validation for anything but a trivial payload soon became a nightmare. Also, reading a JSON Schema file is easier than reasoning about a long list of validation statements.
It's true you need to define the schema in a separate file, but this proved to be no big deal. And, if your endpoints share some common features you can modularise your schemas and reuse common parts. There's a good tutorial at Understanding JSON Schema.
I have a use case where I need to validate JSON objects against a schema that can change real time..
Let me explain my requirements..
I persist JSON objects (MongoDB).
Before persisting I MUST validate the data type of some of the
fields of JSON objects (mentioned in #1) against a schema.
I persist the schema in mongodb.
I always validate the JSON objects against the latest schema available in db. (so I dont think it matters much even if the schema can change in real time for me it is kinda static).
I am using a J2EE stack (Spring Framework).
Can anyone guide me here..?
Another way of doing it is to use an external library https://github.com/fge/json-schema-validator to do the work for you. The one I proposed supports draft 4 of JSON Schema.
The IBM DataPower appliance has JSON Schema validation support. This will allow you to offload validation to an appliance that is designed for it along with routing of data within te enterprise.