JSON validation with custom schema - json

I simply want to validate a JSON file with the proper JSON schema I have written, just to check if I did everything right. Is there a command line tool or anything else for this (I'm on Ubuntu)? Everything I found yet, are packages for several programming languages for writing validation methods in code. I just need a tool where I can specify my JSON file and my custom schema and check it. That's it...

You have a couple of options. Please have a look at json-schema validators implementations to find the one that most suits your needs.
You can use fge java json schema validator from the command line as it is explained here.
If you are comfortable with python, you can also use jsonschema python package and run it from any python console:
from jsonschema import validate
validate(json, schema)
Another popular javascript library is tv4. You can use it directly in the browser, or in node console.
var valid = tv4.validate(json, schema);

Related

Purpose of "resolveJsonModule"?

The setting I am referencing is shown in the snippet bellow
{
"compilerOptions": {
"resolveJsonModule": true,
}
}
I don't really understand why TS language engineers would add a flag for "resolveJsonModule"? Either an environment supports resolving JSON as module via an import statement (or require() method), or the environment doesn't. Why bother with the extra complexity?
Context
Historically, Node has included a specialized JSON loader (unrelated to ECMA standards) to allow importing JSON data only in CommonJS mode.
Standardized importing of anything at all (ES modules) is only a relatively recent phenomenon in ECMAScript. Importing text files containing valid JSON, parsed as native JS data ("importing JSON") is described in a proposal that is still only in stage 3.
However, there has been recent movement in regard to implementation of the above mentioned proposal:
V8 implemented it in June (Chrome 91+)
TypeScipt v4.5.0 implemented it in November
Deno v1.17.0 implemented it in December
Node LTS v16.14.0 implemented it last Tuesday (behind a CLI flag --experimental-json-modules)
TypeScript
TypeScript is a static type-checker, but also a compiler (technically a transpiler), and transforms your TS source code syntax into a syntax that is valid JavaScript for the runtime environment you have specified in your TSConfig. Because there are different runtime environments with different capabilities, the way that you configure the compiler affects the transformed JavaScript that is emitted. In regard to defaults, the compiler uses an algorithmic logic to determine settings. (I can't summarize that here: you honestly have to read the entire reference in order to understand it.) Because loading of JSON data has been a non-standard, specialized operation until extremely recently, it has not been a default.
Alternatives
All JS runtimes offer alternatives to an import statment for importing of textual JSON data (which can then be parsed using JSON.parse), and none of them require configuring the compiler in the ways that you asked about:
Note: the data parsed from the JSON strings imported using these methods will not participate in the "automatic" type inference capabilities of the compiler module graph because they aren't part of the compilation graph: so they'll be typed as any (or possibly unknown in an extremely strict configuration).
Browser and Deno: window.fetch
Deno: Deno.readTextFile
Node: fs.readFile
Additionally, because all JSON (JavaScript Object Notation) is valid JS, you can simply prepend the data in your JSON file with export default , and then save the file as data.js instead of data.json, and then import it as a standard module: import {default as data} from './data.js';.
Final notes about inferred types:
I prefer to audit the JSON I'm importing and use my own manually-written types (written either by myself or someone else: imported from a module/declaration file) for the data, rather than relying on the compiler's inferred types from import statements (which I have found to be too narrow on many occasions), by assigning the parsed JSON data to a new variable using a type assertion.

Single serialization layer to Json with Casbah/Salat

I am trying to create a serialization layer which allows me to:
Store my classes in a MongoDB data source
Convert them to JSON to use them in a REST API.
Some classes are clearly not case classes (because they are inherited from a Java codebase) and I would have to write ad-hoc code for that. Is registering a BSON Hook for my non standard type the correct approach, and does it provide Json serialization?
Salat maintainer here.
You might prefer to create a Salat custom transformer instead of registering a BSON hook with Casbah.
See simple example and spec.
If you run into any issues, feel free to ping the mailing list with a small sample Github project that demonstrates what isn't working.

Is there a json validation framework in play based on a specified grammar

An automated system is going to feed the application[Play with Scala] with JSON's and the contract of the integration is that there would be no validation required on JSON's since it will be always deemed right. But for testing purposes when we seed the data more often than not we are not able to send the correct JSONs. We would like to validate the JSON's we receive based on a set of grammars. Is there a library that already does this. Or is there a better way to do this?
Example: Grammar for valid Json :
"header"->[String, mandatory],
"footer"->[String],
"someArray"->Array[String, mandatory],
"someArrayObject"->Array[
{
{"key1"->Int, mandatory},
{"key2"->String}
},
mandatory
]
and passing,
{
"header":"headerContent",
"footer":"footerContent",
"someArray":["str1", "str2"],
"someArrayObject"->[
{"key1":4, "key2":"someStringValue"},
{"key1":5, "key2":"someOtherStringValue"}
]
} // would pass
{
"header":"headerContent",
"footer":"footerContent",
"someArray":["str1", "str2"]
} // would notpass since someArrayObject though declared mandatory is not provided in the sample json
I think play-json will satisfy you play-json
In play-json you don't create a validator as it is, but a json transformer which is a validator in itself. The author of the framework wrote a series of blog-posts to show how to work with it: json-transformers
* Haven't noticed you use play) Play has play-json included by default.
You don't have to roll out your own DSLs. This is why we have schemas. Just like using XML schemas to validate your XML docs, you can define a JSON schema to validate your JSON objects. I had a similar requirement when building a RESTful web service using Play. I solved it by using the JSON Schema Validator library.
I have used the JSON Schema draft v3. The library supports draft v3 and draft v4. You can validate your schemas against possible JSON inputs using a web application that uses the same library. The web app is hosted here.
Also there are pretty nice examples that use the draft v4. You can check them out from here.
In Play 2, I have composed an action that takes the schema resource file name as input. This keeps away a lot of JSON validation code from the controller action itself.
#JsonValidate("user-register.json")
public static Result create() {
...
}
This way, all JSON Validation code stays in one place. Pretty neat :)

How to encode JSON in ABAP before 7.02

As Horst Keller mentioned in his ABAP and JSON post, "with Releases 7.02 and 7.03/7.31 (Kernelpatch 116) JSON is supported natively in ABAP".
Appartently 7.02 in my case of too generic because the line below:
writer = cl_sxml_string_writer=>create( type = if_sxml=>co_xt_json ).
returns the error: "The field CO_XT_JSON is unknown, but there is a field with the similar name CO_XT_XOP".
So is there any way to easily generate JSON?
Edit: Screenshot from SAP - Status
About the class CL_TREX_JSON_SERIALIZER: I also used this class during developping a mobile sap application and I found the created JSON not being valid, thus I started googling and found this http://scn.sap.com/community/mobile/blog/2012/09/24/serialize-abap-data-into-json-format (which also explains how to create a valid JSON serializer).
Validate your json with json lint http://jsonlint.com/ to see if it is valid.. otherwise, thats for sure, you get a lot of trouble in debugging why it doenst work and dont get the point that the serializer is corrupt. regards, zY
take a look at the ZCL_MDP_JSON Library. You can parse/encode any JSON. So, it is best suited for JSON scenarios that requires flexibility.
It is easy to understand if you have used JSON in other languages. You only need to study methods of ZCL_MDP_JSON_NODE class once & look at the examples.
Here is an extended overview of the library:
http://scn.sap.com/community/abap/blog/2016/07/03/an-open-source-abap-json-library--zclmdpjson
GitHub repo with examples directory: https://github.com/fatihpense/zcl_mdp_json
Disclaimer: I'm the author of the project. If you have questions, don't hesitate to contact me.
Here is some code I wrote for ABAP data <-> JSON conversion some time ago before the new capabilities were included with ABAP (or maybe it was just an older system).
https://gist.github.com/mydoghasworms/2291540
Include the code in your ABAP source and use the method data_to_json of the class.
A nice overview of custom ABAP <-> JSON serializers including yet another one can be found in this blog post
Most popular from my point of view is SE38's ZJSON-library which can be installed using SAPLINK (and which - in contrast to many others) has an explicit license attached to it: Apache 2.0
If upgrading to a newer patch isn't an option in the short term, you can also use class CL_TREX_JSON_SERIALIZER to serialise objects to JSON. A little bit of a quick-and-dirty solution but it works well.

Is there a defined set of JSON parser tests that validate the majority of edge cases?

I'm researching some JSON parsers, but some are home-grown. Is there a validation test that can be run against a json parser which verifies that the parser is "valid" and serializes/deserializes JSON string properly?
It seems that someone attempted to write a test suite here
It is used in another parser project to validate its implementation:
It's in java, don't know if its fit.
(someone has written some tests for python too here)