Is there a way to get a JSON-Schema from a Scala Case Class hierarchy? - json

I'm documenting an internal REST API written ini Scala, unfortunately we are not able to integrate Swagger, so for now we are going with an in-house solution for the doc generator.
I would like to generate a JSON-Schema to show how the response is when getting our resources. I'm just wondering if there is any shortcut to do this by taking advantage of the case classes already modeled.

The autoschema project is able to export JSON schema from Scala case classes. You can use it as follows:
case class MyType(myValue: Int)
AutoSchema.createSchema[MyType]
The Maven artifact seems to be no longer available but it is an SBT project available on Github so you can either copy the sources, build a Jar or add it as a dependency with SBT by putting in your build.sbt the following:
lazy val autoschemaProject =
ProjectRef(uri("https://github.com/coursera/autoschema.git"), "autoschema")
lazy val root = (project in file(".")).dependsOn(autoschemaProject)
I tested this with SBT 0.13.7. Notice that autoschema has its own dependencies (mainly play-json 2.3.2) so you might need to change their versions to avoid version conflicts with you own project dependencies.

As #mziccard stated, autoschema is the way to go. However, it's been a while since there has been some activity on the main repository. I took some time to fork it and update its dependencies and deprecated code (work that was done in other forks, I simply combined it). It is now published in maven central under my fork:
https://github.com/sauldhernandez/autoschema
You can use it by putting this in build.sbt:
libraryDependencies += "com.sauldhernandez" %% "autoschema" % "1.0.0"

Related

Add custom Jinja2 filters/tests to MkDocs

While writing a Jinja2 template for MkDocs, I need some processing that is not covered by the filters/tests available (specifically, I need date formatting, which is a recurring example for custom filters in Jinja2-related resources across the Web). How can I define my own filters/tests and use them from templates?
To clarify the question, I know how to register new filters/tests in a Jinja2 environment from Python. My issue is that, as a user of MkDocs, I do not configure Jinja2 myself. So what I’m looking for is a way to hook into the setup that MkDocs performs.
I assume it is possible to add filters from a plugin. In fact, I have found one such plugin (undocumented and apparently not under active development, unfortunately). However, I hope that there is a simpler, local solution; one that would not involve implementing a plugin, packaging it as a Python package and publishing it on PyPi.
A possible solution is to use mkdocs-simple-hooks that allows to implement the hooks without needing to create a plugin. For example in your case:
plugins:
- mkdocs-simple-hooks:
hooks:
on_env: "docs.hooks:on_env"
docs/hooks.py
def on_env(env, config, files, **kwargs):
env.filters['my_filter'] = my_filter
env.tests['my_test'] = my_test
return env

SBT: how to override the default sources (and/or resources etc)?

I've been reading and re-reading the docs, and everywhere it just says that by default sbt uses the project base directory and src/main/scala to look for scala sources, but I could not find any mention of how that default can be changed.
I have seen vague references to an "exclude" filters, but what I need is the opposite.
I would like to compile a subset of a large scala project into a smaller self-contained artifact. Is there a way to tell sbt exactly what files I want it to include? Something like "**/util/*.scala" for example?
It is possible to customize sources path in SBT:
scalaSource in Compile := baseDirectory.value / "src"
scalaSource in Test := baseDirectory.value / "test-src"
More in documentation.

How to validate JSON in Gradle build

I have a Gradle build where a large JSON configuration is bundled into a package for later upload onto a server. Sometimes when changes are made to the file, the file is not valid any more and thus fails to upload on the server.
I would like to find this earlier by adding a validate-step in the Gradle build.
When looking around I could not find a documented way to do this, I saw the project gradle-json-validator which looks promising, but there is no documentation whatsoever, so I am not sure how this can be used...
Any hint on gradle-json-validator or any other way to validate a JSON file as part of the Gradle build steps?
From source, it would seem, the usage would be:
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'cz.alenkacz.gradle:json-validator:0.9.9'
}
}
apply plugin: 'cz.alenkacz.gradle.jsonvalidator'
The plugin doesn't seem to have an extension to do configuration. But seems to use jsonSchema and targetJsonFile as input schema and file-to-validate. I would try setting them at the root level of build.gradle
validateJson.jsonSchema = new File('/path/to/schema')
validateJson.targetJsonFile = new File('/path/to/jsonFile')
and the task to run is:
gradle validateJson
I have improved the readme file in the repository with proper usage example.
Hope that helps. https://github.com/alenkacz/gradle-json-validator

How to replace gwt json with my own json implementation

I need to find a method to work json implementation in my gwt-project com.google.gwt.json.client package instead of json in GWT com.google.gwt.json package.
In my GWT project I want to use json implementation written by myself and stored in a package in my project with same path as that of GWT json package.
Files inside this packages are com.google.gwt.json.client implemented by myself and keep in same package in my project, how to configure in project to use these packages instead of original.
Any suggestions regarding this will be appreciated.
Thanks to all..
UPDATES:
For more clarification on what I am looking for:
While parsing with JSONUtils in GWT, make parser to use JSONObject written in my project com.google.gwt.json package, insted of GWT-json in com.google.gwt.json package.
Sometimes it is necessary to modify certain GWT core classes because either, you need to fix an issue, or you need to add a new feature to them.
To override any GWT implementation class, for instance com.google.gwt.json.client.JSONObject, you only have to copy and modify that class in your src folder with the same path: src/main/java/com/google/gwt/json/client/JSONObject.java if you are using maven, src/com/google/gwt/json/client/JSONObject.java otherwise.
The only care you need is that your src folder is first in your classpath than the gwt sdk, with maven it is so by default. You should be aware as well that when you update the gwt version, perhaps you would need to update your implementations.
If the instances of the class you were trying to override, are created in GWT using the GWT.create() call, you could replace the class with your own implementation with a <replace-with> tag in your .gwt.xml. This technique is called deferred-binding. This is not the case since JSONObject is normally created with new.
Finally the <super-source> tag can be used to override any class implementation in compile time. Although super-source is designed to override jre classes by gwt implementations, replacing gwt with other gwt implementations works. In this case you have to put your modified classes, with the same namespace structure, in the folder pointed by the super-source tag.

How do I disable code generation in my test plugin?

I have a couple of test files written in my DSL in my tests plugin/project. Most of the tests use inline multi-line strings and Xtend but in four cases, I need to test code which does some magic with URLs and the classpath, so I really need resources in the classpath for that.
Since loading the resources only works when the extension is correct, I can't give the files a fake extension.
Now my problem: My DSL also has a code generator. This means that eventually, I end up with a couple of generated files in places where I can't have them (they don't compile, for example, and one even contains an error to test error handling when information is split across several files).
I can't disable the Xtext nature because the tests project uses Xtend so for these files, I do need code generation.
Since the generator runs inside Eclipse (I have the DSL plugins installed for other projects), there is no way to override the code generator in Guice.
How can I disable the code generator in this case?
There is a simple way to achieve this:
Open the properties of your project
Expand the entry for your DSL
Select "Compiler"
Select "Enable project specific settings"
Disable/deselect "Compiler is activated" under "General"
If you don't have a properties entry for your DSL:
Add this fragment to your .mwe2 workflow file:
fragment = generator.GeneratorFragment {}
Regenerate your projects
Merge the new code from plugin.xml_gen into plugin.xml both in the base and the UI plugins.
The interesting parts are the two extension points org.eclipse.ui.preferencePages and org.eclipse.ui.propertyPages