Object mapper from and to json - json

I am developing an application system that has multiple executable applications on different platforms (java and .net).
For communication between them I am using JSON format. So I need to map object to and from json very frequently. Current solution (seems workaround) is jackson at java end and Newtonsoft.Json at .NET end. Problem is property name are not same and not all properties will be required at de-serialization end
So my questions are:
1. Is there any mapper to do this.
Currently using NewtonSoft.JSON.DatasetMapper at .Net end and
jsonanysetter annotation at java, but in this approach mapping
definition is loaded for each object as actual object mapping code
is in code. For example:
//C#
myobj.prop1 = dataSet.Tables[0].Rows[0]["propertyName1"].ToString();
// and so on.....
//Java
switch(key)
{
case "prop1":
myobj.setPropery1(value.toString());
break;
//and so on......
}
2. Object transformationRate needs to be very high as object are
sent and recieved at very high speed. say some 10k objects per second.

We used GSON in one of our project , i think this reference may help you, Apart from it ,there is a similar question may help you. another q/a in stackoverflow

You should take a look at Jackson. It's the de facto JSON library for Java and will happily handle turning objects into JSON and back again. It has many options to allow you to alter the output, and most per-object configuration is carried out using annotations so is visible in your model rather than hidden away in a separate configuration file.

Related

angular and turning json data into real objects (and vice-versa)

To keep this simple:
I have classes defined in typescript which have methods and properties (with lots of getter/setter logic). I then retrieve json data matching such classes. I need to be able to project these json objects into my "smart" classes. I know about class transformer but I wonder if this is really go-to approach to do this kind of stuff. Furthermore, I'm planning on using ngrx, so this whole class-transformation just looks wrong (server to json, json to state, state to class? and viceversa? I just dont see a clear pattern.
Any clarity is appreciated. Thanks!
I'm doing almost exactly what you describe in a fairly large app.
I'm using class-transformer to transform the JSON from http calls to instances of the appropriate objects, and then using the resulting objects as state in a store (except that I'm using Redux instead of ngrx).
I find that it works very well.
I'm not sure exactly what you mean by "server to json, json to state, state to class? and viceversa?".
For me (using your terminology), it's server to json, json to class, class to state
(but state is just a collection of objects, i.e. class instances. I.E. state is objects).
If I need to send state back to the server, then yes, I typically pull the appropriate objects from the store, serialize them to JSON, and send them to the server. But...the Angular HttpClient does the serialization for you, so you don't typically have to write that part, unless you need some custom serialization.

How to convert Model to JSON

When I naively use Jackson to convert to JSON i receive this exception:
Caused by: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class org.apache.cayenne.access.DefaultDataRowStoreFactory and no properties discovered to create BeanSerializer
Edit: I'd like to do something like this:
ObjectContext context = cayenneRuntime.newContext();
List<User> users = ObjectSelect.query(User.class).select(context);
JsonObject json = Json.mapper.convertValue(obj, Map.class)
Are there any existing solutions? Thanks
Considering that in a general case Cayenne gives you not just objects, but a virtual graph of objects, serialization to JSON becomes a more quirky topic than it initially appears.
The short answer: you'd have manually build JSON for whatever subgraph of your object graph.
While not a direct answer, it may be worth mentioning that Agrest framework (ex. LinkRest) supports rule-based serialization of Cayenne object graphs to JSON. But it is not a standalone component. I.e. it will only work if you use it for your REST services.

Use case for a JSON file

I've used JSON a number of times within AJAX requests to perform asynchronous writes/reads of a database. I've been trying to better understand JSON and its uses within different programming environments and one of the questions I've been curious about is: what are the common use cases for JSON as external file (rather than just as an object that is passed within AJAX requests)?
More specifically, what are some use cases in which a .json file would be better suited than simply using temporary JSON objects to pass between AJAX requests? Any insight on this would be much appreciated.
I am not that familiar with AJAX etc., but JSON is so popular that many programming languages support it - not just Java and related languages.
In itself JSON simply holds information - it's merely a format for storing data.
It can often be used to transfer data between languages. Personally, I am also using JSON to store my objects to persistent data storages and then later on rebuild the objects alongside the .class schematics. For example, Google created GSON to easily turn objects into JSON and back. Very handy!
You should also think about: How do you transfer an object from one machine to another?
To sum it up: It's simple, it doesn't create massive overhead, it's even easy to read. And most important of all: So many tools offer JSON support.
Edit:
To show the simplicity of re-building from JSON, here's an example from my game:
public static Player fromJson(String json) {
if(json != null && !json.isEmpty()) {
return gson.fromJson(json, Player.class);
}
return new Player(); //no save game present. Use default constructor
}

Is there a way to use Playframework Json macro generated format for partial entity validation?

My program stack is ReactiveMongo 0.11.0, Scala 2.11.6, Play 2.4.2.
I'm adding PATCH functionality support to my Controllers. I want it to be type safe, so that PATCH would not mess the data in Mongo.
Current dirty solution of doing this, is
Reading object from Mongo first,
Performing JsObject.deepMerge with provided patch,
Checking that value can still be deserialized to target type.
Serializing merged object back to JsObject, and check, that patch contains only fields that are present in merged Json (So that there is no trash added to the stored object)
Call actual $set on mongo
This is obviously not perfect, but works fine. I would write macros to generate appropriate format generalization, but it might take too much time, which I currently lack of.
Is there a way to use Playframework Json macro generated format for partial entity validation like this?
Or any other solution, that can be easily integrated in Playframework for that matters.
With the help of #julien-richard-foy made a small library, to do exactly what I wanted.
https://github.com/clemble/scala-validator
Need to add some documentation, and I'll publish it to repository.

Binding JSON to nested Grails Domain Objects

I'm developing a RESTful interface which is used to provide JSON data for a JavaScript application.
On the server side I use Grails 1.3.7 and use GORM Domain Objects for persistence. I implemented a custom JSON Marshaller to support marshalling the nested domain objects
Here are sample domain objects:
class SampleDomain {
static mapping = { nest2 cascade: 'all' }
String someString
SampleDomainNested nest2
}
and
class SampleDomainNested {
String someField
}
The SampleDomain resource is published under the URL /rs/sample/ so /rs/sample/1 points to the SampleDomain object with ID 1
When I render the resource using my custom json marshaller (GET on /rs/sample/1), I get the following data:
{
"someString" : "somevalue1",
"nest2" : {
"someField" : "someothervalue"
}
}
which is exactly what I want.
Now comes the problem: I try to send the same data to the resource /rs/sample/1 via PUT.
To bind the json data to the Domain Object, the controller handling the request calls def domain = SampleDomain.get(id) and domain.properties = data where data is the unmarshalled object.
The binding for the "someString" field is working just fine, but the nested object is not populated using the nested data so I get an error that the property "nest2" is null, which is not allowed.
I already tried implementing a custom PropertyEditorSupport as well as a StructuredPropertyEditor and register the editor for the class.
Strangely, the editor only gets called when I supply non-nested values. So when I send the following to the server via PUT (which doesn't make any sense ;) )
{
"someString" : "somevalue1",
"nest2" : "test"
}
at least the property editor gets called.
I looked at the code of the GrailsDataBinder. I found out that setting properties of an association seems to work by specifying the path of the association instead of providing a map, so the following works as well:
{
"someString" : "somevalue1",
"nest2.somefield" : "someothervalue"
}
but this doesn't help me since I don't want to implement a custom JavaScript to JSON object serializer.
Is it possible to use Grails data binding using nested maps? Or do I really heave to implement that by hand for each domain class?
Thanks a lot,
Martin
Since this question got upvoted several times I would like to share what I did in the end:
Since I had some more requirements to be implemented like security etc. I implemented a service layer which hides the domain objects from the controllers. I introduced a "dynamic DTO layer" which translates Domain Objects to Groovy Maps which can be serialized easily using the standard serializers and which implements the updates manually. All the semi-automatic/meta-programming/command pattern/... based solutions I tried to implement failed at some point, mostly resulting in strange GORM errors or a lot of configuration code (and a lot of frustration). The update and serialization methods for the DTOs are fairly straightforward and could be implemented very quickly. It does not introduce a lot of duplicate code as well since you have to specify how your domain objects are serialized anyway if you don't want to publish your internal domain object structure. Maybe it's not the most elegant solution but it was the only solution which really worked for me. It also allows me to implement batch updates since the update logic is not connected to the http requests any more.
However I must say that I don't think that grails is the appropriate tech stack best suited for this kind of application, since it makes your application very heavy-weight and inflexbile. My experience is that once you start doing things which are not supported by the framework by default, it starts getting messy. Furthermore, I don't like the fact that the "repository" layer in grails essentially only exists as a part of the domain objects which introduced a lot of problems and resulted in several "proxy services" emulating a repository layer. If you start building an application using a json rest interface, I would suggest to either go for a very light-weight technology like node.js or, if you want to/have to stick to a java based stack, use standard spring framework + spring mvc + spring data with a nice and clean dto layer (this is what I've migrated to and it works like a charm). You don't have to write a lot of boilerplate code and you are completely in control of what's actually happening. Furthermore you get strong typing which increases developer productivity as well as maintainability and which legitimates the additional LOCs. And of course strong typing means strong tooling!
I started writing a blog entry describing the architecture I came up with (with a sample project of course), however I don't have a lot of time right now to finish it. When it's done I'm going to link to it here for reference.
Hope this can serve as inspiration for people experiencing similar problems.
Cheers!
It requires you to provide teh class name:
{ class:"SampleDomain", someString: "abc",
nest2: { class: "SampleDomainNested", someField:"def" }
}
I know, it requires different input that the output it produces.
As I mentioned in the comment earlier, you might be better off using the gson library.
Not sure why you wrote your own json marshaller, with xstream around.
See http://x-stream.github.io/json-tutorial.html
We have been very happy with xstream for our back end (grails based) services and this way you can render marshall in xml or json, or override the default marshalling for a specific object if you like.
Jettison seems to produce a more compact less human readable JSON and you can run into some library collision stuff, but the default internal json stream renderer is decent.
If you are going to publish the service to the public, you will want to take the time to return appropriate HTTP protocol responses for errors etc... ($.02)