Could someone explain to me why in GWT you cannot convert a client/shared pojo (that implements Serializable) into a JSON object without jumping through a load of hoops like using the AutoBeanFactory (e.g GWT (Client) = How to convert Object to JSON and send to Server? ) or creating javascript overlay objects (and so extends JavaScriptObject)
GWT compiles your client objects into a javascript object, so why can't it then simply convert your javascript to JSON if you ask it to?
The GWT JSON library supplied only allows you to JSONify java objects that extend JavaScriptObject
I am obviously misunderstanding something about GWT since a GWT compiles a simple java POJO into a javascript object and in javascript you can JSON.stringify it into JSON so why not in GWT?
GWT compiles your app, it doesn't just convert it. It does take advantage of the prototype object in JavaScript to build classes as it needs, usually following your class hierarchy (and any GWT classes you use), but it makes many other changes:
Optimizations:
Tightens up types - if you refer to something as List, but it can only be an ArrayList, it rewrites the type declarations. This by itself doesnt give much, but it lets other steps do better work, such as
Making methods static - if nothing ever overrides ArrayList.add, for example, this will turn any calls it can prove are to ArrayList.add into a static call, preventing the need for dynamic dispatch, and allowing the 'this' string in the final JS to be replaces with a shorter arg name. This will prevent a JS object from having a method you expect it to have.
Inline Methods - if a method is simple enough, and is called in few enough places, the compiler might remove the method entirely, since it knows all places where it is called. This will directly affect your use case.
Removes/Inlines unreferenced fields - if you read to a field but only write it once, it will assume that the original value is a constant. If you don't read it, there is no reason to assign it. Values that the compiler can't tell will ever be used don't need to be using up space in the js and time in the browser. This also will directly affect treating gwt'd Java as JS.
After these, among others, the compiler will rename fields, arguments, and types to be as small as possible - rarely will a field or argument be longer than 1 character when this is complete, since those are most frequently used and have the smallest scope, so can be reused the most often by the compiler. This too will affect trying to treat objects as JSON.
The libraries that allow you to export GWT objects as JSON do so by making some other assumption.
JavaScriptObject (JSO) isn't a real Java object, but actually represents a JavaScript instance, so you can cast back and forth at will - the JSNI you write will emerge relatively unoptimized, as the compiler can't tell if you are trying to talk to an external library.
AutoBeans are generated to assume that they should have the ability to write out JSON, so specific methods to encode objects are written in. They will be subject to the same rules as the other Java that is compiled - code that isn't used may be removed, code that is only called one way might be tightened up or inlined.
Libraries that can export JS compile in Java details into the final executable, making it bigger, but giving you the ability to treat these Java objects like JS in some limited way.
One last point, since you are talking both about JSON and Javascript - Some normal JS isn't suitable for writing out as JSON. Date objects don't have a consistent way to serialize that is recognized by JSON. Non-tree object graphs can't be serialized:
var obj = {};
obj.prop = {};
obj.prop.obj = obj;
Autobeans come with a built in checker for these circular references, and I would hope the JSO serialization does as well.
Related
To keep this simple:
I have classes defined in typescript which have methods and properties (with lots of getter/setter logic). I then retrieve json data matching such classes. I need to be able to project these json objects into my "smart" classes. I know about class transformer but I wonder if this is really go-to approach to do this kind of stuff. Furthermore, I'm planning on using ngrx, so this whole class-transformation just looks wrong (server to json, json to state, state to class? and viceversa? I just dont see a clear pattern.
Any clarity is appreciated. Thanks!
I'm doing almost exactly what you describe in a fairly large app.
I'm using class-transformer to transform the JSON from http calls to instances of the appropriate objects, and then using the resulting objects as state in a store (except that I'm using Redux instead of ngrx).
I find that it works very well.
I'm not sure exactly what you mean by "server to json, json to state, state to class? and viceversa?".
For me (using your terminology), it's server to json, json to class, class to state
(but state is just a collection of objects, i.e. class instances. I.E. state is objects).
If I need to send state back to the server, then yes, I typically pull the appropriate objects from the store, serialize them to JSON, and send them to the server. But...the Angular HttpClient does the serialization for you, so you don't typically have to write that part, unless you need some custom serialization.
In a Play 2.0 application, I need to deserialize some JSON from a source which I don't control which uses single-quotes around strings -- where the JSON spec calls for double-quotes.
The solution using Jackson is here:
Configure Jackson to deserialize single quoted (invalid) JSON
But trying to implement this solution in play2.0 I hit a wall of static objects and private classes... it should be enough to replace object JerksonJson with one implementing the solution linked above at initialization, but because it is a static object it can't be extended, and id I try to copy it into my code I need to drag along classes PlaySerializers, PlayDeserializers, JsValueDeserializer,... I stopped here, as it looked like too much.
Is there a clean solution?
How about trying to fix the invalid json string by replacing every ' in it with a "?
That would work if 's are only used for specifying strings.
I realize that this may not help too much with Play framework part, but perhaps you could use Jackson Scala Module instead of Jerkson? Doing that should make it easier to just use ObjectMapper, with Scala module registered, instead of having to use Jerkson-specific handlers.
I am developing the distributed application in CORBA Using the Java IDL provided by default in JDK , and of course , both client and server developed in Java.
I am maintaining the some object state on server.
Now, on client side I want to bring whole state (snapshot) of that object from server side.
and this is object is of some Java type .
As I cannot pass the whole object of any Java type from server to client, because of IDL definition and of course CORBA feature as it is language neutral.
One way I found, is using JSON
I will flatten the whole Java Object of any type into string and pass same to client using string data type, later on client I can deflatten the object from string.
also I can define the string type in idl.
but this adds the some processing for flattening/ deflattening on both sides
is there any other way to pass object from client? or may be I missed something?
Update:
Objects of Following types are transferred
class MyObject{ Map<String,String> object; }
CORBA already has the concept of Objects-By-Value, so you could use that if your ORB supports it. Put your state variables in a valuetype and go from there.
Keep in mind that CORBA is not Java. CORBA can be used with many languages so if you find yourself trying to figure out how to send Java-only things across a CORBA system, you're going to find that very difficult. To transmit anything in CORBA it's got to be representable in IDL first and foremost. If valuetype doesn't meet your needs then use the struct approach that the other answer suggested.
You just have to define your MyObjects as CORBA objects. For that you'll use the IDL. Your Map is a simple name,value list.
module Foo {
struct MapEntry {
string name;
string value;
};
sequence<MapEntry> MyMap;
};
};
This will create an Array of MapEntry Objects in Java. If you want to remap them into a Java Map, feel free. This is the CORBA way of transferring a map of something. Create a struct, put it into a sequence, done.
This also works properly for other languages supported by CORBA (e.g., C++)
That sounds as if you want to have the object state become part of the interface (because if you actually transfer the state, being able to recreate the object depends on the receiver to understand the transmitted state, hence it becomes an interface).
Thus, define a struct containing the data fields in the IDL, and add a method to the object interface to return the state in this form. The transfer is then handled by the regular CORBA marshaller.
I'm developing a RESTful interface which is used to provide JSON data for a JavaScript application.
On the server side I use Grails 1.3.7 and use GORM Domain Objects for persistence. I implemented a custom JSON Marshaller to support marshalling the nested domain objects
Here are sample domain objects:
class SampleDomain {
static mapping = { nest2 cascade: 'all' }
String someString
SampleDomainNested nest2
}
and
class SampleDomainNested {
String someField
}
The SampleDomain resource is published under the URL /rs/sample/ so /rs/sample/1 points to the SampleDomain object with ID 1
When I render the resource using my custom json marshaller (GET on /rs/sample/1), I get the following data:
{
"someString" : "somevalue1",
"nest2" : {
"someField" : "someothervalue"
}
}
which is exactly what I want.
Now comes the problem: I try to send the same data to the resource /rs/sample/1 via PUT.
To bind the json data to the Domain Object, the controller handling the request calls def domain = SampleDomain.get(id) and domain.properties = data where data is the unmarshalled object.
The binding for the "someString" field is working just fine, but the nested object is not populated using the nested data so I get an error that the property "nest2" is null, which is not allowed.
I already tried implementing a custom PropertyEditorSupport as well as a StructuredPropertyEditor and register the editor for the class.
Strangely, the editor only gets called when I supply non-nested values. So when I send the following to the server via PUT (which doesn't make any sense ;) )
{
"someString" : "somevalue1",
"nest2" : "test"
}
at least the property editor gets called.
I looked at the code of the GrailsDataBinder. I found out that setting properties of an association seems to work by specifying the path of the association instead of providing a map, so the following works as well:
{
"someString" : "somevalue1",
"nest2.somefield" : "someothervalue"
}
but this doesn't help me since I don't want to implement a custom JavaScript to JSON object serializer.
Is it possible to use Grails data binding using nested maps? Or do I really heave to implement that by hand for each domain class?
Thanks a lot,
Martin
Since this question got upvoted several times I would like to share what I did in the end:
Since I had some more requirements to be implemented like security etc. I implemented a service layer which hides the domain objects from the controllers. I introduced a "dynamic DTO layer" which translates Domain Objects to Groovy Maps which can be serialized easily using the standard serializers and which implements the updates manually. All the semi-automatic/meta-programming/command pattern/... based solutions I tried to implement failed at some point, mostly resulting in strange GORM errors or a lot of configuration code (and a lot of frustration). The update and serialization methods for the DTOs are fairly straightforward and could be implemented very quickly. It does not introduce a lot of duplicate code as well since you have to specify how your domain objects are serialized anyway if you don't want to publish your internal domain object structure. Maybe it's not the most elegant solution but it was the only solution which really worked for me. It also allows me to implement batch updates since the update logic is not connected to the http requests any more.
However I must say that I don't think that grails is the appropriate tech stack best suited for this kind of application, since it makes your application very heavy-weight and inflexbile. My experience is that once you start doing things which are not supported by the framework by default, it starts getting messy. Furthermore, I don't like the fact that the "repository" layer in grails essentially only exists as a part of the domain objects which introduced a lot of problems and resulted in several "proxy services" emulating a repository layer. If you start building an application using a json rest interface, I would suggest to either go for a very light-weight technology like node.js or, if you want to/have to stick to a java based stack, use standard spring framework + spring mvc + spring data with a nice and clean dto layer (this is what I've migrated to and it works like a charm). You don't have to write a lot of boilerplate code and you are completely in control of what's actually happening. Furthermore you get strong typing which increases developer productivity as well as maintainability and which legitimates the additional LOCs. And of course strong typing means strong tooling!
I started writing a blog entry describing the architecture I came up with (with a sample project of course), however I don't have a lot of time right now to finish it. When it's done I'm going to link to it here for reference.
Hope this can serve as inspiration for people experiencing similar problems.
Cheers!
It requires you to provide teh class name:
{ class:"SampleDomain", someString: "abc",
nest2: { class: "SampleDomainNested", someField:"def" }
}
I know, it requires different input that the output it produces.
As I mentioned in the comment earlier, you might be better off using the gson library.
Not sure why you wrote your own json marshaller, with xstream around.
See http://x-stream.github.io/json-tutorial.html
We have been very happy with xstream for our back end (grails based) services and this way you can render marshall in xml or json, or override the default marshalling for a specific object if you like.
Jettison seems to produce a more compact less human readable JSON and you can run into some library collision stuff, but the default internal json stream renderer is decent.
If you are going to publish the service to the public, you will want to take the time to return appropriate HTTP protocol responses for errors etc... ($.02)
I have a custom container (C#) for the Flash ActiveX control and am passing data back and forth. Previously I would use ExternalInterface.call and pass an Array as a parameter. I would prefer to use the Vector class now that it is available, but it appears that when I do that the call is never made.
It is however made if it is embedded in IE. It appears that when in IE, Flash will send out JavaScript to execute rather than serializing to XML. My guess is that the Vector XML serialization isn't baked in, so Flash just ignores the call.
Anyone have any ideas? Other than just going back to using Array, I've already done that for now.
The docs note that:
Other built-in or custom classes -
ActionScript encodes other objects as
null or as an empty object. In either
case any property values are lost.
It's not completely clear what that means, since custom classes are also Objects - I guess only vanilla objects count? But at any rate it looks like Vector falls into that "other built-in classes" category, so you'll need to either use Array, or re-cast to Array before you pass the data.
you can use arrays with [ArrayElementType("type")] instead. or write a serialization function for Vector