Are JPA (EclipseLink) custom types possible? - json

In particular I am interested in using PostgreSQLs json type.
The core of the problem seems to be that there is no internal mapping in Eclipselink to json type. So, using a naive approach with:
#Column(name = "json", columnDefinition = "json")
public String getJson() {
return json;
}
... and trying to insert an object, I get an exception:
Internal Exception: org.postgresql.util.PSQLException: ERROR: column "json" is of type json but expression is of type character varying
Fair enough I suppose.
Looking through the EclipseLink docs, it seems that the applicable customizations (Transformation Mappings, Native Queries, Converters) rely on the data being made up of the supported mappings (numbers, dates, strings etc) so it makes this quite awkward to get around using vendor specific types.
The main reason this is so frustrating is that json type in posgresql is expressed the same as text/varchar and I believe (at the moment, but not forever) is just an alias of that type - therefore the driver is more than capable of transmitting this, it's just validation rules in my way.
In terms of the solution, I don't mind losing portability (in terms of being database agnostic and using vendor specific types) but just want a solution that allows me to use a json type as an attribute on a normal JPA Entity and retain all the other behavior it is accustomed to (schema generation, merge, persist, transactional code

Walking through SO I've found many questions like this regarding JSON or XML types for mapping into Postgres. It looks like nobody have faced the problem of reading from custom Postgres type, so here the solution for both reading and writing using pure JPA type conversion mechanism.
Postgres JDBC driver maps all attributes for unknown (to Java) types into org.postgresql.util.PGobject object, so it is enough to make converter for this type. Here is entity example:
#Entity
public class Course extends AbstractEntity {
#Column(name = "course_mapped", columnDefinition = "json")
#Convert(converter = CourseMappedConverter.class)
private CourseMapped courseMapped; // have no idea why would you use String json instead of the object to map
// getters and setters
}
Here the converter example:
#Converter
public class CourseMappedConverter implements AttributeConverter<CourseMapped, PGobject> {
#Override
public PGobject convertToDatabaseColumn(CourseMapped courseMapped) {
try {
PGobject po = new PGobject();
// here we tell Postgres to use JSON as type to treat our json
po.setType("json");
// this is Jackson already added as dependency to project, it could be any JSON marshaller
po.setValue((new ObjectMapper()).writeValueAsString(courseMapped));
return po;
} catch (JsonProcessingException e) {
e.printStackTrace();
return null;
} catch (SQLException e) {
e.printStackTrace();
return null;
}
}
#Override
public CourseMapped convertToEntityAttribute(PGobject po) {
try {
return (new ObjectMapper()).readValue(po.getValue(),CourseMapped.class);
} catch (IOException e) {
e.printStackTrace();
return null;
}
}
}
If you really need to stick to String JSON representation in your entity, you can make converter like this for String type
implements AttributeConverter<String, PGobject>
Here is very dirty (though working) proof of concept, it also uses fake object serialization to tell JPA that object was changed if it was
https://github.com/sasa7812/psql-cache-evict-POC

PostgreSQL is too strict about implicit casts between text-like types. The simplest way is a workaround by creating a cast; see this answer.
The clean way to do it would be to create a JPA provider extension that calls setObject(my_json), and/or teach your JPA provider to explicitly add CAST('myvalue' AS json) when it generates queries. This is a pain, as it requires JPA provider specific extensions.
This Stack Overflow search will find a bunch of related questions for the xml type, which people have similar problems with.

Related

How to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON?

I've got following setup: C#, ServiceStack, MariaDB, POCOs with objects and structs, JSON.
The main question is: how to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON and still have working de/serialization of the same POCOs? All of these single tasks are supported, but I had problems when all put together mainly because of structs.
... finally during writing this I found some solution and it may look like I answered my own question, but I still would like to know the answer from more skilled people, because the solution I found is a little bit complicated, I think. Details and two subquestions arise later in the context.
Sorry for the length and for possible misinformation caused by my limited knowledge.
Simple example. This is the final working one I ended with. At the beginning there were no SomeStruct.ToString()/Parse() methods and no JsConfig settings.
using Newtonsoft.Json;
using ServiceStack;
using ServiceStack.DataAnnotations;
using ServiceStack.OrmLite;
using ServiceStack.Text;
using System.Diagnostics;
namespace Test
{
public class MainObject
{
public int Id { get; set; }
public string StringProp { get; set; }
public SomeObject ObjectProp { get; set; }
public SomeStruct StructProp { get; set; }
}
public class SomeObject
{
public string StringProp { get; set; }
}
public struct SomeStruct
{
public string StringProp { get; set; }
public override string ToString()
{
// Unable to use .ToJson() here (ServiceStack does not serialize structs).
// Unable to use ServiceStack's JSON.stringify here because it just takes ToString() => stack overflow.
// => Therefore Newtonsoft.Json used.
var serializedStruct = JsonConvert.SerializeObject(this);
return serializedStruct;
}
public static SomeStruct Parse(string json)
{
// This method behaves differently for just deserialization or when part of Save().
// Details in the text.
// After playing with different options of altering the json input I ended with just taking what comes.
// After all it is not necessary, but maybe useful in other situations.
var structItem = JsonConvert.DeserializeObject<SomeStruct>(json);
return structItem;
}
}
internal class ServiceStackMariaDbStructTest
{
private readonly MainObject _mainObject = new MainObject
{
ObjectProp = new SomeObject { StringProp = "SomeObject's String" },
StringProp = "MainObject's String",
StructProp = new SomeStruct { StringProp = "SomeStruct's String" }
};
public ServiceStackMariaDbStructTest()
{
// This one line is needed to store complex types as blobbed JSON in MariaDB.
MySqlDialect.Provider.StringSerializer = new JsonStringSerializer();
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
}
public void Test_Serialization()
{
try
{
var json = _mainObject.ToJson();
if (!string.IsNullOrEmpty(json))
{
var objBack = json.FromJson<MainObject>();
}
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
public void Test_Save()
{
var cs = "ConnectionStringToMariaDB";
var dbf = new OrmLiteConnectionFactory(cs, MySqlDialect.Provider);
using var db = dbf.OpenDbConnection();
db.DropAndCreateTable<MainObject>();
try
{
db.Save(_mainObject);
var dbObject = db.SingleById<MainObject>(_mainObject.Id);
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
}
What (I think) I know / have tried but at first didn't help to solve it myself:
ServiceStack stores complex types in DB as blobbed JSV by default (last paragraph of first section: https://github.com/ServiceStack/ServiceStack.OrmLite), so it is necessary to set it the way it is proposed: MySqlDialect.Provider.StringSerializer = new JsonStringSerializer(); (https://github.com/ServiceStack/ServiceStack.OrmLite#pluggable-complex-type-serializers)=> default JSV changed to JSON.
the ServiceStack's serialization does not work with structs, it is necessary to treat them special way:
a) according to https://github.com/ServiceStack/ServiceStack.Text#c-structs-and-value-types and example https://github.com/ServiceStack/ServiceStack.Text/#using-structs-to-customize-json it is necessary to implement TStruct.ToString() and static TStruct.ParseJson()/ParseJsv() methods.
b) according to https://github.com/ServiceStack/ServiceStack.Text/#typeserializer-details-jsv-format and unit tests https://github.com/ServiceStack/ServiceStack.Text/blob/master/tests/ServiceStack.Text.Tests/CustomStructTests.cs it shall be TStruct.ToString() (the same as in a) and static TStruct.Parse().
Subquestion #1: which one is the right one? For me, ParseJson() was never called, Parse() was. Documentation issue or is it used in other situation?
I implemented option b). Results:
IDbConnection.Save(_mainObject) saved the item to MariaDB. Success.
Through the saving process ToString() and Parse() were called. In Parse, incoming JSON looked this way:
"{\"StringProp\":\"SomeStruct's String\"}". Fine.
Serialization worked. Success.
Deserialization failed. I don't know the reason, but JSON incoming to Parse() was "double-escaped":
"{\\\"StringProp\\\":\\\"SomeStruct's String\\\"}"
Subquestion #2: Why the "double-escaping" in Parse on deserialization?
I tried to solve structs with JsConfig (and Newtonsoft.Json to get proper JSON):
JsConfig<SomeStruct>.SerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.DeSerializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results:
Save failed: the json input in JsonConvert.DeserializeObject(json) that is used during Save was just type name "WinAmbPrototype.SomeStruct".
De/serialization worked.
b) then I implemented ToString() also using Newtonsoft.Json. During Save ToString() was used instead of JsConfig.SerializeFn even the JsConfig.SerializeFn was still set (maybe by design, I do not judge). Results:
Save failed: but the json input of DeserializeFn called during Save changed, now it was JSV-like "{StringProp:SomeStruct's String}", but still not deserializable as JSON.
De/serialization worked.
Then (during writing this I was still without any solution) I found JsConfig.Raw* "overrides" and tried them:
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results are the same as in 2a.
b) then I implemented ToString(). Results:
BOTH WORKED. No Parse() method needed for this task.
But it is very fragile setup:
if I removed ToString(), it failed (now I understand why, default ToString produced JSON with just type name in 2a, 3a).
if I removed RawSerializeFn setting, it failed in RawDeserializeFn ("double-escaped" JSON).
Is there some simpler solution? I would be very glad if someone points me to better direction.
Acceptable would be maybe two (both of them accessible because of different circumstances):
if I am the TStruct owner: with just pure TStruct.ToString() and static TStruct.Parse() to support out of the box de/serialization and DB by ServiceStack (without different input in Parse()).
if I am a consumer of TStruct with no JSON support implemented and I am without access to its code: until now I did not find the way, if the ToString is not implemented: Save to DB did not work. Maybe would be fine to ensure JsConfig serialize functions are enough for both de/serialization and when used during saving to DB.
And the best one would be without employing other dependency (e.g. Newtonsoft.Json) to serialize structs. Maybe some JsConfig.ShallProcessStructs = true; (WARNING: just a tip, not working as of 2021-04-02) would be fine for such situations.
ServiceStack treats structs like a single scalar value type, just like most of the core BCL Value Types (e.g. TimeSpan, DateTime, etc). Overloading the Parse() and ToString() methods and Struct's Constructor let you control the serialization/deserialization of custom structs.
Docs have been corrected. Structs use Parse whilst classes use ParseJson/ParseJsv
If you want to serialize a models properties I'd suggest you use a class instead as the behavior you're looking for is that of a POCO DTO.
If you want to have structs serailized as DTOs in your RDBMS an alternative you can try is to just use JSON.NET for the complex type serialization, e.g:
public class JsonNetStringSerializer : IStringSerializer
{
public To DeserializeFromString<To>(string serializedText) =>
JsonConvert.DeserializeObject<To>(serializedText);
public object DeserializeFromString(string serializedText, Type type) =>
JsonConvert.DeserializeObject(serializedText, type);
public string SerializeToString<TFrom>(TFrom from) =>
JsonConvert.SerializeObject(from);
}
MySqlDialect.Provider.StringSerializer = new JsonNetStringSerializer();

ServiceStack.Text CSV serialization of IEnumerable<object> ignores custom serialization functions

Firstly, please forgive any rookie mistakes here - I'm not a regular poster I'm afraid.
Now on to the nitty gritty...
I am trying to use ServiceStack.Text to serialize objects to CSV. If I keep it simple, everything works as expected when serializing objects of a known type.
However I want to serialize many objects and I don't know the type at runtime so I am writing a reusable component where all data is treated as a System.Object. We already do this same routine for Json serialization without problems. But CsvSerializer appears to handle objects differently during serialization.
Sample code
public void TestIEnumerableObjectSerialization()
{
var data = GenerateSampleData();
JsConfig<DateTime>.SerializeFn =
time => new DateTime(time.Ticks, DateTimeKind.Utc).ToString("yyyy-MM-dd HH:mm:ss");
var csv = CsvSerializer.SerializeToCsv(data);
Console.WriteLine(csv);
Assert.Equal("DateTime\r\n"
+ "2017-06-14 00:00:00\r\n"
+ "2017-01-31 01:23:45\r\n",
csv);
}
object[] GenerateSampleData()
{
return new object[] {
new POCO
{
DateTime = new DateTime(2017,6,14)
},
new POCO
{
DateTime = new DateTime(2017,1,31, 01, 23, 45)
}
};
}
public class POCO
{
public DateTime DateTime { get; set; }
}
The result of this code is that the custom serialization function is not invoked, and the DateTime is written out using the standard ToString() method.
The cause?
The CsvWriter.Write method is inspecting the type of the records and if the type is Object it is treated as a Dictionary<string, object> and CsvDictionaryWriter generates the output.
In turn, CsvDictionaryWriter uses the ToCsvField() extension method to write each property a record.
The problem is that ToCsvField() converts the value of each property to a string using ToString() meaning no custom serialization is performed.
JsonSerializer uses TypeSerializer.SerializeToString(text) to serialize the properties of an Object using any configured custom serialization functions; but this doesn't happen with CsvSerializer.
A possible solution?
Without complicating CsvSerializer, the ToCsvField() extension method could be updated to use TypeSerializer to handle the serialization to a string. Here is what I've been testing with so far:
public static object ToCsvField(this object text)
{
var textSerialized = TypeSerializer.SerializeToString(text).StripQuotes();
return textSerialized == null || !CsvWriter.HasAnyEscapeChars(textSerialized)
? textSerialized
: string.Concat
(
CsvConfig.ItemDelimiterString,
textSerialized.Replace(CsvConfig.ItemDelimiterString, CsvConfig.EscapedItemDelimiterString),
CsvConfig.ItemDelimiterString
);
}
So far I haven't come across an issue with this change, although someone may prefer not to allocate a new intermediate variable before the return statement.
Hopefully that is enough information, so on to my questions...
Has anyone else experienced this issue?
Am I doing something wrong and should I be serializing Objects a different way?
If this is a suitable fix/implementation of TypeSerializer, what are the chances of this being addressed in an update to ServiceStack.Text? I would raise an issue on GitHub but the ServiceStack.Text repo doesn't let me raise issues.
Thanks in advance.

Jersey/Genson: Unmarschalling single object array

Similar to Jersey: Json array with 1 element is serialized as object BUT on the client side. E.g. I recieve a JSON object where a field is an array regulary, but in case there is only one element, it is a single object.
{"fileInfo":[{"fileName":"weather.arff","id":"10"},"fileName":"supermarket.arff","id":"11"}]}
versus
{"fileInfo":{"fileName":"weather.arff","id":"10"}}
I'm parsing/unmarshalling the JSON using Jersey/Genson. Of course, if the JSON doesnt match the target class I recieve an error (such as expected [ but read '{' )
I've read a lot about this bug and how to avoid when creating JSON objects on the SERVER side, but I found nothing about how to handle this issus when dealing on the CLIENT side.
As always, I prefere the most codeless possibility if there are several solutions...
BTW: Moxy works but it does not marshal native Object-type objects which is another requirement...
Update
Starting with Genson 1.3 release you can achieve it by enabling permissiveParsing:
Genson genson = new GensonBuilder().usePermissiveParsing(true).create();
Answer
Uh, do you know what library produces this on server side? I am curious to see who is responsible for all those badly structured jsons out there...
It is not yet supported in Genson. Originally because IMO people should not produce such dynamic json. Anyway, I opened an issue - this can be easily done, you can expect it to be present in the release coming next week.
Otherwise here is a way to achieve it without breaking the existing mechanisms.
You need to register a Factory that will use Gensons collections factory to create an instance of its standard collection converter. Then you will wrap this converter in another one that will handle the object to array logic. Here is the code (not codeless..., but if you wait a bit you won't have to code :)).
import com.owlike.genson.convert.DefaultConverters.CollectionConverterFactory;
import com.owlike.genson.convert.DefaultConverters.CollectionConverterFactory;
class SingleObjectAsCollectionFactory implements Factory<Converter<Collection>> {
// get the default factory
Factory<Converter<Collection<?>>> defaultFactory = CollectionConverterFactory.instance;
#Override
public Converter<Collection> create(Type type, Genson genson) {
// obtain an instance of the correct default converter for this type
final CollectionConverter defaultConverter = (CollectionConverter) defaultFactory.create(type, genson);
// wrap it in your own converter
return new Converter<Collection>() {
#Override
public void serialize(Collection object, ObjectWriter writer, Context ctx) throws Exception {
defaultConverter.serialize(object, writer, ctx);
}
#Override
public Collection deserialize(ObjectReader reader, Context ctx) throws Exception {
if (reader.getValueType() == ValueType.OBJECT) {
Object object = defaultConverter.getElementConverter().deserialize(reader, ctx);
Collection result = defaultConverter.create();
result.add(object);
return result;
} else return defaultConverter.deserialize( reader, ctx );
}
};
}
}
And then register it
Genson genson = new GensonBuilder()
.withConverterFactory(new SingleObjectAsCollectionFactory())
.create();

How can I pass complex objects as arguments to a RESTful service?

I have successfully set up a quick test of creating a "REST-like" service that returns an object serialized to JSON, and that was quite easy and quick (based on this article).
But while returning JSON-ified objects was easy as peach, I have yet to see any examples dealing with input parameters that are not primitives. How can I pass in a complex object as an argument? I am using Apache CXF, but examples using other frameworks like Jackson are welcome too :)
Client side would probably be something like building a javascript object, pass it into JSON.stringify(complexObj), and pass that string as one of the parameters.
The service would probably look something like this
#Service("myService")
class RestService {
#GET
#Produces("application/json")
#Path("/fooBar")
public Result fooBar(#QueryParam("foo") double foo, #QueryParam("bar") double bar,
#QueryParam("object") MyComplex object) throws WebServiceException {
...
}
}
Sending serialized objects as parameters would probably quickly touch the 2KB URL-limit imposed by Internet Explorer. Would you recommend using POST in these cases, and would I need to change much in the function definitions?
After digging a bit I quickly found out there are basically two options:
Option 1
You pass a "wrapper object" containing all the other parameters to the service. You might need to annotate this wrapper class with JAXB annotations like #XmlRootElement in order for this to work with the Jettison based provider, but if you use Jackson in stead there is no need. Just set the content type to the right type and the right message body reader will be invoked.
This will only work for POST type services of course (AFAIK).
Example
This is just an example of turning the service mentioned in the original question into one using a wrapper object.
#Service("myService")
class RestService {
#POST
#Produces("application/json")
#Path("/fooBar")
public Result fooBar(
/**
* Using "" will inject all form params directly into a ParamsWrapper
* #see http://cxf.apache.org/docs/jax-rs-basics.html
*/
#FormParam("") FooBarParamsWrapper wrapper
) throws WebServiceException {
doSomething(wrapper.foo);
}
}
class ParamsWrapper {
double foo, bar;
MyComplexObject object;
}
Option 2
You can provide some special string format that you pack your objects into and then implement either a constructor taking a string, a static valueOf(String s) or a static fromString(String s) in the class that will take this string and create an object from it. Or quite similar, create a ParameterHandler that does exactly the same.
AFAIK, only the second version will allow you to call your services from a browser using JSONP (since JSONP is a trick restricted to GET). I chose this route to be able to pass arrays of complex objects in the URI.
As an example of how this works, take the following domain class and service
Example
#GET
#Path("myService")
public void myService(#QueryParam("a") MyClass [] myVals) {
//do something
}
class MyClass {
public int foo;
public int bar;
/** Deserializes an Object of class MyClass from its JSON representation */
public static MyClass fromString(String jsonRepresentation) {
ObjectMapper mapper = new ObjectMapper(); //Jackson's JSON marshaller
MyClass o= null;
try {
o = mapper.readValue(jsonRepresentation, MyClass.class );
} catch (IOException e) {
throw new WebApplicationException()
}
return o;
}
}
A URI http://my-server.com/myService?a={"foo":1, "bar":2}&a={"foo":100, "bar":200} would in this case be deserialized into an array composed of two MyClass objects.
2019 comment:
Seeing that this answer still gets some hits in 2019, I feel I should comment. In hindsight, I would not recomment option 2, as going through these steps just to be able to be able to do GET calls adds complexity that's probably not worth it. If your service takes such complex input, you will probably not be able to utilize client side caching anyway, due to the number of permutations of your input. I'd just go for configuring proper Cross-Origin-Sharing (CORS) headers on the server and POST the input. Then focus on caching whatever you can on the server.
The accepted answer is missing #BeanParam. See
https://docs.jboss.org/resteasy/docs/3.0-rc-1/javadocs/javax/ws/rs/BeanParam.html
for further details. It allows you to define query params inside a wrapper object.
E.g.
public class TestPOJO {
#QueryParam("someQueryParam")
private boolean someQueryParam;
public boolean isSomeQueryParam() {
return someQueryParam;
}
public boolean setSomeQueryParam(boolean value) {
this.someQueryParam = value;
}
}
... // inside the Resource class
#GET
#Path("test")
public Response getTest(#BeanParam TestPOJO testPOJO) {
...
}
the best and simplest solution is to send your object as a json string and in server side implement a method which will decode that json and map to the specified object as per your need.. and yes it`s better to use POST.

Json <-> Java serialization that works with GWT [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I am looking for a simple Json (de)serializer for Java that might work with GWT. I have googled a bit and found some solutions that either require annotate every member or define useless interfaces. Quite a boring. Why don't we have something really simple like
class MyBean {
...
}
new GoodSerializer().makeString(new MyBean());
new GoodSerializer().makeObject("{ ... }", MyBean.class)
Take a look at GWT's Overlay Types. I think this is by far the easiest way to work with JSON in GWT. Here's a modified code example from the linked article:
public class Customer extends JavaScriptObject {
public final native String getFirstName() /*-{
return this.first_name;
}-*/;
public final native void setFirstName(String value) /*-{
this.first_name = value;
}-*/;
public final native String getLastName() /*-{
return this.last_name;
}-*/;
public final native void setLastName(String value) /*-{
this.last_name = value;
}-*/;
}
Once you have the overlay type defined, it's easy to create a JavaScript object from JSON and access its properties in Java:
public static final native Customer buildCustomer(String json) /*-{
return eval('(' + json + ')');
}-*/;
If you want the JSON representation of the object again, you can wrap the overlay type in a JSONObject:
Customer customer = buildCustomer("{'Bart', 'Simpson'}");
customer.setFirstName("Lisa");
// Displays {"first_name":"Lisa","last_name":"Simpson"}
Window.alert(new JSONObject(customer).toString());
Another thing to try is the new AutoBean framework introduced with GWT 2.1.
You define interfaces for your beans and a factory that vends them, and GWT generates implementations for you.
interface MyBean {
String getFoo();
void setFoo(String foo);
}
interface MyBiggerBean {
List<MyBean> getBeans();
void setBeans(List<MyBean> beans>;
}
interface Beanery extends AutoBeanFactory{
AutoBean<MyBean> makeBean();
AutoBean<MyBiggerBean> makeBigBean();
}
Beanery beanFactory = GWT.create(Beanery.class);
void go() {
MyBean bean = beanFactory.makeBean().as();
bean.setFoo("Hello, beans");
}
The AutoBeanCodex can be used to serialize them to and from json.
AutoBean<MyBean> autoBean = AutoBeanUtils.getAutoBean(bean);
String asJson = AutoBeanCodex.encode(autoBean).getPayload();
AutoBean<MyBean> autoBeanCloneAB =
AutoBeanCodex.decode(beanFactory, MyBean.class, asJson );
MyBean autoBeanClone = autoBeanCloneAB.as();
assertTrue(AutoBeanUtils.deepEquals(autoBean, autoBeanClone));
They work on the server side too — use AutoBeanFactoryMagic.create(Beanery.class) instead of GWT.create(Beanery.class).
The simplest way would be to use GWT's built-in JSON API. Here's the documentation. And here is a great tutorial on how to use it.
It's as simple as this:
String json = //json string
JSONValue value = JSONParser.parse(json);
The JSONValue API is pretty cool. It lets you chain validations as you extract values from the JSON object so that exceptions will be thrown if anything's amiss with the format.
It seems that I found the right answer to my question
I figured out that bean to json and json to bean conversion in GWT isn't a trivial task. Known libraries would not work because GWT would require their full source code and this source code must use only Java classes that are amoung emulated by GWT. Also, you cannot use reflection in GWT. Very tough requirements!
I found the only existing solution named gwt-jsonizer. It uses a custom Generator class and requires a satellite interface for each "jsonable" bean. Unfortunately, it does not work without patching on the latest version of GWT and has not been updated for a long time.
So, I personally decided that it is cheaper and faster to make my beans khow how to convert themselves to and from json. Like this:
public class SmartBean {
private String name;
public String getName() { return name; }
public void setName(String value) { name = value; }
public JSONObject toJson() {
JSONObject result = new JSONObject();
result.put("name", new JSONString(this.name));
return result;
}
public void fromJson(JSONObject value) {
this.name = value.get("name").isString().stringValue();
}
}
JSONxxxx are GWT built-in classes that provide low-level json support.
RestyGWT is a powerful library for encoding or decoding Java Object to JSON in GWT:
import javax.ws.rs.POST;
...
public interface PizzaOrderCodec extends JsonEncoderDecoder<PizzaOrder> {
}
Then:
// GWT will implement the interface for you
PizzaOrderCodec codec = GWT.create(PizzaOrderCodec.class);
// Encoding an object to json
PizzaOrder order = ...
JSONValue json = codec.encode(order);
// decoding an object to from json
PizzaOrder other = codec.decode(json);
It has also got several easy to use API for consuming Restful web services.
Have a nice time.
Check this:
GWT Professional JSON Serializer:
http://code.google.com/p/gwtprojsonserializer/
!Works with GWT 2.0+!
json.org/java seems to be included with GWT these days:
gwt-servlet-deps.jar\org\json\
Or, this project seems to be comprehensive:
http://code.google.com/p/piriti/
In Google Web Toolkit Applications, pages 510 to 522, the author, Ryan Dewsbury, shows how to use GWT code generation to do serialization to and from XML and JSON documents.
You can download the code here; you want the chapter 10 code bundles, and then you want to look in the src/com/gwtapps/serialization package. I did not see a license for this code, but have emailed the author to see what he says. I'll update this if he replies.
Issues with this solution:
You have to add a marker interface on all your objects that you want serialized (he uses java.io.Serializable but I imagine you could use others--if you are using hibernate for your backend, your pojos might already be tagged like this).
The code only supports string properties; it could be extended.
The code is only written for 1.4 and 1.5.
So, this is not an out of the box solution, but a great starting point for someone to build a JSON serializer that fits with GWT. Combine that with a JSON serializer on the server side, like json-lib and you're good to go.
I also found this project (again, some marker interface is required).
Try this serializer from Google Code: http://code.google.com/p/json-io/
If you need to write or read JSON format in Java, this is the tool to use. No need to create extra classes, etc. Convert a Java object graph to JSON format in one call. Do the opposite - create a JSON String or Stream to Java objects. This is the fastest library I have seen yet to do this. It is faster than ObjectOutputStream and ObjectInputStream in most cases, which use binary format.
Very handy utility.
You may want to checkout this project https://gerrit.googlesource.com/gwtjsonrpc/
It's a library created in order to support a code review system for Android, Gerrit, but it's a stand-alone module meant to be embedded into any GWT project, not just Gerrit.
A reasonable tutorial is probably the README in the top level of the directory. It's quite similar to standard GWT RPC but it uses JSON encoding. It also has built-in XSRF protection.
I seem to be answering this question a lot...
There's a page on code.google.com titled Using GWT for JSON Mashups. It's (unfortunately) way over my head, as I'm not that familiar with GWT, so it may not be helpful.
OK, I deleted my previous answer because it turned out to be exactly what you didn't want.
I don't know how well it works with GWT, but we use the json-lib library to serialize objects in a normal Java project where I work.
It can create a JSONObject directly from a JavaBean, then use the resulting JSONObject's toString() method to get the actual JSON string back.
Likewise, it can also turn JSON back into a JavaBean.
Not sure if Jackson would work for you.
I don't know if there's GWT-specific you are looking for; if not it should work.
But its serialization/deserialization works quite well, like:
// read json, create object
ObjectMapper mapper = new ObjectMapper();
MyBean bean = mapper.readValue(jsonAsString, MyBean.class);
// and write out
StringWriter sw = new StringWriter();
mapper.writeValue(sw, user);
String jsonOut = sw.toString();
You do need accessors (getX() to serialize, setX() to deserialize; can annotate methods with other names), but that's about it.