Using jax-rs, I'm not sure how to manually unmarshal JSON into my custom Java objects.
From my browser I'm sending a simple put request with the following JSON:
{"myDate":{"dayOfMonth":23, "monthOfYear":7, "year":2011}}
On the server I have a BlahResource which consumes this JSON and prints out the Java object properties:
#Component
#Scope("request")
#Path("/blah")
#Consumes("application/json")
#Produces("application/json")
public class BlahResource {
#PUT
public String putBlah(Blah blah) {
System.out.println("Value: " + blah.getMyDate().getMonthOfYear() + "/" + blah.getMyDate().getDayOfMonth() + "/" + blah.getMyDate().getYear());
return "{}";
}
}
Here's the source code for Blah:
public class Blah {
private LocalDate myDate;
public Blah()
{
}
public void setMyDate(LocalDate myDate)
{
this.myDate = myDate;
}
public LocalDate getMyDate()
{
return myDate;
}
}
The problem is Blah.myDate is a Joda-time LocalDate class which does not have setters for dayOfMonth, monthOfYear, and year. So for instance, when I run this the following exception is thrown:
Jul 10, 2011 8:40:33 AM
com.sun.jersey.spi.container.ContainerResponse mapMappableContainerException
SEVERE: The exception contained within MappableContainerException could not
be mapped to a response, re-throwing to the HTTP container
org.codehaus.jackson.map.exc.UnrecognizedPropertyException:
Unrecognized field "dayOfMonth"
This makes perfect sense to me. The problem is I have no idea how to write some sort of adapter so that whenever the type LocalDate is encountered, my adapter class is used to convert the JSON into a LocalDate.
Ideally, I want to do something like this:
public class LocalDateAdapter {
public LocalDate convert(String json)
{
int dayOfMonth = (Integer)SomeJsonUtility.extract("dayOfMonth");
int year = (Integer)SomeJsonUtility.extract("year");
int monthOfYear = (Integer)SomeJsonUtility.extract("monthOfYear");
return new LocalDate(year, monthOfYear, dayOfMonth);
}
}
UPDATE
I've now tried two methods, neither seem to be working.
1) Using ObjectMapper
It seems all I need to do is get a handle on the ObjectMapper and add a deserializer. So I created this provider. To my surprise, I named my dserializer: LocalDateDeserializer and when I had eclipse auto-fix imports I was shocked to see that Jackson already provides an extension for Joda. When I start the server, it finds the provider, but otherwise it seems this code is never invoked.
import javax.ws.rs.ext.ContextResolver;
import javax.ws.rs.ext.Provider;
import org.codehaus.jackson.Version;
import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.map.ext.JodaDeserializers.LocalDateDeserializer;
import org.codehaus.jackson.map.module.SimpleModule;
import org.joda.time.LocalDate;
import org.springframework.stereotype.Component;
#Component
#Provider
public class ObjectMapperProvider implements ContextResolver<ObjectMapper> {
#Override
public ObjectMapper getContext(Class<?> type) {
ObjectMapper mapper = new ObjectMapper();
SimpleModule testModule = new SimpleModule("MyModule", new Version(1, 0, 0, null))
.addDeserializer(LocalDate.class, new LocalDateDeserializer());
mapper.registerModule(testModule);
return mapper;
}
}
2) The second method I tried is to specify a #JsonDeserialize annotation directly on the field.
#JsonDeserialize(using = CustomDateDeserializer.class)
private LocalDate myDate;
This also didn't seem to be invoked.
public class CustomDateDeserializer extends JsonDeserializer<LocalDate> {
#Override
public LocalDate deserialize(JsonParser parser, DeserializationContext context) throws IOException, JsonProcessingException
{
return new LocalDate(2008, 2, 5);
}
}
I'm not sure what to do. This seems like a very basic problem.
UPDATE 2
I'm considering dropping Jackson for using deserialization (even though it works fairly well with Jersey).
I was already using flexjson for serialization, and it seems flexjson is just as simple for deserialization. All these other libraries have some much abstraction and unnecessary complexity.
In Flexjson, I just had to implement ObjectFactory:
class LocalDateTransformer implements ObjectFactory {
#Override
public Object instantiate(ObjectBinder context, Object value, Type targetType, Class targetClass)
{
HashMap map = (HashMap)value;
int year = (Integer)map.get("year");
int monthOfYear = (Integer)map.get("monthOfYear");
int dayOfMonth = (Integer)map.get("dayOfMonth");
return new LocalDate(year, monthOfYear, dayOfMonth);
}
}
It looks surprisingly like the "adapter" class I originally posted! And my resource method now becomes:
#PUT
public String putBlah(String blahStr) {
Blah blah = new JSONDeserializer<Blah>().use(LocalDate.class, new LocalDateTransformer()).deserialize(blahStr, Blah.class);
}
Related
Use-case:
Developers/I, want to only implement a Protobuf implementation (binary protocol). However, I need a way to add config, so, the same implementation is exposed as rest/json api as well -- without code duplication.
I have proto endpoints exposed. I also want consumers to post json equivalent of those proto objects and return/receive json equivalent of the results with type info (Pojo?). The type info helps with OpenAPI / Swagger documentation too!
What are the most elegant/simple ways to achieve that without code duplication?
Any example github code that achieves that would be helpful.
Note: This is for webflux & netty - no tomcat.
ProtobufJsonFormatHttpMessageConverter - works for tomcat, does not work for netty. A working example code would be great.
I was messing around with this and ended up with this. Nothing else worked for me.
Using protov3 and setting a protobuf like this
syntax = "proto3";
option java_package = "com.company";
option java_multiple_files = true;
message CreateThingRequest {
...
message CreateThingResponse {
....
I can scan for the protobuf files by setting app.protoPath in my application.properties
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.SerializerProvider;
import com.google.common.reflect.ClassPath;
import com.google.protobuf.Message;
import com.google.protobuf.util.JsonFormat;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
import org.springframework.http.codec.ServerCodecConfigurer;
import org.springframework.http.codec.json.Jackson2JsonDecoder;
import org.springframework.http.codec.json.Jackson2JsonEncoder;
import org.springframework.http.converter.json.Jackson2ObjectMapperBuilder;
import org.springframework.web.reactive.config.WebFluxConfigurer;
#Configuration
public class WebConfig implements WebFluxConfigurer {
#Value("${app.protoPath:com.}")
private String protoPath;
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.defaultCodecs().jackson2JsonEncoder(
new Jackson2JsonEncoder(Jackson2ObjectMapperBuilder.json().serializerByType(
Message.class, new JsonSerializer<Message>() {
#Override
public void serialize(Message value, JsonGenerator gen, SerializerProvider serializers) throws IOException {
String str = JsonFormat.printer().omittingInsignificantWhitespace().print(value);
gen.writeRawValue(str);
}
}
).build())
);
final ClassLoader loader = Thread.currentThread().getContextClassLoader();
Map<Class<?>, JsonDeserializer<?>> deserializers = new HashMap<>();
try {
for (final ClassPath.ClassInfo info : ClassPath.from(loader).getTopLevelClasses()) {
if (info.getName().startsWith(protoPath)) {
final Class<?> clazz = info.load();
if (!Message.class.isAssignableFrom(clazz)) {
continue;
}
#SuppressWarnings("unchecked") final Class<Message> proto = (Class<Message>) clazz;
final JsonDeserializer<Message> deserializer = new CustomJsonDeserializer() {
#Override
public Class<Message> getDeserializeClass() {
return proto;
}
};
deserializers.put(proto, deserializer);
}
}
} catch (IOException e) {
throw new RuntimeException(e);
}
configurer.defaultCodecs().jackson2JsonDecoder(new Jackson2JsonDecoder(Jackson2ObjectMapperBuilder.json().deserializersByType(deserializers).build()));
}
private abstract static class CustomJsonDeserializer extends JsonDeserializer<Message> {
abstract Class<? extends Message> getDeserializeClass();
#Override
public Message deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException {
Message.Builder builder = null;
try {
builder = (Message.Builder) getDeserializeClass()
.getDeclaredMethod("newBuilder")
.invoke(null);
} catch (Exception e) {
throw new RuntimeException(e);
}
JsonFormat.parser().merge(jp.getCodec().readTree(jp).toString(), builder);
return builder.build();
}
}
}
Then I just use the object types in the returns;
#PostMapping(
path = "/things",
consumes = {MediaType.APPLICATION_JSON_VALUE, "application/x-protobuf"},
produces = {MediaType.APPLICATION_JSON_VALUE, "application/x-protobuf"})
Mono<CreateThingResponse> createThing(#RequestBody CreateThingRequest request);
With https://github.com/innogames/springfox-protobuf you can get the responses to show in swagger but the requests still aren't showing for me.
You'll have to excuse the messy Java I'm a little rusty.
I needed to support json and the following code helped
#Bean
public WebFluxConfigurer webFluxConfigurer() {
return new WebFluxConfigurer() {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
ObjectMapper mapper = new ObjectMapper()
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false)
.registerModule(new ProtobufModule());
configurer.customCodecs().register(new Jackson2JsonEncoder(mapper));
configurer.customCodecs().register(new Jackson2JsonDecoder(mapper));
}
};
}
Try adding ProtoEncoder in your WebFlux config:
#EnableWebFlux
public class MyConfig implements WebFluxConfigurer {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.customCodecs().register(new ProtobufEncoder());
}
}
Then in your request mapping return the proto object:
#GetMapping (produces = "application/x-protobuf")
public MyProtoObject lookup() {
return new MyProtoObject();
}
Furthermore, if you want to serialize the proto object into JSON and return String, then have a look at com.googlecode.protobuf-java-format:protobuf-java-format library and JsonFormat::printToString capability (https://code.google.com/archive/p/protobuf-java-format/):
#GetMapping
public String lookup() {
return new JsonFormat().printToString(new MyProtoObj());
}
Since version 4.1 spring provides org.springframework.http.converter.protobuf.ProtobufHttpMessageConverter for reading and writing protos as Json.
However, If you are using Spring 5.x and Protobuf 3.x there is org.springframework.http.converter.protobuf.ProtobufJsonFormatHttpMessageConverter for more explicit conversion of Json.
This documentation should help you:
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/http/converter/protobuf/ProtobufHttpMessageConverter.html
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/http/converter/protobuf/ProtobufJsonFormatHttpMessageConverter.html
I'm exploring reactive programming with Spring Webflux and therefore, I'm trying to make my code completely nonblocking to get all the benefits of a reactive application.
Currently my code for the method to parse a Json String to a JsonNode to get specific values (in this case the elementId) looks like this:
public Mono<String> readElementIdFromJsonString(String jsonString){
final JsonNode jsonNode;
try {
jsonNode = MAPPER.readTree(jsonString);
} catch (IOException e) {
return Mono.error(e);
}
final String elementId = jsonNode.get("elementId").asText();
return Mono.just(elementId);
}
However, IntelliJ notifies me that I'm using an inappropriate blocking method call with this code:
MAPPER.readTree(jsonString);
How can I implement this code in a nonblocking way? I have seen that since Jackson 2.9+, it is possible to parse a Json String in a nonblocking async way, but I don't know how to use that API and I couldn't find an example how to do it correctly.
I am not sure why it is saying it is a blocking call since Jackson is non blocking as far as I know. Anyway one way to resolve this issue is to use schedulers if you do not want to use any other library. Like this.
public Mono<String> readElementIdFromJsonString(String input) {
return Mono.just(Mapper.readTree(input))
.map(it -> it.get("elementId").asText())
.onErrorResume( it -> Mono.error(it))
.subscribeOn(Schedulers.boundedElastic());
}
Something along that line.
import reactor.core.publisher.Mono;
import java.nio.charset.StandardCharsets;
import org.springframework.core.ResolvableType;
import org.springframework.core.io.buffer.DataBufferUtils;
import org.springframework.core.io.buffer.DefaultDataBuffer;
import org.springframework.core.io.buffer.DefaultDataBufferFactory;
import org.springframework.http.codec.json.AbstractJackson2Decoder;
import org.springframework.util.MimeType;
import org.springframework.util.MimeTypeUtils;
import com.fasterxml.jackson.databind.ObjectMapper;
#FunctionalInterface
public interface MessageParser<T> {
Mono<T> parse(String message);
}
public class JsonNodeParser extends AbstractJackson2Decoder implements MessageParser<JsonNode> {
private static final MimeType MIME_TYPE = MimeTypeUtils.APPLICATION_JSON;
private static final ObjectMapper OBJECT_MAPPER = allocateDefaultObjectMapper();
private final DefaultDataBufferFactory factory;
private final ResolvableType resolvableType;
public JsonNodeParser(final Environment env) {
super(OBJECT_MAPPER, MIME_TYPE);
this.factory = new DefaultDataBufferFactory();
this.resolvableType = ResolvableType.forClass(JsonNode.class);
this.setMaxInMemorySize(100000); // 1MB
canDecodeJsonNode();
}
#Override
public Mono<JsonNode> parse(final String message) {
final byte[] bytes = message.getBytes(StandardCharsets.UTF_8);
return decode(bytes);
}
private Mono<JsonNode> decode(final byte[] bytes) {
final DefaultDataBuffer defaultDataBuffer = this.factory.wrap(bytes);
return this.decodeToMono(Mono.just(defaultDataBuffer), this.resolvableType, MIME_TYPE, Map.of())
.ofType(JsonNode.class)
.subscribeOn(Schedulers.boundedElastic())
.doFinally((t) -> DataBufferUtils.release(defaultDataBuffer));
}
private void canDecodeJsonNode() {
if (!canDecode(this.resolvableType, MIME_TYPE)) {
throw new IllegalStateException(String.format("JsonNodeParser doesn't supports the given tar`enter code here`get " +
"element type [%s] and the MIME type [%s]", this.resolvableType, MIME_TYPE));
}
}
}
I'm having the following code:
#Data
#Validated
#ConfigurationProperties
public class Keys {
private final Key key = new Key();
#Data
#Validated
#ConfigurationProperties(prefix = "key")
public class Key {
private final Client client = new Client();
private final IntentToken intentToken = new IntentToken();
private final Intent intent = new Intent();
private final OAuth oauth = new OAuth();
private final ResourceToken resourceToken = new ResourceToken();
#Valid #NotNull private String authorization;
#Valid #NotNull private String bearer;
...
}
}
That is an instance representing a properties file such as:
key.authorization=Authorization
key.bearer=Bearer
..
As I can have different sources for the properties (properties file, MongoDB, etc), I have a client that inherit from Keys as follow:
Properties files source
#Component
#Configuration
#Primary
#PropertySource("classpath:${product}-keys.${env}.properties")
//#JsonAutoDetect(fieldVisibility = Visibility.ANY)
public class CustomerKeysProperties extends Keys {
}
Mongo source
#Data
#EqualsAndHashCode(callSuper=true)
#Component
//#Primary
#Document(collection = "customerKeys")
public class CustomerKeysMongo extends Keys {
#Id
private String id;
}
I just select the source I want to use annotating the class with #Primary. In the example above, CustomerKeysProperties is the active source.
All this work fine.
The issue I have is when I try to convert an instance of CustomerKeysProperties into JSON, as in the code below:
#SpringBootApplication
public class ConverterUtil {
public static void main(String[] args) throws Exception {
SpringApplication.run(ConverterUtil.class, args);
}
#Component
class CustomerInitializer implements CommandLineRunner {
#Autowired
private Keys k;
private final ObjectMapper mapper = new ObjectMapper();
#Override
public void run(String... args) throws Exception {
mapper.setVisibility(PropertyAccessor.FIELD, Visibility.ANY);
//mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
String jsonInString = mapper.writeValueAsString(k);
System.out.println(jsonInString);
}
}
}
While k contains all the properties set, the conversion fails:
Caused by: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS) (through reference chain: x.client.customer.properties.CustomerKeysProperties$$EnhancerBySpringCGLIB$$eda308bd["CGLIB$CALLBACK_0"]->org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor["advised"]->org.springframework.aop.framework.ProxyFactory["targetSource"]->org.springframework.aop.target.SingletonTargetSource["target"]->x.client.customer.properties.CustomerKeysProperties$$EnhancerBySpringCGLIB$$4fd6c568["CGLIB$CALLBACK_0"])
at com.fasterxml.jackson.databind.exc.InvalidDefinitionException.from(InvalidDefinitionException.java:77)
at com.fasterxml.jackson.databind.SerializerProvider.reportBadDefinition(SerializerProvider.java:1191)
And if I uncomment
mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false)
as suggested in the logs, I have an infinite loop happening in Jackson causing a stackoverflow:
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:155)
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727)
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:719)
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:155)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContents(IndexedListSerializer.java:119)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18)
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727)
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:719)
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:155)
..
Questions
At the end, I just want to provide an Util class than can convert a properties file in a JSON format that will be stored in MongoDB.
How can I solve this problem ?
Without passing through the object above, how can I transform a properties file into JSON ?
Can I save an arbitrary Java bean in MongoDB, with the conversion to JSON automagically done ?
The answer to any of the 3 questions above would be helpful.
Notes
To be noted that I use lombok. Not sure if this is the problem.
Another guess is that I'm trying to serialize a Spring managed bean and the proxy it involve cause jackson to not be able to do the serialization ? If so, what can be the turn-around ?
Thanks!
So found the problem:
jackson can't process managed bean.
The turn around was
try (InputStream input = getClass().getClassLoader().getResourceAsStream("foo.properties")) {
JavaPropsMapper mapper = new JavaPropsMapper();
Keys keys = mapper.readValue(input, Keys.class);
ObjectWriter ow = new ObjectMapper().writer().withDefaultPrettyPrinter();
String res = ow.writeValueAsString(keys);
System.out.println(res);
} catch (IOException e) {
e.printStackTrace();
}
where Keys was the Spring managed bean I was injecting.
And:
JavaPropsMapper come from:
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-properties</artifactId>
</dependency>
I'm about to develop a JAX-RS based RESTful web service and I use MOXy (JAXB) in order to automatically generate my web service's JSON responses.
Everything is cool, but due to the fact that the web service will be the back-end of a JavaScript-based web application and therefore publicly accessible I don't want to expose certain details like class names, etc.
But, I've realized that under certain conditions MOXy embeds a "#type" entry into the marshalled string and this entry is followed by the class name of the object that has just been marshalled.
In particular, I've realized that MOXy behaves in this way when marshalling instances of extended classes.
Assume the following super class "MyBasicResponse"
#XmlRootElement(name="res")
public class MyBasicResponse {
#XmlElement
private String msg;
public MyBasicResponse() {
// Just for conformity
}
public String getMsg() {
return msg;
}
public void setMsg(String msg) {
this.msg = msg;
}
}
And this specialized (extended) class "MySpecialResponse"
#XmlRootElement(name="res")
public class MySpecialResponse extends MyBasicResponse {
#XmlElement
private String moreInfo;
public MySpecialResponse() {
// Just for conformity
}
public String getMoreInfo() {
return moreInfo;
}
public void setMoreInfo(String moreInfo) {
this.moreInfo = moreInfo;
}
}
So, the MyBasicResponse object's marshalled string is
{"msg":"A Message."}
(That's okay!)
But, the MySpecialResponse object's marshalled string is like
{"#type":"MySpecialResponse","msg":"A Message.","moreInfo":"More Information."}
Is there a way to strip the
"#type":"MySpecialResponse"
out of my response?
You can wrap your object in an instance of JAXBElement specifying the subclass being marshalled to get rid of the type key. Below is a full example.
Java Model
Same as from the question, but with the following package-info class added to specifying the field access to match those classes
#XmlAccessorType(XmlAccessType.FIELD)
package com.example.foo;
import javax.xml.bind.annotation.*;
Demo Code
Demo
import java.util.*;
import javax.xml.bind.*;
import javax.xml.namespace.QName;
import org.eclipse.persistence.jaxb.JAXBContextProperties;
public class Demo {
public static void main(String[] args) throws Exception {
Map<String, Object> properties = new HashMap<String, Object>(2);
properties.put(JAXBContextProperties.MEDIA_TYPE, "application/json");
properties.put(JAXBContextProperties.JSON_INCLUDE_ROOT, false);
JAXBContext jc = JAXBContext.newInstance(new Class[] {MySpecialResponse.class}, properties);
Marshaller marshaller = jc.createMarshaller();
marshaller.setProperty(Marshaller.JAXB_FORMATTED_OUTPUT, true);
MySpecialResponse msr = new MySpecialResponse();
marshaller.marshal(msr, System.out);
JAXBElement<MySpecialResponse> jaxbElement = new JAXBElement(new QName(""), MySpecialResponse.class, msr);
marshaller.marshal(jaxbElement, System.out);
}
}
Output
We see that when the object was marshalled an type key was marshalled (corresponding to the xsi:type attribute in the XML representation), because as MOXy is concerned it was necessary to distinguish between MyBasicResponse and MySpecialResponse. When we wrapped the object in an instance of JAXBElement and qualified the type MOXy didn't need to add the type key.
{
"type" : "mySpecialResponse"
}
{
}
For More Information
http://blog.bdoughan.com/2011/05/specifying-eclipselink-moxy-as-your.html
http://blog.bdoughan.com/2012/05/moxy-as-your-jax-rs-json-provider.html
I have a really simple rest web service returning a list of questions. This code works as expected when the number of questions returned are greater than zero. But if the server returns an empty json array like [], JAXB creates a list with one question instance where all fields are set to null!
I'm new to both Jersey and JAXB so I don't know whether I haven't configured it correctly or whether this is a known problem. Any tips?
Client configuration:
DefaultApacheHttpClientConfig config = new DefaultApacheHttpClientConfig();
config.getProperties().put(DefaultApacheHttpClientConfig.PROPERTY_HANDLE_COOKIES, true);
config.getClasses().add(JAXBContextResolver.class);
//config.getClasses().add(JacksonJsonProvider.class); // <- Jackson causes other problems
client = ApacheHttpClient.create(config);
JAXBContextResolver:
#Provider
public final class JAXBContextResolver implements ContextResolver<JAXBContext> {
private final JAXBContext context;
private final Set<Class> types;
private final Class[] cTypes = { Question.class };
public JAXBContextResolver() throws Exception {
this.types = new HashSet(Arrays.asList(cTypes));
this.context = new JSONJAXBContext(JSONConfiguration.natural().build(), cTypes);
}
#Override
public JAXBContext getContext(Class<?> objectType) {
return (types.contains(objectType)) ? context : null;
}
}
Client code:
public List<Question> getQuestionsByGroupId(int id) {
return digiRest.path("/questions/byGroupId/" + id).get(new GenericType<List<Question>>() {});
}
The Question class is just a simple pojo.
I know this is not exactly an answer to your question, but I choosed to use GSON on top of jersey, for my current projects. (and I try to avoid JAXB as much as possible), and I found it very easy and resilient.
You just have to declare
#Consumes(MediaType.TEXT_PLAIN)
or
#Produces(MediaType.TEXT_PLAIN)
or both, and use the GSON marshaller/unmarshaller, and work with plain Strings. Very easy to debug, unittest too...
Using Jackson may help.
See org.codehaus.jackson.map.ObjectMapper and org.codehaus.jackson.map.annotate.JsonSerialize.Inclusion.NON_EMPTY
import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.map.annotate.JsonSerialize;
public class SampleContextResolver implements ContextResolver<ObjectMapper>
{
#Override
public ObjectMapper getContext(Class<?> type)
{
ObjectMapper mapper = new ObjectMapper();
mapper.setSerializationConfig(mapper.getSerializationConfig()
.withSerializationInclusion(JsonSerialize.Inclusion.NON_EMPTY)
}
}