Question regarding combination of Jackson/JPA
If there are about 20 entities in current application and I have add Jackson dependency in POM, does it mean all entities are by default ready to convert to JSON object? I saw a sample project seems only class annotated as #JsonIgnored is skipped by JSON. If so, then how can this happen, what is behind such mechanism? how JACKSON handle those entities which don't have any Jackson annotation, by default ignored or not? I've been looking for resources online but not much luck.
If only one of the 20 entities need to be mapped to JSON object, does it mean I have to add #JsonIgnore to all other 19 entities? If not, how Jackson differentiate with entity to work on?
Thanks.
Jackson and JPA don't have anything to do with each other. Jackson is a JSON parsing library and JPA is a persistence framework. Jackson can serialize almost any object - the only requirement being that the object have some kind of recognizable properties (Javabean type properties, or bare fields annotated with #JsonProperty. There is an additional requirement for deserialization, that the target type have a default (no-arg) constructor. So, for example, this is an object that Jackson can serialize:
// Class with a single Javabean property, "name"
class Person {
private String name;
public String getName() { return name ; }
public String setName(String name) { this.name = name ; }
}
And here is another:
// Class with a single field annotated with #JsonProperty
class Account {
#JsonProperty("accountNumber")
private String accountNumber;
}
And here is yet another:
#Entity
public class User {
#Id
private Long id;
#Basic
private String userName;
#Basic
#JsonIgnore
private String password;
#Basic
#JsonIgnore
private Address address;
// Constructors, getters, setters
}
The last example shows a JPA entity class - as far as Jackson is concerned it can be serialized just like any other type. But, take note of its fields: when this object is serialized into JSON two of the fields will not be included - 'password' and 'address'. This is because they have been annotated with #JsonIgnore. The #JsonIgnore annotation allows a developer to say 'Hey, its ok to serialize this object, but when you do so don't include these fields in the output'. This exclusion only occurs for the fields of this object, so for example, if you included an Address field in another class, but did not mark the field as ignorable, it would be serialized.
To prevent serialization of a type in all cases, regardless of context, use the #JsonIgnoreType annotation. When used on a type it basically means 'I dont care where this type is used, never serialize it'.
No, you don't need to add #JsonIgnore on every class and if you had tried you would have gotten a compile error, since you can't put it there. Jackson will only work on objects you give to it, it's no magic.
The Jackson documentation is easily found online, such at its project page on github or on the codehaus website.
Related
I am on Spring Boot 2.0.6, where an entity pet do have a Lazy many-to-one relationship to another entity owner
Pet entity
#Entity
#Table(name = "pets")
public class Pet extends AbstractPersistable<Long> {
#NonNull
private String name;
private String birthday;
#JsonIdentityInfo(generator=ObjectIdGenerators.PropertyGenerator.class, property="id")
#JsonIdentityReference(alwaysAsId=true)
#JsonProperty("ownerId")
#ManyToOne(fetch=FetchType.LAZY)
private Owner owner;
But while submitting a request like /pets through a client(eg: PostMan), the controller.get() method run into an exception as is given below:-
com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class java.lang.Long and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS) (through reference chain: java.util.ArrayList[0]->com.petowner.entity.Pet["ownerId"])
at com.fasterxml.jackson.databind.exc.InvalidDefinitionException.from(InvalidDefinitionException.java:77) ~[jackson-databind-2.9.7.jar:2.9.7]
at com.fasterxml.jackson.databind.SerializerProvider.reportBadDefinition(SerializerProvider.java:1191) ~[jackson-databind-2.9.7.jar:2.9.7]
Controller.get implementation
#GetMapping("/pets")
public #ResponseBody List<Pet> get() {
List<Pet> pets = petRepository.findAll();
return pets;
}
My observations
Tried to invoke explicitly the getters within owner through pet to force the lazy-loading from the javaassist proxy object of owner within the pet. But did not work.
#GetMapping("/pets")
public #ResponseBody List<Pet> get() {
List<Pet> pets = petRepository.findAll();
pets.forEach( pet -> pet.getOwner().getId());
return pets;
}
Tried as suggested by this stackoverflow answer at https://stackoverflow.com/a/51129212/5107365 to have controller call to delegate to a service bean within the transaction scope to force lazy-loading. But that did not work too.
#Service
#Transactional(readOnly = true)
public class PetServiceImpl implements PetService {
#Autowired
private PetRepository petRepository;
#Override
public List<Pet> loadPets() {
List<Pet> pets = petRepository.findAll();
pets.forEach(pet -> pet.getOwner().getId());
return pets;
}
}
It works when Service/Controller returning a DTO created out from the entity. Obviously, the reason is JSON serializer get to work with a POJO instead of an ORM entity without any mock objects in it.
Changing the entity fetch mode to FetchType.EAGER would solve the problem, but I did not want to change it.
I am curious to know why it is thrown the exception in case of (1) and (2). Those should have forced the explicit loading of lazy objects.
Probably the answer might be connected to the life and scope of that javassist objects got created to maintain the lazy objects. Yet, wondering how would Jackson serializer not find a serializer for a java wrapper type like java.lang.Long. Please do rememeber here that the exception thrown did indicate that Jackson serializer got access to owner.getId as it recognised the type of the property ownerId as java.lang.Long.
Any clues would be highly appreciated.
Edit
The edited part from the accepted answer explains the causes. Suggestion to use a custom serializer is very useful one in case if I don't need to go in DTO's path.
I did a bit of scanning through the Jackson sources to dig down to the root causes. Thought to share that too.
Jackson caches most of the serialization metadata on first use. Logic related to the use case in discussion starts at this method com.fasterxml.jackson.databind.ser.std.CollectionSerializer.serializeContents(Collection<?> value, JsonGenerator g, SerializerProvider provider). And, the respective code snippet is:-
The statement serializer = _findAndAddDynamic(serializers, cc, provider) at Line #140 trigger the flow to assign serializers for pet-level properties while skipping ownerId to be later processed through serializer.serializeWithType at line #147.
Assigning of serializers is done at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.resolve(SerializerProvider provider) method. The respective snippet is shown below:-
Serializers are assigned at line #340 only for those properties which are confirmed as final through the check at line #333.
When owner comes here, its proxied properties are found to be of type com.fasterxml.jackson.databind.type.SimpleType. Had this associated entity been loaded eagerly, the proxied properties obviously won't be there. Instead, original properties would be found with the values that are typed with final classes like Long, String, etc. (just like the pet properties).
Wondering why can't Jackson address this from their end by using the getter's type instead of using that of the proxied property. Anyway, that could be a different topic to discuss :-)
This has to do with the way that Hibernate (internally what spring boot uses for JPA by default) hydrates objects. A lazy object is not loaded until some parameter of the object is requested. Hibernate returns a proxy which delegates to the dto after firing queries to hydrate the objects.
In your scenario, loading OwnerId does not help because it is the key via which you are referencing the owner object i.e. the OwnerId is already present in the Pet object, so the hydration will not take place.
In both 1 and 2, you have not actually loaded the owner object, so when Jackson tries to serialize it at the controller level it fails. In 3 and 4, the owner object has been loaded explicitly, which is why Jackson does not run into any issues.
If you want 2 to work then load some parameter of owner, other than id, and hibernate will hydrate the object, and then jackson will be able to serialize it.
Edited Answer
The problem here is with the default Jackson serializer. This inspects the class returned and fetches the value of each attribute via reflection. In the case of hibernate entities, the object returned is a delegator proxy class in which all parameters are null, but all getters are redirected to the contained instance. When the object is inspected, the values of each attribute are still null, which is defaulted to an error as explained here
So basically, you need to tell jackson how to serialize this object. You can do so by creating a serializer class
public class OwnerSerializer extends StdSerializer<Owner> {
public OwnerSerializer() {
this(null);
}
public OwnerSerializer(Class<Owner> t) {
super(t);
}
#Override
public void serialize(Owner value, JsonGenerator jgen, SerializerProvider provider)
throws IOException, JsonProcessingException {
jgen.writeStartObject();
jgen.writeNumberField("id", value.getId());
jgen.writeStringField("firstName", value.getFirstName());
jgen.writeStringField("lastName", value.getLastName());
jgen.writeEndObject();
}
}
And setting it as the default serializer for the object
#JsonSerialize(using = OwnerSerializer.class)
public class Owner extends AbstractPersistable<Long> {
Alternatively, you can create a new Object of type Owner from the proxy class, manually populate it and set it in the response.
It is a little roundabout, but as a general practice you should not expose your DTO's externally anyway. The controller/domain should be decoupled from the storage layer.
I need to validate a JSON list similar to the following:
[{"op":"A","path":"C","value":"B"},...]
in a Spring MVC application - I am currently deserializing (using default Jackson) to an object along the lines of:
public class Operations extends ArrayList<Operation>{}
public class Operation {
#NotEmpty
public String op;
#NotEmpty
public String path;
public Object value;
public void setOp(String op)... and other getters/setters
}
but I cannot figure out how to get jsr303 validation provided by reference hibernate implementation to fire for the attributes of Operation.
I can get it to function if I wrap the list in a class but then I have an incorrect format for the JSON, ie something like:
{"ops":[{"op":"A",...},...]}
is it possible to validate the first object (Operations)? and if not is it possible to serialize the first format (ie the JSON list) to an object of the second format (ie a list wrapped in a placeholder object with a placeholder field)
Update
Having failed to find a way to trigger the jsr303 validation on a bare ArrayList I have written a custom jackson json deserializer to stick it into a containing object with an annotated field along the lines of
#JsonDeserialize(using=OperationsDeserializer.class)
public class Operations {
#NotEmpty
private ArrayList<Operation> ops;
public void setOps(ArrayList<Operation>ops)...
public ArrayList<Operation> getOps()...
}
This works but now any autogenerated documentation for my api is generating json examples with the dummy "ops" field in it - ie {"ops" : [ ... ] }
so the search for a method of triggering jsr303 validation on an ArrayList that is not a field of another object continues - perhaps there is a way to inject a proxy wrapping class at runtime that might work around this?
Use ObjectMapper.class. it has a method which will convert Json Object into Class Object
method is , new ObjectMapper().readValue(String str, Class<T> valueType)
So you can iterate your Object array, convert to string and pass it to this method to get your result.
so it would look like,
new ObjectMapper().readValue(object.toString, Operation.class);
I have a somewhat philosophical question relating to mapping JPA Objects to JSON Strings. Of course there is no necessity for the source object to be a persistent object - it is just that that is my situation.
I have a collection of objects that are managed by Eclipse Link. I need to turn some of these objects into JSON Strings, however the mapping is not one-to-one. I am convinced that the conversion should be loosely coupled so as to isolate the JSON objects from changes in the underlying entities.
I am planning to have the JPA entity as such:
#Entity
#Table(name = "AbnormalFlags")
public class AbnormalFlag implements java.io.Serializable {
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
#Column(name = "Code", unique = false, nullable = false)
private String code;
#Column(name = "Description", unique = false, nullable = false)
private String description;
// Getters and setters
}
and the equivalent object to be converted to JSON
public class AbnormalFlagDTO implements java.io.Serializable {
private String code;
private String description;
private Boolean disabled;
// Getters and setters
}
Is there an elegant pattern or methodology I can use to facilitate this process for several types of objects.
Thanks in anticipation
My answer: no, and also you should generally extend DTOs with care (when reusing existing DTOs). But you could use a Map<String, Object> as a DTO (if you do not use the same DTO to read the data back). Besides you could create an APT (annotation processor tool) that generates the code for DTOs from your entities and then you simply modify them.
This is a perfect use case for Blaze-Persistence Entity Views as you will most probably also want to keep an eye on the performance of the query used for fetching the data.
I created the library to allow easy mapping between JPA models and custom interface defined models. The idea is that you define your target structure the way you like and map attributes(getters) via JPQL expressions to the entity model. Since the attribute name is used as default mapping, you mostly don't need explicit mappings as 80% of the use cases is to have DTOs that are a subset of the entity model.
A mapping for your model could look as simple as the following
#EntityView(AbnormalFlag.class)
interface AbnormalFlagDTO extends Serializable {
String getCode();
String getDescription();
Boolean getDisabled();
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
AbnormalFlagDTO dto = entityViewManager.find(entityManager, AbnormalFlagDTO.class, id);
The serialization of the entity view to JSON will work as expected. If you also want to deserialize objects, you will have to construct the object first and also add setters to the interface.
I am currently prototyping replacing the GWT-RPC based backend of our application to a REST based API using RestyGWT on the frontend and Spring MVC on the backend.
My issue occurs during the Java <-> JSON type conversions that both frameworks attempt to resolve automatically. All of our data objects use private fields, and many of the fields do not provide java bean style setter methods. By default, neither framework would inspect the private fields of a class and so this conversion fails.
For Spring MVC it was simple enough to fix this by adding an annotation to the data objects:
#JsonAutoDetect(fieldVisibility = Visibility.ANY, getterVisibility = Visibility.NONE, setterVisibility = Visibility.NONE)
For RestyGWT I have not found a suitable fix. The only available workaround I have found is to use default access to all fields and constructors which is far from ideal. Does anybody have a solution that will allow RestyGWT to inspect the private fields of a Java object?
Try using #JsonProperty and #JsonCreator (do not remeber if both are necessary) on your fields.
public abstract class Parent
{
#JsonCreator
public Parent(#JsonProperty("name") String name)
{
this.name = name;
}
public String getName()
{
return name;
}
private String name;
}
My question is, whether it is necessary to add #XmlElement before each element in your pojo to be picked up by jaxb, when making a JSON response. I am using jersey-json 1.17 . The reason I ask this is because, the example given on Jersey site does not use the annotation.
I get an out put as {}, but when I add #XmlElement before the attributes, I get the expected JSON output. Am I doing something wrong, because of which my JSON string is empty ?
My code :
The vertices list is populated in the constructor.
This produces the wrong output of {}
#XmlRootElement
public class SquareModel {
List<Float> vertices = new ArrayList<Float>();
....
}
Whereas this produces the a correct JSON string :
#XmlRootElement
public class SquareModel {
#XmlElement(name="vertices")
List<Float> vertices = new ArrayList<Float>();
....
}
My resource class which returns the JSON
#GET
#Produces(MediaType.APPLICATION_JSON)
public SquareModel getJsonString() {
return new SquareModel();
}
Thanks :)
No, by default a JAXB (JSR-22#) implementation will treat all public fields and properties (get/set combinations) as mapped (not requiring the #XmlElement annotation).
http://blog.bdoughan.com/2012/07/jaxb-no-annotations-required.html
If you wish to annotate a field I would recommend annotating your class with #XmlAccessorType(XmlAccessType.FIELD)
http://blog.bdoughan.com/2011/06/using-jaxbs-xmlaccessortype-to.html
According to this http://jersey.java.net/nonav/documentation/latest/json.html#json.jaxb.approach.section
You should have this annotation (I'm also using it in my code, even though it XML oriented, but it gives me cool JSON also)
Taking this approach will save you a lot of time, if you want to
easily produce/consume both JSON and XML data format. Because even
then you will still be able to use a unified Java model. Another
advantage is simplicity of working with such a model, as JAXB
leverages annotated POJOs and these could be handled as simple Java
beans.
A disadvantage of JAXB based approach could be if you need to work
with a very specific JSON format. Then it could be difficult to find a
proper way to get such a format produced and consumed. This is a
reason why a lot of configuration options are provided, so that you
can control how things get serialized out and deserialized back.
Following is a very simple example of how a JAXB bean could look like.
Example 5.3. Simple JAXB bean implementation
#XmlRootElement
public class MyJaxbBean {
public String name;
public int age;
public MyJaxbBean() {} // JAXB needs this
public MyJaxbBean(String name, int age) {
this.name = name;
this.age = age;
}
}