JSON Patch Request validation in Java - json

In my spring boot service, I'm using https://github.com/java-json-tools/json-patch for handling PATCH requests.
Everything seems to be ok except a way to avoid modifying immutable fields like object id's, creation_time etc. I have found a similar question on Github https://github.com/java-json-tools/json-patch/issues/21 for which I could not find the right example.
This blog seems to give some interesting solutions about validating JSON patch requests with a solution in node.js. Would be good to know if something similar in JAVA is already there.

Under many circumstances you can just patch an intermediate object which only has fields that the user can write to. After that you could quite easily map the intermediate object to your entity, using some object mapper or just manually.
The downside of this is that if you have a requirement that fields must be explicitly nullable, you won’t know if the patch object set a field to null explicitly or if it was never present in the patch.
What you can do too is abuse Optionals for this, e.g.
public class ProjectPatchDTO {
private Optional<#NotBlank String> name;
private Optional<String> description;
}
Although Optionals were not intended to be used like this, it's the most straightforward way to implement patch operations while maintaining a typed input. When the optional field is null, it was never passed from the client. When the optional is not present, that means the client has set the value to null.

Instead of receiving a JsonPatch directly from the client, define a DTO to handle the validation and then you will later convert the DTO instance to a JsonPatch.
Say you want to update a user of instance User.class, you can define a DTO such as:
public class UserDTO {
#Email(message = "The provided email is invalid")
private String username;
#Size(min = 2, max = 10, message = "firstname should have at least 2 and a maximum of 10 characters")
private String firstName;
#Size(min = 2, max = 10, message = "firstname should have at least 2 and a maximum of 10 characters")
private String lastName;
#Override
public String toString() {
return new Gson().toJson(this);
}
//getters and setters
}
The custom toString method ensures that fields that are not included in the update request are not prefilled with null values.
Your PATCH request can be as follows(For simplicity, I didn't cater for Exceptions)
#PatchMapping("/{id}")
ResponseEntity<Object> updateUser(#RequestBody #Valid UserDTO request,
#PathVariable String id) throws ParseException, IOException, JsonPatchException {
User oldUser = userRepository.findById(id);
String detailsToUpdate = request.toString();
User newUser = applyPatchToUser(detailsToUpdate, oldUser);
userRepository.save(newUser);
return userService.updateUser(request, id);
}
The following method returns the patched User which is updated above in the controller.
private User applyPatchToUser(String detailsToUpdate, User oldUser) throws IOException, JsonPatchException {
ObjectMapper objectMapper = new ObjectMapper();
// Parse the patch to JsonNode
JsonNode patchNode = objectMapper.readTree(detailsToUpdate);
// Create the patch
JsonMergePatch patch = JsonMergePatch.fromJson(patchNode);
// Convert the original object to JsonNode
JsonNode originalObjNode = objectMapper.valueToTree(oldUser);
// Apply the patch
TreeNode patchedObjNode = patch.apply(originalObjNode);
// Convert the patched node to an updated obj
return objectMapper.treeToValue(patchedObjNode, User.class);
}

Another solution would be to imperatively deserialize and validate the request body.
So your example DTO might look like this:
public class CatDto {
#NotBlank
private String name;
#Min(0)
#Max(100)
private int laziness;
#Max(3)
private int purringVolume;
}
And your controller can be something like this:
#RestController
#RequestMapping("/api/cats")
#io.swagger.v3.oas.annotations.parameters.RequestBody(
content = #Content(schema = #Schema(implementation = CatDto.class)))
// ^^ this passes your CatDto model to swagger (you must use springdoc to get it to work!)
public class CatController {
#Autowired
SmartValidator validator; // we'll use this to validate our request
#PatchMapping(path = "/{id}", consumes = "application/json")
public ResponseEntity<String> updateCat(
#PathVariable String id,
#RequestBody Map<String, Object> body
// ^^ no Valid annotation, no declarative DTO binding here!
) throws MethodArgumentNotValidException {
CatDto catDto = new CatDto();
WebDataBinder binder = new WebDataBinder(catDto);
BindingResult bindingResult = binder.getBindingResult();
binder.bind(new MutablePropertyValues(body));
// ^^ imperatively bind to DTO
body.forEach((k, v) -> validator.validateValue(CatDto.class, k, v, bindingResult));
// ^^ imperatively validate user input
if (bindingResult.hasErrors()) {
throw new MethodArgumentNotValidException(null, bindingResult);
// ^^ this can be handled by your regular exception handler
}
// Here you can do normal stuff with your cat DTO.
// Map it to cat model, send to cat service, whatever.
return ResponseEntity.ok("cat updated");
}
}
No need for Optional's, no extra dependencies, your normal validation just works, your swagger looks good. The only problem is, you don't get proper merge patch on nested objects, but in many use cases that's not even required.

Related

Get JSON as input in apache flink

I am trying to receive and access JSON data from a Kafka Topic in Flink. What works is, producing data, send it to a Kafka Topic und receive it in Flink as String. But I want to access the data in an object-oriented way (e.g. extract a specific atrribute from every message)?
Therefore I have a Kafka Producer which sends data (e.g. every 1s) to a Kafka Topic:
ObjectMapper test = new ObjectMapper();
ObjectNode jNode= test.createObjectNode();
jNode.put("LoPos", longPos)
.put("LaPos", latPos)
.put("Timestamp", timestamp.toString());
ProducerRecord<String, ObjectNode> rec = new ProducerRecord<String, ObjectNode>(topicName, jNode);
producer.send(rec);
so the JSON data looks like this:
{"LoPos":10.5,"LaPos":2.5,"Timestamp":"2022-10-31 12:45:19.353"}
What works is, receiving the data and print it as string:
DataStream<String> input =
env.fromSource(
KafkaSource.<String>builder()
.setBootstrapServers("localhost:9092")
.setBounded(OffsetsInitializer.latest())
.setValueOnlyDeserializer(new SimpleStringSchema())
.setTopics(topicName)
.build(),
WatermarkStrategy.noWatermarks(),
"kafka-source");
Print the data as string:
DataStream<String> parsed = input.map(new MapFunction<String, String>() {
private static final long serialVersionUID = -6867736771747690202L;
#Override
public String map(String value) {
System.out.println(value);
return "test";
How can I receive the data in Flink and access it in an object-oriented way (e.g. extract LoPos from every message)? Which approach would you recommend? I tried it with JSONValueDeserializationSchema, but without success...
Thanks!
Update1:
I updated to Flink 1.16 to use JsonDeserializationSchema.
Then I created a Flink Pojo Event like this:
public class Event {
public double LoPos;
public double LaPos;
public Timestamp timestamp;
public Event() {}
public Event(final double LoPos, final double LaPos, final Timestamp timestamp) {
this.LaPos=LaPos;
this.LoPos=LoPos;
this.timestamp=timestamp;
}
#Override
public String toString() {
return String.valueOf(LaPos);
}
}
To read the JSON data, I implemented the following:
KafkaSource<Event> source = KafkaSource.<Event>builder()
.setBootstrapServers("localhost:9092")
.setBounded(OffsetsInitializer.earliest())
.setValueOnlyDeserializer(new JsonDeserializationSchema<>(Event.class))
.setTopics("testTopic2")
.build();
DataStream<Event> test=env.fromSource(source, WatermarkStrategy.noWatermarks(), "test");
System.out.println(source.toString());
System.out.println(test.toString());
//test.sinkTo(new PrintSink<>());
test.print();
env.execute();
So I would expect, when using source.toString() the value of LaPos is getting returned. But all I get is:
org.apache.flink.connector.kafka.source.KafkaSource#510f3d34
What am I doing wrong?
This topic is covered in one of the recipes in the Immerok Apache Flink Cookbook.
In the examples below, I'm assuming Event is a Flink POJO.
With Flink 1.15 or earlier, you should use a custom deserializer:
KafkaSource<Event> source =
KafkaSource.<Event>builder()
.setBootstrapServers("localhost:9092")
.setTopics(TOPIC)
.setStartingOffsets(OffsetsInitializer.earliest())
.setValueOnlyDeserializer(new EventDeserializationSchema())
.build();
The deserializer can be something like this:
public class EventDeserializationSchema extends AbstractDeserializationSchema<Event> {
private static final long serialVersionUID = 1L;
private transient ObjectMapper objectMapper;
/**
* For performance reasons it's better to create on ObjectMapper in this open method rather than
* creating a new ObjectMapper for every record.
*/
#Override
public void open(InitializationContext context) {
// JavaTimeModule is needed for Java 8 data time (Instant) support
objectMapper = JsonMapper.builder().build().registerModule(new JavaTimeModule());
}
/**
* If our deserialize method needed access to the information in the Kafka headers of a
* KafkaConsumerRecord, we would have implemented a KafkaRecordDeserializationSchema instead of
* extending AbstractDeserializationSchema.
*/
#Override
public Event deserialize(byte[] message) throws IOException {
return objectMapper.readValue(message, Event.class);
}
}
We've made this easier in Flink 1.16, where we've added a proper JsonDeserializationSchema you can use:
KafkaSource<Event> source =
KafkaSource.<Event>builder()
.setBootstrapServers("localhost:9092")
.setTopics(TOPIC)
.setStartingOffsets(OffsetsInitializer.earliest())
.setValueOnlyDeserializer(new JsonDeserializationSchema<>(Event.class))
.build();
Disclaimer: I work for Immerok.

Flink Kafka - Custom Class Data is always null

Custom Class
Person
class Person
{
private Integer id;
private String name;
//getters and setters
}
Kafka Flink Connector
TypeInformation<Person> info = TypeInformation.of(Person.class);
TypeInformationSerializationSchema schema = new TypeInformationSerializationSchema(info, new ExecutionConfig());
DataStream<Person> input = env.addSource( new FlinkKafkaConsumer08<>("persons", schema , getKafkaProperties()));
Now if I send the below json
{ "id" : 1, "name": Synd }
through Kafka Console Producer, the flink code throws null pointer exception
But if I use SimpleStringSchema instead of CustomSchema as defined before, the stream is getting printed.
What is wrong in the above setup
The TypeInformationSerializationSchema is a de-/serialization schema which uses Flink's serialization stack and, thus, also its serializer. Therefore, when using this SerializationSchema Flink expects that the data has been serialized with Flink's serializer for the Person type.
Given the excerpt of the Person class, Flink will most likely use its PojoTypeSerializer. Feeding JSON input data won't be understood by this serializer.
If you want to use JSON as the input format, then you have to define your own DeserializationSchema which can parse JSON into Person.
Answer for who have the same question
Custom Serializer
class PersonSchema implements DeserializationSchema<Person>{
private ObjectMapper mapper = new ObjectMapper(); //com.fasterxml.jackson.databind.ObjectMapper;
#Override
public Person deserialize(byte[] bytes) throws IOException {
return mapper.readValue( bytes, Person.class );
}
#Override
public boolean isEndOfStream(Person person) {
return false;
}
#Override
public TypeInformation<Person> getProducedType() {
return TypeInformation.of(new TypeHint<Person>(){});
}
}
Using the schema
DataStream<Person> input = env.addSource( new FlinkKafkaConsumer08<>("persons", new PersonSchema() , getKafkaProperties()));

JSON Mapping Exception while calling post method with request body

I have a controller with the below mentioned contract ---
#RequestMapping(value="/api/devices/certs",method = RequestMethod.POST,consumes={"application/json","application/xml"})
public String submitCertificate(#RequestBody Certificate certificate){
System.out.println(certificate.getBase64String());
return certificate.getBase64String();
}
Other than this there are two pojo classes --
1)
public class DeviceCertificateRequest implements Serializable {
private static final long serialVersionUID = -4408117936126030294L;
private Certificate certificate;
public Certificate getCertificate() {
return certificate;
}
public void setCertificate(Certificate certificate) {
this.certificate = certificate;
}
#Override
public String toString() {
return "DeviceCertifficateRequest [certificate=" + certificate + "]";
}
}
2)
public class Certificate implements Serializable {
private static final long serialVersionUID = 4044105355620137636L;
private String base64String;
public String getBase64String() {
return base64String;
}
public void setBase64String(String base64String) {
this.base64String = base64String;
}
#Override
public String toString() {
return "Certificate [base64String=" + base64String + "]";
}
}
Now I am using spring boot and have added jackson-data-bind dependency for content negotiation, also I wanted to consume both json as well as xml data as an input and thus mapping it to the POJO file.
but I am not able to attain the desired result, even I am getting below mentioned error in the logs when trying to send across json from a rest client.
Error----
ERROR] 2017-02-07 13:48:45.448 [http-nio-8080-exec-1] ConfigManagerExceptionHandler - exception while accessing url:-http://localhost:8080/api/devices/certserror message:-Could not read document: Can not construct instance of com.lufthansa.configmanager.request.beans.Certificate: no String-argument constructor/factory method to deserialize from String value ('DeviceCertificateRequest')
at [Source: java.io.PushbackInputStream#3c891128; line: 1, column: 1]; nested exception is com.fasterxml.jackson.databind.JsonMappingException: Can not construct instance of com.lufthansa.configmanager.request.beans.Certificate: no String-argument constructor/factory method to deserialize from String value ('DeviceCertificateRequest')
at [Source: java.io.PushbackInputStream#3c891128; line: 1, column: 1]
Json send across --
"certificate": {
"base64String": "abc"
}
Please also let me know whether it will work properly for xml payload as well, as I want to consume both xml as well as json input
Show us how do You make a request and double check the names of variables.
Check and recheck if You have the correct IMPORTS in the controller, if the Certificate is actually from Your package and not any other.
Add
#JsonInclude(Include.NON_NULL)
class Foo{}
so You won`t have null problems.
Delete for testing the serialVersionUID from certificate.
Try to add #ResponseBody to You consuming controller method.
Try to send
{
"base64String": "abc"
}
without the variable name.
I worked by creating parametrised constructor in the POJO class, seems it jackson data bind requires a parametrised constructor for object creation.
Still have to check for xml input though.

Spring boot / Jackson deserializes JSON of wrong type

I'm a litte bit lost, I have to admit. I wrote a Spring Boot (1.3M2) application that receives a JSON object which it needs to store in a database:
#RequestMapping(value = "/fav", method = RequestMethod.POST, produces = MediaType.APPLICATION_JSON_VALUE)
#ResponseBody
public ResponseEntity<String> setFavorite(#RequestBody List<Favorite> favorites) {
...
internally this method passes the JSON to another method which stores it line by line in a database:
jdbcTemplate.batchUpdate(INSERT_FAVORITE, new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
Favorit fav = favorites.get(i);
ps.setString( ...
}
#Override
public int getBatchSize() {
int size = favorites.size();
return size;
}
When I POST a JSON to the controller which does not match the structure of my Favorite-object I only see null values in my database. Obviously Jackson tries its best to convert my JSON into a Java object but fails and sets all values of the object it finds no value for to null.
Then this list of sort of empty objects is written to the database.
I use curl to POST the values
curl -vX POST https://localhost/fav -d #incorrectype.json
This can't be the source of error because it works with a favorite.json. How can I have my controller / Jackson detect if I use a JSON that does not match ?
One solution is to use annotations from javax.validation, and instead of accepting a List in the controller signature, use a custom wrapper along the lines of this (getters/setters omitted):
public class FavoriteList {
#Valid
#NotNull
#Size(min = 1)
private List<Favorite> favorites;
}
then for the Favorite class add the validation as needed, e. g.:
public class Favorite {
#NotNull
private String id;
}
with these changes in place, modify the controller method signature along these lines:
public ResponseEntity<String> setFavorite(#Valid #RequestBody FavoriteList favoritesList) {
This way, input failing validation will throw exceptions before anything in the controller method is executed.

Parse unnamed mappings in JSON using Jackson

I have some JSON in the following format that I'm trying to parse with Jackson -
"response":{
"response_inner":{
"a":{"field1":2,"field2":0,"field3":5,"field4":0,"field5":[{"field5_1":"b","field5_2":1},{"field5_1":"c","field5_2":1}]},
"d":{"field1":2,"field2":6,"field3":11,"field4":0,"field5":[{"field5_1":"c","field5_2":1},{"field5_1":"b","field5_2":1}]},
"response_inner_bool":false
}
}
Here "a", "b" etc. are some Strings that can change in each response.
I've created a Java object to represent the 'response_inner' (let's call it ResponseInner) and another to represent the object containing the field?s (let's call this one FieldInfo) but I'm not sure how to parse this using the #JsonCreator and #JsonProperty annotations - ResponseInner objects can contain any number of String -> FieldInfo mappings.
I tried parsing it like this -
public class Response {
private ResponseInner responseInner;
#JsonCreator
public Response(#JsonProperty("response_inner") ResponseInner responseInner) {
this.reponseInner = responseInner;
}
}
public class ResponseInner {
private Map<String, FieldInfo> stringToFieldInfoMap;
private boolean responseInnerBool;
#JsonCreator
public ResponseInner(Map<String, FieldInfo> stringToFieldInfoMap, #JsonProperty("response_inner_bool") boolean responseInnerBool ) {
this.stringToFieldInfoMap = stringToFieldInfoMap;
this.responseInnerBool = responseInnerBool;
}
}
But it complains that Argument #0 of constructor has no property name annotation; must have name when multiple-paramater constructor annotated as Creator. Any suggestions for how to get around this?
You don't seem to be using the stringToFieldInfoMap within ResponseInner anyway. Why do you need to pass it as parameter?
If you do need it in that class, you can simply set it via a setter rather than passing it to constructor.
Alternatively, you could perhaps utilize a third class which deals with that actual mapping of the response, which consumes the Response object (which would in turn consume the ResponseInner object which has had the Map removed from it). This would actually allow you to decouple the mapping logic from the response logic perhaps.
public class MappedResponse {
private Map<String, FieldInfo> stringToFieldInfoMap;
private Response response;
public MappedResponse(Map<String, FieldInfo> stringToFieldInfoMap, Response response) {
this.stringToFieldInfoMap = stringToFieldInfoMap;
this.response = response;
}
}