jackson deserialize Map object inside List - json

I have the following JSON in a file
[
{"numberEnrolledPerMonthPerWeek":
{
{"year":"2011","numberEnrolled":0,"weeks":2},
{"year":"2011","numberEnrolled":0,"weeks":3},
{"year":"2011","numberEnrolled":0,"weeks":4},
{"year":"2011","numberEnrolled":0,"weeks":5},
{"year":"2011","numberEnrolled":0,"weeks":6},
{"year":"2011","numberEnrolled":0,"weeks":7},
{"year":"2011","numberEnrolled":0,"weeks":8},
{"year":"2011","numberEnrolled":0,"weeks":9}
}
,"country":"Argentina"
},
]
When I use Jackson to deserialise this into a Java object I get the following error
org.codehaus.jackson.map.JsonMappingException: Can not deserialize instance of java.util.LinkedHashMap out of START_ARRAY token
i am using the following code
ObjectMapper mapper = new ObjectMapper();
List<EnrolledEnrolment> enrolments = mapper.readValue(new File("src/main/resources/data/jsonQueriesTestData1.txt"),
new TypeReference<List<EnrolledEnrolment>>(){});
I have used typeReference for the initial array but do how do I use type Reference for the hashmap inside the object EnrolledEnrolment.
private Map<Integer, Enrolled> numberEnrolledPerMonthPerWeek = new HashMap<Integer,Enrolled>();
The error is thrown when it tries to parse the 2nd Array? Any ideas
Thanks

Hard to tell from the lack of an actual error in the original post, but I encountered an issue when trying to automatically deserialize JSON to an Object where the JSON contain a Map within a List.
I was seeing this error:
com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot construct instance of `java.util.LinkedHashMap` (although at least one Creator exists): no String-argument constructor/factory method to deserialize from String value
I fixed it be adding #JsonDeserialize(converter = JsonToMapConverter.class) to my Map in my POJO.
Where that class looked something like this
public class JsonToMapConverter extends StdConverter<String, Map<String, MyPojo>> {
#Override
public Map<String, MyPojo> convert(String value) {
try {
return JsonUtil.convertJsonToMap(value, String.class, MyPojo.class);
}
catch (Exception e) {
throw new RuntimeException(String.format("Failed to convert String to POJO [%s]", value));
}
}
}
With the utility method looking like
public static <K,V> Map<K,V> convertJsonToMap(String json, Class<K> key, Class<V> value) throws IOException {
JavaType type = mapper.getTypeFactory()
.constructMapLikeType(Map.class, key, value);
return mapper.readValue(json.getBytes(StandardCharsets.UTF_8), type);
}

Related

Json Validation in Apache beam using Google Cloud Dataflow

I am trying to write a Filter transformation using Apache beam Java SDK and i need to filter out invalid Json Messages.
If i create a new Gson Object for every element validation, the implementation works fine. However I want to avoid the creation of Gson Objects for every elements(throughput is 1K/Second) and validate the json.
I am creating a constant Gson Object at the start and initializing it in the static block. This approach is not working. Not sure why the same object cannot be used to parse multiple elements as we are not changing the state of the object during the processing?
// Gson object declared as constant
private static final Gson gsonObj=new Gson();
// Initialized GSon object during class loading before main method invocation
static {
gsonObj = new Gson();
}
....
/*
enum to validate json messages.
*/
enum InputValidation implements SerializableFunction<String, Boolean> {
VALID {
#Override
public Boolean apply(String input) {
try {
gsonObj.fromJson(input, Object.class);
return true;
} catch(com.google.gson.JsonSyntaxException ex) {
return false;
}
}
}
}
Use TupleTag to filter-out the record, instead of 'enum InputValidation implements'.
Use the below code to filter out the unparseable json row.
Pipeline p = Pipeline.create(options);
TupleTag<String> successParse = new TupleTag<String>();
TupleTag<String> failParse = new TupleTag<String>();
private static final Gson gsonObj=new Gson();
PCollectionTuple = input.apply(ParDo.of(new DoFn<String, String>(){
#ProcessElement
public void processElement(ProcessContext c) {
try {
gsonObj.fromJson(c.element(), Object.class);
c.output(successParse,c.element());
} catch {
c.output(failParse,c.element());
}
}
}).withOutputTags(successParse, TupleTagList.of(failParse)));
Above piece of code worked in my case and the optimum solution to filter out the records.
Here is the official documentation example.

Cannot pass string to AWS Lambda using InvokeRequest.withPayload() method

I have a lambda function that accepts a String as an input parameter. When running the lambda function I get the following error:
Can not deserialize instance of java.lang.String out of START_OBJECT token\n
This is what my code too call it looks like:
InvokeRequest request = new InvokeRequest();
final String payload = "";
request.withFunctionName(FUNCTION_NAME).withPayload((String) null);
InvokeResult invokeResult = lambdaClient.invoke(request);
Assert.assertEquals(new String (invokeResult.getPayload().array(), "UTF-8"), "Success");
And this is what my handler looks like:
public String handleRequest(String s, Context context) {}
Now the contents of the string don't matter, it could be null it could be anything. I don't use the input. The obvious solution is to remove it, but because of an annotation generator i'm using I can't do that. I've tried a ByteBuffer input, String, empty String, JSON String {\"s\":\"s\"} but nothing seems to work. I believe I need to pass in a string (i.e no {}). But since I'm using InvokeRequest I don't believe I can do that. Any suggestions would be greatly appreciated.
It works by passing a JSON valid String.
String payload = "{ \"subject\" : \"content\"}";
request.withFunctionName(functionName)
.withPayload(payload);
At the receiving end you have to map it from Object to String again if that's what you want. Here I used Jackson ObjectMapper for this.
ObjectMapper objectMapper = new ObjectMapper();
try {
String payload = objectMapper.writeValueAsString(input);
} catch (JsonProcessingException e) {
e.printStackTrace();
}

Spring boot JsonMappingException: Object is null

I have a #RestController that returns net.sf.json.JSONObject:
#PostMapping("/list")
public JSONObject listStuff(HttpServletRequest inRequest, HttpServletResponse inResponse) {
JSONObject json = new JSONObject();
...
return json;
}
When JSONObject contains null reference, the following exception is thrown:
Could not write JSON: Object is null; nested exception is com.fasterxml.jackson.databind.JsonMappingException: Object is null (through reference chain: net.sf.json.JSONObject[\"list\"]->net.sf.json.JSONArray[0]->net.sf.json.JSONObject[\"object\"]->net.sf.json.JSONNull[\"empty\"])"
This is the legacy code that we are now cleaning up and at some point we will get rid of explicit JSON manipulations, but this will be a huge change, for now I would like to just get rid of the exception.
I tried with following solutions:
Define Include.NON_NULL in Spring's Object Mapper - so this piece of code in my WebMvcConfigurationSupportClass:
#Override
public void configureMessageConverters(List<HttpMessageConverter<?>> converters)
{
ObjectMapper webObjectMapper = objectMapper.copy();
webObjectMapper.setSerializationInclusion(Include.NON_NULL);
converters.add(new MappingJackson2HttpMessageConverter(webObjectMapper));
}
Setting following property in application.yml:
spring.jackson.default-property-inclusion=non_null
Checking the version of com.fasterxml.jackson - the only found in the dependency tree is 2.9.7.
None of the above helped.
Any suggestions on how to tell Spring to ignore null values in net.sf.json.JSONObjects?
Include.NON_NULL does not work because JSONNull represents null but it is not null per se. From documentation:
JSONNull is equivalent to the value that JavaScript calls null, whilst
Java's null is equivalent to the value that JavaScript calls
undefined.
This object is implemented as a Singleton which has two methods: isArray and isEmpty where isEmpty is problematic because throws exception. Below snippet shows it's implementation:
public boolean isEmpty() {
throw new JSONException("Object is null");
}
The best way is to define NullSerializer for JSONNull type. Below example shows how we can configure that:
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import com.fasterxml.jackson.databind.module.SimpleModule;
import com.fasterxml.jackson.databind.ser.std.NullSerializer;
import net.sf.json.JSONArray;
import net.sf.json.JSONNull;
import net.sf.json.JSONObject;
public class JsonApp {
public static void main(String[] args) throws Exception {
SimpleModule netSfJsonModule = new SimpleModule("net.sf.json");
netSfJsonModule.addSerializer(JSONNull.class, NullSerializer.instance);
ObjectMapper mapper = new ObjectMapper();
mapper.enable(SerializationFeature.INDENT_OUTPUT);
mapper.registerModule(netSfJsonModule);
JSONObject object = new JSONObject();
object.accumulate("object", JSONNull.getInstance());
JSONArray jsonArray = new JSONArray();
jsonArray.add(object);
JSONObject json = new JSONObject();
json.accumulate("list", jsonArray);
System.out.println(mapper.writeValueAsString(json));
}
}
Above code prints:
{
"list" : [ {
"object" : null
} ]
}
See also:
Maven: missing net.sf.json-lib
How do you override the null serializer in Jackson 2.0?
This is not the ideal solution, but rather a workaround. As overriding default mapper/converter behavior didn't work, instead I changed the structure of my JSONObject, so added following line during its generation:
listElt.put("object", "");
which produces:
{
"list" : [ {
"object" : ""
} ]
}
This is fine only if I am not interested in the value of this field - and I am not.
I personally prefer #MichaƂ Ziober's solution - it is elegant and generic. Unfortunately doesn't work for me.

Flink Kafka - Custom Class Data is always null

Custom Class
Person
class Person
{
private Integer id;
private String name;
//getters and setters
}
Kafka Flink Connector
TypeInformation<Person> info = TypeInformation.of(Person.class);
TypeInformationSerializationSchema schema = new TypeInformationSerializationSchema(info, new ExecutionConfig());
DataStream<Person> input = env.addSource( new FlinkKafkaConsumer08<>("persons", schema , getKafkaProperties()));
Now if I send the below json
{ "id" : 1, "name": Synd }
through Kafka Console Producer, the flink code throws null pointer exception
But if I use SimpleStringSchema instead of CustomSchema as defined before, the stream is getting printed.
What is wrong in the above setup
The TypeInformationSerializationSchema is a de-/serialization schema which uses Flink's serialization stack and, thus, also its serializer. Therefore, when using this SerializationSchema Flink expects that the data has been serialized with Flink's serializer for the Person type.
Given the excerpt of the Person class, Flink will most likely use its PojoTypeSerializer. Feeding JSON input data won't be understood by this serializer.
If you want to use JSON as the input format, then you have to define your own DeserializationSchema which can parse JSON into Person.
Answer for who have the same question
Custom Serializer
class PersonSchema implements DeserializationSchema<Person>{
private ObjectMapper mapper = new ObjectMapper(); //com.fasterxml.jackson.databind.ObjectMapper;
#Override
public Person deserialize(byte[] bytes) throws IOException {
return mapper.readValue( bytes, Person.class );
}
#Override
public boolean isEndOfStream(Person person) {
return false;
}
#Override
public TypeInformation<Person> getProducedType() {
return TypeInformation.of(new TypeHint<Person>(){});
}
}
Using the schema
DataStream<Person> input = env.addSource( new FlinkKafkaConsumer08<>("persons", new PersonSchema() , getKafkaProperties()));

Jersey + Jackson + arbitrary json

I am using Jersey + Jackson + Guice for my webapp. Now I wanted to implemented a simple REST call for my client where i receive arbitrary JSON data on the server, but every time i get the following exception:
org.codehaus.jackson.map.exc.UnrecognizedPropertyException: Unrecognized field "validTo" (Class org.codehaus.jettison.json.JSONObject), not marked as ignorable| at [Source: org.eclipse.jetty.server.HttpConnection$Input#1cafa346; line: 1, column: 25] (through reference chain: org.codehaus.jettison.json.JSONObject["validTo"])
My method signature looks like the following:
#Override
#POST
#Consumes(MediaType.APPLICATION_JSON)
public void post(JSONObject json) throws JSONException {
}
My Guice config:
return Guice.createInjector(new TTShiroModule(this.servletContext), ShiroWebModule.guiceFilterModule(),
new ServiceModule(), new JerseyServletModule() {
#Override
protected void configureServlets() {
bind(GuiceContainer.class);
bind(MessageBodyReader.class).to(JacksonJsonProvider.class);
bind(MessageBodyWriter.class).to(JacksonJsonProvider.class);
serve("/rest/*").with(GuiceContainer.class, params);
}
#Provides
#Singleton
ObjectMapper objectMapper() {
final ObjectMapper mapper = new ObjectMapper();
return mapper;
}
#Provides
#Singleton
JacksonJsonProvider jacksonJsonProvider(ObjectMapper mapper) {
return new JacksonJsonProvider(mapper);
}
});
I searched for this exception a long time but couldnt find any help. I also tried different approaches but wasnt able to resolve this issue.
Anyone can help me?
If you need more information, then please let me know!
best regards.
Jersey won't automatically unwrap the json string to JSONObject on its own, but you could easily do as follows:
#Override
#POST
#Consumes(MediaType.APPLICATION_JSON)
public void post(String json) throws JSONException {
JSONObject object = new JSONObject(json);
// do things with object
}