Nested JSON with root element not having key and it is varying - json

I have a unique requirement where i need to construct a JSON as below.
{
"XXXMonitoring/DC/EVN/DBNAME":{
"t":123456777,
"s":{
"CAPTURE":{
"c":100
}
}
}
}
where the root element "XXXMonitoring/DC/EVN/DBNAME" contains "/" in between as it represents a path. I tried with GSON to have nested java but not sure how i can represent "XXXMonitoring/DC/EVN/DBNAME" from my Java object.
Can someone help me on this.

I'm not sure if this is what are you asking...
But sollidus (/) is escaped by backslash (\) to be sure that the browser won’t mistake it for the closing script tag
when you need to use that key, you can remove backslash with String.replaceAll() method
json.toString().replaceAll("\\\\", "");

The JSON string can be constructed without POJO class using the below code.
If the JSON structure is same and only values will change for the keys, you can replace the hard coded values with variables and convert this into an utility method. The utility method can be reused to generate the JSON string.
public static void main(String[] args) throws IOException {
Gson gson = new Gson();
JsonObject jsonRootObject = new JsonObject();
JsonObject jsonFirstLevelObject = new JsonObject();
// t property
jsonFirstLevelObject.addProperty("t", 123456777);
JsonObject jsonCaptureObject = new JsonObject();
JsonObject jsonCObject = new JsonObject();
jsonCObject.addProperty("c", 100);
jsonCaptureObject.add("CAPTURE", jsonCObject);
// s property
jsonFirstLevelObject.add("s", jsonCaptureObject);
jsonRootObject.add("XXXMonitoring/DC/EVN/DBNAME", jsonFirstLevelObject);
System.out.println(gson.toJson(jsonRootObject));
}

I have a library called GsonPath which might suit your needs. The aim of the library is to provide an annotation processor that generates the boilerplate code to help simplify the POJO you need to write.
By using the library you can write a POJO similar to the following:
#AutoGsonAdapter(rootField = "XXXMonitoring/DC/EVN/DBNAME")
public class SamplePojo {
int t;
#SerializedName("s.CAPTURE.c")
int sCapture;
}
Then all you need to do in your gson object is to register a special TypeAdapterFactory
GsonBuilder builder = new GsonBuilder();
builder.registerTypeAdapterFactory(GsonPath.createTypeAdapterFactory());
Gson gson = builder.create();
The documentation within the library is faily comprehensive, let me know if you have any problems!

Related

Unmarshalling with Jackson "The Json input stream must start with an array of Json objects"

I'm getting an error when unmarshalling files that only contain a single JSON object: "IllegalStateException: The Json input stream must start with an array of Json objects"
I can't find any workaround and I don't understand why it has to be so.
#Bean
public ItemReader<JsonHar> reader(#Value("file:${json.resources.path}/*.json") Resource[] resources) {
log.info("Processing JSON resources: {}", Arrays.toString(resources));
JsonItemReader<JsonHar> delegate = new JsonItemReaderBuilder<JsonHar>()
.jsonObjectReader(new JacksonJsonObjectReader<>(JsonHar.class))
.resource(resources[0]) //FIXME had to force this, but fails anyway because the file is "{...}" and not "[...]"
.name("jsonItemReader")
.build();
MultiResourceItemReader<JsonHar> reader = new MultiResourceItemReader<>();
reader.setDelegate(delegate);
reader.setResources(resources);
return reader;
}
I need a way to unmarshall single object files, what's the point in forcing arrays (which I won't have in my use case)??
I don't understand why it has to be so.
The JsonItemReader is designed to read an array of objects because batch processing is usually about handling data sources with a lot of items, not a single item.
I can't find any workaround
JsonObjectReader is what you are looking for: You can implement it to read a single json object and use it with the JsonItemReader (either at construction time or using the setter). This is not a workaround but a strategy interface designed for specific use cases like yours.
Definitely not ideal #thomas-escolan. As #mahmoud-ben-hassine pointed, ideal would be to code a custom reader.
In case some new SOF users stumble on this question, I leave here a code example on how to do it
Though this may not be ideal, this is how I handled the situation:
#Bean
public ItemReader<JsonHar> reader(#Value("file:${json.resources.path}/*.json") Resource[] resources) {
log.info("Processing JSON resources: {}", Arrays.toString(resources));
JsonItemReader<JsonHar> delegate = new JsonItemReaderBuilder<JsonHar>()
.jsonObjectReader(new JacksonJsonObjectReader<>(JsonHar.class))
.resource(resources[0]) //DEBUG had to force this because of NPE...
.name("jsonItemReader")
.build();
MultiResourceItemReader<JsonHar> reader = new MultiResourceItemReader<>();
reader.setDelegate(delegate);
reader.setResources(Arrays.stream(resources)
.map(WrappedResource::new) // forcing the bride to look good enough
.toArray(Resource[]::new));
return reader;
}
#RequiredArgsConstructor
static class WrappedResource implements Resource {
#Delegate(excludes = InputStreamSource.class)
private final Resource resource;
#Override
public InputStream getInputStream() throws IOException {
log.info("Wrapping resource: {}", resource.getFilename());
InputStream in = resource.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(in, UTF_8));
String wrap = reader.lines().collect(Collectors.joining())
.replaceAll("[^\\x00-\\xFF]", ""); // strips off all non-ASCII characters
return new ByteArrayInputStream(("[" + wrap + "]").getBytes(UTF_8));
}
}

How to directly convert MongoDB Document do Jackson JsonNode in Java

I would like to store a MongoDB Document (org.bson.Document) as a Jackson JsonNode file type. There is a outdated answer to this problem here, inspired by this I was able to succesfully parse the Document with
ObjectMapper mapper = new ObjectMapper();
...
JonNode jsonData = mapper.readTree(someBsonDocument.toJson());
In my understanding this will:
Convert the Document to string
Parse the string and create a JsonNode object
I noticed there is some support for MongoDB/BSON for the Jackson Project - jackson-datatype-mongo and BSON for Jackson, but I can not figure out how to use them to do the conversion more efficiently.
I was able to figure-out some solution using bson4jackson:
public static InputStream documentToInputStream(final Document document) {
BasicOutputBuffer outputBuffer = new BasicOutputBuffer();
BsonBinaryWriter writer = new BsonBinaryWriter(outputBuffer);
new DocumentCodec().encode(writer, document, EncoderContext.builder().isEncodingCollectibleDocument(true).build());
return new ByteArrayInputStream(outputBuffer.toByteArray());
}
public static JsonNode documentToJsonNode(final Document document) throws IOException {
ObjectMapper mapper = new ObjectMapper(new BsonFactory());
InputStream is = documentToInputStream(document);
return mapper.readTree(is);
}
I am not sure if this is the most efficient way, I am assuming it is still better solution than converting BSOn to String and parsing that string. There is an open Ticket in the mongoDB JIRA for adding conversion from Document, DBObject and BsonDocument to toBson and vice versa, which would simplify the whole process a lot.
Appreciate this isn't what the OP asked for - but might be helpful to some. I've managed to do this in reverse using MongoJack. The key thing is to use the JacksonEncoder which can turn any Json-like object into a Bson object. Then use BsonDocumentWriter to write it to a BsonDocument instance.
#Test
public void writeBsonDocument() throws IOException {
JsonNode jsonNode = new ObjectMapper().readTree("{\"wibble\": \"wobble\"}");
BsonDocument document = new BsonDocument();
BsonDocumentWriter writer = new BsonDocumentWriter(document);
JacksonEncoder transcoder =
new JacksonEncoder(JsonNode.class, null, new ObjectMapper(), UuidRepresentation.UNSPECIFIED);
var context = EncoderContext.builder().isEncodingCollectibleDocument(true).build();
transcoder.encode(writer,jsonNode,context);
Assertions.assertThat(document.toJson()).isEqualTo("{\"wibble\": \"wobble\"}");
}

JSONObject Alternative in Spring and Jackson

I need to pass a map back to the web application.
I'm used to encapsulating the map in a JSONObject
http://json.org/java/
But since I am using Spring and Jackson Haus.
is there an easier way to maintain the pojo? May I can just annotate the MAP ?
Jackson has com.fasterxml.jackson.core.JsonNode, and specific subtypes like ObjectNode.
These form so-called Tree Model, which is one of 3 ways to handle JSON with Jackson -- some other libraries (like org.json) only offer this way.
So you should be able to just use JsonNode instead; there is little point in using org.json library; it is slow, and has outdated API.
Alternatively you can just use java.util.Map, and return that. Jackson can handle standard Lists, Maps and other JDK types just fine.
If you need to manipulate the output, ie, you don't want to provide all the fields of the object you can use JSonArray:
#RequestMapping(value = "/api/users", method = RequestMethod.GET)
public
#ResponseBody
String listUsersJson(ModelMap model) throws JSONException {
JSONArray userArray = new JSONArray();
for (User user : userRepository.findAll()) {
JSONObject userJSON = new JSONObject();
userJSON.put("id", user.getId());
userJSON.put("firstName", user.getFirstName());
userJSON.put("lastName", user.getLastName());
userJSON.put("email", user.getEmail());
userArray.put(userJSON);
}
return userArray.toString();
}
Use the example from here
Otherwise if you add jackson to your dependencies and set the controller method anotatted with #ResponseBody the response will automatically mapped to JSON. Check here for a simple example.

How to serialize in jackson json null string to empty string

I need jackson json (1.8) to serialize a java NULL string to an empty string. How do you do it?
Any help or suggestion is greatly appreciated.
Thanks
See the docs on Custom Serializers; there's an example of exactly this, works for me.
In case the docs move let me paste the relevant answer:
Converting null values to something else
(like empty Strings)
If you want to output some other JSON value instead of null (mainly
because some other processing tools prefer other constant values --
often empty String), things are bit trickier as nominal type may be
anything; and while you could register serializer for Object.class, it
would not be used unless there wasn't more specific serializer to use.
But there is specific concept of "null serializer" that you can use as
follows:
// Configuration of ObjectMapper:
{
// First: need a custom serializer provider
StdSerializerProvider sp = new StdSerializerProvider();
sp.setNullValueSerializer(new NullSerializer());
// And then configure mapper to use it
ObjectMapper m = new ObjectMapper();
m.setSerializerProvider(sp);
}
// serialization as done using regular ObjectMapper.writeValue()
// and NullSerializer can be something as simple as:
public class NullSerializer extends JsonSerializer<Object>
{
public void serialize(Object value, JsonGenerator jgen,
SerializerProvider provider)
throws IOException, JsonProcessingException
{
// any JSON value you want...
jgen.writeString("");
}
}

how to obtain an unproxied and EAGER fetched object in Hibernate?

I want to load an objet and forget that it comes from hibernate! That's it, I just do something as:
MyClass myObject = MyClassDAO.getUnproxiedObject(objectID);
and than I have a real instance of myObj (and not from a Hibernate proxy) with all attributes set with the values from the database, so that I can't distinguish it from a manually created object.
In this thread a method is present to create an unproxied object, but it does not treats the issue of eager loding the objects, what I suppose is necessary for achieving my ultimate goals.
For those who are wondering why would I want such objects, I need to serialize then to Json with Gson, but I think it would have many other uses for many people.
Use FetchType.EAGER to eagerly load all the relations. Specifically for JSON serialization, if you are building a web application consider using an OpenSessionInView interceptor for your HTTP requests.
after testing I found out that the method given in the citted post did exactly what I was looking for.
The reason hibernate doesn't de-proxy while rendering with GSON is that GSON uses reflection to serialize the fields rather than using the getters of the Hibernate object. To workaround, you need to register a new TypeHierarchyAdapter that will de-proxy the object as GSON is serializing.
Here's my approach:
GsonBuilder builder = new GsonBuilder();
builder.registerTypeHierarchyAdapter(HibernateProxy.class, new HibernateProxySerializer());
String jsonLove = gson.toJson(objToSerialize);
Here's the HibernateProxySerializer:
public class HibernateProxySerializer implements JsonSerializer<HibernateProxy> {
#Override
public JsonElement serialize(HibernateProxy proxyObj, Type arg1, JsonSerializationContext arg2) {
try {
GsonBuilder gsonBuilder = new GsonBuilder();
//below ensures deep deproxied serialization
gsonBuilder.registerTypeHierarchyAdapter(HibernateProxy.class, new HibernateProxySerializer());
Object deProxied = proxyObj.getHibernateLazyInitializer().getImplementation();
return gsonBuilder.create().toJsonTree(deProxied);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}