Non-blocking parsing of a JSON String to a JsonNode - json

I'm exploring reactive programming with Spring Webflux and therefore, I'm trying to make my code completely nonblocking to get all the benefits of a reactive application.
Currently my code for the method to parse a Json String to a JsonNode to get specific values (in this case the elementId) looks like this:
public Mono<String> readElementIdFromJsonString(String jsonString){
final JsonNode jsonNode;
try {
jsonNode = MAPPER.readTree(jsonString);
} catch (IOException e) {
return Mono.error(e);
}
final String elementId = jsonNode.get("elementId").asText();
return Mono.just(elementId);
}
However, IntelliJ notifies me that I'm using an inappropriate blocking method call with this code:
MAPPER.readTree(jsonString);
How can I implement this code in a nonblocking way? I have seen that since Jackson 2.9+, it is possible to parse a Json String in a nonblocking async way, but I don't know how to use that API and I couldn't find an example how to do it correctly.

I am not sure why it is saying it is a blocking call since Jackson is non blocking as far as I know. Anyway one way to resolve this issue is to use schedulers if you do not want to use any other library. Like this.
public Mono<String> readElementIdFromJsonString(String input) {
return Mono.just(Mapper.readTree(input))
.map(it -> it.get("elementId").asText())
.onErrorResume( it -> Mono.error(it))
.subscribeOn(Schedulers.boundedElastic());
}
Something along that line.

import reactor.core.publisher.Mono;
import java.nio.charset.StandardCharsets;
import org.springframework.core.ResolvableType;
import org.springframework.core.io.buffer.DataBufferUtils;
import org.springframework.core.io.buffer.DefaultDataBuffer;
import org.springframework.core.io.buffer.DefaultDataBufferFactory;
import org.springframework.http.codec.json.AbstractJackson2Decoder;
import org.springframework.util.MimeType;
import org.springframework.util.MimeTypeUtils;
import com.fasterxml.jackson.databind.ObjectMapper;
#FunctionalInterface
public interface MessageParser<T> {
Mono<T> parse(String message);
}
public class JsonNodeParser extends AbstractJackson2Decoder implements MessageParser<JsonNode> {
private static final MimeType MIME_TYPE = MimeTypeUtils.APPLICATION_JSON;
private static final ObjectMapper OBJECT_MAPPER = allocateDefaultObjectMapper();
private final DefaultDataBufferFactory factory;
private final ResolvableType resolvableType;
public JsonNodeParser(final Environment env) {
super(OBJECT_MAPPER, MIME_TYPE);
this.factory = new DefaultDataBufferFactory();
this.resolvableType = ResolvableType.forClass(JsonNode.class);
this.setMaxInMemorySize(100000); // 1MB
canDecodeJsonNode();
}
#Override
public Mono<JsonNode> parse(final String message) {
final byte[] bytes = message.getBytes(StandardCharsets.UTF_8);
return decode(bytes);
}
private Mono<JsonNode> decode(final byte[] bytes) {
final DefaultDataBuffer defaultDataBuffer = this.factory.wrap(bytes);
return this.decodeToMono(Mono.just(defaultDataBuffer), this.resolvableType, MIME_TYPE, Map.of())
.ofType(JsonNode.class)
.subscribeOn(Schedulers.boundedElastic())
.doFinally((t) -> DataBufferUtils.release(defaultDataBuffer));
}
private void canDecodeJsonNode() {
if (!canDecode(this.resolvableType, MIME_TYPE)) {
throw new IllegalStateException(String.format("JsonNodeParser doesn't supports the given tar`enter code here`get " +
"element type [%s] and the MIME type [%s]", this.resolvableType, MIME_TYPE));
}
}
}

Related

How to get a InputStream from S3client using aws java sdk?

Currently, I have code written in regular Java that gets a public-readable s3 object's InputStream and creates a thumbnail image.
Now I am looking to convert it to using Reactive Java using Project Reactor on Spring Webflux. The following is my code so far and I don't know how to convert it to a inpustream:
public ByteArrayOutputStream createThumbnail(String fileKey, String imageFormat) {
try {
LOG.info("fileKey: {}, endpoint: {}", fileKey, s3config.getSubdomain());
GetObjectRequest request = GetObjectRequest.builder()
.bucket(s3config.getBucket())
.key(fileKey)
.build();
Mono.fromFuture(s3client.getObject(request, new FluxResponseProvider()))
.map(fluxResponse -> new
ResponseInputStream(fluxResponse.sdkResponse, <ABORTABLE_INPUSTREAM?>))
I saw ResponseInputStream and I am thinking maybe that is the way to create a inputstream but I don't know what to put as AbortableInputStream in that constructor?
Is that even the way to create a inpustream?
Btw, I am using FluxResponseProvider from baeldung's documentation which is:
import reactor.core.publisher.Flux;
import software.amazon.awssdk.core.async.AsyncResponseTransformer;
import software.amazon.awssdk.core.async.SdkPublisher;
import software.amazon.awssdk.services.s3.model.GetObjectResponse;
import java.nio.ByteBuffer;
import java.util.concurrent.CompletableFuture;
class FluxResponseProvider implements AsyncResponseTransformer<GetObjectResponse,FluxResponse> {
private FluxResponse response;
#Override
public CompletableFuture<FluxResponse> prepare() {
response = new FluxResponse();
return response.cf;
}
#Override
public void onResponse(GetObjectResponse sdkResponse) {
this.response.sdkResponse = sdkResponse;
}
#Override
public void onStream(SdkPublisher<ByteBuffer> publisher) {
response.flux = Flux.from(publisher);
response.cf.complete(response);
}
#Override
public void exceptionOccurred(Throwable error) {
response.cf.completeExceptionally(error);
}
}
class FluxResponse {
final CompletableFuture<FluxResponse> cf = new CompletableFuture<>();
GetObjectResponse sdkResponse;
Flux<ByteBuffer> flux;
}
Any body know how to get a inpustream from s3 object in reactive java? I am using awssdk version 2.17.195.

Spring webflux Netty: How to expose proto as json endpoints without duplication of code?

Use-case:
Developers/I, want to only implement a Protobuf implementation (binary protocol). However, I need a way to add config, so, the same implementation is exposed as rest/json api as well -- without code duplication.
I have proto endpoints exposed. I also want consumers to post json equivalent of those proto objects and return/receive json equivalent of the results with type info (Pojo?). The type info helps with OpenAPI / Swagger documentation too!
What are the most elegant/simple ways to achieve that without code duplication?
Any example github code that achieves that would be helpful.
Note: This is for webflux & netty - no tomcat.
ProtobufJsonFormatHttpMessageConverter - works for tomcat, does not work for netty. A working example code would be great.
I was messing around with this and ended up with this. Nothing else worked for me.
Using protov3 and setting a protobuf like this
syntax = "proto3";
option java_package = "com.company";
option java_multiple_files = true;
message CreateThingRequest {
...
message CreateThingResponse {
....
I can scan for the protobuf files by setting app.protoPath in my application.properties
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.SerializerProvider;
import com.google.common.reflect.ClassPath;
import com.google.protobuf.Message;
import com.google.protobuf.util.JsonFormat;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
import org.springframework.http.codec.ServerCodecConfigurer;
import org.springframework.http.codec.json.Jackson2JsonDecoder;
import org.springframework.http.codec.json.Jackson2JsonEncoder;
import org.springframework.http.converter.json.Jackson2ObjectMapperBuilder;
import org.springframework.web.reactive.config.WebFluxConfigurer;
#Configuration
public class WebConfig implements WebFluxConfigurer {
#Value("${app.protoPath:com.}")
private String protoPath;
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.defaultCodecs().jackson2JsonEncoder(
new Jackson2JsonEncoder(Jackson2ObjectMapperBuilder.json().serializerByType(
Message.class, new JsonSerializer<Message>() {
#Override
public void serialize(Message value, JsonGenerator gen, SerializerProvider serializers) throws IOException {
String str = JsonFormat.printer().omittingInsignificantWhitespace().print(value);
gen.writeRawValue(str);
}
}
).build())
);
final ClassLoader loader = Thread.currentThread().getContextClassLoader();
Map<Class<?>, JsonDeserializer<?>> deserializers = new HashMap<>();
try {
for (final ClassPath.ClassInfo info : ClassPath.from(loader).getTopLevelClasses()) {
if (info.getName().startsWith(protoPath)) {
final Class<?> clazz = info.load();
if (!Message.class.isAssignableFrom(clazz)) {
continue;
}
#SuppressWarnings("unchecked") final Class<Message> proto = (Class<Message>) clazz;
final JsonDeserializer<Message> deserializer = new CustomJsonDeserializer() {
#Override
public Class<Message> getDeserializeClass() {
return proto;
}
};
deserializers.put(proto, deserializer);
}
}
} catch (IOException e) {
throw new RuntimeException(e);
}
configurer.defaultCodecs().jackson2JsonDecoder(new Jackson2JsonDecoder(Jackson2ObjectMapperBuilder.json().deserializersByType(deserializers).build()));
}
private abstract static class CustomJsonDeserializer extends JsonDeserializer<Message> {
abstract Class<? extends Message> getDeserializeClass();
#Override
public Message deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException {
Message.Builder builder = null;
try {
builder = (Message.Builder) getDeserializeClass()
.getDeclaredMethod("newBuilder")
.invoke(null);
} catch (Exception e) {
throw new RuntimeException(e);
}
JsonFormat.parser().merge(jp.getCodec().readTree(jp).toString(), builder);
return builder.build();
}
}
}
Then I just use the object types in the returns;
#PostMapping(
path = "/things",
consumes = {MediaType.APPLICATION_JSON_VALUE, "application/x-protobuf"},
produces = {MediaType.APPLICATION_JSON_VALUE, "application/x-protobuf"})
Mono<CreateThingResponse> createThing(#RequestBody CreateThingRequest request);
With https://github.com/innogames/springfox-protobuf you can get the responses to show in swagger but the requests still aren't showing for me.
You'll have to excuse the messy Java I'm a little rusty.
I needed to support json and the following code helped
#Bean
public WebFluxConfigurer webFluxConfigurer() {
return new WebFluxConfigurer() {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
ObjectMapper mapper = new ObjectMapper()
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false)
.registerModule(new ProtobufModule());
configurer.customCodecs().register(new Jackson2JsonEncoder(mapper));
configurer.customCodecs().register(new Jackson2JsonDecoder(mapper));
}
};
}
Try adding ProtoEncoder in your WebFlux config:
#EnableWebFlux
public class MyConfig implements WebFluxConfigurer {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.customCodecs().register(new ProtobufEncoder());
}
}
Then in your request mapping return the proto object:
#GetMapping (produces = "application/x-protobuf")
public MyProtoObject lookup() {
return new MyProtoObject();
}
Furthermore, if you want to serialize the proto object into JSON and return String, then have a look at com.googlecode.protobuf-java-format:protobuf-java-format library and JsonFormat::printToString capability (https://code.google.com/archive/p/protobuf-java-format/):
#GetMapping
public String lookup() {
return new JsonFormat().printToString(new MyProtoObj());
}
Since version 4.1 spring provides org.springframework.http.converter.protobuf.ProtobufHttpMessageConverter for reading and writing protos as Json.
However, If you are using Spring 5.x and Protobuf 3.x there is org.springframework.http.converter.protobuf.ProtobufJsonFormatHttpMessageConverter for more explicit conversion of Json.
This documentation should help you:
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/http/converter/protobuf/ProtobufHttpMessageConverter.html
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/http/converter/protobuf/ProtobufJsonFormatHttpMessageConverter.html

Is it possible to pass a java.util.Stream to Gson?

I'm currently working on a project where I need to fetch a large amount of data from the Database and parse it into a specific Json format, I already have built my custom Serializers and Its working properly when i pass a List to Gson. But as I was already working with Streams from my JPA Layer, I thought I could pass the Stream down to the Gson parser so that it could transform it directly to my Json data. But I'm getting an empty Json object instead of a correctly populated one.
So, if anyone could point to me a way to make Gson work with Java 8 Streams or if this isn't possible currently.. i could not find anything on Google, so i came to Stackoverflow.
You could use JsonWriter to streaming your data to output stream:
public void writeJsonStream(OutputStream out, Stream<DataObject> data) throws IOException {
try(JsonWriter writer = new JsonWriter(new OutputStreamWriter(out, "UTF-8"))) {
writer.setIndent(" ");
writer.beginArray();
data.forEach(d -> {
d.beginObject();
d.name("yourField").value(d.getYourField());
....
d.endObject();
});
writer.endArray();
}
}
Note that you're in charge of controling the json structure.
That is, if your DataObject contains nested Object, you have to write beginObject()/endObject() respectively. The same goes for nested array.
It is not as trivial as one would expect, but it can be done in a generic way.
When you look into the Javadoc to TypeAdapterFactory, they provide a very simplistic way of writing a TypeAdapterFactory for a custom type. Alas, it does not work as expected because of problems with element type detection. The proper way to do this can be found in Gson-internal CollectionTypeAdapterFactory. It is quite complex, but taking what's necessary one can come up with something like that:
final class StreamTypeAdapterFactory implements TypeAdapterFactory {
#SuppressWarnings("unchecked")
#Override
public <T> TypeAdapter<T> create(Gson gson, TypeToken<T> typeToken) {
Type type = typeToken.getType();
Class<? super T> rawType = typeToken.getRawType();
if (!Stream.class.isAssignableFrom(rawType)) {
return null;
}
Type elementType = ExtraGsonTypes.getStreamElementType(type, rawType);
TypeAdapter<?> elementAdapter = gson.getAdapter(TypeToken.get(elementType));
return (TypeAdapter<T>) new StreamTypeAdapter<>(elementAdapter);
}
private static class StreamTypeAdapter<E> extends TypeAdapter<Stream<E>> {
private final TypeAdapter<E> elementAdapter;
StreamTypeAdapter(TypeAdapter<E> elementAdapter) {
this.elementAdapter = elementAdapter;
}
public void write(JsonWriter out, Stream<E> value) throws IOException {
out.beginArray();
for (E element : iterable(value)) {
elementAdapter.write(out, element);
}
out.endArray();
}
public Stream<E> read(JsonReader in) throws IOException {
Stream.Builder<E> builder = Stream.builder();
in.beginArray();
while (in.hasNext()) {
builder.add(elementAdapter.read(in));
}
in.endArray();
return builder.build();
}
}
private static <T> Iterable<T> iterable(Stream<T> stream) {
return stream::iterator;
}
}
The ExtraGsonTypes is a special class that I used to circumvent package-private access to $Gson$Types.getSupertype method. It's a hack that works if you're not using JDK 9's modules - you simply place this class in the same package as $Gson$Types:
package com.google.gson.internal;
import java.lang.reflect.*;
import java.util.stream.Stream;
public final class ExtraGsonTypes {
public static Type getStreamElementType(Type context, Class<?> contextRawType) {
return getContainerElementType(context, contextRawType, Stream.class);
}
private static Type getContainerElementType(Type context, Class<?> contextRawType, Class<?> containerSupertype) {
Type containerType = $Gson$Types.getSupertype(context, contextRawType, containerSupertype);
if (containerType instanceof WildcardType) {
containerType = ((WildcardType)containerType).getUpperBounds()[0];
}
if (containerType instanceof ParameterizedType) {
return ((ParameterizedType) containerType).getActualTypeArguments()[0];
}
return Object.class;
}
}
(I filed an issue about that in GitHub)
You use it in the following way:
Gson gson = new GsonBuilder()
.registerTypeAdapterFactory(new StreamTypeAdapterFactory())
.create();
System.out.println(gson.toJson(Stream.of(1, 2, 3)));

Loading json into my unit test from a text file

I am working in AEM trying to get create txt files with JSON output so that I can load them into my unit test as strings and test my model / model processors. So far I have this...
public String readFile(String path, Charset encoding) throws IOException
{
byte[] encoded = Files.readAllBytes(Paths.get(path));
return new String(encoded, encoding);
}
private String sampleInput = readFile("/test/resources/map/sample-
input.txt",Charset.forName("UTF-8"));
I need sampleInput to take the json that is in 'sampleInput.txt' and convert it to a string. I am also running into issues with the Charset encoding.
I think the easiest way to manage JSON documents you use for unit testing is by keeping them organized in the classpath. Guava provides a neat wrapper for loading classpath resources.
import com.google.common.base.Charsets;
import com.google.common.io.Resources;
import java.io.IOException;
import java.net.URL;
public class TestJsonDocumentLoader {
public TestJsonDocumentLoader(Class clazz) {
this.clazz = clazz;
}
public String loadTestJson(String fileName) {
URL url = Resources.getResource(clazz, fileName);
try {
String data = Resources.toString(url, Charsets.UTF_8);
return data;
} catch (IOException e) {
throw new RuntimeException("Couldn't load a JSON file.", e);
}
}
}
This can then be used to load arbitrary JSON files placed in the same package as the test class. It is assumed that the files are UTF-8 encoded. I suggest keeping all sources encoded that way, regardless of the OS your team is using. It saves you a lot of trouble with version control.
Let's say you have MyTest in src/test/java/com/example/mytestsuite, then you could place a file data.json in src/test/resources/com/example/mytestsuite and load id by calling
TestJsonDocumentLoader loader = new TestJsonDocumentLoader(MyTest.class);
String jsonData = loader.loadTestJson("data.json");
String someOtherExample = loader.loadTestJson("other.json");
Actually, this could be used for all sorts of text files.
You could have also used object mapper from Jackson as an alternative
public class JsonResourceObjectMapper<T> {
private Class<T> model;
public JsonResourceObjectMapper(Class<T> model) {
this.model = model;
}
public T loadTestJson(String fileName) throws IOException{
ClassLoader classLoader = this.getClass().getClassLoader();
InputStream inputStream= classLoader.getResourceAsStream(fileName);
return new ObjectMapper().readValue(inputStream, this.model);
}
}
And then setup a fixture in the test passing a .class
private JsonClass json;
#Before
public void setUp() throws IOException {
JsonResourceObjectMapper mapper = new JsonResourceObjectMapper(JsonClass.class);
json = (JsonClass) mapper.loadTestJson("json/testJson.json");
}
Note that the testJson.json file is in resources/json folder same as what #toniedzwiedz mentioned
So then you could use the json model as:
#Test
public void testJsonNameProperty(){
//act
String name = json.getName();
// assert
assertEquals("testName", name);
}

Remove namespace prefix while JAXB marshalling

I have JAXB objects created from a schema. While marshalling, the xml elements are getting annotated with ns2. I have tried all the options that exist over the net for this problem, but none of them works. I cannot modify my schema or change package-info.java. Please help
After much research and tinkering I have finally managed to achieve a solution to this problem. Please accept my apologies for not posting links to the original references - there are many and I wasn't taking notes - but this one was certainly useful.
My solution uses a filtering XMLStreamWriter which applies an empty namespace context.
public class NoNamesWriter extends DelegatingXMLStreamWriter {
private static final NamespaceContext emptyNamespaceContext = new NamespaceContext() {
#Override
public String getNamespaceURI(String prefix) {
return "";
}
#Override
public String getPrefix(String namespaceURI) {
return "";
}
#Override
public Iterator getPrefixes(String namespaceURI) {
return null;
}
};
public static XMLStreamWriter filter(Writer writer) throws XMLStreamException {
return new NoNamesWriter(XMLOutputFactory.newInstance().createXMLStreamWriter(writer));
}
public NoNamesWriter(XMLStreamWriter writer) {
super(writer);
}
#Override
public NamespaceContext getNamespaceContext() {
return emptyNamespaceContext;
}
}
You can find a DelegatingXMLStreamWriter here.
You can then filter the marshalling xml with:
// Filter the output to remove namespaces.
m.marshal(it, NoNamesWriter.filter(writer));
I am sure there are more efficient mechanisms but I know this one works.
For me, only changing the package-info.java class worked like a charm, exactly as zatziky stated :
package-info.java
#javax.xml.bind.annotation.XmlSchema
(namespace = "http://example.com",
xmlns = {#XmlNs(prefix = "", namespaceURI = "http://example.com")},
elementFormDefault = javax.xml.bind.annotation.XmlNsForm.QUALIFIED)
package my.package;
import javax.xml.bind.annotation.XmlNs;
You can let the namespaces be written only once. You will need a proxy class of the XMLStreamWriter and a package-info.java. Then you will do in your code:
StringWriter stringWriter = new StringWriter();
XMLStreamWriter writer = new Wrapper((XMLStreamWriter) XMLOutputFactory
.newInstance().createXMLStreamWriter(stringWriter));
JAXBContext jaxbContext = JAXBContext.newInstance(Collection.class);
Marshaller jaxbMarshaller = jaxbContext.createMarshaller();
jaxbMarshaller.setProperty(Marshaller.JAXB_FORMATTED_OUTPUT, Boolean.TRUE);
jaxbMarshaller.marshal(books, writer);
System.out.println(stringWriter.toString());
Proxy class (the important method is "writeNamespace"):
class WrapperXMLStreamWriter implements XMLStreamWriter {
private final XMLStreamWriter writer;
public WrapperXMLStreamWriter(XMLStreamWriter writer) {
this.writer = writer;
}
//keeps track of what namespaces were used so that not to
//write them more than once
private List<String> namespaces = new ArrayList<String>();
public void init(){
namespaces.clear();
}
public void writeStartElement(String localName) throws XMLStreamException {
init();
writer.writeStartElement(localName);
}
public void writeStartElement(String namespaceURI, String localName) throws XMLStreamException {
init();
writer.writeStartElement(namespaceURI, localName);
}
public void writeStartElement(String prefix, String localName, String namespaceURI) throws XMLStreamException {
init();
writer.writeStartElement(prefix, localName, namespaceURI);
}
public void writeNamespace(String prefix, String namespaceURI) throws XMLStreamException {
if(namespaces.contains(namespaceURI)){
return;
}
namespaces.add(namespaceURI);
writer.writeNamespace(prefix, namespaceURI);
}
// .. other delegation method, always the same pattern: writer.method() ...
}
package-info.java:
#XmlSchema(elementFormDefault=XmlNsForm.QUALIFIED, attributeFormDefault=XmlNsForm.UNQUALIFIED ,
xmlns = {
#XmlNs(namespaceURI = "http://www.w3.org/2001/XMLSchema-instance", prefix = "xsi")})
package your.package;
import javax.xml.bind.annotation.XmlNs;
import javax.xml.bind.annotation.XmlNsForm;
import javax.xml.bind.annotation.XmlSchema;
You can use the NamespacePrefixMapper extension to control the namespace prefixes for your use case. The same extension is supported by both the JAXB reference implementation and EclipseLink JAXB (MOXy).
http://wiki.eclipse.org/EclipseLink/Release/2.4.0/JAXB_RI_Extensions/Namespace_Prefix_Mapper
Every solution requires complex overwriting or annotations which seems not to work with recent version. I use a simpler approach, just by replacing the annoying namespaces. I wish Google & Co would use JSON and get rid of XML.
kml.marshal(file);
String kmlContent = FileUtils.readFileToString(file, "UTF-8");
kmlContent = kmlContent.replaceAll("ns2:","").replace("<kml xmlns:ns2=\"http://www.opengis.net/kml/2.2\" xmlns:ns3=\"http://www.w3.org/2005/Atom\" xmlns:ns4=\"urn:oasis:names:tc:ciq:xsdschema:xAL:2.0\" xmlns:ns5=\"http://www.google.com/kml/ext/2.2\">", "<kml>");
FileUtils.write(file, kmlContent, "UTF-8");