Currently, I have code written in regular Java that gets a public-readable s3 object's InputStream and creates a thumbnail image.
Now I am looking to convert it to using Reactive Java using Project Reactor on Spring Webflux. The following is my code so far and I don't know how to convert it to a inpustream:
public ByteArrayOutputStream createThumbnail(String fileKey, String imageFormat) {
try {
LOG.info("fileKey: {}, endpoint: {}", fileKey, s3config.getSubdomain());
GetObjectRequest request = GetObjectRequest.builder()
.bucket(s3config.getBucket())
.key(fileKey)
.build();
Mono.fromFuture(s3client.getObject(request, new FluxResponseProvider()))
.map(fluxResponse -> new
ResponseInputStream(fluxResponse.sdkResponse, <ABORTABLE_INPUSTREAM?>))
I saw ResponseInputStream and I am thinking maybe that is the way to create a inputstream but I don't know what to put as AbortableInputStream in that constructor?
Is that even the way to create a inpustream?
Btw, I am using FluxResponseProvider from baeldung's documentation which is:
import reactor.core.publisher.Flux;
import software.amazon.awssdk.core.async.AsyncResponseTransformer;
import software.amazon.awssdk.core.async.SdkPublisher;
import software.amazon.awssdk.services.s3.model.GetObjectResponse;
import java.nio.ByteBuffer;
import java.util.concurrent.CompletableFuture;
class FluxResponseProvider implements AsyncResponseTransformer<GetObjectResponse,FluxResponse> {
private FluxResponse response;
#Override
public CompletableFuture<FluxResponse> prepare() {
response = new FluxResponse();
return response.cf;
}
#Override
public void onResponse(GetObjectResponse sdkResponse) {
this.response.sdkResponse = sdkResponse;
}
#Override
public void onStream(SdkPublisher<ByteBuffer> publisher) {
response.flux = Flux.from(publisher);
response.cf.complete(response);
}
#Override
public void exceptionOccurred(Throwable error) {
response.cf.completeExceptionally(error);
}
}
class FluxResponse {
final CompletableFuture<FluxResponse> cf = new CompletableFuture<>();
GetObjectResponse sdkResponse;
Flux<ByteBuffer> flux;
}
Any body know how to get a inpustream from s3 object in reactive java? I am using awssdk version 2.17.195.
Related
I'm writing api using vertx router, and i need my api to return list of json object, is it possible? Thanks in advance
Yes its possible, here is a full example with vert.x json abstraction:
import io.vertx.core.Vertx;
import io.vertx.core.json.JsonArray;
import io.vertx.core.json.JsonObject;
import io.vertx.ext.web.Router;
public class Main {
public static void main(String[] args) {
Vertx vertx = Vertx.vertx();
Router router = Router.router(vertx);
router.get("/foo")
.handler(ctx -> {
JsonArray jsonArray = new JsonArray();
jsonArray.add(new JsonObject());
jsonArray.add(new JsonObject());
ctx.response()
.setChunked(true)
.setStatusCode(200)
.end(jsonArray.toBuffer());
});
vertx
.createHttpServer()
.requestHandler(router)
.listen(8080);
}
}
In case you want to use your own library for json manipulation + pojos, you can work with strings instead when ending the request. e.g. with jackson:
router.get("/foo")
.handler(ctx -> {
List<MyObject> myObjectList = new ArrayList<>();
ObjectMapper objectMapper = new ObjectMapper();
String myObjectListJsonStr = objectMapper.writeValueAsString(myObjectList);
ctx.response()
.setChunked(true)
.setStatusCode(200)
.end(myObjectListJsonStr);
});
Use-case:
Developers/I, want to only implement a Protobuf implementation (binary protocol). However, I need a way to add config, so, the same implementation is exposed as rest/json api as well -- without code duplication.
I have proto endpoints exposed. I also want consumers to post json equivalent of those proto objects and return/receive json equivalent of the results with type info (Pojo?). The type info helps with OpenAPI / Swagger documentation too!
What are the most elegant/simple ways to achieve that without code duplication?
Any example github code that achieves that would be helpful.
Note: This is for webflux & netty - no tomcat.
ProtobufJsonFormatHttpMessageConverter - works for tomcat, does not work for netty. A working example code would be great.
I was messing around with this and ended up with this. Nothing else worked for me.
Using protov3 and setting a protobuf like this
syntax = "proto3";
option java_package = "com.company";
option java_multiple_files = true;
message CreateThingRequest {
...
message CreateThingResponse {
....
I can scan for the protobuf files by setting app.protoPath in my application.properties
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.SerializerProvider;
import com.google.common.reflect.ClassPath;
import com.google.protobuf.Message;
import com.google.protobuf.util.JsonFormat;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
import org.springframework.http.codec.ServerCodecConfigurer;
import org.springframework.http.codec.json.Jackson2JsonDecoder;
import org.springframework.http.codec.json.Jackson2JsonEncoder;
import org.springframework.http.converter.json.Jackson2ObjectMapperBuilder;
import org.springframework.web.reactive.config.WebFluxConfigurer;
#Configuration
public class WebConfig implements WebFluxConfigurer {
#Value("${app.protoPath:com.}")
private String protoPath;
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.defaultCodecs().jackson2JsonEncoder(
new Jackson2JsonEncoder(Jackson2ObjectMapperBuilder.json().serializerByType(
Message.class, new JsonSerializer<Message>() {
#Override
public void serialize(Message value, JsonGenerator gen, SerializerProvider serializers) throws IOException {
String str = JsonFormat.printer().omittingInsignificantWhitespace().print(value);
gen.writeRawValue(str);
}
}
).build())
);
final ClassLoader loader = Thread.currentThread().getContextClassLoader();
Map<Class<?>, JsonDeserializer<?>> deserializers = new HashMap<>();
try {
for (final ClassPath.ClassInfo info : ClassPath.from(loader).getTopLevelClasses()) {
if (info.getName().startsWith(protoPath)) {
final Class<?> clazz = info.load();
if (!Message.class.isAssignableFrom(clazz)) {
continue;
}
#SuppressWarnings("unchecked") final Class<Message> proto = (Class<Message>) clazz;
final JsonDeserializer<Message> deserializer = new CustomJsonDeserializer() {
#Override
public Class<Message> getDeserializeClass() {
return proto;
}
};
deserializers.put(proto, deserializer);
}
}
} catch (IOException e) {
throw new RuntimeException(e);
}
configurer.defaultCodecs().jackson2JsonDecoder(new Jackson2JsonDecoder(Jackson2ObjectMapperBuilder.json().deserializersByType(deserializers).build()));
}
private abstract static class CustomJsonDeserializer extends JsonDeserializer<Message> {
abstract Class<? extends Message> getDeserializeClass();
#Override
public Message deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException {
Message.Builder builder = null;
try {
builder = (Message.Builder) getDeserializeClass()
.getDeclaredMethod("newBuilder")
.invoke(null);
} catch (Exception e) {
throw new RuntimeException(e);
}
JsonFormat.parser().merge(jp.getCodec().readTree(jp).toString(), builder);
return builder.build();
}
}
}
Then I just use the object types in the returns;
#PostMapping(
path = "/things",
consumes = {MediaType.APPLICATION_JSON_VALUE, "application/x-protobuf"},
produces = {MediaType.APPLICATION_JSON_VALUE, "application/x-protobuf"})
Mono<CreateThingResponse> createThing(#RequestBody CreateThingRequest request);
With https://github.com/innogames/springfox-protobuf you can get the responses to show in swagger but the requests still aren't showing for me.
You'll have to excuse the messy Java I'm a little rusty.
I needed to support json and the following code helped
#Bean
public WebFluxConfigurer webFluxConfigurer() {
return new WebFluxConfigurer() {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
ObjectMapper mapper = new ObjectMapper()
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false)
.registerModule(new ProtobufModule());
configurer.customCodecs().register(new Jackson2JsonEncoder(mapper));
configurer.customCodecs().register(new Jackson2JsonDecoder(mapper));
}
};
}
Try adding ProtoEncoder in your WebFlux config:
#EnableWebFlux
public class MyConfig implements WebFluxConfigurer {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.customCodecs().register(new ProtobufEncoder());
}
}
Then in your request mapping return the proto object:
#GetMapping (produces = "application/x-protobuf")
public MyProtoObject lookup() {
return new MyProtoObject();
}
Furthermore, if you want to serialize the proto object into JSON and return String, then have a look at com.googlecode.protobuf-java-format:protobuf-java-format library and JsonFormat::printToString capability (https://code.google.com/archive/p/protobuf-java-format/):
#GetMapping
public String lookup() {
return new JsonFormat().printToString(new MyProtoObj());
}
Since version 4.1 spring provides org.springframework.http.converter.protobuf.ProtobufHttpMessageConverter for reading and writing protos as Json.
However, If you are using Spring 5.x and Protobuf 3.x there is org.springframework.http.converter.protobuf.ProtobufJsonFormatHttpMessageConverter for more explicit conversion of Json.
This documentation should help you:
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/http/converter/protobuf/ProtobufHttpMessageConverter.html
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/http/converter/protobuf/ProtobufJsonFormatHttpMessageConverter.html
I'm exploring reactive programming with Spring Webflux and therefore, I'm trying to make my code completely nonblocking to get all the benefits of a reactive application.
Currently my code for the method to parse a Json String to a JsonNode to get specific values (in this case the elementId) looks like this:
public Mono<String> readElementIdFromJsonString(String jsonString){
final JsonNode jsonNode;
try {
jsonNode = MAPPER.readTree(jsonString);
} catch (IOException e) {
return Mono.error(e);
}
final String elementId = jsonNode.get("elementId").asText();
return Mono.just(elementId);
}
However, IntelliJ notifies me that I'm using an inappropriate blocking method call with this code:
MAPPER.readTree(jsonString);
How can I implement this code in a nonblocking way? I have seen that since Jackson 2.9+, it is possible to parse a Json String in a nonblocking async way, but I don't know how to use that API and I couldn't find an example how to do it correctly.
I am not sure why it is saying it is a blocking call since Jackson is non blocking as far as I know. Anyway one way to resolve this issue is to use schedulers if you do not want to use any other library. Like this.
public Mono<String> readElementIdFromJsonString(String input) {
return Mono.just(Mapper.readTree(input))
.map(it -> it.get("elementId").asText())
.onErrorResume( it -> Mono.error(it))
.subscribeOn(Schedulers.boundedElastic());
}
Something along that line.
import reactor.core.publisher.Mono;
import java.nio.charset.StandardCharsets;
import org.springframework.core.ResolvableType;
import org.springframework.core.io.buffer.DataBufferUtils;
import org.springframework.core.io.buffer.DefaultDataBuffer;
import org.springframework.core.io.buffer.DefaultDataBufferFactory;
import org.springframework.http.codec.json.AbstractJackson2Decoder;
import org.springframework.util.MimeType;
import org.springframework.util.MimeTypeUtils;
import com.fasterxml.jackson.databind.ObjectMapper;
#FunctionalInterface
public interface MessageParser<T> {
Mono<T> parse(String message);
}
public class JsonNodeParser extends AbstractJackson2Decoder implements MessageParser<JsonNode> {
private static final MimeType MIME_TYPE = MimeTypeUtils.APPLICATION_JSON;
private static final ObjectMapper OBJECT_MAPPER = allocateDefaultObjectMapper();
private final DefaultDataBufferFactory factory;
private final ResolvableType resolvableType;
public JsonNodeParser(final Environment env) {
super(OBJECT_MAPPER, MIME_TYPE);
this.factory = new DefaultDataBufferFactory();
this.resolvableType = ResolvableType.forClass(JsonNode.class);
this.setMaxInMemorySize(100000); // 1MB
canDecodeJsonNode();
}
#Override
public Mono<JsonNode> parse(final String message) {
final byte[] bytes = message.getBytes(StandardCharsets.UTF_8);
return decode(bytes);
}
private Mono<JsonNode> decode(final byte[] bytes) {
final DefaultDataBuffer defaultDataBuffer = this.factory.wrap(bytes);
return this.decodeToMono(Mono.just(defaultDataBuffer), this.resolvableType, MIME_TYPE, Map.of())
.ofType(JsonNode.class)
.subscribeOn(Schedulers.boundedElastic())
.doFinally((t) -> DataBufferUtils.release(defaultDataBuffer));
}
private void canDecodeJsonNode() {
if (!canDecode(this.resolvableType, MIME_TYPE)) {
throw new IllegalStateException(String.format("JsonNodeParser doesn't supports the given tar`enter code here`get " +
"element type [%s] and the MIME type [%s]", this.resolvableType, MIME_TYPE));
}
}
}
I am currently building an API gateway for a new microservices system, using the Spring Netflix Zuul library.
So far my gateway contains PRE and POST filters that intercept the requests and perform the required logic, etc.
One thing that I see is that REST calls to specific microservices require invoking an API endpoint (either GET or POST) containing JSON payload data that is very complex.
For an end-user sending a request to a microservice containing this JSON would not be user friendly.
I had an idea such that the API gateway act as a mediator, where the user can submit a more "simplified/ user-friendly" JSON to the API gateway, which will transform the JSON payload with the correct "complex" JSON structure that the target microservice can understand in order to handle the request efficiently.
My understanding of how Netflix Zuul is that this can be done by creating a RouteFilter and then including this logic here.
Can anyone explain if (or how) this transformation could be done using Netflix Zuul?
Any advice is appreciated.
Thanks.
No doubt you can do it with Zuul, i am currently trying to do almost the same. i'd suggest you take a look at this repo :
sample-zuul-filters
and the official doc on github.
Filters have to extend ZuulFilter and implement the following methods :
/**
*return a string defining when your filter must execute during zuul's
*lyfecyle ('pre'/'post' routing
**/
#Override
public String filterType(){
return 'pre'; // run this filter before sending the final request
}
/**
* return an int describing the order that the filter should run on,
* (relative to the other filters and the current 'pre' or 'post' context)
**/
#Override
public int filterOrder {
return 1; //this filter runs first in a pre-request context
}
/**
* return a boolean indicating if the filter should run or not
**/
#Override
public boolean shouldFilter() {
RequestContext ctx = RequestContext.getCurrentContext();
if(ctx.getRequest().getRequestURI().equals("/theRouteIWantToFilter"))
{
return true;
}
else {
return false;
}
}
/**
* After all the config stuffs you can set what your filter actually does
* here. This is where your json logic goes.
*/
#Override
public Object run() {
try {
RequestContext ctx = RequestContext.getCurrentContext();
HttpServletRequest request = ctx.getRequest();
InputStream stream = ctx.getResponseDataStream();
String body = StreamUtils.copyToString(stream, Charset.forName("UTF-8"));
// transform your json and send it to the api.
ctx.setResponseBody(" Modified body : " + body);
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
I am not sure my answer is 100% accurate since i am working on it but it's a start.
I've done payload conversion in pre filter but this should work in route filter as well. Use com.netflix.zuul.http.HttpServletRequestWrapper to capture and modify the original request payload before forwarding the request to target microservice.
Sample code:
package com.sample.zuul.filters.pre;
import com.google.common.io.CharStreams;
import com.netflix.zuul.ZuulFilter;
import com.netflix.zuul.context.RequestContext;
import com.netflix.zuul.http.HttpServletRequestWrapper;
import com.netflix.zuul.http.ServletInputStreamWrapper;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import org.json.simple.parser.ParseException;
import javax.servlet.ServletInputStream;
import javax.servlet.http.HttpServletRequest;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStreamReader;
public class JsonConverterFilter extends ZuulFilter {
#Override
public String filterType() {
return "pre";
}
#Override
public int filterOrder() {
return 0; // Set it to whatever the order of your filter is
}
#Override
public boolean shouldFilter() {
return true;
}
#Override
public Object run() {
RequestContext context = RequestContext.getCurrentContext();
HttpServletRequest request = new HttpServletRequestWrapper(context.getRequest());
String requestData = null;
JSONParser jsonParser = new JSONParser();
JSONObject requestJson = null;
try {
if (request.getContentLength() > 0) {
requestData = CharStreams.toString(request.getReader());
}
if (requestData == null) {
return null;
}
requestJson = (JSONObject) jsonParser.parse(requestData);
} catch (Exception e) {
//Add your exception handling code here
}
JSONObject modifiedRequest = modifyJSONRequest(requestJson);
final byte[] newRequestDataBytes = modifiedRequest.toJSONString().getBytes();
request = getUpdatedHttpServletRequest(request, newRequestDataBytes);
context.setRequest(request);
return null;
}
private JSONObject modifyJSONRequest(JSONObject requestJSON) {
JSONObject jsonObjectDecryptedPayload = null;
try {
jsonObjectDecryptedPayload = (JSONObject) new JSONParser()
.parse("Your new complex json");
} catch (ParseException e) {
e.printStackTrace();
}
return jsonObjectDecryptedPayload;
}
private HttpServletRequest getUpdatedHttpServletRequest(HttpServletRequest request, final byte[] newRequestDataBytes) {
request = new javax.servlet.http.HttpServletRequestWrapper(request) {
#Override
public BufferedReader getReader() throws IOException {
return new BufferedReader(
new InputStreamReader(new ByteArrayInputStream(newRequestDataBytes)));
}
#Override
public ServletInputStream getInputStream() throws IOException {
return new ServletInputStreamWrapper(newRequestDataBytes);
}
/*
* Forcing any calls to HttpServletRequest.getContentLength to return the accurate length of bytes
* from a modified request
*/
#Override
public int getContentLength() {
return newRequestDataBytes.length;
}
};
return request;
}
}
I am working in AEM trying to get create txt files with JSON output so that I can load them into my unit test as strings and test my model / model processors. So far I have this...
public String readFile(String path, Charset encoding) throws IOException
{
byte[] encoded = Files.readAllBytes(Paths.get(path));
return new String(encoded, encoding);
}
private String sampleInput = readFile("/test/resources/map/sample-
input.txt",Charset.forName("UTF-8"));
I need sampleInput to take the json that is in 'sampleInput.txt' and convert it to a string. I am also running into issues with the Charset encoding.
I think the easiest way to manage JSON documents you use for unit testing is by keeping them organized in the classpath. Guava provides a neat wrapper for loading classpath resources.
import com.google.common.base.Charsets;
import com.google.common.io.Resources;
import java.io.IOException;
import java.net.URL;
public class TestJsonDocumentLoader {
public TestJsonDocumentLoader(Class clazz) {
this.clazz = clazz;
}
public String loadTestJson(String fileName) {
URL url = Resources.getResource(clazz, fileName);
try {
String data = Resources.toString(url, Charsets.UTF_8);
return data;
} catch (IOException e) {
throw new RuntimeException("Couldn't load a JSON file.", e);
}
}
}
This can then be used to load arbitrary JSON files placed in the same package as the test class. It is assumed that the files are UTF-8 encoded. I suggest keeping all sources encoded that way, regardless of the OS your team is using. It saves you a lot of trouble with version control.
Let's say you have MyTest in src/test/java/com/example/mytestsuite, then you could place a file data.json in src/test/resources/com/example/mytestsuite and load id by calling
TestJsonDocumentLoader loader = new TestJsonDocumentLoader(MyTest.class);
String jsonData = loader.loadTestJson("data.json");
String someOtherExample = loader.loadTestJson("other.json");
Actually, this could be used for all sorts of text files.
You could have also used object mapper from Jackson as an alternative
public class JsonResourceObjectMapper<T> {
private Class<T> model;
public JsonResourceObjectMapper(Class<T> model) {
this.model = model;
}
public T loadTestJson(String fileName) throws IOException{
ClassLoader classLoader = this.getClass().getClassLoader();
InputStream inputStream= classLoader.getResourceAsStream(fileName);
return new ObjectMapper().readValue(inputStream, this.model);
}
}
And then setup a fixture in the test passing a .class
private JsonClass json;
#Before
public void setUp() throws IOException {
JsonResourceObjectMapper mapper = new JsonResourceObjectMapper(JsonClass.class);
json = (JsonClass) mapper.loadTestJson("json/testJson.json");
}
Note that the testJson.json file is in resources/json folder same as what #toniedzwiedz mentioned
So then you could use the json model as:
#Test
public void testJsonNameProperty(){
//act
String name = json.getName();
// assert
assertEquals("testName", name);
}