Simple camel cxfrs consumer that consumes json and creates a map - json

I am struggling on a simple task. I want to create a cxfrs consumer that simply consumes json.
The json should be converted to a simple map (key->value): I created a simple test:
#Test
public final void test() throws Exception {
MockEndpoint mockOut = context.getEndpoint(MOCK_OUT, MockEndpoint.class);
mockOut.expectedMessageCount(1);
context.addRoutes(createRouteBuilder());
context.start();
context.createProducerTemplate().sendBody(DIRECT_A, "{ \"ussdCode\":\"101#\",\"msisdn\":\"491234567\"}");
mockOut.assertIsSatisfied();
}
private RouteBuilder createRouteBuilder() throws Exception {
return new RouteBuilder() {
#Override
public void configure() throws Exception {
from(DIRECT_A).to("cxfrs://http://localhost:8085/ussd");
from("cxfrs://http://localhost:8085/ussd")
.unmarshal().json(JsonLibrary.Jackson)
.process(to).to(MOCK_OUT);
}
};
}
The problem is on context.start() i get ServiceConstructionException: No resource classes found. I also tried to create the consumer this way (setting binding style):
private Endpoint fromCxfRsEndpoint() {
CxfRsEndpoint cxfRsEndpoint = context.getEndpoint("cxfrs://http://localhost:8085/ussd", CxfRsEndpoint.class);
cxfRsEndpoint.setBindingStyle(BindingStyle.SimpleConsumer);
return cxfRsEndpoint;
}
This didn't helped neither. So how to create a simple rest/json consumer and unmarshal to a simple map?

Related

How to test keyedbroadcastprocessfunction in flink?

I am new to flink i am trying write junit test cases to test KeyedBroadCastProcessFunction. Below is my code ,i am currently calling the getDataStreamOutput method in TestUtils class and passing inputdata and patternrules to method once the input data is evaluated against list of pattern rules and if input data satisfy the condition i will get the signal and calling sink function and returning output data as string in getDataStreamOutput method
#Test
public void testCompareInputAndOutputDataForInputSignal() throws Exception {
Assertions.assertEquals(sampleInputSignal,
TestUtils.getDataStreamOutput(
inputSignal,
patternRules));
}
public static String getDataStreamOutput(JSONObject input, Map<String, String> patternRules) throws Exception {
env.setParallelism(1);
DataStream<JSONObject> inputSignal = env.fromElements(input);
DataStream<Map<String, String>> rawPatternStream =
env.fromElements(patternRules);
//Generate a key,value pair of set of patterns where key is pattern name and value is pattern condition
DataStream<Tuple2<String, Map<String, String>>> patternRuleStream =
rawPatternStream.flatMap(new FlatMapFunction<Map<String, String>,
Tuple2<String, Map<String, String>>>() {
#Override
public void flatMap(Map<String, String> patternRules,
Collector<Tuple2<String, Map<String, String>>> out) throws Exception {
for (Map.Entry<String, String> stringEntry : patternRules.entrySet()) {
JSONObject jsonObject = new JSONObject(stringEntry.getValue());
Map<String, String> map = new HashMap<>();
for (String key : jsonObject.keySet()) {
String value = jsonObject.get(key).toString();
map.put(key, value);
}
out.collect(new Tuple2<>(stringEntry.getKey(), map));
}
}
});
BroadcastStream<Tuple2<String, Map<String, String>>> patternRuleBroadcast =
patternStream.broadcast(patternRuleDescriptor);
DataStream<Tuple2<String, JSONObject>> validSignal = inputSignal.map(new MapFunction<JSONObject,
Tuple2<String, JSONObject>>() {
#Override
public Tuple2<String, JSONObject> map(JSONObject inputSignal) throws Exception {
String source =
inputSignal.getSource();
return new Tuple2<>(source, inputSignal);
}
}).keyBy(0).connect(patternRuleBroadcast).process(new MyKeyedBroadCastProcessFunction());
validSignal.map(new MapFunction<Tuple2<String, JSONObject>,
JSONObject>() {
#Override
public JSONObject map(Tuple2<String, JSONObject> inputSignal) throws Exception {
return inputSignal.f1;
}
}).addSink(new getDataStreamOutput());
env.execute("TestFlink");
}
return (getDataStreamOutput.dataStreamOutput);
}
#SuppressWarnings("serial")
public static final class getDataStreamOutput implements SinkFunction<JSONObject> {
public static String dataStreamOutput;
public void invoke(JSONObject inputSignal) throws Exception {
dataStreamOutput = inputSignal.toString();
}
}
I need to test different inputs with same broadcast rules but each time i am calling this function its again and again doing process from beginning take input signal broadcast data, is there a way i can broadcast once and keeping on sending the input to the method i explored i can use CoFlatMapFunction something like below to combine datastream and keep on sending the input rules while method is running but for this one of the datastream has to keep on getting data from kafka topic again it will overburden on method to load kafka utils and server
DataStream<JSONObject> inputSignalFromKafka = env.addSource(inputSignalKafka);
DataStream<org.json.JSONObject> inputSignalFromMethod = env.fromElements(inputSignal));
DataStream<JSONObject> inputSignal = inputSignalFromMethod.connect(inputSignalFromKafka)
.flatMap(new SignalCoFlatMapper());
public static class SignalCoFlatMapper
implements CoFlatMapFunction<JSONObject, JSONObject, JSONObject> {
#Override
public void flatMap1(JSONObject inputValue, Collector<JSONObject> out) throws Exception {
out.collect(inputValue);
}
#Override
public void flatMap2(JSONObject kafkaValue, Collector<JSONObject> out) throws Exception {
out.collect(kafkaValue);
}
}
I found a link in stackoverflow How to unit test BroadcastProcessFunction in flink when processElement depends on broadcasted data but this is confused me a lot
Any way i can only broadcast only once in Before method in test cases and keeping sending different kind of data to my broadcast function
You can use KeyedTwoInputStreamOperatorTestHarness in order to achieve this for example let's assume you have the following KeyedBroadcastProcessFunction where you define some business logic for both DataStream channels
public class SimpleKeyedBroadcastProcessFunction extends KeyedBroadcastProcessFunction<String, String, String, String> {
#Override
public void processElement(String inputEntry,
ReadOnlyContext readOnlyContext, Collector<String> collector) throws Exception {
//business logic for how you want to process your data stream records
}
#Override
public void processBroadcastElement(String broadcastInput, Context
context, Collector<String> collector) throws Exception {
//process input from your broadcast channel
}
Let's now assume your process function is stateful and is making modifications to the Flink internal state, you would have to create a TestHarness inside your test class to ensure you are able to keep track of the state during testing.
I would then create some unit tests using the following approach:
public class SimpleKeyedBroadcastProcessFunctionTest {
private SimpleKeyedBroadcastProcessFunction processFunction;
private KeyedTwoInputStreamOperatorTestHarness<String, String, String, String> testHarness;
#Before
public void setup() throws Exception {
processFunction = new SimpleKeyedBroadcastProcessFunction();
testHarness = new KeyedTwoInputStreamOperatorTestHarness<>(
new CoBroadcastWithKeyedOperator<>(processFunction, ImmutableList.of(BROADCAST_MAP_STATE_DESCRIPTOR)),
(KeySelector<String, String>) string -> string ,
(KeySelector<String, String>) string -> string,
TypeInformation.of(String.class));
testHarness.setup();
testHarness.open();
}
#After
public void cleanup() throws Exception {
testHarness.close();
}
#Test
public void testProcessRegularInput() throws Exception {
//processElement1 send elements into your regular stream, second param will be the event time of the record
testHarness.processElement1(new StreamRecord<>("Hello", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello", records.get(0).getValue())
}
#Test
public void testProcessBroadcastInput() throws Exception {
//processElement2 send elements into your broadcast stream, second param will be the event time of the record
testHarness.processElement2(new StreamRecord<>("Hello from Broadcast", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello from Broadcast", records.get(0).getValue())
}
}

Is it possible to pass a java.util.Stream to Gson?

I'm currently working on a project where I need to fetch a large amount of data from the Database and parse it into a specific Json format, I already have built my custom Serializers and Its working properly when i pass a List to Gson. But as I was already working with Streams from my JPA Layer, I thought I could pass the Stream down to the Gson parser so that it could transform it directly to my Json data. But I'm getting an empty Json object instead of a correctly populated one.
So, if anyone could point to me a way to make Gson work with Java 8 Streams or if this isn't possible currently.. i could not find anything on Google, so i came to Stackoverflow.
You could use JsonWriter to streaming your data to output stream:
public void writeJsonStream(OutputStream out, Stream<DataObject> data) throws IOException {
try(JsonWriter writer = new JsonWriter(new OutputStreamWriter(out, "UTF-8"))) {
writer.setIndent(" ");
writer.beginArray();
data.forEach(d -> {
d.beginObject();
d.name("yourField").value(d.getYourField());
....
d.endObject();
});
writer.endArray();
}
}
Note that you're in charge of controling the json structure.
That is, if your DataObject contains nested Object, you have to write beginObject()/endObject() respectively. The same goes for nested array.
It is not as trivial as one would expect, but it can be done in a generic way.
When you look into the Javadoc to TypeAdapterFactory, they provide a very simplistic way of writing a TypeAdapterFactory for a custom type. Alas, it does not work as expected because of problems with element type detection. The proper way to do this can be found in Gson-internal CollectionTypeAdapterFactory. It is quite complex, but taking what's necessary one can come up with something like that:
final class StreamTypeAdapterFactory implements TypeAdapterFactory {
#SuppressWarnings("unchecked")
#Override
public <T> TypeAdapter<T> create(Gson gson, TypeToken<T> typeToken) {
Type type = typeToken.getType();
Class<? super T> rawType = typeToken.getRawType();
if (!Stream.class.isAssignableFrom(rawType)) {
return null;
}
Type elementType = ExtraGsonTypes.getStreamElementType(type, rawType);
TypeAdapter<?> elementAdapter = gson.getAdapter(TypeToken.get(elementType));
return (TypeAdapter<T>) new StreamTypeAdapter<>(elementAdapter);
}
private static class StreamTypeAdapter<E> extends TypeAdapter<Stream<E>> {
private final TypeAdapter<E> elementAdapter;
StreamTypeAdapter(TypeAdapter<E> elementAdapter) {
this.elementAdapter = elementAdapter;
}
public void write(JsonWriter out, Stream<E> value) throws IOException {
out.beginArray();
for (E element : iterable(value)) {
elementAdapter.write(out, element);
}
out.endArray();
}
public Stream<E> read(JsonReader in) throws IOException {
Stream.Builder<E> builder = Stream.builder();
in.beginArray();
while (in.hasNext()) {
builder.add(elementAdapter.read(in));
}
in.endArray();
return builder.build();
}
}
private static <T> Iterable<T> iterable(Stream<T> stream) {
return stream::iterator;
}
}
The ExtraGsonTypes is a special class that I used to circumvent package-private access to $Gson$Types.getSupertype method. It's a hack that works if you're not using JDK 9's modules - you simply place this class in the same package as $Gson$Types:
package com.google.gson.internal;
import java.lang.reflect.*;
import java.util.stream.Stream;
public final class ExtraGsonTypes {
public static Type getStreamElementType(Type context, Class<?> contextRawType) {
return getContainerElementType(context, contextRawType, Stream.class);
}
private static Type getContainerElementType(Type context, Class<?> contextRawType, Class<?> containerSupertype) {
Type containerType = $Gson$Types.getSupertype(context, contextRawType, containerSupertype);
if (containerType instanceof WildcardType) {
containerType = ((WildcardType)containerType).getUpperBounds()[0];
}
if (containerType instanceof ParameterizedType) {
return ((ParameterizedType) containerType).getActualTypeArguments()[0];
}
return Object.class;
}
}
(I filed an issue about that in GitHub)
You use it in the following way:
Gson gson = new GsonBuilder()
.registerTypeAdapterFactory(new StreamTypeAdapterFactory())
.create();
System.out.println(gson.toJson(Stream.of(1, 2, 3)));

Camel - CSV Headers setting not working

I have CSV files without headers. Since I'm using 'useMaps' I want to specify the headers dynamically. If I set headers statically and then use in route it works fine as below Approach 1 -
#Component
public class BulkActionRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
CsvDataFormat csv = new CsvDataFormat(",");
csv.setUseMaps(true);
ArrayList<String> list = new ArrayList<String>();
list.add("DeviceName");
list.add("Brand");
list.add("status");
list.add("type");
list.add("features_c");
list.add("battery_c");
list.add("colors");
csv.setHeader(list);
from("direct:bulkImport")
.convertBodyTo(String.class)
.unmarshal(csv)
.split(body()).streaming()
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
GenericObjectModel model = null;
HashMap<String, String> csvRecord = (HashMap<String, String>)exchange.getIn().getBody();
}
});
}
}
However, if the list is passed via Camel headers as below then it does not work Approach 2 -
#Component
public class BulkActionRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
CsvDataFormat csv = new CsvDataFormat(",");
csv.setUseMaps(true);
from("direct:bulkImport")
.convertBodyTo(String.class)
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
ArrayList<String> fileHeaders = (ArrayList<String>)headers.get(Constants.FILE_HEADER_LIST);
if (fileHeaders != null && fileHeaders.size() > 0) {
csv.setHeader(fileHeaders);
}
}
})
.unmarshal(csv)
.split(body()).streaming()
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
GenericObjectModel model = null;
HashMap<String, String> csvRecord = (HashMap<String, String>)exchange.getIn().getBody();
}
});
}
}
What could be missing in the Approach 2?
The big difference between approach 1 and 2 is the scope.
In approach 1 you fully configure the CSV data format. This is all done when the Camel Context is created, since the data format is shared within the Camel Context. When messages are processed, it is the same config for all messages.
In approach 2 you just configure the basics globally. The header configuration is within the route and therefore can change for every single message. Every message would overwrite the header configuration of the context-global data format instance.
Without being sure about this, I guess that it is not possible to change a context-global DataFormat inside the routes.
What would you expect (just for example) when messages are processed in parallel? They would overwrite the header config against each other.
As an alternative, you could use a POJO where you can do your dynamic marshal / unmarshal from Java code.

JAX-RS Exception Mapper not working in Grizzly container

Working on a Jersey web application with a team, as the project got bigger and bigger, we decided to switch from Tomcat to Grizzly to allow deploying parts of the project on different port numbers. What I've found out now, that the custom exception handling we have fails to work now, instead I always get the grizzly html page.
Example exception:
public class DataNotFoundException extends RuntimeException{
private static final long serialVersionUID = -1622261264080480479L;
public DataNotFoundException(String message) {
super(message);
System.out.println("exception constructor called"); //this prints
}
}
Mapper:
#Provider
public class DataNotFoundExceptionMapper implements ExceptionMapper<DataNotFoundException>{
public DataNotFoundExceptionMapper() {
System.out.println("mapper constructor called"); //doesnt print
}
#Override
public Response toResponse(DataNotFoundException ex) {
System.out.println("toResponse called"); //doesnt print
ErrorMessage errorMessage = new ErrorMessage(ex.getMessage(), 404, "No documentation yet.");
return Response.status(Status.NOT_FOUND)
.entity(errorMessage)
.build();
//ErrorMessage is a simple POJO with 2 string and 1 int field
}
}
I'm not sure where is the problem source, if needed I can provide more information/code. What's the problem, what can I try?
EDIT:
Main.class:
public class Main {
/**
* Main method.
* #param args
* #throws Exception
*/
public static void main(String[] args) throws Exception {
...
List<ServerInfo> serverList = new ArrayList<ServerInfo>();
serverList.add(new ServerInfo(
"api",8450,
new ResourceConfig().registerClasses(
the.package.was.here.ApiResource.class)
));
for(ServerInfo server : serverList) {
server.start();
}
System.out.println("Press enter to exit...");
System.in.read();
for(ServerInfo server : serverList) {
server.stop();
}
}
}
EDIT2:
based on this question I've tried using this ServerProperties.RESPONSE_SET_STATUS_OVER_SEND_ERROR, "true"property, which only helped a little. I still get the html grizzly page when the exception happens, but now I see my exception (+stack trace) in the body of the page.
You're only registering one resource class for the entire application
new ResourceConfig().registerClasses(
eu.arrowhead.core.api.ApiResource.class
)
The mapper needs to be registered also
new ResourceConfig().registerClasses(
eu.arrowhead.core.api.ApiResource.class,
YourMapper.class)
)
You can also use package scanning, which will pick up all classes and automatically register them, if they are annotated with #Path or #Provider
new ResourceConfig().packages("the.packages.to.scan")

Jersey + Jackson + arbitrary json

I am using Jersey + Jackson + Guice for my webapp. Now I wanted to implemented a simple REST call for my client where i receive arbitrary JSON data on the server, but every time i get the following exception:
org.codehaus.jackson.map.exc.UnrecognizedPropertyException: Unrecognized field "validTo" (Class org.codehaus.jettison.json.JSONObject), not marked as ignorable| at [Source: org.eclipse.jetty.server.HttpConnection$Input#1cafa346; line: 1, column: 25] (through reference chain: org.codehaus.jettison.json.JSONObject["validTo"])
My method signature looks like the following:
#Override
#POST
#Consumes(MediaType.APPLICATION_JSON)
public void post(JSONObject json) throws JSONException {
}
My Guice config:
return Guice.createInjector(new TTShiroModule(this.servletContext), ShiroWebModule.guiceFilterModule(),
new ServiceModule(), new JerseyServletModule() {
#Override
protected void configureServlets() {
bind(GuiceContainer.class);
bind(MessageBodyReader.class).to(JacksonJsonProvider.class);
bind(MessageBodyWriter.class).to(JacksonJsonProvider.class);
serve("/rest/*").with(GuiceContainer.class, params);
}
#Provides
#Singleton
ObjectMapper objectMapper() {
final ObjectMapper mapper = new ObjectMapper();
return mapper;
}
#Provides
#Singleton
JacksonJsonProvider jacksonJsonProvider(ObjectMapper mapper) {
return new JacksonJsonProvider(mapper);
}
});
I searched for this exception a long time but couldnt find any help. I also tried different approaches but wasnt able to resolve this issue.
Anyone can help me?
If you need more information, then please let me know!
best regards.
Jersey won't automatically unwrap the json string to JSONObject on its own, but you could easily do as follows:
#Override
#POST
#Consumes(MediaType.APPLICATION_JSON)
public void post(String json) throws JSONException {
JSONObject object = new JSONObject(json);
// do things with object
}