Apache Flink: Could not extract key from ObjectNode::get - json

I'm using Flink to process the data coming from some data source (such as Kafka, Pravega etc).
In my case, the data source is Pravega, which provided me a flink connector.
My data source is sending me some JSON data as below:
{"device":"rand-numeric","id":"b4728895-741f-466a-b87b-79c7590893b4","origin":"1591095418904441036","readings":[{"origin":"1591095418904328442","valueType":"Int64","name":"int","device":"rand-numeric","value":"0"}]}
Here is my piece of code:
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
PravegaDeserializationSchema<ObjectNode> adapter = new PravegaDeserializationSchema<>(ObjectNode.class, new JavaSerializer<>());
FlinkPravegaReader<ObjectNode> source = FlinkPravegaReader.<ObjectNode>builder()
.withPravegaConfig(pravegaConfig)
.forStream(stream)
.withDeserializationSchema(adapter)
.build();
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<ObjectNode> dataStream = env.addSource(source).name("Pravega Stream");
dataStream.keyBy(new KeySelector<ObjectNode, String>() {
#Override
public String getKey(ObjectNode node) throws Exception {
return node.get("id").asText();
}
}).print();
env.execute("StreamingJob");
As you see, I used the FlinkPravegaReader and a proper deserializer to get the JSON stream coming from Pravega.
Then I try to KeyBy it with a custom KeySelector and print it.
However, I get an error:
Caused by: java.lang.RuntimeException: Could not extract key from
{"device":"rand-numeric","id":"b4728895-741f-466a-b87b-79c7590893b4","origin":"1591095418904441036","readings":[{"origin":"1591095418904328442","valueType":"Int64","name":"int","device":"rand-numeric","value":"0"}]}
It seems that node.get("id").asText(); threw this exception.
I don't know why. As we see there does exist a key named id in the JSON data. Why can't it be extracted? Have I used the class ObjectNode wrongly or some other reason?
Stack-trace:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: fa9846e6834ae1391acbf51d5ad35aac)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:335)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:205)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:138)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:662)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:893)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:966)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:966)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: fa9846e6834ae1391acbf51d5ad35aac)
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:83)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1620)
at myflink.StreamingJob.main(StreamingJob.java:137)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:321)
... 8 more
Caused by: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: fa9846e6834ae1391acbf51d5ad35aac)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:112)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$21(RestClusterClient.java:565)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$8(FutureUtils.java:291)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:110)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:110)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:76)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:186)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:180)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:496)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:380)
at sun.reflect.GeneratedMethodAccessor77.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:123)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.RuntimeException: Could not extract key from {"device":"rand-numeric","id":"b4728895-741f-466a-b87b-79c7590893b4","origin":"1591095418904441036","readings":[{"origin":"1591095418904328442","valueType":"Int64","name":"int","device":"rand-numeric","value":"0"}]}
at org.apache.flink.streaming.runtime.io.RecordWriterOutput.pushToRecordWriter(RecordWriterOutput.java:110)
at org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:89)
at org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:45)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:730)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:708)
at org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
at io.pravega.connectors.flink.FlinkPravegaReader.run(FlinkPravegaReader.java:307)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:200)
Caused by: java.lang.RuntimeException: Could not extract key from {"device":"rand-numeric","id":"b4728895-741f-466a-b87b-79c7590893b4","origin":"1591095418904441036","readings":[{"origin":"1591095418904328442","valueType":"Int64","name":"int","device":"rand-numeric","value":"0"}]}
at org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannel(KeyGroupStreamPartitioner.java:56)
at org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannel(KeyGroupStreamPartitioner.java:32)
at org.apache.flink.runtime.io.network.api.writer.ChannelSelectorRecordWriter.emit(ChannelSelectorRecordWriter.java:60)
at org.apache.flink.streaming.runtime.io.RecordWriterOutput.pushToRecordWriter(RecordWriterOutput.java:107)
... 9 more
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode
at myflink.StreamingJob$1.getKey(StreamingJob.java:125)
at org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannel(KeyGroupStreamPartitioner.java:54)
... 12 more

You can check the rules for POJO types here.
Rules for POJO types
By using POJO types, Flink can infer a lot of information about the data types that are exchanged and stored during the distributed computation.
The following codes define POJOs for you input.
public class FlinkPOJO {
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setParallelism(3);
DataStream<String> source =
env.addSource(new SourceFunction<String>() {
#Override
public void run(SourceContext<String> sourceContext) throws Exception {
while (true) {
sourceContext.collect("{\"device\":\"rand-numeric\",\"id\":\"b4728895-741f-466a-b87b-79c7590893b4\",\"origin\":\"1591095418904441036\",\"readings\":[{\"origin\":\"1591095418904328442\",\"valueType\":\"Int64\",\"name\":\"int\",\"device\":\"rand-numeric\",\"value\":\"0\"}]}");
Thread.sleep(1000);
}
}
#Override
public void cancel() {
}
});
DataStream<Info> parsedSource =
source.map(new MapFunction<String, Info>() {
#Override
public Info map(String s) throws Exception {
Gson gson = new Gson();
return gson.fromJson(s, Info.class);
}
});
DataStream<String> output = parsedSource.keyBy(Info::getId).timeWindow(Time.seconds(1))
.process(new ProcessWindowFunction<Info, String, String, TimeWindow>() {
#Override
public void process(String s, Context context, Iterable<Info> iterable, Collector<String> collector) throws Exception {
int count = 0;
Iterator<Info> iterator = iterable.iterator();
while (iterator.hasNext()) {
count++;
iterator.next();
}
collector.collect(String.format("key : %s, size : %s", s, count));
}
});
output.print();
env.execute();
}
public class Info {
public String getDevice() {
return device;
}
public void setDevice(String device) {
this.device = device;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getOrigin() {
return origin;
}
public void setOrigin(String origin) {
this.origin = origin;
}
public Reading[] getReadings() {
return readings;
}
public void setReadings(Reading[] readings) {
this.readings = readings;
}
public String device;
public String id;
public String origin;
public Reading[] readings;
public Info() {
}
}
public class Reading {
public String origin;
public String valueType;
public String name;
public String device;
public String value;
public Reading() {
}
}
}
Actually, you can define a brief POJO which only contains the fields you need.

Related

How Do I Verify and Capture the Contents of an InputStream in JUnit Mockito?

I have the following setup:
public interface CommandRunner
{
void run(String cmd, InputStream is) throws IOException;
}
public class CommandRunnerInputFile
{
private final CommandRunner commandRunner;
public CommandRunnerInputFile(CommandRunner commandRunner) {
this.commandRunner = commandRunner;
}
public void run(String command, File inputFile) throws IOException {
try (FileInputStream is = new FileInputStream(inputFile)) {
this.commandRunner.run(command, is);
}
}
}
#ExtendWith(MockitoExtension.class)
public class TestCommandRunnerInputFile
{
#Mock CommandRunner commandRunner;
#Captor ArgumentCaptor<InputStream> inputStream;
private CommandRunnerInputFile commandRunnerInputFile;
#BeforeEach
void initService() {
commandRunnerInputFile = new CommandRunnerInputFile(commandRunner);
}
#Test
public void testHappyPath() throws IOException {
ClassLoader classLoader = this.getClass().getClassLoader();
File file = new File(classLoader.getResource("MyTestInputFile.txt").getFile());
commandRunnerInputFile .run("MyApplication.exe", file);
verify(commandRunner).run(eq("MyApplication.exe"), inputStream.capture());
assertEquals('c', (char)inputStream.getValue().read());
}
}
When I run this test, it fails with the following Exception:
java.io.IOException: Stream Closed
at java.io.FileInputStream.read0(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:207)
This make sense to me since, during the execution of the method, the underlying FileInputStream is closed when it has completed execution. That is to say, at the point in time that the FileInputStream is captured by Mockito, it is open, but by the time I verify that it was passed (and attempt to verify its contents) it is closed. What can I do not to simply capture the InputStream object itself, but actually capture its contents for verification?

How to test keyedbroadcastprocessfunction in flink?

I am new to flink i am trying write junit test cases to test KeyedBroadCastProcessFunction. Below is my code ,i am currently calling the getDataStreamOutput method in TestUtils class and passing inputdata and patternrules to method once the input data is evaluated against list of pattern rules and if input data satisfy the condition i will get the signal and calling sink function and returning output data as string in getDataStreamOutput method
#Test
public void testCompareInputAndOutputDataForInputSignal() throws Exception {
Assertions.assertEquals(sampleInputSignal,
TestUtils.getDataStreamOutput(
inputSignal,
patternRules));
}
public static String getDataStreamOutput(JSONObject input, Map<String, String> patternRules) throws Exception {
env.setParallelism(1);
DataStream<JSONObject> inputSignal = env.fromElements(input);
DataStream<Map<String, String>> rawPatternStream =
env.fromElements(patternRules);
//Generate a key,value pair of set of patterns where key is pattern name and value is pattern condition
DataStream<Tuple2<String, Map<String, String>>> patternRuleStream =
rawPatternStream.flatMap(new FlatMapFunction<Map<String, String>,
Tuple2<String, Map<String, String>>>() {
#Override
public void flatMap(Map<String, String> patternRules,
Collector<Tuple2<String, Map<String, String>>> out) throws Exception {
for (Map.Entry<String, String> stringEntry : patternRules.entrySet()) {
JSONObject jsonObject = new JSONObject(stringEntry.getValue());
Map<String, String> map = new HashMap<>();
for (String key : jsonObject.keySet()) {
String value = jsonObject.get(key).toString();
map.put(key, value);
}
out.collect(new Tuple2<>(stringEntry.getKey(), map));
}
}
});
BroadcastStream<Tuple2<String, Map<String, String>>> patternRuleBroadcast =
patternStream.broadcast(patternRuleDescriptor);
DataStream<Tuple2<String, JSONObject>> validSignal = inputSignal.map(new MapFunction<JSONObject,
Tuple2<String, JSONObject>>() {
#Override
public Tuple2<String, JSONObject> map(JSONObject inputSignal) throws Exception {
String source =
inputSignal.getSource();
return new Tuple2<>(source, inputSignal);
}
}).keyBy(0).connect(patternRuleBroadcast).process(new MyKeyedBroadCastProcessFunction());
validSignal.map(new MapFunction<Tuple2<String, JSONObject>,
JSONObject>() {
#Override
public JSONObject map(Tuple2<String, JSONObject> inputSignal) throws Exception {
return inputSignal.f1;
}
}).addSink(new getDataStreamOutput());
env.execute("TestFlink");
}
return (getDataStreamOutput.dataStreamOutput);
}
#SuppressWarnings("serial")
public static final class getDataStreamOutput implements SinkFunction<JSONObject> {
public static String dataStreamOutput;
public void invoke(JSONObject inputSignal) throws Exception {
dataStreamOutput = inputSignal.toString();
}
}
I need to test different inputs with same broadcast rules but each time i am calling this function its again and again doing process from beginning take input signal broadcast data, is there a way i can broadcast once and keeping on sending the input to the method i explored i can use CoFlatMapFunction something like below to combine datastream and keep on sending the input rules while method is running but for this one of the datastream has to keep on getting data from kafka topic again it will overburden on method to load kafka utils and server
DataStream<JSONObject> inputSignalFromKafka = env.addSource(inputSignalKafka);
DataStream<org.json.JSONObject> inputSignalFromMethod = env.fromElements(inputSignal));
DataStream<JSONObject> inputSignal = inputSignalFromMethod.connect(inputSignalFromKafka)
.flatMap(new SignalCoFlatMapper());
public static class SignalCoFlatMapper
implements CoFlatMapFunction<JSONObject, JSONObject, JSONObject> {
#Override
public void flatMap1(JSONObject inputValue, Collector<JSONObject> out) throws Exception {
out.collect(inputValue);
}
#Override
public void flatMap2(JSONObject kafkaValue, Collector<JSONObject> out) throws Exception {
out.collect(kafkaValue);
}
}
I found a link in stackoverflow How to unit test BroadcastProcessFunction in flink when processElement depends on broadcasted data but this is confused me a lot
Any way i can only broadcast only once in Before method in test cases and keeping sending different kind of data to my broadcast function
You can use KeyedTwoInputStreamOperatorTestHarness in order to achieve this for example let's assume you have the following KeyedBroadcastProcessFunction where you define some business logic for both DataStream channels
public class SimpleKeyedBroadcastProcessFunction extends KeyedBroadcastProcessFunction<String, String, String, String> {
#Override
public void processElement(String inputEntry,
ReadOnlyContext readOnlyContext, Collector<String> collector) throws Exception {
//business logic for how you want to process your data stream records
}
#Override
public void processBroadcastElement(String broadcastInput, Context
context, Collector<String> collector) throws Exception {
//process input from your broadcast channel
}
Let's now assume your process function is stateful and is making modifications to the Flink internal state, you would have to create a TestHarness inside your test class to ensure you are able to keep track of the state during testing.
I would then create some unit tests using the following approach:
public class SimpleKeyedBroadcastProcessFunctionTest {
private SimpleKeyedBroadcastProcessFunction processFunction;
private KeyedTwoInputStreamOperatorTestHarness<String, String, String, String> testHarness;
#Before
public void setup() throws Exception {
processFunction = new SimpleKeyedBroadcastProcessFunction();
testHarness = new KeyedTwoInputStreamOperatorTestHarness<>(
new CoBroadcastWithKeyedOperator<>(processFunction, ImmutableList.of(BROADCAST_MAP_STATE_DESCRIPTOR)),
(KeySelector<String, String>) string -> string ,
(KeySelector<String, String>) string -> string,
TypeInformation.of(String.class));
testHarness.setup();
testHarness.open();
}
#After
public void cleanup() throws Exception {
testHarness.close();
}
#Test
public void testProcessRegularInput() throws Exception {
//processElement1 send elements into your regular stream, second param will be the event time of the record
testHarness.processElement1(new StreamRecord<>("Hello", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello", records.get(0).getValue())
}
#Test
public void testProcessBroadcastInput() throws Exception {
//processElement2 send elements into your broadcast stream, second param will be the event time of the record
testHarness.processElement2(new StreamRecord<>("Hello from Broadcast", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello from Broadcast", records.get(0).getValue())
}
}

How to marshal to JSON/XML when an exception occurs with Camel rest-dsl

I have a REST-dsl camel route with binding: json_xml
with .type() and outType(). It works perfectly when no exception occurs. That is json input gives json output. Xml input gives xml output.
However, when I get an IllegalArgumentException I always return XML. I create a ErrorResponse POJO when the exception occurs. The CONTENT_TYPE is set to "application/json" for json. How do I return a POJO and let camel marhal to JSON/XML when an Exception occurs(given ResBindingMode.json_xml)?
onException(IllegalArgumentException.class)
.log(LoggingLevel.ERROR, LOGGER, "error")
.handled(true)
.setHeader(Exchange.HTTP_RESPONSE_CODE, constant(400))
.setHeader(Exchange.CONTENT_TYPE, exchangeProperty(Exchange.CONTENT_TYPE))
.bean(errorResponseTranslator);
restConfiguration().component("restlet").port(port).skipBindingOnErrorCode(true)
.bindingMode(RestBindingMode.json_xml);
rest("/whatever/api/v1/request")
.post().type(RequestDto.class).outType(ResponseDto.class)
.route()
.setProperty(Exchange.CONTENT_TYPE, header(Exchange.CONTENT_TYPE))
...process
ErrorDto:
#XmlRootElement(name = "errorResponse")
#XmlAccessorType(XmlAccessType.PROPERTY)
public class ErrorResponseDto {
private String errorCode;
private String message;
#XmlElement(name = "message")
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
#XmlElement(name = "errorCode")
public String getErrorCode() {
return errorCode;
}
public void setErrorCode(String errorCode) {
this.errorCode = errorCode;
}
}
You need to set the content type explicit to XML then
.setHeader(Exchange.CONTENT_TYPE, exchangeProperty(Exchange.CONTENT_TYPE))
Should be
.setHeader(Exchange.CONTENT_TYPE, constant("application/json"))
The error occurs because the outType not dynamic. It seems like its a camel bug. That is: the outType must be an XMLROOT that contains OK and ERROR dto. It is possible to quickfix this if you use a XMLroot that takes an any element with lax=true(here you can add the ErrorDto or okResponseDto). But it does add an unwanted element. For now we have to implement a custom contentNegotiator.
This is when using skipBindingOnError is set to false.

#NestedConfigurationProperty and Converter Doesn't Work

I guess I have a rather complex configuration structure that I can't get to work. Here are the important pieces of the configuration classes:
#ConfigurationProperties
public abstract class AbstractConfigHolder<T extends AbstractComponentConfig> {
}
#Component
public class ExportConfigHolder extends AbstractConfigHolder<GenericExportConfig> {
#NestedConfigurationProperty
private Map<String, GenericExportConfig> exports;
// getters and setters for all fields
}
public class GenericExportConfig extends AbstractComponentConfig {
#NestedConfigurationProperty
private AbstractLocatedConfig target;
// getters and setters for all fields
}
public abstract class AbstractLocatedConfig extends RemoteConfig {
#NestedConfigurationProperty
private ProxyConfig proxy;
// getters and setters for all fields
}
public class ProxyConfig extends RemoteConfig {
private Type type;
// getters and setters for all fields
}
public class RemoteConfig {
private String host;
private int port;
private String user;
private String password;
// getters and setters for all fields
}
Here's the properties file:
exports.mmkb.name=MMKB
exports.mmkb.target=ftp
exports.mmkb.target.path=${user.home}/path/blah
# throws an exception:
exports.mmkb.target.proxy.host=super-host
The conversion stuff is what IMHO should cover everything and provide the proper beans to Spring:
#Configuration
public class ConversionSupport {
#ConfigurationPropertiesBinding
#Bean
public Converter<String, AbstractLocatedConfig> locatedConfigConverter(ApplicationContext applicationContext) {
return new Converter<String, AbstractLocatedConfig>() {
private ProxyConfigs proxyConfigs;
private ConnectionConfigs connectionConfigs;
#Override
public AbstractLocatedConfig convert(String targetType) {
System.out.println("Converting " + targetType);
initFields(applicationContext);
switch (targetType.toLowerCase()) {
case "ftp":
return new FtpTargetConfig(proxyConfigs, connectionConfigs);
// others...
}
}
// This is necessary to avoid conflicts in bean dependencies
private void initFields(ApplicationContext applicationContext) {
if (proxyConfigs == null) {
AbstractConfigHolder<?> configHolder = applicationContext.getBean(AbstractConfigHolder.class);
proxyConfigs = configHolder.getProxy();
connectionConfigs = configHolder.getConnection();
}
}
};
}
}
However, I get this instead:
Converting ftp
2016-04-29 09:33:23,900 WARN [org.springframework.context.annotation.AnnotationConfigApplicationContext] [main] Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'exportConfigHolder': Could not bind properties to ExportConfigHolder (prefix=, ignoreInvalidFields=false, ignoreUnknownFields=true, ignoreNestedProperties=false); nested exception is org.springframework.beans.InvalidPropertyException: Invalid property 'exports[mmkb].target.proxy[host]' of bean class [at.a1.iap.epggw.exporter.config.GenericExportConfig]: Property referenced in indexed property path 'proxy[host]' is neither an array nor a List nor a Map; returned value was [at.a1.iap.epggw.commons.config.properties.ProxyConfig#52066604]
2016-04-29 09:33:23,902 ERROR [org.springframework.boot.SpringApplication] [main] Application startup failed
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'exportConfigHolder': Could not bind properties to ExportConfigHolder (prefix=, ignoreInvalidFields=false, ignoreUnknownFields=true, ignoreNestedProperties=false); nested exception is org.springframework.beans.InvalidPropertyException: Invalid property 'exports[mmkb].target.proxy[host]' of bean class [at.a1.iap.epggw.exporter.config.GenericExportConfig]: Property referenced in indexed property path 'proxy[host]' is neither an array nor a List nor a Map; returned value was [at.a1.iap.epggw.commons.config.properties.ProxyConfig#52066604]
at org.springframework.boot.context.properties.ConfigurationPropertiesBindingPostProcessor.postProcessBeforeInitialization(ConfigurationPropertiesBindingPostProcessor.java:339)
at org.springframework.boot.context.properties.ConfigurationPropertiesBindingPostProcessor.postProcessBeforeInitialization(ConfigurationPropertiesBindingPostProcessor.java:289)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:408)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1570)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:545)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:482)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:772)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:839)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:538)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:766)
at org.springframework.boot.SpringApplication.createAndRefreshContext(SpringApplication.java:361)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:307)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1191)
at at.a1.iap.epggw.exporter.Application.main(Application.java:23)
Caused by: org.springframework.beans.InvalidPropertyException: Invalid property 'exports[mmkb].target.proxy[host]' of bean class [at.a1.iap.epggw.exporter.config.GenericExportConfig]: Property referenced in indexed property path 'proxy[host]' is neither an array nor a List nor a Map; returned value was [at.a1.iap.epggw.commons.config.properties.ProxyConfig#52066604]
at org.springframework.beans.AbstractNestablePropertyAccessor.setPropertyValue(AbstractNestablePropertyAccessor.java:406)
at org.springframework.beans.AbstractNestablePropertyAccessor.setPropertyValue(AbstractNestablePropertyAccessor.java:280)
at org.springframework.boot.bind.RelaxedDataBinder$RelaxedBeanWrapper.setPropertyValue(RelaxedDataBinder.java:700)
at org.springframework.beans.AbstractPropertyAccessor.setPropertyValues(AbstractPropertyAccessor.java:95)
at org.springframework.validation.DataBinder.applyPropertyValues(DataBinder.java:834)
at org.springframework.validation.DataBinder.doBind(DataBinder.java:730)
at org.springframework.boot.bind.RelaxedDataBinder.doBind(RelaxedDataBinder.java:128)
at org.springframework.validation.DataBinder.bind(DataBinder.java:715)
at org.springframework.boot.bind.PropertiesConfigurationFactory.doBindPropertiesToTarget(PropertiesConfigurationFactory.java:269)
at org.springframework.boot.bind.PropertiesConfigurationFactory.bindPropertiesToTarget(PropertiesConfigurationFactory.java:241)
at org.springframework.boot.context.properties.ConfigurationPropertiesBindingPostProcessor.postProcessBeforeInitialization(ConfigurationPropertiesBindingPostProcessor.java:334)
... 17 common frames omitted
I mean the error clearly expresses that so far it all worked, there is a proper object in place, but somehow it fails to further apply the properties. I know that it's neither an array nor a List nor a Map, because I want it to be POJO.
What can I do here to make this work?
This is Spring-boot 1.3.3 BTW.
Well, it seems as if I somehow hit a corner-case where Spring doesn't do much about it. The main problem is that Spring seems to collect the available bean structure including their nested field structure before it knows of (or at least makes use of) the Converters lying around in the system.
I let the class with #ConfigurationProperties implement ApplicationContextAware and the new method
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
AnnotationConfigApplicationContext context = (AnnotationConfigApplicationContext) applicationContext;
#SuppressWarnings("unchecked")
Converter<String, AbstractLocatedConfig> locatedConfigSupport = context.getBean("locatedConfigConverter", Converter.class);
:
}
then also looked for all properties in the context's environment that would trigger the conversion process, manually called the conversion and created the bean structure that way.
For some reason the following lifecycle-stuff of Spring caused not all properties to end up in the bean, which made me do this:
#Configuration
public class SampleConfiguration {
#Autowired
private Environment environment;
#Autowired
private ClassWithTheConfigurationPropertiesAbove theBeanWithTheConfigurationPropertiesAbove;
#PostConstruct
void postConstruct() throws Exception {
if (environment instanceof AbstractEnvironment) {
MutablePropertySources sources = ((AbstractEnvironment) environment).getPropertySources();
// This is a MUST since Spring calls the nested properties handler BEFORE
// calling the conversion service on that field. Therefore, our converter
// for AbstractLocatedConfigs is called too late the first time. A second
// call will fill in the fields in the new objects and set the other ones
// again, too.
// See org.springframework.core.env.PropertySourcesPropertyResolver.getProperty(String, Class<T>, boolean)
// Note: in case Spring reorders this, the logic here won't be needed.
setProperties(theBeanWithTheConfigurationPropertiesAbove, sources);
} else {
throw new IllegalArgumentException("The environment must be an " + AbstractEnvironment.class.getSimpleName());
}
}
void setProperties(Object target, MutablePropertySources propertySources) {
// org.springframework.boot.bind.PropertiesConfigurationFactory.doBindPropertiesToTarget()
// was the base for this. Go there for further logic if needed.
RelaxedDataBinder dataBinder = new RelaxedDataBinder(target);
dataBinder.bind(new MutablePropertyValues(getProperties(propertySources)));
}
public String getProperty(String propertyName) {
return environment.getProperty(propertyName);
}
private Map<String, String> getProperties(MutablePropertySources propertySources) {
Iterable<PropertySource<?>> iterable = () -> propertySources.iterator();
return StreamSupport.stream(iterable.spliterator(), false)
.map(propertySource -> {
Object source = propertySource.getSource();
if (source instanceof Map) {
#SuppressWarnings("unchecked")
Map<String, String> sourceMap = (Map<String, String>) source;
return sourceMap.keySet();
} else if (propertySource instanceof SimpleCommandLinePropertySource) {
return Arrays.asList(((SimpleCommandLinePropertySource) propertySource).getPropertyNames());
} else if (propertySource instanceof RandomValuePropertySource) {
return null;
} else {
throw new NotImplementedException("unknown property source " + propertySource.getClass().getName() + " or its source " + source.getClass().getName());
}
})
.filter(Objects::nonNull)
.flatMap(Collection::stream)
.collect(Collectors.toMap(Function.identity(), this::getProperty));
}
}
It would be nice if Spring could do something about this to make it easier...

running antlr inside a javafx and swing app

I have built a gui with javafx and swing and when I add an action listener to parse the expression in a textfield I get a error, I am not sure what the problem is.
the error is:
Exception in thread "AWT-EventQueue-0" java.lang.ExceptionInInitializerError
at functionparsergui.Test.parseFunction(Test.java:110)
at functionparsergui.Test.access$000(Test.java:38)
at functionparsergui.Test$2.actionPerformed(Test.java:88)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2018)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2341)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:252)
at java.awt.Component.processMouseEvent(Component.java:6505)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3321)
at java.awt.Component.processEvent(Component.java:6270)
at java.awt.Container.processEvent(Container.java:2229)
at java.awt.Component.dispatchEventImpl(Component.java:4861)
at java.awt.Container.dispatchEventImpl(Container.java:2287)
at java.awt.Component.dispatchEvent(Component.java:4687)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4832)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4492)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4422)
at java.awt.Container.dispatchEventImpl(Container.java:2273)
at java.awt.Window.dispatchEventImpl(Window.java:2719)
at java.awt.Component.dispatchEvent(Component.java:4687)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:729)
at java.awt.EventQueue.access$200(EventQueue.java:103)
at java.awt.EventQueue$3.run(EventQueue.java:688)
at java.awt.EventQueue$3.run(EventQueue.java:686)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:87)
at java.awt.EventQueue$4.run(EventQueue.java:702)
at java.awt.EventQueue$4.run(EventQueue.java:700)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:699)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:150)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:146)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:138)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:91)
Caused by: java.lang.UnsupportedOperationException: java.io.InvalidClassException: org.antlr.v4.runtime.atn.ATN; Could not deserialize ATN with version 3 (expected 2).
at org.antlr.v4.runtime.atn.ATNSimulator.deserialize(ATNSimulator.java:114)
at edu.chrr.util.function.FunctionLexer.<clinit>(FunctionLexer.java:504)
... 39 more
Caused by: java.io.InvalidClassException: org.antlr.v4.runtime.atn.ATN; Could not deserialize ATN with version 3 (expected 2).
... 41 more
My code starting from declaring the actionlistener is as follows:
ActionListener clearField = new ActionListener() {
#Override
public void actionPerformed(ActionEvent e) {
exprField.setText("");
JOptionPane.showMessageDialog(frame, "Input Cleared");
}
};
clearButton.addActionListener(clearField);
ActionListener parserButton;
parserButton = new ActionListener() {
#Override
public void actionPerformed(ActionEvent e) {
String expression = exprField.getText();
String nowhiteExpr = expression.replaceAll("\\s+", "");
parseFunction(nowhiteExpr, frame);
}
};
parseButton.addActionListener(parserButton);
Platform.runLater(new Runnable() {
#Override
public void run() {
initFX(fxPanel);
}
});
}
private static void parseFunction(final String function, final JFrame frame) {
try {
ANTLRInputStream input = new ANTLRInputStream(function);
FunctionLexer lexer = new FunctionLexer(input);
CommonTokenStream tokens = new CommonTokenStream((TokenSource) lexer);
FunctionParser parser = new FunctionParser(tokens);
parser.start();
int errorsCount = parser.getNumberOfSyntaxErrors();
if (errorsCount == 0) {
JOptionPane.showMessageDialog(frame, "Syntax is Correct");
} else {
Token t = parser.getCurrentToken();
String msg = "Syntax Incorrect: Missing " + t.getText();
JOptionPane.showMessageDialog(frame, msg);
}
} catch (RecognitionException ex) {
JOptionPane.showMessageDialog(frame, "Syntax is Incorrect");
}
}
private static void initFX(JFXPanel fxPanel) {
// This method is invoked on the JavaFX thread
Scene scene = createScene();
fxPanel.setScene(scene);
}
private static Scene createScene() {
Group root = new Group();
Scene scene = new Scene(root, Color.ALICEBLUE);
return (scene);
}
public static void main(String[] args) {
SwingUtilities.invokeLater(new Runnable() {
#Override
public void run() {
initAndShowGUI();
}
});
}
}
ANTLR 4.1 is not compatible with ANTLR 4.0. You are generating your code with ANTLR 4.1 but attempting to run it with the ANTLR 4.0 runtime library.