I'm trying to make a simple UDP socket server for a Unity3D game I'm making, and I've got it mostly working. I can send messages to it and read the messages. But when I'm trying to send the message back to the client (for testing purposes, at the moment), I get a BufferOverFlowException.
Before sending the data back, I'm converting it to json using groovy.json.JsonBuilder. The data has a very simple structure:
[data: "Hello World"]
But for whatever reason, JsonBuilder is building it as
{
"data": "Hello World\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\..."
}
the \u0000's go on for a while. Long enough to make my 1024 byte long ByteBuffer overflow.
This is the class that's responsible for sending the data back to the client:
import groovy.json.JsonBuilder
import groovy.transform.CompileStatic
import groovyx.gpars.actor.DynamicDispatchActor
import java.nio.ByteBuffer
import java.nio.channels.DatagramChannel
#CompileStatic
class SenderActor extends DynamicDispatchActor{
//takes message of type [data: Object, receiver: SocketAddress]
void onMessage(Map message){
println(message.data) //prints "Hello World"
def json = new JsonBuilder([data: message.data]).toString()
println("Sending: $json") //prints '{"data": "Hello World\u0000\u0000..."}'
def channel = DatagramChannel.open()
channel.connect(message.receiver as SocketAddress)
def buffer = ByteBuffer.allocate(1024)
buffer.clear()
buffer.put(json.getBytes())
buffer.flip()
channel.send(buffer, message.receiver as SocketAddress)
}
}
And this is the stack trace I get:
An exception occurred in the Actor thread Actor Thread 2
java.nio.BufferOverflowException
at java.nio.HeapByteBuffer.put(HeapByteBuffer.java:189)
at java.nio.ByteBuffer.put(ByteBuffer.java:859)
at Croquet.Actors.SenderActor.onMessage(SenderActor.groovy:28)
at Croquet.Actors.SenderActor$onMessage.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at Croquet.Actors.ProcessorActor$onMessage.call(Unknown Source)
at groovyx.gpars.actor.impl.DDAClosure$_createDDAClosure_closure1.doCall(DDAClosure.groovy:38)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:292)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1016)
at groovy.lang.Closure.call(Closure.java:423)
at groovy.lang.Closure.call(Closure.java:439)
at groovyx.gpars.actor.AbstractLoopingActor.runEnhancedWithoutRepliesOnMessages(AbstractLoopingActor.java:293)
at groovyx.gpars.actor.AbstractLoopingActor.access$400(AbstractLoopingActor.java:30)
at groovyx.gpars.actor.AbstractLoopingActor$1.handleMessage(AbstractLoopingActor.java:93)
at groovyx.gpars.util.AsyncMessagingCore.run(AsyncMessagingCore.java:132)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
The data in question is encoded as UTF-8, if that helps.
This is the client code that is responsible for sending data to the server (written in C#):
void sendTestMessage(UdpClient udpClient, UdpState udpState){
byte[] data = Encoding.UTF8.GetBytes("Hello World");
udpClient.BeginSend(
data,
data.Length,
udpState.e, //IPEndPoint
result =>{
messageSent = true;
Debug.Log(string.Format("Message '{1}' Sent to {0}", udpState.e, Encoding.UTF8.GetString(data)));
udpClient.EndSend(result);
},
udpState);
}
I solved my problem by changing
def json = new JsonBuilder([data: message.data]).toString()
to
def json = new JsonBuilder([data: message.data]).toString().replace("\\u0000", "")
Related
I am trying to read a json stream from an MQTT broker in Apache Spark with structured streaming, read some properties of an incoming json and output them to the console. My code looks like that:
val spark = SparkSession
.builder()
.appName("BahirStructuredStreaming")
.master("local[*]")
.getOrCreate()
import spark.implicits._
val topic = "temp"
val brokerUrl = "tcp://localhost:1883"
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.toDF().withColumn("payload", $"payload".cast(StringType))
val jsonDF = lines.select(get_json_object($"payload", "$.eventDate").alias("eventDate"))
val query = jsonDF.writeStream
.format("console")
.start()
query.awaitTermination()
However, when the json arrives I get the following errors:
Exception in thread "main" org.apache.spark.sql.streaming.StreamingQueryException: Writing job aborted.
=== Streaming Query ===
Identifier: [id = 14d28475-d435-49be-a303-8e47e2f907e3, runId = b5bd28bb-b247-48a9-8a58-cb990edaf139]
Current Committed Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: -1}
Current Available Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: 0}
Current State: ACTIVE
Thread State: RUNNABLE
Logical Plan:
Project [get_json_object(payload#22, $.id) AS eventDate#27]
+- Project [id#10, topic#11, cast(payload#12 as string) AS payload#22, timestamp#13]
+- StreamingExecutionRelation MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496], [id#10, topic#11, payload#12, timestamp#13]
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:300)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:189)
Caused by: org.apache.spark.SparkException: Writing job aborted.
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:92)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:296)
at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3384)
at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2783)
at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3365)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3365)
at org.apache.spark.sql.Dataset.collect(Dataset.scala:2783)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$15(MicroBatchExecution.scala:537)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$14(MicroBatchExecution.scala:533)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runBatch(MicroBatchExecution.scala:532)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:198)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:166)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:56)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:160)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:279)
... 1 more
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 8, localhost, executor driver): java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1887)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1875)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1874)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:64)
... 34 more
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I am sending the JSON records using mosquitto broker and they look like this:
mosquitto_pub -m '{"eventDate": "2020-11-11T15:17:00.000+0200"}' -t "temp"
It seems that every strings coming from Bahir stream source provider raise this error. For instance the following code also raises this error :
spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("topic")
.writeStream
.format("console")
.start()
It looks like Spark does not recognize strings coming from Bahir, maybe some kind of weird string class version issue. I've tried the following actions to make the code work:
setup java version to 8
upgrade spark version from 2.4.0 to 2.4.7
setup scala version to 2.11.12
use decode function with all possible encoding combinations instead of .cast(StringType) to transform column "payload" to String
use substring function on column "payload" to recreate a compatible String.
Finally, I got working code by recreating the string using constructor and dataset:
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("payload")
.as[Array[Byte]]
.map(payload => new String(payload))
.toDF("payload")
This solution is rather ugly but at least it works.
I believe that there is nothing wrong with the code provided in the question and I suspect a bug on Bahir or Spark side preventing Spark to handle String from Bahir source.
I'm trying to post a json using feign but i get an error from the url that parameters were not sent. This is my code:
JSONObject jsonObject = new JSONObject();
jsonObject.put("name", "Rabbit SEO");
jsonObject.put("price", "10");
jsonObject.put("test", "true");
jsonObject.put("return_url", "https://www.rabbitseo.com/shopifyPaidGuest");
String content = jsonObject.toString();
System.out.println("content = " + content);
String result = myClient.postRecurringPayment(content);
#RequestLine("POST /admin/recurring_application_charges.json")
#Headers("Content-Type: application/json")
String postRecurringPayment(String content);
return Feign.builder()
.requestInterceptors(requestInterceptors)
.target(MyApiClient.class, myShopifyUrl);
I tried also with the gson decoder and encoder:
return Feign.builder()
.decoder(new GsonDecoder())
.encoder(new GsonEncoder())
.requestInterceptors(requestInterceptors)
.target(MyApiClient.class, myShopifyUrl);
Error:
feign.FeignException: status 400 reading MyApiClient#postRecurringPayment(String); content:
{"errors":{"recurring_application_charge":"Required parameter missing or invalid"}}
at feign.FeignException.errorStatus(FeignException.java:62)
at feign.codec.ErrorDecoder$Default.decode(ErrorDecoder.java:91)
at feign.SynchronousMethodHandler.executeAndDecode(SynchronousMethodHandler.java:126)
at feign.SynchronousMethodHandler.invoke(SynchronousMethodHandler.java:74)
at feign.ReflectiveFeign$FeignInvocationHandler.invoke(ReflectiveFeign.java:94)
at com.sun.proxy.$Proxy4.postRecurringPayment(Unknown Source)
at com.test.TestOauth.testShopifyProducts2(TestOauth.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:76)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
at org.junit.runner.JUnitCore.run(JUnitCore.java:157)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:62)
Looking at log http error code 400 mean BadRequest
"errors":{"recurring_application_charge":"Required parameter missing or invalid"
You must pass recurring_application_charge as part of JSON to be accepted by API
But, you need to post complete error log and MyClient class would help advise more.
I need to read a directory with csv files (one or more files). I use Camel with springboot and I need to move any file processed completely(without errors) to OUT dir, but if the last "to" route fails (an Exceptions is thrown) I need to move the file to REFUSED dir.
When I try my code, camel goes to infinitive loop, continue to process the same file forever..
24/08/2017 16:27:57.070 ERROR [Camel (camel-1) thread #0 - file://src/main/resources/data] - org.apache.camel.processor.DefaultErrorHandler: Failed delivery for (MessageId: ID-CAD1652-39380-1503584865077-0-33 on ExchangeId: ID-CAD1652-39380-1503584865077-0-34). Exhausted after delivery attempt: 1 caught: com.cadit.exceptions.FileNotEvaluableException: Error: file tipo sconosciuto
Message History
---------------------------------------------------------------------------------------------------------------------------------------
RouteId ProcessorId Processor Elapsed (ms)
[route2 ] [route2 ] [file://src/main/resources/data?idempotent=false&move=OUT%2FVB%2F ] [ 10]
[route2 ] [unmarshal1 ] [unmarshal[org.apache.camel.model.dataformat.CsvDataFormat#28f6cf0f] ] [ 1]
[route2 ] [to1 ] [bean:myCsvHandler?method=doHandleCsvDataVB ] [ 8]
Stacktrace
---------------------------------------------------------------------------------------------------------------------------------------
com.cadit.exceptions.FileNotEvaluableException: Error: file tipo sconosciuto
at com.cadit.handlers.MyCsvHandler.doHandleCsvDataVB(MyCsvHandler.java:172)
at com.cadit.handlers.MyCsvHandler$$FastClassBySpringCGLIB$$f4b8f70b.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:721)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:656)
at com.cadit.handlers.MyCsvHandler$$EnhancerBySpringCGLIB$$d81d9e7f.doHandleCsvDataVB(<generated>)
at sun.reflect.GeneratedMethodAccessor97.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.camel.component.bean.MethodInfo.invoke(MethodInfo.java:458)
at org.apache.camel.component.bean.MethodInfo$1.doProceed(MethodInfo.java:289)
at org.apache.camel.component.bean.MethodInfo$1.proceed(MethodInfo.java:262)
at org.apache.camel.component.bean.BeanProcessor.process(BeanProcessor.java:178)
at org.apache.camel.component.bean.BeanProducer.process(BeanProducer.java:41)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:145)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:542)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:197)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:120)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:83)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:197)
at org.apache.camel.component.file.GenericFileConsumer.processExchange(GenericFileConsumer.java:460)
at org.apache.camel.component.file.GenericFileConsumer.processBatch(GenericFileConsumer.java:227)
at org.apache.camel.component.file.GenericFileConsumer.poll(GenericFileConsumer.java:191)
at org.apache.camel.impl.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:175)
at org.apache.camel.impl.ScheduledPollConsumer.run(ScheduledPollConsumer.java:102)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
This is the main code:
#Component
public class CamelContextConf extends RouteBuilder{
static final Logger logger = Logger.getLogger(CamelContextConf.class);
#Override
public void configure() throws Exception {
restConfiguration().component("servlet").dataFormatProperty("prettyPrint", "true") ;
CsvDataFormat csv = new CsvDataFormat();
csv.setDelimiter(";");
csv.setSkipHeaderRecord(true);
from("direct:csvprocessor")
.streamCaching()
.from("file:src/main/resources/data?move=OUT/VB/")
.unmarshal(csv)
.to("bean:myCsvHandler?method=doHandleCsvDataVB")
.onCompletion().onFailureOnly().to("file:src/main/reources/data/REFUSED").end()
.setBody(constant("OK"))
.setHeader(Exchange.HTTP_RESPONSE_CODE, constant(200))
.setHeader(Exchange.CONTENT_TYPE, constant("text/html"));
logger.info("** Route config ok");
}
then in myCsvHandler -> method doHandleCsvDataVB I explicitly throw an exception for testing
a failure :
#Component
public class MyCsvHandler {
#Inject
AFVINCCrudRepository _entityManagerVINC;
#Inject
AFFileCrudRepository _entityManagerAfFile;
#Transactional(propagation = Propagation.REQUIRED , transactionManager="DbTransactionManagerVB")
public void doHandleCsvDataVB(List<List<String>> csvData) throws FileNotEvaluableException
{
//System.out.println("stampo..");
if (csvData.size() > 0){
AfFileEntity afbean = new AfFileEntity();
afbean.setNomeFile("test");
afbean.setDataImport(new java.sql.Timestamp(System.currentTimeMillis()));
afbean.setTipoFile("M");
afbean.setAfStatoFlusso("I");
_entityManagerAfFile.save(afbean);
long pkfile = afbean.getId();
System.out.println("pkfile: " + pkfile);
int i = 1; VincEntity vincBean = new VincEntity();
System.out.println(csvData.size());
for (List<String> rows : csvData){
..
_entityManagerVINC.save(..);
}
throw new FileNotEvaluableException("Il file non è nè una ...");
}
}
}
}
The save methods loop and continue to save data on db..
What's wrong?
Thanks so much.
In your code, you cannot have from inside another from.
from("direct:csvprocessor")
.streamCaching()
.from("file:src/main/resources/data?move=OUT/VB/")
Your second from("file:...") should instead use an enricher. http://camel.apache.org/content-enricher.html
Not sure, but in your doHandleCsvDataVB() method I don't see a try() catch() block. You are going straight from a for loop to throwing the exception.
I did the below way to move the file from one folder to another folder.
#Autowired
FileProcessor fileProcessor;
#Component
public class FileRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception {
from("file://{{dir.in}}?readLock=changed&include=.*.csv"&preMove={{dir.progress}}&move={{dir.out}}&moveFailed={{dir.error}}")
.process(fileProcessor).setId("Poller");
}
}
dir.in -Source directory for file polling
dir.progress -File will be moved from in to progress for processing
dir.out -File will be moved here when processed successfully
dir.error - File moved here when any error occur during process
readLock=changed - this will lock the file while reading
include - this will read only csv files in source dir i.e in folder
you may or may not have processor as per your needs.
if you don't want processor then use below code
from("file://{{dir.in}}?readLock=changed&include=.*.csv"&preMove={{dir.progress}}&move={{dir.out}}&moveFailed={{dir.error}}")
.setId("Poller");
Please note that, if you are not using preMove then default .camel folder will be created in the 'in' folder and one copy of file moved to .camel folder. file will always exist in the directory until and unless we remove manually.
So i suggest you to use the preMove in the route.
i use sopaUI 5.0.0 with libraries:
btf 1.2
jackson-annotations-2.0.1
jackson-core-2.8.1
jackson-coreutils-1.8
json-schema-core-1.2.5
json-schema-validator-2.2.6
I try do this code:
import com.fasterxml.jackson.databind.JsonNode
import com.fasterxml.jackson.databind.ObjectMapper
import com.github.fge.jsonschema.core.report.ProcessingReport
import com.github.fge.jsonschema.main.JsonSchema
import com.github.fge.jsonschema.main.JsonSchemaFactory
def response = testRunner.testCase.getTestStepByName("getClientByID").getPropertyValue("response")
ObjectMapper mapper = new ObjectMapper()
JsonNode invoiceJSON = mapper.readTree(response)
JsonNode invoiceSchemaJSON = mapper.readTree(new File("C:\\test\\resources\\schems\\client-card-rs.json"))
log.info( invoiceSchemaJSON)
JsonSchemaFactory factory = JsonSchemaFactory.byDefault()
JsonSchema invoiceSchema = factory.getJsonSchema(invoiceSchemaJSON)
if (invoiceSchema.validInstance(invoiceJSON)) log.info("Response Validated!")
else {
testRunner.fail(invoiceSchema.validate(invoiceJSON).toString())
}
But script crash with error at line:
JsonSchemaFactory factory = JsonSchemaFactory.byDefault()
First,issued error:
"write_bigdecimal_as_plain"
Later
java.lang.NoClassDefFoundError: Could not initialize class com.github.fge.jsonschema.SchemaVersion
SoapUI error log:
2016-09-11 15:39:49,202 ERROR [errorlog] java.lang.NoClassDefFoundError: Could not initialize class com.github.fge.jsonschema.SchemaVersion
java.lang.NoClassDefFoundError: Could not initialize class com.github.fge.jsonschema.SchemaVersion
at com.github.fge.jsonschema.core.load.configuration.LoadingConfigurationBuilder.<init>(LoadingConfigurationBuilder.java:117)
at com.github.fge.jsonschema.core.load.configuration.LoadingConfiguration.byDefault(LoadingConfiguration.java:151)
at com.github.fge.jsonschema.main.JsonSchemaFactoryBuilder.<init>(JsonSchemaFactoryBuilder.java:67)
at com.github.fge.jsonschema.main.JsonSchemaFactory.newBuilder(JsonSchemaFactory.java:123)
at com.github.fge.jsonschema.main.JsonSchemaFactory.byDefault(JsonSchemaFactory.java:113)
at com.github.fge.jsonschema.main.JsonSchemaFactory$byDefault.call(Unknown Source)
at Script6.run(Script6.groovy:20)
at com.eviware.soapui.support.scripting.groovy.SoapUIGroovyScriptEngine.run(SoapUIGroovyScriptEngine.java:100)
at com.eviware.soapui.impl.wsdl.teststeps.WsdlGroovyScriptTestStep.run(WsdlGroovyScriptTestStep.java:154)
at com.eviware.soapui.impl.wsdl.panels.teststeps.GroovyScriptStepDesktopPanel$RunAction$1.run(GroovyScriptStepDesktopPanel.java:277)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Help pls.
The only solution that worked for me was using the json-schema-validator-2.2.8-lib.jar from com.github.java-json-tools (https://mvnrepository.com/artifact/com.github.java-json-tools/json-schema-validator/2.2.8) instead of com.github.fge (which supported the project till its 2.2.6 version). It is important to use the package of libs provided instead of collecting all the relevant dependencies manually. Somehow it doesn't want to work correctly.
I am using SoapUI 5.2.1 on Docker container built on top of an OpenJDK 8 image.
I am trying to learn Retrofit and want to use it for Google search. I have written following code:
public class RetrofitSample {
private static final String URL = "https://www.google.com";
interface Google {
#GET("/search?q=retrofit")
String search();
}
public static void main(String... args) {
RestAdapter restAdapter = new RestAdapter.Builder()
.setEndpoint(URL)
.build();
Google goog = restAdapter.create(Google.class);
String result = goog.search();
System.out.println(result);
}
}
When I run this code I am getting following error:
Exception in thread "main" retrofit.RetrofitError: com.google.gson.JsonSyntaxException: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 1 column 12 path $
at retrofit.RetrofitError.conversionError(RetrofitError.java:33)
at retrofit.RestAdapter$RestHandler.invokeRequest(RestAdapter.java:383)
at retrofit.RestAdapter$RestHandler.invoke(RestAdapter.java:240)
at $Proxy0.search(Unknown Source)
at RetrofitSample.main(RetrofitSample.java:25)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: retrofit.converter.ConversionException: com.google.gson.JsonSyntaxException: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 1 column 12 path $
at retrofit.converter.GsonConverter.fromBody(GsonConverter.java:67)
at retrofit.RestAdapter$RestHandler.invokeRequest(RestAdapter.java:367)
... 8 more
Caused by: com.google.gson.JsonSyntaxException: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 1 column 12 path $
at com.google.gson.Gson.assertFullConsumption(Gson.java:786)
at com.google.gson.Gson.fromJson(Gson.java:776)
at retrofit.converter.GsonConverter.fromBody(GsonConverter.java:63)
... 9 more
Caused by: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 1 column 12 path $
at com.google.gson.stream.JsonReader.syntaxError(JsonReader.java:1573)
at com.google.gson.stream.JsonReader.checkLenient(JsonReader.java:1423)
at com.google.gson.stream.JsonReader.doPeek(JsonReader.java:546)
at com.google.gson.stream.JsonReader.peek(JsonReader.java:429)
at com.google.gson.Gson.assertFullConsumption(Gson.java:782)
... 11 more
Do I have to use Google search API (costum search) or is Google search returning som other type as result?
Thanks in advance.