neo4j creating multiple nodes json - json

Please let me know where I am going wrong
I am using neo4j to store my routers interface and link information. The link is to be created between 2 interfaces.
I have successfully created nodes and interfaces but finding issues in creating links.
This is the query I use to create link
MATCH (I:Interface), (I2:Interface)
FOREACH(p in FILTER(z in {props} WHERE z.OrigIPAddress = I.IfIPAddress or z.TermIPAddress = I.IfIPAddress) |
MERGE (I {IfIPAddress:p.OrigIPAddress})-[r:link]->(I2 {IfIPAddress:p.TermIPAddress})
ON CREATE SET r = p
ON MATCH SET r = p)
I have an array of maps called props which I am passing in the json as params which contains the properties of link i.e OrigIPAddress (Originating interface IP), TermIPAddress (Terminating interface ip).
In the foreach I first filter all those link which have their source or destination interfaces already present. Now after doing this I am creating links out of the props.
When I run this it runs properly but no links are created. There are both source and destination interfaces present.
EDIT 1:
I modified the query and when I run this query
MATCH (I:Interface), (I2:Interface)
FOREACH(p in FILTER(z in {props} WHERE z.OrigIPAddress = I.IfIPAddress and z.TermIPAddress = I2.IfIPAddress) |
MERGE (I {IfIPAddress:p.OrigIPAddress})-[r:link]->(I {IfIPAddress:p.TermIPAddress})
ON CREATE SET r = p
ON MATCH SET r = p)
I do not get ant response back from the neo4j and in neo4j web console I can see this meesage
"Neo4j disconnected, check you socket..."
Here are the logs
Apr 09, 2014 11:06:46 AM org.neo4j.server.logging.Logger log
WARNING: You are using an unsupported Java runtime. Please use Oracle(R) Java(TM) Runtime Environment 7.
SEVERE: The response of the WebApplicationException cannot be utilized as the response is already committed. Re-throwing to the HTTP container
javax.ws.rs.WebApplicationException: javax.ws.rs.WebApplicationException: org.eclipse.jetty.io.EofException
at org.neo4j.server.rest.repr.OutputFormat$1.write(OutputFormat.java:174)
at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
at com.sun.jersey.spi.container.ContainerResponse.write(ContainerResponse.java:306)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1437)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:699)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:698)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1506)
at org.neo4j.server.rest.security.SecurityFilter.doFilter(SecurityFilter.java:112)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1477)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:503)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:211)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1096)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:432)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:175)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1030)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:136)
at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:445)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:268)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:229)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.run(AbstractConnection.java:358)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:601)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:532)
at java.lang.Thread.run(Thread.java:744)
Caused by: javax.ws.rs.WebApplicationException: org.eclipse.jetty.io.EofException
at org.neo4j.server.rest.repr.formats.StreamingJsonFormat$StreamingRepresentationFormat.flush(StreamingJsonFormat.java:401)
at org.neo4j.server.rest.repr.formats.StreamingJsonFormat$StreamingRepresentationFormat.complete(StreamingJsonFormat.java:389)
at org.neo4j.server.rest.repr.MappingRepresentation.serialize(MappingRepresentation.java:43)
at org.neo4j.server.rest.repr.OutputFormat$1.write(OutputFormat.java:160)
... 30 more
Caused by: org.eclipse.jetty.io.EofException
at org.eclipse.jetty.io.ChannelEndPoint.flush(ChannelEndPoint.java:186)
at org.eclipse.jetty.io.WriteFlusher.write(WriteFlusher.java:335)
at org.eclipse.jetty.io.AbstractEndPoint.write(AbstractEndPoint.java:125)
at org.eclipse.jetty.server.HttpConnection$ContentCallback.process(HttpConnection.java:784)
at org.eclipse.jetty.util.IteratingCallback.iterate(IteratingCallback.java:79)
at org.eclipse.jetty.server.HttpConnection.send(HttpConnection.java:356)
at org.eclipse.jetty.server.HttpChannel.sendResponse(HttpChannel.java:631)
at org.eclipse.jetty.server.HttpChannel.write(HttpChannel.java:661)
at org.eclipse.jetty.server.HttpOutput.flush(HttpOutput.java:151)
at com.sun.jersey.spi.container.servlet.WebComponent$Writer.flush(WebComponent.java:315)
at com.sun.jersey.spi.container.ContainerResponse$CommittingOutputStream.flush(ContainerResponse.java:145)
at org.codehaus.jackson.impl.Utf8Generator.flush(Utf8Generator.java:1091)
at org.neo4j.server.rest.repr.formats.StreamingJsonFormat$StreamingRepresentationFormat.flush(StreamingJsonFormat.java:397)
... 33 more
Caused by: java.io.IOException: Broken pipe
at sun.nio.ch.FileDispatcherImpl.writev0(Native Method)
at sun.nio.ch.SocketDispatcher.writev(SocketDispatcher.java:51)
at sun.nio.ch.IOUtil.write(IOUtil.java:148)
at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:524)
at org.eclipse.jetty.io.ChannelEndPoint.flush(ChannelEndPoint.java:167)
... 45 more
Apr 09, 2014 11:10:17 AM com.sun.jersey.server.impl.application.WebApplicationImpl _handleRequest
SEVERE: The response of the WebApplicationException cannot be utilized as the response is already committed. Re-throwing to the HTTP container
javax.ws.rs.WebApplicationException: javax.ws.rs.WebApplicationException: org.eclipse.jetty.io.EofException
at org.neo4j.server.rest.repr.OutputFormat$1.write(OutputFormat.java:174)
at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
at com.sun.jersey.spi.container.ContainerResponse.write(ContainerResponse.java:306)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1437)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:699)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:698)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1506)
at org.neo4j.server.rest.security.SecurityFilter.doFilter(SecurityFilter.java:112)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1477)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:503)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:211)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1096)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:432)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:175)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1030)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:136)
at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:445)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:268)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:229)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.run(AbstractConnection.java:358)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:601)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:532)
at java.lang.Thread.run(Thread.java:744)
Caused by: javax.ws.rs.WebApplicationException: org.eclipse.jetty.io.EofException
at org.neo4j.server.rest.repr.formats.StreamingJsonFormat$StreamingRepresentationFormat.flush(StreamingJsonFormat.java:401)
at org.neo4j.server.rest.repr.formats.StreamingJsonFormat$StreamingRepresentationFormat.complete(StreamingJsonFormat.java:389)
at org.neo4j.server.rest.repr.MappingRepresentation.serialize(MappingRepresentation.java:43)
at org.neo4j.server.rest.repr.OutputFormat$1.write(OutputFormat.java:160)
... 30 more
Caused by: org.eclipse.jetty.io.EofException
at org.eclipse.jetty.io.ChannelEndPoint.flush(ChannelEndPoint.java:186)
at org.eclipse.jetty.io.WriteFlusher.write(WriteFlusher.java:335)
at org.eclipse.jetty.io.AbstractEndPoint.write(AbstractEndPoint.java:125)
at org.eclipse.jetty.server.HttpConnection$ContentCallback.process(HttpConnection.java:784)
at org.eclipse.jetty.util.IteratingCallback.iterate(IteratingCallback.java:79)
at org.eclipse.jetty.server.HttpConnection.send(HttpConnection.java:356)
at org.eclipse.jetty.server.HttpChannel.sendResponse(HttpChannel.java:631)
at org.eclipse.jetty.server.HttpChannel.write(HttpChannel.java:661)
at org.eclipse.jetty.server.HttpOutput.flush(HttpOutput.java:151)
at
Let me know where I am I going wrong.

Is this closer to what you are trying to do?
MATCH (I:Interface)
FOREACH(
p in FILTER(z in {props}
WHERE z.OrigIPAddress = I.IfIPAddress or z.TermIPAddress = I.IfIPAddress) |
MERGE (I)-[r:link]->(:Interface {IfIPAddress:p.TermIPAddress})
SET I.IfIPAddress = p.OrigIPAddress, r = p
);

Finally after googling I found the issue.
Its server timeout. Since the query is taking some time to execute, the server is timing out and sending a response.
I increased the timeout and it worked
Here is the reference on how to increase it.
http://docs.neo4j.org/chunked/stable/server-configuration.html

Related

Apache Spark SQL get_json_object java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String

I am trying to read a json stream from an MQTT broker in Apache Spark with structured streaming, read some properties of an incoming json and output them to the console. My code looks like that:
val spark = SparkSession
.builder()
.appName("BahirStructuredStreaming")
.master("local[*]")
.getOrCreate()
import spark.implicits._
val topic = "temp"
val brokerUrl = "tcp://localhost:1883"
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.toDF().withColumn("payload", $"payload".cast(StringType))
val jsonDF = lines.select(get_json_object($"payload", "$.eventDate").alias("eventDate"))
val query = jsonDF.writeStream
.format("console")
.start()
query.awaitTermination()
However, when the json arrives I get the following errors:
Exception in thread "main" org.apache.spark.sql.streaming.StreamingQueryException: Writing job aborted.
=== Streaming Query ===
Identifier: [id = 14d28475-d435-49be-a303-8e47e2f907e3, runId = b5bd28bb-b247-48a9-8a58-cb990edaf139]
Current Committed Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: -1}
Current Available Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: 0}
Current State: ACTIVE
Thread State: RUNNABLE
Logical Plan:
Project [get_json_object(payload#22, $.id) AS eventDate#27]
+- Project [id#10, topic#11, cast(payload#12 as string) AS payload#22, timestamp#13]
+- StreamingExecutionRelation MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496], [id#10, topic#11, payload#12, timestamp#13]
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:300)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:189)
Caused by: org.apache.spark.SparkException: Writing job aborted.
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:92)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:296)
at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3384)
at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2783)
at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3365)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3365)
at org.apache.spark.sql.Dataset.collect(Dataset.scala:2783)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$15(MicroBatchExecution.scala:537)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$14(MicroBatchExecution.scala:533)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runBatch(MicroBatchExecution.scala:532)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:198)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:166)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:56)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:160)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:279)
... 1 more
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 8, localhost, executor driver): java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1887)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1875)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1874)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:64)
... 34 more
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I am sending the JSON records using mosquitto broker and they look like this:
mosquitto_pub -m '{"eventDate": "2020-11-11T15:17:00.000+0200"}' -t "temp"
It seems that every strings coming from Bahir stream source provider raise this error. For instance the following code also raises this error :
spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("topic")
.writeStream
.format("console")
.start()
It looks like Spark does not recognize strings coming from Bahir, maybe some kind of weird string class version issue. I've tried the following actions to make the code work:
setup java version to 8
upgrade spark version from 2.4.0 to 2.4.7
setup scala version to 2.11.12
use decode function with all possible encoding combinations instead of .cast(StringType) to transform column "payload" to String
use substring function on column "payload" to recreate a compatible String.
Finally, I got working code by recreating the string using constructor and dataset:
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("payload")
.as[Array[Byte]]
.map(payload => new String(payload))
.toDF("payload")
This solution is rather ugly but at least it works.
I believe that there is nothing wrong with the code provided in the question and I suspect a bug on Bahir or Spark side preventing Spark to handle String from Bahir source.

How to configure rsyslog template for Exception error for remote logging?

I'm using rsyslog to ship logs to a remote Logstash server, and the Logstash on that service expects input data in a json format. How can I configure an rsyslog template to json-ify a exception. For example, I want to send the following exception as a single message.
2017-02-08 21:59:51,727 ERROR :localhost-startStop-1 [jdbc.sqlonly] 1. PreparedStatement.executeBatch() batching 1 statements:
1: insert into CR_CLUSTER_REGISTRY (Cluster_Name, Url, Update_Dttm, Node_Id) values ('customer', 'rmi://ip-10-53-123.123.eu-west-1.compute.internal:1199/2', '02/08/2017 21:59:51.639', '2')
java.sql.BatchUpdateException: [Teradata JDBC Driver] [TeraJDBC 15.00.00.35] [Error 1338] [SQLState HY000] A failure occurred while executing a PreparedStatement batch request. Details of the failure can be found in the exception chain that is accessible with getNextException.
at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeBatchUpdateException(ErrorFactory.java:148)
at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeBatchUpdateException(ErrorFactory.java:137)
at com.teradata.jdbc.jdbc_4.TDPreparedStatement.executeBatchDMLArray(TDPreparedStatement.java:272)
at com.teradata.jdbc.jdbc_4.TDPreparedStatement.executeBatch(TDPreparedStatement.java:2584)
at com.teradata.tal.qes.StatementProxy.executeBatch(StatementProxy.java:186)
at net.sf.log4jdbc.StatementSpy.executeBatch(StatementSpy.java:539)
at org.hibernate.jdbc.BatchingBatcher.doExecuteBatch(BatchingBatcher.java:70)
at org.hibernate.jdbc.AbstractBatcher.executeBatch(AbstractBatcher.java:268)
at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:266)
at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:167)
at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:321)
at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:50)
at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1028)
at com.teradata.tal.common.persistence.dao.SessionWrapper.flush(SessionWrapper.java:920)
at com.teradata.trm.common.persistence.dao.DaoImpl.save(DaoImpl.java:263)
at com.teradata.trm.common.service.AbstractService.save(AbstractService.java:509)
at com.teradata.trm.common.cluster.Cluster.init(Cluster.java:413)
at com.teradata.trm.common.cluster.NodeConfiguration.initialize(NodeConfiguration.java:182)
at com.teradata.trm.common.context.Initializer.onApplicationEvent(Initializer.java:73)
at com.teradata.trm.common.context.Initializer.onApplicationEvent(Initializer.java:30)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:97)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:324)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:929)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:467)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:385)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:284)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4973)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5467)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:632)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1247)
at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1898)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.00.00.35] [Error -2801] [SQLState 23000] Duplicate unique prime key error in CIM_META.CR_CLUSTER_REGISTRY.
at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDatabaseSQLException(ErrorFactory.java:301)
at com.teradata.jdbc.jdbc_4.statemachine.ReceiveInitSubState.action(ReceiveInitSubState.java:114)
at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.subStateMachine(StatementReceiveState.java:311)
at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.action(StatementReceiveState.java:200)
at com.teradata.jdbc.jdbc_4.statemachine.StatementController.runBody(StatementController.java:137)
at com.teradata.jdbc.jdbc_4.statemachine.PreparedBatchStatementController.run(PreparedBatchStatementController.java:58)
at com.teradata.jdbc.jdbc_4.TDStatement.executeStatement(TDStatement.java:387)
at com.teradata.jdbc.jdbc_4.TDPreparedStatement.executeBatchDMLArray(TDPreparedStatement.java:252)
... 37 more
I have the following rsyslog configuration file. The startmsg.regex aims to "flag" the start of a new message when it sees the "YYYY-mm-dd" date format, and until it sees that format, it should treat any text following the date format as part of the current message.
input(type="imfile"
File="/usr/share/tomcat/dist/logs/trm-error.log*"
Facility="local3"
Tag="trm-error:"
Severity="error"
startmsg.regex="^[[:digit:]]{4}-[[:digit:]]{2}-[[:digit:]]{2}"
escapeLF="on"
)
if $programname == 'trm-error:' then {
action(
type="omfwd"
Target="10.53.234.234"
Port="5514"
Protocol="udp"
template="textLogTemplate"
)
stop
}
..and the following template.
# Template for non json logs, just sends the message wholesale with extra
# # furniture.
template(name="textLogTemplate" type="list") {
constant(value="{ ")
constant(value="\"type\":\"")
property(name="programname")
constant(value="\", ")
constant(value="\"host\":\"")
property(name="hostname")
constant(value="\", ")
constant(value="\"timestamp\":\"")
property(name="timestamp" dateFormat="rfc3339")
constant(value="\", ")
constant(value="\"#version\":\"1\", ")
constant(value="\"customer\":\"customer\", ")
constant(value="\"role\":\"app2\", ")
constant(value="\"sourcefile\":\"")
property(name="$!metadata!filename")
constant(value="\", ")
constant(value="\"message\":\"")
property(name="rawmsg" format="json")
constant(value="\"}\n")
}
However, Logstash complains about a "jsonparseerror" when it tries to parse the log as a json file. Any clues?
The rsyslog configuration files I'm using are correct, that is, Java exception log is indeed wrapped into a valid JSON file. However, Logstash is complaining about a _jsonparsefailure, so this problem is probably related to Logstash Ruby code, and not on the rsyslog side.

Maximo REST API json formatting error

When trying to GET data from this path:
/maxrest/rest/mbo/taskscheduler
I'm able to get the data in xml perfectly fine.
But when I try to convert it to JSON using this parameter:
/maxrest/rest/mbo/taskscheduler?_format=json
I get a 500 server error and a null pointer exception in my logs when it tries to convert over to json. I have gotten this to work when getting other mbo data such as person, assets, etc... but the taskscheduler mbo is giving me issues. I thought maybe there was some bad data being returned, but when i filtered it down to just return the taskname string from taskscheduler, it still threw the error. So it seems to be an issue with the taskscheduler mbo. Any ideas?
EDIT
Here is the stacktrace:
05 Oct 2016 15:25:10:802 [ERROR] [MXServer] [CID-MXREST-73697]
java.lang.NullPointerException
at com.ibm.tivoli.maximo.rest.MboJSONStructure.setJSONObjectAttribute(MboJSONStructure.java:601)
at com.ibm.tivoli.maximo.rest.MboJSONStructure.mbo2JSONObject(MboJSONStructure.java:491)
at com.ibm.tivoli.maximo.rest.MboJSONStructure.serializeMboSetAsJSONObject(MboJSONStructure.java:294)
at com.ibm.tivoli.maximo.rest.MboJSONStructure.serializeMboSetAsJSONObject(MboJSONStructure.java:126)
at com.ibm.tivoli.maximo.rest.MboJSONSerializer.serializeResource(MboJSONSerializer.java:76)
at com.ibm.tivoli.maximo.rest.MboSerializer.serialize(MboSerializer.java:274)
at com.ibm.tivoli.maximo.rest.BaseResourceRequestHandler.generateResponse(BaseResourceRequestHandler.java:456)
at com.ibm.tivoli.maximo.rest.BaseResourceRequestHandler.handleRequest(BaseResourceRequestHandler.java:343)
at com.ibm.tivoli.maximo.rest.MaximoRestServlet.invokeRequestHandler(MaximoRestServlet.java:783)
at com.ibm.tivoli.maximo.rest.MaximoRestServlet.doGet(MaximoRestServlet.java:336)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:575)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:668)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1232)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:781)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:480)
at com.ibm.ws.webcontainer.servlet.ServletWrapperImpl.handleRequest(ServletWrapperImpl.java:178)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1114)
at com.ibm.ws.webcontainer.servlet.CacheServletWrapper.handleRequest(CacheServletWrapper.java:87)
at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:940)
at com.ibm.ws.webcontainer.WSWebContainer.handleRequest(WSWebContainer.java:1817)
at com.ibm.ws.webcontainer.channel.WCChannelLink.ready(WCChannelLink.java:200)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleDiscrimination(HttpInboundLink.java:463)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleNewRequest(HttpInboundLink.java:530)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.processRequest(HttpInboundLink.java:316)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.ready(HttpInboundLink.java:287)
at com.ibm.ws.ssl.channel.impl.SSLConnectionLink.determineNextChannel(SSLConnectionLink.java:1049)
at com.ibm.ws.ssl.channel.impl.SSLConnectionLink$MyReadCompletedCallback.complete(SSLConnectionLink.java:643)
at com.ibm.ws.ssl.channel.impl.SSLReadServiceContext$SSLReadCompletedCallback.complete(SSLReadServiceContext.java:1818)
at com.ibm.ws.tcp.channel.impl.AioReadCompletionListener.futureCompleted(AioReadCompletionListener.java:175)
at com.ibm.io.async.AbstractAsyncFuture.invokeCallback(AbstractAsyncFuture.java:217)
at com.ibm.io.async.AsyncChannelFuture.fireCompletionActions(AsyncChannelFuture.java:161)
at com.ibm.io.async.AsyncFuture.completed(AsyncFuture.java:138)
at com.ibm.io.async.ResultHandler.complete(ResultHandler.java:204)
at com.ibm.io.async.ResultHandler.runEventProcessingLoop(ResultHandler.java:775)
at com.ibm.io.async.ResultHandler$2.run(ResultHandler.java:905)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1881)
05 Oct 2016 15:25:10:802 [INFO] [MXServer] [CID-MXREST-73697] Correlated data: BEGIN HttpSessionId:cGV-jEFXtHCp2ao6aqoDkL9 RequestParams:_lid=username&_lpwd=password&_format=json UserName:username ClientIP: RequestURI:/maxrest/rest/mbo/taskscheduler ElapsedTime:63 ms END
In system properties try edit mxe.rest.mbo.defaultformat=xml to mxe.rest.mbo.defaultformat=json.
You can now execute your requestion without adding _format=json.

graphhopper GHRequest not fetching

I have a problem with this code:
GraphHopperAPI gh = new GraphHopperWeb();
gh.load("http://localhost:8989/api/route");
GHResponse ph = gh.route(new GHRequest(45.104546,7.69043,45.104546,7.69043));
It gives me this error:
2014-03-29 09:33:00,036 [main] INFO graphhopper.http.GraphHopperWeb - Full request took:0.037406, API took:0.0
Exception in thread "main" java.lang.RuntimeException: Problem while fetching path 45.104546, 7.69043->45.104546, 7.69043
at com.graphhopper.http.GraphHopperWeb.route(GraphHopperWeb.java:119)
at provaMain.main(provaMain.java:23)
Caused by: org.json.JSONException: A JSONObject text must begin with '{' at character 0
at org.json.JSONTokener.syntaxError(JSONTokener.java:410)
at org.json.JSONObject.<init>(JSONObject.java:179)
at org.json.JSONObject.<init>(JSONObject.java:402)
at com.graphhopper.http.GraphHopperWeb.route(GraphHopperWeb.java:95)
... 1 more
The documentation currently undergoes a change (moving it from wiki to source). Where did you find that snippet? Please try gh.load("http://localhost:8989/"); for the latest branch and gh.load("http://localhost:8989/api"); before.

Accessing Apache ActiveMQ via JMX throws Exception

I'm using a fresh ActiveMQ 5.8.0 installation, where I have a message in a queue called 'testing'. I also replaced the ACTIVEMQ_SUNJMX line in bin/activemq to enable JMX:
ACTIVEMQ_SUNJMX_START="-Dcom.sun.management.jmxremote.port=1099 -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false"
Then I query JMX via the following code:
try {
JMXConnector connector = JMXConnectorFactory.connect(new JMXServiceURL("service:jmx:rmi:///jndi/rmi://localhost:1099/jmxrmi"));
connector.connect();
MBeanServerConnection connection = connector.getMBeanServerConnection();
ObjectName mbeanName = new ObjectName("org.apache.activemq:BrokerName=localhost,Type=Broker");
BrokerViewMBean mbean = MBeanServerInvocationHandler.newProxyInstance(connection, mbeanName, BrokerViewMBean.class, true);
System.out.println("Id:" + mbean.getBrokerId()); // here the exception will be thrown
}
catch (Exception x) {
x.printStackTrace();
}
The Exception is thrown when accessing the mbean.
java.lang.reflect.UndeclaredThrowableException
at $Proxy0.getBrokerId(Unknown Source)
at testing.TestJmx.main(TestJmx.java:25)
Caused by: javax.management.InstanceNotFoundException: org.apache.activemq:BrokerName=localhost,Type=Broker
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1095)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:643)
at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:668)
at javax.management.remote.rmi.RMIConnectionImpl.doOperation(RMIConnectionImpl.java:1424)
at javax.management.remote.rmi.RMIConnectionImpl.access$200(RMIConnectionImpl.java:89)
at javax.management.remote.rmi.RMIConnectionImpl$PrivilegedOperation.run(RMIConnectionImpl.java:1292)
at javax.management.remote.rmi.RMIConnectionImpl.doPrivilegedOperation(RMIConnectionImpl.java:1380)
at javax.management.remote.rmi.RMIConnectionImpl.getAttribute(RMIConnectionImpl.java:621)
at sun.reflect.GeneratedMethodAccessor30.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:322)
at sun.rmi.transport.Transport$1.run(Transport.java:177)
at sun.rmi.transport.Transport$1.run(Transport.java:174)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:173)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:553)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:808)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:667)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
at sun.rmi.transport.StreamRemoteCall.exceptionReceivedFromServer(StreamRemoteCall.java:273)
at sun.rmi.transport.StreamRemoteCall.executeCall(StreamRemoteCall.java:251)
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:160)
at com.sun.jmx.remote.internal.PRef.invoke(Unknown Source)
at javax.management.remote.rmi.RMIConnectionImpl_Stub.getAttribute(Unknown Source)
at javax.management.remote.rmi.RMIConnector$RemoteMBeanServerConnection.getAttribute(RMIConnector.java:901)
at javax.management.MBeanServerInvocationHandler.invoke(MBeanServerInvocationHandler.java:280)
... 2 more
I checked that that port is listening, VisualVM also shows me the mbeans, ideas?
The fact that bean names have changed is documented on the 5.8.0 release page.
So your ObjectName should be of the form:
ObjectName mbeanName = new ObjectName("org.apache.activemq:type=Broker,brokerName=localhost");
Excellent answer. Here you have an example of use: API Docs
There is also something important that changes from 5.7 to 5.8. The default URL for JMX services goes from:
service:jmx:rmi://localhost/jndi/rmi://localhost:1099/jmxrmi
to:
service:jmx:rmi:///jndi/rmi://localhost:1099/jmxrmi
Lovely question and answer too, I was having a hard time configuring JMX. ActiveMQ docs isn't up-to-date in this regard. Eg : http://activemq.apache.org/jmx.html doesn't say that from 5.8.0 onwards "SUNJMX" becomes "ACTIVEMQ_SUNJMX_START".
The config provided by OP , ie :
ACTIVEMQ_SUNJMX_START="-Dcom.sun.management.jmxremote.port=1099 -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false"
, and the corresponding code is the one that really worked for me. Thanks ! ( I am using ActiveMQ 5.8)