Thingsboard: JSON parse error when reading MQTT timestamp - json

I get this MQTT payload from a device: (and I can't change how the device sends it)
{t:2021-11-08T16:17:15Z,10:99,14:24,55:20.85,56:64.38,53:36.00}
This is the timestamp: t:2021-11-08T16:17:15Z
I know that TB expects the UNIX style timestamp, and I expected to change this timestamp into the UNIX style in a transform node in the Rule Chain.
But, it never gets there because I believe that TB parses it before getting to the Rule Chain, and throughs an exception and rejects the readings.
How can I prevent TB to do that early parsing so I can get to the value and change it to the expected ts format? Is there a configuration or any other way besides forking the project and rewriting the parser?
Thank you!
Here is the error I get from the logs:
2021-11-08 23:39:25,124 [nioEventLoopGroup-4-5] INFO o.t.s.t.mqtt.MqttTransportHandler - [f5a4a965-8177-4daf-9a56-9197a55fab7d] Processing connect msg for client: xxxx!
2021-11-08 23:39:25,125 [nioEventLoopGroup-4-5] INFO o.t.s.t.mqtt.MqttTransportHandler - [f5a4a965-8177-4daf-9a56-9197a55fab7d] Processing connect msg for client with user name: null!
2021-11-08 23:39:25,167 [DefaultTransportService-18-34] INFO o.t.s.t.mqtt.MqttTransportHandler - [f5a4a965-8177-4daf-9a56-9197a55fab7d] Client connected!
2021-11-08 23:39:25,183 [nioEventLoopGroup-4-5] WARN o.t.s.t.mqtt.MqttTransportHandler - [f5a4a965-8177-4daf-9a56-9197a55fab7d] Failed to process publish msg [device/lre/readings][1]
org.thingsboard.server.common.transport.adaptor.AdaptorException: com.google.gson.JsonSyntaxException: com.google.gson.stream.MalformedJsonException: Unterminated object at line 1 column 18 path $.t
at org.thingsboard.server.transport.mqtt.adaptors.JsonMqttAdaptor.convertToPostTelemetry(JsonMqttAdaptor.java:67)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processDevicePublish(MqttTransportHandler.java:343)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processPublish(MqttTransportHandler.java:298)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processRegularSessionMsg(MqttTransportHandler.java:255)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.lambda$processMsgQueue$0(MqttTransportHandler.java:249)
at org.thingsboard.server.transport.mqtt.session.DeviceSessionCtx.tryProcessQueuedMsgs(DeviceSessionCtx.java:181)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processMsgQueue(MqttTransportHandler.java:249)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.enqueueRegularSessionMsg(MqttTransportHandler.java:241)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processMqttMsg(MqttTransportHandler.java:183)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.channelRead(MqttTransportHandler.java:156)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:296)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:719)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: com.google.gson.JsonSyntaxException: com.google.gson.stream.MalformedJsonException: Unterminated object at line 1 column 18 path $.t
at com.google.gson.internal.Streams.parse(Streams.java:60)
at com.google.gson.JsonParser.parse(JsonParser.java:84)
at com.google.gson.JsonParser.parse(JsonParser.java:59)
at com.google.gson.JsonParser.parse(JsonParser.java:45)
at org.thingsboard.server.transport.mqtt.adaptors.JsonMqttAdaptor.convertToPostTelemetry(JsonMqttAdaptor.java:65)
... 30 common frames omitted
Caused by: com.google.gson.stream.MalformedJsonException: Unterminated object at line 1 column 18 path $.t
at com.google.gson.stream.JsonReader.syntaxError(JsonReader.java:1567)
at com.google.gson.stream.JsonReader.doPeek(JsonReader.java:495)
at com.google.gson.stream.JsonReader.hasNext(JsonReader.java:418)
at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:742)
at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:718)
at com.google.gson.internal.Streams.parse(Streams.java:48)
... 34 common frames omitted
2021-11-08 23:39:25,183 [nioEventLoopGroup-4-5] INFO o.t.s.t.mqtt.MqttTransportHandler - [f5a4a965-8177-4daf-9a56-9197a55fab7d] Closing current session due to invalid publish msg [device/lre/readings][1]
2021-11-08 23:39:25,184 [nioEventLoopGroup-4-5] INFO o.t.s.t.mqtt.MqttTransportHandler - [f5a4a965-8177-4daf-9a56-9197a55fab7d] Client disconnected!
2021-11-08 23:39:28,533 [queue-scheduler-11-thread-1] INFO o.t.s.q.u.DefaultTbApiUsageClient - Reporting API usage statistics for 3 tenants and customers
2021-11-08 23:39:29,253 [sql-log-1-thread-1] INFO o.t.s.dao.sql.TbSqlBlockingQueue - Queue-0 [TS] queueSize [0] totalAdded [6] totalSaved [6] totalFailed [0]
2021-11-08 23:39:29,254 [sql-log-1-thread-1] INFO o.t.s.dao.sql.TbSqlBlockingQueue - Queue-2 [TS] queueSize [0] totalAdded [90] totalSaved [90] totalFailed [0]
2021-11-08 23:39:29,351 [sql-log-1-thread-1] INFO o.t.s.dao.sql.TbSqlBlockingQueue - Queue-0 [TS Latest] queueSize [0] totalAdded [6] totalSaved [6] totalFailed [0]
2021-11-08 23:39:29,351 [sql-log-1-thread-1] INFO o.t.s.dao.sql.TbSqlBlockingQueue - Queue-2 [TS Latest] queueSize [0] totalAdded [90] totalSaved [90] totalFailed [0]
2021-11-08 23:39:29,530 [TB-Scheduling-6] INFO o.t.server.actors.ActorSystemContext - Rule Engine JS Invoke Stats: requests [6] responses [3] failures [0]
2021-11-08 23:39:29,612 [TB-Scheduling-5] INFO o.t.s.s.s.DefaultTbEntityDataSubscriptionService - Stats: regularQueryInvocationCnt = [1], regularQueryInvocationTime = [2], dynamicQueryCnt = [3] dynamicQueryInvocationCnt = [1], dynamicQueryInvocationTime = [2], alarmQueryInvocationCnt = [0], alarmQueryInvocationTime = [0]

Related

Thingsboard: Fails to read valid JSON payload when timestamp is in ISO 8601 format

I send this valid JSON to TB CE edition, and it fails reading it.
mosquitto_pub -d -q 1 -h “192.168.0.108” -t “device/sck/ybuers/readings” -i xxxxx -m
{"ts":"2021-11-08T16:17Z","value1":"99","value2":"24"}
If I send: (just changing the ts format to UNIX style)
mosquitto_pub -d -q 1 -h “192.168.0.108” -t “device/sck/ybuers/readings” -i xxxxx -m
{"ts":"12345678910","value1":"99","value2":"24"}
it works.
Is this a BIG limitation of the platform? or am I missing something basic?
I'm working with TB CE v3.3.1, on Windows.
I paste the error from /var/log/thingsboard below.
Thank you!
2021-11-09 15:25:23,583 [nioEventLoopGroup-4-2] INFO o.t.s.t.mqtt.MqttTransportHandler - [d13716f2-1a55-4056-93c8-11b81bc7794b] Processing connect msg for client: ybuers!
2021-11-09 15:25:23,583 [nioEventLoopGroup-4-2] INFO o.t.s.t.mqtt.MqttTransportHandler - [d13716f2-1a55-4056-93c8-11b81bc7794b] Processing connect msg for client with user name: null!
2021-11-09 15:25:23,639 [DefaultTransportService-28-60] INFO o.t.s.t.mqtt.MqttTransportHandler - [d13716f2-1a55-4056-93c8-11b81bc7794b] Client connected!
2021-11-09 15:25:23,644 [nioEventLoopGroup-4-2] WARN o.t.s.t.mqtt.MqttTransportHandler - [d13716f2-1a55-4056-93c8-11b81bc7794b] **Failed to process publish msg [device/sck/ybuers/readings][1]**
org.thingsboard.server.common.transport.adaptor.AdaptorException: com.google.gson.JsonSyntaxException: **com.google.gson.stream.MalformedJsonException: Unterminated object at line 1 column 18 path $.t**
at org.thingsboard.server.transport.mqtt.adaptors.JsonMqttAdaptor.convertToPostTelemetry(JsonMqttAdaptor.java:67)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processDevicePublish(MqttTransportHandler.java:343)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processPublish(MqttTransportHandler.java:298)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processRegularSessionMsg(MqttTransportHandler.java:255)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.lambda$processMsgQueue$0(MqttTransportHandler.java:249)
at org.thingsboard.server.transport.mqtt.session.DeviceSessionCtx.tryProcessQueuedMsgs(DeviceSessionCtx.java:181)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processMsgQueue(MqttTransportHandler.java:249)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.enqueueRegularSessionMsg(MqttTransportHandler.java:241)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.processMqttMsg(MqttTransportHandler.java:183)
at org.thingsboard.server.transport.mqtt.MqttTransportHandler.channelRead(MqttTransportHandler.java:156)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:296)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:719)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: com.google.gson.JsonSyntaxException: com.google.gson.stream.MalformedJsonException: Unterminated object at line 1 column 18 path $.t
at com.google.gson.internal.Streams.parse(Streams.java:60)
at com.google.gson.JsonParser.parse(JsonParser.java:84)
at com.google.gson.JsonParser.parse(JsonParser.java:59)
at com.google.gson.JsonParser.parse(JsonParser.java:45)
at org.thingsboard.server.transport.mqtt.adaptors.JsonMqttAdaptor.convertToPostTelemetry(JsonMqttAdaptor.java:65)
... 30 common frames omitted
Caused by: com.google.gson.stream.MalformedJsonException: Unterminated object at line 1 column 18 path $.t
at com.google.gson.stream.JsonReader.syntaxError(JsonReader.java:1567)
at com.google.gson.stream.JsonReader.doPeek(JsonReader.java:495)
at com.google.gson.stream.JsonReader.hasNext(JsonReader.java:418)
at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:742)
at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:718)
at com.google.gson.internal.Streams.parse(Streams.java:48)
... 34 common frames omitted
2021-11-09 15:25:23,645 [nioEventLoopGroup-4-2] INFO o.t.s.t.mqtt.MqttTransportHandler - [d13716f2-1a55-4056-93c8-11b81bc7794b] Closing current session due to invalid publish msg [device/sck/ybuers/readings][1]
2021-11-09 15:25:23,646 [nioEventLoopGroup-4-2] INFO o.t.s.t.mqtt.MqttTransportHandler - [d13716f2-1a55-4056-93c8-11b81bc7794b] Client disconnected!
2021-11-09 15:25:23,664 [nioEventLoopGroup-4-1] INFO o.t.s.t.mqtt.MqttTransportHandler - [f03fba50-a2be-4c2f-a301-a48c79d6baaf] Client disconnected!
If you're using windows to execute that cmd, try using windows format quotes.
"{\"ts\":\"12345678910\",\"value1\":\"99\",\"value2\":\"24\"}"

R and H2O: MySQL JDBC Connection Error

I am attempting to use the h2o package in R to query a MySQL database and store the results directly in an h2o instance I have running on my local machine.
Provided below is all of my code and errors received, I would very much appreciate a new set of eyes to help me troubleshoot - thanks!
Via Windows Command Prompt:
java -cp "C:\Program Files\h2o-3.8.2.6\h2o.jar;C:\Program Files\MySQL\mysql-connector-java-5.1.39\mysql-connector-java-5.1.39-bin.jar" water.H2OApp
Which returns:
05-31 12:20:41.601 127.0.0.1:54321 5280 main INFO: HDFS subsystem successfully initialized
05-31 12:20:41.602 127.0.0.1:54321 5280 main INFO: S3 subsystem successfully initialized
05-31 12:20:41.604 127.0.0.1:54321 5280 main INFO: Flow dir: 'C:/Users/ekorne201/h2oflows'
05-31 12:20:41.925 127.0.0.1:54321 5280 main INFO: Cloud of size 1 formed [/127.0.0.1:54321]
05-31 12:20:41.931 127.0.0.1:54321 5280 main INFO: Registered 0 extensions in: 531mS
05-31 12:20:42.525 127.0.0.1:54321 5280 main INFO: Registered: 124 REST APIs in: 553mS
05-31 12:20:42.995 127.0.0.1:54321 5280 main INFO: Registered: 193 schemas in: 468mS
05-31 12:20:42.995 127.0.0.1:54321 5280 main INFO:
05-31 12:20:42.996 127.0.0.1:54321 5280 main INFO: Open H2O Flow in your web browser: http://127.0.0.1:54321/
05-31 12:20:42.998 127.0.0.1:54321 5280 main INFO:
Then, via R I run the following
(db/server details masked):
#packages
require(h2o)
ip <- '127.0.0.1'
port <- 54321
h2o.init(ip = ip, port = port)
#db details
dbip <- 'XX.XX.XX.XX'
dbport <- 'XXXX'
dblogin <- 'login'
dbpass <- 'password'
#test connection
test <- h2o.import_sql_select(
connection_url = paste("jdbc:mysql://", dbip, ":", dbport, "/myDB?useSSL=false", sep = ''),
select_query = 'select * from myTable limit 1000',
username = dblogin,
password = dbpass)
which returns the following errors:
ERROR: Unexpected HTTP Status code: 500 Server Error (url = http://127.0.0.1:54321/99/ImportSQLTable)
java.lang.RuntimeException
[1] "java.lang.RuntimeException: SQLException: Communications link failure\n\nThe last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.\nFailed to connect and read from SQL database with connection_url: jdbc:mysql://XX.XX.XX.XX:XXXX/assumption_matrix?useSSL=false"
[2] " water.jdbc.SQLManager.importSqlTable(SQLManager.java:135)"
[3] " water.api.ImportSQLTableHandler.importSQLTable(ImportSQLTableHandler.java:15)"
[4] " sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)"
[5] " sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)"
[6] " sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)"
[7] " java.lang.reflect.Method.invoke(Unknown Source)"
[8] " water.api.Handler.handle(Handler.java:62)"
[9] " water.api.RequestServer.handle(RequestServer.java:655)"
[10] " water.api.RequestServer.serve(RequestServer.java:596)"
[11] " water.JettyHTTPD$H2oDefaultServlet.doGeneric(JettyHTTPD.java:745)"
[12] " water.JettyHTTPD$H2oDefaultServlet.doPost(JettyHTTPD.java:681)"
[13] " javax.servlet.http.HttpServlet.service(HttpServlet.java:755)"
[14] " javax.servlet.http.HttpServlet.service(HttpServlet.java:848)"
[15] " org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)"
[16] " org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:503)"
[17] " org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)"
[18] " org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)"
[19] " org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)"
[20] " org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)"
[21] " org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:429)"
[22] " org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)"
[23] " org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)"
[24] " org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)"
[25] " org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)"
[26] " org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)"
[27] " org.eclipse.jetty.server.Server.handle(Server.java:370)"
[28] " org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)"
[29] " org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)"
[30] " org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:982)"
[31] " org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1043)"
[32] " org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)"
[33] " org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)"
[34] " org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)"
[35] " org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)"
[36] " org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)"
[37] " org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)"
[38] " java.lang.Thread.run(Unknown Source)"
Error in .h2o.doSafeREST(h2oRestApiVersion = h2oRestApiVersion, urlSuffix = page, :
ERROR MESSAGE:
SQLException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
Failed to connect and read from SQL database with connection_url: jdbc:mysql://XX.XX.XX.XX:XXXX/myDB?useSSL=false
I'm stumped and would appreciate any help. I am able to connect to this server via ODBC without trouble. Thanks!

Apache Pig error while dumping Json data

I have a JSON file and want to load using Apache Pig.
I am using the built-in JSONLOADER to load json data, Below is the sample json data.
cat jsondata1.json
{ "response": { "id": 10123, "thread": "Sloths", "comments": ["Sloths are adorable So chill"] }, "response_time": 0.425 }
{ "response": { "id": 13828, "thread": "Bigfoot", "comments": ["hello world"] } , "response_time": 0.517 }
Here I loading json data using builtin Json loader. While loading there is no error, but while dumping the data it gives the following error.
grunt> a = load '/home/cloudera/jsondata1.json' using JsonLoader('response:tuple (id:int, thread:chararray, comments:bag {tuple(comment:chararray)}), response_time:double');
grunt> dump a;
2016-04-17 01:11:13,286 [pool-4-thread-1] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader - Current split being processed file:/home/cloudera/jsondata1.json:0+229
2016-04-17 01:11:13,287 [pool-4-thread-1] WARN org.apache.hadoop.conf.Configuration - dfs.https.address is deprecated. Instead, use dfs.namenode.https-address
2016-04-17 01:11:13,311 [pool-4-thread-1] WARN org.apache.pig.data.SchemaTupleBackend - SchemaTupleBackend has already been initialized
2016-04-17 01:11:13,321 [pool-4-thread-1] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map - Aliases being processed per job phase (AliasName[line,offset]): M: a[5,4] C: R:
2016-04-17 01:11:13,349 [Thread-16] INFO org.apache.hadoop.mapred.LocalJobRunner - Map task executor complete.
2016-04-17 01:11:13,351 [Thread-16] WARN org.apache.hadoop.mapred.LocalJobRunner - job_local801054416_0004
java.lang.Exception: org.codehaus.jackson.JsonParseException: Current token (FIELD_NAME) not numeric, can not use numeric value accessors
at [Source: java.io.ByteArrayInputStream#2484de3c; line: 1, column: 120]
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:406)
Caused by: org.codehaus.jackson.JsonParseException: Current token (FIELD_NAME) not numeric, can not use numeric value accessors
at [Source: java.io.ByteArrayInputStream#2484de3c; line: 1, column: 120]
at org.codehaus.jackson.JsonParser._constructError(JsonParser.java:1291)
at org.codehaus.jackson.impl.JsonParserMinimalBase._reportError(JsonParserMinimalBase.java:385)
at org.codehaus.jackson.impl.JsonNumericParserBase._parseNumericValue(JsonNumericParserBase.java:399)
at org.codehaus.jackson.impl.JsonNumericParserBase.getDoubleValue(JsonNumericParserBase.java:311)
at org.apache.pig.builtin.JsonLoader.readField(JsonLoader.java:203)
at org.apache.pig.builtin.JsonLoader.getNext(JsonLoader.java:157)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:268)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
2016-04-17 01:11:13,548 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_local801054416_0004
2016-04-17 01:11:13,548 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Processing aliases a
2016-04-17 01:11:13,548 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - detailed locations: M: a[5,4] C: R:
2016-04-17 01:11:18,059 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2016-04-17 01:11:18,059 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_local801054416_0004 has failed! Stop running all dependent jobs
2016-04-17 01:11:18,059 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2016-04-17 01:11:18,059 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2016-04-17 01:11:18,060 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Detected Local mode. Stats reported below may be incomplete
2016-04-17 01:11:18,060 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.0.0-cdh4.7.0 0.11.0-cdh4.7.0 cloudera 2016-04-17 01:11:12 2016-04-17 01:11:18 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_local801054416_0004 a MAP_ONLY Message: Job failed! file:/tmp/temp-1766116741/tmp1151698221,
Input(s):
Failed to read data from "/home/cloudera/jsondata1.json"
Output(s):
Failed to produce result in "file:/tmp/temp-1766116741/tmp1151698221"
Job DAG:
job_local801054416_0004
2016-04-17 01:11:18,060 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2016-04-17 01:11:18,061 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias a
Details at logfile: /home/cloudera/pig_1460877001124.log
I could not able to find the issue. Can I know how to define the correct schema for the above json data?.
Try this:
comments:{(chararray)}
because this version:
comments:bag {tuple(comment:chararray)}
fits this JSON schema:
"comments": [{comment:"hello world"}]
and you have simple string values, not another nested documents:
"comments": ["hello world"]

JsonLoader throws error in pig

I am unable to decode this simple json , i dont know what i am doing wrong.
please help me in this pig script.
I have to decode the below data in json format.
3.json
{
"id": 6668,
"source_name": "National Stock Exchange of India",
"source_code": "NSE"
}
and my pig script is
a = LOAD '3.json' USING org.apache.pig.builtin.JsonLoader ('id:int, source_name:chararray, source_code:chararray');
dump a;
the error i get is given below:
2015-07-23 13:40:08,715 [LocalJobRunner Map Task Executor #0] INFO org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local1664361500_0001_m_000000_0
2015-07-23 13:40:08,775 [LocalJobRunner Map Task Executor #0] INFO org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : [ ]
2015-07-23 13:40:08,780 [LocalJobRunner Map Task Executor #0] INFO org.apache.hadoop.mapred.MapTask - Processing split: Number of splits :1
Total Length = 88
Input split[0]:
Length = 88
Locations:
-----------------------
2015-07-23 13:40:08,793 [LocalJobRunner Map Task Executor #0] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader - Current split being processed file:/home/hariprasad.sudo/3.json:0+88
2015-07-23 13:40:08,844 [LocalJobRunner Map Task Executor #0] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map - Aliases being processed per job phase (AliasName[line,offset]): M: a[1,4] C: R:
2015-07-23 13:40:08,861 [Thread-5] INFO org.apache.hadoop.mapred.LocalJobRunner - map task executor complete.
2015-07-23 13:40:08,867 [Thread-5] WARN org.apache.hadoop.mapred.LocalJobRunner - job_local1664361500_0001
java.lang.Exception: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream#61a79110; line: 1, column: 0])
at [Source: java.io.ByteArrayInputStream#61a79110; line: 1, column: 3]
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
Caused by: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream#61a79110; line: 1, column: 0])
at [Source: java.io.ByteArrayInputStream#61a79110; line: 1, column: 3]
at org.codehaus.jackson.JsonParser._constructError(JsonParser.java:1291)
at org.codehaus.jackson.impl.JsonParserMinimalBase._reportError(JsonParserMinimalBase.java:385)
at org.codehaus.jackson.impl.JsonParserMinimalBase._reportInvalidEOF(JsonParserMinimalBase.java:318)
at org.codehaus.jackson.impl.JsonParserBase._handleEOF(JsonParserBase.java:354)
at org.codehaus.jackson.impl.Utf8StreamParser._skipWSOrEnd(Utf8StreamParser.java:1841)
at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:275)
at org.apache.pig.builtin.JsonLoader.readField(JsonLoader.java:180)
at org.apache.pig.builtin.JsonLoader.getNext(JsonLoader.java:164)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
2015-07-23 13:40:09,179 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2015-07-23 13:40:09,179 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_local1664361500_0001 has failed! Stop running all dependent jobs
2015-07-23 13:40:09,179 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2015-07-23 13:40:09,180 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2015-07-23 13:40:09,180 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Detected Local mode. Stats reported below may be incomplete
2015-07-23 13:40:09,181 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.3.0-cdh5.1.3 0.12.0-cdh5.1.3 hariprasad.sudo 2015-07-23 13:40:07 2015-07-23 13:40:09 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_local1664361500_0001 a MAP_ONLY Message: Job failed! file:/tmp/temp-65649055/tmp1240506051,
Input(s):
Failed to read data from "file:///home/hariprasad.sudo/3.json"
Output(s):
Failed to produce result in "file:/tmp/temp-65649055/tmp1240506051"
Job DAG:
job_local1664361500_0001
2015-07-23 13:40:09,181 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2015-07-23 13:40:09,186 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias a
Details at logfile: /home/hariprasad.sudo/pig_1437673203961.log
grunt> 2015-07-23 13:40:14,754 [communication thread] INFO org.apache.hadoop.mapred.LocalJobRunner - map > map
Please help me in understanding what is wrong.
Thanks,
Hari
Have the compact version of json in 3.json. We can use http://www.jsoneditoronline.org for the same.
3.json
{"id":6668,"source_name":"National Stock Exchange of India","source_code":"NSE"}
with this we are able to dump the data :
(6668,National Stock Exchange of India,NSE)
Ref : Error from Json Loader in Pig where similar issue is discussed.
Extract from the above ref. link :
Pig doesn't usually like "human readable" json. Get rid of the spaces and/or indentations, and you're good.

Which namespace on assign operation in bpel to use - selectionFailure, no results for expression

The situation is that I have a fairly simple BPEL process that invokes a service. I want to access the response message elements and assign then to another service (or even to the result of the BPEL process itself to return to the client). The issue I am having is that the imported wsdl for the service to invoke has a namespace declaration in it e.g. ldap and all the imported xsd elements for that wsdl also have the same ldap namespace declared.
<definitions
xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"
xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:ldap="http://webservices.hrldaplookup.ecis.police.uk/"
xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.xmlsoap.org/wsdl/"
targetNamespace="http://webservices.hrldaplookup.ecis.police.uk/" name="LDAPLookupServiceImpl">
This is then imported into my BPEL process, again using the ldap namespace.
<bpel:process name="HRLDAPProces"
targetNamespace="http://ldap.ecis.police.uk/Person/process"
suppressJoinFailure="yes"
xmlns:tns="http://ldap.ecis.police.uk/Person/process"
xmlns:bpel="http://docs.oasis-open.org/wsbpel/2.0/process/executable"
xmlns:ldap="http://webservices.hrldaplookup.ecis.police.uk/" xmlns:ns1="http://www.w3.org/2001/XMLSchema" xmlns:ns0="http://uk.police.ecis.police.uk/athena/services/ConstrainedValueService" xmlns:ns="http://webservices.cvmanagement.athena.ecis.police.uk/">
<!-- Import the client WSDL -->
<bpel:import namespace="http://webservices.cvmanagement.athena.ecis.police.uk/" location="ConstrainedValueService.wsdl" importType="http://schemas.xmlsoap.org/wsdl/"></bpel:import>
<bpel:import namespace="http://uk.police.ecis.police.uk/athena/services/ConstrainedValueService" location="ConstrainedValueService_1.wsdl" importType="http://schemas.xmlsoap.org/wsdl/"></bpel:import>
<bpel:import namespace="http://webservices.hrldaplookup.ecis.police.uk/" location="LDAPLookupServiceImpl.wsdl" importType="http://schemas.xmlsoap.org/wsdl/"></bpel:import>
<bpel:import location="HRLDAPProcesArtifacts.wsdl" namespace="http://ldap.ecis.police.uk/Person/process"
importType="http://schemas.xmlsoap.org/wsdl/" />
When the service is invoked the response message has its own arbitrary namespaces assigned to the elements.
<getPersonnelResponse xmlns="http://webservices.hrldaplookup.ecis.police.uk/" xmlns:S="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ns2="http://webservices.hrldaplookup.ecis.police.uk/" xmlns:ns3="http://ldap.ecis.police.uk/Person" xmlns:ns4="http://ecis.police.uk/ldaplookupservice"><personnelData xmlns="http://ecis.police.uk/ldaplookupservice"><detail xmlns="http://ldap.ecis.police.uk/Person">
When I want to assign variable parameter parts to something else I don't know which namespace to use.
$LDAPLookupResponse.parameters/ldap:personnelData/ldap:detail/item[1]
or
$LDAPLookupResponse.parameters/ns2:personnelData/ns4:detail/ns4:item[1]
Neither seem to work.
I'm sure I am just missing something simple, I just need pointing in the right direction.
Thanks
I'm using WSO2 Business Process server.
Full bpel process is here as requested by Thilini Ishaka - thanks!
and the log file for the error is
TID: [0] [BPS] [2013-01-21 16:22:47,750] DEBUG {org.wso2.carbon.bpel.messagetrace} - Service invocation completed: MEXId: hqejbhcnphr7xlanvn6p6t :: {http://webservices.hrldaplookup.ecis.police.uk/}LDAPLookupServiceImpl.getPersonnel {org.wso2.carbon.bpel.messagetrace}
TID: [0] [BPS] [2013-01-21 16:22:47,750] TRACE {org.wso2.carbon.bpel.messagetrace} - Response message: MEXId: hqejbhcnphr7xlanvn6p6t :: <?xml version='1.0' encoding='utf-8'?><S:Envelope xmlns:S="http://schemas.xmlsoap.org/soap/envelope/"><S:Body><ns2:getPersonnelResponse xmlns:ns2="http://webservices.hrldaplookup.ecis.police.uk/" xmlns:ns4="http://ecis.police.uk/ldaplookupservice" xmlns:ns3="http://ldap.ecis.police.uk/Person"><ns4:personnelData><ns3:detail><ns3:item title="Managers Name">Bob NELSON PSE 56619</ns3:item><ns3:item title="Fullname">Conrad CRAMPTON PSE 52704</ns3:item><ns3:item title="Rank">PSE</ns3:item><ns3:item title="Collar Number">46052704</ns3:item><ns3:item title="Location">Headquarters</ns3:item><ns3:item title="Email address">conrad.crampton#kent.pnn.police.uk</ns3:item><ns3:item title="Last Name">Crampton</ns3:item><ns3:item title="Force Number">52704</ns3:item><ns3:item title="Managers Force Number">56619</ns3:item><ns3:item title="First Name">Conrad</ns3:item></ns3:detail></ns4:personnelData></ns2:getPersonnelResponse></S:Body></S:Envelope> {org.wso2.carbon.bpel.messagetrace}
TID: [0] [BPS] [2013-01-21 16:22:47,750] INFO {org.apache.ode.bpel.runtime.ASSIGN} - Assignment Fault: {http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure,lineNo=322,faultExplanation={http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure: R-Value expression "{OXPath10Expression $LDAPLookupResponse.parameters//ldap:item[#title = 'Rank']}" did not select any nodes. {org.apache.ode.bpel.runtime.ASSIGN}
TID: [0] [BPS] [2013-01-21 16:22:47,750] INFO {org.apache.ode.bpel.runtime.ASSIGN} - Assignment Fault: {http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure,lineNo=322,faultExplanation={http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure: R-Value expression "{OXPath10Expression $LDAPLookupResponse.parameters//ldap:item[#title = 'Rank']}" did not select any nodes. {org.apache.ode.bpel.runtime.ASSIGN}
TID: [0] [BPS] [2013-01-21 16:22:47,765] WARN {org.apache.ode.bpel.engine.BpelProcess} - Instance 3652 of {http://ldap.ecis.police.uk/Person/process}HRLDAPProces-31 has completed with fault: FaultData: [faultName={http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure, faulType=null ({http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure: R-Value expression "{OXPath10Expression $LDAPLookupResponse.parameters//ldap:item[#title = 'Rank']}" did not select any nodes.)] #322 {org.apache.ode.bpel.engine.BpelProcess}
TID: [0] [BPS] [2013-01-21 16:22:47,765] WARN {org.apache.ode.bpel.engine.BpelProcess} - Instance 3652 of {http://ldap.ecis.police.uk/Person/process}HRLDAPProces-31 has completed with fault: FaultData: [faultName={http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure, faulType=null ({http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure: R-Value expression "{OXPath10Expression $LDAPLookupResponse.parameters//ldap:item[#title = 'Rank']}" did not select any nodes.)] #322 {org.apache.ode.bpel.engine.BpelProcess}
TID: [0] [BPS] [2013-01-21 16:22:47,859] DEBUG {org.wso2.carbon.bpel.messagetrace} - Reply Sent: HRLDAPProces.{http://ldap.ecis.police.uk/Person/process}process {org.wso2.carbon.bpel.messagetrace}
TID: [0] [BPS] [2013-01-21 16:22:47,859] TRACE {org.wso2.carbon.bpel.messagetrace} - Response message: <?xml version='1.0' encoding='utf-8'?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"><soapenv:Header xmlns:wsa="http://www.w3.org/2005/08/addressing"><wsa:Action>http://ldap.ecis.police.uk/Person/process/HRLDAPProces/processResponse</wsa:Action><wsa:RelatesTo>http://identifiers.wso2.com/messageid/1358785364081/1999227541</wsa:RelatesTo></soapenv:Header><soapenv:Body><soapenv:Fault><faultcode>soapenv:Server</faultcode><faultstring xmlns:axis2ns2="http://docs.oasis-open.org/wsbpel/2.0/process/executable">axis2ns2:selectionFailure</faultstring><detail/></soapenv:Fault></soapenv:Body></soapenv:Envelope> {org.wso2.carbon.bpel.messagetrace}
TID: [0] [BPS] [2013-01-21 16:23:17,875] INFO {org.wso2.carbon.core.services.util.CarbonAuthenticationUtil} - 'admin#carbon.super [-1234]' logged in at [2013-01-21 16:23:17,875+0000] {org.wso2.carbon.core.services.util.CarbonAuthenticationUtil}
The problem could be a namespace conflict in your process file.
Ideally it should work with;
$LDAPLookupResponse.parameters/ldap:personnelData/ldap:detail/item[1]
Can you please post the full bpel config and the full error log to check whether any namespace conflicts in the configuration.