Spark SQL: parse timestamp without seconds - json

Some data I don't own comes with a field that's supposed to be a timestamp, but sometimes doesn't seem to comply with the ISO 8601 standard.
In my code, I defined a schema and then when Spark SQL parses my json data, I get the following error:
java.lang.IllegalArgumentException: 2016-10-07T11:15Z
The source data has the following:
"transaction_date_time": "2016-10-07T11:15Z"
And my schema is defined as such:
val schema = (new StructType)
.add("transaction_date_time", TimestampType)
I believe it's due to the fact that it's missing the seconds. How could I go to correctly parse the timestamp?
edit:
For example, reading the data using
spark.read.schema(schema).json(rdd).show()
Will trigger the following error
16/10/24 13:06:27 ERROR Executor: Exception in task 6.0 in stage 5.0 (TID 23)
java.lang.IllegalArgumentException: 2016-10-07T11:15Z
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl$Parser.skip(Unknown Source)
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl$Parser.parse(Unknown Source)
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl.<init>(Unknown Source)
at org.apache.xerces.jaxp.datatype.DatatypeFactoryImpl.newXMLGregorianCalendar(Unknown Source)
at javax.xml.bind.DatatypeConverterImpl._parseDateTime(DatatypeConverterImpl.java:422)
at javax.xml.bind.DatatypeConverterImpl.parseDateTime(DatatypeConverterImpl.java:417)
at javax.xml.bind.DatatypeConverter.parseDateTime(DatatypeConverter.java:327)
at org.apache.spark.sql.catalyst.util.DateTimeUtils$.stringToTime(DateTimeUtils.scala:140)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertField(JacksonParser.scala:114)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertObject(JacksonParser.scala:215)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertField(JacksonParser.scala:182)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertRootField(JacksonParser.scala:73)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1$$anonfun$apply$2.apply(JacksonParser.scala:288)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1$$anonfun$apply$2.apply(JacksonParser.scala:285)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2366)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1.apply(JacksonParser.scala:285)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1.apply(JacksonParser.scala:280)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:246)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:240)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:85)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
16/10/24 13:06:27 WARN TaskSetManager: Lost task 6.0 in stage 5.0 (TID 23, localhost): java.lang.IllegalArgumentException: 2016-10-07T11:15Z
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl$Parser.skip(Unknown Source)
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl$Parser.parse(Unknown Source)
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl.<init>(Unknown Source)
at org.apache.xerces.jaxp.datatype.DatatypeFactoryImpl.newXMLGregorianCalendar(Unknown Source)
at javax.xml.bind.DatatypeConverterImpl._parseDateTime(DatatypeConverterImpl.java:422)
at javax.xml.bind.DatatypeConverterImpl.parseDateTime(DatatypeConverterImpl.java:417)
at javax.xml.bind.DatatypeConverter.parseDateTime(DatatypeConverter.java:327)
at org.apache.spark.sql.catalyst.util.DateTimeUtils$.stringToTime(DateTimeUtils.scala:140)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertField(JacksonParser.scala:114)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertObject(JacksonParser.scala:215)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertField(JacksonParser.scala:182)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertRootField(JacksonParser.scala:73)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1$$anonfun$apply$2.apply(JacksonParser.scala:288)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1$$anonfun$apply$2.apply(JacksonParser.scala:285)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2366)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1.apply(JacksonParser.scala:285)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1.apply(JacksonParser.scala:280)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:246)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:240)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:85)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
16/10/24 13:06:27 ERROR TaskSetManager: Task 6 in stage 5.0 failed 1 times; aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 6 in stage 5.0 failed 1 times, most recent failure: Lost task 6.0 in stage 5.0 (TID 23, localhost): java.lang.IllegalArgumentException: 2016-10-07T11:15Z
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl$Parser.skip(Unknown Source)
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl$Parser.parse(Unknown Source)
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl.<init>(Unknown Source)
at org.apache.xerces.jaxp.datatype.DatatypeFactoryImpl.newXMLGregorianCalendar(Unknown Source)
at javax.xml.bind.DatatypeConverterImpl._parseDateTime(DatatypeConverterImpl.java:422)
at javax.xml.bind.DatatypeConverterImpl.parseDateTime(DatatypeConverterImpl.java:417)
at javax.xml.bind.DatatypeConverter.parseDateTime(DatatypeConverter.java:327)
at org.apache.spark.sql.catalyst.util.DateTimeUtils$.stringToTime(DateTimeUtils.scala:140)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertField(JacksonParser.scala:114)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertObject(JacksonParser.scala:215)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertField(JacksonParser.scala:182)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertRootField(JacksonParser.scala:73)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1$$anonfun$apply$2.apply(JacksonParser.scala:288)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1$$anonfun$apply$2.apply(JacksonParser.scala:285)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2366)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1.apply(JacksonParser.scala:285)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1.apply(JacksonParser.scala:280)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:246)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:240)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:85)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1450)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1438)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1437)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1437)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:811)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1659)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1618)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1607)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:632)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1871)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1884)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1897)
at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:347)
at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:39)
at org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2183)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2532)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2182)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collect(Dataset.scala:2189)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1925)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1924)
at org.apache.spark.sql.Dataset.withTypedCallback(Dataset.scala:2562)
at org.apache.spark.sql.Dataset.head(Dataset.scala:1924)
at org.apache.spark.sql.Dataset.take(Dataset.scala:2139)
at org.apache.spark.sql.Dataset.showString(Dataset.scala:239)
at org.apache.spark.sql.Dataset.show(Dataset.scala:526)
at org.apache.spark.sql.Dataset.show(Dataset.scala:486)
at org.apache.spark.sql.Dataset.show(Dataset.scala:495)
... 54 elided
Caused by: java.lang.IllegalArgumentException: 2016-10-07T11:15Z
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl$Parser.skip(Unknown Source)
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl$Parser.parse(Unknown Source)
at org.apache.xerces.jaxp.datatype.XMLGregorianCalendarImpl.<init>(Unknown Source)
at org.apache.xerces.jaxp.datatype.DatatypeFactoryImpl.newXMLGregorianCalendar(Unknown Source)
at javax.xml.bind.DatatypeConverterImpl._parseDateTime(DatatypeConverterImpl.java:422)
at javax.xml.bind.DatatypeConverterImpl.parseDateTime(DatatypeConverterImpl.java:417)
at javax.xml.bind.DatatypeConverter.parseDateTime(DatatypeConverter.java:327)
at org.apache.spark.sql.catalyst.util.DateTimeUtils$.stringToTime(DateTimeUtils.scala:140)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertField(JacksonParser.scala:114)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertObject(JacksonParser.scala:215)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertField(JacksonParser.scala:182)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$.convertRootField(JacksonParser.scala:73)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1$$anonfun$apply$2.apply(JacksonParser.scala:288)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1$$anonfun$apply$2.apply(JacksonParser.scala:285)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2366)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1.apply(JacksonParser.scala:285)
at org.apache.spark.sql.execution.datasources.json.JacksonParser$$anonfun$parseJson$1.apply(JacksonParser.scala:280)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:246)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:240)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:85)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

You can change
val schema = (new StructType)
.add("transaction_date_time", TimestampType)
TO
val schema = (new StructType)
.add("transaction_date_time", StringType)
and then use df.withColumn("columnTimeWithOutSec", unix_timestamp($"time", format))
where format = "format time with out seconds ex HH:mm "
just like this...
Also, have a look at DateTimeUtils.scala to be inline with Spark style conversions of Date and TimeStamp.

It looks like apache.spark.Timestamp is just a wrapper for java.sql.Timestamp. At least that's what this leads me to believe.
Accordingly, we can parse a date using SimpleDateFormat and extract the milliseconds, passing that to the Timestamp constructor.
You could do something like in this example to preprocess the data:
import java.sql.Timestamp;
import java.text.*;
import java.util.Date;
public class Test {
public static void main(String[] args) {
String timestamp = "2016-10-07T11:15Z";
DateFormat df = new SimpleDateFormat("yyyy-MM-dd'T'HH:mmXXX");
Date parsedDate = null;
try{
parsedDate = df.parse(timestamp);
}catch(Exception e){
//do nothing
}
Timestamp ts = new Timestamp(parsedDate.getTime());
System.out.println(parsedDate);
System.out.println(ts);
}
}
Which outputs
Fri Oct 07 04:15:00 PDT 2016
2016-10-07 04:15:00.0
I searched a bit for 'optional parts in a date format' and found this SO saying you should just make two SimpleDateFormats.

Related

Java out of memory problem with parquet in spark file

i was trying to solve the 2009 problem from the data exposition of the Statistical
computing and Statistical graphics. The first step was about loading dataset, merging them and save them as parquet file. i am having trouble with this last step.
a.spark<-spark_read_csv(sc,"1987.csv")
b.spark<-spark_read_csv(sc,"1988.csv")
c.spark<-spark_read_csv(sc,"1989.csv")
d.spark<-spark_read_csv(sc,"1990.csv")
e.spark<-spark_read_csv(sc,"1991.csv")
f.spark<-spark_read_csv(sc,"1992.csv")
g.spark<-spark_read_csv(sc,"1993.csv")
h.spark<-spark_read_csv(sc,"1994.csv")
i.spark<-spark_read_csv(sc,"1995.csv")
j.spark<-spark_read_csv(sc,"1996.csv")
k.spark<-spark_read_csv(sc,"1997.csv")
l.spark<-spark_read_csv(sc,"1998.csv")
m.spark<-spark_read_csv(sc,"1999.csv")
n.spark<-spark_read_csv(sc,"2000.csv")
o.spark<-spark_read_csv(sc,"2001.csv")
p.spark<-spark_read_csv(sc,"2002.csv")
q.spark<-spark_read_csv(sc,"2003.csv")
r.spark<-spark_read_csv(sc,"2004.csv")
s.spark<-spark_read_csv(sc,"2005.csv")
t.spark<-spark_read_csv(sc,"2006.csv")
u.spark<-spark_read_csv(sc,"2007.csv")
v.spark<-spark_read_csv(sc,"2008.csv")
ab<-full_join(a.spark,b.spark,suffix=c("a.spark","b.spark"))
cd<-full_join(c.spark,d.spark,suffix=c("c.spark","d.spark"))
ef<-full_join(e.spark,f.spark,suffix=c("e.spark","f.spark"))
gh<-full_join(g.spark,h.spark)
ij<-full_join(i.spark,j.spark)
kl<-full_join(k.spark,l.spark)
mn<-full_join(m.spark,n.spark)
op<-full_join(o.spark,p.spark)
qr<-full_join(q.spark,r.spark)
st<-full_join(s.spark,t.spark)
uv<-full_join(u.spark,v.spark)
abcd<-full_join(ab,cd,suffix=c("ab","cd"))
efgh<-full_join(ef,gh)
ijkl<-full_join(ij,kl)
mnop<-full_join(mn,op)
qrst<-full_join(qr,st)
ah<-full_join(abcd,efgh)
ip<-full_join(ijkl,mnop)
qv<-full_join(qrst,uv)
ap<-full_join(ah,ip)
av<-full_join(ap,qv)
spark_write_parquet(av,path=("the path on the pc"),mode="overwrite")
and i get this error
Error: org.apache.spark.SparkException: Job aborted.
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:224)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:154)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:654)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225)
at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:547)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at sparklyr.Invoke.invoke(invoke.scala:161)
at sparklyr.StreamHandler.handleMethodCall(stream.scala:141)
at sparklyr.StreamHandler.read(stream.scala:62)
at sparklyr.BackendHandler$$anonfun$channelRead0$1.apply$mcV$sp(handler.scala:60)
at scala.util.control.Breaks.breakable(Breaks.scala:38)
at sparklyr.BackendHandler.channelRead0(handler.scala:40)
at sparklyr.BackendHandler.channelRead0(handler.scala:14)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Unknown Source)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 7 in stage 159.0 failed 1 times, most recent failure: Lost task 7.0 in stage 159.0 (TID 663, localhost, executor driver): org.apache.spark.memory.SparkOutOfMemoryError: error while calling spill() on org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter#1337ff33 : Spazio su disco insufficiente
at org.apache.spark.memory.TaskMemoryManager.acquireExecutionMemory(TaskMemoryManager.java:217)
at org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:283)
at org.apache.spark.memory.MemoryConsumer.allocatePage(MemoryConsumer.java:117)
at org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.acquireNewPageIfNecessary(UnsafeExternalSorter.java:383)
at org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.insertRecord(UnsafeExternalSorter.java:407)
at org.apache.spark.sql.execution.UnsafeExternalRowSorter.insertRow(UnsafeExternalRowSorter.java:135)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage11.sort_addToSorter$(Unknown Source)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage11.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
at org.apache.spark.sql.execution.RowIteratorFromScala.advanceNext(RowIterator.scala:83)
at org.apache.spark.sql.execution.joins.SortMergeFullOuterJoinScanner.advancedLeft(SortMergeJoinExec.scala:992)
at org.apache.spark.sql.execution.joins.SortMergeFullOuterJoinScanner.<init>(SortMergeJoinExec.scala:982)
at org.apache.spark.sql.execution.joins.SortMergeJoinExec$$anonfun$doExecute$1.apply(SortMergeJoinExec.scala:243)
at org.apache.spark.sql.execution.joins.SortMergeJoinExec$$anonfun$doExecute$1.apply(SortMergeJoinExec.scala:150)
at org.apache.spark.rdd.ZippedPartitionsRDD2.compute(ZippedPartitionsRDD.scala:89)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.r
i dont know how to solve the problem. i've tried with options(java.parameters="-Xmx3072m"), but with 0 results.

Error Generating the Pre-Authenticated URL from Oracle Cloud Storage (oci-java-sdk)

Getting the Error
com.oracle.bmc.model.BmcException: (-1, null, false) Processing exception while communicating to: https://objectstorage.eu-frankfurt-1.oraclecloud.com (outbound opc-request-id: 610055D587CD432284C75AA1C52A6446)
I have taken the code units from the Oracle https://docs.oracle.com/en-us/iaas/api/#/en/objectstorage/20160918/PreauthenticatedRequest/CreatePreauthenticatedRequest but it does not work
Code
final ConfigFileReader.ConfigFile configFile = ConfigFileReader.parseDefault();
final AuthenticationDetailsProvider provider = new ConfigFileAuthenticationDetailsProvider(configFile);
ObjectStorageClient client = new ObjectStorageClient(provider);
log.debug("Client Created = "+client.getEndpoint());
/* Create a request and dependent object(s). */
CreatePreauthenticatedRequestDetails createPreauthenticatedRequestDetails = CreatePreauthenticatedRequestDetails.builder()
.name("EXAMPLE-name-Value")
.bucketListingAction(PreauthenticatedRequest.BucketListingAction.ListObjects)
.objectName("230222082346340.pdf")
.accessType(CreatePreauthenticatedRequestDetails.AccessType.ObjectReadWrite)
.timeExpires(new Date("Wed Nov 26 03:00:33 UTC 2042")).build();
log.debug("createPreauthenticatedRequestDetails Created..."+createPreauthenticatedRequestDetails.getName() );
CreatePreauthenticatedRequestRequest createPreauthenticatedRequestRequest = CreatePreauthenticatedRequestRequest.builder()
.namespaceName("namespacename")
.bucketName("temp")
.createPreauthenticatedRequestDetails(createPreauthenticatedRequestDetails)
.opcClientRequestId("ocid1.test.oc1.."+UUID.randomUUID().toString()+"-opcClientRequestId-Value")
.build();
CreatePreauthenticatedRequestResponse response = client.createPreauthenticatedRequest(createPreauthenticatedRequestRequest);
Is there anything I am missing?
Complete Error Log:
09:02:25 INFO [co.or.bm.ht.ApacheConfigurator]Setting connector provider to ApacheConnectorProvider
09:02:25 INFO [co.or.bm.ob.ObjectStorageClient]Setting endpoint to https://objectstorage.eu-frankfurt-1.oraclecloud.com
09:02:25 DEBUG[or.fr.co.DocStoreOracleService]Client Created = https://objectstorage.eu-frankfurt-1.oraclecloud.com
09:02:25 DEBUG[or.fr.co.DocStoreOracleService]createPreauthenticatedRequestDetails Created...EXAMPLE-name-Value
com.oracle.bmc.model.BmcException: (-1, null, false) Processing exception while communicating to: https://objectstorage.eu-frankfurt-1.oraclecloud.com (outbound opc-request-id: 813E7CD051BD44FB9303B033D912F187)
at com.oracle.bmc.http.internal.RestClient.convertToBmcException(RestClient.java:994)
at com.oracle.bmc.http.internal.RestClient.post(RestClient.java:306)
at com.oracle.bmc.objectstorage.ObjectStorageClient.lambda$null$12(ObjectStorageClient.java:684)
at com.oracle.bmc.retrier.BmcGenericRetrier.doFunctionCall(BmcGenericRetrier.java:89)
at com.oracle.bmc.retrier.BmcGenericRetrier.lambda$execute$0(BmcGenericRetrier.java:60)
at com.oracle.bmc.waiter.GenericWaiter.execute(GenericWaiter.java:55)
at com.oracle.bmc.retrier.BmcGenericRetrier.execute(BmcGenericRetrier.java:51)
at com.oracle.bmc.objectstorage.ObjectStorageClient.lambda$createPreauthenticatedRequest$13(ObjectStorageClient.java:680)
at com.oracle.bmc.retrier.BmcGenericRetrier.doFunctionCall(BmcGenericRetrier.java:89)
at com.oracle.bmc.retrier.BmcGenericRetrier.lambda$execute$0(BmcGenericRetrier.java:60)
at com.oracle.bmc.waiter.GenericWaiter.execute(GenericWaiter.java:55)
at com.oracle.bmc.retrier.BmcGenericRetrier.execute(BmcGenericRetrier.java:51)
at com.oracle.bmc.objectstorage.ObjectStorageClient.createPreauthenticatedRequest(ObjectStorageClient.java:674)
at org.freshfood.controller.DocStoreOracleService.generatePAUNew(DocStoreOracleService.java:774)
at org.freshfood.controller.DocStoreOracleService_Subclass.generatePAUNew$$superforward1(Unknown Source)
at org.freshfood.controller.DocStoreOracleService_Subclass$$function$$8.apply(Unknown Source)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:62)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(Unknown Source)
at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:41)
at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
at org.freshfood.controller.DocStoreOracleService_Subclass.generatePAUNew(Unknown Source)
at org.freshfood.controller.DocStoreOracleService_ClientProxy.generatePAUNew(Unknown Source)
at org.freshfood.controller.DocumentResource.getPreAuthURL(DocumentResource.java:36)
at org.freshfood.controller.DocumentResource_Subclass.getPreAuthURL$$superforward1(Unknown Source)
at org.freshfood.controller.DocumentResource_Subclass$$function$$4.apply(Unknown Source)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:62)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(Unknown Source)
at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:41)
at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
at org.freshfood.controller.DocumentResource_Subclass.getPreAuthURL(Unknown Source)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:170)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:130)
at org.jboss.resteasy.core.ResourceMethodInvoker.internalInvokeOnTarget(ResourceMethodInvoker.java:660)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTargetAfterFilter(ResourceMethodInvoker.java:524)
at org.jboss.resteasy.core.ResourceMethodInvoker.lambda$invokeOnTarget$2(ResourceMethodInvoker.java:474)
at org.jboss.resteasy.core.interception.jaxrs.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:364)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTarget(ResourceMethodInvoker.java:476)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:434)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:408)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:69)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:492)
at org.jboss.resteasy.core.SynchronousDispatcher.lambda$invoke$4(SynchronousDispatcher.java:261)
at org.jboss.resteasy.core.SynchronousDispatcher.lambda$preprocess$0(SynchronousDispatcher.java:161)
at org.jboss.resteasy.core.interception.jaxrs.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:364)
at org.jboss.resteasy.core.SynchronousDispatcher.preprocess(SynchronousDispatcher.java:164)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:247)
at io.quarkus.resteasy.runtime.standalone.RequestDispatcher.service(RequestDispatcher.java:73)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler.dispatch(VertxRequestHandler.java:151)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler$1.run(VertxRequestHandler.java:91)
at io.quarkus.vertx.core.runtime.VertxCoreRecorder$13.runWith(VertxCoreRecorder.java:543)
at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2449)
at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1478)
at org.jboss.threads.DelegatingRunnable.run(DelegatingRunnable.java:29)
at org.jboss.threads.ThreadLocalResettingRunnable.run(ThreadLocalResettingRunnable.java:29)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: javax.ws.rs.ProcessingException: RESTEASY004655: Unable to invoke request: javax.ws.rs.ProcessingException: RESTEASY003215: could not find writer for content-type application/json type: java.lang.String
at org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine.invoke(ManualClosingApacheHttpClient43Engine.java:321)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.invoke(ClientInvocation.java:494)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.invoke(ClientInvocation.java:69)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocationBuilder.post(ClientInvocationBuilder.java:226)
at com.oracle.bmc.http.internal.ForwardingInvocationBuilder.post(ForwardingInvocationBuilder.java:157)
at com.oracle.bmc.http.internal.RestClient.lambda$post$4(RestClient.java:304)
at com.oracle.bmc.circuitbreaker.internal.JaxRsCircuitBreakerImpl.lambda$decorateSupplier$0(JaxRsCircuitBreakerImpl.java:83)
at io.github.resilience4j.circuitbreaker.CircuitBreaker.lambda$decorateSupplier$4(CircuitBreaker.java:198)
at com.oracle.bmc.circuitbreaker.internal.JaxRsCircuitBreakerImpl.lambda$decorateSupplier$1(JaxRsCircuitBreakerImpl.java:93)
at com.oracle.bmc.http.internal.RestClient.lambda$decorateSupplier$0(RestClient.java:175)
at com.oracle.bmc.http.internal.RestClient.post(RestClient.java:304)
... 64 more
Caused by: javax.ws.rs.ProcessingException: RESTEASY003215: could not find writer for content-type application/json type: java.lang.String
at org.jboss.resteasy.core.interception.jaxrs.ClientWriterInterceptorContext.throwWriterNotFoundException(ClientWriterInterceptorContext.java:50)
at org.jboss.resteasy.core.interception.jaxrs.AbstractWriterInterceptorContext.getWriter(AbstractWriterInterceptorContext.java:302)
at org.jboss.resteasy.core.interception.jaxrs.AbstractWriterInterceptorContext.syncProceed(AbstractWriterInterceptorContext.java:240)
at org.jboss.resteasy.core.interception.jaxrs.AbstractWriterInterceptorContext.proceed(AbstractWriterInterceptorContext.java:224)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.writeRequestBody(ClientInvocation.java:446)
at org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine.writeRequestBodyToOutputStream(ManualClosingApacheHttpClient43Engine.java:625)
at org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine.buildEntity(ManualClosingApacheHttpClient43Engine.java:584)
at org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine.loadHttpMethod(ManualClosingApacheHttpClient43Engine.java:489)
at org.jboss.resteasy.client.jaxrs.engines.ManualClosingApacheHttpClient43Engine.invoke(ManualClosingApacheHttpClient43Engine.java:299)
... 74 more

SOAPUI Groovy parsing XML response and insert it into mysql database

Im a beginner in groovy and trying to parse XML and insert the resultant records into MySQL database
Sample SOAP XML response
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<soap:Body>
<GetCitiesByCountryResponse xmlns="http://www.webserviceX.NET">
<GetCitiesByCountryResult><NewDataSet>
<Table>
<Country>British Indian Ocean Territory</Country>
<City>Diego Garcia</City>
</Table>
<Table>
<Country>India</Country>
<City>Ahmadabad</City>
</Table>
<Table>
<Country>India</Country>
<City>Akola</City>
</Table>
</NewDataSet></GetCitiesByCountryResult>
</GetCitiesByCountryResponse>
</soap:Body>
</soap:Envelope>
SOAPUI GROOVY teststep script
import groovy.sql.Sql
def sql = Sql.newInstance('jdbc:mysql://localhost:3306/weather', 'root', 'security', 'com.mysql.jdbc.Driver')
def response = context.expand( '${GetCitiesByCountry#Response}' )
def xml = new XmlSlurper().parseText(response).Body.GetCitiesByCountryResponse.GetCitiesByCountryResult.NewDataSet.Table
//log.info xml
xml.each{ node ->
sql.execute("INSERT INTO indcit(country,city) VALUES (?,?)" ,[node.Country, node.City])
//sql.execute("INSERT INTO indcit(country,city) VALUES (${node.Country},${node.City})")
//log.info node.Country
//log.info node.City
}
When tried both sql.execute line throws the same below error of java.sql.SQLException:Invalid argument value: java.io.NOTSerializableException
Error log
Thu Mar 03 01:16:36 IST 2016:ERROR:java.sql.SQLException: Invalid argument value: java.io.NotSerializableException
java.sql.SQLException: Invalid argument value: java.io.NotSerializableException
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:910)
at com.mysql.jdbc.PreparedStatement.setSerializableObject(PreparedStatement.java:3415)
at com.mysql.jdbc.PreparedStatement.setObject(PreparedStatement.java:3066)
at groovy.sql.Sql.setObject(Sql.java:3655)
at groovy.sql.Sql.setParameters(Sql.java:3620)
at groovy.sql.Sql.getPreparedStatement(Sql.java:3881)
at groovy.sql.Sql.getPreparedStatement(Sql.java:3928)
at groovy.sql.Sql.execute(Sql.java:2287)
at groovy.sql.Sql$execute.call(Unknown Source)
at Script62$_run_closure1.doCall(Script62.groovy:7)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:272)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:909)
at groovy.lang.Closure.call(Closure.java:411)
at groovy.lang.Closure.call(Closure.java:427)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:1325)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:1297)
at org.codehaus.groovy.runtime.dgm$148.doMethodInvoke(Unknown Source)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1085)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:909)
at groovy.lang.DelegatingMetaClass.invokeMethod(DelegatingMetaClass.java:149)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:39)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:54)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at Script62.run(Script62.groovy:6)
at com.eviware.soapui.support.scripting.groovy.SoapUIGroovyScriptEngine.run(SoapUIGroovyScriptEngine.java:92)
at com.eviware.soapui.impl.wsdl.teststeps.WsdlGroovyScriptTestStep.run(WsdlGroovyScriptTestStep.java:141)
at com.eviware.soapui.impl.wsdl.panels.teststeps.GroovyScriptStepDesktopPanel$RunAction$1.run(GroovyScriptStepDesktopPanel.java:250)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Thanks for any help in advance!
Try calling text() on your nodes, ie:
sql.execute("INSERT INTO indcit(country,city) VALUES (?,?)" ,[node.Country.text(), node.City.text()])

GWT Error when trying to throw an Exception to the Client with RPC

I am getting some Error when I try to throw an Exception on the Server to the Client with RPC.
What I have is:
An own Exception Class :
package catan.user.session.shared;
public class SessionInvalidException extends Exception {
private static final long serialVersionUID = 1L;
}
And the RPC Implementation.
#SuppressWarnings("serial")
public class ProfileServiceImpl extends RemoteServiceServlet implements ProfileService{
#Override
public UserLight getUser() throws SessionInvalidException {
SessionManager.getInstance().isSessionStillValid(this.getThreadLocalRequest());
return ((UserLight) this.getThreadLocalRequest().getSession(false).getAttribute("User")).clone();
}
}
The Exception is Thrown in SessionManager.getInstance().isSessionStillValid()
and when I run my Programm i get the folowing Errors in my log
Wed Aug 14 17:13:45 CEST 2013 catan.client.Catan
WARNING: Exception caught
catan.user.session.shared.SessionInvalidException: null
at catan.user.session.shared.SessionInvalidException_FieldSerializer.instantiate(SessionInvalidException_FieldSerializer.java:16)
at catan.user.session.shared.SessionInvalidException_FieldSerializer.create(SessionInvalidException_FieldSerializer.java:25)
at com.google.gwt.user.client.rpc.impl.SerializerBase.instantiate(SerializerBase.java:115)
at com.google.gwt.user.client.rpc.impl.ClientSerializationStreamReader.deserialize(ClientSerializationStreamReader.java:396)
at com.google.gwt.user.client.rpc.impl.AbstractSerializationStreamReader.readObject(AbstractSerializationStreamReader.java:119)
at com.google.gwt.user.client.rpc.impl.RequestCallbackAdapter.onResponseReceived(RequestCallbackAdapter.java:216)
at com.google.gwt.http.client.Request.fireOnResponseReceived(Request.java:258)
at com.google.gwt.http.client.RequestBuilder$1.onReadyStateChange(RequestBuilder.java:412)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103)
at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71)
at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:172)
at com.google.gwt.dev.shell.BrowserChannelServer.reactToMessagesWhileWaitingForReturn(BrowserChannelServer.java:338)
at com.google.gwt.dev.shell.BrowserChannelServer.invokeJavascript(BrowserChannelServer.java:219)
at com.google.gwt.dev.shell.ModuleSpaceOOPHM.doInvoke(ModuleSpaceOOPHM.java:136)
at com.google.gwt.dev.shell.ModuleSpace.invokeNative(ModuleSpace.java:571)
at com.google.gwt.dev.shell.ModuleSpace.invokeNativeObject(ModuleSpace.java:279)
at com.google.gwt.dev.shell.JavaScriptHost.invokeNativeObject(JavaScriptHost.java:91)
at com.google.gwt.core.client.impl.Impl.apply(Impl.java)
at com.google.gwt.core.client.impl.Impl.entry0(Impl.java:242)
at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103)
at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71)
at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:172)
at com.google.gwt.dev.shell.BrowserChannelServer.reactToMessages(BrowserChannelServer.java:293)
at com.google.gwt.dev.shell.BrowserChannelServer.processConnection(BrowserChannelServer.java:547)
at com.google.gwt.dev.shell.BrowserChannelServer.run(BrowserChannelServer.java:364)
at java.lang.Thread.run(Unknown Source)
Aug 14, 2013 5:13:45 PM com.google.gwt.logging.server.RemoteLoggingServiceUtil logOnServer
WARNING: Exception caught
com.google.gwt.core.client.impl.SerializableThrowable$ThrowableWithClassName
at catan.user.session.shared.SessionInvalidException_FieldSerializer.instantiate(SessionInvalidException_FieldSerializer.java:16)
at catan.user.session.shared.SessionInvalidException_FieldSerializer.create(SessionInvalidException_FieldSerializer.java:25)
at com.google.gwt.user.client.rpc.impl.SerializerBase.instantiate(SerializerBase.java:115)
at com.google.gwt.user.client.rpc.impl.ClientSerializationStreamReader.deserialize(ClientSerializationStreamReader.java:396)
at com.google.gwt.user.client.rpc.impl.AbstractSerializationStreamReader.readObject(AbstractSerializationStreamReader.java:119)
at com.google.gwt.user.client.rpc.impl.RequestCallbackAdapter.onResponseReceived(RequestCallbackAdapter.java:216)
at com.google.gwt.http.client.Request.fireOnResponseReceived(Request.java:258)
at com.google.gwt.http.client.RequestBuilder$1.onReadyStateChange(RequestBuilder.java:412)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103)
at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71)
at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:172)
at com.google.gwt.dev.shell.BrowserChannelServer.reactToMessagesWhileWaitingForReturn(BrowserChannelServer.java:338)
at com.google.gwt.dev.shell.BrowserChannelServer.invokeJavascript(BrowserChannelServer.java:219)
at com.google.gwt.dev.shell.ModuleSpaceOOPHM.doInvoke(ModuleSpaceOOPHM.java:136)
at com.google.gwt.dev.shell.ModuleSpace.invokeNative(ModuleSpace.java:571)
at com.google.gwt.dev.shell.ModuleSpace.invokeNativeObject(ModuleSpace.java:279)
at com.google.gwt.dev.shell.JavaScriptHost.invokeNativeObject(JavaScriptHost.java:91)
at com.google.gwt.core.client.impl.Impl.apply(Impl.java)
at com.google.gwt.core.client.impl.Impl.entry0(Impl.java:242)
at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103)
at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71)
at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:172)
at com.google.gwt.dev.shell.BrowserChannelServer.reactToMessages(BrowserChannelServer.java:293)
at com.google.gwt.dev.shell.BrowserChannelServer.processConnection(BrowserChannelServer.java:547)
at com.google.gwt.dev.shell.BrowserChannelServer.run(BrowserChannelServer.java:364)
at java.lang.Thread.run(Unknown Source)
The funny Thing is i have another Exception within the exact same package thrown in the nearly exact same way, with just another name, and it worked... I just don't understand what i am doing wrong...
Make sure the exception class is on a path specified by a element in your *.gwt.xml module definition file. For example:
<!-- Specify the paths for translatable code -->
<source path='client' />
<source path='shared' />
Okay I solved my Problem. So the Exception was correctly passed to the client by the RPC. Then
in the onFailure method I passed the the caught object to my Logger class which then tried to send it back to the Server or something. I'm not sure here because the Logger class was written by another Member of my Projekt. So just not using this class solved the Problem, but i am still not sure what those Errors try to tell me... If someone is intrested here is the Method i passed the caught Object to. The Catan.class is our EntryPoint class.
public class Logger {
/**
* Logs a Throwable
* #param caught
*/
public static void log(Throwable caught) {
final java.util.logging.Logger log = java.util.logging.Logger.getLogger(Catan.class.getName());
log.log(Level.WARNING, "Exception caught \n", caught);
}

SocketTimeoutException when I use Scalaj request

I'm trying to make a simple https request using this library https://github.com/scalaj/scalaj-http . The request contains some json data.
Here is what I'm doing:
val jsonHeaders = """{"jsonrpc": "2.0", "method": "someMethod", "params": {"dataIds":["12348" , "456"]}, "data2": "777"}"""
val result = Http.postData("https://someurl.com/json-rpc", jsonHeaders)
.header("content-type", "application/json")
.header("X-Application", "myCode")
.header("X-Authentication", "myCode2")
.option(HttpOptions.readTimeout(10000))
.asString
//.responseCode -- the same error
println(result)
And it always returns me a timeout error:
[error] (run-main) java.net.SocketTimeoutException: connect timed out
java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
at java.net.AbstractPlainSocketImpl.connect(Unknown Source)
at java.net.SocksSocketImpl.connect(Unknown Source)
at java.net.Socket.connect(Unknown Source)
at sun.security.ssl.SSLSocketImpl.connect(Unknown Source)
at sun.net.NetworkClient.doConnect(Unknown Source)
at sun.net.www.http.HttpClient.openServer(Unknown Source)
at sun.net.www.http.HttpClient.openServer(Unknown Source)
at sun.net.www.protocol.https.HttpsClient.<init>(Unknown Source)
at sun.net.www.protocol.https.HttpsClient.New(Unknown Source)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.getNewHttpClient(Unknown Source)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(Unknown Source)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(Unknown Source)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.connect(Unknown Source)
at scalaj.http.Http$$anonfun$3.apply(Http.scala:263)
at scalaj.http.Http$$anonfun$3.apply(Http.scala:261)
at scalaj.http.Http$Request.process(Http.scala:102)
at scalaj.http.Http$Request.apply(Http.scala:90)
at scalaj.http.Http$Request.asString(Http.scala:133)
at Application$delayedInit$body.apply(Application.scala:27)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at Application$.main(Application.scala:8)
at Application.main(Application.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
If I do just
val jsonHeaders = """{"version123": "12.0", "method": "someMethod", "params": {"dataIds": ["12348" , "456"]}, "data2": "777"}"""
val result = Http.postData("https://someurl.com/some-url2", jsonHeaders)
.header("content-type", "application/json")
.header("X-Application", "myCode")
.header("X-Application1234", "myCode2")
.option(HttpOptions.readTimeout(10000))
println(result)
it returns
Request(POST,<function2>,<function1>,List(),List((X-Authentication,myCode2), (X-Application1234,myCode), (content-type,application/json)),List(<function1>, <function1>, <function1>),DIRECT)
What do I do wrong and is there any another simple way to send https request? Even if involves spray framework, it would be ok (I don't find any example of how to do that in spray, though).
UPDATE:
An example has been taken from here Doing HTTP request in Scala
You need to specify a connection timeout alongside your current read timeout with: .option(HttpOptions.connTimeout(10000)).option(HttpOptions.readTimeout(50000)). Change the 10000 to a value that works for you. The default connection timeout is a pretty aggressive 100.
You do already have a read timeout specified, but the exception says it is timing out on establishing the connection, not on the the reading of the response.
See docs: https://github.com/scalaj/scalaj-http#custom-connect-and-read-timeouts
On the HttpRequest object there is also a .timeout() member method. This method is defined as:
/** The socket connection and read timeouts in milliseconds. Defaults are 1000 and 5000 respectively */
def timeout(connTimeoutMs: Int, readTimeoutMs: Int): HttpRequest = options(
Seq(HttpOptions.connTimeout(connTimeoutMs), HttpOptions.readTimeout(readTimeoutMs))
)
The nice part about this is that it was a bit easier to mock using Mockito. See github as the reference.