Testing json data using Specs2 - json

I added(In built.sbt) matcher-extra :-
"org.specs2" %% "specs2" % "2.3.4" % "test",
"org.specs2" % "specs2-matcher-extra_2.10" % "2.3-scalaz-7.1.0-M3",
the ("/" the symbols are not resolving)
My example test case for Json is looking like below:-
package specs.model
import org.specs2.mutable.Specification
import org.specs2.matcher.JsonMatchers
class Json extends Specification with JsonMatchers {
"Json Matcher" should {
"1st field" in {
val json = """{"name":"sagar"}"""
json must /("name" -> "sagar")
}
"2nd field" in {
val json = """{"id":1}"""
json must /("id" -> 1.0)
}
}
}
ErrorMsg:-
[info] Compiling 2 Scala sources to \target\scala- 2.10\test-classes...
[info] Json
[info]
[info] Json Matcher should
[info] + 1st field
[info] + 2nd field
[info]
[info] Total for specification Json
[info] Finished in 76 ms
[info] 2 examples, 0 failure, 0 error
[trace] Stack trace suppressed: run 'last specBuilder/test:test' for the full output.
[error] Could not run test specs.model.Json: java.lang.NoSuchMethodError: scalaz.Scalaz$.tuple2Monoid(Lscalaz/Monoid;Lscalaz/Monoid;)Lscalaz/std/Tuple2Monoid;
[error] Error: Total 0, Failed 0, Errors 0, Passed 0, Skipped 0
[error] Error during tests:
[error] specs.model.Json
[error] (specBuilder/test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 9 s, completed 11 Dec, 2013 5:12:39 PM
I am strucking here please give me the solution
Thanks,
GSY

I finally got it to work.
There are some instructions for setting up your build.sbt file here. Scroll down to the very bottom of the page. http://etorreborre.github.io/specs2/#Downloads
The import commands in my Application.spec file are:
import org.specs2.mutable._
import org.specs2.mutable.Specification
import org.specs2.matcher.JsonMatchers
import org.specs2.runner._
class yourClass extends Specification with JsonMatchers { }
The following jar files are installed in the lib directory
specs2_2.10-2.3.7-javadoc.jar
scalaz-core_2.10-7.0.4-javadoc.jar
scalaz-concurrent_2.10-7.0.4-javadoc.jar
This is the contents of my build.sbt file.
name := "playExperiments"
version := "1.0-SNAPSHOT"
libraryDependencies ++= Seq(
"org.specs2" %% "specs2" % "2.3.7" % "test",
jdbc,
anorm,
cache
)
scalacOptions in Test ++= Seq("-Yrangepos")
resolvers ++= Seq("snapshots", "releases").map(Resolver.sonatypeRepo)
play.Project.playScalaSettings

Related

Apache Spark SQL get_json_object java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String

I am trying to read a json stream from an MQTT broker in Apache Spark with structured streaming, read some properties of an incoming json and output them to the console. My code looks like that:
val spark = SparkSession
.builder()
.appName("BahirStructuredStreaming")
.master("local[*]")
.getOrCreate()
import spark.implicits._
val topic = "temp"
val brokerUrl = "tcp://localhost:1883"
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.toDF().withColumn("payload", $"payload".cast(StringType))
val jsonDF = lines.select(get_json_object($"payload", "$.eventDate").alias("eventDate"))
val query = jsonDF.writeStream
.format("console")
.start()
query.awaitTermination()
However, when the json arrives I get the following errors:
Exception in thread "main" org.apache.spark.sql.streaming.StreamingQueryException: Writing job aborted.
=== Streaming Query ===
Identifier: [id = 14d28475-d435-49be-a303-8e47e2f907e3, runId = b5bd28bb-b247-48a9-8a58-cb990edaf139]
Current Committed Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: -1}
Current Available Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: 0}
Current State: ACTIVE
Thread State: RUNNABLE
Logical Plan:
Project [get_json_object(payload#22, $.id) AS eventDate#27]
+- Project [id#10, topic#11, cast(payload#12 as string) AS payload#22, timestamp#13]
+- StreamingExecutionRelation MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496], [id#10, topic#11, payload#12, timestamp#13]
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:300)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:189)
Caused by: org.apache.spark.SparkException: Writing job aborted.
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:92)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:296)
at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3384)
at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2783)
at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3365)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3365)
at org.apache.spark.sql.Dataset.collect(Dataset.scala:2783)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$15(MicroBatchExecution.scala:537)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$14(MicroBatchExecution.scala:533)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runBatch(MicroBatchExecution.scala:532)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:198)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:166)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:56)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:160)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:279)
... 1 more
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 8, localhost, executor driver): java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1887)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1875)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1874)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:64)
... 34 more
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I am sending the JSON records using mosquitto broker and they look like this:
mosquitto_pub -m '{"eventDate": "2020-11-11T15:17:00.000+0200"}' -t "temp"
It seems that every strings coming from Bahir stream source provider raise this error. For instance the following code also raises this error :
spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("topic")
.writeStream
.format("console")
.start()
It looks like Spark does not recognize strings coming from Bahir, maybe some kind of weird string class version issue. I've tried the following actions to make the code work:
setup java version to 8
upgrade spark version from 2.4.0 to 2.4.7
setup scala version to 2.11.12
use decode function with all possible encoding combinations instead of .cast(StringType) to transform column "payload" to String
use substring function on column "payload" to recreate a compatible String.
Finally, I got working code by recreating the string using constructor and dataset:
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("payload")
.as[Array[Byte]]
.map(payload => new String(payload))
.toDF("payload")
This solution is rather ugly but at least it works.
I believe that there is nothing wrong with the code provided in the question and I suspect a bug on Bahir or Spark side preventing Spark to handle String from Bahir source.

Testing a DSPComplex ROM

I'm working on building a DSPComplex ROM still and have hit what I think may be an actual Chisel problem.
I've built the ROM, can generate a verilog output from the code that looks reasonable, but can't seem to test the module with even the most basic of testers. I've simplified it below to the most basic checking.
The error is a stack overflow like the following:
$ sbt 'testOnly taylor.TaylorTest'
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=8G; support was removed in 8.0
[info] Loading settings from plugins.sbt ...
[info] Loading project definition from /home/jcondley/Zendar/kodo/ZenFPGA/chisel/project
[info] Loading settings from build.sbt ...
[info] Set current project to zen-chisel (in build file:/home/jcondley/Zendar/kodo/ZenFPGA/chisel/)
[info] Compiling 1 Scala source to /home/jcondley/Zendar/kodo/ZenFPGA/chisel/target/scala-2.12/classes ...
[warn] there were 5 feature warnings; re-run with -feature for details
[warn] one warning found
[info] Done compiling.
[info] Compiling 1 Scala source to /home/jcondley/Zendar/kodo/ZenFPGA/chisel/target/scala-2.12/test-classes ...
[warn] there were two deprecation warnings (since chisel3, will be removed by end of 2017); re-run with -deprecation for details
[warn] there were two feature warnings; re-run with -feature for details
[warn] two warnings found
[info] Done compiling.
[info] [0.004] Elaborating design...
[deprecated] DspComplex.scala:22 (1029 calls): isLit is deprecated: "isLit is deprecated, use litOption.isDefined"
[deprecated] DspComplex.scala:22 (1029 calls): litArg is deprecated: "litArg is deprecated, use litOption or litTo*Option"
[deprecated] DspComplex.scala:23 (1029 calls): isLit is deprecated: "isLit is deprecated, use litOption.isDefined"
[deprecated] DspComplex.scala:23 (1029 calls): litArg is deprecated: "litArg is deprecated, use litOption or litTo*Option"
[warn] There were 4 deprecated function(s) used. These may stop compiling in a future release - you are encouraged to fix these issues.
[warn] Line numbers for deprecations reported by Chisel may be inaccurate; enable scalac compiler deprecation warnings via either of the following methods:
[warn] In the sbt interactive console, enter:
[warn] set scalacOptions in ThisBuild ++= Seq("-unchecked", "-deprecation")
[warn] or, in your build.sbt, add the line:
[warn] scalacOptions := Seq("-unchecked", "-deprecation")
[info] [1.487] Done elaborating.
Total FIRRTL Compile Time: 1887.8 ms
Total FIRRTL Compile Time: 770.6 ms
End of dependency graph
Circuit state created
[info] TaylorTest:
[info] TaylorWindow
[info] taylor.TaylorTest *** ABORTED ***
[info] java.lang.StackOverflowError:
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.evaluate(LoFirrtlExpressionEvaluator.scala:264)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.$anonfun$resolveDependency$1(LoFirrtlExpressionEvaluator.scala:453)
[info] at firrtl_interpreter.Timer.apply(Timer.scala:40)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.resolveDependency(LoFirrtlExpressionEvaluator.scala:445)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.getValue(LoFirrtlExpressionEvaluator.scala:81)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.evaluate(LoFirrtlExpressionEvaluator.scala:304)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.$anonfun$resolveDependency$1(LoFirrtlExpressionEvaluator.scala:453)
[info] at firrtl_interpreter.Timer.apply(Timer.scala:40)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.resolveDependency(LoFirrtlExpressionEvaluator.scala:445)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.getValue(LoFirrtlExpressionEvaluator.scala:81)
[info] ...
This looks suspiciously like another ROM issue from here:
https://github.com/freechipsproject/chisel3/issues/642
but trying Chick's response here:
export SBT_OPTS="-Xmx2G -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=2G -Xss2M -Duser.timezone=GMT"
seems to not solve the issue (and one of the options, MaxPermSize is ignored)
Is this a legitimate Chisel bug with ROMs or is something else going on here?
Actual module with the ROM:
package taylor
import chisel3._
import chisel3.util._
import chisel3.experimental.FixedPoint
import dsptools.numbers._
import scala.io.Source
class TaylorWindow(len: Int, window: Seq[FixedPoint]) extends Module {
val io = IO(new Bundle {
val d_valid_in = Input(Bool())
val sample = Input(DspComplex(FixedPoint(16.W, 8.BP), FixedPoint(16.W, 8.BP)))
val windowed_sample = Output(DspComplex(FixedPoint(32.W, 8.BP), FixedPoint(32.W, 8.BP)))
val d_valid_out = Output(Bool())
})
val win_coeff = VecInit(window.map(x=>DspComplex.wire(x, FixedPoint(0, 16.W, 8.BP))).toSeq) // ROM storing our coefficients.
io.d_valid_out := io.d_valid_in
val counter = RegInit(UInt(10.W), 0.U)
// Implicit reset
io.windowed_sample:= io.sample * win_coeff(counter)
when(io.d_valid_in) {
counter := counter + 1.U
}
}
object TaylorDriver extends App {
val filename = "src/test/test_data/taylor_coeffs"
val coeff_file = Source.fromFile(filename).getLines
val double_coeffs = coeff_file.map(x => x.toDouble)
val fp_coeffs = double_coeffs.map(x => FixedPoint.fromDouble(x, 16.W, 8.BP))
val fp_seq = fp_coeffs.toSeq
chisel3.Driver.execute(args, () => new TaylorWindow(1024, fp_seq))
}
Tester Code:
package taylor
import chisel3._
import chisel3.util._
import chisel3.experimental.FixedPoint
import dsptools.numbers.implicits._
import scala.io.Source
import chisel3.iotesters
import chisel3.iotesters.{ChiselFlatSpec, Driver, PeekPokeTester}
class TaylorWindowUnitTest(dut: TaylorWindow) extends PeekPokeTester(dut) {
val filename = "src/test/test_data/taylor_coeffs"
val coeff_file = Source.fromFile(filename).getLines
val double_coeffs = coeff_file.map(x => x.toDouble)
val fp_coeffs = double_coeffs.map(x => FixedPoint.fromDouble(x, 16.W, 8.BP))
val fp_seq = fp_coeffs.toSeq
poke(dut.io.d_valid_in, Bool(false))
expect(dut.io.d_valid_out, Bool(false))
}
class TaylorTest extends ChiselFlatSpec {
behavior of "TaylorWindow"
backends foreach {backend =>
it should s"test the basic Taylow Window" in {
Driver(() => new TaylorWindow(1024, getSeq()), backend)(c => new TaylorWindowUnitTest(c)) should be (true)
}
}
def getSeq() : Seq[FixedPoint] = {
val filename = "src/test/test_data/taylor_coeffs"
val coeff_file = Source.fromFile(filename).getLines
val double_coeffs = coeff_file.map(x => x.toDouble)
val fp_coeffs = double_coeffs.map(x => FixedPoint.fromDouble(x, 16.W, 8.BP))
fp_coeffs.toSeq
}
}
This looks like a failure in the firrtl-interpreter (one of the Scala based Chisel simulator) which can have problems with a number of large firrtl constructs. If you have verilator installed can you try changing
backends foreach {backend =>
to
Seq("verilator") foreach {backend =>
and see what happens. Another think to try is Treadle which is the new version of the interpreter and is not in full release yet but is available in snapshot versions of the chisel ecosystem. It should be able to handle this also.

Spark SQL error read JSON file : java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class

i am trying to read JSON file using Spark SQL in Java.
this is my code
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.SQLContext;
...
JavaSparkContext jsc = new JavaSparkContext(sparkConf);
SQLContext sqlContext = new SQLContext(jsc);
DataFrame df = sqlContext.jsonFile("~/test.json");
df.printSchema();
df.registerTempTable("test");
...
i made simple JSON "test.json", to make it simple:
{
"name": "myname"
}
and when i tried to run the code, it comes error message:
efg
17/03/30 10:02:26 INFO BlockManagerMasterEndpoint: Registering block manager 10.6.86.82:36824 with 1948.2 MB RAM, BlockManagerId(driver, 10.6.86.82, 36824)
17/03/30 10:02:26 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.6.86.82, 36824)
17/03/30 10:02:26 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at org.apache.spark.sql.sources.CaseInsensitiveMap.<init>(ddl.scala:344)
at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:219)
at org.apache.spark.sql.SQLContext.load(SQLContext.scala:697)
at org.apache.spark.sql.SQLContext.jsonFile(SQLContext.scala:572)
at org.apache.spark.sql.SQLContext.jsonFile(SQLContext.scala:553)
at sugi.kau.sparkonjava.SparkSQL.main(SparkSQL.java:32)
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 6 more
17/03/30 10:02:26 INFO SparkContext: Invoking stop() from shutdown hook
...
thanks
in the docs spark for the function jsonFile(String path):
Loads a JSON file (one object per line), returning the result as a DataFrame. (Note tha jsonFile is replaced by read().json())
so you should have an object per line and your source file should be like this :
{"name": "myname"}
{"name": "myname2"}
.....

Chisel AlreadyBoundException

I am developing a simple on chip memory for an SoC, based on Sodor scratchpad memory. So, first I'm converting a slightly modified version of that design to chisel 3. Now, I'm getting this exception regarding a bounded type that I can't understand.
[info] - should correctly write and read data *** FAILED ***
[info] chisel3.core.Binding$BindingException: Error: Cannot set as output .M_WR: Already bound to LitBinding()
[info] at chisel3.core.Binding$.bind(Binding.scala:100)
[info] at chisel3.core.Output$.apply(Data.scala:50)
[info] at chisel3.util.ReadyValidIO.<init>(Decoupled.scala:22)
[info] at chisel3.util.DecoupledIO.<init>(Decoupled.scala:72)
[info] at chisel3.util.Decoupled$.apply(Decoupled.scala:81)
[info] at mem.MemPortIO.<init>(memory.scala:40)
[info] at mem.OnChipMemory$$anon$1.<init>(memory.scala:49)
[info] at mem.OnChipMemory.<init>(memory.scala:47)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(memoryTest.scala:33)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(memoryTest.scala:33)
[info] at chisel3.core.Module$.do_apply(Module.scala:35)
[info] at chisel3.Driver$$anonfun$elaborate$1.apply(Driver.scala:194)
[info] at chisel3.Driver$$anonfun$elaborate$1.apply(Driver.scala:194)
[info] at chisel3.internal.Builder$$anonfun$build$1.apply(Builder.scala:206)
[info] at chisel3.internal.Builder$$anonfun$build$1.apply(Builder.scala:204)
[info] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[info] at chisel3.internal.Builder$.build(Builder.scala:204)
[info] at chisel3.Driver$.elaborate(Driver.scala:194)
[info] at chisel3.Driver$.execute(Driver.scala:229)
[info] at chisel3.iotesters.setupFirrtlTerpBackend$.apply(FirrtlTerpBackend.scala:110)
[info] at chisel3.iotesters.Driver$.execute(Driver.scala:47)
[info] at chisel3.iotesters.Driver$.apply(Driver.scala:210)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1.apply$mcV$sp(memoryTest.scala:33)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1.apply(memoryTest.scala:33)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1.apply(memoryTest.scala:33)
[info] at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
[info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info] at org.scalatest.Transformer.apply(Transformer.scala:22)
[info] at org.scalatest.Transformer.apply(Transformer.scala:20)
[info] at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1647)
[info] at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
[info] at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
[info] at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1644)
[info] at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
[info] at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
[info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
[info] at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1656)
[info] at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
[info] at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
[info] at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
[info] at scala.collection.immutable.List.foreach(List.scala:381)
[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
[info] at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
[info] at scala.collection.immutable.List.foreach(List.scala:381)
[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
[info] at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
[info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
[info] at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1714)
[info] at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
[info] at org.scalatest.Suite$class.run(Suite.scala:1424)
[info] at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
[info] at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
[info] at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
[info] at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
[info] at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1760)
[info] at org.scalatest.FlatSpec.run(FlatSpec.scala:1683)
[info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:466)
[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:677)
[info] at sbt.TestRunner.runTest$1(TestFramework.scala:76)
[info] at sbt.TestRunner.run(TestFramework.scala:85)
[info] at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
[info] at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
[info] at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:185)
[info] at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
[info] at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
[info] at sbt.TestFunction.apply(TestFramework.scala:207)
[info] at sbt.Tests$$anonfun$9.apply(Tests.scala:216)
[info] at sbt.Tests$$anonfun$9.apply(Tests.scala:216)
[info] at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
[info] at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
[info] at sbt.std.Transform$$anon$4.work(System.scala:63)
[info] at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
[info] at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
[info] at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
[info] at sbt.Execute.work(Execute.scala:237)
[info] at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
[info] at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
[info] at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
[info] at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:745)
[info] ScalaCheck
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] ScalaTest
I'm not clear about the value that is causing the problem. First I thought it was a problem with the Vec's I use in the Mem and IO ports (because of an explanation I found for a similar exception). But the problem exists even after I get rid of them. I even tried getting rid of clone functions.
Now, I'm clueless about the cause of the problem. I'm posting the main section of the source code (except parameter declarations and few simple functions) for the memory in case it might be useful.
trait MemOpConstants
{
val MT_X = Bits(0, 3) // memory transfer type
val MT_B = Bits(1, 3)
val MT_H = Bits(2, 3)
val MT_W = Bits(3, 3)
val MT_D = Bits(4, 3)
val MT_BU = Bits(5, 3)
val MT_HU = Bits(6, 3)
val MT_WU = Bits(7, 3)
val M_X = Bits("b0", 1) // access type
val M_RD = Bits("b0", 1) // load
val M_WR = Bits("b1", 1) // store
}
class MemReq(data_width: Int)(implicit config: Configuration) extends Bundle with MemOpConstants
{
val addr = UInt(width = config.xprlen)
val data = Bits(width = data_width)
val fcn = Bits(width = M_X.getWidth) // memory function code
val typ = Bits(width = MT_X.getWidth) // memory access type
override def cloneType = { new MemReq(data_width).asInstanceOf[this.type] }
}
class MemResp(data_width: Int) extends Bundle
{
val data = Bits(width = data_width)
override def cloneType = { new MemResp(data_width).asInstanceOf[this.type] }
}
class MemPortIO(data_width: Int)(implicit config: Configuration) extends Bundle // constructor for IO interface of data memory
{
val req = Decoupled(new MemReq(data_width)) // ready valid pair
val resp = (new ValidIO(new MemResp(data_width))).flip // valid signal
override def cloneType = { new MemPortIO(data_width).asInstanceOf[this.type] }
}
class OnChipMemory(num_ports: Int = 2, num_bytes: Int = (1 << 15), seq_read: Boolean = false)(implicit config: Configuration) extends Module with MemOpConstants
{
val io = IO(new Bundle{
val port = Vec(num_ports, (new MemPortIO(data_width = config.xprlen)).flip)
})
val num_bytes_per_line = 4
val num_lines = num_bytes/num_bytes_per_line
val lsb_idx = log2Up(num_bytes_per_line) // index of lsb in address
val chipMem = Mem(Vec(4, UInt(width = 32)), num_lines) // memory
for (i <- 0 until num_ports)
{
io.port(i).resp.valid := Reg(next = io.port(i).req.valid)
io.port(i).req.ready := Bool(true) // for now
val req_valid = io.port(i).req.valid
val req_addr = io.port(i).req.bits.addr
val req_data = io.port(i).req.bits.data
val req_fn = io.port(i).req.bits.fcn
val req_typ = io.port(i).req.bits.typ
val byte_shift_amt = io.port(i).req.bits.addr(1,0)
val bit_shift_amt = Cat(byte_shift_amt, UInt(0,3))
//mem read
val r_data_idx = Reg(UInt())
val data_idx = req_addr >> UInt(lsb_idx)
val read_data = Bits()
val rdata = Bits()
if (seq_read)
{
read_data := chipMem(r_data_idx)
rdata := LoadDataGen((read_data >> Reg(next=bit_shift_amt)), Reg(next=req_typ))
}
else
{
read_data := chipMem(data_idx)
rdata := LoadDataGen((read_data >> bit_shift_amt), req_typ)
}
io.port(i).resp.bits.data := rdata
//mem write
when (req_valid && req_fn === M_WR)
{
val wdata = StoreDataGen(req_data, req_typ)
val wmask = ((StoreMask(req_typ) << bit_shift_amt)(31,0)).toBools
chipMem.write(data_idx, wdata, wmask)
}
.elsewhen (req_valid && req_fn === M_RD)
{
r_data_idx := data_idx
}
}
}
object StoreDataGen extends MemOpConstants
{
def apply(din: Bits, typ: Bits): Vec[UInt] =
{
val word = (typ.equals(MT_W)) || (typ.equals(MT_WU))
val half = (typ.equals(MT_H)) || (typ.equals(MT_HU))
val byte = (typ.equals(MT_B)) || (typ.equals(MT_BU))
val dout = Mux(Bool(byte), Vec(4, din( 7,0)),
Mux(Bool(half), Vec(din(15,8), din( 7,0), din(15,8), din( 7,0)),
Vec(din(31,25), din(24,16), din(15,8), din( 7,0))))
return dout
}
}
It would help to have more of the code with line numbers, but based on the name of the trait MemOpConstants, I would guess that it includes some Chisel literals. The compiler is complaining it cannot make an IO port out of MemPortIO since it contains a Bundle MemReq which contains literals as elements.
IO Ports may only contain pure, unbound Chisel data types, not literals, Reg's, Mem's, or Wire's.
If you want to make the constants available at the IO Ports, you should define slots (UInts) to receive them, and then wire those slots up to the actual constant/literal value. Don't place the constants themselves in the IO Port definition.
All your definitions in MemOpConstants are literals (constants) of type Bits. They have values and widths. The values are what make them literals (constants), and unsuitable for inclusion in an IO Port definition.

When I convert an object to json with json4s in a SBT task, it reports errors, how to fix?

I'm writing a SBT task, which will convert some objects to json and output, but has errors. Code is:
project/plugins.sbt
libraryDependencies ++= Seq(
"org.scala-lang" % "scalap" % "2.10.4",
"org.json4s" %% "json4s-native" % "3.2.6",
"org.json4s" %% "json4s-core" % "3.2.6",
"org.json4s" %% "json4s-ext" % "3.2.6"
)
build.sbt
name := "json4s-210-test"
version := "1.0"
scalaVersion := "2.10.4"
import scala.tools.scalap.scalax.rules.scalasig.ScalaSigSymbol
lazy val json = taskKey[Unit]("output a json with json4s")
json := {
case class Config(name: String)
case class Project(name: String, configs: Seq[Config])
implicit val formats = org.json4s.DefaultFormats
val project = Project("myproject", Seq(Config("conf1")))
val json = org.json4s.native.Serialization.write(project)
println(json)
}
When run sbt json, it reports error:
➜ json4s-210-test git:(master) ✗ ./sbt
[info] Loading global plugins from /Users/twer/.sbt/0.13/plugins
[info] Loading project definition from /Users/twer/workspace/json4s-210-test/project
[info] Set current project to json4s-210-test (in build file:/Users/twer/workspace/json4s-210-test/)
> json
[trace] Stack trace suppressed: run last *:json for the full output.
[error] (*:json) java.lang.NoClassDefFoundError: scala/tools/scalap/scalax/rules/scalasig/ScalaSigSymbol
[error] Total time: 0 s, completed May 14, 2015 1:19:58 PM
> last
java.lang.NoClassDefFoundError: scala/tools/scalap/scalax/rules/scalasig/ScalaSigSymbol
at org.json4s.reflect.Reflector$ClassDescriptorBuilder$$anonfun$constructorsAndCompanion$3$$anonfun$13.apply(Reflector.scala:123)
at org.json4s.reflect.Reflector$ClassDescriptorBuilder$$anonfun$constructorsAndCompanion$3$$anonfun$13.apply(Reflector.scala:122)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
But you can see I've alread add the dependency:
"org.scala-lang" % "scalap" % "2.10.4"
and even imported the required class in the build file:
import scala.tools.scalap.scalax.rules.scalasig.ScalaSigSymbol
Why is there such error and how to fix it?
Update: You can clone the code and try it from here: https://github.com/freewind/json4s-210-test