I am rather new to CHISEL.
Is it possible for CHISEL testbench to receive an arg passed in during runtime?
For example, sbt run --backend c --compile --test --genHarness --dut1
--dut1 is meant to be received by the testbench as an arg. It will be used to determine which DUT to be instantiated.
Yes, I believe that would work.
sbt "project myproject" "run my_arg --backend c --targetDir my_target_dir"
You can catch that in your own main, strip out your arguments, and pass Chisel its arguments. Something sort of like this:
````
object top_main {
def main(args: Array[String]): Unit = {
val my_arg = args(0)
val chiselArgs = ArrayBufferString
chiselMain(chiselArgs.toArray, () => iforgettheexactsyntax(my_arg))
}
}
Check out (Chisel runtime error in test harness) for an example main that invokes Chisel.
Related
I have a complex module written in Chisel. I'm using chiseltest to verify its operation. The test is failing. I want to be able to inspect the module's internal wire values to debug what is going wrong. Since the PeekPokeTester only allows me to inspect the value of the io signals, how can I inspect the internal wires?
Here is an example:
import chisel3._
class MyModule extends Module {
val io = IO(new Bundle {
val a = Input(Bool())
val b = Input(Bool())
val c = Input(Bool())
val d = Output(Bool())
})
val i = Wire(Bool())
i := io.a ^ io.b
io.d := i | io.c
}
import chisel3._
import chisel3.tester._
import org.scalatest.FreeSpec
class MyModuleTest extends FreeSpec with ChiselScalatestTester {
"MyModule should work properly" in {
test(new MyModule) { dut =>
dut.io.a.poke(true.B)
dut.io.b.poke(false.B)
dut.io.c.poke(false.B)
dut.i.expect(true.B) // This line throws a java.util.NoSuchElementException
// : key not found: Bool(Wire in MyModule)
}
}
}
How can I inspect the intermediate value "i"?
There's a few ways to do this.
1 ) Turn on VCD output by adding an annotation to your test, as in
import chiseltest.experimental.TestOptionBuilder._
import treadle._
...
test(new MyModule).withAnnotations(Seq(WriteVcdAnnotation)) { dut =>
The .vcd file will be placed in the relevant test_run_dir/ you can view it with GtkWave or similar
2 ) add printf statements to your module.
3 ) There is a simulation shell in the Treadle Repo that allows you to peek poke and step based on a firrtl file (the firrtl file should be in the same test_run_dir/ directory as above). There is A bit of documentation here
Good luck!
How can I generate FIRRTL file from chisel code? I have installed sbt, firrtl and verilator according to the github wiki. And created a chisel code for simple adder. I want to generate the FIRRTL and covert it to Verilog? My problem is how to get the firrtl file from the chisel code.
Thanks.
Source file : MyQueueTest/src/main/scala/example/MyQueueDriver.scala
package example
import chisel3._
import chisel3.util._
class MyQueue extends Module {
val io = IO(new Bundle {
val a = Flipped(Decoupled(UInt(32.W)))
val b = Flipped(Decoupled(UInt(32.W)))
val z = Decoupled(UInt(32.W))
})
val qa = Queue(io.a)
val qb = Queue(io.b)
qa.nodeq()
qb.nodeq()
when (qa.valid && qb.valid && io.z.ready) {
io.z.enq(qa.deq() + qb.deq())
}
}
object MyQueueDriver extends App {
chisel3.Driver.execute(args, () => new MyQueue)
}
I asked a similar question here.
The solution could be to use full template provided here, or you can simply do that:
Add these lines at the end of your scala sources :
object YourModuleDriver extends App {
chisel3.Driver.execute(args, () => new YourModule)
}
Replacing "YourModule" by the name of your module.
And add a build.sbt file in the same directory of your sources with these lines :
scalaVersion := "2.11.8"
resolvers ++= Seq(
Resolver.sonatypeRepo("snapshots"),
Resolver.sonatypeRepo("releases")
)
libraryDependencies += "edu.berkeley.cs" %% "chisel3" % "3.0-SNAPSHOT"
To generate FIRRTL and Verilog you will just have to do :
$ sbt "run-main YourModuleDriver"
And the FIRRTL (yourmodule.fir) /Verilog (yourmodule.v) sources will be in generated directory.
I am trying to create a JSON String from a Scala Object as described here.
I have the following code:
import scala.collection.mutable._
import net.liftweb.json._
import net.liftweb.json.Serialization.write
case class Person(name: String, address: Address)
case class Address(city: String, state: String)
object LiftJsonTest extends App {
val p = Person("Alvin Alexander", Address("Talkeetna", "AK"))
// create a JSON string from the Person, then print it
implicit val formats = DefaultFormats
val jsonString = write(p)
println(jsonString)
}
My build.sbt file contains the following:
libraryDependencies += "net.liftweb" %% "lift-json" % "2.5+"
When I build with sbt package, it is a success.
However, when I try to run it as a Spark job, like this:
spark-submit \
--packages com.amazonaws:aws-java-sdk-pom:1.10.34,org.apache.hadoop:hadoop-aws:2.6.0,net.liftweb:lift-json:2.5+ \
--class "com.foo.MyClass" \
--master local[4] \
target/scala-2.10/my-app_2.10-0.0.1.jar
I get this error:
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: net.liftweb#lift-json;2.5+: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1068)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:287)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:154)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
What am I doing wrong here? Is net.liftweb:lift-json:2.5+ in my packages argument incorrect? Do I need to add a resolver in build.sbt?
Users may also include any other dependencies by supplying a comma-delimited list of maven coordinates with --packages.
2.5+ in your build.sbt is Ivy version matcher syntax, not actual artifact version needed for Maven coordinates. spark-submit apparently doesn't use Ivy for resolution (and I think it would be surprising if it did; your application could suddenly stop working because a new dependency version was published). So you need to find what version 2.5+ resolves to in your case e.g. using https://github.com/jrudolph/sbt-dependency-graph (or trying to find it in show dependencyClasspath).
I'm finding using specs2 with scalacheck to verify the Monoid laws a bit ugly when trying to make use of the scalaz scalacheck-binding library.
My code uses the scalaz Monoid so I wanted to use their laws to verify my MyType implements them.
This uglyness makes me think I'm missing something or mis-using Specs2 or scalacheck-binding API's. Sugestions apreciated.
This is what i've done:-
I'm using specs2 3.7 with scalaz 2.7.0
Reading the user guide at "http://etorreborre.github.io/specs2/guide/SPECS2-3.0/org.specs2.guide.UseScalaCheck.html"
I have extended my spec with the Scalacheck trait and I have an Arbitrary[MyType] in scope so I should be able to use scalacheck OK.
The doc mentioned above states that I need to pass a function to the prop method as long as the passed function returns a Result where scalacheck's Prop is a valid Result
The scalacheck-binding api gives me a monoid.laws[T] function that returns a Properties which is a Prop so this should be OK, it also takes implicit parameters of types Monoid[T], Equal[T] and Arbitrary[T] all of which I have in scope where T is MyType
I want to do this:
class MyTypeSpec extends Specification with ScalaCheck {
def is = s2"""
MyType spec must :-
obey the Monoid Laws $testMonoidLaws
"""
def testMonoidLaws = {
import org.scalacheck.{Gen, Arbitrary}
import scalaz.scalacheck.ScalazProperties._
implicit val arbMyType: Arbitrary[MyType] = genArbMyTpe() // an helper Arbitrary Gen func i have written
prop { monoid.laws[MyType] }
}
}
but prop cannot be applied to (org.scalacheck.Properties)
It requires the T in the Arbitrary to be the type in the parameter to the function, so I have done this, notice I trow away the parameter t, ...
class MyTypeSpec extends Specification with ScalaCheck {
def is = s2"""
MyType spec must :-
obey the Monoid Laws $testMonoidLaws
"""
def testMonoidLaws = {
import org.scalacheck.{Gen, Arbitrary}
import scalaz.scalacheck.ScalazProperties._
implicit val arbMyType: Arbitrary[MyType] = genArbMyTpe() //some Arbitrary Gen func
prop { (t: Path => monoid.laws[MyType] }
}
}
My test passes. yay! So What's the problem?
I'm uneasy about the test. All it says is it passed. I get no output like I would if using Scalacheck directly telling me which laws it ran and passed.
Also I throw away the parameter t and let monoid.laws[MyType] find the in scope implicits, which just seems wrong. Is it working? have I mangled the specs2 API?
modifying MyType so it would definatly fail the laws caused the test to fail, which is good but I am still uneasy as it always fails with
Falsified after 0 passed tests.
I can collect the Arbitrary[MyType] by doing
prop { (p: Path) => monoid.laws[Path] }.collectArg(f => "it was " + f.shows)
then running it like so
sbt testOnly MyTypeSpec -- scalacheck.verbose
which shows me the collected values of t when it works but as I throw away t I'm not sure if this is valid at all.
Is there a better way to test using Specs2 and the scalaz scalacheck-bindings that is less ugly and outputs info that give me confidence that Laws were tried and tested?
Thanks
Karl
You can use Properties directly without having to use prop. Here is a full example:
import org.specs2._
import scalaz.scalacheck.ScalazProperties._
import org.scalacheck._
import scalaz._, Scalaz._
import PositiveInt._
class TestSpec extends Specification with ScalaCheck { def is = s2"""
PositiveInt should pass the Monoid laws $e1
"""
def e1 = monoid.laws[PositiveInt]
}
case class PositiveInt(i: Int)
object PositiveInt {
implicit def ArbitraryPositiveInt: Arbitrary[PositiveInt] =
Arbitrary(Gen.choose(0, 100).map(PositiveInt.apply))
implicit def EqualPositiveInt: Equal[PositiveInt] =
Equal.equalA[PositiveInt]
implicit def MonoidPositiveInt: Monoid[PositiveInt] = new Monoid[PositiveInt] {
val zero = PositiveInt(1)
def append(p1: PositiveInt, p2: =>PositiveInt): PositiveInt =
PositiveInt(p1.i + p2.i)
}
}
And because the Monoid instance is incorrect it will fail with:
[info] TestSpec
[info]
[error] x PositiveInt should pass the Monoid laws
[error] Falsified after 0 passed tests.
[error] > Labels of failing property:
[error] monoid.left identity
[error] > ARG_0: PositiveInt(3)
[info]
[info]
[info] Total for specification TestSpec
[info] Finished in 185 ms
[info] 1 example, 1 failure, 0 error
The failure indicates the first laws that fails to pass. It doesn't however create several examples, one for each law, to display which law is being executed. If you want to do that you can map each property of the laws Properties to an example:
class TestSpec extends Specification with ScalaCheck { def is = s2"""
PositiveInt should pass the Monoid laws $properties
"""
def properties = toExamples(monoid.laws[PositiveInt])
def toExamples(ps: Properties): Fragments =
t ^ Fragments.foreach(ps.properties) { case (name, prop) => br ^ name ! prop }
}
This prints (for a passing Monoid[PositiveInt] instance):
[info] TestSpec
[info]
[info] PositiveInt should pass the Monoid laws
[info] + monoid.semigroup.associative
[info] + monoid.left identity
[info] + monoid.right identity
[info]
[info] Total for specification TestSpec
[info] Finished in 91 ms
[info] 3 examples, 300 expectations, 0 failure, 0 error
Maven surefire-plugin doesn't run integration tests (they named with "IT" suffix by convention), but sbt runs both: unit and integration. So, how to prevent this behaviour? Is there a common way to distinguish integration and unit tests for ScalaTest (don't run FeatureSpec-tests by default)
How to do that is exactly documented on the sbt manual on http://www.scala-sbt.org/release/docs/Detailed-Topics/Testing#additional-test-configurations-with-shared-sources :
//Build.scala
import sbt._
import Keys._
object B extends Build {
lazy val root =
Project("root", file("."))
.configs( FunTest )
.settings( inConfig(FunTest)(Defaults.testTasks) : _*)
.settings(
libraryDependencies += specs,
testOptions in Test := Seq(Tests.Filter(itFilter)),
testOptions in FunTest := Seq(Tests.Filter(unitFilter))
)
def itFilter(name: String): Boolean = name endsWith "ITest"
def unitFilter(name: String): Boolean = (name endsWith "Test") && !itFilter(name)
lazy val FunTest = config("fun") extend(Test)
lazy val specs = "org.scala-tools.testing" %% "specs" % "1.6.8" % "test"
}
Call sbt test for unit tests and sbt fun:test for integration test and sbt test fun:test for both.
The simplest way with the latest sbt is just to apply IntegrationTest config and corresponding settings as described here, - and you put your tests in src/it/scala directory in your project.