How can I generate FIRRTL from chisel code? - chisel

How can I generate FIRRTL file from chisel code? I have installed sbt, firrtl and verilator according to the github wiki. And created a chisel code for simple adder. I want to generate the FIRRTL and covert it to Verilog? My problem is how to get the firrtl file from the chisel code.
Thanks.
Source file : MyQueueTest/src/main/scala/example/MyQueueDriver.scala
package example
import chisel3._
import chisel3.util._
class MyQueue extends Module {
val io = IO(new Bundle {
val a = Flipped(Decoupled(UInt(32.W)))
val b = Flipped(Decoupled(UInt(32.W)))
val z = Decoupled(UInt(32.W))
})
val qa = Queue(io.a)
val qb = Queue(io.b)
qa.nodeq()
qb.nodeq()
when (qa.valid && qb.valid && io.z.ready) {
io.z.enq(qa.deq() + qb.deq())
}
}
object MyQueueDriver extends App {
chisel3.Driver.execute(args, () => new MyQueue)
}

I asked a similar question here.
The solution could be to use full template provided here, or you can simply do that:
Add these lines at the end of your scala sources :
object YourModuleDriver extends App {
chisel3.Driver.execute(args, () => new YourModule)
}
Replacing "YourModule" by the name of your module.
And add a build.sbt file in the same directory of your sources with these lines :
scalaVersion := "2.11.8"
resolvers ++= Seq(
Resolver.sonatypeRepo("snapshots"),
Resolver.sonatypeRepo("releases")
)
libraryDependencies += "edu.berkeley.cs" %% "chisel3" % "3.0-SNAPSHOT"
To generate FIRRTL and Verilog you will just have to do :
$ sbt "run-main YourModuleDriver"
And the FIRRTL (yourmodule.fir) /Verilog (yourmodule.v) sources will be in generated directory.

Related

Debugging module internals in Chisel

I have a complex module written in Chisel. I'm using chiseltest to verify its operation. The test is failing. I want to be able to inspect the module's internal wire values to debug what is going wrong. Since the PeekPokeTester only allows me to inspect the value of the io signals, how can I inspect the internal wires?
Here is an example:
import chisel3._
class MyModule extends Module {
val io = IO(new Bundle {
val a = Input(Bool())
val b = Input(Bool())
val c = Input(Bool())
val d = Output(Bool())
})
val i = Wire(Bool())
i := io.a ^ io.b
io.d := i | io.c
}
import chisel3._
import chisel3.tester._
import org.scalatest.FreeSpec
class MyModuleTest extends FreeSpec with ChiselScalatestTester {
"MyModule should work properly" in {
test(new MyModule) { dut =>
dut.io.a.poke(true.B)
dut.io.b.poke(false.B)
dut.io.c.poke(false.B)
dut.i.expect(true.B) // This line throws a java.util.NoSuchElementException
// : key not found: Bool(Wire in MyModule)
}
}
}
How can I inspect the intermediate value "i"?
There's a few ways to do this.
1 ) Turn on VCD output by adding an annotation to your test, as in
import chiseltest.experimental.TestOptionBuilder._
import treadle._
...
test(new MyModule).withAnnotations(Seq(WriteVcdAnnotation)) { dut =>
The .vcd file will be placed in the relevant test_run_dir/ you can view it with GtkWave or similar
2 ) add printf statements to your module.
3 ) There is a simulation shell in the Treadle Repo that allows you to peek poke and step based on a firrtl file (the firrtl file should be in the same test_run_dir/ directory as above). There is A bit of documentation here
Good luck!

Generating Verilog code after BlackBoxing in Chisel3

I am trying BlackBox feature in Chisel3. Every time I try to generate Verilog code of Chisel I got an error.
I followed the right steps, writing the class, class driver and build.sbt.
I am not sure where the problem is
This is my Chisel Code
import chisel3._
import chisel3.util._
import chisel3.experimental._
class BlackBoxRealAdd extends BlackBox with HasBlackBoxInline {
val io = IO(new Bundle() {
val in1 = Input(UInt(64.W))
val in2 = Input(UInt(64.W))
val out = Output(UInt(64.W))
})
setInline("BlackBoxRealAdd.v",
s"""
|module BlackBoxRealAdd(
| input [15:0] in1,
| input [15:0] in2,
| output [15:0] out
|);
|always #* begin
| out <= (in1) + (in2));
|end
|endmodule
""".stripMargin)
}
object BlackBoxRealAddDriver extends App {
chisel3.Driver.execute(args, () => new BlackBoxRealAdd)
}
scalaVersion := "2.11.12"
resolvers ++= Seq(
Resolver.sonatypeRepo("snapshots"),
Resolver.sonatypeRepo("releases")
)
libraryDependencies += "edu.berkeley.cs" %% "chisel3" % "3.1.+"
I have figured it out. The blackboxed module shouldn't be the top one.

Scala/Spark: NoClassDefFoundError: net/liftweb/json/Formats

I am trying to create a JSON String from a Scala Object as described here.
I have the following code:
import scala.collection.mutable._
import net.liftweb.json._
import net.liftweb.json.Serialization.write
case class Person(name: String, address: Address)
case class Address(city: String, state: String)
object LiftJsonTest extends App {
val p = Person("Alvin Alexander", Address("Talkeetna", "AK"))
// create a JSON string from the Person, then print it
implicit val formats = DefaultFormats
val jsonString = write(p)
println(jsonString)
}
My build.sbt file contains the following:
libraryDependencies += "net.liftweb" %% "lift-json" % "2.5+"
When I build with sbt package, it is a success.
However, when I try to run it as a Spark job, like this:
spark-submit \
--packages com.amazonaws:aws-java-sdk-pom:1.10.34,org.apache.hadoop:hadoop-aws:2.6.0,net.liftweb:lift-json:2.5+ \
--class "com.foo.MyClass" \
--master local[4] \
target/scala-2.10/my-app_2.10-0.0.1.jar
I get this error:
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: net.liftweb#lift-json;2.5+: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1068)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:287)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:154)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
What am I doing wrong here? Is net.liftweb:lift-json:2.5+ in my packages argument incorrect? Do I need to add a resolver in build.sbt?
Users may also include any other dependencies by supplying a comma-delimited list of maven coordinates with --packages.
2.5+ in your build.sbt is Ivy version matcher syntax, not actual artifact version needed for Maven coordinates. spark-submit apparently doesn't use Ivy for resolution (and I think it would be surprising if it did; your application could suddenly stop working because a new dependency version was published). So you need to find what version 2.5+ resolves to in your case e.g. using https://github.com/jrudolph/sbt-dependency-graph (or trying to find it in show dependencyClasspath).

Pass arg to testbench during runtime

I am rather new to CHISEL.
Is it possible for CHISEL testbench to receive an arg passed in during runtime?
For example, sbt run --backend c --compile --test --genHarness --dut1
--dut1 is meant to be received by the testbench as an arg. It will be used to determine which DUT to be instantiated.
Yes, I believe that would work.
sbt "project myproject" "run my_arg --backend c --targetDir my_target_dir"
You can catch that in your own main, strip out your arguments, and pass Chisel its arguments. Something sort of like this:
````
object top_main {
def main(args: Array[String]): Unit = {
val my_arg = args(0)
val chiselArgs = ArrayBufferString
chiselMain(chiselArgs.toArray, () => iforgettheexactsyntax(my_arg))
}
}
Check out (Chisel runtime error in test harness) for an example main that invokes Chisel.

How to prevent sbt from running integration tests?

Maven surefire-plugin doesn't run integration tests (they named with "IT" suffix by convention), but sbt runs both: unit and integration. So, how to prevent this behaviour? Is there a common way to distinguish integration and unit tests for ScalaTest (don't run FeatureSpec-tests by default)
How to do that is exactly documented on the sbt manual on http://www.scala-sbt.org/release/docs/Detailed-Topics/Testing#additional-test-configurations-with-shared-sources :
//Build.scala
import sbt._
import Keys._
object B extends Build {
lazy val root =
Project("root", file("."))
.configs( FunTest )
.settings( inConfig(FunTest)(Defaults.testTasks) : _*)
.settings(
libraryDependencies += specs,
testOptions in Test := Seq(Tests.Filter(itFilter)),
testOptions in FunTest := Seq(Tests.Filter(unitFilter))
)
def itFilter(name: String): Boolean = name endsWith "ITest"
def unitFilter(name: String): Boolean = (name endsWith "Test") && !itFilter(name)
lazy val FunTest = config("fun") extend(Test)
lazy val specs = "org.scala-tools.testing" %% "specs" % "1.6.8" % "test"
}
Call sbt test for unit tests and sbt fun:test for integration test and sbt test fun:test for both.
The simplest way with the latest sbt is just to apply IntegrationTest config and corresponding settings as described here, - and you put your tests in src/it/scala directory in your project.