How do you test RawModules? - chisel

I'm using a chisel RawModule for an AXI interface to a module I'm building (so that I can use axi aclk and aresetn.) However, I'm unable to do the normal tactic of using a peekpoke tester. What's the recommended strategy for testing rawmodules that have clocks?

PeekPokeTester is currently limited to working on MultiIOModule or its subtypes. You can get around this by wrapping your RawModule in a MultiIOModule and bridging the IO (including the implicit clock/reset) from the wrapping MultiIOModule to your RawModule.
The new testing and verification library for Chisel (which replaces chisel-testers/chisel3.iotesters) is expected to support this natively and has an associated tracking issue: ucb-bar/chisel-testers2#14.
Edit: Example of wrapping a RawModule in a MultiIOModule:
import chisel3._
import chisel3.stage.ChiselStage
sealed trait CommonIO { this: RawModule =>
val a = IO(Input(Bool()))
val b = IO(Output(Bool()))
}
class Foo extends RawModule with CommonIO {
val clk = IO(Input(Clock()))
val rst = IO(Input(Reset()))
b := withClockAndReset(clk, rst){ RegNext(a, true.B) }
}
class Wrapper extends MultiIOModule with CommonIO {
val foo = Module(new Foo)
foo.a := a
b := foo.b
foo.clk := clock
foo.rst := reset
}
(new ChiselStage)
.execute(Array("-X", "verilog"),
Seq(chisel3.stage.ChiselGeneratorAnnotation(() => new Wrapper)))
This produces the following FIRRTL:
circuit Wrapper :
module Foo :
input a : UInt<1>
output b : UInt<1>
input clk : Clock
input rst : Reset
reg _T : UInt<1>, clk with : (reset => (rst, UInt<1>("h01")))
_T <= a
b <= _T
module Wrapper :
input clock : Clock
input reset : UInt<1>
input a : UInt<1>
output b : UInt<1>
inst foo of Foo
foo.a <= a
b <= foo.b
foo.clk <= clock
foo.rst <= reset
Note:
The use of withClockAndReset to use clk and rst for the clock and reset connections of the RegNext
The connections from the implicit Wrapper.clock and Wrapper.reset have to be made explicitly. withClockAndReset will not work here.

Related

Debugging module internals in Chisel

I have a complex module written in Chisel. I'm using chiseltest to verify its operation. The test is failing. I want to be able to inspect the module's internal wire values to debug what is going wrong. Since the PeekPokeTester only allows me to inspect the value of the io signals, how can I inspect the internal wires?
Here is an example:
import chisel3._
class MyModule extends Module {
val io = IO(new Bundle {
val a = Input(Bool())
val b = Input(Bool())
val c = Input(Bool())
val d = Output(Bool())
})
val i = Wire(Bool())
i := io.a ^ io.b
io.d := i | io.c
}
import chisel3._
import chisel3.tester._
import org.scalatest.FreeSpec
class MyModuleTest extends FreeSpec with ChiselScalatestTester {
"MyModule should work properly" in {
test(new MyModule) { dut =>
dut.io.a.poke(true.B)
dut.io.b.poke(false.B)
dut.io.c.poke(false.B)
dut.i.expect(true.B) // This line throws a java.util.NoSuchElementException
// : key not found: Bool(Wire in MyModule)
}
}
}
How can I inspect the intermediate value "i"?
There's a few ways to do this.
1 ) Turn on VCD output by adding an annotation to your test, as in
import chiseltest.experimental.TestOptionBuilder._
import treadle._
...
test(new MyModule).withAnnotations(Seq(WriteVcdAnnotation)) { dut =>
The .vcd file will be placed in the relevant test_run_dir/ you can view it with GtkWave or similar
2 ) add printf statements to your module.
3 ) There is a simulation shell in the Treadle Repo that allows you to peek poke and step based on a firrtl file (the firrtl file should be in the same test_run_dir/ directory as above). There is A bit of documentation here
Good luck!

Generating Verilog code after BlackBoxing in Chisel3

I am trying BlackBox feature in Chisel3. Every time I try to generate Verilog code of Chisel I got an error.
I followed the right steps, writing the class, class driver and build.sbt.
I am not sure where the problem is
This is my Chisel Code
import chisel3._
import chisel3.util._
import chisel3.experimental._
class BlackBoxRealAdd extends BlackBox with HasBlackBoxInline {
val io = IO(new Bundle() {
val in1 = Input(UInt(64.W))
val in2 = Input(UInt(64.W))
val out = Output(UInt(64.W))
})
setInline("BlackBoxRealAdd.v",
s"""
|module BlackBoxRealAdd(
| input [15:0] in1,
| input [15:0] in2,
| output [15:0] out
|);
|always #* begin
| out <= (in1) + (in2));
|end
|endmodule
""".stripMargin)
}
object BlackBoxRealAddDriver extends App {
chisel3.Driver.execute(args, () => new BlackBoxRealAdd)
}
scalaVersion := "2.11.12"
resolvers ++= Seq(
Resolver.sonatypeRepo("snapshots"),
Resolver.sonatypeRepo("releases")
)
libraryDependencies += "edu.berkeley.cs" %% "chisel3" % "3.1.+"
I have figured it out. The blackboxed module shouldn't be the top one.

How do I override default codec in circe?

I'd like to encode Array[Byte] fields of my case classes as Base64 strings. For some reason Circe doesn't pick up my codec using default one instead that converts byte array into json array of ints.
What should I do to fix it ? Here is my minimized code
import io.circe.generic.JsonCodec
sealed trait DocumentAttribute
#JsonCodec
sealed case class DAAudio(title: Option[String], performer: Option[String], waveform: Option[Array[Byte]], duration: Int) extends DocumentAttribute
#JsonCodec
sealed case class DAFilename(fileName: String) extends DocumentAttribute
object CirceEncodersDecoders {
import io.circe._
import io.circe.generic.extras._
import io.circe.generic.extras.semiauto._
implicit val arrayByteEncoder: Encoder[Array[Byte]] = Encoder.encodeString.contramap[Array[Byte]] { bytes ⇒
Base64.getEncoder.encodeToString(bytes)
}
val printer: Printer = Printer.noSpaces.copy(dropNullValues = true, reuseWriters = true)
implicit val config: Configuration = Configuration.default.withDiscriminator("kind").withSnakeCaseConstructorNames.withSnakeCaseMemberNames
implicit val DocumentAttributeEncoder: Encoder[DocumentAttribute] = deriveEncoder
implicit val DocumentAttributeDecoder: Decoder[DocumentAttribute] = deriveDecoder
}
object main {
def main(args: Array[String]): Unit = {
import CirceEncodersDecoders._
import io.circe.parser._
import io.circe.syntax._
val attributes: List[DocumentAttribute] = List(
DAAudio(Some("title"), Some("perform"), Some(List(1, 2, 3, 4, 5).map(_.toByte).toArray), 15),
DAFilename("filename")
)
val j2 = attributes.asJson
val decoded2 = decode[List[DocumentAttribute]](j2.noSpaces)
println(decoded2)
}
}
When you do this:
implicit val DocumentAttributeEncoder: Encoder[DocumentAttribute] = deriveEncoder
circe tries to get suitable Encoder for DAFilename and DAAudio. However, since those already exist (by means of #JsonCodec on individual classes), it does not re-derive them from scratch using generics and your Encoder[Array[Byte]] at scope - which you want.
So you can either get rid of #JsonCodec (so it auto-derives codecs for DAFilename and DAAudio together with DocumentAttribute) or trigger re-derivation manually:
implicit val AudioDecoder: Encoder[DAAudio] = deriveEncoder // takes priority over existing one
implicit val DocumentAttributeEncoder: Encoder[DocumentAttribute] = deriveEncoder // AudioDecoder will be used here
You also need to build a Decoder for Array[Byte] and repeat the process above for Decoders, otherwise it will try to parse Base64 string as a list of ints, resulting in a failure.
It seams that #JsonCodec annotations don't work with a custom encoder for Array[Byte].
Here is all the stuff that need for encoding and decoding of your classes with circe:
object CirceEncodersDecoders2 {
val printer: Printer = Printer.noSpaces.copy(dropNullValues = true, reuseWriters = true)
implicit val arrayByteEncoder: Encoder[Array[Byte]] =
Encoder.encodeString.contramap[Array[Byte]](Base64.getEncoder.encodeToString)
implicit val arrayByteDecoder: Decoder[Array[Byte]] =
Decoder.decodeString.map[Array[Byte]](Base64.getDecoder.decode)
implicit val config: Configuration = Configuration.default.withDiscriminator("kind").withSnakeCaseConstructorNames.withSnakeCaseMemberNames
implicit val audioEncoder: Encoder[DAAudio] = deriveEncoder[DAAudio]
implicit val audioDecoder: Decoder[DAAudio] = deriveDecoder[DAAudio]
implicit val filenameEncoder: Encoder[DAFilename] = deriveEncoder[DAFilename]
implicit val filenameDecoder: Decoder[DAFilename] = deriveDecoder[DAFilename]
implicit val documentAttributeEncoder: Encoder[DocumentAttribute] = deriveEncoder[DocumentAttribute]
implicit val documentAttributeDecoder: Decoder[DocumentAttribute] = deriveDecoder[DocumentAttribute]
}
If you are not limited in selection of JSON parser/serializer, then you can try a more efficient solution using jsoniter-scala.
DISCLAIMER: I'm an author of this library.
Here are results of benchmarks for both implementations:
[info] Benchmark Mode Cnt Score Error Units
[info] ListOfAdtWithBase64Benchmark.readCirce thrpt 5 114927.343 ± 7910.068 ops/s
[info] ListOfAdtWithBase64Benchmark.readJsoniterScala thrpt 5 1818299.170 ± 162757.404 ops/s
[info] ListOfAdtWithBase64Benchmark.writeCirce thrpt 5 117982.635 ± 8942.816 ops/s
[info] ListOfAdtWithBase64Benchmark.writeJsoniterScala thrpt 5 4281752.461 ± 319953.287 ops/s
Full sources are here.

Why does this getOrElse statement return type ANY?

I am trying to follow the tutorial https://www.jamesward.com/2012/02/21/play-framework-2-with-scala-anorm-json-coffeescript-jquery-heroku but of course play-scala has changed since the tutorial (as seems to be the case with every tutorial I find). I am using 2.4.3 This requires I actually learn how things work, not necessarily a bad thing.
One thing that is giving me trouble is the getOrElse method.
Here is my Bar.scala model
package models
import play.api.db._
import play.api.Play.current
import anorm._
import anorm.SqlParser._
case class Bar(id: Option[Long], name: String)
object Bar {
val simple = {
get[Option[Long]]("id") ~
get[String]("name") map {
case id~name => Bar(id, name)
}
}
def findAll(): Seq[Bar] = {
DB.withConnection { implicit connection =>
SQL("select * from bar").as(Bar.simple *)
}
}
def create(bar: Bar): Unit = {
DB.withConnection { implicit connection =>
SQL("insert into bar(name) values ({name})").on(
'name -> bar.name
).executeUpdate()
}
}
}
and my BarFormat.scala Json formatter
package models
import play.api.libs.json._
import anorm._
package object Implicits {
implicit object BarFormat extends Format[Bar] {
def reads(json: JsValue):JsResult[Bar] = JsSuccess(Bar(
Option((json \ "id").as[Long]),
(json \ "name").as[String]
))
def writes(bar: Bar) = JsObject(Seq(
"id" -> JsNumber(bar.id.getOrElse(0L)),
"name" -> JsString(bar.name)
))
}
}
and for completeness my Application.scala controller:
package controllers
import play.api.mvc._
import play.api.data._
import play.api.data.Forms._
import javax.inject.Inject
import javax.inject._
import play.api.i18n.{ I18nSupport, MessagesApi, Messages, Lang }
import play.api.libs.json._
import views._
import models.Bar
import models.Implicits._
class Application #Inject()(val messagesApi: MessagesApi) extends Controller with I18nSupport {
val barForm = Form(
single("name" -> nonEmptyText)
)
def index = Action {
Ok(views.html.index(barForm))
}
def addBar() = Action { implicit request =>
barForm.bindFromRequest.fold(
errors => BadRequest,
{
case (name) =>
Bar.create(Bar(None, name))
Redirect(routes.Application.index())
}
)
}
def listBars() = Action { implicit request =>
val bars = Bar.findAll()
val json = Json.toJson(bars)
Ok(json).as("application/json")
}
and routes
# Routes
# This file defines all application routes (Higher priority routes first)
# ~~~~
# Home page
POST /addBar controllers.Application.addBar
GET / controllers.Application.index
GET /listBars controllers.Application.listBars
# Map static resources from the /public folder to the /assets URL path
GET /assets/*file controllers.Assets.versioned(path="/public", file: Asset)
When I try to run my project I get the following error:
now bar.id is defined as an Option[Long] so bar.id.getOrElse(0L) should return a Long as far as I can tell, but it is clearly returning an Any. Can anyone help me understand why?
Thank You!
That's the way type inference works in Scala...
First of all there is an implicit conversion from Int to BigDecimal:
scala> (1 : Int) : BigDecimal
res0: BigDecimal = 1
That conversion allows for Int to be converted before the option is constructed:
scala> Some(1) : Option[BigDecimal]
res1: Option[BigDecimal] = Some(1)
If we try getOrElse on its own where the type can get fixed we get expected type Int:
scala> Some(1).getOrElse(2)
res2: Int = 1
However, this does not work (the problem you have):
scala> Some(1).getOrElse(2) : BigDecimal
<console>:11: error: type mismatch;
found : Any
required: BigDecimal
Some(1).getOrElse(2) : BigDecimal
^
Scala's implicit conversions kick in last, after type inference is performed. That makes sense, because if you don't know the type how would you know what conversions need to be applied. Scala can see that BigDecimal is expected, but it has an Int result based on the type of the Option it has. So it tries to widen the type, can't find anything that matches BigDecimal in Int's type hierarchy and fails with the error.
This works, however because the type is fixed in the variable declaration:
scala> val v = Some(1).getOrElse(2)
v: Int = 1
scala> v: BigDecimal
res4: BigDecimal = 1
So we need to help the compiler somehow - any type annotation or explicit conversion would work. Pick any one you like:
scala> (Some(1).getOrElse(2) : Int) : BigDecimal
res5: BigDecimal = 1
scala> Some(1).getOrElse[Int](2) : BigDecimal
res6: BigDecimal = 1
scala> BigDecimal(Some(1).getOrElse(2))
res7: scala.math.BigDecimal = 1
Here is the signature for Option.getOrElse method:
getOrElse[B >: A](default: ⇒ B): B
The term B >: A expresses that the type parameter B or the abstract type B refer to a supertype of type A, in this case, Any being the supertype of Long:
val l: Long = 10
val a: Any = l
So, we can do something very similar here with getOrElse:
val some: Option[Long] = Some(1)
val value: Any = option.getOrElse("potatos")
val none: Option[Long] = None
val elseValue: Any = none.getOrElse("potatos")
Which brings us to your scenario: the returned type from getOrElse will be a Any and not a BigDecimal, so you will need another way to handle this situation, like using fold, per instance:
def writes(bar: Bar) = {
val defaultValue = BigDecimal(0)
JsObject(Seq(
"id" -> JsNumber(bar.id.fold(defaultValue)(BigDecimal(_))),
"name" -> JsString(bar.name)
))
}
Some other discussions that can help you:
Why is Some(1).getOrElse(Some(1)) not of type Option[Int]?
Option getOrElse type mismatch error

Play 2.3 Scala: Explicitly passing Writer - needs to match Option{T] vs T; implicit writer can handle both

I have a case class with a json.Writes[T] defined on it.
If I have an Option[T], and with an implicit write in scope, I can call Json.toJson(Option[T]); and this works
However if I call Json.toJson(T)(json.Writes[T]) - I get a compile error
type mismatch;
found : play.api.libs.json.Writes[models.WorkflowTemplateMilestone]{def writes(o: models.WorkflowTemplateMilestone): play.api.libs.json.JsObject}
required: play.api.libs.json.Writes[Option[models.WorkflowTemplateMilestone]]
Note: implicit value workflowTemplateMilestoneAPIWrites is not applicable here because it comes after the application point and it lacks an explicit result type
Am I passing the Writer in incorrectly? How does it work between T and Option[T] with the implicit write?
The actual code is below; in case I've mis diagnosed the issue
Custom writer
implicit val workflowTemplateMilestoneAPIWrites = new Writes[WorkflowTemplateMilestone] {
def writes(o: WorkflowTemplateMilestone) = Json.obj(
"id" -> o.id,
"name" -> o.name,
"order" -> o.order
)
}
WORKS
implicit val workflowTemplateMilestoneAPIWrites = new Writes[List[(WorkflowTemplate, Option[WorkflowTemplateMilestone])]] {
def writes(l: List[(WorkflowTemplate, Option[WorkflowTemplateMilestone])]) = Json.obj(
"id" -> l.head._1.id,
"name" -> l.head._1.name,
"milestones" ->
l.map(o =>
Json.toJson(o._2) **// Writer is picked up through Implict this WORKS**
)
)
}
GIVES COMPILE ERROR
implicit val workflowTemplateMilestoneAPIWrites = new Writes[List[(WorkflowTemplate, Option[WorkflowTemplateMilestone])]] {
def writes(l: List[(WorkflowTemplate, Option[WorkflowTemplateMilestone])]) = Json.obj(
"id" -> l.head._1.id,
"name" -> l.head._1.name,
"milestones" ->
l.map(o =>
Json.toJson(o._2)**(WorkflowTemplateMilestone.workflowTemplateMilestoneAPIWrites)** // But if I explicitly pass in the Writer, I get the compile error
)
)
}
Thanks,
Brent