Generating waveforms with ChiselTest framework - chisel

I have a ChiselTest tester written as follows:
class EccTester extends FlatSpec with ChiselScalatestTester with Matchers {
behavior of "Testers2"
it should "send data without errors" in {
test(new EccPair(width=8)) {
c => {
val rnd = new Random()
for (i <- 0 to 20) {
val testVal = rnd.nextInt(1 << c.getWidthParam)
c.io.dataIn.poke(testVal.U)
c.io.errorLocation.poke(0.U)
c.io.injectError.poke(false.B)
c.io.injectSecondError.poke(false.B)
c.clock.step(1)
c.io.dataOut.expect(testVal.U)
c.io.outputNotEqual.expect(false.B)
}
}
}
}
}
I am able to run the test in the shell with
testOnly chisel.lib.ecc.EccTester
But when I try to generate waveforms per the ChiselTest documentation,
testOnly chisel.lib.ecc.EccTester -- -DvwriteVcd=1
The test executes OK but does not dump a waveform.
Documentation I referenced is https://github.com/ucb-bar/chisel-testers2, and the full source code is at https://github.com/hutch31/ip-contributions/blob/ecc/src/test/scala/chisel/lib/ecc/EccTester.scala

I don’t think there is a formal answer to this yet, but here’s what I do. First I add the two following imports.
import chiseltest.experimental.TestOptionBuilder._
import chiseltest.internal.WriteVcdAnnotation
then add the annotation to the test like this
it should "send data without errors" in {
test(new EccPair(width=8)).withAnnotations(Seq(WriteVcdAnnotation)) {
c => {
Note: there are two definitions of WriteVcdAnnotation, one is in package treadle
and the other is in package chiseltest.internal. Use the latter, as it will work
for both treadle and verilator backends.

Related

How to dynamically add IO ports to a Chisel Bundle?

How can you dynamically add inputs or outputs to a Bundle in order to achieve the equivalent of this pseudocode.
class MyBundle extends Bundle {
for( i <- 1 to 10) {
val foo_<i> = UInt(i.W)
}
}
Note that I would like to not only create 10 dynamic ports but would also like that the index value would be reflected in the port name and port size. I think MixedVec cast to Bundle can potentially offer something similar not quite what I am looking for.
One way to do it is use Record instead of Bundle. Here's a pointer to the chisel3 test of the Record construct RecordSpec.scala
As an example based on your pseudocode. It would look like this
class MyBundle extends Record {
val elements = ListMap(Seq.tabulate(10) { i =>
s"foo_$i" -> UInt(i.W)
}:_*)
override def cloneType: this.type = (new MyBundle).asInstanceOf[this.type]
}

How to create JSON output from a combined group of composite classes with Play framework

I am a Scala newbie, extending someone else's code. The code uses the Play framework's JSON libraries. I am accessing objects of class Future[Option[A]] and Future[Option[List[B]]. The classes A and B each have their own JSON writes method, so each can return JSON as a response to a web request. I'm trying to combine these into a single JSON response that I can return as an HTTP response.
I thought creating a class which composes A and B into a single class would allow me to do this, something along these lines:
case class AAndB(a: Future[Option[A]], b: Future[Option[List[B]]])
object AAndB {
implicit val implicitAAndBWrites = Json.writes[AAndB]
}
But that fails all over the place. A and B are both structured like this:
sealed trait A extends SuperClass {
val a1: String = "identifier"
}
case class SubA(a2: ClassA2) extends A {
override val a1: String = "sub identifier"
}
object SubA {
val writes = Writes[SubA] { aa =>
Json.obj(
"a1" -> aa.a1
"a2" -> aa.a2
)
}
}
Since B is accessed as a List, the expected output would be along these lines:
{
"a":{
"a1":"val1",
"a2":"val2"
},
"b":[
{
"b1":"val 3",
"b2":"val 4"
},
{
"b1":"val 5",
"b2":"val 6"
},
{
"b1":"val 7",
"b2":"val 8"
}
]
}
Your help is appreciated.
As #cchantep mentioned in the comments on your question, having Futures as part of a case class declaration is highly unusual - case classes are great for encapsulating immutable domain objects (i.e that don't change over time) but as soon as you involve a Future[T] you potentially have multiple outcomes:
The Future hasn't completed yet
The Future failed
The Future completed successfully, and contains a T instance
You don't want to tangle up this temporal stuff with the act of converting to JSON. For this reason you should model your wrapper class with the Futures removed:
case class AAndB(a: Option[A], b: Option[List[B]])
object AAndB {
implicit val implicitAAndBWrites = Json.writes[AAndB]
}
and instead use Scala/Play's very concise handling of them in your Controller class to access the contents of each. In the below example, assume the existence of injected service classes as follows:
class AService {
def findA(id:Int):Future[Option[A]] = ...
}
class BListService {
def findBs(id:Int):Option[Future[List[B]]] = ...
}
Here's what our controller method might look like:
def showCombinedJson(id:Int) = Action.async {
val fMaybeA = aService.findA(id)
val fMaybeBs = bService.findBs(id)
for {
maybeA <- fMaybeA
maybeBs <- fMaybeBs
} yield {
Ok(Json.toJson(AAndB(maybeA, maybeBs)))
}
}
So here we launch both the A- and B-queries in parallel (we have to do this outside the for-comprehension to achieve this parallelism). The yield block of the for-comprehension will be executed only if/when both the Futures complete successfully - at which point it is safe to access the contents within. Then it's a simple matter of building an instance of the wrapper class, converting to JSON and returning an Ok result to Play.
Note that the result of the yield block will itself be inside a Future (in this case it's a Future[Result]) so we use Play's Action.async Action builder to handle this - letting Play deal with all of the actual waiting-for-things-to-happen.

How to insert json fixture data in Play Specification tests?

I have a Scala Play 2.2.2 application and as part of my Specification tests I would like to insert some fixture data for testing preferably in json format. For the tests I use the usual in-memory H2 database. How can I accomplish this? I have searched all the documentation but there is no mention to this anywhere.
Note that I would prefer not to build my own flavor of fixture implementation via the Global. There should be a non-hacky way to this right?
AFAIK there is no built-in stuff to do this, ala Rails, and it's hard to imagine what the devs could do without making Play Scala much more opinionated about the way persistence should be handled (which I'd personally consider a negative.)
I also use H2 for testing and employ plain SQL fixtures in a resource file and load them before tests using a couple of (fairly simple) helpers:
package object helpers {
import java.io.File
import java.sql.CallableStatement
import org.specs2.execute.{Result, AsResult}
import org.specs2.mutable.Around
import org.specs2.specification.Scope
import play.api.db.DB
import play.api.test.FakeApplication
import play.api.test.Helpers._
/**
* Load a file containing SQL statements into the DB.
*/
private def loadSqlResource(resource: String)(implicit app: FakeApplication) = DB.withConnection { conn =>
val file = new File(getClass.getClassLoader.getResource(resource).toURI)
val path = file.getAbsolutePath
val statement: CallableStatement = conn.prepareCall(s"RUNSCRIPT FROM '$path'")
statement.execute()
conn.commit()
}
/**
* Run a spec after loading the given resource name as SQL fixtures.
*/
abstract class WithSqlFixtures(val resource: String, val app: FakeApplication = FakeApplication()) extends Around with Scope {
implicit def implicitApp = app
override def around[T: AsResult](t: => T): Result = {
running(app) {
loadSqlResource(resource)
AsResult.effectively(t)
}
}
}
}
Then, in your actual spec you can do something like so:
package models
import helpers.WithSqlFixtures
import play.api.test.PlaySpecification
class MyModelSpec extends PlaySpecification {
"My model" should {
"locate items correctly" in new WithSqlFixtures("model-fixtures.sql") {
MyModel.findAll().size must beGreaterThan(0)
}
}
}
Note: this specs2 stuff could probably be better.
Obviously if you really need JSON you'll have to add extra machinery to deserialise your models and persist them in the database (often in your app you'll be doing these things anyway, in which case that might be relatively trivial.)
You'll also need:
Some evolutions to establish your DB schema in conf/evolutions/default
The evolution plugin enabled, which will build your schema when the FakeApplication starts up
The appropriate H2 DB config

Accessing java.awt.Container.getComponents from scala

I'm trying to write automated tests for a gui application written in scala with swing. I'm adapting the code from this article which leverages getName() to find the right ui element in the tree.
This is a self-contained version of my code. It's incomplete, untested, and probably broken and/or non-idiomatic and/or suboptimal in various ways, but my question is regarding the compiler error listed below.
import scala.swing._
import java.awt._
import ListView._
import org.scalatest.junit.JUnitRunner
import org.junit.runner.RunWith
import org.scalatest.FunSpec
import org.scalatest.matchers.ShouldMatchers
object MyGui extends SimpleSwingApplication {
def top = new MainFrame {
title = "MyGui"
val t = new Table(3, 3)
t.peer.setName("my-table")
contents = t
}
}
object TestUtils {
def getChildNamed(parent: UIElement, name: String): UIElement = {
// Debug line
println("Class: " + parent.peer.getClass() +
" Name: " + parent.peer.getName())
if (name == parent.peer.getName()) { return parent }
if (parent.peer.isInstanceOf[java.awt.Container]) {
val children = parent.peer.getComponents()
/// COMPILER ERROR HERE ^^^
for (child <- children) {
val matching_child = getChildNamed(child, name)
if (matching_child != null) { return matching_child }
}
}
return null
}
}
#RunWith(classOf[JUnitRunner])
class MyGuiSpec extends FunSpec {
describe("My gui window") {
it("should have a table") {
TestUtils.getChildNamed(MyGui.top, "my-table")
}
}
}
When I compile this file I get:
29: error: value getComponents is not a member of java.awt.Component
val children = parent.peer.getComponents()
^
one error found
As far as I can tell, getComponents is in fact a member of java.awt.Component. I used the code in this answer to dump the methods on parent.peer, and I can see that getComponents is in the list.
Answers that provide another approach to the problem (automated gui testing in a similar manner) are welcome, but I'd really like to understand why I can't access getComponents.
The issue is that you're looking at two different classes. Your javadoc link points to java.awt.Container, which indeed has getComponents method. On the other hand, the compiler is looking for getComponents on java.awt.Component (the parent class), which is returned by parent.peer, and can't find it.
Instead of if (parent.peer.isInstanceOf[java.awt.Container]) ..., you can verify the type of parent.peer and cast it like this:
parent.peer match {
case container: java.awt.Container =>
// the following works because container isA Container
val children = container.getComponents
// ...
case _ => // ...
}

Why is this groovy code throwing a MultipleCompilationErrorsException?

I have the following groovy code :
class FileWalker {
private String dir
public static void onEachFile(String dir,IAction ia) {
new File(dir).eachFileRecurse {
ia.perform(it)
}
}
}
walker = new FileWalker()
walker.onEachFile(args[0],new PrintAction())
I noticed that if I place a def in front of walker , the script works. Shouldn't this work the way it is now ?
You don't need a def in groovyConsole or in a groovy script. I consider it good programming practice to have it, but the language will work without it and add those types of variables to the scripts binding.
I'm not sure about the rest of your code (as it won't compile as you've posted it). But you either have a really old version of groovy or something else is wrong with your config or the rest of your code.
With the addition of a stub for the missing IAction interface and PrintAction class, I'm able to get it to run without modification:
interface IAction {
def perform(obj)
}
class PrintAction implements IAction{
def perform(obj) {
println obj
}
}
class FileWalker {
private String dir
public static void onEachFile(String dir,IAction ia) {
new File(dir).eachFileRecurse {
ia.perform(it)
}
}
}
walker = new FileWalker()
walker.onEachFile(args[0],new PrintAction())
I created a dummy directory with "foo/bar" and "foo/baz" files.
If I save it to "walkFiles.groovy" and call it from the command line with
groovy walkFiles.groovy foo
It prints:
foo/bar
foo/baz
This is with the latest version of groovy:
groovy -v
Groovy Version: 1.6-RC-3 JVM: 1.5.0_16
In scripting mode (or via "groovyConsole"), you need a declaration of walker with "def" before using it. A Groovy script file is translated into a derivative class of class Script before it get compiled. So, every declaration needs to be done properly.
On the other hand, when you're running a script in "groovysh" (or using an instance of class GroovyShell), its mechanism automatically binds every referencing object without the need of declaration.
updated:
My above answer would be wrong as I decompiled a .class of Groovy and found that it's using a binding object inside the script as well. Thus my first paragraph was indeed wrong.