How to import getVerilog() function from the bootcamp examples? - chisel

I am not sure I understand how to use the getVerilog function from:
https://github.com/freechipsproject/chisel-bootcamp/blob/master/2.1_first_module.ipynb
[error] passthrough_test.scala:18:11: not found: value getVerilog
[error] println(getVerilog(new PassThrough(10)))
[error] ^
[error] one error found
[error] (Test / compileIncremental) Compilation failed
[error] Total time: 1 s, completed Nov 21, 2018 1:53:02 PM
I did import chisel3._ but that does not seem to be enough.

The getVerilog method is defined only in the Bootcamp. There's an equivalent method for Chisel versions prior to Chisel 3.2.0 called Driver.emitVerilog. For Chisel versions after 3.2.0, the correct method is (new chisel3.stage.ChiselStage).emitVerilog:
import chisel3._
import chisel3.stage.ChiselStage
class Foo extends Module {
val io = IO(new Bundle {})
printf("I'm a foo")
}
/* For Chisel versions <3.2.0 use the following: */
Driver.emitVerilog(new Foo)
/* For Chisel >=3.2.0 use the following: */
(new ChiselStage).emitVerilog(new Foo)
As an additional reference, the specific implementation of getVerilog in the Chisel Bootcamp is here. This looks almost identical to what Driver.emitVerilog is doing here. There was a similar, but slightly different question about generating Verilog that was brought up in a different SO question.
Usually if you're wondering if some API exists, it can be useful to search through the Chisel3 API reference. E.g., you can consult the documentation for ChiselStage for methods related to compiling Chisel to FIRRTL IR or Verilog: ChiselStage API.
For more control over Verilog generation, you'll want to use the ChiselStage class' method execute(args: Array[String], annotations: AnnotationSeq) if using Chisel >=3.2.0. Earlier versions of Chisel should use the Driver object's method Driver.execute(args: Array[String], dut: () => RawModule).
Note: ChiselStage.emitVerilog does allow you pass command line arguments and annotations to the Chisel compiler for more control over the generated FIRRTL IR or Verilog.

Related

Cython: calling C function throws 'undefined symbol'

I am attempting to use the LMDB C API with Cython.
I want to import the following definitions from the header file:
typedef struct MDB_env MDB_env;
int mdb_env_create(MDB_env **env);
So I created a .pxd file:
cdef extern from 'lmdb.h':
struct MDB_env:
pass
int mdb_env_create(MDB_env **env)
And I am using it in a Cython script:
cdef MDB_env *e
x = mdb_env_create(&e)
This code compiles fine, but If I run it, I get:
ImportError: /home/me/.cache/ipython/cython/_cython_magic_15705c11c6f56670efe6282cbabe4abc.cpython-36m-x86_64-linux-gnu.so: undefined symbol: mdb_env_create
This happens both in a Cython .pyx + .pxd setup and in a prototype typed in IPython.
If I import another symbol, say a constant, I can access it. So I seem to be looking at the right header file.
I don't see any discrepancy between my syntax and the documentation, but I am clearly doing something wrong. Can somebody give me a hint?
Thanks.
To compile it with IPythons-magic (would be nice if you would mention this explicitly in your question) you have to provide library-path (via -L-option) and library name (via -l-option) of the built c-library you want to wrap, see also the documentation:
%%cython -L=<path to your library> -l=<your_library>
The library you are trying to wrap is not a header-only library. That means that some symbols (e.g. mdb_env_create) are only declared but not defined in the header. When you build the library, the definitions of those symbols can be found in the resulting artifact, which should be provided to the linker when your extension is built. These definitions is what is needed when the program runs.
If you don't do it, the following happens on Linux: When the extension (the *.so-file) is built,the linker allows undefined symbols per default - so this step is "successful" - but the failure is only postponed. When the extension is loaded via import, Python loads the corresponding *.so with help of ldopen and in this step loader checks that the definitions of all symbols are known. But we didn't provide a definition of mdb_env_create so, the loader fails with
undefined symbol: mdb_env_create
It is differently for symbols which are defined in the header-file, for example enums MDB_FIRST&Co - the compiled library isn't necessary and thus the extension can be loaded, as there are no undefined symbols.

Cython: extending Python standard library classes

I am very new to Cyhon and have very little experience with C, but I am testing Cython by refactoring an existing module I wrote.
I am not sure how I should extend a Python standard library class. this:
from contextlib import ContextDecorator
cdef class MyCtxManager(ContextDecorator):
# override methods
gives me an error:
'ContextDecorator' is not a type name
How do I extend ContextDecorator into a Cython class?

Export objects and classes before their declaration makes them undefined

I try to repeat example from Mozilla Hacks (Export lists subtitle):
//export.js
export {detectCats, Kittydar};
function detectCats() {}
class Kittydar {}
//import.js
import {detectCats, Kittydar} from "./export.js";
console.log(detectCats); // function detectCats() {}
console.log(Kittydar); // undefined
Oops: Kittydar is undefined (btw., the problem is the same with simple Object).
But if I put export after Kittydar declaration it's ok:
//export.js
class Kittydar {}
export {Kittydar};
//import.js
import {Kittydar} from "./export.js";
console.log(Kittydar); // function Kittydar() {_classCallCheck(this, Kittydar);}
Is this a typo in the article?
I transpile this with babel and bundle with browserify. Then I include output bundle in a usual .html file (with <script> tag).
The standard is hard to follow on this, but the article is correct. This code works in es6draft and in the SpiderMonkey shell: both the function and the class are initialized by the time those console.log calls run.
Here's how it's supposed to work, in minute detail:
The JS engine parses import.js. It sees the import declaration, so then it loads export.js and parses it as well.
Before actually running any of the code, the system creates both module scopes and populates them with all the top-level bindings that are declared in each module. (The spec calls this ModuleDeclarationInstantiation.) A Kittydar binding is created in export.js, but it's uninitialized for now.
In import.js, a Kittydar import binding is created. It's an alias for the Kittydar binding in export.js.
export.js runs. The class is created. Kittydar is initialized.
import.js runs. Both console.log() calls work fine.
Babel's implementation of ES6 modules is nonstandard.
I think it's deliberate. Babel aims to compile ES6 modules into ES5 code that works with an existing module system of your choice: you can have it spit out AMD modules, UMD, CommonJS, etc. So if you ask for AMD output, your code might be ES6 modules, but the ES5 output is an AMD module and it's going to behave like an AMD module.
Babel could probably be more standard-compliant while still integrating nicely with the various module systems, but there are tradeoffs.

Why does sbt give "object swing is not a member of package scala" for import scala.swing?

Sbt version: 0.13.8
Scala version: 2.11.2
When compiling my scala swing application with scalac, it simply compiles.
However, when compiling the same files with SBT, it provides the following error:
[error] my/file/path.scala:1: object swing is not a member of package scala
[error] import scala.swing._
I added the scala version to my build.sbt. I even configured scalaHome (which I believe should never be in a build.sbt).
The lines in build.sbt:
scalaVersion := "2.11.2"
scalaHome := Some(file("/my/scala/location/opt/scala-2.11.2/"))
The
/my/scala/location/opt/scala-2.11.2/lib
directory contains the sacla swing lib: scala-swing_2.11-1.0.1.jar , this is also why scalac simply compiles.
Some might say I should add swing to my libraryDependencies in the build.sbt, however it shouldn't, since it is part of the core library and scalaHome is configured.
How to get sbt to notice the swing core library in a natural manner?
Bonus question:
How to configure scalaHome outside of build.sbt (without hacking the sbt jar itself) or better, have it notice the SCALA_HOME environment variable?
As of 2.11, the scala swing package is no longer listed among scala's standard library api and in fact is described in its own README as "mostly unsupported".
I think you should expect to have to include it as a dependency.
See also What's wrong with my scala.swing?

Receiving "bad symbolic reference to org.json4s.JsonAST..." in Eclipse Scala-IDE using 2.11

I am attempting to use the Eclipse Scala-IDE for 2.11 (downloaded the prepackaged bundle from the web-site). I have been using the Scala Worksheet to work with a SaaS API returning JSON. I've been pushing through just using String methods. I decided to begin using json4s. I went to http://mvnrepository.com/ and obtained the following libraries:
json4s-core-2.11-3.2.10.jar
json4s-native-2.11-3.2.10.jar
paranamer-2.6.jar
I have added all three jars to the Project's Build Path. And they appear under the project's "Referenced Libraries".
I have the following code in a Scala Worksheet:
package org.public_domain
import org.json4s._
import org.json4s.native.JsonMethods._
object WorkSheet6 {
println("Welcome to the Scala worksheet")
parse(""" { "numbers" : [1, 2, 3, 4] } """)
println("Bye")
}
I am receiving the following two compilation errors:
bad symbolic reference to org.json4s.JsonAST.JValue encountered in class file 'package.class'. Cannot access type JValue in value org.json4s.JsonAST. The current classpath may be missing a definition for org.json4s.JsonAST.JValue, or package.class may have been compiled against a version that's incompatible with the one found on the current classpath.
bad symbolic reference to org.json4s.JsonAST.JValue encountered in class file 'package.class'. Cannot access type JValue in value org.json4s.JsonAST. The current classpath may be missing a definition for org.json4s.JsonAST.JValue, or package.class may have been compiled against a version that's incompatible with the one found on the current classpath.
When I go look in the org.json4s package in the json4s-core-2.11-3.2.10.jar file, there is in fact no .class file indicating any sort of compiled object JsonAST.
This is a showstopper. Any help on this would be greatly appreciated.
Your classpath is incomplete. You are missing a dependency of json4s-core.
bad symbolic reference to org.json4s.JsonAST.JValue encountered in class file 'package.class'. Cannot access type JValue in value org.json4s.JsonAST. The current classpath may be missing a definition for org.json4s.JsonAST.JValue, or package.class may have been compiled against a version that's incompatible with the one found on the current classpath.
The simplest way to consume Scala or Java libraries is to use sbt or maven. They bring in (transitive) dependencies for you. If you check the pom definition of json4s-core library, you notice it depends on json4s-ast. You should add that jar to your build path as well.