Chisel3: Vec indexWhere expected Bool, actual Any - chisel

In Chisel, I have a Vec of Bools coming into a module. I would like to know the index of the first False which occurs.
To obtain this, I tried to use the following:
val faultIndex = Wire(UInt)
faultIndex := comparison.indexWhere(x:Bool => x === false.B)
When I put this in, an error was highlighted:
Unspecified value parameters: from: Int
Type mismatch, expected Bool => Bool, actual: Bool => Any
Type mismatch, expected Bool => Boolean, actual: Bool => Any
Cannot resolve symbol x
Cannot resolve symbol x
What is the proper way to use this function?

There are 2 minor syntax issues here:
val faultIndex = Wire(UInt())
Note the () after UInt. You can think about this as constructing a fresh type object rather than pointing to the static object called UInt.
There are a few ways to express the indexWhere:
faultIndex := comparison.indexWhere((x: Bool) => x === false.B) // Note parentheses
// or
faultIndex := comparison.indexWhere(x => x === false.B) // Type is inferred
// or
faultIndex := comparison.indexWhere(_ === false.B) // underscore shorthand
// alternatively
faultIndex := comparison.indexWhere(x => !x) // !x is equivalent to x === false.B
// or
faultIndex := comparison.indexWhere(!_) // More underscore shorthand
Executable example: https://scastie.scala-lang.org/uHCX5wxgSzu6wXqa9OJdRA

Related

scala parse json which has List of maps

I have some json when I parsed that it is returning stucture Some(List(Map...))
How to match with something and get the values
Below is the code I tried, I need to get all the map values
import scala.util.parsing.json._
val result = JSON.parseFull("[{\"start\":\"starting\",\"test\":123,\"test2\":324,\"end\":\"ending\"}]")
result match {
case Some(map: Map[String, Any]) => { println(map)
}
case None => println("Parsing failed")
case other => println("Unknown data structure: " + other)
}
but its printing non matching
Unknown data structure: Some(List(Map(start -> starting, test -> 123, test2 -> 324, end -> ending)))
Because of type erasure, you cannot pattern match on generics types. List, Map, Option are generic containers, and in runtime compiler will erase types of these generic containers. e.g. List[String], String will be erased and type will be List[_].
case Some(map: List[Map[String, Any]]) => println(map)
In your case above case, if result is val result: Option[Any] = Some(List(12)) i.e. 12, which type is Int and not Map[String, Any], compiler will still match the result with above case even it expects Map[String, Any]] as type of List.
So, Whats going on?
Its all because of type erasure. The compiler will erase all the types and it won't have any type information at runtime unless you use reflection. That means:
case Some(map: List[Map[String, Any]]) => println(map) is essentially case Some(map: List[_]) => println(map) and therefore, match will success for any type parameter of List e.g. List[Map[String, Any]], List[Map[String, Int]], List[String], List[Int] etc.
Therefore, In case you need to match on such generic container, you have to resolve each container and its nested subtypes explicitly.
def resolve(result: Any): Unit = result match {
case Some(res) if res.isInstanceOf[List[_]] && res.asInstanceOf[List[_]].isEmpty => println("Empty List.") //Some(List())
case Some(res) if res.isInstanceOf[List[_]] && !res.asInstanceOf[List[_]].exists(p => p.isInstanceOf[Map[_, _]] && p.asInstanceOf[Map[_, _]].nonEmpty) => println("List is not empty but each item of List is empty Map.") //Some(List(Map(), Map()))
case Some(res) if res.isInstanceOf[List[_]] && res.asInstanceOf[List[_]].filter(p => p.isInstanceOf[Map[_, _]] && p.asInstanceOf[Map[_, _]].nonEmpty).map(_.asInstanceOf[Map[_,_]]).exists(p => {p.head match {case e if e._1.isInstanceOf[String] && e._2.isInstanceOf[Any] => true; case _ => false}}) => println("Correct data.") // Some(List(Map("key1"-> 1), Map("key2" -> 2)))
case None => println("none")
case other => println("other")
}
val a: Option[Any] = Some(List())
val b: Option[Any] = Some(List(Map(), Map()))
val c: Option[Any] = Some(List(Map("key1"-> 1), Map("key2" -> 2)))
val d: Option[Any] = None
val e: Option[Any] = Some("apple")
resolve(a) // match first case
resolve(b) // match second case
resolve(c) // match third case
resolve(d) // match fourth case
resolve(e) // match fifth case
As has been pointed out the return type is actually Option[List[Map[String, Any]]] so you need to unpick this. However you cannot do this with a single match because of type erasure, so you need to do nested matches to ensure that you have the correct type. This is really tedious, so I thoroughly recommend using something like the Extraction.extract function in json4s that will attempt to match your JSON to a specific Scala type:
type ResultType = List[Map[String, Any]]
def extract(json: JValue)(implicit formats: Formats, mf: Manifest[ResultType]): ResultType =
Extraction.extract[ResultType](json)
If you must do it by hand, it looks something like this:
result match {
case Some(l: List[_]) =>
l.headOption match {
case Some(m) =>
m match {
case m: Map[_,_] =>
m.headOption match {
case Some(p) =>
p match {
case (_: String, _) =>
m.foreach(println(_))
case _ => println("Map key was not String")
}
case _ => println("Map was empty")
}
case _ => println("List did not contain a Map")
}
case _ => println("Result List was empty")
}
case _ => println("Parsing failed")
}
Your output is Option[List[Map[String, Any]]], not Option[Map[String, Any]]. match on the List and you'll be fine:
import scala.util.parsing.json._
val result = JSON.parseFull("[{\"start\":\"starting\",\"test\":123,\"test2\":324,\"end\":\"ending\"}]")
val l: List[Map[String, Any]] = result match {
case Some(list: List[Map[String, Any]]) => list
case _ => throw new Exception("I shouldn't be here") // whatever for a non-match
}
Then you can map (if you want a non-Unit return type)/ foreach (if you don't care about a Unit return type) on that List and do whatever you want it:
l.foreach(println)
l.map(_.toString) // or whatever you want ot do with the Map

Optional parameters with defaults in Go struct constructors

I've found myself using the following pattern as a way to get optional parameters with defaults in Go struct constructors:
package main
import (
"fmt"
)
type Object struct {
Type int
Name string
}
func NewObject(obj *Object) *Object {
if obj == nil {
obj = &Object{}
}
// Type has a default of 1
if obj.Type == 0 {
obj.Type = 1
}
return obj
}
func main() {
// create object with Name="foo" and Type=1
obj1 := NewObject(&Object{Name: "foo"})
fmt.Println(obj1)
// create object with Name="" and Type=1
obj2 := NewObject(nil)
fmt.Println(obj2)
// create object with Name="bar" and Type=2
obj3 := NewObject(&Object{Type: 2, Name: "foo"})
fmt.Println(obj3)
}
Is there a better way of allowing for optional parameters with defaults?
Dave Cheney offered a nice solution to this where you have functional options to overwrite defaults:
https://dave.cheney.net/2014/10/17/functional-options-for-friendly-apis
So your code would become:
package main
import (
"fmt"
)
type Object struct {
Type int
Name string
}
func NewObject(options ...func(*Object)) *Object {
// Setup object with defaults
obj := &Object{Type: 1}
// Apply options if there are any
for _, option := range options {
option(obj)
}
return obj
}
func WithName(name string) func(*Object) {
return func(obj *Object) {
obj.Name = name
}
}
func WithType(newType int) func(*Object) {
return func(obj *Object) {
obj.Type = newType
}
}
func main() {
// create object with Name="foo" and Type=1
obj1 := NewObject(WithName("foo"))
fmt.Println(obj1)
// create object with Name="" and Type=1
obj2 := NewObject()
fmt.Println(obj2)
// create object with Name="bar" and Type=2
obj3 := NewObject(WithType(2), WithName("foo"))
fmt.Println(obj3)
}
https://play.golang.org/p/pGi90d1eI52
The approach seems reasonable to me. However, you have a bug. If I explicitly set Type to 0, it will get switched to 1.
My suggested fix: Use a struct literal for the default value: http://play.golang.org/p/KDNUauy6Ie
Or perhaps extract it out: http://play.golang.org/p/QpY2Ymze3b
Take a look at "Allocation with new" in Effective Go. They explain about making zero-value structs a useful default.
If you can make Object.Type (and your other fields) have a default of zero, then Go struct literals already give you exactly the feature you're requesting.
From the section on composite literals:
The fields of a composite literal are laid out in order and must all be present. However, by labeling the elements explicitly as field:value pairs, the initializers can appear in any order, with the missing ones left as their respective zero values.
That means you can replace this:
obj1 := NewObject(&Object{Name: "foo"})
obj2 := NewObject(nil)
obj3 := NewObject(&Object{Type: 2, Name: "foo"})
with this:
obj1 := &Object{Name: "foo"}
obj2 := &Object{}
obj3 := &Object{Type: 2, Name: "foo"}
If it is not possible to make the zero value the default for all of your fields, the recommended approach is a constructor function. For example:
func NewObject(typ int, name string) *Object {
return &Object{Type: typ, Name: name}
}
If you want Type to have a nonzero default, you can add another constructor function. Suppose Foo objects are the default and have Type 1.
func NewFooObject(name string) *Object {
return &Object{Type: 1, Name: name}
}
You only need to make one constructor function for each set of nonzero defaults you use. You can always reduce that set by changing the semantics of some fields to have zero defaults.
Also, note that adding a new field to Object with a zero default value doesn't require any code changes above, because all struct literals use labeled initialization. That comes in handy down the line.
https://play.golang.org/p/SABkY9dbCOD
Here's an alternative that uses a method of the object to set defaults. I've found it useful a few times, although it's not much different than what you have. This might allow better usage if it's part of a package. I don't claim to be a Go expert, maybe you'll have some extra input.
package main
import (
"fmt"
)
type defaultObj struct {
Name string
Zipcode int
Longitude float64
}
func (obj *defaultObj) populateObjDefaults() {
if obj.Name == "" {
obj.Name = "Named Default"
}
if obj.Zipcode == 0 {
obj.Zipcode = 12345
}
if obj.Longitude == 0 {
obj.Longitude = 987654321
}
}
func main() {
testdef := defaultObj{Name: "Mr. Fred"}
testdef.populateObjDefaults()
fmt.Println(testdef)
testdef2 := defaultObj{Zipcode: 90210}
testdef2.populateObjDefaults()
fmt.Println(testdef2)
testdef2.Name = "Mrs. Fred"
fmt.Println(testdef2)
testdef3 := defaultObj{}
fmt.Println(testdef3)
testdef3.populateObjDefaults()
fmt.Println(testdef3)
}
Output:
{Mr. Fred 12345 9.87654321e+08}
{Named Default 90210 9.87654321e+08}
{Mrs. Fred 90210 9.87654321e+08}
{ 0 0}
{Named Default 12345 9.87654321e+08}
You could use the ... operator.
instead of writing ToCall(a=b) like in python you write, ToCall("a",b)
See the Go Play Example
func GetKwds(kwds []interface{}) map[string]interface{} {
result := make(map[string]interface{})
for i := 0; i < len(kwds); i += 2 {
result[kwds[i].(string)] = kwds[i+1]
}
return result
}
func ToCall(kwds ...interface{}) {
args := GetKwds(kwds)
if value, ok := args["key"]; ok {
fmt.Printf("key: %#v\n", value)
}
if value, ok := args["other"]; ok {
fmt.Printf("other: %#v\n", value)
}
}
func main() {
ToCall()
ToCall("other", &map[string]string{})
ToCall("key", "Test", "other", &Object{})
}

Scala implicit function parameterized

Why would this code not take the implicit functions defined in the local scope?
From where else does this take the implicit functions?
def implctest[T](a: T)(implicit b:T=>T):T = {b apply a}
class Testimplcl(val a:Int){
override def toString() = "value of the 'a' is = "+a
}
implicit def dble(x: Int):Int = {x + x}
implicit def stringer(x: String):String = {x+" no not a pity"}
implicit def myclass(x: Testimplcl):Testimplcl = new Testimplcl(x.a +1)
implctest[String]("oh what a pity")
implctest[Int](5)
implctest[Testimplcl](new Testimplcl(4))
None of my implicit defs in local scope are taken in.
For eg the implctestInt gives result 5, I expect it to return 10 by taking the dble as implicit.
It does not show error also. implctest simply returns the arguments passed in.
When you ask for a function A => A, Scala provides an implicit lift from a method definition, such as
implicit def dble(x: Int):Int = x + x
That is, it will treat that as a function dble _. So in the implicit resolution, this is not an immediately available value.
The problem you have is that there is an implicit A => A for any type, defined as Predef.conforms:
def conforms[A]: <:<[A, A] // where <:< is a sub class of A => A
This is useful and necessary because whenever you want a view from A => B and A happens to be B, such a "conversion" is automatically available.
See, with a direct function:
implicit val dble = (x: Int) => x + x
You see the conflict:
implicitly[Int => Int] // look for an implicit conversion of that type
<console>:49: error: ambiguous implicit values:
both method conforms in object Predef of type [A]=> <:<[A,A]
and value dble of type => Int => Int
match expected type Int => Int
implicitly[Int => Int]
^
So, in short, it's not good to ask for a custom A => A. If you really need such thing, use a custom type class such as Foo[A] extends (A => A).
If you will rewrite your implicits like ths:
implicit val dble = (x: Int) => x + x
implicit val stringer = (x: String) => x + " no not a pity"
implicit val myclass = (x: Testimplcl) => new Testimplcl(x.a +1)
then you will immediately see the reason for this behavior. Now you have the problem with ambiguous implicit values:
scala: ambiguous implicit values:
both method conforms in object Predef of type [A]=> <:<[A,A]
and value stringer in object ReflectionTest of type => String => String
match expected type String => String
println(implctest[String]("oh what a pity"))
^
This generally tells you that Predef already defined an implicit function T => T, so it conflicts with your definitions.
I will recommend you not to use such general types as Function as implicit parameters. Just create your own type for this. Like in this example:
trait MyTransformer[T] extends (T => T)
object MyTransformer {
def apply[T](fn: T => T) = new MyTransformer[T] {
def apply(v: T) = fn(v)
}
}
def implctest[T: MyTransformer](a: T): T =
implicitly[MyTransformer[T]] apply a
class Testimplcl(val a:Int){
override def toString() = "value of the 'a' is = "+a
}
implicit val dble = MyTransformer((x: Int) => x + x)
implicit val stringer = MyTransformer((x: String) => x + " no not a pity")
implicit val myclass = MyTransformer((x: Testimplcl) => new Testimplcl(x.a +1))

Scala define function standard

The following are equivalent:
scala> val f1 = {i: Int => i == 1}
f1: Int => Boolean = <function1>
scala> val f2 = (i: Int) => i == 1
f2: Int => Boolean = <function1>
I am more familiar with the former (coming from Groovy), but the latter form is much more common, AFAIK, the standard way to define a function in Scala.
Should I forget the past (Groovy) and adopt the 2nd form? The 1st form is more natural for me as it looks similar to Groovy/Ruby/Javascript way of defining closures (functions)
EDIT
See Zeiger's answer in this thread, for an example where groovy/ruby/javascript closure {=>} syntax seems more natural than () => I assume both can be used interchangeably with same performance, ability to pass around, etc. and that the only difference is syntax
I think that this is the matter of taste (scala styleguide recommends first one). The former one allow you to write multiline (>2 lines in body) functions:
val f1 = { i: Int =>
val j = i/2
j == 1
}
Sometimes it is useful
Actually, both versions are simplified forms of the "full" version.
Full version: multiple parameters, multiple statements.
scala> val f0 = { (x: Int, y: Int) => val rest = x % y; x / y + (if (rest > 0) 1 else 0) }
f0: (Int, Int) => Int = <function2>
The "groovy" version: one parameter, multiple statements.
scala> val f1 = { x: Int => val square = x * x; square + x }
f1: Int => Int = <function1>
The "scala" version: multiple parameters, one statement.
scala> val f2 = (x: Int, y: Int) => x * y
f2: (Int, Int) => Int = <function2>
A version with a single parameter and a single statement does not exist, because it is not syntactically valid (ie, the grammar for that doesn't quite work).

Types and functions

Consider the following:
type T () =
member x.y = 4
let a =
let fn () (k: T) = ()
fn ()
let b =
let fn () (k: System.IO.Directory) = ()
fn ()
a fails while b is ok. The error message is:
The value 'a' has been inferred to have generic type val a : ('_a -> unit) when '_a :> T Either make the arguments to 'a' explicit or, if you do not intend for it to be generic, add a type annotation
Why and how to fix that?
The error message itself tells you exactly what you need to do - add a type annotation:
let a : T -> unit =
let fn () (k: T) = ()
fn ()
The reason that you see the error in the first place is that the compiler tries to generalize the definition of a (see this part of the spec), which results in the odd signature that you see in the error message.
The reason that you don't need to do this for b is that System.IO.Directory is sealed, so there is no need to generalize.
You are facing a value restriction, because a looks like a constant but it returns a function.
Have a look at this question:
Understanding F# Value Restriction Errors
One easy way to solve it is adding a variable to the definition of a.
let a x =
let fn () (k: T) = ()
fn () x
I don't know why with some types it works, which is the case of b
If T where a record instead of a class, it would work. But for some reason, you have to spell it out for the compiler if T is a class,
type T () =
member x.y = 4
let a<'U when 'U :> T> =
let fn () (k: 'U) = ()
fn ()
let test0 = a<T> (T()) // You can be explicit about T,
let test1 = a (T()) // but you don't have to be.
edit: So I played a bit more with this, and weirdly, the compiler seems to be content with just any type restriction:
type T () =
member x.y = 4
type S () =
member x.z = 4.5
let a<'U when 'U :> S> =
let fn () (k: T) = ()
fn ()
let test = a (T()) // Is OK
let test = a<T> (T()) // Error: The type 'T' is not compatible with the type 'S'
The type S has nothing to do with anything in the code above, still the compiler is happy to just have a restriction of any kind.