How to pass a operator as a parameter - chisel

I'm trying to pass an operator to a module so the module can be built generically. I pass a two-input operator parameter and then use it in a reduction operation. If I replace the passed parameter with a concrete operator this works OK.
What's the correct way to pass a Chisel/UInt/Data operator as a module parameter?
val io = IO(new Bundle {
val a = Vec(n, Flipped(Decoupled(UInt(width.W))))
val z = Decoupled(UInt(width.W))
})
val a_int = for (n <- 0 until n) yield DCInput(io.a(n))
val z_int = Wire(Decoupled(UInt(width.W)))
val all_valid = a_int.map(_.valid).reduce(_ & _)
z_int.bits := a_int.map(_.bits).reduce(_ op _)
...

Here's a fancy Scala way of doing it
import chisel3._
import chisel3.tester._
import chiseltest.ChiselScalatestTester
import org.scalatest.{FreeSpec, Matchers}
class ChiselFuncParam(mathFunc: UInt => UInt => UInt) extends Module {
val io = IO(new Bundle {
val a = Input(UInt(8.W))
val b = Input(UInt(8.W))
val out = Output(UInt(8.W))
})
io.out := mathFunc(io.a)(io.b)
}
class CFPTest extends FreeSpec with ChiselScalatestTester with Matchers {
def add(a: UInt)(b: UInt): UInt = a + b
def sub(a: UInt)(b: UInt): UInt = a - b
"add works" in {
test(new ChiselFuncParam(add)) { c =>
c.io.a.poke(9.U)
c.io.b.poke(5.U)
c.io.out.expect(14.U)
}
}
"sub works" in {
test(new ChiselFuncParam(sub)) { c =>
c.io.a.poke(9.U)
c.io.b.poke(2.U)
c.io.out.expect(7.U)
}
}
}
Although it might be clearer to just pass in a string form of the operator and then use simple Scala ifs to control the appropriate code generation. Something like
class MathOp(code: String) extends Module {
val io = IO(new Bundle {
val a = Input(UInt(8.W))
val b = Input(UInt(8.W))
val out = Output(UInt(8.W))
})
io.out := (code match {
case "+" => io.a + io.b
case "-" => io.a - io.b
// ...
})
}

Chick has already provided a good answer, but I want to provide another example to illustrate and explain some of the really powerful features of Chisel and Scala for hardware design. I know you (Guy) probably know most of this but I wanted to provide a detailed answer for anyone else coming across this question.
I'll start with the complete example and then highlight some of the features being used.
class MyModule[T <: Data](n: Int, gen: T)(op: (T, T) => T) extends Module {
require(n > 0, "reduce only works on non-empty Vecs")
val io = IO(new Bundle {
val in = Input(Vec(n, gen))
val out = Output(gen)
})
io.out := io.in.reduce(op)
}
[T <: Data] This is called a Type Parameter (T) with an Upper Type Bound (<: Data). This allows us to make the Module generic to the hardware type with which we parameterize it. We give T an upper bound of Data (which is a type from Chisel) to tell Scala that this is a hardware type we can use to generate hardware with Chisel. The upper-bound means it must be a subtype of Data, which includes all of the Chisel hardware types (eg. UInt, SInt, Vec, Bundle and user classes that extend Bundle). This is the exact same way that the Chisel constructors like Reg(...) are parameterized.
You will notice that there are multiple parameter lists, (n: Int, gen: T) and (op: (T, T) => T). The first argument, n: Int, is a simple integer parameter. The second argument, gen: T, is our generic type T, and thus a subtype of Data that serves as a template for the hardware we will generate inside the Module.
The second parameter list (op: (T, T) => T) is a function. As a functional programming language, functions are values in Scala, and thus can be used as arguments just like our Int argument. (T, T) => T reads as a function of two arguments, both of type T, that returns a T. Remember that T is our hardware type that is a subclass of Data. Because op is in a second parameter list, this is telling Scala that it should infer T from gen, and then use the same T for op. For example, if gen is UInt(8.W), Scala infers T as UInt. This then constrains op to be a function of type (UInt, UInt) => UInt. Bitwise AND is such a function, so we can pass an anonymous function to AND two UInts: (_ & _).
Now that we have our abstract, type parameterized MyModule class, how do we actually use it? Above I gave snippets of how to use it with UInts, but let's see how to get some actual Verilog:
object MyMain extends App {
println(chisel3.Driver.emitVerilog(new MyModule(4, UInt(8.W))(_ & _)))
}
Alternatively, we can parameterize MyModule with a more complex type:
class MyBundle extends Bundle {
val bar = Bool()
val baz = Bool()
}
object MyMain extends App {
def combineMyBundle(a: MyBundle, b: MyBundle): MyBundle = {
val w = Wire(new MyBundle)
w.bar := a.bar && b.bar
w.baz := a.baz && b.baz
w
}
println(chisel3.Driver.emitVerilog(new MyModule(4, new MyBundle)(combineMyBundle)))
}
We also had to define a function of type (MyBundle, MyBundle) => MyBundle which we did with combineMyBundle.
You can see a complete, runnable version of the code I presented above on Scastie.
I hope someone finds this example useful!

Related

Using Shapeless HList to easily build Json Decoder

I am working on trying to write my own little lightweight toy Json library, and I am running into a roadblock trying to come up with an easy way to specify an Encoder/Decoder. I think Ive got a really nice dsl syntax, Im just not sure how to pull it off. I think it might be possible using Shapeless HList, but Ive never used it before, so Im drawing a blank as to how it would be done.
My thought was to chain these has calls together, and build up some sort of chain of HList[(String, J: Mapper)], and then if it is possible to have it behind the scenes try and convert a Json to a HList[J]?
Here is part of the implementation, along with how I imagine using it:
trait Mapper[J] {
def encode(j: J): Json
def decode(json: Json): Either[Json, J]
}
object Mapper {
def strict[R]: IsStrict[R] =
new IsStrict[R](true)
def lenient[R]: IsStrict[R] =
new IsStrict[R](false)
class IsStrict[R](strict: Boolean) {
def has[J: Mapper](at: String): Builder[R, J] =
???
}
class Builder[R, T](strict: Boolean, t: T) {
def has[J: Mapper](at: String): Builder[R, J] =
???
def is(decode: T => R)(encode: R => Json): Mapper[R] =
???
}
}
Mapper
.strict[Person]
.has[String]("firstName")
.has[String]("lastName")
.has[Int]("age")
.is {
case firstName :: lastName :: age :: HNil =>
new Person(firstName, lastName, age)
} { person =>
Json.Object(
"firstName" := person.firstName,
"lastName" := person.lastName,
"age" := person.age
)
}
There is a wonderful resource to learn how to use shapeless(HLIST plus LabelledGeneric) for that purpose:
Dave Gurnell´s The Type Astronaut’s Guide to Shapeless
In your case, given a product type like:
case class Person(firstName: String, lastName: String, age: Int)
The compiler should access to the names and the values of an instance of that type. The explanation about how the compiler is able to create a JSON representation at compile time is well described in the book.
In your example, you must use LabelledGeneric and try to create a generic encoder/decoder. It is a type class that creates a representation of your types as a HList where each element corresponds to a property.
For example, if you create a LabeledGeneric for your Person type
val genPerson = LabelledGeneric[Person]
the compiler infers the following type:
/*
shapeless.LabelledGeneric[test.shapeless.Person]{type Repr = shapeless.::[String with shapeless.labelled.KeyTag[Symbol with shapeless.tag.Tagged[String("firstName")],String],shapeless.::[String with shapeless.labelled.KeyTag[Symbol with shapeless.tag.Tagged[String("lastName")],String],shapeless.::[Int with shapeless.labelled.KeyTag[Symbol with shapeless.tag.Tagged[String("age")],Int],shapeless.HNil]]]}
*/
So, the names and the values are already represented using Scala types and now the compiler can derive JSON encoder/decoder instances at compile time. The code below shows the steps to create a generic JSON encoder(a summary from the chapter 5 of the book) that you can customize.
First step is to create a JSON algebraic data type:
sealed trait JsonValue
case class JsonObject(fields: List[(String, JsonValue)]) extends JsonValue
case class JsonArray(items: List[JsonValue]) extends JsonValue
case class JsonString(value: String) extends JsonValue
case class JsonNumber(value: Double) extends JsonValue
case class JsonBoolean(value: Boolean) extends JsonValue
case object JsonNull extends JsonValue
The idea behind all of this is that the compiler can take your product type and builds a JSON encoder object using the native ones.
A type class to encode your types:
trait JsonEncoder[A] {
def encode(value: A): JsonValue
}
For a first check, you can create three instances that would be necessary for the Person type:
object Instances {
implicit def StringEncoder : JsonEncoder[String] = new JsonEncoder[String] {
override def encode(value: String): JsonValue = JsonString(value)
}
implicit def IntEncoder : JsonEncoder[Double] = new JsonEncoder[Double] {
override def encode(value: Double): JsonValue = JsonNumber(value)
}
implicit def PersonEncoder(implicit strEncoder: JsonEncoder[String], numberEncoder: JsonEncoder[Double]) : JsonEncoder[Person] = new JsonEncoder[Person] {
override def encode(value: Person): JsonValue =
JsonObject("firstName" -> strEncoder.encode(value.firstName)
:: ("lastName" -> strEncoder.encode(value.firstName))
:: ("age" -> numberEncoder.encode(value.age) :: Nil))
}
}
Create an encode function that injects a JSON encoder instance:
import Instances._
def encode[A](in: A)(implicit jsonEncoder: JsonEncoder[A]) = jsonEncoder.encode(in)
val person = Person("name", "lastName", 25)
println(encode(person))
gives:
JsonObject(List((firstName,JsonString(name)), (lastName,JsonString(name)), (age,JsonNumber(25.0))))
Obviously you would need to create instances for each case class. To avoid that you need a function that returns a generic encoder:
def createObjectEncoder[A](fn: A => JsonObject): JsonObjectEncoder[A] =
new JsonObjectEncoder[A] {
def encode(value: A): JsonObject =
fn(value)
}
It needs a function A -> JsObject as parameter. The intuition behind this is that the compiler uses this function when traversing the HList representation of your type to create the type encoder, as it is described in the HList encoder function.
Then, you must create the HList encoder. That requires an implicit function to create the encoder for the HNil type and another for the HList itself.
implicit val hnilEncoder: JsonObjectEncoder[HNil] =
createObjectEncoder(hnil => JsonObject(Nil))
/* hlist encoder */
implicit def hlistObjectEncoder[K <: Symbol, H, T <: HList](
implicit witness: Witness.Aux[K],
hEncoder: Lazy[JsonEncoder[H]],
tEncoder: JsonObjectEncoder[T]): JsonObjectEncoder[FieldType[K, H] :: T] = {
val fieldName: String = witness.value.name
createObjectEncoder { hlist =>
val head = hEncoder.value.encode(hlist.head)
val tail = tEncoder.encode(hlist.tail)
JsonObject((fieldName, head) :: tail.fields)
}
}
The last thing that we have to do is to create an implicit function that injects an Encoder instance for a Person instance. It leverages the compiler implicit resolution to create a LabeledGeneric of your type and to create the encoder instance.
implicit def genericObjectEncoder[A, H](
implicit generic: LabelledGeneric.Aux[A, H],
hEncoder: Lazy[JsonObjectEncoder[H]]): JsonEncoder[A] =
createObjectEncoder { value => hEncoder.value.encode(generic.to(value))
}
You can code all these definitions inside the Instances object.
import Instances._
val person2 = Person2("name", "lastName", 25)
println(JsonEncoder[Person2].encode(person2))
prints:
JsonObject(List((firstName,JsonString(name)), (lastName,JsonString(lastName)), (age,JsonNumber(25.0))))
Note that you need to include in the HList encoder the Witness instance for Symbol. That allows to access the properties names at runtime. Remember that the LabeledGeneric of your Person type is something like:
String with KeyTag[Symbol with Tagged["firstName"], String] ::
Int with KeyTag[Symbol with Tagged["lastName"], Int] ::
Double with KeyTag[Symbol with Tagged["age"], Double] ::
The Lazy type it is necessary to create encoders for recursive types:
case class Person2(firstName: String, lastName: String, age: Double, person: Person)
val person2 = Person2("name", "lastName", 25, person)
prints:
JsonObject(List((firstName,JsonString(name)), (lastName,JsonString(lastName)), (age,JsonNumber(25.0)), (person,JsonObject(List((firstName,JsonString(name)), (lastName,JsonString(name)), (age,JsonNumber(25.0)))))))
Take a look to libraries like Circe or Spray-Json to see how they use Shapeless for codec derivation.
Try
implicit class StringOp(s: String) {
def :=[A](a: A): (String, A) = s -> a
}
implicit def strToJStr: String => Json.String = Json.String
implicit def dblToJNumber: Double => Json.Number = Json.Number
implicit def intToJNumber: Int => Json.Number = Json.Number(_)
sealed trait Json
object Json {
case class Object(fields: (scala.Predef.String, Json)*) extends Json
case class Array(items: List[Json]) extends Json
case class String(value: scala.Predef.String) extends Json
case class Number(value: Double) extends Json
case class Boolean(value: scala.Boolean) extends Json
case object Null extends Json
}
trait Mapper[J] {
def encode(j: J): Json
def decode(json: Json): Either[Json, J]
}
object Mapper {
implicit val `object`: Mapper[Json.Object] = ???
implicit val array: Mapper[Json.Array] = ???
implicit val stringJson: Mapper[Json.String] = ???
implicit val number: Mapper[Json.Number] = ???
implicit val boolean: Mapper[Json.Boolean] = ???
implicit val `null`: Mapper[Json.Null.type] = ???
implicit val json: Mapper[Json] = ???
implicit val int: Mapper[Int] = ???
implicit val string: Mapper[String] = ???
implicit val person: Mapper[Person] = ???
def strict[R]: IsStrict[R] =
new IsStrict[R](true)
def lenient[R]: IsStrict[R] =
new IsStrict[R](false)
class IsStrict[R](strict: Boolean) {
def has[A: Mapper](at: String): Builder[R, A :: HNil] =
new Builder(strict, at :: Nil)
}
class Builder[R, L <: HList](strict: Boolean, l: List[String]) {
def has[A: Mapper](at: String): Builder[R, A :: L] =
new Builder(strict, at :: l)
def is[L1 <: HList](decode: L1 => R)(encode: R => Json)(implicit
reverse: ops.hlist.Reverse.Aux[L, L1]): Mapper[R] = {
val l1 = l.reverse
???
}
}
}
Unfortunately this needs L1 to be explicitly specified for is
case class Person(firstName: String, lastName: String, age: Int)
Mapper
.strict[Person]
.has[String]("firstName")
.has[String]("lastName")
.has[Int]("age")
.is[String :: String :: Int :: HNil] {
case (firstName :: lastName :: age :: HNil) =>
new Person(firstName, lastName, age)
} { person =>
Json.Object(
"firstName" := person.firstName,
"lastName" := person.lastName,
"age" := person.age
)
}
otherwise it's Error: missing parameter type for expanded function.
The argument types of an anonymous function must be fully known.
One way to improve inference is to move implicit reverse to class Builder but this is less efficient: an HList will be reversed in every step , not only in the last step.
Another way is to introduce helper class
def is(implicit reverse: ops.hlist.Reverse[L]) = new IsHelper[reverse.Out]
class IsHelper[L1 <: HList]{
def apply(decode: L1 => R)(encode: R => Json): Mapper[R] = {
val l1 = l.reverse
???
}
}
but then apply (or other method name) should be explicit
Mapper
.strict[Person]
.has[String]("firstName")
.has[String]("lastName")
.has[Int]("age")
.is.apply {
case (firstName :: lastName :: age :: HNil) =>
new Person(firstName, lastName, age)
} { person =>
Json.Object(
"firstName" := person.firstName,
"lastName" := person.lastName,
"age" := person.age
)
}
otherwise compiler mistreats decode as reverse.

How to create a vector of vector when I'm defining IO

I'm trying to define Vector of Vector for my IO, but I'm getting an error from the chisel saying:
vec element 'Vec(chisel3.util.DecoupledIO#2b57)' must be hardware, not a bare Chisel type
The code which I've written is like the following:
//Module's argument
,val ArgsOut: Array[Int]
...
...
val Args = for (i <- 0 until ArgsOut.length) yield {
val arg = Vec(ArgsOut(i), Decoupled(UInt()))
arg
}
val outputArg = Vec(Args)
Some things to consider
IO ports, i.e. members of the IO Bundle, must be chisel hardware constructs, Args is a scala Vector type, it needs to be chisel Vec
All elements of a Vec must be the same size, mostly a consequence of the need to be able to index elements of the Vec. You have each element of Args as a Vec whos's length is determined by some the elements of ArgsOut. Vec(n, type) will not
Do really mean to have a 2D Vec(Vec( of decoupled IO's?
Your UInt in the Decoupled has an unknown width. This is not strictly speaking an error, because Firrtl can infer the width in most situations. But again this can be a problem for the requirement that a Vec's element are all the same length. Inferred widths in IOs should be used cautiously.
I was able to construct at IOBundle like this
val io = IO(new Bundle {
val args = Vec(ArgsOut.length, Vec(ArgsOut(0), Decoupled(UInt(18.W))))
val outputArg = Flipped(args)
})
Which compiles but may not quite be what you had in mind. I was able to connect the io's using
io.outputArg <> io.args
If this doesn't seem to fit your use case, I need to know a bit more how you intend to use the fields and we should be able to figure out how to wire them up.
Here is an illustration of how to use a subclass of Record to manage a virtual array of Vectors of varying length. This example runs for me. It is not exactly the same as your examples. But I think it makes things clearer. This is for the use case where you do not need to access your Vecs via a UInt, but is for when you need to match a heterogenous mix of Vectors.
import chisel3._
import chisel3.iotesters.PeekPokeTester
import org.scalatest.{FreeSpec, Matchers}
import scala.collection.immutable.ListMap
final class VariableBundle(elts: (String, Vec[UInt])*) extends Record {
val elements = ListMap(elts map { case (field, elt) => field -> elt.chiselCloneType }: _*)
def apply(elt: String): Vec[UInt] = elements(elt)
override def cloneType = (new VecVecBundle(elements.toList: _*)).asInstanceOf[this.type]
}
class SeqIO(val sizes: Array[Int]) extends Module {
val io = IO(new VariableBundle(Seq.tabulate(sizes.length) { i =>
s"vec_in_$i" -> Input(Vec(sizes(i), UInt(8.W)))
} ++
Seq.tabulate(sizes.length) { i =>
s"vec_out_$i" -> Output(Vec(sizes(i), UInt(8.W)))
}:_*
)
)
for(i <- sizes.indices) {
io(s"vec_out_$i") := io(s"vec_in_$i")
}
}
class SeqIOTester(c: SeqIO) extends PeekPokeTester(c) {
for(i <- c.sizes.indices) {
for(j <- 0 until c.sizes(i)) {
poke(c.io(s"vec_in_$i")(j), j)
}
}
step(1)
for(i <- c.sizes.indices) {
for(j <- 0 until c.sizes(i)) {
expect(c.io(s"vec_out_$i")(j), j)
}
}
}
class SeqIOSpec extends FreeSpec with Matchers {
"illustrate how to build bundles that have vecs wrapping different sized vecs" in {
iotesters.Driver.execute(Array.empty[String], () => new SeqIO(Array(1, 2, 3, 4))) { c =>
new SeqIOTester(c)
} should be (true)
}
}
Let me write the actual code:
class LoopLatch(val NumInputs: Int,
val ArgsOut: Array[Int],
val ID: Int)
(implicit val p: Parameters) extends Module with CoreParams with UniformPrintfs {
val io = IO(new Bundle {
val inputArg = Vec(NumInputs, Flipped(Decoupled(new DataBundle())))
val Args = Vec(for (i <- 0 until ArgsOut.length) yield {
val arg = Vec(ArgsOut(i), Decoupled(new DataBundle()))
arg
})
DataBundle() is a custom data type which I have defined it in a separate file. What I really want to do is that. I want to pass an array to the module like ArgsOut and then build a vector of vector of Databundle and each subsequent vector has its own number of Databundle which comes from the input array.
The loopLatch module is a wrapper for other module, LiveInNode which I have designed. Actually the LoopLatch module only has a vector of this elements and then provide an IO for it.
Here is the actual code:
class LoopStart(val NumInputs: Int,
val ArgsOut: Array[Int],
val ID: Int)
(implicit val p: Parameters) extends Module with CoreParams with UniformPrintfs {
val io = IO(new Bundle {
val inputArg = Vec(NumInputs, Flipped(Decoupled(new DataBundle())))
val Args = Vec(ArgsOut.length, Vec(ArgsOut(0), Decoupled(new DataBundle())))
val Out = new VecVecBundle(
Seq("inputArg" -> Vec(NumInputs, Flipped(Decoupled(new DataBundle())))) ++
(ArgsOut.indices).map { i =>
val arg = Vec(ArgsOut(i), Decoupled(new DataBundle()))
s"Args_$i" -> arg
}:_*
)
val pSignal = Vec(NumInputs, Flipped(Decoupled(Bool())))
val Finish = Vec(NumInputs, Flipped(Decoupled(new ControlBundle())))
}
)
val Args = for (i <- 0 until NumInputs) yield {
val arg = Module(new LiveInNode(NumOuts = 1, ID = i))
arg
}
//Iterating over each loopelement and connect them to the IO
for (i <- 0 until NumInputs) {
Args(i).io.InData <> io.inputArg(i)
Args(i).io.Finish <> io.Finish(i)
Args(i).io.pred <> io.pSignal(i)
}
for (i <- 0 until ArgsOut.length) {
io.Out("Args_0") <> Args(i).io.Out
}
}

Chisel3: False Combinational Loop in Fixed Priority Arbiter

The following code implements a n N-bit fixed priority arbiter.
import chisel3._
import chisel3.util._
class fixedPriorityArbiter(val n_reqs:Int = 4) extends Module {
val NO_OF_REQS = n_reqs
val io = IO(new Bundle {
val req = Input(UInt(NO_OF_REQS.W))
val grant = Output(UInt(NO_OF_REQS.W))
})
val higherPriReq = Wire(UInt(NO_OF_REQS.W))
higherPriReq := Cat((higherPriReq(NO_OF_REQS-2, 0) | io.req(NO_OF_REQS-2, 0)), UInt(0,1.W))
io.grant := io.req & ~higherPriReq
}
object main_obj extends App {
val DUT = () => new fixedPriorityArbiter()
val margs = Array("--compiler", "verilog")
chisel3.Driver.execute(args= margs, dut= DUT)
}
Inexistent combinatorial loops are reported for this code. The chisel source mirrors a Verilog implementation of the circuit below which doesn't report any combinatorial loops when synthesized in Synopsys Synplify.
The following compile error is reported by FIRRTL in Eclipse IDE on Windows without any Verilog source being generated by FIRRTL
FIRRTL does not support subword analysis so by using a UInt here you are creating what appears to be a combinational loop even though it actually isn't. To get around this, you can use aggregate types like Vecs to make it explicit to Firrtl that you are doing work on the individual bits. Here's an equivalent implementation using a Vec:
class FixedPriorityArbiter(val n_reqs: Int = 4) extends Module {
val NO_OF_REQS = n_reqs
val io = IO(new Bundle {
val req = Input(UInt(NO_OF_REQS.W))
val grant = Output(UInt(NO_OF_REQS.W))
})
val higherPriReq = Wire(Vec(NO_OF_REQS, Bool()))
// Vec implements scala.collection.Seq so you can use such operations as slice and map
val upperOr = higherPriReq.slice(0, NO_OF_REQS-1).zip(io.req(NO_OF_REQS-2, 0).toBools)
.map { case (l, r) => l | r }
higherPriReq := false.B +: upperOr
io.grant := io.req & ~higherPriReq.asUInt
}

Scala implicit function parameterized

Why would this code not take the implicit functions defined in the local scope?
From where else does this take the implicit functions?
def implctest[T](a: T)(implicit b:T=>T):T = {b apply a}
class Testimplcl(val a:Int){
override def toString() = "value of the 'a' is = "+a
}
implicit def dble(x: Int):Int = {x + x}
implicit def stringer(x: String):String = {x+" no not a pity"}
implicit def myclass(x: Testimplcl):Testimplcl = new Testimplcl(x.a +1)
implctest[String]("oh what a pity")
implctest[Int](5)
implctest[Testimplcl](new Testimplcl(4))
None of my implicit defs in local scope are taken in.
For eg the implctestInt gives result 5, I expect it to return 10 by taking the dble as implicit.
It does not show error also. implctest simply returns the arguments passed in.
When you ask for a function A => A, Scala provides an implicit lift from a method definition, such as
implicit def dble(x: Int):Int = x + x
That is, it will treat that as a function dble _. So in the implicit resolution, this is not an immediately available value.
The problem you have is that there is an implicit A => A for any type, defined as Predef.conforms:
def conforms[A]: <:<[A, A] // where <:< is a sub class of A => A
This is useful and necessary because whenever you want a view from A => B and A happens to be B, such a "conversion" is automatically available.
See, with a direct function:
implicit val dble = (x: Int) => x + x
You see the conflict:
implicitly[Int => Int] // look for an implicit conversion of that type
<console>:49: error: ambiguous implicit values:
both method conforms in object Predef of type [A]=> <:<[A,A]
and value dble of type => Int => Int
match expected type Int => Int
implicitly[Int => Int]
^
So, in short, it's not good to ask for a custom A => A. If you really need such thing, use a custom type class such as Foo[A] extends (A => A).
If you will rewrite your implicits like ths:
implicit val dble = (x: Int) => x + x
implicit val stringer = (x: String) => x + " no not a pity"
implicit val myclass = (x: Testimplcl) => new Testimplcl(x.a +1)
then you will immediately see the reason for this behavior. Now you have the problem with ambiguous implicit values:
scala: ambiguous implicit values:
both method conforms in object Predef of type [A]=> <:<[A,A]
and value stringer in object ReflectionTest of type => String => String
match expected type String => String
println(implctest[String]("oh what a pity"))
^
This generally tells you that Predef already defined an implicit function T => T, so it conflicts with your definitions.
I will recommend you not to use such general types as Function as implicit parameters. Just create your own type for this. Like in this example:
trait MyTransformer[T] extends (T => T)
object MyTransformer {
def apply[T](fn: T => T) = new MyTransformer[T] {
def apply(v: T) = fn(v)
}
}
def implctest[T: MyTransformer](a: T): T =
implicitly[MyTransformer[T]] apply a
class Testimplcl(val a:Int){
override def toString() = "value of the 'a' is = "+a
}
implicit val dble = MyTransformer((x: Int) => x + x)
implicit val stringer = MyTransformer((x: String) => x + " no not a pity")
implicit val myclass = MyTransformer((x: Testimplcl) => new Testimplcl(x.a +1))

When do I have to treat my methods as partially applied functions in Scala?

I noticed that when I'm working with functions that expect other functions as parameters, I can sometimes do this:
someFunction(firstParam,anotherFunction)
But other times, the compiler is giving me an error, telling me that I should write a function like this, in order for it to treat it as a partially applied function:
someFunction(firstParam,anotherFunction _)
For example, if I have this:
object Whatever {
def meth1(params:Array[Int]) = ...
def meth2(params:Array[Int]) = ...
}
import Whatever._
val callbacks = Array(meth1 _,meth2 _)
Why can't I have the code like the following:
val callbacks = Array(meth1,meth2)
Under what circumstances will the compiler tell me to add _?
The rule is actually simple: you have to write the _ whenever the compiler is not explicitly expecting a Function object.
Example in the REPL:
scala> def f(i: Int) = i
f: (i: Int)Int
scala> val g = f
<console>:6: error: missing arguments for method f in object $iw;
follow this method with `_' if you want to treat it as a partially applied function
val g = f
^
scala> val g: Int => Int = f
g: (Int) => Int = <function1>
In Scala a method is not a function. The compiler can convert a method implicitly in a function, but it need to know which kind. So either you use the _ to convert it explicitly or you can give some indications about which function type to use:
object Whatever {
def meth1(params:Array[Int]): Int = ...
def meth2(params:Array[Int]): Int = ...
}
import Whatever._
val callbacks = Array[ Array[Int] => Int ]( meth1, meth2 )
or:
val callbacks: Array[ Array[Int] => Int ] = Array( meth1, meth2 )
In addition to what Jean-Philippe Pellet said, you can use partially applied functions, when writing delegate classes:
class ThirdPartyAPI{
def f(a: Int, b: String, c: Int) = ...
// lots of other methods
}
// You want to hide all the unnecessary methods
class APIWrapper(r: ThirdPartyAPI) {
// instead of writing this
def f(a: Int, b: String, c: Int) = r.f(a, b, c)
// you can write this
def f(a: Int, b: String, c: Int) = r.f _
// or even this
def f = r.f _
}
EDIT added the def f = r.f _ part.