Passing functions and operating on their results within Scala's Actors - function

I'm implementing an actor-based app in scala and I'm trying to be able to pass functions between the actors for them to be processed only when some message is received by the actor.
import actors.Actor
import java.util.Random
import scala.Numeric._
import Implicits._
class Constant(val n:Number) extends Actor{
def act(){
loop{
receive{
case "value" => reply( {n} )
}
}
}
}
class Arithmetic[T: Numeric](A: ()=>T, B: ()=>T) extends Actor{
def act(){
receive{
case "sum" => reply ( A() + B() )
/* case "mul" => reply ( A * B )
*/
}
}
}
object Main extends App{
val c5 = new Constant(5)
c5.start
val a = new Arithmetic({c5 !! "value"}, {c5!!"value"} )
a.start
println(a!?"sum")
println(a!?"mul")
}
In the example code above I would expect the output to be both 5+5 and 5*5. The issue is that reply is not a typed function and as such I'm unable to have the operator (+,*) to operate over the result from A and B.
Can you provide any help on how to better design/implement such system?
Edit: Code updated to better reflect the problem. Error in:
error: could not find implicit value for evidence parameter of type Numeric[Any]
val a = new Arithmetic({c5 !! "value"}, {c5!!"value"} )
I need to be able to pass the function to be evaluated in the actor whenever I call it. This example uses static values but I'll bu using dynamic values in the future, so, passing the value won't solve the problem. Also, I would like to receive different var types (Int/Long/Double) and still be able to use the same code.

The error: Error in: error: could not find implicit value for evidence parameter of type Numeric[Any]. The definition of !!:
def !! (msg: Any): Future[Any]
So the T that Arithmetic is getting is Any. There truly isn't a Numeric[Any].

I'm pretty sure that is not your problem. First, A and B are functions, and functions don't have + or *. If you called A() and B(), then you might stand a chance... except for the fact that they are java.lang.Number, which also does not have + or * (or any other method you'd expect it to have).
Basically, there's no "Number" type that is a superclass or interface of all numbers for the simple reason that Java doesn't have it. There's a lot of questions touching this subject on Stack Overflow, including some of my own very first questions about Scala -- investigate scala.math.Numeric, which is the best approximation for the moment.
Method vs Function and lack of parenthesis
Methods and functions are different things -- see tons of related questions here, and the rule regarding dropping parenthesis is different as well. I'll let REPL speak for me:
scala> def f: () => Int = () => 5
f: () => Int
scala> def g(): Int = 5
g: ()Int
scala> f
res2: () => Int = <function0>
scala> f()
res3: Int = 5
scala> g
res4: Int = 5
scala> g()
res5: Int = 5

Related

How to writing a accumulator by using ScalaBlackBox?

I want to create some new number types that like DspReal for dsptools, such as DspPosit and DspQuire. DspPosit bases on posit which I have some Java code, and DspQuire bases on quire which is a kind of accumulator for posit. Because I just want to simulation now, so I have write many ScalaBlackBox for their operation like DspReal. However, I found that ScalaBlackBox can't construct sequential logic. For example, current output of the quire accumulator depends on it's input and last output. But ScalaBlackBox can't get the value of the output. In addition, step(n) also influences the output. Because accumulator will read its input per clock cycle.
I found some system problems of treadle. First, the function of ScalaBlackBox, twoOp and oneOp and so on, will be called many times. I don't know why. Second, step(n) is the function of PeekPokeTester, which can't be access by ScalaBlackBox. Third, I try to read current output but system gives errors.
trait DspBlackBlackBoxImpl extends BlackBoxImplementation with ScalaBlackBox
abstract class DspQuireAccumulator extends DspBlackBlackBoxImpl {
lazy val accValue = Quire32() // initial value
/**
* sub-classes must implement this two argument function
*
* #param posit accumulate element
* #return quire operation result
*/
def accOp(posit: Posit32): Unit
def outputDependencies(outputName: String): Seq[(String)] = {
outputName match {
case "out" => Seq("in") // Seq("out", "in") gives errors
case _ => Seq.empty
}
}
def cycle(): Unit = {}
def execute(inputValues: Seq[Concrete], tpe: Type, outputName: String): Concrete = {
val arg1 :: _ = inputValues
val positArg = Posit32(arg1.value)
accOp(positArg)
val result = quire32ToBigInt(accValue)
ConcreteSInt(result, DspQuire.underlyingWidth, arg1.poisoned).asUInt
}
def getOutput(inputValues: Seq[BigInt], tpe: Type, outputName: String): BigInt = {
val arg1 :: _ = inputValues
val positArg = Posit32(arg1)
accOp(positArg)
quire32ToBigInt(accValue)
}
}
class DspQuireAddAcc(val name: String) extends DspQuireAccumulator {
def accOp(posit: Posit32): Unit = accValue += posit
}
class QuireBlackboxAccOperand extends BlackBox {
val io = IO(new Bundle() {
val in = Input(UInt(DspPosit.underlyingWidth.W))
val out = Output(UInt(DspQuire.underlyingWidth.W))
})
}
class BBQAddAcc extends QuireBlackboxAccOperand
class TreadleDspQuireFactory extends ScalaBlackBoxFactory {
def createInstance(instanceName: String, blackBoxName: String): Option[ScalaBlackBox] = {
blackBoxName match {
case "BBQAddAcc" => Some(add(new DspQuireAddAcc(instanceName)))
...
accOp will be called many times. So, if I want to accumulate List(1, 2, 3), the result maybe 0 + 1 + 1 + 2 + 2 + ...
And peek function will call accOp one time again, this makes me confused also.
I believe most of your problems at this point are caused by mixing two different approaches. I think you should not be using BlackBoxImplmentation because it is an older scheme used in with the firrtl-interpreter. Just use the ScalaBlackBox and implement the methods as described in the wiki page Black Boxes and Treadle and shown in the TreadleTest BlackBoxWithState.
Don't use outputDependencies, and instead specify any dependencies between inputs and outputs with with getDependencies. inputChanged will be called whenever an input IO is changed. So in that method you want to record or update the internal state of your black box. clockChange will be called whenever a clock is changed and will provide the transition information so you can decide what happens then. Treadle will call getOutput whenever it needs that output of your black box, since you will not have used outputDependencies you can ignore the inputs and just provide the output value depending on your internal state.
I am still trying to reproduce a running version of your code here but it will be a little time for me to put it together, if you can try my suggestions above and let me know how it goes that would be helpful. I am interested in making this feature of Treadle better and easier to use so all feedback is appreciated.

Could not find implicit value for parameter c: anorm.Column[Float]

I got this similar question but it doesn't help me. (Anorm parse float values). And I can honestly say I didn't understand the solution of that question.
I am getting this complie time error:
could not find implicit value for parameter c: anorm.Column[Float]
at
def getInformation(id: Long): List[(Float, Float, Float)] = {
DB.withConnection { implicit con =>
val query = SQL("select principal,interest,value from myTable where userId={id} and status=true").on("id"->id)
val result = query().map { row =>
Tuple3(row[Float]("principal"), row[Float]("inetrest"), row[Float]("value"))
// ^
}.toList
return result
}
}
Maybe a short review of implicits help you. Let's construct a very basic example:
// some class which will be used as implicit (can be anything)
case class SomeImplicitInformation(maybe: Int, with: Int, data: Int)
// lets assume we have a function that requires an implicit
def functionRequiringImplicit(regularParameters: Int)(implicit imp: SomeImplicitInformation) {
// ...
}
// now if you try to call the function without having an implicit in scope
// you would have to pass it explicitly as second parameter list:
functionRequiringImplicit(0)(SomeImplicitInformation(0,0,0))
// instead you can declare an implicit somewhere in your scope:
implicit val imp = SomeImplicitInformation(0,0,0)
// and now you can call:
functionRequiringImplicit(0)
The error you get simply says that anorm.Column[Float] in not in the scope as implicit. You can solve it by adding it implicitly to your scope or pass it explicitly.
More detailed instructions for you: Since the Column companion object only provides an implicit for rowToDouble you simply have to use the code that is linked in your question. To get it to work put it before your result computation. Later you might want to place it in a val in some enclosing scope.
try this...
def getInformation(id: Long): List[(Float, Float, Float)] = {
DB.withConnection { implicit con =>
val query = SQL("select principal,interest,value from myTable where userId={id} and status=true").on("id"->id)
val result = query().map { row =>
Tuple3(row[Float]("principal").asInstanceOf[Float], row[Float]("inetrest").asInstanceOf[Float], row[Float]("value").asInstanceOf[Float])
}.toList
return result
}
}
implicit def rowToFloat: Column[Float] = Column.nonNull { (value, meta) =>
val MetaDataItem(qualified, nullable, clazz) = meta
value match {
case d: Float => Right(d)
case _ => Left(TypeDoesNotMatch("Cannot convert " + value + ":" + value.asInstanceOf[AnyRef].getClass + " to Float for column " + qualified))
}
}
Some functions can accept what we call implicit parameters. Such parameters can, under certain conditions, be derived from the context. If these parameters can't be found, then you have to specify them by hand. If you expect a parameter to be used as an implicit one, it must have been declared implicit, for instance this way :
implicit val myVal = ...
It can be done in the current block or in an enclosing one (in the class body, for instance, or even sometimes in the imports)
The error you get seems to be related to this feature. You're using a function that needs a parameter of type anorm.Column[Float]. The argument is defined to be implicit so that an implicit value can be used and your code may be more concise. Unfortunately, you don't seem to have such an implicit value in your code, so it fails.
Latest Anorm (included in Play 2.3) provides more numeric conversion (see details at http://applicius-en.tumblr.com/post/87829484643/anorm-whats-new-play-2-3 & in Play migration notes).
If you have missing converter, you can add an issue on Play github project.
Best

EitherT with Reader generalizing over Reader input

A standard construct in my code is a function that returns a Reader[X,\/[A,B]] and I would like to use the Either portion in a for comprehension, so I have been trying to write a function which will convert a function (X) => \/[A,B] into EitherT[Reader[X,\/[A,B]],A,B].
I can do this with a predetermined Type for X. For instance:
case class Config(host: String)
type ReaderConfig[C] = Reader[Config, C]
type EitherReaderConfig[A,B] = EitherT[ReaderConfig, A,B]
def eitherReaderF[A,B](f: Config => \/[A,B]) : EitherReaderConfig[A,B] = EitherT[ReaderConfig, A,B](Reader[Config, \/[A,B]](f))
eitherReaderF(c => \/-(c.host)).run(Config("hostname"))
However, I am having problems removing the Config type and generalizing over X. This is because EitherT's first argument is expecting one argument in it's type construct: F[_], however Reader is defined as containing 2: Reader[A,B]
One of my attempts is to define a type in terms of an EitherT using type lambdas.
type EitherReaderM[X,A,B] = EitherT[({type λ[α] = Reader[X, α]})#λ, A,B]
def eitherReaderM[X,A,B](f: X => \/[A,B]): EitherReaderM[X,A,B] = EitherT[({type λ[α] = Reader[X, α]})#λ, A,B](Reader(f))
val r: EitherReaderM[Config, Int, String] = eitherReaderM((c: Config) => \/-(c.host))
val run = r.run /// type returns scalaz.Kleisli[[+X]X,Config,scalaz.\/[Int,String]]
run.apply(Config("host")) // fails: value apply is not a member of scalaz.Kleisli[[+X]X,Config,scalaz.\/[Int,String]]
That last bit fails. I feel like I'm close here.....
I'm not entirely sure what's going on yet, but I can run this with 2 calls to run. One on the EitherT and then one on the Kleisli (which I'm not sure where it came in ).
run.run(Config("host"))
However, even though this runs in the console, it doesn't actually compile. I receive this error while compiling:
kinds of the type arguments ([α]scalaz.Kleisli[[+X]X,X,α],A,B)
do not conform to the expected kinds of the type parameters (type F,type A,type B).
[α]scalaz.Kleisli[[+X]X,X,α]'s type parameters do not match type F's expected parameters:
type α is invariant, but type _ is declared covariant
[ERROR] def eitherReaderM[X,A,B](f: X => /[A,B]): EitherReaderM[X,A,B] = EitherT({type λ[α] = Reader[X, α]})#λ, A,B
And here we have it, the final, compiling version. I feel like it can be simplified a little more, but that is for a different day.
type EitherReader[X,A,B] = EitherT[({type λ[+α] = Reader[X, α]})#λ, A,B]
def eitherReader[X,A,B](f: X => \/[A,B]): EitherReader[X,A,B] = EitherT[({type λ[+α] = Reader[X, α]})#λ, A,B](Reader(f))
The allows me to replace Reader[X, A / B].apply with eitherReader[X,A,B].
Old:
def getSub(id: Int) = Reader[Config, String \/ Sub](config => config.findSub(id).right)
New:
def getSub(id: Int) = eitherReader[Config, String, Sub]](config => config.findSub(id).right)
Seems weird that I'm doing this simply for type conversion. Probably means I'm overlooking something.

def or val for defining Function in Scala

I'm learning Programming Paradigms in my University and reading this course material provided by the lecturer that defined a function this way:
val double = (x: Int) => 2 * x
double: Int => Int = <function1>
But from my own studies I found and got used to defining the same function like this:
def d (x: Int) = 2 * x
d: (x: Int)Int
I'm new to Scala. And both definitions give a result of:
res21: Int = 8
Upon passing 4 as the parameter.
Now my main question is why would the lecturer prefer to use val to define a function? I see it as longer and not really necessary unless using val gives some added advantages that I don't know of. Besides I understand using val makes some name a placeholder so later in the program, I could mistakenly write val double = 5 and the function would be gone!
At this stage I'm quite convinced I learned a better way of defining a function unless someone would tell me otherwise.
Strictly speaking def d (x: Int) = 2 * x is a method, not a Function, however scala can transparently convert (lift) methods into Functions for us. So that means you can use the d method anywhere that requires a Int => Int Function.
There is a small overhead of performing this conversion, as a new Function instance is created every time. We can see this happening here:
val double = (x: Int) => 2 * x
def d (x: Int) = 2 * x
def printFunc(f: Int => Int) = println(f.hashCode())
printFunc(double)
printFunc(double)
printFunc(d)
printFunc(d)
Which results in output like so:
1477986427
1477986427
574533740
1102091268
You can see when explicitly defining a Function using a val, our program only creates a single Function and reuses it when we pass as an argument to printFunc (we see the same hash code). When we use a def, the conversion to a Function happens every time we pass it to printFunc and we create several instances of the Function with different hash codes. Try it
That said, the performance overhead is small and often doesn't make any real difference to our program, so defs are often used to define Functions as many people find them more concise and easier to read.
In Scala, function values are monomorphic (i.e. they can not have type parameters, aka "generics"). If you want a polymorphic function, you have to work around this, for example by defining it using a method:
def headOption[A]: List[A] => Option[A] = {
case Nil => None
case x::xs => Some(x)
}
It would not be valid syntax to write val headOption[A]. Note that this didn't make a polymorphic function value, it is just a polymorphic method, returning a monomorphic function value of the appropriate type.
Because you might have something like the following:
abstract class BaseClass {
val intToIntFunc: Int => Int
}
class A extends BaseClass {
override val intToIntFunc = (i: Int) => i * 2
}
So its purpose might not be obvious with a very simple example. But that Function value could itself be passed to higher order functions: functions that take functions as parameters. If you look in the Scala collections documentation you will see numerous methods that take functions as parameters. Its a very powerful and versatile tool, but you need to get to a certain complexity and familiarity with algorithms before the cost /benefit becomes obvious.
I would also suggest not using "double" as an identifier name. Although legal Scala, it is easy to confuse it with the type Double.

How do I make lambda functions generic in Scala? [duplicate]

This question already has answers here:
How can I define an anonymous generic Scala function?
(2 answers)
Closed 9 years ago.
As most of you probably know you can define functions in 2 ways in scala, there's the 'def' method and the lambda method...
making the 'def' kind generic is fairly straight forward
def someFunc[T](a: T) { // insert body here
what I'm having trouble with here is how to make the following generic:
val someFunc = (a: Int) => // insert body here
of course right now a is an integer, but what would I need to do to make it generic?
val someFunc[T] = (a: T) => doesn't work, neither does val someFunc = [T](a: T) =>
Is it even possible to make them generic, or should I just stick to the 'def' variant?
As Randall Schulz said, def does not create a function, but a method. However, it can return a function and this way you can create generic functions like the identity function in Predef. This would look like this:
def myId[A] = (a: A) => a
List(1,2,3) map myId
// List(1,2,3)
List("foo") map myId
// List("foo")
But be aware, that calling myId without any type information infers Nothing. In the above case it works, because the type inference uses the signature of map, which is map[B](f: A => B) , where A is the type of the list and B gets infered to the same as A, because that is the signature of myId.
I don't believe it's possible. You can look at this previous post for more details:
How can I define an anonymous generic Scala function?
The only way around it (as one of the answers mentions) is to extend something like FunctionX and use a generic at the class level and then use that in the override of the apply function.
I don't believe it's possible either, but I'm a pessimist.
http://www.chuusai.com/2012/04/27/shapeless-polymorphic-function-values-1/
Edit:
Tell me if this isn't what you're asking, but this is why the accepted answer isn't what I thought you were asking for, see the link:
scala> :pa
// Entering paste mode (ctrl-D to finish)
def myId[A] = (a: A) => a
List(1,2,3) map myId
// List(1,2,3)
List("foo") map myId
// List("foo")
// Exiting paste mode, now interpreting.
myId: [A]=> A => A
res0: List[String] = List(foo)
scala> val f1 = myId[Int]
f1: Int => Int = <function1>
scala> val f2 = myId[String]
f2: String => String = <function1>
scala> List(1,2,3) map f2
<console>:10: error: type mismatch;
found : String => String
required: Int => ?
List(1,2,3) map f2
^
scala> List("foo") map f1
<console>:10: error: type mismatch;
found : Int => Int
required: String => ?
List("foo") map f1
^
The function values are not polymorphic, i.e., generic.
The closest thing is polymorphic functions I believe:
https://github.com/milessabin/shapeless#polymorphic-function-values