Using Play Framework and case class with greater than 22 parameters - json

I have seen some of the other issues involving the infamous "22 fields/parameters" issue that is an inherent bug (feature?) of Scala V < 2.11. See here and here. However, as per this blog post, it appears that the 22 parameter limit in case class has been fixed; at least where the language is concerned.
I have a case class that I want to load an arbitrary (Read: > 22) number of values into which will later be read into a JSON object using the Play library.
It looks something like this:
object L {
import play.api.libs.json.Reads. _
import play.api.libs.functional.syntax._
implicit val responseRead: Reads[L] = (
MyField1.jsPath.Read[MyField1.t] and
MyField2.jsPath.Read[MyField2.t] and
...
MyField35.jsPath.Read[MyField35.t]
) (L.apply _)
}
case class L(myField1: MyField1.t, myField2: MyField2.t, ... myField35: MyField35.t)
The issue is that on compile, Scala complains that there are more than 22 parameters in the case class. (Specifically: on the last line of the object definition, when the compiler attempts to build, I get: "implementation restricts functions to 22 parameters".) I'm currently using Scala v2.11.6, so I think it's not a language issue. That makes me think that the Play library hasn't updated their implementation of Read.
If that's the case, then I guess the best bet is to group related fields into Tuples and pass the Tuples in through the JSON API?

As mentioned in the blog post you referenced, the 22-parameter limit is still in effect for functions in Scala 2.11 and later, so what you've encountered is a language issue. The function call in this case is:
L.apply _
Restructuring your model is one way to deal with this limit.

So the answer to this question is actually two parts:
1. Workaround
I'll call this the "workaround" because while it does "work" it usually addresses the symptom and not the problem.
My solution was to use shapeless to provide generic heterogeneous lists of arbitrary length. This solution is already widely discussed and available elsewhere. See, e.g., (1) [SO Post] How to get around the Scala case class limit of 22 fields?; (2) Blog post; (3) Yet another blog post.
2. Solution
As #jeffrey-chung mentions is to restructure the model to deal with this limit. As many in the industry have noted, having a function with more than 30 arguments is likely a sign that your function is doing too much or that the function should be refactored to ingest a smaller number of arguments. See, e.g., (1) Rule of 30 – When is a method, class or subsystem too big?; (2) Databrick's style guide.

See answer here
https://stackoverflow.com/a/57317220/1606452
It seems this handles it all nicely.
+22 field case class formatter and more for play-json
https://github.com/xdotai/play-json-extensions
Supports Scala 2.11.x, 2.12.x, and 2.13.x and play 2.3, 2.4, 2.5 and 2.7
And is referenced in the play-json issue as the preferred solution (but not yet merged)

Related

Why is my %h is List = 1,2; a valid assignment?

While finalizing my upcoming Raku Advent Calendar post on sigils, I decided to double-check my understanding of the type constraints that sigils create. The docs describe sigil type constraints with the table
below:
Based on this table (and my general understanding of how sigils and containers work), I strongly expected this code
my %percent-sigil is List = 1,2;
my #at-sigil is Map = :k<v>;
to throw an error.
Specifically, I expected that is List would attempt to bind the %-sigiled variable to a List, and that this would throw an X::TypeCheck::Binding error – the same error that my %h := 1,2 throws.
But it didn't error. The first line created a List that seemed perfectly ordinary in every way, other than the sigil on its variable. And the second created a seemingly normal Map. Neither of them secretly had Scalar intermediaries, at least as far as I could tell with VAR and similar introspection.
I took a very quick look at the World.nqp source code, and it seems at least plausible that discarding the % type constraint with is List is intended behavior.
So, is this behavior correct/intended? If so, why? And how does that fit in with the type constraints and other guarantees that sigils typically provide?
(I have to admit, seeing an %-sigiled variable that doesn't support Associative indexing kind of shocked me…)
I think this is a grey area, somewhere between DIHWIDT (Docter, It Hurts When I Do This) and an oversight in implementation.
Thing is, you can create your own class and use that in the is trait. Basically, that overrides the type with which the object will be created from the default Hash (for %) and Array (for # sigils). As long as you provide the interface methods, it (currently) works. For example:
class Foo {
method AT-KEY($) { 42 }
}
my %h is Foo;
say %h<a>; # 42
However, if you want to pass such an object as an argument to a sub with a % sigil in the signature, it will fail because the class did not consume the Associatve role:
sub bar(%) { 666 }
say bar(%h);
===SORRY!=== Error while compiling -e
Calling bar(A) will never work with declared signature (%)
I'm not sure why the test for Associative (for the % sigil) and Positional (for #) is not enforced at compile time with the is trait. I would assume it was an oversight, maybe something to be fixed in 6.e.
Quoting the Parameters and arguments section of the S06 specification/speculation document about the related issue of binding arguments to routine parameters:
Array and hash parameters are simply bound "as is". (Conjectural: future versions ... may do static analysis and forbid assignments to array and hash parameters that can be caught by it. This will, however, only happen with the appropriate use declaration to opt in to that language version.)
Sure enough the Rakudo compiler implemented some rudimentary static analysis (in its AOT compilation optimization pass) that normally (but see footnote 3 in this SO answer) insists on binding # routine parameters to values that do the Positional role and % ones to Associatives.
I think this was the case from the first official Raku supporting release of Rakudo, in 2016, but regardless, I'm pretty sure the "appropriate use declaration" is any language version declaration, including none. If your/our druthers are static typing for the win for # and % sigils, and I think they are, then that's presumably very appropriate!
Another source is the IRC logs. A quick search quickly got me nothing.
Hmm. Let's check the blame for the above verbiage so I can find when it was last updated and maybe spot contemporaneous IRC discussion. Oooh.
That is an extraordinary read.
"oversight" isn't the right word.
I don't have time tonight to search the IRC logs to see what led up to that commit, but I daresay it's interesting. The previous text was talking about a PL design I really liked the sound of in terms of immutability, such that code could become increasingly immutable by simply swapping out one kind of scalar container for another. Very nice! But reality is important, and Jonathan switched the verbiage to the implementation reality. The switch toward static typing certainty is welcome, but has it seriously harmed the performance and immutability options? I don't know. Time for me to go to sleep and head off for seasonal family visits. Happy holidays...

Using Pickle to serialise large objects - what causes 'Memory error'

I'm pickling a very large (both in terms of properties and in terms raw size) class. I've been picking it no problem with pickle using pickle.dump, until I hit just under 4GB and now I consistently get 'Memory Error'. I've also tried using json.dump (and I get 'is not JSON serializable' error). I've also tried Hickle but I get the same error with Hickle as I do with Pickle.
I can't post all the code here (it's very long) but in essence It's a class that holds a dictionary of values from another class - something like this:
class one:
def __init__(self):
self.somedict = {}
def addItem(self,name,item)
self.somedict[name] = item
class two:
def __init__(self):
self.values = [0]*100
Where name is a string and item is an instance of the class two object.
There's a lot more code to it, but this is where the vast majority of things are held. Is there a reliable and ideally fast solution to saving this object to file and then being able to reload it at a later time. I save it every few thousand iterations (as a backup incase something goes wrong, so I need it to be reasonably quick).
Thanks!
Edit #1:
I've just thought that it might be useful to include some details on my system. I have 64Gb of ram - so I don't think pickling a 3-4GB file should cause this type of issue (although I could be wrong on this!).
You probably checked this one first but just in case: Did you make sure your Python installation 64 bit? The 3-4GB immediately reminded me of the memory limit of 32bit applications.
I found this resource quite useful for analyzing and resolving some of the more common memory related issues with Python.

What mechanism works to show component ID in chisel3 elaboration

Chisel throws an exception with an elaboration error message. The following is a result of my code as an example.
chisel3.core.Binding$ExpectedHardwareException: data to be connected 'chisel3.core.Bool#81' must be hardware, not a bare Chisel type. Perhaps you forgot to wrap it in Wire(_) or IO(_)?
This exception message is interesting because 81 behind chisel3.core.Bool# looks like ID, not hashcode.
Indeed, Data type extends HasId trait which has _id field, and
_id field seems to generate a unique ID for each components.
I've thought Data type overrides toString to make string that has type#ID, but it does not override. That is why $node in below code should not be able to use ID.
throw Binding.ExpectedHardwareException(s"$prefix'$node' must be hardware, " +
"not a bare Chisel type. Perhaps you forgot to wrap it in Wire(_) or IO(_)?")
Instead of toString, toNamed method exists in Data. However, this method seems to be called to generate a firrtl code, not to convert component into string.
Why can Data type show its ID?
If it is not ID, but exactly hashcode, this question is from my misunderstanding.
I think you should take a look at Chisel PR #985. It changes the way that Data's toString method is implemented. I'm not sure if it answers your question directly but it's possible this will make the meaning and location of the error clearer. If not you should comment on it.
Scala classes come with a default toString method that is of the form className#hashCode.
As you noted, the chisel3.core.Bool#81 sure looks like it's using the _id rather than the hashCode. That's because in the most recently published version of Chisel (3.1.6), the hashcode was the id! You can see this if you inspect the source files at the tag for that version: https://github.com/freechipsproject/chisel3/blob/dc4200f8b622e637ec170dc0728c7887a7dbc566/chiselFrontend/src/main/scala/chisel3/internal/Builder.scala#L81
This is no longer the case on master which probably the source of any confusion! As Chick noted, we have just changed the .toString method to be more informative than the default; expect more informative representations in 3.2.0!

kotlin, how to simplify passing parameters to base class constructor?

We have a package that we are looking to convert to kotlin from python in order to then be able to migrate systems using that package.
Within the package there are a set of classes that are all variants, or 'flavours' of a common base class.
Most of the code is in the base class which has a significant number of optional parameters. So consider:
open class BaseTree(val height:Int=10,val roots:Boolean=true, //...... lots more!!
class FruitTree(val fruitSize, height:Int=10, roots:Boolean=true,
// now need all possible parameters for any possible instance
):BaseTree(height=height, roots=roots //... yet another variation of same list
The code is not actually trees, I just thought this was a simple way to convey the idea. There are about 20 parameters to the base class, and around 10 subclasses, and each subclass effectively needs to repeat the same two variations of the parameter list from the base class. A real nightmare if the parameter list ever changes!
Those from a Java background may comment "20 parameters is too many", may miss that this is optional parameters, the language features which impacts this aspect of design. 20 required parameters would be crazy, but 10 or even 20 optional parameters is not so uncommon, check sqlalchemy Table for example.
In python, you to call a base class constructor you can have:
def __init__(self, special, *args, **kwargs):
super().__init(*args, **kwargs) # pass all parameters except special to base constructor
Does anyone know a technique, using a different method (perhaps using interfaces or something?) to avoid repeating this parameter list over and over for each subclass?
There is no design pattern to simplify this use case.
Best solution: Refactor the code to use a more Java like approach: using properties in place of optional parameters.
Use case explained: A widely used class or method having numerous optional parameters is simply not practical in Java, and kotlin is most evolved as way of making java code better. A python class with 5 optional parameters, translated to Java with no optional parameters, could have 5! ( and 5 factorial is 60) different Java signatures...in other words a mess.
Obviously no object should routinely be instanced with a huge parameter list, so normall python classes only evolve for classes when the majority of calls do not need to specify these optional parameters, and the optional parameters are for the exception cases. The actual use case here is the implementation of a large number of optional parameters, where it should be very rare for any individual object to be instanced using more than 3 of the optional parameter. So a class with 10 optional parameters that is used 500 times in an application, would still expect 3 of the optional parameters to be the maximum ever used in one instance. But this is simply a design approach not workable in Java, no matter how often the class is reused.
In Java, functions do hot have optional parameters, which means this case where an object is instanced in this way in a Java library simply could never happen.
Consider an object with one mandatory instance parameter, and five possible options. In Java these options would each be properties able to be set by setters, and objects would then be instanced, and the setter(s) called for setting any relevant option, but infrequently required change to the default value for that option.
The downside is that these options cannot be set from the constructor and remain immutable, but the resultant code reduces the optional parameters.
Another approach is to have a group of less 'swiss army knife' objects, with a set of specialised tools replacing the one do-it-all tool, even when the code could be seen as just slightly different nuances of the same theme.
Despite the support for Optional parameters in kotlin, The inheritance structure in kotlin is not yet optimised for heavier use of this feature.
You can skip the name like BaseTree(height, roots) by put the variable in order but you cannot do things like Python because Python is dynamic language.
It is normal that Java have to pass the variables to super class too.
FruitTree(int fruitSize, int height, boolean root) {
super(height, root);
}
There are about 20 parameters to the base class, and around 10 subclasses
This is most likely a problem of your classes design.
Reading your question I started to experiment myself and this is what I came up with:
interface TreeProperties {
val height: Int
val roots: Boolean
}
interface FruitTreeProperties: TreeProperties {
val fruitSize: Int
}
fun treeProps(height: Int = 10, roots: Boolean = true) = object : TreeProperties {
override val height = height
override val roots = roots
}
fun TreeProperties.toFruitProperty(fruitSize: Int): FruitTreeProperties = object: FruitTreeProperties, TreeProperties by this {
override val fruitSize = fruitSize
}
open class BaseTree(val props: TreeProperties)
open class FruitTree(props: FruitTreeProperties): BaseTree(props)
fun main(args: Array<String>){
val largTree = FruitTree(treeProps(height = 15).toFruitProperty(fruitSize = 5))
val rootlessTree = BaseTree(treeProps(roots = false))
}
Basically I define the parameters in an interface and extend the interface for sub-classes using the delegate pattern. For convenience I added functions to generate instances of those interface which also use default parameters.
I think this achieves the goal of repeating parameter lists quite nicely but also has its own overhead. Not sure if it is worth it.
If your subclass really has that many parameters in the constructur -> No way around that. You need to pass them all.
But (mostly) it's no good sign, that a constructor/function has that many parameters...
You are not alone on this. That is already discussed on the gradle-slack channel. Maybe in the future, we will get compiler-help on this, but for now, you need to pass the arguments yourself.

Is there a way to pseduo-subclass Strings, Numbers, uint, ints, or other 'final' primitives in Actionscript 3 using the Proxy class?

It seems like there might be a way, but I'm not seeing it. I have, in the past, used the valueOf() and toString() methods on Object to cause custom objects to behave in numbers or strings based on context, but I'd like to do more.
Basically no. Final is final so they cannot be extended. You could make a class which has all the same methods as the Number class, but it still wouldn't BE a Number as far as the compiler is concerned.
To be honest there should never be a reason that you should need to extend from these classes.
As far as proxies go you could consider making a factory class which returns a pre-formatted string/number eg:
var myString:String= StringFactory.asCurrency("50"); // "$50.00"
as already stated by groady, this is not possible ... also not in the scenarios you described ... but the thing is, that at runtime, the type detection mechanism is pretty easy ... lookup the traits object, and check whether it matches a class/subclass, or whether it explicitely implements and interface ... in any other case, you will have errors ... you can use proxies to implement your own array access ... however, they will not be arrays, thus passing them to a function that expects Array, will cause errors ... also, in AS3 you cannot overload operators, so you will really have a hard time ... you could create a class for numeric values, but then manipulating it would require methods as add, subtract etc. ... there is however a related request on jira ... still, this will not solve your problem entirely, because you cannot control the way an object responds to operators ... if you compile in ECMA compatibility mode, you probable will be able to bypass the strict runtime type checks, but on the other hand, you will lose a lot of speed ... the best thing probably really is creating a class that has to be manipulated through methods instead of operators ... not too comfortable, but the best AS3 offers ...
greetz
back2dos