Encoding/Decode shapeless records with circe - json

Upgrading circe from 0.4.1 to 0.7.0 broke the following code:
import shapeless._
import syntax.singleton._
import io.circe.generic.auto._
.run[Record.`'transaction_id -> Int`.T](transport)
def run[A](transport: Json => Future[Json])(implicit decoder: Decoder[A], exec: ExecutionContext): Future[A]
With the following error:
could not find implicit value for parameter decoder: io.circe.Decoder[shapeless.::[Int with shapeless.labelled.KeyTag[Symbol with shapeless.tag.Tagged[String("transaction_id")],Int],shapeless.HNil]]
[error] .run[Record.`'transaction_id -> Int`.T](transport)
[error] ^
Am I missing some import here or are these encoders/decoders not available in circe anymore?

Instances for Shapeless's hlists, records, etc. were moved to a separate circe-shapes module in the circe 0.6.0 release. If you add this module to your build, the following should just work:
import io.circe.jawn.decode, io.circe.shapes._
import shapeless._, record.Record, syntax.singleton._
val doc = """{ "transaction_id": 1 }"""
val res = decode[Record.`'transaction_id -> Int`.T](doc)
The motivation for moving these instances was that the improved generic derivation introduced in 0.6 meant that they were no longer necessary, and keeping them out of implicit scope when they're not needed is both cleaner and potentially supports faster compile times. The new circe-shapes module also includes features that were not available in circe-generic, such as instances for coproducts.

Related

Debugging module internals in Chisel

I have a complex module written in Chisel. I'm using chiseltest to verify its operation. The test is failing. I want to be able to inspect the module's internal wire values to debug what is going wrong. Since the PeekPokeTester only allows me to inspect the value of the io signals, how can I inspect the internal wires?
Here is an example:
import chisel3._
class MyModule extends Module {
val io = IO(new Bundle {
val a = Input(Bool())
val b = Input(Bool())
val c = Input(Bool())
val d = Output(Bool())
})
val i = Wire(Bool())
i := io.a ^ io.b
io.d := i | io.c
}
import chisel3._
import chisel3.tester._
import org.scalatest.FreeSpec
class MyModuleTest extends FreeSpec with ChiselScalatestTester {
"MyModule should work properly" in {
test(new MyModule) { dut =>
dut.io.a.poke(true.B)
dut.io.b.poke(false.B)
dut.io.c.poke(false.B)
dut.i.expect(true.B) // This line throws a java.util.NoSuchElementException
// : key not found: Bool(Wire in MyModule)
}
}
}
How can I inspect the intermediate value "i"?
There's a few ways to do this.
1 ) Turn on VCD output by adding an annotation to your test, as in
import chiseltest.experimental.TestOptionBuilder._
import treadle._
...
test(new MyModule).withAnnotations(Seq(WriteVcdAnnotation)) { dut =>
The .vcd file will be placed in the relevant test_run_dir/ you can view it with GtkWave or similar
2 ) add printf statements to your module.
3 ) There is a simulation shell in the Treadle Repo that allows you to peek poke and step based on a firrtl file (the firrtl file should be in the same test_run_dir/ directory as above). There is A bit of documentation here
Good luck!

parsing JSON into classes with Option[T] arg types in spark-shell with Scala

I'm having trouble parsing JSON with json4s.jackson in spark-shell. The same thing works fine in sbt repl.
I wonder if there's a workaround for the version of Spark I'm using.
spark-shell v1.6, scala v2.10.5
sbt repl scala v 2.11.8
.
The following example demonstrates the problem.
sbt repl works as expected for all examples.
spark-shell chokes and gives an error for val c. What's weird is that it seems to choke on Option[Int] or Option[Double] but it works fine for Option[A] where A is a class.
.
import org.json4s.JsonDSL._
import org.json4s.jackson.JsonMethods.{render,compact,pretty}
import org.json4s.DefaultFormats
import org.json4s.jackson.JsonMethods._
import org.json4s.{JValue, JObject}
implicit val formats = DefaultFormats
class A(val a: Int, val aa: Int)
class B(val b: Int, val optA: Option[A]=None)
class C(val b: Int, val bb: Option[Int]=None)
val jb_optA_nested: JObject = ("b" -> 5) ~ ("optA" -> ("a" -> 999) ~ ("aa" -> 1000))
val jb_optA_not_nested: JObject = ("b" -> 5) ~ ("a" -> 999) ~ ("aa" -> 1000)
val jb_optA_none: JObject = ("b" -> 5)
val jc: JObject = ("b" -> 5) ~ ("bb" -> 100)
val b_nested = jb_optA_nested.extract[B] // works as expected in both (optA=Some(A(999,1000)))
val b_not_nested = jb_optA_not_nested.extract[B] // works as expected in both (optA=None)
val b_none = jb_optA_none.extract[B] // works as expected in both (optA=None)
val c = jc.extract[C] // error in spark-shell; works fine in sbt repl
The error generated is: org.json4s.package$MappingException: Can't find constructor for C
The only real difference I can find (other than scala versions) is that in spark-shell... it chokes on Option[native types] and seems to work on Option[user-defined classes]. But maybe that's coincidence.
In posts like this... JSON4s can't find constructor w/spark I see comments where people suggest the class structure doesn't match the JSON... but to me, class C and val jc look identical.
Also of note... this error persists when I harden class defs and functions in a .JAR and import defs into spark-shell from the jar instead of defining in the repl. sometimes that's relevant for spark 1.6, but doesn't seem to be here.
Have you tried:
class C(val b: Int, val bb: Option[java.lang.Integer]=None)
I've had issues with the Scala Int before with Json4s - although I can't recall exactly what it was.
Also doing a case class is worthwhile with the Int - any reason you prefer a regular class? I don't see any vars.

Scala/Spark: NoClassDefFoundError: net/liftweb/json/Formats

I am trying to create a JSON String from a Scala Object as described here.
I have the following code:
import scala.collection.mutable._
import net.liftweb.json._
import net.liftweb.json.Serialization.write
case class Person(name: String, address: Address)
case class Address(city: String, state: String)
object LiftJsonTest extends App {
val p = Person("Alvin Alexander", Address("Talkeetna", "AK"))
// create a JSON string from the Person, then print it
implicit val formats = DefaultFormats
val jsonString = write(p)
println(jsonString)
}
My build.sbt file contains the following:
libraryDependencies += "net.liftweb" %% "lift-json" % "2.5+"
When I build with sbt package, it is a success.
However, when I try to run it as a Spark job, like this:
spark-submit \
--packages com.amazonaws:aws-java-sdk-pom:1.10.34,org.apache.hadoop:hadoop-aws:2.6.0,net.liftweb:lift-json:2.5+ \
--class "com.foo.MyClass" \
--master local[4] \
target/scala-2.10/my-app_2.10-0.0.1.jar
I get this error:
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: net.liftweb#lift-json;2.5+: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1068)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:287)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:154)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
What am I doing wrong here? Is net.liftweb:lift-json:2.5+ in my packages argument incorrect? Do I need to add a resolver in build.sbt?
Users may also include any other dependencies by supplying a comma-delimited list of maven coordinates with --packages.
2.5+ in your build.sbt is Ivy version matcher syntax, not actual artifact version needed for Maven coordinates. spark-submit apparently doesn't use Ivy for resolution (and I think it would be surprising if it did; your application could suddenly stop working because a new dependency version was published). So you need to find what version 2.5+ resolves to in your case e.g. using https://github.com/jrudolph/sbt-dependency-graph (or trying to find it in show dependencyClasspath).

Play JSON Reads/Writes with single-parameter case classes

This creates a Writes for a case class
import play.api.libs.json._
import play.api.libs.functional.syntax._
case class A(a: String, b: String, c: String)
(JsPath.write[String] and
JsPath.write[String] and
JsPath.write[String])(unlift(A.unapply))
This can be extended to work for 2, 3, 4, 5, 6, etc. parameters...but not 1.
case class B(a: String)
(JsPath.write[String])(unlift(B.unapply))
Compiler error:
error: overloaded method value write with alternatives:
(t: String)(implicit w: play.api.libs.json.Writes[String])play.api.libs.json.OWrites[play.api.libs.json.JsValue] <and>
(implicit w: play.api.libs.json.Writes[String])play.api.libs.json.OWrites[String]
cannot be applied to (B => String)
(JsPath.write[String])(unlift(B.unapply))
^
A similar problem happens for Reads.
How can I get Reads and Writes for single-parameter case clases?
Like Travis said:
Transforming an existing Reads: use the map method
Transforming an existing Writes: use contramap
However, contramap only works on Writes that produce JsObject. Your writes will fail at runtime:
val w = JsPath.write[String].contramap[B](_.a)
scala> w.writes(B("Hello"))
java.lang.RuntimeException: when empty JsPath, expecting JsObject
You can create a Writes "from scratch" using Writes.apply:
Writes[B](b => JsString(b.a))
Similarly you can create a Reads using Reads.apply.
implicit val reads: Reads[A] =
(JsPath \ "ax").read[B].map(A.apply)

Why does this json4s code work in the scala repl but fail to compile?

I'm converting a json-like string into json, and the following code works in the scala repl
import org.json4s._
import org.json4s.JsonDSL._
import org.json4s.JsonDSL.WithDouble._
import org.json4s.native.JsonMethods._
val value = "{100:1.50;500:1.00;1000:0.50}"
val data = value.stripPrefix("{").stripSuffix("}").split(";").map(a => {
val b = a.split(":")
(b(0),b(1))
}).toMap
compact(render(data))
But when it is compiled, I'm getting the following error
[error] ... type mismatch;
[error] found : scala.collection.immutable.Map[String,String]
[error] required: org.json4s.JValue
[error] (which expands to) org.json4s.JsonAST.JValue
[error] compact(render(data))
[error] ^
Why is this, and how might I fix it?
I suspect something with the type system that is over my head.
render() is imported from JsonMethods._ and it actually requires a JValue. You have imported an implicit map2jvalue twice from those two imports import org.json4s.JsonDSL._ and import org.json4s.JsonDSL.WithDouble._.
I suspect that the compiler didn't find the implicit due to the ambiguous imports, try to be more selective: the 3rd import seems redundant (the one with JsonDSL.WithDouble._).
Sometimes you can run scalac with -Xlog-implicits to see why implicits are not used.