Scala JSON Rapture API throws exception - json

I am trying to use rapture.io Scala JSON parser to parse a JSON value (rows) that looks like this:
{
rows:
[
[
null,
"2016-11-16T15:43:18.000Z",
{
"p": 1,
"q": 2
},
null,
"Game highlights"
],
[
null,
"2007-10-09T01:52:29.000Z",
{
"p": 21,
"q": 99
},
"blaah",
"Game reviews"
]
}
My code looks like this:
import rapture.io._
import rapture.codec._
import rapture.json._
import rapture.data._
import rapture.uri._
import rapture.net._
import encodings.system
import jsonBackends.jawn._
class NotesDownloader () {
def download(): Unit = {
val src = uri"https://some_url".slurp[Char]
val jsonResponse = Json.parse(src)
val rows = jsonResponse.data.rows
val rowsBean = rows(0).as[Array[Member]]
println(jsonResponse)
}
case class Member(array: Array[Some[String]])
}
When I try to extract the complete data into Member, I get this exception:
Error:(40, 30) not enough arguments for method as: (implicit ext: rapture.data.Extractor[Array[NotesDownloader.this.Member],rapture.json.Json], implicit mode:
rapture.core.Mode[rapture.data.ExtractionMethods])mode.Wrap[Array[NotesDownloader.this.Member],rapture.data.DataGetException].
Unspecified value parameters ext, mode. val rowsBean = value.as[Array[Member]]
what am I missing?

As long as I know, you could use something like this.
Json.parse(str).as[List[Member]]) since it's a list not an a simple array.

The error message tells you that you need values for the implicit parameters ext and mode. It would be something like this:
implicit val ext = ...
implicit val mode = ...
val rowsBean = rows(0).as[Array[Member]] // this uses the above implicits

Related

Scala JSON Parsing Nested Array into Case Class

I'm using the standard Scala json parsing and I am running into issues with the nested arrays.
Here is sample JSON:
{
"_id": "id_here",
"_rev": "rev_here",
"RandomHosts": [
"randomhosts_1",
"randomhosts_2",
"randomhosts_3",
"randomhosts_4"
],
"FirstObject": {
"Host": "ActualHost",
"Port": 8888,
"DB": 0,
"WFMDB": 1,
"ETLDB": 2,
"HostListPrefix": "Dev1",
"ExtraHostsBands": [
{
"Host": "dev2",
"Port": 2222,
"DB": 0,
"WFMDB": 1,
"ETLDB": 2,
"HostListPrefix": "Dev2"
},
{
"Host": "dev3",
"Port": 3333,
"DB": 0,
"WFMDB": 1,
"ETLDB": 3,
"HostListPrefix": "Dev3"
}
],
"RandomObject":{}
}
}
// I HAVE OTHER IMPORTS AS WELL
import scala.util.parsing.json._;
case class BandClass (
Host: String,
Port:Int,
DB:Int,
WFMDB:Int,
ETLDB:Int,
HostListPrefix:String
);
var jsonString = "<myjson>";
val bandconfig = JSON.parseFull(jsonString);
// THIS WORKS PERFECT AND GIVES ME JOINED STRING
val random_hosts = bandconfig.get.asInstanceOf[Map[String, Any]]("RandomHosts").asInstanceOf[List[String]].mkString(",");
//THIS ALSO WORKS PERFECT AND GIVES ME HOST AND PORT
val firstObject = bandconfig.get.asInstanceOf[Map[String, Any]]("FirstObject").asInstanceOf[Map[String, Any]];
val firstObjectHost = firstObject("Host").asInstanceOf[String]
val firstObjectPort = firstObject("Port").asInstanceOf[Double].toInt
//THIS IS WHERE EVERYTHING FALLS APART
val extraBands = firstObject("ExtraHostBands").asInstanceOf[List[BandClass]]
//EVEN THIS DOESNT WORK
val extraBands2 = firstObject("ExtraHostBands").asInstanceOf[Map[String, Any]]
Caused by: java.lang.ClassCastException: scala.collection.immutable.HashMap$HashTrieMap cannot be cast to $BandClass
I'm not sure how to force that nested json array into my case class. Id even settle for a map or seq or anything I could iterate over to get the host/ports out of the ExtraHostBand json objects.
Can anyone point me in the correct direction to get that json array into case class? I also have access to play-json but cant seem to figure that out either.
Ended up going to play-json and it worked really well. Here is a solution in case someone needs this in the future:
import play.api.libs.json.Json;
import play.api.libs.json;
import play.api.libs.json.Writes;
import play.api.libs.json._;
import play.api.libs._;
import play.api.libs.functional.syntax._;
import play.api.libs.json.Reads._;
case class BandClass (
Host: String,
Port:Int,
DB:Int,
WFMDB:Int,
ETLDB:Int,
HostListPrefix:String
)
case class FirstObject (
Host: String,
Port:Int,
DB:Int,
WFMDB:Int,
ETLDB:Int,
HostListPrefix:String,
ExtraHostsBands: List[BandClass]
)
case class RawConfig (
_id: String,
_rev: String,
RandomHosts: List[String],
FirstObject: FirstObject
)
implicit val bandClassFormat = Json.format[BandClass];
implicit val firstObjectFormat = Json.format[FirstObject];
implicit val rawConfigFormat = Json.format[RawConfig];
val playJsonParse = Json.parse(<myjson>).as[RawConfig];
println("playJSON ID " + playJsonParse._id)
println("playJSON REV " + playJsonParse._rev)
playJsonParse.FirstObject.ExtraHostBands.foreach
{
case(r) => {
println("Host " + r.Host);
println("Host Prefix " + r.HostListPrefix);
println("ETLDB " + r.ETLDB);
}
}

Using Json format in Confluent Platform Schema Registry?

I'm trying to put several event types in the same Kafka topic using the JSON format, but in the Producer implementation I'm always getting org.apache.kafka.common.errors.SerializationException: Error serializing JSON message. Seems that the annotation #Schema isn't working as expected is like the schema defined by the annotation isn't enriched properly and in the method that validates the backward compatibility the schema defined by my event has the schemaObj empty and the result is not compatible and fails.
My event:
#Schema(
value = "1",
refs = Array(new SchemaReference(name = "event", subject = "event"))
)
case class Event(#BeanProperty id: String,
#BeanProperty name: String)
Producer:
def send(): Unit = {
val props = new Properties() {
put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092")
put(
ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
"org.apache.kafka.common.serialization.StringSerializer"
)
put(
ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
"io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer"
)
put("auto.register.schemas", "false")
put("use.latest.version", "true")
put("schema.registry.url", "http://127.0.0.1:8081")
put("json.fail.invalid.schema", "true")
}
val producer = new KafkaProducer[String, Event](props)
val topic = "all-json"
val key = "key1"
val event = Event("id", "name")
val record = new ProducerRecord[String, Event](topic, key, event)
producer.send(record).get
}
By the command line, I can perfectly produce the events. The JSON Schema is modeled by
{
"oneOf": [
{ "$ref": "Event.schema.json" },
{ "$ref": "EventB.schema.json" }
]
}
...
the dependencies of confluent used are the version 6.0.1.
Do you know what is the issue?

Encoding all subtypes with prepopulated field with Circe

I'd like to be able to add a field to certain case classes when they are encoded in JSON vis circe.
e.g.
trait OntologyType {
val ontologyType: String = this.getClass.getSimpleName
}
case class Thing(i: Int) extends OntologyType
val thing = Thing(23)
println(t.toJson) // 1
case class Thingy(s: Srtring, i: Int) extends OntologyType
val thingy = Thingy("Hi there", 23)
println(t.toJson) // 2
I'd like to find a way for the above to return
{ "i": 23, "type": "Thing" } // 1
{ "s": "Hi there", "i": 23, "type": "Thingy" } // 2
The closest I have got is making all OntologyTypes render their type, but need to somehow mixin the standard case class encoding too:
implicit def encodeUser[T <: OntologyType]: Encoder[T] =
Encoder.forProduct1("type")(u => (u.ontologyType))
Try
implicit def encodeUser[T <: OntologyType](implicit enc: DerivedObjectEncoder[T]): Encoder[T] =
u => enc.encodeObject(u).add("type", u.ontologyType.asJson).asJson
Imports:
import io.circe._
import io.circe.generic.encoding.DerivedObjectEncoder
import io.circe.syntax._

Type mismatch: expected :: [String, HNil] => op_rabbit.Handler, actual: String => op_rabbit.Handler

I took the following code from here in order to consume messages from RabbitMQ. Once a new message is consumed, I want to execute val init = new Initiator() in order to run some calculations on this message.
In the original code the messages are parsed as follows:
package io.ticofab.scalarabbitmqexample.model
import play.api.libs.json.Json
/**
* This is the kind of object that our listener expects in Json format from the queue. So like
* {
* "name" : "xxxx",
* "version" : 1234
* }
*
* #param name An arbitrary String
* #param version An arbitrary Int
*/
case class MyObject(name: String, version: Int)
object MyObject {
implicit def myObjectFormat = Json.format[MyObject]
}
In my case the parsing is complicated and I don't want to parse the Json String in MyObject. What I want is to run body(as[String]) instead of body(as[MyObject]) { (see the code below). Then I'll do parsing inside init.run(obj) using import spray.json._.
So, my goal is to convert obj into String. But if I do so, I get the following compilation error:
Type mismatch: expected :: [String, HNil] => op_rabbit.Handler,
actual: String => op_rabbit.Handler.
package org.test.akka_actors
import akka.actor.{Actor, ActorLogging, Props}
import com.spingo.op_rabbit.Directives._
import com.spingo.op_rabbit.{RabbitControl, Subscription, SubscriptionRef}
import com.typesafe.config.ConfigFactory
import org.test.Initiator
import org.test.akka_actors.QueueListener.{Listen, Terminate}
import org.test.query.QueryObject
import scala.concurrent.ExecutionContext.Implicits.global
object QueueListener {
case object Listen
case object Terminate
def props = Props(new QueueListener)
}
class QueueListener extends Actor with ActorLogging {
// read info from configuration
val conf = ConfigFactory.load()
val QUEUE = conf.getString("op-rabbit.rabbit-queue")
// instantiate a rabbit mq controller
val RABBIT_CONTROL = context.actorOf(Props[RabbitControl])
// references to the queue subscriptions
var queueSubscription: Option[SubscriptionRef] = None
override def receive: Receive = {
case Listen =>
// initialize a queue subscription
queueSubscription = Some(
Subscription.run(RABBIT_CONTROL) {
channel(qos = 3) {
consume(queue(QUEUE)) {
body(as[String]) {
(obj) =>
log.debug(s"received the query $obj")
val init = new Initiator()
init.run(obj)
ack
}
}
}
}
)
case Terminate =>
// close the subscription
queueSubscription.foreach(_.close())
}
}

JSON output with Groovy

I have been experimenting with the groovy Jsonbuilder as you can see below trying to look at different ways to build JSON objects and arrays. After things started to make sense, I tried expanding to what is shown below. The question I have is, why does "content" show up in the json pretty string output? I actually have another json object displaying this.class information in json string outputs.
Any ideas? I'm new to this, so it could definitely be an obvious one.
def tt = ["test", "test1"]
def jjj = "jason"
def js3 = new groovy.json.JsonBuilder()
def js2 = new groovy.json.JsonBuilder(tt);
js3 hello: "$jjj", "$jjj": tt
def js4 = new groovy.json.JsonBuilder()
def result = js4([sdn: js3, openflow: js2, type: 3])
println js4.toPrettyString();
outputs:
{
"sdn": {
"content": {
"hello": "jason",
"jason": [
"test",
"test1"
]
}
},
"openflow": {
"content": [
"test",
"test1"
]
},
"type": 3
}
The problem can be restated as...
why does this:
import groovy.json.*
def js3 = new JsonBuilder(["test", "test1"])
def js4 = new JsonBuilder(js3)
println js4.toString()
print:
{"content":["test","test1"]}
and this:
import groovy.json.*
def js3 = new JsonBuilder(["test", "test1"])
def js4 = new JsonBuilder(js3.content)
println js4.toString()
prints this (?) :
["test","test1"]
The short answer is that JsonBuilder has a member named content, which represents the payload. When one JsonBuilder absorbs another, we want to replace the payload, and not nest it. This line is the way to replace the payload:
def js4 = new JsonBuilder(js3.content)
Ultimately, this stems from the fact that JsonBuilder.toString() (code here) calls JsonOutput.toJson(object) (code here).
An exercise for the reader is to experiment with:
class MyBuilder {
def content
}
def myB = new MyBuilder(content: ["test", "test1"])
println JsonOutput.toJson(myB)
println JsonOutput.toJson(myB.content)