Hocon: Read an array of objects from a configuration file - json

I have created an Play application (2.1) which uses the configuration in conf/application.conf in the Hocon format.
I want to add an array of projects in the configuration. The file conf/application.conf looks like this:
...
projects = [
{name: "SO", url: "http://stackoverflow.com/"},
{name: "google", url: "http://google.com"}
]
I try to read this configuration in my Scala project:
import scala.collection.JavaConversions._
case class Project(name: String, url: String)
val projectList: List[Project] =
Play.maybeApplication.map{x =>
val simpleConfig = x.configration.getObjectList("projects").map{y =>
y.toList.map{z =>
Project(z.get("name").toString, z.get("url").toString) // ?!? doesn't work
...
}}}}}}}} // *arg*
This approach seems to be very complicated, I am lost in a lot of Options, and my Eclipse IDE cannot give me any hints about the classes.
Has anybody an example how you can read an array of objects from a Hocon configuration file?
Or should I use for this a JSON-file with an JSON-parser instead of Hocon?

The following works for me in Play 2.1.2 (I don't have a .maybeApplication on my play.Play object though, and I'm not sure why you do):
import play.Play
import scala.collection.JavaConversions._
case class Project(name: String, url: String)
val projectList: List[Project] = {
val projs = Play.application.configuration.getConfigList("projects") map { p =>
Project(p.getString("name"), p.getString("url")) }
projs.toList
}
println(projectList)
Giving output:
List(Project(SO,http://stackoverflow.com/), Project(google,http://google.com))
There's not a whole lot different, although I don't get lost in a whole lot of Option instances either (again, different from the API you seem to have).
More importantly, getConfigList seems to be a closer match for what you want to do, since it returns List[play.Configuration], which enables you to specify types on retrieval instead of resorting to casts or .toString() calls.

What are you trying to accomplish with this part y.toList.map{z =>? If you want a collection of Project as the result, why not just do:
val simpleConfig = x.configration.getObjectList("projects").map{y =>
Project(y.get("name").toString, y.get("url").toString)
}
In this case, the map operation should be taking instances of ConfigObject which is what y is. That seems to be all you need to get your Project instances, so I'm not sure why you are toListing that ConfigObject (which is a Map) into a List of Tuple2 and then further mapping that again.

If a normal HOCON configuration then similar to strangefeatures answer this will work
import javax.inject._
import play.api.Configuration
trait Barfoo {
def configuration: Configuration
def projects = for {
projectsFound <- configuration.getConfigList("projects").toList
projectConfig <- projectsFound
name <- projectConfig.getString("name").toList
url <- projectConfig.getString("url").toList
} yield Project(name,url)
}
class Foobar #Inject() (val configuration: Configuration) extends Barfoo
(Using Play 2.4+ Injection)

Given that the contents of the array are Json and you have a case class, you could try to use the Json Play API and work with the objects in that way. The Inception part should make it trivial.

Related

GCP Proto Datastore encode JsonProperty in base64

I store a blob of Json in the datastore using JsonProperty.
I don't know the structure of the json data.
I am using endpoints proto datastore in order to retrieve my data.
The probleme is the json property is encoded in base64 and I want a plain json object.
For the example, the json data will be:
{
first: 1,
second: 2
}
My code looks something like:
import endpoints
from google.appengine.ext import ndb
from protorpc import remote
from endpoints_proto_datastore.ndb import EndpointsModel
class Model(EndpointsModel):
data = ndb.JsonProperty()
#endpoints.api(name='myapi', version='v1', description='My Sample API')
class DataEndpoint(remote.Service):
#Model.method(path='mymodel2', http_method='POST',
name='mymodel.insert')
def MyModelInsert(self, my_model):
my_model.data = {"first": 1, "second": 2}
my_model.put()
return my_model
#Model.method(path='mymodel/{entityKey}',
http_method='GET',
name='mymodel.get')
def getMyModel(self, model):
print(model.data)
return model
API = endpoints.api_server([DataEndpoint])
When I call the api for getting a model, I get:
POST /_ah/api/myapi/v1/mymodel2
{
"data": "eyJzZWNvbmQiOiAyLCAiZmlyc3QiOiAxfQ=="
}
where eyJzZWNvbmQiOiAyLCAiZmlyc3QiOiAxfQ== is the base64 encoded of {"second": 2, "first": 1}
And the print statement give me: {u'second': 2, u'first': 1}
So, in the method, I can explore the json blob data as a python dict.
But, in the api call, the data is encoded in base64.
I expeted the api call to give me:
{
'data': {
'second': 2,
'first': 1
}
}
How can I get this result?
After the discussion in the comments of your question, let me share with you a sample code that you can use in order to store a JSON object in Datastore (it will be stored as a string), and later retrieve it in such a way that:
It will show as plain JSON after the API call.
You will be able to parse it again to a Python dict using eval.
I hope I understood correctly your issue, and this helps you with it.
import endpoints
from google.appengine.ext import ndb
from protorpc import remote
from endpoints_proto_datastore.ndb import EndpointsModel
class Sample(EndpointsModel):
column1 = ndb.StringProperty()
column2 = ndb.IntegerProperty()
column3 = ndb.StringProperty()
#endpoints.api(name='myapi', version='v1', description='My Sample API')
class MyApi(remote.Service):
# URL: .../_ah/api/myapi/v1/mymodel - POSTS A NEW ENTITY
#Sample.method(path='mymodel', http_method='GET', name='Sample.insert')
def MyModelInsert(self, my_model):
dict={'first':1, 'second':2}
dict_str=str(dict)
my_model.column1="Year"
my_model.column2=2018
my_model.column3=dict_str
my_model.put()
return my_model
# URL: .../_ah/api/myapi/v1/mymodel/{ID} - RETRIEVES AN ENTITY BY ITS ID
#Sample.method(request_fields=('id',), path='mymodel/{id}', http_method='GET', name='Sample.get')
def MyModelGet(self, my_model):
if not my_model.from_datastore:
raise endpoints.NotFoundException('MyModel not found.')
dict=eval(my_model.column3)
print("This is the Python dict recovered from a string: {}".format(dict))
return my_model
application = endpoints.api_server([MyApi], restricted=False)
I have tested this code using the development server, but it should work the same in production using App Engine with Endpoints and Datastore.
After querying the first endpoint, it will create a new Entity which you will be able to find in Datastore, and which contains a property column3 with your JSON data in string format:
Then, if you use the ID of that entity to retrieve it, in your browser it will show the string without any strange encoding, just plain JSON:
And in the console, you will be able to see that this string can be converted to a Python dict (or also a JSON, using the json module if you prefer):
I hope I have not missed any point of what you want to achieve, but I think all the most important points are covered with this code: a property being a JSON object, store it in Datastore, retrieve it in a readable format, and being able to use it again as JSON/dict.
Update:
I think you should have a look at the list of available Property Types yourself, in order to find which one fits your requirements better. However, as an additional note, I have done a quick test working with a StructuredProperty (a property inside another property), by adding these modifications to the code:
#Define the nested model (your JSON object)
class Structured(EndpointsModel):
first = ndb.IntegerProperty()
second = ndb.IntegerProperty()
#Here I added a new property for simplicity; remember, StackOverflow does not write code for you :)
class Sample(EndpointsModel):
column1 = ndb.StringProperty()
column2 = ndb.IntegerProperty()
column3 = ndb.StringProperty()
column4 = ndb.StructuredProperty(Structured)
#Modify this endpoint definition to add a new property
#Sample.method(request_fields=('id',), path='mymodel/{id}', http_method='GET', name='Sample.get')
def MyModelGet(self, my_model):
if not my_model.from_datastore:
raise endpoints.NotFoundException('MyModel not found.')
#Add the new nested property here
dict=eval(my_model.column3)
my_model.column4=dict
print(json.dumps(my_model.column3))
print("This is the Python dict recovered from a string: {}".format(dict))
return my_model
With these changes, the response of the call to the endpoint looks like:
Now column4 is a JSON object itself (although it is not printed in-line, I do not think that should be a problem.
I hope this helps too. If this is not the exact behavior you want, maybe should play around with the Property Types available, but I do not think there is one type to which you can print a Python dict (or JSON object) without previously converting it to a String.

POST request using play ws in Scala

I am using play-ws standalone to consume REST service in scala.
val data = Json.obj("message" -> "How are you?")
wsClient.url("http://localhost:5000/token").post(data).map { response =>
val statusText: String = response.statusText
println(response.body)
}
When i run this, i get the following error,
Cannot find an instance of play.api.libs.json.JsObject to WSBody. Define a BodyWritable[play.api.libs.json.JsObject] or extend play.api.libs.ws.ahc.DefaultBodyWritables
wsClient.url("http://localhost:5000/token").post(data).map { response =>
It tells to define a bodywritable. I have read the documentation but cud't get the "BodyWritable". I am new to scala. Anybody help me please. Thanks in advance.
You need to import BodyWritables for json objects, Add following import statements to your source file
import play.api.libs.ws.JsonBodyReadables._
import play.api.libs.ws.JsonBodyWritables._
For more information have a look at official documentation
The current accepted answer does not work in Scala Play 2.7.x (possibly some earlier versions as well).
I couldn't find it in the docs, but you need to explicitly call asScala on the ws object. For example:
val data = Json.obj("message" -> "How are you?")
ws
.asScala()
.url("http://someurl.com")
.post(data)
.map(response => {
//do something with response
})
Note: this also returns a scala future instead of a java completion stage.

Scala Play Json implicit writes type mismatch

I am new to Scala and Play, and I ask for help with this simple example. I tried to search for solution by myself, but I did not succeed.
I am trying to do the example from from Mastering Play Framework for Scala book, the one about extending Json parser (Pages 29-30).
The environment I use is:
Scala: 2.11.7
Play: 2.5.8
Activator: 1.3.10
The code is:
case class Subscription(emailId: String, interval: Long)
In controller:
import play.api.libs.json.Json
import play.api.libs.json.JsValue
import play.api.libs.json.Writes
.....
val parseAsSubscription = parse.using {
request =>
parse.json.map {
body =>
val emailId:String = (body \ "emailId").as[String]
val fromDate:Long = (body \ "fromDate").as[Long]
Subscription(emailId, fromDate)
}
}
implicit val subWrites:Writes[Subscription] = Json.writes[Subscription]
def getSub = Action(parseAsSubscription) {
request =>
val subscription: Subscription = request.body
Ok(Json.toJson(Subscription))
}
The line: Ok(Json.toJson(Subscription)) gives an error
No Json serializer found for type models.Subscription.type. Try to
implement an implicit Writes or Format for this type.
This is odd, because Writes object is defined one row above. Thus, I tried to pass it to toJson method explicitly:
Ok(Json.toJson(Subscription)(subWrites))
It gave me a different error, which partially explained why existing Writes object did not suit:
type mismatch;
found:
play.api.libs.json.Writes[models.Subscription]
required:
play.api.libs.json.Writes[models.Subscription.type]
However, I don't understand the nature of this error and what models.Subscription.type is .
I used to do a similar thing in a different example, and it worked just fine.
Any help will be appreciated.
You're trying to serialize the type Subscription, rather than the request body, which you stored as the value subscription. Try replacing the last line with Ok(Json.toJson(subscription)).

Gatling :- Compare web service Json response using jsonFileFeeder

I'm using JSON feeder to compare JSON output by web services as follows,
val jsonFileFeeder = jsonFile("test_data.json")
val strategy = (value: Option[String], session: Session) => value.map { jsonFileFeeder =>
val result = JSONCompare.compareJSON("expectedStr", "actualStr", JSONCompareMode.STRICT)
if (result.failed) Failure(result.getMessage)
else Success(value)
}.getOrElse(Failure("Missing body"))
val login = exec(http("Login")
.get("/login"))
.pause(1)
.feed(feeder)
.exec(http("authorization")
.post("/auth")
.headers(headers_10)
.queryParam("""email""", "${email}")
.queryParam("""password""", "${password}")
.check(status.is(200))
.check(bodyString.matchWith(strategy)))
.pause(1)
But it throws error
value matchWith is not a member of io.gatling.core.check.DefaultFindChe
ckBuilder[io.gatling.http.check.HttpCheck,io.gatling.http.response.Response,String,String]
15:10:01.963 [ERROR] i.g.a.ZincCompiler$ - .check(bodyString.matchWith(jsonFileFeeder)))
s\lib\Login.scala:18: not found: value JSONCompare
15:10:05.224 [ERROR] i.g.a.ZincCompiler$ - val result = JSONCompare.compareJSON(jsonFileFeeder, j
sonFileFeeder, JSONCompareMode.STRICT)
^
15:10:05.631 [ERROR] i.g.a.ZincCompiler$ - two errors found
Compilation failed
Here's a sample script that semantically compares a JSON response with expected output:
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import io.gatling.core.json.Jackson
import java.nio.charset.StandardCharsets.UTF_8
import scala.concurrent.duration._
class BasicSimulation extends Simulation {
lazy val expectedJson = Jackson.parse(
getClass.getResourceAsStream("/output.json"),
UTF_8
)
val scn = scenario("Scenario Name")
.exec(http("request_1")
.get("http://localhost:8000/output.json")
.check(bodyString.transform(Jackson.parse).is(expectedJson))
)
setUp(scn.inject(atOnceUsers(1)))
}
It assumes there is a file output.json in the resources directory (the directory that also contains your data and request-bodies).
However, I think you should carefully consider whether this solution is right for your needs. It won't scale as well as JSONPath or regex checks (especially for large JSON files), it's inflexible, and it seems more like a functional testing task than a performance task. I suspect that if you're trying to compare JSON files in this way, then you're probably trying to solve the wrong problem.
Note that it doesn't use jsonFile, as jsonFile is designed for use as a feeder, whereas I suspect you want to compare a single request with a hard-coded response. However, jsonFile may prove useful if you will be making a number of different requests with different parameters and expect different (known) responses. Here's an example script that takes this approach:
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import io.gatling.core.json.Jackson
import scala.concurrent.duration._
class BasicSimulation extends Simulation {
val myFeed = jsonFile("json_data.json").circular
val scn = scenario("Scenario Name")
.feed(myFeed)
.exec(http("request_1")
.get("${targetUrl}")
.check(bodyString.transform(Jackson.parse).is("${expectedResponse}"))
)
setUp(scn.inject(atOnceUsers(2)))
}
It assumes there is a json resource in data/json_data.json, that looks something like the following:
[
{
"targetUrl":"http://localhost:8000/failure.json",
"expectedResponse":
{
"success": false,
"message": "Request Failed"
}
},
{
"targetUrl":"http://localhost:8000/success.json",
"expectedResponse":
{
"success": true,
"message": "Request Succeeded"
}
}
]
The expectedResponse should be the exact JSON you expect to get back from the server. And of course you don't just have to parameterise targetUrl, you can parameterise whatever you want in this way.
As an aside, you may also be interested to know that Gatling 2.1 is expected to allow comparing an response with a file without using hacks like these (although the current development version only supports comparing byte-for-byte, not comparing-as-json).

Optimal way to read out JSON from MongoDB into a Scalatra API

I have a pre-formatted JSON blob stored as a string in MongoDB as a field in one of collections. Currently in my Scalatra based API, I have a before filter that renders all of my responses with a JSON content type. An example of how I return the content looks like the following:
get ("/boxscore", operation(getBoxscore)) {
val game_id:Int = params.getOrElse("game_id", "3145").toInt
val mongoColl = mongoDb.apply("boxscores")
val q: DBObject = MongoDBObject("game_id" -> game_id)
val res = mongoColl.findOne(q)
res match {
case Some(j) => JSON.parseFull(j("json_body").toString)
case None => NotFound("Requested document could not be found.")
}
}
Now this certainly does work. It doesn't seem the "Scala" way of doing things and I feel like this can be optimized. The worrisome part to me is when I add a caching layer and a cache does not hit that I am spending additional CPU time on re-parsing a String I already formatted as JSON in MongoDB:
JSON.parseFull(j("json_body").toString)
I have to take the result from findOne(), run .toString on it, then re-parse it into JSON afterwards. Is there a more optimal route? Since the JSON is already stored as a String in MongoDB, I'm guessing a serializer / case class isn't the right solution here. Of course I can just leave what's here - but I'd like to learn if there's a way that would be more Scala-like and CPU friendly going forward.
There is the option to extend Scalatra's render pipeline with handling for MongoDB classes. The following two routes act as an example. They return a MongoCursor and a DBObject as result. We are going to convert those to a string.
get("/") {
mongoColl.find
}
get("/:key/:value") {
val q = MongoDBObject(params("key") -> params("value"))
mongoColl.findOne(q) match {
case Some(x) => x
case None => halt(404)
}
}
In order to handle the types we need to define a partial function which takes care of the conversion and sets the appropriate content type.
There are two cases, the first one handles a DBObject. The content type is set to "application/json" and the object is converted to a string by calling the toString method. The second case handles a MongoCursor. Since it implements TraversableOnce the map function can be used.
def renderMongo = {
case dbo: DBObject =>
contentType = "application/json"
dbo.toString
case xs: TraversableOnce[_] => // handles a MongoCursor, be aware of type erasure here
contentType = "application/json"
val ls = xs map (x => x.toString) mkString(",")
"[" + ls + "]"
}: RenderPipeline
(Note the following type definition: type RenderPipeline = PartialFunction[Any, Any])
Now the method needs to get hooked in. After a HTTP call has been handled the result is forwarded to the render pipeline for further conversion. Custom handling can be added by overriding the renderPipeline method from ScalatraBase. With the following definition the renderMongo function is called first:
override protected def renderPipeline = renderMongo orElse super.renderPipeline
This is a basic approach to handle MongoDB types. There are other options as well, for example by making use of json4s-mongo.
Here is the previous code in a working sample project.