How to get value from JsValue? - json

For Example:
My Db stores following Json. Form Following json I need to extract the value of particular field.
"student": [
{
"name": "Xyz",
"college": "abc",
"student_id":{
"$oid": "59a9314f6d0000920962e247"
}},
{
"name": "DDD",
"college": "opop",
"student_id":{
"$oid": "59a9314f6d0000920962e257"
}}
]
How can I pick only the value of "$oid" and save json in following way:
"student": [
{
"name": "Xyz",
"college": "abc",
"student_id":
"59a9314f6d0000920962e247"
},
{
"name": "DDD",
"college": "opop",
"student_id":
"59a9314f6d0000920962e257"
}
]

In my scenario, I'm reading it from client side as -
String Json = null;
JsonNode body = request().body().asJson();
Json = body.toString();
Logger.info(Json);
String role = body.get("role").get("role").asText();
users.firstName = body.get("firstName").asText();
users.lastName = body.get("lastName").asText();
You need to change the definition of JsonNode body = request().body().asJson(); and the other code as per your scenario to get it from db.

You will need to replace "student_id" jsValue with string value as below:
val original: JsValue = Json.parse(
""" {"student": [
{
"name": "Xyz",
"college": "abc",
"student_id":{
"$oid": "59a9314f6d0000920962e247"
}},
{
"name": "DDD",
"college": "opop",
"student_id":{
"$oid": "59a9314f6d0000920962e257"
}}
]}""")
val changed = original.as[JsObject] ++ Json.obj(
"student" -> Json.arr {
original.transform((__ \ 'student)
.json.pick[JsArray])
.getOrElse(Json.arr())
.value.map(e => {
val value = e.transform((__ \ 'student_id \ '$oid).json.pick[JsString]).get
e.as[JsObject] ++ Json.obj("student_id" -> value)
})
})
println(Json.stringify(changed))
//Result:
{"student":[[{"name":"Xyz","college":"abc","student_id":"59a9314f6d0000920962e247"},{"name":"DDD","college":"opop","student_id":"59a9314f6d0000920962e257"}]]}

Related

Scala Json parsing

This is the input json which I am getting which is nested json structure and I don't want to map directly to class, need custom parsing of some the objects as I have made the case classes
{
"uuid": "b547e13e-b32d-11ec-b909-0242ac120002",
"log": {
"Response": {
"info": {
"receivedTime": "2022-02-09T00:30:00Z",
"isSecure": "Yes",
"Data": [{
"id": "75641",
"type": "vendor",
"sourceId": "3",
"size": 53
}],
"Group": [{
"isActive": "yes",
"metadata": {
"owner": "owner1",
"compressionType": "gz",
"comments": "someComment",
"createdDate": "2022-01-11T11:00:00Z",
"updatedDate": "2022-01-12T14:17:55Z"
},
"setId": "1"
},
{
"isActive": "yes",
"metadata": {
"owner": "owner12",
"compressionType": "snappy",
"comments": "someComment",
"createdDate": "2022-01-11T11:00:00Z",
"updatedDate": "2022-01-12T14:17:55Z"
},
"setId": "2"
},
{
"isActive": "yes",
"metadata": {
"owner": "owner123",
"compressionType": "snappy",
"comments": "someComment",
"createdDate": "2022-01-11T11:00:00Z",
"updatedDate": "2022-01-12T14:17:55Z"
},
"setId": "4"
},
{
"isActive": "yes",
"metadata": {
"owner": "owner124",
"compressionType": "snappy",
"comments": "someComments",
"createdDate": "2022-01-11T11:00:00Z",
"updatedDate": "2022-01-12T14:17:55Z"
},
"setId": "4"
}
]
}
}
}
}
Code that I am trying play json also tried circe . Please help ..New to scala world
below is object and case class
case class DataCatalog(uuid: String, data: Seq[Data], metadata: Seq[Metadata])
object DataCatalog {
case class Data(id: String,
type: String,
sourceId: Option[Int],
size: Int)
case class Metadata(isActive: String,
owner: String,
compressionType: String,
comments: String,
createdDate: String,
updatedDate: String
)
def convertJson(inputjsonLine: String): Option[DataCatalog] = {
val result = Try {
//val doc: Json = parse(line).getOrElse(Json.Null)
//val cursor: HCursor = doc.hcursor
//val uuid: Decoder.Result[String] = cursor.downField("uuid").as[String]
val lat = (inputjsonLine \ "uuid").get
DataCatalog(uuid, data, group)
}
//do pattern matching
result match {
case Success(dataCatalog) => Some(dataCatalog)
case Failure(exception) =>
}
}
}
Any parsing api is fine.
If you use Scala Play, for each case class you should have an companion object which will help you a lot with read/write object in/from json:
object Data {
import play.api.libs.json._
implicit val read = Json.reads[Data ]
implicit val write = Json.writes[Data ]
def tupled = (Data.apply _).tupled
}
object Metadata {
import play.api.libs.json._
implicit val read = Json.reads[Metadata ]
implicit val write = Json.writes[Metadata ]
def tupled = (Metadata.apply _).tupled
}
Is required as each companion object to be in same file as the case class. For your json example, you need more case classes because you have a lot of nested objects there (log, Response, info, each of it)
or, you can read the field which you're interested in as:
(jsonData \ "fieldName").as[CaseClassName]
You can try to access the Data value as:
(jsonData \ "log" \ "Response" \ "info" \ "Data").as[Data]
same for Metadata

merge lists of dictionaries in terraform v0.12

I would like to do the following using terraform:
I have 2 JSONs:
1.json:
[
{
"description": "description1",
"url": "url1",
"data": "data1"
},
{
"description": "description2",
"url": "url2",
"data": "data2",
"action": "action2"
},
{
"description": "description3",
"url": "url3",
"data": "data3"
}
]
2.json:
[
{
"description": "description1",
"url": "url1",
"data": "data1"
},
{
"description": "description2_new",
"url": "url2",
"data": "data2_new"
},
{
"description": "description4",
"url": "url4",
"data": "data4"
}
]
and I want to merge them into one. Dictionaries from the second JSON should override dictionaries from the first one if url key is the same. I.e. combined JSON should look like:
[
{
"description": "description1",
"url": "url1",
"data": "data1"
},
{
"description": "description2_new",
"url": "url2",
"data": "data2_new"
},
{
"description": "description3",
"url": "url3",
"data": "data3"
},
{
"description": "description4",
"url": "url4",
"data": "data4"
}
]
Using python I can easily do it:
import json
with open('1.json') as f:
json1 = json.load(f)
with open('2.json') as f:
json2 = json.load(f)
def list_to_dict(json_list):
res_dict = {}
for d in json_list:
res_dict[d['url']] = d
return res_dict
def merge_json(json1, json2):
j1 = list_to_dict(json1)
j2 = list_to_dict(json2)
j1.update(j2)
res_list = []
for key in j1.keys():
res_list.append(j1[key])
return res_list
print(json.dumps(merge_json(json1, json2), indent=4))
How can I do that using terraform?
Using terraform 0.12.x
$ cat main.tf
locals {
# read from files and turn into json
list1 = jsondecode(file("1.json"))
list2 = jsondecode(file("2.json"))
# iterate over lists and turn url into a unique key
dict1 = { for item in local.list1 : item.url => item }
dict2 = { for item in local.list2 : item.url => item }
# combine both dictionaries so values converge
# only take its values
merged = values(merge(local.dict1, local.dict2))
}
output "this" {
value = local.merged
}
$ terraform apply
Apply complete! Resources: 0 added, 0 changed, 0 destroyed.
Outputs:
this = [
{
"data" = "data1"
"description" = "description1"
"url" = "url1"
},
{
"data" = "data2_new"
"description" = "description2_new"
"url" = "url2"
},
{
"data" = "data3"
"description" = "description3"
"url" = "url3"
},
{
"data" = "data4"
"description" = "description4"
"url" = "url4"
},
]
Terraform supports expanding a list into function parameters using the ... operator. This will allow an arbitrary number of documents to be read.
(I'm not sure, but I believe this feature was added in v0.15)
For this example, I added a new file 3.json with the contents:
[
{
"description": "description4_new",
"url": "url4",
"data": "data4_new"
}
]
For main.tf, I'm using the same logic as #someguyonacomputer's answer:
$ cat main.tf
locals {
jsondocs = [
for filename in fileset(path.module, "*.json") : jsondecode(file(filename))
]
as_dicts = [
for arr in local.jsondocs : {
for obj in arr : obj.url => obj
}
]
# This is where the '...' operator is used
merged = merge(local.as_dicts...)
}
output "as_list" {
value = values(local.merged)
}
Result:
Changes to Outputs:
+ as_list = [
+ {
+ data = "data1"
+ description = "description1"
+ url = "url1"
},
+ {
+ data = "data2_new"
+ description = "description2_new"
+ url = "url2"
},
+ {
+ data = "data3"
+ description = "description3"
+ url = "url3"
},
+ {
+ data = "data4_new"
+ description = "description4_new"
+ url = "url4"
},
]
References:
Terraform Docs -- Function Calls # Expanding Function Arguments

Groovy: How to parse the json specific key's value into list/array

I am new to groovy and trying
1) from the output of prettyPrint(toJson()), I am trying to get a list of values from a specific key inside an json array using groovy. Using the below JSON output from prettyPrint example below, I am trying to create a list which consists only the values of the name key.
My Code:
def string1 = jiraGetIssueTransitions(idOrKey: jira_id)
echo prettyPrint(toJson(string1.data))
def pretty = prettyPrint(toJson(string1.data))
def valid_strings = readJSON text: "${pretty}"
echo "valid_strings.name : ${valid_strings.name}"
Output of prettyPrint(toJson(string1.data))is below JSON:
{
"expand": "places",
"places": [
{
"id": 1,
"name": "Bulbasaur",
"type": {
"grass",
"poison"
}
},
{
"id": 2,
"name": "Ivysaur",
"type": {
"grass",
"poison"
}
}
}
Expected result
valid_strings.name : ["Bulbasaur", "Ivysaur"]
Current output
valid_strings.name : null
The pretty printed JSON content is invalid.
If the JSON is valid, then names can be accessed as follows:
import groovy.json.JsonSlurper
def text = """
{
"expand": "places",
"places": [{
"id": 1,
"name": "Bulbasaur",
"type": [
"grass",
"poison"
]
},
{
"id": 2,
"name": "Ivysaur",
"type": [
"grass",
"poison"
]
}
]
}
"""
def json = new JsonSlurper().parseText(text)
println(json.places*.name)
Basically, use spray the attribute lookup (i.e., *.name) on the appropriate object (i.e., json.places).
I've used something similar to print out elements within the response in ReadyAPI
import groovy.json.*
import groovy.util.*
def json='[
{ "message" : "Success",
"bookings" : [
{ "bookingId" : 60002172,
"bookingDate" : "1900-01-01T00:00:00" },
{ "bookingId" : 59935582,
"bookingDate" : "1900-01-01" },
{ "bookingId" : 53184048,
"bookingDate" : "2019-01-15",
"testId" : "12803798123",
"overallScore" : "PASS" },
{ "bookingId" : 53183765,
"bookingDate" : "2019-01-15T13:45:00" },
{ "bookingId" : 52783312,
"bookingDate" : "1900-01-01" }
]
}
]
def response = context.expand( json )
def parsedjson = new groovy.json.JsonSlurper().parseText(response)
log.info parsedjson
log.info " Count of records returned: " + parsedjson.size()
log.info " List of bookingIDs in this response: " + parsedjson.bookings*.bookingId

How to stream insert JSON array to BigQuery table in Apache Beam

My apache beam application receives a message in JSON array but insert each row to a BigQuery table. How can I support this usecase in ApacheBeam? Can I split each row and insert it to table one by one?
JSON message example:
[
{"id": 1, "name": "post1", "price": 10},
{"id": 2, "name": "post2", "price": 20},
{"id": 3, "name": "post3", "price": 30}
]
BigQuery table schema:
[
{
"mode": "REQUIRED",
"name": "id",
"type": "INT64"
},
{
"mode": "REQUIRED",
"name": "name",
"type": "STRING"
},
{
"mode": "REQUIRED",
"name": "price",
"type": "INT64"
}
]
Here is my solution. I converted JSON string to List once then c.output one by one. My code in in Scala but you can do the same thing in Java.
case class MyTranscationRecord(id: String, name: String, price: Int)
case class MyTranscation(recordList: List[MyTranscationRecord])
class ConvertJSONTextToMyRecord extends DoFn[KafkaRecord[java.lang.Long, String], MyTranscation]() {
private val logger: Logger = LoggerFactory.getLogger(classOf[ConvertJSONTextToMyRecord])
#ProcessElement
def processElement(c: ProcessContext): Unit = {
try {
val mapper: ObjectMapper = new ObjectMapper()
.registerModule(DefaultScalaModule)
val messageText = c.element.getKV.getValue
val transaction: MyRecord = mapper.readValue(messageText, classOf[MyTranscation])
logger.info(s"successfully converted to an EPC transaction = $transaction")
for (record <- transaction.recordList) {
c.output(record)
}
} catch {
case e: Exception =>
val message = e.getLocalizedMessage + e.getStackTrace
logger.error(message)
}
}
}

Json format - scala

I need to build a Json format in the following way in scala. How to implement the same ?
{
"name": "protocols",
"children": [
{
"name": "tcp", "children": [
{
"name": "source 1",
"children": [
{
"name": "destination 1",
"children": [
{
"name": "packet 1"
},
{
"name": "packet 4"
}
]
},
{
"name": "destination 2","children": [
{
"name": "packet 1"
},
{
"name": "packet 4"
}
]
},
I need a tree structure like this to be wriiten to a file .
If you are using play, your json structure can be represented with single case class
Here is a sample, where this case class is called Node
import play.api.libs.json.Json
case class Node(name: String, children: List[Node] = Nil)
implicit val format = Json.format[Node]
val childSource1 = Node("destination 1", List(Node("packet 1"), Node("packet 4")))
val childSource2 = Node("destination 2", List(Node("packet 1"), Node("packet 4")))
val source1 = Node("source 1", List(childSource1, childSource2))
val example = Node("protocols", List(Node("tcp", List(source1))))
Json.prettyPrint(Json.toJson(example))