I have Elasticsearch Search response that is a deeply nested Json file and I am stuck as to how to get a particular value from it. Please am new to Scala and programming in general and I have searched online and could not see any answer that explained it well.
This is the Json file and the value I want to get out is "getSum":"value"
Search_response: org.elasticsearch.action.search.SearchResponse = {
"took" : 32,
"timed_out" : false,
"_shards" : {
"total" : 3,
"successful" : 3,
"failed" : 0
},
"hits" : {
"total" : 12,
"max_score" : 1.0,
"hits" : [ {
"_index" : "myIndex",
"_type" : "myType",
"_id" : "4151202002020",
"_score" : 1.0,
"_source":{"pint":[{"printer":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"Lam":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"Kam":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"Jas":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"tiv":[{ourc""s:"wrer","sourceType":"rsd","Vag":"agaatttt363336"}],"timeLineSource:[{"LA":"DGAT","GATA":"JAS","timeline":9.111694,"GA":"SFWF2525252552552525"}
}, {
"_index" : "myIndex",
"_type" : "myType",
"_id" : "4151202002020",
"_score" : 1.0,
"_source":{"pint":[{"printer":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"Lam":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"Kam":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"Jas":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"tiv":[{ourc""s:"wrer","sourceType":"rsd","Vag":"agaatttt363336"}],"timeLineSource:[{"LA":"DGAT","GATA":"JAS","timeline":9.111694,"GA":"SFWF2525252552552525"}
}, {
"_index" : "myIndex",
"_type" : "myType",
"_id" : "4151202002020",
"_score" : 1.0,
"_source":{"pint":[{"printer":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"Lam":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"Kam":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"},{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"Jas":[{"sourceName":"3636636","sourceType":"Bin","Star":0.0,"Fun":"gatayay"}],"tiv":[{ourc""s:"wrer","sourceType":"rsd","Vag":"agaatttt363336"}],"timeLineSource:[{"LA":"DGAT","GATA":"JAS","timeline":9.111694,"GA":"SFWF2525252552552525"}
}, {
},
"aggregations" : {
"DAEY" : {
"doc_count" : 59,
"histogram" : {
"buckets" : [ {
"key_as_string" : "1978-02-22T00:00:00.000Z",
"key" : 1503360000000,
"doc_count" : 59,
"nestedValue" : {
"doc_count" : 177,
"getSum" : {
"value" : 768.0690221786499
}
},
}
}
}
}
This is what I tried
val getResult: String = searchResult.toString.stripMargin
val getValue = JsonParser.parse(getResult).asInstanceOf[JObject].values("aggregations").toString
You can solve this by using type-safe config. Please find the required maven and sbt dependency below -
Maven Dependecy -
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>1.3.1</version>
</dependency>
Sbt Dependency -
libraryDependencies += "com.typesafe" % "config" % "1.3.1"
Afterwards, you can get the value of sum with below code -
import com.typesafe.config.ConfigFactory
val config = ConfigFactory.parseString(getResult)
config.getConfigList("aggregations.DAEY.buckets").get(0).getString("nestedValue .getSum.value")
Checkout API doc for library from this link
I finally used
val getResult: String = searchResult.toString.stripMargin
val getValue = JsonParser.parse(getResult).asInstanceOf[JObject].values("aggregations").toString
val valueToDouble = getValue.split(" ").last.dropRight(13).toDouble
Related
I'm trying to retrieve the data from this dictionary and for some reason I cannot seem to acquire it. I'm new to parsing JSON so apologies if this is rough.
let temp = json["list"].arrayValue.map({$0["main"].dictionaryValue})
print(temp[0])
Here I am setting a value equal to the dictionary from the JSON. However, I know I need to add the key's value that I'm searching for. To be clear, I am searching for the "temp" key which in the example is equal to 28.19999...
Here is an example of the JSON:
"list" : [
{
"dt" : 1641524400,
"main" : {
"humidity" : 68,
"sea_level" : 1014,
"temp_max" : 29.260000000000002,
"feels_like" : 28.199999999999999,
"temp_min" : 28.199999999999999,
"grnd_level" : 1004,
"temp" : 28.199999999999999,
"temp_kf" : -0.58999999999999997,
"pressure" : 1014
},{
"dt" : 1641546000,
"main" : {
"pressure" : 1009,
"feels_like" : 20.93,
"temp_max" : 27.100000000000001,
"temp" : 27.100000000000001,
"humidity" : 83,
"grnd_level" : 999,
"sea_level" : 1009,
"temp_min" : 27.100000000000001,
"temp_kf" : 0
},
"sys" : {
"pod" : "n"
},
"pop" : 0.41999999999999998,
"wind" : {
"deg" : 354,
"speed" : 5.4100000000000001,
"gust" : 10.58
},
"visibility" : 6695,
"weather" : [
{
"main" : "Snow",
"id" : 600,
"description" : "light snow",
"icon" : "13n"
}
],
"snow" : {
"3h" : 0.26000000000000001
},
"clouds" : {
"all" : 100
},
"dt_txt" : "2022-01-07 09:00:00"
},
{
"dt" : 1641556800,
"main" : {
"temp_min" : 26.82,
"humidity" : 90,
"pressure" : 1008,
"temp_kf" : 0,
"temp" : 26.82,
"feels_like" : 18.879999999999999,
"sea_level" : 1008,
"temp_max" : 26.82,
"grnd_level" : 998
},
"sys" : {
"pod" : "n"
},
"pop" : 0.97999999999999998,
"wind" : {
"deg" : 310,
"gust" : 14.359999999999999,
"speed" : 7.5199999999999996
}]
Found my answer:
let temp = json["list"].arrayValue.map({$0["main"]["temp"].stringValue})
Hello can someone help me extract the value of user parameter which is "testuser1"
I tried to use this JSON Path expression $..data I was able to extract the entire response but unable to extract user parameter. Thanks in advance
{
"data": "{ "took" : 13, "timed_out" : false, "_shards" : { "total" : 5, "successful" : 5, "skipped" : 0, "failed" : 0 }, "hits" : { "total" : 1, "max_score" : 1.0, "hits" : [ { "_index" : "bushidodb_history_network_eval_ea9656ef-0a9b-474b-8026-2f83e2eb9df1_2021-april-10", "_type" : "network", "_id" : "6e2e58be-0ccf-3fb4-8239-1d4f2af322e21618059082000", "_score" : 1.0, "_source" : { "misMatches" : [ "protocol", "state", "command" ], "instance" : "e3032804-4b6d-3735-ac22-c827950395b4|0.0.0.0|10.179.155.155|53|UDP", "protocol" : "UDP", "localAddress" : "0.0.0.0", "localPort" : "12345", "foreignAddress" : "10.179.155.155", "foreignPort" : "53", "command" : "ping yahoo.com ", "user" : "testuser1", "pid" : "10060", "state" : "OUTGOINGFQ", "rate" : 216.0, "originalLocalAddress" : "192.168.100.229", "exe" : "/bin/ping", "md5" : "f9ad63ce8592af407a7be43b7d5de075", "dir" : "", "agentId" : "abcd-dcd123", "year" : "2021", "month" : "APRIL", "day" : "10", "hour" : "12", "time" : "1618059082000", "isMerged" : false, "timestamp" : "Apr 10, 2021 12:51:22 PM", "metricKey" : "6e2e58be-0ccf-3fb4-8239-1d4f2af322e2", "isCompliant" : false }, "sort" : [ 1618059082000 ] } ] }, "aggregations" : { "count_over_time" : { "buckets" : [ { "key_as_string" : "2021-04-10T08:00:00.000-0400", "key" : 1618056000000, "doc_count" : 1 } ] } }}",
"success": true,
"message": {
"code": "S",
"message": "Get Eval results Count Success"
}
}
Actual Response:
Images
What you posted doesn't look like a valid JSON to me.
If in reality you're getting what's in your image, to wit:
{
"data": "{ \"took\" : 13, \"timed_out\" : false, \"_shards\" : { \"total\" : 5, \"successful\" : 5, \"skipped\" : 0, \"failed\" : 0 }, \"hits\" : { \"total\" : 1, \"max_score\" : 1.0, \"hits\" : [ { \"_index\" : \"bushidodb_history_network_eval_ea9656ef-0a9b-474b-8026-2f83e2eb9df1_2021-april-10\", \"_type\" : \"network\", \"_id\" : \"6e2e58be-0ccf-3fb4-8239-1d4f2af322e21618059082000\", \"_score\" : 1.0, \"_source\" : { \"misMatches\" : [ \"protocol\", \"state\", \"command\" ], \"instance\" : \"e3032804-4b6d-3735-ac22-c827950395b4|0.0.0.0|10.179.155.155|53|UDP\", \"protocol\" : \"UDP\", \"localAddress\" : \"0.0.0.0\", \"localPort\" : \"12345\", \"foreignAddress\" : \"10.179.155.155\", \"foreignPort\" : \"53\", \"command\" : \"pingyahoo.com\", \"user\" : \"testuser1\", \"pid\" : \"10060\", \"state\" : \"OUTGOINGFQ\", \"rate\" : 216.0, \"originalLocalAddress\" : \"192.168.100.229\", \"exe\" : \"/bin/ping\", \"md5\" : \"f9ad63ce8592af407a7be43b7d5de075\", \"dir\" : \"\", \"agentId\" : \"abcd-dcd123\", \"year\" : \"2021\", \"month\" : \"APRIL\", \"day\" : \"10\", \"hour\" : \"12\", \"time\" : \"1618059082000\", \"isMerged\" : false, \"timestamp\" : \"Apr10, 202112: 51: 22PM\", \"metricKey\" : \"6e2e58be-0ccf-3fb4-8239-1d4f2af322e2\", \"isCompliant\" : false }, \"sort\" : [ 1618059082000 ] } ] }, \"aggregations\" : { \"count_over_time\" : { \"buckets\" : [ { \"key_as_string\" : \"2021-04-10T08: 00: 00.000-0400\", \"key\" : 1618056000000, \"doc_count\" : 1 } ] } }}",
"success": true,
"message": {
"code": "S",
"message": "Get Eval results Count Success"
}
}
the easiest way is just using 2 JSON Extractors:
Extract data attribute value into a JMeter Variable from the response
Extract user attribute value into a JMeter variable from ${data} JMeter Variable:
Demo:
If the response looks like exactly you posted you won't be able to use JSON Extractors and will have to treat it as normal text so your choice is limited to Regular Expression Extractor, example regular expression:
"user"\s*:\s*"(\w+)"
Add Regular Expression extractor with the corresponding request and extract it. Use the below expression.
Expression: "user" : "(.*?)"
Ref: https://jmeter.apache.org/usermanual/regular_expressions.html
Regular Expression Extractor Sample
I found the solution for json to csv conversion. Below is the sample json and solution.
{
"took" : 111,
"timed_out" : false,
"_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : {
"value" : 2,
"relation" : "eq"
},
"max_score" : 1.0,
"hits" : [
{
"_index" : "alerts",
"_type" : "_doc",
"_id" : "1",
"_score" : 1.0,
"_source" : {
"alertID" : "639387c3-0fbe-4c2b-9387-c30fbe7c2bc6",
"alertCategory" : "Server Alert",
"description" : "Successfully started.",
"logId" : null
}
},
{
"_index" : "alerts",
"_type" : "_doc",
"_id" : "2",
"_score" : 1.0,
"_source" : {
"alertID" : "2",
"alertCategory" : "Server Alert",
"description" : "Successfully stoped.",
"logId" : null
}
}
]
}
}
The solution :
jq -r '.hits.hits[]._source | [ "alertID" , "alertCategory" , "description", "logId" ], ([."alertID",."alertCategory",."description",."logId" // "null"]) | #csv' < /root/events.json
The problem with this solution is that I have to hard code the column names. What If my json gets a few additions under _source tag later? I need a solution which can handle the dynamic data under _source. I am open to any other tool or command in shell.
Simply use keys_unsorted (or keys if you want them sorted). See e.g. Convert JSON array into CSV using jq or How to convert arbitrary simple JSON to CSV using jq? for two SO examples. There are many others too.
I am using OS version of JFrog Artifactory for my CI-CD activities which run via the Jenkins pipeline. I am novice to groovy/java
The REST APIs of OS JFrog Artifactory do not support the extraction of the latest build from a repository. With Jenkins pipeline in play, I was wondering if i could extract the data from the JSON response provided by Artifactory using Jenkins native groovy support(just to avoid external service which can be run via python/Java/Shell).
I am looking to put the extracted JSON response into a Map, sort the Map in descending order and extract the first Key-Value pair which contains the latest build info.
I end up getting "-1" as the response when I try to extract the data.
import groovy.json.JsonSlurper
def response = httpRequest authentication: 'ArtifactoryAPIKey', consoleLogResponseBody: false, contentType: 'TEXT_PLAIN', httpMode: 'POST', requestBody: '''
items.find({
"$and": [
{"repo": {"$match": "libs-snapshot-local"}},
{"name": {"$match": "simple-integration*.jar"}}
]
})''', url: 'http://<my-ip-and-port-info>/artifactory/api/search/aql'
def jsonParser = new JsonSlurper()
Map jsonOutput = jsonParser.parseText(response.content)
List resultsInfo = jsonOutput['results']
print(resultInfo[0].created)
def sortedResult = resultInfo.sort( {a, b -> b["created"] <=> a["created"] } )
sortedResult.each {
println it
}
The sample JSON to be parsed:
{
"results" : [ {
"repo" : "libs-snapshot-local",
"path" : "simple-integration/2.5.150",
"name" : "simple-integration-2.5.150.jar",
"type" : "file",
"size" : 1175,
"created" : "2019-06-23T19:51:30.367+05:30",
"created_by" : "admin",
"modified" : "2019-06-23T19:51:30.364+05:30",
"modified_by" : "admin",
"updated" : "2019-06-23T19:51:30.368+05:30"
},{
"repo" : "libs-snapshot-local",
"path" : "simple-integration/2.5.140",
"name" : "simple-integration-2.5.140.jar",
"type" : "file",
"size" : 1175,
"created" : "2019-06-21T19:52:40.670+05:30",
"created_by" : "admin",
"modified" : "2019-06-21T19:52:40.659+05:30",
"modified_by" : "admin",
"updated" : "2019-06-21T19:52:40.671+05:30"
},{
"repo" : "libs-snapshot-local",
"path" : "simple-integration/2.5.150",
"name" : "simple-integration-2.5.160.jar",
"type" : "file",
"size" : 1175,
"created" : "2019-06-28T19:58:04.973+05:30",
"created_by" : "admin",
"modified" : "2019-06-28T19:58:04.970+05:30",
"modified_by" : "admin",
"updated" : "2019-06-28T19:58:04.973+05:30"
} ],
"range" : {
"start_pos" : 0,
"end_pos" : 3,
"total" : 3
}
}
//The output i am looking for: Latest build info with fields "created" and "name"
def jsonOutput = new groovy.json.JsonSlurper().parseText('''
{
"results" : [ {
"repo" : "libs-snapshot-local",
"path" : "simple-integration/2.5.150",
"name" : "simple-integration-2.5.150.jar",
"type" : "file",
"size" : 1175,
"created" : "2019-06-23T19:51:30.367+05:30",
"created_by" : "admin",
"modified" : "2019-06-23T19:51:30.364+05:30",
"modified_by" : "admin",
"updated" : "2019-06-23T19:51:30.368+05:30"
},{
"repo" : "libs-snapshot-local",
"path" : "simple-integration/2.5.140",
"name" : "simple-integration-2.5.140.jar",
"type" : "file",
"size" : 1175,
"created" : "2019-06-21T19:52:40.670+05:30",
"created_by" : "admin",
"modified" : "2019-06-21T19:52:40.659+05:30",
"modified_by" : "admin",
"updated" : "2019-06-21T19:52:40.671+05:30"
},{
"repo" : "libs-snapshot-local",
"path" : "simple-integration/2.5.150",
"name" : "simple-integration-2.5.160.jar",
"type" : "file",
"size" : 1175,
"created" : "2019-06-28T19:58:04.973+05:30",
"created_by" : "admin",
"modified" : "2019-06-28T19:58:04.970+05:30",
"modified_by" : "admin",
"updated" : "2019-06-28T19:58:04.973+05:30"
} ],
"range" : {
"start_pos" : 0,
"end_pos" : 3,
"total" : 3
}
}
''')
def last = jsonOutput.results.sort{a, b -> b.created <=> a.created }[0]
println last.created
println last.name
The problem here is not with Groovy code but the Jenkins pipeline.
This code as part of the question, and the solution provided by #daggett works charm on any Groovy IDE But, Fails when run via jenkins pipeline.
The issue URL: https://issues.jenkins-ci.org/browse/JENKINS-44924
I hope they fix it soon.
Thanks for your help guys.
I have this JSON:
{
"totalMemory" : 12206567424,
"totalProcessors" : 4,
"version" : "0.4.1",
"agent" : {
"reconnectRetrySec" : 5,
"agentName" : "1001",
"checkRecovery" : false,
"backPressure" : 10000,
"throttler" : 100
},
"logPath" : "/eq/equalum/eqagent-0.4.1.0-SNAPSHOT/logs",
"startTime" : 1494837249902,
"status" : {
"current" : "active",
"currentMessage" : null,
"previous" : "pending",
"previousMessage" : "Recovery:Starting pipelines"
},
"autoStart" : false,
"recovery" : {
"agentName" : "1001",
"partitionInfo" : { },
"topicToInitialCapturePosition" : { }
},
"sources" : [ {
"dataSource" : "oracle",
"name" : "oracle_source",
"captureType" : "directOverApi",
"streams" : [ ],
"idlePollingFreqMs" : 100,
"status" : {
"current" : "active",
"currentMessage" : null,
"previous" : "pending",
"previousMessage" : "Trying to init storage"
},
"host" : "192.168.191.5",
"metricsType" : { },
"bulkSize" : 10000,
"user" : "STACK",
"password" : "********",
"port" : 1521,
"service" : "equalum",
"heartbeatPeriodInMillis" : 1000,
"lagObjective" : 1,
"dataSource" : "oracle"
} ],
"upTime" : "157 min, 0 sec",
"build" : "0-SNAPSHOT",
"target" : {
"targetType" : "equalum",
"agentID" : 1001,
"engineServers" : "192.168.56.100:9000",
"kafkaOptions" : null,
"eventsServers" : "192.168.56.100:9999",
"jaasConfigurationPath" : null,
"securityProtocol" : "PLAINTEXT",
"stateMonitorTopic" : "_state_change",
"targetType" : "equalum",
"status" : {
"current" : "active",
"currentMessage" : null,
"previous" : "pending",
"previousMessage" : "Recovery:Starting pipelines"
},
"serializationFormat" : "avroBinary"
}
}
I trying using Jmeter to extract out the value of agentID, how can I do that using Jmeter, what would be better ? using extractor or json extractor?
what I am trying to do is to extract agentID value in order to use it on another http request sample, but first I have to extract it from this request.
thanks!
I believe using JSON Extractor is the best way to get this agentID value, the relevant JsonPath query will be as simple as $..agentID
Demo:
See the following reference material:
JsonPath - Getting Started - for initial information regarding JsonPath language, functions, operators, etc.
JMeter's JSON Path Extractor Plugin - Advanced Usage Scenarios - for more complex scenarios.