I have json format Splunk search results like below :
"{
"Name": "RUNQDATA",
"RunId": "2021021701",
"Details": <{
"RunQID": "796562",
"TQID": "796562",
"Ent": {
"NAME": "Inv",
"Store": {
"NAME": "FSW",
"TYPE": "QUEUE",
"USERNAME": "abc"
}
},
"ADD_COUNT": "5740",
"UPDATE_COUNT": "0",
"DELETE_COUNT": "0"
}>,
"status": "success",
}"
How can I extract the fields like ADD_COUNT or UPDATE_COUNT from this ? I tried spath & other options , however not able to get the required results. Probably because the json contains <>.
Any help here is appreciated.
Confirmed. If the angle brackets are removed then the spath command will parse the whole thing. The spath command doesn't handle malformed JSON.
If you can't change the format of the event then you'll have to use the rex command to extract the fields as in this run-anywhere example
| makeresults
| eval _raw="{
\"Name\": \"RUNQDATA\",
\"RunId\": \"2021021701\",
\"Details\": <{
\"RunQID\": \"796562\",
\"TQID\": \"796562\",
\"Ent\": {
\"NAME\": \"Inv\",
\"Store\": {
\"NAME\": \"FSW\",
\"TYPE\": \"QUEUE\",
\"USERNAME\": \"abc\"
}
},
\"ADD_COUNT\": \"5740\",
\"UPDATE_COUNT\": \"0\",
\"DELETE_COUNT\": \"0\"
}>,
\"status\": \"success\",
}"
| rex "UPDATE_COUNT\": \"(?<UPDATE_COUNT>\d+)"
| rex "DELETE_COUNT\": \"(?<DELETE_COUNT>\d+)"
Related
Basically, I have a retrieved values from a DynamoDB using Powershell and have gotten a row in JSON format like the following
function gettagsfromdynamo() {
$table_name = "table_name"
$dbkey = '{"ReleaseId":{"AttributeValueList":[ {"N":"1"} ],"ComparisonOperator": "EQ"}}' | ConvertTo-Json -Compress
$dbvalue = aws dynamodb query --table-name $table_name --key-conditions $dbkey --region $region
$latest_tags = $dbvalue
$latest_tags
}
$db_tags = gettagsfromdynamo
This is what db_tags looks like
{
"Items": [
{
"Comment": {
"S": "The first PC release."
},
"Product": {
"S": "PC"
},
"ReleaseId": {
"N": "1"
},
"CreatedOn": {
"S": "12/14/2020 15:23:32"
},
"Tags": {
"S": "{\n \"software1\": \"software1.zip\",\n \"software2\": \"software2.zip\",\n \"software3\":\n [\n \"software3.zip\",\n \"software4.zip\",\n \"software5.zip\"\n ],\n \" data1 \": \"2020_NA\",\n \" 2020_EU \": \"20201_EU\",\n \" 2020_WW \": \"2021_WW\",\n \" dataversions\":\n [\n \"2020\",\n \"2019\",\n \"2018\",\n \"2017"\n ],\n \" products \": \" \"\n}"
}
}
],
"Count": 1,
"ScannedCount": 1,
"ConsumedCapacity": null
}
The task I want to achieve is to be able to get the "dataversions" value which is --> [Items][Tags][Dataversions] and write that value to a JSON file available locally. I have tried various things including using the Convert-ToJson and ConvertFrom-Json.
The tags json value looks like this without the escaped spaces (/n)
{
"software1": "software1.zip",
"software2": "software2.zip",
"software3":
[
"software3.zip",
"software4.zip",
"software5.zip"
],
" data1": "2020_NA",
" 2020_eu ": "2020_EU",
" 2020_ww": "2021_WW",
" dataversions":
[
"2020",
"2019",
"2018",
"2017"
],
" products ": " "
}
How do I retrieve the value of 'dataversions', which is a list of strings. Right now, I can only get it like this after using various tries:
{"S":"{\n \"software1\": \"software1.zip\",\n \"software2\": \"software2.zip\",\n \"software3\":\n [\n \"software3.zip\",\n \"software4.zip\",\n \"software5.zip\"\n ],\n \" data1\": \"2020_NA\",\n \" 2020_EU\": \"20201_EU\",\n \" 2020_WW\": \"2021_WW\",\n \" dataversions\":\n [\n \"2020\",\n \"2019\",\n \"2018\",\n \"2017\"\n ],\n \" products\": \" \"\n}"}
I want to be able to get the value of dataversions inorder to overwrite another 'dataversions' which is inside example.json file. How do I get to the dataversions value and also clean up the \n?
In your JSON file, the property name " dataversions" contains a leading space. When the JSON string is converted to a custom object (via ConvertFrom-Json), the space will be included in the property name. Therefore it must be considered when using member access (object.property) syntax:
($db_tags | ConvertFrom-Json).' dataversions'
I'm trying to convert 7z file content list to json and can't fix missing separator between output converted blocks.
I'm little bit newbie in json conversion, but found that jq could do the job.
I read the jq documentation and found examples inside here and there also elsewhere without solution.
Please find the use case:
The command line:
jq -f pf_7z.jq -R
The input file demo.lst:
Date Time Attr Size Compressed Name
------------------- ----- ------------ ------------ ------------------------
2018-06-23 14:02:16 D.... 0 0 Installer
2018-06-23 14:02:16 ..... 3381 1157 Installer\Readme
2018-06-23 14:02:16 ..... 4646 1157 Installer\License.txt
2018-06-23 14:02:16 ..... 138892 136152 Installer\Setup.exe
The filter file pf7z.jq:
def parse:
def parse_line:
. | map(match("(\\d+-\\d+-\\d+) (\\d+:\\d+:\\d+) (D|.).* +(\\d+) +(\\d+) +(.*\\\\)([^\\\\]*)\\.(.*)")) | .[] |
({
"date" :(.captures[0].string),
"time" :(.captures[1].string),
"attr" :(.captures[2].string),
"size" :(.captures[3].string),
"path" :(.captures[5].string),
"name" :(.captures[6].string),
"extn" :(.captures[7].string)
});
split("\n") | ( {} + (parse_line));
parse
The expected result should be:
{
"date": "2018-06-23",
"time": "14:02:16",
"attr": ".",
"size": "4646",
"path": "Installer\",
"name": "License",
"extn": "txt"
},
{
"date": "2018-06-23",
"time": "14:02:16",
"attr": ".",
"size": "138892",
"path": "Installer\",
"name": "Setup",
"extn": "exe"
}
And I only got :
{
"date": "2018-06-23",
"time": "14:02:16",
"attr": ".",
"size": "4646",
"path": "Installer\",
"name": "License",
"extn": "txt"
}
{
"date": "2018-06-23",
"time": "14:02:16",
"attr": ".",
"size": "138892",
"path": "Installer\",
"name": "Setup",
"extn": "exe"
}
without the comma separator between blocks.
Thanks ;-)
Your def for parse_line produces a stream of JSON entities, whereas you evidently want a JSON array. Using your regex, you could write:
def parse:
def parse_line:
match("(\\d+-\\d+-\\d+) (\\d+:\\d+:\\d+) (D|.).* +(\\d+) +(\\d+) +(.*\\\\)([^\\\\]*)\\.(.*)")
| .captures
| map(.string)
| { "date" :.[0],
"time" :.[1],
"attr" :.[2],
"size" :.[3],
"path" :.[5],
"name" :.[6],
"extn" :.[7] } ;
[inputs | parse_line];
parse
Invocation
jq -nR -f 7z.jq 7z.txt
Alternative regex
The regex fragment (D|.).* does not make much sense.
You should consider replacing it by (.)[^ ]* or some such.
A simpler solution
def parse_line:
capture("(?<date>\\d+-\\d+-\\d+) "
+ "(?<time>\\d+:\\d+:\\d+) "
+ "(?<attr>.)[^ ]* +"
+ "(?<size>\\d+) +\\d+ +"
+ "(?<path>.*\\\\)"
+ "(?<name>[^\\\\]*)\\."
+ "(?<extn>.*)");
[inputs | parse_line]
An alternative approach
From the comment about JSONEdit, it seems likely to me that your overall approach might be suboptimal. Have you considered using jq rather than jq with JSONEdit?
i am trying to extract values of 3 fields (status, id, name) from my json file by using jq tool, here is my json:
cat parse.json
{
"stream": {
"_id": 65675798730520654496,
"broadcast_platform": "live",
"community_id": "",
"community_ids": [],
"average_fps": 60.0247524752,
"delay": 0,
"created_at": "2018-09-26T07:25:38Z",
"is_playlist": false,
"stream_type": "live",
"preview": {
"small": "https://static-cdn.jtvnw.net/previews-ttv/live_user_versuta-80x4512wdfqf.jpg",
},
"channel": {
"mature": true,
"status": "status",
"broadcaster_language": "ru",
"broadcaster_software": "",
"_id": 218025408945123423423445,
"name": "djvbsdhvsdvasdv",
"created_at": "2011-04-17T17:31:36.091604Z",
"updated_at": "2018-09-26T09:49:04.434245Z",
"partner": true,
"video_banner": null,
"profile_banner": "https://static-cdn.jtvnw.net/jtv_user_pictures/25c2bec3-95b8-4347-aba0-128b3b913b0d-profile_banner-480.png",
"profile_banner_background_color": "",
"views": 103911737,
"followers": 446198,
"broadcaster_type": "",
"description": "",
"private_video": false,
"privacy_options_enabled": false
}
}
}
online json validators say that it is valid, when i try to get some field it return null
cat parse.json | jq '.channel'
null
cat parse.json | jq '.channel.status'
null
what am i doing wrong guys ?
Your JSON object has a top-level field "stream" You need to access "stream" to access the other sub-properties, e.g. channel:
jq '.stream.channel.status' parse.json
You can also do cat parse.json | jq '.stream.channel.status'.
Your example JSON is also invalid because the stream.preview.small property has a trailing comma. Simply removing that comma will make it valid, though.
To deal with the invalid JSON, you could use a JSON rectifier such as hjson; to avoid any hassles associated with identifying the relevant paths, you could use ..|objects. Thus for example:
$ hjson -j parse.json | jq '..|objects|select(.status) | .status, ._id, .name'
"status"
218025408945123440000000
"djvbsdhvsdvasdv"
I need to convert this JSON to a TSV format. I've a source file like this:
{
"event": "log",
"timestamp": 1535306331840,
"tags": [
"info"
],
"data": {
"_id": "A301180827005852329209020",
"msisdn": "6282134920902",
"method": "get",
"url": "/api/tcash/balance",
"timeTaken": 32,
"channelid": "UX"
},
"pid": 7920
}
Then I want to convert it to tsv which are consist of below column:
event, timestamp, tags, _id, msisdn, method, url, timeTaken, channelID, pid
You just have to construct an array of atomic values. Since .tags is not atomic, in the following I'll assume (as suggested by #chepner) that we can use .tags|join(","), though you might want to use something else, such as .tags|#csv:
[.event, .timestamp, (.tags | join(","))]
+ (.data|[._id, .msisdn, .method, .url, .timeTaken, .channelID])
+ [.pid]
| #tsv
JSON in question:
{
"search_id": "",
"type": "Search.filter",
"query": "bar,club",
"params": {
"search_id": "",
"user_id": "",
"client": "ios",
"lat": 40.73199375351,
"lon": -74.00080404533901,
"radius": 20
}
}
Code to Retrieve the Data:
val json = Json.parse(new String(body))
println((json \ "search_id") + " | " + (json \ "query"))
println(json)
printing just the json JsValue prints out the entire JSON as expected. printing out the first item produces: "" | "bar,club"
Why is it maintaining the quotes from JSON formatting? That's not part of the string, it's basically saying that the content inside the quotes is a string. How do I fix this?
According to the doc, you should call .as[sometype] (unsafe conversion) or asOpt[sometype] (safe).
println((json \ "search_id").as[String] + " | " + (json \ "query").as[String])