In the JSON below, I need to find the value of the id key of every object where the value of state starts with "failed-"
[
{
"id": "RA_kwDOGETrS84EmTf2",
"state": "uploaded"
},
{
"id": "RA_kwDOGETrS84EmTf6",
"state": "failed-4325423"
},
{
"id": "RA_kwDOGETrS84EmTf7",
"state": "uploaded"
}
]
I got as far as extracting just the matching values of state:
.[] | .state | select(startswith("failed-"))
How do I find the corresponding values of id ?
.[] | select(.state | startswith("failed-")).id
Will output:
"RA_kwDOGETrS84EmTf6"
Trick is to pass state in the select() to startswith(), and then get .id of the result
Demo
.[] | select(.state | test("^failed")).id
is another way
Related
I want to parse a nested json to csv. The data looks similar to this.
{"tables":[{"name":"PrimaryResult","columns":[{"name":"name","type":"string"},{"name":"id","type":"string"},{"name":"custom","type":"dynamic"}]"rows":[["Alpha","1","{\"age\":\"23\",\"number\":\"xyz\"}]]]}
I want csv file as:
name id age number
alpha 1 23 xyz
I tried:
jq -r ".tables | .[] | .columns | map(.name)|#csv" demo.json > demo.csv
jq -r ".tables | .[] | .rows |.[]|#csv" demo.json >> demo.csv
But I am not getting expected result.
Output:
name id custom
alpha 1 {"age":"23","number":"xyz}
Expected:
name id age number
alpha 1 23 xyz
Assuming valid JSON input:
{
"tables": [
{
"name": "PrimaryResult",
"columns": [
{ "name": "name", "type": "string" },
{ "name": "id", "type": "string" },
{ "name": "custom", "type": "dynamic" }
],
"rows": [
"Alpha",
"1",
"{\"age\":\"23\",\"number\":\"xyz\"}"
]
}
]
}
And assuming fixed headers:
jq -r '["name", "id", "age", "number"],
(.tables[].rows | [.[0,1], (.[2] | fromjson | .age, .number)])
| #csv' input.json
Output:
"name","id","age","number"
"Alpha","1","23","xyz"
If any of the assumptions is wrong, you need to clarify your requirements, e.g.
How are column names determined?
What happens if the input contains multiple tables?
As the "dynamic" object always of the same shape? Or can it sometimes contain fewer, more, or different columns?
Assuming that the .rows array is a 2D array of rows and fields, and that a column of type "dynamic" always expects a JSON-encoded object whose fields represent further columns but may or may not always be present in every row.
Then you could go with transposing the headers array and the rows array in order to integratively process each column by their type, especially collecting all keys from the "dynamic" type on the fly, and then transpose it back to get the row-based CSV output.
Input (I have added another row for illustration):
{
"tables": [
{
"name": "PrimaryResult",
"columns": [
{
"name": "name",
"type": "string"
},
{
"name": "id",
"type": "string"
},
{
"name": "custom",
"type": "dynamic"
}
],
"rows": [
[
"Alpha",
"1",
"{\"age\":\"23\",\"number\":\"123\"}"
],
[
"Beta",
"2",
"{\"age\":\"45\",\"word\":\"xyz\"}"
]
]
}
]
}
Filter:
jq -r '
.tables[] | [.columns, .rows[]] | transpose | map(
if first.type == "string" then first |= .name
elif first.type == "dynamic" then
.[1:] | map(fromjson)
| (map(keys[]) | unique) as $keys
| [$keys, (.[] | [.[$keys[]]])] | transpose[]
else empty end
)
| transpose[] | #csv
'
Output:
"name","id","age","number","word"
"Alpha","1","23","123",
"Beta","2","45",,"xyz"
Demo
{
"Users": [
{
"Attributes": [
{
"Name": "sub",
"Value": "1"
},
{
"Name": "phone_number",
"Value": "1234"
},
{
"Name": "referral_code",
"Value": "abc"
}
]
},
{
"Attributes": [
{
"Name": "sub",
"Value": "2"
},
{
"Name": "phone_number",
"Value": "5678"
},
{
"Name": "referral_code",
"Value": "def"
}
]
}
]
}
How can I produce output like below ?
1,1234,abc
2,5678,def
jq '.Users[] .Attributes[] .Value' test.json
produces
1
1234
abc
2
5678
def
Not sure this is the cleanest way to handle this, but the following will get the desired output:
.Users[].Attributes | map(.Value) | #csv
Loop through all the deep Attributes .Users[].Attributes
map() to get all the Value's
Convert to #csv
jqPlay demo
If you don't need the output to be guaranteed to be CSV, and if you're sure the "Name" values are presented in the same order, you could go with:
.Users[].Attributes
| from_entries
| [.[]]
| join(",")
To be safe though it would be better to ensure consistency of ordering:
(.Users[0] | [.Attributes[] | .Name]) as $keys
| .Users[]
| .Attributes
| from_entries
| [.[ $keys[] ]]
| join(",")
Using join(",") will produce the comma-separated values as shown in the Q (without the quotation marks), but is not guaranteed to produce the expected CSV for all valid values of the input. If you don't mind the pesky quotation marks, you could use #csv, or if you want to skip the quotation marks around all numeric values:
map(tonumber? // .) | #csv
My json is as shown below:
[
[
{
"id": "abcd"
},
{
"address": [
"140 Deco st"
]
}
],
[
{
"id": "xyz"
},
{
"dummy": "This is dummy"
}
],
[
{
"id": "12356"
},
{
"address": [
"140 Deco st"
]
}
]]
Now, I want to capture only those ids who have dummy value of "This is dummy". Some of the data may or may not have dummy and address fields.
I tried below but it gave me error "... cannot have their containment checked"
jq -c '.[] | .[] | select(.dummy | contains("This is dummy")) | .[] | .id'
Any help is much appreciated!
contains is quite tricky to use correctly. Since the requirement is:
to capture only those ids who have dummy value of "This is dummy"
I would suggest:
.[]
| select( any(.[]; .dummy == "This is dummy") )
| add
| .id
or perhaps (depending on your detailed requirements):
.[]
| select( any(.[]; .dummy == "This is dummy") )
| .[]
| .id? // empty
Example json:
{
"version": "3",
"services": {
"web": {
"build": "web"
},
"redis": {
"image": "redis"
},
"datadog": {
"build": "datadog"
},
"another": {
"image": "mysql"
}
}
}
I'd like to return a list of services that have the "build" key, and not the "image" key. Note that the value for the build key isn't something I can key off of.
Output should be: ["web", "datadog"]
Here are two ways that work:
1.
jq '.services
| . as $services
| keys_unsorted
| map( select($services[.] | has("build")) )'
(Drill down to .services, remember it as $services for later use, get the list of keys, and select the ones such that the corresponding value in $services has a build key).
2.
jq '.services
| to_entries
| map( select(.value | has("build")) | .key)'
(Drill down to .services, convert to a list of {"key": ..., "value": ...} objects, select the ones where the .value has a build key, and return the .key for each).
The second is probably more idiomatic jq, but the first provides an interesting way to think about the problem as well.
Here's a third approach, notable for being oblivious to the upper reaches:
[(paths(scalars)
| select(.[-1] == "build")) as $p
| getpath($p)]
Need help in parsing Json data using jq , I used to parse the data using json path as [?(#.type=='router')].externalIP. I am not sure how to do the same using jq.
The result from the query should provide the .externalIp from the type=router.
198.22.66.99
Json data snippet as below
[
{
"externalHostName": "localhost",
"externalIP": "198.22.66.99",
"internalHostName": "localhost",
"isUp": true,
"pod": "gateway",
"reachable": true,
"region": "dc-1",
"type": [
"router"
],
"uUID": "b5f986fe-982e-47ae-8260-8a3662f25fc2"
},
]
##
cat your-data.json | jq '.[]|.externalIP|select(type=="string")'
"198.22.66.99"
"192.22.66.29"
"192.22.66.89"
"192.66.22.79"
explanation:
.[] | .externalIP | select(type=="string")
for every array entry | get field 'externalIP' | drop nulls
EDIT/ADDENDUM: filter on type (expects router to be on index 0 of type array)
cat x | jq '.[]|select(.type[0] == "router")|.externalIP'
"198.22.66.99"
"192.22.66.89"
The description:
i would like to extract externalIP for only the array "type": [ "router" ]
The corresponding jq query is:
.[] | select(.type==["router"]) | .externalIP
To base the query on whether "router" is amongst the specified types:
.[] | select(.type|index("router")) | .externalIP