Iterate over JSON object keys and values with jq [duplicate] - json

This question already has answers here:
How to convert a JSON object to key=value format in jq?
(4 answers)
Closed 4 months ago.
Here is jq getting field a1:
$ echo '{"a": {"a1": 1}, "b": {"a1": 2}}' | jq -r ".[] | .a1"
1
2
I'd like to get the keys a and b as well, and output a flat result, i.e.
$ echo '{"a": {"a1": 1}, "b": {"a1": 2}}' | jq -r "<magic here>"
a 1
b 2
Suggestions?

I guess I found an answer:
$ echo '{"a": {"a1": 1}, "b": {"a1": 2}}' | \
jq -r 'to_entries[] | [.key, .value.a1] | #tsv'
a 1
b 2

Here's an alternative using keys (or keys_unsorted):
$ echo '{"a": {"a1": 1}, "b": {"a1": 2}}' |
jq -r 'keys[] as $key | [$key, .[$key].a1] | #tsv'
a 1
b 2

Related

How to not let jq interpret the newline character when exporting to CSV

I want to convert the following JSON content stored in a file tmp.json
{
"results": [
[
{
"field": "field1",
"value": "value1-1"
},
{
"field": "field2",
"value": "value1-2\n"
}
],
[
{
"field": "field1",
"value": "value2-1"
},
{
"field": "field2",
"value": "value2-2\n"
}
]
]
}
into a CSV output
"field1","field2"
"value1-1","value1-2\n"
"value2-1","value2-2\n"
When I use this jq command, however,
cat tmp.json | jq -r '.results | (first | map(.field)), (.[] | map(.value)) | #csv'
I get this result:
"field1","field2"
"value1-1","value1-2
"
"value2-1","value2-2
"
How should the jq command be written to get the desired CSV result?
For a jq-only solution, you can use gsub("\n"; "\\n"). I'd go with something like this:
.results
| (.[0] | map(.field)),
(.[] | map( .value | gsub("\n"; "\\n")))
| #csv
Using your JSON and invoking this with the -r command line option yields:
"field1","field2"
"value1-1","value1-2\n"
"value2-1","value2-2\n"
If newlines are the only thing you can handle, maybe you can do a string replacement.
cat tmp.json | jq -r '.results | (first | map(.field)), (.[] | map(.value) | map(gsub("\\n"; "\\n"))) | #csv'

jq, variables, select and numbers?

It seems that jq's "select", when used with variables, only works for text strings, but not for numbers.
How do I use jq so that I can select no matter if the value is a number or not?
$ cat test.json
[
{
"id": "some-text",
"output": "jq select works!"
},
{
"id": 123,
"output": "jq select doesn't work :("
}
]
Let's try to select "output" where "id" is "some-text" - it works:
$ ID="some-text"
$ cat test.json | jq -r --arg ID "$ID" '.[] | select(.id==$ID) | .output'
jq select works!
$
Let's try to select "output" where "id" is "123" - it doesn't work:
$ ID="123"
$ cat test.json | jq -r --arg ID "$ID" '.[] | select(.id==$ID) | .output'
$
Interestingly, "select" does work if I don't pass the variable via --arg:
$ cat test.json | jq -r '.[] | select(.id==123) | .output'
jq select doesn't work :(
What am I doing wrong?
You can use --argjson instead off -arg to let JQ handle it as a number:
➜ ID="123"
➜ cat test.json | jq -r --argjson ID "$ID" '.[] | select(.id==$ID) | .output'
jq select doesn't work :(
➜
You could also use tostring to convert the ID to a string so you can keep the existing format;
➜ ID="123"
➜ cat test.json | jq -r --arg ID "$ID" '.[] | select(.id|tostring==$ID) | .output'
jq select doesn't work :(
➜
➜ ID="some-text"
➜ cat test.json | jq -r --arg ID "$ID" '.[] | select(.id|tostring==$ID) | .output'
jq select works!
➜

Format number in thousands separators with jq json cli

Given {"a": 1234567890}, I want 1,234,567,890 in the result, how this can be done with jq
echo '{"a": 1234567890}' | jq '.a | FORMAT?'
Thanks for #peak's answer, the solution is
echo '{"a": 1234567890}' | jq -r 'def h: [while(length>0; .[:-3]) | .[-3:]] | reverse | join(","); .a | tostring | h'
//-> 1,234,567,890
Here's an idiomatic one-liner definition:
def h: tostring | [while(length>0; .[:-3]) | .[-3:]] | reverse | join(",");
Example
12, 123, 1234, 12345678 | h
Output (using -r option):
12
123
1,234
12,345,678
jq doesn't have (yet) a printf function to format according locale settings.
If that's an option for you can pass the number to the shell using printf:
echo '{"a": 12345}' | jq '.a' | xargs printf "%'.f\n"
12,345
Note that the printf conversion relies on the format %'.f that is explained in man 3 printf
Here's a generic solution for integers or integer-valued strings:
# "h" for "human-readable"
def h:
def hh: .[0] as $s | .[1] as $answer
| if ($s|length) == 0 then $answer
else ((if $answer == "" then "" else "," end) + $answer ) as $a
| [$s[0:-3], $s[-3:] + $a] | hh
end;
[ tostring, ""] | hh;
Example
12, 123, 1234, 12345678 | h
Result (using -r option):
12
123
1,234
12,345,678

jq - How to filter a json that does not contain

I have an aws query that I want to filter in jq.
I want to filter all the imageTags that don't end with "latest"
So far I did this but it filters things containing "latest" while I want to filter things not containing "latest" (or not ending with "latest")
aws ecr describe-images --repository-name <repo> --output json | jq '.[]' | jq '.[]' | jq "select ((.imagePushedAt < 14893094695) and (.imageTags[] | contains(\"latest\")))"
Thanks
You can use not to reverse the logic
(.imageTags[] | contains(\"latest\") | not)
Also, I'd imagine you can simplify your pipeline into a single jq call.
All you have to do is | not within your jq
A useful example, in particular for mac brew users:
List all bottled formulae
by querying the JSON and parsing the output
brew info --json=v1 --installed | jq -r 'map(
select(.installed[].poured_from_bottle)|.name) | unique | .[]' | tr '\n' ' '
List all non-bottled formulae
by querying the JSON and parsing the output and using | not
brew info --json=v1 --installed | jq -r 'map(
select(.installed[].poured_from_bottle | not) | .name) | unique | .[]'
In this case contains() doesn't work properly, is better use the not of index() function
select(.imageTags | index("latest") | not)
This .[] | .[] can be shorten to .[][] e.g.,
$ jq --null-input '[[1,2],[3,4]] | .[] | .[]'
1
2
3
4
$ jq --null-input '[[1,2],[3,4]] | .[][]'
1
2
3
4
To check whether a string does not contain another string, you can combine contains and not e.g.,
$ jq --null-input '"foobar" | contains("foo") | not'
false
$ jq --null-input '"barbaz" | contains("foo") | not'
true
You can do something similar with an array of strings with either any or all e.g.,
$ jq --null-input '["foobar","barbaz"] | any(.[]; contains("foo"))'
true
$ jq --null-input '["foobar","barbaz"] | any(.[]; contains("qux"))'
false
$ jq --null-input '["foobar","barbaz"] | all(.[]; contains("ba"))'
true
$ jq --null-input '["foobar","barbaz"] | all(.[]; contains("qux"))'
false
Say you had file.json:
[ [["foo", "foo"],["foo", "bat"]]
, [["foo", "bar"],["foo", "bat"]]
, [["foo", "baz"],["foo", "bat"]]
]
And you only want to keep the nested arrays that don't have any strings with "ba":
$ jq --compact-output '.[][] | select(all(.[]; contains("bat") | not))' file.json
["foo","foo"]
["foo","bar"]
["foo","baz"]

json parsing with jq and convert to csv

I need to get some values from a json file with JQ. I need to get a csv (Time, Data.key, Lat, Lng, Qline)
Input:
{
"Time":"14:16:23",
"Data":{
"101043":{
"Lat":49,
"Lng":15,
"Qline":420
},
"101044":{
"Lat":48,
"Lng":15,
"Qline":421
}
}
}
Example output of csv:
"14:16:23", 101043, 49, 15, 420
"14:16:23", 101044, 48, 15, 421
Thanks a lot.
I tried only to:
cat test.json | jq '.Data[] |[ .Lat, .Lng, .Qline ] | #csv'
Try this:
{ Time } + (.Data | to_entries[] | { key: .key | tonumber } + .value)
| [ .Time, .key, .Lat, .Lng, .Qline ]
| #csv
Make sure you get the raw output by using the -r switch.
Here's another solution that doesn't involve the +'s.
{Time, Data: (.Data | to_entries)[]}
| [.Time, (.Data.key | tonumber), .Data.value.Lat, .Data.value.Lng, .Data.value.Qline]
| #csv
Here is another solution. If data.json contains the sample data then
jq -M -r '(.Data|keys[]) as $k | {Time,k:$k}+.Data[$k] | [.[]] | #csv' data.json
will produce
"14:16:23","101043",49,15,420
"14:16:23","101044",48,15,421