This question already has an answer here:
getting all the values of an array with jq
(1 answer)
Closed 3 years ago.
Say I have this command:
kbc get pods -o=json | jq -c
which gives me something like:
{"apiVersion":"v1","items":[{"name":"a"},{"name":"b"},{"name":"c"}]}
how can I echo the name for each element in the items array? Something like this:
kbc get pods -o=json | jq -c | jq '.items[].name' | cat
kbc get pods -o=json | jq -r '.items[].name'
Using -r tells jq to write raw output, thus without quoting the names as JSON strings.
Related
This question already has an answer here:
"Argument list too long" while slurping JSON files [duplicate]
(1 answer)
Closed 1 year ago.
There are more than 6k JSON files, each containing exactly one JSON object. I want to prepare one list of objects from these JSONs.
When I am running below jq command I am getting an error.
Kedar.Javalkar#KD2806 MINGW64 /c/zz
$ jq -s '.' inventoryItem_*.json > inventory_items_result_$(date +"%Y%m%d_%H%M%S").json
bash: /usr/bin/jq: Argument list too long
I tried ulimit -s unlimited but the same error
I am using a windows 10 git bash
This is a job that xargs is created to fix -- splitting lists of items into individual command lines that are within the permitted limit.
Because running jq -s a single time is different from concatenating the results of multiple smaller runs, it's appropriate to use xargs to combine cat invocations using the manner described in the linked duplicate.
printf '%s\0' inventoryItem_*.json \
| xargs -0 cat \
| jq -s . \
>"inventory_items_result_$(date +"%Y%m%d_%H%M%S").json"
This question already has answers here:
How do I assign the output of a command into an array?
(3 answers)
Closed 2 years ago.
I'm trying to get all results of jq query in a array in bash, but it gives me all results in the first array element.
Could you help me? I would like to having every result in one array element
bash-4.2$ try=$(tm run jobs:status::get -s "tm=serverA&work=*&status=Failed" | $PATH_API/jq -r '.statuses[].name // empty')
bash-4.2$ echo $try
job_B job_C
bash-4.2$ echo "${#try[#]}"
1
bash-4.2$
If the status names are sufficiently uncomplicated, you could add an extra pair of parentheses:
try=($(tm run ....))
(Consider also invoking jq without the -r option.)
Otherwise, you could consider using readarray, or other techniques for initializing bash arrays.
This question already has answers here:
Use jq to Convert json File to csv
(1 answer)
Converting json map to csv using jq
(3 answers)
Closed 4 years ago.
I want to efficiently rewrite a large json, which has always the same field names, into a csv, ignoring its keys.
To give a concrete example, here is a large JSON file (tempST.json):
https://gist.githubusercontent.com/pedro-roberto/b81672a89368bc8674dae21af3173e68/raw/e4afc62b9aa3092c8722cdbc4b4b4b6d5bbc1b4b/tempST.json
If I rewrite just fields time, ancestorcount and descendantcount from this JSON into a CSV I should get:
1535995526,1,1
1535974524,1,1
1535974528,1,2
...
1535997274,1,1
The following script tempSpeedTest.sh writes the value of the fields time, ancestorcount and descendantcount into each line of the csv:
rm tempOutput.csv
jq -c '.[]' < tempST.json | while read line; do
descendantcount=$(echo $line | jq '.descendantcount')
ancestorcount=$(echo $line | jq '.ancestorcount')
time=$(echo $line | jq '.time')
echo "${time},${ancestorcount},${descendantcount}" >> tempOutput.csv
done
However the script takes around 3 minutes to run, which is unsatisfying:
>time bash tempSpeedTest.sh
real 2m50.254s
user 2m43.128s
sys 0m34.811s
What is a faster way to achieve the same result?
jq -r '.[] | [.time, .descendantcount, .ancestorcount] | #csv' <tempST.json >tempOutput.csv
See this running at https://jqplay.org/s/QJz5FCmuc9
This question already has answers here:
Parsing JSON with Unix tools
(45 answers)
Closed 7 years ago.
I'm trying to get the value of a json array but I don't manage to get the value of a leaf, can anyone help me?
Here's the json from maps.googleapis.com/maps/api
I'd like to get the duration text and value and here's the script I've been using so far.
function jsonValue() {
KEY=$1
num=$2
awk -F"[,:}]" '{for(i=1;i<=NF;i++){if($i~/'$KEY'\042/){print $(i+1)}}}' | tr -d '"' | sed -n ${num}p
}
curl --get --include 'http://maps.googleapis.com/maps/api/directions/json?origin=59.94473,30.294254&destination=59.80612,30.308552&sensor=false&departure_time=1346197500&mode=transit&format=json' | jsonValue duration["value"] 2
Thanks in advance,
Jeremie.
You are looking for jq
(a lightweight and flexible command-line JSON processor).
This worked for me
curl -G -k --data "origin=59.94473,30.294254&destination=59.80612,30.308552&sensor=false&departure_time=1346197500&mode=transit" http://maps.googleapis.com/maps/api/directions/json | jq '.routes[].legs[].duration["text"]'
This question already has answers here:
filtering data using parameters
(2 answers)
Closed 7 years ago.
I am trying to parse json result from aws result, but I getting error or null when I am using $ip, when I am using specific IP it work. something wrong when I am using tne variable inside the jq command
#!/bin/bash
aws ec2 describe-addresses --region eu-west-1 > 1.txt
ipList=( "52.16.121.238" "52.17.250.188" )
for ip in "${ipList[#]}";
do
echo $ip
cat 1.txt | jq '.Addresses | .[] | select (.PublicIp==$ip) | .InstanceId'
#echo $result
done
Please advise.
You're using single quotes around your jq program, which is causing the shell variable to not be interpolated. Furthermore, even if it were, you would still need to add string quoting around the variable interpolation to make jq interpret it as a string literal. Since doing shell variable interpolation into jq programs is hard and error-prone, jq provides a command-line argument to this effect, --arg, intended to lift shell variables into jq variables. Your jq invocation would therefore look like this:
jq --arg ip "$ip" '.Addresses[] | select(.PublicIp == $ip) | .InstanceId'
Thanks for your help. the right format is:
cat 1.txt | jq ".Addresses | .[] | select(.PublicIp==\"$ip\") | .InstanceId"