How to iterate each object in json array jq [duplicate] - json

This question already has answers here:
Iterating through JSON array in Shell script
(10 answers)
Closed last year.
Intention: Need to iterate the each item in json array using jq and proccess seperately
api returns data in form
{
"name": [
{
"first": "first",
"class": false,
"type": "B"
},
{
"first": "second",
"class": false,
"type": "B"
},
{
"first": "third",
"class": false,
"type": "A"
}
]
}
And i am able to parse it as
data=`curl http://some.ur;l`
echo "$data" | jq -rc '.name[] | select(.type=="B") | .class=true'
it returns data as
{"first": "first","class": true, "type": "B"}
{"first": "second", "class": true, "type": "B"}
Now i want to proccess these two outputs in such a way so that i can make a PUT call for each of them . I tried to learn some concepts of xargs but could not make it done
I piped the output to | xargs -n1 but it removed all the quotes from the string

You could loop through using while read:
while IFS= read -r line
do
# your PUT goes here
# dummy
printf 'PUTting JSON: %s\n' "$line"
done < <(jq -c '.name[] | select(.type == "B") | .class = true')
If you absolutely need to use xargs then use a newline character as delimiter
jq -c '.name[] | select(.type == "B") | .class = true' \
| xargs -d $'\n' printf 'PUTting JSON: %s\n' # dummy

The only part you're missing is a while read loop.
data=$(curl ...)
while IFS= read -r line; do
curl -XPUT -d"$line" http://some_url
done < <(jq -c ... <<<"$data")
...or...
curl ... |
jq -c ... |
xargs -d $'\n' -n 1 curl -XPUT http://some_url -d
Note:
xargs -d $'\n' tells xargs to treat only newlines (and not other whitespace) as separators between items; it also turns off treatment of quotes and backslashes as syntactic rather than literal.
xargs -n 1 tells xargs only to pass one data item to each copy of curl.
Making the last argument to curl be -d means that xargs will place the data immediately after that position.

Related

Construct json through jq tool in shell script loop

json can be created using the following command.
jq -n \
--arg v1 "Value1" \
--arg v2 "Value2" \
'{k1: "$v1", k2:$v2'}
But when my key is mutable, how should I loop? For example, the script I execute is
test.sh k1=v1 k2=v2 k3=v3
test.sh is as follows
index=1
while ((index <= "$#")); do
data_i_key=$(echo ${!index} | awk -F "=" '{print $1}')
data_i_value=$(echo ${!index} | awk -F "=" '{print $2}')
let index++
JSON_STRING=$(jq -n \
--arg value "$data_i_value" \
'{'"$data_i_key"': $value'})
echo $JSON_STRING
The above print result is
{ "K3": "V3" }
if I replace it with
JSON_STRING+=$(jq -n \
--arg val_value "$dataValue" \
'{'"$data_i_key"': $val_value'})
The print result is
{ "k1": "v1" }{ "k2": "v2" }{ "K3": "V3" }
The above two methods have not achieved the goal, do you have a good way to deal with it? My desired output is
{ "k1": "v1" , "k2": "v2" ,"K3": "V3" }
hope you can help me.
I'd suggest a totally different, but simpler, approach:
for kv; do
echo "$kv" | jq -R './"=" | {key:first,value:last}'
done | jq -s 'from_entries'
It builds {key: …, value: …} objects from your positional parameters (splitting them by the equal sign) and then slurping all those objects in a second jq process, converting them into a single object via from_entries.
Alternatively, using -n (--null-input), --args and then accessing via $ARGS.positional. This avoids the loop and the second jq call altogether.
jq -n '$ARGS.positional | map(./"=" | {key:first,value:last}) | from_entries' --args "$#"
If your values can contain = themselves, you have to join all but the first value of the array:
jq -n '$ARGS.positional
| map(./"=" | {key:first,value:.[1:]|join("=")})
| from_entries' --args "$#"
Use parentheses around the key expression to have it evaluated:
jq -n --arg k1 "Key1" --arg v1 "Value1" '{($k1): $v1}'
{
"Key1": "Value1"
}
Perhaps this, which will handle an arbitrary number of key=value arguments:
#!/usr/bin/env bash
args=()
for arg; do
IFS="=" read -r k v <<<"$arg"
args+=("$k" "$v")
done
jq -n --args '
$ARGS.positional as $a
| reduce range(0; $a | length; 2) as $i ({}; .[$a[$i]] = $a[$i + 1])
' "${args[#]}"
Produces:
$ ./test.sh k1=v1 k2=v2 k3=v3
{
"k1": "v1",
"k2": "v2",
"k3": "v3"
}

extract specific key value from json in bash if key matches

I have a json content (output.json)
{"project": {"id": "A", "content": [{"name": "XYZ", "location": "Berlin", "comments":""}, {"name": "ABC", "location": "NewYork", "comments": "Hwllo"}, {"name": "DEF", "location": "Paris", "comments": "Success"}]}}
I would like to extract location key with value when name matches say ABC from the above json using bash or shell commands
I tried something like below which gives be content within curly braces. but not sure on searching specific key.
cat output.json | grep -o -e "{.*}"
Output expectations:
if name matches ABC, get output as "location":"NewYork"
Any suggestions on processing further?
For extracting from json you should use jq if you can. According to authors "jq is like sed for JSON data" (source).
In your case it should be:
$ jq -r '.project' output.json | jq -r '.content' | jq '.[] | select(.name=="ABC")' | jq -r '.location'
Output will be:
NewYork
To get output which you required so:
"location":"NewYork"
You can use:
echo "\"location\":$(jq -r '.project' output.json | jq -r '.content' | jq '.[] | select(.name=="ABC")' | jq '.location')"
Before you use jq you should install it on Debian and Ubuntu it will be:
$ sudo apt install jq
for other OS you should check this site.
There may be better ways to do it, in a quick twisted way is here.
cat output.json | sed 's/"name"/\n"name"/g' | grep '"name"' | awk -F',' '{print $2}'
Add | grep <preferred name> also if need to filter based on name.
Use Perl
$ perl -0777 -lne ' while(/"name":\s+"ABC",\s+"location":\s+(\S+)/msg) { print "$1\n" } ' output.json
"NewYork",
$ cat output.json
{"project": {"id": "A", "content": [{"name": "XYZ", "location": "Berlin", "comments":""}, {"name": "ABC", "location": "NewYork", "comments": "Hwllo"}, {"name": "DEF", "location": "Paris", "comments": "Success"}]}}
$
Please use a JSON parser for processing JSON.
With Xidel it's as simple as:
xidel -s output.json -e '($json//content)()[name="ABC"]/location'
Alternatively:
xidel -s output.json -e '$json/(.//content)()[name="ABC"]/location'
or in full:
xidel -s output.json -e '$json/project/(content)()[name="ABC"]/location'
The above is XPath notation (example #11, Reading JSON). Dot notation (like jq) is also possible:
xidel -s output.json -e '($json).project.content()[name="ABC"].location'
[edit]
Commands above put out NewYork and I just realized you require the output to be "location":"NewYork". Xidel can do that too:
xidel -s output.json -e '($json//content)()[name="ABC"]/concat("""location"":""",location,"""")'
[/edit]

Update values in json with jq (shell script)

I am trying to update a couple values in a json array (separate file) with a shell script. Basically the logic is, set an environment variable called URL, ping that URL and add it to the json-- if 0, update another json field to SUCCESS, else update to FAILED.
Here is are the files:
info.json:
{
"name": "PingTest",",
"metrics": [
{
"event_type": "PingResult",
"provider": "test",
"providerUrl": "URL",
"providerResult": "RESULT"
}
]
}
pinger.sh:
#!/bin/sh
JSON=`cat info.json` #read in JSON
#Assume URL variable is set to www.dksmfdkf.com
ping -q -c 1 "$URL" > /dev/null #ping url
if [ $? -eq 0 ]; then #if ping success, replace result in json template
JSON=`echo ${JSON} | jq --arg v "$URL" '.metrics[].providerUrl |= $v'
info.json`
JSON=`echo ${JSON} | jq '.metrics[].providerResult |= "SUCCESS"' info.json`
else
JSON=`echo ${JSON} | jq --arg v "$URL" '.metrics[].providerUrl |= $v'
info.json`
JSON=`echo ${JSON} | jq '.metrics[].providerResult |= "FAILED"' info.json`
fi
#Remove whitespace from json
JSON=`echo $JSON | tr -d ' \t\n\r\f'`
#Print the result
echo "$JSON"
The problem is my json file isn't getting updated properly, example result when running:
home:InfraPingExtension home$ ./pinger.sh
ping: cannot resolve : Unknown host
{
"name": "PingTest",
"metrics": [
{
"event_type": "PingResult",
"provider": "test",
"providerUrl": "",
"providerResult": "RESULT"
}
]
}
{
"name": "PingTest",
"metrics": [
{
"event_type": "PingResult",
"provider": "test",
"providerUrl": "URL",
"providerResult": "FAILED"
}
]
}
{"name":"PingTest","metrics":[{"event_type":"PingResult","provider":"test","providerUrl":"URL","providerResult":"RESULT"}]}
This would be greatly simplified by only calling jq once.
host=${URL#http://}; host=${host#https://}; host=${host%%/*}
if ping -q -c 1 "$host"; then
result=SUCCESS
else
result=FAILED
fi
JSON=$(
jq -c \
--arg url "$URL" \
--arg result "$result" \
'.metrics[].providerUrl |= $url
| .metrics[].providerResult |= $result
' info.json
)

Parse json into array in bash

{
"db_status": {
"sysa": {
"taskname": "AB",
"state": "Running",
"status": "System ATTENTION",
"updated": "0727",
"version": "5"
},
"sysb": {
"taskname": "null",
"state": "Standby",
"status": "System OK",
"updated": "0727",
"version": "6"
}
}
}
CURL command returns json object. Trying to get both state variables in an array i.e. running and standby. So far i have tried
curl -s http://localhost:9099/api | grep state | sed 's/"//g' | awk -F ": " '/state/ {print $2}' | tr '\n' ' ' | sed s'/..$//'
FYI you don't need a pipeline of 20 different commands when you're using awk. The command line you provided can be written as just one command:
awk -F'"' '$2=="state"{printf "%s%s", (++c>1?", ":""),$4}'
but all you really need is:
awk -F'"' '$2=="state"{print $4}'
and to save the output in a shell array (assuming your json is really always formatted as you show in your question) would be:
$ arr=( $(cat file | awk -F'"' '$2=="state"{print $4}') )
$ echo "${arr[0]}"
Running
$ echo "${arr[1]}"
Standby
Replace cat file with your curl command.
You could use jq
$ curl -s http://localhost:9099/api | jq '.db_status.sysa.state, .db_status.sysb.state'
"Running"
"Standby"
Or if you want all the entries no matter how many
$ curl -s http://localhost:9099/api | jq '.db_status[].state'
"Running"
"Standby"
Alternatively, if you don't have jq (or just looking for ways of feeling pain) you can parse most json in bash using ticktick -- which is written in 250 lines of bash

What is the best way to pipe JSON with whitespaces into a bash array?

I have JSON like this that I'm parsing with jq:
{
"data": [
{
"item": {
"name": "string 1"
},
"item": {
"name": "string 2"
},
"item": {
"name": "string 3"
}
}
]
}
...and I'm trying to get "string 1" "string 2" and "string 3" into a Bash array, but I can't find a solution that ignores the whitespace in them. Is there a method in jq that I'm missing, or perhaps an elegant solution in Bash for it?
Current method:
json_names=$(cat file.json | jq ".data[] .item .name")
read -a name_array <<< $json_names
The below assume your JSON text is in a string named s. That is:
s='{
"data": [
{
"item1": {
"name": "string 1"
},
"item2": {
"name": "string 2"
},
"item3": {
"name": "string 3"
}
}
]
}'
Unfortunately, both of the below will misbehave with strings containing literal newlines; since jq doesn't have support for NUL-delimited output, this is difficult to work around.
On bash 4 (with slightly sloppy error handling, but tersely):
readarray -t name_array < <(jq -r '.data[] | .[] | .name' <<<"$s")
...or on bash 3.x or newer (with very comprehensive error handling, but verbosely):
# -d '' tells read to process up to a NUL, and will exit with a nonzero exit status if that
# NUL is not seen; thus, this causes the read to pass through any error which occurred in
# jq.
IFS=$'\n' read -r -d '' -a name_array \
< <(jq -r '.data[] | .[] | .name' <<<"$s" && printf '\0')
This populates a bash array, contents of which can be displayed with:
declare -p name_array
Arrays are assigned in the form:
NAME=(VALUE1 VALUE2 ... )
where NAME is the name of the variable, VALUE1, VALUE2, and the rest are fields separated with characters that are present in the $IFS (input field separator) variable.
Since jq outputs the string values as lines (sequences separated with the new line character), then you can temporarily override $IFS, e.g.:
# Disable globbing, remember current -f flag value
[[ "$-" == *f* ]] || globbing_disabled=1
set -f
IFS=$'\n' a=( $(jq --raw-output '.data[].item.name' file.json) )
# Restore globbing
test -n "$globbing_disabled" && set +f
The above will create an array of three items for the following file.json:
{
"data": [
{"item": {
"name": "string 1"
}},
{"item": {
"name": "string 2"
}},
{"item": {
"name": "string 3"
}}
]
}
The following shows how to create a bash array consisting of arbitrary JSON texts produced by a run of jq.
In the following, I'll assume input.json is a file with the following:
["string 1", "new\nline", {"x": 1}, ["one\ttab", 4]]
With this input, the jq filter .[] produces four JSON texts -- two JSON strings, a JSON object, and a JSON array.
The following bash script can then be used to set x to be a bash array of the JSON texts:
#!/bin/bash
x=()
while read -r value
do
x+=("$value")
done < <(jq -c '.[]' input.json)
For example, adding this bash expression to the script:
for a in "${x[#]}" ; do echo a="$a"; done
would yield:
a="string 1"
a="new\nline"
a={"x":1}
a=["one\ttab",4]
Notice how (encoded) newlines and (encoded) tabs are handled properly.