update json with jq through shell script - json

I created a shell script to create and update json files using jq. To create json files is working well. I have a variable that is passed as argument to jq command
#!/bin/sh
OPER=$1
FILE_PATH=$2
DATE_TIME=`date +%Y-%m-%d:%H:%M:%S`
DATE=`date +%Y-%m-%d`
CSV=$3
STEP=$4
STATUS=$5
CODE=$6
MESSAGE=$7
if [ "$#" -eq 7 ]; then
if [ "$OPER" == "create" ]; then
# echo "FILE_PATH: $FILE_PATH - CSV: $CSV - STEP: $STEP - STATUS: $STATUS - CODE: $CODE - MESSAGE: $MESSAGE"
REPORT="{\"date\": \"$DATE\", \"csv\": \"$CSV\", \"messages\": [{ \"timestamp\": \"$DATE_TIME\", \"step\": \"$STEP\", \"status\": \"$STATUS\", \"code\": \"$CODE\", \"message\": \"$MESSAGE\" }] }"
echo ${REPORT} | jq . > $FILE_PATH
elif [ "$OPER" == "update" ]; then
echo "FILE_PATH: $FILE_PATH - CSV: $CSV - STEP: $STEP - STATUS: $STATUS - CODE: $CODE - MESSAGE: $MESSAGE"
REPORT="{\"timestamp\": \"$DATE_TIME\", \"step\": \"$STEP\", \"status\": \"$STATUS\", \"code\": \"$CODE\", \"message\": \"$MESSAGE\"}"
echo "REPORTTTTT: "$REPORT
REPORT="jq '.messages[.messages| length] |= . + $REPORT' $FILE_PATH"
#echo $REPORT
echo `jq '.messages[.messages| length] |= . + $REPORT' $FILE_PATH` > $FILE_PATH
else
echo "operation not recognized: $OPER"
fi
else
echo "wrong parameters."
fi
jq . $FILE_PATH
But to update the json file I am getting an error. My $REPORT variable is correct. The cotes are correct. I think I must use another jq argument instead of |= . +. I used that command with plain text and it worked. But when I create the REPORT variable dynamically it throws an error.
Any clue? Thanks
REPORTTTTT: {"timestamp": "2017-02-17:12:11:11", "step": "2", "status": "OK", "code": "34", "message": "message 34 file.xml"}
jq: error: syntax error, unexpected INVALID_CHARACTER, expecting $end (Unix shell quoting issues?) at <top-level>, line 1:
'.messages[.messages|
jq: 1 compile error
jq: error: REPORT/0 is not defined at <top-level>, line 1:
.messages[.messages| length] |= . + $REPORT
jq: 1 compile error
Here is the example at the command line>>
$ jq . file.json
{
"date": "2017-02-17",
"csv": "file.csv",
"messages": [
{
"timestamp": "2017-02-17:12:31:21",
"step": "1",
"status": "OK",
"code": "33",
"message": "message 33"
}
]
}
$ export REPORT="{\"timestamp\": \"2017-02-17:11:51:14\", \"step\": \"2\", \"status\": \"OK\", \"code\": \"34\", \"message\": \"message 34 file.xml\"}"
$ echo $REPORT
{"timestamp": "2017-02-17:11:51:14", "step": "2", "status": "OK", "code": "34", "message": "message 34 file.xml"}
$ jq '.messages[.messages| length] |= . + $REPORT' file.json
jq: error: REPORT/0 is not defined at <top-level>, line 1:
.messages[.messages| length] |= . + $REPORT
jq: 1 compile error

Pass the argument $REPORT as a json argument using the --argjson flag supported from jq-1.5 onwards,
--argjson name JSON-text:
This option passes a JSON-encoded value to the jq program as a predefined variable.
Change your line to,
jq --argjson args "$REPORT" '.data.messages += [$args]' file

Don't generate JSON by hand; let jq do it. This is important if the values you want to add to the JSON need to be quoted properly, which will not happen if you are just performing string interpolation in the shell:
$ foo='"hi" he said'
$ json="{ \"text\": \"$foo\" }"
$ echo "$json"
{ "text": ""hi" he said" } # Wrong: should be { "text": "\"hi\" he said" }
Further, jq can generate the date and time strings; there is no need to run date (twice) as well.
#!/bin/sh
if [ $# -eq 7 ]; then
printf "Wrong number of parameters: %d\n" "$#" >&2
exit 1
fi
oper=$1 file_path=$2
# Generate the new message using arguments 3-7
new_message=$(
jq -n --arg csv "$3" --arg step "$4" \
--arg status "$5" --arg code "$6" \
--arg message "$7" '{
timestamp: now|strftime("%F:%T"),
csv: $csv, step: $step, status: $status, code: $code, message: $message}'
)
case $oper in
create)
jq -n --argjson new_msg "$new_message" --arg csv "$3" '{
date: now|strftime("%F"),
csv: $csv,
messages: [ $new_msg ]
}' > "$file_path"
;;
update)
jq --argjson new_msg "$new_message" \
'.messages += [ $new_msg ]' \
"$file_path" > "$file_path.tmp" && mv "$file_path.tmp" "$file_path" ;;
*) printf 'operation not recognized: %s\n' "$oper" >&2
exit 1 ;;
esac
jq '.' "$file_path"

Thanks #Inian. I changed a little bit but worked.... here is the solution.
if [ "$#" -eq 7 ]; then
if [ "$OPER" == "update" ]; then
echo "update Json"
echo "FILE_PATH: $FILE_PATH - CSV: $CSV - STEP: $STEP - STATUS: $STATUS - CODE: $CODE - MESSAGE: $MESSAGE"
REPORT="{\"timestamp\": \"$DATE_TIME\", \"step\": \"$STEP\", \"status\": \"$STATUS\", \"code\": \"$CODE\", \"message\": \"$MESSAGE\"}"
echo "REPORTTTTT: "$REPORT
echo `jq --argjson args "$REPORT" '.data.messages += [$args]' $FILE_PATH` > $FILE_PATH
else
echo "operation not recognized: $OPER"
fi
else
echo "wrong parameters."
fi

Related

Calculate values from json file

I have a script
#!/bin/bash
#create path to redirect output.json to same directory as output.txt
path=$(dirname $1)
#create variables for name line, test lines and result line
firstline=$(cat $1 | head -n +1 | cut -d "[" -f2 | cut -d "]" -f1)
testname=$(echo $firstline)
tests=$(cat $1 | tail -n +3 | head -n -2)
results=$(cat $1 | tail -n1)
#create main JSON variable
json=$(jq -n --arg tn "$testname" '{testname:$tn,tests:[],summary:{}}')
#test's names, status, duration and updating JSON variable
IFS=$'\n'
for i in $tests
do
if [[ $i == not* ]]
then
stat=false
else
stat=true
fi
if [[ $i =~ expecting(.+?)[0-9] ]]
then
var=${BASH_REMATCH[0]}
name=${var%,*}
fi
if [[ $i =~ [0-9]*ms ]]
then
test_duration=${BASH_REMATCH[0]}
fi
json=$(echo $json | jq \
--arg na "$name" \
--arg st "$stat" \
--arg dur "$test_duration" \
'.tests += [{name:$na,status:$st|test("true"),duration:$dur}]')
done
#final success, failed, rating, duration and finishing JSON variable
IFS=$'\n'
for l in $results
do
if [[ $l =~ [0-9]+ ]]
then
success=${BASH_REMATCH[0]}
fi
if [[ $l =~ ,.[0-9]+ ]]
then
v=${BASH_REMATCH[0]}
failed=${v:2}
fi
if [[ $l =~ [0-9]+.[0-9]+% ]] || [[ $l =~ [0-9]+% ]]
then
va=${BASH_REMATCH[0]}
rating=${va%%%}
fi
if [[ $l =~ [0-9]*ms ]]
then
duration=${BASH_REMATCH[0]}
fi
json=$(echo $json | jq \
--arg suc "$success" \
--arg fa "$failed" \
--arg rat "$rating" \
--arg dur "$duration" \
'.summary += {success:$suc|tonumber,failed:$fa|tonumber,rating:$rat|tonumber,duration:$dur}')
done
#redirect variable's output to file
echo $json | jq "." > $path"/output.json"
Output.json looks like:
{
"testname": "Behave test",
"tests": [
{
"name": "some text",
"status": true,
"duration": "7ms"
},
{
"name": "some text",
"status": false,
"duration": "27ms"
}
],
"summary": {
"success": 1,
"failed": 1,
"rating": 50,
"duration": "34ms"
}
}
Output is much more than in my example above. My question is, how I can calculate "summary" values and put it to json?
"success" - tests with status true;
"failed" - tests with status false;
"rating" - success / total, and it can be float or int
"duration" - sum of "duration" field
I will be very grateful for help.
This adds (or replaces) a .summary field which is calculated by reduceing the .tests array to an object, initialized as {success: 0, failed: 0}. For each array item, either .success or .failed is incremented, depending on the current array item's boolean .state field. The .rating field is incremented unconditionally, thus counting the total number of items which is later used to calculate the true rating. As for the duration, the array item's .duration field is converted into a number by chopping off the ms suffix, and added to the summary's (numeric) .duration field. After the reduction, the rating is corrected by dividing the the success count by it (and multiplied by 100 for convenience). The ? // 0 part catches division by zero issues, in case the .tests array was empty. Finally, the .duration field is re-equipped with the ms suffix.
.summary = (reduce .tests[] as $t ({success: 0, failed: 0};
(if $t.status then .success else .failed end, .rating) += 1
| .duration += ($t.duration | rtrimstr("ms") | tonumber)
) | .rating = (100 * (.success / .rating)? // 0) | .duration |= "\(.//0)ms")
{
"testname": "Behave test",
"tests": [
{
"name": "some text",
"status": true,
"duration": "7ms"
},
{
"name": "some text",
"status": false,
"duration": "27ms"
}
],
"summary": {
"success": 1,
"failed": 1,
"rating": 50,
"duration": "34ms"
}
}
Demo

Replacing key value pair with spaces using jq

I have a properties file and a json file. The property file holds the key value pairs which needs to be replaced in the json.
When the value has no spaces it works as expected, but when the value has spaces, the value is not replaced.
Script that replaces the values
#!bin/bash
echo hello
while read line;
do
#echo $line
key=$(echo "$line" |cut -d':' -f1)
#echo $part1
value=$(echo "$line" |cut -d':' -f2)
if [[ ! -z $value ]];
then
key="\"$key\""
value="\"$value\""
echo $key : $value
jq '.parameters |= map( if .name == '$key' then .default = '$value' else . end )' cam.json > cam1.json
mv cam1.json cam.json
fi
done < prop.properties
property file
git_con_type:something
git_host_fqdn:something again
git_user:something again again
git_user_password:something
git_repo:something
git_repo_user:
git_branch:
JSON file
{
"name": "${p:Instance_Name}",
"parameters": [
{
"name": "git_con_type",
"default": "",
"immutable_after_create": false
},
{
"name": "git_host_fqdn",
"default": "hello",
"immutable_after_create": false
},
{
"name": "git_user",
"default": "",
"immutable_after_create": false
},
{
"name": "git_user_password",
"default": "Passw0rd",
"immutable_after_create": false
},
{
"name": "git_repo",
"default": "lm",
"immutable_after_create": false
},
{
"name": "git_repo_user",
"default": "-Life",
"immutable_after_create": false
},
{
"name": "git_branch",
"default": "master",
"immutable_after_create": false
},
{
"name": "git_clone_dir",
"default": "/opt/git",
"immutable_after_create": false
}
]
}
Error
jq: error: syntax error, unexpected $end, expecting QQSTRING_TEXT or QQSTRING_INTERP_START or QQSTRING_END (Unix shell quoting issues?) at , line 1:
.parameters |= map( if .name == "git_host_fqdn" then .default = "something
jq: error: Possibly unterminated 'if' statement at , line 1:
.parameters |= map( if .name == "git_host_fqdn" then .default = "something
jq: 2 compile errors
how can I make jq accept values with space? I have tried on jqplay with spaces and it works there, but not on the script.
It would be much better to pass $key and $value into jq as illustrated in the following (deliberately minimalist) tweak of your script:
#!/bin/bash
while read line
do
key=$(echo "$line" |cut -d':' -f1)
value=$(echo "$line" |cut -d':' -f2)
if [[ ! -z "$value" ]]
then
jq --arg key "$key" --arg value "$value" '.parameters |= map( if .name == $key then .default = $value else . end )' cam.json > cam1.json
mv cam1.json cam.json
fi
done < prop.properties
It would be still better to avoid the bash loop altogether and do everything with just one invocation of jq.
One-pass solution
The idea is to create a dictionary ($dict) of key-value pairs, which can easily be done using the builtin filter INDEX/1:
INDEX(inputs | split(":") | select(.[1] | length > 0); .[0])
| map_values(.[1]) as $dict
| $cam
| .parameters |= map( $dict[.name] as $value | if $value then .default = $value else . end )
Invocation
With the above jq program in program.jq:
jq -n -R -f program.jq --argfile cam cam.json prop.properties > cam1.json && mv cam1.json cam.json
Or using sponge:
jq -n -R -f program.jq --argfile cam cam.json prop.properties | sponge cam.json
Note in particular the -n option, which is needed because program.jq uses inputs.

How to parse JSON format output of "kubectl get pods" using jq and create an array

The JSON output returned to me after running this command
kubectl get pods -o json | jq '.items[].spec.containers[].env'
on my kuberntes cluster is this
[
{
"name": "USER_NAME",
"value": "USER_NAME_VALUE_A"
},
{
"name": "USER_ADDRESS",
"value": "USER_ADDRESS_VALUE_A"
}
]
[
{
"name": "USER_NAME",
"value": "USER_NAME_VALUE_B"
},
{
"name": "USER_ADDRESS",
"value": "USER_ADDRESS_VALUE_B"
}
]
I'd like to create a unified array/dictionary (Using Bash script) which looks like the example below and how can I get the value of each key?
[
{
"USER_NAME": "USER_NAME_VALUE_A",
"USER_ADDRESS": "USER_ADDRESS_VALUE_A"
},
{
"USER_NAME": "USER_NAME_VALUE_B",
"USER_ADDRESS": "USER_ADDRESS_VALUE_B"
}
]
use the jsonpath
C02W84XMHTD5:~ iahmad$ kubectl get pods --all-namespaces -o=jsonpath='{range .items[*]}{.metadata.name}{"\n"}'
coredns-c4cffd6dc-nsd2k
etcd-minikube
kube-addon-manager-minikube
kube-apiserver-minikube
kube-controller-manager-minikube
kube-dns-86f4d74b45-d5njm
kube-proxy-pg89s
kube-scheduler-minikube
kubernetes-dashboard-6f4cfc5d87-b7n7v
storage-provisioner
tiller-deploy-778f674bf5-vt4mj
https://kubernetes.io/docs/reference/kubectl/jsonpath/
it can output key values as well
C02W84XMHTD5:~ iahmad$ kubectl get pods --all-namespaces -o=jsonpath='{range .items[*]}{.metadata.name}{"\t"}{.status.startTime}{"\n"}{end}'
coredns-c4cffd6dc-nsd2k 2018-10-16T21:44:19Z
etcd-minikube 2018-10-29T17:30:56Z
kube-addon-manager-minikube 2018-10-29T17:30:56Z
kube-apiserver-minikube 2018-10-29T17:30:56Z
kube-controller-manager-minikube 2018-10-29T17:30:56Z
kube-dns-86f4d74b45-d5njm 2018-10-16T21:44:16Z
kube-proxy-pg89s 2018-10-29T17:32:05Z
kube-scheduler-minikube 2018-10-29T17:30:56Z
kubernetes-dashboard-6f4cfc5d87-b7n7v 2018-10-16T21:44:19Z
storage-provisioner 2018-10-16T21:44:19Z
tiller-deploy-778f674bf5-vt4mj 2018-11-01T13:45:23Z
then you can split those by space and form json or list
This will do it in bash. You'd be surprised how much you can do with bash:
#!/bin/bash
NAMES=`kubectl get pods -o=jsonpath='{range .items[*]}{.spec.containers[*].env[*].name}{"\n"}' | tr -d '\011\012\015'`
VALUES=`kubectl get pods -o=jsonpath='{range .items[*]}{.spec.containers[*].env[*].value}{"\n"}' | tr -d '\011\012\015'`
IFS=' ' read -ra NAMESA <<< "$NAMES"
IFS=' ' read -ra VALUESA <<< "$VALUES"
MAXINDEX=`expr ${#NAMESA[#]} - 1`
printf "[\n"
for i in "${!NAMESA[#]}"; do
printf " {\n"
printf " \"USER_NAME\": \"${NAMESA[$i]}\",\n"
printf " \"USER_ADDRESS\": \"${VALUESA[$i]}\"\n"
if [ "$i" == "${MAXINDEX}" ]; then
printf " }\n"
else
printf " },\n"
fi
done
printf "]\n"
While you are using jq as a filter, why not use it as a transformer, too?
kubectl get pods -o json | \
jq '.items|map(.spec.containers|map(.env|map({key: .name, value})|from_entries)|add)'
I know this is totally a necromancer badge, but still ;)

Update values in json with jq (shell script)

I am trying to update a couple values in a json array (separate file) with a shell script. Basically the logic is, set an environment variable called URL, ping that URL and add it to the json-- if 0, update another json field to SUCCESS, else update to FAILED.
Here is are the files:
info.json:
{
"name": "PingTest",",
"metrics": [
{
"event_type": "PingResult",
"provider": "test",
"providerUrl": "URL",
"providerResult": "RESULT"
}
]
}
pinger.sh:
#!/bin/sh
JSON=`cat info.json` #read in JSON
#Assume URL variable is set to www.dksmfdkf.com
ping -q -c 1 "$URL" > /dev/null #ping url
if [ $? -eq 0 ]; then #if ping success, replace result in json template
JSON=`echo ${JSON} | jq --arg v "$URL" '.metrics[].providerUrl |= $v'
info.json`
JSON=`echo ${JSON} | jq '.metrics[].providerResult |= "SUCCESS"' info.json`
else
JSON=`echo ${JSON} | jq --arg v "$URL" '.metrics[].providerUrl |= $v'
info.json`
JSON=`echo ${JSON} | jq '.metrics[].providerResult |= "FAILED"' info.json`
fi
#Remove whitespace from json
JSON=`echo $JSON | tr -d ' \t\n\r\f'`
#Print the result
echo "$JSON"
The problem is my json file isn't getting updated properly, example result when running:
home:InfraPingExtension home$ ./pinger.sh
ping: cannot resolve : Unknown host
{
"name": "PingTest",
"metrics": [
{
"event_type": "PingResult",
"provider": "test",
"providerUrl": "",
"providerResult": "RESULT"
}
]
}
{
"name": "PingTest",
"metrics": [
{
"event_type": "PingResult",
"provider": "test",
"providerUrl": "URL",
"providerResult": "FAILED"
}
]
}
{"name":"PingTest","metrics":[{"event_type":"PingResult","provider":"test","providerUrl":"URL","providerResult":"RESULT"}]}
This would be greatly simplified by only calling jq once.
host=${URL#http://}; host=${host#https://}; host=${host%%/*}
if ping -q -c 1 "$host"; then
result=SUCCESS
else
result=FAILED
fi
JSON=$(
jq -c \
--arg url "$URL" \
--arg result "$result" \
'.metrics[].providerUrl |= $url
| .metrics[].providerResult |= $result
' info.json
)

What is the best way to pipe JSON with whitespaces into a bash array?

I have JSON like this that I'm parsing with jq:
{
"data": [
{
"item": {
"name": "string 1"
},
"item": {
"name": "string 2"
},
"item": {
"name": "string 3"
}
}
]
}
...and I'm trying to get "string 1" "string 2" and "string 3" into a Bash array, but I can't find a solution that ignores the whitespace in them. Is there a method in jq that I'm missing, or perhaps an elegant solution in Bash for it?
Current method:
json_names=$(cat file.json | jq ".data[] .item .name")
read -a name_array <<< $json_names
The below assume your JSON text is in a string named s. That is:
s='{
"data": [
{
"item1": {
"name": "string 1"
},
"item2": {
"name": "string 2"
},
"item3": {
"name": "string 3"
}
}
]
}'
Unfortunately, both of the below will misbehave with strings containing literal newlines; since jq doesn't have support for NUL-delimited output, this is difficult to work around.
On bash 4 (with slightly sloppy error handling, but tersely):
readarray -t name_array < <(jq -r '.data[] | .[] | .name' <<<"$s")
...or on bash 3.x or newer (with very comprehensive error handling, but verbosely):
# -d '' tells read to process up to a NUL, and will exit with a nonzero exit status if that
# NUL is not seen; thus, this causes the read to pass through any error which occurred in
# jq.
IFS=$'\n' read -r -d '' -a name_array \
< <(jq -r '.data[] | .[] | .name' <<<"$s" && printf '\0')
This populates a bash array, contents of which can be displayed with:
declare -p name_array
Arrays are assigned in the form:
NAME=(VALUE1 VALUE2 ... )
where NAME is the name of the variable, VALUE1, VALUE2, and the rest are fields separated with characters that are present in the $IFS (input field separator) variable.
Since jq outputs the string values as lines (sequences separated with the new line character), then you can temporarily override $IFS, e.g.:
# Disable globbing, remember current -f flag value
[[ "$-" == *f* ]] || globbing_disabled=1
set -f
IFS=$'\n' a=( $(jq --raw-output '.data[].item.name' file.json) )
# Restore globbing
test -n "$globbing_disabled" && set +f
The above will create an array of three items for the following file.json:
{
"data": [
{"item": {
"name": "string 1"
}},
{"item": {
"name": "string 2"
}},
{"item": {
"name": "string 3"
}}
]
}
The following shows how to create a bash array consisting of arbitrary JSON texts produced by a run of jq.
In the following, I'll assume input.json is a file with the following:
["string 1", "new\nline", {"x": 1}, ["one\ttab", 4]]
With this input, the jq filter .[] produces four JSON texts -- two JSON strings, a JSON object, and a JSON array.
The following bash script can then be used to set x to be a bash array of the JSON texts:
#!/bin/bash
x=()
while read -r value
do
x+=("$value")
done < <(jq -c '.[]' input.json)
For example, adding this bash expression to the script:
for a in "${x[#]}" ; do echo a="$a"; done
would yield:
a="string 1"
a="new\nline"
a={"x":1}
a=["one\ttab",4]
Notice how (encoded) newlines and (encoded) tabs are handled properly.