How to update json field using jq [duplicate] - json

This question already has answers here:
Transfer or merge only some properties from one JSON file to another with jq
(2 answers)
jq to replace text directly on file (like sed -i)
(9 answers)
Closed 8 days ago.
I have a bootstrap script. After every time it runs, I want to keep track of its exit status in a JSON file. JSON file can have other fields too.
Case 1: The very first time it runs, JSON file will be created & a field node_bootstrap_status with boolean value will be added to it.
Case 2: In subsequent runs, JSON file will pre-exist. But in this case, I want to update same JSON field node_bootstrap_status with the outcome(again boolean).
I wrote the following bash script. It works for 1st case. In second case however, it ends up deleting everything in pre-existing JSON file.
exit_code=1
node_info_file="/etc/node_info.json"
if [[ -f $node_info_file ]]; then
# /etc/node_info.json exists
echo "${node_info_file} exists"
if [ $exit_code = 0 ]; then
cat $node_info_file | jq --argjson node_bootstrap_status true '{"node_bootstrap_status": $node_bootstrap_status}' > $node_info_file
else
cat $node_info_file | jq --argjson node_bootstrap_status false '{"node_bootstrap_status": $node_bootstrap_status}' > $node_info_file
fi
else
# /etc/node_info.json does NOT exists
echo "${node_info_file} does not exist"
touch ${node_info_file}
if [ $exit_code = 0 ]; then
echo '{}' | jq --argjson node_bootstrap_status true '{"node_bootstrap_status": $node_bootstrap_status}' > $node_info_file
else
echo '{}' | jq --argjson node_bootstrap_status false '{"node_bootstrap_status": $node_bootstrap_status}' > $node_info_file
fi
fi
expected outcome:
cat /etc/node_info.json
{
"node_bootstrap_status": true/false,
"foo": "bar"
}

You're off to a good start - but you'd want to use assignment rather than the sort of { ... } syntax like so:
node_info_file="node_info.json"
set_status() {
local value="$1"
mv -f "$node_info_file" "$node_info_file.tmp"
# Set property key with: '.node_bootstrap_status = ...'
jq --argjson node_bootstrap_status "$value" '.node_bootstrap_status = $node_bootstrap_status' "$node_info_file.tmp" > "$node_info_file"
rm -f "$node_info_file.tmp"
}
# if node_info.json does NOT exists; create it
if [ ! -f $node_info_file ]; then
printf '%s\n' '{}' > "$node_info_file"
fi
set_status true
set_status false
I tested this with empty JSON, and with other contents, and it worked as expected.
Also, make sure you quote "$node_info_file" - it's good practice. If you use ShellCheck then it'll catch those types of errors for you.

Try to adapt this version to your needs :
#!/usr/bin/env bash
exit_code=1
node_info_file="/etc/node_info.json"
test -f "$node_info_file" || echo {} > "$node_info_file"
[ $exit_code = 0 ] && status=true || status=false
tmp=$(mktemp)
jq --argjson node_bootstrap_status $status '{$node_bootstrap_status}' "$node_info_file" > $tmp
mv $tmp "$node_info_file"
Just a side note, it may not be a good idea to save node_info.json in /etc.

Related

bash & jq: add attribute with object value

I'm looking for a solution to add a new attribute with a JSON object value into an existing JSON file.
My current script:
if [ ! -f "$src_file" ]; then
echo "Source file $src_file does not exists"
exit 1
fi
if [ ! -f "$dst_file" ]; then
echo "Destination file $dst_file does not exists"
exit 1
fi
if ! jq '.devDependencies' "$src_file" >/dev/null 2>&1; then
echo "The key "devDependencies" does not exists into source file $src_file"
exit 1
fi
dev_dependencies=$(jq '.devDependencies' "$src_file" | xargs )
# Extract data from source file
data=$(cat $src_file)
# Add new key-value
data=$(echo $data | jq --arg key "devDependencies" --arg value "$dev_dependencies" '. + {($key): ($value)}')
# Write data into destination file
echo $data > $dst_file
It's working but the devDependencies value from $dev_dependencies is wrote as string:
"devDependencies": "{ #nrwl/esbuild: 15.6.3, #nrwl/eslint-pl[...]".
How can I write it as raw JSON ?
I think you want the --argjson option instead of --arg. Compare
$ jq --arg k '{"foo": "bar"}' -n '{x: $k}'
{
"x": "{\"foo\": \"bar\"}"
}
with
$ jq --argjson k '{"foo": "bar"}' -n '{x: $k}'
{
"x": {
"foo": "bar"
}
}
--arg will create a string variable. Use --argjson to parse the value as JSON (can be object, array or number).
From the docs:
--arg name value:
This option passes a value to the jq program as a predefined variable.
If you run jq with --arg foo bar, then $foo is available in the
program and has the value "bar". Note that value will be treated as a
string, so --arg foo 123 will bind $foo to "123".
Named arguments are also available to the jq program as $ARGS.named.
--argjson name JSON-text:
This option passes a JSON-encoded value to the jq program as a
predefined variable. If you run jq with --argjson foo 123, then $foo
is available in the program and has the value 123.
Note that you don't need multiple invocations of jq, xargs, command substitution or variables (don't forget to quote all your variables when expanding).
To "merge" the contents of two files, read both files with jq and let jq do the work. This avoids all the complications that arise from jumping between jq and shell context. A single line is all that's needed:
jq --slurpfile deps "$dep_file" '. + { devDependencies: $deps[0].devDependencies }' "$source_file" > "$dest_file"
or
jq --slurpfile deps "$dep_file" '. + ($deps[0]|{devDependencies})' "$source_file" > "$dest_file"
alternatively (still a one-liner):
jq --slurpfile deps "$dev_file" '.devDependencies = $deps[0].devDependencies' "$source_file" > "$dest_file"
peak's answer here reminded me of the very useful input filter, which can make the program even shorter as it avoids the variable:
jq '. + (input|{devDependencies})' "$source_file" "$dep_file" > "$dest_file"

Check if result is an empty string

I have a JSON file which I created using a jq command.
The file is like this:
[
{
"description": "",
"app": "hello-test-app"
},
{
"description": "",
"app": "hello-world-app"
}
]
I would like to have just a simple if/else condition to check if the description is empty.
I tried different approaches but none of them works!!
I tried:
jq -c '.[]' input.json | while read i; do
description=$(echo $i | jq '.description')
if [[ "$description" == "" ]];
then
echo "$description is empty"
fi
done
and with same code but this if/else;
if [[ -z "$description" ]];
then
echo "$description is empty"
fi
Can someone help me?
jq supports conditionals. No need to bring this back to bash (yet):
< foo jq -r '.[] | if .description == ""
then "description is empty"
else .description end'
description is empty
description is empty
If you insist on piping back to bash, here's what is happening:
jq -c '.[]' foo | while read i; do description=$(echo $i | jq '.description')
printf '%s\n' "$description"
done
""
""
You can see here that the expansion of $description is not empty. It is literally a pair of quotes each time.
There are several problems with piping to bash here -- the unquoted expansion of $i, repiping to jq and translating a pair of quotes into a empty string between two different programming languages. But I guess the simple answer is "just test if "$description" expands to a pair of quotes."
Testing quotes in bash means quoting your quotes:
if [[ $description = '""' ]]; then
echo '$description expands to a pair of quotes'
fi
A better answer is, in my opinion, keep the work in jq.

How can I interpret variables on the fly in the shell script?

I'm reading JSON in a shell script using JQ. Here, I'm unable to interpret the variables $HOME, $HOST, $PEMFILE in my shell script on the fly.
JSON File:
{
"script": {
"install": "${HOME}/lib/install.sh $HOST $PEMFILE",
"Setup": "${HOME}/lib/setup.sh $HOST $PEMFILE $VAR1 $VAR2"
}
}
Shell Script:
#!/bin/bash
examplefile="../lib/example.json"
HOST=ec2-..-...-...-...us-west-2.compute.amazonaws.com
PEMFILE=${HOME}/test.pem
installScript=($(jq '.script.install' $examplefile))
bash "$installScript"
Is there a way I can interpret these variables on the fly without modifying the JSON?
P.S I don't want to use eval.
It is easy using gnu utility envsubst:
installScript=$(jq -r '.script.install' "$examplefile" | envsubst)
Here is a solution using env and gsub to perform the replacement.
Note that env requires the variables to be passed as environment variables as opposed to shell variables.
#!/bin/bash
examplefile="../lib/example.json"
HOST=ec2-..-...-...-...us-west-2.compute.amazonaws.com
PEMFILE=${HOME}/test.pem
export HOST
export PEMFILE
installScript=$(jq -Mr '
.script.install | gsub("(?<x>[$][{]?\\w+[}]?)"; env[.x|gsub("[${}]+";"")] )
' $examplefile)
echo $installScript
Sample Output
/home/runner/lib/install.sh ec2-..-...-...-...us-west-2.compute.amazonaws.com /home/runner/test.pem
Try it online!
Specific solution
Here's a jq solution to the stated problem, though it will only work for "global" environment variables.
def substitute:
gsub("\\${HOME}"; env.HOME)
| gsub("\\$HOST"; env.HOST)
| gsub("\\$PEMFILE"; env.PEMFILE)
| gsub("\\$VAR1"; env.VAR1)
| gsub("\\$VAR2"; env.VAR2)
;
walk( if type=="string" then substitute else . end )
If your jq does not already have walk/1, then please either upgrade your jq or snarf the def from https://github.com/stedolan/jq/blob/master/src/builtin.jq
The solution above is a bit brittle but it could easily be robustified or generalized, as shown in the next section.
General solution
walk(if type == "string"
then gsub("\\$(?<x>[A-Za-z_][A-Za-z0-9_]+)"; "\(env[.x])")
| gsub("\\${(?<x>[A-Za-z_][A-Za-z0-9_]+)}"; "\(env[.x])")
else . end)
#!/bin/sh
TMP=$(mktemp /tmp/$$.XXX)
cat<<E_O_F > $TMP
cat <<EOF
$(cat so-dollar-variables.json)
EOF
E_O_F
. $TMP
/bin/rm "$TMP"
I've been hitting this on and off for years. I think I've finally got a decent pure-bash solution: uses regex matching and indirect parameter substitution
# read the file
json=$(< file.json)
echo step 0
echo "$json"
# set the relevant vars, just plain shell variables
HOST=_host_
PEMFILE=_pemfile_
VAR1=_var1_
VAR2=_var2_
# replace '$var' forms
while [[ $json =~ ("$"([[:alnum:]_]+)) ]]; do
json=${json//${BASH_REMATCH[1]}/${!BASH_REMATCH[2]}}
done;
echo
echo step 1
echo "$json"
# replace '${var}' forms
while [[ $json =~ ("$""{"([[:alnum:]_]+)"}") ]]; do
json=${json//${BASH_REMATCH[1]}/${!BASH_REMATCH[2]}}
done
echo
echo step 2
echo "$json"
Output
step 0
{
"script": {
"install": "${HOME}/lib/install.sh $HOST $PEMFILE",
"Setup": "${HOME}/lib/setup.sh $HOST $PEMFILE $VAR1 $VAR2"
}
}
step 1
{
"script": {
"install": "${HOME}/lib/install.sh _host_ _pemfile_",
"Setup": "${HOME}/lib/setup.sh _host_ _pemfile_ _var1_ _var2_"
}
}
step 2
{
"script": {
"install": "/home/jackman/lib/install.sh _host_ _pemfile_",
"Setup": "/home/jackman/lib/setup.sh _host_ _pemfile_ _var1_ _var2_"
}
}
The magic is:
the regular expression, where I capture both $VAR and VAR, and
[[ $json =~ ("$"([[:alnum:]_]+)) ]]
# ..........1 2 21
the parameter substitution, where I search for the string "$VAR" and replace it with the indirect variable expansion ${!VAR}
${json//${BASH_REMATCH[1]}/${!BASH_REMATCH[2]}}

How to Flatten JSON using jq and Bash into Bash Associative Array where Key=Selector?

As a follow-up to Flatten Arbitrary JSON, I'm looking to take the flattened results and make them suitable for doing queries and updates back to the original JSON file.
Motivation: I'm writing Bash (4.2+) scripts (on CentOS 7) that read JSON into a Bash associative array using the JSON selector/filter as the key. I do processing on the associative arrays, and in the end I want to update the JSON with those changes.
The preceding solution gets me close to this goal. I think there are two things that it doesn't do:
It doesn't quote keys that require quoting. For example, the key com.acme would need to be quoted because it contains a special character.
Array indexes are not represented in a form that can be used to query the original JSON.
Existing Solution
The solution from the above is:
$ jq --stream -n --arg delim '.' 'reduce (inputs|select(length==2)) as $i ({};
[$i[0][]|tostring] as $path_as_strings
| ($path_as_strings|join($delim)) as $key
| $i[1] as $value
| .[$key] = $value
)' input.json
For example, if input.json contains:
{
"a.b":
[
"value"
]
}
then the output is:
{
"a.b.0": "value"
}
What is Really Wanted
An improvement would have been:
{
"\"a.b\"[0]": "value"
}
But what I really want is output formatted so that it could be sourced directly in a Bash program (implying the array name is passed to jq as an argument):
ArrayName['"a.b"[0]']='value' # Note 'value' might need escapes for Bash
I'm looking to have the more human-readable syntax above as opposed to the more general:
ArrayName['.["a.b"][0]']='value'
I don't know if jq can handle all of this. My present solution is to take the output from the preceding solution and to post-process it to the form that I want. Here's the work in process:
#!/bin/bash
Flatten()
{
local -r OPTIONS=$(getopt -o d:m:f: -l "delimiter:,mapname:,file:" -n "${FUNCNAME[0]}" -- "$#")
eval set -- "$OPTIONS"
local Delimiter='.' MapName=map File=
while true ; do
case "$1" in
-d|--delimiter) Delimiter="$2"; shift 2;;
-m|--mapname) MapName="$2"; shift 2;;
-f|--file) File="$2"; shift 2;;
--) shift; break;;
esac
done
local -a Array=()
readarray -t Array <<<"$(
jq -c -S --stream -n --arg delim "$Delimiter" 'reduce (inputs|select(length==2)) as $i ({}; .[[$i[0][]|tostring]|join($delim)] = $i[1])' <<<"$(sed 's|^\s*[#%].*||' "$File")" |
jq -c "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" |
sed -e 's|^"||' -e 's|"$||' -e 's|=|\t|')"
if [[ ! -v $MapName ]]; then
local -gA $MapName
fi
. <(
IFS=$'\t'
while read -r Key Value; do
printf "$MapName[\"%s\"]=%q\n" "$Key" "$Value"
done <<<"$(printf "%s\n" "${Array[#]}")"
)
}
declare -A Map
Flatten -m Map -f "$1"
declare -p Map
With the output:
$ ./Flatten.sh <(echo '{"a.b":["value"]}')
declare -A Map='([a.b.0]="value" )'
1) jq is Turing complete, so it's all just a question of which hammer to use.
2)
An improvement would have been:
{
"\"a.b\"[0]": "value"
}
That is easily accomplished using a helper function along these lines:
def flattenPath(delim):
reduce .[] as $s ("";
if $s|type == "number"
then ((if . == "" then "." else . end) + "[\($s)]")
else . + ($s | tostring | if index(delim) then "\"\(.)\"" else . end)
end );
3)
I do processing on the associative arrays, and in the end I want to update the JSON with those changes.
This suggests you might have posed an xy-problem. However, if you really do want to serialize and unserialize some JSON text, then the natural way to do so using jq is using leaf_paths, as illustrated by the following serialization/deserialization functions:
# Emit (path, value) pairs
# Usage: jq -c -f serialize.jq input.json > serialized.json
def serialize: leaf_paths as $p | ($p, getpath($p));
# Usage: jq -n -f unserialize.jq serialized.json
def unserialize:
def pairwise(s):
foreach s as $i ([];
if length == 1 then . + [$i] else [$i] end;
select(length == 2));
reduce pairwise(inputs) as $p (null; setpath($p[0]; $p[1]));
If using bash, you could use readarray (mapfile) to read the paths and values into a single array, or if you want to distinguish between the paths and values more easily, you could (for example) use the approach illustrated by the following:
i=0
while read -r line ; do
path[$i]="$line"; read -r line; value[$i]="$line"
i=$((i + 1))
done < serialized.json
But there are many other alternatives.

parsing json to check whether a field is blank in bash

So, lets say, I am trying to write a shell script which does the following:
1) Ping http://localhost:8088/query?key=q1
2) It returns a json response :
{
"status": "success",
"result": []
"query": "q1"
}
or
{
"status": "success",
"result": ["foo"],
"artist_key": "q1"
}
The "result" is either an empty array or filled array..
I am trying to write a shell script which is checking whether "result" is empty list or not?
Something like this would work:
# Assume result is from curl, but could be in a file or whatever
if curl "http://localhost:8088/query?key=q1" | grep -Pq '"result":\s+\[".+"\]'; then
echo "has result"
else
echo "does not have result"
fi
However, I'm assuming these are on separate lines. If not, there are linters for format it.
Edited (based on the jq comment), here's a jq solution as suggested by Adrian Frühwirth:
result=$( curl "http://localhost:8088/query?key=q1" | jq '.result' )
if [ "$result" == "[]" ]; then
echo "does not have result"
else
echo "has result"
fi
I have learned something new today. And as I play around with this more, maybe it's better to do this:
result=$( curl "http://localhost:8088/query?key=q1" | jq '.result | has(0)' )
if [ "$result" == "true" ]; then
echo "has result"
else
echo "does not have result"
fi
See the manual. I wasn't able to get the -e or --exit-status arguments to work.
I'd use a language that can convert JSON to a native data structure:
wget -O- "http://localhost:8088/query?key=q1" |
perl -MJSON -0777 -ne '
$data = decode_json $_;
exit (#{$data->{result}} == 0)
'
That exits with a success status if the result attribute is NOT empty. Encapsulating into a shell function:
check_result() {
wget -O- "http://localhost:8088/query?key=q1" |
perl -MJSON -0777 -ne '$data = decode_json $_; exit (#{$data->{result}} == 0)'
}
if check_result; then
echo result is NOT empty
else
echo result is EMPTY
fi
I like ruby for parsing JSON:
ruby -rjson -e 'data = JSON.parse(STDIN.read); exit (data["result"].length > 0)'
It's interesting that the exit status requires the opposite comparison operator. I guess ruby will convert exit(true) to exit(0), unlike perl which has no true boolean objects only integers.