Check if result is an empty string - json

I have a JSON file which I created using a jq command.
The file is like this:
[
{
"description": "",
"app": "hello-test-app"
},
{
"description": "",
"app": "hello-world-app"
}
]
I would like to have just a simple if/else condition to check if the description is empty.
I tried different approaches but none of them works!!
I tried:
jq -c '.[]' input.json | while read i; do
description=$(echo $i | jq '.description')
if [[ "$description" == "" ]];
then
echo "$description is empty"
fi
done
and with same code but this if/else;
if [[ -z "$description" ]];
then
echo "$description is empty"
fi
Can someone help me?

jq supports conditionals. No need to bring this back to bash (yet):
< foo jq -r '.[] | if .description == ""
then "description is empty"
else .description end'
description is empty
description is empty
If you insist on piping back to bash, here's what is happening:
jq -c '.[]' foo | while read i; do description=$(echo $i | jq '.description')
printf '%s\n' "$description"
done
""
""
You can see here that the expansion of $description is not empty. It is literally a pair of quotes each time.
There are several problems with piping to bash here -- the unquoted expansion of $i, repiping to jq and translating a pair of quotes into a empty string between two different programming languages. But I guess the simple answer is "just test if "$description" expands to a pair of quotes."
Testing quotes in bash means quoting your quotes:
if [[ $description = '""' ]]; then
echo '$description expands to a pair of quotes'
fi
A better answer is, in my opinion, keep the work in jq.

Related

Using jq with an unknown amount of arguments from shell script?

I'm about to lose my mind working with jq for the first time today. I've tried every ever so dirty way to make this work somehow. Let's get down to business:
I'm trying to further develop the pretty barebones API for Unifi Controllers for some use cases:
https://dl.ui.com/unifi/6.0.41/unifi_sh_api
Here's some mock up data (the real json data is thousands of lines):
{
"meta": {
"rc": "ok"
},
"data": [
{
"_id": "61a0da77f730e404af0edc3c",
"ip": "10.19.31.120",
"mac": "ff:ec:ff:89:ff:58",
"name": "Test-Accesspoint"
}
]
}
Here is my advancedapi.sh
#!/bin/bash
source unifiapi.sh
unifi_login 1>dev/null
if [ "$1" ];
then
command=$1
shift
else
echo "Please enter command"
exit
fi
#jq -r '.data[] | "\(.name),\(.mac)"'
assembleKeys(){
c="0"
seperator=";"
keys=$(for i in "$#"
do
c=$(expr $c + 1)
if [ $c -gt 1 ];
then
echo -n "${seperator}\(.${i})"
else
echo -n "\(.${i})"
fi
done
echo)
}
assembleJQ(){
c=0
jq=$(echo -en 'jq -r -j \x27.data[] | '
for i in "$#"
do
if [ "$c" != "0" ];
then
echo -en ",\x22 \x22,"
fi
c=1
echo -en ".$i"
done
echo -en ",\x22\\\n\x22\x27")
echo "$jq"
}
getDeviceKeys(){
assembleJQ "$#"
#unifi_list_devices | jq -r '.data[]' | jq -r "${keys}"
#export keys=mac
#export second=name
#unifi_list_devices | jq -r -j --arg KEYS "$keys" --arg SECOND "$second" '.data[] | .[$KEYS]," ",.[$SECOND],"\n"'
unifi_list_devices | $jq
#unifi_list_devices | jq -r -j '.data[] | .mac," ",.name,"\n"'
}
"$command" $#
Users should be able to call any function from the API with as many arguments as they want.
So basically this:
#export keys=mac
#export second=name
#unifi_list_devices | jq -r -j --arg KEYS "$keys" --arg SECOND "$second" '.data[] | .[$KEYS]," ",.[$SECOND],"\n"'
which works, but only with a limited number of arguments. I want to pass $# to jq.
I'm trying to get the name and mac-address of a device by calling:
./advancedapi.sh getDeviceKeys mac name
this should return the mac-address and name of the device. But it only gives me this:
jq -r -j '.data[] | .mac," ",.name,"\n"'
jq: error: syntax error, unexpected INVALID_CHARACTER, expecting $end (Unix shell quoting issues?) at <top-level>, line 1:
'.data[]
jq: 1 compile error
The top-most line being the jq command, which works perfectly fine when called manually.
The assembleKeys function was my attempt at generating a string that looks like this:
jq -r '.data[] | "\(.name),\(.mac)"'
Can anyone explain to me, preferably in-depth, how to do this? I'm going insane here!
Thanks!
I want to pass $# to jq.
There are actually many ways to do this, but the simplest might be using the --args command-line option, as illustrated here:
File: check
#/bin/bash
echo "${#}"
jq -n '$ARGS.positional' --args "${#}"
Example run:
./check 1 2 3
1 2 3
[
"1",
"2",
"3"
]
Projection
Here's an example that might be closer to what you're looking for.
Using your sample JSON:
File: check2
#/bin/bash
jq -r '
def project($array):
. as $in | reduce $array[] as $k ([]; . + [$in[$k]]);
$ARGS.positional,
(.data[] | project($ARGS.positional))
| #csv
' input.json --args "${#}"
./check2 name mac
"name","mac"
"Test-Accesspoint","ff:ec:ff:89:ff:58"
There is error in the json (red highlighted commas)(that commas are not needed)
$
$ cat j.json | jq -r '.data[] | "\(.name)█\(.mac)█\(.ip)█\(._id)"'
Test-Accesspoint█ff:ec:ff:89:ff:58█10.19.31.120█61a0da77f730e404af0edc3c
$ cat j.json
{
"meta": {
"rc": "ok"
},
"data": [
{
"_id": "61a0da77f730e404af0edc3c",
"ip": "10.19.31.120",
"mac": "ff:ec:ff:89:ff:58",
"name": "Test-Accesspoint"
}
]
}
$
If the api have given this json you should probably edit it, before piping it into jq [maybe with sed or something like it]

How to use positional argument with embed quotes in bash?

I'm trying to create a bash script that automates configuration of some letsencrypt related stuff.
The file that I have to edit is json so I would just use jq to edit it and pass the site name to it from the positional arguments of the script, but I can't get the positional argument passed into the json text.
I'm trying to do domething like the following:
JSON=`jq '. + { "ssl_certificate": "/etc/letsencrypt/live/$2/fullchain.pem" }' <<< echo site_config.json`
JSON=`jq '. + { "ssl_certificate_key": "/etc/letsencrypt/live/$2/fullchain.pem" }' <<< ${JSON}`
echo -e "$JSON" > site_config.json
Where the second positional argument ($2) contain the domain name required to be set in the json file.
How this can be done?
Original json:
{
"key1":"value1",
"key2":"value2"
}
Wanted json:
{
"key1":"value1",
"key2":"value2",
"ssl_certificate": "/etc/letsencrypt/live/somesite.com/fullchain.pem",
"ssl_certificate_key": "/etc/letsencrypt/live/somesite.com/fullchain.pem"
}
1. String construction under bash
I use printf and octal representation for nesting quotes and double quotes:
printf -v JSON 'Some "double quoted: \047%s\047"' "Any string"
echo "$JSON"
Some "double quoted: 'Any string'"
2. Using jq, strictly answer to edited question:
myFunc() {
local file="$1" site="$2" JSON
printf -v JSON '. + {
"ssl_certificate": "/etc/letsencrypt/live/%s/fullchain.pem",
"ssl_certificate_key": "/etc/letsencrypt/live/%s/fullchain.pem"
}' "$site" "$site"
jq "$JSON" <"$file"
}
Then run:
myFunc site_config.json test.com
{
"key1": "value1",
"key2": "value2",
"ssl_certificate": "/etc/letsencrypt/live/test.com/fullchain.pem",
"ssl_certificate_key": "/etc/letsencrypt/live/test.com/fullchain.pem"
}
myFunc site_config.json test.com >site_config.temp && mv site_config.{temp,conf}
Or even:
myFunc <(
printf '{ "key1":"value1","key2":"value2","comment":"Let\047s doit\041" }'
) test.com
Will render:
{
"key1": "value1",
"key2": "value2",
"comment": "Let's doit!",
"ssl_certificate": "/etc/letsencrypt/live/test.com/fullchain.pem",
"ssl_certificate_key": "/etc/letsencrypt/live/test.com/fullchain.pem"
}
2b. Better written with jq's --arg option:
Thanks to peak's detailed answer!
I use bash arrays to store strings with quotes, spaces and other special characters. This is more readable as there is no need to escape end-of-line (backslashes) and permit comments:
myFunc() {
local file="$1" site="$2"
local JSON=(
--arg ssl_certificate "/etc/letsencrypt/live/$site/fullchain.pem"
--arg ssl_certificate_key "/etc/letsencrypt/live/$site/fullchain.pem"
'. + {$ssl_certificate, $ssl_certificate_key}' # this syntax
# do offer two advantages: 1: no backslashes and 2: permit comments.
)
jq "${JSON[#]}" <"$file"
}
3. Inline edit function
For editing a small script. I prefer to use cp -a in order to preserve
attributes and ensure a valid operation before replacement.
If you plan to use this, mostly for replacing, you could add replacement in your function:
myFunc() {
local REPLACE=false
[ "$1" = "-r" ] && REPLACE=true && shift
local file="$1" site="$2"
local JSON=( --arg ssl_certificate "/etc/letsencrypt/live/$site/fullchain.pem"
--arg ssl_certificate_key "/etc/letsencrypt/live/$site/fullchain.pem"
'. + {$ssl_certificate, $ssl_certificate_key}' )
if $REPLACE;then
cp -a "$file" "${file}.temp"
exec {out}>"${file}.temp"
else
exec {out}>&1
fi
jq "${JSON[#]}" <"$file" >&$out &&
$REPLACE && mv "${file}.temp" "$file"
exec {out}>&-
}
Then to modify file instead of dumping result to terminal, you have to add -r option:
myFunc -r site_config.json test.org
I cannot get the positional argument passed into the json text.
In general, by far the best way to do that is to use jq's --arg and/or --argjson command-line options. This is safe, and in the present case means that you would only have to call jq once. E.g.:
< site_config.json \
jq --arg sslc "/etc/letsencrypt/live/$2/fullchain.pem" \
--arg sslck "/etc/letsencrypt/live/$2/fullchain.pem" '
. + {ssl_certificate: $sslc, ssl_certificate_key: $sslck }'
Once you're sure things are working properly, feel free to use moreutils's sponge :-)
A DRY-er solution
Thanks to a neat convenience feature of jq, one can write more DRY-ly:
< site_config.json \
jq --arg ssl_certificate "/etc/letsencrypt/live/$2/fullchain.pem" \
--arg ssl_certificate_key "/etc/letsencrypt/live/$2/fullchain.pem" '
. + {$ssl_certificate, $ssl_certificate_key }'
``

jq truncates ENV variable after whitespace

Trying to write a bash script that replaces values in a JSON file we are running into issues with Environment Variables that contain whitespaces.
Given an original JSON file.
{
"version": "base",
"myValue": "to be changed",
"channelId": 0
}
We want to run a command to update some variables in it, so that after we run:
CHANNEL_ID=1701 MY_VALUE="new value" ./test.sh
The JSON should look like this:
{
"version": "base",
"myValue": "new value",
"channelId": 1701
}
Our script is currently at something like this:
#!/bin/sh
echo $MY_VALUE
echo $CHANNEL_ID
function replaceValue {
if [ -z $2 ]; then echo "Skipping $1"; else jq --argjson newValue \"${2}\" '. | ."'${1}'" = $newValue' build/config.json > tmp.json && mv tmp.json build/config.json; fi
}
replaceValue channelId ${CHANNEL_ID}
replaceValue myValue ${MY_VALUE}
In the above all values are replaced by string and strings are getting truncated at whitespace. We keep alternating between this issue and a version of the code where substitutions just stop working entirely.
This is surely an issue with expansions but we would love to figure out, how we can:
- Replace values in the JSON with both strings and values.
- Use whitespaces in the strings we pass to our script.
You don't have to mess with --arg or --argjson to import the environment variables into jq's context. It can very well read the environment on its own. You don't need a script separately, just set the values along with the invocation of jq
CHANNEL_ID=1701 MY_VALUE="new value" \
jq '{"version": "base", myValue: env.MY_VALUE, channelId: env.CHANNEL_ID}' build/config.json
Note that in the case above, the variables need not be exported globally but just locally to the jq command. This allows you to not export multiple variables into the shell and pollute the environment, but just the ones needed for jq to construct the desired JSON.
To make the changes back to the original file, do > tmp.json && mv tmp.json build/config.json or more clearly download the sponge(1) utility from moreutils package. If present, you can pipe the output of jq as
| sponge build/config.json
Pass variables with --arg. Do:
jq --arg key "$1" --arg value "$2" '.[$key] = $value'
Notes:
#!/bin/sh indicates that this is posix shell script, not bash. Use #!/bin/bash in bash scripts.
function replaceValue { is something from ksh shell. Prefer replaceValue() { to declare functions. Bash obsolete and deprecated syntax.
Use newlines in your script to make it readable.
--argjson passes a json formatted argument, not a string. Use --arg for that.
\"${2}\" doesn't quote $2 expansion - it only appends and suffixes the string with ". Because the expansion is not qouted, word splitting is performed, which causes your input to be split on whitespaces when creating arguments for jq.
Remember to quote variable expansions.
Use http://shellcheck.net to check your scripts.
. | means nothing in jq, it's like echo $(echo $(echo))). You could jq '. | . | . | . | . | .' do it infinite number of times - it passes the same thing. Just write the thing you want to do.
Do:
#!/bin/bash
echo "$MY_VALUE"
echo "$CHANNEL_ID"
replaceValue() {
if [ -z "$2" ]; then
echo "Skipping $1"
else
jq --arg key "$1" --arg value "$2" '.[$key] = $value' build/config.json > tmp.json &&
mv tmp.json build/config.json
fi
}
replaceValue channelId "${CHANNEL_ID}"
replaceValue myValue "${MY_VALUE}"
#edit Replaced ."\($key)" with easier .[$key]
jq allows you to build new objects:
MY_VALUE=foo;
CHANNEL_ID=4
echo '{
"version": "base",
"myValue": "to be changed",
"channelId": 0
}' | jq ". | {\"version\": .version, \"myValue\": \"$MY_VALUE\", \"channelId\": $CHANNEL_ID}"
The . selects the whole input, and inputs that (|) to the construction of a new object (marked by {}). For version is selects .version from the input, but you can set your own values for the other two. We use double quotes to allow the Bash variable expansion, which means escaping the double quotes in the JSON.
You'll need to adapt my snippet above to scriptify it.

Constructing a JSON object from a bash associative array

I would like to convert an associative array in bash to a JSON hash/dict. I would prefer to use JQ to do this as it is already a dependency and I can rely on it to produce well formed json. Could someone demonstrate how to achieve this?
#!/bin/bash
declare -A dict=()
dict["foo"]=1
dict["bar"]=2
dict["baz"]=3
for i in "${!dict[#]}"
do
echo "key : $i"
echo "value: ${dict[$i]}"
done
echo 'desired output using jq: { "foo": 1, "bar": 2, "baz": 3 }'
There are many possibilities, but given that you already have written a bash for loop, you might like to begin with this variation of your script:
#!/bin/bash
# Requires bash with associative arrays
declare -A dict
dict["foo"]=1
dict["bar"]=2
dict["baz"]=3
for i in "${!dict[#]}"
do
echo "$i"
echo "${dict[$i]}"
done |
jq -n -R 'reduce inputs as $i ({}; . + { ($i): (input|(tonumber? // .)) })'
The result reflects the ordering of keys produced by the bash for loop:
{
"bar": 2,
"baz": 3,
"foo": 1
}
In general, the approach based on feeding jq the key-value pairs, with one key on a line followed by the corresponding value on the next line, has much to recommend it. A generic solution following this general scheme, but using NUL as the "line-end" character, is given below.
Keys and Values as JSON Entities
To make the above more generic, it would be better to present the keys and values as JSON entities. In the present case, we could write:
for i in "${!dict[#]}"
do
echo "\"$i\""
echo "${dict[$i]}"
done |
jq -n 'reduce inputs as $i ({}; . + { ($i): input })'
Other Variations
JSON keys must be JSON strings, so it may take some work to ensure that the desired mapping from bash keys to JSON keys is implemented. Similar remarks apply to the mapping from bash array values to JSON values. One way to handle arbitrary bash keys would be to let jq do the conversion:
printf "%s" "$i" | jq -Rs .
You could of course do the same thing with the bash array values, and let jq check whether the value can be converted to a number or to some other JSON type as desired (e.g. using fromjson? // .).
A Generic Solution
Here is a generic solution along the lines mentioned in the jq FAQ and advocated by #CharlesDuffy. It uses NUL as the delimiter when passing the bash keys and values to jq, and has the advantage of only requiring one call to jq. If desired, the filter fromjson? // . can be omitted or replaced by another one.
declare -A dict=( [$'foo\naha']=$'a\nb' [bar]=2 [baz]=$'{"x":0}' )
for key in "${!dict[#]}"; do
printf '%s\0%s\0' "$key" "${dict[$key]}"
done |
jq -Rs '
split("\u0000")
| . as $a
| reduce range(0; length/2) as $i
({}; . + {($a[2*$i]): ($a[2*$i + 1]|fromjson? // .)})'
Output:
{
"foo\naha": "a\nb",
"bar": 2,
"baz": {
"x": 0
}
}
This answer is from nico103 on freenode #jq:
#!/bin/bash
declare -A dict=()
dict["foo"]=1
dict["bar"]=2
dict["baz"]=3
assoc2json() {
declare -n v=$1
printf '%s\0' "${!v[#]}" "${v[#]}" |
jq -Rs 'split("\u0000") | . as $v | (length / 2) as $n | reduce range($n) as $idx ({}; .[$v[$idx]]=$v[$idx+$n])'
}
assoc2json dict
You can initialize a variable to an empty object {} and add the key/values {($key):$value} for each iteration, re-injecting the result in the same variable :
#!/bin/bash
declare -A dict=()
dict["foo"]=1
dict["bar"]=2
dict["baz"]=3
data='{}'
for i in "${!dict[#]}"
do
data=$(jq -n --arg data "$data" \
--arg key "$i" \
--arg value "${dict[$i]}" \
'$data | fromjson + { ($key) : ($value | tonumber) }')
done
echo "$data"
This has been posted, and credited to nico103 on IRC, which is to say, me.
The thing that scares me, naturally, is that these associative array keys and values need quoting. Here's a start that requires some additional work to dequote keys and values:
function assoc2json {
typeset -n v=$1
printf '%q\n' "${!v[#]}" "${v[#]}" |
jq -Rcn '[inputs] |
. as $v |
(length / 2) as $n |
reduce range($n) as $idx ({}; .[$v[$idx]]=$v[$idx+$n])'
}
$ assoc2json a
{"foo\\ bar":"1","b":"bar\\ baz\\\"\\{\\}\\[\\]","c":"$'a\\nb'","d":"1"}
$
So now all that's needed is a jq function that removes the quotes, which come in several flavors:
if the string starts with a single-quote (ksh) then it ends with a single quote and those need to be removed
if the string starts with a dollar sign and a single-quote and ends in a double-quote, then those need to be removed and internal backslash escapes need to be unescaped
else leave as-is
I leave this last iterm as an exercise for the reader.
I should note that I'm using printf here as the iterator!
bash 5.2 introduces the #k parameter transformation which, makes this much easier. Like:
$ declare -A dict=([foo]=1 [bar]=2 [baz]=3)
$ jq -n '[$ARGS.positional | _nwise(2) | {(.[0]): .[1]}] | add' --args "${dict[#]#k}"
{
"foo": "1",
"bar": "2",
"baz": "3"
}

parsing json to check whether a field is blank in bash

So, lets say, I am trying to write a shell script which does the following:
1) Ping http://localhost:8088/query?key=q1
2) It returns a json response :
{
"status": "success",
"result": []
"query": "q1"
}
or
{
"status": "success",
"result": ["foo"],
"artist_key": "q1"
}
The "result" is either an empty array or filled array..
I am trying to write a shell script which is checking whether "result" is empty list or not?
Something like this would work:
# Assume result is from curl, but could be in a file or whatever
if curl "http://localhost:8088/query?key=q1" | grep -Pq '"result":\s+\[".+"\]'; then
echo "has result"
else
echo "does not have result"
fi
However, I'm assuming these are on separate lines. If not, there are linters for format it.
Edited (based on the jq comment), here's a jq solution as suggested by Adrian Frühwirth:
result=$( curl "http://localhost:8088/query?key=q1" | jq '.result' )
if [ "$result" == "[]" ]; then
echo "does not have result"
else
echo "has result"
fi
I have learned something new today. And as I play around with this more, maybe it's better to do this:
result=$( curl "http://localhost:8088/query?key=q1" | jq '.result | has(0)' )
if [ "$result" == "true" ]; then
echo "has result"
else
echo "does not have result"
fi
See the manual. I wasn't able to get the -e or --exit-status arguments to work.
I'd use a language that can convert JSON to a native data structure:
wget -O- "http://localhost:8088/query?key=q1" |
perl -MJSON -0777 -ne '
$data = decode_json $_;
exit (#{$data->{result}} == 0)
'
That exits with a success status if the result attribute is NOT empty. Encapsulating into a shell function:
check_result() {
wget -O- "http://localhost:8088/query?key=q1" |
perl -MJSON -0777 -ne '$data = decode_json $_; exit (#{$data->{result}} == 0)'
}
if check_result; then
echo result is NOT empty
else
echo result is EMPTY
fi
I like ruby for parsing JSON:
ruby -rjson -e 'data = JSON.parse(STDIN.read); exit (data["result"].length > 0)'
It's interesting that the exit status requires the opposite comparison operator. I guess ruby will convert exit(true) to exit(0), unlike perl which has no true boolean objects only integers.