Iterate over jq loop: assign key if value.subvalue matches condition - json

I'm trying to perform some work on keys whose value matches a certain condition. First, the json I'm working with looks similar to this when I execute a CLI tool in linux:
$ ./sacli ClusterQuery
"i-aaaaaaabbbbbbbbbb": {
"sacli_ip": "10.0.52.37",
"sacli_listen_ip": "10.0.52.37",
"sacli_port": "945"
},
"i-ccccccccddddddddd": {
"sacli_ip": "10.0.48.68",
"sacli_listen_ip": "10.0.48.68",
"sacli_port": "945"
}
I would ideally like to loop through these entries, do a check and then perform operations based on the check. So conceptually:
for i in $(./sacli ClusterQuery | jq -r '<some syntax to get each key, value>'); do
node_id=key
ip=value.sacli_ip
ping -c 1 -W 1 $ip
ping_result=$?
if [[ $ping_result -eq 1 ]]; then #ping has failed
./remove $node_id
fi
done

I'd avoid your for loop in favor of piping to while, like this:
./sacli ClusterQuery | jq -rc '
keys_unsorted[] as $k | $k, .[$k].sacli_ip' |
while read -r node_id ; do
read -r ip
ping -c 1 -W 1 "$ip"
ping_result=$?
if [[ $ping_result -eq 1 ]]; then #ping has failed
echo ./remove "$node_id"
fi
done
This assumes the keys and sacli_ip values don't have embedded newlines.
You can of course easily add the appropriate select filter(s) inside the jq program.
The -c is there for safety.

Related

Filter json results from a file

I'm trying to write a script that reads all lines of .txt file. Each line is valid JSON with fields: saddr, and data. I want to get Bytes size (with wc -c), but can't do that.
while IFS='' read -r line || [[ -n "$line" ]]; do
echo $line |
jq -r '.data' |
bytes=$(wc -c)
if (( $bytes > 200 )); then
echo $bytes
fi
done< "testear.txt"
Example testear.txt:
{ "saddr": "157.130.222.66", "data": "9f00032a30000000" }
You're trying to set bytes in a pipeline, with the command:
echo $line | jq -r '.data' | bytes=$(wc -c)
However, the commands in a pipeline are run in a subshell, so environment variables set there will not be available in the top-level shell.
Instead, try this:
bytes=$(echo "$line" | jq -r '.data' | wc -c)
- so that you're setting bytes at the top level shell.

How to Flatten JSON using jq and Bash into Bash Associative Array where Key=Selector?

As a follow-up to Flatten Arbitrary JSON, I'm looking to take the flattened results and make them suitable for doing queries and updates back to the original JSON file.
Motivation: I'm writing Bash (4.2+) scripts (on CentOS 7) that read JSON into a Bash associative array using the JSON selector/filter as the key. I do processing on the associative arrays, and in the end I want to update the JSON with those changes.
The preceding solution gets me close to this goal. I think there are two things that it doesn't do:
It doesn't quote keys that require quoting. For example, the key com.acme would need to be quoted because it contains a special character.
Array indexes are not represented in a form that can be used to query the original JSON.
Existing Solution
The solution from the above is:
$ jq --stream -n --arg delim '.' 'reduce (inputs|select(length==2)) as $i ({};
[$i[0][]|tostring] as $path_as_strings
| ($path_as_strings|join($delim)) as $key
| $i[1] as $value
| .[$key] = $value
)' input.json
For example, if input.json contains:
{
"a.b":
[
"value"
]
}
then the output is:
{
"a.b.0": "value"
}
What is Really Wanted
An improvement would have been:
{
"\"a.b\"[0]": "value"
}
But what I really want is output formatted so that it could be sourced directly in a Bash program (implying the array name is passed to jq as an argument):
ArrayName['"a.b"[0]']='value' # Note 'value' might need escapes for Bash
I'm looking to have the more human-readable syntax above as opposed to the more general:
ArrayName['.["a.b"][0]']='value'
I don't know if jq can handle all of this. My present solution is to take the output from the preceding solution and to post-process it to the form that I want. Here's the work in process:
#!/bin/bash
Flatten()
{
local -r OPTIONS=$(getopt -o d:m:f: -l "delimiter:,mapname:,file:" -n "${FUNCNAME[0]}" -- "$#")
eval set -- "$OPTIONS"
local Delimiter='.' MapName=map File=
while true ; do
case "$1" in
-d|--delimiter) Delimiter="$2"; shift 2;;
-m|--mapname) MapName="$2"; shift 2;;
-f|--file) File="$2"; shift 2;;
--) shift; break;;
esac
done
local -a Array=()
readarray -t Array <<<"$(
jq -c -S --stream -n --arg delim "$Delimiter" 'reduce (inputs|select(length==2)) as $i ({}; .[[$i[0][]|tostring]|join($delim)] = $i[1])' <<<"$(sed 's|^\s*[#%].*||' "$File")" |
jq -c "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" |
sed -e 's|^"||' -e 's|"$||' -e 's|=|\t|')"
if [[ ! -v $MapName ]]; then
local -gA $MapName
fi
. <(
IFS=$'\t'
while read -r Key Value; do
printf "$MapName[\"%s\"]=%q\n" "$Key" "$Value"
done <<<"$(printf "%s\n" "${Array[#]}")"
)
}
declare -A Map
Flatten -m Map -f "$1"
declare -p Map
With the output:
$ ./Flatten.sh <(echo '{"a.b":["value"]}')
declare -A Map='([a.b.0]="value" )'
1) jq is Turing complete, so it's all just a question of which hammer to use.
2)
An improvement would have been:
{
"\"a.b\"[0]": "value"
}
That is easily accomplished using a helper function along these lines:
def flattenPath(delim):
reduce .[] as $s ("";
if $s|type == "number"
then ((if . == "" then "." else . end) + "[\($s)]")
else . + ($s | tostring | if index(delim) then "\"\(.)\"" else . end)
end );
3)
I do processing on the associative arrays, and in the end I want to update the JSON with those changes.
This suggests you might have posed an xy-problem. However, if you really do want to serialize and unserialize some JSON text, then the natural way to do so using jq is using leaf_paths, as illustrated by the following serialization/deserialization functions:
# Emit (path, value) pairs
# Usage: jq -c -f serialize.jq input.json > serialized.json
def serialize: leaf_paths as $p | ($p, getpath($p));
# Usage: jq -n -f unserialize.jq serialized.json
def unserialize:
def pairwise(s):
foreach s as $i ([];
if length == 1 then . + [$i] else [$i] end;
select(length == 2));
reduce pairwise(inputs) as $p (null; setpath($p[0]; $p[1]));
If using bash, you could use readarray (mapfile) to read the paths and values into a single array, or if you want to distinguish between the paths and values more easily, you could (for example) use the approach illustrated by the following:
i=0
while read -r line ; do
path[$i]="$line"; read -r line; value[$i]="$line"
i=$((i + 1))
done < serialized.json
But there are many other alternatives.

extract 2 values from JSON object and use as variables in loop using jq and bash

I am new to jq. I am trying to write a simple script that loops through a JSON file, gets two values within each object and assigns them to two separate variables I can use with a curl REST call. I see both values as output when I echo $i but how can I get value and addr as separate variables?
for i in `cat /Users/egraham/Downloads/test2 | jq .[] | jq ."value,.addr"`; do
You can do this:
jq -rc '.populator.value + " " + .populator.addr' file.json |
while read -r value addr; do
echo do something with "$value" and "$addr"
done
If spaces or tabs or other special characters make using 'read -r' problematic, and if your shell has "readarray", then it could be used:
$ readarray -t v < <(jq -rc '.populator | (.value,.addr)' file.json)
The values would then be available as ${v[0]} and ${v[1]}
This approach is especially useful if there are more than two values of interest, or if the number of values is variable or not known beforehand.
If your shell does not have readarray, then you can still use the array-oriented approach, e.g. along the lines of:
i=-1; while read -r a ; do i=$((i+1)); v[$i]="$a" ; done
First:
for i in cat /Users/egraham/Downloads/test2 | jq .[] | jq .value; do echo $i done
Second:
for i in cat /Users/egraham/Downloads/test2 | jq .[] | jq .addr; do echo $i done
I don't know any way to get it without running the commands separately. I don't know AWK, but maybe it's something worth considering.

How can I tell if a jq filter successfully pulls data from a JSON data structure?

I want to know if a given filter succeeds in pulling data from a JSON data structure. For example:
###### For the user steve...
% Name=steve
% jq -j --arg Name "$Name" '.[]|select(.user == $Name)|.value' <<<'
[
{"user":"steve", "value":false},
{"user":"tom", "value":true},
{"user":"pat", "value":null},
{"user":"jane", "value":""}
]'
false
% echo $?
0
Note: successful results can include boolean values, null, and even the empty string.
###### Now for user not in the JSON data...
% Name=mary
% jq -j --arg Name "$Name" '.[]|select(.user == $Name)|.value' <<<'
[
{"user":"steve", "value":false},
{"user":"tom", "value":true},
{"user":"pat", "value":null},
{"user":"jane", "value":""}
]'
% echo $?
0
If the filter does not pull data from the JSON data structure, I need to know this. I would prefer the filter to return a non-zero return code.
How would I go about determining if a selector successfully pulls data from a JSON data structure vs. fails to pull data?
Important: The above filter is just an example, the solution needs to work for any jq filter.
Note: the evaluation environment is Bash 4.2+.
You can use the -e / --exit-status flag from the jq Manual, which says
Sets the exit status of jq to 0 if the last output values was neither false nor null, 1 if the last output value was either false or null, or 4 if no valid result was ever produced. Normally jq exits with 2 if there was any usage problem or system error, 3 if there was a jq program compile error, or 0 if the jq program ran.
I can demonstrate the usage with a basic filter as below, as your given example is not working for me.
For a successful query,
dudeOnMac:~$ jq -e '.foo?' <<< '{"foo": 42, "bar": "less interesting data"}'
42
dudeOnMac:~$ echo $?
0
For an invalid query, done with a non-existent entity zoo,
dudeOnMac:~$ jq -e '.zoo?' <<< '{"foo": 42, "bar": "less interesting data"}'
null
dudeOnMac:~$ echo $?
1
For an error scenario, returning code 2 which I created by double-quoting the jq input stream.
dudeOnMac:~$ jq -e '.zoo?' <<< "{"foo": 42, "bar": "less interesting data"}"
jq: error: Could not open file interesting: No such file or directory
jq: error: Could not open file data}: No such file or directory
dudeOnMac:~$ echo $?
2
I've found a solution that meets all of my requirements! Please let me know what you think!
The idea is use jq -e "$Filter" as a first-pass check. Then for the return code of 1, do a jq "path($Filter)" check. The latter will only succeed if, in fact, there is a path into the JSON data.
Select.sh
#!/bin/bash
Select()
{
local Name="$1"
local Filter="$2"
local Input="$3"
local Result Status
Result="$(jq -e --arg Name "$Name" "$Filter" <<<"$Input")"
Status=$?
case $Status in
1) jq --arg Name "$Name" "path($Filter)" <<<"$Input" >/dev/null 2>&1
Status=$?
;;
*) ;;
esac
[[ $Status -eq 0 ]] || Result="***ERROR***"
echo "$Status $Result"
}
Filter='.[]|select(.user == $Name)|.value'
Input='[
{"user":"steve", "value":false},
{"user":"tom", "value":true},
{"user":"pat", "value":null},
{"user":"jane", "value":""}
]'
Select steve "$Filter" "$Input"
Select tom "$Filter" "$Input"
Select pat "$Filter" "$Input"
Select jane "$Filter" "$Input"
Select mary "$Filter" "$Input"
And the execution of the above:
% ./Select.sh
0 false
0 true
0 null
0 ""
4 ***ERROR***
I've added an updated solution below
The fundamental problem here is that when try to retrieve a value from an object using the .key or .[key] syntax, jq — by definition — can't distinguish a missing key from a key with a value of null.
You can instead define your own lookup function:
def lookup(k):if has(k) then .[k] else error("invalid key") end;
Then use it like so:
$ jq 'lookup("a")' <<<'{}' ; echo $?
jq: error (at <stdin>:1): invalid key
5
$ jq 'lookup("a")' <<<'{"a":null}' ; echo $?
null
0
If you then use lookup consistently instead of the builtin method, I think that will give you the behaviour you want.
Here's another way to go about it, with less bash and more jq.
#!/bin/bash
lib='def value(f):((f|tojson)//error("no such value"))|fromjson;'
users=( steve tom pat jane mary )
Select () {
local name=$1 filter=$2 input=$3
local -i status=0
result=$( jq --arg name "$name" "${lib}value(${filter})" <<<$input 2>/dev/null )
status=$?
(( status )) && result="***ERROR***"
printf '%s\t%d %s\n' "$name" $status "$result"
}
filter='.[]|select(.user == $name)|.value'
input='[{"user":"steve","value":false},
{"user":"tom","value":true},
{"user":"pat","value":null},
{"user":"jane","value":""}]'
for name in "${users[#]}"
do
Select "$name" "$filter" "$input"
done
This produces the output:
steve 0 false
tom 0 true
pat 0 null
jane 0 ""
mary 5 ***ERROR***
This takes advantage of the fact the absence of input to a filter acts like empty, and empty will trigger the alternative of //, but a string — like "null" or "false" — will not.
It should be noted that value/1 will not work for filters that are simple key/index
lookups on objects/arrays, but neither will your solution. I'm reasonably sure that to
cover all the cases, you'd need something like this (or yours) and something
like get or lookup.
Given that jq is the way it is, and in particular that it is stream-oriented, I'm inclined to think that a better approach would be to define and use one or more filters that make the distinctions you want. Thus rather than writing .a to access the value of a field, you'd write get("a") assuming that get/1 is defined as follows:
def get(f): if has(f) then .[f] else error("\(type) is not defined at \(f)") end;
Now you can easily tell whether or not an object has a key, and you're all set to go. This definition of get can also be used with arrays.

verify that a json field exists with jq and bash?

I have a script that uses jq for parsing a json string MESSAGE (that is read from another application). Meanwhile the json has changed and a field is split in 2 fields: file_path is now split into folder and file. The script was reading the file_path, now the folder may not be present, so for creating the path of the file I have to verify if the field is there. I have search for a while on the internet, and manage to do:
echo $(echo $MESSAGE | jq .folder -r)$'/'$(echo $MESSAGE | jq .file -r)
if [ $MESSAGE | jq 'has(".folder")' -r ]
then
echo $(echo $MESSAGE | jq .folder -r)$'/'$(echo $MESSAGE | jq .file -r)
else
echo $(echo $MESSAGE | jq .file -r)
fi
where MESSAGE='{"folder":"FLDR","file":"fl"}' or MESSAGE='{"file":"fl"}'
The first line is printing FLDR/fl or null/fl if the folder field is not present. So I have thought to create an if that is verifying if the folder field is present or not, but it seems that I am doing it wrong and cannot figure out what is wrong. The output is
bash: [: missing `]'
jq: ]: No such file or directory
null/fl
I'd do the whole thing in a jq filter:
echo "$MESSAGE" | jq -r '[ .folder, .file ] | join("/")'
In the event that you want to do it with bash (or to learn how to do this sort of thing in bash), two points:
Shell variables should almost always be quoted when they are used (i.e., "$MESSAGE" instead of $MESSAGE). You will run into funny problems if one of the strings in your JSON ever contains a shell metacharacter (such as *) and you forgot to do that; the string will be subject to shell expansion (and that * will be expanded into a list of files in the current working directory).
A shell if accepts as condition a command, and the decision where to branch is made depending on the exit status of that command (true if the exit status is 0, false otherwise). The [ you attempted to use is just a command (an alias for test, see man test) and not special in any way.
So, the goal is to construct a command that exits with 0 if the JSON object has a folder property, non-zero otherwise. jq has a -e option that makes it return 0 if the last output value was not false or null and non-zero otherwise, so we can write
if echo "$MESSAGE" | jq -e 'has("folder")' > /dev/null; then
echo "$MESSAGE" | jq -r '.folder + "/" + .file'
else
echo "$MESSAGE" | jq -r .file
fi
The > /dev/null bit redirects the output from jq to /dev/null (where it is ignored) so that we don't see it on the console.