Getting empty string after parsing bash associative array to json using jq - json

I have a bash associative array containing dynamic data like:
declare -A assoc_array=([cluster_name]="cpod1" [site_name]="ppod1" [alarm_name]="alarm1")
I have to create the JSON data accordingly.
{
"name": "cluster_name",
"value": $assoc_array[cluster_name],
"regex": False
}
{
"name": "site_name",
"value": $assoc_array[site_name],
"regex": False
}
{
"name": "alert_name",
"value": $assoc_array[alert_name],
"regex": False
}
I have used the following code for it:
for i in "${!assoc_array[#]}"; do
echo $i
alarmjsonarray=$(echo ${alarmjsonarray}| jq --arg name "$i" \
--arg value "${alarm_param[$i]}" \
--arg isRegex "False" \
'. + [{("name"):$name,("value"):$value,("isRegex"):$isRegex}]')
done
echo "alarmjsonarray" $alarmjsonarray
I am getting empty string from it. Can you please help me in it?

Assuming the ASCII record separator character \x1e can't occur in your data (and given your assertions that newlines will never be present), one way to handle this would be:
for key in "${!assoc_array[#]}"; do
printf '%s\x1e%s\n' "$key" "${assoc_array[$key]}"
done | jq -Rn '
[
inputs |
split("\u001e") | .[0] as $key | .[1] as $value |
{"name": $key, "value": $value, "regex": false}
]'
...feel free to change \x1e to a different character (like a tab) that can never exist, as appropriate.

#!/bin/bash
declare -A assoc_array=([cluster_name]="cpod1" [site_name]="ppod1" [alarm_name]="alarm1")
for i in "${!assoc_array[#]}"; do
alarmjsonarray=$(echo "${alarmjsonarray:-[]}" |
jq --arg name "$i" \
--arg value "${assoc_array[$i]}" \
--arg isRegex "False" \
'. + [{("name"):$name,("value"):$value,("isRegex"):$isRegex}]')
done
echo "alarmjsonarray" "$alarmjsonarray"
array was supposed to be initialized before using it. I thought in bash, scope is there out of block also.
Its more simpler way to generate json from bash associative array.

Related

Send bash array to json file

I pulled some config variables from a json file with jq.
And once modified, i want to write the whole config array (which can contain keys that were not there at first) to the json file.
The "foreach" part seems quite obvious.
But how to express "change keyA by value A or add keyA=>valueA" to the conf file ?
I'm stuck with something like
for key in "${!conf[#]}"; do
value=${conf[$key]}
echo $key $value
jq --arg key $key --arg value $value '.$key = $value' $conf_file > $new_file
done
Thanks
Extended solution with single jq invocation:
Sample conf array:
declare -A conf=([status]="finished" [nextTo]="2018-01-24" [result]=true)
Sample file conf.json:
{
"status": "running",
"minFrom": "2018-01-23",
"maxTo": "2018-01-24",
"nextTo": "2018-01-23",
"nextFrom": "2018-01-22"
}
Processing:
jq --arg data "$(paste -d':' <(printf "%s\n" "${!conf[#]}") <(printf "%s\n" "${conf[#]}"))" \
'. as $conf | map($data | split("\n")[]
| split(":") | {(.[0]) : .[1]})
| add | $conf + .' conf.json > conf.tmp && mv conf.tmp conf.json
The resulting conf.json contents:
{
"status": "finished",
"minFrom": "2018-01-23",
"maxTo": "2018-01-24",
"nextTo": "2018-01-24",
"nextFrom": "2018-01-22",
"result": "true"
}
I finally run into the following solution : not as compact as RomanPerekhrest, but a bit more human-readable. Thanks.
function update_conf() {
echo update_conf
if [ -n "$1" ] && [ -n "$2" ]; then
conf[$1]=$2
fi
for key in "${!conf[#]}"; do
value=${conf[$key]}
jq --arg key "$key" --arg value "$value" '.[$key] = $value' $confFile > $confTempFile && mv $confTempFile $confF$
done
}

jq not replacing json value with parameter

test.sh is not replacing test.json parameter values ($input1 and $input2). result.json has same ParameterValue "$input1/solution/$input2.result"
[
{
"ParameterKey": "Project",
"ParameterValue": [ "$input1/solution/$input2.result" ]
}
]
test.sh
#!/bin/bash
input1="test1"
input2="test2"
echo $input1
echo $input2
cat test.json | jq 'map(if .ParameterKey == "Project" then . + {"ParameterValue" : "$input1/solution/$input2.result" } else . end )' > result.json
shell variables in jq scripts should be interpolated or passed as arguments via --arg name value:
jq --arg inp1 "$input1" --arg inp2 "$input2" \
'map(if .ParameterKey == "Project"
then . + {"ParameterValue" : ($inp1 + "/solution/" + $inp2 + ".result") }
else . end)' test.json
The output:
[
{
"ParameterKey": "Project",
"ParameterValue": "test1/solution/test2.result"
}
]
In your jq program, you have quoted "$input1/solution/$input2.result", and therefore it is a JSON string literal, whereas you evidently want string interpolation; you also need to distinguish between the shell variables ($input1 and $input2) on the one hand, and the corresponding jq dollar-variables (which may or may not have the same name) on the other.
Since your shell variables are strings, you could pass them in using the --arg command-line option (e.g. --arg input1 "$input1" if you chose to name the variables in the same way).
You can read up on string interpolation in the jq manual (see https://stedolan.github.io/jq/manual, but note the links at the top for different versions of jq).
There are other ways to achieve the desired results too, but using string interpolation with same-named variables, you'd write:
"\($input1)/solution/\($input2).result"
Notice that the above string is NOT itself literally a JSON string. Only after string interpolation does it become so.

Constructing a JSON object from a bash associative array

I would like to convert an associative array in bash to a JSON hash/dict. I would prefer to use JQ to do this as it is already a dependency and I can rely on it to produce well formed json. Could someone demonstrate how to achieve this?
#!/bin/bash
declare -A dict=()
dict["foo"]=1
dict["bar"]=2
dict["baz"]=3
for i in "${!dict[#]}"
do
echo "key : $i"
echo "value: ${dict[$i]}"
done
echo 'desired output using jq: { "foo": 1, "bar": 2, "baz": 3 }'
There are many possibilities, but given that you already have written a bash for loop, you might like to begin with this variation of your script:
#!/bin/bash
# Requires bash with associative arrays
declare -A dict
dict["foo"]=1
dict["bar"]=2
dict["baz"]=3
for i in "${!dict[#]}"
do
echo "$i"
echo "${dict[$i]}"
done |
jq -n -R 'reduce inputs as $i ({}; . + { ($i): (input|(tonumber? // .)) })'
The result reflects the ordering of keys produced by the bash for loop:
{
"bar": 2,
"baz": 3,
"foo": 1
}
In general, the approach based on feeding jq the key-value pairs, with one key on a line followed by the corresponding value on the next line, has much to recommend it. A generic solution following this general scheme, but using NUL as the "line-end" character, is given below.
Keys and Values as JSON Entities
To make the above more generic, it would be better to present the keys and values as JSON entities. In the present case, we could write:
for i in "${!dict[#]}"
do
echo "\"$i\""
echo "${dict[$i]}"
done |
jq -n 'reduce inputs as $i ({}; . + { ($i): input })'
Other Variations
JSON keys must be JSON strings, so it may take some work to ensure that the desired mapping from bash keys to JSON keys is implemented. Similar remarks apply to the mapping from bash array values to JSON values. One way to handle arbitrary bash keys would be to let jq do the conversion:
printf "%s" "$i" | jq -Rs .
You could of course do the same thing with the bash array values, and let jq check whether the value can be converted to a number or to some other JSON type as desired (e.g. using fromjson? // .).
A Generic Solution
Here is a generic solution along the lines mentioned in the jq FAQ and advocated by #CharlesDuffy. It uses NUL as the delimiter when passing the bash keys and values to jq, and has the advantage of only requiring one call to jq. If desired, the filter fromjson? // . can be omitted or replaced by another one.
declare -A dict=( [$'foo\naha']=$'a\nb' [bar]=2 [baz]=$'{"x":0}' )
for key in "${!dict[#]}"; do
printf '%s\0%s\0' "$key" "${dict[$key]}"
done |
jq -Rs '
split("\u0000")
| . as $a
| reduce range(0; length/2) as $i
({}; . + {($a[2*$i]): ($a[2*$i + 1]|fromjson? // .)})'
Output:
{
"foo\naha": "a\nb",
"bar": 2,
"baz": {
"x": 0
}
}
This answer is from nico103 on freenode #jq:
#!/bin/bash
declare -A dict=()
dict["foo"]=1
dict["bar"]=2
dict["baz"]=3
assoc2json() {
declare -n v=$1
printf '%s\0' "${!v[#]}" "${v[#]}" |
jq -Rs 'split("\u0000") | . as $v | (length / 2) as $n | reduce range($n) as $idx ({}; .[$v[$idx]]=$v[$idx+$n])'
}
assoc2json dict
You can initialize a variable to an empty object {} and add the key/values {($key):$value} for each iteration, re-injecting the result in the same variable :
#!/bin/bash
declare -A dict=()
dict["foo"]=1
dict["bar"]=2
dict["baz"]=3
data='{}'
for i in "${!dict[#]}"
do
data=$(jq -n --arg data "$data" \
--arg key "$i" \
--arg value "${dict[$i]}" \
'$data | fromjson + { ($key) : ($value | tonumber) }')
done
echo "$data"
This has been posted, and credited to nico103 on IRC, which is to say, me.
The thing that scares me, naturally, is that these associative array keys and values need quoting. Here's a start that requires some additional work to dequote keys and values:
function assoc2json {
typeset -n v=$1
printf '%q\n' "${!v[#]}" "${v[#]}" |
jq -Rcn '[inputs] |
. as $v |
(length / 2) as $n |
reduce range($n) as $idx ({}; .[$v[$idx]]=$v[$idx+$n])'
}
$ assoc2json a
{"foo\\ bar":"1","b":"bar\\ baz\\\"\\{\\}\\[\\]","c":"$'a\\nb'","d":"1"}
$
So now all that's needed is a jq function that removes the quotes, which come in several flavors:
if the string starts with a single-quote (ksh) then it ends with a single quote and those need to be removed
if the string starts with a dollar sign and a single-quote and ends in a double-quote, then those need to be removed and internal backslash escapes need to be unescaped
else leave as-is
I leave this last iterm as an exercise for the reader.
I should note that I'm using printf here as the iterator!
bash 5.2 introduces the #k parameter transformation which, makes this much easier. Like:
$ declare -A dict=([foo]=1 [bar]=2 [baz]=3)
$ jq -n '[$ARGS.positional | _nwise(2) | {(.[0]): .[1]}] | add' --args "${dict[#]#k}"
{
"foo": "1",
"bar": "2",
"baz": "3"
}

Create JSON using jq from pipe-separated keys and values in bash

I am trying to create a json object from a string in bash. The string is as follows.
CONTAINER|CPU%|MEMUSAGE/LIMIT|MEM%|NETI/O|BLOCKI/O|PIDS
nginx_container|0.02%|25.09MiB/15.26GiB|0.16%|0B/0B|22.09MB/4.096kB|0
The output is from docker stats command and my end goal is to publish custom metrics to aws cloudwatch. I would like to format this string as json.
{
"CONTAINER":"nginx_container",
"CPU%":"0.02%",
....
}
I have used jq command before and it seems like it should work well in this case but I have not been able to come up with a good solution yet. Other than hardcoding variable names and indexing using sed or awk. Then creating a json from scratch. Any suggestions would be appreciated. Thanks.
Prerequisite
For all of the below, it's assumed that your content is in a shell variable named s:
s='CONTAINER|CPU%|MEMUSAGE/LIMIT|MEM%|NETI/O|BLOCKI/O|PIDS
nginx_container|0.02%|25.09MiB/15.26GiB|0.16%|0B/0B|22.09MB/4.096kB|0'
What (modern jq)
# thanks to #JeffMercado and #chepner for refinements, see comments
jq -Rn '
( input | split("|") ) as $keys |
( inputs | split("|") ) as $vals |
[[$keys, $vals] | transpose[] | {key:.[0],value:.[1]}] | from_entries
' <<<"$s"
How (modern jq)
This requires very new (probably 1.5?) jq to work, and is a dense chunk of code. To break it down:
Using -n prevents jq from reading stdin on its own, leaving the entirety of the input stream available to be read by input and inputs -- the former to read a single line, and the latter to read all remaining lines. (-R, for raw input, causes textual lines rather than JSON objects to be read).
With [$keys, $vals] | transpose[], we're generating [key, value] pairs (in Python terms, zipping the two lists).
With {key:.[0],value:.[1]}, we're making each [key, value] pair into an object of the form {"key": key, "value": value}
With from_entries, we're combining those pairs into objects containing those keys and values.
What (shell-assisted)
This will work with a significantly older jq than the above, and is an easily adopted approach for scenarios where a native-jq solution can be harder to wrangle:
{
IFS='|' read -r -a keys # read first line into an array of strings
## read each subsequent line into an array named "values"
while IFS='|' read -r -a values; do
# setup: positional arguments to pass in literal variables, query with code
jq_args=( )
jq_query='.'
# copy values into the arguments, reference them from the generated code
for idx in "${!values[#]}"; do
[[ ${keys[$idx]} ]] || continue # skip values with no corresponding key
jq_args+=( --arg "key$idx" "${keys[$idx]}" )
jq_args+=( --arg "value$idx" "${values[$idx]}" )
jq_query+=" | .[\$key${idx}]=\$value${idx}"
done
# run the generated command
jq "${jq_args[#]}" "$jq_query" <<<'{}'
done
} <<<"$s"
How (shell-assisted)
The invoked jq command from the above is similar to:
jq --arg key0 'CONTAINER' \
--arg value0 'nginx_container' \
--arg key1 'CPU%' \
--arg value1 '0.0.2%' \
--arg key2 'MEMUSAGE/LIMIT' \
--arg value2 '25.09MiB/15.26GiB' \
'. | .[$key0]=$value0 | .[$key1]=$value1 | .[$key2]=$value2' \
<<<'{}'
...passing each key and value out-of-band (such that it's treated as a literal string rather than parsed as JSON), then referring to them individually.
Result
Either of the above will emit:
{
"CONTAINER": "nginx_container",
"CPU%": "0.02%",
"MEMUSAGE/LIMIT": "25.09MiB/15.26GiB",
"MEM%": "0.16%",
"NETI/O": "0B/0B",
"BLOCKI/O": "22.09MB/4.096kB",
"PIDS": "0"
}
Why
In short: Because it's guaranteed to generate valid JSON as output.
Consider the following as an example that would break more naive approaches:
s='key ending in a backslash\
value "with quotes"'
Sure, these are unexpected scenarios, but jq knows how to deal with them:
{
"key ending in a backslash\\": "value \"with quotes\""
}
...whereas an implementation that didn't understand JSON strings could easily end up emitting:
{
"key ending in a backslash\": "value "with quotes""
}
I know this is an old post, but the tool you seek is called jo: https://github.com/jpmens/jo
A quick and easy example:
$ jo my_variable="simple"
{"my_variable":"simple"}
A little more complex
$ jo -p name=jo n=17 parser=false
{
"name": "jo",
"n": 17,
"parser": false
}
Add an array
$ jo -p name=jo n=17 parser=false my_array=$(jo -a {1..5})
{
"name": "jo",
"n": 17,
"parser": false,
"my_array": [
1,
2,
3,
4,
5
]
}
I've made some pretty complex stuff with jo and the nice thing is that you don't have to worry about rolling your own solution worrying about the possiblity of making invalid json.
You can ask docker to give you JSON data in the first place
docker stats --format "{{json .}}"
For more on this, see: https://docs.docker.com/config/formatting/
JSONSTR=""
declare -a JSONNAMES=()
declare -A JSONARRAY=()
LOOPNUM=0
cat ~/newfile | while IFS=: read CONTAINER CPU MEMUSE MEMPC NETIO BLKIO PIDS; do
if [[ "$LOOPNUM" = 0 ]]; then
JSONNAMES=("$CONTAINER" "$CPU" "$MEMUSE" "$MEMPC" "$NETIO" "$BLKIO" "$PIDS")
LOOPNUM=$(( LOOPNUM+1 ))
else
echo "{ \"${JSONNAMES[0]}\": \"${CONTAINER}\", \"${JSONNAMES[1]}\": \"${CPU}\", \"${JSONNAMES[2]}\": \"${MEMUSE}\", \"${JSONNAMES[3]}\": \"${MEMPC}\", \"${JSONNAMES[4]}\": \"${NETIO}\", \"${JSONNAMES[5]}\": \"${BLKIO}\", \"${JSONNAMES[6]}\": \"${PIDS}\" }"
fi
done
Returns:
{ "CONTAINER": "nginx_container", "CPU%": "0.02%", "MEMUSAGE/LIMIT": "25.09MiB/15.26GiB", "MEM%": "0.16%", "NETI/O": "0B/0B", "BLOCKI/O": "22.09MB/4.096kB", "PIDS": "0" }
Here is a solution which uses the -R and -s options along with transpose:
split("\n") # [ "CONTAINER...", "nginx_container|0.02%...", ...]
| (.[0] | split("|")) as $keys # [ "CONTAINER", "CPU%", "MEMUSAGE/LIMIT", ... ]
| (.[1:][] | split("|")) # [ "nginx_container", "0.02%", ... ] [ ... ] ...
| select(length > 0) # (remove empty [] caused by trailing newline)
| [$keys, .] # [ ["CONTAINER", ...], ["nginx_container", ...] ] ...
| [ transpose[] | {(.[0]):.[1]} ] # [ {"CONTAINER": "nginx_container"}, ... ] ...
| add # {"CONTAINER": "nginx_container", "CPU%": "0.02%" ...
json_template='{"CONTAINER":"%s","CPU%":"%s","MEMUSAGE/LIMIT":"%s", "MEM%":"%s","NETI/O":"%s","BLOCKI/O":"%s","PIDS":"%s"}'
json_string=$(printf "$json_template" "nginx_container" "0.02%" "25.09MiB/15.26GiB" "0.16%" "0B/0B" "22.09MB/4.096kB" "0")
echo "$json_string"
Not using jq but possible to use args and environment in values.
CONTAINER=nginx_container
json_template='{"CONTAINER":"%s","CPU%":"%s","MEMUSAGE/LIMIT":"%s", "MEM%":"%s","NETI/O":"%s","BLOCKI/O":"%s","PIDS":"%s"}'
json_string=$(printf "$json_template" "$CONTAINER" "$1" "25.09MiB/15.26GiB" "0.16%" "0B/0B" "22.09MB/4.096kB" "0")
echo "$json_string"
If you're starting with tabular data, I think it makes more sense to use something that works with tabular data natively, like sqawk to make it into json, and then use jq work with it further.
echo 'CONTAINER|CPU%|MEMUSAGE/LIMIT|MEM%|NETI/O|BLOCKI/O|PIDS
nginx_container|0.02%|25.09MiB/15.26GiB|0.16%|0B/0B|22.09MB/4.096kB|0' \
| sqawk -FS '[|]' -RS '\n' -output json 'select * from a' header=1 \
| jq '.[] | with_entries(select(.key|test("^a.*")|not))'
{
"CONTAINER": "nginx_container",
"CPU%": "0.02%",
"MEMUSAGE/LIMIT": "25.09MiB/15.26GiB",
"MEM%": "0.16%",
"NETI/O": "0B/0B",
"BLOCKI/O": "22.09MB/4.096kB",
"PIDS": "0"
}
Without jq, sqawk gives a bit too much:
[
{
"anr": "1",
"anf": "7",
"a0": "nginx_container|0.02%|25.09MiB/15.26GiB|0.16%|0B/0B|22.09MB/4.096kB|0",
"CONTAINER": "nginx_container",
"CPU%": "0.02%",
"MEMUSAGE/LIMIT": "25.09MiB/15.26GiB",
"MEM%": "0.16%",
"NETI/O": "0B/0B",
"BLOCKI/O": "22.09MB/4.096kB",
"PIDS": "0",
"a8": "",
"a9": "",
"a10": ""
}
]

passing arguments to jq filter

Here is my config.json:
{
"env": "dev",
"dev": {
"projects" : {
"prj1": {
"dependencies": {},
"description": ""
}
}
}
}
Here are my bash commands:
PRJNAME='prj1'
echo $PRJNAME
jq --arg v "$PRJNAME" '.dev.projects."$v"' config.json
jq '.dev.projects.prj1' config.json
The output:
prj1
null
{
"dependencies": {},
"description": ""
}
So $PRJNAME is prj1, but the first invocation only outputs null.
Can someone help me?
The jq program .dev.projects."$v" in your example will literally try to find a key named "$v". Try the following instead:
jq --arg v "$PRJNAME" '.dev.projects[$v]' config.json
You can use --argjson too when you make your json.
--arg a v # set variable $a to value <v>;
--argjson a v # set variable $a to JSON value <v>;
As asked in a comment above there's a way to pass multiple argumets.
Maybe there's a more elegant way, but it works.
If you are sure always all keys needed you can use this:
jq --arg key1 $k1 --arg key2 $k2 --arg key3 $k3 --arg key4 $k4 '.[$key1] | .[$key2] | .[$key3] | .[$key4] '
If the key isn't always used you could do it like this:
jq --arg key $k ' if key != "" then .[$key] else . end'
If key sometimes refers to an array:
jq --arg key $k ' if type == "array" then .[$key |tonumber] else .[$key] end'
of course you can combine these!
you can do this:
key="dev.projects.prj1"
filter=".$key"
cat config.json | jq $filter
My example bash command to replace an array in a json file.
Everything is in variables:
a=/opt/terminal/conf/config.json ; \
b='["alchohol"]' ; \
c=terminal.canceledInspections.resultSendReasons ; \
cat $a | jq .$c ; \
cat $a | jq --argjson b $b --arg c $c 'getpath($c / ".") = $b' | sponge $a ; \
cat $a | jq .$c