Modify Nested JSON with jq - json

I'm trying to modify nested JSON objects using the jq <map> function in a bash/shell script; something similar to this blog entry but attempting to adapt the examples here to nested objects.
The returned JSON to be modified as follows:
{
"name": "vendor-module",
"dependencies": {
"abc": {
"from": "abc#2.4.0",
"resolved": "https://some.special.url",
"version": "2.4.0"
},
"acme": {
"from": "acme#1.2.3",
"resolved": "<CHANGE_THIS>",
"version": "1.2.3"
}
}
}
This would be my attempt:
modules="`node -pe 'JSON.parse(process.argv[1]).dependencies.$dependency' \
"$(cat $wrapped)"`"
version="1.2.3"
resolved="some_url"
cat OLD.json |
jq 'to_entries |
map(if .dependencies[0].$module[0].from == "$module#$version"
then . + {"resolved"}={"$resolved"}
else .
end
) |
from_entries' > NEW.json
Obviously this doesn't work. When I run the script the NEW.json is created but without modifications or returned errors. If I don't target a nested object (e.g., "name": "vendor-module"), The script works as expected. I am sure there is a way to do it using native bash and jq..?? Any help (with the proper escaping) will be greatly appreciated.
UPDATE:
Thnx from the help of Charles Duffy's answer, and his suggestion of using sponge, The solution that works well for me is:
jq --arg mod "acme" --arg resolved "Some URL" \
'.dependencies[$mod].resolved |= $resolved' \
OLD.json | sponge OLD.json

If you know the name of the dependency you want to update, you could just index into it.
$ jq --arg dep "$dep" --arg resolved "$resolved" \
'.dependencies[$dep].resolved = $resolved' \
OLD.json > NEW.json
Otherwise, to modify a dependency based on the name (or other property), search for the dependency and update.
$ jq --arg version "$version" --arg resolved "$resolved" \
'(.dependencies[] | select(.version == $version)).resolved = $resolved' \
OLD.json > NEW.json

For your existing sample data, the following suffices:
jq --arg mod "acme" \
--arg resolved "some_url" \
'.dependencies[$mod].resolved=$resolved' \
<in.json >out.json
...to filter on the from, by contrast:
jq --arg new_url "http://new.url/" \
--arg target "acme#1.2.3" \
'.dependencies=(.dependencies
| to_entries
| map(if(.value.from == $target)
then .value.resolved=$new_url
else . end)
| from_entries)' \
<in.json >out.json

Related

Stuctured logs in shell script using JQ

I've been wanting to logs some shells scripts to a file to be picked up by an agent like Fluentbit and shipped off to Cloudwatch and Datadog.
I found this example on the web which works like a charm using jq.
__timestamp(){
date "+%Y%m%dT%H%M%S"
}
__log(){
log_level="$1"
message="$2"
echo '{}' | jq \
--arg timestamp "$(__timestamp)"
--arg log_level "$log_level" \
--arg message "$message" \
'.timestamp=$timestamp|.log_level=$log_level|.message=$message' >> logs.log
}
__log "INFO" "Hello, World!"
The one issue is that the output is rendering a full json with newlines and tabs. Easy on the eyes, but not great for cloud logging services.
{
"timestamp": "20220203T162320",
"log_level": "INFO",
"message": "Hello, World!"
}
How would I modify to render the output on one line like so?
{"timestamp": "20220203T171908","log_level": "INFO","message": "Hello, World!"}
Use the --compact-output or -c option: jq -c --arg … '…' >> logs.log
From the manual:
--compact-output / -c:
By default, jq pretty-prints JSON output. Using this option will
result in more compact output by instead putting each JSON object
on a single line.
After having read your linked article about "Structured Logging", this is how I would have implemented said __log function:
__log(){
jq -Mcn --arg log_level "$1" --arg message "$2" \
'{timestamp: now|todate, $log_level, $message}'
}
__log "INFO" "Hello, World!"
🤦 Somehow I completely missed Step 3.0 The Finishing Touches of this terrific guide which answers my question perfectly.
In case anyone else was curious, the solution is here:
__timestamp(){
date "+%Y%m%dT%H%M%S"
}
__log(){
log_level="$1"
message="$2"
echo '{}' | \
jq --monochrome-output \
--compact-output \
--raw-output \
--arg timestamp "$(__timestamp)" \
--arg log_level "$log_level" \
--arg message "$message" \
'.timestamp=$timestamp|.log_level=$log_level|.message=$message'
}
__log "INFO" "Hello, World!"
Which outputs:
{"timestamp":"20210812T191730","log_level":"INFO","message":"Hello, World!"}

Getting empty string after parsing bash associative array to json using jq

I have a bash associative array containing dynamic data like:
declare -A assoc_array=([cluster_name]="cpod1" [site_name]="ppod1" [alarm_name]="alarm1")
I have to create the JSON data accordingly.
{
"name": "cluster_name",
"value": $assoc_array[cluster_name],
"regex": False
}
{
"name": "site_name",
"value": $assoc_array[site_name],
"regex": False
}
{
"name": "alert_name",
"value": $assoc_array[alert_name],
"regex": False
}
I have used the following code for it:
for i in "${!assoc_array[#]}"; do
echo $i
alarmjsonarray=$(echo ${alarmjsonarray}| jq --arg name "$i" \
--arg value "${alarm_param[$i]}" \
--arg isRegex "False" \
'. + [{("name"):$name,("value"):$value,("isRegex"):$isRegex}]')
done
echo "alarmjsonarray" $alarmjsonarray
I am getting empty string from it. Can you please help me in it?
Assuming the ASCII record separator character \x1e can't occur in your data (and given your assertions that newlines will never be present), one way to handle this would be:
for key in "${!assoc_array[#]}"; do
printf '%s\x1e%s\n' "$key" "${assoc_array[$key]}"
done | jq -Rn '
[
inputs |
split("\u001e") | .[0] as $key | .[1] as $value |
{"name": $key, "value": $value, "regex": false}
]'
...feel free to change \x1e to a different character (like a tab) that can never exist, as appropriate.
#!/bin/bash
declare -A assoc_array=([cluster_name]="cpod1" [site_name]="ppod1" [alarm_name]="alarm1")
for i in "${!assoc_array[#]}"; do
alarmjsonarray=$(echo "${alarmjsonarray:-[]}" |
jq --arg name "$i" \
--arg value "${assoc_array[$i]}" \
--arg isRegex "False" \
'. + [{("name"):$name,("value"):$value,("isRegex"):$isRegex}]')
done
echo "alarmjsonarray" "$alarmjsonarray"
array was supposed to be initialized before using it. I thought in bash, scope is there out of block also.
Its more simpler way to generate json from bash associative array.

Looping through a json file and retrieving the values into bash variables

I have a json file which has the below data
{"Item": {"ID": {"S": "4869949"},"no":{"N": "2"}}}
I need to retrieve the S value from ID and N value from no and put them into a query as parameters.. the query looks like this:
query --table-name validation \
--key-condition-expression "ID = :v1 AND no = :v2" \
--expression-attribute-values '{":v1": {"S": "${ID}"},":v2": {"N": "${no}"}}' \
--region us-east-1
I have tried using jq but couldn't figure out how to read a json file and retrieve the values as parameters. This is the first time I am using jq. Can someone help with it?
The following may not be exactly what you're after but does show how you can avoid calling jq more than once, and how the command can partly be constructed by jq. In other words, assuming the templates for --key-condition-expression and --expression-attribute-values are fixed, you should be able to adapt the following to your needs:
Assuming a bash or bash-like shell, and that data.json holds:
{"Item": {"ID": {"S": "4869949"},"no":{"N": "2"}}}
we could write:
ID=ID
no=no
< data.json jq --arg ID "$ID" --arg no "$no" -cr '
.Item
| .ID.S as $v1
| .no.N as $v2
| $v1, $v2, {($v1): {"S": $ID}, ($v2): {"N": $no}}
' | while read -r v1
do
read -r v2
read -r query
echo query --table-name validation \
--key-condition-expression "\"ID = $v1 AND no = $v2\"" \
--expression-attribute-values "'""$query""'" \
--region us-east-1
done
jq as a template engine
If the template with :v1 and :v2 in the Q are variable, then you could adapt the above by using jq as a template engine, a topic which is covered in the jq Cookbook: https://github.com/stedolan/jq/wiki/Cookbook#using-jq-as-a-template-engine
It's basic jq syntax:
$ ID=$(cat data.json | jq .Item.ID.S -r)
$ no=$(cat data.json | jq .Item.no.N -r)
$ echo $ID $no
4869949 2
I'd suggest you read something like https://shapeshed.com/jq-json/ to get more familiar though.

Send bash array to json file

I pulled some config variables from a json file with jq.
And once modified, i want to write the whole config array (which can contain keys that were not there at first) to the json file.
The "foreach" part seems quite obvious.
But how to express "change keyA by value A or add keyA=>valueA" to the conf file ?
I'm stuck with something like
for key in "${!conf[#]}"; do
value=${conf[$key]}
echo $key $value
jq --arg key $key --arg value $value '.$key = $value' $conf_file > $new_file
done
Thanks
Extended solution with single jq invocation:
Sample conf array:
declare -A conf=([status]="finished" [nextTo]="2018-01-24" [result]=true)
Sample file conf.json:
{
"status": "running",
"minFrom": "2018-01-23",
"maxTo": "2018-01-24",
"nextTo": "2018-01-23",
"nextFrom": "2018-01-22"
}
Processing:
jq --arg data "$(paste -d':' <(printf "%s\n" "${!conf[#]}") <(printf "%s\n" "${conf[#]}"))" \
'. as $conf | map($data | split("\n")[]
| split(":") | {(.[0]) : .[1]})
| add | $conf + .' conf.json > conf.tmp && mv conf.tmp conf.json
The resulting conf.json contents:
{
"status": "finished",
"minFrom": "2018-01-23",
"maxTo": "2018-01-24",
"nextTo": "2018-01-24",
"nextFrom": "2018-01-22",
"result": "true"
}
I finally run into the following solution : not as compact as RomanPerekhrest, but a bit more human-readable. Thanks.
function update_conf() {
echo update_conf
if [ -n "$1" ] && [ -n "$2" ]; then
conf[$1]=$2
fi
for key in "${!conf[#]}"; do
value=${conf[$key]}
jq --arg key "$key" --arg value "$value" '.[$key] = $value' $confFile > $confTempFile && mv $confTempFile $confF$
done
}

passing arguments to jq filter

Here is my config.json:
{
"env": "dev",
"dev": {
"projects" : {
"prj1": {
"dependencies": {},
"description": ""
}
}
}
}
Here are my bash commands:
PRJNAME='prj1'
echo $PRJNAME
jq --arg v "$PRJNAME" '.dev.projects."$v"' config.json
jq '.dev.projects.prj1' config.json
The output:
prj1
null
{
"dependencies": {},
"description": ""
}
So $PRJNAME is prj1, but the first invocation only outputs null.
Can someone help me?
The jq program .dev.projects."$v" in your example will literally try to find a key named "$v". Try the following instead:
jq --arg v "$PRJNAME" '.dev.projects[$v]' config.json
You can use --argjson too when you make your json.
--arg a v # set variable $a to value <v>;
--argjson a v # set variable $a to JSON value <v>;
As asked in a comment above there's a way to pass multiple argumets.
Maybe there's a more elegant way, but it works.
If you are sure always all keys needed you can use this:
jq --arg key1 $k1 --arg key2 $k2 --arg key3 $k3 --arg key4 $k4 '.[$key1] | .[$key2] | .[$key3] | .[$key4] '
If the key isn't always used you could do it like this:
jq --arg key $k ' if key != "" then .[$key] else . end'
If key sometimes refers to an array:
jq --arg key $k ' if type == "array" then .[$key |tonumber] else .[$key] end'
of course you can combine these!
you can do this:
key="dev.projects.prj1"
filter=".$key"
cat config.json | jq $filter
My example bash command to replace an array in a json file.
Everything is in variables:
a=/opt/terminal/conf/config.json ; \
b='["alchohol"]' ; \
c=terminal.canceledInspections.resultSendReasons ; \
cat $a | jq .$c ; \
cat $a | jq --argjson b $b --arg c $c 'getpath($c / ".") = $b' | sponge $a ; \
cat $a | jq .$c