Conditionally add json array using jq and bash - json

Am trying to add a json array with just one array element to a json. And it has to be added only if it does not already exist.
Example json is below.
{
"lorem": "2.0",
"ipsum": {
"key1": "value1",
"key2": "value2"
},
"schemes": ["https"],
"dorum" : "value3"
}
Above is the json, where "schemes": ["https"], exists. Am trying to add schemes only if it does not exist using the below code.
scheme=$( cat rendertest.json | jq -r '. "schemes" ')
echo $scheme
schem='["https"]'
echo "Scheme is"
echo $schem
if [ -z $scheme ]
then
echo "all good"
else
jq --argjson argval "$schem" '. += { "schemes" : $schem }' rendertest.json > test.json
fi
I get the below error in a file when the json array element 'schemes' does not exist. It returns a null and errors out. Any idea where am going wrong?
null
Scheme is
["https"]
jq: error: schem/0 is not defined at <top-level>, line 1:
. += { "schemes" : $schem }
jq: 1 compile error
Edit: the question is not about how to pass on bash variables to jq.

Just use an explicit if condition that checks for the attribute schemes in the root level of the JSON structure
schem='["https"]'
After setting the above variable in your shell, run the following filter
jq --argjson argval "$schem" 'if has("schemes")|not then .+= { schemes: $argval } else . end' json
The argument immediately after the --argjson field is the one that needs to be used in the context of jq, but you were trying to use $schem in the context which is incorrect.
You can even go one level further and check even if schemes is present and if it does not contain the value you expect, then make the overwrite. Modify the filter within '..' to
( has("schemes")|not ) or .schemes != $argval )
which can be run as
jq --argjson argval "$schem" 'if ( (has("schemes")|not) or .schemes != $argval) then (.schemes: $argval) else . end'

Related

filter keys in JSON using jq

I am having a complex nested json
{
...
"key1": {
"key2" : [
{ ...
"base_score" :4.5
}
]
"key3": {
"key4": [
{ ...
"base_score" : 0.5
...
}
]
}
...
}
}
There maybe multiple "base_score" in the json("base_score" path is unknown) and the corresponding value will be a number, I have to check if at least one such value is greater than some known value 7.0, and if there is, I have to do "exit 1". I have to write this query in shell script.
Assuming the input is valid JSON in a file named input.json, then based on my understanding of the requirements, you could go with:
jq --argjson limit 7.0 '
any(.. | select(type=="object" and (.base_score|type=="number")) | .base_score; . > $limit)
| halt_error(if . then 1 else 0 end)
' input.json
You can modify the argument to halt_error to set the exit code as you wish.
Note that halt_error redirects its input to stderr, so you might want to append 2> /dev/null (or the equivalent expression appropriate for your shell) to the above invocation.
You can easily get a stream of base_score values at any level and use that with any:
any(..|.base_score?; . > 7)
The stream will contain null values for objects without the property, but null is not greater than any number, so that shouldn't be a stopper.
You could then compare the output or specify -e/--exit-status to be used with a condition directly:
jq -e 'any(..|.base_score?; . > 7)' complexnestedfile.json >/dev/null && exit 1

how to use jq and bash to inject a files json array contents while appending

I have a .json file which I want to read, take all the contents.. and append it to a json string that's in a bash variable
input.json
[{
"maven": {
"coordinates": "somelib",
"repo": "somerepo"
}
},
{
"maven": {
"coordinates": "anotherlib",
"exclusions": ["exclude:this", "*:and-all-that"]
}
}]
OUR_BIG_JSON variable
{
"name": "",
"myarray": [
{
"generic": {
"package": "somepymodule"
}
},
{
"rcann": {
"package": "anothermodule==3.0.0"
}
}
],
and a json I want to append to.. it's residing in a variable so we must echo it as we use jq
command attempt
echo "starting..."
inputs_json=`cat input.json`
echo "$inputs_json"
echo "$OUR_BIG_JSON"
OUR_BIG_JSON=$(echo "${OUR_BIG_JSON}" |./jq '.myarray += [{"date":"today"}]') # This worked..
next line fails
OUR_BIG_JSON=$(echo "${OUR_BIG_JSON}" |./jq '.myarray += ' $inputs_json)
out comes errors when trying to take the $inputs_json contents and just pipe it into the variable.. basically append to the JSON in memory..
jq: error: syntax error, unexpected INVALID_CHARACTER (Windows cmd shell quoting issues?) at <top-level>, line 1: .myarray += \"$inputs_json\" jq: 1 compile error
Using the --argjson option to read in the contents of your shell variable enables you to use it as variable inside the jq filter, too. Your input file can normally be read in via parameter.
jq --argjson v "${OUR_BIG_JSON}" '…your filter here using $v…' input.json
For example:
jq --argjson v "${OUR_BIG_JSON}" '$v + {myarray: ($v.myarray + .)}' input.json
Or by making the input too a variable inside the jq filter
jq --argjson v "${OUR_BIG_JSON}" '. as $in | $v | .myarray += $in' input.json
With the -n option you could also reference the input using input:
jq --argjson v "${OUR_BIG_JSON}" -n '$v | .myarray += input' input.json
And there's also the options --argfile and --slurpfile to read in the input file as variable directly...
As you have tagged bash you could also do it the other way around and use a herestring <<< to read in the bash variable as input and then using either --argfile or --slurpfile to reference the file content through a variable. For instance:
jq --argfile in input.json '.myarray += $in' <<< "${OUR_BIG_JSON}"

Looping over a list of keys to extract from a JSON file with jq

I'm trying to extract a series of properties (named in an input file) in jq and getting error when I feed those from bash via a loop:
while read line; do echo $line; cat big.json | jq ".$line"; sleep 1; done < big.properties.service
cfg.keyload.service.count
jq: error: syntax error, unexpected INVALID_CHARACTER, expecting $end (Unix shell quoting issues?) at <top-level>, line 1:
When i try to do it manually it works
$ line=cfg.keyload.service.count
$ echo $line
cfg.keyload.service.count
$ cat big.json | jq ".$line"
1
Is there any way to get it work in loop?
Here is example
cat >big.json <<EOF
{
"cfg": {
"keyload": {
"backend": {
"app": {
"shutdown": {
"timeout": "5s"
},
"jmx": {
"enable": true
}
}
}
}
}
}
EOF
cat >big.properties.service <<EOF
cfg.keyload.backend.app.shutdown.timeout
cfg.keyload.backend.app.jmx.enable
cfg.keyload.backend.app.jmx.nonexistent
cfg.nonexistent
EOF
...output should be:
cfg.keyload.backend.app.shutdown.timeout
"5s"
cfg.keyload.backend.app.jmx.enable
true
cfg.keyload.backend.app.jmx.nonexistent
null
cfg.nonexistent
null
Immediate Issue - Invalid Input
The "invalid character" at hand here is almost certainly a carriage return. Use dos2unix to convert your input file to a proper UNIX text file, and your original code will work (albeit very inefficiently, rereading your whole big.json every time it wants to extract a single property).
Performant Implementation - Loop In JQ, Not Bash
Don't use a bash loop for this at all -- it's much more efficient to have jq do the looping.
Note the sub("\r$"; "") used in this code to remove trailing carriage returns so it can accept input in DOS format.
jq -rR --argfile infile big.json '
sub("\r$"; "") as $keyname
| ($keyname | split(".")) as $pieces
| (reduce $pieces[] as $piece ($infile; .[$piece]?)) as $value
| ($keyname, ($value | tojson))
' <big.properties.service
properly emits as output, when given the inputs in the question:
cfg.keyload.backend.app.shutdown.timeout
"5s"
cfg.keyload.backend.app.jmx.enable
true
cfg.keyload.backend.app.jmx.nonexistent
null
cfg.nonexistent
null
Your properties file is effectively paths in the json that you want to retrieve values from. Convert them to paths that jq recognizes so you can get those values. Just make an array of keys that would need to be traversed. Be sure to read your properties file as raw input (-R) since it's not json, and use raw output (-r) to be able to output the paths as you want.
$ jq --argfile big big.json '
., (split(".") as $p | $big | getpath($p) | tojson)
' -Rr big.properties.service
cfg.keyload.backend.app.shutdown.timeout
"5s"
cfg.keyload.backend.app.jmx.enable
true
cfg.keyload.backend.app.jmx.nonexistent
null
cfg.nonexistent
null

jq not replacing json value with parameter

test.sh is not replacing test.json parameter values ($input1 and $input2). result.json has same ParameterValue "$input1/solution/$input2.result"
[
{
"ParameterKey": "Project",
"ParameterValue": [ "$input1/solution/$input2.result" ]
}
]
test.sh
#!/bin/bash
input1="test1"
input2="test2"
echo $input1
echo $input2
cat test.json | jq 'map(if .ParameterKey == "Project" then . + {"ParameterValue" : "$input1/solution/$input2.result" } else . end )' > result.json
shell variables in jq scripts should be interpolated or passed as arguments via --arg name value:
jq --arg inp1 "$input1" --arg inp2 "$input2" \
'map(if .ParameterKey == "Project"
then . + {"ParameterValue" : ($inp1 + "/solution/" + $inp2 + ".result") }
else . end)' test.json
The output:
[
{
"ParameterKey": "Project",
"ParameterValue": "test1/solution/test2.result"
}
]
In your jq program, you have quoted "$input1/solution/$input2.result", and therefore it is a JSON string literal, whereas you evidently want string interpolation; you also need to distinguish between the shell variables ($input1 and $input2) on the one hand, and the corresponding jq dollar-variables (which may or may not have the same name) on the other.
Since your shell variables are strings, you could pass them in using the --arg command-line option (e.g. --arg input1 "$input1" if you chose to name the variables in the same way).
You can read up on string interpolation in the jq manual (see https://stedolan.github.io/jq/manual, but note the links at the top for different versions of jq).
There are other ways to achieve the desired results too, but using string interpolation with same-named variables, you'd write:
"\($input1)/solution/\($input2).result"
Notice that the above string is NOT itself literally a JSON string. Only after string interpolation does it become so.

Dynamically add json object into array with jq

Below is the template of my employee.json file
{
"orgConfig": {
"departments": []
}
}
where departments will have array of departments like below
{
"name" : "physics",
"id" : "1234",
"head" : "abcd"
}
similarly
{
"name" : "chemistry",
"id" : "3421",
"head" : "xyz"
}
so the final array structure i want to construct is as below
{
"orgConfig": {
"departments": [
{
"name" : "physics",
"id" : "1234",
"head" : "abcd"
},
{
"name" : "chemistry",
"id" : "3421",
"head" : "xyz"
},
{
"name" : "Maths",
"id" : "4634",
"head" : "jklm"
}
]
}
}
Below is the code where i am adding the json elements to an departments array dynamically
#!/bin/bash
source department.properties # will have departments=physiscs,chemistry,Maths,computers .. etc
IFS=',' read -ra NAMES <<< "$departmentsToImport"
position=0
for i in "${NAMES[#]}"; do
#./jsonfiles will chemistry.json, physics.json, Maths.json etc
value=`cat ./jsonfiles/$i.json`
if [ $position -eq 0 ]
then
cat employee.json | jq --arg value "$value" '.orgConfig.departments[0] |= .+ $value' > tmp.json && mv tmp.json employee.json
else
cat employee.json | jq --arg position "$position" value "$value" '.orgConfig.departments[$position] |= .+ $value' > tmp.json && mv tmp.json employee.json
fi
((position++))
rm -rf tmp.json
done
exit $?
but program throws below error
jq: error (at <stdin>:51): Cannot index array with string "1"
But if use direct index instead of variable position then it works fine.
cat employee.json | jq --argjson value "$value" '.orgConfig.departments[1] |= .+ $value' > tmp.json && mv tmp.json employee.json
I do not know how many key value maps of departments i have. I cannot hard code the index. Any help to above problem and Dynamically add json object into array?
Thanks
This can be done by turning off jq's automatic input reading, then piping the first explicit input through a filter that modifies its input using the remaining explicit inputs. Make sense? No :) But the code itself is simple:
jq -n 'input | .orgConfig.departments += [inputs]' \
employee.json chemistry.json physics.json math.json
The first input reads from the first argument (employee.json).
The += gets its input from the input filter. Its left operand
selects the field to update; the right operand provides the value
to update it with.
inputs reads from the remaining command-line arguments, and puts their contents in an array, each file in a separate element.
Combining this with your shell code to select the correct course files yields
source department.properties
IFS=, read -ra NAMES <<< "$departmentsToImport"
for c in "${NAMES[#]}"; do
courses+=("./jsonfiles/$c.json")
done
jq -n 'input | .orgConfig.departments += [inputs]' employee.json "${courses[#]}"
The task can be accomplished without invoking jq more than once.
Something very much like the following should suffice:
jq -s '{orgConfig: {departments: . }}' jsonfiles/*.json
This solution of course assumes the .json files all hold valid JSON.
The trick is to use the -s (aka --slurp) option, as this converts the input into an array for you. You'll probably find that using -s results in better run-times than some other approaches.